View a file whilst it is being written - Python - python

I am making a game in python 2.7 for fun and am trying to make a map to go along with it. I am using file I/O to read and write the map and have also got notepad ++ set to silent update, however I can only see the changes once my program has fully run and want to view the file as it is updated.
I have this code which i am testing with:
from time import sleep
map = open('C:\Users\Ryan\Desktop\Codes\Python RPG\Maps\map.txt', 'r+')
map.truncate()
print "file deleted"
sleep(1)
worldMap = open('C:\Users\Ryan\Desktop\Codes\Python RPG\Maps\worldMap.txt', 'r')
for line in worldMap:
map.write(line)
print "file updated"
worldMap.close()
map.close()
Any help is greatly appricated :)

By default Python uses buffered I/O. This means that written data is stored in memory before actually written to the file. Calling file's flush method causes the data to be written to the file.

Related

Reading a dynamically updated log file via readline

I'm trying to read a log file, written line by line, via readline.
I'm surprised to observe the following behaviour (code executed in the interpreter, but same happens when variations are executed from a file):
f = open('myfile.log')
line = readline()
while line:
print(line)
line = f.readline()
# --> This displays all lines the file contains so far, as expected
# At this point, I open the log file with a text editor (Vim),
# add a line, save and close the editor.
line = f.readline()
print(line)
# --> I'm expecting to see the new line, but this does not print anything!
Is this behaviour standard? Am I missing something?
Note: I know there are better way to deal with an updated file for instance with generators as pointed here: Reading from a frequently updated file. I'm just interested in understanding the issue with this precise use case.
For your specific use case, the explanation is that Vim uses a write-to-temp strategy. This means that all writing operations are performed on a temporary file.
On the contrary, your scripts reads from the original file, so it does not see any change on it.
To further test, instead of Vim, you can try to directly write on the file using:
echo "Hello World" >> myfile.log
You should see the new line from python.
for following your file, you can use this code:
f = open('myfile.log')
while True:
line = readline()
if not line:
print(line)

Writing to a text file does not occur in real-time. How to fix this

I have a python script that takes a long time to run.
I placed print-outs throughout the script to observe its progress.
As this script different programs, some of whom print many messages, it is unfeasible to print directly to the screen.
Therefore, I am using a report file
f_report = open(os.path.join("//shared_directory/projects/work_area/", 'report.txt'), 'w')
To which I print my massages:
f_report.write(" "+current_image+"\n")
However, when I look at the file while the script is running, I do not see the messages. They appear only when the program finishes and closes the file, making my approach useless for monitoring on-going progress.
What should I do in order to make python output the messages to the report file in real time?
Many thanks.
You should use flush() function to write immediately to the file.
f_report.write(" "+current_image+"\n")
f_report.flush()
try this:
newbuffer = 0
f_report = open(os.path.join("//shared_directory/projects/work_area/", 'report.txt'), 'w', newbuffer)
it sets up a 0 buffer which will push OS to write content to file "immediately". well, different OS may behavior differently but in general content will be flushed out right away.

How to make Python file-writing faster with IDLE?

Writing from file_A to file_B using IDLE always makes IDLE print out the lines as they are being written. If the file is very large, then the process would take hours to finish.
How can I make IDLE not print anything while the process of writing to a new file is ongoing, in order to speed things up?
A simple code to demonstrate that IDLE prints the lines as they are being written:
file = open('file.csv','r')
copy = open('copy.csv','w')
for i in file:
i = i.split()
copy.write(str(i))
I assume you are using Python3 where write returns the number of characters written to the file and IDLE's python shell prints this return value when you call it. In Python2 write returns None that is not printed by IDLE's shell.
The workaround is to assign the return value of write to a temporary dummy variable
dummy = f.write("my text")
For your example the following code should work
file = open('file.csv','r')
copy = open('copy.csv','w')
for i in file:
i = i.split()
dummy = copy.write(str(i))
I added two screenshots for all of you to see the difference between the writes in Python 2 and Python 3 on my system.

Blocking until a file is closed in python

I have Python set up to create and open a txt file [see Open document with default application in Python ], which I then manually make some changes to and close. Immidiately after this is complete I want Python to open up next txt file. I currently have this set up so that python waits for a key command that I type after I have closed the file, and on that key, it opens the next one for me to edit.
Is there a way of getting Python to open the next document as soon as the prior one is closed (i.e to skip out having python wait for a key to be clicked). ... I will be repeating this task approximately 100,000 times, and thus every fraction of a second of clicking mounts up very quickly. I basically want to get rid of having to interface with python, and simply to have the next txt file automatically appear as soon as prior one is closed.
I couldn't work out how to do it, but was thinking along the lines of a wait until the prior file is closed (wasn't sure if there was a way for python to be able to tell if a file is open/closed).
For reference, I am using python2.7 and Windows.
Use the subprocess module's Popen Constructor to open the file. It will return an object with a wait() method which will block until the file is closed.
How about something like:
for fname in list_of_files:
with open(fname, mode) as f:
# do stuff
In case of interest, the following code using the modified time method worked:
os.startfile(text_file_name)
modified = time.ctime(os.path.getmtime(text_file_name))
created = time.ctime(os.path.getctime(text_file_name))
while modified == created:
sleep(0.5)
modified = time.ctime(os.path.getmtime(text_file_name))
print modified
print "moving on to next item"
sleep(0.5)
sys.stdout.flush()
Athough I think I will use the Popen constructor in the future since that seems a much more elegant way of doing (and also allows for situations where the file is closed without an edit been needed).

How to not lose Intermediate Data in File

While Learning Python, we do Print to screen but eventually graduate to printing to output files... However most times all errors are not resolved in the code... In such cases the code aborts after running some 10 20 loops or say 80% of the code and then aborts.. However during this time the data that is printed to the file is lost as the file.close() is not executed.
In Python is there a way in which we can save the WIP file. I want to do this without closing and reopening the file once again multiple times in append modes. This will help in Debugging and also not losing the data that has been accumulated before the error was occurred.
After searching i did not find something like this .... if someone has or can give any ideas how to make a module for this that will be great... What we need is a generic catchall... in case of any error.. execute the catchall code to close the file and then exit from Python.
You can flush the internal file buffer by calling f.flush() on the file object in question.
Even better is to wrap the file access in a with block. If an exception is raised, the file is closed.
with open('tmp.txt', 'r') as f:
do_stuff_with(f)
On the documentation of File objects : https://docs.python.org/2/library/stdtypes.html?highlight=flush#file.flush
Use the flush function. There is also a note on the doc with os.fsync function to be sure the data are written on the disc.

Categories