I have a function that watches a csv file. It iterates through all the lines in a csv file and returns each line, waits for the file to be updated, and if it is, returns that new value. This is how it looks like:
def follow():
file_path = filedialog.askopenfilename()
with open(file_path) as csvDataFile:
csvReader = csv.reader(csvDataFile)
csvDataFile.seek(0,0)
while True:
line = csvDataFile.readline()
if not line:
time.sleep(0.5)
continue
yield line
I am doing this for a gui. I have a button that, when pressed, loads the csv file and calls this function. It saves each line to a list that is a member variable.
def browseForFile(self):
line = pull_csv_data.follow()
for item in line:
self.list.append(item)
When another button is pressed, it iterates through that list and displays the information on the gui:
def listItems(self):
for i in self.list:
time.sleep(0.1)
item = QListWidgetItem(i)
self.prev_scenes.addItem(item)
The problem is, whenever I click the button to display the data in the list, it stops responding until I kill the python script that has the follow() function. For clarification, the follow function is in a separate file that I include in my "main" file.
Essentially.. the follow function does not allow for other processes to run at the same time, I think. Is that true? is there a workaround? Or is there a better way to do this?
The follow function is in the same process as your main function. Because of this, when your follow function sleeps, so does your entire program. If you want this to be run asynchronously, you'll have to make it do so explicitly.
You should probably use something like https://pythonhosted.org/watchdog/index.html or https://github.com/seb-m/pyinotify/wiki instead of trying to write your own file handling function.
Related
Maybe this is just how Python works but I think I'm missing something, I've looked around and I can't find anything addressing this.
Are text files only created after the process is terminated, or have I made a mistake?
Here's the code I'm using to generate the text file:
user_name_token = open("user_name_token.txt", "w")
If that is just how Python works and .txt files are only generated after the process is terminated, then there's no need to look at the code below. However, if I've made a mistake and that's why the .txt file is only generated after the process is terminated, please do let me know.
I've put what I feel to be the relevant code sequence below.
The following code, if the file exists, converts the text inside to a str value and leads to a greeting where the str value is used but it isn't relevant here. Instead, we'll carry on as if the file does not exist yet; and go to initial_greeting_and_initial_user_name_set() after initialization():
initialization()
def initialization():
global user_name
global initial_login_token
if os.path.isfile("user_name_token.txt") == True:
file = open("user_name_token.txt")
user_name = file.read().replace("\n", " ")
file.close
print(user_name)
time_based_greeting()
else:
initial_greeting_and_initial_user_name_set()
The following code allows the user to create the user name and leads to initial_greeting_and_inital_user_name_set_token_generation() when the button is pressed:
def initial_greeting_and_initial_user_name_set():
global initial_greeting_label
global initial_greeting_space0
global user_name
global initial_greeting_progression_button
initial_greeting_label = Label(root, text="Hello! It's nice to meet you, my name is Eve, what's yours?")
initial_greeting_space0 = Label(root, text=" ")
user_name = Entry(root)
initial_greeting_progression_button = Button(root,\
text="Enter",\
command=initial_greeting_and_initial_user_name_set_token_generation)
initial_greeting_label.pack()
initial_greeting_space0.pack()
user_name.pack()
initial_greeting_progression_button.pack()
The following code uses the user Entry to create a .txt file with the user name inside and loops back to initialization():
def initial_greeting_and_initial_user_name_set_token_generation():
global user_name_token
user_name_token = open("user_name_token.txt", "w")
user_name_token.write(user_name.get())
initialization()
Thank you for any help, especially given it's a long read if I have in-fact made a mistake and that's why the .txt file is only created after the window is closed.
Again, if it's just normal for .txt files to only be generated after the process is terminated and not when the code to create it has been run; then there's no need to take the above code into account.
I'm not sure that's the problem, but in that code:
initialization()
def initialization():
global user_name
global initial_login_token
if os.path.isfile("user_name_token.txt") == True:
file = open("user_name_token.txt")
user_name = file.read().replace("\n", " ")
file.close
print(user_name)
time_based_greeting()
else:
initial_greeting_and_initial_user_name_set()
It seems to me like you forgot to call the file.close method.
Add brackets at the end, so it will look as follows:
file.close().
It is unclear just what your code is doing as you're opening a file for writing and then opening it again for reading before you've closed it for writing. You should close your file for writing before you go on to open it for reading:
def initial_greeting_and_initial_user_name_set_token_generation():
global user_name_token
user_name_token = open("user_name_token.txt", "w")
user_name_token.write(user_name.get())
user_name_token.close()
initialization()
or better yet:
def initial_greeting_and_initial_user_name_set_token_generation():
global user_name_token
with open("user_name_token.txt", "w") as user_name_token:
user_name_token.write(user_name.get())()
initialization()
This latter construct is the preferred way to open a file such that you're sure it gets closed. It will remain open only inside the with block and will then be closed automatically.
Python does not hold back any I/O operations. Like other languages, it performs the operations right away. It may buffer up write operations and write the results in larger blocks, but all writing is completed when you call close() on the file (or when the associated with block is exited, closing the file automatically).
I am trying to figure out how to restart a python program with Tkinter. Basically, when the user clicks on the restart button, the program should close all existing Tkinter tabs and "restart" the program. How can I do this?
This is what I am currently doing
def Restart(root,root2,root3):
fh = open(".exeinfo/AccountData/Filepath.baseconfig",'r')
path = fh.read()
call(["Restart.bat"])
root2.destroy()
try:
root.destroy()
except:
pass
try:
root3.destroy()
except:
pass
Normally I would say that you should put the entirety of the code within a loop, but tkinter breaks that, so what I will suggest is a solution that is very unclean; two more simple scripts to run alongside this one.
The first is a simple multiprocess one, something along the lines of using multiprocess to run your current script/application through alongside another.
Then add this to the end of your function you've made here to restart the program:
f = open("restart.txt", "w+")
f.write("true")
f.close()
exit() #or however you choose to end your program after destroying the roots
And this to the start of your script/application (before anything else is done, except importing stuff):
r = open("restart.txt", "w+")
r.write("false")
r.close()
Finally, in the third script just add a simple loop:
x = 5
while x = 5:
f = open("restart.txt", "r+")
restart = f.read()
if restart == "true":
#run your application/script
else:
f.close()
The goal of this would be to run your application, and this simple loop simultaneously. When you want to restart your program, you would update the text file to "true" and close the program, and then the simple loop would run your program again, which would change the value to false and prevent infinite windows of your program being opened. I doubt this is the best solution, nor exactly what you wanted, but it should "restart" your program.
I have a situation where I only want to store a single tensor at a time. IIUC, the FileWriter appends summaries to the existing event file. Right now, every time I want to write a summary, I do the following in a class init:
self.WRITER = tf.summary.FileWriter(self.LOGDIR,
max_queue=1,
flush_secs=9999999)
and in a class method:
summary = self.SESSION.run(tf.summary.tensor_summary('frame',
image_tensor))
self.WRITER.add_summary(summary)
self.WRITER.flush()
self.WRITER.close()
With .close(), it only writes once. Without .close(), it appends to the event file. With self.WRITER.reopen() at the beginning of the method, it adds new event files. I would like to have a single event file that is overwritten every time.
Is there any way to do this through TensorFlow, or do I need to remove the old file manually and create a new summary at each iteration?
There is no way to refresh the file with the last event as I see but you can use
os.remove("path/to/file")
on each iteration.
I'd like to ask if there's any way we could watch a directory in python and parse the latest text file being generated on the directory.
Tho i have this start up code which parse a certain text file.
import time
def follow(thefile):
thefile.seek(0,2)
while True:
line = thefile.readline()
if not line:
time.sleep(0.1)
continue
yield line
if __name__ == '__main__':
logfile = open(r'\\some directory\files.txt',"r")
loglines = follow(logfile)
for line in loglines:
print line,
See the bold files.txt i need that to be dynamic by watching the directory for newly generated text files and switch to the latest text file and parse it.
It will run on Windows XP service Pack 3
I'm using Python 2.7
Directory i'm watching is also using windows XP
Thank you.
To check for new files, repeatedly get a list of files currently in the directory with os.listdir('directory'). Save the entries in a set and calculate the difference of the set with the previous set.
# Initialize before an event loop:
old_entries = set()
# You need a loop that calls two handlers, each handler returning soon.
# Inside your loop, check for a "new file" event this way:
now_entries = os.listdir(r'\\some directory')
now_entries.symmetric_difference_update(old_entries)
for new_entry in now_entries:
handle_new_file(new_entry)
Your program needs to listen for two events:
New file in the directory.
New line in the old file.
You call follow(), which is like an event handler that never returns. I think you want that handler to return to one main event loop that checks for each kind of event. Your follow() function never returns because it continues within the while True infinite loop unless a new line is added to the file for it to yield. It will never yield if no more lines are getting added to that file.
Take a look into the FindFirstChangeNotification API
http://timgolden.me.uk/python/win32_how_do_i/watch_directory_for_changes.html
The approach here is to use the MS FindFirstChangeNotification API, exposed via the pywin32 win32file module. It needs a little explanation: you get a change handle for a directory (optionally with its subdirectories) for certain kinds of change. You then use the ubiquitous WaitForSingleObject call from win32event, which fires when something's changed in one of your directories.
Essentially because the Windows OS is responsible for managing creation/modification of files, you can ask it to let you know immediately when a file is changed/created.
I have a process that gets a files from a directory and puts them in a list. It then iterates that list in a loop. The last line of the loop being where it should update my gui display, then it begins the loop again with the next item in the list.
My problem is that it does not actually update the gui until the entire process is complete, which depending on the size of the list could be 30 seconds to over a minute. This gives the feeling of the program being 'hung'
What I wanted it to do was to process one line in the list, update the gui and then continue. Where did I go wrong? The line to update the list is # Populate listview with drive contents. The print statements are just for debug.
def populateList(self):
print "populateList"
sSource = self.txSource.Value
sDest = self.txDest.Value
# re-intialize listview and validated list
self.listView1.DeleteAllItems()
self.validatedMove = None
self.validatedMove = []
#Create list of files
listOfFiles = getList(sSource)
#prompt if no files detected
if listOfFiles == []:
self.lvActions.Append([datetime.datetime.now(),"Parse Source for .MP3 files","No .MP3 files in source directory"])
#Populate list after both Source and Dest are chosen
if len(sDest) > 1 and len(sDest) > 1:
print "-iterate listOfFiles"
for file in listOfFiles:
sFilename = os.path.basename(file)
sTitle = getTitle(file)
sArtist = getArtist(file)
sAlbum = getAblum(file)
# Make path = sDest + Artist + Album
sDestDir = os.path.join (sDest, sArtist)
sDestDir = os.path.join (sDestDir, sAlbum)
#If file exists change destination to *.copyX.mp3
sDestDir = self.defineDestFilename(os.path.join(sDestDir,sFilename))
# Populate listview with drive contents
self.listView1.Append([sFilename,sTitle,sArtist,sAlbum,sDestDir])
#populate list to later use in move command
self.validatedMove.append([file,sDestDir])
print "-item added to SourceDest list"
else:
print "-list not iterated"
Create a worker thread/process that does your processing in the background and updates the GUI after the processing is done, maybe reporting progress during work.
Have a look at the threading or multiprocessing modules.
This is a common problem with GUI programs. Controls don't get updated until a "repaint" command is received and processed, and that won't happen until your function returns.
You can force a control to repaint at any time by calling its Update method, as shown in the answer to this question: How do you force refresh of a wx.Panel?
I might suggest you try wx.lib.delayedresult. It is somehow the simplified multithread workaround. You can put your business logic into worker function and other logics (including GUI appending, updating) in consumer function. The worker function runs in another thread while the consumer function is guaranteed to run after the finish of worker function in main thread.