"Peek of a closed file" right after file is opened - python

I'm trying to load data from a pickled object into a list, but despite opening the file, I am receiving
Traceback (most recent call last):
File "/path/to/file.py", line 18, in <module>
data.append(pickle.load(file))
ValueError: peek of closed file
I assumed that I missed something in opening the file, but I looked and what I have seemed fine to me (this is my first foray into IO with pickle)
# load data to list
with open('tasks.txt', 'rb') as file:
data = []
while True:
try:
data.append(pickle.load(file))
except EOFError:
break
file.close()
Am I handling the opening wrong, or is it something else?

You closed the file after the first load; remove the file.close() entirely (the with statement already handles that), and it should work fine.

Related

How do you open text files in python3? [duplicate]

This question already has answers here:
Why am I getting a FileNotFoundError? [duplicate]
(8 answers)
Closed 1 year ago.
file = open('python-ai-info.txt', 'r')
words = file.read()
file.close()
Is this correct? Please help me fix it. I want to open file named python-ai-info.txt. I am currently getting error:
Traceback (most recent call last):
File "C:\Users\sgcoder1337\AppData\Local\Programs\Python\Python38-32\rhyme ai.py", line 1, in <module>
file = open('python-ai-info.txt', 'r')
FileNotFoundError: [Errno 2] No such file or directory: 'python-ai-info.txt'
Your code is correct, the error occurs because your directory is an invalid/nonexistent directory, try to use the complete directory like:
'C:/Users/Username/Desktop/Folder/filename.txt'
For opening file:
file = open(directory, flag)
Main flags (modes) are:
"r" for reading text, throws an exception if the file doesn't exist
"rb" for reading bytes, throws an exception if the file doesn't exist
"w" for writing text, doesn't throw an exception if the file doesn't exist, but creates it
"wb" for writing bytes, doesn't throw an exception if the file doesn't exist, but creates it
"a" for appending text, throws an exception if the file doesn't exist
"ab" for appending bytes, throws an exception if the file doesn't exist
"x" for creating files without opening, throws an exception if the file already exist
For reading files:
txt = file.read() # get the whole text
line = file.readline() # get a line
lines = file.readlines() # get a list of lines
For writing text:
file.write('txt')
For closing:
file.close()
For more detailed and complete info have a look here

Python get data from named pipe

How do I read from a named pipe in Python 3.5.3?
The pipe's name and location is /tmp/shairport-sync-metadata and it can be viewed by anybody, but only modified by the shairport-sync user.
Other websites and questions say to use os.mkfifo("pipe-location") but I've been getting this error:
>>> import os
>>> os.mkfifo("/tmp/shairport-sync-metadata")
Traceback (most recent call last):
File "<pyshell#1>", line 1, in <module>
os.mkfifo("/tmp/shairport-sync-metadata")
FileExistsError: [Errno 17] File exists
Is there a way to get around this? Sorry for the nooby question.
os.mkfifo is used to create fifo. Use open to open/read fifo already exist:
with open('/tmp/shairport-sync-metadata') as f: # add `rb` for binary mode
# line-by-line read
for line in f:
print(line)
# f.read(1024) # to read 1024 characters

CSV file gives error on reading with open() in python 2.7

import unicodecsv
engagement_file=r'G:\college\udacity\intro to data analitics\datasets\daily_engagement.csv'
enrollment_file=r'G:\college\udacity\intro to data analitics\datasets\enrollments.csv'
project_submissions_file=r'G:\college\udacity\intro to data analitics\datasets\project_submissions.csv'
def csv_to_list(csv_file):
with open(csv_file,'rb') as f:
reader=unicodecsv.DictReader(f)
return list(reader)
daily_engagement=csv_to_list(engagement_file)
enrollment=csv_to_list(enrollment_file)
project_submissions=csv_to_list(project_submissions_file)
on executing this piece of code I get following errors
Traceback (most recent call last):
File "G:\college\udacity\intro to data analitics\data_analytis_csv_to_list.py", line 10, in <module>
daily_engagement=csv_to_list(engagement_file)
File "G:\college\udacity\intro to data analitics\data_analytis_csv_to_list.py", line 8, in csv_to_list
return list(reader)
File "C:\ProgramData\Anaconda2\lib\site-packages\unicodecsv\py2.py", line 217, in next
row = csv.DictReader.next(self)
File "C:\ProgramData\Anaconda2\lib\csv.py", line 108, in next
row = self.reader.next()
File "C:\ProgramData\Anaconda2\lib\site-packages\unicodecsv\py2.py", line 117, in next
row = self.reader.next()
ValueError: I/O operation on closed file
I dont know how to solve it ,I m new to python
thanks in advance
When using with open() as f: in python the file f is only open inside the with clause. That is the point of using it; it provides automatic file closing and cleaning in a easy and readable way.
If you want to work on the file either open it without the with clause (that is plain opening a file) or do the operations on that file inside the clause, calling it directly as f.
You need to move your return under your with statement. Once control flow has gone out of the with statement, Python automatically closes the file for you. That means any file I/O you have to do needs to be done under the context manager:
def csv_to_list(csv_file):
with open(csv_file,'rb') as f:
reader = unicodecsv.DictReader(f)
return list(reader) # return the file under the context manager

Multiple File Stream Issue

In my project, I am using 3 files throughout the whole process. The source file (.ada), a "three address code" file (.TAC), and my own temporary file for use during processing (.TACTMP).
in Caller.py:
TACFILE = open(str(sys.argv[1])[:-4] + ".TAC", 'w') # line 17
# Does a bunch of stuff
TACFILE.close() # line 653
# the below function is imported from Called.py
post_process_file_handler() # line 654
in Called.py:
TAC_FILE_NAME = str(sys.argv[1])[:-4] # line 6
TAC_lines = open(TAC_FILE_NAME + ".TAC", 'r').readlines() # line 7
If I try to run my program without already having a (even if it's blank) .TAC file, I will get the following error:
Traceback (most recent call last):
File "Caller.py", line 8, in <module>
from Called import post_process_file_handler
File "Called.py", line 7, in <module>
TAC_lines = file(TAC_FILE_NAME + ".TAC", 'r').readlines()
IOError: [Errno 2] No such file or directory: 'test76.TAC'
Why would this be happening? This error is being thrown even if I put a breakpoint at the beginning of Caller.py, well before the post_process_file_handler() function ever gets called.
For clarity: test76.TAC should be being generated by Caller.py, and then Called.py should open that file to process it further, for some reason that isn't happening.
This may be specific to my case, but I found out the issue is due to the order and manner in which I was using these streams.
In short, when the import line was encountered:
from Called import post_process_file_handler
it triggered some sort of initialization, and since the file pointer was a global variable in Called.py, it was initialized before Caller.py had a chance to create the .TAC file it would read from.
Moving the import line to just before I use the function fixed my issue, as nothing in Called.py is initialized until after Caller.py is done doing its work.

Does fp.readlines() close a file?

In python I'm seeing evidence that fp.readlines() is closing the file when I try to access the fp later in the program. Can you confirm this behavior, do I need to re-open the file again later if I also want to read from it again?
Is the file closed? is similar, but didn't answer all of my questions.
import sys
def lines(fp):
print str(len(fp.readlines()))
def main():
sent_file = open(sys.argv[1], "r")
lines(sent_file)
for line in sent_file:
print line
this returns:
20
Once you have read a file, the file pointer has been moved to the end and no more lines will be 'found' beyond that point.
Re-open the file or seek back to the start:
sent_file.seek(0)
Your file is not closed; a closed file raises an exception when you attempt to access it:
>>> fileobj = open('names.txt')
>>> fileobj.close()
>>> fileobj.read()
Traceback (most recent call last):
File "<stdin>", line 1, in <module>
ValueError: I/O operation on closed file
It doesn't close the file, but it does read the lines in it so they cannot be read again without reopening the file or setting the file pointer back to the beginning with fp.seek(0).
As evidence that it doesn't close the file, try changing the function to actually close the file:
def lines(fp):
print str(len(fp.readlines()))
fp.close()
You will get the error:
Traceback (most recent call last):
File "test5.py", line 16, in <module>
main()
File "test5.py", line 12, in main
for line in sent_file:
ValueError: I/O operation on closed file
It won't be closed, but the file will be at the end. If you want to read its contents a second time then consider using
f.seek(0)
You may want to use the with statement and context manager:
>>> with open('data.txt', 'w+') as my_file: # This will allways ensure
... my_file.write('TEST\n') # that the file is closed.
... my_file.seek(0)
... my_file.read()
...
'TEST'
If you use a normal call, remember to close it manually (in theory python closes file objects and garbage collect them as needed):
>>> my_file = open('data.txt', 'w+')
>>> my_file.write('TEST\n') # 'del my_file' should close it and garbage collect it
>>> my_file.seek(0)
>>> my_file.read()
'TEST'
>>> my_file.close() # Makes shure to flush buffers to disk

Categories