How do I read from a named pipe in Python 3.5.3?
The pipe's name and location is /tmp/shairport-sync-metadata and it can be viewed by anybody, but only modified by the shairport-sync user.
Other websites and questions say to use os.mkfifo("pipe-location") but I've been getting this error:
>>> import os
>>> os.mkfifo("/tmp/shairport-sync-metadata")
Traceback (most recent call last):
File "<pyshell#1>", line 1, in <module>
os.mkfifo("/tmp/shairport-sync-metadata")
FileExistsError: [Errno 17] File exists
Is there a way to get around this? Sorry for the nooby question.
os.mkfifo is used to create fifo. Use open to open/read fifo already exist:
with open('/tmp/shairport-sync-metadata') as f: # add `rb` for binary mode
# line-by-line read
for line in f:
print(line)
# f.read(1024) # to read 1024 characters
Related
I just tried to read in a big json file (the Wikipedia json dump) in Python line by line and got the Error:
Traceback (most recent call last):
File "C:/.../test_json_wiki_file.py", line 19, in <module>
test_fct()
File "C:/.../test_json_wiki_file.py", line 12, in test_fct
for line in f:
OSError: [Errno 9] Bad file descriptor
Here is my code:
import json
def test_fct():
data = []
i = 0
with open('E:/.../20200713.json/20200713.json') as f:
for line in f:
data.append(json.loads(line))
i = i + 1
if i > 1:
input_file.close()
return data
test_data = test_fct()
The file size is around 700GB and the description (https://www.wikidata.org/wiki/Wikidata:Database_download) of the file states that it can be read line by line. I don't know if this is important but the E:/ hard drive is an external one.
Thank you for your help in advance :)
I don't have any firsthand knowledge on opening large files in python, but did you mean to have the path as 20200713.json/20200713.json. Is the first one actually a directory that has a .json extension? I'd also suggest trying to first load a smaller sample of the file (opening might be hard, so maybe just use the more command in terminal?).
I'm trying to load data from a pickled object into a list, but despite opening the file, I am receiving
Traceback (most recent call last):
File "/path/to/file.py", line 18, in <module>
data.append(pickle.load(file))
ValueError: peek of closed file
I assumed that I missed something in opening the file, but I looked and what I have seemed fine to me (this is my first foray into IO with pickle)
# load data to list
with open('tasks.txt', 'rb') as file:
data = []
while True:
try:
data.append(pickle.load(file))
except EOFError:
break
file.close()
Am I handling the opening wrong, or is it something else?
You closed the file after the first load; remove the file.close() entirely (the with statement already handles that), and it should work fine.
I'm trying to access a .txt file in Python and I can't figure out how to open the file. I ended up copying the contents into a list directly but I would like to know how to open a file for the future.
If I run this nothing prints. I think it's because Python is looking in the wrong folder/directory but I don't know how to change file paths.
sourcefile = open("CompletedDirectory.txt").read()
print(sourcefile)
The file CompletedDirectory.txt is probably empty.
If Python could not find the file, you would get a FileNotFoundError exception:
>>> sourcefile = open("CompletedDirectory.txt").read()
Traceback (most recent call last):
File "<stdin>", line 1, in <module>
FileNotFoundError: [Errno 2] No such file or directory: 'CompletedDirectory.txt'
Note that using read() in this way is not recommended. You're not closing the file properly. Use a context manager:
with open("CompletedDirectory.txt") as infile:
sourcefile = infile.read()
This will automatically close infile on leaving the with block.
You can get the current working directory:
import os
os.getcwd()
Then just concat it with the file container directory
os.path.join("targetDir", "fileName")
Im trying to read a dataset and collect meta features from it.
I get the following error after executing the python file.
Traceback (most recent call last):
File "runmeta.py", line 79, in <module>
np.savetxt('datasets/'+str(i)+'/metafeatures',meta[i],delimiter=',')
File "/usr/lib/python2.7/dist-packages/numpy/lib/npyio.py", line 940, in savetxt
fh = open(fname, 'w')
IOError: [Errno 2] No such file or directory: 'datasets/2/metafeatures'
the error you're getting is simply telling you it didn't find the file. i would suggest looking into absolute and relative file paths.
advice in error handling:
the error is triggered on this line
fh = open(fname, 'w')
so as you debug your program, look at the line python shows you. maybe change the variable fname. that is where i would start.
currently
fname = 'datasets/2/metafeatures'
In python I'm seeing evidence that fp.readlines() is closing the file when I try to access the fp later in the program. Can you confirm this behavior, do I need to re-open the file again later if I also want to read from it again?
Is the file closed? is similar, but didn't answer all of my questions.
import sys
def lines(fp):
print str(len(fp.readlines()))
def main():
sent_file = open(sys.argv[1], "r")
lines(sent_file)
for line in sent_file:
print line
this returns:
20
Once you have read a file, the file pointer has been moved to the end and no more lines will be 'found' beyond that point.
Re-open the file or seek back to the start:
sent_file.seek(0)
Your file is not closed; a closed file raises an exception when you attempt to access it:
>>> fileobj = open('names.txt')
>>> fileobj.close()
>>> fileobj.read()
Traceback (most recent call last):
File "<stdin>", line 1, in <module>
ValueError: I/O operation on closed file
It doesn't close the file, but it does read the lines in it so they cannot be read again without reopening the file or setting the file pointer back to the beginning with fp.seek(0).
As evidence that it doesn't close the file, try changing the function to actually close the file:
def lines(fp):
print str(len(fp.readlines()))
fp.close()
You will get the error:
Traceback (most recent call last):
File "test5.py", line 16, in <module>
main()
File "test5.py", line 12, in main
for line in sent_file:
ValueError: I/O operation on closed file
It won't be closed, but the file will be at the end. If you want to read its contents a second time then consider using
f.seek(0)
You may want to use the with statement and context manager:
>>> with open('data.txt', 'w+') as my_file: # This will allways ensure
... my_file.write('TEST\n') # that the file is closed.
... my_file.seek(0)
... my_file.read()
...
'TEST'
If you use a normal call, remember to close it manually (in theory python closes file objects and garbage collect them as needed):
>>> my_file = open('data.txt', 'w+')
>>> my_file.write('TEST\n') # 'del my_file' should close it and garbage collect it
>>> my_file.seek(0)
>>> my_file.read()
'TEST'
>>> my_file.close() # Makes shure to flush buffers to disk