I'm trying to send a file over a socket. Everything seems to be working properly except for the file not writing properly. I snipped the code down to the major issue but can send the full server and client code if necessary.
if inst == "send":
try:
print ("Receiving...")
l = s.recv(1024)
with open('torecv.py', 'wb') as f:
print ("Writing...")
newFile = l.decode("UTF-8")
f.write(newFile)
f.close()
print ("Done Receiving")
except:
pass
The output returns:
Receiving...
Writing...
and newFile saves the correct data which says to me that it is running f.write is the problem because "torecv.py" is empty.
I'm pretty new to python so go easy on me. Thanks!
Related
What I need to do is to write some messages on a .txt file, close it and send it to a server. This happens in a infinite loop, so the code should look more or less like this:
from requests_toolbelt.multipart.encoder import MultipartEncoder
num = 0
while True:
num += 1
filename = f"example{num}.txt"
with open(filename, "w") as f:
f.write("Hello")
f.close()
mp_encoder = MultipartEncoder(
fields={
'file': ("file", open(filename, 'rb'), 'text/plain')
}
)
r = requests.post("my_url/save_file", data=mp_encoder, headers=my_headers)
time.sleep(10)
The post works if the file is created manually inside my working directory, but if I try to create it and write on it through code, I receive this response message:
500 - Internal Server Error
System.IO.IOException: Unexpected end of Stream, the content may have already been read by another component.
I don't see the file appearing in the project window of PyCharm...I even used time.sleep(10) because at first, I thought it could be a time-related problem, but I didn't solve the problem. In fact, the file appears in my working directory only when I stop the code, so it seems the file is held by the program even after I explicitly called f.close(): I know the with function should take care of closing files, but it didn't look like that so I tried to add a close() to understand if that was the problem (spoiler: it was not)
I solved the problem by using another file
with open(filename, "r") as firstfile, open("new.txt", "a+") as secondfile:
secondfile.write(firstfile.read())
with open(filename, 'w'):
pass
r = requests.post("my_url/save_file", data=mp_encoder, headers=my_headers)
if r.status_code == requests.codes.ok:
os.remove("new.txt")
else:
print("File not saved")
I make a copy of the file, empty the original file to save space and send the copy to the server (and then delete the copy). Looks like the problem was that the original file was held open by the Python logging module
Firstly, can you change open(f, 'rb') to open("example.txt", 'rb'). In open, you should be passing file name not a closed file pointer.
Also, you can use os.path.abspath to show the location to know where file is written.
import os
os.path.abspath('.')
Third point, when you are using with context manager to open a file, you don't close the file. The context manger supposed to do it.
with open("example.txt", "w") as f:
f.write("Hello")
I am downloading dynamic data from api server and writing it to file by means of an endless loop in python. For whatever reason the program stops writing to file after couple thousand lines, while the program seems to be running. I am not sure where the problem is. The program does not give error, ie. it is not that the API server is refusing response. When restarted it continues as planned. I am using python 3.6 and win10.
Simplified version of the code looks something like:
import requests, json, time
while True:
try:
r = requests.get('https://someapiserver.com/data/')
line = r.json()
with open('file.txt', 'a') as f:
f.write(line)
time.sleep(5)
except:
print('error')
time.sleep(10)
Try opening the file first and keeping the lock on it like so:
import requests, json, time
f = open('file.txt', 'a')
while True:
try:
r = requests.get('https://someapiserver.com/data/')
line = r.json()
f.write(line)
f.flush()
time.sleep(5)
except:
print('error')
time.sleep(10)
f.close() # remember to close the file
The solution is ugly but it will do.
I'm trying to control exceptions when reading files, but I have a problem. I'm new to Python, and I am not yet able to control how I can catch an exception and still continue reading text from the files I am accessing. This is my code:
import errno
import sys
class Read:
#FIXME do immutables this 2 const
ROUTE = "d:\Profiles\user\Desktop\\"
EXT = ".txt"
def setFileReaded(self, fileToRead):
content = ""
try:
infile = open(self.ROUTE+fileToRead+self.EXT)
except FileNotFoundError as error:
if error.errno == errno.ENOENT:
print ("File not found, please check the name and try again")
else:
raise
sys.exit()
with infile:
content = infile.read()
infile.close()
return content
And from another class I tell it:
read = Read()
print(read.setFileReaded("verbs"))
print(read.setFileReaded("object"))
print(read.setFileReaded("sites"))
print(read.setFileReaded("texts"))
Buy only print this one:
turn on
connect
plug
File not found, please check the name and try again
And no continue with the next files. How can the program still reading all files?
It's a little difficult to understand exactly what you're asking here, but I'll try and provide some pointers.
sys.exit() will terminate the Python script gracefully. In your code, this is called when the FileNotFoundError exception is caught. Nothing further will be ran after this, because your script will terminate. So none of the other files will be read.
Another thing to point out is that you close the file after reading it, which is not needed when you open it like this:
with open('myfile.txt') as f:
content = f.read()
The file will be closed automatically after the with block.
The following is Snippet of my program:
with open(completepathname,'wb') as f:
print ('file opened')
print(f)
while True:
print('receiving data...')
print('hello')
data = send_receive_protocol.recv_msg(conn)
#print('hi')
print(data)
if not data:
print('printing1')
break
print('printing')
data1=data.encode()
print(data1)
f.write(data1)#write to file
f.close()
output prints correctly to the console, but if I go open up the file it's blank. If I delete the file and execute my program again, the file is created but still is empty
The following snippet works as expected:
with open('test.txt', 'wb') as f:
while True:
data = 'test-data'
if not data:
break
data1 = data.encode('hex')
f.write(data1)
f.flush()
You should be careful not to .close() a file-like object and then continue to attempt to write to it. If you need to ensure the output is written right away, use .flush().
I am using twisted and python to send some large text file over network, I would like to use UDP and multicast, What solution would be the best to do it, I need sample code cause I am already confused, When I try to do it I get error 24 from python which says too many open files, could you help me to resolve this issue?
here is part of my code:
if (options.upt != None):
print "UPGRADE is initiating"
sourceFile = open(options.upt, "r")
reactor.listenMulticast(1888, UpgradeReciever("Control Listener"), listenMultiple=True)
#with open(options.upt) as sourceFile:
for line in sourceFile:
upgradeSenderObj = UpgradeSender(line, "224.0.0.8", 1888)
reactor.listenMulticast(1888, upgradeSenderObj, listenMultiple=True)
reactor.run()
I have also tried to read the whole file and put in list and then call each element of list (which are in fact lines of my file) by twisted, but still got the similar problem, here is my updated code:
if (options.upt != None):
print "UPGRADE is initiating"
sourceFile = open(options.upt, "r")
reactor.listenMulticast(1888, UpgradeReciever("Control Listener"), listenMultiple=True)
dataContainer = list(sourceFile)
print dataContainer
for i in range(len(dataContainer)):
upgradeSenderObj = UpgradeSender(dataContainer[i], "224.0.0.8", 1888)
reactor.listenMulticast(1888, upgradeSenderObj, listenMultiple=True)
reactor.run()