My python2 script uploads files nicely using this method but python3 is presenting problems and I'm stuck as to where to go next (googling hasn't helped).
from ftplib import FTP
ftp = FTP(ftp_host, ftp_user, ftp_pass)
ftp.storbinary('STOR myfile.txt', open('myfile.txt'))
The error I get is
Traceback (most recent call last):
File "/Library/WebServer/CGI-Executables/rob3/functions/cli_f.py", line 12, in upload
ftp.storlines('STOR myfile.txt', open('myfile.txt'))
File "/Library/Frameworks/Python.framework/Versions/3.1/lib/python3.1/ftplib.py", line 454, in storbinary
conn.sendall(buf)
TypeError: must be bytes or buffer, not str
I tried altering the code to
from ftplib import FTP
ftp = FTP(ftp_host, ftp_user, ftp_pass)
ftp.storbinary('STOR myfile.txt'.encode('utf-8'), open('myfile.txt'))
But instead I got this
Traceback (most recent call last):
File "/Library/WebServer/CGI-Executables/rob3/functions/cli_f.py", line 12, in upload
ftp.storbinary('STOR myfile.txt'.encode('utf-8'), open('myfile.txt'))
File "/Library/Frameworks/Python.framework/Versions/3.1/lib/python3.1/ftplib.py", line 450, in storbinary
conn = self.transfercmd(cmd)
File "/Library/Frameworks/Python.framework/Versions/3.1/lib/python3.1/ftplib.py", line 358, in transfercmd
return self.ntransfercmd(cmd, rest)[0]
File "/Library/Frameworks/Python.framework/Versions/3.1/lib/python3.1/ftplib.py", line 329, in ntransfercmd
resp = self.sendcmd(cmd)
File "/Library/Frameworks/Python.framework/Versions/3.1/lib/python3.1/ftplib.py", line 244, in sendcmd
self.putcmd(cmd)
File "/Library/Frameworks/Python.framework/Versions/3.1/lib/python3.1/ftplib.py", line 179, in putcmd
self.putline(line)
File "/Library/Frameworks/Python.framework/Versions/3.1/lib/python3.1/ftplib.py", line 172, in putline
line = line + CRLF
TypeError: can't concat bytes to str
Can anybody point me in the right direction
The issue is not with the command argument, but with the the file object. Since you're storing binary you need to open file with 'rb' flag:
>>> ftp.storbinary('STOR myfile.txt', open('myfile.txt', 'rb'))
'226 File receive OK.'
APPEND to file in FTP.
Note: it's not SFTP - FTP only
import ftplib
ftp = ftplib.FTP('localhost')
ftp.login ('user','password')
fin = open ('foo.txt', 'r')
ftp.storbinary ('APPE foo2.txt', fin, 1)
Ref: Thanks to Noah
Related
I am building a test upload api to check dropbox to build final project but I am getting this error when i run python file in cmd:
Traceback (most recent call last):
File "C:\Users\sufiy\Desktop\test.py", line 7, in <module>
dbx.files_upload(file_contents, '/testdropbox.txt', mode=dropbox.files.WriteMode.overwrite)
File "C:\Users\sufiy\AppData\Local\Programs\Python\Python311\Lib\site-packages\dropbox\base.py", line 3210, in files_upload
r = self.request(
^^^^^^^^^^^^^
File "C:\Users\sufiy\AppData\Local\Programs\Python\Python311\Lib\site-packages\dropbox\dropbox_client.py", line 326, in request
res = self.request_json_string_with_retry(host,
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "C:\Users\sufiy\AppData\Local\Programs\Python\Python311\Lib\site-packages\dropbox\dropbox_client.py", line 476, in request_json_string_with_retry
return self.request_json_string(host,
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "C:\Users\sufiy\AppData\Local\Programs\Python\Python311\Lib\site-packages\dropbox\dropbox_client.py", line 538, in request_json_string
raise TypeError('expected request_binary as binary type, got %s' %
TypeError: expected request_binary as binary type, got <class 'str'>
here is my code:
import dropbox
dbx = dropbox.Dropbox('my api key')
with open('testdropbox.txt', 'r') as f:
file_contents = f.read()
dbx.files_upload(file_contents, '/testdropbox.txt', mode=dropbox.files.WriteMode('overwrite'))
I tried to build a program that overwrite txt file every minute and I want it to work so I can schedule this by using windows task schedule
I need to decompress this "H4sIAAAAAAAA/6tWKkktLjFUsjI00lEAs42UrCAMpVoAbyLr+R0AAAA=" which actually is compressed form of {"test1":12, "test2": "test"}. Now in python I'm using gzip library and getting below mentioned response:
>>> import gzip
>>> gzip.decompress("H4sIAAAAAAAA/6tWKkktLjFUsjI00lEAs42UrCAMpVoAbyLr+R0AAAA=".encode("UTF-8"))
Traceback (most recent call last):
File "<console>", line 1, in <module>
File "/data/python-3.8.10/lib/python3.8/gzip.py", line 548, in decompress
return f.read()
File "/data/python-3.8.10/lib/python3.8/gzip.py", line 292, in read
return self._buffer.read(size)
File "/data/python-3.8.10/lib/python3.8/gzip.py", line 479, in read
if not self._read_gzip_header():
File "/data/python-3.8.10/lib/python3.8/gzip.py", line 427, in _read_gzip_header
raise BadGzipFile('Not a gzipped file (%r)' % magic)
gzip.BadGzipFile: Not a gzipped file (b'H4')
Is there any way to decompress the string in python ?
The string is Base64 encoded. Therefore:-
import gzip
import base64
b = base64.b64decode('H4sIAAAAAAAA/6tWKkktLjFUsjI00lEAs42UrCAMpVoAbyLr+R0AAAA=')
r = gzip.decompress(b)
print(r.decode())
I am trying to parse a gif file with Biopython, and am using the sample code from their website. This is the code:
from BCBio import GFF
in_file = "infile.gff"
in_handle = open(in_file)
for rec in GFF.parse(in_handle):
print(rec)
in_handle.close()
When I run the code I get the following error:
Traceback (most recent call last):
File "/Users/juliofdiaz/anaconda2/envs/python37/lib/python3.7/site-packages/Bio/SeqIO/Interfaces.py", line 47, in __init__
self.stream = open(source, "r" + mode)
TypeError: expected str, bytes or os.PathLike object, not FakeHandle
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File "get_genes_dpt.py", line 37, in <module>
for rec in GFF.parse(in_handle):
File "/Users/juliofdiaz/anaconda2/envs/python37/lib/python3.7/site-packages/BCBio/GFF/GFFParser.py", line 746, in parse
target_lines):
File "/Users/juliofdiaz/anaconda2/envs/python37/lib/python3.7/site-packages/BCBio/GFF/GFFParser.py", line 322, in parse_in_parts
for results in self.parse_simple(gff_files, limit_info, target_lines):
File "/Users/juliofdiaz/anaconda2/envs/python37/lib/python3.7/site-packages/BCBio/GFF/GFFParser.py", line 343, in parse_simple
for results in self._gff_process(gff_files, limit_info, target_lines):
File "/Users/juliofdiaz/anaconda2/envs/python37/lib/python3.7/site-packages/BCBio/GFF/GFFParser.py", line 637, in _gff_process
for out in self._lines_to_out_info(line_gen, limit_info, target_lines):
File "/Users/juliofdiaz/anaconda2/envs/python37/lib/python3.7/site-packages/BCBio/GFF/GFFParser.py", line 699, in _lines_to_out_info
fasta_recs = self._parse_fasta(FakeHandle(line_iter))
File "/Users/juliofdiaz/anaconda2/envs/python37/lib/python3.7/site-packages/BCBio/GFF/GFFParser.py", line 560, in _parse_fasta
return list(SeqIO.parse(in_handle, "fasta"))
File "/Users/juliofdiaz/anaconda2/envs/python37/lib/python3.7/site-packages/Bio/SeqIO/__init__.py", line 607, in parse
return iterator_generator(handle)
File "/Users/juliofdiaz/anaconda2/envs/python37/lib/python3.7/site-packages/Bio/SeqIO/FastaIO.py", line 183, in __init__
super().__init__(source, mode="t", fmt="Fasta")
File "/Users/juliofdiaz/anaconda2/envs/python37/lib/python3.7/site-packages/Bio/SeqIO/Interfaces.py", line 51, in __init__
if source.read(0) != "":
TypeError: read() takes 1 positional argument but 2 were given
I am not sure how to fix the error as it seems I am passing a str and not a FakeHandle. I am running biopython 1.78 with conda.
I'm having trouble figuring out why the file, the contents of which are "DELETE ME LATER", which is loaded with encoding utf-8 causes an exception in botocore when it's being hashed.
with io.open('deleteme','r', encoding='utf-8') as f:
try:
resp=client.put_object(
Body=f,
Bucket='s3-bucket-actual-name-for-real',
Key='testing/a/put'
)
print('deleteme exists')
print(resp)
except:
print('deleteme could not put')
raise
Produces:
deleteme could not put
Traceback (most recent call last): File
"./test_operator.py", line 41, in
Key='testing/a/put' File "/Users/lamblin/VEnvs/awscli/lib/python3.6/site-packages/botocore/client.py",
line 312, in _api_call
return self._make_api_call(operation_name, kwargs) File "/Users/lamblin/VEnvs/awscli/lib/python3.6/site-packages/botocore/client.py",
line 582, in _make_api_call
request_signer=self._request_signer, context=request_context) File
"/Users/lamblin/VEnvs/awscli/lib/python3.6/site-packages/botocore/hooks.py",
line 242, in emit_until_response
responses = self._emit(event_name, kwargs, stop_on_response=True) File
"/Users/lamblin/VEnvs/awscli/lib/python3.6/site-packages/botocore/hooks.py",
line 210, in _emit
response = handler(**kwargs) File "/Users/lamblin/VEnvs/awscli/lib/python3.6/site-packages/botocore/handlers.py",
line 201, in conditionally_calculate_md5
calculate_md5(params, **kwargs) File "/Users/lamblin/VEnvs/awscli/lib/python3.6/site-packages/botocore/handlers.py",
line 179, in calculate_md5
binary_md5 = _calculate_md5_from_file(body) File "/Users/lamblin/VEnvs/awscli/lib/python3.6/site-packages/botocore/handlers.py",
line 193, in _calculate_md5_from_file
md5.update(chunk)
TypeError: Unicode-objects must be encoded before hashing
Now this can be avoided by opening the file with 'rb' but, isn't the file object f clearly using an encoding?
Now this can be avoided by opening the file with 'rb' but, isn't the file object f clearly using an encoding?
The encoding specified to io.open in mode='r' is used to decode the content. So when you iterate f, the content has already been converted from bytes to str (text) by Python.
To interface with botocore directly, open your file with mode 'rb', and drop the encoding kwarg. There is no point to decode it to text when the first thing botocore will have to do in order to transport the content is just encode back into bytes again.
I am serving large static files ~70 MB, I am able to download the files when working in flask alone, but I am getting the error below when using Tornado and flask.
Exception ignored in: <bound method Future.__del__ of <tornado.concurrent.Future object at 0x32c61acc>>
Traceback (most recent call last):
File "/home/user/virtual/lib/python3.4/site-packages/tornado/concurrent.py", line 333, in __del__
File "/usr/local/lib/python3.4/traceback.py", line 181, in format_exception
File "/usr/local/lib/python3.4/traceback.py", line 153, in _format_exception_iter
File "/usr/local/lib/python3.4/traceback.py", line 18, in _format_list_iter
File "/usr/local/lib/python3.4/traceback.py", line 65, in _extract_tb_or_stack_iter
File "/usr/local/lib/python3.4/linecache.py", line 15, in getline
File "/usr/local/lib/python3.4/linecache.py", line 41, in getlines
File "/usr/local/lib/python3.4/linecache.py", line 126, in updatecache
File "/usr/local/lib/python3.4/tokenize.py", line 437, in open
AttributeError: 'module' object has no attribute 'open'
Here is the code that I am using to serve files
def download(path):
def generate():
with open(path, 'rb') as file_handler:
while True:
chunk = file_handler.read(1024)
if not chunk:
break
yield chunk
return Response(generate(), direct_passthrough=True, mimetype='application/octet-stream',
headers={'Content-Disposition': 'attachment;filename={}'.format(os.path.basename(path))})
http_server = HTTPServer(WSGIContainer(APP))
http_server.listen(PORT, address='0.0.0.0')
IOLoop.instance().start()