PIL.Image.save() to an FTP server - python

Right now, I have the following code:
pilimg = PILImage.open(img_file_tmp) # img_file_tmp just contains the image to read
pilimg.thumbnail((200,200), PILImage.ANTIALIAS)
pilimg.save(fn, 'PNG') # fn is a filename
This works just fine for saving to a local file pointed to by fn. However, what I would want this to do instead is to save the file on a remote FTP server.
What is the easiest way to achieve this?

Python's ftplib library can initiate an FTP transfer, but PIL cannot write directly to an FTP server.
What you can do is write the result to a file and then upload it to the FTP server using the FTP library. There are complete examples of how to connect in the ftplib manual so I'll focus just on the sending part:
# (assumes you already created an instance of FTP
# as "ftp", and already logged in)
f = open(fn, 'r')
ftp.storbinary("STOR remote_filename.png", f)
If you have enough memory for the compressed image data, you can avoid the intermediate file by having PIL write to a StringIO, and then passing that object into the FTP library:
import StringIO
f = StringIO()
image.save(f, 'PNG')
f.seek(0) # return the StringIO's file pointer to the beginning of the file
# again this assumes you already connected and logged in
ftp.storbinary("STOR remote_filename.png", f)

Related

Gzip a file in Python before uploading to Cloud Storage

I have the following Python function to write the given content to a bucket in Cloud Storage:
import gzip
from google.cloud import storage
def upload_to_cloud_storage(json):
"""Write to Cloud Storage."""
# The contents to upload as a JSON string.
contents = json
storage_client = storage.Client()
# Path and name of the file to upload (file doesn't yet exist).
destination = "path/to/name.json.gz"
# Gzip the contents before uploading
with gzip.open(destination, "wb") as f:
f.write(contents.encode("utf-8"))
# Bucket
my_bucket = storage_client.bucket('my_bucket')
# Blob (content)
blob = my_bucket.blob(destination)
blob.content_encoding = 'gzip'
# Write to storage
blob.upload_from_string(contents, content_type='application/json')
However, I receive an error when running the function:
FileNotFoundError: [Errno 2] No such file or directory: 'path/to/name.json.gz'
Highlighting this line as the cause:
with gzip.open(destination, "wb") as f:
I can confirm that the bucket and path both exist although the file itself is new and to be written.
I can also confirm that removing the Gzipping part sees the file successfully written to Cloud Storage.
How can I gzip a new file and upload to Cloud Storage?
Other answers I've used for reference:
https://stackoverflow.com/a/54769937
https://stackoverflow.com/a/67995040
Although #David's answer wasn't complete at the time of solving my problem, it got me on the right track. Here's what I ended up using along with explanations I found out along the way.
import gzip
from google.cloud import storage
from google.cloud.storage import fileio
def upload_to_cloud_storage(json_string):
"""Gzip and write to Cloud Storage."""
storage_client = storage.Client()
bucket = storage_client.bucket('my_bucket')
# Filename (include path)
blob = bucket.blob('path/to/file.json')
# Set blog meta data for decompressive transcoding
blob.content_encoding = 'gzip'
blob.content_type = 'application/json'
writer = fileio.BlobWriter(blob)
# Must write as bytes
gz = gzip.GzipFile(fileobj=writer, mode="wb")
# When writing as bytes we must encode our JSON string.
gz.write(json_string.encode('utf-8'))
# Close connections
gz.close()
writer.close()
We use the GzipFile() class instead of convenience method (compress) to enable us to pass in the mode. When trying to write using w or wt you will receive the error:
TypeError: memoryview: a bytes-like object is required, not 'str'
So we must write in binary mode (wb). This will also enable the .write() method. When doing so however we need to encode our JSON string. This can be done using str.encode() and setting it as utf-8. Failing to do this will also result in the same error.
Finally, I wanted to be able to enable decompressive transcoding where the requester (browser in my case) will receive the uncompressed version of the file when requested. To enable this google.cloud.storage.blob allows you to set some meta data including content_type and content_encoding so we can can follow best practices.
This sees the JSON object in memory written to your chosen destination in Cloud Storage in a compressed format and decompressed on the fly (without needing to download a gzip archive).
Thanks also to #JohnHanley for the troubleshooting advice.
The best solution is not to write the gzip to a file at all, and directly compress and stream to GCS.
from google.cloud import storage
from google.cloud.storage import fileio
storage_client = storage.Client()
bucket = storage_client.bucket('my_bucket')
blob = bucket.blob('my_object')
writer = fileio.BlobWriter(blob)
gz = gzip.GzipFile(fileobj=writer, mode="w") # use "wb" if bytes
gz.write(contents)
gz.close()
writer.close()

Writing to data in Python to a local file and uploading to FTP at the same time does not work

I have this weird issue with my code on Raspberry Pi 4.
from gpiozero import CPUTemperature
from datetime import datetime
import ftplib
cpu = CPUTemperature()
now = datetime.now()
time = now.strftime('%H:%M:%S')
# Save data to file
f = open('/home/pi/temp/temp.txt', 'a+')
f.write(str(time) + ' - Temperature is: ' + str(cpu.temperature) + ' C\n')
# Login and store file to FTP server
ftp = ftplib.FTP('10.0.0.2', 'username', 'pass')
ftp.cwd('AiDisk_a1/usb/temperature_logs')
ftp.storbinary('STOR temp.txt', f)
# Close file and connection
ftp.close()
f.close()
When I have this code, script doesn't write anything to the .txt file and file that is transferred to FTP server has size of 0 bytes.
When I remove this part of code, script is writing to the file just fine.
# Login and store file to FTP server
ftp = ftplib.FTP('10.0.0.2', 'username', 'pass')
ftp.cwd('AiDisk_a1/usb/temperature_logs')
ftp.storbinary('STOR temp.txt', f)
...
ftp.close()
I also tried to write some random text to the file and run the script, and the file transferred normally.
Do you have any idea, what am I missing?
After you write the file, the file pointer is at the end. So if you pass file handle to FTP, it reads nothing. Hence nothing is uploaded.
I do not have a direct explanation for the fact the local file ends up empty. But the strange way of combining "append" mode and reading may be the reason. I do not even see a+ mode defined in open function documentation.
If you want to both append data to a local file and FTP, I suggest your either:
Append the data to the file – Seek back to the original position – And upload the appended file contents.
Write the data to memory and then separately 1) dump the in-memory data to a file and 2) upload it.

ftplib upload and download get stuck

I'm trying to upload a file to my VPS (hosted by GoDaddy) via Python's ftplib library:
from ftplib import FTP
session = FTP('ftp.wangsibo.xyz','wsb','Wsb.139764')
file = open('source10.png','rb')
session.storbinary('store_source10.png', file)
file.close()
session.quit()
However it gets stuck at line 4 (the file is only a few k's and it's taking minutes). The same thing happens when I'm trying to read using retrbinary.
I've tried using FileZilla and it worked fine. Any suggestions?
FTP.storbinary(command, fp[, blocksize, callback, rest])
Store a file in binary transfer mode. command should be an appropriate
STOR command: "STOR filename". fp is an open file object which is read
until EOF using its read() method in blocks of size blocksize to
provide the data to be stored.
store_source10.png is not a command, you can try to use STOR source10.png.
e.g.
from ftplib import FTP
session = FTP('ftp.wangsibo.xyz','wsb','Wsb.139764')
file=open('source10.png','rb')
session.storbinary('STOR source10.png',file)
file.close()
session.quit()

Image handling in Python with Numpy

We are importing an screen capture from a web page direct into a variable in Python; and then producing a Numpy array using the following code :
To capture is a PNG image (note - the device url has an embedded cgi to do the capture work) :
response = requests.get(url.format(ip, device), auth=credentials)
Once screen is captured, covert to a Numpy array called image :
image = imread(BytesIO(response.content))
After analysis of image, we would like to FTP the captured PNG to a server for reference at a later date. The best solution we can find right now involves using imsave to create a file locally and then FTP with storbinary to take the local image and put it on the server.
Is it possible to FTP response.content; or a conversion of the numpy array back into a PNG (using imsave?) direct to the server and skip the local storage step?
Update
As per MattDMo comment, we tried:
def ftp_artifact (ftp_ip, ftp_dir, tid, artifact_name, artifact_path, imgdata) :
ftp = FTP(ftp_ip)
ftp.login("autoftp","autoftp")
ftp.mkd ("FTP/" + ftp_dir)
ftp.cwd("FTP/" + ftp_dir)
filepath = artifact_path
filename = artifact_name
f = BytesIO(imgdata)
ftp.storbinary ('STOR ' + filename, f)
ftp.quit()
Where imgdata is the result of io.imread. The result file is 5x bigger and not an image. The BytesIO object is the numpy array I presume?
In the ftplib module, the FTP.storbinary() method takes an open file object as its second argument. Since your BytesIO object can act as a file object, all you'd need to do is pass that - no need for a temporary file on the server.
EDIT
Without seeing your full code, what I suspect is happening is that you are passing the NumPy array to storbinary(), not the BytesIO object. You also need to make sure the object's read pointer is at the beginning by calling bytesio_object.seek(0) before uploading. The following code demonstrates how to do everything:
from ftplib import FTP
from io import BytesIO
import requests
r = requests.get("http://example.com/foo.png")
png = BytesIO(r.content)
# do image analysis
png.seek(0)
ftp = FTP("ftp.server.com")
ftp.login(user="username", passwd="password")
# change to desired upload directory
ftp.storbinary("STOR " + file_name, png)
try:
ftp.quit()
except:
ftp.close()
Took a bit of research but our student figured it out :
def ftp_image_to(ftp_ip, ftp_dir, filename, data):
ftp = FTP(ftp_ip)
print("logging in")
ftp.login('autoftp', 'autoftp')
print("making dir")
ftp.mkd('FTP/' + ftp_dir)
ftp.cwd('FTP/' + ftp_dir)
print("formatting image")
bytes = BytesIO()
plt.imsave(bytes, data, format='png')
bytes.seek(0)
print("storing binary")
ftp.storbinary('STOR ' + filename, bytes)
ftp.quit()
Thanks IH!

Can I upload an object in memory to FTP using Python?

Here's what I'm doing now:
mysock = urllib.urlopen('http://localhost/image.jpg')
fileToSave = mysock.read()
oFile = open(r"C:\image.jpg",'wb')
oFile.write(fileToSave)
oFile.close
f=file('image.jpg','rb')
ftp.storbinary('STOR '+os.path.basename('image.jpg'),f)
os.remove('image.jpg')
Writing files to disk and then imediately deleting them seems like extra work on the system that should be avoided. Can I upload an object in memory to FTP using Python?
Because of duck-typing, the file object (f in your code) only needs to support the .read(blocksize) call to work with storbinary. When faced with questions like this, I go to the source, in this case lib/python2.6/ftplib.py:
def storbinary(self, cmd, fp, blocksize=8192, callback=None):
"""Store a file in binary mode. A new port is created for you.
Args:
cmd: A STOR command.
fp: A file-like object with a read(num_bytes) method.
blocksize: The maximum data size to read from fp and send over
the connection at once. [default: 8192]
callback: An optional single parameter callable that is called on
on each block of data after it is sent. [default: None]
Returns:
The response code.
"""
self.voidcmd('TYPE I')
conn = self.transfercmd(cmd)
while 1:
buf = fp.read(blocksize)
if not buf: break
conn.sendall(buf)
if callback: callback(buf)
conn.close()
return self.voidresp()
As commented, it only wants a file-like object, indeed it not even be particularly file-like, it just needs read(n). StringIO provides such "memory file" services.
import urllib
import ftplib
ftp = ftplib.FTP(...)
f = urllib.urlopen('http://localhost/image.jpg')
ftp.storbinary('STOR image.jpg', f)
You can use any in-memory file-like object, like BytesIO:
from io import BytesIO
It works both in binary mode with FTP.storbinary:
f = BytesIO(b"the contents")
ftp.storbinary("STOR /path/file.txt", f)
as well as in ascii/text mode with FTP.storlines:
f = BytesIO(b"the contents")
ftp.storlines("STOR /path/file.txt", f)
For more advanced examples, see:
Python - Upload a in-memory file (generated by API calls) in FTP by chunks
Python - Transfer a file from HTTP(S) URL to FTP/Dropbox without disk writing (chunked upload)
How to send CSV file directly to an FTP server

Categories