I have a python web project based on Pyramid. I have a confusion in deciding to use which tools I have to use to enable image uploading. I used pyramid_storage(https://github.com/danjac/pyramid_storage) to handle the image uploading before, but I haven't figured how to expire the uploaded file.
Related
I have a working Python script that generates and saves hi-res image files to a local Dropbox folder (synced through the Windows Dropbox app). Is there a way in Python to change the SmartSync setting for the newly created image from "Local" to "Online Only" so that I can save space on my local hard drive? I know I could use the Dropbox API v2 to just upload the file and then delete the temporary local files after, but I'm wondering if there is a way to directly change the file settings since it already gets saved to the synced Dropbox folder.
Thanks!
No, unfortunately Dropbox doesn't offer an API for managing Smart Sync settings like this, but I'll pass this along as a feature request.
I want to use the google.appengine.api images package but I do not know how to install the tools for virtualenv. The package works fine when I use dev_appserver.py on my normal environment but when I use the flexible environment with flask it cannot find the package. Is there a way to add the images library into my virtualenv?
When I try using Pillow to resize the image before I uploaded it to the server but when I would do that the image would arrive in the cloud storage at 0B.
if file and allowed_file(file.filename):
filename = '%s_%s.jpg' % (item.id, len(item.photos))
# Resize file using pillow
image = Image.open(file)
image.thumbnail((300,300)
resized_image = io.BytesIO()
image.save(resized_image, format='JPEG')
# if I did a image.show() here the image would
# properly be shown resized
gcs = storage.Client()
bucket = gcs.get_bucket(CLOUD_STORAGE_BUCKET)
blob = bucket.blob(filename)
blob.upload_from_file(resized_image,
content_type=file.content_type)
# I would then view the image in the bucket and it shows up as 0 bytes
# and blank
# If I just use the regular file it uploads fine.
You may be out of luck, the images service is not available outside the standard environment.
From the Migrating Services from the Standard Environment to the Flexible Environment:
The Images service is not available outside of the standard
environment. However, you can easily serve images directly from your
application or directly from Cloud Storage.
If you need to do image processing, you can install and use any image
processing library such as Pillow.
The Images service also provided functionality to avoid dynamic
requests to your application by handling image resizing using a
serving URL. If you want similar functionality, you can generate the
re-sized images ahead of time and upload them to Cloud Storage for
serving. Alternatively, you could use a third-party content delivery
network (CDN) service that offers image resizing.
For more resources, see the following guides:
Using Cloud Storage
Serving Static Files
I want to upload images in Microsoft azure through a python script and show those images in a dashboard build on django admin interface. Now, i figured since i am sending pics i should be using ftp.. So this is the code:
import ftplib
session = ftplib.FTP('server.address.com','USERNAME','PASSWORD')
file = open('kitten.jpg','rb') # file to send
session.storbinary('STOR kitten.jpg', file) # send the file
file.close() # close file and FTP
session.quit()
Now, i don't know how to setup ftp server in azure and how would i be able to fetch those images from server to my dashboard.. I don't know much about deployment so any link to do this or guide would be welcome/helpful.
It sounds like you tried to create a Django app to upload & show images via FTP on Azure WebApps.
Per my experience, the only feasible way on Azure WebApps is that reading or writing images via Kudu FTP, please refer to the offical wiki page for Kudu FTP, and set up the user name & password via Azure portal.
However, I think it's not a best practice for image showing on Azure WebApp, because normally there is concurrency limits for uploading & downloading in FTP that be not suitable for your scenario, and the storage via FTP on Azure WebApp is ready for App self, not for resource.
So my suggestion is that using Azure Blob Storage to read & write images, and Django Framework has supported for integrating with Azure Storgae via simply setup configuration. Please refer to the Django document reference for Azure Storage to knwo how to do it.
I need to export my blobstore from one appengine project and upload it to another project. How can I switch between projects programmatically with python?
If by "python" you mean a python GAE app's code itself - AFAIK you can't switch apps - each such code runs only inside the app specified in the .yaml file.
You could teach the exporting app project to serve the blob and for the actual transfer you could either:
have the receving app directly pulling the blobs from the exporting app
have an external (python) script pull blobs from the exporting apps and uploading them to the importing app.
Either way you'd need to write some code to actually perform the transfer.
So instead of doing that, I'd rather write and execute a one-time conversion script to move the data from blobstore (presently shown in the GAE python docs under Storing Data > Superseded Storage Solutions section on the left-side menubar) to the Datastore or GCS, both of which have better backup/restore options, including across apps :) GCS can probably be even used to share the same data across apps. And you can still serve the GCS data using the blobstore API, see Uploading files directly to Google Cloud Storage for GAE app
If you mean some external python app code - AFAIK the blobstore doesn't offer generic access directly to an external application (I might be wrong, tho). So an external app would need to go through the regular upload/download handlers of the 2 apps. So in this case switching between projects really means switching between the 2 apps' upload/download URLs.
Even for this scenario it might be worthy to migrate to GCS, which does offer direct access, see Sharing and Collaboration
I'm working on an app in django that allows users to upload documents to google drive and share them with friends. The problem is I want to restrict the shared documents to view only (no download option). How can I go about doing this?
You can insert/upload files using the Drive API and set the "restricted" label to prevent downloading of the file. You would then set the appropriate permissions to this file to allow anyone or a specified set of users to access the file.
Download restrictions may or may not apply for files that are converted to one of the Google Apps formats because the option to prevent downloading seems unavailable for these files through the Google Drive UI. You would have to test this yourself.