I would like to create an app using python on google app engine to handle file upload and store them into the blobstore.
However, currently blobstore requires the use of blobstore.create_upload_url to create the url for the file upload form post. Since I am uploading from another server, my question is, is it possible to upload file to gae blobstore without using that dynamic url from blobstore.create_upload_url?
FYI, it is ok that I request a upload URL from the python script before I upload from another server but this creates extra latency and that is not what I want. Also I read that using the so called "file-like API" from http://code.google.com/intl/en/appengine/docs/python/blobstore/overview.html#Writing_Files_to_the_Blobstore but the documentation didn't seem to cover the part on uploading.
Also, previously I tried to use datastore for file upload, but the max file size is 1MB which is not enough for my case. Please kindly advise, thanks.
There are exactly two ways to write files to the blobstore: 1) using create_upload_url, and posting a form with a file attachment to it, 2) writing to the blobstore directly using an API (with a solution here for large files).
If you want a remote server to be able to upload, you have the same two choices:
1) The remote server firsts requests a URL. You have a handler that's just is only to return such a URL. With this URL, the remote server crafts a properly-encoded POST message and posts it to the URL.
2) You send the file data as an HTTP parameter to a handler to your app engine server, which uses the blobstore write API to then write the file directly. Request size is limited to 32MB, so the maximum file size you can write using this method will be slightly under that.
Related
I have built a python webservices application using flask.
After creation of swagger document, I published the api in WSO2 API manager.
The publish of API and subscription to the API are successful.
I am able to get authentication token as well.
What I am doing
To consume the API, in Postman/Angular application I am uploading 2 excel file
Files are then sent to WSO2 server url which then sends it to python server
When python server receives the file, it parses it.
Does some calculations and returns the response data object.
Now the problem is at step 3. The files received at Python end are not in the excel format. The data of both the file are combined into one FileStorage object.
Please see the snapshot below
Instead of two One FileStorage object received in request.files
I am trying this all in postman as well as with an angular application and both doesn't work.
I even tried it via swagger at published app's page in WSO2 API manager but that also doesn't works.
Sending Files Via WSO2 Swagger
What only works
When I try the Swagger of published app, Or when I consume the services via postman making a request directly to python's server.
I am getting both the files in proper required format.
File received properly when tried with python app's swagger page
Can you please help me to understand as to what I might be doing wrong.
I am using content type as below
'content-type: multipart/form-data;
After some digging up I found my resolution at Multipart form data file upload using WSO2 API manager?
I added the
<messageBuilder contentType="multipart/form-data"
class="org.wso2.carbon.relay.BinaryRelayBuilder"/>
<messageFormatter contentType="multipart/form-data"
class="org.wso2.carbon.relay.ExpandingMessageFormatter"/>
inside the JSON Message Builders and JSON Message Formatters section in axis2.xml file at
<API-M_HOME>repository\conf\axis2\axis2.xml
I'm trying to upload a file from API Rest (Google Endpoints) to GCS, but I have retrieve a lot of errors. I don't know if I'm using a bad way or simply Google Endpoints does not upload a file.
I'm trying who my customers upload files to my project bucket.
I read "Endpoints doesn't accept the multipart/form-data encoding so you can't upload the image directly to Endpoints".
Mike answered me at this post but dont know how to implement that on my project.
I'm using this libray (Python):
https://cloud.google.com/appengine/docs/python/googlecloudstorageclient/
If is possible, whats the better way? Any example?
Thanks so much.
I think what Mike means in the previous post, is that you should use Blobstore API to upload file to GCS, instead of using endpoints, and take the data again to the blobstore.
But that will depends on what platform is your client. If you use Web-based client, you should use ordinary way just as Mike has explained (by using HTML form and handler). But if you use Android or mobile client, you can use GCS Client library or GCS REST API.
I have an application hosted on Google App Engine and would like to make uploading data more secure. Currently, I upload data by using a python script to make POST requests with the relevant data to a URL on the /_ah/spi/... path (and the app takes care of uploading the data correctly). My goal is to make uploading data only available to admins of the application. I tried specifying an admin security-constraint for /_ah/spi/* in my web.xml but this seemed to block me from being able to upload data.
What is the best/easiest way only allow admins to upload data via my python script?
I didn't quite get exactly what I wanted (allow access to my application endpoints by admins only) but I did find a way to secure it. I followed the steps here. Basically I had to:
generate a client ID via the google developer console
add the client ID to the #Api annotation (under clientIds) for my endpoint
add a User parameter to all of the protected methods
authenticate myself using OAuth and my public/secret client ID (generated in step 1) in my python script (sample code here)
This scheme at least requires the script accessing my application to have both the public/secret client ID. Also, this question had some very useful information.
I want to upload an image to the blobstore, because I want to support files larger than 1MB. Now, the only way I can find is for the client to issue a POST where it sends the metadata, like geo-location, tags, and what-not, which the server puts in an entity. In this entity the server also puts the key of a blob where the actual image data is going to be stored, and the server concludes the request by returning to the client the url returned by create_upload_url(). This works fine, however I can get inconsistency, such as if the second request is never issued, and hence the blob is never filled. The entity is now pointing to an empty blob.
The only solution to this problem I can see is to trigger a deferred task which is going to check whether the blob was ever filled with an upload. I'm not a big fan of this solution, so I'm guessing if anybody has a better solution in mind.
I went through exactly the same thought process, but in Java, and ended up using Apache Commons FileUpload. I'm not familiar with Python, but you'll just need a way of handling a multipart/form-data upload.
I upload the image and my additional fields together, using JQuery to assemble the multipart form data, which I then POST to my server.
On the server side I then take the file and write it to Google Cloud Storage using the Google Cloud Storage client library (Python link). This can be done in one chunk, or 'streamed' if it's a large file. Once it's in GCS, your App Engine app can read it using the same library, or you can serve it directly with a public URL, depending on the ACL you set.
I am developing a Google web app using their Google app engine to parse some data from various incoming sources and save it all to one place. My ultimate goal is to save the file on a Dropbox, but the Google app hosting service dose't allow me to save files on the disk. Is there a way to send raw data to a Dropbox app and have that app save it as a file?
You can write to a file in Google's AppEngine by using either BlobStore (https://developers.google.com/appengine/docs/python/blobstore/overview#Writing_Files_to_the_Blobstore) or Google Cloud Storage (https://developers.google.com/appengine/docs/python/googlestorage/overview)
To write to a file using the dropbox api, have a look here: https://www.dropbox.com/developers/reference/api#files-POST
You'll have to setup an authenticated request, but this will write the contents of your POST body into a file into the authenticated user's Dropbox.
I think for the Blobstore, Google Cloud Storage, and DropBox you cannot append to existing files, so if you need to do this you need to either create a new file for each time you want to write data and combine files at a later point or read in the previous files data and prepend it to the new data before writing the new data.