I would like to upload a file to Google Drive using Flask which runs on App Engine. I got Google Drive service (constructed with Google Drive API) up and running and was able to upload a file from the server using files().insert(...).
Now I would like to implement uploading via HTML form. This gives me a FileStorage object. How should I proceed from this point in order to get the file inside Google Drive?
Please note that I'm uploading files that exceed App Engine's 5MB limit on request size.
Thank you for your suggestions.
Upload the form to blobstore with a callback into your frontend.
If the file is very large you might need to go as far as a backend with resumable uploads done on chained backends (as backends could also go down in the middle of the upload for huge files)
Related
I need to load CSV files from Google Drive into BigQuery automatically and I was wondering if it's possible to do it that way:
Google Drive Folder
Pub/Sub, Cloud Functions, DriveApi... ??
Cloud Storage Bucket
Bigquery
I have developed a python script that uploads the CSV file stored in Cloud Storage automatically to BigQuery, now I need to create the workflow between Google Drive and Cloud Storage.
I've been researching but really donĀ“t really know how to proceed.
Any hints?
You will need to develop an app to listen for changes, Google App Engine works well here or Cloud Functions.
The app will need to implement the Retrieve Changes logic that makes sense to your use case.
See these Google Drive API docs https://developers.google.com/drive/api/v3/manage-changes
With Drive, I recommend asking whether the OAuth is worth it for any app. Asking your users to submit to a lightweight frontend might be easy and faster to develop.
Try using Google drive API to pull data from google drive and load it to which ever location you want, i.e. GCS, BQ table and so on.
You can refer following example to create a code to achieve same.
You will need to develop an app to listen for changes, Google App Engine works well here or Cloud Functions.
The app will need to implement the Retrieve Changes logic that makes sense to your use case.
See these Google Drive API docs https://developers.google.com/drive/api/v3/manage-changes
With Drive, I recommend asking whether the OAuth is worth it for any app. Asking your users to submit to a lightweight frontend might be easy and faster to develop.
I am trying to figure out a way to upload an image into Google Cloud Storage using Google App Engine.
I have checked:
Sending images to google cloud storage using google app engine
Upload images/video to google cloud storage using Google App Engine
They all show how to do it using the BlobStore API.
When I checked the BlobStore API:https://cloud.google.com/appengine/docs/python/blobstore/
They have a note to use Google Cloud Storage instead. What's the current status of BlobStore and will it be supported in the future?
I see an example for image upload: https://github.com/GoogleCloudPlatform/python-docs-samples/blob/master/appengine/standard/blobstore/main.py using BlobStore API.
Is there an example for Google Cloud Storage using Google App Engine?
Uploading images through App Engine has three problems:
First, it's very inefficient. You are using your App Engine instance hours to simply pass-through data from a user's browser to Google Cloud Storage.
Second, it's slower (again, because you use an intermediary).
Finally, App Engine does not support streaming and all requests are limited to 32MB.
The best option is to upload files directly to Cloud Storage using one of the upload options.
I ended up using https://cloud.google.com/appengine/docs/python/googlecloudstorageclient/app-engine-cloud-storage-sample (The GCS client #ChrisC73 pointed out.
Also, referred to #vocausa's signed_url project: https://github.com/voscausa/appengine-gcs-signed-url
I upvoted the above answer as you should try to avoid passing data through App Engine if possible. If you really need to upload an image (or any other data) from App Engine to Google Cloud Storage in Python, the answer is as ChrisC73 pointed out to use the GoogleAppEngineCloudStorageClient, as there is no built-in API for Cloud Storage on the Python runtime.
In contrast, the PHP runtime has built-in support for Google Cloud Storage with the standard filesystem functions.
I have a .json local file and I want to upload it to google cloud storage using Python.
First note that my python script is not on app-engine, it's a simple script so I don't want to use the appengine API. Secondly, I read this tutorial and I think it is extremely unclear.
Hence, I am looking for an alternative API, maybe wrote by someone outside from Google, to help me upload my json file using my json credentials.
If you want to interact with google cloud storage services outside of the App Engine environment, you may use Gcloud (https://googlecloudplatform.github.io/gcloud-python/stable/) to do so.
You need a service account on your application as well as download the JSON credentials file. With those, once you install the proper packages, you'll be able to make queries as well as write data.
Here is an example of authenticating yourself to use the library:
from gcloud import datastore
# the location of the JSON file on your local machine
os.environ["GOOGLE_APPLICATION_CREDENTIALS"] = "/location/client_secret.json"
# project ID from the Developers Console
projectID = "THE_ID_OF_YOUR_PROJECT"
os.environ["GCLOUD_TESTS_PROJECT_ID"] = projectID
os.environ["GCLOUD_TESTS_DATASET_ID"] = projectID
client = datastore.Client(dataset_id=projectID)
Edit:
This library is for the following as per the docs here(https://github.com/GoogleCloudPlatform/gcloud-python):
This client supports the following Google Cloud Platform services:
Google Cloud Datastore
Google Cloud Storage
Google Cloud Pub/Sub
Google BigQuery
Google Cloud Resource Manager
I am developing a Google web app using their Google app engine to parse some data from various incoming sources and save it all to one place. My ultimate goal is to save the file on a Dropbox, but the Google app hosting service dose't allow me to save files on the disk. Is there a way to send raw data to a Dropbox app and have that app save it as a file?
You can write to a file in Google's AppEngine by using either BlobStore (https://developers.google.com/appengine/docs/python/blobstore/overview#Writing_Files_to_the_Blobstore) or Google Cloud Storage (https://developers.google.com/appengine/docs/python/googlestorage/overview)
To write to a file using the dropbox api, have a look here: https://www.dropbox.com/developers/reference/api#files-POST
You'll have to setup an authenticated request, but this will write the contents of your POST body into a file into the authenticated user's Dropbox.
I think for the Blobstore, Google Cloud Storage, and DropBox you cannot append to existing files, so if you need to do this you need to either create a new file for each time you want to write data and combine files at a later point or read in the previous files data and prepend it to the new data before writing the new data.
I would like to create an app using python on google app engine to handle file upload and store them into the blobstore.
However, currently blobstore requires the use of blobstore.create_upload_url to create the url for the file upload form post. Since I am uploading from another server, my question is, is it possible to upload file to gae blobstore without using that dynamic url from blobstore.create_upload_url?
FYI, it is ok that I request a upload URL from the python script before I upload from another server but this creates extra latency and that is not what I want. Also I read that using the so called "file-like API" from http://code.google.com/intl/en/appengine/docs/python/blobstore/overview.html#Writing_Files_to_the_Blobstore but the documentation didn't seem to cover the part on uploading.
Also, previously I tried to use datastore for file upload, but the max file size is 1MB which is not enough for my case. Please kindly advise, thanks.
There are exactly two ways to write files to the blobstore: 1) using create_upload_url, and posting a form with a file attachment to it, 2) writing to the blobstore directly using an API (with a solution here for large files).
If you want a remote server to be able to upload, you have the same two choices:
1) The remote server firsts requests a URL. You have a handler that's just is only to return such a URL. With this URL, the remote server crafts a properly-encoded POST message and posts it to the URL.
2) You send the file data as an HTTP parameter to a handler to your app engine server, which uses the blobstore write API to then write the file directly. Request size is limited to 32MB, so the maximum file size you can write using this method will be slightly under that.