I am developing a Google web app using their Google app engine to parse some data from various incoming sources and save it all to one place. My ultimate goal is to save the file on a Dropbox, but the Google app hosting service dose't allow me to save files on the disk. Is there a way to send raw data to a Dropbox app and have that app save it as a file?
You can write to a file in Google's AppEngine by using either BlobStore (https://developers.google.com/appengine/docs/python/blobstore/overview#Writing_Files_to_the_Blobstore) or Google Cloud Storage (https://developers.google.com/appengine/docs/python/googlestorage/overview)
To write to a file using the dropbox api, have a look here: https://www.dropbox.com/developers/reference/api#files-POST
You'll have to setup an authenticated request, but this will write the contents of your POST body into a file into the authenticated user's Dropbox.
I think for the Blobstore, Google Cloud Storage, and DropBox you cannot append to existing files, so if you need to do this you need to either create a new file for each time you want to write data and combine files at a later point or read in the previous files data and prepend it to the new data before writing the new data.
Related
I want to automatically sync new files that are added to google drive to google cloud storage.
I have seen various people asking this on the web and most of them suggest something along the lines of:
Develop an app to poll for new files in the drive
Retrieve new files and upload them to GCS
If someone has already written an open-source library/script for this then I would like to reuse it instead of re-inventing the wheel.
Edit:
I have now written a watcher webhook API in python, and subscribed to the folder to get notification when a new file is added to google drive.
Now the issue is, when the webhook is called by Google, no information is provided about the new files/folders added.
I understand you are looking for a method to sync content on different services( NFS, Disks, etc ) to GCS in order to have a backup there and make data accessible to applications which can only access to the cloud storage buckets.
We don't have a google owned solution for this however we have several partners link which offer proprietary solutions which might work for your usecase.
I am looking at Google Drive API tutorials and they tell you to store credentials.json in my working directory (eg https://developers.google.com/drive/api/v3/quickstart/python).
My goal is to make a script which regularly runs on my system and downloads files from Google Drive. My concern is: does storing the credentials.json file leave me open to security risks? If anyone gets access to this file, can they not use it to gain access to all my Google Drive data?
If so, then how should I store the credentials file in a secure manner?
The credentials.json file is used to create user credentials for your application. If someone got a hold of that they could pretend to be your application and request access of users and then do what they wanted with user data. It is very important that you keep this file secure.
Note: If you are only accessing your google drive account and not one owned by other users then you should consider looking into service accounts.
I'm trying to upload a file from API Rest (Google Endpoints) to GCS, but I have retrieve a lot of errors. I don't know if I'm using a bad way or simply Google Endpoints does not upload a file.
I'm trying who my customers upload files to my project bucket.
I read "Endpoints doesn't accept the multipart/form-data encoding so you can't upload the image directly to Endpoints".
Mike answered me at this post but dont know how to implement that on my project.
I'm using this libray (Python):
https://cloud.google.com/appengine/docs/python/googlecloudstorageclient/
If is possible, whats the better way? Any example?
Thanks so much.
I think what Mike means in the previous post, is that you should use Blobstore API to upload file to GCS, instead of using endpoints, and take the data again to the blobstore.
But that will depends on what platform is your client. If you use Web-based client, you should use ordinary way just as Mike has explained (by using HTML form and handler). But if you use Android or mobile client, you can use GCS Client library or GCS REST API.
I would like to upload a file to Google Drive using Flask which runs on App Engine. I got Google Drive service (constructed with Google Drive API) up and running and was able to upload a file from the server using files().insert(...).
Now I would like to implement uploading via HTML form. This gives me a FileStorage object. How should I proceed from this point in order to get the file inside Google Drive?
Please note that I'm uploading files that exceed App Engine's 5MB limit on request size.
Thank you for your suggestions.
Upload the form to blobstore with a callback into your frontend.
If the file is very large you might need to go as far as a backend with resumable uploads done on chained backends (as backends could also go down in the middle of the upload for huge files)
I would like to create an app using python on google app engine to handle file upload and store them into the blobstore.
However, currently blobstore requires the use of blobstore.create_upload_url to create the url for the file upload form post. Since I am uploading from another server, my question is, is it possible to upload file to gae blobstore without using that dynamic url from blobstore.create_upload_url?
FYI, it is ok that I request a upload URL from the python script before I upload from another server but this creates extra latency and that is not what I want. Also I read that using the so called "file-like API" from http://code.google.com/intl/en/appengine/docs/python/blobstore/overview.html#Writing_Files_to_the_Blobstore but the documentation didn't seem to cover the part on uploading.
Also, previously I tried to use datastore for file upload, but the max file size is 1MB which is not enough for my case. Please kindly advise, thanks.
There are exactly two ways to write files to the blobstore: 1) using create_upload_url, and posting a form with a file attachment to it, 2) writing to the blobstore directly using an API (with a solution here for large files).
If you want a remote server to be able to upload, you have the same two choices:
1) The remote server firsts requests a URL. You have a handler that's just is only to return such a URL. With this URL, the remote server crafts a properly-encoded POST message and posts it to the URL.
2) You send the file data as an HTTP parameter to a handler to your app engine server, which uses the blobstore write API to then write the file directly. Request size is limited to 32MB, so the maximum file size you can write using this method will be slightly under that.