I am not hosting the project on git or anything like that. I'm trying to use Pydrive and it is not letting me load service account credentials from Heroku environment variables. If I put the json credentials file with my project, is there any chance someone could find it in my Heroku project? This project is computer to computer, basically generating Word docs in a Google Team Drive. There is no web interface for it.
Related
I'm creating a cli tool to move file around in a user Google Drive space. I'm using Python and Google Drive Api Python SDK to do that and I've created this repo.
Now I have to run every midnight this tool to move file from a folder to another with input from user. Locally I can create credentials.son and retrieve credentials from it, generate the token.json file and use it to authenticate to Drive api. But in public CI environment, I would save my credentials in a Github Secret and give them to my tool at runtime using an option.
Can I do that?
There's some security issues?
I would publish this tool in some forums (like Python SubReddit) to get suggesstions and improvements, but before do that, I'd like to make this move file function as complete as possible.
I am using Python Google API Client by following this tutorial. I was able to run the API locally by downloading the credential JSON and running export GOOGLE_APPLICATION_CREDENTIALS="[PATH]". Now, I would like to deploy my application that uses the API to AppEngine. How can I do so? Do I have to somehow download JSON on my AppEngine machine and run export via app.yaml?
Place the JSON file inside your application directory and it will be deployed to GAE as part of your application. Then you can reference it in your app with just a file path (relative to the your app.yaml file). Easiest is to put it right beside your app.yaml file and reference it with just the filename, no file path.
The exact way to reference it may be depending on the GAE environment and runtime your app uses (which you didn't mention), exporting GOOGLE_APPLICATION_CREDENTIALS via app.yaml may be right.
You may want to check your app's handlers and ensure that you don't accidentally respond with the JSON file to a properly crafted external request - you probably don't want that file served to an external request :)
I am pretty new Google app engine, and though I have hacked around with alot of languages, I am finding the google documentation a little overwhelming. I have successfully launched a static site, and successfully run some python code from the console. But I have not run any python from my static site.
I am a Small company trying to setup a google app engine static/dynamic website that I only want to expose to my Gsuite users.
I have some python code I want to run on my app engine, which will download a file from gdrive/teamdrive, process the file, create a new file from the results and then upload the resulting file to the same folder.
I may also at a later date have this static/dynamic website also interfacing with Cloud SQL(mysql) or an external database.
my Questions
What authentication method to use to only expose this website to my gsuite users?
Though I have worked through some of the GDrive api examples whats the best and easiest method of passing GDrive text files to my python code? (though I have hacked around with python lots in the past, the html and python combination perplexes me)
Thanks!
You would use OAuth2.0 to acquire a token which you can then use to interact with that user's files through the GDrive API (see "About Authorization" as well as this quickstart Python example).
I'm having trouble submitting an Apache Beam example from a local machine to our cloud platform.
Using gcloud auth list I can see that the correct account is currently active. I can use gsutil and the web client to interact with the file system. I can use the cloud shell to run pipelines through the python REPL.
But when I try and run the python wordcount example I get the following error:
IOError: Could not upload to GCS path gs://my_bucket/tmp: access denied.
Please verify that credentials are valid and that you have write access
to the specified path.
Is there something I am missing with regards to the credentials?
Here are my two cents after spending the whole morning on the issue.
You should make sure that you login with gcloud on your local machine, however, pay attention to the warning message that return from gcloud auth login:
WARNING: `gcloud auth login` no longer writes application default credentials.
These credentials are required for the python code to identify your credentials properly.
Solution is rather simple, just use:
gcloud auth application-default login
This will write a credentials file under: ~/.config/gcloud/application_default_credentials.json which is used for the authentication in the local development env.
You'll need to create a GCS bucket and folder for your project, then specify that as the pipeline parameter instead of using the default value.
https://cloud.google.com/storage/docs/creating-buckets
Same Error Solved after creating a bucket.
gsutil mb gs://<bucket-name-from-the-error>/
I have faced the same issue where it throws up the IO error. Things that helped me here are (not in the order):
Checking the Name of the bucket. This step helped me a lot. Bucket names are global. If you make mistake in the bucket-name while accessing your bucket then you might be accessing buckets that you have NOT created and you don't have permission to.
Checking the service account that you have filled in:
export GOOGLE_CLOUD_PROJECT= yourkeyfile.json
Activating the service account for the key file you have plugged in -
gcloud auth activate-service-account --key-file=your-key-file.json
Also, listing out the auth accounts available might help you too.
gcloud auth list
One solution might work for you. It did for me.
In the cloud shell window, click on "Launch code Editor" (The Pencil Icon). The editor will work in Chrome (not sure about Firefox), it did not work in Brave browser.
Now, browse to your code file [in the launched code editor on GCP] (.py or .java) and locate the pre-defined PROJECT and BUCKET names and replace the name with your own Project and Bucket names and save it.
Now execute the file and it should work now.
Python doesn't use gcloud auth to authenticate but it uses the environment variable GOOGLE_APPLICATION_CREDENTIALS. So before you run the python command to launch the Dataflow job, you will need to set that environment variable:
export GOOGLE_APPLICATION_CREDENTIALS="/path/to/key"
More info on setting up the environment variable: https://cloud.google.com/docs/authentication/getting-started#setting_the_environment_variable
Then you'll have to make sure that the account you set up has the necessary permissions in your GCP project.
Permissions and service accounts:
User service account or user account: it needs the Dataflow Admin
role at the project level and to be able to act as the worker service
account (source:
https://cloud.google.com/dataflow/docs/concepts/security-and-permissions#worker_service_account).
Worker service account: it will be one worker service account per
Dataflow pipeline. This account will need the Dataflow Worker role at
the project level plus the necessary permissions to the resources
accessed by the Dataflow pipeline (source:
https://cloud.google.com/dataflow/docs/concepts/security-and-permissions#worker_service_account).
Example: if Dataflow pipeline’s input is Pub/Sub topic and output is
BigQuery table, the worker service account will need read access to
the topic as well as write permission to the BQ table.
Dataflow service account: this is the account that gets automatically
created when you enable the Dataflow API in a project. It
automatically gets the Dataflow Service Agent role at the project
level (source:
https://cloud.google.com/dataflow/docs/concepts/security-and-permissions#service_account).
I deployed a webapp2 python application on GAE. Is there any way with which i can explore the source code or make changes in the project files from GAE console. Is it possible that if i only want to update a single .py file on already deployed app rather than again deploying the whole project?
I think you're looking for this :
https://console.cloud.google.com/code/develop
I pushed my code on Google Cloud Platform with git, and I'm able to change text files directly online.
The doc is here :
https://cloud.google.com/source-repositories/