I'm trying to mimic the flow for authenticating with GCP using the gcloud CLI in a Go project. In this case, I can't just shell out to gcloud using the os package because I have to assume it's not installed on the system. I also need to avoid having the user go in and set an OAuth2 client_id and client_secret in the developer console since the gcloud CLI doesn't seem to require it.
I was trying to look through the Python code for gcloud and I can sort of see what it's doing but it's a bit difficult to follow since I'm not super experienced in Python and it seems like there are many layers of abstraction in the authentication code that are a bit hard to break through.
I can see the GCP URLs (the ones that you open in the browser when doing gcloud auth login) have a client_id but they don't have a client_secret. They also look like they do PKCE but I'm not super familiar with how that works. I also can't figure out exactly where they get the client_id from. If it's somehow bundled with the gcloud CLI I can't find it anywhere.
Related
Currently I'm running Google Python SDK to automate the data collection of resources deployed in GCP. However, when parsing through the data, I often come across URLs (https://googleapis.com/compute/.....). In order to parse these, I'm currently using the access token obtained using the gcloud command: gcloud auth print-access-token.
I'm looking to automate this, as everytime the token expires, I use the gcloud command and manually replace the token in my python script.
Please assist.
Thank you
I have a Python script that accesses Google cloud platform, I also set up the service account, I can request & save the json file through the cloud console webpage after I login my Google account, and sets the GOOGLE_APPLICATION_CREDENTIALS to that json file, so the Python script can have access.
Now I want to share it with others, I have requirements.txt for the Python scrip to install the gcloud-api library, but I don't want to enforce others to install gcloud-sdk. And I don't want to share that json file with others. I would like to let others run the script, and if that json credential file is not found, the script will ask them to:
login gcloud
generate and save json credential, e.g., to a default directory
sets GOOGLE_APPLICATION_CREDENTIALS to that json file
All the step better be done without browser. Is there a way to use Python to do such thing? I did some research & googling but no luck.
I believe I can do this anyway by Python invoking curl or using requests, but just wonder if there is a simpler way to do this.
UPDATE
Thanks to the comments but I just want to release to others a Python script file.
I read through the service account and the work identity federation, I don‘t have infra to setup identity provider. I believe that based on my reading and the comments, if I want to use something like oauth, I need to register my script as a client on Google. I am not sure if this is feasible or considered as a good practice...
I'm having trouble submitting an Apache Beam example from a local machine to our cloud platform.
Using gcloud auth list I can see that the correct account is currently active. I can use gsutil and the web client to interact with the file system. I can use the cloud shell to run pipelines through the python REPL.
But when I try and run the python wordcount example I get the following error:
IOError: Could not upload to GCS path gs://my_bucket/tmp: access denied.
Please verify that credentials are valid and that you have write access
to the specified path.
Is there something I am missing with regards to the credentials?
Here are my two cents after spending the whole morning on the issue.
You should make sure that you login with gcloud on your local machine, however, pay attention to the warning message that return from gcloud auth login:
WARNING: `gcloud auth login` no longer writes application default credentials.
These credentials are required for the python code to identify your credentials properly.
Solution is rather simple, just use:
gcloud auth application-default login
This will write a credentials file under: ~/.config/gcloud/application_default_credentials.json which is used for the authentication in the local development env.
You'll need to create a GCS bucket and folder for your project, then specify that as the pipeline parameter instead of using the default value.
https://cloud.google.com/storage/docs/creating-buckets
Same Error Solved after creating a bucket.
gsutil mb gs://<bucket-name-from-the-error>/
I have faced the same issue where it throws up the IO error. Things that helped me here are (not in the order):
Checking the Name of the bucket. This step helped me a lot. Bucket names are global. If you make mistake in the bucket-name while accessing your bucket then you might be accessing buckets that you have NOT created and you don't have permission to.
Checking the service account that you have filled in:
export GOOGLE_CLOUD_PROJECT= yourkeyfile.json
Activating the service account for the key file you have plugged in -
gcloud auth activate-service-account --key-file=your-key-file.json
Also, listing out the auth accounts available might help you too.
gcloud auth list
One solution might work for you. It did for me.
In the cloud shell window, click on "Launch code Editor" (The Pencil Icon). The editor will work in Chrome (not sure about Firefox), it did not work in Brave browser.
Now, browse to your code file [in the launched code editor on GCP] (.py or .java) and locate the pre-defined PROJECT and BUCKET names and replace the name with your own Project and Bucket names and save it.
Now execute the file and it should work now.
Python doesn't use gcloud auth to authenticate but it uses the environment variable GOOGLE_APPLICATION_CREDENTIALS. So before you run the python command to launch the Dataflow job, you will need to set that environment variable:
export GOOGLE_APPLICATION_CREDENTIALS="/path/to/key"
More info on setting up the environment variable: https://cloud.google.com/docs/authentication/getting-started#setting_the_environment_variable
Then you'll have to make sure that the account you set up has the necessary permissions in your GCP project.
Permissions and service accounts:
User service account or user account: it needs the Dataflow Admin
role at the project level and to be able to act as the worker service
account (source:
https://cloud.google.com/dataflow/docs/concepts/security-and-permissions#worker_service_account).
Worker service account: it will be one worker service account per
Dataflow pipeline. This account will need the Dataflow Worker role at
the project level plus the necessary permissions to the resources
accessed by the Dataflow pipeline (source:
https://cloud.google.com/dataflow/docs/concepts/security-and-permissions#worker_service_account).
Example: if Dataflow pipeline’s input is Pub/Sub topic and output is
BigQuery table, the worker service account will need read access to
the topic as well as write permission to the BQ table.
Dataflow service account: this is the account that gets automatically
created when you enable the Dataflow API in a project. It
automatically gets the Dataflow Service Agent role at the project
level (source:
https://cloud.google.com/dataflow/docs/concepts/security-and-permissions#service_account).
My reading of Cognito is that it can be used in place of a local Django admin database to authenticate users of a website. However I am not finding any soup-to-nuts examples of a basic "Hello, World" app with a login screen that goes through Cognito. I would very much appreciate it if someone could post an article that shows, step-by-step, how to create a Hello World Django app and a Cognito user pool, and then how to replace the default authentication in Django with a call to AWS Cognito.
In particular I need to know how to gather the information from the Cognito admin site that is needed to set up a call to Cognito API to authenticate a user.
There are two cases to consider: App user login to App, and Admin login to django Admin URL of site. I assume that I would want to use Cognito for both cases, otherwise I am leaving a potential hole where the Admin URL is using a weaker login technology.
Current answers on AWS forums and StackExchange either say:
(1) It is a waste of time to use Cognito for authenticating a website, it is only for access to AWS resources
(2) It is not a waste of time. I am about to give up. I have gone as far as creating a sample Cognito user pool and user groups, and of scouring the web for proper examples of this use case. (None found, or I wouldn't be writing.)
(3) https://github.com/capless/warrant, https://github.com/metametricsinc/django-warrant are two possible solution from the aws forums.
If you are reading this, you probably googled "aws cognito django" xD.
I just want to share what I did in order to get this thing to work:
Django-Warrant. Great aws cognito wrapper package.
Make sure to understand your current User model structure. If you use custom user model, don't forget to map it using COGNITO_ATTR_MAPPING setting.
Change your authentication to support 3rd party connectivity. When you get from the client some Cognito token, convert it into your own token using oAuth/JWT/Session.
Rethink your login/register process. Do you want different registration? The django-warrant package supports it...
At the end of the day, this is a GREAT solution for fast authentication.
To add to the accepted answer, there is a simple but very important extra step that I found was necessary to take to use django-warrant with Django 2.0:
The conditional in backend.py in the root package needs to be changed from:
if DJANGO_VERSION[1] > 10
to:
if DJANGO_VERSION[1] > 10 or DJANGO_VERSION[0] > 1:
Using django-warrant with Zappa and AWS Lambda:
The project I am working on also uses Zappa to enable the serverless deployment of my Django app to AWS Lambda. Although the above code fixed django-warrant for me when testing locally, after deploying the app to the Lambda environment, I had another significant issue stemming from some of django-warrant's supporting packages - primarily related to python-jose-pycryptodome, which django-warrant uses during the authentication process. The issue showed itself in the form of a FileNotFound error related to the Crypto._SHA256 file. This error appears to have been caused because pycryptodome expects different files to be available in the Crypto package at runtime on Windows (which I am developing on) and Linux (the Lambda environment) respectively. I ended up solving this issue by downloading the Linux version of pycryptodome and merging its Crypto package with the Crypto package from the Windows version.
TLDR: If you want to use django-warrant with AWS Lambda and you are developing on a Windows machine, make sure to download the Linux version of pycryptodome and merge its Crypto package with the same from the Windows version.
Note: The versions of pycryptodome and python-jose (not python-jose-cryptodome) that I ended up using to achieve the above were 3.7.2 and 3.0.1 respectively.
I have an application that needs to log into a singular Drive account and perform operations on the files automatically using a cron job. Initially, I tried to use the domain administrator login to do this, however I am unable to do any testing with the domain administrator as it seems that you cannot use the test server with a domain administrator account, which makes testing my application a bit impossible!
As such, I started looking at storing arbitray oauth tokens--especially the refresh token--to log into this account automatically after the initial setup. However, all of the APIs and documentation assume that multiple individual users are logging in manually, and I cannot find functionality in the oauth APIs that allow or account for logging into anything but the currently logged in user.
How can I achieve this in a way that I can test my code on a test domain? Can I do it without writing my own oauth library and doing the oauth requests by hand? Or is there a way to get the domain administrator authorization to work on a local test server?
You can load the credentials for a single account into your datastore using the Remote API, which can be enabled in your app.yaml file:
builtins:
- remote_api: on
By executing
remote_api_shell.py -s your_app_id.appspot.com
from the command line you'll have access to a shell which can execute in the environment of your application. Before doing this, make sure you have your application deployed (more on local development below) and make sure the source for google-api-python-client is included by pip-installing it and running enable-app-engine-project /path/to/project to add it to your App Engine project.
Once you are in the remote shell (after executing the remote command above), perform the following:
from oauth2client.appengine import CredentialsModel
from oauth2client.appengine import StorageByKeyName
from oauth2client.client import OAuth2WebServerFlow
from oauth2client.tools import run
KEY_NAME = 'your_choice_here'
CREDENTIALS_PROPERTY_NAME = 'credentials'
SCOPE = 'https://www.googleapis.com/auth/drive'
storage = StorageByKeyName(CredentialsModel, KEY_NAME, CREDENTIALS_PROPERTY_NAME)
flow = OAuth2WebServerFlow(
client_id=YOUR_CLIENT_ID,
client_secret=YOUR_CLIENT_SECRET,
scope=SCOPE)
run(flow, storage)
NOTE: If you have not deployed your application with the google-api-python-client code, this will fail, because your application won't know how to make the same imports you made on your local machine, e.g. from oauth2client.appengine import CredentialsModel.
When run is called, your web browser will open and prompt you to accept the OAuth access for the client you've specified with CLIENT_ID and CLIENT_SECRET and after successfully completing, it will save an instance of CredentialsModel in the datastore of the deployed application your_app_id.appspot.com and it will store it using the KEY_NAME your provided.
After doing this, any caller in your application -- including your cron jobs -- can access those credentials by executing
storage = StorageByKeyName(CredentialsModel, KEY_NAME, CREDENTIALS_PROPERTY_NAME)
credentials = storage.get()
Local Development:
If you'd like to test this locally, you can run your application locally via
dev_appserver.py --port=PORT /path/to/project
and you can execute the same commands using the remote API shell and pointing it at your local application:
remote_api_shell.py -s localhost:PORT
Once here, you can execute the same code you did in the remote api shell and similarly an instance of CredentialsModel will be stored in the datastore of your local development server.
As above, if you don't have the correct google-api-python-client modules included, this will fail.
EDIT: This used to recommend using the Interactive Console at:
http://localhost:PORT/_ah/admin/interactive
but it was discovered that this doesn't work because socket does not work properly in the App Engine local development sandbox.
This article explains how to interact with Google Drive on behalf of users of your domain by having the Domain Administrator delegate domain-wide authority to a Service Account
This other article explains how to interact with a Drive owned by your application using a Service Account.
Note that both methods use a JWT based Service Accounts and which currently need a modified version of the google-api-python-client in order to work on App Engine.
Unlike Google App Engine Service account, JWT based Service Accounts should work with the development server.