How to get credentials in Google AppEngine Python37 - python

I started new app in AppEngine Python3.7 stadard.
I am trying to get credentials using the following snippet, but its failing.
Anybody able to get credentials in GAE standard Python37 ?
Input:
from google.auth import app_engine
credentials = app_engine.Credentials()
Output:
The App Engine APIs are not available

When using App Engine Standard with python 3.7, none of the google.xxx libraries are available. You have to build your own, or use standard Python libraries. This goes for: auth, users, images, search, mail, taskqueue, memcache, urlfetch, deferred, etc., and even the ndb datastore interface.
For datastore, you use google-cloud-datastore or some 3rd party.
For others, you use a standard Python library, e.g.: google.auth => rauth, google.appengine.api.memcache => python-memcached
Read more here: https://cloud.google.com/appengine/docs/standard/python3/python-differences
That page recommends Google Identity Platform or Firebase Authentication to do authorization.

Related

Google App engine user authentication (python 3.7)

I am looking for a simple method to authenticate (using Goggle account) users into an python 3.7 app running on GAE.
In the past (using python 2.7) I have used
https://cloud.google.com/appengine/docs/standard/python/users/loginurls
and the logic was something like:
def my_end_point():
if user is not authenticated:
redirect_to_google_login()
else:
user_email = get_user_email_from_auth_service()
do_the_endpoint_logic(user_email)
I would like to keep this simple logic under GAE python 3.7
I have found this and I was wondering if someone wrapped this logic with decorator..
any directions?
The the App Engine standard environment Python 3.7 runtime does not support the Users API service. Your options are:
Firebase Authentication
Google Sign-In
OAuth 2.0 and OpenID Connect
via https://pypi.org/project/Flask-OAuthlib
via https://authlib.org
Nowadays (Januari 2022), the Python 3.x runtime does support the Users API by means of the App Engine bundled services. There is a very detailed and straightforward documentation page on how to enable this in App Engine.
Several other options for user authentication are described here.

Integrating MQTT with GCP using IOT adapter and google pub/sub api in python

Integration with Cloud Pub/Sub APIs from App Engine Standard
I am working on developing a Google app engine app in standard Python environment. For some portions of the code, I need to integrate with Google Cloud pub/sub APIs.
As mentioned here, Pub/Sub can only be integrated in the App Engine flexible environment (BTW it is also only in alpha). Can someone please describe how to integrate with Pub/Sub in the App Engine Standard environment?
My use case description
I am trying to integrate MQTT with google app engine by using Agosto IOT broker. I will be using MQTT for clients (Currently mobile platforms) and on server side, I plan to use pub/sub for receiving/sending the messages and saving relevant data to the database.
You might want to try instead to use the new Google Cloud IoT Core product (full disclosure, I worked on it) instead of hosting MQTT on App Engine. Cloud IoT Core lets you connect to a Google-provided MQTT bridge that will put your data into Google Cloud PubSub. You can use Google Cloud DataFlow to move your data from PubSub to your data warehouse for analytics or can use your own database as the output from DataFlow.
The connection details for communicating with the Google Cloud IoT Core MQTT bridge are discussed in detail in the documentation but the important connection properties you will need to be aware of are the hostname (mqtt.googleapis.com) port (8883 or 443) and the MQTT password / client ID which are going to be based on the devices you've provisioned for the service.
Your actual MQTT client will need to be chosen depending on which programming language you're trying to access the MQTT bridge. If you're trying from Android, you could start from the Java MQTT client sample and would probably end up with something like the Android Things Cloud IoT sensor hub connector from the AndroidThings team.
TL;DR - App Engine standard does not support the newer Google Cloud Client libraries. You will instead need to use the older Google Cloud API Client libraries to communicate with Cloud Pub/Sub.
Cloud API Client libraries (older) vs Google Cloud Client libraries (newer)
The Cloud Pub/Sub client library documentation you're pointing to advises you to use the older Google API Client libraries (which is supported on App Engine Standard environment) instead of Google Cloud Client libraries (which is supported on App Engine Flexible environment but not on standard)
The client libraries are explained in detail here.
Google API Client libraries for Cloud Pub/Sub
Here are all the list of APIs which are supported using Google API Client libraries. Cloud Pub/Sub APIs are also part of this list.
Using Google API Client libraries with App Engine Standard
If you scroll down that page, there is a section describing how this API library can be used in App Engine Standard environment. In short, you will need to bundle the library along with your application just like other third-party libraries you use.
App Engine
Because the Python client libraries are not installed in the App
Engine Python runtime environment, they must be vendored into your
application just like third-party libraries.
This warning that you will see on the page, advises you to use the regular Cloud Client library if possible. But since App Engine Standard does not support it, you can ignore it for that use case.
While this library is still supported, we suggest trying the newer
Cloud Client Library for Google Cloud Pub/Sub, especially for new
projects. See Google Cloud Pub/Sub Libraries for installation and
usage details.
Examples using google-api-python-client library to invoke PubSub APIs
Using credentials from a service account json file
The following example shows you how you can use a service account to authenticate with Google Cloud PubSub APIs and invoke them. The information about how to use credentials from a Service account is available here.
You will need to have the following python packages pre-installed for this example to work: google-api-python-client and oauth2client.
If you're using pip, you can do:
pip install google-api-python-client oauth2client
Example which I have tested personally:
from googleapiclient import discovery
from httplib2 import Http
from oauth2client.service_account import ServiceAccountCredentials
# BEGIN CONFIG
PRIVATE_KEY_JSON = 'path/to/service_account_private_key.json'
API_SCOPES = ['https://www.googleapis.com/auth/pubsub']
PROJECT_NAME = 'FILL_IN_PROJECT_NAME_HERE'
# END CONFIG
# The format of project name expected by PubSub
PROJECT = 'projects/{0}'.format(PROJECT_NAME)
# Create a ServiceAccountCredentials object by reading the credentials from
# your JSON file.
credentials = ServiceAccountCredentials.from_json_keyfile_name(
PRIVATE_KEY_JSON, scopes=API_SCOPES)
# Build the Cloud PubSub API object which you will be using for
# invoking the corresponding APIs using the credentials object
# you created previously
pubsub = discovery.build('pubsub', 'v1', credentials=credentials)
# List all topics the specified project
topics = pubsub.projects().topics().list(
project=PROJECT).execute()
print topics
# Add a new topic
topic_name = 'TOPIC_NAME_TO_ADD'
added_topic_response = pubsub.projects().topics().create(
name='{0}/topics/{1}'.format(PROJECT, topic_name), body={}).execute()
print added_topic_response
Using credentials from a service account within an App Engine app
There is some info here regarding how to use Service account credentials from your App Engine apps.
The above example will work for the most part for invoking PubSub APIs, except for the part where you will initialize the credentials object. That part can be replaced roughly as described below:
Service Accounts
If your App Engine application needs to call an API to access data
owned by the application's project, you can simplify OAuth 2.0 by
using Service Accounts. These server-to-server interactions do
not involve a user, and only your application needs to authenticate
itself. Use the AppAssertionCredentials class to create a
Credentials object without using a Flow object.
In the following code snippet, a Credentials object is created and an
Http object is authorized:
import httplib2
from google.appengine.api import memcache
from oauth2client.contrib.appengine import AppAssertionCredentials
...
credentials = AppAssertionCredentials(scope='https://www.googleapis.com/auth/devstorage.read_write')
http = credentials.authorize(httplib2.Http(memcache))
pubsub = discovery.build('pubsub', 'v1', http=http)
...
Once you have an authorized Http object, you can pass it to the
build() or execute() functions as you normally would.
Using Application Default Credentials
You can also utilize Application Default credentials for your local testing and also works within App Engine environment.
from oauth2client.client import GoogleCredentials
...
credentials = GoogleCredentials.get_application_default()
pubsub = discovery.build('pubsub', 'v1', credentials=credentials)
...

Google App Engine authorization for Google BigQuery - Multiple Projects

I am running multiple projects that using the same source code (python) on GAE. I am currently trying to include BigQuery functionality in those projects. I have enabled the BigQuery API on all of the projects, successfully imported some data to BigQuery from GCS using the new developers console.
I am able to make queries from the GAE app using AppAssertionCredentials from some of the projects but get a 403 "Access Not Configured. The API (BigQuery API) is not enabled for your project. Please use the Google Developers Console to update your configuration." error for others.
tl;dr AppAssertionCredentials with BigQuery fails for some projects, not for others (same source code)
All of the projects have BigQuery API's and billing enabled I followed all the steps from Google App Engine authorization for Google BigQuery
The only difference between the projects is how they where created.
Original project:
project_id: project_A
service_account: project_A#appspot.gserviceaccount.com
Second project:
project_id: project_B
service_account: project_B#appspot.gserviceaccount.com
Third project (cloned from poject_A):
project_id: project_C
service_account: project_A#appspot.gserviceaccount.com
The third project project_C has been created using cloning feature of the old appengine console, that is why it shares the same service account email. This is the project that the AppAssertionCredentials fail for when trying to query BigQuery (although everything works fine when authenticating to GCS with the same credentials)
I have added project_A#appspot.gserviceaccount.com to project_C Permissions list with "Edit" permissions - that didn't help. The service discovery code:
from googleapiclient.discovery import build
from oauth2client.appengine import AppAssertionCredentials
credentials = AppAssertionCredentials('https://www.googleapis.com/auth/bigquery')
return build('bigquery', 'v2', credentials=credentials)
Is there a workaround this problem or maybe anything else I need to check for? I would really like to avoid using any other authorization method than AppAssertionCredentials.

Upload Files To Google Cloud Storage With Google App Engine (Python)

I'm trying to set up a basic python-based google app engine site that allows users to upload files to google cloud storage (mostly images)
I've been going through the documentation for the JSON API and the GCS client library overview (as well as blobstore etc) and still don't have a good handle on which is the best method and how they are related. Would be great if someone could give an overview of this or point me to some resources I can check out
Also, any sample code that's relevant would be really helpful. I've been able to run the upload samples here but not sure if they're useful for an app engine setup: https://github.com/GoogleCloudPlatform/storage-file-transfer-json-python
Thanks!!
Google Cloud Storage has two APIs -- the XML API and the JSON API. The XML API is XML based and very like the Amazon S3 API. The JSON API is similar to many other Google APIs, and it works with the standard Google API client libraries (for example, the Google API Python library). Both of these APIs can be used from anywhere, with or without App Engine, and are based on RESTful HTTP calls.
App Engine provides a couple of standard ways to access Google Cloud Storage. The first is built into App Engine's API as a feature called the "Google Cloud Storage Python API". This does not directly use either the XML or the JSON API. It's deprecated and no longer recommended.
The second App Engine library is called the "Google Cloud Storage
Python Client Library" and is not part of the core App Engine API. Instead, it's a Python library put out by Google that you can download and add to your application like any other library. This library happens to be implemented using the XML API. It provides a few extra features that are useful for App Engine users, such as the ability to serialize an upload while it's in progress. There's an example of using this library included as part of the download, in the python/demo directory. You can also see it online.
Equivalents of these tools also exist in Java and Go.
There's no need for users to use the App Engine-specific libraries unless they find them to be useful. The standard Python library or even just hand-written HTTP calls using urlfetch will work just as well. The App Engine library merely provides some useful extras for App Engine users.
App Engine also have a "Blobstore Python API". This is a feature specific to App Engine and distinct from Google Cloud Storage, except that it provides a few hooks into Google Cloud Storage, such as the ability to store files in Google Cloud Storage using the Blobstore API.

How can I log in to an arbitrary user in appengine for use with the Drive SDK?

I have an application that needs to log into a singular Drive account and perform operations on the files automatically using a cron job. Initially, I tried to use the domain administrator login to do this, however I am unable to do any testing with the domain administrator as it seems that you cannot use the test server with a domain administrator account, which makes testing my application a bit impossible!
As such, I started looking at storing arbitray oauth tokens--especially the refresh token--to log into this account automatically after the initial setup. However, all of the APIs and documentation assume that multiple individual users are logging in manually, and I cannot find functionality in the oauth APIs that allow or account for logging into anything but the currently logged in user.
How can I achieve this in a way that I can test my code on a test domain? Can I do it without writing my own oauth library and doing the oauth requests by hand? Or is there a way to get the domain administrator authorization to work on a local test server?
You can load the credentials for a single account into your datastore using the Remote API, which can be enabled in your app.yaml file:
builtins:
- remote_api: on
By executing
remote_api_shell.py -s your_app_id.appspot.com
from the command line you'll have access to a shell which can execute in the environment of your application. Before doing this, make sure you have your application deployed (more on local development below) and make sure the source for google-api-python-client is included by pip-installing it and running enable-app-engine-project /path/to/project to add it to your App Engine project.
Once you are in the remote shell (after executing the remote command above), perform the following:
from oauth2client.appengine import CredentialsModel
from oauth2client.appengine import StorageByKeyName
from oauth2client.client import OAuth2WebServerFlow
from oauth2client.tools import run
KEY_NAME = 'your_choice_here'
CREDENTIALS_PROPERTY_NAME = 'credentials'
SCOPE = 'https://www.googleapis.com/auth/drive'
storage = StorageByKeyName(CredentialsModel, KEY_NAME, CREDENTIALS_PROPERTY_NAME)
flow = OAuth2WebServerFlow(
client_id=YOUR_CLIENT_ID,
client_secret=YOUR_CLIENT_SECRET,
scope=SCOPE)
run(flow, storage)
NOTE: If you have not deployed your application with the google-api-python-client code, this will fail, because your application won't know how to make the same imports you made on your local machine, e.g. from oauth2client.appengine import CredentialsModel.
When run is called, your web browser will open and prompt you to accept the OAuth access for the client you've specified with CLIENT_ID and CLIENT_SECRET and after successfully completing, it will save an instance of CredentialsModel in the datastore of the deployed application your_app_id.appspot.com and it will store it using the KEY_NAME your provided.
After doing this, any caller in your application -- including your cron jobs -- can access those credentials by executing
storage = StorageByKeyName(CredentialsModel, KEY_NAME, CREDENTIALS_PROPERTY_NAME)
credentials = storage.get()
Local Development:
If you'd like to test this locally, you can run your application locally via
dev_appserver.py --port=PORT /path/to/project
and you can execute the same commands using the remote API shell and pointing it at your local application:
remote_api_shell.py -s localhost:PORT
Once here, you can execute the same code you did in the remote api shell and similarly an instance of CredentialsModel will be stored in the datastore of your local development server.
As above, if you don't have the correct google-api-python-client modules included, this will fail.
EDIT: This used to recommend using the Interactive Console at:
http://localhost:PORT/_ah/admin/interactive
but it was discovered that this doesn't work because socket does not work properly in the App Engine local development sandbox.
This article explains how to interact with Google Drive on behalf of users of your domain by having the Domain Administrator delegate domain-wide authority to a Service Account
This other article explains how to interact with a Drive owned by your application using a Service Account.
Note that both methods use a JWT based Service Accounts and which currently need a modified version of the google-api-python-client in order to work on App Engine.
Unlike Google App Engine Service account, JWT based Service Accounts should work with the development server.

Categories