Google calendar and client secrets with desktop app - python

I am trying to figure out how the oauth flow will work for my google calendar app. I have a desktop app which I will be distributing and which will be using google calendar. I know there's a client secrets, but I'd like to know if there's a way to request a token rather than sending the client secrets file along. My worry is that someone will just spam the calendar and my app won't work for anyone else. Is this a possibility? What solutions exist to mitigate this?
thanks,

Using OAuth2, the client secret is used to authenticate your app to Google, not vice versa. The OAuth2 server will only issue a token to an application that submits a correct client id and client secret (among other things). This arrangement is (in part) precisely about making sure that someone else cannot (e.g.) "spam your calendar". Otherwise, how is Google to know that the requesting application is actually legitimate, rather than malicious code crafted by a spammer to just look like your application?
Before you can use OAuth2, your application has be be registered with Google. As part of this process Google issues you with a client secret, which you then have to build in to your application instance. This application instance is also tied to redirect URIs for when the authorisation handshake is complete.
The upshot of this is that you can't really distribute an OAuth2-using app without each deployment having to go through the registration process. If you try and distribute the secret with your application, then it's no longer a secret, and in any case, you can't know all of the URIs at which it may be deployed.
An approach I've taken with an application I'm working on is to have the application read a client secret file that has to be provided by the installer of the application, based on their own registration. This has a format based on the JSON download that Google provides when you register an application. It's something of a pain requiring every installer to go through this dance, but as things stand I don't believe there's an easier way that is also secure.
For example, I have an implementation of OpenID Connect authentication using Google that builds upon the oauth2client library, and uses a file format based on Google's client secrets file. When a service is registered with Google to use OAuth2, there's a "client secrets" file that Google can provide that looks something like this:
{
"web": {
"client_id": "9876543210.apps.googleusercontent.com",
"client_secret": "secret-12345678901234567890",
"client_email": "9876543210#developer.gserviceaccount.com",
"client_x509_cert_url":
"https://www.googleapis.com/robot/v1/metadata/x509/9876543210#developer.gserviceaccount.com",
"redirect_uris":
[ "http://localhost:8000/annalist/login_done/",
"http://annalist-demo.example.org:8000/annalist/login_done/"
],
"auth_uri": "https://accounts.google.com/o/oauth2/auth",
"token_uri": "https://accounts.google.com/o/oauth2/token",
"auth_provider_x509_cert_url": "https://www.googleapis.com/oauth2/v1/certs"
}
}
In my code, I use this file to initiate an OAuth2 flow using the oauth2client library, with code that looks like this:
flow = flow_from_clientsecrets(
clientsecrets_filename,
scope=scope,
redirect_uri=request.build_absolute_uri(login_done)
)
So you can see that there's much more information than just the client id that is used when initiating the flow. The full code of my implementation is at https://github.com/gklyne/annalist/blob/master/src/annalist_root/oauth2/views.py, beginning about line 273, but there's a lot more logic in that module that's to do with passing details back to a an application running in the Django framework.
I've also created documentation of the procedure I use to register the application with Google and deploy the client credentials. Note that these instructions are for the identity service not the calendar API, and that the details are specific to my application, but I'm hoping there's enough commonality to help you on your way.
Looking to the future, the IETF are working on a spec for allowing automated registration of an OAuth2-using application instance. I think this is the relevant specification: https://datatracker.ietf.org/doc/draft-ietf-oauth-dyn-reg/. It appears that it is currently (as of September 2014) being considered for standards track publication, but that doesn't say when it might be more widely available.

Related

How to setup python social auth for web app and for mobile app?

We have
An existing Django backend with Python social auth for signing in with Google, providing web-based application and an API for the mobile app.
An iOS mobile app with GoogleSignIn pod.
Now we would like to allow mobile app users to sign in with Google inside the app, and then authenticate them on the backend, so that they can access their personal data via the app.
So my idea of the algorithm is:
App uses the GoogleSignIn and finally receives access_token.
App sends this access_token to the Backend.
Backend verifies this access_token, fetches/creates the user, returns some sessionid to the App.
App uses this sessionid for further requests.
The problem is with the third step: token verification. I found two ways of verifying:
1. Python social auth flow
As described in the docs:
token = request.GET.get('access_token')
user = request.backend.do_auth(token)
if user:
login(request, user)
return 'OK'
else:
return 'ERROR'
This would be a preferred flow, since it already has all the required steps and is working perfectly with the web app (like, accounts creation, defaults for newly created users, analytics collection, etc.).
But the problem is that the backend and the app use different CLIENT_IDs for the auth. This is due to the limitations in the Google Developers Console: when creating credentials, you need to select whether it will be a web app or an iOS app, and it cannot be both.
I tried to use different client ids (then backend cannot verify), tried to use web id inside the app (then the pod does not work), and tried to use app id inside the web (then the backend cannot verify again).
2. Google API Client Library
Another option is to utilize the way from the Google Sign-In for iOS documentation:
from google.oauth2 import id_token
from google.auth.transport import requests
try:
idinfo = id_token.verify_oauth2_token(token, requests.Request(), CLIENT_ID)
userid = idinfo['sub']
except ValueError:
# Invalid token
pass
It worked, but here we're missing all the pipeline provided by social auth (e.g. we need to create a user somehow), and I could not find a proper way of starting the pipeline from the middle, and I'm afraid it would be quite fragile and bug-prone code.
Another problem with this solution is that in reality we also have Signed in with Apple and Sign in with Facebook, and this solution will demand ad-hoc coding for each of these backends, which also bring more mess and unreliability.
3. Webview
Third option would be not to use SDKs in the Swift and just use a web view with the web application, as in the browser.
This solves the problem with the pipeline and client ids.
But it doesn't look native, and some users may suspect phishing attempts (how does it differ from a malicious app trying to steal Google identity by crafting the same-looking form?). Also, I'm not sure it will play nicely with the accounts configured on the device. And it also will require us to open a browser even for signing in with Apple, which looks somewhat awkward. And we're not sure such an app will pass the review.
But, maybe, these all are minor concerns?
⁂
So, what do you think? Is there a fourth option? Or maybe improvements to the options above? How is it solved in your app?

compute engine's service account has insufficient scopes for cloud vision api

I need to use Cloud Vision API in my python solution, I've been relying on an API key for a while now, but at the moment I'm trying to give my Compute Engine's default service account the scope needed to call Vision, with little luck so far.
I have enabled vision API in my project via cloud console, but I still get that 403 error:
Request had insufficient authentication scopes.
I would set access individually for each API from my gce's edit details tab, but couldn't find Vision listed along the other API's.
The only way I managed to correctly receive a correct response from Vision API is by flagging the "Allow full access to all Cloud APIs" checkbox, again from my gce's edit details tab, but that doesn't sound too secure to me.
Hopefully there are better ways to do this, but I couldn't find any on Vision's documentation on authentication, nor in any question here on stack overflow (some had a close topic, but none of the proposed answers quite fitted my case, or provided a working solution).
Thank you in advance for your help.
EDIT
I'm adding the list of every API I can individually enable in my gce's default service account from cloud console:
BigQuery; Bigtable Admin; Bigtable Data; Cloud Datastore; Cloud Debugger; Cloud Pub/Sub; Cloud Source Repositories; Cloud SQL; Compute Engine; Service Control; Service Management; Stackdriver Logging API; Stackdriver Monitoring API; Stackdriver Trace; Storage; Task queue; User info
None of them seems useful to my needs, although the fact that enabling full access to them all solves my problem is pretty confusing to me.
EDIT #2
I'll try and state my question(s) more concisely:
How do I add https://www.googleapis.com/auth/cloud-vision to my gce instance's default account?
I'm looking for a way to do that via any of the following: GCP console, gcloud command line, or even through Python (at the moment I'm using googleapiclient.discovery.build, I don't know if there is any way to ask for vision api scope through the library).
Or is it ok to enable all the scopes as long as limit the roles via IAM? And if that's the case how do I do that?
I really can't find my way around the documentation, thank you once again.
Google Cloud APIs (Vision, Natural Language, Translation, etc) do not need any special permissions, you should just enable them in your project (going to the API Library tab in the Console) and create an API key or a Service account to access them.
Your decision to move from API keys to Service Accounts is the correct one, given that Service Accounts are the recommended approach for authentication with Google Cloud Platform services, and for security reasons, Google recommends to use them instead of API keys.
That being said, I see that you are using the old Python API Client Libraries, which make use of the googleapiclient.discovery.build service that you mentioned. As of now, the newer idiomatic Client Libraries are the recommended approach, and they superseded the legacy API Client Libraries that you are using, so I would strongly encourage to move in that direction. They are easier to use, more understandable, better documented and are the recommended approach to access Cloud APIs programatically.
Getting that as the starting point, I will divide this answer in two parts:
Using Client Libraries
If you decided to follow my advice and migrate to the new Client Libraries, authentication will be really easy for you, given that Client Libraries use Application Default Credentials (ADC) for authentication. ADC make use of the default service account for Compute Engine in order to provide authentication, so you should not worry about it at all, as it will work by default.
Once that part is clear, you can move on to create a sample code (such as the one available in the documentation), and as soon as you test that everything is working as expected, you can move on to the complete Vision API Client Library reference page to get the information about how the library works.
Using (legacy) API Client Libraries
If, despite my words, you want to stick to the old API Client Libraries, you might be interested in this other documentation page, where there is some complete information about Authentication using the API Client Libraries. More specifically, there is a whole chapter devoted to explaining OAuth 2.0 authentication using Service Accounts.
With a simple code like the one below, you can use the google.oauth2.service_account module in order to load the credentials from the JSON key file of your preferred SA, specify the required scopes, and use it when building the Vision client by specifying credentials=credentials:
from google.oauth2 import service_account
import googleapiclient.discovery
SCOPES = ['https://www.googleapis.com/auth/cloud-vision']
SERVICE_ACCOUNT_FILE = '/path/to/SA_key.json'
credentials = service_account.Credentials.from_service_account_file(
SERVICE_ACCOUNT_FILE, scopes=SCOPES)
vision = googleapiclient.discovery.build('vision', 'v1', credentials=credentials)
EDIT:
I forgot to add that in order for Compute Engine instances to be able to work with Google APIs, it will have to be granted with the https://www.googleapis.com/auth/cloud-platform scope (in fact, this is the same as choosing the Allow full access to all Cloud APIs). This is documented in the GCE Service Accounts best practices, but you are right that this would allow full access to all resources and services in the project.
Alternatively, if you are concerned about the implications of allowing "access-all" scopes, in this other documentation page it is explained that you can allow full access and then perform the restriction access by IAM roles.
In any case, if you want to grant only the Vision scope to the instance, you can do so by running the following gcloud command:
gcloud compute instances set-service-account INSTANCE_NAME --zone=INSTANCE_ZONE --scopes=https://www.googleapis.com/auth/cloud-vision
The Cloud Vision API scope (https://www.googleapis.com/auth/cloud-vision) can be obtained, as for any other Cloud API, from this page.
Additionally, as explained in this section about SA permissions and access scopes, SA permissions should be compliant with instance scopes; that means that most restrictive permission would apply, so you need to have that in mind too.
To set the access scopes from the python client libraries with the same effect as that radio button in the GUI:
instance_client = compute_v1.InstancesClient()
instance.service_accounts = [
compute_v1.ServiceAccount(
email="$$$$$$$-compute#developer.gserviceaccount.com",
scopes=[
"https://www.googleapis.com/auth/compute",
"https://www.googleapis.com/auth/cloud-platform",
],
)
]
With a tutorial for creating instances from python here

Google Drive API with Python from server(Backend) without browser autentication

I'd like to save the files of my Saas Appication to my Google Drive Account, all the examples I've seen was with oauth2 autentication and needs the end user autenticate openning the browser, I need to upload files from my server without any user interation, sending files direct to my account!
I have try many tutorials I found on internet with no success, mainly the oficial
Google Drive API with Python
How can I autenticate programatically from my server and upload files and use API features such as share folders and others?
I'm using Python, the lib PyDrive uses the same aproach to autenticate
You can do this, but need to use a Service Account, which is (or rather can be used as) an account for an application, and doesn't require a browser to open.
The documentation is here: https://developers.google.com/api-client-library/python/auth/service-accounts
And an example (without PyDrive, which is just a wrapper around all this, but makes service account a bit trickier):
from apiclient.discovery import build
from oauth2client.service_account import ServiceAccountCredentials
from httplib2 import Http
scopes = ['https://www.googleapis.com/auth/drive.readonly']
credentials = ServiceAccountCredentials.from_json_keyfile_name('YourDownloadedFile-5ahjaosi2df2d.json', scopes)
http_auth = credentials.authorize(Http())
drive = build('drive', 'v3', http=http_auth)
request = drive.files().list().execute()
files = request.get('items', [])
for f in files:
print(f)
To add to andyhasit's answer, using a service account is the correct and easiest way to do this.
The problem with using the JSON key file is it becomes hard to deploy code anywhere else, because you don't want the file in version control. An easier solution is to use an environment variable like so:
https://benjames.io/2020/09/13/authorise-your-python-google-drive-api-the-easy-way/
I know it's quite late for answer but this worked for me:
Use the same API you were using, this time in your computer, it will generate a Storage.json which using it along with your scripts will solve the issue (specially in read-ony platforms like heroku)
Checkout the Using OAuth 2.0 for Web Server Applications. It seems that's what you're looking for.
Any application that uses OAuth 2.0 to access Google APIs must have
authorization credentials that identify the application to Google's
OAuth 2.0 server. The following steps explain how to create
credentials for your project. Your applications can then use the
credentials to access APIs that you have enabled for that project.
Open the Credentials page in the API Console. Click Create credentials
OAuth client ID. Complete the form. Set the application type to Web application. Applications that use languages and frameworks like PHP,
Java, Python, Ruby, and .NET must specify authorized redirect URIs.
The redirect URIs are the endpoints to which the OAuth 2.0 server can
send responses. For testing, you can specify URIs that refer to the
local machine, such as http://localhost:8080.
We recommend that you design your app's auth endpoints so that your
application does not expose authorization codes to other resources on
the page.
Might be a bit late but I've been working with gdrive over python, js and .net and here's one proposed solution (REST API) once you get the authorization code on authorization code
How to refresh token in .net google api v3?
Please let me know if you have any questions

Custom Authenication(User Model) for Cloud Endpoints-Python

I am developing an Android application with a GAE backend, for sessions etc.
I want to use Google Cloud Endpoint and develop an API with custom authentication user model. I dont want to use the google's oauth. I want to implement a simple email/pass user authentication model with a session based token. I have no experience on GAE whatsoever. I have worked in python and it's frameworks(django, flask, etc).
I have looked for a sample project of this kind for past week(with no luck).
Can someone please provide me with sample code/resource on how to implement such an endpoint with session management and csrf protection along with ssl?
Ps: If you think cloud endpoints is not a good approach for my application(server backend) then please direct me to a source that may aid me in creating my own RESTful api with JSON encoding + crsf-protection and session management.
I have already seen the following but none of them have a detailed solution:
Custom Authentication for Google Cloud Endpoints (instead of OAuth2)
Google App Engine: Endpoints authentication when custom auth or Open ID is used
AppEngine Cloud Endpoints and custom Users service
You're in for a ride. It's not a simple process, but I've managed to do just what you're looking for--albeit in a slightly hackish way.
First, there's a boilerplate project for GAE (in Python) that implements a custom email/pwd login system using webapp2's extras: http://appengine.beecoss.com/
It follows the guidelines for setting up custom authentication detailed in this blog post: http://blog.abahgat.com/2013/01/07/user-authentication-with-webapp2-on-google-app-engine/
This project will set things up so that your user will start a session upon login. Now, in order to access the user information on this session in your endpoints, you'll follow the instructions to that first StackOverflow link you posted.
The key, after following the previous steps, is to match the session key in your endpoints to the session key in the config file of the boilerplate code. Then, you'll be able to get which user made the request and follow through with the endpoint call if they're validated:
self.get_user_from_cookie()
if not self.user:
raise endpoints.UnauthorizedException('Invalid token.')
It is incredibly ridiculous that this is how it works for custom authentication, so if you're used to Django and would like to implement your app that way, DO IT. It was "too late to turn back now" for me, and I despise Google for only documenting authentication schemes that work for Google account holders only.
OP, just use Django on GAE and save yourself the frustration. I'm sure there's plenty of quick integration with mobile apps that the Django community can provide.
No one wants to force their app users to have Google accounts in order to log in, Google. Stop it.

Where to store web authentication session in PySide?

I'm building a little application in Python. I use PySide for the GUI and Django to read data from my web application.
Everything works well, but I have a login access, like dropbox application.
I want to store this informations on the current machine (like a session, I don't want to login every time I open the application).
Now my question is, what is the safest way to do this? Environment variables?
Usually when you have an API that you're exposing in your app to the outer world (even your own desktop/mobile app), you'll design this API to be stateless, as part of the REST architecture. So your app should always include an HTTP header or any other method of carrying an authentication token that will let your API identify the user.
You only log in once, and when the log-in procedure is successful you should get an authentication token from your API, and then you will store this token somewhere safe.
You can also look into implementing OAuth2 for the authentication.

Categories