Google Drive Python API without Creating Project - python

For the Google Drive Python API, in all the tutorials I have seen, they require
users to create a project in their Google Dashboard, before obtaining a client ID and a client secret json file. I've been researching both the default Google Drive API and the pydrive module.
Is there a way for users to simply login to their Google Account with username and password,
without having to create a project? So once they login to their Google Account, they are free to
access all files in their Google Drive?

It's not possible to use the Drive API without creating a GCP project for the application. Otherwise Google has no idea what application is requesting access, and what scope of account access it should have.
Using simply a username and password to log in is not possible. You need to create a project and use OAuth.

it might be possible using some pysimplegui hackery or just simply modifying the code of a python based browser but in most cases it is not practical
except if you need to automate something (like renaming files ) that would take 1 hour in a place where you do not have access to GCP

Related

Authenticate Google Sheets API Using Access Key in Python

I am trying to access a Google Sheet stored in my Drive through the Google Sheets REST API.
This will just be a Python script without any user interaction. How can I authenticate my request using something like an access key or a service account?
I understand the concept of generating access keys or creating a service account in my Google Cloud console. But, I don't quite understand how the Sheet in my Drive can be associated with it.
I would like to know the steps I should follow in order to accomplish this. For instance, how can I send a request to this API endpoint?
GET https://sheets.googleapis.com/v4/spreadsheets/{spreadsheetId}
Note: I want to do this using the REST API. I do not want to use a Python API that has already been developed. So, I simply want to hit the above endpoint using maybe the requests package.
Google does not permit API only access to Google (Workspace?) documents.
See Authorizing Requests
API keys authenticate programs.
OAuth is used to authenticate users and Google requires that users authenticate requests when access user data stored in Workspace documents.
Domain-wide Delegation enables the use of a Service Account to operate on behalf of users in situations such as this but it is only available for (paid) Workspace accounts.
I'm unsure how to refer to the free and paid (Workspace) versions.

Authenticated Google Drive API with access token from Python Social App

I created a website that I plan to connect to Google Drive. So, when students or teachers upload files, the files will go to their respective google drive accounts. Meanwhile, the only thing that goes into the database is the file link.
I've tried using Python Social Auth and implemented it successfully. But I'm still confused about how to use the access token I got to access the Google Drive API.
Please help me

What is the workflow to encrypt and access Google Application Credentials in Google's App Engine?

I know that the Google Cloud environment has tons of solutions for encryption but I keep ultimately running in circles and finding myself holding my own key when it should be unknown to the application.
My current strategy is:
Access my google credentials as json held locally.
import os
os.environ['GOOGLE_APPLICATION_CREDENTIALS']='<my_google_user>.json'
Set my secret with payload in google cloud via Python (longer code omitted for conciseness)
Access my secret as dictionary to access directly later in code.
client = secretmanager.SecretManagerServiceClient()
sf_str = client.access_secret_version(request={"name":<my_version_name>}).payload.data.decode("utf-8")
sf_cred = json.loads(sf_str)
So my question is how can I encrypt my google application credentials json?
I am very new to this environment and workflows involving encryption so please be precise with python or cloud terminal examples. Feel free to knock the security of my strategy as a whole so that I may learn a better one.
P.S. I have created a Cryptographic Key in Google Cloud Platform if that is a step but don't know how to use/automate it in this 12hr recurring task I want to use in setting up this password access.
Not sure if I understand your question correctly.
You might not need the credentials file at all.
Your code in the cloud is executed by App Engine, or Cloud Function, or Cloud Run, etc. under some service account. It can be a default service account, or a specifically created service account. For example: Using the Default App Engine Service Account
In order to access a secret in the Secret Manager, it may be enough to add/assign relevant IAM roles to the service account which executes the code to access those secrets. For example - to add IAM roles to the default app engine service account.
Secret Manager IAM is described here: Access control
Most likely a roles/secretmanager.secretAccessor role may be enough. In that case your code will be able to get the secret value, and subsequently use it.

How do I get an access token of my own account to be stored/used in my server?

I'm going to be using the Google Cloud Storage JSON API, and the app will only be using my Cloud Storage project, and not the individual users'. How do I get an an access token/api key so I can do requests to my Cloud Storage project without having to worry about expired tokens, etc., from AppEngine?
I see that there's a "Server Key" entry in the Cloud Console, but I'm not sure what it's for.
I'm aware about "activating" my AppEngine project with the Google Cloud project (god that's confusing) to achieve what I want, but unfortunately, my AppEngine project is already "activated" to a different Google Cloud project (which doesn't have Cloud Storage enabled), and the Google Cloud project is already associated with another AppEngine project (wat?).
Also, how would I use this "api key" in my requests via the wrapper library? As all samples online are done via OAuth2 and signing requests with a user's access token. (I don't think the wrapper library even has support for non-OAuth2 requests?)
Your AppEngine project has a service account associated with it. That account can be granted membership in your second project.
To find the name of your service account, head over to the AppEngine console ( https://appengine.google.com ) and choose your AppEngine project. Down under the Application Settings tab, you'll see a "Service Account Name" that looks like an email address.
Now head over to the cloud console ( https://cloud.google.com/console ) and choose the second project, the one you're using for this Google Cloud Storage stuff. While you're there, make sure you've enabled it for Google Cloud Storage (and the JSON version), and that you've set up billing.
Under "permissions", you'll see a list of email addresses. Click "Add member" and put in the address we found earlier in the AppEngine console. Choose between owner, editor, or viewer, depending on what your AppEngine app is going to do need to do.
Alternately, rather than adding that account to the project itself, you could also grant it permissions for just the buckets or objects it needs to use.
When you invoke the Google Cloud Storage JSON API, you can specify which API key you want to use as a keyword argument on the build() function. You can use either API key.
In order to get ahold of credentials for invoking the JSON API, you'll most likely want to use AppAssertionsCredentials, as shown here: https://developers.google.com/api-client-library/python/guide/google_app_engine#ServiceAccounts
import httplib2
from google.appengine.api import memcache
from oauth2client.appengine import AppAssertionCredentials
from apiclient import discovery
...
credentials = AppAssertionCredentials(scope='https://www.googleapis.com/auth/devstorage.read_write')
http = credentials.authorize(httplib2.Http(memcache))
storage = discovery.build(serviceName='storage', version='v1beta2', http=http, developerKey=your_api_key)
Also note that, in addition to the JSON API, there is also an AppEngine-specific Python library for accessing Google Cloud Storage: https://developers.google.com/appengine/docs/python/googlecloudstorageclient/

How can I protect my AWS access id and secret key in my python application

I'm making an application in Python and using Amazon Web Services in some modules.
I'm now hard coding my AWS access id and secret key in *.py file. Or might move them out to an configuration file in future.
But there's a problem, how can I protect AWS information form other people? As I know python is a language that easy to de-compile.
Is there a way to do this?
Well what I'm making is an app to help user upload/download stuff from cloud. I'm using Amazon S3 as cloud storage. As I know Dropbox also using S3 so I'm wondering how they protects the key.
After a day's research I found something.
I'm now using boto (an AWS library for python). I can use a function of 'generate_url(X)' to get a url for the app to accessing the object in S3. The url will be expired in X seconds.
So I can build a web service for my apps to provide them the urls. The AWS keys will not be set into the app but into the web service.
It sounds great, but so far I only can download objects with this function, upload doesn't work. Any body knows how to use it for uploading?
Does anyone here know how to use key.generate_url() of boto to get a temporary url for uploading stuff to S3?
There's no way to protect your keys if you're going to distribute your code. They're going to be accessible to anyone who has access to your server or source code.
There are two things you can do to protect yourself against malicious use of your keys.
Use the amazon IAM service to create a set of keys that only has permission to perform the tasks that you require for your script. http://aws.amazon.com/iam/
If you have a mobile app or some other app that will require user accounts you can create a service to create temporary tokens for each user. The user must have a valid token and your keys to perform any actions. If you want to stop a user from using your keys you can stop generating new tokens for them. http://awsdocs.s3.amazonaws.com/STS/latest/sts-api.pdf
Specifically to S3 if you're creating an application to allow people to upload content. The only way to protect your account and the information of the other users is to make them register an account with you.
The first step of the application would be to authenticate with your server.
Once your server authenticates you make a request to amazons token server and return a token
Your application then makes a request using the keys built into the exe and the token.
Based on the permissions applied to this user he can upload only to the bucket that is assigned to him.
If this seems pretty difficult then you're probably not ready to design an application that will help users upload data to S3. You're going to have significant security problems if you only distribute 1 key even if you can hide that key from the user they would be able to edit any data added by any user.
The only way around this is to have each user create their own AWS account and your application will help them upload files to their S3 account. If this is the case then you don't need to worry about protecting the keys because the user will be responsible for adding their own keys after installing your application.
I've been trying to answer the same question... the generate_url(x) looks quite promising.
This link had a suggestion about creating a cloudfront origin access identity, which I'm guessing taps into the IAM authentication... meaning you could create a key for each application without giving away your main account details. With IAM, you can set permissions based on keys as to what they can do, so they can have limited access.
Note: I don't know if this really works, I haven't tried it yet, but it might be another avenue to explore.
2 - Create a Cloudfront "Origin Access Identity"
This identity can be reused for many different distributions and keypairs. It is only used
to allow cloudfront to access your private S3 objects without allowing
everyone. As of now, this step can only be performed using the API.
Boto code is here:
# Create a new Origin Access Identity
oai = cf.create_origin_access_identity(comment='New identity for secure videos')
print("Origin Access Identity ID: %s" % oai.id)
print("Origin Access Identity S3CanonicalUserId: %s" % oai.s3_user_id)
You're right, you can't upload using pre-signed URLs.
There is a different, more complex capability that you can use called GetFederationToken. This will return you some temporary credentials, to which you can apply any policy (permissions) that you like.
So for example, you could write a web service POST /upload that creates a new folder in S3, then creates temporary credentials with permissions to PutObject to only this folder, and returns the folder path and credentials to the caller. Presumably, some authorization check would be performed by this method as well.
You can't embed cloud credentials, or any other credentials, in your application code. Which isn't to say that nobody ever accidentally does this, even security professionals.
To safely distribute credentials to your infrastructure, you need tool support. If you use an AWS facility like CloudFormation, you can (somewhat more) safely give it your credentials. CloudFormation can also create new credentials on the fly. If you use a PaaS like Heroku, you can load your credentials into it, and Heroku will presumably treat them carefully. Another option for AWS is IAM Role. You can create an IAM Role with permission to do what you need, then "pass" the role to your EC2 instance. It will be able to perform the actions permitted by the role.
A final option is a dedicated secrets management service, such as Conjur. (Disclaimer: I'm a founder of the company). You load your credentials and other secrets into a dedicated virtual appliance, and you define access permissions that govern the modification and distribution of the credentials. These permissions can be granted to people or to "robots" like your EC2 box. Credentials can be retrieved via REST or client APIs, and every interaction with credentials is recorded to a permanent record.
Don't put it in applications you plan to distribute. It'll be visible and they can launch instances that are directly billable to you or worst..they can take down instances if you use it in production.
I would look at your programs design and seriously question why I need to include that information in the app. If you post more details on the design I'm sure we can help you figure out a way in which you don't need to bundle this information.

Categories