I've rolled my own email validation function before but have a project that uses a Pre-Google-Cloud-SDK api:
from google.appengine.api.mail import IsEmailValid
This takes a string and returns True/False if it does/doesn't follow the format for an email address. it doesn't test to see if the email address is live - it only parses the string.
Does this functionality exist in Google's Cloud SDK api?
I suspect not as bulk mail support was dropped from App Engine and, with it, this support function.
The Google Cloud API does not offer this type of validation. You can actually search all the available methods in the Client library documentation and in the GitHub repository.
If you are deploying your application with Google App Engine, you can always configure the App to handle Bounce notifications in case a mail is not delivered.
Alternatively, Gmail has its own getProfile API that allows you to check if a user is valid, unfortunately it will only work for gmail accounts.
Related
Box.com supports different authentication method, OAuth2 and JWT. I'm currently using OAuth2 with develop tokens, which works just fine. The developer tokens expires within an hour so I can't use this in our production.
I'm using the python SDK to upload files to box, and there is no user interaction here at all. It seems like I can't use the OAuth2 authentication method since there is no users uploading (automatic script), am I right?
The JWT authentication method requires an enterprise id, which I can't find. I used this page as reference: https://box-content.readme.io/docs/box-platform
I've logged in as an co-admin in box, but can't find the enterprise id or Custom apps under the APPS menu.
Is there anything I have missed?
You have to use JWT to make server to server api call. you can find your enterprise ID in you Admin Console-->Enterprise Setting--> Account Info-->Enterprise ID.
On the client side I have Android users I wish to authenticate using Google Identity Toolkit. I'll mainly be using email/password authentication but i'm also looking into federated logins. I'm just not sure how to use the Identity Toolkit on Google cloud endpoints. So far the only thing i know for sure is i can't use get_current_user() method to validate a user.
I came across this user authentication API explorer demonstration on google's website which uses identity toolkit. This is what I want to do, but i don't know how to do it. I couldn't find a proper documentation that shows how to authenticate users on Cloud Endpoints using Google Identity Toolkit API.
A step by step guide would be great!
We could do it with this data:
authorizationUrl: ''
flow: implicit
type: oauth2
x-google-audiences: [PROJECT_ID]
x-google-issuer: https://identitytoolkit.google.com/
x-google-jwks_uri: https://www.googleapis.com/identitytoolkit/v3/relyingparty/publicKeys
I have a website and I need to test it with 250 users. However, I am using google login via OAuth2. The website is hosted on Google App Engine.
I am stuck at this login part. After we log in we get and access token from Google that is passed to Google APIs via the Authorization: Bearer header. We use the access token in the application to get user details and access other google apps for that user. I don't know how to get that access token using my external script.
One option is to mock / stub this part of your application out during testing. For instance, you can provide a certain header that tells your application that you're in test mode and instead of calling the real google APIs, it calls a mock API instead. If your application is setup for dependency injection this could be trivial, otherwise, it may involve monkey-patching or similar.
Another option is to use an OAuth2 Service Account and acquire access tokens for a bunch of users in a test Google Apps domain. Your test script can do this and then just pass the access tokens just as a client normally would.
I'm going to be using the Google Cloud Storage JSON API, and the app will only be using my Cloud Storage project, and not the individual users'. How do I get an an access token/api key so I can do requests to my Cloud Storage project without having to worry about expired tokens, etc., from AppEngine?
I see that there's a "Server Key" entry in the Cloud Console, but I'm not sure what it's for.
I'm aware about "activating" my AppEngine project with the Google Cloud project (god that's confusing) to achieve what I want, but unfortunately, my AppEngine project is already "activated" to a different Google Cloud project (which doesn't have Cloud Storage enabled), and the Google Cloud project is already associated with another AppEngine project (wat?).
Also, how would I use this "api key" in my requests via the wrapper library? As all samples online are done via OAuth2 and signing requests with a user's access token. (I don't think the wrapper library even has support for non-OAuth2 requests?)
Your AppEngine project has a service account associated with it. That account can be granted membership in your second project.
To find the name of your service account, head over to the AppEngine console ( https://appengine.google.com ) and choose your AppEngine project. Down under the Application Settings tab, you'll see a "Service Account Name" that looks like an email address.
Now head over to the cloud console ( https://cloud.google.com/console ) and choose the second project, the one you're using for this Google Cloud Storage stuff. While you're there, make sure you've enabled it for Google Cloud Storage (and the JSON version), and that you've set up billing.
Under "permissions", you'll see a list of email addresses. Click "Add member" and put in the address we found earlier in the AppEngine console. Choose between owner, editor, or viewer, depending on what your AppEngine app is going to do need to do.
Alternately, rather than adding that account to the project itself, you could also grant it permissions for just the buckets or objects it needs to use.
When you invoke the Google Cloud Storage JSON API, you can specify which API key you want to use as a keyword argument on the build() function. You can use either API key.
In order to get ahold of credentials for invoking the JSON API, you'll most likely want to use AppAssertionsCredentials, as shown here: https://developers.google.com/api-client-library/python/guide/google_app_engine#ServiceAccounts
import httplib2
from google.appengine.api import memcache
from oauth2client.appengine import AppAssertionCredentials
from apiclient import discovery
...
credentials = AppAssertionCredentials(scope='https://www.googleapis.com/auth/devstorage.read_write')
http = credentials.authorize(httplib2.Http(memcache))
storage = discovery.build(serviceName='storage', version='v1beta2', http=http, developerKey=your_api_key)
Also note that, in addition to the JSON API, there is also an AppEngine-specific Python library for accessing Google Cloud Storage: https://developers.google.com/appengine/docs/python/googlecloudstorageclient/
I am working on a script to migrate from domain X to domain Y using Python Google Apps APIs.
For each account on my domain I need to export the mail from domain X and import it into domain Y.
I see that I can create an mbox file for each user account using the createMailboxExportRequest method. I then can download the mbox file(s) when it is ready.
Now how can I get the mbox file back into a Google account on domain Y? I need a solution in Python.
There are methods of migrating using the Email Migration API. This requires a RFC822 format email. I don't believe that is the mbox format.
I would hope there is a method in one of the APIs that can simply import the mbox file that Google exported.
The Audit API you referenced for export is not suitable for use here. From the ToS section 4:
Email Audit API The Email Audit API is not designed and should not be
used for general backup, archival, or journaling purposes. Google
reserves the right to prevent a customer from using the Email Audit
API in ways that might adversely impact the performance or usability
of the Email Audit API.
additionally, when using the Audit API, you do not get message state (read/unread, starred, labels, etc, etc).
You should be looking at Gmail IMAP as the method to connect and export messages. IMAP along with Google's IMAP extensions provide you access to all of the messages metadata (read/unread, starred, labelled, etc). You can authenticate via OAuth 2.0 with the Gmail IMAP servers.
Messages extracted via IMAP should be in RFC822 format and ready for submission to the Migration API (along with their metadata).
Got Your Back (GYB) is an open source Python script that uses OAuth (1.0 since 2.0 wasn't out when I wrote it) and Gmail-specific IMAP commands to backup and restore accounts. I used IMAP for the restore portion so that it'd be compatible with consumer Gmail account which don't support migration API. However, it may prove a good reference point for you.