I am working on a script to migrate from domain X to domain Y using Python Google Apps APIs.
For each account on my domain I need to export the mail from domain X and import it into domain Y.
I see that I can create an mbox file for each user account using the createMailboxExportRequest method. I then can download the mbox file(s) when it is ready.
Now how can I get the mbox file back into a Google account on domain Y? I need a solution in Python.
There are methods of migrating using the Email Migration API. This requires a RFC822 format email. I don't believe that is the mbox format.
I would hope there is a method in one of the APIs that can simply import the mbox file that Google exported.
The Audit API you referenced for export is not suitable for use here. From the ToS section 4:
Email Audit API The Email Audit API is not designed and should not be
used for general backup, archival, or journaling purposes. Google
reserves the right to prevent a customer from using the Email Audit
API in ways that might adversely impact the performance or usability
of the Email Audit API.
additionally, when using the Audit API, you do not get message state (read/unread, starred, labels, etc, etc).
You should be looking at Gmail IMAP as the method to connect and export messages. IMAP along with Google's IMAP extensions provide you access to all of the messages metadata (read/unread, starred, labelled, etc). You can authenticate via OAuth 2.0 with the Gmail IMAP servers.
Messages extracted via IMAP should be in RFC822 format and ready for submission to the Migration API (along with their metadata).
Got Your Back (GYB) is an open source Python script that uses OAuth (1.0 since 2.0 wasn't out when I wrote it) and Gmail-specific IMAP commands to backup and restore accounts. I used IMAP for the restore portion so that it'd be compatible with consumer Gmail account which don't support migration API. However, it may prove a good reference point for you.
Related
I am trying to access a Google Sheet stored in my Drive through the Google Sheets REST API.
This will just be a Python script without any user interaction. How can I authenticate my request using something like an access key or a service account?
I understand the concept of generating access keys or creating a service account in my Google Cloud console. But, I don't quite understand how the Sheet in my Drive can be associated with it.
I would like to know the steps I should follow in order to accomplish this. For instance, how can I send a request to this API endpoint?
GET https://sheets.googleapis.com/v4/spreadsheets/{spreadsheetId}
Note: I want to do this using the REST API. I do not want to use a Python API that has already been developed. So, I simply want to hit the above endpoint using maybe the requests package.
Google does not permit API only access to Google (Workspace?) documents.
See Authorizing Requests
API keys authenticate programs.
OAuth is used to authenticate users and Google requires that users authenticate requests when access user data stored in Workspace documents.
Domain-wide Delegation enables the use of a Service Account to operate on behalf of users in situations such as this but it is only available for (paid) Workspace accounts.
I'm unsure how to refer to the free and paid (Workspace) versions.
Preface: I work at an agency with access to many clients google analytics accounts. I am working to set up an API to pipe all the clients data to a data warehouse.When I try to use the python API library, I created a service account that needs to be added as a user by each client. This can take a while depending on the client.
Question: why is it that if I wanted to use something like stitch data, that they can access all of my client's data without having a service account of theirs added to each individual client? How does the singer-tap used for this work to avoid this service account problem?
I've rolled my own email validation function before but have a project that uses a Pre-Google-Cloud-SDK api:
from google.appengine.api.mail import IsEmailValid
This takes a string and returns True/False if it does/doesn't follow the format for an email address. it doesn't test to see if the email address is live - it only parses the string.
Does this functionality exist in Google's Cloud SDK api?
I suspect not as bulk mail support was dropped from App Engine and, with it, this support function.
The Google Cloud API does not offer this type of validation. You can actually search all the available methods in the Client library documentation and in the GitHub repository.
If you are deploying your application with Google App Engine, you can always configure the App to handle Bounce notifications in case a mail is not delivered.
Alternatively, Gmail has its own getProfile API that allows you to check if a user is valid, unfortunately it will only work for gmail accounts.
I have a script (that I did not write) that uses basic authentication to access email boxes online. The script uses the following code:
from O365 import Connection
Connection.login(user, password)
It recently failed. Doing a bit of research I got the following error:
'Basic Authentication for Outlook REST API is no longer supported...'
I found some information on https://aka.ms/BasicAuthDeprecated:
Last year, we announced that in November 2018, we will stop
supporting Basic Authentication in the Office 365 Outlook REST API
v1.0 and this is a follow up announcement to reiterate that we will
be decommissioning Basic Authentication in Outlook REST API v1.0 this
month. This means that new or existing apps will not be able to use
Basic Authentication in v1.0 and Beta versions of Outlook REST API
starting December 2018.
If you have been using Basic Authentication in Office 365 Outlook REST
API v1.0/Beta in your app, you should immediately transition to
Microsoft Graph- based Outlook REST APIs to continue accessing
Exchange Online data.
Web authentication is not my strong suit and I'm not sure what I need to do here. Looks like some sort of registration/token generation is required.
You will need to use OAuth for authenticating with O365 as BasicAuth is not supported anymore.
https://pypi.org/project/O365/#authentication has details on the Python library integration with OAuth.
Circling back to this, so that others see this, the code example in the authentication section works:
from O365 import Account, FileSystemTokenBackend
credentials = ('id', 'secret')
# this will store the token under: "my_project_folder/my_folder/my_token.txt".
# you can pass strings to token_path or Path instances from pathlib
token_backend = FileSystemTokenBackend(token_path='my_folder', token_filename='my_token.txt')
account = Account(credentials, token_backend=token_backend)
# This account instance tokens will be stored on the token_backend configured before.
# You don't have to do anything more
# ...
When you run the code, it'll provide a url which you paste into a browser, and will provide a new url that you need to copy into code, which will authenticate and provide the token for further usage.
I am making an application in GAE (python) which allows users (who must be logged in with a Google Account) to upload files to Google Cloud Storage directly. The upload would happen asynchronously (a button would execute some javascript code to send the XHR PUT request with the file).
I decided to use the JSON API, and I'm trying to do a multipart upload. According to the docs I need to include an Authorization header with a token. I was wondering, since I don't need access to anything in the user's account (like calendar or that kind of thing), what would be the best way to achieve this?
Is it possible to create a temporary access token from the application default credentials and send that to the user within the html file to use it then from the js function? If not, what should I do?
You would need to ask the user to grant you the google cloud storage write scopes (which I would strongly recommend any google cloud storage user not grant to any random application). You would also need to grant the end users write permission on your bucket (which also means they can delete everything in the bucket). Unless you know all of your end users up front, this means you would need to make the bucket public write.. which I doubt is something you want to do.
I strongly recommend using delegating access to your service account instead, though unfortunately the JSON api does not currently support any form of authentication delegation. However, the XML API does support Signed URLs.
https://cloud.google.com/storage/docs/access-control/signed-urls
You only need to use it for the client-side uploads, everything else can use the JSON api.
There are three options:
Just sign a simple PUT request
https://cloud.google.com/storage/docs/xml-api/put-object-upload#query_string_parameters
Use a form POST and sign a policy document
https://cloud.google.com/storage/docs/xml-api/post-object#policydocument
Initiate a resumable upload server side and pass the upload URL back to the client. I would only recommend this option if being able to resume the upload is important (e.g. large uploads).
https://cloud.google.com/storage/docs/access-control/signed-urls#signing-resumable