Google API service object is not created when ran as exe - python

I've written a script to add events to Google calendar using the Google Calendar API for python. It works perfectly when ran as just a python script but after making it an exe with pyinstaller it fails to create a serivce object.
Code for creating service object:
def create_service(client_secret_file, api_name, api_version, *scopes, prefix=''):
CLIENT_SECRET_FILE = client_secret_file
API_SERVICE_NAME = api_name
API_VERSION = api_version
SCOPES = [scope for scope in scopes[0]]
cred = None
working_dir = os.getcwd()
token_dir = 'token files'
pickle_file = f'token_{API_SERVICE_NAME}_{API_VERSION}{prefix}.pickle'
### Check if token dir exists first, if not, create the folder
if not os.path.exists(os.path.join(working_dir, token_dir)):
os.mkdir(os.path.join(working_dir, token_dir))
if os.path.exists(os.path.join(working_dir, token_dir, pickle_file)):
with open(os.path.join(working_dir, token_dir, pickle_file), 'rb') as token:
cred = pickle.load(token)
if not cred or not cred.valid:
if cred and cred.expired and cred.refresh_token:
cred.refresh(Request())
else:
flow = InstalledAppFlow.from_client_secrets_file(CLIENT_SECRET_FILE, SCOPES)
cred = flow.run_local_server()
with open(os.path.join(working_dir, token_dir, pickle_file), 'wb') as token:
pickle.dump(cred, token)
try:
service = build(API_SERVICE_NAME, API_VERSION, credentials=cred)
print(API_SERVICE_NAME, API_VERSION, 'service created successfully')
return service
except Exception as e:
print(e)
print(f'Failed to create service instance for {API_SERVICE_NAME}')
os.remove(os.path.join(working_dir, token_dir, pickle_file))
return None
And I create the service object using:
created_event = service.events().quickAdd(
calendarId=Calendar_two,
text=eventtext).execute()
When ran as a .py it works fine and I get the "serivce created successfully" print, but when ran as exe it fails and gives me the "failed to create service" print and spits out the following error: "AttributeError: 'NoneType' object has no attribute 'events".
Visual studio PyLint also spits out this error: Instance of 'Resource' has no 'events' member [E:no-member]. But again, it still works as a .py
Here is the same code in a google example
This guy had the exact same problem a few months ago but didn't get any replies

Related

How to upload files into google drive via api using Python?

I have an excel file in my local computer that I would like to upload into google drive using python via an API.
The following is my code:
import pickle
import os
from google_auth_oauthlib.flow import Flow, InstalledAppFlow
from googleapiclient.discovery import build
from googleapiclient.http import MediaFileUpload, MediaIoBaseDownload
from google.auth.transport.requests import Request
from googleapiclient.http import MediaFileUpload
def Create_Service(client_secret_file, api_name, api_version, *scopes):
print(client_secret_file, api_name, api_version, scopes, sep='-')
CLIENT_SECRET_FILE = client_secret_file
API_SERVICE_NAME = api_name
API_VERSION = api_version
SCOPES = [scope for scope in scopes[0]]
print(SCOPES)
cred = None
pickle_file = f'token_{API_SERVICE_NAME}_{API_VERSION}.pickle'
if os.path.exists(pickle_file):
with open(pickle_file, 'rb') as token:
cred = pickle.load(token)
if not cred or not cred.valid:
if cred and cred.expired and cred.refresh_token:
cred.refresh(Request())
else:
flow = InstalledAppFlow.from_client_secrets_file(CLIENT_SECRET_FILE, SCOPES)
cred = flow.run_local_server()
with open(pickle_file, 'wb') as token:
pickle.dump(cred, token)
try:
service = build(API_SERVICE_NAME, API_VERSION, credentials=cred)
print(API_SERVICE_NAME, 'service created successfully')
return service
except Exception as e:
print('Unable to connect.')
print(e)
return None
def convert_to_RFC_datetime(year=1900, month=1, day=1, hour=0, minute=0):
dt = datetime.datetime(year, month, day, hour, minute, 0).isoformat() + 'Z'
return dt
CLIENT_SECRET_FILE = '/Users/shruthiravishankar/Downloads/client_secret_316665721335-819139d5ea0aeet1ddshhk6p0mpl8mv2.apps.googleusercontent.com.json'
API_NAME = 'Desktop client 1'
API_VERSION = 'v3'
SCOPES =['https://www.googleapis.com/auth/drive']
service = Create_Service(CLIENT_SECRET_FILE,API_NAME, API_VERSION, SCOPES)
folder_id='10Xct2T1vBpqW3-3Ud6mjPuf_lKCN1bUL'
file_names = ['Manual_SIC_MVP (9).xlsx']
mime_types = ['application/vnd.openxmlformats-officedocument.spreadsheetml.sheet']
for file_name, mime_type in zip(file_names, mime_types):
file_metadata = {
'name': file_name,
'parents': [folder_id]
}
media = MediaFileUpload('/Users/shruthiravishankar/Downloads/{0}'.format(file_name), mimetype=mime_type)
print(media)
service.files().create(
supportsTeamDrives=True,
body=file_metadata,
media_body=media,
fields='id'
).execute()
The error that I am getting is:
Unable to connect and none files.
Is there an issue with he api_version and scope? Unable to figure out the issue. This is my first time dealing with api.
The issue is with this statement actually:
API_NAME = 'Desktop client 1'
That's not a valid Google service name.
This particular library needs to know which API service it needs to build, not how you want to call it.
Change that line to the following:
API_NAME = 'drive'
See the github line in question here and the general quickstart docs here

How can I upload A file to shared folder in my google drive? {python}

I have tried almost everything on google documentation and almost all the possibilities I could explore by myself. I still cant file a viable solution.
I just need to create a program which uploads a given file for example "test.zip" in my working directory to google drive.
I have a client_secret.json but none of the solutions actually help online as I am having issues with authentication.
from Google import Create_Service
from googleapiclient.http import MediaFileUpload
CLIENT_SECRET_FILE = "client_secret.json"
API_NAME = "drive"
API_VERSION = "v3"
SCOPES = ["https://www.googleapis.com/auth/drive"]
service = Create_Service(CLIENT_SECRET_FILE,API_NAME,API_VERSION,SCOPES)
folder_id = "1QpsQB_R7JyqxueQwIe8_AvKGm7a25IoJ"
file_names = ["my_file.zip"]
mime_types = ['application/zip']
for file_name , mime_type in zip(file_names , mime_types):
file_metadata = {
"name" : file_name,
"parents" : [folder_id]
}
media = MediaFileUpload('./Uploads/{0}'.format(file_name), mimetype=mime_type)
service.files().create(
body = file_metadata,
media_body = media,
fields = "id"
).execute()
this is the code I am using right now,
Create_service is being taken from google.py
import pickle
import os
from google_auth_oauthlib.flow import Flow, InstalledAppFlow
from googleapiclient.discovery import build
from googleapiclient.http import MediaFileUpload, MediaIoBaseDownload
from google.auth.transport.requests import Request
def Create_Service(client_secret_file, api_name, api_version, *scopes):
print(client_secret_file, api_name, api_version, scopes, sep='-')
CLIENT_SECRET_FILE = client_secret_file
API_SERVICE_NAME = api_name
API_VERSION = api_version
SCOPES = [scope for scope in scopes[0]]
print(SCOPES)
cred = None
pickle_file = f'token_{API_SERVICE_NAME}_{API_VERSION}.pickle'
# print(pickle_file)
if os.path.exists(pickle_file):
with open(pickle_file, 'rb') as token:
cred = pickle.load(token)
if not cred or not cred.valid:
if cred and cred.expired and cred.refresh_token:
cred.refresh(Request())
else:
flow = InstalledAppFlow.from_client_secrets_file(CLIENT_SECRET_FILE, SCOPES)
cred = flow.run_local_server()
with open(pickle_file, 'wb') as token:
pickle.dump(cred, token)
try:
service = build(API_SERVICE_NAME, API_VERSION, credentials=cred)
print(API_SERVICE_NAME, 'service created successfully')
return service
except Exception as e:
print('Unable to connect.')
print(e)
return None
def convert_to_RFC_datetime(year=1900, month=1, day=1, hour=0, minute=0):
dt = datetime.datetime(year, month, day, hour, minute, 0).isoformat() + 'Z'
return dt
but even after allowing authentication it shows,
any help will be appreciation :)

Is it possible to sync or upload the file from google drive without copying the whole thing in the folder using python?

I just start learning the python scripting and I created a script using pydrive and the function is uploading all files from local folder (linux OS) to google drive but I'm planning to modify the script for my automation and add the function that can upload only the most recent file added to the local folder with no reuploading of all the files inside the folder, may I know if this is possible with python script alone?
Thank you in advance!
You dont need to use pydrive. You can use the Google api python client library directly. As far as i know pydrive does use the client library internally. There's a starter example here
Quick start python
from __future__ import print_function
import os.path
from google.auth.transport.requests import Request
from google.oauth2.credentials import Credentials
from google_auth_oauthlib.flow import InstalledAppFlow
from googleapiclient.discovery import build
from googleapiclient.errors import HttpError
# If modifying these scopes, delete the file token.json.
SCOPES = ['https://www.googleapis.com/auth/drive.metadata.readonly']
def main():
"""Shows basic usage of the Drive v3 API.
Prints the names and ids of the first 10 files the user has access to.
"""
creds = None
# The file token.json stores the user's access and refresh tokens, and is
# created automatically when the authorization flow completes for the first
# time.
if os.path.exists('token.json'):
creds = Credentials.from_authorized_user_file('token.json', SCOPES)
# If there are no (valid) credentials available, let the user log in.
if not creds or not creds.valid:
if creds and creds.expired and creds.refresh_token:
creds.refresh(Request())
else:
flow = InstalledAppFlow.from_client_secrets_file(
'credentials.json', SCOPES)
creds = flow.run_local_server(port=0)
# Save the credentials for the next run
with open('token.json', 'w') as token:
token.write(creds.to_json())
try:
service = build('drive', 'v3', credentials=creds)
# Call the Drive v3 API
results = service.files().list(
pageSize=10, fields="nextPageToken, files(id, name)").execute()
items = results.get('files', [])
if not items:
print('No files found.')
return
print('Files:')
for item in items:
print(u'{0} ({1})'.format(item['name'], item['id']))
except HttpError as error:
# TODO(developer) - Handle errors from drive API.
print(f'An error occurred: {error}')
if __name__ == '__main__':
main()
Manage uploads
file_metadata = {'name': 'photo.jpg'}
media = MediaFileUpload('files/photo.jpg', mimetype='image/jpeg')
file = drive_service.files().create(body=file_metadata,
media_body=media,
fields='id').execute()
print 'File ID: %s' % file.get('id')

How to deal with HTTPError 403 when using OAuth2 authentication with Python (V. 3.8.1)?

When trying to query the Google Search Console for certain metrics (not pictured) using the Google search Console API, specifically OAuth 2.0, I am getting a HTTPError 403: "returned "User does not have sufficient permission for site 'XXXXXXXX.com'. See also: https://support.google.com/webmasters/answer/2451999.". Details: "[{'message': "User does not have sufficient permission for site 'https://www.xxxxxx.com'. See also: https://support.google.com/webmasters/answer/2451999.", 'domain': 'global', 'reason': 'forbidden'}]">" (replaced site URL with x's)
I have also gotten a 403 error that says I do not have a Google Analytics account on some other attempts.
Below is the code to authorize the access:
def get_domain_name(start_url):
domain_name = '{uri.netloc}'.format(uri=urlparse(start_url)) # Get Domain Name To Name Project
domain_name = domain_name.replace('.','_')
return domain_name
def create_project(directory):
if not os.path.exists(directory):
print('Create project: '+ directory)
os.makedirs(directory)
def authorize_creds(creds):
# Variable parameter that controls the set of resources that the access token permits.
SCOPES = ['https://www.googleapis.com/auth/webmasters.readonly']
# Path to client_secrets.json file
CLIENT_SECRETS_PATH = creds
# Create a parser to be able to open browser for Authorization
parser = argparse.ArgumentParser(
formatter_class=argparse.RawDescriptionHelpFormatter,
parents=[tools.argparser])
flags = parser.parse_args([])
flow = client.flow_from_clientsecrets(
CLIENT_SECRETS_PATH, scope = SCOPES,
message = tools.message_if_missing(CLIENT_SECRETS_PATH))
# Prepare credentials and authorize HTTP
# If they exist, get them from the storage object
# credentials will get written back to a file.
storage = file.Storage('authorizedcreds.dat')
credentials = storage.get()
# If authenticated credentials don't exist, open Browser to authenticate
if credentials is None or credentials.invalid:
credentials = tools.run_flow(flow, storage, flags)
http = credentials.authorize(http=httplib2.Http())
webmasters_service = build('webmasters', 'v3', http=http)
return webmasters_service

How to resolve 'pkg_resources.DistributionNotFound' error in pyinstaller exe

I'm trying to make an application that uses Google Drive API. I am converting my python file to an executable using PyInstaller. I ran the .exe file generated from the most basic python script and I got an error that says
pkg_resources.DistributionNotFound: The 'google-api-python-client' distribution was not found and is required by the application
My code doesn't even use 'google-api-python-client', yet I installed it in my anaconda environment but I am still facing the same issue.
I am adding the code for reference:-
import pickle
import os
from googleapiclient.discovery import build
from google.auth.transport.requests import Request
def Create_Service(client_secret_file, api_name, api_version, *scopes):
print(client_secret_file, api_name, api_version, scopes, sep='-')
CLIENT_SECRET_FILE = client_secret_file
API_SERVICE_NAME = api_name
API_VERSION = api_version
SCOPES = [scope for scope in scopes[0]]
print(SCOPES)
cred = None
pickle_file = f'token_{API_SERVICE_NAME}_{API_VERSION}.pickle'
print(pickle_file)
if os.path.exists(pickle_file):
with open(pickle_file, 'rb') as token:
cred = pickle.load(token)
if not cred or not cred.valid:
if cred and cred.expired and cred.refresh_token:
cred.refresh(Request())
else:
flow = InstalledAppFlow.from_client_secrets_file(CLIENT_SECRET_FILE, SCOPES)
cred = flow.run_local_server(port=0)
with open(pickle_file, 'wb') as token:
pickle.dump(cred, token)
try:
service = build(API_SERVICE_NAME, API_VERSION, credentials=cred)
print(API_SERVICE_NAME, 'service created successfully')
return service
except Exception as e:
print('Unable to connect.')
print(e)
return None
s=Create_Service('client_secret_kc.json','drive','v3',['https://www.googleapis.com/auth/drive'])
The pyinstaller code I used to convert it into .exe is as follows
pyinstaller -c -F --add-data "client_secret_kc.json;." Google.py
NB- Google.py is the name of the python script gien above.
Thank you for helping.

Categories