I'm running a google sheet extract python api from compute engine and its works fine
and I'm running the same from the composer, but its not working
In composer, I'm login to the vm with the same user and running ssh command and I'm using the same service account in google sheet & vm
The following is the error message I'm getting
File "/home/utetwork_multiplier.py", line 15, in sheet_reade
result = service.spreadsheets().values().get(spreadsheetId=spreadsheet_id, range=range_name).execute(
File "/hsite-packages/googleapiclient/_helpers.py", line 130, in positional_wrappe
return wrapped(*args, **kwargs
File "/on3.5/site-packages/googleapiclient/http.py", line 849, in execut
raise HttpError(resp, content, uri=self.uri
googleapiclient.errors.HttpError: <HttpError 403 when requesting https://sheets.googleapis.com/v4/spreadsheets/14r0cQ1RCXhLyd7i0VCrmddXOFiugFnfioRb6cYI_BWQ/values/Master%21A%3AJ?alt=json returned "Request had insufficient authentication scopes."
Traceback (most recent call last)
File "/usr/local/lib/airflow/airflow/contrib/operators/ssh_operator.py", line 164, in execut
.format(self.command, error_msg)
airflow.exceptions.AirflowException: error running cmd: set -e;cd src/digital_platform && ../../venvs/bdp/bin/python -m.marketing.adnetwork_multiplier, error: Traceback (most recent call last)
File "/usr/lib/python3.5/runpy.py", line 184, in _run_module_as_mai
"__main__", mod_spec
File "/usr/lib/python3.5/runpy.py", line 85, in _run_cod
exec(code, run_globals
File "/home/tt/src/digital_platform//marketing/adnetwork_multiplier.py", line 74, in <module
main(
File "/home/tt/src/digital_platform//marketing/adnetwork_multiplier.py", line 28, in mai
data = sheet_reader(range_name
File "/home/tt/src/digital_platform//marketing/adnetwork_multiplier.py", line 15, in sheet_reade
result = service.spreadsheets().values().get(spreadsheetId=spreadsheet_id, range=range_name).execute(
File "/home//venvs/bdp/lib/python3.5/site-packages/googleapiclient/_helpers.py", line 130, in positional_wrappe
return wrapped(*args, **kwargs
File "/home//venvs/bdp/lib/python3.5/site-packages/googleapiclient/http.py", line 849, in execut
raise HttpError(resp, content, uri=self.uri
googleapiclient.errors.HttpError: <HttpError 403 when requesting https://sheets.googleapis.com/v4/spreadsheets/14r0cQ1RCXhLydcYI_BWQ/values/Master%21A%3AJ?alt=json returned "Request had insufficient authentication scopes."
During handling of the above exception, another exception occurred
Scopes
scope = [
'https://www.googleapis.com/auth/bigquery',
'https://www.googleapis.com/auth/spreadsheets.readonly',
'https://www.googleapis.com/auth/spreadsheets'
]
home = os.path.expanduser('~')
csf = os.path.join(home, '.client_secret.json')
token_filename = os.path.join(home, '.google_auth.dat')
flow = flow_from_clientsecrets(csf, scope=scope, message="%s is missing" % csf)
storage = Storage(token_filename)
credentials = storage.get()
if credentials is None or credentials.invalid:
flags = tools.argparser.parse_args(args=[])
flags.noauth_local_webserver = True
credentials = tools.run_flow(flow, storage, flags=flags)
From the error message you apear to be running Method: spreadsheets.values.get
That method requires one of the following scopes to access
https://www.googleapis.com/auth/drive
https://www.googleapis.com/auth/drive.readonly
https://www.googleapis.com/auth/drive.file
https://www.googleapis.com/auth/spreadsheets
https://www.googleapis.com/auth/spreadsheets.readonly
The error message states
403 when requesting https://sheets.googleapis.com/v4/spreadsheets/14r0cQ1RCXhLydcYI_BWQ/values/Master%21A%3AJ?alt=json returned "Request had insufficient authentication scopes.
Which means that the user you have authenticated with was not authenticated with one of the required scopes above. I suggest that you log out the user delete the credentials for that user and run your code again making sure that when it pops up the consent screen it is asking for one of the required scopes. I suspect you are running your script with outdated grants from the user.
Related
I am building a python client-side application that uses Firestore. I have successfully used Google Identity Platform to sign up and sign in to the Firebase project, and created a working Firestore client using google.cloud.firestore.Client which is authenticated as a user:
import json
import requests
import google.oauth2.credentials
from google.cloud import firestore
request_url = f"https://identitytoolkit.googleapis.com/v1/accounts:signInWithPassword?key={self.__api_key}"
headers = {"Content-Type": "application/json; charset=UTF-8"}
data = json.dumps({"email": self.__email, "password": self.__password, "returnSecureToken": True})
response = requests.post(request_url, headers=headers, data=data)
try:
response.raise_for_status()
except (HTTPError, Exception):
content = response.json()
error = f"error: {content['error']['message']}"
raise AuthError(error)
json_response = response.json()
self.__token = json_response["idToken"]
self.__refresh_token = json_response["refreshToken"]
credentials = google.oauth2.credentials.Credentials(self.__token,
self.__refresh_token,
client_id="",
client_secret="",
token_uri=f"https://securetoken.googleapis.com/v1/token?key={self.__api_key}"
)
self.__db = firestore.Client(self.__project_id, credentials)
I have the problem, however, that when the token has expired, I get the following error:
Traceback (most recent call last):
File "/usr/local/lib/python3.7/dist-packages/google/api_core/grpc_helpers.py", line 57, in error_remapped_callable
return callable_(*args, **kwargs)
File "/usr/local/lib/python3.7/dist-packages/grpc/_channel.py", line 826, in __call__
return _end_unary_response_blocking(state, call, False, None)
File "/usr/local/lib/python3.7/dist-packages/grpc/_channel.py", line 729, in _end_unary_response_blocking
raise _InactiveRpcError(state)
grpc._channel._InactiveRpcError: <_InactiveRpcError of RPC that terminated with:
status = StatusCode.UNAUTHENTICATED
details = "Missing or invalid authentication."
debug_error_string = "{"created":"#1613043524.699081937","description":"Error received from peer ipv4:172.217.16.74:443","file":"src/core/lib/surface/call.cc","file_line":1055,"grpc_message":"Missing or invalid authentication.","grpc_status":16}"
>
The above exception was the direct cause of the following exception:
Traceback (most recent call last):
File "/usr/lib/python3.7/threading.py", line 917, in _bootstrap_inner
self.run()
File "/home/my_app/src/controllers/im_alive.py", line 20, in run
self.__device_api.set_last_updated(utils.device_id())
File "/home/my_app/src/api/firestore/firestore_device_api.py", line 21, in set_last_updated
"lastUpdatedTime": self.__firestore.SERVER_TIMESTAMP
File "/home/my_app/src/api/firestore/firestore.py", line 100, in update
ref.update(data)
File "/usr/local/lib/python3.7/dist-packages/google/cloud/firestore_v1/document.py", line 382, in update
write_results = batch.commit()
File "/usr/local/lib/python3.7/dist-packages/google/cloud/firestore_v1/batch.py", line 147, in commit
metadata=self._client._rpc_metadata,
File "/usr/local/lib/python3.7/dist-packages/google/cloud/firestore_v1/gapic/firestore_client.py", line 1121, in commit
request, retry=retry, timeout=timeout, metadata=metadata
File "/usr/local/lib/python3.7/dist-packages/google/api_core/gapic_v1/method.py", line 145, in __call__
return wrapped_func(*args, **kwargs)
File "/usr/local/lib/python3.7/dist-packages/google/api_core/retry.py", line 286, in retry_wrapped_func
on_error=on_error,
File "/usr/local/lib/python3.7/dist-packages/google/api_core/retry.py", line 184, in retry_target
return target()
File "/usr/local/lib/python3.7/dist-packages/google/api_core/timeout.py", line 214, in func_with_timeout
return func(*args, **kwargs)
File "/usr/local/lib/python3.7/dist-packages/google/api_core/grpc_helpers.py", line 59, in error_remapped_callable
six.raise_from(exceptions.from_grpc_error(exc), exc)
File "<string>", line 3, in raise_from
google.api_core.exceptions.Unauthenticated: 401 Missing or invalid authentication.
I have tried omitting the token and only specifying the refresh token, and then calling credentials.refresh(), but the expires_in in the response from the https://securetoken.googleapis.com/v1/token endpoint is a string instead of a number (docs here), which makes _parse_expiry(response_data) in google.oauth2._client.py:257 raise an exception.
Is there any way to use the firestore.Client from either google.cloud or firebase_admin and have it automatically handle refreshing tokens, or do I need to switch to the manually calling the Firestore RPC API and refreshing tokens at the correct time?
Note: There are no users interacting with the python app, so the solution must not require user interaction.
Can't you just pass the string cast as integer _parse_expiry(int(float(response_data))) ?
If it is not working you could try to make a call and refresh token after getting and error 401, see my answer for the general idea on how to handle tokens.
As mentioned by #Marco, it is recommended that you use a service account if it's going to be used in an environment without user. When you use service account, you can just set GOOGLE_APPLICATION_CREDENTIALS environment variable to location of service account json file and just instantiate the firestore Client without any credentials (The credentials will be picked up automatically):
import firestore
client = firestore.Client()
and run it as (assuming Linux):
$ export GOOGLE_APPLICATION_CREDENTIALS=/path/to/credentials.json
$ python file.py
Still, if you really want to use user credentials for the script, you can install the Google Cloud SDK, then:
$ gcloud auth application-default login
This will open browser and for you to select account and login. After logging in, it creates a "virtual" service account file corresponding to your user account (that will also be loaded automatically by clients). Here too, you don't need to pass any parameters to your client.
See also: Difference between “gcloud auth application-default login” and “gcloud auth login”
I'm a beginner trying to access a google sheet via python gspread. I'm following these instructions: https://towardsdatascience.com/accessing-google-spreadsheet-data-using-python-90a5bc214fd2.
I believe I've successfully created my credentials: survivor-points-update#update-survivor-gspread.iam.gserviceaccount.com. When I try to share the google sheets document with this email, I get an email response telling me delivery failed because the domain couldn't be found. More specifically:
DNS Error: 19491 DNS type 'mx' lookup of update-survivor-gspread.iam.gserviceaccount.com responded with code NXDOMAIN Domain name not found: update-survivor-gspread.iam.gserviceaccount.com
I've found this alternative but I got a 401 error: The OAuth client was not found.:
client_id=<CLIENTID_FROM_GOOGLE_DEVLOPER_CONSOLE>
redirect_uri=http://localhost:8080/
scope=https://spreadsheets.google.com/feeds
access_type=offline
response_type=code
I've tried setting up my credentials a second time, but those don't work either. Thanks in advance for any help.
EDIT: On a hunch, I tried to share other documents with my two service accounts, and that didn't help.
EDIT 2: I tried a few different settings and scope as suggested in the git hub issue.
My main code is:
scope = ['https://spreadsheets.google.com/feeds']
creds = ServiceAccountCredentials.from_json_keyfile_name(cred_path, scope)
client = gspread.authorize(creds)
sheet = client.open(gsheet).sheet1
With this scope, I get the following error:
Traceback (most recent call last):
File "<ipython-input-8-b2d4b9300652>", line 1, in <module>
sheet = client.open(gsheet).sheet1
File "C:\Users\User\Anaconda3\lib\site-packages\gspread\client.py", line 123, in open
self.list_spreadsheet_files()
File "C:\Users\User\Anaconda3\lib\site-packages\gspread\client.py", line 96, in list_spreadsheet_files
res = self.request('get', url, params=params).json()
File "C:\Users\User\Anaconda3\lib\site-packages\gspread\client.py", line 79, in request
raise APIError(response)
APIError: {
"error": {
"errors": [
{
"domain": "global",
"reason": "insufficientPermissions",
"message": "Insufficient Permission: Request had insufficient authentication scopes."
}
],
"code": 403,
"message": "Insufficient Permission: Request had insufficient authentication scopes."
}
}
If I change:
scope = ['https://spreadsheets.google.com/feeds',
'https://www.googleapis.com/auth/drive']
then I get:
Traceback (most recent call last):
File "C:\Users\User\Anaconda3\lib\site-packages\gspread\client.py", line 123, in open
self.list_spreadsheet_files()
File "C:\Users\User\Anaconda3\lib\site-packages\gspread\utils.py", line 37, in finditem
return next((item for item in seq if func(item)))
StopIteration
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File "<ipython-input-22-b2d4b9300652>", line 1, in <module>
sheet = client.open(gsheet).sheet1
File "C:\Users\User\Anaconda3\lib\site-packages\gspread\client.py", line 131, in open
raise SpreadsheetNotFound
SpreadsheetNotFound
I've tried enabling the google sheets API to both my projects and run the code again with both the scope options.
If I run client.openall() it returns only the following list: [<Spreadsheet 'Access Rights Spreadsheet' id:1TwF6neGtogHAsqu4vjIiGWpfUPiQmVFgF5kR8k_8jC0>]
Is there another scope I can add or API I can enable?
EDIT 3: I don't know how [<Spreadsheet 'Access Rights Spreadsheet' id:1TwF6neGtogHAsqu4vjIiGWpfUPiQmVFgF5kR8k_8jC0>] was shared with me. I thought it was just one that everybody had.
When I run this:
bk = client.open_by_key('1TwF6neGtogHAsqu4vjIiGWpfUPiQmVFgF5kR8k_8jC0')
sht = bk.get_worksheet(0)
l = sht.get_all_values()
l is an empty list. bk.worksheets() returns [<Worksheet 'Sheet1' id:0>] so I don't see any other data.
I'm trying to access google cloud storage bucket from cloud functions (python) instance and it's throwing mystic 500 error.
I have given the service account editor role too. It didn't make any change.
I also checked if any of the quota is going off limit. The limits were not even close.
Please, anyone can help me find cause of this error?
here is the code
from google.cloud import storage
import os
import base64
storage_client = storage.Client()
def init_analysis(event, context):
print("event", event)
pubsub_message = base64.b64decode(event['data']).decode('utf-8')
print(pubsub_message)
bucket_name = 'my-bucket'
bucket = storage_client.get_bucket(bucket_name)
blobs = bucket.list_blobs()
for blob in blobs:
print(blob.name)
Error:
Traceback (most recent call last): File "/env/local/lib/python3.7/site-packages/google/auth/compute_engine/credentials.py", line 99, in refresh service_account=self._service_account_email) File "/env/local/lib/python3.7/site-packages/google/auth/compute_engine/_metadata.py", line 208, in get_service_account_token 'instance/service-accounts/{0}/token'.format(service_account)) File "/env/local/lib/python3.7/site-packages/google/auth/compute_engine/_metadata.py", line 140, in get url, response.status, response.data), response) google.auth.exceptions.TransportError: ("Failed to retrieve http://metadata.google.internal/computeMetadata/v1/instance/service-accounts/my-project#appspot.gserviceaccount.com/token from the Google Compute Enginemetadata service. Status: 500 Response:\nb'Could not fetch URI /computeMetadata/v1/instance/service-accounts/my-project#appspot.gserviceaccount.com/token\\n'", <google.auth.transport.requests._Response object at 0x2b0ef9edf438>) The above exception was the direct cause of the following exception: Traceback (most recent call last): File "/env/local/lib/python3.7/site-packages/google/cloud/functions/worker.py", line 383, in run_background_function _function_handler.invoke_user_function(event_object) File "/env/local/lib/python3.7/site-packages/google/cloud/functions/worker.py", line 217, in invoke_user_function return call_user_function(request_or_event) File "/env/local/lib/python3.7/site-packages/google/cloud/functions/worker.py", line 214, in call_user_function event_context.Context(**request_or_event.context)) File "/user_code/main.py", line 21, in init_analysis bucket = storage_client.get_bucket(bucket_name) File "/env/local/lib/python3.7/site-packages/google/cloud/storage/client.py", line 227, in get_bucket bucket.reload(client=self) File "/env/local/lib/python3.7/site-packages/google/cloud/storage/_helpers.py", line 130, in reload _target_object=self, File "/env/local/lib/python3.7/site-packages/google/cloud/_http.py", line 315, in api_request target_object=_target_object, File "/env/local/lib/python3.7/site-packages/google/cloud/_http.py", line 192, in _make_request return self._do_request(method, url, headers, data, target_object) File "/env/local/lib/python3.7/site-packages/google/cloud/_http.py", line 221, in _do_request return self.http.request(url=url, method=method, headers=headers, data=data) File "/env/local/lib/python3.7/site-packages/google/auth/transport/requests.py", line 205, in request self._auth_request, method, url, request_headers) File "/env/local/lib/python3.7/site-packages/google/auth/credentials.py", line 122, in before_request self.refresh(request) File "/env/local/lib/python3.7/site-packages/google/auth/compute_engine/credentials.py", line 102, in refresh six.raise_from(new_exc, caught_exc) File "<string>", line 3, in raise_from google.auth.exceptions.RefreshError: ("Failed to retrieve http://metadata.google.internal/computeMetadata/v1/instance/service-accounts/my-project#appspot.gserviceaccount.com/token from the Google Compute Enginemetadata service. Status: 500 Response:\nb'Could not fetch URI /computeMetadata/v1/instance/service-accounts/my-project#appspot.gserviceaccount.com/token\\n'", <google.auth.transport.requests._Response object at 0x2b0ef9edf438>)
google.auth.exceptions.TransportError: ("Failed to retrieve http://metadata.google.internal/computeMetadata/v1/instance/service-accounts/my-project#appspot.gserviceaccount.com/token from the Google Compute Enginemetadata service. Status: 500 Response:\nb'Could not fetch URI /computeMetadata/v1/instance/service-accounts/my-project#appspot.gserviceaccount.com/token\\n'"
The error you are getting it's because your Cloud Functions service account doesn't have the cloudfunctions.serviceAgent role. As you can see on the documentation:
Authenticating as the runtime service account from inside your function may fail if you change the Cloud Functions service account's permissions.
However, I found that sometimes you can not add this role as it doesn't show up as an option. I have reported this issue to the Google Cloud Functions engineering team and they are working to solve it.
Nevertheless, you can add the role again using this gcloud command:
gcloud projects add-iam-policy-binding <project_name> --role=roles/cloudfunctions.serviceAgent --member=serviceAccount:service-<project_number>#gcf-admin-robot.iam.gserviceaccount.com
Im trying to launch an ml-engine jobs submit training using a cloud composer, i'm using this guide for instructions recommendation-system-tensorflow-deploy.
Im using a plugin which google created (see the implementation here)
Im trying to make it work on python version 3.5, this by changing line 206 from:
training_request = {
'jobId': job_id,
'trainingInput': {
'scaleTier': self._scale_tier,
'packageUris': self._package_uris,
'pythonModule': self._training_python_module,
'region': self._region,
'args': self._training_args,
'masterType': self._master_type
}
To:
training_request = {
'jobId': job_id,
'trainingInput': {
'scaleTier': self._scale_tier,
'packageUris': self._package_uris,
'pythonModule': self._training_python_module,
'region': self._region,
'args': self._training_args,
'masterType': self._master_type,
'python-version': '3.5' #self._python_version
}
I also tried to add to it the run time version (runtime-version='1.12') but i keep on getting the following error:
[2019-01-20 11:58:36,331] {models.py:1594} ERROR - <HttpError 400 when requesting https://ml.googleapis.com/v1/projects/hallowed-forge-577/jobs?alt=json returned "Invalid JSON payload received. Unknown name "python-version" at 'job.training_input': Cannot find field.">
Traceback (most recent call last)
File "/usr/local/lib/airflow/airflow/models.py", line 1492, in _run_raw_tas
result = task_copy.execute(context=context
File "/home/airflow/gcs/plugins/ml_engine_plugin.py", line 241, in execut
self._project_id, training_request, check_existing_job
File "/home/airflow/gcs/plugins/ml_engine_plugin.py", line 79, in create_jo
request.execute(
File "/usr/local/lib/python3.6/site-packages/oauth2client/util.py", line 135, in positional_wrappe
return wrapped(*args, **kwargs
File "/usr/local/lib/python3.6/site-packages/googleapiclient/http.py", line 838, in execut
raise HttpError(resp, content, uri=self.uri
googleapiclient.errors.HttpError: <HttpError 400 when requesting https://ml.googleapis.com/v1/projects/hallowed-forge-577/jobs?alt=json returned "Invalid JSON payload received. Unknown name "python-version" at 'job.training_input': Cannot find field."
[2019-01-20 11:58:36,334] {models.py:1623} INFO - Marking task as FAILED.
[2019-01-20 11:58:36,513] {models.py:1627} ERROR - Failed to send email to: ['airflow#example.com']
[2019-01-20 11:58:36,516] {models.py:1628} ERROR - HTTP Error 401: Unauthorized
Traceback (most recent call last)
File "/usr/local/lib/airflow/airflow/models.py", line 1625, in handle_failur
self.email_alert(error, is_retry=False
File "/usr/local/lib/airflow/airflow/models.py", line 1778, in email_aler
send_email(task.email, title, body
File "/usr/local/lib/airflow/airflow/utils/email.py", line 44, in send_emai
return backend(to, subject, html_content, files=files, dryrun=dryrun, cc=cc, bcc=bcc, mime_subtype=mime_subtype
File "/usr/local/lib/airflow/airflow/contrib/utils/sendgrid.py", line 116, in send_emai
_post_sendgrid_mail(mail.get()
File "/usr/local/lib/airflow/airflow/contrib/utils/sendgrid.py", line 122, in _post_sendgrid_mai
response = sg.client.mail.send.post(request_body=mail_data
File "/usr/local/lib/python3.6/site-packages/python_http_client/client.py", line 252, in http_reques
return Response(self._make_request(opener, request, timeout=timeout)
File "/usr/local/lib/python3.6/site-packages/python_http_client/client.py", line 176, in _make_reques
raise ex
python_http_client.exceptions.UnauthorizedError: HTTP Error 401: Unauthorize
Notice that the python version actually changes (to 3.6 from the original 2.7) so changing the python version, does something, but then gets stuck
Any help on what i'm missing here will be awesome!
It seems like the example uses an old version of airflow MLEngineTrainingOperator.
The last version implements the runtime-version/python-version training params.
Use the current version:
mlengine_operator.py
Generating an access token for the Python Instagram API requires running this file and then entering a Client ID, Client Secret, Redirect URI, and Scope. The console then outputs a URL to follow to authorize the app and asks for the code generated afterwards. Theoretically after this process it should return an access token.
Instead, it's throwing an error:
Traceback (most recent call last):
File "get_access_token.py", line 40, in <module>
access_token = api.exchange_code_for_access_token(code)
File "C:\Users\Daniel Leybzon\Anaconda2\lib\site-packages\instagram\oauth2.py", line 48, in exchange_code_for_access_token
return req.exchange_for_access_token(code=code)
File "C:\Users\Daniel Leybzon\Anaconda2\lib\site-packages\instagram\oauth2.py", line 115, in exchange_for_access_token
raise OAuth2AuthExchangeError(parsed_content.get("error_message", ""))
instagram.oauth2.OAuth2AuthExchangeError: You must provide a client_id
Screenshot provided for context: