StatusCode.PERMISSION_DENIED error while publishing message to Google PubSub - python

I am trying to publish messages to Google PubSub in Python
Here is the code that I tried
from google.cloud import pubsub
ps = pubsub.Client()
topic = ps.topic("topic_name")
topic.publish("Message to topic")
I am getting the below error
File "/usr/local/lib/python2.7/dist-packages/google/cloud/iterator.py", line 218, in _items_iter
for page in self._page_iter(increment=False):
File "/usr/local/lib/python2.7/dist-packages/google/cloud/iterator.py", line 247, in _page_iter
page = self._next_page()
File "/usr/local/lib/python2.7/dist-packages/google/cloud/iterator.py", line 445, in _next_page
items = six.next(self._gax_page_iter)
File "/usr/local/lib/python2.7/dist-packages/google/gax/__init__.py", line 455, in next
return self.__next__()
File "/usr/local/lib/python2.7/dist-packages/google/gax/__init__.py", line 465, in __next__
response = self._func(self._request, **self._kwargs)
File "/usr/local/lib/python2.7/dist-packages/google/gax/api_callable.py", line 376, in inner
return a_func(*args, **kwargs)
File "/usr/local/lib/python2.7/dist-packages/google/gax/retry.py", line 127, in inner
' classified as transient', exception)
google.gax.errors.RetryError: GaxError(Exception occurred in retry method that was not classified as transient, caused by <_Rendezvous of RPC that terminated with (StatusCode.PERMISSION_DENIED, User not authorized to perform this action.)>)
I've download service-account.json and the path of service-account.json is set to GOOGLE_APPLICATION_CREDENTIALS
I've also tried installing gcloud and executing gcloud auth application-default login
Please note that I am able to publish message using gcloud command and in java
$ gcloud beta pubsub topics publish sample "hello"
messageIds: '127284267552464'
Java code
TopicName topicName = TopicName.create(SRC_PROJECT, SRC_TOPIC);
Publisher publisher = Publisher.defaultBuilder(topicName).build();
ByteString data1 = ByteString.copyFromUtf8("hello");
PubsubMessage pubsubMessage1 = PubsubMessage.newBuilder().setData(data1).build();
publisher.publish(pubsubMessage1);
What is missing in python code?
I followed the steps described over here

This is an issue with my setup. My service-account.json does not have enough permission to publish messages to PubSub

Related

Sendgrid HTTP: 400 error using Cloud Composer

I'm trying to set up an Airflow DAG that is able to send emails through the EmailOperator in Composer 2, Airflow 2.3.4. I've followed this guide. I tried running the example DAG that is provided in the guide, but I get an HTTP 400 error.
The log looks like this:
[2023-01-20, 10:46:45 UTC] {taskinstance.py:1904} ERROR - Task failed with exception
Traceback (most recent call last):
File "/opt/python3.8/lib/python3.8/site-packages/airflow/operators/email.py", line 75, in execute
send_email(
File "/opt/python3.8/lib/python3.8/site-packages/airflow/utils/email.py", line 58, in send_email
return backend(
File "/opt/python3.8/lib/python3.8/site-packages/airflow/providers/sendgrid/utils/emailer.py", line 123, in send_email
_post_sendgrid_mail(mail.get(), conn_id)
File "/opt/python3.8/lib/python3.8/site-packages/airflow/providers/sendgrid/utils/emailer.py", line 142, in _post_sendgrid_mail
response = sendgrid_client.client.mail.send.post(request_body=mail_data)
File "/opt/python3.8/lib/python3.8/site-packages/python_http_client/client.py", line 277, in http_request
self._make_request(opener, request, timeout=timeout)
File "/opt/python3.8/lib/python3.8/site-packages/python_http_client/client.py", line 184, in _make_request
raise exc
python_http_client.exceptions.BadRequestsError: HTTP Error 400: Bad Request
I've looked at similar threads on Stackoverflow but none of those suggestions worked for me.
I have set up and verified the from email address in Sendgrid and it
uses a whole email address including the domain.
I also set this email address up in Secret Manager (as well as the API key).
I haven't changed the test DAG from the guide, except for the 'to' address.
In another DAG I've tried enabling 'email_on_retry' and that also didn't trigger any mail.
I'm at a loss here, can someone provide me with suggestions on things to try?

403 Request had insufficient authentication issues while accessing Secrets on GCP within a container

I am trying to access a secret on GCP Secrets and I get the following error :
in get_total_results "api_key": get_credentials("somekey").get("somekey within key"), File
"/helper.py", line 153, in get_credentials response = client.access_secret_version(request={"name": resource_name})
File "/usr/local/lib/python3.8/site-packages/google/cloud/secretmanager_v1/services/secret_manager_service/client.py",
line 1136, in access_secret_version response = rpc(request, retry=retry, timeout=timeout, metadata=metadata,)
File "/usr/local/lib/python3.8/site-packages/google/api_core/gapic_v1/method.py", line 145, in __call__
return wrapped_func(*args, **kwargs) File "/usr/local/lib/python3.8/site-packages/google/api_core/retry.py", line 285, in retry_wrapped_func return retry_target( File "/usr/local/lib/python3.8/site-packages/google/api_core/retry.py",
line 188, in retry_target return target() File "/usr/local/lib/python3.8/site-packages/google/api_core/grpc_helpers.py",
line 69, in error_remapped_callable six.raise_from(exceptions.from_grpc_error(exc), exc) File "<string>",
line 3, in raise_from google.api_core.exceptions.PermissionDenied:
403 Request had insufficient authentication scopes.
The code is fairly simple:-
def get_credentials(secret_id):
project_id = os.environ.get("PROJECT_ID")
resource_name = f"projects/{project_id}/secrets/{secret_id}/versions/1"
client = secretmanager.SecretManagerServiceClient()
response = client.access_secret_version(request={"name": resource_name})
secret_string = response.payload.data.decode("UTF-8")
secret_dict = json.loads(secret_string)
return secret_dict
So, what I have is a cloud function, which is deployed using Triggers, and uses a service account which has the Owner role.
The cloud function triggers a Kubernete Work Job and creates a container, which downloads a repo inside the container and executes it.
Dockerfile is:
FROM gcr.io/project/repo:latest
FROM python:3.8-slim-buster
COPY . /some_dir
WORKDIR /some_dir
COPY --from=0 ./repo /a_repo
RUN pip install -r requirements.txt & pip install -r a_repo/requirements.txt
ENTRYPOINT ["python3" , "main.py"]
The GCE instance might not have the correct authentication scope.
From: https://developers.google.com/identity/protocols/oauth2/scopes#secretmanager
https://www.googleapis.com/auth/cloud-platform is the required scope.
When creating the GCE instance you need to select the option that gives the instance the correct scope to call out to cloud APIs:

Firestore client in python (as user) using firebase_admin or google.cloud.firestore

I am building a python client-side application that uses Firestore. I have successfully used Google Identity Platform to sign up and sign in to the Firebase project, and created a working Firestore client using google.cloud.firestore.Client which is authenticated as a user:
import json
import requests
import google.oauth2.credentials
from google.cloud import firestore
request_url = f"https://identitytoolkit.googleapis.com/v1/accounts:signInWithPassword?key={self.__api_key}"
headers = {"Content-Type": "application/json; charset=UTF-8"}
data = json.dumps({"email": self.__email, "password": self.__password, "returnSecureToken": True})
response = requests.post(request_url, headers=headers, data=data)
try:
response.raise_for_status()
except (HTTPError, Exception):
content = response.json()
error = f"error: {content['error']['message']}"
raise AuthError(error)
json_response = response.json()
self.__token = json_response["idToken"]
self.__refresh_token = json_response["refreshToken"]
credentials = google.oauth2.credentials.Credentials(self.__token,
self.__refresh_token,
client_id="",
client_secret="",
token_uri=f"https://securetoken.googleapis.com/v1/token?key={self.__api_key}"
)
self.__db = firestore.Client(self.__project_id, credentials)
I have the problem, however, that when the token has expired, I get the following error:
Traceback (most recent call last):
File "/usr/local/lib/python3.7/dist-packages/google/api_core/grpc_helpers.py", line 57, in error_remapped_callable
return callable_(*args, **kwargs)
File "/usr/local/lib/python3.7/dist-packages/grpc/_channel.py", line 826, in __call__
return _end_unary_response_blocking(state, call, False, None)
File "/usr/local/lib/python3.7/dist-packages/grpc/_channel.py", line 729, in _end_unary_response_blocking
raise _InactiveRpcError(state)
grpc._channel._InactiveRpcError: <_InactiveRpcError of RPC that terminated with:
status = StatusCode.UNAUTHENTICATED
details = "Missing or invalid authentication."
debug_error_string = "{"created":"#1613043524.699081937","description":"Error received from peer ipv4:172.217.16.74:443","file":"src/core/lib/surface/call.cc","file_line":1055,"grpc_message":"Missing or invalid authentication.","grpc_status":16}"
>
The above exception was the direct cause of the following exception:
Traceback (most recent call last):
File "/usr/lib/python3.7/threading.py", line 917, in _bootstrap_inner
self.run()
File "/home/my_app/src/controllers/im_alive.py", line 20, in run
self.__device_api.set_last_updated(utils.device_id())
File "/home/my_app/src/api/firestore/firestore_device_api.py", line 21, in set_last_updated
"lastUpdatedTime": self.__firestore.SERVER_TIMESTAMP
File "/home/my_app/src/api/firestore/firestore.py", line 100, in update
ref.update(data)
File "/usr/local/lib/python3.7/dist-packages/google/cloud/firestore_v1/document.py", line 382, in update
write_results = batch.commit()
File "/usr/local/lib/python3.7/dist-packages/google/cloud/firestore_v1/batch.py", line 147, in commit
metadata=self._client._rpc_metadata,
File "/usr/local/lib/python3.7/dist-packages/google/cloud/firestore_v1/gapic/firestore_client.py", line 1121, in commit
request, retry=retry, timeout=timeout, metadata=metadata
File "/usr/local/lib/python3.7/dist-packages/google/api_core/gapic_v1/method.py", line 145, in __call__
return wrapped_func(*args, **kwargs)
File "/usr/local/lib/python3.7/dist-packages/google/api_core/retry.py", line 286, in retry_wrapped_func
on_error=on_error,
File "/usr/local/lib/python3.7/dist-packages/google/api_core/retry.py", line 184, in retry_target
return target()
File "/usr/local/lib/python3.7/dist-packages/google/api_core/timeout.py", line 214, in func_with_timeout
return func(*args, **kwargs)
File "/usr/local/lib/python3.7/dist-packages/google/api_core/grpc_helpers.py", line 59, in error_remapped_callable
six.raise_from(exceptions.from_grpc_error(exc), exc)
File "<string>", line 3, in raise_from
google.api_core.exceptions.Unauthenticated: 401 Missing or invalid authentication.
I have tried omitting the token and only specifying the refresh token, and then calling credentials.refresh(), but the expires_in in the response from the https://securetoken.googleapis.com/v1/token endpoint is a string instead of a number (docs here), which makes _parse_expiry(response_data) in google.oauth2._client.py:257 raise an exception.
Is there any way to use the firestore.Client from either google.cloud or firebase_admin and have it automatically handle refreshing tokens, or do I need to switch to the manually calling the Firestore RPC API and refreshing tokens at the correct time?
Note: There are no users interacting with the python app, so the solution must not require user interaction.
Can't you just pass the string cast as integer _parse_expiry(int(float(response_data))) ?
If it is not working you could try to make a call and refresh token after getting and error 401, see my answer for the general idea on how to handle tokens.
As mentioned by #Marco, it is recommended that you use a service account if it's going to be used in an environment without user. When you use service account, you can just set GOOGLE_APPLICATION_CREDENTIALS environment variable to location of service account json file and just instantiate the firestore Client without any credentials (The credentials will be picked up automatically):
import firestore
client = firestore.Client()
and run it as (assuming Linux):
$ export GOOGLE_APPLICATION_CREDENTIALS=/path/to/credentials.json
$ python file.py
Still, if you really want to use user credentials for the script, you can install the Google Cloud SDK, then:
$ gcloud auth application-default login
This will open browser and for you to select account and login. After logging in, it creates a "virtual" service account file corresponding to your user account (that will also be loaded automatically by clients). Here too, you don't need to pass any parameters to your client.
See also: Difference between “gcloud auth application-default login” and “gcloud auth login”

Google Cloud DataFlow job throws alert after few hours

Running a DataFlow streaming job using 2.11.0 release.
I get the following authentication error after few hours:
File "streaming_twitter.py", line 188, in <lambda>
File "streaming_twitter.py", line 102, in estimate
File "streaming_twitter.py", line 84, in estimate_aiplatform
File "streaming_twitter.py", line 42, in get_service
File "/usr/local/lib/python2.7/dist-packages/googleapiclient/_helpers.py", line 130, in positional_wrapper return wrapped(*args, **kwargs)
File "/usr/local/lib/python2.7/dist-packages/googleapiclient/discovery.py", line 227, in build credentials=credentials)
File "/usr/local/lib/python2.7/dist-packages/googleapiclient/_helpers.py", line 130, in positional_wrapper return wrapped(*args, **kwargs)
File "/usr/local/lib/python2.7/dist-packages/googleapiclient/discovery.py", line 363, in build_from_document credentials = _auth.default_credentials()
File "/usr/local/lib/python2.7/dist-packages/googleapiclient/_auth.py", line 42, in default_credentials credentials, _ = google.auth.default()
File "/usr/local/lib/python2.7/dist-packages/google/auth/_default.py", line 306, in default raise exceptions.DefaultCredentialsError(_HELP_MESSAGE) DefaultCredentialsError: Could not automatically determine credentials. Please set GOOGLE_APPLICATION_CREDENTIALS or explicitly create credentials and re-run the application.
This Dataflow job performs an API request to AI Platform prediction
and seems to be Authentication token is expiring.
Code snippet:
def get_service():
# If it hasn't been instantiated yet: do it now
return discovery.build('ml', 'v1',
discoveryServiceUrl=DISCOVERY_SERVICE,
cache_discovery=True)
I tried adding the following lines to the service function:
os.environ[
"GOOGLE_APPLICATION_CREDENTIALS"] = "/tmp/key.json"
But I get:
DefaultCredentialsError: File "/tmp/key.json" was not found. [while running 'generatedPtransform-930']
I assume because file is not in DataFlow machine.
Other option is to use developerKey param in build method, but doesnt seems supported by AI Platform prediction, I get error:
Expected OAuth 2 access token, login cookie or other valid authentication credential. See https://developers.google.com/identity/sign-in/web/devconsole-project."> [while running 'generatedPtransform-22624']
Looking to understand how to fix it and what is the best practice?
Any suggestions?
Complete logs here
Complete code here
Setting os.environ['GOOGLE_APPLICATION_CREDENTIALS'] = '/tmp/key.json' only works locally with the DirectRunner. Once deploying to a distributed runner like Dataflow, each worker won't be able to find the local file /tmp/key.json.
If you want each worker to use a specific service account, you can tell Beam which service account to use to identify workers.
First, grant the roles/dataflow.worker role to the service account you want your workers to use. There is no need to download the service account key file :)
Then if you're letting PipelineOptions parse your command line arguments, you can simply use the service_account_email option, and specify it like --service_account_email your-email#your-project.iam.gserviceaccount.com when running your pipeline.
The service account pointed by your GOOGLE_APPLICATION_CREDENTIALS is simply used to start the job, but each worker uses the service account specified by the service_account_email. If a service_account_email is not passed, it defaults to the email from your GOOGLE_APPLICATION_CREDENTIALS file.

can not authenticate with gcs in python

I am following the example in https://developers.google.com/storage/docs/gspythonlibrary#credentials
I created client/secret pair by choosing in the dev. console "create new client id", "installed application", "other".
I have the following code in my python script:
import boto
from gcs_oauth2_boto_plugin.oauth2_helper import SetFallbackClientIdAndSecret
CLIENT_ID = 'my_client_id'
CLIENT_SECRET = 'xxxfoo'
SetFallbackClientIdAndSecret(CLIENT_ID, CLIENT_SECRET)
uri = boto.storage_uri('foobartest2014', 'gs')
header_values = {"x-goog-project-id": proj_id}
uri.create_bucket(headers=header_values)
and it fails with the following error:
File "/usr/local/lib/python2.7/dist-packages/boto/storage_uri.py", line 555, in create_bucket
conn = self.connect()
File "/usr/local/lib/python2.7/dist-packages/boto/storage_uri.py", line 140, in connect
**connection_args)
File "/usr/local/lib/python2.7/dist-packages/boto/gs/connection.py", line 47, in __init__
suppress_consec_slashes=suppress_consec_slashes)
File "/usr/local/lib/python2.7/dist-packages/boto/s3/connection.py", line 190, in __init__
validate_certs=validate_certs, profile_name=profile_name)
File "/usr/local/lib/python2.7/dist-packages/boto/connection.py", line 572, in __init__
host, config, self.provider, self._required_auth_capability())
File "/usr/local/lib/python2.7/dist-packages/boto/auth.py", line 883, in get_auth_handler
'Check your credentials' % (len(names), str(names)))
boto.exception.NoAuthHandlerFound: No handler was ready to authenticate. 3 handlers were checked. ['OAuth2Auth', 'OAuth2ServiceAccountAuth', 'HmacAuthV1Handler'] Check your credentials
I have been struggling with this for the last couple of days, turns out the boto stuff, and that gspythonlibrary are all totally obsolete.
The latest example code showing how to use/authenticate Google Cloud Storage is here:
https://github.com/GoogleCloudPlatform/python-docs-samples/tree/master/storage/api
You need to provide a client/secret pair in a .boto file, and then run gsutil config.
It will create a refresh token, and then should work!
For more info, see https://developers.google.com/storage/docs/gspythonlibrary#credentials
U can also make console application for gsutil commands authentication and gsutil cp, rm, gsutil config -a pass through console application to cloud SDK then execute

Categories