Access Secret Manager Service Account from Cloud Run - python

Using Azure Devops I'm developing a Cloud Run service on PROJECT_A that need to utilize a Service Account of a PROJECT_B to read logs from Stackdriver.
I've successfully deployed the Cloud Run using its gcloud commands.
gcloud run deploy [[SERVICE] --namespace=NAMESPACE] [--service-account=Service_Account#PROJECT_A.iam.gserviceaccount.com]
Since I'm storing my service account as secure file in Azure Devops, I've uploaded PROJECT_B service account to PROJECT_A GCP Secret Manager using
echo $(service_account_PROJECT_B.json) > SA_PROJECT_B.txt
gcloud secrets create SA_PROJECT_B --data-file=SA_PROJECT_B.txt --replication policy=user-managed --project=PROJECT_A
I'm finding Issues while accessing to the Service Account stored in the secret manager.
Locally, when I create the client, I use:
#config.py
if DEPLOY_ENVIRONMENT == "local":
SA_PROJECT_B = os.path.join(BASE_DIR / "SA_PROJECT_B.json")
os.environ["SA_PROJECT_B"] = str(SA_PROJECT_B)
.
#client.py
from google.cloud import logging
from config import SA_PROJECT_B
logging_client = logging.Client.from_service_account_json(
SA_PROJECT_B
)
And it works.
When I execute the code from the Cloud Run, I get an error message stating that it can't cannot import name 'SA_PROJECT_B'
So here is my question:
how should I reference to a secret stored in secrets manager from the code?
I've tried following this google cloud community tutorial that showed me that the issue happens when I upload the secret to secret manager.
secrets = secretmanager.SecretManagerServiceClient()
SA_PROJECT_B= secrets.access_secret_version(request={"name": "projects/"+"PROJECT_B"+"/secrets/PROJECT_B/versions/1"}).payload.data.decode("utf-8")
print(SA_PROJECT_B) returns '$(service_account_PROJECT_B.json)'
I can't understand what I'm doing wrong.
Is something related with uploading a service account to gcp or something related to accessing correctly the secret manager?

Related

Flask web app on Cloud Run - google.auth.exceptions.DefaultCredentialsError:

I'm hosting a Flask web app on Cloud Run. I'm also using Secret Manager to store Service Account keys. (I previously downloaded a JSON file with the keys)
In my code, I'm accessing the payload then using os.environ["GOOGLE_APPLICATION_CREDENTIALS"] = payload to authenticate. When I deploy the app and try to visit the page, I get an Internal Service Error. Reviewing the logs, I see:
File "/usr/local/lib/python3.10/site-packages/google/auth/_default.py", line 121, in load_credentials_from_file
raise exceptions.DefaultCredentialsError(
google.auth.exceptions.DefaultCredentialsError: File {"
I can access the secret through gcloud just fine with: gcloud secrets versions access 1 --secret="<secret_id>" while acting as the Service Account.
Here is my Python code:
# Grabbing keys from Secret Manager
def access_secret_version():
# Create the Secret Manager client.
client = secretmanager.SecretManagerServiceClient()
# Build the resource name of the secret version.
name = "projects/{project_id}/secrets/{secret_id}/versions/1"
# Access the secret version.
response = client.access_secret_version(request={"name": name})
payload = response.payload.data.decode("UTF-8")
return payload
#app.route('/page/page_two')
def some_random_func():
# New way
payload = access_secret_version() # <---- calling the payload
os.environ["GOOGLE_APPLICATION_CREDENTIALS"] = payload
# Old way
os.environ["GOOGLE_APPLICATION_CREDENTIALS"] = "service-account-keys.json"
I'm not technically accessing a JSON file like I was before. The payload variable is storing entire key. Is this why it's not working?
Your approach is incorrect.
When you run on a Google compute service like Cloud Run, the code runs under the identity of the compute service.
In this case, by default, Cloud Run uses the Compute Engine default service account but, it's good practice to create a Service Account for your service and specify it when you deploy it to Cloud Run (see Service accounts).
This mechanism is one of the "legs" of Application Default Credentials when your code is running on Google Cloud, you don't specify the environment variable (you also don't need to create a key) and Cloud Run service acquires the credentials from the Metadata service:
import google.auth
credentials, project_id = google.auth.default()
See google.auth package
It is bad practice to define|set an environment variable within code. By their nature, environment variables should be provided by the environment. Doing this with APPLICATION_DEFAULT_CREDENTIALS means that your code always sets this value when it should only do this when the code is running off Google Cloud.
For completeness, if you need to create Credentials from a JSON string rather than from a file contain a JSON string, you can use from_service_account_info (see google.oauth2.service_account)

Python connection to google big query using ADC

I am trying to get data from Google big query table using python. I dont have a service account access,but i have individual access to bigquery using gcloud. i have application default credentials Json file. I need to how to make a connection to bigquery usinG ADC.
code snippet:
from google.cloud import bigquery
conn=bigquery.Client()
query="select * from my_data.test1"
conn.query(query)
When i run above code snippet i am getting error saying:
NewConnectionError: <urllib3.connection.HttpsConnection object at 0x83dh46bdu640>: Failed to establish a new connection:[Error -2] Name or Service not known
Note: ENVIRONMENT Variable GOOGLE APPLICATION CREDENTIALS is not set and empty
Your script works for me because I authenticated using end user credentials from Google Cloud SDK, once you have the SDK installed you can simply run:
gcloud auth application-default login
The credentials from your json file are not being passed to the bigquery client, e.g.:
client = bigquery.Client(project=project, credentials=credentials)
to set that up you can follow these steps: https://cloud.google.com/bigquery/docs/authentication/end-user-installed
or this thread has some good details on setting the credentials environment variable: Setting GOOGLE_APPLICATION_CREDENTIALS for BigQuery Python CLI

How to get Google OAUTH Token without using Gcloud Command Line

I am currently using the following code to get the OAUTH Token
command = 'gcloud auth print-access-token'
result = str(subprocess.Popen(command, universal_newlines=True, shell=True, stdout=subprocess.PIPE,
stderr=subprocess.PIPE).communicate())
The result variable has the OAUTH Token. This technique uses my current logged in gcloud config.
However, I am looking out for a way to get the OAUTH Token without using command line.
I am using this OAUTH Token to make CDAP calls to get the Google Dataflow Pipeline Execution Details.
I checked some google blogs. This is the one I think should try but it asks to create consent screen and it will require one time activity to provide consent to the scopes defined and then it should work.
Google Document
Shall I follow steps in above document and check OR is there any other way we can get the OAUTH Token?
Is there a way to get authentication done by service account instead of google user account and get the OAUTH Token?
For automated process, service account is the recommended way. You can use the google-oauth library for this. You can generate an access token like this
# With default credential (your user account or the Google Cloud Component service account.
# Or with the service account key file defined in the GOOGLE_APPLICATION_CREDENTIALS env var -> for platform outside GCP)
credentials, project_id = google.auth.default(scopes=["https://www.googleapis.com/auth/cloud-platform"])
# With service account key file (not recommended)
# credentials = service_account.Credentials.from_service_account_file('service-account.json',
# scopes=["https://www.googleapis.com/auth/cloud-platform"])
from google.auth.transport import requests
credentials.refresh(requests.Request())
print(credentials.token)
However, if you want to call Google cloud APIs, I recommend you to use authorized request object
Here an example of BigQuery call. You can use service account key file to generate your credential as in my previous example.
base_url = 'https://bigquery.googleapis.com'
credentials, project_id = google.auth.default(scopes=['https://www.googleapis.com/auth/cloud-platform'])
project_id = 'MyProjectId'
authed_session = AuthorizedSession(credentials)
response = authed_session.request('GET', f'{base_url}/bigquery/v2/projects/{project_id}/jobs')
print(response.json())
EDIT
When you want to use Google APIs, a service account key file is not needed (and I recommend you to not use it) on your computer and on GCP component. The Application Default Credential is always sufficient.
When you are in your local environment, you must run the command gcloud auth application-default login. With this command, you will register your personal account as default credential when you run locally your app. (of course, you need to have your user account email authorized on the component that you call)
When you are on GCP environment, each component have a default service account (or you can specify one with you configure your component). Thanks to the component "identity", you can use the default credential. (of course, you need to have the service account email authorized on the component that you call)
ONLY when you run an app automatically and outside GCP, you need a service account key file (for example, in your CI/CD other that Cloud Build, or in an app deployed on other Cloud Provider or on premise)
Why service account key file is not recommended? It's at least my recommendation because this file is ..... a file!! That's the problem. You have a way to authenticate a service account in a simple file: you have to store it securely (it's a secret and an authentication method!!), you can copy it, you can send it by email, you can even commit it in a public GIT repository... In addition, Google recommend to rotate them every 90 days, so it's a nightmare to rotate, to trace and to manage

AWS Glue - Python Shell Jobs Secret Manager Connectivity Issues

I am using Python Shell Jobs under AWS Glue which has boto3 and a few other libraries built-in . I am facing issues trying to access the secrets manager to get credentials to my RDS instance running Mysql , the job keeps running forever without any (error/success) message nor does it time out .
Below is the simple code that runs even from my local or a lambda for Python3.7 but not in Python Shell GLUE ,
import boto3
import base64
from botocore.exceptions import ClientError
secret_name = "secret_name"
region_name = "eu-west-1"
session = boto3.session.Session()
client = session.client(
service_name='secretsmanager',
region_name=region_name
)
get_secret_value_response = client.get_secret_value(SecretId=secret_name)
print(get_secret_value_response)
Would be very helpful if someone could point out if anything needs to be done additionally in Python Shell jobs under AWS Glue in order to access the secret manager credentials .
Make sure the IAM role used by the Glue Job has the policy SecretsManagerReadWrite
Also AWSGlueServiceRole and AmazonS3FullAccess
According to the documentation
When you create a job without any VPC configuration , then glue tries to reach the secret manager through internet , if the policies allows to have internet route then we can connect to secret manager
But when a glue job is created with VPC configuration/connection then all the request are made from your VPC/subnet where the connection points to , if this is the case, make sure you have secret manager endpoint present in your route table of the subnet where glue launches the resources.
https://docs.aws.amazon.com/glue/latest/dg/setup-vpc-for-glue-access.html
https://docs.aws.amazon.com/secretsmanager/latest/userguide/vpc-endpoint-overview.html

How to list Azure vms using python?

I am trying to list Azure VMs using python code. could someone help me with this issue?
I have already tried to go through the code on the Microsoft website but it is not clear to me.
First, you need to follow the section Register your client application with Azure AD of Azure offical document Azure REST API Reference to register an application with Azure AD on Azure portal for getting the required parameters client_id and secret for Authentication to list VMs.
And, you continue to get the other required parameters subscription_id and tenant_id.
Then, to create a virtual environment in Python to install Azure SDK for Python via pip install azure or just install Azure Compute Management for Python via pip install azure-mgmt-compute.
Here is my sample code.
from azure.common.credentials import ServicePrincipalCredentials
from azure.mgmt.compute import ComputeManagementClient
credentials = ServicePrincipalCredentials(
client_id='<your client id>',
secret='<your client secret>',
tenant='<your tenant id>'
)
subscription_id = '<your subscription id>'
client = ComputeManagementClient(credentials, subscription_id)
If just list VMs by resource group, to use function list(resource_group_name, custom_headers=None, raw=False, **operation_config) as the code below.
resource_group_name = '<your resource group name>'
vms_by_resource_group = client.virtual_machines.list(resource_group_name)
Or you want to list all VMs in your subscription, to use function list_all(custom_headers=None, raw=False, **operation_config) as the code below.
all_vms = client.virtual_machines.list_all()
As references, there are two SO threads I think which may help for understanding deeper: How to get list of Azure VMs (non-classic/Resource Managed) using Java API and Is it anyway to get ftpsState of azure web app (Azure Function app) using Python SDK?.
Hope it helps.

Categories