Specify GOOGLE APPLICATION CREDENTIALS in Airflow - python

So I am trying to orchestrate a workflow in Airflow. One task is to read GCP Cloud Storage, which needs me to specify the Google Application Credentials.
I decided to create a new folder in the dag folder and put the JSON key. Then I specified this in the dag.py file;
os.environ["GOOGLE_APPLICATION_CREDENTIALS"] = "dags\support\keys\key.json"
Unfortunately, I am getting this error below;
google.auth.exceptions.DefaultCredentialsError: File dags\support\keys\dummy-surveillance-project-6915f229d012.json was not found
Can anyone help with how I should go about declaring the service account key?
Thank you.

You can create a connection to Google Cloud from Airflow webserver admin menu. In this menu you can pass the Service Account key file path.
In this picture, the keyfile Path is /usr/local/airflow/dags/gcp.json.
Beforehand you need to mount your key file as a volume in your Docker container with the previous path.
You can also directly copy the key json content in the Airflow connection, in the keyfile Json field :
You can check from these following links :
Airflow-connections
Airflow-with-google-cloud
Airflow-composer-managing-connections

If you trying to download data from Google Cloud Storage using Airflow, you should use the GCSToLocalFilesystemOperator operator described here. It is already provided as part of the standard Airflow library (if you installed it) so you don't have to write the code yourself using the Python operator.
Also, if you use this operator you can enter the GCP credentials into the connections screen (where it should be). This is a better approach to putting your credentials in a folder with your DAGs as this could lead to your credentials being committed into your version control system which could lead to security issues.

Related

Google Storage not using service account even with environment variable properly set

I was trying to save two files to GCP Storage using the following commands in a Jupyter Notebook:
!gsutil cp ./dist/my_custom_code-0.1.tar.gz gs://$BUCKET_NAME/custom_prediction_routine_tutorial/my_custom_code-0.1.tar.gz
!gsutil cp model.h5 preprocessor.pkl gs://$BUCKET_NAME/custom_prediction_routine_tutorial/model/
The bucket has been created properly since I can see it in the bucket list on GCP. Also in Permissions for the bucket, I can see the service account created. Plus, I made sure the environment variable is set by running:
export GOOGLE_APPLICATION_CREDENTIALS="/home/george/Documents/Credentials/prediction-routine-new-b7a445077e61.json"
This can be verified by running this in Python:
import os
print('Credendtials from environ: {}'.format(os.environ.get('GOOGLE_APPLICATION_CREDENTIALS')))
which shows:
Credentials from environ: /home/george/Documents/Credentials/prediction-routine-new-b7a445077e61.json
And I do have the json file stored at the specified location. However, when I tried to save files using the commands shown at the top, I kept getting this error message:
AccessDeniedException: 403 george***#gmail.com does not have storage.objects.list access to the Google Cloud Storage bucket.
Copying file://model.h5 [Content-Type=application/octet-stream]...
AccessDeniedException: 403 george***#gmail.com does not have storage.objects.create access to the Google Cloud Storage object.
So the question is, how come Google Storage is not using my service account and keeps using my user account?
UPDATE
After activating the service account for the project as pointed out by #Hao Z, GCP is using my service account now. However, I do have the permissions set for this service account...
UPDATE 2
This seems to be a known issue: https://github.com/GoogleCloudPlatform/gsutil/issues/546
Check How to use Service Accounts with gsutil, for uploading to CS + BigQuery
Relevant bit:
Download service account key file, and put it in e.g. /etc/backup-account.json
gcloud auth activate-service-account --key-file /etc/backup-account.json
Or you can do gsutil -i to impersonate a service account. Use 'gsutil help creds' for more info. I guess the env variable is just used by the Python SDK and not by the CLI.
I was able to resolve this in the following steps:
First, Using the way suggested by #Hao Z above, I was able to activate the service account in Jupyter Notebook using:
!gcloud auth activate-service-account \
prediction-routine-new#prediction-routine-test.iam.gserviceaccount.com \
--key-file=/home/george/Documents/Credentials/prediction-routine-new-b7a445077e61.json \
--project=prediction-routine-test
Second, I changed the bucket name used after realizing that I was using the wrong name - it should be "prediction-routine" instead of "prediction-routine-bucket".
BUCKET_NAME="prediction-routine"
Third, I changed the role from "Storage Object Admmin" to "Storage Admin" for the service account's permissions.

Unable to switch gcloud platform account using python script

Please could someone help me with a query related to permissions on the Google cloud platform? I realise that this is only loosely programming related so I apologise if this is the wrong forum!
I have a project ("ProjectA") written in Python that uses Googles cloud storage and compute engine. The project has various buckets that are accessed using python code from both compute instances and from my home computer. This project uses a service account which is a Project "owner", I believe it has all APIs enabled and the project works really well. The service account name is "master#projectA.iam.gserviceaccount.com".
Recently I started a new project that needs similar resources (storage, compute) etc, but I want to keep it separate. The new project is called "ProjectB" and I set up a new master service account called master#projectB.iam.gserviceaccount.com. My code in ProjectB generates an error related to access permissions and is demonstrated even if I strip the code down to these few lines:
The code from ProjectA looked like this:
from google.cloud import storage
client = storage.Client(project='projectA')
mybucket = storage.bucket.Bucket(client=client, name='projectA-bucket-name')
currentblob = mybucket.get_blob('somefile.txt')
The code from ProjectB looks like this:
from google.cloud import storage
client = storage.Client(project='projectB')
mybucket = storage.bucket.Bucket(client=client, name='projectB-bucket-name')
currentblob = mybucket.get_blob('somefile.txt')
Both buckets definitely exist, and obviously if "somefile.text" does not exist then currentblob is None, which is fine, but when I execute this code I receive the following error:
Traceback (most recent call last):
File .... .py", line 6, in <module>
currentblob = mybucket.get_blob('somefile.txt')
File "C:\Python27\lib\site-packages\google\cloud\storage\bucket.py", line 599, in get_blob
_target_object=blob,
File "C:\Python27\lib\site-packages\google\cloud\_http.py", line 319, in api_request
raise exceptions.from_http_response(response)
google.api_core.exceptions.Forbidden: 403 GET https://www.googleapis.com/storage/v1/b/<ProjectB-bucket>/o/somefile.txt: master#ProjectA.iam.gserviceaccount.com does not have storage.objects.get access to projectB/somefile.txt.
Notice how the error message says "ProjectA" service account doesn't have ProjectB access - well, I would somewhat expect that but I was expecting to use the service account on ProjectB!
Upon reading the documentation and links such as this and this, but even after removing and reinstating the service account or giving it limited scopes it hasnt helped. I have tried a few things:
1) Make sure that my new service account was "activated" on my local machine (where the code is being run for now):
gcloud auth activate-service-account master#projectB.iam.gserviceaccount.com --key-file="C:\my-path-to-file\123456789.json"
This appears to be successful.
2) Verify the list of credentialled accounts:
gcloud auth list
This lists two accounts, one is my email address (that I use for gmail, etc), and the other is master#projectB.iam.gserviceaccount.com, so it appears that my account is "registered" properly.
3) Set the service account as the active account:
gcloud config set account master#projectB.iam.gserviceaccount.com
When I look at the auth list again, there is an asterisk "*" next to the service account, so presumably this is good.
4) Check that the project is set to ProjectB:
gcloud config set project projectB
This also appears to be ok.
Its strange that when I run the python code, it is "using" the service account from my old project even though I have changed seemingly everything to refer to project B - Ive activated the account, selected it, etc.
Please could someone point me in the direction of something that I might have missed? I don't recall going through this much pain when setting up my original project and Im finding it so incredibly frustrating that something I thought would be simple is proving so difficult.
Thank you to anyone who can offer me any assistance.
I'm not entirely sure, but this answer is from a similar question on here:
Permission to Google Cloud Storage via service account in Python
Specifying the account explicitly by pointing to the credentials in your code. As documented here:
Example from the documentation page:
def explicit():
from google.cloud import storage
# Explicitly use service account credentials by specifying the private key
# file.
storage_client = storage.Client.from_service_account_json(
'service_account.json')
# Make an authenticated API request
buckets = list(storage_client.list_buckets())
print(buckets)
Don't you have a configured GOOGLE_APPLICATION_CREDENTIALS env variable which points project A's SA?
The default behavior of Google SDK is to takes the service account from the environment variable GOOGLE_APPLICATION_CREDENTIALS.
If you want to change the account you can do something like:
from google.cloud import storage
credentials_json_file = os.environ.get('env_var_with_path_to_account_json')
client= storage.Client.from_service_account_json(credentials)
The above assumes you have creates a json account file like in: https://cloud.google.com/iam/docs/creating-managing-service-account-keys
and that the json account file is in the environment variable env_var_with_path_to_account_json
This way you can have 2 account files and decide which one to use.

Allow Google Cloud Compute Engine Instance to write file to Google Storage Bucket - Python

In my python server script which is running on a google cloud VM instance, it tries to save an image(jpeg) in the storage. But it throws following error.
File "/home/thamindudj_16/server/object_detection/object_detector.py",
line 109, in detect Hand
new_img.save("slicedhand/{}#sliced_image{}.jpeg".format(threadname,
i)) File
"/home/thamindudj_16/.local/lib/python3.5/site-packages/PIL/Image.py",
line 2004, in save
fp = builtins.open(filename, "w+b")
OSError: [Errno 5] Input/output error: 'slicedhand/thread_1#sliced_image0.jpeg'
All the files including python scripts are in a google storage bucket and have mounted to the VM instance using gcsfuse. App tries to save new image in the slicedhand folder.
Python code snippet where image saving happen.
from PIL import Image
...
...
i = 0
new_img = Image.fromarray(bounding_box_img) ## conversion to an image
new_img.save("slicedhand/{}#sliced_image{}.jpeg".format(threadname, i))
I think may be the problem is with permission access. Doc says to use --key_file. But what is the key file I should use and where I can find that. I'm not clear whether this is the problem or something else.
Any help would be appreciated.
I understand that you are using gcfuse on your Linux VM Instance to access Google Cloud Storage.
Key file is a Service Account credentials key, that will allow you to initiate Cloud SDK or Client Library as another Service Account. You can download key file from Cloud Console. However, if you are using VM Instance, you are automatically using Compute Engine Default Service Account. You can check it using console command: $ gcloud init.
To configure properly your credentials, please follow the documentation.
Compute Engine Default Service Account, need to have enabled Access Scope Storage > Full. Access Scope is the mechanism that limits access level to Cloud APIs. That can be done during machine creation or when VM Instance is stopped.
Please note that Access Scopes are defined explicitly for the Service Account that you select for VM Instance.
Cloud Storage objects names have requirements. It is strongly recommended avoid using hash symbol "#" in the names of the objects.

Secrets in a google cloud bucket

We want to have a production airflow environment but do not know how to deal properly with secrets, in particular google bigquery client JSON files
We tried setting up the kubernetes secrets on the automatically created kubernetes cluster (automatically by creationg a google cloud composer (airflow) environment). We currently just put the files on the bucket, but would like a better way.
def get_bq_client ():
""" returns bq client """
return bq.Client.from_service_account_json(
join("volumes", "bigquery.json")
)
We would like some form of proper management of the required secrets. Sadly, using Airflow Variables won't work because we can't create the client object using the json file as text
One solution that would work, is to encrypt the JSON files and put that on the bucket. As long as the decryption key exists on the bucket and no where else you'll be able to just check the code in with secrets to some source control and in the bucket checkout and decrypt.

how to provide credentials in apache beam python programmatically?

We are using apache beam through airflow. Default GCS account is set with environmental variable - GOOGLE_APPLICATION_CREDENTIALS. We don't want to change environmental variable as it might affect other processes running at that time. I couldn't find a way to change Google Cloud Dataflow Service Account programmatically.
We are creating pipeline in following way
p = beam.Pipeline(argv=self.conf)
Is there any option through argv or options, where in I can mention the location of gcs credential file?
Searched through documentation, but didn't find much information.
You can specify a service account when you launch the job with a basic flag:
--serviceAccount=my-service-account-name#my-project.iam.gserviceaccount.com
That account will need the Dataflow Worker role attached plus whatever else you would like(GCS/BQ/Etc). Details here. You don't need the SA to be stored in GCS, or keys locally to use it.

Categories