I am trying to get authentication to use the google translation API. Currently on my local machine I simply do this:
from google.cloud import translate_v2 as translate
os.environ["GOOGLE_APPLICATION_CREDENTIALS"] = path_to_cred_json_file
translate_client = translate.Client()
which works fine. However, I wish to do this on AWS where I have I stored the credential json file in AWS secrets. In the documentation for translate.Client I see this:
Init signature:
translate.Client(
target_language='en',
credentials=None,
...
)
...
:type credentials: :class:`~google.auth.credentials.Credentials`
However, if I read in the json file and try to pass it in as the credentials argument it chucks an error.
The only answer I have for now in AWS is to read the secret, write it out as a json file, and then set os.environ["GOOGLE_APPLICATION_CREDENTIALS"] = path_to_cred_json_file, which will work, but was told by a data engineer is a security risk.
So the question is how do I get this google.auth.credentials.Credentials object without reading a physical file. I have access to the plain text version of the json file in memory (via AWS secrets). I'm really new to AWS in general so go easy on me.
Thanks to #miles-budnek and this github comment, found the answer.
Supposing I have the json string as a dictionary called secret:
from google.cloud import translate_v2 as translate
from google.oauth2 import service_account
credentials = service_account.Credentials.from_service_account_info(secret)
t_client = translate.Client(credentials=credentials)
Related
I am using python and azure function app to send a document to be translated using the google cloud translation api.
I am trying to load the credentials from a tempfile (json) using the below code. The idea is to later download the json file from blob storage and store it in a temp file but I am not thinking about the blob storage for now.
key= {cred info}
f= tempfile.NamedTemporaryFile(suffix='.json', mode='a+')
json.dump(key, f)
f.flush()
f.seek(0)
os.environ['GOOGLE_APPLICATION_CREDENTIALS'] = f.name
client= translate.TranslationServiceClient()
But when I run this I get the following error:
Exception: PermissionError: [Errno 13] Permission denied:
How can I correctly load the creds from a temp file?. Also what is the relationship between translate.TranslationServiceClient() and os.environ['GOOGLE_APPLICATION_CREDENTIALS'] = f.name? Does the TranslationServiceClient() get the creds from the environment variable?
I have been looking at this problem for a while now and I cannot find a good solution. Any help would be amazing!
edit:
when I change it to
os.environ['GOOGLE_APPLICATION_CREDENTIALS'] = f.read()
I get a different error:
System.Private.CoreLib: Exception while executing function:
Functions.Trigger. System.Private.CoreLib: Result: Failure
Exception: DefaultCredentialsError:
EDIT 2:
Its really weird, but it works when I read the file just before like so:
contents= f.read()
os.environ['GOOGLE_APPLICATION_CREDENTIALS'] = f.name
client= translate.TranslationServiceClient()
Any ideas why?
Any application which connects to any GCP Product requires credentials to authenticate. Now there are many ways how this authentication works.
According to the Google doc
Additionally, we recommend you use Google Cloud Client Libraries for your application. Google Cloud Client Libraries use a library called Application Default Credentials (ADC) to automatically find your service account credentials. ADC looks for service account credentials in the following order:
If the environment variable GOOGLE_APPLICATION_CREDENTIALS is set, ADC uses the service account key or configuration file that the variable points to.
If the environment variable GOOGLE_APPLICATION_CREDENTIALS isn't set, ADC uses the service account that is attached to the resource that is running your code.
This service account might be a default service account provided by Compute Engine, Google Kubernetes Engine, App Engine, Cloud Run, or Cloud Functions. It might also be a user-managed service account that you created.
If ADC can't use any of the above credentials, an error occurs.
There are also modules provided by Google that can be used to pass the credentials.
If you already have the JSON value as dictionary then you can simply pass dictionary in from_service_account_info(key)
Example:
key = json.load(open("JSON File Path")) # loading my JSON file into dictionary
client = translate.TranslationServiceClient().from_service_account_info(key)
In your case you already have the key as dictionary
As for the error you are getting, I believe that has to be something with the temp file. Because GOOGLE_APPLICATION_CREDENTIALS needs full access to the JSON file path to read from it.
Having problem uploading file to azure blob storage container, using azure.storage.blob for python 2.7. (I know i should use newer python, but it's a part of big ROS application, hence not just so to upgrade it all.)
from azure.storage.blob import BlobServiceClient
...
container_name = "operationinput"
self.back_up_root = "~/backup/sql/lp/"
self.back_up_root = os.path.expanduser(self.back_up_root)
file = 'test.sql'
try:
client = BlobServiceClient.from_connection_string(conn_str=connection_string)
blob = client.get_blob_client(container='container_name', blob='datafile')
except Exception as err:
print(str(err))
with open(self.back_up_root + file, "rb") as data:
blob.upload_blob(data)
I get the following error:
azure.core.exceptions.HttpResponseError: The specifed resource name contains invalid characters.
RequestId:3fcb6c26-101e-007e-596d-1c7d61000000
Time:2022-02-07T21:58:17.1308670Z
ErrorCode:InvalidResourceName
All post i have found refers to people using capital letters or so, but i have:
operationinput
datafile
All should be within specification.
Any ideas?
We have tried with below sample code to upload files to Azure blob storage (Container ) using SAS token , and can able to achieve it successfully.
Code sample:-
from azure.storage.blob import BlobClient
upload_file_path="C:\\Users\\Desktop\\filename"
sas_url="https://xxx.blob.core.windows.nethttps://cloudsh3D?sastoken"
client = BlobClient.from_blob_url(sas_url)
with open(upload_file_path,'rb') as data:
client.upload_blob(data)
print("**file uploaded**")
To generate SAS url and connection string we have selected as below:-
For more information please refer this Microsoft Documentation: Allow or disallow public read access for a storage account
I have the following Python3 script:
import os, json
import googleapiclient.discovery
from google.oauth2 import service_account
from google.cloud import storage
storage_client = storage.Client.from_service_account_json('gcp-sa.json')
buckets = list(storage_client.list_buckets())
print(buckets)
compute = googleapiclient.discovery.build('compute', 'v1')
def list_instances(compute, project, zone):
result = compute.instances().list(project=project, zone=zone).execute()
return result['items'] if 'items' in result else None
list_instances(compute, "my-project", "my-zone")
Only listing buckets without the rest works fine, that tells me that my service account (with has read access to the whole project) should work. How can I now list VM's? Using the code above, I get
raise exceptions.DefaultCredentialsError(_HELP_MESSAGE)
google.auth.exceptions.DefaultCredentialsError: Could not automatically determine credentials. Please set GOOGLE_APPLICATION_CREDENTIALS or explicitly create credentials and re-run the application. For more information, please see https://cloud.google.com/docs/authentication/getting-started
So that tells me that I somehow have to pass the service account json. How is that possible?
Thanks!!
Let it be stated that this is the first time I'm working with Google Cloud, so this is probably a noob question.
I want to use the Google Cloud Speech library to do some speech to text on an audio file. I also want to explicitly state within the code which Google Cloud service account credentials to use by giving the private key file. How do I do so?
It seems that what I want is a mix of this quickstart for speech recognition and this example of how to set credentials within the code (the explicit() part).
I tried doing so, but explicit() uses google.cloud.storage to set the client,
from google.cloud import storage
storage_client = storage.Client.from_service_account_json('service_account.json')
in order to make the API request.
Setting
client = storage.Client.from_service_account_json('service_account.json')
and then running
client.recognize(config, audio)
obviously throws an error saying that client doesn't have that attribute. My guess is what I need is something similar, but for google.cloud.speech? I have tried looking through the documentation - am I missing something?
Almost all of the Google Client libraries follow the same design pattern for credentials. In the example below, the code loads the credentials and then creates the client using those credentials.
Link to the Speech Client
from google.oauth2 import service_account
from google.cloud import speech_v1
from google.cloud.speech_v1 import enums
SCOPES = ["https://www.googleapis.com/auth/cloud-platform"]
SERVICE_ACCOUNT_FILE = 'service-account.json'
cred = service_account.Credentials.from_service_account_file(
SERVICE_ACCOUNT_FILE, scopes=SCOPES)
client = speech_v1.SpeechClient(credentials=cred)
OR (Notice not specifying scopes which is usually OK)
client = speech_v1.SpeechClient.from_service_account_file(SERVICE_ACCOUNT_FILE)
I'm trying to execute an SQL query on some bigquery table. I keep getting a DefaultCredentialsError when trying to instantiate a bigquery client object. For example by doing this:
from google.cloud import bigquery
client = bigquery.Client.from_service_account_json('service_account_key.json')
Or by doing this:
from oauth2client.service_account import ServiceAccountCredentials
key = open('service_account_key.json', 'rb').read()
credentials = ServiceAccountCredentials(
'my_email',
key,
scope='https://www.googleapis.com/auth/bigquery')
client = bigquery.Client(credentials=credentials)
Could there be a problem with my .json credentals file? I created a service account key:
Any other suggestions?
You are most likely hitting a bug with using the from_service_account_json method.
Instead, try using the recommended way of authenticating by exporting the GOOGLE_APPLICATION_CREDENTIALS environment variable as decribed here.