after reading the offical google translate api document, it provide us with the following sample code:
from google.cloud import translate
def translate_text(text="Hello, world!", project_id="weighty-site-333613"):
client = translate.TranslationServiceClient().from_service_account_json('key.json')
location = "global"
parent = f"projects/{project_id}/locations/{location}"
response = client.translate_text(
request={
"parent": parent,
"contents": [text],
"mime_type": "text/plain",
"source_language_code": "en-US",
"target_language_code": "zh-CN",
}
)
for translation in response.translations:
print("Translated text: {}".format(translation.translated_text))
translate_text()
These code worked properly on google cloud terminal.
However, even if i put the "key.json" file in the same folder, an error like this is shown:
/usr/local/bin/python3.6 /Users/jiajunmao/Documents/GitHub/translator_of_excel/google_trans.py
Traceback (most recent call last):
File "/Users/jiajunmao/Documents/GitHub/translator_of_excel/google_trans.py", line 37, in <module>
translate_text()
File "/Users/jiajunmao/Documents/GitHub/translator_of_excel/google_trans.py", line 22, in translate_text
client = translate.TranslationServiceClient().from_service_account_json('key.json')
File "/Library/Frameworks/Python.framework/Versions/3.6/lib/python3.6/site-packages/google/cloud/translate_v3/services/translation_service/client.py", line 354, in __init__
always_use_jwt_access=True,
File "/Library/Frameworks/Python.framework/Versions/3.6/lib/python3.6/site-packages/google/cloud/translate_v3/services/translation_service/transports/grpc.py", line 158, in __init__
always_use_jwt_access=always_use_jwt_access,
File "/Library/Frameworks/Python.framework/Versions/3.6/lib/python3.6/site-packages/google/cloud/translate_v3/services/translation_service/transports/base.py", line 110, in __init__
**scopes_kwargs, quota_project_id=quota_project_id
File "/Library/Frameworks/Python.framework/Versions/3.6/lib/python3.6/site-packages/google/auth/_default.py", line 488, in default
raise exceptions.DefaultCredentialsError(_HELP_MESSAGE)
google.auth.exceptions.DefaultCredentialsError: Could not automatically determine credentials. Please set GOOGLE_APPLICATION_CREDENTIALS or explicitly create credentials and re-run the application. For more information, please see https://cloud.google.com/docs/authentication/getting-started
Process finished with exit code 1
can someone tell me what should i do at this step? thank you so much
You need service account json file with correct permissions from GCP under IAM & Service Accounts.
Then you need to implement command,
GOOGLE_APPLICATION_CREDENTIALS = "/path/to/your/service_account.json"
Related
Running a DataFlow streaming job using 2.11.0 release.
I get the following authentication error after few hours:
File "streaming_twitter.py", line 188, in <lambda>
File "streaming_twitter.py", line 102, in estimate
File "streaming_twitter.py", line 84, in estimate_aiplatform
File "streaming_twitter.py", line 42, in get_service
File "/usr/local/lib/python2.7/dist-packages/googleapiclient/_helpers.py", line 130, in positional_wrapper return wrapped(*args, **kwargs)
File "/usr/local/lib/python2.7/dist-packages/googleapiclient/discovery.py", line 227, in build credentials=credentials)
File "/usr/local/lib/python2.7/dist-packages/googleapiclient/_helpers.py", line 130, in positional_wrapper return wrapped(*args, **kwargs)
File "/usr/local/lib/python2.7/dist-packages/googleapiclient/discovery.py", line 363, in build_from_document credentials = _auth.default_credentials()
File "/usr/local/lib/python2.7/dist-packages/googleapiclient/_auth.py", line 42, in default_credentials credentials, _ = google.auth.default()
File "/usr/local/lib/python2.7/dist-packages/google/auth/_default.py", line 306, in default raise exceptions.DefaultCredentialsError(_HELP_MESSAGE) DefaultCredentialsError: Could not automatically determine credentials. Please set GOOGLE_APPLICATION_CREDENTIALS or explicitly create credentials and re-run the application.
This Dataflow job performs an API request to AI Platform prediction
and seems to be Authentication token is expiring.
Code snippet:
def get_service():
# If it hasn't been instantiated yet: do it now
return discovery.build('ml', 'v1',
discoveryServiceUrl=DISCOVERY_SERVICE,
cache_discovery=True)
I tried adding the following lines to the service function:
os.environ[
"GOOGLE_APPLICATION_CREDENTIALS"] = "/tmp/key.json"
But I get:
DefaultCredentialsError: File "/tmp/key.json" was not found. [while running 'generatedPtransform-930']
I assume because file is not in DataFlow machine.
Other option is to use developerKey param in build method, but doesnt seems supported by AI Platform prediction, I get error:
Expected OAuth 2 access token, login cookie or other valid authentication credential. See https://developers.google.com/identity/sign-in/web/devconsole-project."> [while running 'generatedPtransform-22624']
Looking to understand how to fix it and what is the best practice?
Any suggestions?
Complete logs here
Complete code here
Setting os.environ['GOOGLE_APPLICATION_CREDENTIALS'] = '/tmp/key.json' only works locally with the DirectRunner. Once deploying to a distributed runner like Dataflow, each worker won't be able to find the local file /tmp/key.json.
If you want each worker to use a specific service account, you can tell Beam which service account to use to identify workers.
First, grant the roles/dataflow.worker role to the service account you want your workers to use. There is no need to download the service account key file :)
Then if you're letting PipelineOptions parse your command line arguments, you can simply use the service_account_email option, and specify it like --service_account_email your-email#your-project.iam.gserviceaccount.com when running your pipeline.
The service account pointed by your GOOGLE_APPLICATION_CREDENTIALS is simply used to start the job, but each worker uses the service account specified by the service_account_email. If a service_account_email is not passed, it defaults to the email from your GOOGLE_APPLICATION_CREDENTIALS file.
Generating an access token for the Python Instagram API requires running this file and then entering a Client ID, Client Secret, Redirect URI, and Scope. The console then outputs a URL to follow to authorize the app and asks for the code generated afterwards. Theoretically after this process it should return an access token.
Instead, it's throwing an error:
Traceback (most recent call last):
File "get_access_token.py", line 40, in <module>
access_token = api.exchange_code_for_access_token(code)
File "C:\Users\Daniel Leybzon\Anaconda2\lib\site-packages\instagram\oauth2.py", line 48, in exchange_code_for_access_token
return req.exchange_for_access_token(code=code)
File "C:\Users\Daniel Leybzon\Anaconda2\lib\site-packages\instagram\oauth2.py", line 115, in exchange_for_access_token
raise OAuth2AuthExchangeError(parsed_content.get("error_message", ""))
instagram.oauth2.OAuth2AuthExchangeError: You must provide a client_id
Screenshot provided for context:
I'm trying to get my account information by evernote api on python, however, I got errors :
Traceback (most recent call last): File "", line 1, in
File "build/bdist.linux-x86_64/egg/evernote/api/client.py", line 148, in delegate_method
File "build/bdist.linux-x86_64/egg/evernote/edam/userstore/UserStore.py", line 1033, in getUser
File "build/bdist.linux-x86_64/egg/evernote/edam/userstore/UserStore.py", line 1058, in recv_getUser
evernote.edam.error.ttypes.EDAMSystemException: EDAMSystemException(errorCode=8, rateLimitDuration=None, _message='authenticationToken')
My python code as below:
from evernote.api.client import EvernoteClient
dev_token="my develop token"
client = EvernoteClient(token=dev_token)
userStore = client.get_user_store()
user = userStore.getUser()
I'm sure I've generated a valid developer token for my Evernote account, as shown in the picture, I have a develop token in my account link
Is there anything I missed?
By the way, I use the code above and replace the key with another develop token generated by Evernote sandbox account, it's ok.
If you are not on sandbox, try:
client = EvernoteClient(token=dev_token, sandbox=False)
I am following the example in https://developers.google.com/storage/docs/gspythonlibrary#credentials
I created client/secret pair by choosing in the dev. console "create new client id", "installed application", "other".
I have the following code in my python script:
import boto
from gcs_oauth2_boto_plugin.oauth2_helper import SetFallbackClientIdAndSecret
CLIENT_ID = 'my_client_id'
CLIENT_SECRET = 'xxxfoo'
SetFallbackClientIdAndSecret(CLIENT_ID, CLIENT_SECRET)
uri = boto.storage_uri('foobartest2014', 'gs')
header_values = {"x-goog-project-id": proj_id}
uri.create_bucket(headers=header_values)
and it fails with the following error:
File "/usr/local/lib/python2.7/dist-packages/boto/storage_uri.py", line 555, in create_bucket
conn = self.connect()
File "/usr/local/lib/python2.7/dist-packages/boto/storage_uri.py", line 140, in connect
**connection_args)
File "/usr/local/lib/python2.7/dist-packages/boto/gs/connection.py", line 47, in __init__
suppress_consec_slashes=suppress_consec_slashes)
File "/usr/local/lib/python2.7/dist-packages/boto/s3/connection.py", line 190, in __init__
validate_certs=validate_certs, profile_name=profile_name)
File "/usr/local/lib/python2.7/dist-packages/boto/connection.py", line 572, in __init__
host, config, self.provider, self._required_auth_capability())
File "/usr/local/lib/python2.7/dist-packages/boto/auth.py", line 883, in get_auth_handler
'Check your credentials' % (len(names), str(names)))
boto.exception.NoAuthHandlerFound: No handler was ready to authenticate. 3 handlers were checked. ['OAuth2Auth', 'OAuth2ServiceAccountAuth', 'HmacAuthV1Handler'] Check your credentials
I have been struggling with this for the last couple of days, turns out the boto stuff, and that gspythonlibrary are all totally obsolete.
The latest example code showing how to use/authenticate Google Cloud Storage is here:
https://github.com/GoogleCloudPlatform/python-docs-samples/tree/master/storage/api
You need to provide a client/secret pair in a .boto file, and then run gsutil config.
It will create a refresh token, and then should work!
For more info, see https://developers.google.com/storage/docs/gspythonlibrary#credentials
U can also make console application for gsutil commands authentication and gsutil cp, rm, gsutil config -a pass through console application to cloud SDK then execute
I'm trying to update a document on Google Docs/Drive after I created it, using the GData helpers for Python.
The new version of the API lacks documentation for Py.
client = gdata.docs.client.DocsClient(source=PluginConfig.APP_NAME)
client.http_client.debug = PluginConfig.DEBUG
client.client_login(
PluginConfig.EMAIL,
PluginConfig.PASSWORD,
source=PluginConfig.APP_NAME,
service=client.auth_service
)
[...]
# Upload the text file
ms = gdata.data.MediaSource()
ms.SetFileHandle(file_path, content_type)
doc = gdata.docs.data.Resource(type='document', title=title)
doc.description = gdata.docs.data.Description(description)
doc = client.CreateResource(doc, media=ms)
doc = client.UpdateResource(doc, media=ms, new_revision=True)
Login and document creation work fine, but the Update() receives 400 Bad Request
Traceback (most recent call last):
File "coll.py", line 301, in <module>
main()
File "coll.py", line 293, in main
doc = client.UpdateResource(doc, media=ms, new_revision=True)
File "/usr/lib/python2.7/dist-packages/gdata/docs/client.py", line 344, in update_resource
uri_params=uri_params, **kwargs)
File "/usr/lib/python2.7/dist-packages/gdata/client.py", line 1151, in update_file
auth_token=auth_token, method='PUT')
File "/usr/lib/python2.7/dist-packages/gdata/client.py", line 1085, in upload_file
start_byte, self.file_handle.read(self.chunk_size))
File "/usr/lib/python2.7/dist-packages/gdata/client.py", line 1044, in upload_chunk
raise error
gdata.client.RequestError: Server responded with: 400, Invalid Request
More output: http://pastebin.com/LZL3qV0N
Any help is appreciated.
Try using the newer Drive API, its documentation includes Python samples in the reference guide, a Python quickstart and a complete sample application written in Python on App Engine.