I want to use google translation api but I have some problems.
My env is Linux ubuntu 18 and python with Atom idle
I was used gcloud to set my configuration and got auth login, auth login token.
export GOOGLE_APPLICATION_CREDENTIALS=//api_key.json
gcloud init
gcloud auth application-default login
gcloud auth application-default print-access-token
so I could use curl and got some test data
curl -X POST -H "Authorization: Bearer "$(gcloud auth application-default print-access-token) -H "Content-Type: application/json; charset=utf-8" --data
"{
'q': 'Hello world',
'q': 'My name is Jeff',
'target': 'de'
}" "https://translation.googleapis.com/language/translate/v2"
{
"data": {
"translations": [
{
"translatedText": "Hallo Welt",
"detectedSourceLanguage": "en"
},
{
"translatedText": "Mein Name ist Jeff",
"detectedSourceLanguage": "en"
}
]
}
}
When I run test code in Atom idle, my project number is wrong.
It is my past project.
Even I run test code in bash python, it is same situation
I dont know what is wrong, I just guess some problem in python env.
raised error
raise exceptions.from_http_response(response)
google.api_core.exceptions.Forbidden: 403 POST
https://translation.googleapis.com/language/translate/v2: Cloud Translation
API has not been used in project [wrong number] before or it is disabled.
Enable it by visiting
https://console.developers.google.com/apis/api/translate.googleapis.com
/overview?project=[wrong number] then retry. If you enabled this API
recently, wait a few minutes for the action to propagate to our systems and
retry.
This error message is usually thrown when the application is not being authenticated correctly due to several reasons such as missing files, invalid credential paths, incorrect environment variables assignations, among other causes. Since the Client Libraries need to pull the credentials data from the environment variable or the client object, it is required to ensure you are pointing to the correct authentication files. Keep in mind this issue might not occur when using CURL command because you were passing the access-token directly.
Based on this, I recommend you to confirm that you are using the JSON file credentials of your current project, as well as follow the Obtaining and providing service account credentials manually guide in order to explicitly specify your service account file directly into your code; In this way, you will be able to set it permanently and verify if you are passing the service credentials correctly. Additionally, you can take a look on Using Client Libraries guide that contains the step-by-step process required to use the Translation API with Python.
Passing the path to the service account key in code example:
def explicit():
from google.cloud import storage
# Explicitly use service account credentials by specifying the private key
# file.
storage_client = storage.Client.from_service_account_json('service_account.json')
# Make an authenticated API request
buckets = list(storage_client.list_buckets())
print(buckets)
Related
I tried so many methods, but none seem to work. Help me make a connection with LinkedIn using python. Issue in generating Access Token I received CODE but it doesn't work. I have python 3.9 Please post a sample of basic code that establishes a connection and gets a access Token. And which redirectUri I have to use. Can i use any website link for rediectUri.
I tried to check API through curl and Postman but didn't get solution its say Unauthorized Accesss.
https://github.com/ozgur/python-linkedin <---This is where I got some idea how to use API .To recievd Access token .
First solution valid for any (including free) applications, it useses so-called 3-Legged OAuth 2.0 Authentication:
Login to your account in the browser.
Create new application by this link.
If you already have application you may use it by selecting it here and changing its options if needed.
In application credentials copy Client ID and Client Secret, you'll need them later.
On your application's server side create Authorization request URL by next code and send/redirect it to client. If your Python code runs locally you may just open this URL in your browser with import webbrowser; webbrowser.open(url) code. Fill in all fields with your values too. There is redirect_uri in the code, this is URL where authorization response is sent back, for locally running script you have to run Python HTTP web server to retrieve result.
# Needs: python -m pip install requests
import requests, secrets
url = requests.Request(
'GET',
'https://www.linkedin.com/oauth/v2/authorization',
params = {
'response_type': 'code', # Always should equal to fixed string "code"
# ClientID of your created application
'client_id': 'REPLACE_WITH_YOUR_CLIENT_ID',
# The URI your users are sent back to after authorization.
# This value must match one of the OAuth 2.0 Authorized Redirect
# URLs defined in your application configuration.
# This is basically URL of your server that processes authorized requests like:
# https://your.server.com/linkedin_authorized_callback
'redirect_uri': 'REPLACE_WITH_REDIRECT_URL', # Replace this with your value
# state, any unique non-secret randomly generated string like DCEeFWf45A53sdfKef424
# that identifies current authorization request on server side.
# One way of generating such state is by using standard "secrets" module like below.
# Store generated state string on your server for further identifying this authorization session.
'state': secrets.token_hex(8).upper(),
# Requested permissions, below is just example, change them to what you need.
# List of possible permissions is here:
# https://learn.microsoft.com/en-us/linkedin/shared/references/migrations/default-scopes-migration#scope-to-consent-message-mapping
'scope': ' '.join(['r_liteprofile', 'r_emailaddress', 'w_member_social']),
},
).prepare().url
# You may now send this url from server to user
# Or if code runs locally just open browser like below
import webbrowser
webbrowser.open(url)
After user authorized your app by previous URL his browser will be redirected to redirect_uri and two fields code and state will be attached to this URL, code is unique authorization code that you should store on server, code expires after 30 minutes if not used, state is a copy of state from previous code above, this state is like unique id of your current authorization session, use same state string only once and generate it randomly each time, also state is not a secret thing because you send it to user inside authorization URL, but should be unique and quite long. Example of full redirected URL is https://your.server.com/linkedin_authorized_callback?code=987ab12uiu98onvokm56&state=D5B1C1348F110D7C.
Next you have to exchange code obtained previously to access_token by next code, next code should be run on your server or where your application is running, because it uses client_secret of your application and this is a secret value, you shouldn't show it to public, never share ClientSecret with anyone except maybe some trusted people, because such people will have ability to pretend (fake) to be your application while they are not.
# Needs: python -m pip install requests
import requests
access_token = requests.post(
'https://www.linkedin.com/oauth/v2/accessToken',
params = {
'grant_type': 'authorization_code',
# This is code obtained on previous step by Python script.
'code': 'REPLACE_WITH_CODE',
# This should be same as 'redirect_uri' field value of previous Python script.
'redirect_uri': 'REPLACE_WITH_REDIRECT_URL',
# Client ID of your created application
'client_id': 'REPLACE_WITH_YOUR_CLIENT_ID',
# Client Secret of your created application
'client_secret': 'REPLACE_WITH_YOUR_CLIENT_SECRET',
},
).json()['access_token']
print(access_token)
access_token obtained by previous script is valid for 60 days! So quite long period. If you're planning to use your application for yourself only or your friends then you can just pre-generate manually once in two months by hands several tokens for several people without need for servers.
Next use access_token for any API calls on behalf of just authorized above user of LinkedIn. Include Authorization: Bearer ACCESS_TOKEN HTTP header in all calls. Example of one such API code below:
import requests
print(requests.get(
'https://api.linkedin.com/v2/jobs',
params = {
# Any API params go here
},
headers = {
'Authorization': 'Bearer ' + access_token,
# Any other needed HTTP headers go here
},
).json())
More details can be read here. Regarding how your application is organized, there are 3 options:
Your application is running fully on remote server, meaning both authentication and running application (API calls) are done on some dedicated remote server. Then there is no problem with security, server doesn't share any secrets like client_secret, code, access_token.
Your application is running locally on user's machine while authentication is runned once in a while by your server, also some other things like storing necessary data in DataBase can be done by server. Then your server doesn't need to share client_secret, code, but shares access_token which is sent back to application to user's machine. It is also OK, then your server can keep track of what users are using your application, also will be able to revoke some or all of access_tokens if needed to block user.
Your application is fully run on local user's machine, no dedicated server is used at all. In this case all of client_secret, code, access_token are stored on user's machine. In this case you can't revoke access to your application of some specific users, you can only revoke all of them by regenerating client_secret in your app settings. Also you can't track any work of your app users (although maybe there is some usage statistics in your app settings/info pages). In this case any user can look into your app code and copy client_secret, unless you compile Python to some .exe/.dll/.so and encrypt you client secret there. If anyone got client_secret he can pretend (fake) to be your application meaning that if you app contacts other users somehow then he can try to authorize other people by showing your app interface while having some other fraudulent code underneath, basically your app is not that secure or trusted anymore. Also local code can be easily modified so you shouldn't trust your application to do exactly your code. Also in order to authorize users like was done in previous steps 5)-7) in case of local app you have to start Python HTTP Server to be able to retrieve redirected results of step 5).
Below is a second solution valid only if your application is a part of LinkedIn Developer Enterprise Products paid subscription, also then you need to Enable Client Credentials Flow in your application settings, next steps uses so-called 2-Legged OAuth 2.0 Authentication:
Login to your account in the browser.
Create new application by this link.
If you already have application you may use it by selecting it here and changing its options if needed.
In application credentials copy ClientID and ClientSecret, you'll need them later.
Create AccessToken by next Python code (put correct client id and client secret), you should run next code only on your server side or on computers of only trusted people, because code uses ClientSecret of your application which is a secret thing and shouldn't be showed to public:
# Needs: python -m pip install requests
import requests
access_token = requests.post(
'https://www.linkedin.com/oauth/v2/accessToken',
params = {
'grant_type': 'client_credentials',
'client_id': 'REPLACE_WITH_YOUR_CLIENT_ID',
'client_secret': 'REPLACE_WITH_YOUR_CLIENT_SECRET',
},
).json()['access_token']
print(access_token)
Copy access_token from previous response, it expires after 30 minutes after issue so you need to use previous script often to gain new access token.
Now you can do any API requests that you need using this token, like in code below (access_token is taken from previous steps):
import requests
print(requests.get(
'https://api.linkedin.com/v2/jobs',
params = {
# Any API params go here
},
headers = {
'Authorization': 'Bearer ' + access_token,
# Any other needed HTTP headers go here
},
).json())
More details can be read here or here.
I am currently using the following code to get the OAUTH Token
command = 'gcloud auth print-access-token'
result = str(subprocess.Popen(command, universal_newlines=True, shell=True, stdout=subprocess.PIPE,
stderr=subprocess.PIPE).communicate())
The result variable has the OAUTH Token. This technique uses my current logged in gcloud config.
However, I am looking out for a way to get the OAUTH Token without using command line.
I am using this OAUTH Token to make CDAP calls to get the Google Dataflow Pipeline Execution Details.
I checked some google blogs. This is the one I think should try but it asks to create consent screen and it will require one time activity to provide consent to the scopes defined and then it should work.
Google Document
Shall I follow steps in above document and check OR is there any other way we can get the OAUTH Token?
Is there a way to get authentication done by service account instead of google user account and get the OAUTH Token?
For automated process, service account is the recommended way. You can use the google-oauth library for this. You can generate an access token like this
# With default credential (your user account or the Google Cloud Component service account.
# Or with the service account key file defined in the GOOGLE_APPLICATION_CREDENTIALS env var -> for platform outside GCP)
credentials, project_id = google.auth.default(scopes=["https://www.googleapis.com/auth/cloud-platform"])
# With service account key file (not recommended)
# credentials = service_account.Credentials.from_service_account_file('service-account.json',
# scopes=["https://www.googleapis.com/auth/cloud-platform"])
from google.auth.transport import requests
credentials.refresh(requests.Request())
print(credentials.token)
However, if you want to call Google cloud APIs, I recommend you to use authorized request object
Here an example of BigQuery call. You can use service account key file to generate your credential as in my previous example.
base_url = 'https://bigquery.googleapis.com'
credentials, project_id = google.auth.default(scopes=['https://www.googleapis.com/auth/cloud-platform'])
project_id = 'MyProjectId'
authed_session = AuthorizedSession(credentials)
response = authed_session.request('GET', f'{base_url}/bigquery/v2/projects/{project_id}/jobs')
print(response.json())
EDIT
When you want to use Google APIs, a service account key file is not needed (and I recommend you to not use it) on your computer and on GCP component. The Application Default Credential is always sufficient.
When you are in your local environment, you must run the command gcloud auth application-default login. With this command, you will register your personal account as default credential when you run locally your app. (of course, you need to have your user account email authorized on the component that you call)
When you are on GCP environment, each component have a default service account (or you can specify one with you configure your component). Thanks to the component "identity", you can use the default credential. (of course, you need to have the service account email authorized on the component that you call)
ONLY when you run an app automatically and outside GCP, you need a service account key file (for example, in your CI/CD other that Cloud Build, or in an app deployed on other Cloud Provider or on premise)
Why service account key file is not recommended? It's at least my recommendation because this file is ..... a file!! That's the problem. You have a way to authenticate a service account in a simple file: you have to store it securely (it's a secret and an authentication method!!), you can copy it, you can send it by email, you can even commit it in a public GIT repository... In addition, Google recommend to rotate them every 90 days, so it's a nightmare to rotate, to trace and to manage
I have two GCloud accounts, consider x & y.
I ran the command gcloud config set account x as only account x has access to that particular gcloud project.
But everytime I run a local job task such as:
python -m trainer.task --bucket=bucket-name --output_dir=outputdir --job-dir=./tmp --train_steps=200
I get the following error:
tensorflow.python.framework.errors_impl.PermissionDeniedError: Error executing an HTTP request: HTTP response code 403 with body '{
"error": {
"errors": [
{
"domain": "global",
"reason": "forbidden",
"message": "y does not have storage.objects.create access to bucket-name."
}
],
"code": 403,
"message": "y does not have storage.objects.create access to bucket-name."
}
}
It seems to me that the command line is accessing the y account even though I am logged into the x account. I double checked that I am logged into the right account with access to the right project.
The gcloud config set command seems to only affect the Cloud SDK authentification-wise. This means that despite having the account x set as default, the API calls are still done through the application-default credentials.
If you want to log in with the account y, using the gcloud auth login y command should do the trick. I understand that this is your local developement environment, so you should have no problems after doing this.
As well, in the Ml-engine is recommended to use the gcloud ml-engine local train command to run the jobs to run the jobs locally (Documentation on this), you can see this example on how to do it.
I ran into this issue trying to upload some files to GCS through an API.
It seems that APIs use the default account and not the account you are signed in to.
Fix:
gcloud auth application-default login
Followed by the auth flow.
That fixed it for me like #joan grau noel mentioned. I had to go through a new auth flow to allow Auth Library access via the browser. After changing the default service account to the one I needed the errors stopped.
gcloud auth revoke --all
Just execute about command and login back with
gcloud auth login
Done, Enjoy executing commands with desired user there after.
The access token im getting with gcloud auth print-access-token is obviously a different access token than the one i can get with some basic python code:
export GOOGLE_APPLICATION_CREDENTIALS=/the-credentials.json
from oauth2client.client import GoogleCredentials
credentials = GoogleCredentials.get_application_default()
credentials.get_access_token()
What i am trying to do is get a token that would work with:
curl -u _token:<mytoken> https://eu.gcr.io/v2/my-project/my-docker-image/tags/list
I'd prefer not to install gcloud utility as a dependency for my app, hence my tries to obtain the access token progrmatically via oath google credentials
I know this is a very old question, but I just got faced with the exact same problem of requiring an ACCESS_TOKEN in Python and not being able to generate it, and managed to make it work.
What you need to do is to use the variable credentials.token, except it won't exist once you first create the credentials object, returning None. In order to generate a token, the credentials must be used by a google.cloud library, which in my case was done by using the googleapiclient.discovery.build method:
sqladmin = googleapiclient.discovery.build('sqladmin', 'v1beta4', credentials=credentials)
response = sqladmin.instances().get(project=PROJECT_ID, instance=INSTANCE_ID).execute()
print(json.dumps(response))
After which the ACCESS_TOKEN could be properly generated using
access_token = credentials.token
I've also tested it using google.cloud storage as a way to test credentials, and it also worked, by just trying to access a bucket in GCS through the appropriate Python library:
from google.oauth2 import service_account
from google.cloud import storage
PROJECT_ID = your_project_id_here
SCOPES = ['https://www.googleapis.com/auth/sqlservice.admin']
SERVICE_ACCOUNT_FILE = '/path/to/service.json'
credentials = service_account.Credentials.from_service_account_file(
SERVICE_ACCOUNT_FILE, scopes=SCOPES)
try:
list(storage.Client(project=PROJECT_ID, credentials=credentials).bucket('random_bucket').list_blobs())
except:
print("Failed because no bucket exists named 'random_bucket' in your project... but that doesn't matter, what matters is that the library tried to use the credentials and in doing so generated an access_token, which is what we're interested in right now")
access_token = credentials.token
print(access_token)
So I think there are a few questions:
gcloud auth print-access-token vs GoogleCredentials.get_application_default()
gcloud doesn't set application default credentials by default anymore when performing a gcloud auth login, so the access_token you're getting from gcloud auth print-access-token is going to be the one corresponding to the used you used to login.
As long as you follow the instructions to create ADC's for a service account, that account has the necessary permissions, and the environment from which you are executing the script has access to the ENV var and the adc.json file, you should be fine.
How to make curl work
The Docker Registry API specifies that a token exchange should happen, swapping your Basic auth (i.e. Authorization: Basic base64(_token:<gcloud_access_token>)) for a short-lived Bearer token. This process can be a bit involved, but is documented here under "How to authenticate" and "Requesting a Token". Replace auth.docker.io/token with eu.gcr.io/v2/token and service=registry.docker.io with service=eu.gcr.io, etc. Use curl -u oauth2accesstoken:<mytoken> here.
See also: How to list images and tags from the gcr.io Docker Registry using the HTTP API?
Avoid the question entirely
We have a python lib that might be relevant to your needs:
https://github.com/google/containerregistry
I have a server side application in python that is calling the AdWords API. Since ClientLogin is being deprecated I'm going to have to use OAuth 2.0.
Basically I need to generate an access token but I don't want any user interaction (to allow access the app) because I'm always using the same username and password on the server, and I'd like to use that username and password to make the AdWords API calls.
I believe the right way to do it is through a grant_type of 'password' oauth2 call to https://accounts.google.com/o/oauth2/token.
And this is what I understood from the OAuth2.0 RFC (https://www.rfc-editor.org/rfc/rfc6749#page-38). The RFC says that the request/query string must contain 3 parameters: grant_type (set to 'password'), username and password.
So I constructed my curl script, which looks like:
curl -v --data "grant_type=password&username=user#gmail.com&password=password" https://accounts.google.com/o/oauth2/token
But I launch the command and I get back with the response from google:
{"error" : "invalid_request"}
Am I missing something? Is there a simple python library that supports grant_type=password and that has a decent enough documentation?
This grant type isn't supported by the AdWords API, so you can't get an access token using a username and password, but you can get one using a refresh token. Your application just needs to store the refresh token and use it to get new access tokens from the OAuth API. You only need to authorize the account and get the code once; after you input the code into the example, you'll get an access and refresh token.
Here's an outline of the process:
1: Construct an authorization URL:
https://accounts.google.com/o/oauth2/auth?response_type=code&client_id=1234567890123.apps.googleusercontent.com&redirect_uri=urn:ietf:wg:oauth:2.0:oob&scope=https://adwords.google.com/api/adwords/&access_type=offline
Note the access_type of offline that requests a refresh token, which you can use to generate new access tokens when they expire.
2: Browse to the URL and authorize your account.
3: Extract the code from the redirect page, and request your access and refresh tokens:
curl -v --data "code=4/v6xr77ewYqhvHSyW6UJ1w7jKwAzu&client_id=8819981768.apps.googleusercontent.com&client_secret={client_secret}&redirect_uri=urn:ietf:wg:oauth:2.0:oob&grant_type=authorization_code" https://accounts.google.com/o/oauth2/token
This should return your access and refresh tokens:
{
"access_token":"1/fFAGRNJru1FTz70BzhT3Zg",
"expires_in":3920,
"token_type":"Bearer",
"refresh_token":"1/xEoDL4iW3cxlI7yDbSRFYNG01kVKM2C-259HOF2aQbI"
}
4: The access token only lasts for an hour, but you can use the refresh token to generate a new one without repeating steps 1-3:
curl -v --data "client_id=8819981768.apps.googleusercontent.com&
client_secret={client_secret}&refresh_token=1/xEoDL4iW3cxlI7yDbSRFYNG01kVKM2C-259HOF2aQbI&
grant_type=refresh_token" https://accounts.google.com/o/oauth2/token
You can find a Python-specific example here.