I have two GCloud accounts, consider x & y.
I ran the command gcloud config set account x as only account x has access to that particular gcloud project.
But everytime I run a local job task such as:
python -m trainer.task --bucket=bucket-name --output_dir=outputdir --job-dir=./tmp --train_steps=200
I get the following error:
tensorflow.python.framework.errors_impl.PermissionDeniedError: Error executing an HTTP request: HTTP response code 403 with body '{
"error": {
"errors": [
{
"domain": "global",
"reason": "forbidden",
"message": "y does not have storage.objects.create access to bucket-name."
}
],
"code": 403,
"message": "y does not have storage.objects.create access to bucket-name."
}
}
It seems to me that the command line is accessing the y account even though I am logged into the x account. I double checked that I am logged into the right account with access to the right project.
The gcloud config set command seems to only affect the Cloud SDK authentification-wise. This means that despite having the account x set as default, the API calls are still done through the application-default credentials.
If you want to log in with the account y, using the gcloud auth login y command should do the trick. I understand that this is your local developement environment, so you should have no problems after doing this.
As well, in the Ml-engine is recommended to use the gcloud ml-engine local train command to run the jobs to run the jobs locally (Documentation on this), you can see this example on how to do it.
I ran into this issue trying to upload some files to GCS through an API.
It seems that APIs use the default account and not the account you are signed in to.
Fix:
gcloud auth application-default login
Followed by the auth flow.
That fixed it for me like #joan grau noel mentioned. I had to go through a new auth flow to allow Auth Library access via the browser. After changing the default service account to the one I needed the errors stopped.
gcloud auth revoke --all
Just execute about command and login back with
gcloud auth login
Done, Enjoy executing commands with desired user there after.
Related
Due to the deprecation of some of MS's services, I need to migrate the login method for an unattended script. The script is currently set up to log in to the service using Basic Auth username/password, but now must go through something like MSAL to accomplish this.
Again, this will be an unattended script so it cannot accommodate any interactive prompts.
According to docs, ROPC (while "not recommended") is documented as not requiring any UI. However, when I try to acquire an access token using this method, I get the following:
>>> app = msal.PublicClientApplication(client_id, authority=f'https://login.microsoftonline.com/{tenant_id}')
>>> app.acquire_token_by_username_password(username, passwd, scopes=['Mail.ReadWrite'])
{
'error': 'invalid_grant',
'error_description': "AADSTS65001: The user or administrator has not consented to use the
application with ID '...' named '...'. Send an interactive authorization request for this
user and resource.\r\n...",
'error_codes': [65001],
...,
'suberror': 'consent_required'
}
I have also tried to use the API directly:
POST
https://login.microsoftonline.com/{tenant}/oauth2/v2.0/token
body:
{
"client_id": "{client_id}",
"scope": "https://graph.microsoft.com/mail.readwrite",
"username": "{username}",
"password": "{password}",
"grant_type": "password"
}
Response same as above.
I have set Delegated permissions in the App registration (but there doesn't seem to be a way to grant admin consent here).
The response message suggests this type of request requires an interactive prompt, but the documentation explicitly states this is a non-UI authentication flow.
What am I missing?
I tried to reproduce the same in my environment and got the same error as below:
The error usually occurs if the API permissions granted to the Azure AD Application is not consented by the Global Admin.
To resolve the error, make sure to Grant Admin Consent to mail.readwrite permission like below:
After Granting the Admin Consent, I am able to generate the access token successfully like below:
https://login.microsoftonline.com/TenantID/oauth2/v2.0/token
client_id:f2e61f2e-7340-4f37-9dac-XXXXXX
scope:https://graph.microsoft.com/mail.readwrite
username:rukadmin#XXXX.onmicrosoft.com
password:****
grant_type:password
Alternatively, you can make use of the below endpoint and sign in as Global Admin and Accept the consent on behalf of organization like below:
https://login.microsoftonline.com/TenantID/adminconsent?client_id=ClientID
You can also make use of Interactive Grant Type and allow the users to consent to the application accessing their account by doing the below setting:
Go to Azure Portal -> Enterprise Application -> User Settings -> Go to Consent and permissions
I have recently deployed python web application to Heroku platform that takes advantage of Google-calendar API. I was able to gain events list from calendar using OAuth 2.0 credentials but web application domain verification isn't required in such cases in general. In opposite was with push-notification feature that requires either domain verification and HTTPS request sent to Google API to activate it.
When I try to execute the HTTPS request using watch metod
with code:
body={
"id": "<specified_uuid_read_from_file>",
"type": "web_hook",
"address": "https://<heroku_application_id>.herokuapp.com"
}
events_result = service.events().watch(calendarId="primary", body=body).execute()
I get this response:
googleapiclient.errors.HttpError: <HttpError 401 when requesting https://www.googleapis.com/calendar/v3/calendars/primary/events/watch?alt=json returned "Unauthorized WebHook callback channel: https://<heroku_application_id>.herokuapp.com">
I have already done:
Verified domain https://<heroku_application_id>.herokuapp.com via Google Search Console by uploading html file.
Added this domain to Google Api "Domain verification" tab.
I would like to know what could be other possible resason than domain verification issues resulted from inappropriate Google authoriztion process and how to fix it?
Thank you in advance.
EDIT:
Eventually I was able to gain access by renewing Google API credentials. Everything works fine for now. Nevertheless I can't still explain what could be actual problem which caused this error.
I am currently using the following code to get the OAUTH Token
command = 'gcloud auth print-access-token'
result = str(subprocess.Popen(command, universal_newlines=True, shell=True, stdout=subprocess.PIPE,
stderr=subprocess.PIPE).communicate())
The result variable has the OAUTH Token. This technique uses my current logged in gcloud config.
However, I am looking out for a way to get the OAUTH Token without using command line.
I am using this OAUTH Token to make CDAP calls to get the Google Dataflow Pipeline Execution Details.
I checked some google blogs. This is the one I think should try but it asks to create consent screen and it will require one time activity to provide consent to the scopes defined and then it should work.
Google Document
Shall I follow steps in above document and check OR is there any other way we can get the OAUTH Token?
Is there a way to get authentication done by service account instead of google user account and get the OAUTH Token?
For automated process, service account is the recommended way. You can use the google-oauth library for this. You can generate an access token like this
# With default credential (your user account or the Google Cloud Component service account.
# Or with the service account key file defined in the GOOGLE_APPLICATION_CREDENTIALS env var -> for platform outside GCP)
credentials, project_id = google.auth.default(scopes=["https://www.googleapis.com/auth/cloud-platform"])
# With service account key file (not recommended)
# credentials = service_account.Credentials.from_service_account_file('service-account.json',
# scopes=["https://www.googleapis.com/auth/cloud-platform"])
from google.auth.transport import requests
credentials.refresh(requests.Request())
print(credentials.token)
However, if you want to call Google cloud APIs, I recommend you to use authorized request object
Here an example of BigQuery call. You can use service account key file to generate your credential as in my previous example.
base_url = 'https://bigquery.googleapis.com'
credentials, project_id = google.auth.default(scopes=['https://www.googleapis.com/auth/cloud-platform'])
project_id = 'MyProjectId'
authed_session = AuthorizedSession(credentials)
response = authed_session.request('GET', f'{base_url}/bigquery/v2/projects/{project_id}/jobs')
print(response.json())
EDIT
When you want to use Google APIs, a service account key file is not needed (and I recommend you to not use it) on your computer and on GCP component. The Application Default Credential is always sufficient.
When you are in your local environment, you must run the command gcloud auth application-default login. With this command, you will register your personal account as default credential when you run locally your app. (of course, you need to have your user account email authorized on the component that you call)
When you are on GCP environment, each component have a default service account (or you can specify one with you configure your component). Thanks to the component "identity", you can use the default credential. (of course, you need to have the service account email authorized on the component that you call)
ONLY when you run an app automatically and outside GCP, you need a service account key file (for example, in your CI/CD other that Cloud Build, or in an app deployed on other Cloud Provider or on premise)
Why service account key file is not recommended? It's at least my recommendation because this file is ..... a file!! That's the problem. You have a way to authenticate a service account in a simple file: you have to store it securely (it's a secret and an authentication method!!), you can copy it, you can send it by email, you can even commit it in a public GIT repository... In addition, Google recommend to rotate them every 90 days, so it's a nightmare to rotate, to trace and to manage
I want to use google translation api but I have some problems.
My env is Linux ubuntu 18 and python with Atom idle
I was used gcloud to set my configuration and got auth login, auth login token.
export GOOGLE_APPLICATION_CREDENTIALS=//api_key.json
gcloud init
gcloud auth application-default login
gcloud auth application-default print-access-token
so I could use curl and got some test data
curl -X POST -H "Authorization: Bearer "$(gcloud auth application-default print-access-token) -H "Content-Type: application/json; charset=utf-8" --data
"{
'q': 'Hello world',
'q': 'My name is Jeff',
'target': 'de'
}" "https://translation.googleapis.com/language/translate/v2"
{
"data": {
"translations": [
{
"translatedText": "Hallo Welt",
"detectedSourceLanguage": "en"
},
{
"translatedText": "Mein Name ist Jeff",
"detectedSourceLanguage": "en"
}
]
}
}
When I run test code in Atom idle, my project number is wrong.
It is my past project.
Even I run test code in bash python, it is same situation
I dont know what is wrong, I just guess some problem in python env.
raised error
raise exceptions.from_http_response(response)
google.api_core.exceptions.Forbidden: 403 POST
https://translation.googleapis.com/language/translate/v2: Cloud Translation
API has not been used in project [wrong number] before or it is disabled.
Enable it by visiting
https://console.developers.google.com/apis/api/translate.googleapis.com
/overview?project=[wrong number] then retry. If you enabled this API
recently, wait a few minutes for the action to propagate to our systems and
retry.
This error message is usually thrown when the application is not being authenticated correctly due to several reasons such as missing files, invalid credential paths, incorrect environment variables assignations, among other causes. Since the Client Libraries need to pull the credentials data from the environment variable or the client object, it is required to ensure you are pointing to the correct authentication files. Keep in mind this issue might not occur when using CURL command because you were passing the access-token directly.
Based on this, I recommend you to confirm that you are using the JSON file credentials of your current project, as well as follow the Obtaining and providing service account credentials manually guide in order to explicitly specify your service account file directly into your code; In this way, you will be able to set it permanently and verify if you are passing the service credentials correctly. Additionally, you can take a look on Using Client Libraries guide that contains the step-by-step process required to use the Translation API with Python.
Passing the path to the service account key in code example:
def explicit():
from google.cloud import storage
# Explicitly use service account credentials by specifying the private key
# file.
storage_client = storage.Client.from_service_account_json('service_account.json')
# Make an authenticated API request
buckets = list(storage_client.list_buckets())
print(buckets)
I am running this code in a small example:
from google.cloud import storage
from google.appengine.api import app_identity
class TestB(base_handler.TRNHandler):
#...
def post(self):
client = storage.Client()
bucket_name = os.environ.get('BUCKET_NAME',
app_identity.get_default_gcs_bucket_name())
bucket = client.get_bucket(bucket_name)
#...
If I deploy this code everything works as expected. But when I run it locally (SDK), I get an error: Unauthorized: 401 Invalid Credentials. What's happening and how can I fix it?
I've got a pretty strong guess, although I can't be sure without seeing your exact logs and whatnot.
The google.cloud library is smart about authorization. It uses a thing called "application default credentials." If you run that code on App Engine or on a GCE instance, the code will be able to figure out which service account is associated with that instance and authorize itself with the credentials of that account.
However, when you run the program locally, the library has no way of knowing which credentials to use, and so it just makes calls anonymously. Your bucket probably hasn't granted anonymous users access (which is good), and so the call fails with a 401.
You can, however, register credentials locally with the gcloud command:
$> gcloud auth application-default login
Run that, and the library will use whatever credentials you've used to log in for a while. Alternatively, you could also make sure that the environment variable GOOGLE_APPLICATION_CREDENTIALS points to a service account's JSON key file.
There's a bunch of documentation on exactly how Application Default Credentials pick a credential.
Alternately, if you'd prefer to specify auth right in the program, you can do that too:
storage = Storage.from_service_account_json('/path/to/key_file.json')