Cloud Tasks masks Bearer Token (To a Public Cloud Run) - python

I have a public Cloud Run, authenticated by JWT Token. Working 100%.
The logic inside the Cloud Run to decode the token is in python:
def decode_jwt(token: str) -> dict:
try:
decoded_token = jwt.decode(
token, JWT_SECRET, algorithms=[JWT_ALGORITHM])
return decoded_token if decoded_token["expires"] >= time.time() else None
except Exception as e:
raise InvalidTokenError
The Cloud Run is publicly available using a custom domain.
Now, I want to do some requests to the Cloud Run, using Cloud Tasks (each request have different parameters, created previously by a Cloud Functions).
In the Cloud Tasks, I create each task with a "Bearer {token}" parameter
Cloud Task Headers Code:
task["http_request"]["headers"] = \
{"Authorization": f"Bearer {token}",
"Accept": "application/json"}
First situation:
When I create the task without the "oidc_token" parameter in the http_request creation.
Cloud Run returns "403 Forbidden", and never reach the decode_jwt function inside cloud run.
Cloud Task http_request Code:
task = {
"http_request": {
"http_method": tasks_v2.HttpMethod.POST,
"url": url,
}
}
Second situation:
I add an "oidc_token".
task = {
"http_request": {
"http_method": tasks_v2.HttpMethod.POST,
"url": url,
"oidc_token": {
"service_account_email": "service-task#xxxxx.iam.gserviceaccount.com",
}
}
Now, the request reach the Cloud Run decode_jwt function, and the log in Cloud Run returns "InvalidTokenError".
Extra: I added a logging.info to expose the token received in Cloud Run, and is not the token I passed in the Cloud Task Creation.

Problem Summary:
you have a public (allUsers) Cloud Run service.
you have created your own authorization mechanism (HS256 - HMAC with SHA-256).
you want to assign a custom token for the HTTP Authorization Bearer value.
Cloud Run authorization is managed by IAP.
Authorization for the Cloud Run service is managed by the Identity Aware Proxy (IAP). If you add an HTTP Authorization Bearer token, IAP will verify that token. That step fails for your custom token which results in an HTTP 403 Forbidden error.
Cloud Tasks supports two types of HTTP Authorization Bearer tokens. OAuth Access tokens and OIDC Identity tokens. You cannot use your own token value to replace the supported types.
That leaves you with two options:
Enhance your code to support Google signed OIDC Identity Tokens.
Use a custom HTTP header that supports your custom token format.
Note: I do not recommend using HS256. HS256 is a symmetric algorithm which means the secret must be known to both sides in order to validate the payload. RS256 is an asymmetric algorithm which uses private/public key pairs. To verify only requires the public key. This is one of the strong design features of Google's use of private keys for service accounts and identities. If you switch to Google's method, all of the hard work is done for you.

You have to specificy the audience of your Cloud Run service, like that
task = {
"http_request": { # Specify the type of request.
"http_method": tasks_v2.HttpMethod.POST,
"url": url, # The full url path that the task will be sent to.
"oidc_token": {
"service_account_email": "service-task#xxxxx.iam.gserviceaccount.com",
"audience": base url of Cloud Run, no /sub/path
}
}

Related

Authenticate with AWS ALB / Cognito

I am trying to authorize with an ALB from python. As I understand the ALB looks for "AWSELBAuthSessionCookie" cookies before letting you to the website. I also see these cookies when logging into the application myself (using username and password). Question is how do I obtain the values of these cookies if I want to authenticate myself to the website/api from a python program. Has anybody done this before?
I had the exact same problem and could only make it work using an API Gateway since they allow authorization via JWT in the authorization header of the request. This can easily be done in Python, e.g.
import boto3
import requests
client = boto3.client(
"cognito-idp",
region_name="<aws region of the cognito app client>"
)
response = client.initiate_auth(
ClientId="<cognito app client ID>",
AuthFlow="USER_PASSWORD_AUTH",
AuthParameters={
"USERNAME": "<username>",
"PASSWORD": "<password>",
"SECRET_HASH": "<secret hash>",
},
)
token = response["AuthenticationResult"]["AccessToken"]
headers = {"Authorization": f"Bearer {token}"}
requests.get("<api gateway url>", headers=headers)
However, I also needed to allow authorization via the Cognito UI. Thus, I had to use both the ALB and API Gateway.
While this solved the issue of making my application available both from the browser (i.e. for humans) as well as from code (i.e. for machines), it introduced a lot of additional AWS components I had to use. And, as a disadvantegous side effect, the API has a request payload limit of 10MB that cannot be increased. This is another issue for me.
I know it's been a year, but if you've solved the issue, feel free to share your solution.

Triggering a cloud function from google sheets (via google apps script)

I have been trying (with little success) to have a google cloud function be triggered via an http request from a google sheet (google apps script) and it seemingly won't work. A few important things:
The function should only run if the user comes from my organization
The user should not have to be invited to the GCP project
I know this can be done very easily in google colabs and Python. The following script will let a user in my organization who is not in the GCP project trigger the cloud function:
import requests
import google.auth
from google.auth.transport.requests import Request
from google.colab import auth
credentials, project_id = google.auth.default()
request = Request()
credentials.refresh(request=request)
GCF_URL = ''https://project_location-project_id.cloudfunctions.net/name-of-your-funciton''
resp = requests.get(GCF_URL, headers={'Authorization': f'Bearer {credentials.id_token}'})
This will work and trigger the cloud function for any users inside my organization but does not work for my personal email for example.
Now, I would like to replicate this behvaiour inside a google apps script such that an end user with access to that sheet can trigger the cloud function as long as they are a member of my organization.
I have tried some things I have seen online such as this example:
function callExternalUrl() {
var url = 'https://project_location-project_id.cloudfunctions.net/name-of-your-funciton';
var oauthToken = ScriptApp.getOAuthToken(); // get the user who's logged into Google Sheet's OAuth token
const data = {
oauthToken, // 1. add the oauth token to the payload
activeUser: param.user // 2. this here is important as it adds the userinfo.email scope to the token
// any other data you need to send to the Cloud Function can be added here
};
var options = {
'method' : 'get', // or post, depending on how you set up your Cloud Function
'contentType': 'application/json',
// Convert the JavaScript object to a JSON string.
'payload' : JSON.stringify(data)
};
const response = UrlFetchApp.fetch(url, options);
Logger.log('Response Code: ' + response.getResponseCode());
}
This gives a 403 error but if I change it up so that it gives the OAuth on the correct format like this:
function callExternalUrl() {
var url = 'https://project_location-project_id.cloudfunctions.net/name-of-your-funciton';
var oauthToken = ScriptApp.getOAuthToken(); // get the user who's logged into Google Sheet's OAuth token
var response = UrlFetchApp.fetch(url, {
headers: {
Authorization: 'Bearer ' + oauthToken
}
});
// const response = UrlFetchApp.fetch(url, options);
Logger.log('Response Code: ' + response.getResponseCode());
}
I get a 401 (i.e. the authorization failed). Now, it seems that I simply have to get the correct authentication from the users to send in this request for it to work. I have seen this github repo that focuses on getting OAuth2 from google apps scripts (https://github.com/gsuitedevs/apps-script-oauth2), but I can't seem to get that to work either, it would have to be adapted to cloud in some way I am unaware of.
I have read
Securely calling a Google Cloud Function via a Google Apps Script
which is very similar but it did not seem to get to the root of the problem, any input on how to make this process possible?

Authenticating Against an IAP Protected Resource with Bearer Header?

Is it possible to use an Authorization: Bearer … header to make a request through Identity Aware Proxy to my protected application? (Using a service account, of course. From outside GCP.)
I would like to not perform the OIDC token exchange, is this supported?
If so, does anyone have any examples?
So far, I have the following but it doesn't work:
iat = time.time()
exp = iat + 3600
payload = {'iss': account['client_email'],
'sub': account['client_email'],
'aud': '/projects/NNNNN/apps/XXXXXXX',
'iat': iat,
'exp': exp}
additional_headers = {'kid': account['private_key']}
signed_jwt = jwt.encode(payload, account['private_key'], headers=additional_headers,
algorithm='RS256')
signed_jwt = signed_jwt.decode('utf-8')
This produces: Invalid IAP credentials: JWT signature is invalid.
this is not currently supported. IAP is expecting a signature generated by the Google accounts infrastructure using its private key, so that's why the signature check is failing. Could you tell me more about why you'd like to avoid the OIDC token exchange? --Matthew, Google IAP Engineering

Python get info from API / Oauth Authentication

that is my first try with an API, said API being called OPS.
I would like to get information using the API (OAuth 2) within my python code.
The ressource URL is :
http://ops.epo.org/3.2/rest-services/register/{publication}/{EPODOC}/{EP2814089}/biblio
I also received :
Consumer Key: O220VlTQqAmodifiedsf0YeqgM6c
Consumer Secret Key: swWmodified3edjORU
The documentation states that:
OPS uses the OAuth framework for Authentication and Authorization. At this point in
time, only the “Client Credentials” flow is supported using a Consumer key and
Consumer secret.
The actual steps to follow are:
Step 1: Client converts Consumer key and Consumer secret to
Base64Encode(Consumer key:Consumer secret).
This should be done programmatically using the language you are developing the client
application in. For the purposes of this example, a public website was used to perform
this conversion.
By entering the colon separated Client credentials, an encoded response is generated.
This response is then be used for basic Authentication.
Step 2: Client requests an access token using Basic Authentication, supplying its
Consumer key and Consumer secret with base64Encoding over encrypted HTTPS
connection:
OPS authenticates the client credentials passed in the Authorization header using basic
authentication method.
If credentials are valid, OPS responds with a valid access token.
Step 3: Client accesses OPS resources with access token in authorization header
(bearer tokens) over encrypted HTTPS connection
I tried a few samples of code with requests but, until now, nothing worked.
The client credentials flow is described in the OAuth2 RFC-6749. The client id and secret are base64 encoded in a Basic authentication scheme as described in RFC-7617
You should be able to get a token using Python code like:
import requests
import base64
url = 'https://ops.epo.org/3.2/auth/accesstoken'
data = {"grant_type": "client_credentials"}
creds = base64.b64encode("O220VlTQqAmodifiedsf0YeqgM6c:swWmodified3edjORU".encode())
headers = {'Authorization': 'Basic ' + creds.decode('UTF-8'), 'Content-Type': 'application/x-www-form-urlencoded'}
response = requests.post(url, headers=headers, data=data)
access_token = response.json()["access_token"]
When using the previous response I can obtain a token. (Thanks a lot for your answer)
So I tried :
myUrl = 'http://ops.epo.org/3.2/rest-services/register/publication/EPODOC/EP2814089/biblio'
header = {'PRIVATE-TOKEN': myToken}
response = requests.get(myUrl, headers=header)
print(response.text)
but I obtained a 403 error.
I finally got a specific library to do the job :
EPO OPS Library
But I still don't know how to do it on my own...

Google API Service Account Access Token expires

I want to retrieve realtime user data from Google Analytics and this article by Google outlines how to this (and works as expected).
gapi.client.analytics.data.realtime.get({
"ids": "ga:21660971",
"metrics": "rt:activeUsers"
})
However I am following the server-side authentication because I want this to be always visible to users who do not have access. But overnight this access token has become invalid. What can I do or rework to make this work with the need to constantly have manual intervention to refresh the token all of the time!
Python script (as is in the Google article).
# service-account.py
import time
from oauth2client.service_account import ServiceAccountCredentials
# The scope for the OAuth2 request.
SCOPE = 'https://www.googleapis.com/auth/analytics.readonly'
# The location of the key file with the key data.
KEY_FILEPATH = 'MY-JSON-FILE.json'
# Defines a method to get an access token from the ServiceAccount object.
def get_access_token():
return ServiceAccountCredentials.from_json_keyfile_name(
KEY_FILEPATH, SCOPE).get_access_token().access_token
print(get_access_token())
JS Authentification
/**
* Authorize the user with an access token obtained server side.
*/
gapi.analytics.auth.authorize({
'serverAuth': {
'access_token': 'TOKEN-FROM-PY-SCRIPT-ABOVE'
}
});
This is the error given in the console, and it only does this after 60 minutes or so.
{domain: "global", reason: "authError", message: "Invalid
Credentials", locationType: "header", location: "Authorization"}
Array(0) message
"Invalid Credentials"

Categories