Databricks API 2.0- Create Secret Scope - TEMPORARILY_UNAVAILABLE - python

I am automating the deployment of an infrastructure containing an Azure Databricks instance. To be able to use the Azure Blob Storage from within Databricks I want to create a Secret Scope via the Databricks REST API 2.0 in my DevOps Pipeline running a Python job.
When I try to create the secret scope, I get the response
{"message":"Authentication is temporarily unavailable. Please try again later.", "error_code": "TEMPORARILY_UNAVAILABLE"}
I was already able to create a databricks access token using the API, i.e. the endpoint /token/create worked perfectly.
I am authenticating to databricks using the Code from the answer to this question: https://stackoverflow.com/a/61826488/2196531
This is how I am able to create a token and how I try to generate the scope:
import requests
import adal
import json
# set variables
clientId = "<Service Principal Id>"
tenantId = "<Tenant Id>"
clientSecret = "<Service Principal Secret>"
subscription_id = "<Subscription Id>"
resource_group = "<Resource Group Name>"
databricks_workspace = "<Databricks Workspace Name>"
dbricks_url = "<Databricks Azure URL>"
# Acquire a token to authenticate against Azure management API
authority_url = 'https://login.microsoftonline.com/'+tenantId
context = adal.AuthenticationContext(authority_url)
token = context.acquire_token_with_client_credentials(
resource='https://management.core.windows.net/',
client_id=clientId,
client_secret=clientSecret
)
azToken = token.get('accessToken')
# Acquire a token to authenticate against the Azure Databricks Resource
token = context.acquire_token_with_client_credentials(
resource="2ff814a6-3304-4ab8-85cb-cd0e6f879c1d",
client_id=clientId,
client_secret=clientSecret
)
adbToken = token.get('accessToken')
# Format Request API Url
dbricks_api = "https://{}/api/2.0".format(dbricks_url)
# Request Authentication
dbricks_auth = {
"Authorization": "Bearer {}".format(adbToken),
"X-Databricks-Azure-SP-Management-Token": azToken,
"X-Databricks-Azure-Workspace-Resource-Id": ("/subscriptions/{}/resourceGroups/{}/providers/Microsoft.Databricks/workspaces/{}".format(subscription_id, resource_group, databricks_workspace) )
}
# Creating a databricks token
payload = {
"comment": "This token is created by API call"
}
requests.post(f"{dbricks_api}/token/create", headers=dbricks_auth, json=payload)
# works
# Creating a databricks secret scope
payload = {
"scope": "my-databricks-secret-scope",
"initial_manage_principal": "users"
}
requests.post(f"{dbricks_api}/secrets/scopes/create", headers=dbricks_auth, json=payload)
# returns {"message":"Authentication is temporarily unavailable. Please try again later.", "error_code": "TEMPORARILY_UNAVAILABLE"}
Databricks is running in westeurope.
Python 3.8.5 x64
Packages used in the Code snippet
adal-1.2.4
requests-2.24.0
Is there a problem with the databricks API or am I doing something wrong?

According to my test, when we use the Databricks Rest API to create Secret Scope, we should use the person access token.
For example
Create a service principal
az login
az ad sp create-for-rbac -n "MyApp"
Code
import requests
import adal
import json
# set variables
clientId = "<Service Principal Id>"
tenantId = "<Tenant Id>"
clientSecret = "<Service Principal Secret>"
subscription_id = "<Subscription Id>"
resource_group = "<Resource Group Name>"
databricks_workspace = "<Databricks Workspace Name>"
dbricks_url = "<Databricks Azure URL>"
# Acquire a token to authenticate against Azure management API
authority_url = 'https://login.microsoftonline.com/'+tenantId
context = adal.AuthenticationContext(authority_url)
token = context.acquire_token_with_client_credentials(
resource='https://management.core.windows.net/',
client_id=clientId,
client_secret=clientSecret
)
azToken = token.get('accessToken')
# Acquire a token to authenticate against the Azure Databricks Resource
token = context.acquire_token_with_client_credentials(
resource="2ff814a6-3304-4ab8-85cb-cd0e6f879c1d",
client_id=clientId,
client_secret=clientSecret
)
adbToken = token.get('accessToken')
# Format Request API Url
dbricks_api = "https://{}/api/2.0".format(dbricks_url)
# Request Authentication
dbricks_auth = {
"Authorization": "Bearer {}".format(adbToken),
"X-Databricks-Azure-SP-Management-Token": azToken,
"X-Databricks-Azure-Workspace-Resource-Id": ("/subscriptions/{}/resourceGroups/{}/providers/Microsoft.Databricks/workspaces/{}".format(subscription_id, resource_group, databricks_workspace) )
}
# Creating a databricks token
payload = {
"lifetime_seconds": 3600, # the token lifetime
"comment": "This token is created by API call"
}
data =requests.post(f"{dbricks_api}/token/create", headers=dbricks_auth, json=payload)
dict_content = json.loads(data.content.decode('utf-8'))
token = dict_content.get('token_value')
payload = {
"scope": "my-databricks-secret-scope",
"initial_manage_principal": "users"
}
res=requests.post(f"{dbricks_api}/secrets/scopes/create", headers={
"Authorization": "Bearer {}".format(token),
}, json=payload)
print(res.status_code)

Related

Python gCloud billing APIs from CloudRun container instance gives 404 error

The account that is running cloudrun has been linked to the billing account, but still the cloud billing python apis throw errors. What else needs to be done in the service account ?
#1. Created json token for service account and passed into the access_bills() method
#2. The service account has role for Billing Access View
#3. Copied these methods as advised in comments from John Hanley's blog:
def load_private_key(json_cred):
''' Return the private key from the json credentials '''
return json_cred['private_key']
def create_signed_jwt(pkey, pkey_id, email, scope):
'''
Create a Signed JWT from a service account Json credentials file
This Signed JWT will later be exchanged for an Access Token
'''
# Google Endpoint for creating OAuth 2.0 Access Tokens from Signed-JWT
auth_url = "https://www.googleapis.com/oauth2/v4/token"
expires_in = 3600
issued = int(time.time())
expires = issued + expires_in # expires_in is in seconds
# Note: this token expires and cannot be refreshed. The token must be recreated
# JWT Headers
additional_headers = {
'kid': pkey_id,
"alg": "RS256", # Google uses SHA256withRSA
"typ": "JWT"
}
# JWT Payload
payload = {
"iss": email, # Issuer claim
"sub": email, # Issuer claim
"aud": auth_url, # Audience claim
"iat": issued, # Issued At claim
"exp": expires, # Expire time
"scope": scope # Permissions
}
# Encode the headers and payload and sign creating a Signed JWT (JWS)
sig = jwt.encode(payload, pkey, algorithm="RS256", headers=additional_headers)
return sig
def exchangeJwtForAccessToken(signed_jwt):
'''
This function takes a Signed JWT and exchanges it for a Google OAuth Access Token
'''
auth_url = "https://www.googleapis.com/oauth2/v4/token"
params = {
"grant_type": "urn:ietf:params:oauth:grant-type:jwt-bearer",
"assertion": signed_jwt
}
r = requests.post(auth_url, data=params)
if r.ok:
return(r.json()['access_token'], '')
return None, r.text
def access_bills(sa_json):
cred = json.loads(sa_json)
private_key = load_private_key(cred)
# scopes = "https://www.googleapis.com/auth/cloud-platform" # this does not work, gets 404
scopes = "https://www.googleapis.com/auth/cloud-billing.readonly"
s_jwt = create_signed_jwt(
private_key,
cred['private_key_id'],
cred['client_email'],
scopes)
token, err = exchangeJwtForAccessToken(s_jwt)
if token is None:
logger.error("Error: {}".format(err))
exit(1)
logger.info("Token response: {}".format(token))
# the token is obtained and prints in the log
headers = {
"Host": "www.googleapis.com",
"Authorization": "Bearer " + token,
"Content-Type": "application/json"
}
try:
url = "https://cloudbilling.googleapis.com/v1/billingAccounts/01C8DC-336472-E177E1" # account name is "Billing Core"
response = requests.get(url=url, headers=headers)
logger.info("Response: {}".format(response))
# logs -> app - INFO - Response: <Response [404]>
return {
'statusCode': 200,
'body': 'Success'
}
except Exception as e:
logger.error("Error")
raise e
It gives 404 error as shown in the comment log after trying on that url.
Okay I found that the only way it works as of now is via big query export from billing account, sort out dataViewer permission and run the corresponding sql from python application, it works.

How to use MS Graph API and Python to add a client secret to AAD application

I am looking for Python examples that use the MS Graph API, to generate a new client secret for a Microsoft Azure AD app registration. Can someone please help me?
Here is an example, along with obtaining a token using a client and secret
See the documentation for the addPassword API
Obtain a bearer (and access) token
# The App Registration's application (client) ID
client_id = ""
# Client secret created under App Registration blade
client_secret = ""
# Your Azure AD tenant ID
tenant_id = ""
app = msal.ConfidentialClientApplication(
client_id = client_id,
client_credential = client_secret,
authority = f"https://login.microsoftonline.com/{tenant_id}")
scopes = ["https://graph.microsoft.com/.default"]
# Obtain bearer token from MS Graph
token = None
token = app.acquire_token_for_client(scopes = scopes)
Add a new client secret using Graph API
# The App Registration's object ID
app_object_id = ""
req_uri = f"https://graph.microsoft.com/v1.0/applications/{app_object_id}/addPassword"
req_headers = {
"Authorization": "Bearer " + token['access_token'],
"Content-Type": "application/json"
}
req_body = json.dumps(
{
"passwordCredential": {
"displayName": "Secret Description"
}
}
)
result = requests.post(url = req_uri, headers = req_headers, data = req_body)

How to get number of compute engine instances running on Google Cloud

In Google Cloud documentation, the command line to get the list and number of instances that are running on a project in Google Cloud is given as following :
gcloud compute instances list
or
GET https://compute.googleapis.com/compute/v1/projects/{project}/zones/{zone}/instances
How can I get the equivalent function in Python using Google Cloud ?
There is documentation on this here: https://cloud.google.com/compute/docs/tutorials/python-guide
In short:
import googleapiclient.discovery
project="your project"
zone="your zone"
compute = googleapiclient.discovery.build('compute', 'v1')
instances = compute.instances().list(project=project, zone=zone).execute()
for instance in instances:
print(' - ' + instance['name'])
Below is an example I wrote that uses the REST API and does not use one of the Google Cloud SDKs.
This example will teach you the low level details of services accounts, authorization, access tokens and the Compute Engine REST API.
'''
This program lists lists the Google Compute Engine Instances in one zone
'''
import time
import json
import jwt
import requests
import httplib2
# Project ID for this request.
project = 'development-123456'
# The name of the zone for this request.
zone = 'us-west1-a'
# Service Account Credentials, Json format
json_filename = 'service-account.json'
# Permissions to request for Access Token
scopes = "https://www.googleapis.com/auth/cloud-platform"
# Set how long this token will be valid in seconds
expires_in = 3600 # Expires in 1 hour
def load_json_credentials(filename):
''' Load the Google Service Account Credentials from Json file '''
with open(filename, 'r') as f:
data = f.read()
return json.loads(data)
def load_private_key(json_cred):
''' Return the private key from the json credentials '''
return json_cred['private_key']
def create_signed_jwt(pkey, pkey_id, email, scope):
'''
Create a Signed JWT from a service account Json credentials file
This Signed JWT will later be exchanged for an Access Token
'''
# Google Endpoint for creating OAuth 2.0 Access Tokens from Signed-JWT
auth_url = "https://www.googleapis.com/oauth2/v4/token"
issued = int(time.time())
expires = issued + expires_in # expires_in is in seconds
# Note: this token expires and cannot be refreshed. The token must be recreated
# JWT Headers
additional_headers = {
'kid': pkey_id,
"alg": "RS256",
"typ": "JWT" # Google uses SHA256withRSA
}
# JWT Payload
payload = {
"iss": email, # Issuer claim
"sub": email, # Issuer claim
"aud": auth_url, # Audience claim
"iat": issued, # Issued At claim
"exp": expires, # Expire time
"scope": scope # Permissions
}
# Encode the headers and payload and sign creating a Signed JWT (JWS)
sig = jwt.encode(payload, pkey, algorithm="RS256", headers=additional_headers)
return sig
def exchangeJwtForAccessToken(signed_jwt):
'''
This function takes a Signed JWT and exchanges it for a Google OAuth Access Token
'''
auth_url = "https://www.googleapis.com/oauth2/v4/token"
params = {
"grant_type": "urn:ietf:params:oauth:grant-type:jwt-bearer",
"assertion": signed_jwt
}
r = requests.post(auth_url, data=params)
if r.ok:
return(r.json()['access_token'], '')
return None, r.text
def gce_list_instances(accessToken):
'''
This functions lists the Google Compute Engine Instances in one zone
'''
# Endpoint that we will call
url = "https://www.googleapis.com/compute/v1/projects/" + project + "/zones/" + zone + "/instances"
# One of the headers is "Authorization: Bearer $TOKEN"
headers = {
"Host": "www.googleapis.com",
"Authorization": "Bearer " + accessToken,
"Content-Type": "application/json"
}
h = httplib2.Http()
resp, content = h.request(uri=url, method="GET", headers=headers)
status = int(resp.status)
if status < 200 or status >= 300:
print('Error: HTTP Request failed')
return
j = json.loads(content.decode('utf-8').replace('\n', ''))
print('Compute instances in zone', zone)
print('------------------------------------------------------------')
for item in j['items']:
print(item['name'])
if __name__ == '__main__':
cred = load_json_credentials(json_filename)
private_key = load_private_key(cred)
s_jwt = create_signed_jwt(
private_key,
cred['private_key_id'],
cred['client_email'],
scopes)
token, err = exchangeJwtForAccessToken(s_jwt)
if token is None:
print('Error:', err)
exit(1)
gce_list_instances(token)

using python to authenticate to GCP compute API endpoint

My goal is to reproduce/replicate the functionality of gcloud compute addresses create without depending on the gcloud binary.
I am trying to use python to authenticate a POST to a googleapis compute endpoint per the documentation at https://cloud.google.com/compute/docs/ip-addresses/reserve-static-external-ip-address about reserving a static external ip address
But my POST's return 401 every time.
I have created a JWT from google.auth.jwt python module and when I decode it the JWT has all the strings embedded that I would expect to be there.
I've also tried combinations of the following OAuth scopes to be included in the JWT:
- "https://www.googleapis.com/auth/userinfo.email"
- "https://www.googleapis.com/auth/compute"
- "https://www.googleapis.com/auth/cloud-platform"
this is my function for getting a JWT using the information in my service account's JSON key file
def _generate_jwt( tokenPath, expiry_length=3600 ):
now = int(time.time())
tokenData = load_json_data( tokenPath )
sa_email = tokenData['client_email']
payload = {
'iat': now,
# expires after 'expiry_length' seconds.
"exp": now + expiry_length,
'iss': sa_email,
"scope": " ".join( [
"https://www.googleapis.com/auth/cloud-platform",
"https://www.googleapis.com/auth/compute",
"https://www.googleapis.com/auth/userinfo.email"
] ),
'aud': "https://www.googleapis.com/oauth2/v4/token",
'email': sa_email
}
# sign with keyfile
signer = google.auth.crypt.RSASigner.from_service_account_file( tokenPath )
jwt = google.auth.jwt.encode(signer, payload)
return jwt
once I have the JWT then I make the following post that fails, 401, ::
gapiURL = 'https://www.googleapis.com/compute/v1/projects/' + projectID + '/regions/' + region + '/addresses'
jwtToken = _generate_jwt( servicetoken )
headers = {
'Authorization': 'Bearer {}'.format( jwtToken ),
'content-type' : 'application/json',
}
post = requests.post( url=gapiURL, headers=headers, data=data )
post.raise_for_status()
return post.text
I received a 401 no matter how many combinations of scopes I used in the JWT or permissions I provided to my service account. What am I doing wrong?
edit: many thanks to #JohnHanley for pointing out that I'm missing the next/second POST to https://www.googleapis.com/oauth2/v4/token URL in GCP's auth sequence. So, you get a JWT to get an 'access token.'
I've changed my calls to use the python jwt module rather than the google.auth.jwt module in-combo with the google.auth.crypt.RSASigner. So the code is a bit simpler and I put it in a single method
## serviceAccount auth sequence for google :: JWT -> accessToken
def gke_get_token( serviceKeyDict, expiry_seconds=3600 ):
epoch_time = int(time.time())
# Generate a claim from the service account file.
claim = {
"iss": serviceKeyDict["client_email"],
"scope": " ".join([
"https://www.googleapis.com/auth/cloud-platform",
"https://www.googleapis.com/auth/userinfo.email"
]),
"aud": "https://www.googleapis.com/oauth2/v4/token",
"exp": epoch_time + expiry_seconds,
"iat": epoch_time
}
# Sign claim with JWT.
assertion = jwt.encode( claim, serviceKeyDict["private_key"], algorithm='RS256' ).decode()
data = urllib.urlencode( {
"grant_type": "urn:ietf:params:oauth:grant-type:jwt-bearer",
"assertion": assertion
} )
# Request the access token.
result = requests.post(
url="https://www.googleapis.com/oauth2/v4/token",
headers={
"Content-Type": "application/x-www-form-urlencoded"
},
data=data
)
result.raise_for_status()
return loadJsonData(result.text)["access_token"]
In Google Cloud there are three types of "tokens" that grant access:
Signed JWT
Access Token
Identity Token
In your case you created a Signed JWT. A few Google services accept this token. Most do not.
Once you create a Signed JWT, then next step is to call a Google OAuth endpoint and exchange for an Access Token. I wrote an article that describes this in detail:
Google Cloud – Creating OAuth Access Tokens for REST API Calls
Some Google services now accept Identity Tokens. This is called Identity Based Access Control (IBAC). This does not apply to your question but is the trend for the future in Google Cloud Authorization. An example is my article on Cloud Run + Cloud Storage + KMS:
Google Cloud – Go – Identity Based Access Control
The following example Python code shows how to exchange tokens:
def exchangeJwtForAccessToken(signed_jwt):
'''
This function takes a Signed JWT and exchanges it for a Google OAuth Access Token
'''
auth_url = "https://www.googleapis.com/oauth2/v4/token"
params = {
"grant_type": "urn:ietf:params:oauth:grant-type:jwt-bearer",
"assertion": signed_jwt
}
r = requests.post(auth_url, data=params)
if r.ok:
return(r.json()['access_token'], '')
return None, r.text

AWS Custom Federation Broker: calling federation endpoint error 400 python

I'm trying to creating a URL that enables federated users to access the AWS Management Console following the
[officlal documentation][1]. I'm using Cognito with [enhanced authflow][2] in order to authenticate user with username and password. This is the code:
################## 1. LOGIN ####################
cognito = boto3.client('cognito-idp', aws_access_key_id='', aws_secret_access_key='')
response = cognito.initiate_auth(
ClientId = app_client_id,
AuthFlow = 'USER_PASSWORD_AUTH',
AuthParameters = {
"USERNAME": username,
"PASSWORD": password
},
ClientMetadata = { 'UserPoolId': user_pool_id }
)
id_token = response['AuthenticationResult']['IdToken']
################## 2. GET ID ####################
cognito_identity = boto3.client('cognito-identity', aws_access_key_id='', aws_secret_access_key='', region_name=region)
response = cognito_identity.get_id(
IdentityPoolId = identity_pool_id,
Logins = {
'cognito-idp.{}.amazonaws.com/{}'.format(region, user_pool_id) : id_token
}
)
identity_id = response['IdentityId']
################## 3. RETRIEVE CREDENTIALS ####################
response = cognito_identity.get_credentials_for_identity(
IdentityId = identity_id,
Logins = {
'cognito-idp.{}.amazonaws.com/{}'.format(region, user_pool_id) : id_token
}
)
access_key_id = response['Credentials']['AccessKeyId']
secret_key = response['Credentials']['SecretKey']
session_token = response['Credentials']['SessionToken']
For the next step (assume role and call federation endpoint) i'm not using the example in the official documentation linked above because it use boto rather than boto3. This is the code:
sts_boto_3 = boto3.client('sts', aws_access_key_id = access_key_id,
aws_secret_access_key = secret_key,
aws_session_token = session_token,
region_name = region)
response = sts_boto_3.assume_role(
RoleArn = role,
RoleSessionName = role_session_name,
)
session_id = response['Credentials']['AccessKeyId']
session_key = response['Credentials']['SecretAccessKey']
session_token = response['Credentials']['SessionToken']
session_string = '{{"sessioId" : "{}" , "sessionKey": "{}", "sessionToken" : "{}"}}'.format(session_id, session_key, session_token)
req_url = 'https://signin.aws.amazon.com/federation?Action=getSigninToken&SessionDuration={}&Session={}'.format(3600, urllib.quote_plus(session_string))
r = requests.get(req_url)
print r
The result is
<Response [503]>
What i'm wrong?
[EDIT]
There wasn't an error in session_string (sessioId instead of sessionId)
session_string = '{{"sessionId" : "{}" , "sessionKey": "{}", "sessionToken" : "{}"}}'.format(session_id, session_key, session_token)
Now the response is 400 BAD REQUEST
<Response [400]>
I've added a full example of how to set up credentials and construct a URL that gives federated users direct access to the AWS Management Console on GitHub.
Here's the salient part of the code that constructs the URL:
def construct_federated_url(assume_role_arn, session_name, issuer, sts_client):
"""
Constructs a URL that gives federated users direct access to the AWS Management
Console.
1. Acquires temporary credentials from AWS Security Token Service (AWS STS) that
can be used to assume a role with limited permissions.
2. Uses the temporary credentials to request a sign-in token from the
AWS federation endpoint.
3. Builds a URL that can be used in a browser to navigate to the AWS federation
endpoint, includes the sign-in token for authentication, and redirects to
the AWS Management Console with permissions defined by the role that was
specified in step 1.
For more information, see Enabling Custom Identity Broker Access to the AWS Console
[https://docs.aws.amazon.com/IAM/latest/UserGuide/id_roles_providers_enable-console-custom-url.html]
in the AWS Identity and Access Management User Guide.
:param assume_role_arn: The role that specifies the permissions that are granted.
The current user must have permission to assume the role.
:param session_name: The name for the STS session.
:param issuer: The organization that issues the URL.
:param sts_client: A Boto3 STS instance that can assume the role.
:return: The federated URL.
"""
response = sts_client.assume_role(
RoleArn=assume_role_arn, RoleSessionName=session_name)
temp_credentials = response['Credentials']
print(f"Assumed role {assume_role_arn} and got temporary credentials.")
session_data = {
'sessionId': temp_credentials['AccessKeyId'],
'sessionKey': temp_credentials['SecretAccessKey'],
'sessionToken': temp_credentials['SessionToken']
}
aws_federated_signin_endpoint = 'https://signin.aws.amazon.com/federation'
# Make a request to the AWS federation endpoint to get a sign-in token.
# The requests.get function URL-encodes the parameters and builds the query string
# before making the request.
response = requests.get(
aws_federated_signin_endpoint,
params={
'Action': 'getSigninToken',
'SessionDuration': str(datetime.timedelta(hours=12).seconds),
'Session': json.dumps(session_data)
})
signin_token = json.loads(response.text)
print(f"Got a sign-in token from the AWS sign-in federation endpoint.")
# Make a federated URL that can be used to sign into the AWS Management Console.
query_string = urllib.parse.urlencode({
'Action': 'login',
'Issuer': issuer,
'Destination': 'https://console.aws.amazon.com/',
'SigninToken': signin_token['SigninToken']
})
federated_url = f'{aws_federated_signin_endpoint}?{query_string}'
return federated_url

Categories