Trigger an Azure Function upon VM creation - python

I'm looking for a way to trigger an Azure Function which will run some Python code, each time a new virtual machine is created. I have already done the same thing in AWS using CloudWatch + Lambda, but I can't find where/how achieve the same thing in Azure.
I have tried to use Logic App with Event Grid but there is no trigger to monitor VM state.
Anyone could provide me with some guidance here ?
Many thanks in advance.

Azure service don't have built-in method to achieve your requirement, but I think you can achieve this by your own python code. The main logic is to polling the VM names from your subscription and then store the VM names in somewhere, if they changes, post a request to something like 'HttpTrigger' endpoint(Or just put the logic in the polling algorithm.).
And the for the polling algorithm, you can design by yourself or just use the 'TimeTrigger' to achieve.
I notice you add the 'Python' tag, so just use code like below and put them inside a polling algorithm:
import requests
from azure.identity import ClientSecretCredential
import json
client_id = 'xxx'
tenant_id = 'xxx'
client_secret = 'xxx'
subscription_id = 'xxx'
credential = ClientSecretCredential(tenant_id=tenant_id, client_id=client_id, client_secret=client_secret)
accesstoken = str(credential.get_token('https://management.azure.com/.default'))[19:1287]
bearertoken = "Bearer "+accesstoken
r = requests.get("https://management.azure.com/subscriptions/"+subscription_id+"/resources?$filter=resourceType eq 'Microsoft.Compute/virtualMachines'&api-version=2020-06-01",headers={'Authorization': bearertoken})
items = json.loads(r.text)
print(r.text)
for item in items['value']:
print(item['name'])#This line is print, you need to store this in some place such as database, azure blob storage, azure table storage etc.
#check the VM names here. If some VM been added, post a request to the HttpTrigger function.
If you use azure function 'Time Trigger' instead of self-designed algorithm, then you can store the client id, tenent id, client_secret and subscription id to the keyvault and then let your function app configuration settings refer to the keyvault, this will make it safe.
Above code is based on AAD bearer token, you need to create a AAD App and let it have the 'Owner' RBAC role of the subscription. You need to something like this:
This just like a 'custom trigger' that trigger by the VM created in your 'subscription'. And I think your VM will not be many, so it will not consume much computing resources.

Related

Flask web app on Cloud Run - google.auth.exceptions.DefaultCredentialsError:

I'm hosting a Flask web app on Cloud Run. I'm also using Secret Manager to store Service Account keys. (I previously downloaded a JSON file with the keys)
In my code, I'm accessing the payload then using os.environ["GOOGLE_APPLICATION_CREDENTIALS"] = payload to authenticate. When I deploy the app and try to visit the page, I get an Internal Service Error. Reviewing the logs, I see:
File "/usr/local/lib/python3.10/site-packages/google/auth/_default.py", line 121, in load_credentials_from_file
raise exceptions.DefaultCredentialsError(
google.auth.exceptions.DefaultCredentialsError: File {"
I can access the secret through gcloud just fine with: gcloud secrets versions access 1 --secret="<secret_id>" while acting as the Service Account.
Here is my Python code:
# Grabbing keys from Secret Manager
def access_secret_version():
# Create the Secret Manager client.
client = secretmanager.SecretManagerServiceClient()
# Build the resource name of the secret version.
name = "projects/{project_id}/secrets/{secret_id}/versions/1"
# Access the secret version.
response = client.access_secret_version(request={"name": name})
payload = response.payload.data.decode("UTF-8")
return payload
#app.route('/page/page_two')
def some_random_func():
# New way
payload = access_secret_version() # <---- calling the payload
os.environ["GOOGLE_APPLICATION_CREDENTIALS"] = payload
# Old way
os.environ["GOOGLE_APPLICATION_CREDENTIALS"] = "service-account-keys.json"
I'm not technically accessing a JSON file like I was before. The payload variable is storing entire key. Is this why it's not working?
Your approach is incorrect.
When you run on a Google compute service like Cloud Run, the code runs under the identity of the compute service.
In this case, by default, Cloud Run uses the Compute Engine default service account but, it's good practice to create a Service Account for your service and specify it when you deploy it to Cloud Run (see Service accounts).
This mechanism is one of the "legs" of Application Default Credentials when your code is running on Google Cloud, you don't specify the environment variable (you also don't need to create a key) and Cloud Run service acquires the credentials from the Metadata service:
import google.auth
credentials, project_id = google.auth.default()
See google.auth package
It is bad practice to define|set an environment variable within code. By their nature, environment variables should be provided by the environment. Doing this with APPLICATION_DEFAULT_CREDENTIALS means that your code always sets this value when it should only do this when the code is running off Google Cloud.
For completeness, if you need to create Credentials from a JSON string rather than from a file contain a JSON string, you can use from_service_account_info (see google.oauth2.service_account)

Calling Google App Engine (iAP Enabled) from Google Cloud Function within the same Project

Context:
Node Server in Google App Engine (GAE) that effectively houses a backend for a frontend that is also served by the same app engine instance
Hence why iAP is enabled (for selected web app users only)
Has various endpoints for the frontend to call via reverse-proxy (as I understand it's called)
Google Cloud Function(GCF) within the same project that (funny enough) is actually being called by the node server to initiate the cloud function that then needs to call an endpoint within the GAE node server.
....k wait I might've just found another way to solve the problem but I'll get to that at the end.
I created a VPC Connector for GCF to access a VM instance that I created to talk to external networks. GAE (Flex) is able to do so natively. Not sure if this is relevant but wanted to throw it in the mix.
Short term solution:
Since I need to call the GCF from the GAE node server first, I can just provide it with the relevant data as needed.
Long term solution:
Ideally, the GCF should be called by any other services that might or might not have the data, so it would be ideal to have the GCF call out the GAE endpoint to get the data.
So far:
import urllib
import google.auth.transport.requests
import google.oauth2.id_token
req = urllib.request.Request('https://the-gcp-project-id.appspot.com/api/theEndpoint')
auth_req = google.auth.transport.requests.Request()
id_token = google.oauth2.id_token.fetch_id_token(auth_req, 'https://appengine.googleapis.com')
log.info("Authorization: " + f"Bearer {id_token}")
# req.add_header("Authorization", f"Bearer {id_token}")
# response = urllib.request.urlopen(req)
# # return response.read()
# log.info(response.read())
import requests as reqs
response = reqs.post('https://the-gcp-project-id.appspot.com/theEndpoint', json={'test':'123'}, headers={"Authorization" : f"Bearer {id_token}"})
log.info(response)
This doesn't seem to actually trigger the endpoint though. As far as I know the service account for the cloud function should have the same permissions as the app engine service account.
Can anyone point me in the right direction on this?

Invoking Cloud Function/Cloud Run with default credentials

I need to call other cloud functions/cloud run services from my cloud function. I would like to be authenticated if possible so to that effect I have been looking into how to create an AuthorizedSession using the credentials I get back from google.auth.default. My current code looks like so:
credentials, _ = google.auth.default(scopes=[
SERVICE_A_URL,
SERVICE_B_URL,
SERVICE_C_URL,
])
return AuthorizedSession(credentials)
When running this I get the following:
google.auth.exceptions.RefreshError: ('No access token in response.', {'id_token': '[ID_TOKEN]'})
Does anyone know how to get the AuthorizedSession to accept my credentials?
Authorized session works well with Google Cloud APIs which expect an access token. In the case of Cloud Functions and Cloud Run (and also App Engine behind IAP) you need to provide an identity token.
So, you can't with authorized session, you need to generate and ID token with the correct audience and then to add it in the authorization header of your request.
credentials, project_id = google.auth.default(scopes=["https://www.googleapis.com/auth/cloud-platform"])
credentials.with_target_audience("SERVICE_A_URL")
from google.auth.transport import requests
credentials.refresh(requests.Request())
print(credentials.token)
request.get("SERVICE_A_URL+Parameters", headers={'Authorization': 'bearer {}'.format(credentials.token)})

Using Azure Key Vault and Active Directory to Retrieve Secrets

For a Python code base I would like to have developers accessing application secrets using Azure Key Vault, with the idea that when we deploy, the application also should be able to connect. Hence, I'm thinking Active Directory.
However, I can not find any examples on the interweb that show this with the Python SDK. Initially, I would think to retrieve the CLI user:
from azure.common.credentials import get_azure_cli_credentials
credentials, subscription_id, tenant_id = get_azure_cli_credentials(with_tenant=True)
and then use this retrieved set of credentials to access the key vault:
from azure.keyvault import KeyVaultClient
vault_url = "https://########.vault.azure.net/"
secret_name = "########"
secret_version = "########"
client = KeyVaultClient(credentials)
secret = client.get_secret(vault_url, secret_name, secret_version)
print(secret)
However, I retrieve an error that:
azure.keyvault.v7_0.models.key_vault_error_py3.KeyVaultErrorException: Operation returned an invalid status code 'Unauthorized'
I can confirm that credentials, subscription_id and tenant_id are correct, and that using the CLI, I can succesfully retrieve the secret content. So it must be some Python SDK-specific thing.
Any ideas?
It looks like this is a bug in the Python SDK.
https://github.com/Azure/azure-sdk-for-python/issues/5096
You can use your own AD username and password with the UserPassCredentials class. It's not the logged in user, but's it's probably as close as you'll get for now.
EG:
from azure.common.credentials import UserPassCredentials
credentials = UserPassCredentials('username','password')
client = KeyVaultClient(credentials)
secret = client.get_secret(vault_url, secret_name, secret_version)
print(secret)
I tried the same thing and had a different error ("...audience is invalid...") until I changed your first function call adding the resource parameter:
credentials, subscription_id, tenant_id =
get_azure_cli_credentials(resource='https://vault.azure.net', with_tenant=True)
With this change I was able to access secrets using the same code you show.
What about this code snippet? Comparing your code to the example, I don't see where you're setting the client_id or the tenant.
You’ll want to set the access policy for the key vault to allow the authenticated user to access secrets. This can be done in the portal. Bear in mind that key vault has an upper limit of 16 access definitions, so you’ll probably want to grant access to a group and add your users to that group.
As #8forty pointed out, adding a resource='https://vault.azure.net' parameter to your get_azure_cli_credentials call will resolve the issue.
However, there are new packages for working with Key Vault in Python that replace azure-keyvault:
azure-keyvault-certificates (Migration guide)
azure-keyvault-keys (Migration guide)
azure-keyvault-secrets (Migration guide)
azure-identity is also the package that should be used with these for authentication.
If you want to authenticate your Key Vault client with the credentials of the logged in CLI user, you can use the AzureCliCredential class:
from azure.identity import AzureCliCredential
from azure.keyvault.secrets import SecretClient
credential = AzureCliCredential()
vault_url = "https://{vault-name}.vault.azure.net"
secret_name = "secret-name"
client = SecretClient(vault_url, credential)
secret = client.get_secret(secret_name)
print(secret.value)
(I work on the Azure SDK in Python)

Google Admin Directory API - Send a query via apiclient

I am retrieving a ChromeOS device MAC address via the Google Admin Directory API using the device's Serial Number as reference, and am making my calls through
apiclient.
service = discovery.build('admin', 'directory_v1', developerKey=settings.API_KEY)
Here are the calls available for ChromeOS devices; my issue is that I require a Device ID in order to execute the following:
service.chromeosdevices().get(customerId=settings.CID, deviceId=obtained_id, projection=None).execute()
I can send a GET query via the following format:
https://www.googleapis.com/admin/directory/v1/customer/my_customer/devices/chromeos?projection=full&query=id:" + serial + "&orderBy=status&sortOrder=ascending&maxResults=10", "GET")
... but I'm trying to avoid using OAuth2 and just use my API key. Passing the key in a GET request doesn't work either, as it still returns a "Login Required" notice.
How do I squeeze the above query into an apiclient-friendly format? The only option I found via the above calls was to request every device we have (via list), then sift through the mountain of data for the matching Serial number, which seems silly and excessive.
I did notice I could call apiclient.http.HttpRequests, but I couldn't find a way to pass the API key through it either. There's new_batch_http_request, but I can't discern from the docs how to simply pass a URL to it.
Thank you!
Got it!
You can't use just a key for Directory API queries, you need a Service account.
I'm using google-auth (see here) since oauth2client is deprecated.
You also need to:
Delegate the necessary permissions for your service account (mine has the role of Viewer and has scope access to https://www.googleapis.com/auth/admin.directory.device.chromeos.readonly)
Delegate API access to it separately in the Admin Console (Security -> Advanced Settings -> Authentication)
Get your json client secret key and place it with your app (don't include it in your VCS)
Obtain your credentials like this:
credentials = service_account.Credentials.from_service_account_file(
settings.CLIENT_KEY,
scopes=settings.SCOPES,
subject=settings.ADMIN_USER)
where ADMIN_USER is the email address of an authorized Domain admin.
Then you send a GET request like so:
authed_session = AuthorizedSession(credentials)
response = authed_session.get(request_id_url)
This returns a Requests object you can read via response.content.
Hope it helps someone else!

Categories