Is it possible to use Azure Shared Access Signature outside of Azure? - python

Since Azures Shared Access Signatures (SAS) is an open standard I want to use it in my Django application which has no links to Azure whatsoever. I basically want to create a piece of code that creates read permission for the next 24 hours on a specific url I'm serving. So I've watched some video's on SAS and I installed the python library for it (pip install azure-storage-blob).
I read over the README here on github but as far as I can see it always requires an Azure account. Is it also possible to use SAS in my own (Python) application? I imagine it to create the hashes based on a pre-defined secret key for. If this is possible, does anybody have any example code on how to create the url and how to validate it? Preferably in Python, but example code in other languages would be welcome as well.

While the original blob storage sas generation code exists here, I rather find the below simplified code more useful for your general purpose (inspired by this sample). Adjust as per your need. Below is the client side sas generation logic (hmac sha256 digest) using a secret key. Use the similar logic at server side to re-generate signature extracting URL params (sr, sig, se) and compare the same (sig) with that passed from client side to match. Note the shared secret key both at client and server side is main driver here.
import time
import urllib
import hmac
import hashlib
import base64
def get_auth_token(url_base, resource, sas_name, sas_secret):
"""
Returns an authorization token dictionary
for making calls to Event Hubs REST API.
"""
uri = urllib.parse.quote_plus("https://{}.something.com/{}" \
.format(url_base, resource))
sas = sas_secret.encode('utf-8')
expiry = str(int(time.time() + 10000))
string_to_sign = (uri + '\n' + expiry).encode('utf-8')
signed_hmac_sha256 = hmac.HMAC(sas, string_to_sign, hashlib.sha256)
signature = urllib.parse.quote(base64.b64encode(signed_hmac_sha256.digest()))
return {"url_base": url_base,
"resource": resource,
"token":'SharedAccessSignature sr={}&sig={}&se={}&skn={}' \
.format(uri, signature, expiry, sas_name)
}

Related

How to pull values from TSI using session ID and tag name

I am trying to pull values from Time series insights using the session ID, environment name and tag name using python.
The steps that I have figured out is as follows.
I was able to get the session using the access token using which I am ale to reach the TSI environment which have the tag values I would want to pull.
to get the session using the access token
headers = {"Authorization": auth_token}
tsi_session = requests.Session()
tsi_session.params = params
tsi_session.headers.update(headers)
to use the session and access the TSI environment
tsi_environment = tsi_api_wrapper.get_tsi_environment(session=tsi_session, environment_name="some_name")
print(tsi_environment)
I was able to get the environment properties.
What would be the next step to get the values of a particular tag without using client ID and client secret but only the above mentioned inputs.
any help would be much appreciated.
to call Azure TSI's REST APIs you will always need to provide an Azure AD JWT token, and the identity retrieving the token will always first be required to authenticate. Thus, there will always need to be some sort of secret whether it's a user's password, client sec, certificate etc.
I see you got an auth token, nice. Is the object ID of the token the app's ID? I assume that you're looking for samples on how to have your web app facilitate an interactive user-login and that the app will call TSI as a downstream API? I believe you'll need to find the python equivalent of this sample. Note that your questions are more around obtaining auth tokens, rather than TSI-specific questions, thus you might consider tagging "azure-active-directory" instead.
Is this list of users fixed, or would it be dynamically changing? If it's dynamic then that may be problematic since the object ID within the token must have a role assignment for the TSI environment. In that case, you can instead have the users log into the app, but then app itself could turn around and call the TSI APIs as a service principal.
I found this post which seems applicable to your situation https://towardsdatascience.com/how-to-secure-python-flask-web-apis-with-azure-ad-14b46b8abf22

Azure Pipelines - Use System.AccessToken within a Python Script

I am working on a pipeline where the majority of code is within a python script that I call in the pipeline. In the script I would like to use the predefined variable System.AccessToken to make a call to the DevOps API that sets the status of a pull request.
However, when I try to get the token using os.environ['System.AccessToken'] I get a key error.
Oddly though, it seems that System.AccessToken is set, because in the yaml file for the pipeline I am able to access the API like:
curl -u ":$(System.AccessToken)" URL
and get back a valid response. Is there something additional I need to do in Python to access this variable?
After reviewing the page that Mani posted I found the answer. For most variables, something like System.AccessToken would have a corresponding SYSTEM_ACCESSTOKEN.
However, with a secret variable this is not the case. I was able to make it accessible to my python script by adding:
env:
SYSTEM_ACCESSTOKEN: $(System.AccessToken)
to where the Python script is called in the pipeline's yaml file.
See https://learn.microsoft.com/en-us/azure/devops/pipelines/process/variables?view=azure-devops&tabs=yaml%2Cbatch#secret-variables for more details.
with this documentation it can work: https://learn.microsoft.com/de-de/azure/developer/python/azure-sdk-authenticate?tabs=cmd
Just change the language to "read in english"
There must be a vault and a present Secret aka SAS Token.
And I have to say your code above is curl not python.
import os
from azure.identity import DefaultAzureCredential
from azure.keyvault.secrets import SecretClient
# Acquire the resource URL
vault_url = os.environ["KEY_VAULT_URL"]
# Acquire a credential object
credential = DefaultAzureCredential()
# Acquire a client object
secret_client = SecretClient(vault_url=vault_url, credential=credential)
# Attempt to perform an operation
retrieved_secret = secret_client.get_secret("secret-name-01")
with this save change the fields to your vault and secret name the file as test.py and run it.
If you need the token outside, each Environment have it own namespace.
So adding it in local context with export ... or
follow the Unix policy, "everything is a file" write it to file.
Good practise here is to use ansible-vault or something similar.
store it encrypted, use it if you need it.
read it from file.
Can you use os.environ['SYSTEM_ACCESSTOKEN'] . As mentioned in https://learn.microsoft.com/en-us/azure/devops/pipelines/process/variables?view=azure-devops&tabs=yaml%2Cbatch#environment-variables the case/format of the environment variables is different

How to create Azure Container SAS from Stored Access Policy in Python

I am trying to create a SAS with python code and use it to create a ContainerClient with only "write" permissions to upload data to one specific container.
I was trying several code snippets but couldn't find anything suitable, only the following code:
from azure.storage.blob import generate_container_sas
generate_container_sas(account_name="<yourAccountName>",container_name = "<yourContainerName>",account_key="<yourAccountKey>" ,policy_id='<yourPolicyId>')
Problem: I don't want to give the user an account_key or delegation_key, but at least one of those is required for this function.
Is there a way to create a SAS only using account_name, container_name and policy_id?
Thank you for the help!
The answer is no.
The account_name and account_key are needed when creating sas token. Account_key is used to authenticate. Without it, the request to storage will be denied.
Regarding Stored Access policy, it's used to set the start time, expiry time, or permissions(like read/write) for a sas token. It's nothing to do with the authentication.

How to add custom header when creating bucket using Boto3?

I can create a bucket using these parameters. But none of them is a custom header. It's also said that boto3 will not support it because S3 does not currently allow setting arbitrary headers on buckets or objects.
But in my case. I am using Cloudian as storage. It supports x-gmt-policyid this policy determines how data in the bucket will be distributed and protected through either replication or erasure coding.
Any idea how to inject custom header to boto bucket creation?
s3_resource.create_bucket(Bucket='foo-1')
My last two options:
1) to fork botocore and add this functionality, but I saw they use loaders.py that read everything from json file, and it seems a bit complicated for a beginner.
2) or maybe I need to use pure python implementation using request module to create s3 bucket.
Thanks for suggestions.
My current solution is to fetch S3 compatible cloudian API directly. Signing the request is very complicated, so I use the help of requests-aws4auth library. I tried other libs but failed.
example to create bucket with clodian x-gmt-policyid value:
import requests
from requests_aws4auth import AWS4Auth
endpoint = "http://awesome-bucket.my-s3.net"
auth = AWS4Auth(
"00ac60d1a669fakekey",
"S2/x9sRvb1Jys9n+fakekey",
"eu-west-1",
"s3",
)
headers = {
"x-gmt-policyid": "9f934425b7f5de611c32fakeid",
"x-amz-acl": "public-read",
}
response = requests.put(endpoint, auth=auth, headers=headers)
print(response.text)

Cloudformation wildcard search with boto3

I have been tasked with converting some bash scripting used by my team that performs various cloudformation tasks into Python using the boto3 library. I am currently stuck on one item. I cannot seem to determine how to do a wildcard type search where a cloud formation stack name contains a string.
My bash version using the AWS CLI is as follows:
aws cloudformation --region us-east-1 describe-stacks --query "Stacks[?contains(StackName,'myString')].StackName" --output json > stacks.out
This works on the cli, outputting the results to a json file, but I cannot find any examples online to do a similar search for contains using boto3 with Python. Is it possible?
Thanks!
Yes, it is possible. What you are looking for is the following:
import boto3
# create a boto3 client first
cloudformation = boto3.client('cloudformation', region_name='us-east-1')
# use client to make a particular API call
response = cloudformation.describe_stacks(StackName='myString')
print(response)
# as an aside, you'd need a different client to communicate
# with a different service
# ec2 = boto3.client('ec2', region_name='us-east-1')
# regions = ec2.describe_regions()
where, response is a Python dictionary, which, among other things, will contain the description of the stack, "myString".

Categories