I'm trying to find out how to create a container obect that has SAS permission applied on it without supplying the account or account key. I've done this in C# but looking to do this in Python.
sasToken = "https://samplestoragehotblob.blob.core.windows.net/samplecontainer?sv=2016-05-31&sr=c&sig=dfdLKJ.....kljsdflkjljsd=3027-09-11T17%3A16%3A57Z&sp=racwdl";
CloudBlobContainer cbContainer = new CloudBlobContainer(new Uri(sasToken));
I can then work in the container with all the necessary permissions without specifying an account and key. Is this possible in Python?
Found a way to not use the account key. This was an acceptable compromise.
from azure.storage.blob import BlockBlobService
def AccessTest():
accountName = "Account Name"
containerName = "Container Name"
sasToken = "sv=2016-05-31&sr=c&sig=BhhYbf3............................-10-02T15%3A28%3A59Z&sp=racwdl"
blobService = BlockBlobService(account_name = accountName, account_key = None, sas_token = sasToken)
for blob in blobService.list_blobs(containerName):
print blob.__getattribute__('name')
Just as Summary, If you haven't generated SAS token,you can't avoid using account key.
You could follow the official tutorial and use generate_shared_access_signature,generate_container_shared_access_signature,generate_blob_shared_access_signature to generate SAS Token for azure storage account,container,blob in python.
If you already generated SAS token, you could operate your container with SAS token instead of account key.
Code Snippet:
from datetime import datetime, timedelta
from azure.storage.blob import (
BlockBlobService,
ContainerPermissions,
)
accountName = "***"
accountKey = "***"
containerName = "***"
def GenerateSasToken():
blobService = BlockBlobService(account_name=accountName, account_key=accountKey)
sas_url = blobService.generate_container_shared_access_signature(containerName,ContainerPermissions.READ, datetime.utcnow() + timedelta(hours=1))
# print sas_url
return 'https://' + accountName + '.blob.core.windows.net/' + containerName + '?' + sas_url
def AccessTest(sastoken):
blobService = BlockBlobService(account_name = accountName, account_key = None, sas_token = sastoken)
BlockBlobService
for blob in blobService.list_blobs(containerName):
print blob.__getattribute__('name')
sastoken = GenerateSasToken()
print sastoken
AccessTest(sastoken)
In addition, you could try to use Azure Key Vault.
The Azure Storage Account (ASA) key feature manages secret rotation
for you. It also removes the need for your direct contact with an ASA
key by offering Shared Access Signatures (SAS) as a method.
which mentioned here.
Please refer to the Azure Key Vault official tutorial and it support REST API.
Related
I'm trying to create a blob container within an Azure storage account with Azure's Python API.
def create_storage_container(storageAccountName: str, containerName: str):
print(
f"Creating storage container '{containerName}'",
f"in storage account '{storageAccountName}'"
)
credentials = DefaultAzureCredential()
url = f"https://{storageAccountName}.blob.core.windows.net"
blobClient = BlobServiceClient(account_url=url, credential=credentials)
containerClient = blobClient.get_container_client(containerName)
containerClient.create_container()
On create_container() I get the error:
Exception has occurred: HttpResponseError
This request is not authorized to perform this operation.
RequestId:8a3f8af1-101e-0075-3351-074949000000
Time:2022-12-03T20:00:25.5236364Z
ErrorCode:AuthorizationFailure
Content: <?xml version="1.0" encoding="utf-8"?><Error><Code>AuthorizationFailure</Code><Message>This request is not authorized to perform this operation.
RequestId:8a3f8af1-101e-0075-3351-074949000000
Time:2022-12-03T20:00:25.5236364Z</Message></Error>
The storage account was created like so:
# Creates a storage account if it does not already exist.
# Returns the name of the storage account.
def create_storage_account(
resourceGroupName: str, location: str,
subscriptionId: str, storageAccountName: str
):
credentials = AzureCliCredential()
# Why does this have creation powers for storage accounts
# instead of the ResourceManagementClient?
storageClient = StorageManagementClient(
credentials, subscriptionId, "2018-02-01"
)
params = {
"sku": {"name": "Standard_LRS", "tier": "Standard"},
"kind": "StorageV2",
"location": location,
"supportsHttpsTrafficOnly": True,
}
result = storageClient.storage_accounts.begin_create(
resourceGroupName, storageAccountName, params
) # type:ignore
storageAccount = result.result(120)
print(f"Done creating storage account with name: {storageAccount.name}")
The storage accounts that are generated like this seem to have completely open network access, so I wouldn't think that would be an issue.
Storage account network settings:
How can I fix this error or create a storage container in another way programmatically?
Thanks
I tried in my environment and got same error in results:
Console:
If you are accessing storage account you need a role like Storage-blob-contributor or storage-blob-owner.
Go to portal -> storage accounts -> Access Control (IAM) ->Add -> Add role assignments -> storage-blob-contributor or storage-blob-owner.
Portal:
After assigning role to my storage account, I executed same code and it successfully created container.
Code:
from azure.storage.blob import BlobServiceClient
from azure.identity import DefaultAzureCredential
storageAccountName="venkat123"
containerName="test"
def create_storage_container():
print(
f"Creating storage container '{containerName}'",
f"in storage account '{storageAccountName}'"
)
credentials = DefaultAzureCredential()
url = f"https://{storageAccountName}.blob.core.windows.net"
blobClient = BlobServiceClient(account_url=url, credential=credentials)
containerClient = blobClient.get_container_client(containerName)
containerClient.create_container()
print("Container created")
create_storage_container()
Console:
Portal:
Check the RBAC roles your user is assigned to for the storage account. The default ones don’t always enable you to view data and sounds like it’s causing your problems.
I am new come to the python, but I need to invoke Power BI REST API with python to publish my pbix file in my repo to the workspace.
Based on this document, I could successfully authenticated and get the workspace:
import json, requests, pandas as pd
try:
from azure.identity import ClientSecretCredential
except Exception:
!pip install azure.identity
from azure.identity import ClientSecretCredential
# --------------------------------------------------------------------------------------#
# String variables: Replace with your own
tenant = 'Your-Tenant-ID'
client = 'Your-App-Client-ID'
client_secret = 'Your-Client-Secret-Value' # See Note 2: Better to use key vault
api = 'https://analysis.windows.net/powerbi/api/.default'
# --------------------------------------------------------------------------------------#
# Generates the access token for the Service Principal
auth = ClientSecretCredential(authority = 'https://login.microsoftonline.com/',
tenant_id = tenant,
client_id = client,
client_secret = client_secret)
access_token = auth.get_token(api)
access_token = access_token.token
print('\nSuccessfully authenticated.')
But I do not know how to publish my pbix to one of my workspace and with parameter overwrite by using REST API with python. And if the pbix already existed in the workspace, provide the parameter to overwrite it.
Any advice would be greatly appreciated and a sample will be greate.
I am getting this error when I try to list down all my vms on Azure through python
Code: AuthorizationFailed
Message: The client "XXXX" with object id "XXXX" does not have authorization to perform action 'Microsoft.Compute/virtualMachines/read' over scope '/subscriptions/XXXXX or the scope is invalid. If access was recently granted, please refresh your credentials.
my code is below:
from azure.mgmt.compute import ComputeManagementClient
from azure.identity import ClientSecretCredential
Subscription_Id = "XXXX"
Tenant_Id = "XXXXX"
Client_Id = "XXXXX"
Secret = "XXXXX"
credential = ClientSecretCredential(
client_id=Client_Id,
client_secret=Secret,
tenant_id=Tenant_Id
)
compute_client = ComputeManagementClient(credential, Subscription_Id)
vm_list = compute_client.virtual_machines.list_all()
pageobject1 = vm_list.by_page(continuation_token=None)
for page in pageobject1:
for j in page:
print(j)
Instead of passing your app registration applicationId/objectId you need to pass the service principal/appregistration name when you are trying to assign a particular role like virtualmachinecontributor to your Service principal as show in below.
Post providing the required access to the service principal/appregistration you will be able to pull the list of virtual machines in your subscription. we have checked the above python in our local environment which is also working fine.
Here is sample output screenshot for reference:
Updated Answer To pull list of VM's using Resource Management Client:
from azure.mgmt.resource import ResourceManagementClient
from azure.identity import ClientSecretCredential
Subscription_Id = "<subId>"
Tenant_Id = "<tenantid>"
Client_Id = "<appId>"
Secret = "<clientSecret>"
credential = ClientSecretCredential(
client_id=Client_Id,
client_secret=Secret,
tenant_id=Tenant_Id
)
resource_client=ResourceManagementClient(credential=credential,subscription_id=Subscription_Id)
resource_list=resource_client.resources.list()
for item in resource_list:
if(item.type == 'Microsoft.Compute/virtualMachines'):
print(item)
I'm tryng to generate a shared access signature link through python of my files which are already at blob storage, but something goes wrong , I received this message when I put the generate link on web browser:
"Server failed to authenticate the request. Make sure the value of Authorization header is formed correctly including the signature."
I'm generating the key from my container name on right button get shared access signature, but I can't go further.
from datetime import datetime
from datetime import timedelta
#from azure.storage.blob import BlobService
datetime.utcnow()
from azure.storage.blob import generate_blob_sas, AccountSasPermissions,AccessPolicy
def generate_link():
account_name='my_account_name_storage'
container_name='container_name'
blob_name='file_name.xsl'
account_key='?sv=2019-12-12&ss=bfqt&srt=sco&sp=rwdlacupx&se=2020-09-17T05:49:57Z&st=2020-09-16T21:49:57Z&spr=https&sig=sdfsdhgbjgnbdkfnglfkdnhklfgnhklgf%30'
url = f"https://{account_name}.blob.core.windows.net/{container_name}/{blob_name}"
sas_token = generate_blob_sas(
account_name=account_name,
account_key=account_key,
container_name=container_name,
blob_name=blob_name,
permission=AccountSasPermissions(read=True),
expiry=datetime.utcnow() + timedelta(hours=1)
)
print(sas_token)
url_with_sas = f"{url}?{sas_token}"
print(url_with_sas)
generate_link()```
It's wrong about account_key in your code.
To find account_key of your storage account, please nav to azure portal -> your storage account -> Settings -> Access keys, then you can see the account_key. The screenshot as below:
I am trying to download an Azure Blob Storage file from my storage account, to do so, I have checked what the URL is and I am doing the following:
with urllib.request.urlopen("<url_file>") as resp:
img = np.asarray(bytearray(resp.read()), dtype="uint8")
But I am getting the following error:
urllib.error.HTTPError: HTTP Error 404: The specified resource does not exist.
I have doubled checked that the url is correct. Could this have something to do with not having passed the keys of my subscription or any other info about the Storage Account?
Any idea?
As on Dec 26, 2019 I am unable to import BaseBlobService from azure cloud storage. Neither of BlobPermissions, generate_blob_shared_access_signature worked for me. Below is something I used and it worked in my case and hope it helps
from azure.storage.blob import generate_blob_sas, AccountSasPermissions
def scan_product():
account_name=<account_name>
container_name=<container_name>
blob_name=<blob_name>
account_key=<account_key>
url = f"https://{account_name}.blob.core.windows.net/{container_name}/{blob_name}"
sas_token = generate_blob_sas(
account_name=account_name,
account_key=account_key,
container_name=container_name,
blob_name=blob_name,
permission=AccountSasPermissions(read=True),
expiry=datetime.utcnow() + timedelta(hours=1)
)
url_with_sas = f"{url}?{sas_token}"
Actually, you can generate a blob url with sas token in Azure Storage SDK for Python for accessing directly, as my sample code below.
from azure.storage.blob.baseblobservice import BaseBlobService
from azure.storage.blob import BlobPermissions
from datetime import datetime, timedelta
account_name = '<account name>'
account_key = '<account key>'
container_name = '<container name>'
blob_name = '<blob name>'
url = f"https://{account_name}.blob.core.windows.net/{container_name}/{blob_name}"
service = BaseBlobService(account_name=account_name, account_key=account_key)
token = service.generate_blob_shared_access_signature(container_name, blob_name, permission=BlobPermissions.READ, expiry=datetime.utcnow() + timedelta(hours=1),)
url_with_sas = f"{url}?{token}"
Then,
import urllib
import numpy as np
req = urllib.urlopen(url_with_sas)
img = np.asarray(bytearray(req.read()), dtype=np.uint8)
For downloading using url directly, you should put the blob in a public container, or in the private container then you should generate a sas token for the blob(the url looks like : https://xxx.blob.core.windows.net/aa1/0116.txt?sp=r&st=2019-06-26T09:47:04Z&se=2019-06-26xxxxx).
I test your code with the url which contains a sas token, it can be downloaded.
Test result:
How to generate sas token for a blob:
To solve the issue all I needed to do was to change the Blob Storage access level to Blob (anonymous read access for blob only). Once this is done, it will work.