For a Python code base I would like to have developers accessing application secrets using Azure Key Vault, with the idea that when we deploy, the application also should be able to connect. Hence, I'm thinking Active Directory.
However, I can not find any examples on the interweb that show this with the Python SDK. Initially, I would think to retrieve the CLI user:
from azure.common.credentials import get_azure_cli_credentials
credentials, subscription_id, tenant_id = get_azure_cli_credentials(with_tenant=True)
and then use this retrieved set of credentials to access the key vault:
from azure.keyvault import KeyVaultClient
vault_url = "https://########.vault.azure.net/"
secret_name = "########"
secret_version = "########"
client = KeyVaultClient(credentials)
secret = client.get_secret(vault_url, secret_name, secret_version)
print(secret)
However, I retrieve an error that:
azure.keyvault.v7_0.models.key_vault_error_py3.KeyVaultErrorException: Operation returned an invalid status code 'Unauthorized'
I can confirm that credentials, subscription_id and tenant_id are correct, and that using the CLI, I can succesfully retrieve the secret content. So it must be some Python SDK-specific thing.
Any ideas?
It looks like this is a bug in the Python SDK.
https://github.com/Azure/azure-sdk-for-python/issues/5096
You can use your own AD username and password with the UserPassCredentials class. It's not the logged in user, but's it's probably as close as you'll get for now.
EG:
from azure.common.credentials import UserPassCredentials
credentials = UserPassCredentials('username','password')
client = KeyVaultClient(credentials)
secret = client.get_secret(vault_url, secret_name, secret_version)
print(secret)
I tried the same thing and had a different error ("...audience is invalid...") until I changed your first function call adding the resource parameter:
credentials, subscription_id, tenant_id =
get_azure_cli_credentials(resource='https://vault.azure.net', with_tenant=True)
With this change I was able to access secrets using the same code you show.
What about this code snippet? Comparing your code to the example, I don't see where you're setting the client_id or the tenant.
You’ll want to set the access policy for the key vault to allow the authenticated user to access secrets. This can be done in the portal. Bear in mind that key vault has an upper limit of 16 access definitions, so you’ll probably want to grant access to a group and add your users to that group.
As #8forty pointed out, adding a resource='https://vault.azure.net' parameter to your get_azure_cli_credentials call will resolve the issue.
However, there are new packages for working with Key Vault in Python that replace azure-keyvault:
azure-keyvault-certificates (Migration guide)
azure-keyvault-keys (Migration guide)
azure-keyvault-secrets (Migration guide)
azure-identity is also the package that should be used with these for authentication.
If you want to authenticate your Key Vault client with the credentials of the logged in CLI user, you can use the AzureCliCredential class:
from azure.identity import AzureCliCredential
from azure.keyvault.secrets import SecretClient
credential = AzureCliCredential()
vault_url = "https://{vault-name}.vault.azure.net"
secret_name = "secret-name"
client = SecretClient(vault_url, credential)
secret = client.get_secret(secret_name)
print(secret.value)
(I work on the Azure SDK in Python)
Related
I am trying to use the documentation on https://docs.aws.amazon.com/AmazonRDS/latest/UserGuide/UsingWithRDS.IAMDBAuth.Connecting.Python.html. Right now I am stuck at session = boto3.session(profile_name='RDSCreds'). What is profile_name and how do I find that in my RDS?
import sys
import boto3
ENDPOINT="mysqldb.123456789012.us-east-1.rds.amazonaws.com"
PORT="3306"
USR="jane_doe"
REGION="us-east-1"
os.environ['LIBMYSQL_ENABLE_CLEARTEXT_PLUGIN'] = '1'
#gets the credentials from .aws/credentials
session = boto3.Session(profile_name='RDSCreds')
client = session.client('rds')
token = client.generate_db_auth_token(DBHostname=ENDPOINT, Port=PORT, DBUsername=USR, Region=REGION)
session = boto3.Session(profile_name='RDSCreds')
profile_name here means the name of the profile you have configured to use for your aws cli.
usually when you run aws configure it creates a default profile.But sometime users want to manage aws cli with another account credentials or amange request for another region so they configure separate profile. docs for creating configuring multiple profiles
aws configure --profile RDSCreds #enter your access keys for this profile
in case if you think you have already created RDSCreds profile to check that profile less ~/.aws/config
the documentation which you have mentioned for rds using boto3 also says "The code examples use profiles for shared credentials. For information about the specifying credentials, see Credentials in the AWS SDK for Python (Boto3) documentation."
Here is my problem, I am trying to create linked service using python sdk and I was successful if I provide the storage account name and key. But I would like to create Linked service with key vaults reference, the below runs fine and creates the linked service. However when I go to datafactory and test connection.. it fails.. Please help!
store = LinkedServiceReference(reference_name ='LS_keyVault_Dev')
storage_string = AzureKeyVaultSecretReference( store=store, secret_name = 'access_key')
ls_azure_storage = AzureStorageLinkedService(connection_string=storage_string)
ls = adf_client.linked_services.create_or_update(rg_name, df_name, ls_name, ls_azure_storage)
Error Message
Invalid storage connection string provided to 'AzureTableConnection'. Check the storage connection string in configuration. No valid combination of account information found.
I test your code, it created the linked service successfully, and I navigate to the portal to Test connection, it also works, you could follow the steps below.
1.Navigate to the azure keyvault in the portal -> Secrets -> Create a secret, I'm not sure why you can use access_key as the name of the secret, pey my test, it is invalid. So in my sample, I use accesskey as the name of the secret, then store the Connection string of the storage account.
2.Navigate to the Access policies of the keyvault, add the MSI of your data factory with correct secret permission. If you did not enable the MSI of the data factory, follow this link to generate it, this is used to for the Azure Key Vault linked service to access your keyvault secret.
3.Navigate to the Azure Key Vault linked service of your data factory, make sure the connection is successful.
4.Use the code below to create the storage linked service.
Version of the libraries:
azure-common==1.1.23
azure-mgmt-datafactory==0.9.0
from azure.common.credentials import ServicePrincipalCredentials
from azure.mgmt.datafactory import DataFactoryManagementClient
from azure.mgmt.datafactory.models import *
subscription_id = '<subscription-id>'
credentials = ServicePrincipalCredentials(client_id='<client-id>', secret='<client-secret>', tenant='<tenant-id>')
adf_client = DataFactoryManagementClient(credentials, subscription_id)
rg_name = '<resource-group-name>'
df_name = 'joyfactory'
ls_name = 'storageLinkedService'
store = LinkedServiceReference(reference_name ='AzureKeyVault1') # AzureKeyVault1 is the name of the Azure Key Vault linked service
storage_string = AzureKeyVaultSecretReference( store=store, secret_name = 'accesskey')
ls_azure_storage = AzureStorageLinkedService(connection_string=storage_string)
ls = adf_client.linked_services.create_or_update(rg_name, df_name, ls_name, ls_azure_storage)
print(ls)
5.Go back to the linked service page, refresh and test the connection, it works fine.
I have been stuck on the following issue for quite some time now. Within Python I want users to retrieve a token based upon their username and password from the AWS cognito-identity-pool making use of srp authentication. With this token I want the users to upload data to s3.
This is part of the code I use (from the warrant library): https://github.com/capless/warrant
self.client = boto3.client('cognito-idp', region_name="us-east-1")
response = boto_client.initiate_auth(
AuthFlow='USER_SRP_AUTH',
AuthParameters=auth_params,
ClientId=self.client_id
)
def get_auth_params(self):
auth_params = {'USERNAME': self.username,
'SRP_A': long_to_hex(self.large_a_value)}
if self.client_secret is not None:
auth_params.update({
"SECRET_HASH":
self.get_secret_hash(self.username,self.client_id, self.client_secret)})
return auth_params
However, I keep on getting:
botocore\auth.py", line 352, in add_auth raise NoCredentialsError
botocore.exceptions.NoCredentialsError: Unable to locate credentials
I was able to get rid of this error by adding credentials in the .aws/credentials file. But this is not in line with the purpose of this program. It seems like there is a mistake in the warrant or botocore library and the it keeps on attempting to use the AWS Access Key ID and AWS Secret Access Key from the credentials file, rather than that the given credentials (username and password) are used.
Any help is appreciated
I am on to Cognito team. initiate auth is an unauthenticated call so it shouldn't require you to provide AWS credentials. The service endpoint will not validate the sigv4 signature for these calls.
That being said, some client libraries have certain peculiarities in the sense that you need to provide some dummy credentials otherwise the client library will throw an exception. However you can provide anything for the credentials.
I too ran into this, using warrant.
The problem is that the boto3 libraries are trying to sign the request to aws, but this request is not supposed to be signed. To prevent that, create the identity pool client with a config that specifies no signing.
import boto3
from botocore import UNSIGNED
from botocore.config import Config
client = boto3.client('cognito-idp', region_name='us-east-1', config=Config(signature_version=UNSIGNED))
AWS Access Key ID and AWS Secret Access Key are totally different from username and password.
The Boto3 client has to connect to the AWS service endpoint (in your case: cognito-idp.us-east-1.amazonaws.com) to execute any API. Before executing an API, the API credentials (key+secret) have to provided to authenticate your AWS account. Without autheticating your account, you cannot call cognito-idp APIs.
There is one AWS account (key/secret) but there can be multiple users (username/password).
I need to retrieve secrets from keyvault. This is my code so far:
from azure.mgmt.keyvault import KeyVaultManagementClient
from azure.common.credentials import ServicePrincipalCredentials
subscription_id = 'x'
# See above for details on creating different types of AAD credentials
credentials = ServicePrincipalCredentials(
client_id = 'x',
secret = 'x',
tenant = 'x'
)
kv_client = KeyVaultManagementClient(credentials, subscription_id)
for vault in kv_client.vaults.list():
print(vault)
But I am getting this error:
msrestazure.azure_exceptions.CloudError: Azure Error:
AuthorizationFailed Message: The client 'x' with object id 'x' does
not have authorization to perform action
'Microsoft.Resources/subscriptions/resources/read' over scope
'/subscriptions/x'.
Now I am able to access the same keyvault with same credentials using C# code/ POwershell so there is definitely nothing wrong with authorization. Not sure why it isnt working using SDK. Please help.
If you are looking to access via a ServicePrincipalCredentials instance, you can just use:
from azure.keyvault import KeyVaultClient, KeyVaultAuthentication
from azure.common.credentials import ServicePrincipalCredentials
credentials = None
def auth_callback(server, resource, scope):
credentials = ServicePrincipalCredentials(
client_id = '',
secret = '',
tenant = '',
resource = "https://vault.azure.net"
)
token = credentials.token
return token['token_type'], token['access_token']
client = KeyVaultClient(KeyVaultAuthentication(auth_callback))
secret_bundle = client.get_secret("https://vault_url", "secret_id", "")
print(secret_bundle.value)
This assumes that you don't want to pass a version. If you do, you can substitute the last parameter for it.
I run your code sample above and it is able to list the key vaults without any issue, hence it is not a code issue.
I have assigned the Contributor role to my AD application on the subscription where the key vault is provisioned and set the Access Policies to allow GET & LIST permissions for Key and Secret to the AD application.
The versions of my Azure Python packages used running under Python 3.6.2 runtime environment:
azure.common (1.1.8)
azure.mgmt.keyvault (0.40.0)
msrestazure(0.4.13)
I'll recommend you to try on the Python runtime version and Azure Python packages versions which is verified working.
Addendum:
If the above Python runtime environment version as well as Azure Python packages also does not work for you, you should probably consider creating a new issue in the Azure SDK for Python GitHub as it is working with the same credential with Azure .NET SDK as well as PowerShell.
You can also get secret by the name of the secret instead of ID:
secret_bundle = client.get_secret(<VAULT URL>, "<NAME>", "")
There are some good answers already, but the Azure SDK has since released new packages for working with Key Vault in Python that replace azure-keyvault:
azure-keyvault-certificates (Migration guide)
azure-keyvault-keys (Migration guide)
azure-keyvault-secrets (Migration guide)
azure-identity is also the package that should be used with these for authentication.
Documentation for working with the secrets library can be found on the azure-sdk-for-python GitHub repository, and here's a sample for retrieving secrets as you were doing:
from azure.identity import DefaultAzureCredential
from azure.keyvault.secrets import SecretClient
credential = DefaultAzureCredential()
secret_client = SecretClient(
vault_url="https://my-key-vault.vault.azure.net/",
credential=credential
)
secret = secret_client.get_secret("secret-name")
You can provide the same credentials that you used for ServicePrincipalCredentials by setting environment variables corresponding to the client_id, secret, and tenant:
export AZURE_CLIENT_ID="client_id"
export AZURE_CLIENT_SECRET="secret"
export AZURE_TENANT_ID="tenant"
(I work on the Azure SDK in Python)
One can use the below class from azure.identity i.e ClientSecretCredential, find the below code ex: snippet
from azure.identity import ClientSecretCredential
from azure.keyvault.secrets import SecretClient
TENANT= <TenantId-in-string>
CLIENT_ID = <ClientId-in-string>
CLIENT_SECRET= <ClientSecret-in-string>
credential = ClientSecretCredential(TENANT,CLIENT_ID,CLIENT_SECRET)
VAULT_URL= <AzureVault-url-in-string>
client = SecretClient(vault_url=VAULT_URL, credential=credential)
print(client)
example_secret = client.get_secret(<secret_name_in_string>)
print(example_secret.value)
I'm trying to use Apache Libcloud (Web) and reading the Documentation of how to use it with Amazon EC2 I'm stuck on a step at the beginning.
On this step:
from libcloud.compute.types import Provider
from libcloud.compute.providers import get_driver
cls = get_driver(Provider.EC2)
driver = cls('temporary access key', 'temporary secret key',
token='temporary session token', region="us-west-1")
You need to pass the temporary access data and tells you to read Amazon Documentation but also I've read the documentation I don't get very clear what I have to do to get my temporal credentials.
On the doc says that you can interact with the AWS STS API to connect to the endpoint but I don't understand how do you get the credentials. Moreover, on the example of Libcloud Web they use the personal credentials:
ACCESS_ID = 'your access id'
SECRET_KEY = 'your secret key'
So I'm a bit lost. How I can get my temporal credentials to use it on my code?
Thanks and regards.
If this code does not run on an EC2 instance I suggest you go with static credentials:
ACCESS_ID = 'your access id'
SECRET_KEY = 'your secret key'
cls = get_driver(Provider.EC2)
driver = cls(ACCESS_ID, SECRET_KEY, region="us-west-1")
to create access credentials:
Sign in to the Identity and Access Management (IAM) console at https://console.aws.amazon.com/iam/.
In the navigation pane, choose Users.
Choose the name of the desired user, and then choose the Security Credentials tab.
If needed, expand the Access Keys section and do any of the following:
Choose Create Access Key and then choose Download Credentials to save the access key ID and secret access key to a CSV file on your computer. Store the file in a secure location. You will not have access to the secret access key again after this dialog box closes. After you have downloaded the CSV file, choose Close.
if you want to run your code from an EC2 machine you can get temporary credentials by assuming an IAM role using the AWS SDK for Python https://boto3.readthedocs.io/en/latest/guide/quickstart.html by calling assume_role() on the STS service https://boto3.readthedocs.io/en/latest/reference/services/sts.html
#Aker666 from what I have found on the web, you're still expected to use the regular AWS api to obtain this information.
The basic snippet that works for me is:
import boto3
from libcloud.compute.types import Provider
from libcloud.compute.providers import get_driver
boto3.setup_default_session(aws_access_key_id='somekey',aws_secret_access_key='somesecret',region_name="eu-west-1")
sts_client = boto3.client('sts')
assumed_role_object = sts_client.assume_role(
RoleArn="arn:aws:iam::701********:role/iTerm_RO_from_TGT",
RoleSessionName='update-cloud-hosts.aviadraviv#Aviads-MacBook-Pro.local'
)
cls = get_driver(Provider.EC2)
driver = cls(assumed_role_object['Credentials']['AccessKeyId'], assumed_role_object['Credentials']['SecretAccessKey'],
token=assumed_role_object['Credentials']['SessionToken'], region="eu-west-1")
nodes = driver.list_nodes()
print(nodes)
Hope this helps anyone.