How can I, using the Azure Python API, create a full set of credentials that can later be used to start and deallocate all VMs in a named resource group, without any other permissions?
I have thoroughly researched the example code and both official and unofficial documentation, but I don't even know where to start...
I know I will need a tenant ID, client ID, client secret and subscription ID. Which of those can I make using an API, and how would I go about assigning roles to allow for starting/deallocating VMs of an existing resource group?
Sample code highly sought after, but will take any hint!
You need the azure-graphrbac package to create a Service Principal:
https://learn.microsoft.com/python/api/overview/azure/activedirectory
The closer to a sample might be this unittest:
https://github.com/Azure/azure-sdk-for-python/blob/master/sdk/graphrbac/azure-graphrbac/tests/test_graphrbac.py
For role and permissions, you need azure-mgmt-authorization:
https://learn.microsoft.com/python/api/overview/azure/authorization
Best sample for this one, is probably the sub-part of this sample:
https://github.com/Azure-Samples/compute-python-msi-vm#role-assignement-to-the-msi-credentials
"msi_identity" is a synonym of "service principal" in your context.
Note that all of this is supported by the CLI v2.0:
https://learn.microsoft.com/cli/azure/ad/sp
https://learn.microsoft.com/cli/azure/role/assignment
It might be interested to test the CLI in --debug mode and sniffing in the code repo at the same time:
https://github.com/Azure/azure-cli
(full disclosure, I work at MS in the Azure SDK for Python team)
Related
I am getting the error while deploying the Azure function from the local system.
I wen through some blogs and it is stating that my function is unable to connect with the Azure storage account which has the functions meta data.
Also, The function on the portal is showing the error as: Azure Functions runtime is unreachable
Earlier my function was running but after integrating the function with a Azure premium App service plan it has stooped working. My assumption is that my app service plan having some restriction for the inbound/outbound traffic rule and Due to this it is unable to establish the connection with the function's associated storage account.
Also, I would like to highlight that if a function is using the premium plan then we have to add few other configuration properties.
WEBSITE_CONTENTAZUREFILECONNECTIONSTRING = "DefaultEndpointsProtocol=https;AccountName=blob_container_storage_acc;AccountKey=dummy_value==;EndpointSuffix=core.windows.net"
WEBSITE_CONTENTSHARE = "my-function-name"
For the WEBSITE_CONTENTSHARE property I have added the function app name but I am not sure with the value.
Following is the Microsoft document reference for the function properties
Microsoft Function configuration properties link
Can you please help me to resolve the issue.
Note: I am using python for the Azure functions.
I have created a new function app with Premium plan and selected the interpreter as Python. When we select Python, OS will be automatically Linux.
Below is the message we get to create functions for Premium plan function App:
Your app is currently in read only mode because Elastic Premium on Linux requires running from a package.
PortalScreenshot
We need to create, deploy and run function apps from a package, refer to the documentation on how we can run functions from package.
Documentation
Make sure to add all your local.settings.json configurations to Application Settings in function app.
Not sure of what kind of Azure Function you are using but usually when there is a Storage Account associated, we need to specify the AzureWebJobsStorage field in the serviceDependencies.json file inside Properties folder. And when I had faced the same error, the cause was that while publishing the azure function from local, some settings from the local.settings.json were missing in the Application Settings of the app service under Configuration blade.
There can be few more things which you can recheck:
Does the storage account you are trying to use existing still or is deleted by any chance.
While publishing the application from local, using the web deploy method, the publish profile is correct or has any issues.
Disabling the function app and then stopping the app service before redeploying it.
Hope any of the above mentions help you diagnose and solve the issue.
The thing is that there is a difference in how the function deployed using Consumption vs Premium service plan.
Consumption - working out of the box.
Premium - need to add the WEBSITE_RUN_FROM_PACKAGE = 1 in the function Application settings. (see https://learn.microsoft.com/en-us/azure/azure-functions/run-functions-from-deployment-package for full details)
I am working on a project where I have been using Python to make API calls to our organization's various technologies to get data, which I then push to Power BI to track metrics over time relating to IT Security.
My boss wants to see info added from Exchange Online Protection such as malware detected in emails, spam blocks etc., essentially replicating some of the email and collaboration reports you'd see in M365 defender > reports > email and collaboration (security.microsoft.com/emailandcollabreport).
I have tried the Defender API and MS Graph API, read through a ton of documentation, and can't seem to find anywhere to pull this info from. Has anyone done something similar, or know where this data can be pulled from?
Thanks in advance.
You can try using the Microsoft Graph Security API using which you can get the alerts, information protection, secure score using that. Also you can refer the alerts section in the documentation which talks about the list of supported providers at this point using the Microsoft Graph security api.
In case anyone else runs into this, this is the solution I ended up using (hacky as it may be);
The only way to extract the pertinent info seems to be through PowerShell, you need the modules ExchangeOnlineManagement and PSWSMan so those will need to be installed.
You need to add an app to your Azure instance with global reader role minimum (or something custom) and generate and upload self-signed certificates to the app.
I then ran the following lines as a ps1 script:
Connect-ExchangeOnline -CertificateFilePath "<PATH>" -AppID "<APPID>" -Organization "<ORG>.onmicrosoft.com" -CertificatePassword (ConvertTo-SecureString -String '<PASSWORD>' -AsPlainText -Force)
$dte = (Get-Date).AddDays(-30)
Get-MailflowStatusReport -StartDate $dte -EndDate (Get-Date)
Disconnect-ExchangeOnline
I used python to call the powershell script, then extract the info I needed from the output and push it to PowerBI.
I'm sure there is a more secure and efficient way to do this but I was able to accomplish the task this way.
I'm working on automated deletion of Azure resources based on tags attached to these resources.
I'm using Azure SDK for python (https://github.com/Azure/azure-sdk-for-python) - I found how to get a list of my resources and that I can delete them with ResourceManagementClient with resources.delete_by_id method.
However, this method requires 2 arguments - resource id (which I have from resources listed by ResourceManagementClient) and API version (which is different for every resource type.
How can I determine which API version should be passed to the method?
I've tried to find something in docs and code of SDK, but I couldn't come up with a proper solution.
API version can be even hardcoded, but it needs to work for all resource types.
When using some api versions (e.g. 2018-05-01) I get error for some of resource types:
Azure Error: NoRegisteredProviderFound
Message: No registered resource provider found for location 'westeurope' and API version '['2018-05-01']' for type 'virtualMachines'. The supported api-versions are '2015-05-01-preview, 2015-06-15, 2016-03-30, 2016-04-30-preview, 2016-08-30, 2017-03-30, 2017-12-01, 2018-04-01, 2018-06-01, 2018-10-01, 2019-03-01'. The supported locations are 'eastus, eastus2, westus, centralus, northcentralus, southcentralus, northeurope, westeurope, eastasia, southeastasia, japaneast, japanwest, australiaeast, australiasoutheast, brazilsouth, southindia, centralindia, westindia, canadacentral, canadaeast, westus2, westcentralus, uksouth, ukwest, koreacentral, koreasouth, francecentral, southafricanorth'.
ERROR: 'CloudError' object has no attribute '__traceback__'
I would recommend the same approach than the CLI implementation, do an initial call to ARM to get the possible mappings from Resource Provider / Resource Type to API version(s), and use that to inject the correct api version in your call.
Get this mapping would be a list providers call.
Adding a Mgmt sample repo, look for ResourceManagementClient:
https://github.com/Azure-Samples/azure-samples-python-management
Edit: I work at MS in the Python SDK team.
If I am not mistaken, resources.delete_by_id is a wrapper over Delete By Id REST API method. Currently the latest API Version for this operation is 2018-05-01. You can use that in your method call.
We are using apache beam through airflow. Default GCS account is set with environmental variable - GOOGLE_APPLICATION_CREDENTIALS. We don't want to change environmental variable as it might affect other processes running at that time. I couldn't find a way to change Google Cloud Dataflow Service Account programmatically.
We are creating pipeline in following way
p = beam.Pipeline(argv=self.conf)
Is there any option through argv or options, where in I can mention the location of gcs credential file?
Searched through documentation, but didn't find much information.
You can specify a service account when you launch the job with a basic flag:
--serviceAccount=my-service-account-name#my-project.iam.gserviceaccount.com
That account will need the Dataflow Worker role attached plus whatever else you would like(GCS/BQ/Etc). Details here. You don't need the SA to be stored in GCS, or keys locally to use it.
I want to use python API to create a User and project, assign that user to the project created and then allocate certain quota. I can't find documentation for this. What is the best way to do these tasks using code (cant do it with CLI).
Thanks.
Error : 'Proxy' object has no attribute 'v2'
Code:
from openstack import connection
import openstack
conn = connection.Connection(auth_url="https://example.com/v2.0",
project_name="admin",
username="admin",
password="test", verify=False)
conn.identity.v2._proxy.create_role()
Documentation URL : https://developer.openstack.org/sdks/python/openstacksdk/users/proxies/identity_v2.html
I have not tried openstack user/domain/project creation using python directly. But the openstack cli in turn uses the python APIs only.
The CLI command actually imports the python module 'openstackclient'.
So digging further into this may help you.
The default location where openstack client installed is
/usr/lib/python2.7/dist-packages/openstackclient/