I am new to Azure and Python and would like to ask some questions. Using Azure Python SDK:
Is there a direct way for me to get a list of snapshots of a managed OS disk order by created date? As of now, I can only get all snapshots within a resource group or under a subscription.
How do I write Python SDK code to create snapshots asynchronously? I have an idea of using multi-threading, and I also want to be notified when a snapshot is successfully created.
My questions might be confusing since I have little experience with Azure and Python. Any help will be appreciated.
Related
Help!
I have a small python script that logins to a website, downloads a bundle of files and then saves these files to a Sharepoint site for use by others. There are multiple files to this and several required python imports.
I'd like to move this to Azure so I can put this thing on a schedule to run periodically so I can forget about it (and have the script send notification or otherwise). Actually there are other scripts I would also like to put on a schedule.
I'm somewhat baffled in where to start doing this on Azure. I have an Azure account with some free credit but beyond that confused as what Azure service this should be built on.
Google searching is not helping as all I get is bundle of buzzwords that are not really helping.
Looking for some pointers in the right direction.
Thanks
As mentioned by #Gaurav Mantri, you can use Azure functions' Timer trigger where the function is run on a schedule. Alternatively, for codeless automated workflows, you can even opt for Azure logic apps which provide many connectors that are performed just by constructing a workflow.
REFERENCES: Connect to SharePoint from Azure Logic Apps - MSFT Docs
I'm trying to make a game with a leaderboard, and I have the code that writes the high scores in a file, but when multiple people play the game, they have separate leaderboards. so basically, i'm asking if there's a way to have everyone share a file in repl.it.
If your code is writing to local files e.g. via open('some_file.txt'), then those are limited to the individual user's Repl.it workspaces.
If you must use files, then you could use any of the Dropbox, Git, AWS S3, Google Drive Python libraries to read/write external files.
However, the better solution would be to use a hosted database solution and connect Repl.it to that. Some popular (free) examples include Firebase, Mongo Atlas, or Datastax Astra. For a simple leaderboard counter, then even Grafana Cloud.
I've developed a Python script that helps track and organize various tasks in Smartsheet but I want to assign it to a button inside of the Smartsheet such that it is available for multiple users. Is there a way to do this and could someone provide some documentation? Thus far I haven't seen anything online.
Thank you,
Sam
Smartsheet doesn't host scripts, nor does it natively have a way to execute your remote scripts. You'd need to either host it yourself or host it on a cloud platform like AWS or Azure. Inside Smartsheet, you could create a dashboard widget that then links to your own server.
Noob and beginner here. Just trying to learn the basics of GCP.
I have a series of Google Cloud Buckets that are text files. I also have a VM instance that I've set up via GCP.
Now, I'm trying to write some code to extract the data from Google buckets and run the script via GCP's command prompt.
How can I extract GCP buckets in Python
I think that you can use the Listing Objects and Downloading Objects GCS methods with Python; in this way, you will be able to get a list of the objects stored in your Cloud Storage buckets to then extract them into you VM instance. Additionally, keep in mind that it is important to verify that the service account that you implement to perform these tasks, has the required roles assigned in order to access to your GCS buckets, as well as provide the credentials to your application by using environment variables or explicitly pointing to your service account file in code.
Once you have your code ready, you can simply execute your Python program by using the
python command. You can take a look on this link to get the instructions to install Python in your new environment.
I have tried searching for a way to download VHD disk of a VM from Azure, but couldn't find any.
The only way I found to download is by downloading it manually using the steps provided in the link :
https://learn.microsoft.com/en-us/azure/virtual-machines/linux/download-vhd
If anyone has a way to download it using python, please share...
Thanx in Advance...
Essentially, the link you referenced tells you what you need to do to download a .VHD
However, if you want to use Python, there is a library you can use to make common tasks easier.
See this file especially for some more information on how to read blobs in an Azure Storage Account.