I have a Webapp that writes the status of Azure VMs in a DB. I want to be able to have the DB update automatically every time a VM is started/stopped/restarted. How do I set up a trigger that POST to my Webapp when the VM is started/stopped/restarted? I'm using the Azure Python SDK. Thank you.
You don't need to write code to achieve this you could use EventGrids in Azure linked to a logic app which would write straight into your database. Here is an example of how to set up an EventGrid to send emails when linked to a logic app. This is an example in how to connect a Logic App to a database. The combination of these two examples will allow you to use Azure native technologies to achieve your goal.
Alternatively instead of writing straight to the database you could use the logic app to POST to your website using one of the connectors.
Hope that helps.
Related
There is a AI based Web Application which needs to be connected to Azure SQL DB.
So I have to create a Rest API as an interface to connect the web app GUI to the Azure SQL DB.
Please suggest how can I achieve this. I searched other posts which suggest that we can expose the Azure SQL DB via ODATA service but not sure where to start from.
I followed the post
How to expose Azure Sql Server database using OData
Need some help from where to start.
I found this guide for you.
Even if it's not for Python you get the point, this is how to work with Azure SQL Database and create a GUI interface.
I have developed an application in python that connects to snowflake and upload some results of a data model. The point is, If I'm the user running, no problem with the browser option. But eventually we gonna need to rollout this model to other users and we are going to face issues.
Is there a way out of this?
If you have configured Snowflake to use single sign-on (SSO), you can configure your client application to use SSO for authentication
Check the documentation here
Problem: Habitica is a habit-tracking app, but its personal data logs are not as detailed as I want. I want to create a local log of when I mark off habits/todo's in the app. Habitica offers certain webhooks that trigger when habits/todo's are checked off, which seems perfect for what I want, but how do I turn these triggers into a local log? I would like to use Python for this.
Ideas: It seems to me that I would need to set up some kind of personal cloud server to receive this data, turn it into a log, and then store it for download. I have previously deployed a Flask app using Heroku, so if this could be done similarly, that would be ideal. However, I don't know much about this, so I would welcome any ideas or advice.
Creating the Habitica webhook as Flask application is a good approach.
Heroku supports Python/Flask very nicely however the file system is ephemeral, hence it gets wiped out at every application restart.
In order to persist data you can look at various options:
save the file to AWS S3
save the data into a DB (Heroku has a free plan for PostgreSQL)
Just start using the firebase + react to build a website. One of the designed features of my website is to crawl and show users the data parsed from another website (e.g., the stock price changes). I already have a python crawler responsible to parse the data, but I have no idea how to execute this python crawler (in the background) of my server in firebase (or it is not even possible)?
Here is the example usage of my system
user login and subscribe the website/data they are interesting
my crawler will parse that website every 1 hour and update the data to database
user can see the summary of change of website from database
One option I have in mind is running the crawler in my local machine and use the REST api to update the parsed data to firebase database. However, it seems a very inefficient/naive approach because it is kind of losing the meaning of deploying my server with cloud service, like firebase.
Firebase does not have any service/feature that allows you to periodically run Python or any other code. The closest thing to that is Cloud Functions, which can be triggered through an external service like cron-job.org.
For more in this, see:
Firebase Hosting Flask Application (on running Python on Firebase Hosting)
Using google cloud function to spawn a python script (for an elaborate way where you might apparently run Python on Cloud Functions, but I never have, nor am likely to ever try this myself)
Cloud Functions for Firebase trigger on time? (for running Cloud Functions periodically either through AppEngine, or cron-job.org).
I am attempting to create a python application on a Raspberry Pi that can access data stored in a db model on an App Engine application. Specifically the latest entry in the data store.
I have no experience doing this type of remote data access but have a fair bit of experience with App Engine and Python.
I have found very little that I understand on this subject of remote data access.
I would like to access the data store directly, not text on a web page like this.
ProtoRPC kind of looks like it may work but Im not familiar with it and it looks like it is pretty involved when I just need to access a few strings.
What would make the most sense to accomplish this? If an example is easy to provide I would appreciate it.
What you looking for is the appengine remote api.
https://cloud.google.com/appengine/docs/python/tools/remoteapi
Google App Engine doesn't allow direct access to it's databases from your local python script. Instead, only an application hosted on App engine's server can access that data for your application.
Essentially, you're looking for a Google App Engine compatible, automatic, Restful API. Several exist, and have been discussed here. YMMV with the various frameworks that are discussed there.