I am developing a python django app for my project. Due the nature of one of my apps, I need to run a certain script for a long period of time (maybe several hours).
Obviously everything is fine in my local environment. However, when I publish that app to the azure, the app crashes after a period of time due to max. execution time (It is not giving some error related to max. execution instead it throws an internal server error)
At this point I have 2 questions:
Is it possible to increase max. execution time for a python web-app in azure? If yes, how can I do that?
Should I be using some other azure service rather than a web-app for such an operation?
Thank you.
You can try to leverage WebJobs to run your certain scripts or programs for on demand, continuously, or on a schedule tasks on background.
And at the same time, as web apps are unloaded if they are idle for some period of time
by default. This lets the system conserve resources. In Basic or Standard mode, you can enable Always On to keep the app loaded all the time. Especially if you need to run continuous or a long time job or task, you should enable Always On.
You can modify this setting in manage portal, refer to Configure web apps in Azure App Service for details.
Related
So far when dealing with web scraping projects, I've used GAppsScript, meaning that I can easily trigger the script to be run once a day.
Is there an equivalent service when dealing with python scripts? I have a RaspberryPi, so I guess I can keep it on 24/7 and use cronjobs to trigger the script daily. But that seems rather wasteful, since I'm talking about a few small scripts that take only a few seconds to run.
Is there any service that allows me to trigger a python script once a day? (without a need to keep a local machine on 24/7) The simpler the solution the better, wouldn't want to overengineer such a basic use case if a ready-made system already exists.
The only service I've found so far to do this with is WayScript and here's a python example running in the cloud. The free tier that should be enough for most simple/hobby-tier usecases.
I have a number of very simple Python scripts that are constantly hitting API endpoints 24/7. The amount of data being pulled is very minimal and they all query the APIs every few seconds. My question is, is it okay to run multiple simple scripts in AWS Lightsail using tmux on a single core Lightsail instance or is it better practice to create a new instance for each Python script?
I don't find any limits mentioned in LightSail for your use case. As long as the end-points are owned by you or you don't get blocked for hitting them continuously, all seems good.
https://aws.amazon.com/lightsail/faq/
You can also set some alarms on Lightsail instance usage to know if you've hit any limits.
I have a script that downloads larger amounts of data from an API. The script takes around two hours to run. I would like to run the script on GCP and schedule it to run once a week on Sundays, so that we have the newest data in our SQL database (also on GCP) by the next day.
I am aware of cronjobs, but would not like to run an entire server just for this single script. I have taken a look at cloud functions and cloud scheduler, but because the script takes so long to execute I cannot run it on cloud functions as the maximum execution time is 9 minutes (from here). Is there any other way how I could schedule the python script to run?
Thank you in advance!
For running a script more than 1h, you need to use a Compute Engine. (Cloud Run can live only 1h).
However, you can use Cloud Scheduler. Here how to do
Create a cloud scheduler with the frequency that you want
On this scheduler, use the Compute Engine Start API
In the advanced part, select a service account (create one or reuse one) which have the right to start a VM instance
Select OAuth token as authentication mode (not OIDC)
Create a compute engine (that you will start with the Cloud Scheduler)
Add a startup script that trigger your long job
At the end on the script, add a line to shutdown the VM (with Gcloud for example)
Note: the startup script is run as ROOT user. Take care of the default home directory and the permission of the created files.
I have a Python script that pulls some data from an Azure Data Lake cluster, performs some simple compute, then stores it into a SQL Server DB on Azure. The whole shebang runs in about 20 seconds. It needs sqlalchemy, pandas, and some Azure data libraries. I need to run this script daily. We also have a Service Fabric cluster available to use.
What are my best options? I thought of containerizing it with Docker and making it into an http triggered API, but then how do I trigger it 1x per day? I'm not good with Azure or microservices design so this is where I need the help.
You can use Web Jobs in App Service. It has two types of Azure Web Jobs for you to choose: Continuous and Trigger. As I see you need the type Trigger
You could refer to the document here for more details.In addition, here shows how to run tasks in WebJobs.
Also, you can use Azure function timer-based on python which was made generally available in recent months.
I have a web application that runs locally only (does not run on a remote server). The web application is basically just a ui to adjust settings and see some information about the main application. Web UI was used as opposed to a native application due to portability and ease of development. Now, in order to start and stop the main application, I want to achieve this through a button in the web application. However, I couldn't find a suitable way to start a asynchronous and managed task locally. I saw there is a library called celery, however that seems to be suitable to a distributed environment, which mine is not.
My main need to be able to start/stop the task, as well as the check if the task is running (so I can display that in the ui). Is there any way to achieve this?
celery can work just fine locally. Distributed is just someone else's computer after all :)
You will have to install all the same requirements and the like. You can kick off workers by hand, or as a service, just like in the celery docs.