package and schedule python folder using azure - python

I have a python folder that contains the main file and its supporting files.
I run the Main file and it calls the other files.
I now need to package this and schedule it on Azure so it runs every hour.
Can someone guide me on how to do this

A great way to run Python code in Azure without the need for any supporting infrastructure to maintain is Azure Functions--you only pay for the execution of your code. Functions can be triggered using a timer, such as every hour.
To get started, see: https://learn.microsoft.com/en-us/azure/azure-functions/functions-reference-python

Related

run python script from powerautomate when file gets created in OneDrive for business

I need to run a python script when an excel file gets dropped in onedrive for business. The python script needs to read from that file and then do some actions. I have been searching aroud and I see that azure functions app is the way to go. I wanted to know if there are any other options to get this done? I see in powerautomate desktop that it runs python 2 code? I need to use python 3 Anyway around that? Calling Python script with the variable(file in onedrive) using powershell?

Can Google Apps Script call a python script?

I have a python script that uses a python api to fetch data from a data provider, then after manipulating the data, writes some of that data to Google Sheets (via the Google Sheets python api). So my workflow to update the data is to open the python file in VS Code, and run it from there, then switch to the spreadsheet to see the updated data.
Is there a way to call this python script from Google Sheets using Google Apps Script? If so, that would be more efficient; I could link the GAS script to a macro button on the spreadsheet.
Apps Script runs in the cloud at Google's servers rather than in your computer, and has no access to local resources such as Python scripts in your system.
To call a resource in the cloud, use the URL Fetch Service.
Currently, I am using following hack.
The appsscript behind gsheet command button writes parameters in a sheet called Parameters. And the python function on my local machine checks that Parameters sheet in that google workbook every 5 seconds. If there are no parameters, then it exists. And if there are parameters, then it executes the main code
When the code is deployed on service account, the portion which is polling remains inactive. And the appsscript directly makes a call directly to python code in service account.
There are many reasons why I need to call python function on LOCAL machine from gsheet. One reason is --- debugging is better in local machine and cumbersome on service account. Another reason is --- certain files can be put on local machines and we do not want to move these files to workspace. And gsheet needs data from these files.
This is a HACK I am using
.
I am looking for a better way that this "python code keeps polling" method.

Easiest Way to Pack all Dependencies into a Python Script (NOT EXE)

I was wondering is there any easy way to pull all the dependencies in my Python script into a file(s) that I could include with my script in order to run it on Airflow? This would be a huge time saver for me.
To be clear it CANNOT be an exe, the Airflow scheduler runs python scripts, not Exes.
(My company uses Airflow scheduler and the team that supports it is very slow. Every time I need to install a new package on Airflow to run a script that's a dependency, it takes months of confusion, multiple tickets and wasted time. I don't have the access level to fix it myself and they don't give that access level ever to developers.)

How to schedule longer python scripts in GCP without cloud functions

I have a script that downloads larger amounts of data from an API. The script takes around two hours to run. I would like to run the script on GCP and schedule it to run once a week on Sundays, so that we have the newest data in our SQL database (also on GCP) by the next day.
I am aware of cronjobs, but would not like to run an entire server just for this single script. I have taken a look at cloud functions and cloud scheduler, but because the script takes so long to execute I cannot run it on cloud functions as the maximum execution time is 9 minutes (from here). Is there any other way how I could schedule the python script to run?
Thank you in advance!
For running a script more than 1h, you need to use a Compute Engine. (Cloud Run can live only 1h).
However, you can use Cloud Scheduler. Here how to do
Create a cloud scheduler with the frequency that you want
On this scheduler, use the Compute Engine Start API
In the advanced part, select a service account (create one or reuse one) which have the right to start a VM instance
Select OAuth token as authentication mode (not OIDC)
Create a compute engine (that you will start with the Cloud Scheduler)
Add a startup script that trigger your long job
At the end on the script, add a line to shutdown the VM (with Gcloud for example)
Note: the startup script is run as ROOT user. Take care of the default home directory and the permission of the created files.

Where to run online Python 3.8 script

I have a folder with different folders in it.
In one of the folder I have a python script.
The python script reads an excel file (which is in the same folder), scrapes information from the internet, updates the excel file and creates another excel file in the main directory.
My question is:
As I can't run my computer non stop, I imagine it's possible (easy? and free) to upload all my folders on a website which will allow me to run my python (3.8) script. Do you have any suggestions ? Which website could be appropriate ? Pythonanywhere.com ?
Plus, I'd like to run this script every morning at 6am.
Thank you for your answers ! :)
Yes, you could use PythonAnywhere -- free accounts allow you to create one scheduled task, which can run once a day. If you have an account, you can set it up on the "Tasks" page.
Some public cloud providers, such as GCP, AWS, and Azure, offer free tier VMs. Simply run the code on those and set up a cron job. Though the network usage probably still costs you a few cents a month, this is a very cheap way to go. You could also consider setting up a FaaS solution against very low cost.
As #Klaus said, this is not a programming question. If you are on linux you can use crontab to schedule your process.
crontab
And if you want to run it on the cloud you can use free services like Heroku

Categories