Schedule task in Azure to request Container App - python

I have a Container App in Azure cloud which hosts an API.
My goal is to execute a task which sends some specific requests to this API and sends e-mail with results. I want to run it twice a day. Let's say it's a single Python script task.py
What would be the best option to achieve that?
So far I've seen the following options:
Azure Functions
Logic Apps
Container Registry tasks???
Please note that container image used in the Container App is stored in Container Registry and contains task.py script too.

Related

Communicate with Docker containers via API

I'm creating an API (FastAPI or Flask) that will be deployed in a docker container. It's main objective will be to serve as a 'controller' that will launch 2-3 python apps via specific endpoinds.
Each python app is installed in a separate docker container. Each app does a specific job - extracts data, creates an image, etc.
What options do I have to allow the 'controller' to communicate with the apps living on separate docker containers?
While exploring, I identified the following routes:
Install Docker CLI in the 'controller' container. Designated endpoints to run 'docker start' command to launch each python app.
Create separate REST API for each of the APP containers. Allow the 'controller' to interact with each app via HTTPS.
Is there anything I'm missing? Maybe there are better routes?

Deploy the scheduler application on multiple servers without running all of them

I have a python app that have scheduler and i want deploy it on multiple server.
Problem:
If I deploy my app to multiple servers, all schedules run, but I only need one of them.
* I don't want to define a field in the database and find out through it whether the scheduler should run or not, I am looking for another solution to not save anything anywhere**
Thanks.
disable the scheduler and try to schedule it via a microservice from outside. As for example if you want to do this opensource you can use airflow and prefect. If you are on AWS you can use EventBridge, lambda
Microservice for this purpose.

How to list docker images in Google container registry (GCR) through python script?

I'm trying to write a python script to auto-delete old prerelease docker images that are taking up space in Google container registry. I'm stuck in how can I authenticate google cloud in python script and list docker images in GCR.
Google Container Registry (GCR) implements (the de facto standard) Docker Registry API HTTP V2.
This means that you can use docker, podman and other tools that implement these APIs to interact with GCR and it means that you should use a 3rd-party client library (often provided by Docker) to implement interaction with a Docker Registry such as GCR.
Google (!) documents this on the page Docker Registry API
There is no Google API to interact with GCR.
You can demonstrate this to yourself by running e.g. gcloud container images list with the --log-http flag to show that the underlying REST calls are of the form http://gcr.io/v2/{repo}/tags/list. This is documented on Docker's API page for GET tags.
I've not used this but Docker provides a Docker SDK for Python and this should enable you to interact with any registry that implements the Docker Registry API HTTP V2 such as GCR.
You must authenticate to GCR using a suitable permitted Google identity. This is documented Authenticating using the Docker Registry API
You can use the Google Cloud Python clients in your Python script.
Then use a service account and a download a Json key from the GCP IAM page.
You have to give the needed permissions to this service account in the IAM page.
Before to run the main of your Python script, you can do an authentication on GCP with in your bash terminal :
export GOOGLE_APPLICATION_CREDENTIALS=your_path/your_service_account.json
To list your Docker images, you can also use a shell script instead of a Python script. The following command list all the images of the current project from GCR :
gcloud container images list
As explain before, in order to use gcloud commands, you have to be authenticated with your service account or other identity.
https://cloud.google.com/sdk/gcloud/reference/container/images/list

Deploying Python script daily on Azure

I have a Python script that pulls some data from an Azure Data Lake cluster, performs some simple compute, then stores it into a SQL Server DB on Azure. The whole shebang runs in about 20 seconds. It needs sqlalchemy, pandas, and some Azure data libraries. I need to run this script daily. We also have a Service Fabric cluster available to use.
What are my best options? I thought of containerizing it with Docker and making it into an http triggered API, but then how do I trigger it 1x per day? I'm not good with Azure or microservices design so this is where I need the help.
You can use Web Jobs in App Service. It has two types of Azure Web Jobs for you to choose: Continuous and Trigger. As I see you need the type Trigger
You could refer to the document here for more details.In addition, here shows how to run tasks in WebJobs.
Also, you can use Azure function timer-based on python which was made generally available in recent months.

Python application logging with Azure Log Analytics

I have a small Python (Flask) application running in a Docker container.
The container orchestrator is Kubernetes, all running in Azure.
What is the best approach to set up centralized logging? (similar to Graylog)
Is it possible to get the application logs over OMS to Azure Log Analytics?
Thank you,
Tibor
I have a similar requirement. I have a continuously running Python application running in a Docker container. So far I have found that the Azure SDK for Python supports lots of integration into Azure from Python. This page might be able to help:
https://pypi.org/project/azure-storage-logging/
Here is also a package and guide how to set up Blob Storage and enable logging:
https://github.com/Azure/azure-storage-python/tree/master/azure-storage-blob

Categories