Azure Blob storage trigger function using Docker - python

I am making a pipeline using Python and I found that Azure's default container does not support libsndfile library. So I am trying to use docker so that I can make a container which supports libsndfile library. However, I have not used docker so I need a help.
The function app that I made is blob storage triggered function app.
upload to blob storage (blob triggered) -> Processing (function app) -> copy to another blob storage (output)
The questions are
Is it possible to make a blob storage function app using docker?
If it is possible, can you give me some hints how to use docker?

In a case where when your functions require a specific language version or have a specific dependency or configuration that isn't provided by the built-in image, you typically use a custom image. Here, you can create and deploy your code to Azure Functions as a custom Docker container using a Linux base image.
In summary, you can create Azure Function App using Docker image using Azure LCI like below:
az functionapp create --name <app_name> --storage-account <storage_name> --resource-group AzureFunctionsContainers-rg --plan myPremiumPlan --runtime <functions runtime stack> --deployment-container-image-name <docker_id>/azurefunctionsimage:v1.0.0
Do check out the above link for detailed step by step tutorial and you are good to go! It also shows you how to create output bindings.

Related

How to list docker images in Google container registry (GCR) through python script?

I'm trying to write a python script to auto-delete old prerelease docker images that are taking up space in Google container registry. I'm stuck in how can I authenticate google cloud in python script and list docker images in GCR.
Google Container Registry (GCR) implements (the de facto standard) Docker Registry API HTTP V2.
This means that you can use docker, podman and other tools that implement these APIs to interact with GCR and it means that you should use a 3rd-party client library (often provided by Docker) to implement interaction with a Docker Registry such as GCR.
Google (!) documents this on the page Docker Registry API
There is no Google API to interact with GCR.
You can demonstrate this to yourself by running e.g. gcloud container images list with the --log-http flag to show that the underlying REST calls are of the form http://gcr.io/v2/{repo}/tags/list. This is documented on Docker's API page for GET tags.
I've not used this but Docker provides a Docker SDK for Python and this should enable you to interact with any registry that implements the Docker Registry API HTTP V2 such as GCR.
You must authenticate to GCR using a suitable permitted Google identity. This is documented Authenticating using the Docker Registry API
You can use the Google Cloud Python clients in your Python script.
Then use a service account and a download a Json key from the GCP IAM page.
You have to give the needed permissions to this service account in the IAM page.
Before to run the main of your Python script, you can do an authentication on GCP with in your bash terminal :
export GOOGLE_APPLICATION_CREDENTIALS=your_path/your_service_account.json
To list your Docker images, you can also use a shell script instead of a Python script. The following command list all the images of the current project from GCR :
gcloud container images list
As explain before, in order to use gcloud commands, you have to be authenticated with your service account or other identity.
https://cloud.google.com/sdk/gcloud/reference/container/images/list

How to host Azure Python API via Azure Data Storage

I have python scripts that I want to host as a web app in azure.
I want to keep the function scripts in azure data storage and host them as a python API in azure.
I want some devs to be able to change the scripts in the azure data storage and reflect the changes live without having to deploy.
How can I got about doing this?
Create Function App of Runtime Stack Python supported only in Linux Version in any Hosting Plan through Azure Portal.
Make the Continuous Deployment setting to GitHub under Deployment Center of the function app in the portal like below:
After authorizing, provide your GitHub Repository details in the same section.
Create the Azure Python Functions from VS Code and deploy to the GitHub Repository using Git Clone through Command Palette.
After that, you can deploy the functions to azure function app.
Here after publishing to azure, you'll get the Azure Functions Python Rest API.
I want some devs to be able to change the scripts in the azure data storage and reflect the changes live without having to deploy.
Whenever you or dev's make changes the changes in code through GitHub and commit the changes, then it automatically reflects in the Azure Portal Function App.
For more information, please refer this article and GitHub actions of editing the code/script files.

Deploy FastAPI microservice in Kubernetes via OpenFaaS

I have a big application structured with FastAPI (with many routers), that runs in AWS Lambda. I want to migrate it to a container inside Kubernetes. From my research, OpenFaaS is a great solution.
However I can't find documentation about how to do this.
Does anyone has references or a better solution?
If you are using the python or Ruby
You can create the docker file and use it for creating the docker images and simply deploy it on Kubernetes.
FROM ruby:2.7-alpine3.11
WORKDIR /home/app
COPY . .
RUN bundle install
CMD ["ruby", "main.rb"]
For OpenFass they have provided good labs with documentation to create the Async function etc.
Labs : https://github.com/openfaas/workshop
If you are looking for examples you can check out the official repo only : https://github.com/openfaas/faas/tree/master/sample-functions
Extra
There is also another good option Knative or Kubeless
You can find the python Kubeless example and CI/CD example : https://github.com/harsh4870/kubeless-kubernetes-ci-cd
Try use a template to build an upstream FastAPI application as an OpenFAAS function. This will create a docker image you can run and deploy in your Kubernetes cluster.
You can see how to do so in the following github repo

Mirror Docker container image to Google Container Registry using least dependencies/permissions

I need to perform the following from a python program:
docker pull foo/bar:tag
docker tag foo/bar:tag gcr.io/project_id/mirror/foo/bar:tag
gcloud auth configure-docker --quiet
docker push gcr.io/project_id/mirror/foo/bar:tag
I want to accomplish this with the minimal possible footprint - no root, no privileged Docker installation, etc. The Google Cloud SDK is installed.
How to programmatically mirror the image with minimal app footprint?
Google cloud build API can be used to perform all your required steps in one command Or use Trigger.
gcloud builds submit --tag gcr.io/$DEVSHELL_PROJECT_ID/$UMAGE_NAME:v0.1 .
Above command, you can call using Python cloud Build API
https://googleapis.dev/python/cloudbuild/latest/gapic/v1/api.html

Is it able to create an HTTP triggered python function on Azure Function without doing any local coding?

I wonder is it able to create an HTTP triggered python function on Azure Function without doing any local coding? I want to do everything on Azure cloud. My python function codes are in a Github/Azure repos repository, but I do not have all the extra files of an Azure function project (for example, a init.py script file that is the HTTP trigger function of the Azure Function App). Is it possible to generate those files from Azure (without generating any Azure Function related files on my local computer)? I noticed that we cannot do in-portal editing for Python function Apps.
As far as I know, we can just deploy the python function from local to Azure cloud, but can not deploy it as you expected. And I think it will not be too difficult to us to deploy the python function from local to azure cloud.
Since you have had the main python function code "init.py", you just need to sign in to Azure in you VS code and create python function by following this tutorial. And then use your init.py code to replace the new python function code. After that, run the command below in "TERMINAL" window to generate the "requirements.txt":
pip freeze > requirements.txt
The "requirements.txt" includes all of the modules which imported in your "init.py" and when the function deployed to azure, azure will install these modules by this "requirements.txt". I saw you mentioned you don't have all the extra files of this function project, if these modules are what you care about, the "requirements.txt" will solve your problem.
Then use this command to deploy it to Azure:
func azure functionapp publish hurypyfunapp --build remote

Categories