how to run docker compose using docker python sdk - python

I would like to run docker-compose via python docker sdk.
However I couldn't find any reference on how to achieve this using these reference Python SDK? I could also use subprocess but I have some other difficulty while using that. see here docker compose subprocess

I am working on the same issue and was looking for answers, but nothing so far. The best shot I can give it is to simplify that docker-compose logic. For example, you have a YAML file with a network and services - create them separately using Python Docker SDK and connect containers to a network.
It gets cumbersome, but eventually you can get things working that way from Python.

I created a package to make this easy: python-on-whales
Install with
pip install python-on-whales
Then you can do
from python_on_whales import docker
docker.compose.up()
docker.compose.stop()
# and all the other commands.
You can find the source for the package in my GitHub repository: https://gabrieldemarmiesse.github.io/python-on-whales/

Related

Deploy FastAPI microservice in Kubernetes via OpenFaaS

I have a big application structured with FastAPI (with many routers), that runs in AWS Lambda. I want to migrate it to a container inside Kubernetes. From my research, OpenFaaS is a great solution.
However I can't find documentation about how to do this.
Does anyone has references or a better solution?
If you are using the python or Ruby
You can create the docker file and use it for creating the docker images and simply deploy it on Kubernetes.
FROM ruby:2.7-alpine3.11
WORKDIR /home/app
COPY . .
RUN bundle install
CMD ["ruby", "main.rb"]
For OpenFass they have provided good labs with documentation to create the Async function etc.
Labs : https://github.com/openfaas/workshop
If you are looking for examples you can check out the official repo only : https://github.com/openfaas/faas/tree/master/sample-functions
Extra
There is also another good option Knative or Kubeless
You can find the python Kubeless example and CI/CD example : https://github.com/harsh4870/kubeless-kubernetes-ci-cd
Try use a template to build an upstream FastAPI application as an OpenFAAS function. This will create a docker image you can run and deploy in your Kubernetes cluster.
You can see how to do so in the following github repo

Adding Python Libraries to Airflow-Puckel on Docker

I am new to Docker and Airflow and am having trouble figuring out the correct place to add the httplib2 Python library to the container. I am using the Airflow-Puckel image. Do I need to add it to the Dockerfile or the docker-compose yml file or both and once added do I just need to rebuild the container with up and it will run?
From my own experience while learning Airflow and Docker, I strongly recommend using the official docker-compose file, maintained by Airflow. If you are on your first steps with Docker and Airflow, the guides and docs may come in very handy and comprehensive. Also, there is the fact that the images are more likely to be updated with the last Airflow version.
For example, once you are done with the initialization, you can take a look at this article where it's is explained how to add packages to each of the services being run on Compose or how to set it up as production-ready. You could check this answer for an example too.
Good luck!

Running Docker containers on Azure using docker-py

I can find all of the ingredients for what I want to do, but I'm not sure if I can put them together.
Ultimately, I want a Python process to be able to create and manage Docker instances running on Azure.
This link shows that you can use the Docker API to fire up instances on Azure: https://docs.docker.com/engine/context/aci-integration/. It's Beta, but I've been able to run my own container on Azure after logging in, using something like this:
docker --context myacicontext run hello-world
The second half of my problem is to call this from docker-py. The vanilla usage of docker-py is nice and striaght-forward, but I can't find any reference to the flag "--context" in the docker-py docs (https://docker-py.readthedocs.io/en/stable/).
Is there a way for configuring docker-py such that it provides a --context?
EDIT:
Thanks to #CharlesXu pointing me in the right direction, I have now found that the following docker-py command does have an effect:
docker.context.ContextAPI.set_current_context("myacicontext")
This changes the default context used by the docker cmd line interface, so
C:\Users\MikeSadler>docker ps -a
will subsequently list the containers running in Azure, and not locally.
However, the docker.from_env().containers.list(all=all) command stubbornly continues to return the local containers. This is true even if you restart the Python session and create a new client from a completely fresh start.
CONCLUSION:
Having spoken with the Docker developers, as of October 2020 docker-py officially does not support Cloud connections. They are currently using GRPC protocols to manage Cloud runs, and this may be incorporated into docker-py in the future.
I'm afraid there is no way to do things like the command docker --context myacicontext run hello-world does. And there is also no parameter like --context in the SDK. As I know, you can set the current context use the SDK like this:
import docker
docker.context.config.write_context_name_to_docker_config('context_name')
But when you use the code:
client = docker.from_env()
client.containers.run('xxx')
Then it will set the context into default. It means you cannot run the containers into ACI. I think it may be a bug that needs to be fixed. I'm not very very sure, but that's it right now.

How to use python script taskin vsts release pipeline

I am new to CI and CD world. I am using VSTS pipelines to automate my build and release processs.
This question is about the Release Pipeline. My deploy my build drop to a AWS VM. I created a Deployment group and ran the script in the VM to generate a deployment Agent on the AWS VM.
This works well and I am able to deploy successfully.
I would like to run few automation scripts in python after successful deployment.
I tried using Python Script Task. One of the settings is Python Interpretor. the help information says:
"Absolute path to the Python interpreter to use. If not specified, the task will use the interpreter in PATH.
Run the Use Python Version task to add a version of Python to PATH."
So,
I tried to use Python Version Task and specified the version of python I ususally run my scripts with. The prerequisites for the task mention
"A Microsoft-hosted agent with side-by-side versions of Python installed, or a self-hosted agent with Agent.ToolsDirectory configured (see Q&A)."
reference to Python Version task documentation
I am not sure how and where to set Agent.ToolsDirectory or how to use Microsoft Hosted agent on a release pipeline deploying to AWS VM. I could not find any step by step examples for this. Can anyone help me with clear steps how to run python scripts in my scenario?
the easiest way of doing this is just doing something like in your yaml definition:
- script: python xxx
this will run python and pass arguments to it, you can use python2 or python3 (default version installed on the hosted agent). another way of achieving this (more reliable) is using container inside hosted agent. this way you can explicitly specify python version and guarantee you are getting what you specified. example:
resources:
containers:
- container: my_container # can be anything
image: python:3.6-jessie # just an example
jobs:
- job: job_name
container: my_container # has to be the container name from resources
pool:
vmImage: 'Ubuntu-16.04'
steps:
- checkout: self
fetchDepth: 1
clean: true
- script: python xxx
this will start the python:3.6-jessie container, mount your code inside the container and run the python command in the root of the repo. Reading:
https://learn.microsoft.com/en-us/azure/devops/pipelines/yaml-schema?view=azdevops&tabs=schema&viewFallbackFrom=vsts#job
https://learn.microsoft.com/en-us/azure/devops/pipelines/process/container-phases?view=azdevops&tabs=yaml&viewFallbackFrom=vsts
in case you are using your own agent - just install python on it and make sure its in the path, so it should work when you just type python in the console (you'd have to use script task in this case). if you want to use python task, follow these articles:
https://github.com/Microsoft/azure-pipelines-tool-lib/blob/master/docs/overview.md#tool-cache
https://learn.microsoft.com/en-us/azure/devops/pipelines/tasks/tool/use-python-version?view=azdevops

Setting up docker container so that I can access python packages on ubuntu server

I'm new to using Docker, so I'm either looking for direct help or a link to a relevant guide. I need to train some deep learning models on my school's linux server, but I can't manually install pytorch and other python packages since I don't have root access (sudo). Another student said that he uses docker and has everything ready to go in his container.
I'm wondering how to wrap up my code and relevant packages into a container that I can push to the linux server and then run.
To address your specific problem the easiest way I found to get code into a container is to use git.
start the container in interactive mode or ssh to it if it's attached to a network.
git clone <your awesome deep learning code>. In your git repo have a requirements.txt file. Change directories into your local clone of your repo and run pip install -r requirements.txt
Run whatever script you need to run your code. Note you can easily put your pip install command in one of your run scripts.
It's important to remember that docker containers are stateless/ephemeral. You should not expect the container nor its contents to exist in some durable fashion. This specific issue is addressed by mapping a directory on the host system to a directory in the container.
Side note: I first recommend starting with the docker tutorial. You can easily skip over the installation parts if you are working on system that already has docker installed and where you have permissions to build, start, and stop containers.
I don't have root access (sudo). Another student said that he uses docker
I would like to point out that docker requires sudo permissions.
Instead I think you should look at using something like Google Colab or JupyterLab. This gives you the added benefit of code that is backed-up on a remote server

Categories