Such as a python file
example.py:
import os
containerId = "XXX"
command = "docker exec -ti " + containerId + "sh"
os.system(command)
when I execute this file using "python example.py", I can enter a docker container, but I want to execute some other commands inside the docker.
I tried this:
import os
containerId = "XXX"
command = "docker exec -ti " + containerId + "sh"
os.system(command)
os.system("ps")
but ps is only executed outside docker after I exit the docker container,it can not be executed inside docker.
so my question is how can I execute commands inside a docker container using the python shell.
By the way, I am using python2.7. Thanks a lot.
If the commands you would like to execute can be defined in advance easily, then you can attach them to a docker run command like this:
docker run --rm ubuntu:18.04 /bin/sh -c "ps"
Now if you already have a running container e.g.
docker run -it --rm ubuntu:18.04 /bin/bash
Then you can do the same thing with docker exec:
docker exec ${CONTAINER_ID} /bin/sh -c "ps"
Now, in python this would probably look something like this:
import os
containerId = "XXX"
in_docker_command = "ps"
command = 'docker exec ' + containerId + ' /bin/sh -c "' + in_docker_command + '"'
os.system(command)
This solution is useful, if you do not want to install an external dependency such as docker-py as suggested by #Szczad
Related
I need to have bash shell commands run through python in order to be universal with pc and mac/linux. ./bin/production doesn't work in powershell and putting 'bash' in front would give an error that it doesn't recognize 'docker' command
./bin/production contents:
#!/bin/bash
docker run --rm -it \
--volume ${PWD}/prime:/app \
$(docker build -q docker/prime) \
npm run build
This is the python script:
import subprocess
from python_on_whales import docker
cmd = docker.run('docker run --rm -it --volume ${PWD}/prime:/app $(docker build -q docker/prime) npm run build')
p = subprocess.Popen(cmd, stdout=subprocess.PIPE)
out, err = p.communicate()
print(out)
This is the error I get when running the python script:
python_on_whales.exceptions.NoSuchImage: The docker command executed was C:\Program Files\Docker\Docker\resources\bin\docker.EXE image inspect docker run --rm -it --volume ${PWD}/prime:/app $(docker build -q docker/prime) npm run build.
It returned with code 1
The content of stdout is '[]
'
The content of stderr is 'Error response from daemon: no such image: docker run --rm -it --volume ${PWD}/prime:/app $(docker build -q docker/prime) npm run build: invalid reference format: repository name must be lowercase
'
Running the command, docker run --rm -it--volume ${PWD}/prime:/app $(docker build -q docker/prime) npm run build in one long line in powershell works but we want a universal standard command for both pc and mac/linux
The Python on Whales docker.run() function doesn't take a docker run ... command line. It is a native Python API where you need to express the various Docker options as function parameters.
In principle you could rewrite this Python script using that API:
from pathlib import Path
from python_on_whales import docker
# build the image, returns an Image object
image = docker.build(Path.cwd() / 'docker' / 'prime')
# start the container; like `docker run ...`
docker.run(image,
command=['npm', 'run', 'build'],
volumes=[(Path.cwd() / 'prime', '/app')], # -v $(PWD)/prime:/app
interactive=True, # -i (required?)
tty=True, # -t (required?)
remove=True) # --rm
The return value from docker.run() (without detach=True) is the container's stdout, and the examples print() that data.
This might not be what you're looking for but you can always try this:
import platform
import subprocess
import os
cur_os = platform.system()
if cur_os == "Windows":
print("You are on windows")
os.system('Command here') # for windows
elif cur_os == "Darwin":
print("You are on mac")
subprocess.call('Command goes here') # for mac
Edit:
I'm intermediate with python so don't judge, if I did something wrong please give me feedback. Thanks.
I am using docker sdk for python. How do I pass a file to container using exec_run function.
I want to replicate the following docker exec command:
docker exec -i -u postgres <Insert the id find above> pg_restore -C -d postgres < filename
Above command loads a postgres backup. filename is the name of the file and resides on host machine from which exec command is being run.
I am trying this:
containers[0].exec_run("/bin/bash -c 'pg_restore -C -d postgres <'" + filename, stdout=True, stderr=True, user='postgres')
print(exec_log[1])
Here the file resides inside another docker container in which a python application is running which uses python docker client.
I am getting this:
b'/bin/bash: 2019-04-29-postgres_db.dump: No such file or directory\n'
I have looked into put_archive but that would require extracting the file inside the container. Is there a way of doing this using exec_run or any other simpler way?
Thanks
As a work around, you can mount a volume in your docker image that contain that file. Then you can use it from there.
container = context.client.containers.run(
image="ipostgres",
auto_remove=True,
detach=True,
volumes={"/host/machine/store: {'bind': '/opt/whatever', 'mode': 'ro'}, },
)
Then
exec_run("/bin/bash -c 'pg_restore -C -d postgres < /opt/whatever/filename'", stdout=True, stderr=True, user='postgres')
Im new to docker.
I am starting the run command with a script called r, which has the following code
proxy="--build-arg http_proxy=http://wwwcache.open.ac.uk:80 --build-arg https_proxy=http://wwwcache.open.ac.uk:80"
if [ "$http_proxy" == "" ]; then
proxy=
fi
docker build $proxy -t bi-tbcnn docker
docker run -v $(pwd):/e -w /e --entrypoint bash --rm -it bi-tbcnn -c ./run
When I execute r I am getting the following error
bash: ./run: No such file or directory
but when I directly execute the ./run command on my terminal is ok
I use Docker Toolbox on windows
The project address is https://github.com/bdqnghi/bi-tbcnn
thanks
This is a known issue on docker for windows
https://blogs.msdn.microsoft.com/stevelasker/2016/09/22/running-scripts-in-a-docker-container-from-windows-cr-or-crlf/
it seems you're facing an issue with Carriage Return(CR) and Line Feeds(LF) characters, maybe your code editor is changing the newline format automatically
can you to try open a bash session on the container and execute the script manually?
docker run -v $(pwd):/e -w /e --entrypoint bash --rm -it bi-tbcnn
root#a83fcd779f8e:/e# ./run
Please paste the output here
I'll try to explain this as simply as possible.
I have a dockerised python app. Within this python app at some point I try to run a docker command in another (libreoffice) container as such:
import subprocess
file_path = 'path_to_file'
args = ['docker', 'run', '-it', '-v', '/tmp:/tmp',
'lcrea/libreoffice-headless', '--headless', '--convert-to', 'pdf', file_path,
'--outdir', '/tmp']
process = subprocess.run(args,
stdout=subprocess.PIPE,
stderr=subprocess.PIPE,
timeout=timeout)
I end my python app's Dockerfile with a command which starts the server:
CMD python3 -m app.run_app
What is interesting is when I start the python app like this it works fine:
docker-compose run -p 9090:9090 backend /bin/bash
root#74430c3f1f0c:/src python3 -m app.run_app
But when I start it just using docker-compose up, the libreoffice container is never called. I am sure of it because when I do docker ps -a in the first instance a libreoffice container has been created while in the second there is none.
What is going on here?
I found the error. I was passing in the -it option which was failing the process because of the input device is not a TTY. All I had to do was take it out...
Let me clarify what I want to do.
I have a python script in my local machine that performs a lot of stuff and in certain point it have to call another python script that must be executed into a docker container. Such script have some input arguments and it returns some results.
So i want to figure out how to do that.
Example:
def function()
do stuff
.
.
.
do more stuff
''' call another local script that must be executed into a docker'''
result = execute_python_script_into_a_docker(python script arguments)
The docker has been launched in a terminal as:
docker run -it -p 8888:8888 my_docker
You can add your file inside docker container thanks to -v option.
docker run -it -v myFile.py:/myFile.py -p 8888:8888 my_docker
And execute your python inside your docker with :
py /myFile.py
or with the host:
docker run -it -v myFile.py:/myFile.py -p 8888:8888 my_docker py /myFile.py
And even if your docker is already running
docker exec -ti docker_name py /myFile.py
docker_name is available after a docker ps command.
Or you can specify name in the run command like:
docker run -it --name docker_name -v myFile.py:/myFile.py -p 8888:8888 my_docker
It's like:
-v absoluteHostPath:absoluteRemotePath
You can specify folder too in the same way:
-v myFolder:/customPath/myFolder
More details at docker documentation.
You can use docker's python SDK library. First you need to move your script there, I recommend you do it when you create the container or when you start it as Callmemath mentioned:
docker run -it -v myFile.py:/myFile.py -p 8888:8888 my_docker
Then to run the script using the library:
...
client = docker.client.from_env()
container = client.containers.get(CONTAINER_ID)
exit_code, output = container.exec_run("python your_script.py script_args")
...
you have to use docker exec -it image_name python /filename
Note: To use 'docker exec' you must run the container using docker run