How to run Python inside an expressjs Docker container - python

i am trying to build a container for my express.js application. The express.js-app makes use of python via the npm package PythonShell.
I have plenty of python-code, which is in a subfolder of my express-app and with npm start everything works perfectly.
However, i am new to docker and i need to containerize the app. My Dockerfile looks like this:
FROM node:18
WORKDIR /usr/src/app
COPY package*.json ./
RUN npm install
COPY . .
EXPOSE 3001
CMD ["node", "./bin/www"]
I built the Image with:
docker build . -t blahblah-server and ran it with docker run -p 8080:3001 -d blahblah-server.
I make use of imports at the top of the python-script like this:
import datetime
from pathlib import Path # Used for easier handling of auxiliary file's local path
import pyecma376_2 # The base library for Open Packaging Specifications. We will use the OPCCoreProperties class.
from assi import model
When the pythonscript is executed (only in the container!!!) I get following error-message:
/usr/src/app/public/javascripts/service/pythonService.js:12
if (err) throw err;
^
PythonShellError: ModuleNotFoundError: No module named 'pyecma376_2'
at PythonShell.parseError (/usr/src/app/node_modules/python-shell/index.js:295:21)
at terminateIfNeeded (/usr/src/app/node_modules/python-shell/index.js:190:32)
at ChildProcess.<anonymous> (/usr/src/app/node_modules/python-shell/index.js:182:13)
at ChildProcess.emit (node:events:537:28)
at ChildProcess._handle.onexit (node:internal/child_process:291:12)
----- Python Traceback -----
File "/usr/src/app/public/pythonscripts/myPython/wtf.py", line 6, in <module>
import pyecma376_2 # The base library for Open Packaging Specifications. We will use the OPCCoreProperties class. {
traceback: 'Traceback (most recent call last):\n' +
' File "/usr/src/app/public/pythonscripts/myPython/wtf.py", line 6, in <module>\n' +
' import pyecma376_2 # The base library for Open Packaging Specifications. We will use the OPCCoreProperties class.\n' +
"ModuleNotFoundError: No module named 'pyecma376_2'\n",
executable: 'python3',
options: null,
script: 'public/pythonscripts/myPython/wtf.py',
args: null,
exitCode: 1
}
If I comment the first three imports out, I get the same error:
PythonShellError: ModuleNotFoundError: No module named 'assi'
Please notice, that assi actually is from my own python-code, which is included in the expressjs-app-directory
Python seems to be installed in the container correctly. I stepped inside the container via docker exec -it <container id> /bin/bash and there are the python packages in the #/usr/lib-directory.
I really have absolute no idea how all this works together and why python doesn't find this modules...

You are trying to use libs that are not in Standard Python Library. It seems that you are missing to run pip install , when you build the docker images.
Try adding RUN docker commands that can do this for you. Example:
RUN pip3 install pyecma376_2
RUN pip3 install /path/to/assi
Maybe, that can solve your problem. Don't forget to check if python are already installed in your container, it semms that it is. And if you have python2 and pyhton3 installed, make sure that you use pip3 instead of only pip.

Related

How to get the location of installed Python package into the shell

I want my users to be able to reference a file in my python package (specifically a docker-compose.yml file) directly from the shell.
I couldnt find a way to get only the location from pip show (and grep-ing out "location" from its output feels ugly), so my current (somewhat verbose) solution is:
docker compose -f $(python3 -c "import locust_plugins; print(locust_plugins.__path__[0])")/timescale/docker-compose.yml up
Is there a better way?
Edit: I solved it by installing a wrapper command I call locust-compose as part of the package. Not perfect, but it gets the job done:
#!/bin/bash
module_location=$(python3 -c "import locust_plugins; print(locust_plugins.__path__[0])")
set -x
docker compose -f $module_location/timescale/docker-compose.yml "$#"
Most of the support you need for this is in the core setuptools suite.
First of all, you need to make sure the data file is included in your package. In a setup.cfg file you can write:
[options.package_data]
timescale = docker-compose.yml
Now if you pip install . or pip wheel, that will include the Compose file as part of the Python package.
Next, you can retrieve this in Python code using the ResourceManager API:
#!/usr/bin/env python3
# timescale/compose_path.py
import pkg_resources
if __name__ == '__main__':
print(pkg_resources.resource_filename('timescale', 'docker-compose.yml'))
And finally, you can take that script and make it a setuptools entry point script (as distinct from the similarly-named Docker concept), so that you can just run it as a single command.
[options.entry_points]
console_scripts=
timescale_compose_path = timescale:compose_path
Again, if you pip install . into a virtual environment, you should be able to run timescale_compose_path and get the path name out.
Having done all of those steps, you can finally run a simpler
docker-compose -f $(timescale_compose_path) up

Run a python script in node-red running on Docker

I'm trying to run a python script saved on my local system in node-red which is running as a docker container. I copied the python script into the docker container as the exec node was unable to locate the file using this command -
cat /local/file/path | docker exec -i <running-container-id> sh -c 'cat > /inside/docker/file/path'
But now I'm getting the following error - Traceback (most recent call last):
File "outlier.py", line 2, in
from pandas import read_csv
ModuleNotFoundError: No module named 'pandas'
I had installed pandas on my local but it's not being found by the exec node. Any help is appreciated, thanks.
When applications run inside a Docker container they only have access to the libraries/modules included inside the container. They have no access to anything in the host machine.
So if you want to run Python scripts that have dependencies on Python modules you will need to create a custom Docker container that extends the official Node-RED container and then installs those modules.
Node-RED provides doc about extending it's container here

Run pip in python docker

I am completely new to docker (on windows 10 machine). I intend to setup a python development environment as a docker container. And most of the reading that I did involved the use of Dockerfile. I want to do it from scratch instead purely using commands.
What I intend to do is very basic requirement: To have python docker image present with me and that I should be able to install more libraries in that image and commit these updates to that image. But I want to do it completely using commands (not via a Dockerfile).
I am using Docker Desktop on windows 10 machine. I did docker pull python:latest and it pulled the image like so:
C:\Users\MyHomeDirectory>docker pull python:latest
latest: Pulling from library/python
d960726af2be: Pull complete
e8d62473a22d: Pull complete
8962bc0fad55: Pull complete
65d943ee54c1: Pull complete
532f6f723709: Pull complete
1334e0fe2851: Pull complete
062ada600c9e: Pull complete
aec2e3a89371: Pull complete
1ec7c3bcb4b2: Pull complete
Digest: sha256:65367d1d3eb47f62127f007ea1f74d1ce11be988044042ab45d74adc6cfceb21
Status: Downloaded newer image for python:latest
docker.io/library/python:latest
Then I did docker images and it showed that python latest image is present with a size of 886 MB.
C:\Users\Tejas.Khajanchee>docker images
REPOSITORY TAG IMAGE ID CREATED SIZE
python latest 5b3b4504ff1f 47 hours ago 886MB
I am also able to enter the interactive python by doing docker run -it python and it generates the interactive shell:
Python 3.9.5 (default, May 12 2021, 15:26:36)
[GCC 8.3.0] on linux
Type "help", "copyright", "credits" or "license" for more information.
>>>
>>> import gc
>>> import numpy
Traceback (most recent call last):
File "<stdin>", line 1, in <module>
ModuleNotFoundError: No module named 'numpy'
>>>
>>> import pandas
Traceback (most recent call last):
File "<stdin>", line 1, in <module>
ModuleNotFoundError: No module named 'pandas'
>>>
>>> import openpyxl
Traceback (most recent call last):
File "<stdin>", line 1, in <module>
ModuleNotFoundError: No module named 'openpyxl'
>>>
But as evident, some of the libraries are not installed. But this is where I get stuck. How do I install libraries into the python image and have the image updated. Also, if this shell is the only thing that I am allowed to do till now, what does the 886 MB content represent? Also I want to be able to run scripts using this docker image. When I attempt to do this on a very basic hello world script, the following error comes up:
C:\Users\MyHomeDirectory\Downloads>docker run -it python a.py
docker: Error response from daemon: OCI runtime create failed: container_linux.go:367: starting container process caused: exec: "a.py": executable file not found in $PATH: unknown.
I want to be able to do this purely with commands and not a Dockerfile. Please help.
First, looks you confuse the concept of image & container.
Docker image: read only, used as basis of container
Docker container: overlay a writeable layer upon the read only layer of docker image, all container will use image as basis
Second, for you, you mentioned you want to install numpy in the image, the best way for this is to customized a Dockerfile like next:
Dockerfile:
FROM python
RUN pip install numpy
Then, build a new image with docker build -t newpython .
BUT, you mentioned you don't want to use Dockerfile, then the replacement is next:
Install numpy in a container:
docker run -it python /bin/bash
# pip install numpy
Use docker ps -a to get the container id, e.g: 0a6b4df8e2c2, then commit this container which already have numpy installed to a new image:
docker commit 0a6b4df8e2c2 newpython
Finally, all new container need to run base on newpython image not python image, as only the newpython image has numpy installed:
docker run --rm newpython python -c "import numpy; print(numpy.__version__)"
1.20.3
Additional, for docker run -it python a.py, I think you misunderstand the concept. Container command like python a.py means the command will executed in container, so the a.py should be in container, not in host machine.

model_main.py file is using Python2.7 instead of Python3

I'm currently using python3 to run model_main.py file. I followed each step to install object_detection api
I've made sure that each command is run with a python3 prefix but after running the command:
python3 model_main.py --logtostderr --train_dir=custom1/training --pipeline_config_path=hand_inference_graph/pipeline.config
I'm getting an error:
ImportError: /home/abrar/.local/lib/python2.7/site-packages/tensorflow/models/research/pycocotools/_mask.so: undefined symbol: _Py_ZeroStruct
The model_main.py file is using python2.7 every time I run the command.
offhand, I'd say that somehow /home/abrar/.local/lib/python2.7/site-packages/tensorflow/models/research/pycocotools is being added to your path which, by name, implies a directory full of python2.7 stuff. Try adding:
import sys
print(sys.path)
to the top of your script to determine what locations are being searched for modules. If the pycocotools directory is being added somehow, you'll need to either remove it from your path or find out where it's being added and stopping it.

Running containerized PyTest

I am learning how to run containerized PyTests and I am failing to run a test with arguments.
My Dockerfile looks like this:
FROM python:2
ADD main.py /
RUN pip install docker
RUN pip install fake_useragent
RUN pip install pytest
RUN pip install requests
CMD ["pytest", "main.py --html=report.html"]
But I tried all kinds of CMD/RUN variations I found online.
Anybody has a clue?
The full project is here if helps:
https://github.com/pavelzag/DockerSDKLearn
"main.py --html=report.html" will be passed in pytest as a single argument and will appear in sys.argv[1] there. Hence pytest is trying to locate a file with the exact same name with stuff like --html in it. You should fully tokenize the command:
CMD ["pytest", "main.py", "--html=report.html"]

Categories