I want to dockerize my discord bot written in python for developement process but I can't get it done. In docker-compose it is like that right now:
discord_bot:
build: ./discord
volumes:
- ./discord:/usr/src/discord
depends_on:
- mongo
- node
Is there a way I can hot reload this code while still using discord.py?
If you want it to auto-reload on code change for local development, what you have is mostly correct. The one thing you're missing is launching the main process via some sort of file watcher. You could use nodemon with python, or find some equivalent for python specifically.
Changes you need to make:
You're build image needs to contain some sort of file watcher. You could use Nodemon for this (even for python , or use some python equivalent)
You should override the default command of the image to launch via your file watcher.
discord_bot:
build: ./discord <--- Should include file watcher executable (nodemon or some python equivalent)
command: nodemon /usr/src/discord/index.js <--- add this line
volumes:
- ./discord:/usr/src/discord
depends_on:
- mongo
- node
Related
PyCharm reports 'unresolved reference' to Python imports with docker-compose interpreter running.
see image attached
unresolved references e.g. in settings.py
I have already read and tried some problems of the same kind and the solution answers on this portal, like marking the folders in the PYCharm IDE as source root. Also I have used the Repair IDE function a lot to rebuild the indexes. Nothing. Nothing has helped so far.
I'm having this problem with PyCharm since I'm not running my Python installation in a venv and switching the PyCharm interpreter to it, but working with a Docker Compose environment.
I have created a dockerfile and a docker-compose.yml file for this purpose. If I use the terminal command "docker compose up", the container environment runs and my Python/Django application can also be started without errors via the browser. The respective logs of the containers do not cause any problems either.So the problem doesn't seem to be with my Docker environment, but rather with how the PyCharm IDE interacts with the Docker environment.
here is my Dockerfile code:
FROM python:3.10.4-slim-bullseye
# Set environment variables
ENV PIP_DISABLE_PIP_VERSION_CHECK 1
ENV PYTHONDONTWRITEBYTECODE 1
ENV PYTHONUNBUFFERED 1
# Set work directory
WORKDIR /cpp_base
# Install dependencies
COPY ./requirements.txt .
RUN pip install -r requirements.txt
# Copy project
COPY . .
and here my docker-compose.yml:
version: "3.9"
services:
web:
build: .
container_name: python_django
command: python /cpp_base/manage.py runserver 0.0.0.0:8000
volumes:
- .:/cpp_base
ports:
- "8000:8000"
depends_on:
- db
db:
image: postgres:14.5
container_name: postgres_14.5
restart: always
ports:
- "5432:5432"
environment:
POSTGRES_DB: cpp_base
POSTGRES_USER: postgres
POSTGRES_PASSWORD: postgres
volumes:
- postgres_data:/var/lib/postgresql/data
pgadmin:
container_name: pgadmin4_container
image: dpage/pgadmin4
restart: always
volumes:
- pgadmin_data:/var/lib/pgadmin
environment:
PGADMIN_DEFAULT_EMAIL: admin#admin.com
PGADMIN_DEFAULT_PASSWORD: root
ports:
- "5050:80"
blackd:
restart: always
image: docker.io/pyfound/black
command: blackd --bind-host 0.0.0.0 --bind-port 45484
ports:
- "45484:45484"
portainer:
image: portainer/portainer-ce:latest
container_name: portainer
restart: unless-stopped
security_opt:
- no-new-privileges:true
volumes:
- /etc/localtime:/etc/localtime:ro
- /var/run/docker.sock:/var/run/docker.sock:ro
- ./portainer-data:/data
ports:
- "9000:9000"
volumes:
postgres_data:
pgadmin_data:
In my PyCharm Ide:
Connect to docker daemon
Settings->Build,Execution, Deployment, see attached Image
Add new Interpreter
Interpreter Docker-Compose configuration image, see attached file
select the new Interpreter and see that the all needed Packages were there
Interpreter Selection and Package list, see attached file
configure a Run/Debug Configuration
see attached configuration image
After all these configurations, I was able to start the Docker Environment inside the Ide with the green triangle play button. The code also seems to run because I can see the Django default app in the browser. I don't have the slightest idea why the IDE makes the red underlines though. The funny thing is that if I don't select any interpreter within the IDE I can still run the application and I don't get any unresolved messages. So only when I set the interpreter to the "web" service in the Docker compose file the IDE starts to complain.
Does anyone know help.
Thank you very much.
My Software Versions:
PYCharm 2022.2.2
Windows 11, 10.0.22000
Docker v2.12.0, running on WSL2
Python 3.10.4
Django 4.1
I have found a solution. The jetbrain support and the bug tool from jetbrain YoutTrack helped me to solve the problem. There were 2 things I had to do:
1. first solution section
First of all, the support found an error in my PyCharm log that had to do with the PyCharm Docker interpreter.
The error in the log had the following output:
Error response from daemon: invalid environment variable: =::=::\
To fix this error you can do the same as in this bug report:
https://youtrack.jetbrains.com/issue/PY-24604/Unable-to-create-Docker-Compose-interpreter-InternalServerErrorException-invalid-environment-variable
So when setting up the remote Docker interpreter in PyCharm, uncheck the following option in the environment settings:
include parent environment variables
Unfortunately, this is quite hard to find and probably a lot of users won't find it right away and therefore run into the same error.
2. second solution section
A user on another platform could give me a hint about a bug in the current PyCharm and show the workaround for it.You can find the workaround here:
https://youtrack.jetbrains.com/issue/PY-55617/Pycharm-doesnt-recognize-any-of-my-installed-packages-on-a-remote-host
I can't say if the two solutions mentioned are dependent on each other. However, after the solution in point 1, the error messages were gone in the logs and after the workaround in point 2, all package dependencies and modules in the code were also no longer shown to me as "unresolved references". This has been my solution.
I have this same issue.
As far as I can tell, Jetbrains doesn't support a remote interpreter in a docker orchestration. And although this is ~supposed to work, it is broken in 2022.2.
Here's the open issue on it.
I've been scratching my head for a while with this. I have the following Dockerfile for my python application:
# Use an official Python runtime as a parent image
FROM frankwolf/rpi-python3
# Set the working directory to /app
WORKDIR /app
# Copy the current directory contents into the container at /app
COPY . /app
RUN chmod 777 docker-entrypoint.sh
# Install any needed packages specified in requirements.txt
RUN pip3 install --trusted-host pypi.python.org -r requirements.txt
# Run __main__.py when the container launches
CMD ["sudo", "python3", "__main__.py", "-debug"] # Not sure if I need sudo here
docker-compose file:
version: "3"
services:
mongoDB:
restart: unless-stopped
volumes:
- "/data/db:/data/db"
ports:
- "27017:27017"
- "28017:28017"
image: "andresvidal/rpi3-mongodb3:latest"
mosquitto:
restart: unless-stopped
ports:
- "1883:1883"
image: "mjenz/rpi-mosquitto"
FG:
privileged: true
network_mode: "host"
depends_on:
- "mosquitto"
- "mongoDB"
volumes:
- "/home/pi:/home/pi"
#image: "arkfreestyle/fg:v1.8"
image: "test:latest"
entrypoint: /app/docker-entrypoint.sh
restart: unless-stopped
And this what docker-entrypoint.sh looks like:
#!/bin/sh
if [ ! -f /home/pi/.initialized ]; then
echo "Initializing..."
echo "Creating .initialized"
# Create .initialized hidden file
touch /home/pi/.initialized
else
echo "Initialized already!"
sudo python3 __main__.py -debug
fi
Here's what I am trying to do:
(This stuff already works)
1) I need a docker image which runs my python application when I run it in a container. (this works)
2) I need a docker-compose file which runs 2 services + my python application, BUT before running my python application I need to do some initialization work, for this I created a shell script which is docker-entrypoint.sh. I want to do this initialization work ONLY ONCE when I deploy my application on a machine for the first time. So I'm creating a .initialized hidden file which I'm using as a check in my shell script.
I read that using entrypoint in a docker-compose file overwrites any old entrypoint/cmd given to the Dockerfile. So that's why in the else portion of my shell script I'm manually running my code using "sudo python3 main.py -debug", this else portion works fine.
(This is the main question)
In the if portion, I do not run my application in the shell script. I've tested the shell script itself separately, both if and else statements work as I expect, but when I run "sudo docker-compose up", the first time when my shell script hits the if portion it echoes the two statements, creates the hidden file and THEN RUNS MY APPLICATION. The console output appears in purple/pink/mauve for the application, while the other two services print their logs out in yellow and cyan. I'm not sure if the colors matter, but in the normal condition my application logs are always green, in fact the first two echoes "Initializing" and "Creating .initialized" are also green! so I thought I'd mention this detail. After those two echoes, my application mysteriously begins and logs console output in purple...
Why/how is my application being invoked in the if statement of the shell script?
(This is only happens if I run through docker-compose, not if I just run the shell script with sh docker-entrypoint.sh)
Problem 1
Using ENTRYPOINT and CMD at the same time has some strange effects.
Problem 2
This happens to your container:
It is started the first time. The .initialized file does not exist.
The if case is executed. The file is created.
The script and therefore the container ends.
The restart: unless-stopped option restarts the container.
The .initialized file exists now, the else case is run.
python3 __main__.py -debug is executed.
BTW the USER command in the Dockerfile or the user option in Docker Compose are better options than sudo.
I have a Python script which connects to MySql and inserts data into a database.That is the one container, I want to build. I want to build another container which will have a Python script which will connect to the database of the first container and execute some queries. I am trying to follow the documentation of Docker, however I find it difficult to make the proper yml file. Any guidance will be very helpful.
It depends on how complex is what you want to make, but the docker-compose.yml file should look similar to this:
version: '3'
services:
my_database:
image: mysql
[... MySQL configs]
my_python_container:
build: .
depends_on:
- my_database
links:
- my_database
I have not knowledge about the configuration of the MySQL database so I left that part blank ([... MySQL configs]).
The my_python_conainer is in a Dockerfile in the same folder, similar to:
FROM python
COPY script.py script.py
CMD python script.py
This should be enough to get the connection, but you have to consider in your program that the mysql hostname will be the name given to the container.
I tried running odoo tests using --test-enable, but it won't work. I have a couple of questions.
According to the documentation Tests can only be run during module installation, what happens when we add functionality and then want to run tests?
Is it possible to run tests from IDE like Pycharm ?
This useful For Run odoo test case:
./odoo.py -i/-u module_being_tested -d being_used_to_test --test-enable
Common options:
-i INIT, --init=INIT
install one or more modules (comma-separated list, use "all" for all modules), requires -d
-u UPDATE, --update=UPDATE
update one or more modules (comma-separated list, use "all" for all modules). Requires -d.
Database related options:
-d DB_NAME, --database=DB_NAME
specify the database name
Testing Configuration:
--test-enable: Enable YAML and unit tests.
#aftab You need add log-level please see below.
./odoo.py -d <dbname> --test-enable --log-level=test
and regarding you question, If you are making changes to installed modules and need to re test all test cases then you need to simple restart you server with -u <module_name> or -u all(for all modules) with the above command.
Here is a REALLY nice plugin to run unit odoo tests directly with pytest:
https://github.com/camptocamp/pytest-odoo
Here's a result example:
I was able to run odoo's tests using pycharm, to achieve this I used docker + pytest-odoo + pycharm (using remote interpreters).
First you setup a Dockerfile like this:
FROM odoo:14
USER root
RUN apt-get update && \
apt-get install -y --no-install-recommends \
python3-pip
RUN pip3 install pytest-odoo coverage pytest-html
USER odoo
And a docker-compose.yml like this:
version: '2'
services:
web:
container_name: plusteam-odoo-web
build:
context: .
dockerfile: Dockerfile
image: odoo:14
depends_on:
- db
ports:
- "8069:8069"
volumes:
- odoo-web-data:/var/lib/odoo
- ./config:/etc/odoo
- ./addons:/mnt/extra-addons
command: --dev all
db:
container_name: plusteam-odoo-db
image: postgres:13
ports:
- "5432:5432"
environment:
- POSTGRES_DB=postgres
- POSTGRES_PASSWORD=odoo
- POSTGRES_USER=odoo
- PGDATA=/var/lib/postgresql/data/pgdata
volumes:
- odoo-db-data:/var/lib/postgresql/data/pgdata
volumes:
odoo-web-data:
odoo-db-data:
So we extend an odoo image with packages to generate coverage reports and pytest-odoo
Once you have this, you can run docker-compose up -d to get your odoo instance running, the odoo container will have pytest-odoo installed, the next part is to tell pycharm to use a remote interpreter with the odoo modified image including the pyest-odoo package:
Now every time you run a script in pycharm they will launch a new container based on the image you provided.
After examining the containers launched by pycharm I realized they bind the project's directory to the /opt/project/ directory inside the container, this is useful because you will need to modify the odoo.conf file when you run your tests.
You can customize the database connection for a custom testing db which you should do, and the important part is that you need to map the addons_path option to /opt/project/addons or the final path inside the containers launched by pycharm where your custom addons are available.
With this you can create a pycharm script for pytest like this:
Notice how we provided the path for the odoo config with modifications for testing, this way the odoo available in the container launched by pycharm will know where your custom addon's code is located.
Now we can run the script and even debug it and everything will work as expected.
I go further in this matter (my particular solution) in a medium article, I even wrote a repository with a working demo so you can try it out, hope this helps:
https://medium.com/plusteam/how-to-run-odoo-tests-with-pycharm-51e4823bdc59 https://github.com/JSilversun/odoo-testing-example
Be aware that using remote interpreters you just need to make sure the odoo binary can find the addons folder properly and you will be all set :) besides using a Dockerfile to extend an image helps to speed up development.
I'm trying to use docker for odoo module developement. I have the following docker-compose.yml file
db:
image: postgres
environment:
POSTGRES_USER: odoo
POSTGRES_PASSWORD: odoo
volumes:
- data:/var/lib/postgresql/data/
odoo:
image: odoo
links:
- db:db
ports:
- "127.0.0.1:8069:8069"
volumes:
- extra-addons:/mnt/extra-addons
command: -- --update=tutorial
The module contains only an __openerp__.py file but odoo doesn't show the changes I make to it even with --update=tutorial option
{
'name': "tutorial",
'summary': """Hello world!!""",
'description': """
This is the new description
""",
'author': "ybouhjira",
'website': "ybouhjira.com",
'category': 'Technical Settings',
'version': '0.1',
'depends': ["base"],
}
this file is in a folder named tutorial located in extra-addons, and I tried stop and starting the containers even removing and recreating them.
Like shodowsjedi already said, you need to create a __init__.py file (see module structure : https://www.odoo.com/documentation/8.0/howtos/backend.html#module-structure ).
Also, check permissions in your odoo containers, your files in the odoo volume will have uid and gid of your system (the host) in the container (that can be associated to a different user). To check this you can use docker exec :
docker exec docker_odoo_1 ls -la /mnt/extra-addons
If you don't know the docker name of your container you can retrieve it by using :
docker-compose ps
Last and probably the most important one, check odoo logs by using :
docker-compose logs
and update your module in the configuration page of Odoo (or at the startup of the server)
You have to add own config file. first in docker-compose.yml mount /etc/odoo
odoo:
image: odoo
links:
- db:db
ports:
- "127.0.0.1:8069:8069"
volumes:
- extra-addons:/mnt/extra-addons
- ./config:/etc/odoo
Then create "odoo.conf" in ./config and add configuration options like below.
[options]
addons_path = /mnt/extra-addons,/usr/lib/python2.7/dist- packages/odoo/addons
data_dir = /var/lib/odoo
auto_reload = True
restart odoo, go to debug mode then apps->update module list
If still not works, then check access rights on addons directories and check if group and others can read them
To create new module you need more then Odoo Manifest file __openerp__.py file you also need Python Descriptor file __init__.py as minimal structure, of course you need more then two file but that minimal to module to exists. Once you create a module on existing database you need call Update module List under setting to load your module correctly and then you will be able to install it.
Here the quick guide on module creation.
Here the Detail Guide on API and framework.
The --update option requires -d specifying the database name
Odoo CLI doc
Take into account that the description, icons, and version inside the manifest, not always change innmediatly. Try to shift f5 your browser, but this is not so relevant when you are developing.
Besides having as a minimum, the manifest, and init.py file, if you are using docker-compose, I recommend having a script to put down, remove and recreate your container.
./doeall
cat doeall
#!/bin/sh
docker-compose down
docker-compose rm
docker-compose up -d
docker-compose logs -f
For developing purposes, is also convenient to have db in a separated docker-compose.yml, so that you can reuse the same db container for several odoo installations.
Take a look to my docker-compose for multi-instances here:
https://github.com/bmya/odoo-docker-compose/tree/multi
anyway, if you still want to use Postgres as db together in the same docker-compose file, you have it in this other branch:
https://github.com/bmya/odoo-docker-compose/blob/uni/docker-compose.yml
Again, regarding your module:
The important thing when you are writing code is:
When you change something in the methods in python code, just restart the server.
When you change something in the model inside python restart the server and reinstall.
When you change data files (views, data, etc) just reinstall the module in order to update the data files.
this fix my problem, we need create "odoo.conf" in ./config
[options]
addons_path = /mnt/extra-addons,/usr/lib/python2.7/dist- packages/odoo/addons
data_dir = /var/lib/odoo
auto_reload = True
First of all create a directory with the docker-compose.yml file and these directories:
/addons
/volumes/odoo/sessions
/volumes/odoo/filestore
/docker-compose.yml
Put this code in your docker-compose.yml file :
version: '3'
services:
web:
image: odoo:12.0
depends_on:
- db
ports:
- "8069:8069"
volumes:
- odoo-web-data:/var/lib/odoo
- ./volumes/odoo/filestore:/opt/odoo/data/filestore
- ./volumes/odoo/sessions:/opt/odoo/data/sessions
- ./addons:/mnt/extra-addons
db:
image: postgres:10
environment:
- POSTGRES_DB=postgres
- POSTGRES_PASSWORD=odoo
- POSTGRES_USER=odoo
- PGDATA=/var/lib/postgresql/data/pgdata
volumes:
- odoo-db-data:/var/lib/postgresql/data/pgdata
volumes:
odoo-web-data:
odoo-db-data:
Then in a terminal write for build your environnement:
docker-compose up
docker-compse start or docker-compose stop
If you want to add custom module , just put it in addons directory then clic on update app list in App module, restart docker , after this disable all filters in search bar. Normally if you write module name in search bar your custom module will show below.
My docker-compose file support run Odoo 15 on Docker:
version: '3'
services:
postgres:
image: postgres:13
container_name: postgres
restart: always
ports:
- "5432:5432"
environment:
POSTGRES_USER: ${POSTGRES_USER}
POSTGRES_PASSWORD: ${POSTGRES_PASSWORD}
POSTGRES_DB: ${POSTGRES_DB}
PGDATA: /var/lib/postgresql/data
volumes:
- ./data/postgres:/var/lib/postgresql/data
odoo:
image: odoo:15
container_name: odoo
restart: always
depends_on:
- postgres
ports:
- "8069:8069"
- "8072:8072"
environment:
HOST: postgres
USER: ${POSTGRES_USER}
PASSWORD: ${POSTGRES_PASSWORD}
volumes:
- ./etc/odoo:/etc/odoo
- ./data/addons:/mnt/extra-addons
- ./data/odoo:/var/lib/odoo