pip: Cannot perform editable system-wide install of "pyproject.toml" project - python

Consider the following minimal Python project with setuptools packaging and a "pyproject.toml" file (see setuptools Build System Support):
> tree myproject
myproject
|-- myproject
| `-- __init__.py
|-- pyproject.toml
|-- setup.cfg
`-- setup.py
"setup.py" is only a minimal dummy file to enable support for editable installs, as described here:
from setuptools import setup
if __name__ == '__main__':
setup()
When performing an editable install (pip install -e) to a virtualenv, everything works as expected:
> ls venv/lib/python3.9/site-packages | grep myproject
myproject.egg-link
> cat venv/lib/python3.9/site-packages/easy-install.pth
/myproject
> python3
>>> import myproject
Hello world!
The same is true for a non-editable system-wide install:
> ls /usr/local/lib/python3.9/dist-packages | grep myproject
myproject
myproject-1.0.dist-info
> python3
>>> import myproject
Hello world!
For an editable system-wide install, however, pip succeeds but does not result in a usable module:
> ls /usr/local/lib/python3.9/dist-packages | grep myproject
(No output)
> python3
>>> import myproject
Traceback (most recent call last):
File "<stdin>", line 1, in <module>
ModuleNotFoundError: No module named 'myproject'
I'm aware that there were some issues in the past with "pyproject.toml" and editable installs. However, these appear to be fixed since 2019.
I have a relatively recent Debian Bullseye system with pip 20.3.4 and setuptools 52.0.0.
There also is PEP 660, which has not been implemented by setuptools yet. However, the dummy "setup.py" file (see above) should work around that limitation.

This is a Debian-specific problem.
Having a "pyproject.toml" file (i.e. a PEP 518 package), enables build isolation by default. On Debian, that results in editable installs ending up in "/usr/lib/python3.9/site-packages" instead of "/usr/local/lib/python3.9/dist-packages":
> ls /usr/lib/python3.9/site-packages
easy-install.pth myproject.egg-link
> cat /usr/lib/python3.9/site-packages/easy-install.pth
/myproject
However, "/usr/lib/python3.9/site-packages" is not part of the default Debian Python module paths.
I reported this behavior as Debian bug #1004149. Until it has been fixed, one can work around it by using pip install --no-build-isolation -e.

Related

How to correctly install custom package in Docker with docker-compose?

I am building a flask api and want to create a docker image of it. However, when I do docker-compose run (after build), it cannot find the module.
Error:
api_1 | Traceback (most recent call last):
api_1 | File "app.py", line 6, in <module>
api_1 | from api.classify.classify import get_prediction
api_1 | ModuleNotFoundError: No module named 'api'
My folder structure looks like this:
- api
-- classify
--- classify.py
-- app.py
-- Dockerfile
-- requirements.txt
-- setup.py
The setup.py looks like this:
from setuptools import setup, find_packages
setup(
name='image_api',
keywords='',
version='0.1',
packages=find_packages()
)
And the Dockerfile looks like this:
FROM python:3
WORKDIR /user/src/app
ENV PYTHONPATH=/api
COPY requirements.txt .
RUN pip install --no-cache-dir -r requirements.txt
COPY . .
RUN python setup.py install
CMD ["python", "app.py"]
How do I fix this and what is best practice to install custom packages when building a Docker image?
You should never over-write PYTHONPATH like that, nstead append your path, or the system will not find the installed Python packages.
You can do either of the following to make it work:
RUN export PYTHONPATH="$PYTHONPATH:/api"
ENV PYTHONPATH="$PYTHONPATH:/api"
Also your Dockerfile should be on api level, it wont be able to look for it in the present directory structure.

single-sourcing package version for setup.cfg Python projects

For traditional Python projects with a setup.py, there are various ways of ensuring that the version string does not have to be repeated throughout the code base. See PyPA's guide on "Single-sourcing the package version" for a list of recommendations.
Many are trying to move away from setup.py to setup.cfg (probably under the influence of PEP517 and PEP518; setup.py was mostly used declaratively anyway, and when there was logic in setup.py, it was probably for the worse.) This means that most the suggestions won't work anymore since setup.cfg cannot contain "code".
How can I single-source the package version for Python projects that use setup.cfg?
There are a couple of ways to do this (see below for the project structure used in these examples):
1.
setup.cfg
[metadata]
version = 1.2.3.dev4
src/my_top_level_package/__init__.py
import importlib.metadata
__version__ = importlib.metadata.version('MyProject')
2.
setup.cfg
[metadata]
version = file: VERSION.txt
VERSION.txt
1.2.3.dev4
src/my_top_level_package/__init__.py
import importlib.metadata
__version__ = importlib.metadata.version('MyProject')
3.
setup.cfg
[metadata]
version = attr: my_top_level_package.__version__
src/my_top_level_package/__init__.py
__version__ = '1.2.3.dev4'
And more...
There are probably other ways to do this, by playing with different combinatons.
References:
https://setuptools.readthedocs.io/en/latest/userguide/declarative_config.html
https://docs.python.org/3/library/importlib.metadata.html
Structure assumed in the previous examples is as follows...
MyProject
├── setup.cfg
├── setup.py
└── src
└── my_top_level_package
└── __init__.py
setup.py
#!/usr/bin/env python3
import setuptools
if __name__ == '__main__':
setuptools.setup(
# see 'setup.cfg'
)
setup.cfg
[metadata]
name = MyProject
# See above for the value of 'version = ...'
[options]
package_dir =
= src
packages = find:
[options.packages.find]
where = src
$ cd path/to/MyProject
$ python3 setup.py --version
1.2.3.dev4
$ python3 -m pip install .
# ...
$ python3 -c 'import my_top_level_package; print(my_top_level_package.__version__)'
1.2.3.dev4
$ python3 -V
Python 3.6.9
$ python3 -m pip list
Package Version
------------- ----------
MyProject 1.2.3.dev4
pip 20.0.2
pkg-resources 0.0.0
setuptools 45.2.0
wheel 0.34.2
zipp 3.0.0

python ModuleNotFoundError after running pip install - e .

I'm trying to compile and install the following python package, system-wide:
https://github.com/mathurinm/BlitzL1/
(note that the init.py of the module is inside a folder named python)
So I run, at the root of the repo,
pip install -e .
I get:
zongo#zongo-HP-EliteBook-840-G3:~/workspace/BlitzL1$ pip install -e .
Obtaining file:///home/zongo/workspace/BlitzL1
Installing collected packages: blitzl1
Running setup.py develop for blitzl1
Successfully installed blitzl1
zongo#zongo-HP-EliteBook-840-G3:~/workspace/BlitzL1$ ipython
Python 3.6.6 | packaged by conda-forge | (default, Jul 26 2018, 09:53:17)
Type 'copyright', 'credits' or 'license' for more information
IPython 7.0.1 -- An enhanced Interactive Python. Type '?' for help.
In [1]: import blitzl1
---------------------------------------------------------------------------
ModuleNotFoundError Traceback (most recent call last)
<ipython-input-1-8bb5a22c28e9> in <module>
----> 1 import blitzl1
ModuleNotFoundError: No module named 'blitzl1'
after trial and error, I found that renaming the python folder to blitzl1 and replacing, in setup.py:
package_dir = {"blitzl1": "python"},
by
package_dir = {"blitzl1": "blitzl1"},
makes it possible to import the package. Why is the first one not working?
By the way:
zongo#zongo-HP-EliteBook-840-G3:~/workspace/BlitzL1$ which pip
/home/zongo/anaconda3/bin/pip
This is due to a long lasting issue in pip with installing a package in develop mode when the package directory is not in the same folder as the setup.py. See here for more info.
To be clearer, if the package name is my_package and the structure of the source is:
|- setup.py
|- src
|- __init__.py
|- ...
with package_dir={'my_package':'src'}, installing the package with either pip install -e . or python setup.py develop will raise the error reported by the OP.
A way to mitigate this is to change to package_dir={'':'src'} and change the structure of the repo to
|- setup.py
|- src
|- mypackage
|- __init__.py
|- ...

setuptools very simple (one source file module) configuration

I want to use setuptools to create a package consisting of two files: foo.py (script) and foo.conf.
Then I want to publish the package on my devpi-server and then install the package using pip.
Suppose I that initially I have my current working directory clean
$ ls -l
total 0
Then I issue pip install (or download?) command
$ pip install -i http://mydevpi.server foo
And get a dir with my two files created
$ tree
.
|
foo
|
|\_ foo.py
|
\_ foo.conf
So questions are:
what setuptools configuration should I use?
what exact pip command should I use to install the package the way I want? Will pip install -i http://mydevpi.server --target=. do the trick?
First write somethings as setup.py in foo directory like:
import setuptools
setuptools.setup(
name='foo_pip',
version='1',
packages=[''],
url='1',
license='1',
author='1',
author_email='1',
description='1'
)
(You can use distutils or setuptools)
Then python setup.py bdist_wheel -d TARGET and there will be a whl file in target directory, copy the path.
You can now install using pip install the_wheel_file_path --prefix="the_path_to_install"
Something like this
Processing .../TARGET/foo_pip-1-py2-none-any.whl
Installing collected packages: foo-pip
Successfully installed foo-pip-1
Then use it by import foo

Minimal working example for conda package build

I am trying to look into conda for python package/virtual environments management. However, I can't seem to be able to build my own conda package. Could somebody help me to construct a minimal working example?
First of all some directory structure:
- src/
|- foo1.py
|- foo2.py
- conda-build/
|- meta.yaml
- setup.py
Running python setup.py install installs the packages using pip. Now if I try to cd to the conda-build directory and run conda build . I get the following output
Removing old build directory
Removing old work directory
BUILD START: example_pkg-0.5.1-abc
Fetching package metadata: ......
Solving package specifications: .
The following NEW packages will be INSTALLED:
pip: 6.1.1-py34_0
python: 3.4.3-0
setuptools: 15.0-py34_0
Linking packages ...
[ COMPLETE ]|##################################################| 100%
Removing old work directory
Copying C:\some\path\ to C:\Anaconda3\conda-bld\work
Package: example_pkg-0.5.1-abc
source tree in: C:\Anaconda3\conda-bld\work
number of files: 0
Fixing permissions
Fixing permissions
BUILD END: example_pkg-0.5.1-abc
Nothing to test for: example_pkg-0.5.1-abc
# If you want to upload this package to binstar.org later, type:
#
# $ binstar upload C:\Anaconda3\conda-bld\win-64\example_pkg-0.5.1- abc.tar.bz2
#
# To have conda build upload to binstar automatically, use
# $ conda config --set binstar_upload yes
I can indeed find the package in the directory C:\Anaconda3\conda-bld\win-64 but the package does not seem to contain any files. I can install the package using conda install --use-local .\example_pkg-0.5.1-abc.tar.bz2 and then it is listed by conda list, however I cannot import it in Python. This is my meta.yaml:
package:
name: example_pkg
version: "0.5.1"
source:
path: ../src
build:
number: 1
string: abc
script: python setup.py install
requirements:
build:
- python
run:
- python
Any help is greatly appreciated! :)
Looks like there's an issue where build/script doesn't work on Windows. Until that PR is merged, you should just create a bld.bat with
python setup.py install
if errorlevel 1 exit 1
and put it in the conda recipe.

Categories