On Ubuntu 16.04.4, I suspect some recent update of some Python system package of having broken my Python 2.7 configuration. Whatever package I try to install or reinstall with a basic sudo python setup.py install it always fails because of gitignore:
running install
running bdist_egg
running egg_info
[...]
installing library code to build/bdist.linux-x86_64/egg
running install_lib
running build_py
error: Error: setup script specifies an absolute path:
/home/me/some_repo/.gitignore
setup() arguments must *always* be /-separated paths relative to the
setup.py directory, *never* absolute paths.
Lately I found a temporary workaround by manually cleaning the /usr/local/lib/python2.7/dist-packages/some_package directory manually before installing some_package. However tonight I'm facing the same issue on another repository and it keeps failing whatever I clean up. I also tried cleaning all compiled folders .egg-info/ build/ dist/ without success.
Notes: The setup script does not actually specify an absolute path to gitignore. An example of failing repo is https://github.com/philchristensen/python-artnet/blob/master/setup.py This repo has a setuptools_git entry which might lead to a clue, but other packages without this git entry point also fail because of the gitignore while a couple of months ago I never faced such issue with the same repos. Deleting the gitignore causes the setup to fail because of another non-py local file.
Any clue?
It looks like some other package that I had previously installed broke my system-wide Python.
Here's how I fixed in order to install package xyz:
Browsed /usr/local/lib/python2.7/dist-packages in search for occurences of "gitignore"
Deleted folders of all matching occurences (including setuptools_git itself that matches "gitignore")
pip install setuptools_git
In package xyz, rm -rf dist/ build/ *.egg-info/
Reinstall package xyz, which now succeeds to install
Use virtual envs as a lesson
Related
I am attempting to install from a Python setup. I had this working after a month of hell-ish attempts on 18.04, and now need it on 20.04.
The setup.py while I am using is on this repository ( https://github.com/flashlight/flashlight/tree/main/bindings/python ).
I am using Ubuntu 20.04, fresh install sans a few dependencies I know the system needs.
I have Python3.10 installed locally.
During the install at steps
cd bindings/python
python3.10 setup.py install
I hit the error
CMake Error at cmake/FindMKL.cmake:400. MKL library not found. Please specify the library location by appending the root directory of the MKL installation to the environment variable CMAKE_PREFIX_PATH
Which is weird for a variety of reasons.
I did a sudo apt-get install mkl-devel which took, but also
The setup script claims it defaults to MKL off.
Right prior to this error, I do get a warning.
Cmake Warning at cmake/FindMKL.cmake:387. MKL libraries files are found, but MKL header files are not. You can get them by 'pip install mkl-devel' (I did this btw) if using pip. If build fails with header files available on the system, please make sure that CMake will search the directory containing them by setting CMAKE_INCLUDE_PATH
That last part I did not do, primarily because I'm not sure how to tell where pip install mkl-devel would have put those header files.
That said, when I do a find / "*mkl*" I notice files in two primary locations
.h files : /home/myusername/.local/include
.so files: /home/myusername/.local/lib
So I set the following environment variables in my terminal
LIBRARY_PATH=/home/myusername/.local/lib
LD_LIBRARY_PATH=/home/myusername/.local/lib
KENLM_ROOT=/home/myusername/Flashlight/kenlm
USE_MKL=0
CMAKE_INCLUDE_PATH=/home/myusername/.local/include/
CMAKE_PREFIX_PATH=/home/myusername/.local/lib
However when I try to install again, the process still fails with both of the MKL issues above (Warning and Error)
I'm baffled at what I am supposed to be pointing where to get this to succeed at this point.
I've got a couple of projects here for which I'm preparing documentation at the moment, hosted at readthedocs.org. FYI, all of them use poetry and I use custom .readthedocs.yml files with this entry:
python:
install:
- method: pip
path: .
It works fine for most projects, but it fails for two for different reasons during installation of the project via pip:
The first one uses PyGObject, which failes like this:
Package gobject-introspection-1.0 was not found in the pkg-config search path.
Perhaps you should add the directory containing `gobject-introspection-1.0.pc'
to the PKG_CONFIG_PATH environment variable
No package 'gobject-introspection-1.0' found
Command '('pkg-config', '--print-errors', '--exists', 'gobject-introspection-1.0 >= 1.56.0')' returned non-zero exit status 1.
Try installing it with: 'sudo apt install libgirepository1.0-dev'
So it seems that PyGObject cannot be installed without some system packages to be installed. I could rearrange the code so that the import is not top-level. But still I need it in the dependencies. Can I tell pip install to ignore this single package somehow? Any other idea?
The second project compiles some C++ code via Cython and fails, because it's missing a library. I use a custom build script in the pyproject.toml:
[tool.poetry.build]
script = "build.py"
generate-setup-file = false
Is there some flag in pip that I could set and retrieve in build.py to skip the compilation? Or is there a better way?
Question
Is there a way to build a wheel for a package while in a different repository such that the wheel has been built exactly as it would be if you built the wheel inside of the repository containing the package?
Example
Consider the following repo:
/repo-containing-your-package
|___ your_module/
|___ setup.py
Build method A
When I run python setup.py bdist_wheel from within repo-containing-your-package it builds the wheel as expected, including your_module. This means after I install pip install ./dist/your_module-#.#.#-py3-none-any.whl (which is successful), I can run python -m your_module.foo from the command line.
When the package is building, I get output that verifies that my module has been picked up by the wheel:
creating 'dist/your_module-#.#.#-py3-none-any.whl' and adding 'build/bar' to it
adding 'your_module/__init__.py'
etc...
Build method B
However, if I run python ../repo-containing-your-package/setup.py bdist_wheel from a repository that is a sibling to repo-containing-your-package, it does not build the wheel as expected, as it fails to include your_module. This means after I install pip install ./dist/your_module-#.#.#-py3-none-any.whl (which is successful), attempting python -m your_module.foo fails:
Error while finding module specification for 'your_module.foo' (ModuleNotFoundError: No module named 'your_module')
The fact that the module has not been properly installed with the package is confirmed by reviewing the build output, which does not include the adding 'your_module' output that method A includes.
Two solutions I know of:
change working directory in setup.py
If you can modify the setup script, you can change the working directory programmatically. Add an os.chdir call early enough in the setup script:
import os
from setuptools import setup
os.chdir(os.path.dirname(__file__))
setup(...)
You can also change the working directory with other means without having to modify the setup script, e.g. in bash:
$ pushd path/to/repo; python setup.py bdist_wheel; popd
Use pip wheel
pip has a subcommand wheel that builds a wheel from the given arg; this arg is usually the name of the package, but can be a directory containing the setup script. Pass -e in that case so the wheel has the correct name:
$ pip wheel -e path/to/repo
I have run
python setup.py sdist --formats=gztar,zip bdist_wheel
and then
python setup.py install
The result is that the egg files are created in the site-packages directory but not the <package-name>/<package-source files>:
$ls /usr/local/lib/python3.7/site-packages/infix*
/usr/local/lib/python3.7/site-packages/infixpy-0.0.3-py3.7.egg
/usr/local/lib/python3.7/site-packages/infixpy.egg-link
/usr/local/lib/python3.7/site-packages/infixpy-0.0.4-py3.7.egg
Notice that the directory infix was not created - and thus none of the source code was copied. What am I missing / not understanding in this local installation process?
Update When I had run
pip3 install infixpy
there was an additional directory infix and the source code was included in that directory. Running the local or devel modes of setup.py install was not causing that code to be updated and - crucially - the stacktraces from running any python code (even in a completely new ipython repl) was showing only the older / pip3 installed code. In particular the file __init__.py So my observation has been that the source file :
/usr/local/lib/python3.7/site-packages/infixpy/__init__.py
is an accurate reflection of what the python executable were using. #phd is mentioning that the source code is already included in the egg. So then I do not understand the relationship between the source code in the egg and the source code in that subdirectory - which in the lastest run of mine is completely missing.
The following commands all yield slightly different results:
pip install .: installed as uncompressed package directories and a XXX.dist-info directory
pip install infixpy: same as previous, but installed from an (remote) index (per default PyPI), not from the local directory
python setup.py install: installed as a zipped file XXX.egg
pip install --editable . or python setup.py develop: not installed, but linked as a XXX.egg-link file
So depending on the commands entered, the content of site-packages is different.
Now this is what you say you have:
$ls /usr/local/lib/python3.7/site-packages/infix*
/usr/local/lib/python3.7/site-packages/infixpy-0.0.3-py3.7.egg
/usr/local/lib/python3.7/site-packages/infixpy.egg-link
/usr/local/lib/python3.7/site-packages/infixpy-0.0.4-py3.7.egg```
This is a bit surprising, since theoretically there are 3 versions of your project that are importable (0.0.3, 0.0.4, and develop/editable). I am not sure which one is used by the Python interpreter in this case. You might want to run pip uninstall infixpy a couple of times to start fresh and alleviate these uncertainties. You can then experiment with the commands mentioned above and see how they impact the content of site-packages along with inspecting the result of pip show infixpy.
Goals:
Make use of modern Python packaging toolsets to deploy/install proprietary packages into some virtualenv.
The installed packages should include compiled *.pyc(or *.pyo) only without source files.
There are a couple of packages, and a vendor name (here we choose dgmx for our studio) is used as the package names. Therefore, the installed packages would be something like dgmx/alucard, dgmx/banshee, dgmx/carmilla, ...
The file hierarchy of installed packages should be like ones by python setup.py install --single-version-externally-managed or pip install. Refer to How come I can't get the exactly result to *pip install* by manually *python setup.py install*?
Question in short:
I like to deploy proprietary namespaced packages into a virtualenv by only compiled *.pyc(or *.pyo) files, in which the file/directory hierarchy just reflects the namespace with polluting sys.path by lots of ooxx.egg paths.
Something I have tried:
python setup.py bdist_egg --exclude-source-files then easy_install ooxx.egg.
pollute "sys.path" for each namespace package.
python setup.py install --single-version-externally-managed.
not *.pyc only.
the "install_requires" got ignored!
need to manually put a ooxx.egg-info/installed-files.txt to make uninstall work correctly.
pip install . in the location of "setup.py".
not *.pyc only.
pysetup install . in the location of "setup.py".
not *.pyc only.
Update:
My current idea is to follow method 2.
python setup.py egg_info --egg-base . # get requires.txt
python setup.py install --single-version-externally-managed --record installed-files.txt # get installed-files.txt
manually install other dependencies through "requires.txt"
manually delete installed source files (*.py) through "installed-files.txt"
remove source files (*.py) from "installed-files.txt" and put it into deployed "ooxx.egg-info/installed-files.txt"
References:
Migrating to pip+virtualenv from setuptools
installing only .pyc (python compiled) with setuptools
Can I deploy Python .pyc files only to Google App Engine?
How come I can't get the exactly result to *pip install* by manually *python setup.py install*?
Some trick may help:
Compile your source into .pyc, zip them up in a single .zip file.
Write a new module with a simple module all it does is to add the .zip to the sys.path.
So when you import this module, the .zip is in the path. All you have to do is in a custom step in setup.py, copy the zip file to the proper place.