How to include external library with python wheel package - python

I want to create package for python that embeds and uses an external library (.so) on Linux using the cffi module.
Is there standard way to include .so file into python package?
The package will be used only internally and won't be published to pypi.
I think Wheel packages are the best option - they would create platform specific package with all files ready to be copied so there will be no need to build anything on target environments.

You can use auditwheel to inject the external libraries into the wheel:
auditwheel repair: copies these external shared libraries into the wheel itself, and automatically modifies the appropriate RPATH entries such that these libraries will be picked up at runtime. This accomplishes a similar result as if the libraries had been statically linked without requiring changes to the build system. Packagers are advised that bundling, like static linking, may implicate copyright concerns.
You can pre-build the external c++ library by typically executing the following:
./configure && make && make install
This will generate an my_external_library.so file and install it in the appropriate path. However, you'll need to ensure that the library path is properly set in order for the auditwheel to discover the missing dependency.
export LD_LIBRARY_PATH=/usr/local/lib
You can then build the python wheel by executing:
python setup.py bdist_wheel
Finally, you can repair the wheel, which will inject the my_external_library.so into the package.
auditwheel repair my-python-wheel-1.5.2-cp35-cp35m-linux_x86_64.whl
I successfully applied the above steps to the python library confluent-kafka-python which has a required c/c++ dependency on librdkafka.
Note: auditwheel is Linux-only. For MacOS, see the delocate tool.

Wheels are the standard way of distributing Python packages, but there is a problem when you have extension modules that depend on other so's. This is because the normal Linux dynamic linker is used, and that only looks in /usr/lib or /usr/local/lib. This is a problem when installing a wheel in a virtualenv.
As far as I know, you have three options:
Static linking, so the 'wrapper' does not depend on anything else;
using ctypes to wrap the so directly from Python;
Split the distribution into a wheel with the Python code & wrapper, and a separate RPM or DEB to install the so into either /usr/lib or /usr/local/lib.
A wheel may work if you include the dependent so as a data file to be stored in /lib and install into the root Python environment (haven't tried that), but this will break if someone tries to install the wheel into a virtualenv (did try that).

Related

Python3. Setuptools. Adding a local package to an assembly

There is a locally built package (eg main-0.1.tar.gz). There is another package (for example base-0.1) that requires main-0.1 as a dependency.
It is necessary that during the subsequent installation of the base-0.1 package, the main-0.1 package is also installed.
Those. You can specify only packages with PyPI in install_requires, but local adding packages to the assembly is not clear how.
You can add the package main-0.1.tag.gz to the base-0.1 archive using MANIFEST.in (include main-0.1.tag.gz). But further dependency_links, for example, does not work correctly.
How do I add a local package to the build of another package and then install it along with another package, as if it were pulled from PyPI?
You might want to look at:
PEP 440 ("File URLs")
PEP 508
import setuptools
setuptools.setup(
# [...]
install_requires = [
'main # file:///path/to/main-0.1.tar.gz'
# [...]
],
)
Alternatively (probably better actually), use some combination of pip install options:
pip install --no-index --find-links '/path/to/distributions' main base
Reference:
https://pip.pypa.io/en/stable/user_guide/#installing-from-local-packages
Found a rough solution. I don't know how much it is for Feng Shui, but it works.
Add include main-0.1.tar.gz to MANIFEST.in
In setup.py, at the end of the file (after calling setup ()), add:
if 'sdist' not in sys.argv[1]:
os.system('pip install main-0.1.tar.gz')
The condition may be different if, for example, sdist is not used for building (python setup.py sdist). The main thing is to somehow determine that this is running setup for assembly, and not for installation (pip install base-0.1.tar.gz in the future).
In this case, we copy the local dependent package into the archive of the package being built, and it is distributed, accordingly, along with it. And installed the same way.

pip does not install my package dependencies

I have developed a python package on github that I released on PyPi. It installs with pip install PACKAGENAME, but does not do anything with the dependencies that are stated in the "install_requires" of the setup.py file.
Weirdly enough, the zip file of the associated release does install all dependencies.. I tried with different virtual environments and on different computers but it never installs the dependencies.. Any help appreciated.
pip install pythutils downloads a wheel if it's available — and it's available for your package.
When generating a wheel setuptools runs python setup.py locally but doesn't include setup.py into the wheel. Download your wheel file and unzip it (it's just a zip archive) — there is your main package directory pythutils and a directory with metadata pythutils-1.1.1.dist-info. In the metadata directory there is a file METADATA that usually lists static dependencies but your file doesn't list any. Because when you were generating wheels all your dependencies has already been installed so all your dynamic code paths were skipped.
The archive that you downloaded from Github release install dependencies because it's not a wheel so pip runs python setup.py install and your dynamic dependencies work.
What you can do? My advice is to avoid dynamic dependencies. Declare static dependencies and allow pip to decide what versions to install:
install_requires=[
'numpy==1.16.5; python_version>="2" and python_version<"3"',
'numpy; python_version>="3"',
],
Another approach would be to create version-specific wheel files — one for Python 2 and another for Python 3 — with fixed dependencies.
Yet another approach is to not publish wheels at all and only publish sdist (source distribution). Then pip is forced to run python setup.py install on the target machine. That not the best approach and it certainly will be problematic for packages with C extensions (user must have a compiler and developer tools to install from sources).
Your setup.py does a series of checks like
try:
import numpy
except ImportError:
if sys.version_info[0] == 2:
install_requires.append('numpy==1.16.5')
if sys.version_info[0] == 3:
install_requires.append("numpy")
Presumably the system where you ran it had all the required modules already installed, and so ended up with an empty list in install_requires. But this is the wrong way to do it anyway; you should simply make a static list (or two static lists, one each for Python 2 and Python 3 if you really want to support both in the same package).

How to upgrade/uninstall distutils packages (PyYAML) in windows OS

I am working in WIN10 , with python 2.7.15
I am try to install package, during the installation process I received the following error .
Cannot uninstall 'PyYAML'. It is a distutils installed project and thus we cannot accurately determine which files belong to it which would lead to only a partial uninstall.
I try to uninstall with pip (18.1) command and I received the same error.
pip uninstall PyYAML
How I can uninstall/upgrade distutils packge in win10 OS.
Base distutils functionality doesn't leave any information about which files belong to a package -- thus it cannot be reliably uninstalled. That's what the message is telling you. Moreover, it doesn't have dependency metadata, so it can't be "upgraded" reliably, either. All those features are additions by setuptools (and some by wheel and pip itself).
This can happen if you installed the package directly from source with setup.py install if setup.py is distutils- rather than setuptools-based. Or if you installed it manually from some types of packages by copying/extracting files.
Unless the way you installed it provides an own uninstaller, you'll have to manually figure out which files belong to the package and delete them from Python directories.
Usually, these are:
site-packages\<package_name>* directories and/or
site-packages\<package_name>*.py for standalone modules
optionally, a site-packages\<package_name>.pth file
Generally, look for anything that bears the package's name on it.
If you can build the same package from source, you can use the build process to get a hint: build a binaly package that you can look into (e.g. setup.py bdist_wheel -- .whl is a ZIP archive) and see what files it has in it.

Cannot compile Python 3.6 with Zlib support

I am trying to cross-compile Python for a Raspberry Pi 3, but I've been having a lot of issues. For one, it doesn't come with pip nor setuptools and if I try to install pip via python3.6 get-pip.py I get a
zipimport.ZipImportError: can't decompress data; zlib not available
error. Now, I tried to configure it using the --with-zlib option but the configure script shows a warning that reads
configure: WARNING: unrecognized options: --with-zlib
I read A LOT of questions regarding this issue and I double-checked that ALL necessary files are present, I'm on Fedora, and I have installed both zlib and zlib-devel.
I also have just ran a simple installation (excluding the options and envvars to cross-compile, and using a prefix for the output, so it doesn't mess with my existing installation from the repos) and it also doesn't work, pip and setuptools are not present in site-packages in the output and configure also can't recognize the --with- options
What am I doing wrong? I'm on Fedora 26 which has python3.6 in the repos, if that matters for some reason.
Edit: Apparently, I also can't build other packages as well:
Python build finished successfully!
The necessary bits to build these optional modules were not found:
_bz2 _curses _curses_panel
_dbm _gdbm _lzma
_sqlite3 _ssl _tkinter
readline zlib
To find the necessary bits, look in setup.py in detect_modules() for the module's name.
I have checked in setup.py, specifically for the zlib case and it is apparently ignoring my system's dirs when looking for header files:
zlib_inc = find_file('zlib.h', [], inc_dirs)
As you can see the arg that should be filled with my system's "standard" paths is empty, If this is a generated file, how can I instruct it to fill it with the beforementioned paths? --with-zlib-dir=/path and --with-zlib=/path yielded the same results, the configure script didn't recognize them.
What is wrong with the process? How can I have Python find and build those packages?
This is a follow-up question for my question about the installation missing pip and setuptools.

including python package dependecy as an executable

Currently my python package does not have a dependency on the wmi package and it can be easily installed via
pip install mypackage
If I add a dependency on the wmi package, this will likely fail since when I try installing wmi through pip, I encounter errors since I do not have visual studio 2008 installed...and I only managed to get it installed using the binary distribution.
Is it possible for me to include and install the binary release of wmi in my package?
The main concern is that if people fail to install my package via the pip command, they just avoid using my package.
The first thing to consider is why are you considering adding the wmi package - since it is MS-Windows specific if you use it, or anything depending on it, your package will also be MS-Windows specific.
Are there other ways to achieve what you are trying to do that remain cross platform? If not and you really have to use it then you could include a prerequisite statement in the documentation, and ideally in setup.py, telling people that they need to have an installed & working copy of wmi, hopefully with a pointer to the binary distributions.
The other way to go - if you are on a late enough version of python - is to build and distribute your package as python wheels. Since wheels allow the inclusion of C package elements without relying on the presence of a compiler on the target system - see pep-0427 & here for some more information.
Creating Wheels:
You need to be running python2 > 2.6 or python3, pip >= 1.4 and setuptools >= 0.8.
Basically, assuming that you have a setup.py that will create your, (source), distribution for upload to pip with:
python setup.py sdist
then you can create a binary distribution that should contain all the dependencies of your package for your current python version with:
python setup.py bdist_wheel
This will build a distribution wheel that includes the .pyc files and the binary files from the required packages.
But - you need to do this once for each version of python that you are planning of supporting, (virtualenv is magic for this), and on each platform if you are also planning on supporting 64 bit or mac. Unless, of course, you manage to make a pure python package that will run, without 2to3, under both python 2 & 3 in which case you can build a universal wheel - obviously you can not do this if you require .c extensions.
For more information on wheels see Wheel - Read The Docs.

Categories