Installing shared library with python package not separately - python

I have successfully built a Python package that uses CMake combined with pybind11 to create a shared object (.so - assuming only Linux usage at the moment) file. The implementation works but I am unable to remove this shared object file using pip uninstall .
My setup command in setup.py file looks like this taken from the pybind/cmake_example repository:
setup(
name='package',
version='0.0.1',
author='-',
author_email='-',
description='A test project using pybind11 and CMake',
long_description='',
ext_modules=[CMakeExtension('packagebindings')],
cmdclass=dict(build_ext=CMakeBuild),
zip_safe=False,
packages=setuptools.find_packages()
)
My CMakeLists.txt file has an install instruction that looks like this:
install(TARGETS packagebindings COMPONENT python LIBRARY DESTINATION ${Python_SITELIB})
To summarise, here are the files that are created when running pip install .:
path/to/site-packages/package/* - removed by pip uninstall package
path/to/site-packages/package-0.0.1.dist-info/* - removed by pip uninstall package
path/to/site-packages/packagebindings.cpython-37m-x86_64-linux-gnu.so - still present after pip uninstall package
I would like to know how make it so that running pip uninstall . removes the .so file.
If a further MRE is required, I can link to a repository.

Your CMake install target seems to place the .so directly into the python installation directory (DESTINATION ${Python_SITE_LIB}). I'm guessing this stops the .so from being registered by Python proper, so it is not removed when uninstalling. I would suggest to make CMake place the .so in a distribution directory, and then add the following option to setup():
data_files = [("installation_bin", ["distribution_bin/library.so"])]
This will let the .so be tracked by the Python package manager. The first string is a directory relative to the installation prefix. The second string is the .so file in your distribution, relative to the setup.py script.

Related

pip does not install my package dependencies

I have developed a python package on github that I released on PyPi. It installs with pip install PACKAGENAME, but does not do anything with the dependencies that are stated in the "install_requires" of the setup.py file.
Weirdly enough, the zip file of the associated release does install all dependencies.. I tried with different virtual environments and on different computers but it never installs the dependencies.. Any help appreciated.
pip install pythutils downloads a wheel if it's available — and it's available for your package.
When generating a wheel setuptools runs python setup.py locally but doesn't include setup.py into the wheel. Download your wheel file and unzip it (it's just a zip archive) — there is your main package directory pythutils and a directory with metadata pythutils-1.1.1.dist-info. In the metadata directory there is a file METADATA that usually lists static dependencies but your file doesn't list any. Because when you were generating wheels all your dependencies has already been installed so all your dynamic code paths were skipped.
The archive that you downloaded from Github release install dependencies because it's not a wheel so pip runs python setup.py install and your dynamic dependencies work.
What you can do? My advice is to avoid dynamic dependencies. Declare static dependencies and allow pip to decide what versions to install:
install_requires=[
'numpy==1.16.5; python_version>="2" and python_version<"3"',
'numpy; python_version>="3"',
],
Another approach would be to create version-specific wheel files — one for Python 2 and another for Python 3 — with fixed dependencies.
Yet another approach is to not publish wheels at all and only publish sdist (source distribution). Then pip is forced to run python setup.py install on the target machine. That not the best approach and it certainly will be problematic for packages with C extensions (user must have a compiler and developer tools to install from sources).
Your setup.py does a series of checks like
try:
import numpy
except ImportError:
if sys.version_info[0] == 2:
install_requires.append('numpy==1.16.5')
if sys.version_info[0] == 3:
install_requires.append("numpy")
Presumably the system where you ran it had all the required modules already installed, and so ended up with an empty list in install_requires. But this is the wrong way to do it anyway; you should simply make a static list (or two static lists, one each for Python 2 and Python 3 if you really want to support both in the same package).

How to upgrade/uninstall distutils packages (PyYAML) in windows OS

I am working in WIN10 , with python 2.7.15
I am try to install package, during the installation process I received the following error .
Cannot uninstall 'PyYAML'. It is a distutils installed project and thus we cannot accurately determine which files belong to it which would lead to only a partial uninstall.
I try to uninstall with pip (18.1) command and I received the same error.
pip uninstall PyYAML
How I can uninstall/upgrade distutils packge in win10 OS.
Base distutils functionality doesn't leave any information about which files belong to a package -- thus it cannot be reliably uninstalled. That's what the message is telling you. Moreover, it doesn't have dependency metadata, so it can't be "upgraded" reliably, either. All those features are additions by setuptools (and some by wheel and pip itself).
This can happen if you installed the package directly from source with setup.py install if setup.py is distutils- rather than setuptools-based. Or if you installed it manually from some types of packages by copying/extracting files.
Unless the way you installed it provides an own uninstaller, you'll have to manually figure out which files belong to the package and delete them from Python directories.
Usually, these are:
site-packages\<package_name>* directories and/or
site-packages\<package_name>*.py for standalone modules
optionally, a site-packages\<package_name>.pth file
Generally, look for anything that bears the package's name on it.
If you can build the same package from source, you can use the build process to get a hint: build a binaly package that you can look into (e.g. setup.py bdist_wheel -- .whl is a ZIP archive) and see what files it has in it.

Using setuptools to install files to arbitrary locations

Is there a way to install files to arbitrary locations with setuptools? I've used Data Files with setuptools before, but those are typically installed inside the package directory. I need to install a plugin file that will be located in the install directory of another application.
It seems that setuptools has purposely made it difficult to install files outside of the package directory.
I instead included the plugin files as package data and used the Entry Points feature of setuptools to expose the install/uninstall functions for the plugin files I wanted to distribute.
setup(
...
entry_points={
'console_scripts': [
'mypackage_install_plugins = mypackage:install_plugins',
'mypackage_uninstall_plugins = mypackage:uninstall_plugins',
],
}
)
I just added an additional step to the installation instructions to run the following command after installing the python package:
$> mypackage_install_plugins
The data_files attribute will allow you to specify full paths.
You could also do some shutil.copy magic in your setup.py, except don't.
Check out this answer:
Execute a Python script post install using distutils / setuptools
which shows how to add an arbitrary install script (python, shell, whatever) that runs at the end of the install. It'll run whther you use "setup.py install" directly, or a package manager like "pip install". With this, you can add any files you want, anywhere you want.
Unfortunately, I feel Brendan's pain - setuptools, not being a full package manager itself, does not handle the uninstall. Therefore, there's no way to have an uninstall hook to reverse what you did in the post-install script.

Installing a .tar.bz2 in windows

I am a newbie to installing python extensions working on Windows 7, running Python 2.6 - I need to install the Levenshtein library from
http://code.google.com/p/pylevenshtein/downloads/detail?name=python-Levenshtein-0.10.1.tar.bz2&can=2&q=
When I unzip the downloaded file, it gives me the following list of files:
COPYING
gendoc.sh
Levenshtein.c
Levenshtein.h
MANIFEST
NEWS
PKG-INFO
README
setup.cfg
setup.py
StringMatcher.py
How do I install the Levenshtein library so I could import and use it into my python code?
Assuming you have Python already installed on on you PATH, you can do this:
python setup.py install
However, it seems to have a compiled extension so you will probably also need a complete Windows development environment to install that (it is a source distribution). So if you don't it may not work. Your best bet would be to find that as an MSI package, if you can.
Here is quite a large section of the documentation easily found by doing some research.
http://docs.python.org/install/index.html
It appears that you will want to run:
python setup.py install --prefix="\Temp\Python"
to install modules to the \Temp\Python directory on the current drive.
Some more info:
If you don’t choose an installation directory—i.e., if you just run
setup.py install—then the install command installs to the standard
location for third-party Python modules.
The default installation directory on Windows was C:\Program Files\Python under Python 1.6a1, 1.5.2, and earlier.

Any methods to deploy Python packages with 'pip | easyinstall' + '*.pyc only' + 'flat namespace packges' + virtualenv?

Goals:
Make use of modern Python packaging toolsets to deploy/install proprietary packages into some virtualenv.
The installed packages should include compiled *.pyc(or *.pyo) only without source files.
There are a couple of packages, and a vendor name (here we choose dgmx for our studio) is used as the package names. Therefore, the installed packages would be something like dgmx/alucard, dgmx/banshee, dgmx/carmilla, ...
The file hierarchy of installed packages should be like ones by python setup.py install --single-version-externally-managed or pip install. Refer to How come I can't get the exactly result to *pip install* by manually *python setup.py install*?
Question in short:
I like to deploy proprietary namespaced packages into a virtualenv by only compiled *.pyc(or *.pyo) files, in which the file/directory hierarchy just reflects the namespace with polluting sys.path by lots of ooxx.egg paths.
Something I have tried:
python setup.py bdist_egg --exclude-source-files then easy_install ooxx.egg.
pollute "sys.path" for each namespace package.
python setup.py install --single-version-externally-managed.
not *.pyc only.
the "install_requires" got ignored!
need to manually put a ooxx.egg-info/installed-files.txt to make uninstall work correctly.
pip install . in the location of "setup.py".
not *.pyc only.
pysetup install . in the location of "setup.py".
not *.pyc only.
Update:
My current idea is to follow method 2.
python setup.py egg_info --egg-base . # get requires.txt
python setup.py install --single-version-externally-managed --record installed-files.txt # get installed-files.txt
manually install other dependencies through "requires.txt"
manually delete installed source files (*.py) through "installed-files.txt"
remove source files (*.py) from "installed-files.txt" and put it into deployed "ooxx.egg-info/installed-files.txt"
References:
Migrating to pip+virtualenv from setuptools
installing only .pyc (python compiled) with setuptools
Can I deploy Python .pyc files only to Google App Engine?
How come I can't get the exactly result to *pip install* by manually *python setup.py install*?
Some trick may help:
Compile your source into .pyc, zip them up in a single .zip file.
Write a new module with a simple module all it does is to add the .zip to the sys.path.
So when you import this module, the .zip is in the path. All you have to do is in a custom step in setup.py, copy the zip file to the proper place.

Categories