Installing python module to python-version neutral location - python

I've built a debian package containing a python module.
The problem is that
dpkg-deb -c python-mymodule_1.0_i386.deb
show that all the files will be installed under
/usr/lib/python2.6/dist-packages/mymodule*
This means that the end-user who installs my deb package will need to be using exactly the same version of python as I am - yet I know that my module works fine on later versions too.
In my Makefile I have the following target:
install:
python setup.py install --root $(DESTDIR) $(COMPILE) --install-layout=deb
Where setup.py is
from distutils.core import setup
setup(name='mymodule',
version='1.0',
description='does my stuff',
author='Me',
author_email='myemail#localhost',
url='http://myurl/',
packages=['mymodule'],
)
Is there some way I can edit either the setup.py file or the Makefile so that the resulting module is installed in a python-version neutral directory instead of /usr/lib/python2.6?
Thanks,
Alex

I think I have found the answer now (thanks to the pointers from Tshepang):
In debian/rules you need to invoke dh_pysupport.
This grabs all the files installed by setup.py on the build machine in
$(DESTDIR)/usr/lib/pythonX.Y/dist-packages/mymodule*
and puts them into a non python-version specific location in the .deb file, namely
/usr/share/pyshared/mymodule*
finally, it adds a call to update-python-modules to the postinst script, which makes sure that the module is available on every version of python present on the target machine.

Related

Question about Packaging in Python with pip

I saw this nice explanation video (link) of packaging using pip and I got two questions:
The first one is:
I write a code which I want to share with my colleagues, but I do not aim to share it via pypi. Thus, I want to share it internally, so everyone can install it within his/ her environment.
I actually needn't to create a wheel file with python setup.py bdist_wheel, right? I create the setup.py file and I can install it with the command pip install -e . (for editable use), and everyone else can do it so as well, after cloning the repository. Is this right?
My second question is more technical:
I create the setup.py file:
from setuptools import setup
setup(
name = 'helloandbyemate',
version = '0.0.1',
description="Say hello in slang",
py_modules=['hellomate'],
package_dir={"": "src"}
)
To test it, I write a file hellomate.py which contains a function printing hello, mate!. I put this function in src/. In the setup.py file I put only this module in the list py_modules. In src/ is another module called byemate.py. When I install the whole module, it installs the module byemate.py as well, although I only put hellomate in the list of py_modules. Has anyone an explanation for this behaviour?
I actually needn't to create a wheel file ... everyone else can do it so as well, after cloning the repository. Is this right?
This is correct. However, the installation from source is slower, so you may want to publish wheels to an index anyway if you would like faster installs.
When I install the whole module, it installs the module byemate.py as well, although I only put hellomate in the list of py_modules. Has anyone an explanation for this behaviour?
Yes, this is an artifact of the "editable" installation mode. It works by putting the src directory onto the sys.path, via a line in the path configuration file .../lib/pythonX.Y/site-packages/easy-install.pth. This means that the entire source directory is exposed and everything in there is available to import, whether it is packaged up into a release file or not.
The benefit is that source code is "editable" without reinstallation (adding/removing/modifying files in src will be reflected in the package immediately)
The drawback is that the editable installation is not exactly the same as a "real" installation, where only the files specified in the setup.py will be copied into site-packages directly
If you don't want other files such as byemate.py to be available to import, use a regular install pip install . without the -e option. However, local changes to hellomate.py won't be reflected until the installation step is repeated.
Strict editable installs
It is possible to get a mode of installation where byemate.py is not exposed at all, but live modifications to hellomate.py are still possible. This is the "strict" editable mode of setuptools. However, it is not possible using setup.py, you have to use a modern build system declaration in pyproject.toml:
[build-system]
requires = ["setuptools"]
build-backend = "setuptools.build_meta"
[project]
name = "helloandbyemate"
version = "0.0.1"
description = "Say hello in slang"
[tool.setuptools]
py-modules = ["hellomate"]
include-package-data = false
[tool.setuptools.package-dir]
"" = "src"
Now you can perform a strict install with:
pip install -e . --config-settings editable_mode=strict

python is not installing dependencies listed in install_requires of setuptools

I have written a python module that depends on openpyxl. I want openpxyl to be installed as a dependency automatically using setuptools.
I read that the proper way to do this is to include the following in the setup.py script:
setup(name='methpipe',
version=find_version("lala", "__init__.py"),
description='Utilities',
author='Jonathan T',
author_email='jt#lala.com',
url='https://git.com...',
packages=find_packages(),
install_requires=[
'openpxyl = 2.3.3',
],
scripts=["bin/submit_run_full.py"],
cmdclass=dict(install=my_install)
)
So I packaged up my module with python setup.py sdist, took the *.tar.gz file, unzipped it, and then ran python setup.py install, and openpyxl is NOT installing!!!
What am I doing wrong here?
Try providing your dependency both in install_requires and setup_requires.
Following is from setuptool's documentation at https://pythonhosted.org/setuptools/setuptools.html
setup_requires
A string or list of strings specifying what other distributions need to be present in order for the setup script to run.
setuptools will attempt to obtain these (even going so far as to
download them using EasyInstall) before processing the rest of the
setup script or commands. This argument is needed if you are using
distutils extensions as part of your build process; for example,
extensions that process setup() arguments and turn them into EGG-INFO
metadata files.
(Note: projects listed in setup_requires will NOT be automatically
installed on the system where the setup script is being run. They are
simply downloaded to the ./.eggs directory if they’re not locally
available already. If you want them to be installed, as well as being
available when the setup script is run, you should add them to
install_requires and setup_requires.)
I notice when you use override 'install' with a 'cmdclass' key. The pattern below also left me with uninstalled dependencies.
Custom_Install(install):
def run(self):
# some custom commands
install.run(self)
Adding the dependencies into setup_requires didn't work for me so in the end I just did my own pip install in the custom install command..
def pip_install(package_name):
subprocess.call(
[sys.executable, '-m', 'pip', 'install', package_name]
)
Building with sdist create a source distribution, so I think it's normal taht dependencies are not packaged with your sources.

Shipping *.so and binaries while building RPM package

I have created a python application in which I would like to ship .so and some binary files in the final RPM package. After long reading I found a way to add binaries/ image and other data files in setup.py. Now, when I build an RPM with python setup.py bdist_rpm command, it complains about architecture dependency:
Arch dependent binaries in noarch package
error: command 'rpmbuild' failed with exit status 1
After googling I found that we can add:
#%define _binaries_in_noarch_packages_terminate_build 0
or removing the line BuildArch: noarch in the packagename.spec file to overcome the rpmbuild failure. However, every time I add or remove line from build/bdist.linux-i686/rpm/SPECS/packagename.spec the command python setup.py bdist_rpm always overwrites the .spe file.
Is there a way to avoid Arch dependent binaries and ship *.so and other binary files in rpm?
The behavior of bdist_rpm is defined by a bunch of settings in:
/usr/lib/rpm/macros
/etc/rpm/macros
$HOME/.rpmmacros
I'm willing to bet that only /usr/lib/rpm/macros exists on your system. This is normal.
So, in order to prevent the "Arch dependent binaries in noarch package" error you would create /etc/rpm/macros or ~/.rpmmacros and add the following:
%_unpackaged_files_terminate_build 0
%_binaries_in_noarch_packages_terminate_build 0
Do not modify /usr/lib/rpm/macros because that file will be overwritten by the system whenever the rpm-build package is upgraded, downgraded, or re-installed.
If you want to override the behavior for everyone on the system put the settings in /etc/rpm/macros. If you want override the behavior for a particular user then add the settings to $HOME/.rpmmacros.
.rpmmacros trumps /etc/rpm/macros which trumps /usr/lib/rpm/macros.
Note: it's useful to examine /usr/lib/rpm/macros to see what settings are available and for syntax examples.
As a side note, %_unpackaged_files_terminate_build 0 setting prevents the error: Installed (but unpackaged) file(s) found: error.
.so files are always arch dependent as far as I know.
In your case to avoid having to edit the specs-file all the time you can add --force-arch=<your_arch> to our setup.py bdist_rpm
e.g.
python setup.py bdist_rpm --force-arch=x86_64
If you encounter this matter in a .spec file when you try to build a new .rpm package.
Change the BuildArch from noarch to x86_64 (or whatever you have on the building system)
[root#devel-mga7][~/build/yate-ota]# grep Arch yate-ota.spec
BuildArch: x86_64
[root#devel-mga7][~/build/yate-ota]#

Detect python package installation path from within setup.py

After installation, I would like to make soft-links to some of the configuration & data files created by installation.
How can I determine the location of a new package's files installed from within the package's setup.py?
I initially hard-coded the path "/usr/local/lib/python2.7/dist-packages", but that broke when I tried using a virtual environment. (Created by virtualenv.)
I tried distutils.sysconfig.get_python_lib(), and that works inside the virtualenv. When installed on the real system, however, it returns "/usr/lib/python2.7/dist-packages" (Note the "local" directory isn't present.)
I've also tried site.getsitepackages():
Running a Python shell from the base environment:
import site
site.getusersitepackages()
'/home/sarah/.local/lib/python2.7/site-packages'
site.getsitepackages()
['/usr/local/lib/python2.7/dist-packages', '/usr/lib/python2.7/dist-packages']
Running a Python shell from a virtual environment "testenv":
import site
site.getsitepackages()
Traceback (most recent call last):
File "", line 1, in
AttributeError: 'module' object has no attribute 'getsitepackages'
I'm running "Python 2.7.3 (default, Aug 1 2012, 05:14:39)" with "[GCC 4.6.3] on linux2" on Ubuntu. I can probably cobble something together with try-except blocks, but it seems like there should be some variable set / returned by distutils / setuptools. (I'm agnostic about which branch to use, as long as it works.)
Thanks.
I haven't found the "correct" way of doing this, but I have found a couple tricks that seem almost-correct. One method only works on install; the other only works if the package is already installed.
For install, I use the object returned by setuptools.setup():
from setuptools import setup
s = setup([...])
installation_path = s.command_obj['install'].install_lib
(This only works during install since you need a valid Distribution object for those attributes to exist. AFAIK, the only way to get such an object is to run setup().)
On uninstall, I use the file attribute of the package, as suggested by #Zhenya above. The only catch is that when I run ./setup.py uninstall to get rid of package, I usually have directories ./package/, ./build, ./dist, and ./package.egg-info/. (The "uninstall" option is caught by my code without calling setup(). It runs a manually-created script to delete the package files.) These can redirect the python interpreter to some place other than the globally-accessible repository I'm trying to get rid of. Here's my hack to handle that:
import imp
import sys
from subprocess import Popen
from os import getcwd
Popen('rm -r build dist *.egg-info', shell=True).wait()
oldpath = sys.path
rundir = getcwd()
sys.path.remove(rundir)
mod = imp.find_module(PACKAGE)
p = imp.load_module(PACKAGE, mod[0], mod[1], mod[2])
sys.path = oldpath
installation_path = p.__file__
(This doesn't work during install since - I think - Python only inventories modules when it starts, so find_module() won't find the just-installed package unless you exit python and come back in.)
I've tested both install and uninstall on a bare environment and a virtual environment (from virtualenv 1.9.1). I'm running Ubuntu 12.04 LTS, Python 2.7.3, setuptools 0.6c11 (in the bare environment) and setuptools 0.7.4 (in virtualenv).
This will probably not answer your question, but if you need to access the source code of a package you have installed, or any other file within this package, the best way to do it is to install this package in develop mode (by downloading the sources, putting it wherever you want and then running python setup.py develop in the base directory of the package sources). This way you know where the package is found.

How to build debian package with CPack to execute setup.py?

Until now, my project had only .cpp files that were compiled into different binaries and I managed to configure CPack to build a proper debian package without any problems.
Recently I wrote a couple of python applications and added them to the project, as well as some custom modules that I would also like to incorporate to the package.
After writing a setup.py script, I'm wondering how to add these files to the CPack configuration in a way that setup.py get's executed automatically when the user installs the package on the system with dpkg -i package.deb.
I'm struggling to find relevant information on how to configure CPack to install custom python applications/modules. Has anyone tried this?
I figured out a way to do it but it's not very simple. I'll do my best to explain the procedure so please be patient.
The idea of this approach is to use postinst and prerm to install and remove the python application from the system.
In the CMakeLists.txt that defines the project, you need to state that CPACK is going to be used to generate a .deb package. There's some variables that need to be filled with info related to the package itself, but one named CPACK_DEBIAN_PACKAGE_CONTROL_EXTRA is very important because it's used to specify the location of postinst and prerm, which are standard scripts of the debian packaging system that are automatically executed by dpkg when the package is installed/removed.
At some point of your main CMakeLists.txt you should have something like this:
add_subdirectory(name_of_python_app)
set(CPACK_COMPONENTS_ALL_IN_ONE_PACKAGE 1)
set(CPACK_PACKAGE_NAME "fake-package")
set(CPACK_PACKAGE_VENDOR "ACME")
set(CPACK_PACKAGE_DESCRIPTION_SUMMARY "fake-package - brought to you by ACME")
set(CPACK_PACKAGE_VERSION "1.0.2")
set(CPACK_PACKAGE_VERSION_MAJOR "1")
set(CPACK_PACKAGE_VERSION_MINOR "0")
set(CPACK_PACKAGE_VERSION_PATCH "2")
SET(CPACK_SYSTEM_NAME "i386")
set(CPACK_GENERATOR "DEB")
set(CPACK_DEBIAN_PACKAGE_MAINTAINER "ACME Technology")
set(CPACK_DEBIAN_PACKAGE_DEPENDS "libc6 (>= 2.3.1-6), libgcc1 (>= 1:3.4.2-12), python2.6, libboost-program-options1.40.0 (>= 1.40.0)")
set(CPACK_DEBIAN_PACKAGE_CONTROL_EXTRA "${CMAKE_SOURCE_DIR}/name_of_python_app/postinst;${CMAKE_SOURCE_DIR}/name_of_python_app/prerm;")
set(CPACK_SET_DESTDIR "ON")
include(CPack)
Some of these variables are optional, but I'm filling them with info for educational purposes.
Now, let's take a look at the scripts:
postinst:
#!/bin/sh
# postinst script for fake_python_app
set -e
cd /usr/share/pyshared/fake_package
sudo python setup.py install
prerm:
#!/bin/sh
# prerm script
#
# Removes all files installed by: ./setup.py install
sudo rm -rf /usr/share/pyshared/fake_package
sudo rm /usr/local/bin/fake_python_app
If you noticed, script postinst enters at /usr/share/pyshared/fake_package and executes the setup.py that is laying there to install the app on the system. Where does this file come from and how it ends up there? This file is created by you and will be copied to that location when your package is installed on the system. This action is configured in name_of_python_app/CMakeLists.txt:
install(FILES setup.py
DESTINATION "/usr/share/pyshared/fake_package"
)
install(FILES __init__.py
DESTINATION "/usr/share/pyshared/fake_package/fake_package"
)
install(FILES fake_python_app
DESTINATION "/usr/share/pyshared/fake_package/fake_package"
)
install(FILES fake_module_1.py
DESTINATION "/usr/share/pyshared/fake_package/fake_package"
)
install(FILES fake_module_2.py
DESTINATION "/usr/share/pyshared/fake_package/fake_package"
)
As you can probably tell, besides the python application I want to install there's also 2 custom python modules that I wrote that also need to be installed. Below I describe the contents of the most important files:
setup.py:
#!/usr/bin/env python
from distutils.core import setup
setup(name='fake_package',
version='1.0.5',
description='Python modules used by fake-package',
py_modules=['fake_package.fake_module_1', 'fake_package.fake_module_2'],
scripts=['fake_package/fake_python_app']
)
_init_.py: is an empty file.
fake_python_app : your python application that will be installed in /usr/local/bin
And that's pretty much it!
A setup.py file is the equivalent of the configure && make && make install dance for a standard unix source distribution and as such is inappropriate to run as a part of a distributions package install process. See this discussion of the different ways to include Python modules in a .deb package.

Categories