How to build debian package with CPack to execute setup.py? - python

Until now, my project had only .cpp files that were compiled into different binaries and I managed to configure CPack to build a proper debian package without any problems.
Recently I wrote a couple of python applications and added them to the project, as well as some custom modules that I would also like to incorporate to the package.
After writing a setup.py script, I'm wondering how to add these files to the CPack configuration in a way that setup.py get's executed automatically when the user installs the package on the system with dpkg -i package.deb.
I'm struggling to find relevant information on how to configure CPack to install custom python applications/modules. Has anyone tried this?

I figured out a way to do it but it's not very simple. I'll do my best to explain the procedure so please be patient.
The idea of this approach is to use postinst and prerm to install and remove the python application from the system.
In the CMakeLists.txt that defines the project, you need to state that CPACK is going to be used to generate a .deb package. There's some variables that need to be filled with info related to the package itself, but one named CPACK_DEBIAN_PACKAGE_CONTROL_EXTRA is very important because it's used to specify the location of postinst and prerm, which are standard scripts of the debian packaging system that are automatically executed by dpkg when the package is installed/removed.
At some point of your main CMakeLists.txt you should have something like this:
add_subdirectory(name_of_python_app)
set(CPACK_COMPONENTS_ALL_IN_ONE_PACKAGE 1)
set(CPACK_PACKAGE_NAME "fake-package")
set(CPACK_PACKAGE_VENDOR "ACME")
set(CPACK_PACKAGE_DESCRIPTION_SUMMARY "fake-package - brought to you by ACME")
set(CPACK_PACKAGE_VERSION "1.0.2")
set(CPACK_PACKAGE_VERSION_MAJOR "1")
set(CPACK_PACKAGE_VERSION_MINOR "0")
set(CPACK_PACKAGE_VERSION_PATCH "2")
SET(CPACK_SYSTEM_NAME "i386")
set(CPACK_GENERATOR "DEB")
set(CPACK_DEBIAN_PACKAGE_MAINTAINER "ACME Technology")
set(CPACK_DEBIAN_PACKAGE_DEPENDS "libc6 (>= 2.3.1-6), libgcc1 (>= 1:3.4.2-12), python2.6, libboost-program-options1.40.0 (>= 1.40.0)")
set(CPACK_DEBIAN_PACKAGE_CONTROL_EXTRA "${CMAKE_SOURCE_DIR}/name_of_python_app/postinst;${CMAKE_SOURCE_DIR}/name_of_python_app/prerm;")
set(CPACK_SET_DESTDIR "ON")
include(CPack)
Some of these variables are optional, but I'm filling them with info for educational purposes.
Now, let's take a look at the scripts:
postinst:
#!/bin/sh
# postinst script for fake_python_app
set -e
cd /usr/share/pyshared/fake_package
sudo python setup.py install
prerm:
#!/bin/sh
# prerm script
#
# Removes all files installed by: ./setup.py install
sudo rm -rf /usr/share/pyshared/fake_package
sudo rm /usr/local/bin/fake_python_app
If you noticed, script postinst enters at /usr/share/pyshared/fake_package and executes the setup.py that is laying there to install the app on the system. Where does this file come from and how it ends up there? This file is created by you and will be copied to that location when your package is installed on the system. This action is configured in name_of_python_app/CMakeLists.txt:
install(FILES setup.py
DESTINATION "/usr/share/pyshared/fake_package"
)
install(FILES __init__.py
DESTINATION "/usr/share/pyshared/fake_package/fake_package"
)
install(FILES fake_python_app
DESTINATION "/usr/share/pyshared/fake_package/fake_package"
)
install(FILES fake_module_1.py
DESTINATION "/usr/share/pyshared/fake_package/fake_package"
)
install(FILES fake_module_2.py
DESTINATION "/usr/share/pyshared/fake_package/fake_package"
)
As you can probably tell, besides the python application I want to install there's also 2 custom python modules that I wrote that also need to be installed. Below I describe the contents of the most important files:
setup.py:
#!/usr/bin/env python
from distutils.core import setup
setup(name='fake_package',
version='1.0.5',
description='Python modules used by fake-package',
py_modules=['fake_package.fake_module_1', 'fake_package.fake_module_2'],
scripts=['fake_package/fake_python_app']
)
_init_.py: is an empty file.
fake_python_app : your python application that will be installed in /usr/local/bin
And that's pretty much it!

A setup.py file is the equivalent of the configure && make && make install dance for a standard unix source distribution and as such is inappropriate to run as a part of a distributions package install process. See this discussion of the different ways to include Python modules in a .deb package.

Related

Question about Packaging in Python with pip

I saw this nice explanation video (link) of packaging using pip and I got two questions:
The first one is:
I write a code which I want to share with my colleagues, but I do not aim to share it via pypi. Thus, I want to share it internally, so everyone can install it within his/ her environment.
I actually needn't to create a wheel file with python setup.py bdist_wheel, right? I create the setup.py file and I can install it with the command pip install -e . (for editable use), and everyone else can do it so as well, after cloning the repository. Is this right?
My second question is more technical:
I create the setup.py file:
from setuptools import setup
setup(
name = 'helloandbyemate',
version = '0.0.1',
description="Say hello in slang",
py_modules=['hellomate'],
package_dir={"": "src"}
)
To test it, I write a file hellomate.py which contains a function printing hello, mate!. I put this function in src/. In the setup.py file I put only this module in the list py_modules. In src/ is another module called byemate.py. When I install the whole module, it installs the module byemate.py as well, although I only put hellomate in the list of py_modules. Has anyone an explanation for this behaviour?
I actually needn't to create a wheel file ... everyone else can do it so as well, after cloning the repository. Is this right?
This is correct. However, the installation from source is slower, so you may want to publish wheels to an index anyway if you would like faster installs.
When I install the whole module, it installs the module byemate.py as well, although I only put hellomate in the list of py_modules. Has anyone an explanation for this behaviour?
Yes, this is an artifact of the "editable" installation mode. It works by putting the src directory onto the sys.path, via a line in the path configuration file .../lib/pythonX.Y/site-packages/easy-install.pth. This means that the entire source directory is exposed and everything in there is available to import, whether it is packaged up into a release file or not.
The benefit is that source code is "editable" without reinstallation (adding/removing/modifying files in src will be reflected in the package immediately)
The drawback is that the editable installation is not exactly the same as a "real" installation, where only the files specified in the setup.py will be copied into site-packages directly
If you don't want other files such as byemate.py to be available to import, use a regular install pip install . without the -e option. However, local changes to hellomate.py won't be reflected until the installation step is repeated.
Strict editable installs
It is possible to get a mode of installation where byemate.py is not exposed at all, but live modifications to hellomate.py are still possible. This is the "strict" editable mode of setuptools. However, it is not possible using setup.py, you have to use a modern build system declaration in pyproject.toml:
[build-system]
requires = ["setuptools"]
build-backend = "setuptools.build_meta"
[project]
name = "helloandbyemate"
version = "0.0.1"
description = "Say hello in slang"
[tool.setuptools]
py-modules = ["hellomate"]
include-package-data = false
[tool.setuptools.package-dir]
"" = "src"
Now you can perform a strict install with:
pip install -e . --config-settings editable_mode=strict

Python setuptools installs console scripts even if optional dependency are not installed

i am currently working on a package and am confused with setuptools. This package contains many dependencies and with these dependencies, multiple scripts can be executed via cli.
E.G.
> main_pkg
> main_pkg_which_needs_dep1
> main_pkg_which_needs_dep2
> ...
It is not necessary to have all scripts available on a system. Only the relevant ones. So i thought that i could simply modify my setup.py as follows:
...
entry_points=dict(console_scripts=[
'main_pkg = main_pkg.main_pkg:main ',
'main_pkg_which_needs_dep1 = main_pkg.main_pkg:main_dep1 [dep1]',
...
]),
...
extras_require={
"dep1": ["psycopg"],
"dep2": ["apsw"],
"dep3": ["numpy"],
...
},
And assummed if someone executes pip install main_pkg, that only main_pkg would be available in CLI. (Therefore, if executing pip install main_pkg[dep1], then there would be main_pkg and main_pkg_which_needs_dep1 available in CLI)
However, executing pip install main_pkg also makes all other console_scripts available through CLI, failing if executing e.g. main_pkg_which_needs_dep1 due to missing dependencies.
Is this behaviour expected by setuptools?
From the documentation i am reading the following:
It is up to the installer to determine how to handle the situation where PDF was not indicated (e.g. omit the console script, provide a warning when attempting to load the entry point, assume the extras are present and let the implementation fail later).
Also, if looking here, the documentation mentions the following:
In this case, the hello-world script is only viable if the pretty-printer extra is indicated, and so a plugin host might exclude that entry point (i.e. not install a console script) if the relevant extra dependencies are not installed.
Am i understanding the documentation correctly, that the installer (plugin host? --> pip?) has to handle this case, which is currently not working?
Or do i have to further modify the setup.py to achieve such a behaviour?
Thanks in advance!
I ran into this same problem. Based on this thread: https://github.com/pypa/pip/issues/9726, it does not look like you can optionally install console scripts.
However, this comment: https://github.com/pypa/pip/issues/9726#issuecomment-826381705 proposes a solution that may help you. I'll copy-paste it below.
Have myscript with the extra [cli] depends on myscript-cli the package and myscript-cli depends on myscript but contains the entrypoint to the console_script in the main package.
If you install myscript[cli] it requires myscript-cli package which then gets installed and that contains the entrypoint you wanted. This makes myscript[cli] or myscript-cli install both packages, but permits a myscript install that will not require the -cli package and thus will not provide the entrypoint.

Python: Cannot get imports to work when installed on remote server

(Before responding with a 'see this link' answer, know that I've been searching for hours and have probably read it all. I've done my due diligence, I just can't seem to find the solution)
That said, I'll start with my general setup and give details after.
Setup: On my desktop, I have a project that I am running in Pycharm, Python3.4, using a virtual environment. In the cloud (AWS). I have an EC2 instance running Ubuntu. I'm not using a virtual environment in the cloud. The cloud machine has both python 2.7 and python 3.5 installed.
[Edit] I've switched to a virtual machine on my cloud environment, and installing from setup distrubution (still broken)
Problem: On my desktop, both within pycharm and from the command line (within the virtual environment using workon (project), I can run a particular file called "do_daily.py" without any issues. However, If I try to run the same file on the cloud server, I get the famous import error.
[edit] Running directly from command line on remote server.
python3 src/do_daily.py
File "src/do_daily.py", line 3, in <module>
from src.db_backup import dev0_backup as dev0bk
ImportError: No module named 'src.db_backup'
Folder Structure: My folder structure for the specific import is (among other stuff).
+ project
+ src
- __init__.py
- do_daily.py
+ db_backup
- __init__.py
- dev0_backup.py
Python Path: (echo $PYTHONPATH)
/home/ubuntu/automation/Project/src/tg_servers:/home/ubuntu/automation/Project/src/db_backup:/home/ubuntu/automation/Project/src/aws:/home/ubuntu/automation/Project/src:/home/ubuntu/automation/Project
Other stuff:
print(sys.executable) = /usr/bin/python3
print(sys.path) = gives me all the above plus a bunch of default paths.
I have run out of ideas and would appreciate any help.
Thank you,
SteveJ
SOLUTION
Clearly the accepted answer was the most comprehensive and represents the best approach to the problem. However, for those seeing this later - I was able to solve my specific problem a little more directly.
(From within the virtual environment), both the add2virtualenv and creating .pth files did work. What I was missing is that I had to add all packages; src, db_backup, pkgx,y,z etc...
I have created a github repository (https://github.com/thebjorn/pyimport.git), and tested the code on a freshly created AWS/Ubuntu instance.
First the installs and updates I did (installing and updating pip3):
ubuntu#:~$ sudo apt-get update
ubuntu#:~$ sudo apt install python3-pip
ubuntu#:~$ pip3 install -U pip
Then get the code:
ubuntu#:~$ git clone https://github.com/thebjorn/pyimport.git
My version of do_daily.py imports dev0_backup, contains a function that tells us it was called, and a __main__ section (for calling with -m or filename):
ubuntu#ip-172-31-29-112:~$ cat pyimport/src/do_daily.py
from __future__ import print_function
from src.db_backup import dev0_backup as dev0bk
def do_daily_fn():
print("do_daily_fn called")
if __name__ == "__main__":
do_daily_fn()
The setup.py file points directly to the do_daily_fn:
ubuntu#ip-172-31-29-112:~$ cat pyimport/setup.py
from setuptools import setup
setup(
name='pyimport',
version='0.1',
description='pyimport',
url='https://github.com/thebjorn/pyimport.git',
author='thebjorn',
license='MIT',
packages=['src'],
entry_points={
'console_scripts': """
do_daily = src.do_daily:do_daily_fn
"""
},
zip_safe=False
)
Install the code in dev mode:
ubuntu#:~$ pip3 install -e pyimport
I can now call do_daily in a number of ways (notice that I haven't done anything with my PYTHONPATH).
The console_scripts in setup.py makes it possible too call do_daily by just typing its name:
ubuntu#:~$ do_daily
do_daily_fn called
Installing the package (in dev mode or otherwise) makes the -m flag work out-of-the box:
ubuntu#:~$ python3 -m src.do_daily
do_daily_fn called
You can even call the file directly (although this is by far the ugliest way and I would recommend against it):
ubuntu#:~$ python3 pyimport/src/do_daily.py
do_daily_fn called
Your PYTHONPATH should contain /home/ubuntu/automation/Project and likely nothing below it.
There all reasons to use a virtualenv in production and never install any packages into system Python explicitly. System Python is for running the OS-provided software written in Python. Don't mix it with your deployments.
A few questions here.
From which directory are you running you program?
Did you try to import the db_backup module inside of src/__init__.py?

Installing lpsolve to work with python in Ubuntu?

Read other questions; unfortunately they were not relevant.
Using this tutorial:
http://lpsolve.sourceforge.net/5.5/Python.htm
Found this file:
lp_solve_5.5.2.0_exe_ux32
That contains these files:
libbfp_etaPFI.so
libbfp_GLPK.so
libbfp_LUSOL.so
libxli_CPLEX.so
libxli_DIMACS.so
libxli_LINDO.so
libxli_MathProg.so
libxli_XPRESS.so
libxli_ZIMPL.so
lp_solve
The tutorial says the need file is:
lpsolve55.so
How do you get lpsolve working with Python in Ubuntu?
This is for installing lpsolve55 in Python, anaconda in ubuntu distribution, translated from the blogpost (which is in Chinese)
go download at sourceforge
lp_solve_5.5.0.20_dev.tar.gz - where you would find liblpsolve55.so and some other files
lp_solve_5.5.0.20_Python_source.tar.gz - where you would find lpsolve55.so and setup.py
make sure you have install python-dev (if not, type sudo apt-get install python-dev at the command line)
unzip lp_solve_5.5.0.20_Python_source.tar.gz and put the file at anaconda2/lib/python2.7/site-packages/
unzip lp_solve_5.5.0.20_dev.tar.gz and put "only" the liblpsolve55.so at anaconda2/lib, and the rest at anaconda2/lib/python2.7/site-packages/lp_solve_5.5, the directory you get at step 3, in the same layer as extra. The result would look like this
Now
$ cd anaconda2/lib/python2.7/site-packages/lp_solve_5.5/extra/Python/
$ python setup.py install
Test by import lpsolve55
You will need this file:
lp_solve_5.5.2.0_Python2.5_exe_ux64.tar.gz
Inside it you will find: liblpsolve55.so
You need to put that file in a place accessible to the python path. Had problems doing that so it's in the project's folder.
You also need this file:
lp_solve_5.5.2.0_dev_ux64.tar.gz
Inside it you will find:
liblpsolve55.so
This file needs to go to /lib/usr

Installing python module to python-version neutral location

I've built a debian package containing a python module.
The problem is that
dpkg-deb -c python-mymodule_1.0_i386.deb
show that all the files will be installed under
/usr/lib/python2.6/dist-packages/mymodule*
This means that the end-user who installs my deb package will need to be using exactly the same version of python as I am - yet I know that my module works fine on later versions too.
In my Makefile I have the following target:
install:
python setup.py install --root $(DESTDIR) $(COMPILE) --install-layout=deb
Where setup.py is
from distutils.core import setup
setup(name='mymodule',
version='1.0',
description='does my stuff',
author='Me',
author_email='myemail#localhost',
url='http://myurl/',
packages=['mymodule'],
)
Is there some way I can edit either the setup.py file or the Makefile so that the resulting module is installed in a python-version neutral directory instead of /usr/lib/python2.6?
Thanks,
Alex
I think I have found the answer now (thanks to the pointers from Tshepang):
In debian/rules you need to invoke dh_pysupport.
This grabs all the files installed by setup.py on the build machine in
$(DESTDIR)/usr/lib/pythonX.Y/dist-packages/mymodule*
and puts them into a non python-version specific location in the .deb file, namely
/usr/share/pyshared/mymodule*
finally, it adds a call to update-python-modules to the postinst script, which makes sure that the module is available on every version of python present on the target machine.

Categories