How to use setuptools to create rpm packages for linux - python

I want to build a rpm package for my software. I am only familiar with the classic way of using rpmbuild tool of linux with spec files and source directory. But I read in the documentation of distutils that it can somehow create a RPM package. Setuptools is based on distutils so I am guessing it also has some procedure to build rpm.
Although I never practically used any of the two modules, but I always thought that they build their own standalone packages.
I have two questions. First is that what is the exact procedure to create a rpm from setuptools. Second is that, is this way more organized than rpmbuild utility?
What I researched so far on Internet-
Setuptools is mainly used to create a "wheel" package. And it is similar to other packages like rpm or deb, except linux will not directly understand it like RPM.
Need to pass bdist_rpm flag during the build process to create a rpm package.(link)
I am quite confused with the concept of building and distributing a package. Need some explanation on what i am understanding wrong between setuptools and rpm.

How to build RPM package using bdist directly from setup.py http://jeromebelleman.gitlab.io/posts/devops/setuppy/
Note that this method is easy and can produce just simply RPM packages. And for example, you cannot put requires (or build requires) in metadata, you have to remember to put them on the command line all the times.
I would say that bdist is suitable just for initial work. If you want to ship and support it then creating SPEC file is a must.
One more example - AFAIK you cannot specify %post or %pre scriptlets using bdist and setup.py.
Here is an example of python SPEC file: https://fedoraproject.org/wiki/Packaging:Python#Example_common_spec_file

Related

How to force a platform wheel using build and pyproject.toml?

I am trying to force a Python3 non-universal wheel I'm building to be a platform wheel, despite not having any native build steps that happen during the distribution-packaging process.
The wheel will include an OS-specific shared library, but that library is built and copied into my package directory by a larger build system that my package knows nothing about. By the time my Python3 package is ready to be built into a wheel, my build system has already built the native shared library and copied it into the package directory.
This SO post details a solution that works for the now-deprecated setup.py approach, but I'm unsure how to accomplish the same result using the new and now-standard build / pyproject.toml system:
mypackage/
mypackage.py # Uses platform.system and importlib to load the local OS-specific library
pyproject.toml
mysharedlib.so # Or .dylib on macOS, or .dll on Windows
Based on the host OS performing the build, I would like the resulting wheel to be manylinux, macos, or windows.
I build with python3 -m build --wheel, and that always emits mypackage-0.1-py3-none-any.whl.
What do I have to change to force the build to emit a platform wheel?
OK, after some research and reading of code, I can present a bit of information and a few solutions that might meet other people's needs, summarized here:
Firstly, pyproject.toml is not mutually exclusive from setup.py. setuptools will complain about deprecation if you create a distribution package via python3 setup.py ... and no pyproject.toml file is present.
However, setup.py is still around and available, but it's a mistake to duplicate project configuration values (name, version, etc). So, put as much as your package will allow inside your pyproject.toml file, and use setup.py for things like overriding the Distribution class, or overriding the bdist_wheel module, etc.
As far as creating platform wheels, there are a few approaches that work, with pros and cons:
Override the bdist_wheel command class in setup.py as described here and set self.root_is_pure to False in the finalize_options override. This forces the python tag (e.g. cp39) to be set, along with the platform tag.
Override the Distribution class in setup.py as described here and override has_ext_modules() to simply return True. This also forces the python and platform tags to be set.
Add an unused minimal extension module to your packaging definition, as described here and here. This lengthens the build process and adds a useless "dummy" shared library to be dragged along wherever your wheel goes.
Add the argument -C=--build-option=--plat {your-platform-tag} to the build invocation (for my case that's python -m build -w -n, for example). This leaves the Python tag untouched but you have to supply your own tag; there's no way to say "use whatever the native platform is". You can discover the exact platform tag with the command wheel.bdist_wheel.get_platform(pathlib.Path('.')) after importing the pathlib and wheel.bdist_wheel packages, but that can be cumbersome because wheel isn't a standard library package.
Simply rename your wheel from mypkg-py3-none-any.whl to mypkg-py3-none-macosx_13_0_x86_64.whl- it appears that the platform tag is only encoded into the filename, and not any of the package metadata that's generated during the distribution-package process.
In the end I chose option #4 because it required the least amount of work- no setup.py files need to be introduced solely to accomplish this, and the build logs make it clear that a platform wheel (not a pure wheel) is being created.

How to install python modules using cppyy?

I want to package a python module containing python source and a native c++ library. Cppyy is used to dynamically generate the bindings so the library is really just a normal library. The build system for the library is meson and should not be replaced. The whole thing is in a git repository. I only care about Linux.
My question is how to get from this to “pip install url_to_package builds/installs everything.” in the least complicated way possible.
What I’ve tried:
Extending setuptools with a custom build command:
…that executes meson compile and copies the result in the right place. But pip install will perform its work in some random split-off temporary directory and I can’t find my C++ sources from there.
The Meson python module:
…can build my library and install files directly into some python env. Does not work with pip and has very limited functionality.
Wheels:
…are incredibly confusing and overkill for me. I will likely be the only user of this module. Actually, all I want is to easily use the module in projects that live in different directories…
Along the way, I also came across different CMake solutions, but those are disqualified because of my build system choice. What should I do?

Why do we need to install python modules

Lets imagine I created new module. Why do I need to install it via setup file? I mean I can just add my module to PYTHONPATH variable and thats all. Thanks
For a simple one-file module, sure, that's enough.
But a setup.py file also lets you create a distribution, associate metadata with the distribution (author, homepage, description, etc.), register your package with the Python Package Index and most of all, lets you define what other packages might be needed to run your code. setup.py is not just for installing your module.
Installing a module with a setup.py based on setuptools also gives you additional functionality, such as support for namespaced packages (multiple distributions sharing a top-level name) and the ability to install multiple versions of a package side by side.

What is the difference between an 'sdist' .tar.gz distribution and an python egg?

I am a bit confused. There seem to be two different kind of Python packages, source distributions (setup.py sdist) and egg distributions (setup.py bdist_egg).
Both seem to be just archives with the same data, the python source files. One difference is that pip, the most recommended package manager, is not able to install eggs.
What is the difference between the two and what is 'the' way to do distribute my packages?
(Note, I am not wanting to distribute my packages through PyPI, but I want to use a package manager that fetches my dependencies from PyPI)
setup.py sdist creates a source distribution: it contains setup.py, the source files of your module/script (.py files or .c/.cpp for binary modules), your data files, etc. The result is an archive that can then be used to recompile everything on any platform.
setup.py bdist (and bdist_*) creates a built distribution: it includes .pyc files, .so/.dll/.dylib for binary modules, .exe if using py2exe on Windows, your data files... but no setup.py. The result is an archive that is specific to a platform (for example linux-x86_64) and to a version of Python, and that can be installed simply by extracting it into the root of your filesystem (executables are in /usr/bin (or equivalent), data files in /usr/share, modules in /usr/lib/pythonX.X/site-packages/...). You can even build rpm archives that can be directly installed using your package manager.
2021 update: the tools to build and use eggs no longer exist in Python.
There are many more than two different kind of Python (distribution) packages. This command lists many subcommands:
$ python setup.py --help-commands
Notice the various different bdist types.
An egg was a new package type, introduced by setuptools but later adopted by the standard library. It is meant to be installed monolithic onto sys.path. This differs from an sdist package which is meant to have setup.py install run, copying each file into place and perhaps taking other actions as well (building extension modules, running additional arbitrary Python code included in the package).
eggs are largely obsolete at this point in time. EDIT: eggs are gone, they were used with the command "easy_install" that's been removed from Python.
The favored packaging format now is the "wheel" format, notably used by "pip install".
Whether you create an sdist or an egg (or wheel) is independent of whether you'll be able to declare what dependencies the package has (to be downloaded automatically at installation time by PyPI). All that's necessary for this dependency feature to work is for you to declare the dependencies using the extra APIs provided by distribute (the successor of setuptools) or distutils2 (the successor of distutils - otherwise known as packaging in the current development version of Python 3.x).
https://packaging.python.org/ is a good resource for further information about packaging. It covers some of the specifics of declaring dependencies (eg install_requires but not extras_require afaict).

Can we shed some definitive light on how python packaging and import works?

I had my fair chance of getting through the python management of modules, and every time is a challenge: packaging is not what people do every day, and it becomes a burden to learn, and a burden to remember, even when you actually do it, since this happens normally once.
I would like to collect here the definitive overview of how import, package management and distribution works in python, so that this question becomes the definitive explanation for all the magic that happens under the hood. Although I understand the broad level of the question, these things are so intertwined that any focused answer will not solve the main problem: understand how all works, what is outdated, what is current, what are just alternatives for the same task, what are the quirks.
The list of keywords to refer to is the following, but this is just a sample out of the bunch. There's a lot more and you are welcome to add additional details.
PyPI
setuptools / Distribute
distutils
eggs
egg-link
pip
zipimport
site.py
site-packages
.pth files
virtualenv
handling of compiled modules in eggs (with and without installation via easy_install)
use of get_data()
pypm
bento
PEP 376
the cheese shop
eggsecutable
Linking to other answers is probably a good idea. As I said, this question is for the high-level overview.
For the most part, this is an attempt to look at the packaging/distribution side, not the mechanics of import. Unfortunately, packaging is the place where Python provides way more than one way to do it. I'm just trying to get the ball rolling, hopefully others will help fill what I miss or point out mistakes.
First of all there's some messy terminology here. A directory containing an __init__.py file is a package. However, most of what we're talking about here are specific versions of packages published on PyPI, one of it's mirrors, or in a vendor specific package management system like Debian's Apt, Redhat's Yum, Fink, Macports, Homebrew, or ActiveState's pypm.
These published packages are what folks are trying to call "Distributions" going forward in an attempt to use "Package" only as the Python language construct. You can see some of that usage in PEP-376 PEP-376.
Now, your list of keywords relate to several different aspects of the Python Ecosystem:
Finding and publishing python distributions:
PyPI (aka the cheese shop)
PyPI Mirrors
Various package management tools / systems: apt, yum, fink, macports, homebrew
pypm (ActiveState's alternative to PyPI)
The above are all services that provide a place to publish Python distributions in various formats. Some, like PyPI mirrors and apt / yum repositories can be run on your local machine or within your companies network but folks typically use the official ones. Most, if not all provide a tool (or multiple tools in the case of PyPI) to help find and download distributions.
Libraries used to create and install distributions:
setuptools / Distribute
distutils
Distutils is the standard infrastructure on which Python packages are compiled and built into distributions. There's a ton of functionality in distutils but most folks just know:
from distutils.core import setup
setup(name='Distutils',
version='1.0',
description='Python Distribution Utilities',
author='Greg Ward',
author_email='gward#python.net',
url='http://www.python.org/sigs/distutils-sig/',
packages=['distutils', 'distutils.command'],
)
And to some extent that's a most of what you need. With the prior 9 lines of code you have enough information to install a pure Python package and also the minimal metadata required to publish that package a distribution on PyPI.
Setuptools provides the hooks necessary to support the Egg format and all of it's features and foibles. Distribute is an alternative to Setuptools that adds some features while trying to be mostly backwards compatible. I believe Distribute is going to be included in Python 3 as the successor to Distutil's from distutils.core import setup.
Both Setuptools and Distribute provide a custom version of the distutils setup command
that does useful things like support the Egg format.
Python Distribution Formats:
source
eggs
Distributions are typically provided either as source archives (tarball or zipfile). The standard way to install a source distribution is by downloading and uncompressing the archive and then running the setup.py file inside.
For example, the following will download, build, and install the Pygments syntax highlighting library:
curl -O -G http://pypi.python.org/packages/source/P/Pygments/Pygments-1.4.tar.gz
tar -zxvf Pygments-1.4.tar.gz
cd Pygments-1.4
python setup.py build
sudo python setup.py install
Alternatively you can download the Egg file and install it. Typically this is accomplished by using easy_install or pip:
sudo easy_install pygments
or
sudo pip install pygments
Eggs were inspired by Java's Jarfiles and they have quite a few features you should read about here
Python Package Formats:
uncompressed directories
zipimport (zip compressed directories)
A normal python package is just a directory containing an __init__.py file and an arbitrary number of additional modules or sub-packages. Python also has support for finding and loading source code within *.zip files as long as they are included on the PYTHONPATH (sys.path).
Installing Python Packages:
easy_install: the original egg installation tool, depends on setuptools
pip: currently the most popular way to install python packages. Similar to easy_install but more flexible and has some nice features like requirements files to help document dependencies and reproduce deployments.
pypm, apt, yum, fink, etc
Environment Management / Automated Deployment:
bento
buildout
virtualenv (and virtualenvwrapper)
The above tools are used to help automate and manage dependencies for a Python project. Basically they give you tools to describe what distributions your application requires and automate the installation of those specific versions of your dependencies.
Locations of Packages / Distributions:
site-packages
PYTHONPATH
the current working directory (depends on your OS and environment settings)
By default, installing a python distribution is going to drop it into the site-packages directory. That directory is usually something like /usr/lib/pythonX.Y/site-packages.
A simple programmatic way to find your site-packages directory:
from distuils import sysconfig
print sysconfig.get_python_lib()
Ways to modify your PYTHONPATH:
Python's import statement will only find packages that are located in one of the directories included in your PYTHONPATH.
You can inspect and change your path from within Python by accessing:
import sys
print sys.path
sys.path.append("/home/myname/lib")
Besides that, you can set the PYTHONPATH environment variable like you would any other environment variable on your OS or you could use:
.pth files: *.pth files located in directories that are already on your PYTHONPATH are read and each line of the *.pth file is added to your PYTHONPATH. Basically any time you would copy a package into a directory on your PYTHONPATH you could instead create a mypackages.pth. Read more about *.pth files: site module
egg-link files: Internal structure of python eggs they are a cross platform alternative to symbolic links. Creating an egg link file is similar to creating a pth file.
site.py modifications
To add the above /home/myname/lib to site-packages with a *.pth file you'd create a *.pth file. The name of the file doesn't matter but you should still probably choose something sensible.
Let's create myname.pth:
# myname.pth
/home/myname/lib
That's it. Drop that into sysconfig.get_python_lib() on your system or any other directory in your PYTHONPATH and /home/myname/lib will be added to the path.
For packaging question, this should help http://guide.python-distribute.org/
For import, the old article from Fredrik Lundh http://effbot.org/zone/import-confusion.htm still a very good starting point.
I recommend Tarek Ziadek's Book on Python. There's a chapter dedicated to packaging and distribution.
I don't think import needs to be explored (Python's namespacing and importing functionality is intuitive IMHO).
I use pip exclusively now. I haven't run into any issues with it.
However, the topic of packaging and distribution is something worth exploring. Instead of giving a lengthy answer, I will say this:
I learned how to package and distribute my own "packages" by simply copying how Pylons or many other open-source packages do it. I then combined that sort-of template with reading up of the docs to flesh it out even further and have come up with a solid distribution method.
When you grok package management and distribution for python (distutils and pypi) it's actually quite powerful. I like it a lot.
[edit]
I also wanted to add in a bit about virtualenv. USE IT. I create a virtualenv for every project and I always use --no-site-packages; I install all the packages I need for that particular project (even if it's something common amongst them all, like lxml) inside the virtualev. It keeps everything isolated and it's much easier for me to maintain the grouping in my head (rather than trying to keep track of what's where and for which version of python!)
[/edit]

Categories