How to include .so file when build .egg with setuptools - python

How can I build an .egg file include a '.so' file?
here is my file tree:
--setup.py
--__init__.py
--mcpack.py`
--mcpack.so
how can I make a package with mcpack.so in it?
I've tried using pack_data, it turns out .so file is packaged but not findable in python file.

Put it in native_libs.txt and build the egg with bdist_egg, see this section of the documentation. You should, however, switch to wheels instead of eggs.
Shared object files are only distributable if you can guarantee the exact same tool chain that was used to build your .so is also present in every other distribution that is going to run it - this is highly unlikely and would break the distributable nature of egg files, that is unless you really only want to target a specific version of a Linux distribution.
Your egg, if installed in any other distribution than the one the .so was built on will not be able to be used.
If you want to build a distributable package containing native code have a look at wheel files and for wheels that can be distributed to all Linux OSs, have a look at manylinux.

Related

What is `build` in the context of python?

I'm currently learning about Python distribution packages, and reading through this article, and it says:
pyproject.toml tells build tools (like pip and build) what is required to build your project. This tutorial uses setuptools, so open pyproject.toml and enter the following content:
It mentions a few times more the concept of building Python packages.
As far as I know, at least when talking about pure Python code, Python distributions (sdists and wheels) only contain .py source files. So what is meant by the author when speaking about building?
build may have two meaning.
In your description building means to put all project files into one file with extension .whl and/or .gz - so later users (using pip) will download all as a single .whl (or .gz).
So we could say that this building can mean packing.
Second build (but not in your description) can be when people install package. Some packages may have C/C++ code (ie. numpy) and after downloading it needs to compile it (for your CPU).
So we could say that this building can means compiling.

How to structure and distribute Pybind11 extension with stubs?

I'm trying to create and distribute (with pip) a Python package that has Python code, and C++ code compiled to a .pyd file with Pybind11 (using Visual Studio 2019). I also want to include .pyi stub files, for VScode and other editors. I can't find much documentation on doing this correctly.
I'd like to be able to just install the package via pip as normal, and write from mymodule.mysubmodule import myfunc etc like a normal Python package, including autocompletes, type annotations, VScode intellisense etc using the .pyi files I'd write.
My C++ code is in multiple cpp and header files. It uses a few standard libraries, and a few external libraries (such as boost). It defines a single module, and 2 submodules. I want to be able to distribute this on Windows and Linux, and for x86 and x64. I am currently targeting Python 3.9, and the c++17 standard.
How should I structure and distribute this package? Do I include the c++ source files, and create a setup.py similar to the Pybind11 example? If so, how do I include the external libraries? And how do I structure the .pyi stub files? Does this mean whoever tries to install my package would need a c++ compiler as well?
Or, should I compile my c++ to a .pyd/.so file for each platform and architecture? If so, is there a way to specify which one gets installed through pip? And again, how would I structure the .pyi stubs?
Generating .pyi stubs
The pybind11 issue mentions a couple of tools (1, 2) to generate stubs for binary modules. There could be more, but I'm not aware of others. Unfortunately both are far from being perfect, so you probably still need to check and adjust the generated stubs manually.
Distribution of .pyi stubs
After correction of stubs you just include those .pyi files in you distribution (e.g. in wheel or as sources) along with py.typed indication file or, alternatively, distribute them separately as standalone package (e.g. mypackage-stubs).
Building wheels
Wheels allows users of your library to install it in binary form, i.e. without compilation. Wheels makes use of older compilers in order to be compatible with greater number of systems/platforms, so you might face some troubles with a C++17 library. (C++11 is old enough and should have no problems with wheels).
Building wheels for various platforms is tedious, the pybind11's python_example uses cibuildwheels package to do that, I would recommend this route if you are already using CI.
If wheels are missing for target platform the pip will attempt to install from source. This would require compiler and 3rd party libraries you are using to be already installed.
Maybe conda?
If your setup is complex and requires a number of 3rd party libraries it might be worth to write a conda recipe and use conda-forge to generate binary versions of the package. Conda is superior to pip, since it can manage non-python dependencies as well.

Building a python wheel that includes svn:externals files on Jenkins

I'm building a package on Python 2.7.6 32bit Windows 32
The only definitive source of some components of a package is an svn 'share'. The common practice in this company is to include that into your project the using svn:externals.
The normal way to build this package is:
python setup.py bdist_wheel
All appears normal on my workstation (where I checked out the code with TortoiseSVN), but when I run the same process on Jenkins the bdist_wheel process does not include any .py file that was sourced via svn:externals.
After reading through the documentation, this appears to be because of a feature which identifies which scripts are part of the package based on which files are tracked by SVN. It appears that as a consequence of how Jenkins checks out the files, the bdist_wheel sees that I'm using SVN and assumes that it knows how to determine which files are tracked, but gets the answer wrong.
What I need is a way to stop the bdist_wheel command from trying to guess which files I care about (I actually want every .py file in the project to be included, regardless of how it's been brought in)
I tried tried specifying the files I needed using a MANIFEST.in file, but it did not work.
recursive-include externals *.py
In this example, 'externals' is a top-level directory in my source-tree which contains an init.py file and a bunch of svn:external'd directories. Only the init file can be seen in the built whl file.
Unfortunately this makes the .py files behave as if they were data, in the log I can see this:
copying build\lib\externals\security\credentials.py -> build\bdist.win32\wheel\foopackage-0.0.4.data\..\externals\security
That's obviously not a real solution!
Pip, Virtualenv and all relevant tools are at the latest stable versions.
It turns out that this problem is caused by Jenkins using a very old SVN standard (1.4) for it's own repositories. Switching to 1.7 corrects this behavior.

What is the difference between an 'sdist' .tar.gz distribution and an python egg?

I am a bit confused. There seem to be two different kind of Python packages, source distributions (setup.py sdist) and egg distributions (setup.py bdist_egg).
Both seem to be just archives with the same data, the python source files. One difference is that pip, the most recommended package manager, is not able to install eggs.
What is the difference between the two and what is 'the' way to do distribute my packages?
(Note, I am not wanting to distribute my packages through PyPI, but I want to use a package manager that fetches my dependencies from PyPI)
setup.py sdist creates a source distribution: it contains setup.py, the source files of your module/script (.py files or .c/.cpp for binary modules), your data files, etc. The result is an archive that can then be used to recompile everything on any platform.
setup.py bdist (and bdist_*) creates a built distribution: it includes .pyc files, .so/.dll/.dylib for binary modules, .exe if using py2exe on Windows, your data files... but no setup.py. The result is an archive that is specific to a platform (for example linux-x86_64) and to a version of Python, and that can be installed simply by extracting it into the root of your filesystem (executables are in /usr/bin (or equivalent), data files in /usr/share, modules in /usr/lib/pythonX.X/site-packages/...). You can even build rpm archives that can be directly installed using your package manager.
2021 update: the tools to build and use eggs no longer exist in Python.
There are many more than two different kind of Python (distribution) packages. This command lists many subcommands:
$ python setup.py --help-commands
Notice the various different bdist types.
An egg was a new package type, introduced by setuptools but later adopted by the standard library. It is meant to be installed monolithic onto sys.path. This differs from an sdist package which is meant to have setup.py install run, copying each file into place and perhaps taking other actions as well (building extension modules, running additional arbitrary Python code included in the package).
eggs are largely obsolete at this point in time. EDIT: eggs are gone, they were used with the command "easy_install" that's been removed from Python.
The favored packaging format now is the "wheel" format, notably used by "pip install".
Whether you create an sdist or an egg (or wheel) is independent of whether you'll be able to declare what dependencies the package has (to be downloaded automatically at installation time by PyPI). All that's necessary for this dependency feature to work is for you to declare the dependencies using the extra APIs provided by distribute (the successor of setuptools) or distutils2 (the successor of distutils - otherwise known as packaging in the current development version of Python 3.x).
https://packaging.python.org/ is a good resource for further information about packaging. It covers some of the specifics of declaring dependencies (eg install_requires but not extras_require afaict).

How do I turn a python program into an .egg file?

How do I turn a python program into an .egg file?
Setuptools is the software that creates .egg files. It's an extension of the distutils package in the standard library.
The process involves creating a setup.py file, then python setup.py bdist_egg creates an .egg package.
Also, if you need to get an .egg package off a single .py file app, check this link: EasyInstall - Packaging others projects as eggs.
Python has its own package for creating distributions that is called distutils. However instead of using Python’s distutils’ setup function, we’re using setuptools’ setup. We’re also using setuptools’ find_packages function which will automatically look for any packages in the current directory and add them to the egg. To create said egg, you’ll need to run the following from the command line:
c:\Python34\python.exe setup.py bdist_egg

Categories