How to adapt ``pip install -e .`` to build cython extensions - python

With the move to the new pyproject.toml system, I was wondering whether there was a way to install packages in editable mode while compiling extensions (which pip install -e . does not do).
So I want pip to:
run the build_ext I configured for Cython and generate my .so files
put them in the local folder
do the rest of the normal editable install
I found some mentions of build_wheel_for_editable on the pip documentation but I could not find any actual example of where this hook should be implemented and what it should look like. (to be honest, I'm not even completely sure this is what I'm looking for)
So would anyone know how to do that?
I'd also happy about any additional explanation as to why pip install . runs build_ext but the editable command does not.
Details:
I don't have a setup.py file anymore; the pyproject.toml uses setuptools and contains
[build-system]
requires = ["setuptools>=61.0", "numpy>=1.17", "cython>=0.18"]
build-backend = "setuptools.build_meta"
[tool.setuptools]
package-dir = {"" = "."}
[tool.setuptools.packages]
find = {}
[tool.setuptools.cmdclass]
build_ext = "_custom_build.build_ext"
The custom build_ext looks like
from setuptools import Extension
from setuptools.command.build_ext import build_ext as _build_ext
from Cython.Build import cythonize
class build_ext(_build_ext):
def initialize_options(self):
super().initialize_options()
if self.distribution.ext_modules is None:
self.distribution.ext_modules = []
extensions = Extension(...)
self.distribution.ext_modules.extend(cythonize(extensions))
def build_extensions(self):
...
super().build_extensions()
It builds a .pyx into .cpp, then adds it with another cpp into a .so.

I created a module that looks like this:
$ tree .
.
├── pyproject.toml
├── setup.py
└── test
└── helloworld.pyx
1 directory, 3 files
My pyproject.toml looks like:
[build-system]
requires = ["setuptools>=61.0", "numpy>=1.17", "cython>=0.18"]
build-backend = "setuptools.build_meta"
[tool.setuptools]
py-modules = ["test"]
[project]
name = "test"
version = "0.0.1"%
My setup.py:
from setuptools import setup
from Cython.Build import cythonize
setup(ext_modules=cythonize("test/helloworld.pyx"))
And helloworld.pyx just contains print("Hello world").
When I do pip install -e ., it builds the cython file as expected.
If you really don't want to have a setup.py at all, I think you'll need to override build_py instead of build_ext, but IMO just having the simple setup.py file isn't a big deal.

Related

Using pypa's build on a python project leads to a generic "none-any.whl" wheel, but the package has OS-specific binaries (cython)

I am trying to build a package for distribution which has cython code that I would like to compile into binaries before uploading to PyPI. To do this I am using pypa's build,
python -m build
in the project's root directory. This cythonizes the code and generates the binaries for my system then creates the sdist and wheel in the dist directory. However, the wheel is named "--py3-none-any.whl". When I unzip the .whl I do find the appropriate binaries stored,
(e.g., cycode.cp39-win_amd64.pyd). The problem is I plan to run this in a GitHub workflow where binaries are built for multiple python versions and operating systems. That workflow works fine but overwrites (or causes a duplicate version error) when uploading to PyPI since all of the wheels from the various OS share the same name. Then if I install from PyPI on another OS I get "module can't be found" errors since the binaries for that OS are not there and, since it was a wheel, the installation did not re-compile the cython files.
I am working with 64-bit Windows, MacOS, and Ubuntu. Python versions 3.8-3.10. And a small set of other packages which are listed below.
Does anyone see what I am doing wrong here? Thanks!
Simplified Package
Tests\
Project\
__init__.py
pycode.py
cymod\
__init__.py
_cycode.pyx
_build.py
pyproject.toml
pyproject.toml
[project]
name='Project'
version = '0.1.0'
description = 'My Project'
authors = ...
requires-python = ...
dependencies = ...
[build-system]
requires = [
'setuptools>=64.0.0',
'numpy>=1.22',
'cython>=0.29.30',
'wheel>=0.38'
]
build-backend = "setuptools.build_meta"
[tool.setuptools]
py-modules = ["_build"]
include-package-data = true
packages = ["Project",
"Project.cymod"]
[tool.setuptools.cmdclass]
build_py = "_build._build_cy"
_build.py
import os
from setuptools.extension import Extension
from setuptools.command.build_py import build_py as _build_py
class _build_cy(_build_py):
def run(self):
self.run_command("build_ext")
return super().run()
def initialize_options(self):
super().initialize_options()
import numpy as np
from Cython.Build import cythonize
print('!-- Cythonizing')
if self.distribution.ext_modules == None:
self.distribution.ext_modules = []
# Add to ext_modules list
self.distribution.ext_modules.append(
Extension(
'Project.cymod.cycode',
sources=[os.path.join('Project', 'cymod', '_cycode.pyx')],
include_dirs=[os.path.join('Project', 'cymod'), np.get_include()]
)
)
# Add cythonize ext_modules
self.distribution.ext_modules = cythonize(
self.distribution.ext_modules,
compiler_directives={'language_level': "3"},
include_path=['.', np.get_include()]
)
print('!-- Finished Cythonizing')

How to configure setuptools with setup.cfg to include platform name, python tag and ABI tag?

Due to the console message of setup.py install is deprecated, I am in the middle of upgrading my existing setup.py install to the recommended setup.cfg with build
My existing setup.py looks something like
from setuptools import setup
setup(
name='pybindsample',
version='0.1.0',
packages=[''],
package_data={'': ['pybindsample.so']},
has_ext_modules=lambda: True,
)
My current translation looks like:
setup.cfg
[metadata]
name = pybindsample
version = 0.1.0
[options]
packages = .
[options.package_data]
. = pybindsample.so
pyproject.toml
[build-system]
requires = ["setuptools"]
build-backend = "setuptools.build_meta"
My question is how can I translate has_ext_modules=lambda: True? has_ext_modules=lambda: True is from the solution here. Without this, after executing python3 -m build --wheel the file name of the generated wheel will become pybindsample-0.1.0-py3-none-any.whl, whereas my old python3 setup.py bdist_wheel will generate wheel with file name pybindsample-0.1.0-cp39-cp39-macosx_11_0_x86_64.whl. I have attempted
setup.cfg
[metadata]
name = pybindsample
version = 0.1.0
[options]
packages = .
has_ext_modules=lambda: True,
[options.package_data]
. = pybindsample.so
but it still generates pybindsample-0.1.0-py3-none-any.whl, I also attempted
setup.cfg
[metadata]
name = pybindsample
version = 0.1.0
[options]
packages = .
[options.package_data]
. = pybindsample.so
[bdist_wheel]
python-tag = c39
plat-name = macosx_11_0_x86_64
py-limited-api = c39
this generates pybindsample-0.1.0-cp39-none-macosx_11_0_x86_64.whl, and I couldn't figure out why the abi tag is still none.
What is the right way to configure setuptools with setup.cfg to include platform name, python tag, and ABI tag?

How to build py3-none-any wheels for a project with an optional C extension?

msgpack includes an optional cython
extension. Some users of the package want py3-none-any wheels of msgpack. I'm trying to figure out how to make
it possible to build wheels both with and without the optional extension.
One possible solution is to use an environment variable in setup.py to decide
whether to set ext_modules to an empty list of a list of setuptools.Extension
pyproject.toml
[build-system]
requires = ["setuptools", "wheel", "cython"]
build-backend = "setuptools.build_meta"
setup.py
from setuptools import setup, Extension
import os
if 'ONLY_PURE' in os.environ:
ext_modules = []
else:
module1 = Extension('helloworld', sources = ['helloworld.pyx'])
ext_modules = [module1]
setup(ext_modules=ext_modules)
setup.cfg
[metadata]
name = mypackage
version = 0.0.1
[options]
py_modules = mypackage
mypackage.py
try:
import helloworld
except ImportError:
print('hello pure python')
helloworld.pyx
print("hello extension")
To build with extension:
$ pip install build
...
$ python -m build
...
$ ls dist/
mypackage-0.0.1-cp39-cp39-linux_x86_64.whl mypackage-0.0.1.tar.gz
To build without extension
$ pip install build
...
$ ONLY_PURE='a nonempty string' python -m build
...
$ ls dist/
mypackage-0.0.1-py3-none-any.whl mypackage-0.0.1.tar.gz

How do I build multiple wheel files from a single setup.py?

In my project, I have a single setup.py file that builds multiple modules using the following namespace pattern:
from setuptools import setup
setup(name="testmoduleserver",
packages=["testmodule.server","testmodule.shared"],
namespace_packages=["testmodule"])
setup(name="testmoduleclient",
packages=["testmodule.client","testmodule.shared"],
namespace_packages=["testmodule"])
I am trying to build wheel files for both packages. However, when I do:
python -m pip wheel .
It only ever builds the package for one of the definitions.
Why does only one package get built?
You cannot call setuptools.setup() more than once in your setup.py, even if you want to create several packages out of one codebase.
Instead you need to separate everything out into separate namespace packages, and have one setup.py for each (they all can reside in one Git repository!):
testmodule/
testmodule-client/
setup.py
testmodule/
client/
__init__.py
testmodule-server/
setup.py
testmodule/
server/
__init__.py
testmodule-shared/
setup.py
testmodule/
shared/
__init__.py
And each setup.py contains something along the lines
from setuptools import setup
setup(
name='testmodule-client',
packages=['testmodule.client'],
install_requires=['testmodule-shared'],
...
)
and
from setuptools import setup
setup(
name='testmodule-server',
packages=['testmodule.server'],
install_requires=['testmodule-shared'],
...
)
and
from setuptools import setup
setup(
name='testmodule-shared',
packages=['testmodule.shared'],
...
)
To build all three wheels you then run
pip wheel testmodule-client
pip wheel testmodule-server
pip wheel testmodule-shared

Marking Cython as a Build Dependency?

There is a Python package with a setup.py that reads thusly:
from distutils.core import setup
from distutils.extension import Extension
from Cython.Distutils import build_ext
setup(
name = 'fastahack',
ext_modules=[
Extension("fastahack.cfastahack",
sources=["fastahack/cfastahack.pyx", "lib/Fasta.cpp", "lib/split.cpp"],
libraries=["stdc++"],
include_dirs=["lib/"],
language="c++"),
],
package_data = {'lib': ['*.pyx', "*.c", "*.h", "README.rst"]},
package_dir = {"fastahack": "fastahack"},
cmdclass = {'build_ext': build_ext},
packages = ['fastahack', 'fastahack.tests'],
author = "Brent Pedersen",
author_email="bpederse#gmail.com",
#test_suite='nose.collector'
)
This setup.py can't be imported if Cython is not installed. As far as I know, importing setup.py is how tools like pip figure out the dependencies of a package. I want to set up this package so that it could be uploaded to PyPI, with the fact that it depends on Cython noted, so that Cython will be downloaded and installed when you try to "pip install fastahack", or when you try to "pip install" directly from the Git repository.
How would I package this module so that it installs correctly from the Internet when Cython is not installed? Always using the latest version of Cython would be a plus.
You can specify Cython as a build dependency using PEP-518 project specification.
In the file pyproject.toml (in the same directory as setup.py) insert:
[build-system]
requires = ["setuptools", "wheel", "Cython"]
Cython will then be installed before building your package.
Note that (currently) you need to pass --no-use-pep517 to pip install if you are installing your package locally as editable (ie with --editable or -e) setuptools v64 supports editable installs with pyproject.toml builds
My standard template for setup.py:
have_cython = False
try:
from Cython.Distutils import build_ext as _build_ext
have_cython = True
except ImportError:
from distutils.command.build_ext import build_ext as _build_ext
if have_cython:
foo = Extension('foo', ['src/foo.pyx'])
else:
foo = Extension('foo', ['src/foo.c'])
setup (
...
ext_modules=[foo],
cmdclass={'build_ext': build_ext}
And don't forget to provide extention .c files with package - that will allow users to build module without installing cython.
Use a try and except for the Cython import and modify your setup based on whether or not your import succeeds. Look at the setup.py of Pandas for an example

Categories