Python build CFFI in API mode with Setuptools - python

I'm trying to learn about creating a CFFI modules, and packaging them with setuptools. When I run the build script build_foo.py I get an API mode library, but when I try to pip install . I get an ABI mode library.
Creates API mode
$> python build_foo.py
$> ls
build_foo.py _one_cffi.cpython-36m-x86_64-linux-gnu.so
_one_cffi.c _one_cffi.o
Creates ABI mode
$> pip install .
$> ls env/bin/site-packages
cffi pkg_resources
cffi-1.11.5.dist-info pkg_resources-0.0.0.dist-info
_cffi_backend.cpython-36m-x86_64-linux-gnu.so __pycache__
easy_install.py pycparser
Foo-0.1.dist-info pycparser-2.19.dist-info
foopkg setuptools
_one_cffi.abi3.so setuptools-40.6.2.dist-info
pip wheel
pip-18.1.dist-info wheel-0.32.3.dist-info
Files
build_foo.py
#!/usr/bin/env python3
import cffi
ffi = cffi.FFI()
ffi.cdef("int get_one();")
ffi.set_source("_one_cffi",
"""
int get_one() {
return 1;
}
"""
)
if __name__ == '__main__':
ffi.compile(verbose=True)
setup.py
from setuptools import setup
setup(
name = 'Foo',
version = '0.1',
packages = ['foopkg'],
cffi_modules=["foopkg/build_foo.py:ffi"],
install_requires = ['cffi']
)

I think that the filename ending with abi3.so has to do with Python's PEP 425 abi tag, not the ABI-mode option for how to use CFFI generated bindings. As far as I can tell the cffi_module install is simply failing to add any platform specific information to the shared object file, but I think that Python code that calls foopkg can still access get_one() in API-mode.

Related

pip install -e . vs setup.py

I have been locally editing (inside a conda env) the package GSTools cloned from the github repo https://github.com/GeoStat-Framework/GSTools, to adapt it to my own purposes. The package is c++ wrapped in python (cython).
I've thus far used pip install -e . in the main package dir for my local changes. But I want to now use their OpenMP support by setting the env variable export GSTOOLS_BUILD_PARALLEL=1 . Then doing pip install -e . I get among other things in the terminal ...
Installing collected packages: gstools
Running setup.py develop for gstools
Successfully installed gstools-1.3.6.dev37
The issue is nothing actually changed because, setup.py (shown below) is supposed to print "OpenMP=True" if the env variable is set to GSTOOLS_BUILD_PARALLEL=1 in the linux terminal , and print something else if its not set to 1.
here is setup.py.
# -*- coding: utf-8 -*-
"""GSTools: A geostatistical toolbox."""
import os
​
import numpy as np
from Cython.Build import cythonize
from extension_helpers import add_openmp_flags_if_available
from setuptools import Extension, setup
​
# cython extensions
CY_MODULES = [
Extension(
name=f"gstools.{ext}",
sources=[os.path.join("src", "gstools", *ext.split(".")) + ".pyx"],
include_dirs=[np.get_include()],
define_macros=[("NPY_NO_DEPRECATED_API", "NPY_1_7_API_VERSION")],
)
for ext in ["field.summator", "variogram.estimator", "krige.krigesum"]
]
# you can set GSTOOLS_BUILD_PARALLEL=0 or GSTOOLS_BUILD_PARALLEL=1
if int(os.getenv("GSTOOLS_BUILD_PARALLEL", "0")):
added = [add_openmp_flags_if_available(mod) for mod in CY_MODULES]
print(f"## GSTools setup: OpenMP used: {any(added)}")
else:
print("## GSTools setup: OpenMP not wanted by the user.")
​
# setup - do not include package data to ignore .pyx files in wheels
setup(ext_modules=cythonize(CY_MODULES), include_package_data=False)
I tried instead just python setup.py install but that gives
UNKNOWN 0.0.0 is already the active version in easy-install.pth
Installed /global/u1/b/benabou/.conda/envs/healpy_conda_gstools_dev/lib/python3.8/site-packages/UNKNOWN-0.0.0-py3.8-linux-x86_64.egg
Processing dependencies for UNKNOWN==0.0.0
Finished processing dependencies for UNKNOWN==0.0.0
and import gstools
no longer works correctly.
So how can I install my edited version of the package with OpenMP support?
developer of GSTools here.
I guess you don't see the printed message, because pip is suppressing output for the setup. So you could try making pip verbose with:
GSTOOLS_BUILD_PARALLEL=1 pip install -v -e .
BTW, we are always interested in enhancements. So maybe you are willing the share your edits on GSTools? :-)
Cheers,
Sebastian

How to create a full wheel with abi tag?

Trying to create a wheel from an empty project, using this setup.py:
setup.py
from setuptools import setup
setup(name='bla', version='1')
I invoke with python setup.py bdist_wheel --python-tag py35 --plat-name linux_x86_64 and get
bla-1-py35-none-linux_x86_64.whl
My machine stats
python -V: Python 3.6.9
uname -p: x86_64
How to enforce abi? (make it bla-1-py35-cp35-linux_x86_64.whl)
How to decide between py35 and cp35 in my python-tag?
After MUCH searching myself, I finally found a working solution in 'pip setup.py bdist_wheel' no longer builds forced non-pure wheels
Basically, if setup.py believes you have a binary distribution, it will create a wheel with the specific version of python, the ABI, and the current architecture. You can do that by overriding the 'has_ext_modules' function in the Distribution class. As suggested by https://stackoverflow.com/users/5316090/py-j:
from setuptools import setup
from setuptools.dist import Distribution
DISTNAME = "packagename"
DESCRIPTION = ""
MAINTAINER = ""
MAINTAINER_EMAIL = ""
URL = ""
LICENSE = ""
DOWNLOAD_URL = ""
VERSION = '1.2'
PYTHON_VERSION = (2, 7)
# Tested with wheel v0.29.0
class BinaryDistribution(Distribution):
"""Distribution which always forces a binary package with platform name"""
def has_ext_modules(foo):
return True
setup(name=DISTNAME,
description=DESCRIPTION,
maintainer=MAINTAINER,
maintainer_email=MAINTAINER_EMAIL,
url=URL,
license=LICENSE,
download_url=DOWNLOAD_URL,
version=VERSION,
packages=["packagename"],
# Include pre-compiled extension
package_data={"packagename": ["_precompiled_extension.pyd"]},
distclass=BinaryDistribution)
Then you run the setup.py file from whatever Python version/architecture you need, and it will create a platform-specific wheel for each.
The ABI tag depends on your Python version. It will be automatically added to your wheel file name. The command python setup.py bdist_wheel is enough for building the wheel file.
To create wheel packages with different ABI tags, a simple way is to run different Python versions in different Docker containers.
The wheel packages I published to pypi.org
The pattern of my package name (package-cp(python version)-cp(python version)m-manylinux1_x86_64.whl) is a little bit from yours. You can't add cp35 to the package built using Python 3.6.

How to install data_files to absolute path?

I use pip with setuptools to install a package.
I want pip to copy some resource files to, say, /etc/my_package.
My setup.py looks like this:
setup(
...
data_files=[('/etc/my_package', ['config.yml'])]
)
When running pip install, the file ends up in
~/.local/lib/python3.5/site-packages/etc/my_package/config.yml
instead of /etc/my_package.
What am I doing wrong?
(pip version 9.0.1)
Short answer: use pip install --no-binary :all: to install your package.
I struggled with this for a while and eventually figured out that there is some weirdness/inconsistency in how data_files are handled between binary wheels and source distributions. Specifically, there is a bug with wheels that makes all paths in data_files relative to the install location (see https://github.com/pypa/wheel/issues/92 for an issue tracking this).
"Thats fine", you might say, "but I'm not using a wheel!". Not so fast! It turns out recent versions of pip (I am working with 9.0.1) will try to compile a wheel even from a source distribution. For example, if you have a package my_package you can see this doing something like
$ python setup.py sdist # create source tarball as dist/my_package.tar.gz
[...]
$ pip install dist/my_package.tar.gz # install the generated source
[...]
Building wheels for collected packages: my_package
Running setup.py bdist_wheel for my_package ... done
pip tries to be helpful and build a wheel to install from and cache for later. This means you will run into the above bug even though in theory you are not using bdist_wheel yourself. You can get around this by running python setup.py install directly from the package source folder. This avoids the building and caching of built wheels that pip will try to do but is majorly inconvenient when the package you want is already on PyPI somewhere. Fortunately pip offers an option to explicitly disable binaries.
$ pip install --no-binary :all: my_package
[...]
Skipping bdist_wheel for my_package, due to binaries being disabled for it.
Installing collected packages: my_package
Running setup.py install for my_package ... done
Successfully installed my_package-0.1.0
Using the --no-binary option prevents wheel building and lets us reference absolute paths in our data_files paths again. For the case where you are installing a lot of packages together and want to selectively disable wheels you can replace :all: with a comma separated list of packages.
it seems that data_files can't support absolute path, it will add sys.prefix before "/etc/my_package", if you want to put config.yml to ../site_packages/my_package, please try:
import os
import sys
from distutils.sysconfig import get_python_lib
relative_site_packages = get_python_lib().split(sys.prefix+os.sep)[1]
date_files_relative_path = os.path.join(relative_site_packages, "my_package")
setup(
...
data_files=[(date_files_relative_path, ['config.yml'])]
)
I ended up writing an init() function that installs the config file on first run instead of creating it during the installation:
def init():
try:
if not path.isdir(config_dir):
os.mkdir(cs_dir)
copyfile(pkg_resources.resource_filename(
__name__, "default_config.yml"), config_file)
print("INFO: config file created. ")
except IOError as ex:
print("ERROR: could not create config directory: " + str(ex)
if __name__ == "__main__":
init()
main()

'pip setup.py bdist_wheel' no longer builds forced non-pure wheels

I have a project that compiles with C extensions on Linux, but without
them on Windows. When I first generated the wheel files on Windows with python setup.py bdist_wheel, they became universal, and I could not upload them to PyPI as these universal wheels are preferred by pip for installation
over the .tar.gz uploads (the result from python setup.py sdist).
The trick around this was to specify in the setup.py:
Distribution.is_pure = lambda *args: False
or by subclassing Distribution:
class BinaryDistribution(Distribution):
def is_pure(self):
return False
and calling setup() in setup.py with the extra keyword argument distclass=BinaryDistribution,.
This all worked fine on my VM running Windows XP 64 which has 32 and 64 bit versions of Python 2.6/2.7/3.3/3.4 and pypy installed just for this purpose. A simple batch file gives me:
dist/pkg-1.0-cp26-none-win32.whl
dist/pkg-1.0-cp26-none-win_amd64.whl
dist/pkg-1.0-cp27-none-win32.whl
dist/pkg-1.0-cp27-none-win_amd64.whl
dist/pkg-1.0-cp33-none-win32.whl
dist/pkg-1.0-cp33-none-win_amd64.whl
dist/pkg-1.0-cp34-none-win32.whl
dist/pkg-1.0-cp34-none-win_amd64.whl
and the appropriate package gets downloade and installed by pip when you run pip on Windows and when you run pip on Linux you get the
pkg-1.0.tar.gz
which includes the C sources which are compiled during installation.
The problem started with the fact that I don't have a spare Windows 7 licensed machine where I can install Python 3.5 (it doesn't install on the EOL XP). So I investigated Appveyor and created appveyor.yml:
environment:
matrix:
- PYTHON: C:\Python27
- PYTHON: C:\Python33
- PYTHON: C:\Python34
- PYTHON: C:\Python35
- PYTHON: C:\Python27-x64
- PYTHON: C:\Python33-x64
DISTUTILS_USE_SDK: '1'
- PYTHON: 'C:\Python34-x64'
DISTUTILS_USE_SDK: '1'
- PYTHON: 'C:\Python35-x64'
install:
- |
%PYTHON%\python.exe -m pip install --upgrade pip
%PYTHON%\python.exe -m pip install wheel
build: off
test_script:
- echo Skipped for now
after_test:
- |
%PYTHON%\python.exe setup.py bdist_wheel
artifacts:
- path: dist\*
With the exact same source the result from the above eight calls to python setup.py bdist_wheel are:
pkg-1.0-py2-none-any.whl
pkg-1.0-py3-none-any.whl
And if you upload these to PyPI, Linux prefers them over the .tar.gz leading to non-inclusion of the C extension code.
What causes this, and how can I use Appveyor to build my .whl files (or at least the ones for Python 3.5?
I've just run into this issue myself with Python v2.7 and wheel v0.29.0 on Windows 7 x64, where I build a Python package with some pre-compiled extensions (complicated VisualStudio setup with SWIG and external DLLs).
After examining the source code I have found that overriding Distribution.has_ext_modules works (automatically includes platform name and ABI tag):
from setuptools import setup
from setuptools.dist import Distribution
DISTNAME = "packagename"
DESCRIPTION = ""
MAINTAINER = ""
MAINTAINER_EMAIL = ""
URL = ""
LICENSE = ""
DOWNLOAD_URL = ""
VERSION = '1.2'
PYTHON_VERSION = (2, 7)
# Tested with wheel v0.29.0
class BinaryDistribution(Distribution):
"""Distribution which always forces a binary package with platform name"""
def has_ext_modules(foo):
return True
setup(name=DISTNAME,
description=DESCRIPTION,
maintainer=MAINTAINER,
maintainer_email=MAINTAINER_EMAIL,
url=URL,
license=LICENSE,
download_url=DOWNLOAD_URL,
version=VERSION,
packages=["packagename"],
# Include pre-compiled extension
package_data={"packagename": ["_precompiled_extension.pyd"]},
distclass=BinaryDistribution)
The difference, of course, is in the environment, on the correctly working Win XP there is an older version of the wheel package installed (0.24.0) whereas on Appveyor the latest and greatest (and broken) version 0.26 of wheel gets installed (0.25 is broken as well).
Changing the install stanza in the YAML file to fix the wheel version:
install:
- |
%PYTHON%\python.exe -m pip install --upgrade pip
%PYTHON%\python.exe -m pip install wheel==0.24
is enough to get this to work quickly.
You should however upgrade your the wheel package on your Linux box to version 0.28 and then use the new commandline option --plat-name:
python setup.py sdist
python2 setup.py bdist_wheel --plat-name win32
python2 setup.py bdist_wheel --plat-name win_amd64
python3 setup.py bdist_wheel --plat-name win32
python3 setup.py bdist_wheel --plat-name win_amd64
that will generate:
pkg-1.1.tar.gz
dist/pkg-1.1-py2-none-win32.whl
dist/pkg-1.1-py2-none-win32.whl
dist/pkg-1.1-py3-none-win_amd64.whl
dist/pkg-1.1-py3-none-win32.whl
dist/pkg-1.0-cp34-none-win_amd64.whl
which you can upload to PyPI and results in the correct (.tar.gz) file downloading on Linux and the appropriate wheel on Windows. By just making
sure that if the --plat-name win... is specified setup() is called with ext_modules=None. The resulting wheel files have minor (line endings in 3 files and their SHA256SUM), but install normally on Windows.
That way you no longer need to build these packages, that are essentially pure packages, on a Windows machine
For me this change by Nate Coraor brings my total build time down from 15+ minutes to about 7 seconds
An alternative that seems to do the same as the accepted answer but more concisely is this:
from setuptools import setup
DISTNAME = "packagename"
DESCRIPTION = ""
MAINTAINER = ""
MAINTAINER_EMAIL = ""
URL = ""
LICENSE = ""
DOWNLOAD_URL = ""
VERSION = '1.2'
PYTHON_VERSION = (2, 7)
setup(name=DISTNAME,
description=DESCRIPTION,
maintainer=MAINTAINER,
maintainer_email=MAINTAINER_EMAIL,
url=URL,
license=LICENSE,
download_url=DOWNLOAD_URL,
version=VERSION,
packages=["packagename"],
# Include pre-compiled extension
package_data={"packagename": ["_precompiled_extension.pyd"]},
has_ext_modules=lambda: True)

Why does setuptools not understand git+https URLs?

According to Dependency section in the setuptools manual git repository URLs can be specified in the dependency_links argument to setup with git+URL. Yet,
cd /tmp
mkdir py-test
cd py-test
touch __init__.py
and creation of a setup.py file with
from setuptools import setup, find_packages
from pkg_resources import parse_version
setup(
name = "py-test",
version = "1.0",
packages = ["."],
dependency_links = [
"git+https://github.com/wxWidgets/wxPython.git"
],
install_requires = ["wxPython"],
)
causes the error Download error on git+https://github.com/wxWidgets/wxPython.git: unknown url type: git+https -- Some packages may not be found! when I run python setup.py build && sudo setup.py install.
The installation of the package python-setuptools-git doesn't help.
I'm using setuptools 18.2 with python 2.7 on Ubuntu 15.04.
From the setuptools docs:
In the case of a VCS checkout, you should also append #egg=project-version in order to identify for what package that checkout should be used
So the fix is just to append the #egg=wxPython fragment onto the end:
dependency_links = [
"git+https://github.com/wxWidgets/wxPython.git#egg=wxPython"
]

Categories