reading "--plat-name" argument in setup.py - python

The setuptools bdist_wheel/bdist_egg commands have a --plat-name argument that allows for overriding the host platform name. This value gets tacked onto the name of the resulting file, e.g., mypackage-1.2.3-py2.py3-none-manylinux1_x86_64.whl.
How can I read this value in setup.py? Note I'm not asking for the host platform the script is running on, e.g., platform.system(). I want the platform name that setuptools is using.

In bdist_egg (and only it; bdist_wheel just runs bdist_egg) --plat-name argument is stored in self.plat_name. So you can override bdist_egg with your custom class and use self.plat_name:
from setuptools.command.bdist_egg import bdist_egg as _bdist_egg
from setuptools import setup
class bdist_egg(_bdist_egg):
def run(self):
# Use self.plat_name before building an egg…
_bdist_egg.run(self)
# …or after
setup(
…
cmdclass={
'bdist_egg': bdist_egg,
},
…
)

Related

Change output directory in setup.py

I'm using setup from setuptools to create a setup.py, and I was wondering if it's possible to change the output directory programmatically to change it from dist/.
I'm aware that you can do this from the command line using the --dist-dir flag, but I want to be able to do from within the setup.py file instead.
Anyone have any ideas?
You need to override code that set the default name:
from distutils.command.bdist import bdist as _bdist
from distutils.command.sdist import sdist as _sdist
dist_dir = 'my-dist-dir'
class bdist(_bdist):
def finalize_options(self):
_bdist.finalize_options(self)
self.dist_dir = dist_dir
class sdist(_sdist):
def finalize_options(self):
_sdist.finalize_options(self)
self.dist_dir = dist_dir
setup(
cmdclass={
'bdist': bdist,
'sdist': sdist,
},
…
)
Other bdist_* commands copy the value from bdist.

Get Gitlab's Continuous Integration to compile a Python extension written in C

Context
I have a Python project for which I wrap some C/C++ code (using the excellent PyBind library). I have a set of C and Python unit tests and I've configured Gitlab's CI to run them at each push.
The C tests use a minimalist unit test framework called minunit and I use Python's unittest suite.
Before running the C tests, all the C code is compiled and then tested. I'd like to also compile the C/C++ wrapper for Python before running the Python tests, but have a hard time to do it.
Question in a few words
Is there a standard/good way to get Gitlab-CI to build a Python extension using setuptools before running unit-tests?
Question with more words / Description of what I tried
To compile the C/C++ wrapper locally, I use setuptools with a setup.py file including a build_ext command.
I locally compile everything with python setup.py build_ext --inplace (the last arg --inplace will just copy the compiled file to the current directory).
As far as I know, this is quite standard.
What I tried to do on Gitlab is to have a Python script (code below) that will run a few commands using os.system command (which appears to be bad practice...).
The first command is to run a script building and running all C tests. This works but I'm happy to take recommendations (should I configure Gitlab CI to run C tests separately?).
Now, the problem comes when I try to build the C/C++ wrapper, with os.system("cd python/ \npython setup.py build_ext --inplace"). This generates the error
File "setup.py", line 1, in <module>
from setuptools import setup, Extension
ImportError: No module named setuptools
So I tried to modify my gitlab's CI configuration file to install python-dev. My .gitlab-ci.yml looks like
test:
script:
- apt-get install -y python-dev
- python run_tests.py
But, not being sudo on the gitlab's server, I get the following error E: Could not open lock file /var/lib/dpkg/lock - open (13: Permission denied).
Anyone knows a way around that, or a better way to tackle this problem?
Any help would be more than welcome!
run_tests.py file
import unittest
import os
from shutil import copyfile
import glob
class AllTests(unittest.TestCase):
def test_all(self):
# this automatically loads all tests in current dir
testsuite = unittest.TestLoader().discover('tests/Python_tests')
# run tests
result = unittest.TextTestRunner(verbosity=2).run(testsuite)
# send/print results
self.assertEqual(result.failures, [], 'Failure')
if __name__ == "__main__":
# run C tests
print(' ------------------------------------------------------ C TESTS')
os.system("cd tests/C_tests/ \nbash run_all.sh")
# now python tests
print(' ------------------------------------------------- PYTHON TESTS')
# first build and copy shared library compiled from C++ in the python test directory
# build lib
os.system("cd python/ \npython setup.py build_ext --inplace")
# copy lib it to right place
dest_dir = 'tests/Python_tests/'
for file in glob.glob(r'python/*.so'):
print('Copying file to test dir : ', file)
copyfile(file, dest_dir+file.replace('python/', ''))
# run Python tests
unittest.main(verbosity=0)
My suggestion would be moving the entire test running logic into the setup script.
using test command
First of all, setuptools ships a test command, so you can run the tests via python setup.py test. Even better, the test calls build_ext command under the hood and places the built extensions so that they accessible in the tests, so no need for you to invoke python setup.py build_ext explicitly:
$ python setup.py test
running test
running egg_info
creating so.egg-info
writing so.egg-info/PKG-INFO
writing dependency_links to so.egg-info/dependency_links.txt
writing top-level names to so.egg-info/top_level.txt
writing manifest file 'so.egg-info/SOURCES.txt'
reading manifest file 'so.egg-info/SOURCES.txt'
writing manifest file 'so.egg-info/SOURCES.txt'
running build_ext
building 'wrap_fib' extension
creating build
creating build/temp.linux-aarch64-3.6
aarch64-unknown-linux-gnu-gcc -pthread -fPIC -I/data/gentoo64/usr/include/python3.6m -c wrap_fib.c -o build/temp.linux-aarch64-3.6/wrap_fib.o
aarch64-unknown-linux-gnu-gcc -pthread -fPIC -I/data/gentoo64/usr/include/python3.6m -c cfib.c -o build/temp.linux-aarch64-3.6/cfib.o
creating build/lib.linux-aarch64-3.6
aarch64-unknown-linux-gnu-gcc -pthread -shared -Wl,-O1 -Wl,--as-needed -L. build/temp.linux-aarch64-3.6/wrap_fib.o build/temp.linux-aarch64-3.6/cfib.o -L/data/gentoo64/usr/lib64 -lpython3.6m -o build/lib.linux-aarch64-3.6/wrap_fib.cpython-36m-aarch64-linux-gnu.so
copying build/lib.linux-aarch64-3.6/wrap_fib.cpython-36m-aarch64-linux-gnu.so ->
test_fib_0 (test_fib.FibonacciTests) ... ok
test_fib_1 (test_fib.FibonacciTests) ... ok
test_fib_10 (test_fib.FibonacciTests) ... ok
----------------------------------------------------------------------
Ran 3 tests in 0.002s
OK
(I used the code from the Cython Book examples repository to play with, but the output should be pretty similar to what PyBind produces).
using the extra keywords
Another feature that may come handy are the extra keywords setuptools adds: test_suite, tests_require, test_loader (docs). Here's an example of embedding a custom test suite as you do in run_tests.py:
# setup.py
import unittest
from Cython.Build import cythonize
from setuptools import setup, Extension
exts = cythonize([Extension("wrap_fib", sources=["cfib.c", "wrap_fib.pyx"])])
def pysuite():
return unittest.TestLoader().discover('tests/python_tests')
if __name__ == '__main__':
setup(
name='so',
version='0.1',
ext_modules=exts,
test_suite='setup.pysuite'
)
extending the test command
The last requirement is running C tests. We can embed them by overriding the test command and invoking some custom code from there. The advantage of that is that distutils offers a command API with many useful functions, like copying files or executing external commands:
# setup.py
import os
import unittest
from Cython.Build import cythonize
from setuptools import setup, Extension
from setuptools.command.test import test as test_orig
exts = cythonize([Extension("wrap_fib", sources=["cfib.c", "wrap_fib.pyx"])])
class test(test_orig):
def run(self):
# run python tests
super().run()
# run c tests
self.announce('Running C tests ...')
pwd = os.getcwd()
os.chdir('tests/C_tests')
self.spawn(['bash', 'run_all.sh'])
os.chdir(pwd)
def pysuite():
return unittest.TestLoader().discover('tests/python_tests')
if __name__ == '__main__':
setup(
name='so',
version='0.1',
ext_modules=exts,
test_suite='setup.pysuite',
cmdclass={'test': test}
)
I extended the original test command, running some extra stuff after the python unit tests finish (notice calling of an external command via self.spawn). All that is left is replacing the default test command with the custom one via passing cmdclass in the setup function.
Now you have everything collected in the setup script and python setup.py test will do all the dirty job.
But, not being sudo on the gitlab's server, I get the following error
I don't have any experience with Gitlab CI, but I can't imagine there is no possibility to install packages on the build server. Maybe this question will be helpful: How to use sudo in build script for gitlab ci?
If there really is no other option, you can bootstrap a local copy of setuptools with ez_setup.py. Note, however, that although this method still works, it was deprecated recently.
Also, if you happen to use a recent version of Python (3.4 and newer), then you should have pip bundled with Python distribution, so it should be possible to install setuptools without root permissions with
$ python -m pip install --user setuptools

setup_requires only for some commands

I have a distutils-style Python package which requires a specific, and quite large, dependency for its build step. Currently, this dependency is specified under the setup_requires argument to distutils.setup. Unfortunately, this means the dependency will be built for any execution of setup.py, including when running setup.py clean. This creates the rather ironic situation of the clean step sometimes causing large amount of code to be compiled.
As I said, this setup dependency is only required for the build step. Is there a way to encode this logic in setup.py so that all commands that do not invoke the build command are run without it?
You can always order the Distribution to fetch some packages explicitly, same way as they will be if you define them in setup_requires. Example with numpy dependency required for build command only:
from distutils.command.build import build as build_orig
from setuptools import setup, find_packages, Command, dist
class build(build_orig):
def run(self):
self.distribution.fetch_build_eggs(['numpy'])
# numpy becomes available after this line. Test it:
import numpy
print(numpy.__version__)
super().run()
setup(
name='spam',
packages=find_packages(),
cmdclass={'build': build,}
...
)
The dependencies are passed the same as they would be defined in setup_requires arg, so version specs are also ok:
self.distribution.fetch_build_eggs(['numpy>=1.13'])
Although I must note that fetching dependencies via setup_requires is usually much slower than installing them via pip (especially when you have some heavy dependencies that must be built from source first), so if you can be sure you will have pip available (or use python3.4 and newer), the approach suggested by phd in his answer will save you time. Fetching eggs via distribution may, however, come handy when building for old python versions or obscure python installations like the system python on MacOS.
if sys.argv[0] == 'build':
kw = {'setup_requires': [req1, req2, …]}
else:
kw = {}
setup(
…,
**kw
)
Another approach to try is override build command with a custom cmdclass:
from setuptools.command.build import build as _build
class build(_build):
def run(self):
subprocess.call(["pip", "install", req1, req2…])
_build.run(self)
setup(
…,
cmdclass={'build': build},
)
and avoid setup_requires at all.

setuptools and the bdist_wheel command [duplicate]

I am working on a python2 package in which the setup.py contains some custom install commands. These commands actually build some Rust code and output some .dylib files that are moved into the python package.
An important point is that the Rust code is outside the python package.
setuptools is supposed to detect automatically if the python package is pure python or platform specific (if it contains some C extensions for instance).
In my case, when I run python setup.py bdist_wheel, the generated wheel is tagged as a pure python wheel: <package_name>-<version>-py2-none-any.whl.
This is problematic because I need to run this code on different platforms, and thus I need to generated one wheel per platform.
Is there a way, when building a wheel, to force the build to be platform specific ?
Here's the code that I usually look at from uwsgi
The basic approach is:
setup.py
# ...
try:
from wheel.bdist_wheel import bdist_wheel as _bdist_wheel
class bdist_wheel(_bdist_wheel):
def finalize_options(self):
_bdist_wheel.finalize_options(self)
self.root_is_pure = False
except ImportError:
bdist_wheel = None
setup(
# ...
cmdclass={'bdist_wheel': bdist_wheel},
)
The root_is_pure bit tells the wheel machinery to build a non-purelib (pyX-none-any) wheel. You can also get fancier by saying there are binary platform-specific components but no cpython abi specific components.
The modules setuptools, distutils and wheel decide whether a python distribution is pure by checking if it has ext_modules.
If you build an external module on your own, you can still list it in ext_modules so that the building tools know it exists. The trick is to provide an empty list of sources so that setuptools and distutils will not try to build it. For example,
setup(
...,
ext_modules=[
setuptools.Extension(
name='your.external.module',
sources=[]
)
]
)
This solution worked better for me than patching the bdist_wheel command. The reason is that bdist_wheel calls the install command internally and that command checks again for the existence of ext_modules to decide between purelib or platlib install. If you don't list the external module, you end up with the lib installed in a purelib subfolder inside the wheel. That causes problems when using auditwheel repair, which complains about the extensions being installed in a purelib folder.
You can also specify/spoof a specific platform name when building wheels by specifying a --plat-name:
python setup.py bdist_wheel --plat-name=manylinux1_x86_64
Neither the root_is_pure trick nor the empty ext_modules trick worked for me, but after MUCH searching myself, I finally found a working solution in 'pip setup.py bdist_wheel' no longer builds forced non-pure wheels
Basically, you override the 'has_ext_modules' function in the Distribution class, and set distclass to point to the overriding class. At that point, setup.py will believe you have a binary distribution, and will create a wheel with the specific version of python, the ABI, and the current architecture. As suggested by https://stackoverflow.com/users/5316090/py-j:
from setuptools import setup
from setuptools.dist import Distribution
DISTNAME = "packagename"
DESCRIPTION = ""
MAINTAINER = ""
MAINTAINER_EMAIL = ""
URL = ""
LICENSE = ""
DOWNLOAD_URL = ""
VERSION = '1.2'
PYTHON_VERSION = (2, 7)
# Tested with wheel v0.29.0
class BinaryDistribution(Distribution):
"""Distribution which always forces a binary package with platform name"""
def has_ext_modules(foo):
return True
setup(name=DISTNAME,
description=DESCRIPTION,
maintainer=MAINTAINER,
maintainer_email=MAINTAINER_EMAIL,
url=URL,
license=LICENSE,
download_url=DOWNLOAD_URL,
version=VERSION,
packages=["packagename"],
# Include pre-compiled extension
package_data={"packagename": ["_precompiled_extension.pyd"]},
distclass=BinaryDistribution)
I find Anthony Sottile's answer great, but didn't work for me.
My case is that I have to enforce the wheel to be created for x86_64, but any python3, so making root impure actually caused my wheel to be py36-cp36 :(
A better way, IMO, in general is just to use sys.argv:
from setuptools import setup
import sys
sys.argv.extend(['--plat-name', 'x86_64'])
setup(name='example-wheel')

Running custom setuptools build during install

I've tried to implement Compass compiling during setuptools' build, but the following code runs compilation during explicit build command and doesn't runs during install.
#!/usr/bin/env python
import os
import setuptools
from distutils.command.build import build
SETUP_DIR = os.path.dirname(os.path.abspath(__file__))
class BuildCSS(setuptools.Command):
description = 'build CSS from SCSS'
user_options = []
def initialize_options(self):
pass
def run(self):
os.chdir(os.path.join(SETUP_DIR, 'django_project_dir', 'compass_project_dir'))
import platform
if 'Windows' == platform.system():
command = 'compass.bat compile'
else:
command = 'compass compile'
import subprocess
try:
subprocess.check_call(command.split())
except (subprocess.CalledProcessError, OSError):
print 'ERROR: problems with compiling Sass. Is Compass installed?'
raise SystemExit
os.chdir(SETUP_DIR)
def finalize_options(self):
pass
class Build(build):
sub_commands = build.sub_commands + [('build_css', None)]
setuptools.setup(
# Custom attrs here.
cmdclass={
'build': Build,
'build_css': BuildCSS,
},
)
Any custom instructions at Build.run (e.g. some printing) doesn't apply during install too, but dist instance contains in commands attribute only my build command implementation instances. Incredible! But I think the trouble is in complex relations between setuptools and distutils. Does anybody knows how to make custom building run during install on Python 2.7?
Update: Found that install definitely doesn't calls build command, but it calls bdist_egg which runs build_ext. Seems like I should implement "Compass" build extension.
Unfortunatelly, I haven't found the answer. Seems like the ability to run post-install scripts correctly there's only at Distutils 2. Now you can use this work-around:
Update: Because of setuptools' stack checks, we should override install.do_egg_install, not run method:
from setuptools.command.install import install
class Install(install):
def do_egg_install(self):
self.run_command('build_css')
install.do_egg_install(self)
Update2: easy_install runs exactly bdist_egg command which is used by install too, so the most correct way (espetially if you want to make easy_install work) is to override bdist_egg command. Whole code:
#!/usr/bin/env python
import setuptools
from distutils.command.build import build as _build
from setuptools.command.bdist_egg import bdist_egg as _bdist_egg
class bdist_egg(_bdist_egg):
def run(self):
self.run_command('build_css')
_bdist_egg.run(self)
class build_css(setuptools.Command):
description = 'build CSS from SCSS'
user_options = []
def initialize_options(self):
pass
def finalize_options(self):
pass
def run(self):
pass # Here goes CSS compilation.
class build(_build):
sub_commands = _build.sub_commands + [('build_css', None)]
setuptools.setup(
# Here your setup args.
cmdclass={
'bdist_egg': bdist_egg,
'build': build,
'build_css': build_css,
},
)
You may see how I've used this here.

Categories