Integrating setup.py with Makefile to run tests - python

I used to run the tests for my packages from the Makefile as a way to perform three tasks in one: setup a virtual environment, install requirements and call the testing suite with the corresponding arguments:
test: venv
env/bin/pip install -r test_requirements.txt
env/bin/nosetests --rednose --with-coverage --cover-pagacke my_module
Then I read that requirements.txt files are discouraged in favor of setup.py, so I modified the setup.py file aiming to get the same result:
setup(
...
tests_require=['nose', 'rednose', 'coverage'],
test_suite=['nose.collector'])
Now I could change the Makefile with
test: venv
coverage run --source=my_module/ setup.py test
But that requires installing the testing dependencies before running the setup.py file. I'm also not sure how to include other arguments such as rednose. What is the best way to do this?

Tox is good and all, but here's how to do it without having to install any other package beforehand.
List the testing dependencies as setup_requires instead of tests_require in the setup.py file
setup(
setup_requires=['nose', 'rednose', 'coverage'],
install_requires=[] # fill in other arguments as usual
)
Optionally add test parameters to the setup.cfg file.
[nosetests]
rednose=1
detailed-errors=1
with-coverage=1
cover-package=server
cover-xml=1
verbosity=2
Run the tests with the following command
python setup.py nosetests
Source: http://nose.readthedocs.io/en/latest/api/commands.html

Related

Pass command line arguments to nose via "python setup.py test"

Package Settings
I have built a Python package which uses nose for testing. Therefore, setup.py contains:
..
test_suite='nose.collector',
tests_require=['nose'],
..
And python setup.py test works as expected:
running test
...
----------------------------------------------------------------------
Ran 3 tests in 0.065s
OK
Running with XUnit output
Since I'm using Jenkins CI, I would like to output the nose results to JUnit XML format:
nosetests <package-name> --with-xunit --verbose
However, python setup.py test is far more elegant, and it installs the test requirements without having to build a virtual environment.
Is there a way to pass the --with-xunit (or any other parameter) to nose, when calling nose via python setup.py test?
You can set nosetests option using setup.cfg
For example in you setup.cfg
[nosetests]
with-xunit=1
Further information can be found at http://nose.readthedocs.io/en/latest/api/commands.html
Nose provides its own setuptools command (nosetests) which accepts command line arguments:
python setup.py nosetests --with-xunit
More information can be found here:
http://nose.readthedocs.io/en/latest/setuptools_integration.html

Tox can't copy non-python file while installing the module

This is the tree structure of the module I'm writing the setup.py file for:
ls .
LICENSE
README.md
bin
examples
module
scratch
setup.py
tests
tox.ini
I configured my setup.py as follows:
from setuptools import setup, find_packages
setup(
name="package_name",
version="0.1",
packages=find_packages(),
install_requires=[
# [...]
],
extras_require={
# [...]
},
tests_require={
'pytest',
'doctest'
},
scripts=['bin/bootstrap'],
data_files=[
('license', ['LICENSE']),
],
# [...]
# could also include long_description, download_url, classifiers, etc.
)
If I install the package from my python environment (also a virtualenv)
pip install .
the LICENSE file gets correctly installed.
But running tox:
[tox]
envlist = py27, py35
[testenv]
deps =
pytest
git+https://github.com/djc/couchdb-python
docopt
commands = py.test \
{posargs}
I get this error:
running install_data
creating build/bdist.macosx-10.11-x86_64/wheel/leafline-0.1.data
creating build/bdist.macosx-10.11-x86_64/wheel/leafline-0.1.data/data
creating build/bdist.macosx-10.11-x86_64/wheel/leafline-0.1.data/data/license
error: can't copy 'LICENSE': doesn't exist or not a regular file
Removing the data_files part from the setup.py makes tox running correctly.
Your issue here is that setuptools is not able to find the 'LICENSE' file in the files that have been included for building the source distribution. You have 2 options, to tell setuptools to include that file (both have been pointed to here):
Add a MANIFEST.in file (like https://github.com/pypa/sampleproject/)
Use include_package_data=True in your setup.py file.
Using MANIFEST.in is often simpler and easier to verify due to https://pypi.org/project/check-manifest/, making it possible to use automation to verify that things are indeed correct (if you use a VCS like Git or SVN).
pip install . builds a wheel using python setup.py bdist_wheel which is installed by simply unpacking it appropriately, as defined in the Wheel Specification: https://www.python.org/dev/peps/pep-0427/
tox builds a source distribution using python setup.py sdist, which is then unpacked and installed using python setup.py install.
That might be a reason for the difference in behavior for you.
I have some resource files inside my packages which I use during the execution. To make setup store them in a package with python code, I use include_package_data=True and I access them using importlib.resources. You can use backport for an older Python version than 3.7 or another library.
Before each release I have a script which verifies, that all files I need are placed inside a bdist wheel to be sure that everything is on the place.

Where does the nose.collector look for tests?

I want to use nose.collector as a test suite for setuptools, as described here. My package's source lives in mypackage/src, and I have tests in mypackage/tests. I have a setup.py that looks like this:
import setuptools
setuptools.setup(
name='mypackage',
version='1.2.3',
package_dir={'': 'src'},
packages=setuptools.find_packages('src'),
tests_require=['nose'],
test_suite='nose.collector',
provides=setuptools.find_packages('src'),
)
However, when I run python setup.py test, it doesn't test anything:
$ python setup.py test
running test
running egg_info
writing src/mypackage.egg-info/PKG-INFO
writing top-level names to src/mypackage.egg-info/top_level.txt
writing dependency_links to src/mypackage.egg-info/dependency_links.txt
reading manifest file 'src/mypackage.egg-info/SOURCES.txt'
writing manifest file 'src/mypackage.egg-info/SOURCES.txt'
running build_ext
----------------------------------------------------------------------
Ran 0 tests in 0.002s
OK
How can I tell nose where to look for tests? Up until now, I've been doing nosetests -d tests, which works fine. But I'd like to change to use setuptools so that I can follow the python setup.py test convention.
from the docs
When running under setuptools, you can configure nose settings via the
environment variables detailed in the nosetests script usage message,
or the setup.cfg, ~/.noserc or ~/.nose.cfg config files.
http://nose.readthedocs.org/en/latest/setuptools_integration.html
Fix them with chmod -x $(find tests/ -name '*.py')

How to install a dependency from a submodule in Python?

I have a Python project with the following structure (irrelevant source files omitted for simplicity):
myproject/
mysubmodule/
setup.py
setup.py
The file myproject/setup.py uses distutils.core.setup to install the module myproject and the relevant sources. However, myproject requires mysubmodule to be installed (this is a git submodule). So what I am doing right now is:
myproject/$ cd mysubmodule
myproject/mysubmodule/$ python setup.py install
myproject/mysubmodule/$ cd ..
myproject/$ python setup.py install
This is too tedious for customers, especially if the project will be extended by further submodules in the future.
Is there a way to automate the installation of mysubmodule when calling myproject/setup.py?
setuptools.find_packages() is able to discover submodules
Your setup.py should look like
from setuptools import setup, find_packages
setup(
packages=find_packages(),
# ...
)
Create a package for mysubmodule with its own setup.py and let the top-level package depend on that package in its setup.py. This means you only need to make the packages / dependencies available and run python setup.py install on the top-level package.
The question then becomes how to ship the dependencies / packages to your customers but this can be solved by putting them in a directory and configuring setup.py to include that directory when searching for dependencies.
The alternative is to "vendor" mysubmodule which simply means including it all in one package (no further questions asked) and having one python setup.py install to install the main package. For example, pip vendors (includes) requests so it can use it without having to depend on that requests package.

What is setup.py?

What is setup.py and how can it be configured or used?
setup.py is a python file, the presence of which is an indication that the module/package you are about to install has likely been packaged and distributed with Distutils, which is the standard for distributing Python Modules.
This allows you to easily install Python packages. Often it's enough to write:
$ pip install .
pip will use setup.py to install your module. Avoid calling setup.py directly.
https://docs.python.org/3/installing/index.html#installing-index
It helps to install a python package foo on your machine (can also be in virtualenv) so that you can import the package foo from other projects and also from [I]Python prompts.
It does the similar job of pip, easy_install etc.,
Using setup.py
Let's start with some definitions:
Package - A folder/directory that contains __init__.py file.
Module - A valid python file with .py extension.
Distribution - How one package relates to other packages and modules.
Let's say you want to install a package named foo. Then you do,
$ git clone https://github.com/user/foo
$ cd foo
$ python setup.py install
Instead, if you don't want to actually install it but still would like to use it. Then do,
$ python setup.py develop
This command will create symlinks to the source directory within site-packages instead of copying things. Because of this, it is quite fast (particularly for large packages).
Creating setup.py
If you have your package tree like,
foo
├── foo
│   ├── data_struct.py
│   ├── __init__.py
│   └── internals.py
├── README
├── requirements.txt
└── setup.py
Then, you do the following in your setup.py script so that it can be installed on some machine:
from setuptools import setup
setup(
name='foo',
version='1.0',
description='A useful module',
author='Man Foo',
author_email='foomail#foo.example',
packages=['foo'], #same as name
install_requires=['wheel', 'bar', 'greek'], #external packages as dependencies
)
Instead, if your package tree is more complex like the one below:
foo
├── foo
│   ├── data_struct.py
│   ├── __init__.py
│   └── internals.py
├── README
├── requirements.txt
├── scripts
│   ├── cool
│   └── skype
└── setup.py
Then, your setup.py in this case would be like:
from setuptools import setup
setup(
name='foo',
version='1.0',
description='A useful module',
author='Man Foo',
author_email='foomail#foo.example',
packages=['foo'], #same as name
install_requires=['wheel', 'bar', 'greek'], #external packages as dependencies
scripts=[
'scripts/cool',
'scripts/skype',
]
)
Add more stuff to (setup.py) & make it decent:
from setuptools import setup
with open("README", 'r') as f:
long_description = f.read()
setup(
name='foo',
version='1.0',
description='A useful module',
license="MIT",
long_description=long_description,
author='Man Foo',
author_email='foomail#foo.example',
url="http://www.foopackage.example/",
packages=['foo'], #same as name
install_requires=['wheel', 'bar', 'greek'], #external packages as dependencies
scripts=[
'scripts/cool',
'scripts/skype',
]
)
The long_description is used in pypi.org as the README description of your package.
And finally, you're now ready to upload your package to PyPi.org so that others can install your package using pip install yourpackage.
At this point there are two options.
publish in the temporary test.pypi.org server to make oneself familiarize with the procedure, and then publish it on the permanent pypi.org server for the public to use your package.
publish straight away on the permanent pypi.org server, if you are already familiar with the procedure and have your user credentials (e.g., username, password, package name)
Once your package name is registered in pypi.org, nobody can claim or use it. Python packaging suggests the twine package for uploading purposes (of your package to PyPi). Thus,
the first step is to locally build the distributions using:
# prereq: wheel (pip install wheel)
$ python setup.py sdist bdist_wheel
then using twine for uploading either to test.pypi.org or pypi.org:
$ twine upload --repository testpypi dist/*
username: ***
password: ***
It will take few minutes for the package to appear on test.pypi.org. Once you're satisfied with it, you can then upload your package to the real & permanent index of pypi.org simply with:
$ twine upload dist/*
Optionally, you can also sign the files in your package with a GPG by:
$ twine upload dist/* --sign
Bonus Reading:
See a sample setup.py from a real project here: torchvision-setup.py
PEP 517, setuptools
why twine? using twine
setup.py is Python's answer to a multi-platform installer and make file.
If you’re familiar with command line installations, then make && make install translates to python setup.py build && python setup.py install.
Some packages are pure Python, and are only byte compiled. Others may contain native code, which will require a native compiler (like gcc or cl) and a Python interfacing module (like swig or pyrex).
If you downloaded package that has "setup.py" in root folder, you can install it by running
python setup.py install
If you are developing a project and are wondering what this file is useful for, check Python documentation on writing the Setup Script
setup.py is a Python script that is usually shipped with libraries or programs, written in that language. It's purpose is the correct installation of the software.
Many packages use the distutils framework in conjuction with setup.py.
http://docs.python.org/distutils/
setup.py can be used in two scenarios , First, you want to install a Python package. Second, you want to create your own Python package. Usually standard Python package has couple of important files like setup.py, setup.cfg and Manifest.in. When you are creating the Python package, these three files will determine the (content in PKG-INFO under egg-info folder) name, version, description, other required installations (usually in .txt file) and few other parameters. setup.cfg is read by setup.py while package is created (could be tar.gz ). Manifest.in is where you can define what should be included in your package. Anyways you can do bunch of stuff using setup.py like
python setup.py build
python setup.py install
python setup.py sdist <distname> upload [-r urltorepo] (to upload package to pypi or local repo)
There are bunch of other commands which could be used with setup.py . for help
python setup.py --help-commands
setup.py is a Python file like any other. It can take any name, except by convention it is named setup.py so that there is not a different procedure with each script.
Most frequently setup.py is used to install a Python module but server other purposes:
Modules:
Perhaps this is most famous usage of setup.py is in modules. Although they can be installed using pip, old Python versions did not include pip by default and they needed to be installed separately.
If you wanted to install a module but did not want to install pip, just about the only alternative was to install the module from setup.py file. This could be achieved via python setup.py install. This would install the Python module to the root dictionary (without pip, easy_install ect).
This method is often used when pip will fail. For example if the correct Python version of the desired package is not available via pipperhaps because it is no longer maintained, , downloading the source and running python setup.py install would perform the same thing, except in the case of compiled binaries are required, (but will disregard the Python version -unless an error is returned).
Another use of setup.py is to install a package from source. If a module is still under development the wheel files will not be available and the only way to install is to install from the source directly.
Building Python extensions:
When a module has been built it can be converted into module ready for distribution using a distutils setup script. Once built these can be installed using the command above.
A setup script is easy to build and once the file has been properly configured and can be compiled by running python setup.py build (see link for all commands).
Once again it is named setup.py for ease of use and by convention, but can take any name.
Cython:
Another famous use of setup.py files include compiled extensions. These require a setup script with user defined values. They allow fast (but once compiled are platform dependant) execution. Here is a simple example from the documentation:
from distutils.core import setup
from Cython.Build import cythonize
setup(
name = 'Hello world app',
ext_modules = cythonize("hello.pyx"),
)
This can be compiled via python setup.py build
Cx_Freeze:
Another module requiring a setup script is cx_Freeze. This converts Python script to executables. This allows many commands such as descriptions, names, icons, packages to include, exclude ect and once run will produce a distributable application. An example from the documentation:
import sys
from cx_Freeze import setup, Executable
build_exe_options = {"packages": ["os"], "excludes": ["tkinter"]}
base = None
if sys.platform == "win32":
base = "Win32GUI"
setup( name = "guifoo",
version = "0.1",
description = "My GUI application!",
options = {"build_exe": build_exe_options},
executables = [Executable("guifoo.py", base=base)])
This can be compiled via python setup.py build.
So what is a setup.py file?
Quite simply it is a script that builds or configures something in the Python environment.
A package when distributed should contain only one setup script but it is not uncommon to combine several together into a single setup script. Notice this often involves distutils but not always (as I showed in my last example). The thing to remember it just configures Python package/script in some way.
It takes the name so the same command can always be used when building or installing.
When you download a package with setup.py open your Terminal (Mac,Linux) or Command Prompt (Windows). Using cd and helping you with Tab button set the path right to the folder where you have downloaded the file and where there is setup.py :
iMac:~ user $ cd path/pakagefolderwithsetupfile/
Press enter, you should see something like this:
iMac:pakagefolderwithsetupfile user$
Then type after this python setup.py install :
iMac:pakagefolderwithsetupfile user$ python setup.py install
Press enter. Done!
To make it simple, setup.py is run as "__main__" when you call the install functions the other answers mentioned. Inside setup.py, you should put everything needed to install your package.
Common setup.py functions
The following two sections discuss two things many setup.py modules have.
setuptools.setup
This function allows you to specify project attributes like the name of the project, the version.... Most importantly, this function allows you to install other functions if they're packaged properly. See this webpage for an example of setuptools.setup
These attributes of setuptools.setup enable installing these types of packages:
Packages that are imported to your project and listed in PyPI using setuptools.findpackages:
packages=find_packages(exclude=["docs","tests", ".gitignore", "README.rst","DESCRIPTION.rst"])
Packages not in PyPI, but can be downloaded from a URL using dependency_links
dependency_links=["http://peak.telecommunity.com/snapshots/",]
Custom functions
In an ideal world, setuptools.setup would handle everything for you. Unfortunately this isn't always the case. Sometimes you have to do specific things, like installing dependencies with the subprocess command, to get the system you're installing on in the right state for your package. Try to avoid this, these functions get confusing and often differ between OS and even distribution.
To install a Python package you've downloaded, you extract the archive and run the setup.py script inside:
python setup.py install
To me, this has always felt odd. It would be more natural to point a package manager at the download, as one would do in Ruby and Nodejs, eg. gem install rails-4.1.1.gem
A package manager is more comfortable too, because it's familiar and reliable. On the other hand, each setup.py is novel, because it's specific to the package. It demands faith in convention "I trust this setup.py takes the same commands as others I have used in the past". That's a regrettable tax on mental willpower.
I'm not saying the setup.py workflow is less secure than a package manager (I understand Pip just runs the setup.py inside), but certainly I feel it's awkard and jarring. There's a harmony to commands all being to the same package manager application. You might even grow fond it.

Categories