Let sphinx use version from setup.py - python

If I do sphinx-quickstart I get asked about the version of the project.
I want to avoid to have two places for the version of my project.
How to do this in the python packing world?

The easiest (and probably cleanest) way is to define __version__ for the __init__.py of your top-level package, and then import that package and read the version in both setup.py and your Sphinx project's conf.py.
So lets say your project is called myproject.
Move your current version out of setup.py, and make it a variable in myproject/__init__.py instead:
myproject/__init__.py:
# import foo
# ...
__version__ = '1.5'
Import myproject in your project's setup.py, and replace the hardcoded version with myproject.__version__:
setup.py:
from setuptools import setup
from myproject import __version__
project = "myproject"
setup(
name=project,
version=__version__,
# ...
)
In your Sphinx project's conf.py, do the same. So edit the generated conf.py along these lines:
docs/conf.py:
from myproject import __version__
# ...
# The short X.Y version.
version = __version__
# The full version, including alpha/beta/rc tags.
release = version
For an example of a library that does this pretty much exactly like this, have a look at the requests module (__init__.py | setup.py | conf.py).
This will take care of the auto-generated texts where the project version is used (like the links to the front page of the documentation). If you want to use your version in specific custom places, you can use the rst_epilog directive to dynamically insert configuration values defined in conf.py.

Maybe an even cleaner option is to actually build sphinx from the setup.py command as described in http://www.sphinx-doc.org/en/master/setuptools.html:
setup.py
# this is only necessary when not using setuptools/distribute
from sphinx.setup_command import BuildDoc
cmdclass = {'build_sphinx': BuildDoc}
name = 'My project'
version = '1.2'
release = '1.2.0'
setup(
name=name,
author='Bernard Montgomery',
version=release,
cmdclass=cmdclass,
# these are optional and override conf.py settings
command_options={
'build_sphinx': {
'project': ('setup.py', name),
'version': ('setup.py', version),
'release': ('setup.py', release),
'source_dir': ('setup.py', 'doc')}},
)
Then build documentation using
$ python setup.py build_sphinx
Benefits:
Makes setup.py the single source of version number
Avoids having to make packages out of your project folders unnecessarily

You could have a look at bumpversion module:
"A small command line tool to simplify releasing software by updating all version strings in your source code by the correct increment"
You may use a configuration file .bumpversion.cfg for complex multi-file operations.

Another way is integrating setuptools_scm in your project. This way you can
from setuptools_scm import get_version
version = get_version()
in your conf.py

Here is a straightforward solution, ironically from the setuptools_scm PyPI page:
# contents of docs/conf.py
from importlib.metadata import version
release = version('myproject')
# for example take major/minor
version = '.'.join(release.split('.')[:2])
Here is their explanation why it is discouraged to use their package from Sphinx:
The underlying reason is, that services like Read the Docs sometimes change the working directory for good reasons and using the installed metadata prevents using needless volatile data there.

Extract Information from pyproject.toml
If you use a pyproject.toml you could also parse it in the conf.py with tomli or use the equivalent tomllib when you are on python ^3.11.
Like this you can extract the information from the pyproject.toml and use it in your sphinx documentation configuration.
Here a short incomplete example using tomli, assuming conf.py
is located at <project-root>/docs/source/conf.py:
# conf.py
import tomli
with open("../../pyproject.toml", "rb") as f:
toml = tomli.load(f)
# -- Project information -----------------------------------------------------
pyproject = toml["tool"]["poetry"]
project = pyproject["name"]
version = pyproject["version"]
release = pyproject["version"]
copyright = ...
author = ...
# and the rest of the configuration

Related

mypy does not detect errors when wrongly using a function imported from an external package

I've run a simple experiment where I've created a very simple python package containing the following files:
In the folder my_package:
# example.py
def foo(number: int, text: str) -> None:
print(number)
print(text)
An empty __init__.py
In the root folder, a setup.py file:
from setuptools import setup
setup(
name='ExamplePackage',
version='0.1.0',
author='An Awesome Coder',
author_email='aac#example.com',
packages=['my_package'],
description='An awesome package that does something',
)
After building the package using python ./setup.py bdist_wheel I've copied the .whl file to another python project, ran pip install ExamplePackage-0.1.0-py3-none-any.whl and created the following file main.py:
from my_package.example import foo
# No mypy errors at all!
x = foo()
def internal_foo(number: int, text: str) -> None:
print(number)
print(text)
# Missing positional arguments "number", "text" in call to "internal_foo" [call-arg]
# "internal_foo" does not return a value [func-returns-value]
y = internal_foo()
My mypy.ini looks like this:
[mypy]
ignore_missing_imports = True
show_error_codes = True
strict = True
check_untyped_defs = True
raise_exceptions = True
I'm struggling to understand why mypy which could easily infer the signature of the imported foo function, does not show any errors regarding the wrong usage
Any help will be much appreciated
According to PEP-561 (a python standard that regulates use of typing data for packages), there are two general ways to declare types for package.
The first (applicable in your case) is py.typed marker. It is an empty file named py.typed, placed in package root (the same place where the topmost __init__.py resides; for namespace packages prefer adding py.typed to every submodule to avoid inconsistent interpretation). It declares that the package contains inline type information.
It is important not to forget to add it as package data. You need to add it as MANIFEST.in entry + any configuration necessary to enable it - nothing for pyproject.toml-based packages, include_package_data = True for setup.py and setup.cfg, something else (similar?) for non-setuptools distros (Poetry, flit, hatch or whatever you need for some reason). Alternatively, it can be declared in [tool.setuptools.package-data] section of pyproject.toml, package_data in setup.py, [options.package_data] in setup.cfg and something similar for other build systems. There are other options like VCS integration (from .gitignore and similar files) with appropriate plugins, refer to setuptools documentation here or docs for system of your choice.
Another supported solution is using stub files - these are separate files with .pyi extension, providing only typing for functions/methods and variables/attributes. They have a higher precedence (e.g. if there are both stubs and py.typed, stubs types are used), but this should not happen (if code can be annotated inline, then there is no need in separate stubs). Stub packages are named <package>-stubs. There are also "side-by-side" approach that allows putting .py and .pyi near each other.

Install sphinx compilated doc from setup.py

I want to install some HTML on-line doc that is made (understand compiled from rSt) using sphinx along with the python code into the site-package. I want to to this using setup.py.
To be more specific, I want to complile my *.rst files to HTML and then copy then into site-package from the setup.py file when the user types python setup.py install.
Does someone know how to do this ? I look into sphinx and setuptools doc but was not able to find the info.
The reason I want to do it, is that my package is a GUI tool and the HTML is the on-line help of it. It is displayed into GUI's internal browser.
As already noted you'll need a MANIFEST.in file to include the docs directory:
recursive-include docs *
Now in your setup.py you could programmatically generate the sphinx documentation from your *.rst files.
Here's an example on how to do that automatically when python setup.py install is run:
from distutils.core import setup
# set dependencies, we need sphinx to build doc
dependencies = ['sphinx']
setup(
name='my_package'
# rest of your setup
)
# auto-generate documentation
import sphinx
# automatically build html documentation
# For example:
# format = 'html'
# sphinx-src-dir = './doc'
# sphinx-build-dir = './doc/build'
sphinx.build_main(['setup.py', '-b', '<format>', '<sphinx-src-dir>', '<sphinx-build-dir>'])
Like this you can run sphinx-build as needed (Documentation).
Note: It might be a good idea to add a Custom command to your setup.py for separating documentation generation (f.e. python setup.py make_doc [params]) and install routine (python setup.py install). This way the user could easily supply parameters format, sphinx-src-dir and sphinx-build-dir.
You can add them into your MANIFEST.in and then build the package
recursive-include docs *
Then it will be distributed with your egg

packaging with numpy and test suite

Introduction
Disclaimer: I'm very new to python packaging with distutils. So far I've just stashed everything into modules, and packages manually and developed on top of that. I never wrote a setup.py file before.
I have a Fortran module that I want to use in my python code with numpy. I figured the best way to do that would be f2py, since it is included in numpy. To automate the build process I want to use distutils and the corresponding numpy enhancement, which includes convenience functions for f2py wrappers.
I do not understand how I should organize my files, and how to include my test suite.
What I want is the possibility to use ./setup.py for building, installing, and testing, and developing.
My directory structure looks as follows:
volterra
├── setup.py
└── volterra
├── __init__.py
├── integral.f90
├── test
│   ├── __init__.py
│   └── test_volterra.py
└── volterra.f90
And the setup.py file contains this:
def configuration(parent_package='', top_path=None):
from numpy.distutils.misc_util import Configuration
config = Configuration('volterra', parent_package, top_path)
config.add_extension('_volterra',
sources=['volterra/integral.f90', 'volterra/volterra.f90'])
return config
if __name__ == '__main__':
from numpy.distutils.core import setup
setup(**configuration(top_path='').todict())
After running ./setup.py build I get.
build/lib.linux-x86_64-2.7/
└── volterra
└── _volterra.so
Which includes neither the __init__.py file, nor the tests.
Questions
Is it really necessary to add the path to every single source file of the extension? (I.e. volterra/integral.f90) Can't I give a parameter which says, look for stuff in volterra/? The top_path, and package_dir parameters didn't do the trick.
Currently, the __init__.py file is not included in the build. Why is that?
How can I run my tests in this setup?
What's the best workflow for doing development in such an environment? I don't want to install my package for every single change I do. How do you do development in the source directory when you need to compile some extension modules?
Here is a setup.py that works for me:
# pkg - A fancy software package
# Copyright (C) 2013 author (email)
#
# This program is free software: you can redistribute it and/or modify
# it under the terms of the GNU General Public License as published by
# the Free Software Foundation, either version 3 of the License, or
# (at your option) any later version.
#
# This program is distributed in the hope that it will be useful,
# but WITHOUT ANY WARRANTY; without even the implied warranty of
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
# GNU General Public License for more details.
#
# You should have received a copy of the GNU General Public License
# along with this program. If not, see http://www.gnu.org/licenses/gpl.html.
"""pkg: a software suite for
Hey look at me I'm a long description
But how long am I?
"""
from __future__ import division, print_function
#ideas for setup/f2py came from:
# -numpy setup.py: https://github.com/numpy/numpy/blob/master/setup.py 2013-11-07
# -winpython setup.py: http://code.google.com/p/winpython/source/browse/setup.py 2013-11-07
# -needing to use
# import setuptools; from numpy.distutils.core import setup, Extension:
# http://comments.gmane.org/gmane.comp.python.f2py.user/707 2013-11-07
# -wrapping FORTRAN code with f2py: http://www2-pcmdi.llnl.gov/cdat/tutorials/f2py-wrapping-fortran-code 2013-11-07
# -numpy disutils: http://docs.scipy.org/doc/numpy/reference/distutils.html 2013-11-07
# -manifest files in disutils:
# 'distutils doesn't properly update MANIFEST. when the contents of directories change.'
# https://github.com/numpy/numpy/blob/master/setup.py
# -if things are not woring try deleting build, sdist, egg directories and try again:
# https://stackoverflow.com/a/9982133/2530083 2013-11-07
# -getting fortran extensions to be installed in their appropriate sub package
# i.e. "my_ext = Extension(name = 'my_pack._fortran', sources = ['my_pack/code.f90'])"
# Note that sources is a list even if one file:
# http://numpy-discussion.10968.n7.nabble.com/f2py-and-setup-py-how-can-I-specify-where-the-so-file-goes-tp34490p34497.html 2013-11-07
# -install fortran source files into their appropriate sub-package
# i.e. "package_data={'': ['*.f95','*.f90']}# Note it's a dict and list":
# https://stackoverflow.com/a/19373744/2530083 2013-11-07
# -Chapter 9 Fortran Programming with NumPy Arrays:
# Langtangen, Hans Petter. 2013. Python Scripting for Computational Science. 3rd edition. Springer.
# -Hitchhikers guide to packaging :
# http://guide.python-distribute.org/
# -Python Packaging: Hate, hate, hate everywhere :
# http://lucumr.pocoo.org/2012/6/22/hate-hate-hate-everywhere/
# -How To Package Your Python Code:
# http://www.scotttorborg.com/python-packaging/
# -install testing requirements:
# https://stackoverflow.com/a/7747140/2530083 2013-11-07
import setuptools
from numpy.distutils.core import setup, Extension
import os
import os.path as osp
def readme(filename='README.rst'):
with open('README.rst') as f:
text=f.read()
f.close()
return text
def get_package_data(name, extlist):
"""Return data files for package *name* with extensions in *extlist*"""
#modified slightly from taken from http://code.google.com/p/winpython/source/browse/setup.py 2013-11-7
flist = []
# Workaround to replace os.path.relpath (not available until Python 2.6):
offset = len(name)+len(os.pathsep)
for dirpath, _dirnames, filenames in os.walk(name):
for fname in filenames:
if not fname.startswith('.') and osp.splitext(fname)[1] in extlist:
# flist.append(osp.join(dirpath, fname[offset:]))
flist.append(osp.join(dirpath, fname))
return flist
DOCLINES = __doc__.split("\n")
CLASSIFIERS = """\
Development Status :: 1 - Planning
License :: OSI Approved :: GNU Lesser General Public License v3 or later (LGPLv3+)
Programming Language :: Python :: 2.7
Topic :: Scientific/Engineering
"""
NAME = 'pkg'
MAINTAINER = "me"
MAINTAINER_EMAIL = "me#me.com"
DESCRIPTION = DOCLINES[0]
LONG_DESCRIPTION = "\n".join(DOCLINES[2:])#readme('readme.rst')
URL = "http://meeeee.mmemem"
DOWNLOAD_URL = "https://github.com/rtrwalker/geotecha.git"
LICENSE = 'GNU General Public License v3 or later (GPLv3+)'
CLASSIFIERS = [_f for _f in CLASSIFIERS.split('\n') if _f]
KEYWORDS=''
AUTHOR = "me"
AUTHOR_EMAIL = "me.com"
PLATFORMS = ["Windows"]#, "Linux", "Solaris", "Mac OS-X", "Unix"]
MAJOR = 0
MINOR = 1
MICRO = 0
ISRELEASED = False
VERSION = '%d.%d.%d' % (MAJOR, MINOR, MICRO)
INSTALL_REQUIRES=[]
ZIP_SAFE=False
TEST_SUITE='nose.collector'
TESTS_REQUIRE=['nose']
DATA_FILES = [(NAME, ['LICENSE.txt','README.rst'])]
PACKAGES=setuptools.find_packages()
PACKAGES.remove('tools')
PACKAGE_DATA={'': ['*.f95','*f90']}
ext_files = get_package_data(NAME,['.f90', '.f95','.F90', '.F95'])
ext_module_names = ['.'.join(osp.splitext(v)[0].split(osp.sep)) for v in ext_files]
EXT_MODULES = [Extension(name=x,sources=[y]) for x, y in zip(ext_module_names, ext_files)]
setup(
name=NAME,
version=VERSION,
maintainer=MAINTAINER,
maintainer_email=MAINTAINER_EMAIL,
description=DESCRIPTION,
long_description=LONG_DESCRIPTION,
url=URL,
download_url=DOWNLOAD_URL,
license=LICENSE,
classifiers=CLASSIFIERS,
author=AUTHOR,
author_email=AUTHOR_EMAIL,
platforms=PLATFORMS,
packages=PACKAGES,
data_files=DATA_FILES,
install_requires=INSTALL_REQUIRES,
zip_safe=ZIP_SAFE,
test_suite=TEST_SUITE,
tests_require=TESTS_REQUIRE,
package_data=PACKAGE_DATA,
ext_modules=EXT_MODULES,
)
To install, at the command line I use:
python setup.py install
python setup.py clean --all
The only issue I seem to have is a minor one. when I look in site-packages for my package it is installed inside the egg folder C:\Python27\Lib\site-packages\pkg-0.1.0-py2.7-win32.egg\pkg. Most other packages I see there have a C:\Python27\Lib\site-packages\pkg folder separate to the egg folder. Does anyone know how to get that separation?
As for testing, after installing, I type the following at the command line:
nosetests package_name -v
Try investigating python setup.py develop (Python setup.py develop vs install) for not having to install the package after every change.
As I commented in the code I found the following useful:
numpy setup.py: https://github.com/numpy/numpy/blob/master/setup.py 2013-11-07
winpython setup.py: http://code.google.com/p/winpython/source/browse/setup.py 2013-11-07
needing to use
import setuptools; from numpy.distutils.core import setup, Extension:
http://comments.gmane.org/gmane.comp.python.f2py.user/707 2013-11-07
wrapping FORTRAN code with f2py: http://www2-pcmdi.llnl.gov/cdat/tutorials/f2py-wrapping-fortran-code 2013-11-07
numpy disutils: http://docs.scipy.org/doc/numpy/reference/distutils.html 2013-11-07
manifest files in disutils:
'distutils doesn't properly update MANIFEST. when the contents of directories change.'
https://github.com/numpy/numpy/blob/master/setup.py
if things are not woring try deleting build, sdist, egg directories and try again:
https://stackoverflow.com/a/9982133/2530083 2013-11-07
getting fortran extensions to be installed in their appropriate sub package
i.e. "my_ext = Extension(name = 'my_pack._fortran', sources = ['my_pack/code.f90'])"
Note that sources is a list even if one file:
http://numpy-discussion.10968.n7.nabble.com/f2py-and-setup-py-how-can-I-specify-where-the-so-file-goes-tp34490p34497.html 2013-11-07
install fortran source files into their appropriate sub-package
i.e. "package_data={'': ['.f95','.f90']}# Note it's a dict and list":
https://stackoverflow.com/a/19373744/2530083 2013-11-07
Chapter 9 Fortran Programming with NumPy Arrays:
Langtangen, Hans Petter. 2013. Python Scripting for Computational Science. 3rd edition. Springer.
Hitchhikers guide to packaging :
http://guide.python-distribute.org/
Python Packaging: Hate, hate, hate everywhere :
http://lucumr.pocoo.org/2012/6/22/hate-hate-hate-everywhere/
How To Package Your Python Code:
http://www.scotttorborg.com/python-packaging/
install testing requirements:
https://stackoverflow.com/a/7747140/2530083 2013-11-07
'python setup.py develop' :
https://stackoverflow.com/a/19048754/2530083
Here is setup.py from a project I made. I have found figuring out setup.py / packaging to be frustrating with no solid answers and definitely not pythonic in the sense of having one and only one obvious way to do something. Hopefully this will help a little.
The points you may find useful are:
find_packages which removes the drudgery of including lots of files or messing around with generating manifest.
package_data which allows you to easily specify non .py files to be included
install_requires / tests_require
You'll need to find the source for distribute_setup.py if you don't have it already.
Is it really necessary to add the path to every single source file of
the extension? (I.e. volterra/integral.f90) Can't I give a parameter
which says, look for stuff in volterra/? The top_path, and package_dir
parameters didn't do the trick.
Currently, the init.py file is not
included in the build. Why is that?
Hopefully find_packages() will solve both of those. I don't have much experience packaging but I haven't had to go back to manual inclusion yet.
How can I run my tests in this
setup?
I think this is probably a different question with many answers depending on how you are doing tests. Maybe you can ask it separately?
As a side note, I am under the impression that the standard is to put your tests directory at the top level. I.e. volterra/volterra and volterra/tests.
What's the best workflow for doing development in such an
environment? I don't want to install my package for every single
change I do. How do you do development in the source directory when
you need to compile some extension modules?
This might be worth another question as well. I don't see why you would need to install your package for every single change. If you are uploading the package, just don't install it on your dev system (except to test installation) and work directly from your development copy. Maybe I'm missing something though since I don't work with compiled extensions.
Here is the example
try:
from setuptools import setup, find_packages
except ImportError:
from distribute_setup import use_setuptools
use_setuptools()
from setuptools import setup, find_packages
setup(
# ... other stuff
py_modules=['distribute_setup'],
packages=find_packages(),
package_data={'': ['*.png']}, # for me to include anything with png
install_requires=['numpy', 'treenode', 'investigators'],
tests_require=['mock', 'numpy', 'treenode', 'investigators'],
)

What is the correct way to share package version with setup.py and the package?

With distutils, setuptools, etc. a package version is specified in setup.py:
# file: setup.py
...
setup(
name='foobar',
version='1.0.0',
# other attributes
)
I would like to be able to access the same version number from within the package:
>>> import foobar
>>> foobar.__version__
'1.0.0'
I could add __version__ = '1.0.0' to my package's __init__.py, but I would also like to include additional imports in my package to create a simplified interface to the package:
# file: __init__.py
from foobar import foo
from foobar.bar import Bar
__version__ = '1.0.0'
and
# file: setup.py
from foobar import __version__
...
setup(
name='foobar',
version=__version__,
# other attributes
)
However, these additional imports can cause the installation of foobar to fail if they import other packages that are not yet installed. What is the correct way to share package version with setup.py and the package?
Set the version in setup.py only, and read your own version with pkg_resources, effectively querying the setuptools metadata:
file: setup.py
setup(
name='foobar',
version='1.0.0',
# other attributes
)
file: __init__.py
from pkg_resources import get_distribution
__version__ = get_distribution('foobar').version
To make this work in all cases, where you could end up running this without having installed it, test for DistributionNotFound and the distribution location:
from pkg_resources import get_distribution, DistributionNotFound
import os.path
try:
_dist = get_distribution('foobar')
# Normalize case for Windows systems
dist_loc = os.path.normcase(_dist.location)
here = os.path.normcase(__file__)
if not here.startswith(os.path.join(dist_loc, 'foobar')):
# not installed, but there is another version that *is*
raise DistributionNotFound
except DistributionNotFound:
__version__ = 'Please install this project with setup.py'
else:
__version__ = _dist.version
I don't believe there's a canonical answer to this, but my method (either directly copied or slightly tweaked from what I've seen in various other places) is as follows:
Folder heirarchy (relevant files only):
package_root/
|- main_package/
| |- __init__.py
| `- _version.py
`- setup.py
main_package/_version.py:
"""Version information."""
# The following line *must* be the last in the module, exactly as formatted:
__version__ = "1.0.0"
main_package/__init__.py:
"""Something nice and descriptive."""
from main_package.some_module import some_function_or_class
# ... etc.
from main_package._version import __version__
__all__ = (
some_function_or_class,
# ... etc.
)
setup.py:
from setuptools import setup
setup(
version=open("main_package/_version.py").readlines()[-1].split()[-1].strip("\"'"),
# ... etc.
)
... which is ugly as sin ... but it works, and I've seen it or something like it in packages distributed by people who I'd expect to know a better way if there were one.
I agree with #stefano-m 's philosophy about:
Having version = "x.y.z" in the source and parsing it within
setup.py is definitely the correct solution, IMHO. Much better than
(the other way around) relying on run time magic.
And this answer is derived from #zero-piraeus 's answer. The whole point is "don't use imports in setup.py, instead, read the version from a file".
I use regex to parse the __version__ so that it does not need to be the last line of a dedicated file at all. In fact, I still put the single-source-of-truth __version__ inside my project's __init__.py.
Folder heirarchy (relevant files only):
package_root/
|- main_package/
| `- __init__.py
`- setup.py
main_package/__init__.py:
# You can have other dependency if you really need to
from main_package.some_module import some_function_or_class
# Define your version number in the way you mother told you,
# which is so straightforward that even your grandma will understand.
__version__ = "1.2.3"
__all__ = (
some_function_or_class,
# ... etc.
)
setup.py:
from setuptools import setup
import re, io
__version__ = re.search(
r'__version__\s*=\s*[\'"]([^\'"]*)[\'"]', # It excludes inline comment too
io.open('main_package/__init__.py', encoding='utf_8_sig').read()
).group(1)
# The beautiful part is, I don't even need to check exceptions here.
# If something messes up, let the build process fail noisy, BEFORE my release!
setup(
version=__version__,
# ... etc.
)
... which is still not ideal ... but it works.
And by the way, at this point you can test your new toy in this way:
python setup.py --version
1.2.3
PS: This official Python packaging document (and its mirror) describes more options. Its first option is also using regex. (Depends on the exact regex you use, it may or may not handle quotation marks inside version string. Generally not a big issue though.)
PPS: The fix in ADAL Python is now backported into this answer.
setuptools 46.4.0 added basic abstract syntax tree analysis support so that the setup.cfg attr: directive works without having to import your package's dependencies. This makes it possible to have a single source of truth of the package version thereby antiquating much of the solutions in previous answers posted prior to the release of setupstools 46.4.0.
It's now possible to avoid passing version to the setuptools.setup function in setup.py if __version__ is initialized in yourpackage.__init__.py and the following metadata is added to the setup.cfg file of your package. With this configuration the setuptools.setup function will automatically parse the package version from yourpackage.__init__.py and you're free to import __version__.py where needed in your application.
Example
setup.py without version passed to setup
from setuptools import setup
setup(
name="yourpackage"
)
yourpackage.____init__.py
__version__ = '0.2.0'
setup.cfg
[metadata]
version = attr: package.__version__
some module in your app
from yourpackage import __version__ as expected_version
from pkg_distribution import get_distribution
installed_version = get_distribution("yourpackage").version
assert expected_version != installed_version
Put __version__ in your_pkg/__init__.py, and parse in setup.py using ast:
import ast
import importlib.util
from pkg_resources import safe_name
PKG_DIR = 'my_pkg'
def find_version():
"""Return value of __version__.
Reference: https://stackoverflow.com/a/42269185/
"""
file_path = importlib.util.find_spec(PKG_DIR).origin
with open(file_path) as file_obj:
root_node = ast.parse(file_obj.read())
for node in ast.walk(root_node):
if isinstance(node, ast.Assign):
if len(node.targets) == 1 and node.targets[0].id == "__version__":
return node.value.s
raise RuntimeError("Unable to find version string.")
setup(name=safe_name(PKG_DIR),
version=find_version(),
packages=[PKG_DIR],
...
)
If using Python < 3.4, note that importlib.util.find_spec is not available. Moreover, any backport of importlib of course cannot be relied upon to be available to setup.py. In this case, use:
import os
file_path = os.path.join(os.path.dirname(__file__), PKG_DIR, '__init__.py')
The accepted answer requires that the package has been installed. In my case, I needed to extract the installation params (including __version__) from the source setup.py. I found a direct and simple solution while looking through the tests of the setuptools package. Looking for more info on the _setup_stop_after attribute lead me to an old mailing list post which mentioned distutils.core.run_setup, which lead me to the actual docs needed. After all that, here's the simple solution:
file setup.py:
from setuptools import setup
setup(name='funniest',
version='0.1',
description='The funniest joke in the world',
url='http://github.com/storborg/funniest',
author='Flying Circus',
author_email='flyingcircus#example.com',
license='MIT',
packages=['funniest'],
zip_safe=False)
file extract.py:
from distutils.core import run_setup
dist = run_setup('./setup.py', stop_after='init')
dist.get_version()
It seems like setuptools do not recommend using pkg_resources anymore.
A newer solution using the recommended importlib.metadata, working in Python 3.8+:
>>> from importlib.metadata import version
>>> version('wheel')
'0.32.3'
Based on the accepted answer and comments, this is what I ended up doing:
file: setup.py
setup(
name='foobar',
version='1.0.0',
# other attributes
)
file: __init__.py
from pkg_resources import get_distribution, DistributionNotFound
__project__ = 'foobar'
__version__ = None # required for initial installation
try:
__version__ = get_distribution(__project__).version
except DistributionNotFound:
VERSION = __project__ + '-' + '(local)'
else:
VERSION = __project__ + '-' + __version__
from foobar import foo
from foobar.bar import Bar
Explanation:
__project__ is the name of the project to install which may be
different than the name of the package
VERSION is what I display in my command-line interfaces when
--version is requested
the additional imports (for the simplified package interface) only
occur if the project has actually been installed
Very late, I know. But this is working for me.
module/version.py:
__version__ = "1.0.2"
if __name__ == "__main__":
print(__version__)
module/__init__.py:
from . import version
__version__ = version.__version__
setup.py:
import subprocess
out = subprocess.Popen(['python', 'module/version.py'], stdout=subprocess.PIPE, stderr=subprocess.STDOUT)
stdout,stderr = out.communicate()
version = str(stdout)
Main advantage for me is that it requires no hand-crafted parsing or regex, or manifest.in entries. It is also fairly Pythonic, seems to work in all cases (pip -e, etc), and can easily be extended to share docstrings etc by using argparse in version.py. Can anyone see issues with this approach?

Standard way to embed version into Python package?

Is there a standard way to associate version string with a Python package in such way that I could do the following?
import foo
print(foo.version)
I would imagine there's some way to retrieve that data without any extra hardcoding, since minor/major strings are specified in setup.py already. Alternative solution that I found was to have import __version__ in my foo/__init__.py and then have __version__.py generated by setup.py.
Not directly an answer to your question, but you should consider naming it __version__, not version.
This is almost a quasi-standard. Many modules in the standard library use __version__, and this is also used in lots of 3rd-party modules, so it's the quasi-standard.
Usually, __version__ is a string, but sometimes it's also a float or tuple.
As mentioned by S.Lott (Thank you!), PEP 8 says it explicitly:
Module Level Dunder Names
Module level "dunders" (i.e. names with two leading and two trailing
underscores) such as __all__, __author__, __version__, etc.
should be placed after the module docstring but before any import
statements except from __future__ imports.
You should also make sure that the version number conforms to the format described in PEP 440 (PEP 386 a previous version of this standard).
I use a single _version.py file as the "once cannonical place" to store version information:
It provides a __version__ attribute.
It provides the standard metadata version. Therefore it will be detected by pkg_resources or other tools that parse the package metadata (EGG-INFO and/or PKG-INFO, PEP 0345).
It doesn't import your package (or anything else) when building your package, which can cause problems in some situations. (See the comments below about what problems this can cause.)
There is only one place that the version number is written down, so there is only one place to change it when the version number changes, and there is less chance of inconsistent versions.
Here is how it works: the "one canonical place" to store the version number is a .py file, named "_version.py" which is in your Python package, for example in myniftyapp/_version.py. This file is a Python module, but your setup.py doesn't import it! (That would defeat feature 3.) Instead your setup.py knows that the contents of this file is very simple, something like:
__version__ = "3.6.5"
And so your setup.py opens the file and parses it, with code like:
import re
VERSIONFILE="myniftyapp/_version.py"
verstrline = open(VERSIONFILE, "rt").read()
VSRE = r"^__version__ = ['\"]([^'\"]*)['\"]"
mo = re.search(VSRE, verstrline, re.M)
if mo:
verstr = mo.group(1)
else:
raise RuntimeError("Unable to find version string in %s." % (VERSIONFILE,))
Then your setup.py passes that string as the value of the "version" argument to setup(), thus satisfying feature 2.
To satisfy feature 1, you can have your package (at run-time, not at setup time!) import the _version file from myniftyapp/__init__.py like this:
from _version import __version__
Here is an example of this technique that I've been using for years.
The code in that example is a bit more complicated, but the simplified example that I wrote into this comment should be a complete implementation.
Here is example code of importing the version.
If you see anything wrong with this approach, please let me know.
Rewritten 2017-05
After 13+ years of writing Python code and managing various packages, I came to the conclusion that DIY is maybe not the best approach.
I started using the pbr package for dealing with versioning in my packages. If you are using git as your SCM, this will fit into your workflow like magic, saving your weeks of work (you will be surprised about how complex the issue can be).
As of today, pbr has 12M mongthly downloads, and reaching this level didn't include any dirty tricks. It was only one thing -- fixing a common packaging problem in a very simple way.
pbr can do more of the package maintenance burden, and is not limited to versioning, but it does not force you to adopt all its benefits.
So to give you an idea about how it looks to adopt pbr in one commit have a look switching packaging to pbr
Probably you would observed that the version is not stored at all in the repository. PBR does detect it from Git branches and tags.
No need to worry about what happens when you do not have a git repository because pbr does "compile" and cache the version when you package or install the applications, so there is no runtime dependency on git.
Old solution
Here is the best solution I've seen so far and it also explains why:
Inside yourpackage/version.py:
# Store the version here so:
# 1) we don't load dependencies by storing it in __init__.py
# 2) we can import it in setup.py for the same reason
# 3) we can import it into your module module
__version__ = '0.12'
Inside yourpackage/__init__.py:
from .version import __version__
Inside setup.py:
exec(open('yourpackage/version.py').read())
setup(
...
version=__version__,
...
If you know another approach that seems to be better let me know.
Per the deferred [STOP PRESS: rejected] PEP 396 (Module Version Numbers), there is a proposed way to do this. It describes, with rationale, an (admittedly optional) standard for modules to follow. Here's a snippet:
When a module (or package) includes a version number, the version SHOULD be available in the __version__ attribute.
For modules which live inside a namespace package, the module SHOULD include the __version__ attribute. The namespace package itself SHOULD NOT include its own __version__ attribute.
The __version__ attribute's value SHOULD be a string.
There is a slightly simpler alternative to some of the other answers:
__version_info__ = ('1', '2', '3')
__version__ = '.'.join(__version_info__)
(And it would be fairly simple to convert auto-incrementing portions of version numbers to a string using str().)
Of course, from what I've seen, people tend to use something like the previously-mentioned version when using __version_info__, and as such store it as a tuple of ints; however, I don't quite see the point in doing so, as I doubt there are situations where you would perform mathematical operations such as addition and subtraction on portions of version numbers for any purpose besides curiosity or auto-incrementation (and even then, int() and str() can be used fairly easily). (On the other hand, there is the possibility of someone else's code expecting a numerical tuple rather than a string tuple and thus failing.)
This is, of course, my own view, and I would gladly like others' input on using a numerical tuple.
As shezi reminded me, (lexical) comparisons of number strings do not necessarily have the same result as direct numerical comparisons; leading zeroes would be required to provide for that. So in the end, storing __version_info__ (or whatever it would be called) as a tuple of integer values would allow for more efficient version comparisons.
Many of these solutions here ignore git version tags which still means you have to track version in multiple places (bad). I approached this with the following goals:
Derive all python version references from a tag in the git repo
Automate git tag/push and setup.py upload steps with a single command that takes no inputs.
How it works:
From a make release command, the last tagged version in the git repo is found and incremented. The tag is pushed back to origin.
The Makefile stores the version in src/_version.py where it will be read by setup.py and also included in the release. Do not check _version.py into source control!
setup.py command reads the new version string from package.__version__.
Details:
Makefile
# remove optional 'v' and trailing hash "v1.0-N-HASH" -> "v1.0-N"
git_describe_ver = $(shell git describe --tags | sed -E -e 's/^v//' -e 's/(.*)-.*/\1/')
git_tag_ver = $(shell git describe --abbrev=0)
next_patch_ver = $(shell python versionbump.py --patch $(call git_tag_ver))
next_minor_ver = $(shell python versionbump.py --minor $(call git_tag_ver))
next_major_ver = $(shell python versionbump.py --major $(call git_tag_ver))
.PHONY: ${MODULE}/_version.py
${MODULE}/_version.py:
echo '__version__ = "$(call git_describe_ver)"' > $#
.PHONY: release
release: test lint mypy
git tag -a $(call next_patch_ver)
$(MAKE) ${MODULE}/_version.py
python setup.py check sdist upload # (legacy "upload" method)
# twine upload dist/* (preferred method)
git push origin master --tags
The release target always increments the 3rd version digit, but you can use the next_minor_ver or next_major_ver to increment the other digits. The commands rely on the versionbump.py script that is checked into the root of the repo
versionbump.py
"""An auto-increment tool for version strings."""
import sys
import unittest
import click
from click.testing import CliRunner # type: ignore
__version__ = '0.1'
MIN_DIGITS = 2
MAX_DIGITS = 3
#click.command()
#click.argument('version')
#click.option('--major', 'bump_idx', flag_value=0, help='Increment major number.')
#click.option('--minor', 'bump_idx', flag_value=1, help='Increment minor number.')
#click.option('--patch', 'bump_idx', flag_value=2, default=True, help='Increment patch number.')
def cli(version: str, bump_idx: int) -> None:
"""Bumps a MAJOR.MINOR.PATCH version string at the specified index location or 'patch' digit. An
optional 'v' prefix is allowed and will be included in the output if found."""
prefix = version[0] if version[0].isalpha() else ''
digits = version.lower().lstrip('v').split('.')
if len(digits) > MAX_DIGITS:
click.secho('ERROR: Too many digits', fg='red', err=True)
sys.exit(1)
digits = (digits + ['0'] * MAX_DIGITS)[:MAX_DIGITS] # Extend total digits to max.
digits[bump_idx] = str(int(digits[bump_idx]) + 1) # Increment the desired digit.
# Zero rightmost digits after bump position.
for i in range(bump_idx + 1, MAX_DIGITS):
digits[i] = '0'
digits = digits[:max(MIN_DIGITS, bump_idx + 1)] # Trim rightmost digits.
click.echo(prefix + '.'.join(digits), nl=False)
if __name__ == '__main__':
cli() # pylint: disable=no-value-for-parameter
This does the heavy lifting how to process and increment the version number from git.
__init__.py
The my_module/_version.py file is imported into my_module/__init__.py. Put any static install config here that you want distributed with your module.
from ._version import __version__
__author__ = ''
__email__ = ''
setup.py
The last step is to read the version info from the my_module module.
from setuptools import setup, find_packages
pkg_vars = {}
with open("{MODULE}/_version.py") as fp:
exec(fp.read(), pkg_vars)
setup(
version=pkg_vars['__version__'],
...
...
)
Of course, for all of this to work you'll have to have at least one version tag in your repo to start.
git tag -a v0.0.1
I use a JSON file in the package dir. This fits Zooko's requirements.
Inside pkg_dir/pkg_info.json:
{"version": "0.1.0"}
Inside setup.py:
from distutils.core import setup
import json
with open('pkg_dir/pkg_info.json') as fp:
_info = json.load(fp)
setup(
version=_info['version'],
...
)
Inside pkg_dir/__init__.py:
import json
from os.path import dirname
with open(dirname(__file__) + '/pkg_info.json') as fp:
_info = json.load(fp)
__version__ = _info['version']
I also put other information in pkg_info.json, like author. I
like to use JSON because I can automate management of metadata.
Lots of work toward uniform versioning and in support of conventions has been completed since this question was first asked. Palatable options are now detailed in the Python Packaging User Guide. Also noteworthy is that version number schemes are relatively strict in Python per PEP 440, and so keeping things sane is critical if your package will be released to the Cheese Shop.
Here's a shortened breakdown of versioning options:
Read the file in setup.py (setuptools) and get the version.
Use an external build tool (to update both __init__.py as well as source control), e.g. bump2version, changes or zest.releaser.
Set the value to a __version__ global variable in a specific module.
Place the value in a simple VERSION text file for both setup.py and code to read.
Set the value via a setup.py release, and use importlib.metadata to pick it up at runtime. (Warning, there are pre-3.8 and post-3.8 versions.)
Set the value to __version__ in sample/__init__.py and import sample in setup.py.
Use setuptools_scm to extract versioning from source control so that it's the canonical reference, not code.
NOTE that (7) might be the most modern approach (build metadata is independent of code, published by automation). Also NOTE that if setup is used for package release that a simple python3 setup.py --version will report the version directly.
Also worth noting is that as well as __version__ being a semi-std. in python so is __version_info__ which is a tuple, in the simple cases you can just do something like:
__version__ = '1.2.3'
__version_info__ = tuple([ int(num) for num in __version__.split('.')])
...and you can get the __version__ string from a file, or whatever.
There doesn't seem to be a standard way to embed a version string in a python package. Most packages I've seen use some variant of your solution, i.e. eitner
Embed the version in setup.py and have setup.py generate a module (e.g. version.py) containing only version info, that's imported by your package, or
The reverse: put the version info in your package itself, and import that to set the version in setup.py
arrow handles it in an interesting way.
Now (since 2e5031b)
In arrow/__init__.py:
__version__ = 'x.y.z'
In setup.py:
from arrow import __version__
setup(
name='arrow',
version=__version__,
# [...]
)
Before
In arrow/__init__.py:
__version__ = 'x.y.z'
VERSION = __version__
In setup.py:
def grep(attrname):
pattern = r"{0}\W*=\W*'([^']+)'".format(attrname)
strval, = re.findall(pattern, file_text)
return strval
file_text = read(fpath('arrow/__init__.py'))
setup(
name='arrow',
version=grep('__version__'),
# [...]
)
I also saw another style:
>>> django.VERSION
(1, 1, 0, 'final', 0)
After several hours of trying to find the simplest reliable solution, here are the parts:
create a version.py file INSIDE the folder of your package "/mypackage":
# Store the version here so:
# 1) we don't load dependencies by storing it in __init__.py
# 2) we can import it in setup.py for the same reason
# 3) we can import it into your module module
__version__ = '1.2.7'
in setup.py:
exec(open('mypackage/version.py').read())
setup(
name='mypackage',
version=__version__,
in the main folder init.py:
from .version import __version__
The exec() function runs the script outside of any imports, since setup.py is run before the module can be imported. You still only need to manage the version number in one file in one place, but unfortunately it is not in setup.py. (that's the downside, but having no import bugs is the upside)
pbr with bump2version
This solution was derived from this article
The use case - python GUI package distributed via PyInstaller. Needs to show version info.
Here is the structure of the project packagex
packagex
├── packagex
│   ├── __init__.py
│   ├── main.py
│   └── _version.py
├── packagex.spec
├── LICENSE
├── README.md
├── .bumpversion.cfg
├── requirements.txt
├── setup.cfg
└── setup.py
where setup.py is
# setup.py
import os
import setuptools
about = {}
with open("packagex/_version.py") as f:
exec(f.read(), about)
os.environ["PBR_VERSION"] = about["__version__"]
setuptools.setup(
setup_requires=["pbr"],
pbr=True,
version=about["__version__"],
)
packagex/_version.py contains just
__version__ = "0.0.1"
and packagex/__init__.py
from ._version import __version__
and for .bumpversion.cfg
[bumpversion]
current_version = 0.0.1
commit = False
tag = False
parse = (?P<major>\d+)\.(?P<minor>\d+)\.(?P<patch>\d+)(\-(?P<release>[a-z]+)(?P<build>\d+))?
serialize =
{major}.{minor}.{patch}-{release}{build}
{major}.{minor}.{patch}
[bumpversion:part:release]
optional_value = prod
first_value = dev
values =
dev
prod
[bumpversion:file:packagex/_version.py]
Using setuptools and pbr
There is not a standard way to manage version, but the standard way to manage your packages is setuptools.
The best solution I've found overall for managing version is to use setuptools with the pbr extension. This is now my standard way of managing version.
Setting up your project for full packaging may be overkill for simple projects, but if you need to manage version, you are probably at the right level to just set everything up. Doing so also makes your package releasable at PyPi so everyone can download and use it with Pip.
PBR moves most metadata out of the setup.py tools and into a setup.cfg file that is then used as a source for most metadata, which can include version. This allows the metadata to be packaged into an executable using something like pyinstaller if needed (if so, you will probably need this info), and separates the metadata from the other package management/setup scripts. You can directly update the version string in setup.cfg manually, and it will be pulled into the *.egg-info folder when building your package releases. Your scripts can then access the version from the metadata using various methods (these processes are outlined in sections below).
When using Git for VCS/SCM, this setup is even better, as it will pull in a lot of the metadata from Git so that your repo can be your primary source of truth for some of the metadata, including version, authors, changelogs, etc. For version specifically, it will create a version string for the current commit based on git tags in the repo.
PyPA - Packaging Python Packages with SetupTools - Tutorial
PBR latest build usage documentation - How to setup an 8-line setup.py and a setup.cfg file with the metadata.
As PBR will pull version, author, changelog and other info directly from your git repo, so some of the metadata in setup.cfg can be left out and auto generated whenever a distribution is created for your package (using setup.py)
Get the current version in real-time
setuptools will pull the latest info in real-time using setup.py:
python setup.py --version
This will pull the latest version either from the setup.cfg file, or from the git repo, based on the latest commit that was made and tags that exist in the repo. This command doesn't update the version in a distribution though.
Updating the version metadata
When you create a distribution with setup.py (i.e. py setup.py sdist, for example), then all the current info will be extracted and stored in the distribution. This essentially runs the setup.py --version command and then stores that version info into the package.egg-info folder in a set of files that store distribution metadata.
Note on process to update version meta-data:
If you are not using pbr to pull version data from git, then just update your setup.cfg directly with new version info (easy enough, but make sure this is a standard part of your release process).
If you are using git, and you don't need to create a source or binary distribution (using python setup.py sdist or one of the python setup.py bdist_xxx commands) the simplest way to update the git repo info into your <mypackage>.egg-info metadata folder is to just run the python setup.py install command. This will run all the PBR functions related to pulling metadata from the git repo and update your local .egg-info folder, install script executables for any entry-points you have defined, and other functions you can see from the output when you run this command.
Note that the .egg-info folder is generally excluded from being stored in the git repo itself in standard Python .gitignore files (such as from Gitignore.IO), as it can be generated from your source. If it is excluded, make sure you have a standard "release process" to get the metadata updated locally before release, and any package you upload to PyPi.org or otherwise distribute must include this data to have the correct version. If you want the Git repo to contain this info, you can exclude specific files from being ignored (i.e. add !*.egg-info/PKG_INFO to .gitignore)
Accessing the version from a script
You can access the metadata from the current build within Python scripts in the package itself. For version, for example, there are several ways to do this I have found so far:
## This one is a new built-in as of Python 3.8.0 should become the standard
from importlib.metadata import version
v0 = version("mypackage")
print('v0 {}'.format(v0))
## I don't like this one because the version method is hidden
import pkg_resources # part of setuptools
v1 = pkg_resources.require("mypackage")[0].version
print('v1 {}'.format(v1))
# Probably best for pre v3.8.0 - the output without .version is just a longer string with
# both the package name, a space, and the version string
import pkg_resources # part of setuptools
v2 = pkg_resources.get_distribution('mypackage').version
print('v2 {}'.format(v2))
## This one seems to be slower, and with pyinstaller makes the exe a lot bigger
from pbr.version import VersionInfo
v3 = VersionInfo('mypackage').release_string()
print('v3 {}'.format(v3))
You can put one of these directly in your __init__.py for the package to extract the version info as follows, similar to some other answers:
__all__ = (
'__version__',
'my_package_name'
)
import pkg_resources # part of setuptools
__version__ = pkg_resources.get_distribution("mypackage").version
Create a file named by _version.txt in the same folder as __init__.py and write version as a single line:
0.8.2
Read this infomation from file _version.txt in __init__.py:
import os
def get_version():
with open(os.path.join(os.path.abspath(os.path.dirname(__file__)), "_version.txt")) as f:
return f.read().strip()
__version__ = get_version()
I described a standard and modern way here, relying on setuptools_scm.
This pattern has worked successfully for dozens of published packages over the past years, so I can warmly recommend it.
Note that you do not need the getversion package to implement this pattern. It just happens that the getversion documentation hosts this tip.
I prefer to read the package version from installation environment.
This is my src/foo/_version.py:
from pkg_resources import get_distribution
__version__ = get_distribution('foo').version
Makesure foo is always already installed, that's why a src/ layer is required to prevent foo imported without installation.
In the setup.py, I use setuptools-scm to generate the version automatically.
Update in 2022.7.5:
There is another way, which is my faviourate now. Use setuptools-scm to generate a _version.py file.
setup(
...
use_scm_version={
'write_to':
'src/foo/_version.py',
'write_to_template':
'"""Generated version file."""\n'
'__version__ = "{version}"\n',
},
)
Using setuptools and pyproject.toml
Setuptools now offers a way to dynamically get version in pyproject.toml
Reproducing the example here, you can create something like the following in your pyproject.toml
# ...
[project]
name = "my_package"
dynamic = ["version"]
# ...
[tool.setuptools.dynamic]
version = {attr = "my_package.__version__"}
Use a version.py file only with __version__ = <VERSION> param in the file. In the setup.py file import the __version__ param and put it's value in the setup.py file like this:
version=__version__
Another way is to use just a setup.py file with version=<CURRENT_VERSION> - the CURRENT_VERSION is hardcoded.
Since we don't want to manually change the version in the file every time we create a new tag (ready to release a new package version), we can use the following..
I highly recommend bumpversion package. I've been using it for years to bump a version.
start by adding version=<VERSION> to your setup.py file if you don't have it already.
You should use a short script like this every time you bump a version:
bumpversion (patch|minor|major) - choose only one option
git push
git push --tags
Then add one file per repo called: .bumpversion.cfg:
[bumpversion]
current_version = <CURRENT_TAG>
commit = True
tag = True
tag_name = {new_version}
[bumpversion:file:<RELATIVE_PATH_TO_SETUP_FILE>]
Note:
You can use __version__ parameter under version.py file like it was suggested in other posts and update the bumpversion file like this:
[bumpversion:file:<RELATIVE_PATH_TO_VERSION_FILE>]
You must git commit or git reset everything in your repo, otherwise you'll get a dirty repo error.
Make sure that your virtual environment includes the package of bumpversion, without it it will not work.
For what it's worth, if you're using NumPy distutils, numpy.distutils.misc_util.Configuration has a make_svn_version_py() method that embeds the revision number inside package.__svn_version__ in the variable version .
If you use CVS (or RCS) and want a quick solution, you can use:
__version__ = "$Revision: 1.1 $"[11:-2]
__version_info__ = tuple([int(s) for s in __version__.split(".")])
(Of course, the revision number will be substituted for you by CVS.)
This gives you a print-friendly version and a version info that you can use to check that the module you are importing has at least the expected version:
import my_module
assert my_module.__version_info__ >= (1, 1)

Categories