How to configure os specific dependencies in a pyproject.toml file [Maturin] - python

I have a rust and python project that I am building using Maturin(https://github.com/PyO3/maturin). It says that it requires a pyproject.toml file for the python dependencies.
I have a dependency of uvloop, which is not supported on windows and arm devices. I have added the code that conditionally imports these packages. However, I do not know how to conditionally install these packages. Right now, these packages are getting installed by default on every OS.
Here is the pyproject.toml file.
[project]
name = "robyn"
dependencies = [
"watchdog>=2.1.3,<3",
"uvloop>=0.16.0,<0.16.1",
"multiprocess>=0.70.12.2,<0.70.12.3"
]
And the github link, jic anyone is interested: https://github.com/sansyrox/robyn/pull/94/files#diff-50c86b7ed8ac2cf95bd48334961bf0530cdc77b5a56f852c5c61b89d735fd711R21

If you don't want to install on windows, specify like this:
# assuming you're using poetry
uvloop = {version = "^0.16.0", markers = 'sys_platform != "win32"'}

As wim indicated in their comment, https://peps.python.org/pep-0508/ specifies how to write constraints on package requirements.
In addition to restricting the package to a range of versions, you can restrict package installations based on various markers, such as sys_platform for the OS, separated from your other requirements with a semicolon.
I haven't tested this with pyproject.toml, but the following works in setup.cfg:
[options]
install_requires =
uvloop ; sys_platform != "win32"

Related

how to reference metadata in a pyproject.toml file?

I was previously using setup.py to package my Python libs. As it seems pyproject.toml is the future way of setuptools I decided to migrate before my next releases.
In setup.py I was using the following string to define the link to the downloadable tarball:
setup(
version = "2.13.4"
download_url = "https://github.com/xx/xx/archive/v${metadata:version}.tar.gz"
)
My objective is to set up the version number just once. Is it still possible in pyproject.toml and if yes how ?
I tried the following but it's not included the version parameter in the url:
[project]
version = "2.13.4"
[project.urls]
Download = "https://github.com/xx/xx/archive/v${metadata:version}.tar.gz"

Make python script executable via pip using pyproject.toml [duplicate]

I'm trying to add a pyproject.toml to a project that's been using setup.py in order to enable support by pipx. I'd like to specify the command line scripts the project includes in pyproject.toml, but all the guides I can find give instructions for use with poetry, which I am not using.
I also don't want to specify entry points to modules - I already have working command line scripts and just want to specify those.
Is there a proper place in pyproject.toml to specify command line scripts?
Not sure it matters, but the package in question is cutlet.
Update July 2022: if your TOML file uses setuptools as its build system, setuptools will happily create and install a console script. For example, my pyproject.toml file starts like this:
[build-system]
requires = ["setuptools>=61.0"]
build-backend = "setuptools.build_meta"
Extend your pyproject.toml file with an entry like this, naming the package, module and entry-point function names:
[project.scripts]
my-client = "my_package.my_module:main_cli"
Then run the install sequence:
pip3 install .
And setuptools will install a shell script named my-client somewhere appropriate, for me in my Py3.9 virtual environment 's bin directory (~/.virtualenvs/py39/bin).
I was doing a similar thing, upgrading a package that had a setup.py, altho I had no existing scripts. With the rewrite to using pyproject.toml I dropped the old setup.py file entirely.
FWIW I realize the OP asked for a way to install existing scripts, which I didn't provide. This answer tells setuptools to create and install new scripts.
Update Feb 2023: thanks for all the votes. If you're cutting corners to meet arbitrary management deadlines, just copy-paste this short pyproject.toml file and adjust:
[build-system]
requires = ["setuptools>=61.0"]
build-backend = "setuptools.build_meta"
[project]
name = "my_client"
version = "1.2.3"
authors = [{name="Ubr Programmer", email="ubr#gmailington.com" }]
description = "Client for my awesome system"
readme = "README.md"
dependencies = ["cachetools","requests"]
requires-python = ">=3.9"
[project.scripts]
my-client = "my_package.my_module:main_cli"
[project.urls]
"Homepage" = "https://github.com/your_name_here/something"
"Bug Tracker" = "https://github.com/your_name_here/something/issues"
Is there a proper place in pyproject.toml to specify command line scripts?
PEP566 (Metadata 2.1) only defines Core metadata specifications. Thus, the answer depends on your build system (Note: PEP518 defines build system concept).
If you use the existing build tools such as setuptools, poetry, and flit, you only can consider adding such options in pyproject.toml if that tool supports command line scripts (console_scripts) in pyproject.toml. Surely, if you have your own build tool, you need to implement a parser to parse the command line scripts in pyproject.toml.
Lastly, you can check below list to know which major build system supports command line scripts (console_scripts) in pyproject.toml (2020/Oct):
setuptools: not implemented yet according to PEP621
poetry: yes, here's part of the implementation.
flit: yes.

How to include local Python dependency in setup.cfg

A similar question has been asked before, but this time I am asking for the newer setuptools config file — setup.cfg.
Consider my use case, where I have a project with multiple Python packages that depends on each other. For simplicity let's say mypkg1 depends on mypkg2:
mypkg1/
mypkg1/
setup.cfg
mypkg2/
mypkg2/
setup.cfg
How do I write the setup.cfg file for mypkg1 such that it depends on a local copy of mypkg2?
[metadata]
name = mypkg1
version = 0.0.1
[options]
packages = find:
python_requires = >= 3.7
install_requires =
../mypkg2 # Does not work
The answer cannot be to distribute mypkg2 to a package repository (e.g., PyPI) or some VCS release (e.g., GitHub Release) as these solutions makes the package external and no local.
Related Questions
How to include and install local dependencies in setup.py in Python?
This question is for setup.py which does not work for setup.cfg.
https://github.com/pypa/setuptools/issues/1951
The discussion does not indicate any support for this nor any plans for this to be a feature.
Direct references still work. Looks like this:
install_requires =
my_package # file:///home/code/my_package
Note the triple-slash in file:/// - the first two are the usual schema://, the third one is what separates the empty <host> (defaults to localhost) from the path.

How can I specify library versions in setup.py?

In my setup.py file, I've specified a few libraries needed to run my project:
setup(
# ...
install_requires = [
'django-pipeline',
'south'
]
)
How can I specify required versions of these libraries?
I'm not sure about buildout, however, for setuptools/distribute, you specify version info with the comparison operators (like ==, >=, or <=).
For example:
install_requires = ['django-pipeline==1.1.22', 'south>=0.7']
You can add them to your requirements.txt file along with the version.
For example:
django-pipeline==1.1.22
south>=0.7
and then in your setup.py
import os
from setuptools import setup
with open('requirements.txt') as f:
required = f.read().splitlines()
setup(...
install_requires=required,
...)
Reading from the docs -
It is not considered best practice to use install_requires to pin
dependencies to specific versions, or to specify sub-dependencies
(i.e. dependencies of your dependencies). This is overly-restrictive,
and prevents the user from gaining the benefit of dependency
upgrades.
https://packaging.python.org/discussions/install-requires-vs-requirements/#id5

Standard way to embed version into Python package?

Is there a standard way to associate version string with a Python package in such way that I could do the following?
import foo
print(foo.version)
I would imagine there's some way to retrieve that data without any extra hardcoding, since minor/major strings are specified in setup.py already. Alternative solution that I found was to have import __version__ in my foo/__init__.py and then have __version__.py generated by setup.py.
Not directly an answer to your question, but you should consider naming it __version__, not version.
This is almost a quasi-standard. Many modules in the standard library use __version__, and this is also used in lots of 3rd-party modules, so it's the quasi-standard.
Usually, __version__ is a string, but sometimes it's also a float or tuple.
As mentioned by S.Lott (Thank you!), PEP 8 says it explicitly:
Module Level Dunder Names
Module level "dunders" (i.e. names with two leading and two trailing
underscores) such as __all__, __author__, __version__, etc.
should be placed after the module docstring but before any import
statements except from __future__ imports.
You should also make sure that the version number conforms to the format described in PEP 440 (PEP 386 a previous version of this standard).
I use a single _version.py file as the "once cannonical place" to store version information:
It provides a __version__ attribute.
It provides the standard metadata version. Therefore it will be detected by pkg_resources or other tools that parse the package metadata (EGG-INFO and/or PKG-INFO, PEP 0345).
It doesn't import your package (or anything else) when building your package, which can cause problems in some situations. (See the comments below about what problems this can cause.)
There is only one place that the version number is written down, so there is only one place to change it when the version number changes, and there is less chance of inconsistent versions.
Here is how it works: the "one canonical place" to store the version number is a .py file, named "_version.py" which is in your Python package, for example in myniftyapp/_version.py. This file is a Python module, but your setup.py doesn't import it! (That would defeat feature 3.) Instead your setup.py knows that the contents of this file is very simple, something like:
__version__ = "3.6.5"
And so your setup.py opens the file and parses it, with code like:
import re
VERSIONFILE="myniftyapp/_version.py"
verstrline = open(VERSIONFILE, "rt").read()
VSRE = r"^__version__ = ['\"]([^'\"]*)['\"]"
mo = re.search(VSRE, verstrline, re.M)
if mo:
verstr = mo.group(1)
else:
raise RuntimeError("Unable to find version string in %s." % (VERSIONFILE,))
Then your setup.py passes that string as the value of the "version" argument to setup(), thus satisfying feature 2.
To satisfy feature 1, you can have your package (at run-time, not at setup time!) import the _version file from myniftyapp/__init__.py like this:
from _version import __version__
Here is an example of this technique that I've been using for years.
The code in that example is a bit more complicated, but the simplified example that I wrote into this comment should be a complete implementation.
Here is example code of importing the version.
If you see anything wrong with this approach, please let me know.
Rewritten 2017-05
After 13+ years of writing Python code and managing various packages, I came to the conclusion that DIY is maybe not the best approach.
I started using the pbr package for dealing with versioning in my packages. If you are using git as your SCM, this will fit into your workflow like magic, saving your weeks of work (you will be surprised about how complex the issue can be).
As of today, pbr has 12M mongthly downloads, and reaching this level didn't include any dirty tricks. It was only one thing -- fixing a common packaging problem in a very simple way.
pbr can do more of the package maintenance burden, and is not limited to versioning, but it does not force you to adopt all its benefits.
So to give you an idea about how it looks to adopt pbr in one commit have a look switching packaging to pbr
Probably you would observed that the version is not stored at all in the repository. PBR does detect it from Git branches and tags.
No need to worry about what happens when you do not have a git repository because pbr does "compile" and cache the version when you package or install the applications, so there is no runtime dependency on git.
Old solution
Here is the best solution I've seen so far and it also explains why:
Inside yourpackage/version.py:
# Store the version here so:
# 1) we don't load dependencies by storing it in __init__.py
# 2) we can import it in setup.py for the same reason
# 3) we can import it into your module module
__version__ = '0.12'
Inside yourpackage/__init__.py:
from .version import __version__
Inside setup.py:
exec(open('yourpackage/version.py').read())
setup(
...
version=__version__,
...
If you know another approach that seems to be better let me know.
Per the deferred [STOP PRESS: rejected] PEP 396 (Module Version Numbers), there is a proposed way to do this. It describes, with rationale, an (admittedly optional) standard for modules to follow. Here's a snippet:
When a module (or package) includes a version number, the version SHOULD be available in the __version__ attribute.
For modules which live inside a namespace package, the module SHOULD include the __version__ attribute. The namespace package itself SHOULD NOT include its own __version__ attribute.
The __version__ attribute's value SHOULD be a string.
There is a slightly simpler alternative to some of the other answers:
__version_info__ = ('1', '2', '3')
__version__ = '.'.join(__version_info__)
(And it would be fairly simple to convert auto-incrementing portions of version numbers to a string using str().)
Of course, from what I've seen, people tend to use something like the previously-mentioned version when using __version_info__, and as such store it as a tuple of ints; however, I don't quite see the point in doing so, as I doubt there are situations where you would perform mathematical operations such as addition and subtraction on portions of version numbers for any purpose besides curiosity or auto-incrementation (and even then, int() and str() can be used fairly easily). (On the other hand, there is the possibility of someone else's code expecting a numerical tuple rather than a string tuple and thus failing.)
This is, of course, my own view, and I would gladly like others' input on using a numerical tuple.
As shezi reminded me, (lexical) comparisons of number strings do not necessarily have the same result as direct numerical comparisons; leading zeroes would be required to provide for that. So in the end, storing __version_info__ (or whatever it would be called) as a tuple of integer values would allow for more efficient version comparisons.
Many of these solutions here ignore git version tags which still means you have to track version in multiple places (bad). I approached this with the following goals:
Derive all python version references from a tag in the git repo
Automate git tag/push and setup.py upload steps with a single command that takes no inputs.
How it works:
From a make release command, the last tagged version in the git repo is found and incremented. The tag is pushed back to origin.
The Makefile stores the version in src/_version.py where it will be read by setup.py and also included in the release. Do not check _version.py into source control!
setup.py command reads the new version string from package.__version__.
Details:
Makefile
# remove optional 'v' and trailing hash "v1.0-N-HASH" -> "v1.0-N"
git_describe_ver = $(shell git describe --tags | sed -E -e 's/^v//' -e 's/(.*)-.*/\1/')
git_tag_ver = $(shell git describe --abbrev=0)
next_patch_ver = $(shell python versionbump.py --patch $(call git_tag_ver))
next_minor_ver = $(shell python versionbump.py --minor $(call git_tag_ver))
next_major_ver = $(shell python versionbump.py --major $(call git_tag_ver))
.PHONY: ${MODULE}/_version.py
${MODULE}/_version.py:
echo '__version__ = "$(call git_describe_ver)"' > $#
.PHONY: release
release: test lint mypy
git tag -a $(call next_patch_ver)
$(MAKE) ${MODULE}/_version.py
python setup.py check sdist upload # (legacy "upload" method)
# twine upload dist/* (preferred method)
git push origin master --tags
The release target always increments the 3rd version digit, but you can use the next_minor_ver or next_major_ver to increment the other digits. The commands rely on the versionbump.py script that is checked into the root of the repo
versionbump.py
"""An auto-increment tool for version strings."""
import sys
import unittest
import click
from click.testing import CliRunner # type: ignore
__version__ = '0.1'
MIN_DIGITS = 2
MAX_DIGITS = 3
#click.command()
#click.argument('version')
#click.option('--major', 'bump_idx', flag_value=0, help='Increment major number.')
#click.option('--minor', 'bump_idx', flag_value=1, help='Increment minor number.')
#click.option('--patch', 'bump_idx', flag_value=2, default=True, help='Increment patch number.')
def cli(version: str, bump_idx: int) -> None:
"""Bumps a MAJOR.MINOR.PATCH version string at the specified index location or 'patch' digit. An
optional 'v' prefix is allowed and will be included in the output if found."""
prefix = version[0] if version[0].isalpha() else ''
digits = version.lower().lstrip('v').split('.')
if len(digits) > MAX_DIGITS:
click.secho('ERROR: Too many digits', fg='red', err=True)
sys.exit(1)
digits = (digits + ['0'] * MAX_DIGITS)[:MAX_DIGITS] # Extend total digits to max.
digits[bump_idx] = str(int(digits[bump_idx]) + 1) # Increment the desired digit.
# Zero rightmost digits after bump position.
for i in range(bump_idx + 1, MAX_DIGITS):
digits[i] = '0'
digits = digits[:max(MIN_DIGITS, bump_idx + 1)] # Trim rightmost digits.
click.echo(prefix + '.'.join(digits), nl=False)
if __name__ == '__main__':
cli() # pylint: disable=no-value-for-parameter
This does the heavy lifting how to process and increment the version number from git.
__init__.py
The my_module/_version.py file is imported into my_module/__init__.py. Put any static install config here that you want distributed with your module.
from ._version import __version__
__author__ = ''
__email__ = ''
setup.py
The last step is to read the version info from the my_module module.
from setuptools import setup, find_packages
pkg_vars = {}
with open("{MODULE}/_version.py") as fp:
exec(fp.read(), pkg_vars)
setup(
version=pkg_vars['__version__'],
...
...
)
Of course, for all of this to work you'll have to have at least one version tag in your repo to start.
git tag -a v0.0.1
I use a JSON file in the package dir. This fits Zooko's requirements.
Inside pkg_dir/pkg_info.json:
{"version": "0.1.0"}
Inside setup.py:
from distutils.core import setup
import json
with open('pkg_dir/pkg_info.json') as fp:
_info = json.load(fp)
setup(
version=_info['version'],
...
)
Inside pkg_dir/__init__.py:
import json
from os.path import dirname
with open(dirname(__file__) + '/pkg_info.json') as fp:
_info = json.load(fp)
__version__ = _info['version']
I also put other information in pkg_info.json, like author. I
like to use JSON because I can automate management of metadata.
Lots of work toward uniform versioning and in support of conventions has been completed since this question was first asked. Palatable options are now detailed in the Python Packaging User Guide. Also noteworthy is that version number schemes are relatively strict in Python per PEP 440, and so keeping things sane is critical if your package will be released to the Cheese Shop.
Here's a shortened breakdown of versioning options:
Read the file in setup.py (setuptools) and get the version.
Use an external build tool (to update both __init__.py as well as source control), e.g. bump2version, changes or zest.releaser.
Set the value to a __version__ global variable in a specific module.
Place the value in a simple VERSION text file for both setup.py and code to read.
Set the value via a setup.py release, and use importlib.metadata to pick it up at runtime. (Warning, there are pre-3.8 and post-3.8 versions.)
Set the value to __version__ in sample/__init__.py and import sample in setup.py.
Use setuptools_scm to extract versioning from source control so that it's the canonical reference, not code.
NOTE that (7) might be the most modern approach (build metadata is independent of code, published by automation). Also NOTE that if setup is used for package release that a simple python3 setup.py --version will report the version directly.
Also worth noting is that as well as __version__ being a semi-std. in python so is __version_info__ which is a tuple, in the simple cases you can just do something like:
__version__ = '1.2.3'
__version_info__ = tuple([ int(num) for num in __version__.split('.')])
...and you can get the __version__ string from a file, or whatever.
There doesn't seem to be a standard way to embed a version string in a python package. Most packages I've seen use some variant of your solution, i.e. eitner
Embed the version in setup.py and have setup.py generate a module (e.g. version.py) containing only version info, that's imported by your package, or
The reverse: put the version info in your package itself, and import that to set the version in setup.py
arrow handles it in an interesting way.
Now (since 2e5031b)
In arrow/__init__.py:
__version__ = 'x.y.z'
In setup.py:
from arrow import __version__
setup(
name='arrow',
version=__version__,
# [...]
)
Before
In arrow/__init__.py:
__version__ = 'x.y.z'
VERSION = __version__
In setup.py:
def grep(attrname):
pattern = r"{0}\W*=\W*'([^']+)'".format(attrname)
strval, = re.findall(pattern, file_text)
return strval
file_text = read(fpath('arrow/__init__.py'))
setup(
name='arrow',
version=grep('__version__'),
# [...]
)
I also saw another style:
>>> django.VERSION
(1, 1, 0, 'final', 0)
After several hours of trying to find the simplest reliable solution, here are the parts:
create a version.py file INSIDE the folder of your package "/mypackage":
# Store the version here so:
# 1) we don't load dependencies by storing it in __init__.py
# 2) we can import it in setup.py for the same reason
# 3) we can import it into your module module
__version__ = '1.2.7'
in setup.py:
exec(open('mypackage/version.py').read())
setup(
name='mypackage',
version=__version__,
in the main folder init.py:
from .version import __version__
The exec() function runs the script outside of any imports, since setup.py is run before the module can be imported. You still only need to manage the version number in one file in one place, but unfortunately it is not in setup.py. (that's the downside, but having no import bugs is the upside)
pbr with bump2version
This solution was derived from this article
The use case - python GUI package distributed via PyInstaller. Needs to show version info.
Here is the structure of the project packagex
packagex
├── packagex
│   ├── __init__.py
│   ├── main.py
│   └── _version.py
├── packagex.spec
├── LICENSE
├── README.md
├── .bumpversion.cfg
├── requirements.txt
├── setup.cfg
└── setup.py
where setup.py is
# setup.py
import os
import setuptools
about = {}
with open("packagex/_version.py") as f:
exec(f.read(), about)
os.environ["PBR_VERSION"] = about["__version__"]
setuptools.setup(
setup_requires=["pbr"],
pbr=True,
version=about["__version__"],
)
packagex/_version.py contains just
__version__ = "0.0.1"
and packagex/__init__.py
from ._version import __version__
and for .bumpversion.cfg
[bumpversion]
current_version = 0.0.1
commit = False
tag = False
parse = (?P<major>\d+)\.(?P<minor>\d+)\.(?P<patch>\d+)(\-(?P<release>[a-z]+)(?P<build>\d+))?
serialize =
{major}.{minor}.{patch}-{release}{build}
{major}.{minor}.{patch}
[bumpversion:part:release]
optional_value = prod
first_value = dev
values =
dev
prod
[bumpversion:file:packagex/_version.py]
Using setuptools and pbr
There is not a standard way to manage version, but the standard way to manage your packages is setuptools.
The best solution I've found overall for managing version is to use setuptools with the pbr extension. This is now my standard way of managing version.
Setting up your project for full packaging may be overkill for simple projects, but if you need to manage version, you are probably at the right level to just set everything up. Doing so also makes your package releasable at PyPi so everyone can download and use it with Pip.
PBR moves most metadata out of the setup.py tools and into a setup.cfg file that is then used as a source for most metadata, which can include version. This allows the metadata to be packaged into an executable using something like pyinstaller if needed (if so, you will probably need this info), and separates the metadata from the other package management/setup scripts. You can directly update the version string in setup.cfg manually, and it will be pulled into the *.egg-info folder when building your package releases. Your scripts can then access the version from the metadata using various methods (these processes are outlined in sections below).
When using Git for VCS/SCM, this setup is even better, as it will pull in a lot of the metadata from Git so that your repo can be your primary source of truth for some of the metadata, including version, authors, changelogs, etc. For version specifically, it will create a version string for the current commit based on git tags in the repo.
PyPA - Packaging Python Packages with SetupTools - Tutorial
PBR latest build usage documentation - How to setup an 8-line setup.py and a setup.cfg file with the metadata.
As PBR will pull version, author, changelog and other info directly from your git repo, so some of the metadata in setup.cfg can be left out and auto generated whenever a distribution is created for your package (using setup.py)
Get the current version in real-time
setuptools will pull the latest info in real-time using setup.py:
python setup.py --version
This will pull the latest version either from the setup.cfg file, or from the git repo, based on the latest commit that was made and tags that exist in the repo. This command doesn't update the version in a distribution though.
Updating the version metadata
When you create a distribution with setup.py (i.e. py setup.py sdist, for example), then all the current info will be extracted and stored in the distribution. This essentially runs the setup.py --version command and then stores that version info into the package.egg-info folder in a set of files that store distribution metadata.
Note on process to update version meta-data:
If you are not using pbr to pull version data from git, then just update your setup.cfg directly with new version info (easy enough, but make sure this is a standard part of your release process).
If you are using git, and you don't need to create a source or binary distribution (using python setup.py sdist or one of the python setup.py bdist_xxx commands) the simplest way to update the git repo info into your <mypackage>.egg-info metadata folder is to just run the python setup.py install command. This will run all the PBR functions related to pulling metadata from the git repo and update your local .egg-info folder, install script executables for any entry-points you have defined, and other functions you can see from the output when you run this command.
Note that the .egg-info folder is generally excluded from being stored in the git repo itself in standard Python .gitignore files (such as from Gitignore.IO), as it can be generated from your source. If it is excluded, make sure you have a standard "release process" to get the metadata updated locally before release, and any package you upload to PyPi.org or otherwise distribute must include this data to have the correct version. If you want the Git repo to contain this info, you can exclude specific files from being ignored (i.e. add !*.egg-info/PKG_INFO to .gitignore)
Accessing the version from a script
You can access the metadata from the current build within Python scripts in the package itself. For version, for example, there are several ways to do this I have found so far:
## This one is a new built-in as of Python 3.8.0 should become the standard
from importlib.metadata import version
v0 = version("mypackage")
print('v0 {}'.format(v0))
## I don't like this one because the version method is hidden
import pkg_resources # part of setuptools
v1 = pkg_resources.require("mypackage")[0].version
print('v1 {}'.format(v1))
# Probably best for pre v3.8.0 - the output without .version is just a longer string with
# both the package name, a space, and the version string
import pkg_resources # part of setuptools
v2 = pkg_resources.get_distribution('mypackage').version
print('v2 {}'.format(v2))
## This one seems to be slower, and with pyinstaller makes the exe a lot bigger
from pbr.version import VersionInfo
v3 = VersionInfo('mypackage').release_string()
print('v3 {}'.format(v3))
You can put one of these directly in your __init__.py for the package to extract the version info as follows, similar to some other answers:
__all__ = (
'__version__',
'my_package_name'
)
import pkg_resources # part of setuptools
__version__ = pkg_resources.get_distribution("mypackage").version
Create a file named by _version.txt in the same folder as __init__.py and write version as a single line:
0.8.2
Read this infomation from file _version.txt in __init__.py:
import os
def get_version():
with open(os.path.join(os.path.abspath(os.path.dirname(__file__)), "_version.txt")) as f:
return f.read().strip()
__version__ = get_version()
I described a standard and modern way here, relying on setuptools_scm.
This pattern has worked successfully for dozens of published packages over the past years, so I can warmly recommend it.
Note that you do not need the getversion package to implement this pattern. It just happens that the getversion documentation hosts this tip.
I prefer to read the package version from installation environment.
This is my src/foo/_version.py:
from pkg_resources import get_distribution
__version__ = get_distribution('foo').version
Makesure foo is always already installed, that's why a src/ layer is required to prevent foo imported without installation.
In the setup.py, I use setuptools-scm to generate the version automatically.
Update in 2022.7.5:
There is another way, which is my faviourate now. Use setuptools-scm to generate a _version.py file.
setup(
...
use_scm_version={
'write_to':
'src/foo/_version.py',
'write_to_template':
'"""Generated version file."""\n'
'__version__ = "{version}"\n',
},
)
Using setuptools and pyproject.toml
Setuptools now offers a way to dynamically get version in pyproject.toml
Reproducing the example here, you can create something like the following in your pyproject.toml
# ...
[project]
name = "my_package"
dynamic = ["version"]
# ...
[tool.setuptools.dynamic]
version = {attr = "my_package.__version__"}
Use a version.py file only with __version__ = <VERSION> param in the file. In the setup.py file import the __version__ param and put it's value in the setup.py file like this:
version=__version__
Another way is to use just a setup.py file with version=<CURRENT_VERSION> - the CURRENT_VERSION is hardcoded.
Since we don't want to manually change the version in the file every time we create a new tag (ready to release a new package version), we can use the following..
I highly recommend bumpversion package. I've been using it for years to bump a version.
start by adding version=<VERSION> to your setup.py file if you don't have it already.
You should use a short script like this every time you bump a version:
bumpversion (patch|minor|major) - choose only one option
git push
git push --tags
Then add one file per repo called: .bumpversion.cfg:
[bumpversion]
current_version = <CURRENT_TAG>
commit = True
tag = True
tag_name = {new_version}
[bumpversion:file:<RELATIVE_PATH_TO_SETUP_FILE>]
Note:
You can use __version__ parameter under version.py file like it was suggested in other posts and update the bumpversion file like this:
[bumpversion:file:<RELATIVE_PATH_TO_VERSION_FILE>]
You must git commit or git reset everything in your repo, otherwise you'll get a dirty repo error.
Make sure that your virtual environment includes the package of bumpversion, without it it will not work.
For what it's worth, if you're using NumPy distutils, numpy.distutils.misc_util.Configuration has a make_svn_version_py() method that embeds the revision number inside package.__svn_version__ in the variable version .
If you use CVS (or RCS) and want a quick solution, you can use:
__version__ = "$Revision: 1.1 $"[11:-2]
__version_info__ = tuple([int(s) for s in __version__.split(".")])
(Of course, the revision number will be substituted for you by CVS.)
This gives you a print-friendly version and a version info that you can use to check that the module you are importing has at least the expected version:
import my_module
assert my_module.__version_info__ >= (1, 1)

Categories