Python's poetry dependency manager allows specifying optional dependencies via command:
$ poetry add --optional redis
Which results in this configuration:
[tool.poetry.dependencies]
python = "^3.8"
redis = {version="^3.4.1", optional=true}
However how do you actually install them? Docs seem to hint to:
$ poetry install -E redis
but that just throws and error:
Installing dependencies from lock file
[ValueError]
Extra [redis] is not specified.
You need to add a tool.poetry.extras group to your pyproject.toml if you want to use the -E flag during install, as described in this section of the docs:
[tool.poetry.extras]
caching = ["redis"]
The key refers to the word that you use with poetry install -E, and the value is a list of packages that were marked as --optional when they were added. There currently is no support for making optional packages part of a specific group during their addition, so you have to maintain this section in your pyproject.toml file by hand.
The reason behind this additional layer of abstraction is that extra-installs usually refer to some optional functionality (in this case caching) that is enabled through the installation of one or more dependencies (in this case just redis). poetry simply mimics setuptools' definition of extra-installs here, which might explain why it's so sparingly documented.
I will add that not only you have to have this extras section added by hand, as well your optional dependencies cannot be in dev section.
Example of code that won't work:
[tool.poetry]
name = "yolo"
version = "1.0.0"
description = ""
authors = []
[tool.poetry.dependencies]
python = "2.7"
Django = "*"
[tool.poetry.dev-dependencies]
pytest = "*"
ipdb = {version = "*", optional = true}
[tool.poetry.extras]
dev_tools = ["ipdb"]
But this WILL work:
[tool.poetry]
name = "yolo"
version = "1.0.0"
description = ""
authors = []
[tool.poetry.dependencies]
python = "2.7"
Django = "*"
ipdb = {version = "*", optional = true}
[tool.poetry.dev-dependencies]
pytest = "*"
[tool.poetry.extras]
dev_tools = ["ipdb"]
Up-voted Drachenfels's answer.
Dev dependency could not be optional, otherwise, no matter how you tweak it with extras or retry with poetry install -E, it will just never get installed.
This sounds like a bug but somehow by design,
...this is not something I want to add. Extras will be referenced in the distributions metadata when packaging the project but development dependencies do not which will lead to a broken extras.
— concluded in Poetry PR#606 comment by one maintainer. See here for detailed context: https://github.com/python-poetry/poetry/pull/606#issuecomment-437943927
I would say that I can accept the fact that optional dev-dependency cannot be implemented. However, at least Poetry should warn me when I have such a config. If so, I wouldn't have been confused for a long time, reading each corner of the help manual and found nothing helpful.
I found some people did get trap in this problem (Is poetry ignoring extras or pyproject.toml is misconfigured?) but their questions are closed, marked duplicated and re-linked to this question. Thus I decided to answer here and give more details about this problem.
This is now possible (with Poetry version 1.2; perhaps even an earlier version), using the "extras" group:
poetry add redis --group=extras
It will appear in the section
[tool.poetry.group.extras.dependencies]
which is also newer style (compared to [tool.poetry.extras] or [tool.poetry.extras.dependencies]
See the documentation. Interestingely, this still follows the older style, [tool.poetry.extras], and doesn't show the use of poetry add, but the above result is what I get.
Related
let's say I have a package:
from conans import ConanFile
class MainLibraryPackage(ConanFile):
name = 'main'
description = 'stub'
def set_version(self):
self.version = customFunctionToGetVersion()
...
And I have a test package for it:
import os
from conans import ConanFile, CMake
class MainLibraryTests(ConanFile):
"""
Conan recipe to run C++ tests for main library
"""
settings = 'arch', 'build_type', 'os', 'compiler'
generators = "cmake"
def requirements(self):
self.requires("gtest/1.12.1")
self.requires(<my main library, somehow?>)
def build(self):
cmake = CMake(self)
cmake.configure()
cmake.build()
def test(self):
print("THIS IS A TEST OF THE TEST")
if not tools.cross_building(self):
os.chdir("bin")
self.run(".%smain_tests" % os.sep)
How do I actually add the main package as a requirement? And if I do so, will that properly populate the CONAN_LIBS variable in the CMakeLists.txt for my test package? Thanks!
Welcome to the Conan Community!
Your answer depends on which Conan version are you running.
For Conan 1.x, your package is automatically consumed by the test package if you run conan create command. However, if you want to develop step-by-step and run conan test test_package <reference> you need to pass the reference by command line. Still, you don't need to add your package as a requirement, for both cases, Conan resolves it automatically.
However, Conan 2.x is still in Beta, but very close to be released as stable.
On Conan 2.x, the test package will no longer manage your package automatically, you will need to declare as a requirement, exactly what are you doing in your question. And, this behavior is compatible with Conan 1.x, if you add the attribute test_type = "explicit" in the test_package/conanfile.py
So answering your question, you should use:
self.requires(self.tested_reference_str)
The self.tested_reference_str will resolve your package reference automatically for you, so you don't need to care about which version or name, it will the same used by your Conan command.
For further information, I recommend you reading the official documentation:
The getting started there is section about testing your package: https://docs.conan.io/en/latest/creating_packages/getting_started.html#getting-started
How to prepare your test_package for Conan 2.x: https://docs.conan.io/en/latest/migrating_to_2.0/recipes.html#changes-in-the-test-package-recipe
I'm using TensorFlow under inside an x64_64 environment, but the processor is an Intel Atom processor. This processor lacks the AVX processor extension and since the pre-built wheels for TensorFLow are complied with the AVX extension TensorFLow does not work and exits. Hence I had to build my own wheel and I host it on GitHub as a released file.
The problem I have is to download this pre-built wheel only in an Atom based processor. I was able to achieve this previously using a setup.py file where this can be easily detected, but I have migrated to pyproject.toml which is very poor when it comes to customization and scripted installation support.
Is there anything similar in addition to platform_machine=='x86_64' which checks for the processor type? Or has the migration to pyproject.toml killed here my flexibility?
The current requirements.txt is:
confluent-kafka # https://github.com/HandsFreeGadgets/python-wheels/releases/download/v0.1/confluent_kafka-1.9.2-cp38-cp38-linux_aarch64.whl ; platform_machine=='aarch64'
tensorflow # https://github.com/HandsFreeGadgets/python-wheels/releases/download/v0.1/tensorflow-2.8.4-cp38-cp38-linux_aarch64.whl ; platform_machine=='aarch64'
tensorflow-addons # https://github.com/HandsFreeGadgets/python-wheels/releases/download/v0.1/tensorflow_addons-0.17.1-cp38-cp38-linux_aarch64.whl ; platform_machine=='aarch64'
tensorflow-text # https://github.com/HandsFreeGadgets/python-wheels/releases/download/v0.1/tensorflow_text-2.8.2-cp38-cp38-linux_aarch64.whl ; platform_machine=='aarch64'
rasa==3.4.2
SQLAlchemy==1.4.45
phonetics==1.0.5
de-core-news-md # https://github.com/explosion/spacy-models/releases/download/de_core_news_md-3.4.0/de_core_news_md-3.4.0-py3-none-any.whl
For platform_machine=='aarch64' I need something similar for x86_64 but only executed on Atom processor environments.
The old setup.py was:
import platform
import subprocess
import os
from setuptools import setup
def get_requirements():
requirements = []
if platform.machine() == 'x86_64':
command = "cat /proc/cpuinfo"
all_info = subprocess.check_output(command, shell=True).strip()
# AVX extension is the missing important information
if b'avx' not in all_info or ("NO_AVX" in os.environ and os.environ['NO_AVX']):
requirements.append(f'tensorflow # file://localhost/'+os.getcwd()+'/pip-wheels/amd64/tensorflow-2.3.2-cp38-cp38-linux_x86_64.whl')
elif platform.machine() == 'aarch64':
...
requirements.append('rasa==3.3.3')
requirements.append('SQLAlchemy==1.4.45')
requirements.append('phonetics==1.0.5')
requirements.append('de-core-news-md # https://github.com/explosion/spacy-models/releases/download/de_core_news_md-3.4.0/de_core_news_md-3.4.0-py3-none-any.whl')
return requirements
setup(
...
install_requires=get_requirements(),
...
)
The line if b'avx' not in all_info or ("NO_AVX" in os.environ and os.environ['NO_AVX']) does the necessary differentiation.
If a pyproject.toml approach is not for my needs, what is recommended for Python with more installation power which is not marked as legacy? Maybe there is something similar for Python what is Gradle for building projects in the Java world, which was introduced to overcome the XML limitations and providing a complete scripting language which I'm not aware of?
My recommendation would be to migrate pyproject.toml as intended. I would declare dependencies such as tensorflow according to the standard specification for dependencies but I would not use any direct references at all.
Then I would create some requirements.txt files in which I would list the dependencies that need special treatment (no need to list all dependencies), for example those that require a direct reference (and/or a pinned version). I would probably create one requirements file per platform, for example I would create a requirements-atom.txt.
As far as I know it should be possible to instruct pip to install from a remote requirements file via its URL. Something like this:
python -m pip install --requirement 'https://server.tld/path/requirements-atom.txt'
If you need to create multiple requirements.txt files with common parts, then probably a tool like pip-tools can help.
Maybe something like the following (untested):
requirements-common.in
# Application (or main project)
MyApplication # git+https://github.com/HandsFreeGadgets/MyApplication.git
# Common dependencies
CommonLibrary
AnotherCommonLibrary==1.2.3
requirements-atom.in:
--requirement requirements-common.in
# Atom CPU specific
tensorflow # https://github.com/HandsFreeGadgets/tensorflow-atom/releases/download/v0.1/tensorflow-2.8.4-cp38-cp38-linux_aarch64.whl ; platform_machine=='aarch64'
pip-compile requirements-atom.in > requirements-atom.txt
I created a python project "foo" with Poetry.
This is the content of pyproject.toml:
[tool.poetry]
name = "bar"
version = "0.1.0"
description = ""
[tool.poetry.dependencies]
python = ">=3.5"
[tool.poetry.dev-dependencies]
[build-system]
requires = ["poetry>=0.12"]
build-backend = "poetry.masonry.api"
This package is compatible with Python3.5.
I want to black formatter, which is not compatible with Python3.5.
I think there is no problem if I use Python>=3.6 for development, but I cannot install black formatter:
$ poetry add black --dev
[SolverProblemError]
The current project's Python requirement (>=3.5) is not compatible with some of the required packages Python requirement:
- black requires Python >=3.6
Because no versions of black match >19.10b0,<20.0
and black (19.10b0) requires Python >=3.6, black is forbidden.
So, because bar depends on black (^19.10b0), version solving failed.
So I installed black directly with pip:
$ poetry run pip install black
This way doesn't sit well with me. I want to install black by poetry.
How should I do? (I don't want to modify the dependency to python>=3.6)
Seems a bit late but actually you can do what you want even if black supports only Python >=3.6.2
In your pyproject.toml you can define a restricted dependcy as documented in https://python-poetry.org/docs/dependency-specification/#python-restricted-dependencies
[tool.poetry.dependencies]
python = ">=3.5"
[tool.poetry.dev-dependencies]
black = {version = "^21.7b0", python = ">=3.6.2"}
Poetry won't complain and you won't have any problems since it is a dev dependency.
You need to edit the python value in your pyproject.toml:
[tool.poetry]
name = "bar"
version = "0.1.0"
description = ""
[tool.poetry.dependencies]
python = ">=3.6"
[tool.poetry.dev-dependencies]
[build-system]
requires = ["poetry>=0.12"]
build-backend = "poetry.masonry.api"
I'm trying to install couchdbkit using following buildout config:
[buildout]
parts = eggs
include-site-packages = false
versions = versions
[eggs]
recipe = zc.recipe.egg:eggs
eggs =
couchdbkit
[versions]
couchdbkit = 0.6.3
It installs package successfully but I get numerous errors like this during setup on some machines:
Download error on http://hg.e-engura.org/couchdbkit/: [Errno -2] Name or service not known -- Some packages may not be found!
Be default buildout should find packages using this index. But I can't understand source of this weird hostname. Nothing here points to this location.
How does it actually work?
The underlying setuptools code also scans for homepage and download links from the simple index and does this quite aggressively.
The couchdbkit setup.py file lists http://hg.e-engura.org/couchdbkit/ as the homepage, so all homepage links on the simple index link there.
You can prevent zc.buildout from trying to connect to that host by setting up a whitelist of hosts it can connect to:
[buildout]
# ...
allow-hosts =
*.python.org
*.google.com
*.googlecode.com
*.sourceforge.net
for example.
When installing my python package, I want to be able to tell the user about various optional dependencies. Ideally I would also like to print out a message about these optional requirements and what each of them do.
I haven't seen anything yet in the docs of either pip or docutils. Do tools these support optional dependencies?
These are called extras, here is how to use them in your setup.py, setup.cfg, or pyproject.toml.
The base support is in pkg_resources. You need to enable distribute in your setup.py. pip will also understand them:
pip install 'package[extras]'
Yes, at stated by #Tobu and explained here. In your setup.py file you can add a line like this:
extras_require = {
'full': ['matplotlib', 'tensorflow', 'numpy', 'tikzplotlib']
}
I have an example of this line here.
Now you can either install via PIP basic/vanilla package like pip install package_name or the package with all the optional dependencies like pip install package_name[full]
Where package_name is the name of your package and full is because we put "full" in the extras_require dictionary but it depends on what you put as a name.
If someone is interested in how to code a library that can work with or without a package I recommend this answer
Since PEP-621, this information is better placed in the pyproject.toml rather than setup.py. Here's the relevant specification from PEP 621. Here's an example snippet from a pyproject.toml (credit to #GwynBleidD):
[project.optional-dependencies]
test = [
"pytest < 5.0.0",
"pytest-cov[all]"
]
lint = [
"black",
"flake8"
]
ci = [
"pytest < 5.0.0",
"pytest-cov[all]",
"black",
"flake8"
]
A more complete example is found in the PEP