I have a Python project that doesn't contain requirements.txt.
But it has a pyproject.toml file.
How can I download packages (dependencies) required by this Python project and declared in pyproject.toml using the Pip package manager (instead of the build tool Poetry).
So instead of pip download -r requirements.txt, something like pip download -r pyproject.toml.
Here is an example of .toml file:
[build-system]
requires = [
"flit_core >=3.2,<4",
]
build-backend = "flit_core.buildapi"
[project]
name = "aedttest"
authors = [
{name = "Maksim Beliaev", email = "beliaev.m.s#gmail.com"},
{name = "Bo Yang", email = "boy#kth.se"},
]
readme = "README.md"
requires-python = ">=3.7"
classifiers = ["License :: OSI Approved :: MIT License"]
dynamic = ["version", "description"]
dependencies = [
"pyaedt==0.4.7",
"Django==3.2.8",
]
[project.optional-dependencies]
test = [
"black==21.9b0",
"pre-commit==2.15.0",
"mypy==0.910",
"pytest==6.2.5",
"pytest-cov==3.0.0",
]
deploy = [
"flit==3.4.0",
]
to install core dependencies you run:
pip install .
if you need test(develop) environment (we use test because it is a name defined in .toml file, you can use any):
pip install .[test]
To install from Wheel:
pip install C:\git\aedt-testing\dist\aedttest-0.0.1-py3-none-any.whl[test]
pip supports installing pyproject.toml dependencies natively.
As of version 10.0, pip supports projects declaring dependencies that are required at install time using a pyproject.toml file, in the form described in PEP 518. When building a project, pip will install the required dependencies locally, and make them available to the build process. Furthermore, from version 19.0 onwards, pip supports projects specifying the build backend they use in pyproject.toml, in the form described in PEP 517.
From the project's root, use pip's local project install:
python -m pip install .
You can export the dependencies to a requirements.txt and use pip download afterwards:
poetry export -f requirements.txt > requirements.txt
pip download -r requirements.txt
Related
Im using poetry in my project for dependencies, so my pyproject.toml contains all my dependencies. My goal is is to build a wheel from the current project and install them in another virtualenv using pip. But im facing an error:
pip._vendor.pkg_resources.RequirementParseError: Invalid URL: artifacts/pp-0.358.0.tar.gz
I guess that the problem may be in the relative path, but I don't know how to fix it, because when installing dependencies, this package is taken from the tar.gz file located in this path.
Мой pyproject.toml:
[tool.poetry]
name = "package_name"
version = "0.0.0"
description = ""
readme = "README.md"
packages = [
{include = "package1"},
{include = "package2"},
{include = "package3"},
]
[tool.poetry.dependencies]
python = "~3.8"
pp = {file = "artifacts/pp-0.358.0.tar.gz"}
...some other deps
To build a wheel im using following command:
poetry build --format wheel
Installing to another virtualenv with following command:
pip3 install /Users/av/Projects/package_name/dist/package_name-0.0.0-py3-none-any.whl
Is it any ways to fix it somehow? If I comment pp line in dependencies installation works well.
My Python package is installed using setuptools configured with a setup.cfg file. In it requirements are specified:
[options]
packages = find:
zip_safe = True
include_package_data = True
install_requires =
gmsh >= 4.10.5
matplotlib >= 3.6.1
numpy >= 1.23.3
When installing the package via pip the package to a fresh venv non of the requirements are installed. The output of pip show no errors or related information. However, once manually installing them everything works fine. How can I get pip to actually install the requirements?
It is mentioned in the comments that you have a pyproject.toml file. If you use the toml configuration then you do not need the setup.cfg at all. Delete the setup.cfg and add in pyproject.toml:
[build-system]
requires = ["setuptools"]
build-backend = "setuptools.build_meta"
[project]
name = ...
version = ...
dependencies = [
"gmsh >= 4.10.5",
"matplotlib >= 3.6.1",
"numpy >= 1.23.3",
]
This question already has answers here:
Pip install from pypi works, but from testpypi fails (cannot find requirements)
(2 answers)
Closed 2 years ago.
How can I publish a package on PyPI such that all dependencies are automatically installed, rather than manually by the user.
I specify the dependencies in setup.py with install_requires as follows:
setuptools.setup(name='myPackage',
version='1.0',
packages=setuptools.find_packages(),
include_package_data=True,
classifiers=[
'Programming Language :: Python :: 3',
'Operating System :: OS Independent',
'Topic :: Scientific/Engineering :: Bio-Informatics'
],
install_requires=['numpy', 'pandas', 'sklearn'],
python_requires='>=3'
)
And I have a requirements.txt file which is included in my MANIFEST.in:
numpy==1.15.4
sklearn==0.20.1
pandas==0.23.4
However, after publishing on test.pypi when I try to install the package, I get the following error:
Could not find a version that satisfies the requirement numpy (from myPackage==1.0.0) (from versions: )
No matching distribution found for sklearn (from myPackage==1.0.0)
This means that PyPI does not install the numpy dependency.
How do I enable automatic installation of my dependencies?
Should I use a virtual environment when building and publishing the package? How do I do this?
P.S. I am entirely new to this so I will appreciate explicit code or links to simple tutorial pages. Thank you.
You can specify multiple indexes via --extra-index-url. Point it to TestPyPI so your package is pulled from there, the deps from PyPI:
$ pip install myPackage --extra-index-url=https://test.pypi.org/simple/
However, the real root for the issue is that you have included the wrong dist name for the scikit-learn package. Replace sklearn with scikit-learn:
setup(
...,
install_requires=['numpy', 'pandas', 'scikit-learn'],
)
This is an unfortunate (and known) downside to TestPyPI: The issue is that sklearn does not exist on TestPyPI, and by installing your package from there, you are telling pip to look for dependencies there as well.
Instead, you should publish to PyPI instead, and use a pre-release version so as not to pollute your versions. You can delete these pre-releases from the project later.
I realized that installing packages from test.PyPI does not install all packages, since some of these packages are hosted on PyPI and not test.PyPI.
When I published the package on PyPI as a pre-release version (1.0a1), instead on test.PyPI, the dependencies were correctly installed. Hence, the problem was purely with test.PyPI.
This is my approach.
I like to use a requirements.txt file instead of putting dependencies in install_requires because it's easier during dev to run:
$ pip install -r requirements.txt
To have pip install dependencies automatically, I include at the top of setup.py before setuptools.setup():
requirements = []
with open('requirements.txt', 'r') as fh:
for line in fh:
requirements.append(line.strip())
Then in setuptools.setup():
install_requires = requirements
To install:
pip install --index-url https://test.pypi.org/simple/ --upgrade --no-cache-dir --extra-index-url=https://pypi.org/simple/ <<package name>>
--index-url is telling pip to use the test version of pypi.
--upgrade forces an upgrade if a previous version is installed.
--no-cache-dir resolves issues with caching if doing a very quick re-release (pip doesn't pick up the new version)
--extra-index tells pip to look in the prod version of pypi if it can't find the required package in test (i.e. solves problems of dependencies not being available in test)
Your install_requires should be of the form
...
install_requires=["numpy==1.15.4",
"sklearn==0.20.1",
"pandas==0.23.4"]
...
You can also use >= instead of == to allow for more recent versions of those libraries.
In my tox.ini file, the dependencies are installed via the requirements.txt file which is also used by setup.py, as follows:
The requirements.txt file contains the acceptable range of django packages, depending on the python version installed, as follows:
Django>=1.11,<2 ; python_version == '2.7'
Django>=1.11,<3 ; python_version > '3'
For python3, I want to make sure the tests run on django 2.0 as well as the latest django 2.1+ that will be installed by default, obeying the version constraints specified in the requirements.txt file. To achieve that, I force the installation of the desired django version with commands, as follows:
[tox]
envlist = {py27,py3}-django111,py3-django{20,21}
[testenv]
deps =
-r{toxinidir}/requirements.txt
commands =
django111: pip install 'Django>=1.11,<1.12'
py3-django20: pip install 'Django>=2.0,<2.1'
py3-django21: pip install 'Django>=2.1'
pytest
Ideally I could just add to the deps variable like so:
[testenv]
deps =
-r{toxinidir}/requirements.txt
django111: Django>=1.11,<1.12
py3-django20: Django>=2.0,<2.1
py3-django21: Django>=2.1
commands =
pytest
But pip does not support double requirements and will throw an error even though there is no conflict in how the version constraints are specified.
The drawback of using commands to override the installation is that it needs to remove the django package version installed via requirements.txt to install the desired one. Is there a way to avoid that extra step?
One trick is to move the requirement from requirements.txt into setup.py - where it's loosely pinned so that all your django versions are possible. For example
# setup.py
from setuptools import setup, find_packages
setup(
...
install_requires=[
"Django>=1.11,<2.1",
]
)
and then use your second suggestion in tox.ini
[testenv]
deps =
-r{toxinidir}/requirements.txt
django111: Django>=1.11,<1.12
py3-django20: Django>=2.0,<2.1
py3-django21: Django>=2.1
commands =
pytest
... so long as the Django requirement isn't listed in requirements.txt.
This works because the pip install is split in two parts, the first from tox:deps where you specify the hard requirement, and the second from the equivalent of pip install -e . where the setup.py has the looser requirement.
I have package "A" with a setup.py and an extras_requires line like:
extras_require = {
'ssh': ['paramiko'],
},
And a package "B" that depends on util:
install_requires = ['A[ssh]']
If I run python setup.py install on package B, which uses setuptools.command.easy_install under the hood, the extras_requires is correctly resolved, and paramiko is installed.
However, if I run pip /path/to/B or pip hxxp://.../b-version.tar.gz, package A is installed, but paramiko is not.
Because pip "installs from source", I'm not quite sure why this isn't working. It should be invoking the setup.py of B, then resolving & installing dependencies of both B and A.
Is this possible with pip?
We use setup.py and pip to manage development dependencies for our packages, though you need a newer version of pip (we're using 1.4.1 currently).
#!/usr/bin/env python
from setuptools import setup
from myproject import __version__
required = [
'gevent',
'flask',
...
]
extras = {
'develop': [
'Fabric',
'nose',
]
}
setup(
name="my-project",
version=__version__,
description="My awsome project.",
packages=[
"my_project"
],
include_package_data=True,
zip_safe=False,
scripts=[
'runmyproject',
],
install_requires=required,
extras_require=extras,
)
To install the package:
$ pip install -e . # only installs "required"
To develop:
$ pip install -e .[develop] # installs develop dependencies
This is suppported since pip 1.1, which was released in February 2012 (one year after this question was asked).
The answer from #aaronfay is completely correct but it may be nice to point out that if you're using zsh that the install command pip install -e .[dev] needs to be replaced by pip install -e ".[dev]".