I am using a pyproject.toml file to allow me to use third party packages during my build. (For example, I'd like to use the toml package in setup.up.) When I add a local package to "requires" (installed in editable mode), the build doesn't see the package. Is there a way to include local packages in my pyproject.toml other than explicitly deploying them to pypi?
Here's what my pyproject.toml file looks like currently:
[build-system]
requires = ["setuptools", "wheel", "toml", "my_local_package"]
You can do something like:
[build-system]
build-backend = "setuptools.build_meta"
requires = [
"setuptools >= 61.2",
"versioningit # file:///Users/basnijholt/Downloads/versioningit",
"wheel",
]
Related
The automatic discovery of setuptools.build_meta includes top-level folders into the tarball that shouldn't be included.
We were trying to build a python package with python3 -m build. Our project has a src-layout and we are using setuptools as the backend for the build. According to the documentation, the automatic discovery should figure out the project structure and build the tarball and the wheel accordingly.
For the wheel build it works as expected, i.e. only the files under src/... are included on the package. On the other hand, in the tarball build also top-level folders that are on the same hierarchical level as src are included. For example, the tests and the docs folder, which shouldn't be included. Interestingly, only a few files from the docs were included, not all of them.
We tried to explicitly exclude the tests and docs folder in the pyproject.toml, the setup.cfg and the MANIFEST.in, following the respective documentations, but none of them helped.
We configured the build backend in the pyproject.toml like so:
[build-system]
requires = ["setuptools"]
build-backend = "setuptools.build_meta"
...
We don't have a setup.py and our setup.cfg only contains flake8 specifications.
We are using setuptools-67.1.0, we tried python versions 3.8, 3.9 and 3.10. We tried it with an Ubuntu 20.04 and a Debian 11 running conda 3.8.13.
The tarball and the wheel can be seen here.
My Python package is installed using setuptools configured with a setup.cfg file. In it requirements are specified:
[options]
packages = find:
zip_safe = True
include_package_data = True
install_requires =
gmsh >= 4.10.5
matplotlib >= 3.6.1
numpy >= 1.23.3
When installing the package via pip the package to a fresh venv non of the requirements are installed. The output of pip show no errors or related information. However, once manually installing them everything works fine. How can I get pip to actually install the requirements?
It is mentioned in the comments that you have a pyproject.toml file. If you use the toml configuration then you do not need the setup.cfg at all. Delete the setup.cfg and add in pyproject.toml:
[build-system]
requires = ["setuptools"]
build-backend = "setuptools.build_meta"
[project]
name = ...
version = ...
dependencies = [
"gmsh >= 4.10.5",
"matplotlib >= 3.6.1",
"numpy >= 1.23.3",
]
I have a GUI Python app that I'm trying to distribute a desktop entry with. Normally, one would write a setup.py with setuptools that has this in it:
from setuptools import setup
setup(
name = 'myapp',
version = '0.0.1',
packages = ['myapp'],
data_files = [
('share/applications', ['myapp.desktop']),
],
)
This is deprecated, however, and my goal is to use only pyproject.toml in my repo with no setup.py or setup.cfg needed. I have been unable to find any information on how I would go about doing this.
Due to the console message of setup.py install is deprecated, I am in the middle of upgrading my existing setup.py install to the recommended setup.cfg with build
My existing setup.py looks something like
from setuptools import setup
setup(
name='pybindsample',
version='0.1.0',
packages=[''],
package_data={'': ['pybindsample.so']},
has_ext_modules=lambda: True,
)
My current translation looks like:
setup.cfg
[metadata]
name = pybindsample
version = 0.1.0
[options]
packages = .
[options.package_data]
. = pybindsample.so
pyproject.toml
[build-system]
requires = ["setuptools"]
build-backend = "setuptools.build_meta"
My question is how can I translate has_ext_modules=lambda: True? has_ext_modules=lambda: True is from the solution here. Without this, after executing python3 -m build --wheel the file name of the generated wheel will become pybindsample-0.1.0-py3-none-any.whl, whereas my old python3 setup.py bdist_wheel will generate wheel with file name pybindsample-0.1.0-cp39-cp39-macosx_11_0_x86_64.whl. I have attempted
setup.cfg
[metadata]
name = pybindsample
version = 0.1.0
[options]
packages = .
has_ext_modules=lambda: True,
[options.package_data]
. = pybindsample.so
but it still generates pybindsample-0.1.0-py3-none-any.whl, I also attempted
setup.cfg
[metadata]
name = pybindsample
version = 0.1.0
[options]
packages = .
[options.package_data]
. = pybindsample.so
[bdist_wheel]
python-tag = c39
plat-name = macosx_11_0_x86_64
py-limited-api = c39
this generates pybindsample-0.1.0-cp39-none-macosx_11_0_x86_64.whl, and I couldn't figure out why the abi tag is still none.
What is the right way to configure setuptools with setup.cfg to include platform name, python tag, and ABI tag?
I have a Python project that doesn't contain requirements.txt.
But it has a pyproject.toml file.
How can I download packages (dependencies) required by this Python project and declared in pyproject.toml using the Pip package manager (instead of the build tool Poetry).
So instead of pip download -r requirements.txt, something like pip download -r pyproject.toml.
Here is an example of .toml file:
[build-system]
requires = [
"flit_core >=3.2,<4",
]
build-backend = "flit_core.buildapi"
[project]
name = "aedttest"
authors = [
{name = "Maksim Beliaev", email = "beliaev.m.s#gmail.com"},
{name = "Bo Yang", email = "boy#kth.se"},
]
readme = "README.md"
requires-python = ">=3.7"
classifiers = ["License :: OSI Approved :: MIT License"]
dynamic = ["version", "description"]
dependencies = [
"pyaedt==0.4.7",
"Django==3.2.8",
]
[project.optional-dependencies]
test = [
"black==21.9b0",
"pre-commit==2.15.0",
"mypy==0.910",
"pytest==6.2.5",
"pytest-cov==3.0.0",
]
deploy = [
"flit==3.4.0",
]
to install core dependencies you run:
pip install .
if you need test(develop) environment (we use test because it is a name defined in .toml file, you can use any):
pip install .[test]
To install from Wheel:
pip install C:\git\aedt-testing\dist\aedttest-0.0.1-py3-none-any.whl[test]
pip supports installing pyproject.toml dependencies natively.
As of version 10.0, pip supports projects declaring dependencies that are required at install time using a pyproject.toml file, in the form described in PEP 518. When building a project, pip will install the required dependencies locally, and make them available to the build process. Furthermore, from version 19.0 onwards, pip supports projects specifying the build backend they use in pyproject.toml, in the form described in PEP 517.
From the project's root, use pip's local project install:
python -m pip install .
You can export the dependencies to a requirements.txt and use pip download afterwards:
poetry export -f requirements.txt > requirements.txt
pip download -r requirements.txt