I have a module that depends on another module (module_a) stored in my private repository (nexus) that reuqest another module (module_b) at build time, even stored in the private repository.
I add repository source on poetry.toml in order to add my private repo.
[[tool.poetry.source]]
name = "nexus"
url = "https://my_nexus_url/private_repo/simple"
secondary = true
Then I specify the dependency on th toml
[tool.poetry.dependencies]
python = ">=3.9.0,<3.11,"
module_a="1.0.0"
When I run poetry install it download module_a and build it. During the build process I get this error:
ERROR: Could not find a version that satisfies the requirement module_b==1.0.1
ERROR: No matching distribution found for module_b==1.0.1
When I try to install the module using pip with --extra-index-url <my repo> everything works fine.
pip install module_a --extra-index-url https://my_nexus_url/private_repo/simple
I guess that the problem is releted on the pip command, executed by poetry. It dose not specify the extra-index-url that point to my repo so it try to download dependency (module_b) from pypi repository instead from my repo.
There is a way to instruct poetry in order to use my private repo when source build is required?
I have already try with this:
[tool.poetry.dependencies]
python = ">=3.9.0,<3.11,"
module_b={version="1.0.1", source="nexus"}
module_a={version="1.0.0", source="nexus"}
without any success.
Related
I am trying to build python wheel following the instructions as described in the link below. I am doing this first time.
https://packaging.python.org/en/latest/tutorials/packaging-projects/
I set up the folder structure, files and all. I have added this in pyproject.toml file.
[build-system]
requires= ["setuptools>=57.4.0","wheel>=0.37.1"]
build-backend = "setuptools.build_meta"
I have installed setuptoos and wheel on my virtual environment.
When I tried to run the build command, I am getting an SSL warnings and below error.
Could not find a version that satisfies the requirement wheel>=0.37.1
Could not fetch from URL https://pypi.org/simple
Even though have installed setuptools and wheel on my virtual environment, I think it is hitting the pypi to find and download these packages.
I don't know how the build module finds the modules/packages in "requires". I am not finding a way to direct to use the already installed setuptools and wheel on my machine instead of fetching from pypi.
Even if it tries to doanload again, how can derect to use our artifactory instead of pypi.
Any help in this is greatly appreciated.
I tried all below with differenct combinations but did not work. Obviously I am missing something.
1.
I added a pip.ini in my virtual environment (Lib\site-packages\pip).
Added the index-url with our organization's artifactory url.
Added trusted-host
Also tried pip.config
I downloaded the wheels for setuptools and wheel.
Added another argument in pyproject.toml
[easy-install]
find-links = c:\wheels
Added the wheels directly in the src folder.
Thanks.
I want to achieve a similar behavior as the library Dask does, it is possible to use pip to install dask, dask[dataframe], dask[array] and others. They do it by using the setup.py with a packages key like this. If I install only dask the dask[dataframe] is not installed and they warn you about this when executing the module.
I found this in the poetry documentation but when I execute poetry build I only get one .whl file with all of the packages within.
How can I package my module to be able to install specific parts of a library using poetry?
Actually the Dask example does not install sub packages sepparatelly, it just installs the custom dependencies sepparatelly as explained in this link.
In order to accomplish the same behavior using poetry you need to use this (as mentioned by user #sinoroc in this comment)
The example pyproject.toml from the poetry extras page is this:
[tool.poetry]
name = "awesome"
[tool.poetry.dependencies]
# These packages are mandatory and form the core of this package’s distribution.
mandatory = "^1.0"
# A list of all of the optional dependencies, some of which are included in the
# below `extras`. They can be opted into by apps.
psycopg2 = { version = "^2.7", optional = true }
mysqlclient = { version = "^1.3", optional = true }
[tool.poetry.extras]
mysql = ["mysqlclient"]
pgsql = ["psycopg2"]
By using poetry build --format wheel a single wheel file would be created.
In order to install a specific set of extra dependencies using pip and the wheel file you should use:
pip install "wheel_filename.whl[mysql]"
I've got a couple of projects here for which I'm preparing documentation at the moment, hosted at readthedocs.org. FYI, all of them use poetry and I use custom .readthedocs.yml files with this entry:
python:
install:
- method: pip
path: .
It works fine for most projects, but it fails for two for different reasons during installation of the project via pip:
The first one uses PyGObject, which failes like this:
Package gobject-introspection-1.0 was not found in the pkg-config search path.
Perhaps you should add the directory containing `gobject-introspection-1.0.pc'
to the PKG_CONFIG_PATH environment variable
No package 'gobject-introspection-1.0' found
Command '('pkg-config', '--print-errors', '--exists', 'gobject-introspection-1.0 >= 1.56.0')' returned non-zero exit status 1.
Try installing it with: 'sudo apt install libgirepository1.0-dev'
So it seems that PyGObject cannot be installed without some system packages to be installed. I could rearrange the code so that the import is not top-level. But still I need it in the dependencies. Can I tell pip install to ignore this single package somehow? Any other idea?
The second project compiles some C++ code via Cython and fails, because it's missing a library. I use a custom build script in the pyproject.toml:
[tool.poetry.build]
script = "build.py"
generate-setup-file = false
Is there some flag in pip that I could set and retrieve in build.py to skip the compilation? Or is there a better way?
I have a package which I'm pushing to PyPi and some of the depedencies are not packages, but installable git repositories. My requirements.txt looks like this
sphinx_bootstrap_theme>=0.6.5
matplotlib>=2.2.0
numpy>=1.15.0
sphinx>=1.7.5
sphinx-argparse>=0.2.2
tensorboardX
tqdm>=4.24.0
Cython>=0.28.5
# git repos
git+git://github.com/themightyoarfish/svcca-gpu.git
Accordingly, my setup.py has this content:
#!/usr/bin/env python
from distutils.core import setup
import setuptools
import os
with open('requirements.txt', mode='r') as f:
requirements = f.read()
required_pkgs, required_repos = requirements.split('# git repos')
required_pkgs = required_pkgs.split()
required_repos = required_repos.split()
with open('README.md') as f:
readme = f.read()
setup(name=...
...
packages=setuptools.find_packages('.', include=[...]),
install_requires=required_pkgs,
dependency_links=required_repos,
zip_safe=False, # don't install egg, but source
)
But running pip install <package> does not actually install the git dependency. I assume that pip doesn't actually use the setup script. It works when I run python setup.py install manually.
Edit:
I also tried removing dependency_links and just using install_requires with the repository, but when installing my repository from GitHub (the project including the above files), I'm met with
Complete output from command python setup.py egg_info:
error in ikkuna setup command: 'install_requires' must be a string or
list of strings containing valid project/version requirement specifiers; Invalid requirement, parse error at "'+git://g'"
It has been suggested in other answers that one can put something like
git+https://github.com/themightyoarfish/svcca-gpu.git#egg=svcca
into requirements.txt, but that fails with
error in <pkg> setup command: 'install_requires' must be a string or list of strings containing valid project/version requirement specifiers; Invalid requirement, parse error at "'+https:/'
Question: (How) Can I list git repositories as dependencies for a pip package?
Out of the 50 or so different ways to specify git dependencies for Pip, the only one that did what I intended was this one (outline in PEP 508):
svcca # git+ssh://git#github.com/themightyoarfish/svcca-gpu
This can be used in install_requires, which solves the issue of dependency_links being ignored by pip.
An amusing side-effect is that the package cannot be uploaded to PyPi with such a dependency:
HTTPError: 400 Client Error: Invalid value for requires_dist. Error: Can't have direct dependency: 'svcca # git+ssh://git#github.com/themightyoarfish/svcca-gpu' for url: https://upload.pypi.org/legacy/
According to the next post related to How to state in requirements.txt a direct github source.
You could add a package from git remote repository with the next syntax:
-e git://github.com/themightyoarfish/svcca-gpu.git
Reference:
Install a project in editable mode (i.e. setuptools “develop mode”) from a local project path or a VCS url with-e
What I should have:
I want my Yocto Project to build a package for my Python project with all dependencies inside. The project has to run out of box on the resulting read-only sdcard image.
It simply should install all requirements in the required version to the package.
What I tried without luck:
Calling pip in do_install():
"pip/pip3 is not found", even it's in RDEPENDS.
Anyway, I really prefer this way.
With inherit pypi:
When trying with inherit pypi, it tries to get also my local sources (my pyton project) from pypi. And I have always to copy the requirements to the recipe. This is not my preferred way.
Calling pip in pkg_postinst():
It tries to install the modules on first start and fails, because the system has no internet connection and it's a read-only system. It must run out of the box without installation on first boot time. Does its stuff to late.
Where I'll get around:
There should be no need to change anything in the recipes when something changes in requirements.txt.
Background information
I'm working with Yocto Rocko in a Linux environment.
In the Hostsystem, there is no pip installed. I want to run this one installed from RDEPENDS in the target system.
Building the Package (only this recipe) with:
bitbake myproject
Building the whole sdcard image:
bitbake myProject-image-base
The recipe:
myproject.bb (relevant lines):
RDEPENDS_${PN} = "python3 python3-pip"
APP_SOURCES_DIR := "${#os.path.abspath(os.path.dirname(d.getVar('FILE', True)) + '/../../../../app-sources')}"
FILESEXTRAPATHS_prepend := "${THISDIR}/files:"
SRC_URI = " \
file://${APP_SOURCES_DIR}/myProject \
...
"
inherit allarch # tried also with pypi and setuptools3 for the pypi way.
do_install() { # Line 116
install -d -m 0755 ${D}/myProject
cp -R --no-dereference --preserve=mode,links -v ${APP_SOURCES_DIR}/myProject/* ${D}/myProject/
pip3 install -r ${APP_SOURCES_DIR}/myProject/requirements.txt
# Tried also python ${APP_SOURCES_DIR}/myProject/setup.py install
}
# Tried also this, but it's no option because the data MUST be included in the Package:
# pkg_postinst_${PN}() {
# #!/bin/sh -e
# pip3 install -r /myProject/requirements.txt
# }
FILES_${PN} = "/myProject/*"
Resulting Errors:
Expected to install the listed modules from requirements.txt into the myProject package, so that the python app will run directly on the resulting readonly sdcard image.
With pip, I get:
| /*/tmp/work/*/myProject/0.1.0-r0/temp/run.do_install: 116: pip3: not found
| WARNING: exit code 127 from a shell command.
| ERROR: Function failed: do_install ...
When using pypi:
404 Not Found
ERROR: myProject-0.1.0-r0 do_fetch: Fetcher failure for URL: 'https://files.pythonhosted.org/packages/source/m/myproject/myproject-0.1.0.tar.gz'. Unable to fetch URL from any source.
=> But it should not fetch myProject, since it is already local and nowhere remote.
Any ideas? What would be the best way to reach to a ready to use sdcard image without the need to change recipes when requirements.txt changes?
You should use RDEPENDS_${PN} to take care of your dependencies for your app in the recipe.
For example, assuming your python app needs aws-iot-device-sdk-python module, you should add it to RDEPENDS in the recipe. In your case, it would be like this:
RDEPENDS_${PN} = "python3 \
python3-pip \
python3-aws-iot-device-sdk-python \
"
Here's the link showing the Python modules supported by OpenEmbedded Layer.
https://layers.openembedded.org/layerindex/branch/master/layer/meta-python/
If the modules you need are not there, you will likely need to create recipes for the modules.
My newest findings:
Yocto/bitbake seems to suppress interpreting the requirements, because this breaks automatic dependency resolving what could lead to conflicts.
Reason: The required modules from setup.py would not be stored as independent packages, but as part of my package. So, bitbake does not know about this modules what could conflict with other packages that probably requires same modules in different versions.
What was in my recipe:
MY_INSTALL_ARGS = "--root=${D} \
--prefix=${prefix} \
--install-lib=${PYTHON_SITEPACKAGES_DIR} \
--install-data=${datadir}"
do_install() {
PYTHONPATH=${PYTHON_SITEPACKAGES_DIR} \
${STAGING_BINDIR_NATIVE}/${PYTHON_PN}-native/${PYTHON_PN} setup.py install ${MY_INSTALL_ARGS}
}
If I execute this outside of bitbake as python3 setup.py install ${MY_INSTALL_ARGS}, all will be installed correctly, but in the recipe, no requirements are installed.
There is a parameter --no-deps, but I didn't find where it is set.
I think there could be one possibility to exploit the requirements out of setup.py:
Find out where to disable --no-deps in the openembedded/poky layer for easy_install.
Creating a separate PYTHON_SITEPACKAGES_DIR
Install this separate PYTHON_SITEPACKAGES_DIR in eg the home directory as private python modules dir.
This way, no python module would trigger a conflict.
Since I do not have the time to experiment with this, I'll define now one recipe per requirement.
You try installing pip?
Debian
apt-get install python-pip
apt-get install python3-pip
Centos
yum install python-pip