Using an extra python package index url with setup.py - python

Is there a way to use an extra Python package index (ala pip --extra-index-url pypi.example.org mypackage) with setup.py so that running python setup.py install can find the packages hosted on pypi.example.org?

If you're the package maintainer, and you want to host one or more dependencies for your package somewhere other than PyPi, you can use the dependency_links option of setuptools in your distribution's setup.py file. This allows you to provide an explicit location where your package can be located.
For example:
from setuptools import setup
setup(
name='somepackage',
install_requires=[
'somedep'
],
dependency_links=[
'https://pypi.example.org/pypi/somedep/'
]
# ...
)
If you host your own index server, you'll need to provide links to the pages containing the actual download links for each egg, not the page listing all of the packages (e.g. https://pypi.example.org/pypi/somedep/, not https://pypi.example.org/)

setuptools uses easy_install under the hood.
It relies on either setup.cfg or ~/.pydistutils.cfg as documented here.
Extra paths to packages can be defined in either of these files with the find_links. You can override the registry url with index_url but cannot supply an extra-index-url. Example below inspired by the docs:
[easy_install]
find_links = http://mypackages.example.com/somedir/
http://turbogears.org/download/
http://peak.telecommunity.com/dist/
index-url = https://mypi.example.com

I wanted to post a latest answer to this since both the top answers are obsolete; use of easy_install has been deprecated by setuptools.
https://setuptools.pypa.io/en/latest/deprecated/easy_install.html
Easy Install is deprecated. Do not use it. Instead use pip. If you think you need Easy Install, please reach out to the PyPA team (a ticket to pip or setuptools is fine), describing your use-case.
Please use pip moving forward. You can do one of the following:
provide --index-url flag to pip command
define index-url in pip.conf file
define PIP_INDEX_URL environment variable
https://pip.pypa.io/en/stable/topics/configuration/

The following worked for me (develop, not install):
$ python setup.py develop --index-url https://x.com/n/r/pypi-proxy/simple
Where https://x.com/n/r/pypi-proxy/simple is a local PyPI repository.

Found solution when using Dockerfile:
RUN cd flask-mongoengine-0.9.5 && \
/bin/echo -e [easy_install]\\nindex-url = https://pypi.tuna.tsinghua.edu.cn/simple >> setup.cfg && \
python setup.py install
Which /bin/echo -e [easy_install]\\nindex-url = https://pypi.tuna.tsinghua.edu.cn/simple will exists in file setup.cfg:
[easy_install]
index-url = https://pypi.tuna.tsinghua.edu.cn/simple

this worked for me
PIP_INDEX_URL=<MY CUSTOM PIP INDEX URL> pip install -e .
I use setup.py and setup.cfg

As far as I know, you cant do that.
You need to tell pip this, or by passing a parameter like you mentioned, or by setting this on the user environment.
Check my ~/.pip/pip.conf:
[global]
download_cache = ~/.cache/pip
index-url = http://user:pass#localpypiserver.com:80/simple
timeout = 300
In this case, my local pypiserver also proxies all packages from pypi.python.org, so I dont need to add a 2nd entry.

You can include --extra-index-urls in a requirements.txt file. See: http://pip.readthedocs.org/en/0.8.3/requirement-format.html

Related

Use setuptools to Install a Python package from a private Gitlab package repository

I created a private package for my employer. Since I’m forbidden to upload it to PyPI (it’s proprietary), I uploaded it to the packages index for my project on our private Gitlab hub. I can install it manually with:
$ pip install my-package --extra-index-url https://__token__:my-token-xxx#gitlab.company-domain.com/api/v4/projects/123456/packages/pypi/simple
Now I also want setuptools to be able to find it when listed in the install_requires argument to setup(). I tried:
setup(
install_requires=[
f"my-package # https://__token__:{API_TOKEN}#gitlab.company-domain.com/api/v4/projects/123456/packages/pypi/simple",
...
],
...
pip install -e . results in
ERROR: HTTP error 404 while getting https://__token__:****#gitlab.company-domain.com/api/v4/projects/123456/packages/pypi/simple
This is different than
my-package # git+https://user:password#gitlab.company-domain.com/..../my-package.git
That works, but I want to be able to download it as a pre-built wheel.
I’m not sure whether this is a setuptools issue or a Gitlab issue. The 404 response tells me that it might be a gitlab issue, yet the same URI works perfectly when used with the pip install CLI command.
This question is similar to Include python packages from gitlab's package registry and other external indexes directly into setup.py dependencies, but I don't think that one got sufficient response. I posted the same question to discuss.python.org, but that discussion is old and I think I might get a quicker response here.
I also found this response to a similar question, which wasn't encouraging. It recommends Poetry or Pipenv. I've tried both, and found each to be excruciatingly slow when resolving dependencies, so I fell back on setuptools.
Only include the package name in install_requires. Then, configure your (extra) index URL in your pip configuration (either environment variables or pip.conf/.pypirc or CLI argument). Then using pip install as normal will work.
For example:
In setup.py:
# ...
install_requires=[
'my-package-name',
# ...
],
# ...
Then the install command (assuming the environment variable API_TOKEN exists):
GITLAB_INDEX="https://__token__:${API_TOKEN}#gitlab.company-domain.com/api/v4/projects/123456/packages/pypi/simple"
pip install --extra-index-url "${GITLAB_INDEX}" -e .

setup.py file using requirements.txt

I've read a discussion where a suggestion was to use the requirements.txt inside the setup.py file to ensure the correct installation is available on multiple deployments without having to maintain both a requirements.txt and the list in setup.py.
However, when I'm trying to do an installation via pip install -e ., I get an error:
Obtaining file:///Users/myuser/Documents/myproject
Processing /home/ktietz/src/ci/alabaster_1611921544520/work
ERROR: Could not install packages due to an OSError: [Errno 2] No such file or directory:
'/System/Volumes/Data/home/ktietz/src/ci/alabaster_1611921544520/work'
It looks like pip is trying to look for packages that are available on pip (alabaster) on my local machine. Why? What am I missing here? Why isn't pip looking for the required packages on the PyPi server?
I have done it before the other way around, maintaining the setup file and not the requirements file. For the requirements file, just save it as:
*
and for setup, do
from distutils.core import setup
from setuptools import find_packages
try:
from Module.version import __version__
except ModuleNotFoundError:
exec(open("Module/version.py").read())
setup(
name="Package Name",
version=__version__,
packages=find_packages(),
package_data={p: ["*"] for p in find_packages()},
url="",
license="",
install_requires=[
"numpy",
"pandas"
],
python_requires=">=3.8.0",
author="First.Last",
author_email="author#company.com",
description="Description",
)
For reference, my version.py script looks like:
__build_number__ = "_LOCAL_"
__version__ = f"1.0.{__build_number__}"
Which Jenkins is replacing the build_number with a tag
This question consists of two separate questions, for the rather philosopihc choice of how to arrange setup requirements is actually unrelated to the installation error that you are experiencing.
First about the error: It looks like the project you are trying to install depends on another library (alabaster) of which you apparently also did an editable install using pip3 install -e . that points to this directory:
/home/ktietz/src/ci/alabaster_1611921544520/work
What the error tells you is that the directory where the install is supposed to be located does not exist anymore. You should only install your project itself in editable mode, but the dependencies should be installed into a classical system directory, i. e. without the option -e.
To clean up, I would suggest that you do the following:
# clean up references to the broken editable install
pip3 uninstall alabaster
# now do a proper non-editable install
pip3 install alabaster
Concerning the question how to arrange setup requirements, you should primarily use the install_requires and extras_require options of setuptools:
# either in setup.py
setuptools.setup(
install_requires = [
'dep1>=1.2',
'dep2>=2.4.1',
]
)
# or in setup.cfg
[options]
install_requires =
dep1>=1.2
dep2>=2.4.1
[options.extras_require]
extra_deps_a =
dep3
dep4>=4.2.3
extra_deps_b =
dep5>=5.2.1
Optional requirements can be organised in groups. To include such an extra group with the install, you can do pip3 install .[extra_deps_name].
If you wish to define specific dependency environments with exact versions (e. g. for Continuous Integration), you may use requirements.txt files in addition, but the general dependency and version constraint definitions should be done in setup.cfg or setup.py.

pip3 setup.py install_requires PEP 508 git URL for private repo

I am trying to run:
pip3 install -e .
in my Python project where I have the following setup.py:
from setuptools import setup
setup(
name='mypackage',
install_requires=[
"anotherpackage#git+git#bitbucket.org:myorg/anotherpackage.git"
]
)
but it fails with:
error in mypackage setup command: 'install_requires' must be a string or list of strings containing valid project/version requirement specifiers; Invalid URL given
I guess it is correct about the format of my URL as PEP 508 doesn't allow specifying git user name for ssh clone URLs.
What is the correct syntax for PEP 508 URLs with git+ssh protocol for install_requires dependency for private git repositories (in this case hosted on BitBucket)? What is the syntax for specifying a specific branch, tag or sha?
More context to avoid XY problem
I have an internal Python project that depends on multiple internally developed Python packages. I would like to avoid the necessity for hosting my own PIP repository in the organisation and thus I am trying to use git URLs directly. I need to use ssh protocol for git URLs as all the users have their ssh keys configured and it would be cumbersome to ask all the users to configure their app passwords in BitBuckets (I have 2FA required and the regular user password doesn't work).
I have already tried to use:
dependency_links
setup(
name='mypackage',
install_requires=[
"anotherpackage==0.0.1"
],
dependency_links=[
"git+git#bitbucket.org:myorg/anotherpackage.git#0.0.1#egg=anotherpackage-0.0.1"
]
)
But they are deprecated and they are ignored by pip3 install -e .. According to documentation I've found, PEP 508 URLs should be used instead.
requirements.txt file with entries duplicated from install_requires entries
I have a requirements.txt file with:
-e git+git#bitbucket.org:myorg/anotherpackage.git#0.0.1#egg=anotherpackage
and I use pip3 install -r requirements.txt instead of pip3 install -e .. It works but is suboptimal as I have to keep both setyp.py and requirements.txt in sync.
If there is any other recommended solution for my problem I would like to learn about it :)
After checking pip source code I found the correct syntax for private BitBucket repositories.
The general form for the packages with URLs is <package name>#<URI> and the URI must start with a <scheme>://.
So I fixed it to:
anotherpackage#git+ssh://git#bitbucket.org:myorg/anotherpackage.git
and then I was getting a different error - this time git command (invoked by pip) was complaining about repository URL ssh://git#bitbucket.org:myorg/anotherpackage.git.
I checked the git documentation for the ssh:// URLs format and found out that hostname and organisation parts must be separated with / instead of ::
ssh://git#bitbucket.org/myorg/anotherpackage.git
This URL works fine. I also learned from the pip source code that the actual revision/branch/tag can be specified by appending #<rev-spec> so I can specify for example the tag 0.0.1 with the following in install_requires:
anotherpackage#git+ssh://git#bitbucket.org:myorg/anotherpackage.git#0.0.1
The only issue that I still have is that when I change the revision and run pip3 install -e . again it doesn't detect the change (even when run with --upgrade). I have to manually uninstall the package (pip3 uninstall anotherpackage) and run pip3 install -e . again.

How to install dependencies from requirements.txt in a Yocto recipe for a local Python project

What I should have:
I want my Yocto Project to build a package for my Python project with all dependencies inside. The project has to run out of box on the resulting read-only sdcard image.
It simply should install all requirements in the required version to the package.
What I tried without luck:
Calling pip in do_install():
"pip/pip3 is not found", even it's in RDEPENDS.
Anyway, I really prefer this way.
With inherit pypi:
When trying with inherit pypi, it tries to get also my local sources (my pyton project) from pypi. And I have always to copy the requirements to the recipe. This is not my preferred way.
Calling pip in pkg_postinst():
It tries to install the modules on first start and fails, because the system has no internet connection and it's a read-only system. It must run out of the box without installation on first boot time. Does its stuff to late.
Where I'll get around:
There should be no need to change anything in the recipes when something changes in requirements.txt.
Background information
I'm working with Yocto Rocko in a Linux environment.
In the Hostsystem, there is no pip installed. I want to run this one installed from RDEPENDS in the target system.
Building the Package (only this recipe) with:
bitbake myproject
Building the whole sdcard image:
bitbake myProject-image-base
The recipe:
myproject.bb (relevant lines):
RDEPENDS_${PN} = "python3 python3-pip"
APP_SOURCES_DIR := "${#os.path.abspath(os.path.dirname(d.getVar('FILE', True)) + '/../../../../app-sources')}"
FILESEXTRAPATHS_prepend := "${THISDIR}/files:"
SRC_URI = " \
file://${APP_SOURCES_DIR}/myProject \
...
"
inherit allarch # tried also with pypi and setuptools3 for the pypi way.
do_install() { # Line 116
install -d -m 0755 ${D}/myProject
cp -R --no-dereference --preserve=mode,links -v ${APP_SOURCES_DIR}/myProject/* ${D}/myProject/
pip3 install -r ${APP_SOURCES_DIR}/myProject/requirements.txt
# Tried also python ${APP_SOURCES_DIR}/myProject/setup.py install
}
# Tried also this, but it's no option because the data MUST be included in the Package:
# pkg_postinst_${PN}() {
# #!/bin/sh -e
# pip3 install -r /myProject/requirements.txt
# }
FILES_${PN} = "/myProject/*"
Resulting Errors:
Expected to install the listed modules from requirements.txt into the myProject package, so that the python app will run directly on the resulting readonly sdcard image.
With pip, I get:
| /*/tmp/work/*/myProject/0.1.0-r0/temp/run.do_install: 116: pip3: not found
| WARNING: exit code 127 from a shell command.
| ERROR: Function failed: do_install ...
When using pypi:
404 Not Found
ERROR: myProject-0.1.0-r0 do_fetch: Fetcher failure for URL: 'https://files.pythonhosted.org/packages/source/m/myproject/myproject-0.1.0.tar.gz'. Unable to fetch URL from any source.
=> But it should not fetch myProject, since it is already local and nowhere remote.
Any ideas? What would be the best way to reach to a ready to use sdcard image without the need to change recipes when requirements.txt changes?
You should use RDEPENDS_${PN} to take care of your dependencies for your app in the recipe.
For example, assuming your python app needs aws-iot-device-sdk-python module, you should add it to RDEPENDS in the recipe. In your case, it would be like this:
RDEPENDS_${PN} = "python3 \
python3-pip \
python3-aws-iot-device-sdk-python \
"
Here's the link showing the Python modules supported by OpenEmbedded Layer.
https://layers.openembedded.org/layerindex/branch/master/layer/meta-python/
If the modules you need are not there, you will likely need to create recipes for the modules.
My newest findings:
Yocto/bitbake seems to suppress interpreting the requirements, because this breaks automatic dependency resolving what could lead to conflicts.
Reason: The required modules from setup.py would not be stored as independent packages, but as part of my package. So, bitbake does not know about this modules what could conflict with other packages that probably requires same modules in different versions.
What was in my recipe:
MY_INSTALL_ARGS = "--root=${D} \
--prefix=${prefix} \
--install-lib=${PYTHON_SITEPACKAGES_DIR} \
--install-data=${datadir}"
do_install() {
PYTHONPATH=${PYTHON_SITEPACKAGES_DIR} \
${STAGING_BINDIR_NATIVE}/${PYTHON_PN}-native/${PYTHON_PN} setup.py install ${MY_INSTALL_ARGS}
}
If I execute this outside of bitbake as python3 setup.py install ${MY_INSTALL_ARGS}, all will be installed correctly, but in the recipe, no requirements are installed.
There is a parameter --no-deps, but I didn't find where it is set.
I think there could be one possibility to exploit the requirements out of setup.py:
Find out where to disable --no-deps in the openembedded/poky layer for easy_install.
Creating a separate PYTHON_SITEPACKAGES_DIR
Install this separate PYTHON_SITEPACKAGES_DIR in eg the home directory as private python modules dir.
This way, no python module would trigger a conflict.
Since I do not have the time to experiment with this, I'll define now one recipe per requirement.
You try installing pip?
Debian
apt-get install python-pip
apt-get install python3-pip
Centos
yum install python-pip

How to state in requirements.txt a direct github source

I've installed a library using the command
pip install git+git://github.com/mozilla/elasticutils.git
which installs it directly from a Github repository. This works fine and I want to have that dependency in my requirements.txt. I've looked at other tickets like this but that didn't solve my problem. If I put something like
-f git+git://github.com/mozilla/elasticutils.git
elasticutils==0.7.dev
in the requirements.txt file, a pip install -r requirements.txt results in the following output:
Downloading/unpacking elasticutils==0.7.dev (from -r requirements.txt (line 20))
Could not find a version that satisfies the requirement elasticutils==0.7.dev (from -r requirements.txt (line 20)) (from versions: )
No distributions matching the version for elasticutils==0.7.dev (from -r requirements.txt (line 20))
The documentation of the requirements file does not mention links using the git+git protocol specifier, so maybe this is just not supported.
Does anybody have a solution for my problem?
Normally your requirements.txt file would look something like this:
package-one==1.9.4
package-two==3.7.1
package-three==1.0.1
...
To specify a Github repo, you do not need the package-name== convention.
The examples below update package-two using a GitHub repo. The text between # and # denotes the specifics of the package.
Specify commit hash (41b95ec in the context of updated requirements.txt):
package-one==1.9.4
git+https://github.com/path/to/package-two#41b95ec#egg=package-two
package-three==1.0.1
Specify branch name (master):
git+https://github.com/path/to/package-two#master#egg=package-two
Specify tag (0.1):
git+https://github.com/path/to/package-two#0.1#egg=package-two
Specify release (3.7.1):
git+https://github.com/path/to/package-two#releases/tag/v3.7.1#egg=package-two
Note that #egg=package-two is not a comment here, it is to explicitly state the package name
This blog post has some more discussion on the topic.
“Editable” packages syntax can be used in requirements.txt to import packages from a variety of VCS (git, hg, bzr, svn):
-e git://github.com/mozilla/elasticutils.git#egg=elasticutils
Also, it is possible to point to particular commit:
-e git://github.com/mozilla/elasticutils.git#000b14389171a9f0d7d713466b32bc649b0bed8e#egg=elasticutils
requirements.txt allows the following ways of specifying a dependency on a package in a git repository as of pip 7.0:1
[-e] git+git://git.myproject.org/SomeProject#egg=SomeProject
[-e] git+https://git.myproject.org/SomeProject#egg=SomeProject
[-e] git+ssh://git.myproject.org/SomeProject#egg=SomeProject
-e git+git#git.myproject.org:SomeProject#egg=SomeProject (deprecated as of Jan 2020)
For Github that means you can do (notice the omitted -e):
git+git://github.com/mozilla/elasticutils.git#egg=elasticutils
Why the extra answer?
I got somewhat confused by the -e flag in the other answers so here's my clarification:
The -e or --editable flag means that the package is installed in <venv path>/src/SomeProject and thus not in the deeply buried <venv path>/lib/pythonX.X/site-packages/SomeProject it would otherwise be placed in.2
Documentation
1 https://pip.readthedocs.org/en/stable/reference/pip_install/#git
2 https://pip.readthedocs.org/en/stable/reference/pip_install/#vcs-support
First, install with git+git or git+https, in any way you know. Example of installing kronok's branch of the brabeion project:
pip install -e git+https://github.com/kronok/brabeion.git#12efe6aa06b85ae5ff725d3033e38f624e0a616f#egg=brabeion
Second, use pip freeze > requirements.txt to get the right thing in your requirements.txt. In this case, you will get
-e git+https://github.com/kronok/brabeion.git#12efe6aa06b85ae5ff725d3033e38f624e0a616f#egg=brabeion-master
Third, test the result:
pip uninstall brabeion
pip install -r requirements.txt
Since pip v1.5, (released Jan 1 2014: CHANGELOG, PR) you may also specify a subdirectory of a git repo to contain your module. The syntax looks like this:
pip install -e git+https://git.repo/some_repo.git#egg=my_subdir_pkg&subdirectory=my_subdir_pkg # install a python package from a repo subdirectory
Note: As a pip module author, ideally you'd probably want to publish your module in it's own top-level repo if you can. Yet this feature is helpful for some pre-existing repos that contain python modules in subdirectories. You might be forced to install them this way if they are not published to pypi too.
None of these answers worked for me. The only thing that worked was:
git+https://github.com/path_to_my_project.git
No "e", no double "git" and no previous installs necessary.
Github has zip endpoints that in my opinion are preferable to using the git protocol. The advantages are:
You don't have to specify #egg=<project name>
Git doesn't need to be installed in your environment, which is nice for containerized environments
It works much better with pip hashing and caching
The URL structure is easier to remember and more discoverable
You usually want requirements.txt entries to look like this, e.g. without the -e prefix:
https://github.com/org/package/archive/1a58aa586efd4bca37f2cfb9d9348958986aab6c.tar.gz
To install from main branch:
https://github.com/org/package/archive/main.tar.gz
There is also an equivalent .zip endpoint, but it was reported in a comment that always using the .tar.gz endpoint avoids problems with unicode package names.
It seems like this is also a valid format:
gym-tictactoe # git+https://github.com/haje01/gym-tictactoe.git#84e22fc28fe192ba0040bdd56a697f63d3d4a3d5
If you do a pip install "git+https://github.com/haje01/gym-tictactoe.git", then look at what got installed by running pip freeze, you will see the package described in this format and can copy and paste into requirements.txt.
I'm finding that it's kind of tricky to get pip3 (v9.0.1, as installed by Ubuntu 18.04's package manager) to actually install the thing I tell it to install. I'm posting this answer to save anyone's time who runs into this problem.
Putting this into a requirements.txt file failed:
git+git://github.com/myname/myrepo.git#my-branch#egg=eggname
By "failed" I mean that while it downloaded the code from Git, it ended up installing the original version of the code, as found on PyPi, instead of the code in the repo on that branch.
However, installing the commmit instead of the branch name works:
git+git://github.com/myname/myrepo.git#d27d07c9e862feb939e56d0df19d5733ea7b4f4d#egg=eggname
For private repositories, I found that these two work fine for me:
pip install https://${GITHUB_TOKEN}#github.com/owner/repo/archive/main.tar.gz
Where main.tar.gz refers to the main branch of your repo and can be replaced with other branch names. For more information and using the more recent Github API see here:
pip install https://${GITHUB_TOKEN}#api.github.com/repos/owner/repo/tarball/master
If you have git installed and available, then
pip install git+https://${GITHUB_TOKEN}#github.com/owner/repo.git#main
achieves the same, and it also allows for some more flexibility by appending #branch or #tag or #commit-hash. That approach, however, actually clones the repo into a local temp folder which can take a noticeable amount of time.
You can use the URLs in your requirements.txt, too.

Categories