python2; pip parse_requirements with --trusted-host and --extra-index-url - python

I am using
install_requires = [str(ir.req) for ir in parse_requirements("requirements.txt", session=PipSession())]
with pip install .. However, this does not seem to work with a requirements.txt that looks like this:
--trusted-host blah
--extra-index-url blah2
...
(support for --trusted-host was added in pip8.0.0). The install from blah fails because it complains about it not being an untrusted host as if it never processed the first line.
HOWEVER, pip install -r requirements.txt works perfectly, so these options are correct.
This means there is something parse_requirements is not doing. My question is: how do I fix or work around this using only pip install .? I could do something horrendous like:
os.system(pip install -r requirements.txt)
setup(...
in the setup.py file.
The implicit coupling of requirements.txt and setup.py is confusing to me. Nothing in setup calls requirements.txt unless you explicitly parse requirements.txt yourself, yet requirements.txt is a very standard python convention.
EDIT: We are using tools (Cloudify and sometimes Chef) that execute a pip install .. We cannot change this. I have to get this working as a pippable package, with the --trusted-host and --extra-index-urls without using a pip.conf either. Currently we are doing the os.system trick.

There has been much written about using setup.py vrs. requirements.txt. Setup.py is for Abstract requirements. Requirements.txt is for concrete requirements. In other words, they serve different purposes. Whereas requirements.txt is for an environment, setup.py is for a package. So it doesn't make sense for a setup.py to read from a requirement.txt just like it wouldn't make sense for a deb package to read from a Chef cookbook. Abstract vrs. Concrete Requirements
Often the reason people do this is they want to support people installing their package with pip install -r requirements.txt from within a check out without needing to list their dependencies twice. That's a reasonable thing to want which is why the requirements file format has a construct that enables it, simply make a requirements.txt file that contains "." or "-e ." and pip will automatically install the project and all of it's dependencies.
EDIT:
Since pip is not a library, using the most exposed part of the program is the safest (in my opinion). An alternative to os.system is
import pip
pip.main(['install','-r','requirements.txt'])

This answer from goCards is perfectly valid and you should probably import pip from your setup.py if there's no way around pip install .. But I would like to explain what actually happens here. Here's what you need to know:
install_requires is an option only supported by setuptools, a third-party package that improves upon distutils (the standard tool used in setup.py files and distributed with Python).
By design, setuptools only accepts actual requirements in install_requires, so options such as --trusted-host cannot be sent to install_requires.
This is why you're using parse_requirements("requirements.txt", session=PipSession()). This function only yields packages. The option lines such as --trusted-host bla are not returned, but sent to a PackageFinder if you gave one to parse_requirements. But you don't want these options to be returned, because setuptools would not know what do with those!
Long story short, if you want to use pip options, you need to use pip.

Related

List Python packages that will be installed from requirements.txt

In my requirements.txt I have packages defined in following manner:
Django ~= 2.2.0
It means that when I use pip install -r requirements.txt pip will find the latest available 2.2.x version and install it along with all dependencies.
What I need is requirements-formatted list of all packages with explicit versions that will be installed but without actually installing any packages. So example output would be something like:
Django==2.2.23
package1==0.2.1
package2==1.4.3
...
So in other words I'm looking for something like pip freeze results but without installing anything.
pip-compile is what you need!
Doc: https://github.com/jazzband/pip-tools)
python -m pip install pip-tools
pip-compile requirements.txt --output-file requirements-all.txt
The pip-compile command lets you compile a requirements.txt file from your dependencies, this way you can pip install you dependencies to always have the same environment
TL;DR
Try pipdetree or pip-tree.
Explanation
pip, contrary to most package managers, doesn't have a big dependency graph to look up. What it does is that it lets arbitrary setup code to be executed, which automatically pulls the dependencies. This means that, for example, a package could manage their dependencies in an other way than putting them in requirements.txt (see fastai for an example of a project that handles the dependencies differently).
So, there is, theoretically, no other way to see all the dependencies than to actually run an install on an isolated environment, see what was pulled, then delete the environment (because it could potentially be the same part of the code that does the installation and that brings the dependencies). You could actually do that with venv.
In practice, tools like pipdetree or pip-tree fetch the dependencies based on some standardization of the requirements (most packages separate the dependencies and the installation, and actually let pip handle both).

Python3: install github-based module in setup.py?

Installing with pip, I can write the following requirements.txt file:
git+https://repo#branch#egg=foo&subdirectory=this/bar/foo
numpy
And successfully install the requirements file:
python3 -m pip install -r requirements.tx
However, I have co-located in the directory a setup.py script that lists:
setuptools.setup(
...
install_requires = get_lines('requirements.txt'),
...
)
And installing this submodule using pip involves pip running setup.py...which fails to handle the module link:
git+https://github.com/repo#branch#egg=foo&subdirectory=this/bar/foo
I can see a lot of ways around this, but it seems like there should be one non-ambiguous way to do this which changes as little as possible in the setup.py script.
Is there such a way?
You probably need to change the line in requirements.txt to something like:
foo # git+https://repo#branch#egg=foo&subdirectory=this/bar/foo
References:
https://pip.pypa.io/en/stable/reference/pip_install/#requirement-specifiers
https://www.python.org/dev/peps/pep-0440/#direct-references
Although I am not entirely sure it will work. There might be subtle differences between the notations accepted in requirements.txt files, pip directly and setuptools. In particular I do not know how well things like egg and subdirectory are supported.
Advices:
Avoid calling python setup.py install or python setup.py develop from now on, and make sure to call python -m pip install . or python -m pip install --editable . instead.
I do consider reading requirements.txt from within setup.py as a red flag (or at least yellow). The contents of install_requires of setuptools and requirements.txt usually serve different purposes.

Running custom code with pip install fails

I have a Python package that I'm distributing with pip. I need to add some custom code to be run at install time:
from setuptools import setup
from setuptools.command.install import install
class CustomInstall(install):
def run(self):
install.run(self)
print "TEST"
setup(
...
cmdclass={'install': CustomInstall},
...)
I thought the problem might pip suppressing stdout: Custom pip install commands not running. But then I replaced print "TEST" with creating a file and writing some text, and that didn't happen either.
It appears that my custom run method is only happening when I create and upload my_package to test PyPI:
python setup.py sdist bdist_wheel upload -r https://testpypi.python.org/pypi
and not when I pip install it:
pip install -i https://testpypi.python.org/pypi my_package
Maybe I am fundamentally not understanding how pip and setuptools work, but that is the opposite of the behavior I expected.
My questions are:
How can I get my CustomInstall class to work?
and
What actually happens when you call pip install?
I've looked a the setuptools docs and the PyPI docs, and I haven't been able to figure it out. It seems like other people have had success with this: Run custom task when call `pip install`, so I'm not sure what's going wrong.
So I'm not sure how much this will help, but I recently dealt with a similar issue, and here's what I learned.
Your custom install code appears to be correct. However, there are more methods than just run that can be overridden. Another useful one is finalize_options because you can write code to dynamically change the parameters of your setup.py (example here.)
This is a very good question to ask.pip install does various things depending on various factors. From where are you installing the package? From PyPI or some other package index? How was the package distributed? Is it a binary dist (.whl) or a source dist (.gz) file? Are you installing the package via a local directory? A remote repo via a VCS URL? Pip does not necessarily use the same approach for each of these cases. I would recommend using the -vvv flag to see what exactly pip is doing. It may not be running setuptools's install command for whatever reason...do you have
packages=setuptools.find_packages(),
include_package_data=True
in your setup.py file? Without these lines, pip could be installing your package's metadata but not the package itself.

How do I "pretend" to install a package using pip?

This feels like such a simple question, but I can't find any reference in the pip documentation and the only question that seemed relevant mentions a flag that has apparently been deprecated since version 1.5 (version 8.1 is current at the time of this writing).
How do I "pretend" to install a package or list of packages using pip, without actually installing them? I have two separate use cases for this:
I need to see what packages out of a long (~70 line) requirements.txt are missing, without actually installing them; seeing what requirements are already satisfied without installing the missing requirements would satisfy this for me.
Finding the dependencies for a package that I have not yet installed on my computer, without using something like Portage or Aptitude.
There is also the pretty useful pip-tools package that provides a pip-sync tool which you can execute in a "dry run" mode against your requirements file(s):
$ mkvirtualenv test_so
New python executable in test_so/bin/python
Installing setuptools, pip, wheel...done.
...
(test_so) $ pip install pip-tools
...
Installing collected packages: six, click, first, pip-tools
(test_so) $ echo "Django==1.6.11" > requirements.txt
(test_so) $ pip-sync --dry-run requirements.txt
Would install:
Django==1.6.11
Also, here is a partially relevant thread: Check if requirements are up to date.
Per the pip documentation, the proper way to generate the requirements.txt file is via pip freeze > requirements.txt. Hopefully this is what you wanted.

Installing Python packages from local file system folder to virtualenv with pip

Is it possible to install packages using pip from the local filesystem?
I have run python setup.py sdist for my package, which has created the appropriate tar.gz file. This file is stored on my system at /srv/pkg/mypackage/mypackage-0.1.0.tar.gz.
Now in a virtual environment I would like to install packages either coming from pypi or from the specific local location /srv/pkg.
Is this possible?
PS
I know that I can specify pip install /srv/pkg/mypackage/mypackage-0.1.0.tar.gz. That will work, but I am talking about using the /srv/pkg location as another place for pip to search if I typed pip install mypackage.
What about::
pip install --help
...
-e, --editable <path/url> Install a project in editable mode (i.e. setuptools
"develop mode") from a local project path or a VCS url.
eg, pip install -e /srv/pkg
where /srv/pkg is the top-level directory where 'setup.py' can be found.
I am pretty sure that what you are looking for is called --find-links option.
You can do
pip install mypackage --no-index --find-links file:///srv/pkg/mypackage
From the installing-packages page you can simply run:
pip install /srv/pkg/mypackage
where /srv/pkg/mypackage is the directory, containing setup.py.
Additionally1, you can install it from the archive file:
pip install ./mypackage-1.0.4.tar.gz
1
Although noted in the question, due to its popularity, it is also included.
I am installing pyfuzzybut is is not in PyPI; it returns the message: No matching distribution found for pyfuzzy.
I tried the accepted answer
pip install --no-index --find-links=file:///Users/victor/Downloads/pyfuzzy-0.1.0 pyfuzzy
But it does not work either and returns the following error:
Ignoring indexes: https://pypi.python.org/simple
Collecting pyfuzzy
Could not find a version that satisfies the requirement pyfuzzy (from versions: )
No matching distribution found for pyfuzzy
At last , I have found a simple good way there: https://pip.pypa.io/en/latest/reference/pip_install.html
Install a particular source archive file.
$ pip install ./downloads/SomePackage-1.0.4.tar.gz
$ pip install http://my.package.repo/SomePackage-1.0.4.zip
So the following command worked for me:
pip install ../pyfuzzy-0.1.0.tar.gz.
Hope it can help you.
This is the solution that I ended up using:
import pip
def install(package):
# Debugging
# pip.main(["install", "--pre", "--upgrade", "--no-index",
# "--find-links=.", package, "--log-file", "log.txt", "-vv"])
pip.main(["install", "--upgrade", "--no-index", "--find-links=.", package])
if __name__ == "__main__":
install("mypackagename")
raw_input("Press Enter to Exit...\n")
I pieced this together from pip install examples as well as from Rikard's answer on another question. The "--pre" argument lets you install non-production versions. The "--no-index" argument avoids searching the PyPI indexes. The "--find-links=." argument searches in the local folder (this can be relative or absolute). I used the "--log-file", "log.txt", and "-vv" arguments for debugging. The "--upgrade" argument lets you install newer versions over older ones.
I also found a good way to uninstall them. This is useful when you have several different Python environments. It's the same basic format, just using "uninstall" instead of "install", with a safety measure to prevent unintended uninstalls:
import pip
def uninstall(package):
response = raw_input("Uninstall '%s'? [y/n]:\n" % package)
if "y" in response.lower():
# Debugging
# pip.main(["uninstall", package, "-vv"])
pip.main(["uninstall", package])
pass
if __name__ == "__main__":
uninstall("mypackagename")
raw_input("Press Enter to Exit...\n")
The local folder contains these files: install.py, uninstall.py, mypackagename-1.0.zip
An option --find-links does the job and it works from requirements.txt file!
You can put package archives in some folder and take the latest one without changing the requirements file, for example requirements:
.
└───requirements.txt
└───requirements
├───foo_bar-0.1.5-py2.py3-none-any.whl
├───foo_bar-0.1.6-py2.py3-none-any.whl
├───wiz_bang-0.7-py2.py3-none-any.whl
├───wiz_bang-0.8-py2.py3-none-any.whl
├───base.txt
├───local.txt
└───production.txt
Now in requirements/base.txt put:
--find-links=requirements
foo_bar
wiz_bang>=0.8
A neat way to update proprietary packages, just drop new one in the folder
In this way you can install packages from local folder AND pypi with the same single call: pip install -r requirements/production.txt
PS. See my cookiecutter-djangopackage fork to see how to split requirements and use folder based requirements organization.
Assuming you have virtualenv and a requirements.txt file, then you can define inside this file where to get the packages:
# Published pypi packages
PyJWT==1.6.4
email_validator==1.0.3
# Remote GIT repo package, this will install as django-bootstrap-themes
git+https://github.com/marquicus/django-bootstrap-themes#egg=django-bootstrap-themes
# Local GIT repo package, this will install as django-knowledge
git+file:///soft/SANDBOX/python/django/forks/django-knowledge#egg=django-knowledge
To install only from local you need 2 options:
--find-links: where to look for dependencies. There is no need for the file:// prefix mentioned by others.
--no-index: do not look in pypi indexes for missing dependencies (dependencies not installed and not in the --find-links path).
So you could run from any folder the following:
pip install --no-index --find-links /srv/pkg /path/to/mypackage-0.1.0.tar.gz
If your mypackage is setup properly, it will list all its dependencies, and if you used pip download to download the cascade of dependencies (ie dependencies of depencies etc), everything will work.
If you want to use the pypi index if it is accessible, but fallback to local wheels if not, you can remove --no-index and add --retries 0. You will see pip pause for a bit while it is try to check pypi for a missing dependency (one not installed) and when it finds it cannot reach it, will fall back to local. There does not seem to be a way to tell pip to "look for local ones first, then the index".
Having requirements in requirements.txt and egg_dir as a directory
you can build your local cache:
$ pip download -r requirements.txt -d eggs_dir
then, using that "cache" is simple like:
$ pip install -r requirements.txt --find-links=eggs_dir
What you need is --find-links of pip install.
-f, --find-links If a url or path to an html file, then parse for links to archives. If a local path or
file:// url that's a directory, then look for archives in the directory listing.
In my case, after python -m build, tar.gz package (and whl file) are generated in ./dist directory.
pip install --no-index -f ./dist YOUR_PACKAGE_NAME
Any tar.gz python package in ./dist can be installed by this way.
But if your package has dependencies, this command will prompt error.
To solve this, you can either pip install those deps from official pypi source, then add --no-deps like this
pip install --no-index --no-deps -f ./dist YOUR_PACKAGE_NAME
or copy your deps packages to ./dist directory.
I've been trying to achieve something really simple and failed miserably, probably I'm stupid.
Anyway, if you have a script/Dockerfile which download a python package zip file (e.g. from GitHub) and you then want to install it you can use the file:/// prefix to install it as shown in the following example:
$ wget https://example.com/mypackage.zip
$ echo "${MYPACKAGE_MD5} mypackage.zip" | md5sum --check -
$ pip install file:///.mypackage.zip
NOTE: I know you could install the package straight away using pip install https://example.com/mypackage.zip but in my case I wanted to verify the checksum (never paranoid enough) and I failed miserably when trying to use the various options that pip provides/the #md5 fragment.
It's been surprisingly frustrating to do something so simple directly with pip. I just wanted to pass a checksum and have pip verify that the zip was matching before installing it.
I was probably doing something very stupid but in the end I gave up and opted for this. I hope it helps others trying to do something similar.
In my case, it was because this library depended on another local library, which I had not yet installed. Installing the dependency with pip, and then the dependent library, solved the issue.
If you want to install one local package (package A) to be used inside another local project/package (B) this is quite simple. All you need is to CD to (B) and call:
pip install /path/to/package(A)
Of course you will need to first compile the package (A) with:
sudo python3 ./setup.py install
And, each time you change package A, just run again setup.py in package (A) then pip install ... inside the using project/package (B)
Just add directory on pip command
pip install mypackage file:/location/in/disk/mypackagename.filetype

Categories