How to extract pip requirements which do not have a wheel? - python

Is there a way to list which of my requirements in my requirements.txt don't have a corresponding wheel?
This sort of works but doesn't seem like a very nice option and takes a lot longer than I would like.
$ pip install -r requirements.txt --force-reinstall | grep '.tar.gz'

Related

Python pip rebuild requirements

Using a virtualenv with pip install and pip freeze is quite a nice way to work. All your requirements can be handled at the shell and at a later date another developer can rebuild things
pip install lib1
pip freeze > requirements.txt
# Do some development
# Oh I need this as well
pip install lib2
pip freeze > requirements.txt
# Another developer comes along
pip install -r requirements.txt
# They can carry on developer
However, if you then want to update your libraries thins are difficult (because you have all the dependencies in the freeze rather than just the packages you use).
Is there a way to work like this where but also update your libraries at a later date.
One approach is to use pip-tools and keep a requirements file (this is used internally by pipenv), but this isn't very "shelly"
My workflow avoids pip freeze. It goes:
# Oh, I need lib1
echo "lib1~=1.0" >> requirements.txt
pip install -r requirements.txt
# Oh, I need lib2
echo "lib2~=3.0" >> requirements.txt
pip install -r requirements.txt
That way, requirements.txt contains only my direct dependencies, so hopefully it's much easier to maintain.

Pip install requirements.txt with -t for some libraries [duplicate]

When I have, for example, a requirements-dev.txt and a requirements.txt, I know I can have -r requirements.txt inside requirements-dev.txt, for example, and running pip install -r requirements-dev.txt would install packages from both files.
That said, I was certain that any install option would work fine inside a requirements file. Turns out that when I place inside a requirements file something like:
mypackage==1.0.0 -t /path/to/local/dir
I get:
pip: error: no such option: -t
while running pip install mypackage==1.0.0 -t /path/to/local/dir works just fine. For complicated reasons, I need to place multiple packages in one requirements file, where some packages must target one directory, others must target another, and so goes on.
Any solutions to make this work?
As of today (in pip version 21.2.4), the -t, --target <dir> option is not supported in requirements.txt files. The section "Requirements File Format" of pip's User guide lists the currently supported options:
-i, --index-url
--extra-index-url
--no-index
-c, --constraint
-r, --requirement
-e, --editable
-f, --find-links
--no-binary
--only-binary
--prefer-binary
--require-hashes
--pre
--trusted-host
--use-feature
pip install -r requirements.txt -t /path/to/install
This should work. It worked for me.
If you want different modules to be installed to different locations, then I think you might have to put them into multiple requirements text files. This is at least as far as I know

How can I define multiple requirement files?

How can I do define multiple requirements files in my requirements.txt file.
-r base.txt
-r test.txt
The current behavior is that pip only installs packages from test.txt. I'd expect pip to install packages found in both base.txt and test.txt. I could have sworn I've seen someone do this in the past, but I can't find any examples.
pip accepts multiple -r arguments:
pip install -r reqs1.txt -r reqs2.txt
The help for pip install says:
-r, --requirement
Install from the given requirements file. This option can be used multiple times.
You can have one file "include" the other; for example, if you put this in file2.txt:
-r file1.txt
Django
Flask
etc.
Then when you do pip install -r file2.txt, it will also install things from file1.txt.
I often use this strategy to have a "base" requirements file, and then only specify those things that are required at each stage (development, testing, staging, production, etc.)
I have many requirements in different directories and solve this problem as:
sudo find . -name "requirement*" -type f -exec pip3 install -r '{}' ';'

pip - installation of sub-dependencies overrides other packages on requirements.txt

my requirements file is like that :
https://github.com/sontek/pyramid_webassets/archive/38b0b9f9f4e36dc22b3a5c10eabf4d9228d97740.zip#egg=pyramid_webassets-0.0
https://github.com/miracle2k/webassets/archive/334d55c6bcfd091cb2d984777daf943acde0d364.zip#egg=webassets-0.8.dev
when running pip install -r requirements.txt I want it to install the specific version of pyramid_webassets, and then the specific webassets version (0.8.dev)
the problem is that pyramid_webassets have the webassets as sub-dependency, and it installs the latest of this package.
so the output of pip freeze is
Chameleon==2.14
Mako==0.9.1
MarkupSafe==0.18
PasteDeploy==1.5.2
WebOb==1.3.1
argparse==1.2.1
pyramid==1.4.5
pyramid-webassets==0.0
repoze.lru==0.6
translationstring==1.1
venusian==1.0a8
webassets==0.9
wsgiref==0.1.2
zope.deprecation==4.1.0
zope.interface==4.0.5
you might notice that webassets version is the latest (0.9) though I specified the version I want (0.8.dev).
I tried to reorder the list, adding the --upgrade flag- nothing helped.
any idea how can I install it and still having the required version of webassets?
Thanks.
soultion:
I found this commend useful:
cat requirements.txt | xargs -L1 pip install
that will install one by one the packages orderly
but we should add --upgrade for the last package so it'll upgrade it.
use pip install option to not install package dependencies
$ pip install --no-deps -r requirements.txt
Doing a pip freeze afterwards
gottfried#sascha-Latitude-XT2:~/venv$ bin/pip freeze
argparse==1.2.1
pyramid-webassets==0.0
webassets==0.8.dev
wsgiref==0.1.2
References
pip cookbook - Ensuring repeatability
What happens when you move webassets higher then pyramid_webassets on the list?

How can I use a pip requirements file to uninstall as well as install packages?

I have a pip requirements file that changes during development.
Can pip be made to uninstall packages that do not appear in the requirements file as well as installing those that do appear? Is there a standard method?
This would allow the pip requirements file to be the canonical list of packages - an 'if and only if' approach.
Update: I suggested it as a new feature at https://github.com/pypa/pip/issues/716
This should uninstall anything not in requirements.txt:
pip freeze | grep -v -f requirements.txt - | grep -v '^#' | xargs pip uninstall -y
Although this won't work quite right with packages installed with -e, i.e. from a git repository or similar. To skip those, just filter out packages starting with the -e flag:
pip freeze | grep -v -f requirements.txt - | grep -v '^#' | grep -v '^-e ' | xargs pip uninstall -y
Then, obviously:
pip install -r requirements.txt
Update for 2016:
You probably don't really want to actually use the above approach, though. Check out pip-tools and pip-sync which accomplish what you are probably looking to do in a much more robust way.
https://github.com/nvie/pip-tools
Update for May, 2016:
You can now also use pip uninstall -r requirements.txt, however this accomplishes basically the opposite - it uninstalls everything in requirements.txt
Update for May, 2019:
Check out pipenv or Poetry. A lot has happened in the world of package management that makes this sort of question a bit obsolete. I'm actually still quite happily using pip-tools, though.
You can now pass the -r requirements.txt argument to pip uninstall.
pip uninstall -r requirements.txt -y
At least as of pip 8.1.2, pip help uninstall shows:
...
Uninstall Options:
-r, --requirement <file> Uninstall all the packages listed in the given requirements file. This option can be
used multiple times.
...
It's not a feature of pip, no. If you really want such a thing, you could write a script to compare the output of pip freeze with your requirements.txt, but it would likely be more hassle than it's worth.
Using virtualenv, it is easier and more reliable to just create a clean environment and (re)install from requirements.txt, like:
deactivate
rm -rf venv/
virtualenv venv/
source venv/bin/activate
pip install -r requirements.txt
The short answer is no, you can't do that with pip.
Here's a simple solution that works:
pip uninstall $(pip freeze) -y
This is an old question (but a good one), and things have changed substantially since it was asked.
There's an offhand reference to pip-sync in another answer, but deserves its own answer, because it solves precisely the OP's problem.
pip-sync takes a requirements.txt file as input, and "trues up" your current Python environment so that it matches exactly what's in that requirements.txt. This includes removing any packages that are present in your env but absent from requirements.txt.
Example: Suppose we want our env to contain (only) 3 libraries: libA, libB, and libC, like so:
> cat requirements.txt
libA==1.0
libB==1.1
libC==1.2
But our env currently contains libC and libD:
> pip freeze
libC==1.2
libD==1.3
Running pip-sync will result in this, which was our desired final state:
> pip-sync requirements.txt
> pip freeze
libA==1.0
libB==1.1
libC==1.2
Stephen's proposal is a nice idea, but unfortunately it doesn't work
if you include only direct requirements in your file, which sounds
cleaner to me.
All dependencies will be uninstalled,
including even distribute, breaking down pip itself.
Maintaining a clean requirements file while version tracking a virtual environment
Here is how I try to version-track my virtual environment.
I try to maintain a minimal requirements.txt, including only
the direct requirements, and not even mentioning version constraints where
I'm not sure.
But besides, I keep, and include in version tracking (say git),
the actual status of my virtualenv in a venv.pip file.
Here is a sample workflow:
setup virtualenv workspace, with version tracking:
mkdir /tmp/pip_uninstalling
cd /tmp/pip_uninstalling
virtualenv venv
. venv/bin/activate
initialize version tracking system:
git init
echo venv > .gitignore
pip freeze > venv.pip
git add .gitignore venv.pip
git commit -m "Python project with venv"
install a package with dependencies, include it in requirements file:
echo flask > requirements.txt
pip install -r requirements.txt
pip freeze > venv.pip
Now start building your app, then commit and start a new branch:
vim myapp.py
git commit -am "Simple flask application"
git checkout -b "experiments"
install an extra package:
echo flask-script >> requirements.txt
pip install -r requirements.txt
pip freeze > venv.pip
... play with it, and then come back to earlier version
vim manage.py
git commit -am "Playing with flask-script"
git checkout master
Now uninstall extraneous packages:
pip freeze | grep -v -f venv.pip | xargs pip uninstall -y
I suppose the process can be automated with git hooks, but let's not go off topic.
Of course, it makes sense then to use some package caching system
or local repository like pip2pi
you can create a new file with all installed packages
pip freeze > uninstall.txt
and then uninstall all of those
pip uninstall -r uninstall.txt -y
and then finally re-install the packages you had in your original requirements.txt file
pip install -r requirements.txt
Piggybacking off #stephen-j-fuhry here is a powershell equivalent I use:
pip freeze | ? { $_ -notmatch ((gc req.txt) -join "|") }
While this doesn't directly answer the question, a better alternative to requirements.txt now is using a Pipfile. This functions similarly to a Ruby Gemfile. Currently, you need to make use of the pipenv tool but hopefully this will eventually be incorporated into pip. This provides the pipenv clean command which does what you want.
(Note that you can import an existing requirements.txt with pipenv install -r requirements.txt. After this you should have a Pipfile and the requirements.txt can be removed.)
It is possible now using:
pip uninstall -r requirements.txt

Categories