I am using docker tutorial (https://docs.docker.com/language/python/build-images/) to build a simple python app. Using freeze command I made requirements.txt file which consists a lot of packages.
When I want to build the docker image, I am getting this error:
Step 4/6 : RUN pip3 install -r requirements.txt ---> Running in
f92acd21d271
ERROR: Could not find a version that satisfies the requirement
apt-clone==0.2.1 (from versions: none)
ERROR: No matching distribution found for apt-clone==0.2.1
The command '/bin/sh -c pip3 install -r requirements.txt' returned a
non-zero code: 1
This is my dockerfile contents (same as what is mentioned in the tutorial):
# syntax=docker/dockerfile:1
FROM python:3.8-slim-buster
WORKDIR /app
COPY requirements.txt requirements.txt
RUN pip3 install -r requirements.txt
COPY . .
CMD [ "python3", "-m", "flask","run","--host=0.0.0.0" ]
It's not related to apt-clone==0.2.1 package. Whatever I try to install in the docker image, it fails. I tried apt update and installing pip3 in the dockerfile too but didn't work.
What did I miss?
pip3 freeze outputs the package and its version installed in the current environment, no matter the package installed by pip or with other methods.
In fact, apt-clone==0.2.1 comes from debian package repo, not from pypi.org, see next:
$ pip3 freeze | grep apt-clone
$ apt-get install -y apt-clone
$ dpkg -l apt-clone
Desired=Unknown/Install/Remove/Purge/Hold
| Status=Not/Inst/Conf-files/Unpacked/halF-conf/Half-inst/trig-aWait/Trig-pend
|/ Err?=(none)/Reinst-required (Status,Err: uppercase=bad)
||/ Name Version Architecture Description
+++-==============-============-============-=================================
ii apt-clone 0.4.1 all Script to create state bundles
$ dpkg -L apt-clone
/.
/usr
/usr/share
/usr/share/doc
/usr/share/doc/apt-clone
/usr/share/doc/apt-clone/copyright
/usr/share/doc/apt-clone/changelog.gz
/usr/share/man
/usr/share/man/man8
/usr/share/man/man8/apt-clone.8.gz
/usr/lib
/usr/lib/python3
/usr/lib/python3/dist-packages
/usr/lib/python3/dist-packages/apt_clone.py
/usr/lib/python3/dist-packages/apt_clone-0.2.1.egg-info
/usr/bin
/usr/bin/apt-clone
$ pip3 freeze | grep apt-clone
apt-clone==0.2.1
You could see from above, the apt_clone.py & apt_clone-0.2.1.egg-info are installed by debian package apt-clone, the 0.4.1 is just the debian package version, while 0.2.1 is the python package version.
So, for apt-clone similar, you need to install them with apt although they are seen in pip3 freeze.
There is no package named apt-clone in the public PyPI repository, so pip3 obviously cannot find it.
If you actually have a Python package named like this, where did it come from?
If you created a package with this name locally, you need to install it too in your Docker image somehow.
If you are inside an organization, maybe you have a local PyPI with different packages than the public one, and then you need to configure pip3 to use it (try pip3 -i http://your.local.pypi/simple -r requirements.txt where obviously the URL needs to be something else, but we can't guess what).
If pip3 freeze says you have this package, it must have come from somewhere, but we don't know where. You have to figure that out, or supply more details to help us help you.
There is a Debian package named apt-clone but that obviously isn't something pip can install. Did you actually mean apt-get install -y apt-clone==0.2.1? (I can only find version 0.4.1 in Debian Buster, though.)
... #atline's answer explains what happened (you should probably accept that) but the simple fix is to manually populate requirements.txt with your actual requirements. pip3 freeze is a nice shorthand for that if you are in a virtual environment, but it's not a proper replacement for a manually curated requirements.txt or setup.py (or its modern replacements, currently moving from setup.cfg to pyproject.toml). If flask is the only package your app needs, simply put pip3 install flask and don't create a requirements.txt.
Related
I'm having trouble installing the dependencies of a Python wheel from Artifactory on my Jenkins job. It seems to be finding and pulling the correct .whl file, but it doesn't seem to be installing its dependencies. I get this error:
Looking in indexes: https://biv_ci_id:****#docker.repo1.uhc.com/artifactory/api/pypi/pypi-local/simple
Collecting ness_logger_lib
Downloading https://docker.repo1.uhc.com/artifactory/api/pypi/pypi-local/smart-audit/ness_logger_lib-1.0.1-py2.py3-none-any.whl (5.8 kB)
ERROR: Could not find a version that satisfies the requirement avro-python3 (from ness-logger-lib) (from versions: none)
ERROR: No matching distribution found for avro-python3
I'm trying to run unit tests for an application that needs the wheel listed above. I'm doing this during the testing stage of a Jenkins job. The stage that fails on my Jenkinsfile looks like this:
stage('Python Unit Testing') {
agent { label 'docker-kitchensink-slave' }
steps {
sh '''
export PYTHON_VERSION=${PYTHON_VERSION}
. /etc/profile.d/jenkins.sh
python3 -m venv venv
. venv/bin/activate
python3 -m ensurepip
python3 -m pip install --upgrade pip
python3 -m pip install -r requirements.txt
python3 -m pip install ness_logger_lib -i https://${ci_id}:${ci_pass}#docker.repo1.uhc.com/artifactory/api/pypi/pypi-local/simple
pytest ./sa_create_run_job --cov=./sa_create_run_job/server --cov-report=xml
'''
archiveArtifacts 'coverage.xml'
stash includes: 'coverage.xml', name: 'py-coverage'
}
}
As you can see from the error, its able to find and pull the wheel from Artifactory just fine, but it doesn't seem to be able to find the dependencies. Manually installing those dependencies first, like so, seems to work around the problem:
steps {
sh '''
export PYTHON_VERSION=${PYTHON_VERSION}
. /etc/profile.d/jenkins.sh
python3 -m venv venv
. venv/bin/activate
python3 -m ensurepip
python3 -m pip install --upgrade pip
python3 -m pip install -r requirements.txt
python3 -m pip install avro-python3
python3 -m pip install kafka-python
python3 -m pip install netaddr
python3 -m pip install pathlib
python3 -m pip install psutil
python3 -m pip install requests
python3 -m pip install ness_logger_lib -i https://${ci_id}:${ci_pass}#docker.repo1.uhc.com/artifactory/api/pypi/pypi-local/simple
pytest ./sa_create_run_job --cov=./sa_create_run_job/server --cov-report=xml
'''
archiveArtifacts 'coverage.xml'
stash includes: 'coverage.xml', name: 'py-coverage'
}
However, this is pretty inconvenient, especially since we plan to use this wheel for multiple applications.
How can I ensure that the wheel's dependencies are installed automatically, when pulling from Artifactory's pypy-local?
TL;DR
You have to use the --extra-index-url option of pip instead of -i / --index-url (see pip help install and look for the Package Index Options: section)
python3 -m pip install ness_logger_lib \
--extra-index-url https://${ci_id}:${ci_pass}#docker.repo1.uhc.com/artifactory/api/pypi/pypi-local/simple
Background info
I have to take a guess here but I'm pretty sure your local pypi is not an actual mirror of pypi but only contains a few extra packages. Moreover, from the rest of your commands we can see that you have access to the default pypi.org index.
If my guess is correct, you are using pip wrongly, i.e. you are replacing the default pypi.org index with you own (and incomplete) index. Pip finds you explicit custom package on your custom index but then tries to resolve the dependencies on that same index and fails.
When you replace the -i option with --extra-index-url, you instruct pip to look in both indexes for your packages. It will then find your local package in your artifactory index and the dependencies in the default pypi.org index.
I would like to install a certain package from a private git repository. This is possible using pip install git+<REPO_LINK>. However, I would like to pip install -r requirements.txt all of my packages at the same time without having to specify which one comes from Pypi and private repo.
I have tried adding a configuration in ~/.config/pip/pip.conf
[global]
find-links =
git+<REPO_LINK>
but this happened when running pip install -r requirements.txt:
ERROR: Could not find a version that satisfies the requirement my-package==0.1
Thanks in advance.
I have found a solution for it at this doc.
pip install git+<REPO_LINK>#egg=<PACKAGE_NAME>
When I run pip freeze, the package I have just installed is printed like this:
git+<REPO_LINK>#egg=<PACKAGE_NAME>
Add it to your requirements.txt so pip install -r requirements.txt install this specific package so as public ones from Pypi.
:)
I am mac user, used to run pip install with --user, but recently after brew update, I found there are some strange things, maybe related.
Whatever I tries, the packages are always installed to ~/Library/Python/2.7/lib/python/site-packages
Here are the commands I run.
$ python -m site --user-site
~/Library/Python/2.7/lib/python/site-packages
$ pip install --user -r requirements.txt
$ PYTHONUSERBASE=. pip install --user -r requirements.txt
So what should be the problem?
I used for lambda zip packaging
Updates:
If using Mac OS X and you have Python installed using Homebrew (see Homebrew), the accepted command will not work. A simple workaround is to add a setup.cfg file in your /path/to/project-dir with the following content.
[install]
prefix=
https://docs.aws.amazon.com/lambda/latest/dg/lambda-python-how-to-create-deployment-package.html
You can use the target (t) flag of pip install to specify a target location for installation.
In use:
pip install -r requirements.txt -t /path/to/directory
to the current directory:
pip install -r requirements.txt -t .
How do I upgrade all my python packages from requirements.txt file using pip command?
tried with below command
$ pip install --upgrade -r requirements.txt
Since, the python packages are suffixed with the version number (Django==1.5.1) they don't seem to upgrade. Is there any better approach than manually editing requirements.txt file?
EDIT
As Andy mentioned in his answer packages are pinned to a specific version, hence it is not possible to upgrade packages through pip command.
But, we can achieve this with pip-tools using the following command.
$ pip-review --auto
this will automatically upgrade all packages from requirements.txt (make sure to install pip-tools using pip install command).
I already answered this question here. Here's my solution:
Because there was no easy way for upgrading package by package, and updating the requirements.txt file, I wrote this pip-upgrader which also updates the versions in your requirements.txt file for the packages chosen (or all packages).
Installation
pip install pip-upgrader
Usage
Activate your virtualenv (important, because it will also install the new versions of upgraded packages in current virtualenv).
cd into your project directory, then run:
pip-upgrade
Advanced usage
If the requirements are placed in a non-standard location, send them as arguments:
pip-upgrade path/to/requirements.txt
If you already know what package you want to upgrade, simply send them as arguments:
pip-upgrade -p django -p celery -p dateutil
If you need to upgrade to pre-release / post-release version, add --prerelease argument to your command.
Full disclosure: I wrote this package.
you can try:
pip install --upgrade --force-reinstall -r requirements.txt
You can also ignore installed package and install the new one :
pip install --ignore-installed -r requirements.txt
No. Your requirements file has been pinned to specific versions. If your requirements are set to that version, you should not be trying to upgrade beyond those versions. If you need to upgrade, then you need to switch to unpinned versions in your requirements file.
Example:
lxml>=2.2.0
This would upgrade lxml to any version newer than 2.2.0
lxml>=2.2.0,<2.3.0
This would upgrade lxml to the most recent version between 2.2.0 and 2.3.0.
I suggest freezing all of your dependencies in order to have predictable builds.
When doing that, you can update all dependencies at once like this:
sed -i '' 's/[~=]=/>=/' requirements.txt
pip install -U -r requirements.txt
pip freeze | sed 's/==/~=/' > requirements.txt
Having done the above, test your project with the new set of packages and eventually commit the requirements.txt file to the repository while still allowing for installing hot-fixes.
Another solution is to use the upgrade-requirements package
pip install upgrade-requirements
And then run :
upgrade-requirements
It will upgrade all the packages that are not at their latest versions, and also create an updated requirements.txt at the end.
Fixing dependencies to a specific version is the recommended practice.
Here's another solution using pur to keep the dependencies fresh!
Give pur your requirements.txt file and it will auto update all your high-level packages to the latest versions, keeping your original formatting and comments in-place.
For example, running pur on the example requirements.txt updates the packages to the currently available latest versions:
$ pur -r requirements.txt
Updated flask: 0.9 -> 0.10.1
Updated sqlalchemy: 0.9.10 -> 1.0.12
Updated alembic: 0.8.4 -> 0.8.6
All requirements up-to-date.
As pur never modifies your environment or installed packages, it's extremely fast and you can safely run it without fear of corrupting your local virtual environment. Pur separates updating your requirements.txt file from installing the updates. So you can use pur, then install the updates in separate steps.
I've just had to do the same... used this small one-liner to do the job:
packages=$(cat requirements.txt | sed 's/==.*//g'); echo $packages | xargs pip3 install -U; freeze=$(pip3 freeze); for p in $(echo $packages); do echo $freeze | grep -E "^${p}==" >> requirements.new; done
which:
packages=$(cat requirements.txt | sed 's/==.*//g') creates a list of the current packages names in requirements.txt (removing the version).
echo $packages | xargs pip3 install -U then passes all of the packages as arguments to pip3 to upgrade.
freeze=$(pip3 freeze); Gets all of the current package versions in the format required for requirements.txt
for p in $(echo $packages) then iterates through the package names
echo $freeze | grep -E "^${p}==" >> requirements.new gets the package version line from the pip freeze output which matches the package and writes to new requirements.txt
This has the added benefit of preserving the ordering of the original requirements.txt. :)
Hope this helps!
The second answer is the most useful but what I wanted to do is lock some packages while having others at the latest version (e.g. youtube-dl).
An example requirements.txt would look like this (~ means compatible):
Pillow==6.2.2
requests~=2.22.0
youtube_dl
Then in the terminal, use the command pip install --upgrade -r requirements.txt
This ensures that Pillow will stay at 6.2.2, requests will be upgraded to the latest 2.22.x (if available), and the latest version of youtube-dl will be installed if not already.
Since I couldn't do that using bash, I wrote a python module to create a new requirements file with no versions and use it:
data = open('requirements-prod.pip', 'r')
data2 = open('requirements-prod-no-version.pip', 'w')
for line in data.readlines():
new_line = line[:line.index('==')]
data2.write(new_line + '\n')
data2.flush()
Then install the libs from the new file pip install -U -r requirements-prod-no-version.pip
Finally freeze the versions to the original file pip freeze > requirements-prod.pip
More robust solution is IMO to use a dependency management such as poetry, https://python-poetry.org which comes with an exhaustive dependency resolver.
I guess the simplest solution is creating the requirements.txt with:
pip freeze | sed 's/==/>=/' > requirements.txt
You can use below command on Linux and Mac:
cat requirements.txt | cut -f1 -d= | xargs pip install -U
1) To upgrade pip installed files from reqs.txt
add the >= in replacement of ==
this will tell pip to install lib greater than or equal to the version you are requesting, here by installing the most to-date version of requested library
1.a) **My answer for thread ** By adding py -m pip install -r reqs.txt to a daily restart... or something of the nature you can update your installed libs.
Summed up by Andy Perfectly
-My reason For entering this thread was to find information on how to update virtual env base pip (usually 10.0.03 for me??)
in-hopes of satisfying an issue of which have I was able to derive one of two solutions
A. creation of venv || B. Installation of Required libs
Thanks to Andy I have satisfied need B
By adding pip >= requested version in reqs.txt
upon instantiation of new virtual-Environment || re-instantiation of previous Venv
py -m venv devenv
to setup new dev env
devenv\scripts\activate.bat
to activate dev env
python -m pip install -r requirenments.txt
to install base libs
yeilds output
Collecting pip >= 20.0.2 (from -r requirenments.txt (line 1))
Using cached >https://files.pythonhosted.org/packages/54/0c/d01aa759fdc501a58f431eb594a17495f15b88da142ce14b5845662c13f3/pip-20.0.2-py2.py3-none-any.whl
Found existing installation: pip 10.0.1
Uninstalling pip-10.0.1:
Successfully uninstalled pip-10.0.1
Successfully installed pip-20.0.2
Sorry for the Brain Dump, Hopes this helps someone :)
🤳 Austin 👨🎤🚀🥊
If you install anything in your django project and after installation you want to update your requirement file this command can update you requirement.txt file
pip freeze > requirements.txt
if your requirement file not exist in you project you can use this command for make new requirement.txt file
pip freeze > requirements.txt
With pip-tools you have a basic requirements.in with desired dependencies and a requirements.txt file with pinned versions. pip-tools then generates the pinned versions automatically, which makes handling the whole process including upgrading your dependencies a lot easier.
# requirements.in
django
and the autogenerated requirements.txt (to pin all dependencies)
$ pip-compile requirements.in
#
# This file is autogenerated by pip-compile
# To update, run:
#
# pip-compile requirements.in
#
asgiref==3.2.3
# via django
django==3.0.3
# via -r requirements.in
pytz==2019.3
# via django
sqlparse==0.3.0
# via django
If you use that workflow, which I can highly recommend, it's
pip-compile --upgrade
which generates the requirements.txt with the latest versions.
I edit the requirements.txt as below and run $sh ./requirements.txt
pip install -U amqp;
pip install -U appdirs;
pip install -U arrow;
pip install -U Babel;
pip install -U billiard;
pip install -U celery;
pip install -U Django;
pip install -U django-cors-headers;
pip install -U django-crispy-forms;
pip install -U django-filter;
pip install -U django-markdown-deux;
pip install -U django-pagedown;
pip install -U django-timezone-field;
pip install -U djangorestframework;
pip install -U fcm-django;
pip install -U flower;
pip install -U gunicorn;
pip install -U kombu;
pip install -U Markdown;
pip install -U markdown2;
pip install -U packaging;
I have a pip requirements file that changes during development.
Can pip be made to uninstall packages that do not appear in the requirements file as well as installing those that do appear? Is there a standard method?
This would allow the pip requirements file to be the canonical list of packages - an 'if and only if' approach.
Update: I suggested it as a new feature at https://github.com/pypa/pip/issues/716
This should uninstall anything not in requirements.txt:
pip freeze | grep -v -f requirements.txt - | grep -v '^#' | xargs pip uninstall -y
Although this won't work quite right with packages installed with -e, i.e. from a git repository or similar. To skip those, just filter out packages starting with the -e flag:
pip freeze | grep -v -f requirements.txt - | grep -v '^#' | grep -v '^-e ' | xargs pip uninstall -y
Then, obviously:
pip install -r requirements.txt
Update for 2016:
You probably don't really want to actually use the above approach, though. Check out pip-tools and pip-sync which accomplish what you are probably looking to do in a much more robust way.
https://github.com/nvie/pip-tools
Update for May, 2016:
You can now also use pip uninstall -r requirements.txt, however this accomplishes basically the opposite - it uninstalls everything in requirements.txt
Update for May, 2019:
Check out pipenv or Poetry. A lot has happened in the world of package management that makes this sort of question a bit obsolete. I'm actually still quite happily using pip-tools, though.
You can now pass the -r requirements.txt argument to pip uninstall.
pip uninstall -r requirements.txt -y
At least as of pip 8.1.2, pip help uninstall shows:
...
Uninstall Options:
-r, --requirement <file> Uninstall all the packages listed in the given requirements file. This option can be
used multiple times.
...
It's not a feature of pip, no. If you really want such a thing, you could write a script to compare the output of pip freeze with your requirements.txt, but it would likely be more hassle than it's worth.
Using virtualenv, it is easier and more reliable to just create a clean environment and (re)install from requirements.txt, like:
deactivate
rm -rf venv/
virtualenv venv/
source venv/bin/activate
pip install -r requirements.txt
The short answer is no, you can't do that with pip.
Here's a simple solution that works:
pip uninstall $(pip freeze) -y
This is an old question (but a good one), and things have changed substantially since it was asked.
There's an offhand reference to pip-sync in another answer, but deserves its own answer, because it solves precisely the OP's problem.
pip-sync takes a requirements.txt file as input, and "trues up" your current Python environment so that it matches exactly what's in that requirements.txt. This includes removing any packages that are present in your env but absent from requirements.txt.
Example: Suppose we want our env to contain (only) 3 libraries: libA, libB, and libC, like so:
> cat requirements.txt
libA==1.0
libB==1.1
libC==1.2
But our env currently contains libC and libD:
> pip freeze
libC==1.2
libD==1.3
Running pip-sync will result in this, which was our desired final state:
> pip-sync requirements.txt
> pip freeze
libA==1.0
libB==1.1
libC==1.2
Stephen's proposal is a nice idea, but unfortunately it doesn't work
if you include only direct requirements in your file, which sounds
cleaner to me.
All dependencies will be uninstalled,
including even distribute, breaking down pip itself.
Maintaining a clean requirements file while version tracking a virtual environment
Here is how I try to version-track my virtual environment.
I try to maintain a minimal requirements.txt, including only
the direct requirements, and not even mentioning version constraints where
I'm not sure.
But besides, I keep, and include in version tracking (say git),
the actual status of my virtualenv in a venv.pip file.
Here is a sample workflow:
setup virtualenv workspace, with version tracking:
mkdir /tmp/pip_uninstalling
cd /tmp/pip_uninstalling
virtualenv venv
. venv/bin/activate
initialize version tracking system:
git init
echo venv > .gitignore
pip freeze > venv.pip
git add .gitignore venv.pip
git commit -m "Python project with venv"
install a package with dependencies, include it in requirements file:
echo flask > requirements.txt
pip install -r requirements.txt
pip freeze > venv.pip
Now start building your app, then commit and start a new branch:
vim myapp.py
git commit -am "Simple flask application"
git checkout -b "experiments"
install an extra package:
echo flask-script >> requirements.txt
pip install -r requirements.txt
pip freeze > venv.pip
... play with it, and then come back to earlier version
vim manage.py
git commit -am "Playing with flask-script"
git checkout master
Now uninstall extraneous packages:
pip freeze | grep -v -f venv.pip | xargs pip uninstall -y
I suppose the process can be automated with git hooks, but let's not go off topic.
Of course, it makes sense then to use some package caching system
or local repository like pip2pi
you can create a new file with all installed packages
pip freeze > uninstall.txt
and then uninstall all of those
pip uninstall -r uninstall.txt -y
and then finally re-install the packages you had in your original requirements.txt file
pip install -r requirements.txt
Piggybacking off #stephen-j-fuhry here is a powershell equivalent I use:
pip freeze | ? { $_ -notmatch ((gc req.txt) -join "|") }
While this doesn't directly answer the question, a better alternative to requirements.txt now is using a Pipfile. This functions similarly to a Ruby Gemfile. Currently, you need to make use of the pipenv tool but hopefully this will eventually be incorporated into pip. This provides the pipenv clean command which does what you want.
(Note that you can import an existing requirements.txt with pipenv install -r requirements.txt. After this you should have a Pipfile and the requirements.txt can be removed.)
It is possible now using:
pip uninstall -r requirements.txt