Docker pip3 not installing packages - python

I have the following Dockerfile and requirements.txt file. The requirements.txt appears to be processed, but I don't see any output "Installing collected packages" statements like I see when I install the packages on my system without Docker. In the docker build I end up with an error where the previous package in requirements.txt should have been installed.
Dockerfile
FROM alpine:3.8
ADD . /code
RUN apk add alpine-sdk python3-dev
WORKDIR /code
RUN sudo apk update
RUN pip3 install --trusted-host pypi.python.org -r requirements.txt
CMD ["python3", "linqcmd"]
requirements.txt
boto3
click
python-levenshtein
python-dateutil
cython
# pip3 install git+https://github.com/izderadicka/pdfparser
-e git://github.com/izderadicka/pdfparser.git#egg=pdfparser
docker-compose up --build output
...
Step 6/7 : RUN pip3 install --trusted-host pypi.python.org -r requirements.txt
---> Running in 515dd716aa7c
Collecting boto3 (from -r requirements.txt (line 4))
Downloading https://files.pythonhosted.org/packages/a8/45/810f786ce144bfd19d9f2f700a8cd4358435559a2b88b2c235f7bb3f29df/boto3-1.8.6-py2.py3-none-any.whl (128kB)
Collecting click (from -r requirements.txt (line 5))
Downloading https://files.pythonhosted.org/packages/34/c1/8806f99713ddb993c5366c362b2f908f18269f8d792aff1abfd700775a77/click-6.7-py2.py3-none-any.whl (71kB)
Collecting python-levenshtein (from -r requirements.txt (line 6))
Downloading https://files.pythonhosted.org/packages/42/a9/d1785c85ebf9b7dfacd08938dd028209c34a0ea3b1bcdb895208bd40a67d/python-Levenshtein-0.12.0.tar.gz (48kB)
Collecting python-dateutil (from -r requirements.txt (line 7))
Downloading https://files.pythonhosted.org/packages/cf/f5/af2b09c957ace60dcfac112b669c45c8c97e32f94aa8b56da4c6d1682825/python_dateutil-2.7.3-py2.py3-none-any.whl (211kB)
Collecting cython (from -r requirements.txt (line 8))
Downloading https://files.pythonhosted.org/packages/21/89/ca320e5b45d381ae0df74c4b5694f1471c1b2453c5eb4bac3449f5970481/Cython-0.28.5.tar.gz (1.9MB)
Obtaining pdfparser from git+git://github.com/izderadicka/pdfparser.git#egg=pdfparser (from -r requirements.txt (line 10))
Cloning git://github.com/izderadicka/pdfparser.git to ./src/pdfparser
Complete output from command python setup.py egg_info:
You need to install cython first - sudo pip install cython
----------------------------------------
Command "python setup.py egg_info" failed with error code 1 in /code/src/pdfparser/
You are using pip version 10.0.1, however version 18.0 is available.
You should consider upgrading via the 'pip install --upgrade pip' command.
ERROR: Service 'web' failed to build: The command '/bin/sh -c pip3 install --trusted-host pypi.python.org -r requirements.txt' returned a non-zero code: 1

By design, pip doesn't install any packages until after it's collected & built wheels for everything it's going to install; this is done to prevent a failure in the middle of installation from causing only some packages to be installed. Thus, in your case, cython won't be installed before a wheel is built for pdfparser, which apparently needs cython in order to build, and so the installation fails. You need to install cython and pdfparser in two separate steps.

Related

Docker: Install python packages offline

What I am trying to do?
Install all dependencies mentioned in requirements.txt using downloaded wheel files i.e. offline installation of packages in Docker
What I have done?
By following this thread I managed to download all my wheels into a wheelhouse folder using mkdir wheelhouse && pip download -r requirements.txt -d wheelhouse and I created a compressed tarball wheelhouse.tar.gz containing all my downloaded .whl files along with a requirements.txt
When I try to install the wheels locally (outside Docker) using pip install -r wheelhouse/requirements.txt --no-index --find-links wheelhouse, it works!
But when I run the same in Docker, it doesn't with the following error:
Processing ./wheelhouse/beautifulsoup4-4.8.2-py3-none-any.whl
ERROR: Could not find a version that satisfies the requirement blis==0.4.1 (from -r ./wheelhouse/requirements.txt (line 2)) (from versions: none)
ERROR: No matching distribution found for blis==0.4.1 (from -r ./wheelhouse/requirements.txt (line 2))
While actually, the wheel for blis 0.4.1 is present in my wheelhouse directory.
Can anyone please help me identify why it doesn't run on Docker and runs on local?
Dockerfile
FROM python:3
COPY . /app
WORKDIR /app
RUN tar -zxf ./wheelhouse.tar.gz
RUN pip install -r ./wheelhouse/requirements.txt --no-index --find-links ./wheelhouse
Screenshot of wheelhouse directory:

Error specifying Python pip requirements.txt greater than, but less than

In my requirements.txt, I have the following:
tensorflow>=1.13.1,<2.0.0
When I run pip3 install -r requirements.txt, I am getting the following error:
ERROR: Could not find a version that satisfies the requirement tensorflow<2.0.0,>=1.13.1 (from -r requirements.txt (line 2)) (from versions: none)
ERROR: No matching distribution found for tensorflow<2.0.0,>=1.13.1 (from -r requirements.txt (line 2))
The command '/bin/sh -c pip3 install -r requirements.txt && pip3 install src/' returned a non-zero code: 1
How can I specify that I need tensorflow less than 2.0.0, but greater than 1.13.1?
I'm running this inside a docker build command, so I'm starting with a fresh environment each time. I'm using the python:latest base image to build off of.

How to cache Python dependencies properly

I am trying to build a Python project in a Dockerfile. I want to cache dependencies, and then use that cache later, something like this:
RUN pip3 download -d "/pth/to/downloaded/files" -r /temp/requirements.txt -c /temp/constraints.txt
# much later on in the Dockerfile:
RUN pip3 install --download-cache="/pth/to/downloaded/files" -r requirements.txt -c constraints.txt
Question:
Assuming the pip3 download command is correct, I no longer see a --download-cache option when I look at the pip3 --help output - is there a new option I can use with pip3 install to reference the dependency cache generated by pip3 download?
Right now I am getting this error:
Usage: pip install [options]
no such option: --download-cache
The --download-cache option was removed in pip version 8, because it's now using cache by default. So you don't need to specify this option at all. I'm not sure what the purpose of the pip download -d <dir> option is, but apparently it's not creating a cache in the destination directory. You can just leave out the -d <dir> option too. The following Dockerfile works:
FROM python:3.7
COPY constraints.txt requirements.txt ./
RUN pip3 download -d .pipcache -r requirements.txt -c constraints.txt
COPY test.txt ./
RUN pip3 install -r requirements.txt -c constraints.txt
If you add --cache-dir <dir> to both the download and install commands, it will work as well. So the following Dockerfile also works:
FROM python:3.7
COPY constraints.txt requirements.txt ./
RUN pip3 download --cache-dir ./tmp/pipcache -r requirements.txt -c constraints.txt
COPY test.txt ./
RUN pip3 install --cache-dir ./tmp/pipcache -r requirements.txt -c constraints.txt
Example output (with only pep8 and pylint in the requirements.txt):
First run:
Sending build context to Docker daemon 5.632kB
Step 1/5 : FROM python:3.7
---> a4cc999cf2aa
Step 2/5 : COPY constraints.txt requirements.txt ./
---> 411eaa3d36ff
Step 3/5 : RUN pip3 download -r requirements.txt -c constraints.txt
---> Running in 6b489df74137
Collecting pep8==1.7.1 (from -c constraints.txt (line 17))
Downloading https://files.pythonhosted.org/packages/42/3f/669429ce58de2c22d8d2c542752e137ec4b9885fff398d3eceb1a7f5acb4/pep8-1.7.1-py2.py3-none-any.whl (41kB)
Saved /pep8-1.7.1-py2.py3-none-any.whl
Collecting pylint==2.3.1 (from -c constraints.txt (line 22))
Downloading https://files.pythonhosted.org/packages/60/c2/b3f73f4ac008bef6e75bca4992f3963b3f85942e0277237721ef1c151f0d/pylint-2.3.1-py3-none-any.whl (765kB)
Saved /pylint-2.3.1-py3-none-any.whl
Collecting mccabe==0.6.1 (from -c constraints.txt (line 14))
Downloading https://files.pythonhosted.org/packages/87/89/479dc97e18549e21354893e4ee4ef36db1d237534982482c3681ee6e7b57/mccabe-0.6.1-py2.py3-none-any.whl
Saved /mccabe-0.6.1-py2.py3-none-any.whl
Collecting astroid==2.2.5 (from -c constraints.txt (line 2))
Downloading https://files.pythonhosted.org/packages/d5/ad/7221a62a2dbce5c3b8c57fd18e1052c7331adc19b3f27f1561aa6e620db2/astroid-2.2.5-py3-none-any.whl (193kB)
Saved /astroid-2.2.5-py3-none-any.whl
Collecting isort==4.3.19 (from -c constraints.txt (line 10))
Downloading https://files.pythonhosted.org/packages/ae/ae/5ef4b57e15489754b73dc908b656b02ab0e6d37b190ac78dd498be8b577d/isort-4.3.19-py2.py3-none-any.whl (42kB)
Saved /isort-4.3.19-py2.py3-none-any.whl
Collecting lazy-object-proxy==1.4.1 (from -c constraints.txt (line 12))
Downloading https://files.pythonhosted.org/packages/43/a5/1b19b094ad19bce55b5b6d434020f5537b424fd2b3cff0fbef23d7bb5a95/lazy_object_proxy-1.4.1-cp37-cp37m-manylinux1_x86_64.whl (49kB)
Saved /lazy_object_proxy-1.4.1-cp37-cp37m-manylinux1_x86_64.whl
Collecting wrapt==1.11.1 (from -c constraints.txt (line 39))
Downloading https://files.pythonhosted.org/packages/67/b2/0f71ca90b0ade7fad27e3d20327c996c6252a2ffe88f50a95bba7434eda9/wrapt-1.11.1.tar.gz
Saved /wrapt-1.11.1.tar.gz
Collecting six==1.12.0 (from -c constraints.txt (line 28))
Downloading https://files.pythonhosted.org/packages/73/fb/00a976f728d0d1fecfe898238ce23f502a721c0ac0ecfedb80e0d88c64e9/six-1.12.0-py2.py3-none-any.whl
Saved /six-1.12.0-py2.py3-none-any.whl
Collecting typed-ast==1.3.5 (from -c constraints.txt (line 37))
Downloading https://files.pythonhosted.org/packages/17/9e/00918af7bdd616decb5b7ad06a9cd0a4a247d2fccaa630ab448a57e68b98/typed_ast-1.3.5-cp37-cp37m-manylinux1_x86_64.whl (736kB)
Saved /typed_ast-1.3.5-cp37-cp37m-manylinux1_x86_64.whl
Successfully downloaded pep8 pylint mccabe astroid isort lazy-object-proxy wrapt six typed-ast
Removing intermediate container 6b489df74137
---> 8ac3be432c58
Step 4/5 : COPY test.txt ./
---> 5cac20851967
Step 5/5 : RUN pip3 install -r requirements.txt -c constraints.txt
---> Running in 394847f09e9b
Collecting pep8==1.7.1 (from -c constraints.txt (line 17))
Using cached https://files.pythonhosted.org/packages/42/3f/669429ce58de2c22d8d2c542752e137ec4b9885fff398d3eceb1a7f5acb4/pep8-1.7.1-py2.py3-none-any.whl
Collecting pylint==2.3.1 (from -c constraints.txt (line 22))
Using cached https://files.pythonhosted.org/packages/60/c2/b3f73f4ac008bef6e75bca4992f3963b3f85942e0277237721ef1c151f0d/pylint-2.3.1-py3-none-any.whl
Collecting astroid==2.2.5 (from -c constraints.txt (line 2))
Using cached https://files.pythonhosted.org/packages/d5/ad/7221a62a2dbce5c3b8c57fd18e1052c7331adc19b3f27f1561aa6e620db2/astroid-2.2.5-py3-none-any.whl
Collecting mccabe==0.6.1 (from -c constraints.txt (line 14))
Using cached https://files.pythonhosted.org/packages/87/89/479dc97e18549e21354893e4ee4ef36db1d237534982482c3681ee6e7b57/mccabe-0.6.1-py2.py3-none-any.whl
Collecting isort==4.3.19 (from -c constraints.txt (line 10))
Using cached https://files.pythonhosted.org/packages/ae/ae/5ef4b57e15489754b73dc908b656b02ab0e6d37b190ac78dd498be8b577d/isort-4.3.19-py2.py3-none-any.whl
Collecting lazy-object-proxy==1.4.1 (from -c constraints.txt (line 12))
Using cached https://files.pythonhosted.org/packages/43/a5/1b19b094ad19bce55b5b6d434020f5537b424fd2b3cff0fbef23d7bb5a95/lazy_object_proxy-1.4.1-cp37-cp37m-manylinux1_x86_64.whl
Collecting six==1.12.0 (from -c constraints.txt (line 28))
Using cached https://files.pythonhosted.org/packages/73/fb/00a976f728d0d1fecfe898238ce23f502a721c0ac0ecfedb80e0d88c64e9/six-1.12.0-py2.py3-none-any.whl
Collecting wrapt==1.11.1 (from -c constraints.txt (line 39))
Using cached https://files.pythonhosted.org/packages/67/b2/0f71ca90b0ade7fad27e3d20327c996c6252a2ffe88f50a95bba7434eda9/wrapt-1.11.1.tar.gz
Collecting typed-ast==1.3.5 (from -c constraints.txt (line 37))
Using cached https://files.pythonhosted.org/packages/17/9e/00918af7bdd616decb5b7ad06a9cd0a4a247d2fccaa630ab448a57e68b98/typed_ast-1.3.5-cp37-cp37m-manylinux1_x86_64.whl
Building wheels for collected packages: wrapt
Building wheel for wrapt (setup.py): started
Building wheel for wrapt (setup.py): finished with status 'done'
Stored in directory: /root/.cache/pip/wheels/89/67/41/63cbf0f6ac0a6156588b9587be4db5565f8c6d8ccef98202fc
Successfully built wrapt
Installing collected packages: lazy-object-proxy, six, wrapt, typed-ast, astroid, isort, mccabe, pep8, pylint
Successfully installed astroid-2.2.5 isort-4.3.19 lazy-object-proxy-1.4.1 mccabe-0.6.1 pep8-1.7.1 pylint-2.3.1 six-1.12.0 typed-ast-1.3.5 wrapt-1.11.1
Removing intermediate container 394847f09e9b
---> 68e65a214a32
Successfully built 68e65a214a32
Successfully tagged test:latest
Second run (after changing test.txt to trigger a rebuild of Docker layers 4 and 5):
Sending build context to Docker daemon 5.632kB
Step 1/5 : FROM python:3.7
---> a4cc999cf2aa
Step 2/5 : COPY constraints.txt requirements.txt ./
---> Using cache
---> 411eaa3d36ff
Step 3/5 : RUN pip3 download -r requirements.txt -c constraints.txt
---> Using cache
---> 8ac3be432c58
Step 4/5 : COPY test.txt ./
---> 7ab5814153b7
Step 5/5 : RUN pip3 install -r requirements.txt -c constraints.txt
---> Running in 501da787ab07
Collecting pep8==1.7.1 (from -c constraints.txt (line 17))
Using cached https://files.pythonhosted.org/packages/42/3f/669429ce58de2c22d8d2c542752e137ec4b9885fff398d3eceb1a7f5acb4/pep8-1.7.1-py2.py3-none-any.whl
Collecting pylint==2.3.1 (from -c constraints.txt (line 22))
Using cached https://files.pythonhosted.org/packages/60/c2/b3f73f4ac008bef6e75bca4992f3963b3f85942e0277237721ef1c151f0d/pylint-2.3.1-py3-none-any.whl
Collecting astroid==2.2.5 (from -c constraints.txt (line 2))
Using cached https://files.pythonhosted.org/packages/d5/ad/7221a62a2dbce5c3b8c57fd18e1052c7331adc19b3f27f1561aa6e620db2/astroid-2.2.5-py3-none-any.whl
Collecting mccabe==0.6.1 (from -c constraints.txt (line 14))
Using cached https://files.pythonhosted.org/packages/87/89/479dc97e18549e21354893e4ee4ef36db1d237534982482c3681ee6e7b57/mccabe-0.6.1-py2.py3-none-any.whl
Collecting isort==4.3.19 (from -c constraints.txt (line 10))
Using cached https://files.pythonhosted.org/packages/ae/ae/5ef4b57e15489754b73dc908b656b02ab0e6d37b190ac78dd498be8b577d/isort-4.3.19-py2.py3-none-any.whl
Collecting typed-ast==1.3.5 (from -c constraints.txt (line 37))
Using cached https://files.pythonhosted.org/packages/17/9e/00918af7bdd616decb5b7ad06a9cd0a4a247d2fccaa630ab448a57e68b98/typed_ast-1.3.5-cp37-cp37m-manylinux1_x86_64.whl
Collecting six==1.12.0 (from -c constraints.txt (line 28))
Using cached https://files.pythonhosted.org/packages/73/fb/00a976f728d0d1fecfe898238ce23f502a721c0ac0ecfedb80e0d88c64e9/six-1.12.0-py2.py3-none-any.whl
Collecting wrapt==1.11.1 (from -c constraints.txt (line 39))
Using cached https://files.pythonhosted.org/packages/67/b2/0f71ca90b0ade7fad27e3d20327c996c6252a2ffe88f50a95bba7434eda9/wrapt-1.11.1.tar.gz
Collecting lazy-object-proxy==1.4.1 (from -c constraints.txt (line 12))
Using cached https://files.pythonhosted.org/packages/43/a5/1b19b094ad19bce55b5b6d434020f5537b424fd2b3cff0fbef23d7bb5a95/lazy_object_proxy-1.4.1-cp37-cp37m-manylinux1_x86_64.whl
Building wheels for collected packages: wrapt
Building wheel for wrapt (setup.py): started
Building wheel for wrapt (setup.py): finished with status 'done'
Stored in directory: /root/.cache/pip/wheels/89/67/41/63cbf0f6ac0a6156588b9587be4db5565f8c6d8ccef98202fc
Successfully built wrapt
Installing collected packages: typed-ast, six, wrapt, lazy-object-proxy, astroid, isort, mccabe, pep8, pylint
Successfully installed astroid-2.2.5 isort-4.3.19 lazy-object-proxy-1.4.1 mccabe-0.6.1 pep8-1.7.1 pylint-2.3.1 six-1.12.0 typed-ast-1.3.5 wrapt-1.11.1
Removing intermediate container 501da787ab07
---> b377fe561e97
Successfully built b377fe561e97
Successfully tagged test:latest
NB: The official documentation was quite helpful.
As wovano already mentioned, as of PIP 8.0.0 the --download-cache option was removed. pip uses cache by default.
To reuse the downloaded/cached packages, you can use the following strategy:
Download the packages to a custom destination dir, using --dest as described here.
$ pip3 download --dest "$DEST_DIR" ...
Install the previously downloaded packages from local directory $DEST_DIR, using --find-links as described here.
$ pip3 install --find-links "file://${DEST_DIR}" ...
Wrapping up:
FROM python:3
ENV PIP_DOWNLOAD_CACHE "/var/custom-pip-download-dir"
COPY requirements1.txt ./requirements1.txt
RUN pip3 download -r requirements1.txt --dest "$PIP_DOWNLOAD_CACHE"
COPY requirements2.txt ./requirements2.txt
RUN pip3 install -r requirements2.txt --find-links "file://${PIP_DOWNLOAD_CACHE}"
(pip maintainer here)
With modern pip versions (> 6.0), pip automatically caches the packages that it is installing if the default cache directory is accessible (see documentation).
As long as you can somehow re-populate this cache, pip will use them when it can (though it does not fall back to it when the network availability is poor).
The other option (which I would personally prefer) is to download the files using pip download in one step, such that Docker can cache things on it's own. The next stage would be doing an "offline" install using pip install --no-index and --find-links.
$ pip download --dest=downloaded-packages project
$ pip install --no-index --find-links=downloaded-packages project
edit:
The Python Packaging User Guide, contains a guide on this too. It has a similar suggestion (though it suggests using pip wheel instead of pip download, which is a good suggestion): https://packaging.python.org/guides/index-mirrors-and-caches/#caching-with-pip
Caching python dependency properly means whenever building the Dockerfile it re-uses the install python dependency. I tried to build docker via using below DockerFile.
requirements.txt
Django==2.1.7
DockerFile
FROM python:3
ENV PYTHONUNBUFFERED 1
ADD requirements.txt ./requirements.txt
RUN pip3 install -r requirements.txt
Build docker image
docker build .
Building docker first time
Building docker second time. It uses the dependency from the cache by default.
If you make any changes or library version changes to requirements.txt then it doesn't use caching. And install the fresh library version.
Thanks.

How do I specify to pip that it should only build wheels if they don't exist in the wheelhouse?

So currently we have a build process that works like so:
#copy wheelhouse if it exists
if [[ -d $WHEELHOUSE ]]; then
sudo cp -aTr $WHEELHOUSE $WORKSPACE/wheelhouse
fi
cd $WORKSPACE; pip install --find-links=$WORKSPACE/wheelhouse --use-wheel -r requirements-meta.txt; pip install --find-links=$WORKSPACE/wheelhouse --use-wheel -r requirements.txt; pip install --find-links=$WORKSPACE/wheelhouse --use-wheel -r requirements-dev.txt;
echo "Now building wheels"
pip wheel --wheel-dir=$WORKSPACE/wheelhouse --find-links $WORKSPACE/wheelhouse -r requirements-meta.txt; pip wheel --wheel-dir=$WORKSPACE/wheelhouse --find-links $WORKSPACE/wheelhouse -r requirements-dev.txt; pip wheel --wheel-dir=$WORKSPACE/wheelhouse --find-links $WORKSPACE/wheelhouse -r requirements.txt
This allows us to then copy over the wheelhouse with the rest of the workspace onto our production servers, and use those wheels as the install base for pip on those servers.
For some reason, this always builds wheels for numpy and scipy (as well as pycrypto and a few others, but those are less time-consuming), even if they already exist in $WORKSPACE/wheelhouse.
What we'd like to have happen is for pip wheel to skip building these wheels if they already exist in $WORKSPACE/wheelhouse. Is there something I'm missing on how to do this?

Python equivalent of node.js's npm link to use local development versions of requirements?

In Node.js, I'm used to using npm link to get a project to use a custom version of a dependency. From the Node documentation:
First, npm link in a package folder will create a globally-installed symbolic link from prefix/package-name to the current folder.
Next, in some other location, npm link package-name will create a symlink from the local node_modules folder to the global symlink.
Is it kosher to do something similar by symlinking into site-packages?
The exact analogue is pip install -e . or python setup.py develop.
https://pip.pypa.io/en/latest/reference/pip_install.html#editable-installs
Perhaps, but what you probably want to do is use virtualenv. Virtualenv allows you to create a python environment isolated from any others:
$ virtualenv myenv
New python executable in myenv/bin/python
Installing setuptools............done.
Installing pip...............done.
$ source myenv/bin/activate
You can then install specific versions of python packages as you please, say version 0.1.0 of a random toolz package I just found, when the lastest version is 0.2.1:
(myenv)$ pip install toolz==0.1.0
Downloading/unpacking toolz==0.1.0
Downloading toolz-0.1.tar.gz
Running setup.py egg_info for package toolz
Downloading/unpacking itertoolz>=0.5 (from toolz==0.1.0)
Downloading itertoolz-0.5.tar.gz
Running setup.py egg_info for package itertoolz
Downloading/unpacking functoolz>=0.4 (from toolz==0.1.0)
Downloading functoolz-0.4.tar.gz
Running setup.py egg_info for package functoolz
Installing collected packages: toolz, itertoolz, functoolz
Running setup.py install for toolz
Running setup.py install for itertoolz
Running setup.py install for functoolz
Successfully installed toolz itertoolz functoolz
Cleaning up...
As you can see it also installs dependencies. You can also generate a requirements file:
(myenv)$ pip freeze
functoolz==0.4
itertoolz==0.5
toolz==0.1
wsgiref==0.1.2
Which you can then use to duplicate those same dependencies in another virtualenv
(myenv)$ pip freeze > reqs.txt
(myenv)$ deactivate
$ source env2/bin/activate
(env2)$ pip freeze
wsgiref==0.1.2
(env2)$ pip install -r reqs.txt
Downloading/unpacking functoolz==0.4 (from -r reqs.txt (line 1))
Downloading functoolz-0.4.tar.gz
Running setup.py egg_info for package functoolz
Downloading/unpacking itertoolz==0.5 (from -r reqs.txt (line 2))
Downloading itertoolz-0.5.tar.gz
Running setup.py egg_info for package itertoolz
Downloading/unpacking toolz==0.1 (from -r reqs.txt (line 3))
Downloading toolz-0.1.tar.gz
Running setup.py egg_info for package toolz
Requirement already satisfied (use --upgrade to upgrade): wsgiref==0.1.2 in /Library/Frameworks/Python.framework/Versions/2.6/lib/python2.6 (from -r reqs.txt (line 4))
Installing collected packages: functoolz, itertoolz, toolz
Running setup.py install for functoolz
Running setup.py install for itertoolz
Running setup.py install for toolz
Successfully installed functoolz itertoolz toolz
Cleaning up...

Categories