How do I prevent pip automatically installing supporting packages? - python

I am using python 3.6.0 within a venv. I would like to "pip install" matplotlib==2.0.0, however when I do this, pip seems to automatically grab the newest versions of all other required supporting packages for matplotlib. i.e. cycler 0.11.0, pyparsing==3.0.7, etc. These latest supporting package versions do not seem to work with the older version of matplotlib and it throws errors when attempting to import matplotlib.
How do I install matplotlib without pip attempting to install all its supporting packages automatically?
My current temporary solution is to go back and manually install each package before installing matplotlib but I'm sure I will run into this issue again so would like to find a better solution.

Pip has a built-in feature:
pip install matplotlib --no-dependencies
To exclude specific, you can put it in requirements file and pass it:
pip install --no-deps -r requirements.txt

Related

how to install openCV_python with no import cv2 error?

I am trying to install the OpenCV-python on my mac and i have used the following:
$pip install opencv-python
which gave me the following error:
$pip install opencv-python
Collecting opencv-python
Using cached opencv_python-3.4.0.12-cp27-cp27m macosx_10_6_intel.macosx_10_9_intel.macosx_10_9_x86_64.macosx_10_10_intel.macosx_10_10_x86_64.whl
Collecting numpy>=1.11.1 (from opencv-python)
Using cached numpy-1.14.2-cp27-cp27m-macosx_10_6_intel.macosx_10_9_intel.macosx_10_9_x86_64.macosx_10_10_intel.macosx_10_10_x86_64.whl
matplotlib 1.3.1 requires nose, which is not installed.
matplotlib 1.3.1 requires tornado, which is not installed.
Installing collected packages: numpy, opencv-python
Found existing installation: numpy 1.8.0rc1
Cannot uninstall 'numpy'. It is a distutils installed project and thus we cannot accurately determine which files belong to it which would lead to only a partial uninstall.
Then i did try the pip install --upgrade matplotlib which didnot change anything. It just show me:
matplotlib 2.2.2 requires backports.functools-lru-cache, which is not installed.
matplotlib 2.2.2 has requirement numpy>=1.7.1, but you'll have numpy 1.8.0rc1 which is incompatible.
As I found many ways to install the openCV-python in the internet like:
https://www.pyimagesearch.com/2015/06/15/install-opencv-3-0-and-python-2-7-on-osx/
and I installed on my other mac but i got import cv2 problem alot in my codes.
I will be more than happy if anyone have a good solution or recommendation to install the openCV-python.
Thanks
In summary, macOS comes with the Python preinstalled and you should not mess with the packages installed as some system utilities depend on them.
https://docs.python.org/3.7/using/mac.html
The Apple-provided build of Python is installed in /System/Library/Frameworks/Python.framework and /usr/bin/python, respectively. You should never modify or delete these, as they are Apple-controlled and are used by Apple- or third-party software. Remember that if you choose to install a newer Python version from python.org, you will have two different but functional Python installations on your computer, so it will be important that your paths and usages are consistent with what you want to do.
You should take a look on either venv or virtualenv.
You can read this answer: https://stackoverflow.com/a/41972262/4796844 that will get you through the basics.
In a nutshell, to solve your problem:
$ python3 -m venv ./project-name
$ . ./project-name/bin/activate
$ pip install opencv-python
And to leave the virtual environment, simply:
$ deactivate

How to prevent pip from replacing numpy+mkl with just numpy when installing packages that require numpy?

When I pip install (or pip install --upgrade) packages that require numpy, they have a tendency to uninstall my existing numpy+mkl (which has a high enough version to satisfy the numpy version requirement). Afterwards, they install numpy without +mkl, which causes problems for other packages that do require MKL. An example for which this happens is gym (which has 'numpy>=1.10.4' in its install_requires in setup.py).
I understand that this is related to the +mkl suffix that probably somehow messes with the versions, and understand I can fix it afterwards by downloading and installing numpy+mkl from https://www.lfd.uci.edu/~gohlke/pythonlibs/, but it gets annoying to manually do this every time over again when upgrading a package like gym to a new version. Is there any way to prevent numpy+mkl from getting uninstalled during the pip install --upgrade?
For me, this is happening on Windows 10, Python 3.6. I did not yet check if the same happens on Linux, but would be interested in an answer for that too if it's different there.
My currently installed version of numpy+mkl (which often gets automatically uninstalled) is 1.13.3+mkl.
Using --upgrade-strategy, as suggested by cgohlke in a comment, addresses this problem. So, taking the example where we want to install gym from scratch without it replacing our existing numpy+mkl installation with regular numpy, the full command to run is:
pip install --upgrade-strategy only-if-needed gym
Or, if we just want to upgrade an existing installation, we also add --upgrade
pip install --upgrade --upgrade-strategy only-if-needed gym
I have found that some packages force pip to reinstall numpy. The safest (and only) way to ensure that numpy is installed with mkl (from conda) is to uninstall using conda and pip and then reinstall using conda:
conda uninstall numpy
pip uninstall nump
conda install numpy

Different ways to install numpy, scipy, and matplotlib on macOS via homebrew

Today I decided to install python and the scipy stack manually, instead of using Anaconda (or Canopy) as I had previously done. I use homebrew on my mac and have python2 and python3 (2.7 and 3.6) installed via homebrew. But reading through the documentation, there are multiple ways to install the scipy stack and I want to know what the differences are. I have tested they independently and they all work.
From the Homebrew documentation:
python2 -m pip install numpy scipy matplotlib
python3 -m pip install numpy scipy matplotlib
These are the same two commands that the Matplotlib installation docuentation lists for how to install matplotlib through homebrew. Why does this use pip (the system Python 2.7.x's pip) instead of pip2 and pip3 respectively? Is it because you call python2/python3 first?
However, the SciPy documentation for installing these modules when using homebrew is different:
brew tap homebrew/science && brew install numpy scipy matplotlib
(NOTE: the matplotlib formula is located in the homebrew/science repository, which is why you need to use brew tap.)
Finally, from the command line readout when installing python2 and python3 via homebrew:
pip2 install numpy scipy matplotlib
pip3 install numpy scipy matplotlib
which are based on the following readouts:
Pip and setuptools have been installed. To update them
pip2 install --upgrade pip setuptools
You can install Python packages with
pip2 install <package>
They will install into the site-package directory
/usr/local/lib/python2.7/site-packages
See: https://docs.brew.sh/Homebrew-and-Python.html
...
Pip, setuptools, and wheel have been installed. To update them
pip3 install --upgrade pip setuptools wheel
You can install Python packages with
pip3 install <package>
They will install into the site-package directory
/usr/local/lib/python3.6/site-packages
See: https://docs.brew.sh/Homebrew-and-Python.html
So between four sources of documentation, there are three different ways to install scipy when using homebrew and they all work; but how is each different and should one be preferred?
From what I can tell, the first and third methods, which both invoke pip (pip2/pip3), are functionally equivalent - both invoke Homebrew's Python X.X.X's pip - but one implicitly, the other explicitly. I assume this means both methods install the pre-built binary packages from pip in the form of wheels. For the second method, I think it installs homebrew's own formulae for these packages (i.e. maintained separately by homebrew in it's repository).
If this is true, then I assume one should use the second method if you are using a version of python which is maintained by homebrew (i.e. installed via brew install python or python3). My reasoning is that if you later decide to install another formula via homebrew that has any of the scipy stack as a dependency, it will install those modules again from homebrew's repository if you installed them using pip previously.
As mentioned, I am not sure if my understanding is correct and I have not been able to find any answers, so any insights or confirmations would be appreciated.
Your analysis seems correct: variants 1 and 3 will install numpy/scipy from the python package index (PyPI) and will use pre-built wheels (if available for your platform, which they most likely are).
Variant 2 installs the brew formula.
As mentioned by #Evhz, the conda packages for numpy and scipy use the Intel Math Kernel library, which can provide significant speedups (not just on Intel processors) versus the packages installed from PyPI or brew, both of which are linked against OpenBLAS.
Concerning which method to prefer: it's not entirely straightforward.
Yes, on the surface, using brew to manage both the python interpreter and the python packages would seem consistent.
However, homebrew only provides formulae for a handful of python packages, so you'll end up needing to mix with pip in any case.
If you want performance, you go with conda, which will be managing both the interpreter and python packages.
However, also anaconda / conda-forge still have some catching up to do with PyPI, so you'll likely need to mix with pip again.
In the end, there is no perfect solution but as long as you knowingly decide for one, you're unlikely to run into issues.

Downgrading python package installed locally

In the server that work in (as do many other people) the "global" python has a certain version of a package, say 1.0.0.
I recently used pip to upgrade that to 1.0.2 locally for my user with the pip install --user package==1.0.2, which worked. However, now I want to uninstall my locally installed version and remain with the global one.
I've tried pip uninstall --user package==1.0.2, pip uninstall --user package, and a few other options but nothing seems to work. I always get this error:
Usage:
pip <command> [options]
no such option: --user
I also tried pip install --user package=1.0.0 but now I have both versions installed locally and python uses the most recent.
How can I do what I want?
Apparently this cannot be done with pip directly. I ended up solving it just by removing the package from ~/.local/lib/python3.5/site-packages/. A bit more manual than I was hoping I'd have to do.
The --user option for pip seems to have been removed but is still an option with setuptools.
So if you want to use the --user function what you can do is use pip download which will download the .whl file. You then need to extract the file using wheel unpack. I then ran python setup.py install --user (worked for numpy) and it installed the package to my home directory under .local.
I followed the documentation here.

PIP uninstall not looking into /usr/local

I installed several packages (among them patsy and statsmodels) with pip 1.3.1 in kubuntu 13.04. They were put into /usr/local/lib, instead of /usr/lib. When using pip freeze or pip list, these packages appear fine, and are usable in python. However, when I use pip uninstall I get "Can't uninstall 'statsmodels'. No files were found to uninstall." The structure of install packages in /usr/local/lib/python2.7/dist-packages seem correct, and installed-files.txt has everything listed. How do I make pip see these files and uninstall them?
I do not really have a solution for the pip path lookup, but deleting /usr/local/lib/python2.7/dist-packages/_PACKAGE_NAME did the trick for me. At the very least it allowed me to install anew.

Categories