Pip (python) differences between `--install-option='--prefix'` and `--root` and `--target` - python

The pip documentation lacks too much wordings (to my eyes), about parameters to deal with source and destinations.
I've experienced strange things installing Sphinx with pip3 and playing with the options available to seemingly allow me to install it precisely where I wanted (for some reasons, I want to have each thing in its own directory). I say “playing”, not that I did not read the doc nor tried --help, but because the pip3 help install did not help, and the pip install official documentation page is too short on this and actually says not more than the pip3 help install.
Here are the experiments done and the observations.
First case with --root
I downloaded the current Sphinx repository tarball, unpacked it, get into the newly created directory and did:
pip3 install --root /home/<user-name>/apps/sphinx -e .
I though this would be the same as --prefix, as there was no --prefix option visibly available. To my surprise, it installed the commands in the bin directory of Python3 (which is also installed locally in its own directory) along to some things in its library directory, and strange, instead of a /home/<user-name>/apps/sphinx directory, I get a /home/<user-name>/apps/sphinx/home/<user-name>/apps/sphinx/…: it appended the specified path to itself.
How especially the last point does make sense? What's the purpose of --root?
Second case with --target
Then I though if it's not --root, that may be --target, so I did (after a clean up):
pip3 install --target /home/<user-name>/apps/sphinx -e .
It did not work, complaining about an unrecognized --home option.
What is this --home (which I did not specified) it complains about, and what exactly is --target?
Third case with --install-option='--prefix=…'
After some web‑searching and a thread on StackOverflow, I tried this:
pip3 install --install-option='--prefix=/home/<user-name>/apps/sphinx' -e .
It just complained it could not install a .pth file and something is wrong with my PYTHONPATH, which was addressable restarting the same with the addition of a variable definition:
export PYTHONPATH=/home/<user-name>/apps/sphinx/lib/python3.4/site-packages
pip3 install --install-option='--prefix=/home/<user-name>/apps/sphinx' -e .
I just had to the set PYTHONPATH even before the directory actually exists and anything was installed in it, but this one was OK (whether or not pip should update PYTHONPATH itself during the process and remind to set it up definitively, is a debatable question).
This option, which was the good one, was also the less clearly visible one.
Another last related one:
What's the difference between --editable and --src?
Update #1
I can't tell if it's Sphinx related, but I noticed two additional things.
Doing
pip3 install --install-option='--prefix=<install-dir>' -e <repository-dir>
where repository-dir is a local check out of Sphinx, Sphinx gets installed in install-dir, is listed by pip3 list but can't be uninstalled.
On the opposite, doing
pip3 install --install-option='--prefix=<install-dir>' Sphinx
that is, letting pip3 retrieving an archive, Sphinx is not installed in install-dir, is installed in the python directory instead, is listed by pip3 list and can be uninstalled.
Depending on whether the source is a local repository or a remote archive, it won't be installed at the same location and will not be or will be uninstallable.
Dependencies were not affected, were handled the same way in both cases (installed where expected, listed, and uninstallable).
Update #2
The behaviour with --root make me feel about a kind of fake‑root (like the one you get when building a Debian package or when cross‑compiling). If it's intended to be the same, then the path which surprised me, is on the contrary, expected.

First and obvious question: why don't you just install the package from PyPI?
sudo pip install sphinx
If you want to install anything that has a setup.py file with pip you can use the --editable flag:
-e, --editable <path/url>
Install a project in editable mode (i.e. setuptools “develop mode”) from a local project path or a VCS url.
So you can just issue the command (prefix with sudo if necessary):
pip3 install -e /path/to/pkg
where /path/to/pkg is the directory where setup.py can be found (where you extracted the files).
To answer the other questions:
--root <dir> is used to change the root directory of the file system where pip should install package resources, not to change where to find the package.
--target is used to tell pip in which folder to install the package.
--install-option is used to set some variables that will be used by setup.py, not to change where pip should look for the file.

Related

Can't uninstall project with no packages

While trying to build an MCVE for another question, I created an example directory with one file in it, a setup.py with the following contents:
from setuptools import setup
setup(
name='example',
)
and installed it with
python3.6 setup.py sdist
python3.6 -m pip install --user dist/example-0.0.0.tar.gz
No actual packages or modules, but something got installed:
redacted:~/example> python3.6 -m pip list | grep example
DEPRECATION: The default format will switch to columns in the future. You can use --format=(legacy|columns) (or define a format=(legacy|columns) in your pip.conf under the [list] section) to disable this warning.
example (0.0.0)
Now I can't uninstall it:
redacted:~/example> python3.6 -m pip uninstall example
Can't uninstall 'example'. No files were found to uninstall.
Other posts suggest there might be a .pth file I have to remove from my site-packages directory, but I don't see any:
redacted:~/example> find ~/.local/lib/python3.6/site-packages/ -name '*.pth'
redacted:~/example>
What did I just do to my system, and how can I undo it?
The steps shown in the question will actually create and install a real package. It won't create any importable files, but it will create metadata in a site-packages directory. Exactly where it has installed depends on your USER_SITE configuration, which you can check with python3.6 -m site, but it's probably going to be at ~/.local/lib/python3.6/site-packages/example-0.0.0-py3.6.egg-info.
Path files (.pth) are unrelated.
The reason it can't uninstall, saying:
Can't uninstall 'example'. No files were found to uninstall.
is because the build command executed earlier will have created an example.egg-info in the current directory, and using python3.6 -m pip means the empty-string is in sys.path. So, the current directory is also considered a package location. Since the current working directory, at sys.path[0], is before the user site the example.egg-info metadata will be found here instead of in site-packages.
The command python3.6 -m pip uninstall also finds this build artifact first, for the same reasons, and does not find the metadata from site-packages which has a record of the files that should be removed during an uninstall. To correctly uninstall this package you could:
rm -rf example.egg-info # first prevent pip from getting confused by the temporary build artifact in cwd
python3.6 -m pip uninstall example # uninstall it from the user site
Or, you could change directory before uninstalling, so that pip finds the package metadata for example in the user site instead of in the working directory.
Note 1: These workarounds are not required for pip >= 20.1. Since April 2020, using python -m pip now ejects the cwd from sys.path and it will uninstall successfully from the user site in the first place without getting confused (#7731)
Note 2: some details are slightly different if this python3.6 environment has a wheel installation in it - in this case the install command will first create a wheel file from the sdist, and then install the wheel, which will result in an example-0.0.0.dist-info subdirectory for the metadata instead of an egg-info subdirectory, but the important details are the same whether you have an .egg-info or .dist-info style install in the user site. It is not possible to determine from the details in the question whether the python3.6 environment had a wheel installation available.
Since you didn't specify any files, there was nothing to be installed. So you can't uninstall anything either.

"yum install package" or "python setup.py install" in CentOS?

I was wondering how the above "yum install package" & "python setup.py install" are used differently in CentOS? I used yum install ... all the time. However, when I try to do python setup.py install, I always get: this setup.py file couldn't be found even though its path shows up under echo $PATH, unless I try to use it in its current directory or use the absolute path.
When you type python setup.py install, your shell will check your $PATH for the python command, and run that. Then, python will be examining its arguments, which are setup.py install. It knows that it can be given the name of a script, so it looks for the file called setup.py so it can be run. Python doesn't use your $PATH to find scripts, though, so it should be a real path to a file. If you just give it the name setup.py it will only look in your current directory.
The source directory for a python module should not, ideally, be in your $PATH.
yum install is a command that will go to a package repository, download all the files needed to install something, and then put them in the right place. yum (and equivalents on other distributions, like apt for Debian systems) will also fetch and install any other packages you need, including any that aren't python modules.
Python has a package manager, too. You may also find using pip install modulename or pip install --user modulename (if you don't have administrative rights) easier than downloading and installing the module by hand. You can often get more recent versions of modules this way, as the ones provided by an operating system (through yum) tend to be older, more stable versions. Sometimes the module is not available through yum at all. pip can't install any extra packages that aren't python modules, though.
If you don't have pip already (it comes with Python3, but might need installing separately for Python2, depending on how it was set up), then you can install it by following the instructions here: https://pip.pypa.io/en/stable/installing/

Can I have my pip user-installed package be preferred over system?

I would like to figure out a "fool-proof" installation instruction to put in the README of a Python project, call it footools, such that other people in our group can install the newest SVN version of it on their laptops and their server accounts.
The problem is getting the user-installed libs to be used by Python when they call the scripts installed by pip. E.g., we're using a server that has an old version of footools in /opt/local/Library/Frameworks/Python.framework/Versions/2.7/lib/python2.7/site-packages/.
If I do python2.7 setup.py install --user and run the main entry script, it uses the files in /Users/unhammer/Library/Python/2.7/lib/python/site-packages/. This is what I want, but setup.py alone doesn't install dependencies.
If I (revert the installation and) instead do pip-2.7 install --user . and run the main entry script, it uses the old files in /opt/local/Library/Frameworks/Python.framework/Versions/2.7/lib/python2.7/site-packages/ – that's not what I want.
If I (revert the installation and) instead do pip-2.7 install --user -e . and run the main entry script, it uses the files in . – that's not what I want, the user should be able to remove the source dir (and be able to svn up without that affecting their install).
I could use (and recommend other people to use) python2.7 setup.py install --user – but then they have to first do
pip-2.7 install -U --user -r requirements.txt -e .
pip-2.7 uninstall -y footools
in order to get the dependencies installed (since pip has no install --only-deps option). That's rather verbose though.
What is setup.py doing that pip is not doing here?
(Edited to make it clear I'm looking for simpler+safer installation instructions.)
Install virtualenvwrapper. I allows setting up separate python environments to alleviate any conflicts you might be having. Here is a tutorial for installing and using virtualenv.
Related:
https://virtualenvwrapper.readthedocs.org/en/latest/
Console scripts generated by pip in the process of installation should use user installed versions of libraries as according to PEP 370:
The user site directory is added before the system site directories
but after Python's search paths and PYTHONPATH. This setup allows the
user to install a different version of a package than the system
administrator but it prevents the user from accidently overwriting a
stdlib module. Stdlib modules can still be overwritten with
PYTHONPATH.
Sidenote
Setuptools use hack by inserting code in easy_install.pth file which is placed in site-packages directory. This code makes packages installed with setuptools come before other packages in sys.path so they shadow other packages with the same name. This is referred to as sys.path modification in the table comparing setuptools and pip. This is the reason console scripts use user installed libraries when you install with setup.py install instead of using pip.
Taking all of the above into account the reason for what you observe might be caused by:
PYTHONPATH pointing to directories with system-wide installed libraries
Having system-wide libraries installed using sudo python.py install (...)
Having OS influence sys.path construction in some way
In the first case either clearing PYTHONPATH or adding path to user installed library to the beginning of PYTHONPATH should help.
In the second case uninstalling system-wide libraries and installing them with distro package manager instead might help (please note that you never should use sudo with pip or setup.py to install Python packages).
In the third case it's necessary to find out how does OS influence sys.path construction and if there's some way of placing user installed libraries before system ones.
You might be interested in reading issue pip list reports wrong version of package installed both in system site and user site where I asked basically the same question as you:
Does it mean that having system wide Python packages installed with easy_install thus having them use sys.path manipulation breaks scripts from user bin directory? If so is there any workaround?
Last resort solution would be to manually place directory/directories with user installed libraries in the beginning of sys.path from your scripts before importing these libraries.
Having said that if your users do not need direct access to source code I would propose packaging your app together with all dependencies using tool like pex or Platter into self-contained bundle.

Installing Python packages from local file system folder to virtualenv with pip

Is it possible to install packages using pip from the local filesystem?
I have run python setup.py sdist for my package, which has created the appropriate tar.gz file. This file is stored on my system at /srv/pkg/mypackage/mypackage-0.1.0.tar.gz.
Now in a virtual environment I would like to install packages either coming from pypi or from the specific local location /srv/pkg.
Is this possible?
PS
I know that I can specify pip install /srv/pkg/mypackage/mypackage-0.1.0.tar.gz. That will work, but I am talking about using the /srv/pkg location as another place for pip to search if I typed pip install mypackage.
What about::
pip install --help
...
-e, --editable <path/url> Install a project in editable mode (i.e. setuptools
"develop mode") from a local project path or a VCS url.
eg, pip install -e /srv/pkg
where /srv/pkg is the top-level directory where 'setup.py' can be found.
I am pretty sure that what you are looking for is called --find-links option.
You can do
pip install mypackage --no-index --find-links file:///srv/pkg/mypackage
From the installing-packages page you can simply run:
pip install /srv/pkg/mypackage
where /srv/pkg/mypackage is the directory, containing setup.py.
Additionally1, you can install it from the archive file:
pip install ./mypackage-1.0.4.tar.gz
1
Although noted in the question, due to its popularity, it is also included.
I am installing pyfuzzybut is is not in PyPI; it returns the message: No matching distribution found for pyfuzzy.
I tried the accepted answer
pip install --no-index --find-links=file:///Users/victor/Downloads/pyfuzzy-0.1.0 pyfuzzy
But it does not work either and returns the following error:
Ignoring indexes: https://pypi.python.org/simple
Collecting pyfuzzy
Could not find a version that satisfies the requirement pyfuzzy (from versions: )
No matching distribution found for pyfuzzy
At last , I have found a simple good way there: https://pip.pypa.io/en/latest/reference/pip_install.html
Install a particular source archive file.
$ pip install ./downloads/SomePackage-1.0.4.tar.gz
$ pip install http://my.package.repo/SomePackage-1.0.4.zip
So the following command worked for me:
pip install ../pyfuzzy-0.1.0.tar.gz.
Hope it can help you.
This is the solution that I ended up using:
import pip
def install(package):
# Debugging
# pip.main(["install", "--pre", "--upgrade", "--no-index",
# "--find-links=.", package, "--log-file", "log.txt", "-vv"])
pip.main(["install", "--upgrade", "--no-index", "--find-links=.", package])
if __name__ == "__main__":
install("mypackagename")
raw_input("Press Enter to Exit...\n")
I pieced this together from pip install examples as well as from Rikard's answer on another question. The "--pre" argument lets you install non-production versions. The "--no-index" argument avoids searching the PyPI indexes. The "--find-links=." argument searches in the local folder (this can be relative or absolute). I used the "--log-file", "log.txt", and "-vv" arguments for debugging. The "--upgrade" argument lets you install newer versions over older ones.
I also found a good way to uninstall them. This is useful when you have several different Python environments. It's the same basic format, just using "uninstall" instead of "install", with a safety measure to prevent unintended uninstalls:
import pip
def uninstall(package):
response = raw_input("Uninstall '%s'? [y/n]:\n" % package)
if "y" in response.lower():
# Debugging
# pip.main(["uninstall", package, "-vv"])
pip.main(["uninstall", package])
pass
if __name__ == "__main__":
uninstall("mypackagename")
raw_input("Press Enter to Exit...\n")
The local folder contains these files: install.py, uninstall.py, mypackagename-1.0.zip
An option --find-links does the job and it works from requirements.txt file!
You can put package archives in some folder and take the latest one without changing the requirements file, for example requirements:
.
└───requirements.txt
└───requirements
├───foo_bar-0.1.5-py2.py3-none-any.whl
├───foo_bar-0.1.6-py2.py3-none-any.whl
├───wiz_bang-0.7-py2.py3-none-any.whl
├───wiz_bang-0.8-py2.py3-none-any.whl
├───base.txt
├───local.txt
└───production.txt
Now in requirements/base.txt put:
--find-links=requirements
foo_bar
wiz_bang>=0.8
A neat way to update proprietary packages, just drop new one in the folder
In this way you can install packages from local folder AND pypi with the same single call: pip install -r requirements/production.txt
PS. See my cookiecutter-djangopackage fork to see how to split requirements and use folder based requirements organization.
Assuming you have virtualenv and a requirements.txt file, then you can define inside this file where to get the packages:
# Published pypi packages
PyJWT==1.6.4
email_validator==1.0.3
# Remote GIT repo package, this will install as django-bootstrap-themes
git+https://github.com/marquicus/django-bootstrap-themes#egg=django-bootstrap-themes
# Local GIT repo package, this will install as django-knowledge
git+file:///soft/SANDBOX/python/django/forks/django-knowledge#egg=django-knowledge
To install only from local you need 2 options:
--find-links: where to look for dependencies. There is no need for the file:// prefix mentioned by others.
--no-index: do not look in pypi indexes for missing dependencies (dependencies not installed and not in the --find-links path).
So you could run from any folder the following:
pip install --no-index --find-links /srv/pkg /path/to/mypackage-0.1.0.tar.gz
If your mypackage is setup properly, it will list all its dependencies, and if you used pip download to download the cascade of dependencies (ie dependencies of depencies etc), everything will work.
If you want to use the pypi index if it is accessible, but fallback to local wheels if not, you can remove --no-index and add --retries 0. You will see pip pause for a bit while it is try to check pypi for a missing dependency (one not installed) and when it finds it cannot reach it, will fall back to local. There does not seem to be a way to tell pip to "look for local ones first, then the index".
Having requirements in requirements.txt and egg_dir as a directory
you can build your local cache:
$ pip download -r requirements.txt -d eggs_dir
then, using that "cache" is simple like:
$ pip install -r requirements.txt --find-links=eggs_dir
What you need is --find-links of pip install.
-f, --find-links If a url or path to an html file, then parse for links to archives. If a local path or
file:// url that's a directory, then look for archives in the directory listing.
In my case, after python -m build, tar.gz package (and whl file) are generated in ./dist directory.
pip install --no-index -f ./dist YOUR_PACKAGE_NAME
Any tar.gz python package in ./dist can be installed by this way.
But if your package has dependencies, this command will prompt error.
To solve this, you can either pip install those deps from official pypi source, then add --no-deps like this
pip install --no-index --no-deps -f ./dist YOUR_PACKAGE_NAME
or copy your deps packages to ./dist directory.
I've been trying to achieve something really simple and failed miserably, probably I'm stupid.
Anyway, if you have a script/Dockerfile which download a python package zip file (e.g. from GitHub) and you then want to install it you can use the file:/// prefix to install it as shown in the following example:
$ wget https://example.com/mypackage.zip
$ echo "${MYPACKAGE_MD5} mypackage.zip" | md5sum --check -
$ pip install file:///.mypackage.zip
NOTE: I know you could install the package straight away using pip install https://example.com/mypackage.zip but in my case I wanted to verify the checksum (never paranoid enough) and I failed miserably when trying to use the various options that pip provides/the #md5 fragment.
It's been surprisingly frustrating to do something so simple directly with pip. I just wanted to pass a checksum and have pip verify that the zip was matching before installing it.
I was probably doing something very stupid but in the end I gave up and opted for this. I hope it helps others trying to do something similar.
In my case, it was because this library depended on another local library, which I had not yet installed. Installing the dependency with pip, and then the dependent library, solved the issue.
If you want to install one local package (package A) to be used inside another local project/package (B) this is quite simple. All you need is to CD to (B) and call:
pip install /path/to/package(A)
Of course you will need to first compile the package (A) with:
sudo python3 ./setup.py install
And, each time you change package A, just run again setup.py in package (A) then pip install ... inside the using project/package (B)
Just add directory on pip command
pip install mypackage file:/location/in/disk/mypackagename.filetype

Install a Python package into a different directory using pip?

I know the obvious answer is to use virtualenv and virtualenvwrapper, but for various reasons I can't/don't want to do that.
So how do I modify the command
pip install package_name
to make pip install the package somewhere other than the default site-packages?
The --target switch is the thing you're looking for:
pip install --target=d:\somewhere\other\than\the\default package_name
But you still need to add d:\somewhere\other\than\the\default to PYTHONPATH to actually use them from that location.
-t, --target <dir>
Install packages into <dir>. By default this will not replace existing files/folders in <dir>.
Use --upgrade to replace existing packages in <dir> with new versions.
Upgrade pip if target switch is not available:
On Linux or OS X:
pip install -U pip
On Windows (this works around an issue):
python -m pip install -U pip
Use:
pip install --install-option="--prefix=$PREFIX_PATH" package_name
You might also want to use --ignore-installed to force all dependencies to be reinstalled using this new prefix. You can use --install-option to multiple times to add any of the options you can use with python setup.py install (--prefix is probably what you want, but there are a bunch more options you could use).
Instead of the --target or --install-options options, I have found that setting the PYTHONUSERBASE environment variable works well (from discussion on a bug regarding this very thing):
PYTHONUSERBASE=/path/to/install/to pip install --user
(Or set the PYTHONUSERBASE directory in your environment before running the command, using export PYTHONUSERBASE=/path/to/install/to)
This uses the very useful --user option but tells it to make the bin, lib, share and other directories you'd expect under a custom prefix rather than $HOME/.local.
Then you can add this to your PATH, PYTHONPATH and other variables as you would a normal installation directory.
Note that you may also need to specify the --upgrade and --ignore-installed options if any packages upon which this depends require newer versions to be installed in the PYTHONUSERBASE directory, to override the system-provided versions.
A full example
PYTHONUSERBASE=/opt/mysterypackage-1.0/python-deps pip install --user --upgrade numpy scipy
..to install the scipy and numpy package most recent versions into a directory which you can then include in your PYTHONPATH like so (using bash and for python 2.6 on CentOS 6 for this example):
export PYTHONPATH=/opt/mysterypackage-1.0/python-deps/lib64/python2.6/site-packages:$PYTHONPATH
export PATH=/opt/mysterypackage-1.0/python-deps/bin:$PATH
Using virtualenv is still a better and neater solution!
To pip install a library exactly where I wanted it, I navigated to the location I wanted the directory with the terminal then used
pip install mylibraryName -t .
the logic of which I took from this page: https://cloud.google.com/appengine/docs/python/googlecloudstorageclient/download
Installing a Python package often only includes some pure Python files. If the package includes data, scripts and or executables, these are installed in different directories from the pure Python files.
Assuming your package has no data/scripts/executables, and that you want your Python files to go into /python/packages/package_name (and not some subdirectory a few levels below /python/packages as when using --prefix), you can use the one time command:
pip install --install-option="--install-purelib=/python/packages" package_name
If you want all (or most) of your packages to go there, you can edit your ~/.pip/pip.conf to include:
[install]
install-option=--install-purelib=/python/packages
That way you can't forget about having to specify it again and again.
Any excecutables/data/scripts included in the package will still go to their default places unless you specify addition install options (--prefix/--install-data/--install-scripts, etc., for details look at the custom installation options).
Tested these options with python3.5 and pip 9.0.3:
pip install --target /myfolder [packages]
Installs ALL packages including dependencies under /myfolder. Does not take into account that dependent packages are already installed elsewhere in Python. You will find packages from /myfolder/[package_name]. In case you have multiple Python versions, this doesn't take that into account (no Python version in package folder name).
pip install --prefix /myfolder [packages]
Checks if dependencies are already installed. Will install packages into /myfolder/lib/python3.5/site-packages/[packages]
pip install --root /myfolder [packages]
Checks dependencies like --prefix but install location will be /myfolder/usr/local/lib/python3.5/site-packages/[package_name].
pip install --user [packages]
Will install packages into $HOME:
/home/[USER]/.local/lib/python3.5/site-packages
Python searches automatically from this .local path so you don't need to put it to your PYTHONPATH.
=> In most of the cases --user is the best option to use.
In case home folder can't be used because of some reason then --prefix.
pip3 install "package_name" -t "target_dir"
source - https://pip.pypa.io/en/stable/reference/pip_install/
-t switch = target
Nobody seems to have mentioned the -t option but that the easiest:
pip install -t <direct directory> <package>
pip install packageName -t pathOfDirectory
or
pip install packageName --target pathOfDirectorty
Just add one point to #Ian Bicking's answer:
Using the --user option to specify the installed directory also work if one wants to install some Python package into one's home directory (without sudo user right) on remote server.
E.g.,
pip install --user python-memcached
The command will install the package into one of the directories that listed in your PYTHONPATH.
Newer versions of pip (8 or later) can directly use the --prefix option:
pip install --prefix=$PREFIX_PATH package_name
where $PREFIX_PATH is the installation prefix where lib, bin and other top-level folders are placed.
To add to the already good advice, as I had an issue installing IPython when I didn't have write permissions to /usr/local.
pip uses distutils to do its install and this thread discusses how that can cause a problem as it relies on the sys.prefix setting.
My issue happened when the IPython install tried to write to '/usr/local/share/man/man1' with Permission denied. As the install failed it didn't seem to write the IPython files in the bin directory.
Using "--user" worked and the files were written to ~/.local. Adding ~/.local/bin to the $PATH meant I could use "ipython" from there.
However I'm trying to install this for a number of users and had been given write permission to the /usr/local/lib/python2.7 directory. I created a "bin" directory under there and set directives for distutils:
vim ~/.pydistutils.cfg
[install]
install-data=/usr/local/lib/python2.7
install-scripts=/usr/local/lib/python2.7/bin
then (-I is used to force the install despite previous failures/.local install):
pip install -I ipython
Then I added /usr/local/lib/python2.7/bin to $PATH.
I thought I'd include this in case anyone else has similar issues on a machine they don't have sudo access to.
If you are using brew with python, unfortunately, pip/pip3 ships with very limited options. You do not have --install-option, --target, --user options as mentioned above.
Note on pip install --user
The normal pip install --user is disabled for brewed Python. This is because of a bug in distutils, because Homebrew writes a distutils.cfg which sets the package prefix.
A possible workaround (which puts executable scripts in ~/Library/Python/./bin) is:
python -m pip install --user --install-option="--prefix=" <package-name>
You might find this line very cumbersome. I suggest use pyenv for management.
If you are using
brew upgrade python python3
Ironically you are actually downgrade pip functionality.
(I post this answer, simply because pip in my mac osx does not have --target option, and I have spent hours fixing it)
With pip v1.5.6 on Python v2.7.3 (GNU/Linux), option --root allows to specify a global installation prefix, (apparently) irrespective of specific package's options. Try f.i.,
$ pip install --root=/alternative/prefix/path package_name
I suggest to follow the documentation and create ~/.pip/pip.conf file. Note in the documentation there are missing specified header directory, which leads to following error:
error: install-base or install-platbase supplied, but installation scheme is incomplete
The full working content of conf file is:
[install]
install-base=$HOME
install-purelib=python/lib
install-platlib=python/lib.$PLAT
install-scripts=python/scripts
install-headers=python/include
install-data=python/data
Unfortunatelly I can install, but when try to uninstall pip tells me there is no such package for uninstallation process.... so something is still wrong but the package goes to its predefined location.
pip install /path/to/package/
is now possible.
The difference with this and using the -e or --editable flag is that -e links to where the package is saved (i.e. your downloads folder), rather than installing it into your python path.
This means if you delete/move the package to another folder, you won't be able to use it.
system` option, that will install pip package-bins to /usr/local/bin thats accessible to all users. Installing without this option may not work for all users as things go to user specific dir like $HOME/.local/bin and then it is user specific install which has to be repeated for all users, also there can be path issues if not set for users, then bins won't work. So if you are looking for all users - yu need to have sudo access:
sudo su -
python3 -m pip install --system <module>
logout
log back in
which <module-bin> --> it should be installed on /usr/local/bin/
Sometimes it works only works with Cache argument
-m pip install -U pip --target=C:\xxx\python\lib\site-packages Pillow --cache-dir C:\tmp

Categories