I am trying to edit a python library and hence building it from source. Can someone explain what does the following instruction do and why is this method different from
'pip install package-name' done normally ?
pip install --verbose --no-build-isolation --editable
You can read all the usage options here: https://pip.pypa.io/en/stable/cli/pip_install/
-v, --verbose
Give more output. Option is additive, and can be used up to 3 times.
--no-build-isolation
Disable isolation when building a modern source distribution. Build dependencies specified by PEP 518 must be already installed if this option is used.
It means pip won't install the dependencies, so you have to install the dependencies if any by yourself first or the command will fail.
-e, --editable <path/url>
Install a project in editable mode (i.e. setuptools “develop mode”) from a local project path or a VCS url.
Here you have to input a path/url argument to install from an external source.
This information is from pip official documentation. Please refer to it
When making build requirements available, pip does so in an isolated environment. That is, pip does not install those requirements into the user’s site-packages, but rather installs them in a temporary directory which it adds to the user’s sys.path for the duration of the build. This ensures that build requirements are handled independently of the user’s runtime environment. For example, a project that needs a recent version of setuptools to build can still be installed, even if the user has an older version installed (and without silently replacing that version).
In certain cases, projects (or redistributors) may have workflows that explicitly manage the build environment. For such workflows, build isolation can be problematic. If this is the case, pip provides a --no-build-isolation flag to disable build isolation. Users supplying this flag are responsible for ensuring the build environment is managed appropriately (including ensuring that all required build dependencies are installed).
Thank you
Related
Do I understand correctly that the best way to make sure is to go through module’s code and see for myself?
Yes.
When you install Python modules through PIP, you are installing any code contained within those modules. And note that Python modules allow code execution not only at run time, but also at install time. To prevent this, only install binary distribution Python wheels using the --only-binary :all: flag. This avoids arbitrary code execution on installation (by avoiding setup.py).
In addition to this, you can also help mitigate against malicious packages by:
Installing packages with the local user using the --user flag.
And installing packages in hash-checking mode using the --require-hashes flag.
I am running automated test suite and one of the test needs to install several Python packages with pip to make sure project scaffolds operate correctly.
However it is quite slowish operation to fetch packages from PyPi and cause unnecessary time burned during the test run. This is also a great source of random failures due to network connectivity errors. My plan was to create a cache tarball of known Python packages which are going to be installed. Then pip could consume packages directly from this tarball or extract it to a virtualenv for the test run.
Also the goal is to make this repeatable, so that the same cache (tarball) would be available on CI and local development.
Does there exist any tools or processes to create a redistributable Python package caches for pip?
Any other ideas how to do this in a platform agnostic way? I assume relocatable virtual environments are specific to target platform?
Use wheel:
pip wheel -r requirements.txt
all requirements are built to folder wheelhouse.
so, on each test-suite you can run pip install wheelhouse/*
Your second option is devpi which is work as pypi cache.
I would like to figure out a "fool-proof" installation instruction to put in the README of a Python project, call it footools, such that other people in our group can install the newest SVN version of it on their laptops and their server accounts.
The problem is getting the user-installed libs to be used by Python when they call the scripts installed by pip. E.g., we're using a server that has an old version of footools in /opt/local/Library/Frameworks/Python.framework/Versions/2.7/lib/python2.7/site-packages/.
If I do python2.7 setup.py install --user and run the main entry script, it uses the files in /Users/unhammer/Library/Python/2.7/lib/python/site-packages/. This is what I want, but setup.py alone doesn't install dependencies.
If I (revert the installation and) instead do pip-2.7 install --user . and run the main entry script, it uses the old files in /opt/local/Library/Frameworks/Python.framework/Versions/2.7/lib/python2.7/site-packages/ – that's not what I want.
If I (revert the installation and) instead do pip-2.7 install --user -e . and run the main entry script, it uses the files in . – that's not what I want, the user should be able to remove the source dir (and be able to svn up without that affecting their install).
I could use (and recommend other people to use) python2.7 setup.py install --user – but then they have to first do
pip-2.7 install -U --user -r requirements.txt -e .
pip-2.7 uninstall -y footools
in order to get the dependencies installed (since pip has no install --only-deps option). That's rather verbose though.
What is setup.py doing that pip is not doing here?
(Edited to make it clear I'm looking for simpler+safer installation instructions.)
Install virtualenvwrapper. I allows setting up separate python environments to alleviate any conflicts you might be having. Here is a tutorial for installing and using virtualenv.
Related:
https://virtualenvwrapper.readthedocs.org/en/latest/
Console scripts generated by pip in the process of installation should use user installed versions of libraries as according to PEP 370:
The user site directory is added before the system site directories
but after Python's search paths and PYTHONPATH. This setup allows the
user to install a different version of a package than the system
administrator but it prevents the user from accidently overwriting a
stdlib module. Stdlib modules can still be overwritten with
PYTHONPATH.
Sidenote
Setuptools use hack by inserting code in easy_install.pth file which is placed in site-packages directory. This code makes packages installed with setuptools come before other packages in sys.path so they shadow other packages with the same name. This is referred to as sys.path modification in the table comparing setuptools and pip. This is the reason console scripts use user installed libraries when you install with setup.py install instead of using pip.
Taking all of the above into account the reason for what you observe might be caused by:
PYTHONPATH pointing to directories with system-wide installed libraries
Having system-wide libraries installed using sudo python.py install (...)
Having OS influence sys.path construction in some way
In the first case either clearing PYTHONPATH or adding path to user installed library to the beginning of PYTHONPATH should help.
In the second case uninstalling system-wide libraries and installing them with distro package manager instead might help (please note that you never should use sudo with pip or setup.py to install Python packages).
In the third case it's necessary to find out how does OS influence sys.path construction and if there's some way of placing user installed libraries before system ones.
You might be interested in reading issue pip list reports wrong version of package installed both in system site and user site where I asked basically the same question as you:
Does it mean that having system wide Python packages installed with easy_install thus having them use sys.path manipulation breaks scripts from user bin directory? If so is there any workaround?
Last resort solution would be to manually place directory/directories with user installed libraries in the beginning of sys.path from your scripts before importing these libraries.
Having said that if your users do not need direct access to source code I would propose packaging your app together with all dependencies using tool like pex or Platter into self-contained bundle.
I am running a local pypi server. I can install packages from this server by either specifying it with the -i option of the pip command or by setting the PIP_INDEX_URL environment variable. When I install a package that has prerequisites, setup.py has historically honored the PIP_INDEX_URL environment variable, pulling the additional packages from my local server.
However, on a couple of systems that have been recently installed, it is behaving differently. Running, for instance, python setup.py develop fails because it tries to install prerequisites packages from pypi.python.org.
I have updated all of the related python packages (python, distribute, virtualenv, pip, etc...) on all the systems I'm testing on and continue to see this discrepancy. On my "original" system, setup.py downloads prerequisites from the pypi server specified in my PIP_INDEX_URL environment variable. On the newer systems, I can't seem to make it honor this variable.
What am I missing?
Create setup.cfg in the same folder as your setup.py with following content:
[easy_install]
allow_hosts = *.myintranet.example.com
From: http://pythonhosted.org/setuptools/easy_install.html#restricting-downloads-with-allow-hosts
You can use the --allow-hosts (-H) option to restrict what domains EasyInstall will look for links and downloads on.
--allow-hosts=None prevents downloading altogether.
I ran into the same issue. Fundamentally, setup.py is using setuptools which leverages easy_install, not pip. Thus, it ignores any pip-related environment variables you set.
Rather than use python setup.py develop you can run pip (from the top of the package) pip install -e . to produce the same effect.
In order to install matplotlib to a non default location, I change the file setup.cfg, setting the variable basedirlist.
I do
python setup.py build
and then
python setup.py install
but the last fail because:
copying build/lib.linux-x86_64-2.6/mpl_toolkits/axes_grid1/colorbar.py -> /opt/python/2.6.4/lib/python2.6/site-packages/mpl_toolkits/axes_grid1
error: could not delete '/opt/python/2.6.4/lib/python2.6/site-packages/mpl_toolkits/axes_grid1/colorbar.py': Read-only file system
I am not root, so how can I install matplotlib? is there any other variable I have to set?
Try with an unmodified version of setup.cfg and run python setup.py install --help There are several options for controlling where the files are installed, the important part of the help message being:
Options for 'install' command:
--prefix installation prefix
--exec-prefix (Unix only) prefix for platform-specific files
--home (Unix only) home directory to install under
--user install in user site-package
'/home/yannpaul/.local/lib/python2.6/site-packages'
--install-base base installation directory (instead of --prefix or --
home)
Read over those options and choose which one suits you best.
I recommend, however, to use vertualenv. This sets up, in a directory of your choice, a custom library location and copy of python. All the other libraries (installed by a system admin for example) are available until you install your own copy of the library in this virtualenv.
Virtualenv is also a good option if you want to play around with the development version of a library, matplotlib for example. Setup a virtualenv for these development libraries, then use the python "executable" associated with that virtualenv to get access to the development version of the library.
Check out What's the proper way to install pip, virtualenv, and distribute for Python? to get setup with virtualenv.