Can I have my pip user-installed package be preferred over system? - python

I would like to figure out a "fool-proof" installation instruction to put in the README of a Python project, call it footools, such that other people in our group can install the newest SVN version of it on their laptops and their server accounts.
The problem is getting the user-installed libs to be used by Python when they call the scripts installed by pip. E.g., we're using a server that has an old version of footools in /opt/local/Library/Frameworks/Python.framework/Versions/2.7/lib/python2.7/site-packages/.
If I do python2.7 setup.py install --user and run the main entry script, it uses the files in /Users/unhammer/Library/Python/2.7/lib/python/site-packages/. This is what I want, but setup.py alone doesn't install dependencies.
If I (revert the installation and) instead do pip-2.7 install --user . and run the main entry script, it uses the old files in /opt/local/Library/Frameworks/Python.framework/Versions/2.7/lib/python2.7/site-packages/ – that's not what I want.
If I (revert the installation and) instead do pip-2.7 install --user -e . and run the main entry script, it uses the files in . – that's not what I want, the user should be able to remove the source dir (and be able to svn up without that affecting their install).
I could use (and recommend other people to use) python2.7 setup.py install --user – but then they have to first do
pip-2.7 install -U --user -r requirements.txt -e .
pip-2.7 uninstall -y footools
in order to get the dependencies installed (since pip has no install --only-deps option). That's rather verbose though.
What is setup.py doing that pip is not doing here?
(Edited to make it clear I'm looking for simpler+safer installation instructions.)

Install virtualenvwrapper. I allows setting up separate python environments to alleviate any conflicts you might be having. Here is a tutorial for installing and using virtualenv.
Related:
https://virtualenvwrapper.readthedocs.org/en/latest/

Console scripts generated by pip in the process of installation should use user installed versions of libraries as according to PEP 370:
The user site directory is added before the system site directories
but after Python's search paths and PYTHONPATH. This setup allows the
user to install a different version of a package than the system
administrator but it prevents the user from accidently overwriting a
stdlib module. Stdlib modules can still be overwritten with
PYTHONPATH.
Sidenote
Setuptools use hack by inserting code in easy_install.pth file which is placed in site-packages directory. This code makes packages installed with setuptools come before other packages in sys.path so they shadow other packages with the same name. This is referred to as sys.path modification in the table comparing setuptools and pip. This is the reason console scripts use user installed libraries when you install with setup.py install instead of using pip.
Taking all of the above into account the reason for what you observe might be caused by:
PYTHONPATH pointing to directories with system-wide installed libraries
Having system-wide libraries installed using sudo python.py install (...)
Having OS influence sys.path construction in some way
In the first case either clearing PYTHONPATH or adding path to user installed library to the beginning of PYTHONPATH should help.
In the second case uninstalling system-wide libraries and installing them with distro package manager instead might help (please note that you never should use sudo with pip or setup.py to install Python packages).
In the third case it's necessary to find out how does OS influence sys.path construction and if there's some way of placing user installed libraries before system ones.
You might be interested in reading issue pip list reports wrong version of package installed both in system site and user site where I asked basically the same question as you:
Does it mean that having system wide Python packages installed with easy_install thus having them use sys.path manipulation breaks scripts from user bin directory? If so is there any workaround?
Last resort solution would be to manually place directory/directories with user installed libraries in the beginning of sys.path from your scripts before importing these libraries.
Having said that if your users do not need direct access to source code I would propose packaging your app together with all dependencies using tool like pex or Platter into self-contained bundle.

Related

pip3 installs modules to location python3 can't find

I have pip3, installed via the yum install of python3-pip.
I've done a pip3 global install of some modules I need, but python3 can't find them to import. After a little investigation I see that pip3 installed the modules to /usrlib/python3.6/site-packages/pip/_vendor/
The problem is that python3 doesn't seem to know to look at pip/_vendor, it only finds modules directly installed under site-package. If I just copy the modules from .../site-package/pip/_vendor to .../site-package everything works fine.
The issue doesn't appear to be related to file permissions or ability to read the modules.
I'm wondering how I configure either pip to install directly to site-package or python3 to understand how to look in the pip/_vendor location.
I'm configuring this all with ansible and would like as module an option as possible. For instance I could manually use an argument to tell pip3 to install to the folder I want, but I don't want to hardcode the exact site-package directory if I don't have to.
I recommend starting over with pip by downloading and running get-pip.py. This will not only install the latest version of pip, but it will also install packages to a Python-readable location (the version of Python you use to run get-pip.py).
As an aside, I would avoid installing packages system-wide unless there is a specific need for them. At the very least, you should be installing them as a regular user, and even better you should be using a virtualenv.

Installing numpy, keras and theano without root privileges on linux

I have been given access to a University Data Center to deploy an Image Analysis python project. The server has Python 2.7 and 3.5 installed and I can see that it is missing packages like numpy, theano and keras which I have used in my code as additional libraries.
The problem at hand is, that I do not have access to install anything, or run commands like pip install or apt-get install, and cannot copy anything to the original site-packages location in my server.
But I can copy files into my userspace, and I tried to:
- clone numpy and its prerequisites, and all the additional packages I need into a folder called site-packages.
- add this path to my sys.path, but it gives me errors like "cannot import multiarray"
I'm new to Linux, and my question is: can I copy package files into a Linux system and provide this path to my PYTHONPATH to run the code?
I believe you are looking for:
pip install --user package_name
You might also need to investigate compiling some packages from their source code, but this will depend on the package.
From the user guide more on pip install --user:
pip install --user follows four rules:
When globally installed packages are on the python path, and they
conflict with the installation requirements, they are ignored, and not
uninstalled.
When globally installed packages are on the python path,
and they satisfy the installation requirements, pip does nothing, and
reports that requirement is satisfied (similar to how global packages
can satisfy requirements when installing packages in a
--system-site-packages virtualenv).
pip will not perform a --user install in a --no-site-packages > virtualenv (i.e. the default kind of
virtualenv), due to the user site not being on the python path. The
installation would be pointless.
In a --system-site-packages
virtualenv, pip will not install a package that conflicts with a
package in the virtualenv site-packages. The --user installation would
lack sys.path precedence and be pointless.
Edit: If pip itself is not installed then you can read up here: https://pip.pypa.io/en/stable/installing/

When install external python packages global, when local? pip or system package-manager?

I'm confused about the possibilities of installing external python packages:
install package local with pip into /home/chris/.local/lib/python3.4/site-packages
$ pip install --user packagename
install package global with pip into /usr/local/lib/python3.4/site-packages
(superuser permission required)
$ pip install packagename
install package global with zypper into /usr/lib/python3.4/site-packages
(superuser permission required)
$ zypper install packagename
I use OpenSuse with package-manager zypper and have access to user root.
What I (think to) know about pip is that:
- pip just downloads the latest version.
- For installed packages won't be checked if newer versions are available.
- Own packages can be installed in a virtual env.
- Takes more time to download and install than zypper.
- Local or global installation possible.
The package-manager of my system:
- Does download and installation faster.
- Installs the package only globally.
My question is when and why should I do the installation: pip (local, global) or with zypper?
I've read a lot about this issue but could not answer this question clearly...
The stuff under /usr/lib is system packages considered part of the OS. It's likely/possible that OS scripts and services will have dependencies on these components. I'd recommend not touching these yourself, or really using or depending on them for user scripts either as this will make your app OS or even OS version dependent. Use these if writing scripts that run at system level such as doing maintenance or admin tasks, although I'd seriously consider even these using...
Stuff under /usr/local/lib is installed locally for use by any user. System scripts and such won't depend on these (I don't know SuSE myself though), but other user's scripts might well do, so that needs to be borne in mind when making changes here. It's a shared resource. If your writing scripts that other users might need to run, develop against this to ensure they will have access to all required dependencies.
Stuff in your home directory is all yours, so do as thou wilt. Use this if you're writing something just for yourself and especially if you might need the scripts to be portable to other boxes/OSes.
There might well be other options that make sense, such as if you're part of a team developing application software, in which case install your team's base dev packages in a shared location but perhaps not /usr/local.
In terms of using zypper or pip, I'd suggest using zypper to update /usr/lib for sure as it's the specific tool for OS configuration update. Probably same goes for /usr/local/lib too as that's really part of the 'system' but it's really up to you and which method might make most sense e.g. if you needed to replicate the config an another host. For stuff in your homedir it's up to you but if you decide to move to a new host on a new OS, pip will still be available and so that environment will be easier to recreate.

Installing python packages with no installation directory acces and no pip/easy_install/virtual_env

At work we have python installed, but no additional modules. I want to import some scipy modules but I have no access to the python directory for installation.
Similar questions have been asked on StackOverflow, but the answers always assumed easy install, pip or virtualenv were installed. At my workplace, none of these packages are installed. It's just the plain python installation and nothing else.
Is there still an option for me for installing modules in my local folder and calling them from python? If so, how do I go about it?
Not exactly installing modules on your local folder, but a solution nonetheless:
I used to work for a company that used windows and didn't have admin access, so I ended up using Portable python.
It seems portable python is no longer mantained, but you can see some other portable python solutions on their site, most of which you can run straight from your usb.
You can download pip from here http://pip.readthedocs.org/en/stable/installing/ and install it without root privileges by typing:
python get-pip.py --user
This will install to directory with prefix $HOME/.local so the pip executable will be in the directory $HOME/.local/bin/pip, for your convenience you can add this directory to $PATH by adding to end of .bashrc file this string
export PATH=$HOME/.local/bin/:$PATH
After this you can install any packages by typing
pip install package --user
Or you can alternatively compile the python distribution from source code and install to your home directory to directory $HOME/.local or $HOME/opt or any subfolder of $HOME you prefer, let's call this path $PREFIX. For doing this you have to download python source code from official site, unpack it and then run
./configure --prefix=$PREFIX --enable-shared
make install
And then add python binary to $PATH, python libraries to $LD_LIBRARY_PATH, by adding to the end of $HOME/.bashrc file whit strings
export PATH=$PREFIX/bin:$PATH
export LD_LIBRARY_PATH=$PREFIX/lib
and when after restarting bash you can also run
python get-pip.py
and pip and will be installed automatically to your $PREFIX directory. And all other packages those you will install with pip will be automatically installed also to $PREFIX directory. This way is more involved, but it allows you to have the last version of python.

Proper permissions for python installation directory?

I'm trying to use a python app on a server for the first time. I started by adding setuptools as root:
[root#server mydirectory]# yum install python-setuptools
Cool. Then I try setup.py:
[user#server mydirectory]$ python setup.py install
running install
error: can't create or remove files in install directory
The following error occurred while trying to add or remove files in the
installation directory:
[Errno 13] Permission denied: '/usr/lib/python2.4/site-packages/test-easy-install-25752.write-test'
This directory /usr/lib/python2.4/site-packages is owned by root, so that makes sense.
My question is, should I chmod the site-packages directory, or should I be running setup.py as root?
The traditional way to install stuff system-wide as a non-root user is to use sudo. Which is why you see things like this all over the Python docs:
sudo python setup.py install
Some people prefer to instead make the site-packages group-writable by some "dev" group so you don't need to do this. (This is effectively what the Mac package manager Homebrew does.)
Alternatively, you can install into per-user site packages. Not every project can do this, but anything based on modern setuptools should be able to do so.
And, while we're at it, if you're installing stuff based on modern setuptools, it's probably better to pip install . instead of python setup.py install anyway. That will, among other benefits, create egg-info files so the package can be detected as a dependency, uninstalled, etc.
See the Python Packaging User Guide for more information.
Finally, you may want to consider using a virtual environment. With Python 3.3+, this is built in as venv, although it doesn't have its own pip until 3.4. With earlier versions of Python, you can install virtualenv off PyPI.
Many hosted server environments for Python (2.x or 3.x) come with virtualenv pre-installed. If not, installing it system-wide will of course require you to be root… but after that, you will be able to install (most) other packages into per-project virtual environments instead of system-wide.
Installing packages with pip/easy_install and running directly setup.py files require root privileges because they read/write in those restricted folders.
Usually hosts like www.openshift.com support a virtualenv for you so you just activate it and you have your own per-user environment. Affecting the global site-packages is usually forbidden since it may be a shared host.
In my experience, in a local ubuntu-installed laptop, I have two options:
Run installs as sudo
Run installs in a virtualenv
perhaps your host, if shared, supports virtualenv. Try asking them if it doesn't support it.

Categories