I'm trying to use a python app on a server for the first time. I started by adding setuptools as root:
[root#server mydirectory]# yum install python-setuptools
Cool. Then I try setup.py:
[user#server mydirectory]$ python setup.py install
running install
error: can't create or remove files in install directory
The following error occurred while trying to add or remove files in the
installation directory:
[Errno 13] Permission denied: '/usr/lib/python2.4/site-packages/test-easy-install-25752.write-test'
This directory /usr/lib/python2.4/site-packages is owned by root, so that makes sense.
My question is, should I chmod the site-packages directory, or should I be running setup.py as root?
The traditional way to install stuff system-wide as a non-root user is to use sudo. Which is why you see things like this all over the Python docs:
sudo python setup.py install
Some people prefer to instead make the site-packages group-writable by some "dev" group so you don't need to do this. (This is effectively what the Mac package manager Homebrew does.)
Alternatively, you can install into per-user site packages. Not every project can do this, but anything based on modern setuptools should be able to do so.
And, while we're at it, if you're installing stuff based on modern setuptools, it's probably better to pip install . instead of python setup.py install anyway. That will, among other benefits, create egg-info files so the package can be detected as a dependency, uninstalled, etc.
See the Python Packaging User Guide for more information.
Finally, you may want to consider using a virtual environment. With Python 3.3+, this is built in as venv, although it doesn't have its own pip until 3.4. With earlier versions of Python, you can install virtualenv off PyPI.
Many hosted server environments for Python (2.x or 3.x) come with virtualenv pre-installed. If not, installing it system-wide will of course require you to be root… but after that, you will be able to install (most) other packages into per-project virtual environments instead of system-wide.
Installing packages with pip/easy_install and running directly setup.py files require root privileges because they read/write in those restricted folders.
Usually hosts like www.openshift.com support a virtualenv for you so you just activate it and you have your own per-user environment. Affecting the global site-packages is usually forbidden since it may be a shared host.
In my experience, in a local ubuntu-installed laptop, I have two options:
Run installs as sudo
Run installs in a virtualenv
perhaps your host, if shared, supports virtualenv. Try asking them if it doesn't support it.
Related
I am facing problems installing pyobjc on my mac.
Basically I have to install pyobjc on a new Mac System in the system default python. I have so far tried easy_install, pip and downloading the pkg file and installing. All give me a error in different ways. Some give me a error saying certain safari files are missing other cant due to some permission being denied even though I am running them through sudo su.
I then found a fix.
pip install pyobjc --user
This worked and I could access all the modules I required, but then if I try running python through sudo, I cant access those modules.
Can anyone suggest a fix for this.
NOTE: I don't mind a different method to install also. Also I have not tried brew due to some previous difficulties with it.
NOTE 2: I need to be able to access those modules using all users on the computer, the root user and me(the non-root user)
i had to (temporarily) move (using sudo) /Library/Python/2.7/site-packages/Extras.pth to another name before I could install the current pyobjc.
This is what works for me:
sudo mv /Library/Python/2.7/site-packages/Extras.pth /Library/Python/2.7/site-packages/Extras.pth_orig
pip install --upgrade pyobjc
sudo mv /Library/Python/2.7/site-packages/Extras.pth_orig /Library/Python/2.7/site-packages/Extras.pth
It appears that something in the .pth file interferes with the install, but does not impede running pyobjc.
but then if I try running python through sudo, I cant access those
modules.
Because sudo python basically means run python as some other user (root by default). That user may have a different set of environment variables, including $PATH.
Some of linux distributions use older Python version for root user,like centos.If the Python verison you're running with sudo isn't correct,you can't access those modules installed by pip.
So in my opinion,if you didn't get permission issues,you don't need to use sudo ,using sudo might bring unexpected mistakes(most environment variables issues),maybe chown or chmod can fix those issues.
So here are my plans:
Plan A: The best way is to try to use virtualenv.
Plan B: Install modules without sudo command,if got permission errors(not very common),try --user .
Install to the Python user install directory for your platform.
Typically ~/.local/, or %APPDATA%Python on Windows.
In most cases,you should modify your PYTHONPATH.See details from How do I access packages installed by pip --user.
Plan C: All related commands are executed with sudo.sudo pip install (all modules) and sudo python script.py.Not a good idea.
I'm confused about the possibilities of installing external python packages:
install package local with pip into /home/chris/.local/lib/python3.4/site-packages
$ pip install --user packagename
install package global with pip into /usr/local/lib/python3.4/site-packages
(superuser permission required)
$ pip install packagename
install package global with zypper into /usr/lib/python3.4/site-packages
(superuser permission required)
$ zypper install packagename
I use OpenSuse with package-manager zypper and have access to user root.
What I (think to) know about pip is that:
- pip just downloads the latest version.
- For installed packages won't be checked if newer versions are available.
- Own packages can be installed in a virtual env.
- Takes more time to download and install than zypper.
- Local or global installation possible.
The package-manager of my system:
- Does download and installation faster.
- Installs the package only globally.
My question is when and why should I do the installation: pip (local, global) or with zypper?
I've read a lot about this issue but could not answer this question clearly...
The stuff under /usr/lib is system packages considered part of the OS. It's likely/possible that OS scripts and services will have dependencies on these components. I'd recommend not touching these yourself, or really using or depending on them for user scripts either as this will make your app OS or even OS version dependent. Use these if writing scripts that run at system level such as doing maintenance or admin tasks, although I'd seriously consider even these using...
Stuff under /usr/local/lib is installed locally for use by any user. System scripts and such won't depend on these (I don't know SuSE myself though), but other user's scripts might well do, so that needs to be borne in mind when making changes here. It's a shared resource. If your writing scripts that other users might need to run, develop against this to ensure they will have access to all required dependencies.
Stuff in your home directory is all yours, so do as thou wilt. Use this if you're writing something just for yourself and especially if you might need the scripts to be portable to other boxes/OSes.
There might well be other options that make sense, such as if you're part of a team developing application software, in which case install your team's base dev packages in a shared location but perhaps not /usr/local.
In terms of using zypper or pip, I'd suggest using zypper to update /usr/lib for sure as it's the specific tool for OS configuration update. Probably same goes for /usr/local/lib too as that's really part of the 'system' but it's really up to you and which method might make most sense e.g. if you needed to replicate the config an another host. For stuff in your homedir it's up to you but if you decide to move to a new host on a new OS, pip will still be available and so that environment will be easier to recreate.
I would like to figure out a "fool-proof" installation instruction to put in the README of a Python project, call it footools, such that other people in our group can install the newest SVN version of it on their laptops and their server accounts.
The problem is getting the user-installed libs to be used by Python when they call the scripts installed by pip. E.g., we're using a server that has an old version of footools in /opt/local/Library/Frameworks/Python.framework/Versions/2.7/lib/python2.7/site-packages/.
If I do python2.7 setup.py install --user and run the main entry script, it uses the files in /Users/unhammer/Library/Python/2.7/lib/python/site-packages/. This is what I want, but setup.py alone doesn't install dependencies.
If I (revert the installation and) instead do pip-2.7 install --user . and run the main entry script, it uses the old files in /opt/local/Library/Frameworks/Python.framework/Versions/2.7/lib/python2.7/site-packages/ – that's not what I want.
If I (revert the installation and) instead do pip-2.7 install --user -e . and run the main entry script, it uses the files in . – that's not what I want, the user should be able to remove the source dir (and be able to svn up without that affecting their install).
I could use (and recommend other people to use) python2.7 setup.py install --user – but then they have to first do
pip-2.7 install -U --user -r requirements.txt -e .
pip-2.7 uninstall -y footools
in order to get the dependencies installed (since pip has no install --only-deps option). That's rather verbose though.
What is setup.py doing that pip is not doing here?
(Edited to make it clear I'm looking for simpler+safer installation instructions.)
Install virtualenvwrapper. I allows setting up separate python environments to alleviate any conflicts you might be having. Here is a tutorial for installing and using virtualenv.
Related:
https://virtualenvwrapper.readthedocs.org/en/latest/
Console scripts generated by pip in the process of installation should use user installed versions of libraries as according to PEP 370:
The user site directory is added before the system site directories
but after Python's search paths and PYTHONPATH. This setup allows the
user to install a different version of a package than the system
administrator but it prevents the user from accidently overwriting a
stdlib module. Stdlib modules can still be overwritten with
PYTHONPATH.
Sidenote
Setuptools use hack by inserting code in easy_install.pth file which is placed in site-packages directory. This code makes packages installed with setuptools come before other packages in sys.path so they shadow other packages with the same name. This is referred to as sys.path modification in the table comparing setuptools and pip. This is the reason console scripts use user installed libraries when you install with setup.py install instead of using pip.
Taking all of the above into account the reason for what you observe might be caused by:
PYTHONPATH pointing to directories with system-wide installed libraries
Having system-wide libraries installed using sudo python.py install (...)
Having OS influence sys.path construction in some way
In the first case either clearing PYTHONPATH or adding path to user installed library to the beginning of PYTHONPATH should help.
In the second case uninstalling system-wide libraries and installing them with distro package manager instead might help (please note that you never should use sudo with pip or setup.py to install Python packages).
In the third case it's necessary to find out how does OS influence sys.path construction and if there's some way of placing user installed libraries before system ones.
You might be interested in reading issue pip list reports wrong version of package installed both in system site and user site where I asked basically the same question as you:
Does it mean that having system wide Python packages installed with easy_install thus having them use sys.path manipulation breaks scripts from user bin directory? If so is there any workaround?
Last resort solution would be to manually place directory/directories with user installed libraries in the beginning of sys.path from your scripts before importing these libraries.
Having said that if your users do not need direct access to source code I would propose packaging your app together with all dependencies using tool like pex or Platter into self-contained bundle.
I'm trying to install setuptools for Python2.7 on a Centos 6 VPS with Digital Ocean using this tutorial. When I reach the step where you "Intall setuptools using the Python we've installed (2.7.6)", I get this error:
[username#hotsname setuptools-1.4.2]$ python2.7 setup.py install
running install
error: can't create or remove files in install directory
The following error occurred while trying to add or remove files in the
installation directory:
[Errno 13] Permission denied: '/usr/local/lib/python2.7/site-packages/test-easy-install-1111.write-test'
The installation directory you specified (via --install-dir, --prefix, or
the distutils default setting) was:
/usr/local/lib/python2.7/site-packages/
Perhaps your account does not have write access to this directory? If the
installation directory is a system-owned directory, you may need to sign in
as the administrator or "root" account. If you do not have administrative
access to this machine, you may wish to choose a different installation
directory, preferably one that is listed in your PYTHONPATH environment
variable.
For information on other options, you may wish to consult the
documentation at:
https://pythonhosted.org/setuptools/easy_install.html
Please make the appropriate changes for your system and try again.
Now, I previously followed instructions on the same digital ocean community site to give the user executing the above root privileges. When I try to use sudo to do this task, I get:
[username#hostname setuptools-1.4.2]$ sudo python2.7 setup.py install
[sudo] password for username:
sudo: python2.7: command not found
So I'm a little confused. I feel like I'm probably missing something simple. Digital Ocena was unable to provide further support on this. I've worked with virtualenv for a long time and am familiar with what to do once I get it installed, I'm just stuck here as it's my first time setting up a Centos host. Can you tell what I'm missing?
Thank you!
Changing to root user did the trick. Thanks CasualDemon.
Nowadays, if you'd like to install setuptools & pip easily, you can run this file with your python interpreter:
get-pip.py
You made need administrator (root) privileges for installing it to your system python (e.g sudo python get-pip.py.
Afterwards you can upgrade pip and/or setuptools through e.g:
$ pip install -U setuptools
$ pip install -U pip
I recommend for most python development you only install setuptools, pip, and virtualenv as root (or purely virtualenv, if you're being conservative). After, you can use virtualenv virtual environments to create isolated python environments which don't need to install to the system python or affect its installed packages. That way no other python (and/or pip invocations) needs to be run as root.
I am using numpy / scipy / pynest to do some research computing on Mac OS X. For performance, we rent a 400-node cluster (with Linux) from our university so that the tasks could be done parallel. The problem is that we are NOT allowed to install any extra packages on the cluster (no sudo or any installation tool), they only provide the raw python itself.
How can I run my scripts on the cluster then? Is there any way to integrate the modules (numpy and scipy also have some compiled binaries I think) so that it could be interpreted and executed without installing packages?
You don't need root privileges to install packages in your home directory. You can do that with a command such as
pip install --user numpy
or from source
python setup.py install --user
See https://stackoverflow.com/a/7143496/284795
The first alternative is much more convenient, so if the server doesn't have pip or easy_install, you should politely ask the admins to add it, explaining the benefit to them (they won't be bothered anymore by requests for individual packages).
You could create a virtual environment through the virtualenv package.
This creates a folder (say venv) with a new copy of the Python executable and a new site-packages directory, into which you can "install" any number of packages without needing any kind of administrative access at all. Thus, activating the environment through source venv/bin/activate will give Python an environment that's equivalent to having those packages installed.
I know this works for SGE clusters, although how the virtual environment is activated might depend on your cluster's configuration.
You can try installing virtualenv on your cluster within your own site-packages directory using the following steps:
Download virtualenv from here, put it on your cluster
Install it using setup.py to a specific, local directory to serve as your own site-packages:
python setup.py build
python setup.py install --install-base /path/to/local-site-packages
Add that directory to your PYTHONPATH:
export PYTHONPATH="/path/to/local-site-packages:${PYTHONPATH}"
Create a virtualenv:
virtualenv venv
You can import a module from an arbitrary path by calling:
sys.path.append()
The Python Distribution Anaconda solves many of the issues discussed in this questions. Anaconda does not require Admin or root access and is able to install to your home directory. Anaconda comes with many of the packages in question (scipy, numpy, sklearn, etc...) as well as the conda installer to install additional packages should additional ones be necessary.
It can be downloaded from https://www.continuum.io/downloads