i have installed many python modules plugins and libraries in my centos system.
Now i don't want to install each thing again on separate computers.
Is there any way i can make the package like rpm or any other thing so that when i install in in new location all the modules, dependencies also gets installed and everytime i install new thing i can update the install package
If you have installed the packages with pip (an improved easy_install), you can just do pip freeze > my-reqs.txt to get a list and versions of the installed packages. There is also some option to install using the reqs file, which I can not remember right now.
Pip is meant to companion virtualenv, which can be used to handle per project requirements of dependent packages.
If you are not using Pip, then you should check what packages you have in your global and/or user site-packages directories.
The solutions below are from:
How do I find the location of my Python site-packages directory?
Global site packages:
python -m site
or more concisely:
python -c "import site; print(site.getsitepackages())"
User site packages:
python -m site --user-site
The latter do not however show the site-packages of the current virtual environment – use distutils.sysconfig.get_python_lib() for that:
python -c "from distutils.sysconfig import get_python_lib;
print get_python_lib()"
Note!
You can not just copy over the packages that have C++-extensions or other compiled binaries in them. These have to be reinstalled on other machines.
Related
My software needs to install packages on arbitrary container images that have python 3.5+. The module must then be loadable by running python3 -c 'import my_module'.
I plan to use python3 -m pip install my-package --user command to install the packages. The --user work around the containers where the active user is not root.
Unfortunately I've vaguely heard about cases where the user directory where pip installs packages is not in PATH on some systems or something like that.
Would that affect the ability to import the installed package?
What command-line should I use to install packages reliably?
as long as it's in your path such as .local/bin or in .local or python PATH's you can run it as if it was in /usr PATH
I have been given access to a University Data Center to deploy an Image Analysis python project. The server has Python 2.7 and 3.5 installed and I can see that it is missing packages like numpy, theano and keras which I have used in my code as additional libraries.
The problem at hand is, that I do not have access to install anything, or run commands like pip install or apt-get install, and cannot copy anything to the original site-packages location in my server.
But I can copy files into my userspace, and I tried to:
- clone numpy and its prerequisites, and all the additional packages I need into a folder called site-packages.
- add this path to my sys.path, but it gives me errors like "cannot import multiarray"
I'm new to Linux, and my question is: can I copy package files into a Linux system and provide this path to my PYTHONPATH to run the code?
I believe you are looking for:
pip install --user package_name
You might also need to investigate compiling some packages from their source code, but this will depend on the package.
From the user guide more on pip install --user:
pip install --user follows four rules:
When globally installed packages are on the python path, and they
conflict with the installation requirements, they are ignored, and not
uninstalled.
When globally installed packages are on the python path,
and they satisfy the installation requirements, pip does nothing, and
reports that requirement is satisfied (similar to how global packages
can satisfy requirements when installing packages in a
--system-site-packages virtualenv).
pip will not perform a --user install in a --no-site-packages > virtualenv (i.e. the default kind of
virtualenv), due to the user site not being on the python path. The
installation would be pointless.
In a --system-site-packages
virtualenv, pip will not install a package that conflicts with a
package in the virtualenv site-packages. The --user installation would
lack sys.path precedence and be pointless.
Edit: If pip itself is not installed then you can read up here: https://pip.pypa.io/en/stable/installing/
I would like to figure out a "fool-proof" installation instruction to put in the README of a Python project, call it footools, such that other people in our group can install the newest SVN version of it on their laptops and their server accounts.
The problem is getting the user-installed libs to be used by Python when they call the scripts installed by pip. E.g., we're using a server that has an old version of footools in /opt/local/Library/Frameworks/Python.framework/Versions/2.7/lib/python2.7/site-packages/.
If I do python2.7 setup.py install --user and run the main entry script, it uses the files in /Users/unhammer/Library/Python/2.7/lib/python/site-packages/. This is what I want, but setup.py alone doesn't install dependencies.
If I (revert the installation and) instead do pip-2.7 install --user . and run the main entry script, it uses the old files in /opt/local/Library/Frameworks/Python.framework/Versions/2.7/lib/python2.7/site-packages/ – that's not what I want.
If I (revert the installation and) instead do pip-2.7 install --user -e . and run the main entry script, it uses the files in . – that's not what I want, the user should be able to remove the source dir (and be able to svn up without that affecting their install).
I could use (and recommend other people to use) python2.7 setup.py install --user – but then they have to first do
pip-2.7 install -U --user -r requirements.txt -e .
pip-2.7 uninstall -y footools
in order to get the dependencies installed (since pip has no install --only-deps option). That's rather verbose though.
What is setup.py doing that pip is not doing here?
(Edited to make it clear I'm looking for simpler+safer installation instructions.)
Install virtualenvwrapper. I allows setting up separate python environments to alleviate any conflicts you might be having. Here is a tutorial for installing and using virtualenv.
Related:
https://virtualenvwrapper.readthedocs.org/en/latest/
Console scripts generated by pip in the process of installation should use user installed versions of libraries as according to PEP 370:
The user site directory is added before the system site directories
but after Python's search paths and PYTHONPATH. This setup allows the
user to install a different version of a package than the system
administrator but it prevents the user from accidently overwriting a
stdlib module. Stdlib modules can still be overwritten with
PYTHONPATH.
Sidenote
Setuptools use hack by inserting code in easy_install.pth file which is placed in site-packages directory. This code makes packages installed with setuptools come before other packages in sys.path so they shadow other packages with the same name. This is referred to as sys.path modification in the table comparing setuptools and pip. This is the reason console scripts use user installed libraries when you install with setup.py install instead of using pip.
Taking all of the above into account the reason for what you observe might be caused by:
PYTHONPATH pointing to directories with system-wide installed libraries
Having system-wide libraries installed using sudo python.py install (...)
Having OS influence sys.path construction in some way
In the first case either clearing PYTHONPATH or adding path to user installed library to the beginning of PYTHONPATH should help.
In the second case uninstalling system-wide libraries and installing them with distro package manager instead might help (please note that you never should use sudo with pip or setup.py to install Python packages).
In the third case it's necessary to find out how does OS influence sys.path construction and if there's some way of placing user installed libraries before system ones.
You might be interested in reading issue pip list reports wrong version of package installed both in system site and user site where I asked basically the same question as you:
Does it mean that having system wide Python packages installed with easy_install thus having them use sys.path manipulation breaks scripts from user bin directory? If so is there any workaround?
Last resort solution would be to manually place directory/directories with user installed libraries in the beginning of sys.path from your scripts before importing these libraries.
Having said that if your users do not need direct access to source code I would propose packaging your app together with all dependencies using tool like pex or Platter into self-contained bundle.
For work-related reasons, I'm trying to install Python 2.7.8 directly onto my machine (Mac OSX 10.9).
I am current running Python 2.7.6 in Enthought Canopy, and I really don't want to touch the existing libraries there.
My problem is that I'd like to use pip to install packages for the new instantiation of Python, but currently pip is bundled up with Enthought Canopy, so it only installs packages in the site-packages path for Enthought Canopy.
I first tried the following:
pip install --install-option="--prefix=$PREFIX_PATH/Library/Frameworks/Python.framework/Versions/2.7/lib/python2.7/site-packages" scikit-learn
But got the following error:
Requirement already satisfied (use --upgrade to upgrade): scikit-learn in ./Library/Enthought/Canopy_64bit/User/lib/python2.7/site-packages
Then, I tried to add the existing Enthought folder to the path for the newly installed Python 2.7.8, By entering the following line at the end of the .bash_profile:
PYTHONPATH=$PYTHONPATH:Users/***/Library/Frameworks/Python.framework/Versions/2.7/lib/python2.7/site-packages
This led to errors when trying to import some of the packages, probably for this reason:
Cannot import Scikit-Learn
I would really prefer just to install a new version of scikit-learn in a separate folder. Anyone have any suggestions?
You can use virtualenv to create a self-contained python environment that can be configured and used separately from your regular python installation.
Create the virtualenv (for oldish versions of virtualenv you'd want to include --no-site-packages right after virtualenv):
$ virtualenv limitedenv
Using base prefix '/usr/local/Cellar/python3/3.3.3/Frameworks/Python.framework/Versions/3.3'
New python executable in limitedenv/bin/python3
Also creating executable in limitedenv/bin/python
Installing setuptools, pip...done.
Move into the virtualenv and activate it:
$ cd limitedenv/
$ source bin/activate
(limitedenv)$
Install the packages you're after with pip as you'd do globally:
(limitedenv)$ pip install scikit-learn
Downloading/unpacking scikit-learn
Downloading scikit-learn-0.15.0.tar.gz (7.0MB): ...
scikit-learn will now be installed just inside limitedenv, and as long as you have that environment active, invoking python or pip will be like this is your very own, secluded Python installation.
You can exit from the virtual environment by calling deactivate:
(limitedenv)$ deactivate
$
This allows you to have different versions of python by themselves, different versions of libraries pr. project and different configurations based on what your project requires. virtualenv is a very useful tool!
I have installed virtualenv for python. After I have installed some packages such as "nose" etc., I decided to have a try at installing some other packages without affect the former environment.
I type the command,
virtualenv --system-site-packages --always-copy some_new_env
And it responded,
New python executable in some_new_env\Scripts\python.exe
Installing setuptools, pip...done.
Then I looked into the folder some_new_env\lib\site-packages\, in it, still only with the following files and folders:
<_markerlib>
<pip>
<pip-1.5.6.dist-info>
<setuptools>
<setuptools-3.6.dist-info>
easy_install.py
easy_install.pyc
pkg_resources.py
pkg_resources.pyc
Nose etc. installed packages are not installed to this folder. Was the typed command incorrect? How should I type a right command to make those packages installed in the out environment copied to the new environment?
Ideally, you should not copy virtualenvs - instead, you should track the packages you need and install them in the new virtualenv.
This is made much easier if you use pip:
$ env1/bin/pip freeze > requirements.txt
$ env2/bin/pip install -r requirements.txt
As a bonus, you can check requirements.txt into source control, so that you always know what packages you need to install to get a certain version to work.
Here's the relevant documentation for pip freeze