Using a user folder for python3 packages with pip3 - python

I am trying to install and use python3 packages to /home/myname/pp folder. I should be able to run python3 from anywhere. Also, pip3 should be able to update the packages in this folder. I should also be able to copy this folder to a new Linux system and it should work there as well (by changing PYTHONPATH there).
I searched and found following options:
pip install -t <direct directory> <package> # I prefer this.
and
pip install --install-option="--prefix=$PREFIX_PATH" package_name
or use:
virtualenv
and then I need to do:
echo 'export PYTHONPATH="/home/myname/pp:$PYTHONPATH"' >> ~/.bash_profile
What should be my approach for these requirements? Thanks for your help.

pip install -t <direct directory> <package>
Will install the package globally in the given directory.
pip install --install-option="--prefix=$PREFIX_PATH" package_name
Will run the package's setup.py with the given parameters, as mentioned in the pip's help:
--install-option Extra arguments to be supplied to the setup.py install command (use like
--install-option="--install-scripts=/usr/local/bin"). Use multiple --install-
option options to pass multiple options to setup.py install. If you are using an option with a directory path,
be sure to use absolute path.
The recommended way to do to install packages is to use a virtual environment. It keeps your global packages sanitized, in case you want the same package but different versions for it in two different projects for example.
virtualenv basically creates a folder for where to store the installed packages.
In a linux-based system, you would have to run the virtualenv commmand to create the folder and after that "activate" it.
virtualenv my_virtual_environment
source my_virtual_environment/bin/activate
You will notice that the environment's name will appear at the end of shell line. What activate does is simply changing some paths in your PATH environment variable to point to your current virtual environment folder.
It will still used the system's python interpreter but when trying to import packages in your program, it will look in the virtual environment's folder first.
To return to the global python packages, just type deactivate.
If you want the environment you're using to be active right when you start the terminal, add the source command to your .bash_profile or .bashrc. I recommend using the absolute path to the python virtual environment.
If you're working on multiple projects and want to keep the packages separate from each other, create multiple virtual environments and just switch to them. You could take a look at virtalenvwrapper which makes starting the virtual environment when you open the terminal and switching between other environments really easy.

Related

Installing python packages with no installation directory acces and no pip/easy_install/virtual_env

At work we have python installed, but no additional modules. I want to import some scipy modules but I have no access to the python directory for installation.
Similar questions have been asked on StackOverflow, but the answers always assumed easy install, pip or virtualenv were installed. At my workplace, none of these packages are installed. It's just the plain python installation and nothing else.
Is there still an option for me for installing modules in my local folder and calling them from python? If so, how do I go about it?
Not exactly installing modules on your local folder, but a solution nonetheless:
I used to work for a company that used windows and didn't have admin access, so I ended up using Portable python.
It seems portable python is no longer mantained, but you can see some other portable python solutions on their site, most of which you can run straight from your usb.
You can download pip from here http://pip.readthedocs.org/en/stable/installing/ and install it without root privileges by typing:
python get-pip.py --user
This will install to directory with prefix $HOME/.local so the pip executable will be in the directory $HOME/.local/bin/pip, for your convenience you can add this directory to $PATH by adding to end of .bashrc file this string
export PATH=$HOME/.local/bin/:$PATH
After this you can install any packages by typing
pip install package --user
Or you can alternatively compile the python distribution from source code and install to your home directory to directory $HOME/.local or $HOME/opt or any subfolder of $HOME you prefer, let's call this path $PREFIX. For doing this you have to download python source code from official site, unpack it and then run
./configure --prefix=$PREFIX --enable-shared
make install
And then add python binary to $PATH, python libraries to $LD_LIBRARY_PATH, by adding to the end of $HOME/.bashrc file whit strings
export PATH=$PREFIX/bin:$PATH
export LD_LIBRARY_PATH=$PREFIX/lib
and when after restarting bash you can also run
python get-pip.py
and pip and will be installed automatically to your $PREFIX directory. And all other packages those you will install with pip will be automatically installed also to $PREFIX directory. This way is more involved, but it allows you to have the last version of python.

How to set site-packages directory for local Python 2.7 install

I'm trying to run a certain script in Python, but it requires some other modules (setuptools) - I don't have write permissions for our /usr/ directory to install them, so I'm trying to install a local version of Python 2.7 to run it (not in /usr/).
When I try to run the ez_setup script for setuptools, it tries to access:
/usr/local/lib/python2.7/site-packages/
But that's not the installation I want (I get write permission error). I can point the ez_setup script to wherever, but I'm not sure how to get Python to use it. In my 2.7 install I ran:
site.getsitepackages()
['/usr/local/lib/python2.7/site-packages', '/usr/local/lib/site-python'].
Is there a way to change the default site-packages directory so I can do local installs? Like in the Python installation directory itself?
Thanks,
Kaleb
You really want to read up on Python virtual environments, which allow you to create a "local" Python tree into which you can install your own packages and such without requiring root access.
For example, assuming that you have the virtualenv command available (you may need to install this first; it's available as a package for most major distributions), you can create a new virtual environment like this:
virtualenv --system-site-packages myenv
The --system-site-packages option will make any packages in your system site-packages directory visible to this environment, as well as anything you install locally. Then activate the environment:
. myenv/bin/activate
And install stuff:
pip install apackage
NB For no good reason, pydoc will stop working for you in this configuration. You can just drop a new pydoc script into myenv/bin/pydoc that looks like this:
#!/usr/bin/env python
import pydoc
if __name__ == '__main__':
pydoc.cli()

How do I change the default directory that pip installs to?

When I run the commmand
pip install virtualenv
I get:
Installing virtualenv script to /usr/local/share/python
But my default virtualenv is in a different place:
which virtualenv
usr/local/bin/virtualenv
I'd like pip to install to the usr/local/bin directory by default. Any help would be greatly appreciated.
If you want to manually decide where you want packages to reside, you could always download the source distribution to a directory of your choice with the following:
pip install -d <path_to_my_directory>
But when you install, I think you probably want to put the executable console scripts (as defined in the package's setup.py file; like virtualenv for example) in a directory included in your $PATH environmental variable.
You can specify this manually by doing the following:
sudo python setup.py install --install-scripts /usr/bin/
orsudo python setup.py install --install-scripts /usr/local/bin/
Let me know if you have any other questions...
/usr/local/bin is for executable programs. /usr/local/share is to store data that is independent of the architecture.
So, in your case, you are installing by default in /usr/local, where the executable programs live in /usr/local/bin, the arquitecture-independent data lives in /usr/local/share, configuration files live in /usr/local/etc, etc.
You can check Filesystem_Hierarchy_Standard to get an overview on the topic.

System PIP instead of virtualenv PIP by default?

After using virtualenv with pip off-and-on for a couple of days, I've found that the version of PIP that is used after the virtualenv is actived is the global PIP instead of the PIP relative to that environment; such that if you don't set the shell environment variable export PIP_RESPECT_VIRTUALENV=true, pip will install whatever new package (e.g. pip install argparse) to the global scope instead of only to the virtualenv.
I would expect PIP to install to the virtualenv by default, if that virtualenv is activated.
Is there a reasoning behind it not working that way by default?
See explanation here for how PIP_RESPECT_VIRTUALENV works.
When you create a virtualenv, the activate file hardcodes the variable VIRTUAL_ENV to the location in which you first created the root directory. This variable is then exported when you source <your-venv>/bin/activate.
Consequently, if you move the virtualenv directory subsequent to its creation, the hardcoded file path will be incorrect.
Just open <your-venv>/bin/activate in a text editor and make sure VIRTUAL_ENV is set to the new path of your virtualenv directory:
VIRTUAL_ENV="/Full/path/to/<your-venv>"
export VIRTUAL_ENV
before running source <your-venv>/bin/activate again.
Then of course you can test the version of pip with which pip which should produce:
/Full/path/to/<your-venv>/bin/pip
rather than /usr/bin/pip or /bin/pip etc.
It is not the first time I see someone reporting the same issue. I don't know what is happening, but some people discourage the use o source /path/to/venv/bin/activate because it can mess up your $PATH.
There is a way pip will always respect your virtualenv: don't rely on $PATH. Use:
/path/to/venv/bin/pip install MYPACKAGE
It would be nice to find out what is happening to you and share your solution with others. Meanwhile, it may be ok to use the absolute path to pip.

Renaming a virtualenv folder without breaking it

I've created folder and initialized a virtualenv instance in it.
$ mkdir myproject
$ cd myproject
$ virtualenv env
When I run (env)$ pip freeze, it shows the installed packages as it should.
Now I want to rename myproject/ to project/.
$ mv myproject/ project/
However, now when I run
$ . env/bin/activate
(env)$ pip freeze
it says pip is not installed. How do I rename the project folder without breaking the environment?
You need to adjust your install to use relative paths. virtualenv provides for this with the --relocatable option. From the docs:
Normally environments are tied to a
specific path. That means that you
cannot move an environment around or
copy it to another computer. You can
fix up an environment to make it
relocatable with the command:
$ virtualenv --relocatable ENV
NOTE: ENV is the name of the virtual environment and you must run this from outside the ENV directory.
This will make some of the files
created by setuptools or distribute
use relative paths, and will change
all the scripts to use
activate_this.py instead of using the
location of the Python interpreter to
select the environment.
Note: you must run this after you've
installed any packages into the
environment. If you make an
environment relocatable, then install
a new package, you must run virtualenv
--relocatable again.
I believe "knowing why" matters more than "knowing how". So, here is another approach to fix this.
When you run . env/bin/activate, it actually executes the following commands (using /tmp for example):
VIRTUAL_ENV="/tmp/myproject/env"
export VIRTUAL_ENV
However, you have just renamed myproject to project, so that command failed to execute.
That is why it says pip is not installed, because you haven't installed pip in the system global environment and your virtualenv pip is not sourced correctly.
If you want to fix this manually, this is the way:
With your favorite editor like Vim, modify /tmp/project/env/bin/activate usually in line 42:
VIRTUAL_ENV='/tmp/myproject/env' => VIRTUAL_ENV='/tmp/project/env'
Modify /tmp/project/env/bin/pip in line 1:
#!/tmp/myproject/env/bin/python => #!/tmp/project/env/bin/python
After that, activate your virtual environment env again, and you will see your pip has come back again.
NOTE: As #jb. points out, this solution only applies to easily (re)created virtualenvs. If an environment takes several hours to install this solution is not recommended
Virtualenvs are great because they are easy to make and switch around; they keep you from getting locked into a single configuration. If you know the project requirements, or can get them, Make a new virtualenv:
Create a requirements.txt file
(env)$ pip freeze > requirements.txt
If you can't create the requirements.txt file, check env/lib/pythonX.X/site-packages before removing the original env.
Delete the existing (env)
deactivate && rm -rf env
Create a new virtualenv, activate it, and install requirements
virtualenv env && . env/bin/activate && pip install -r requirements.txt
Alternatively, use virtualenvwrapper to make things a little easier as all virtualenvs are kept in a centralized location
$(old-venv) pip freeze > temp-reqs.txt
$(old-venv) deactivate
$ mkvirtualenv new-venv
$(new-venv) pip install -r temp-reqs.txt
$(new-venv) rmvirtualenv old-venv
I always install virtualenvwrapper to help out. From the shell prompt:
pip install virtualenvwrapper
There is a way documented in the virtualenvwrapper documents - cpvirtualenv
This is what you do. Make sure you are out of your environment and back to the shell prompt. Type in this with the names required:
cpvirtualenv oldenv newenv
And then, if necessary:
rmvirtualenv oldenv
To go to your newenv:
workon newenv
You can fix your issue by following these steps:
rename your directory
rerun this: $ virtualenv ..\path\renamed_directory
virtualenv will correct the directory associations while leaving your packages in place
$ scripts/activate
$ pip freeze to verify your packages are in place
An important caveat, if you have any static path dependencies in script files in your virtualenv directory, you will have to manually change those.
Yet another way to do it that worked for me many times without problems is virtualenv-clone:
pip install virtualenv-clone
virtualenv-clone old-dir/env new-dir/env
Run this inside your project folder:
cd bin
sed -i 's/old_dir_name/new_dir_name/g' *
Don't forget to deactivate and activate.
In Python 3.3+ with built-in venv
As of Python 3.3 the virtualenv package is now built-in to Python as the venv module. There are a few minor differences, one of which is the --relocatable option has been removed. As a result, it is normally best to recreate a virtual environment rather than attempt to move it. See this answer for more information on how to do that.
What is the purpose behind wanting to move rather than just recreate any virtual environment? A virtual environment is intended to manage the dependencies of a module/package with the venv so that it can have different and specific versions of a given package or module it is dependent on, and allow a location for those things to be installed locally.
As a result, a package should provide a way to recreate the venv from scratch. Typically this is done with a requirements.txt file and sometimes also a requirements-dev.txt file, and even a script to recreate the venv in the setup/install of the package itself.
One part that may give headaches is that you may need a particular version of Python as the executable, which is difficult to automate, if not already present. However, when recreating an existing virtual environment, one can simply run python from the existing venv when creating the new one. After that it is typically just a matter of using pip to reinstall all dependencies from the requirements.txt file:
From Git Bash on Windows:
python -m venv mynewvenv
source myvenv/Scripts/activate
pip install -r requirements.txt
It can get a bit more involved if you have several local dependencies from other locally developed packages, because you may need to update local absolute paths, etc. - though if you set them up as proper Python packages, you can install from a git repo, and thus avoid this issue by having a static URL as the source.
virtualenv --relocatable ENV is not a desirable solution. I assume most people want the ability to rename a virtualenv without any long-term side effects.
So I've created a simple tool to do just that. The project page for virtualenv-mv outlines it in a bit more detail, but essentially you can use virtualenv-mv just like you'd use a simple implementation of mv (without any options).
For example:
virtualenv-mv myproject project
Please note however that I just hacked this up. It could break under unusual circumstances (e.g. symlinked virtualenvs) so please be careful (back up what you can't afford to lose) and let me know if you encounter any problems.
Even easier solution which worked for me: just copy the site-packages folder of your old virtual environment into a new one.
Using Visual Studio Code (vscode), I just opened the ./env folder in my project root, and did a bulk find/replace to switch to my updated project name. This resolved the issue.
Confirm with which python
If you are using an conda env,
conda create --name new_name --clone old_name
conda remove --name old_name --all # or its alias: `conda env remove --name old_name`

Categories