How do I change the default directory that pip installs to? - python

When I run the commmand
pip install virtualenv
I get:
Installing virtualenv script to /usr/local/share/python
But my default virtualenv is in a different place:
which virtualenv
usr/local/bin/virtualenv
I'd like pip to install to the usr/local/bin directory by default. Any help would be greatly appreciated.

If you want to manually decide where you want packages to reside, you could always download the source distribution to a directory of your choice with the following:
pip install -d <path_to_my_directory>
But when you install, I think you probably want to put the executable console scripts (as defined in the package's setup.py file; like virtualenv for example) in a directory included in your $PATH environmental variable.
You can specify this manually by doing the following:
sudo python setup.py install --install-scripts /usr/bin/
orsudo python setup.py install --install-scripts /usr/local/bin/
Let me know if you have any other questions...

/usr/local/bin is for executable programs. /usr/local/share is to store data that is independent of the architecture.
So, in your case, you are installing by default in /usr/local, where the executable programs live in /usr/local/bin, the arquitecture-independent data lives in /usr/local/share, configuration files live in /usr/local/etc, etc.
You can check Filesystem_Hierarchy_Standard to get an overview on the topic.

Related

virtualenv - Birds Eye View of Understanding

Using Windows
Learning about virtualenv. Here is my understanding of it and a few question that I have. Please correct me if my understanding is incorrect.
virtualenv are environments where your pip dependencies and its selected version are stored for a particular project. A folder is made for your project and inside there are the dependencies.
I was told you would not want to save your .py scripts in side of virtual ENV, if that's the case how do I access the virtual env when I want to run that project? Open it up in the command line under source ENV/bin/activate then cd my way to where my script is stored?
By running pip freeze that creates a requirements.txt file in that project folder that is just a txt. copy of the dependencies of that virtual env?
If I'm in a second virutalenv who do I import another virtualenv's requirements? I've been to the documentation but I still don't get it.
$ env1/bin/pip freeze > requirements.txt
$ env2/bin/pip install -r requirements.txt
Guess I'm confused on the "requirements" description. Isn't best practice to always call our requirements, requirements.txt? If that's the case how does env2 know I'm want env1 requirements?
Thank you for any info or suggestions. Really appreciate the assistance.
I created a virtualenv C:\Users\admin\Documents\Enviorments>virtualenv django_1
Using base prefix'c:\\users\\admin\\appdata\\local\\programs\\python\\python37-32'
New python executable in C:\Users\admin\Documents\Enviorments\django_1\Scripts\python.exe Installing setuptools, pip, wheel...done.
How do I activate it? source django_1/bin/activate doesn't work?
I've tried: source C:\Users\admin\Documents\Enviorments\django_1/bin/activate Every time I get : 'source' is not recognized as an internal or external command, operable program or batch file.
* disclaimer * I mainly use conda environments instead of virtualenv, but I believe that most of this is the same across both of them and is true to your case.
You should be able to access your scripts from any environment you are in. If you have virtenvA and virtenvB then you can access your script from inside either of your environments. All you would do is activate one of them and then run python /path/to/my/script.py, but you need to make sure any dependent libraries are installed.
Correct, but for clarity the requirements file contains a list of the dependencies by name only. It doesn't contain any actual code or packages. You can print out a requirements file but it should just be a list which says package names and their version numbers. Like pandas 1.0.1 numpy 1.0.1 scipy 1.0.1 etc.
In the lines of code you have here you would export the dependencies list of env1 and then you would install these dependencies in env2. If env2 was empty then it will now just be a copy of env1, otherwise it will be the same but with all the packages of env1 added and if it had a different version number of some of the same packages then this would be overwritten
virtualenv simply creates a new Python environment for your project. Think of it as another copy of Python that you have in your system. Virutual environment is helpful for development, especially if you will need different versions of the same libraries.
Answer to your first question is, yes, for each project that you use virtualenv, you need to activate it first. After activating, when you run python script, not just your project's scripts, but any python script, will use dependencies and configuration of the active Python environment.
Answer to the second question, pip freeze > requirements.txt will create requirements file in active folder, not in your project folder. So, let's say in your cmd/terminal you are in C:\Desktop, then the requirements file will be created there. If you're in C\Desktop\myproject folder, the file will be created there. Requirements file will contain the packages installed on active virtualenv.
Answer to 3rd question is related to second. Simply, you need to write full path of the second requirements file. So if you are in first project and want to install packages from second virtualenv, you run it like env2/bin/pip install -r /path/to/my/first/requirements.txt. If in your terminal you are in active folder that does not have requirements.txt file, then running pip install will give you an error. So, running the command does not know which requirements file you want to use, you specify it.
I created a virtualenv
C:\Users\admin\Documents\Enviorments>virtualenv django_1 Using base prefix 'c:\\users\\admin\\appdata\\local\\programs\\python\\python37-32' New python executable in C:\Users\admin\Documents\Enviorments\django_1\Scripts\python.exe Installing setuptools, pip, wheel...done.
How do I activate it? source django_1/bin/activate doesn't work?
I've tried: source C:\Users\admin\Documents\Enviorments\django_1/bin/activate Every time I get : 'source' is not recognized as an internal or external command, operable program or batch file.
Yes, saving virtualenv separately from your project files is one of concepts. virtualenvwrapper and pipenv works like that. But personally if I use virtualenv in the simplest form, then I just create the directory with the same name inside virtualenv's directory (next to bin/) and I keep project files there.
pip freeze prints to console the packages (and it's versions) you've installed inside your virtualenv using pip. If you want to save those requirements to file you should do something like pip freeze > requirements.txt
There are few posibilites:
you can activate one virtualenv, then go (cd /path/to/venv2) to another virtualenv.
you can copy your requirements.txt file from one virtualenv and install those requirements in your second virtualenv

Using a user folder for python3 packages with pip3

I am trying to install and use python3 packages to /home/myname/pp folder. I should be able to run python3 from anywhere. Also, pip3 should be able to update the packages in this folder. I should also be able to copy this folder to a new Linux system and it should work there as well (by changing PYTHONPATH there).
I searched and found following options:
pip install -t <direct directory> <package> # I prefer this.
and
pip install --install-option="--prefix=$PREFIX_PATH" package_name
or use:
virtualenv
and then I need to do:
echo 'export PYTHONPATH="/home/myname/pp:$PYTHONPATH"' >> ~/.bash_profile
What should be my approach for these requirements? Thanks for your help.
pip install -t <direct directory> <package>
Will install the package globally in the given directory.
pip install --install-option="--prefix=$PREFIX_PATH" package_name
Will run the package's setup.py with the given parameters, as mentioned in the pip's help:
--install-option Extra arguments to be supplied to the setup.py install command (use like
--install-option="--install-scripts=/usr/local/bin"). Use multiple --install-
option options to pass multiple options to setup.py install. If you are using an option with a directory path,
be sure to use absolute path.
The recommended way to do to install packages is to use a virtual environment. It keeps your global packages sanitized, in case you want the same package but different versions for it in two different projects for example.
virtualenv basically creates a folder for where to store the installed packages.
In a linux-based system, you would have to run the virtualenv commmand to create the folder and after that "activate" it.
virtualenv my_virtual_environment
source my_virtual_environment/bin/activate
You will notice that the environment's name will appear at the end of shell line. What activate does is simply changing some paths in your PATH environment variable to point to your current virtual environment folder.
It will still used the system's python interpreter but when trying to import packages in your program, it will look in the virtual environment's folder first.
To return to the global python packages, just type deactivate.
If you want the environment you're using to be active right when you start the terminal, add the source command to your .bash_profile or .bashrc. I recommend using the absolute path to the python virtual environment.
If you're working on multiple projects and want to keep the packages separate from each other, create multiple virtual environments and just switch to them. You could take a look at virtalenvwrapper which makes starting the virtual environment when you open the terminal and switching between other environments really easy.

Installing python packages with no installation directory acces and no pip/easy_install/virtual_env

At work we have python installed, but no additional modules. I want to import some scipy modules but I have no access to the python directory for installation.
Similar questions have been asked on StackOverflow, but the answers always assumed easy install, pip or virtualenv were installed. At my workplace, none of these packages are installed. It's just the plain python installation and nothing else.
Is there still an option for me for installing modules in my local folder and calling them from python? If so, how do I go about it?
Not exactly installing modules on your local folder, but a solution nonetheless:
I used to work for a company that used windows and didn't have admin access, so I ended up using Portable python.
It seems portable python is no longer mantained, but you can see some other portable python solutions on their site, most of which you can run straight from your usb.
You can download pip from here http://pip.readthedocs.org/en/stable/installing/ and install it without root privileges by typing:
python get-pip.py --user
This will install to directory with prefix $HOME/.local so the pip executable will be in the directory $HOME/.local/bin/pip, for your convenience you can add this directory to $PATH by adding to end of .bashrc file this string
export PATH=$HOME/.local/bin/:$PATH
After this you can install any packages by typing
pip install package --user
Or you can alternatively compile the python distribution from source code and install to your home directory to directory $HOME/.local or $HOME/opt or any subfolder of $HOME you prefer, let's call this path $PREFIX. For doing this you have to download python source code from official site, unpack it and then run
./configure --prefix=$PREFIX --enable-shared
make install
And then add python binary to $PATH, python libraries to $LD_LIBRARY_PATH, by adding to the end of $HOME/.bashrc file whit strings
export PATH=$PREFIX/bin:$PATH
export LD_LIBRARY_PATH=$PREFIX/lib
and when after restarting bash you can also run
python get-pip.py
and pip and will be installed automatically to your $PREFIX directory. And all other packages those you will install with pip will be automatically installed also to $PREFIX directory. This way is more involved, but it allows you to have the last version of python.

Override default installation directory for Python bdist Windows installer

Is it possible to specify during the installer generation (or during the actual installation) a custom path for Python modules? By way of example, let's say I have 5 modules for which I generate an installer using:
c:\>python setup.py bdist
Everything gets packaged up correctly, but when I install, I am forced to install into site-packages. I need to be able to specify a custom directory of my (or the installer's choosing). At a minimum, I need to be able to override the default so my custom path appears as the default.
Is this possible using a built distribution?
You should write setup.cfg where you can specify installation options(see python setup.py install --help output) and then run python setup.py bdist. When creating binary distro python will do the dumb installation under the "build" subdir with this options and create the installer from this dumb installation. For example, if you want to create bdist which installs libraries to /some/lib/path and scripts to /some/bin/path create the following setup.cfg:
[install]
prefix=/
install_lib=/some/lib/path
install_scripts=/some/bin/path
And then run python setup.py bdist
From running python setup.py --help install:
Options for 'install' command:
--prefix installation prefix
--exec-prefix (Unix only) prefix for platform-
specific files
--home (Unix only) home directory to install
under
--user install in user site-package
'/home/jterrace/.local/lib/python2.7/si
te-packages'
--install-base base installation directory (instead of
--prefix or --home)
--install-platbase base installation directory for
platform-specific files (instead of --
exec-prefix or --home)
--root install everything relative to this
alternate root directory
I do beleive that MaxSin's answer was somewhat correct. But to use his answer for the command: "python setup.py bdist_wininst" you would have to do it like this:
[bdist_wininst]
prefix=/
install_lib=/some/lib/path
install_scripts=/some/bin/path
Seeing as the syntax here is:
[command]
option=value
...
edit:
It looks like this doesnt work :( not sure of a possible other solution.

Renaming a virtualenv folder without breaking it

I've created folder and initialized a virtualenv instance in it.
$ mkdir myproject
$ cd myproject
$ virtualenv env
When I run (env)$ pip freeze, it shows the installed packages as it should.
Now I want to rename myproject/ to project/.
$ mv myproject/ project/
However, now when I run
$ . env/bin/activate
(env)$ pip freeze
it says pip is not installed. How do I rename the project folder without breaking the environment?
You need to adjust your install to use relative paths. virtualenv provides for this with the --relocatable option. From the docs:
Normally environments are tied to a
specific path. That means that you
cannot move an environment around or
copy it to another computer. You can
fix up an environment to make it
relocatable with the command:
$ virtualenv --relocatable ENV
NOTE: ENV is the name of the virtual environment and you must run this from outside the ENV directory.
This will make some of the files
created by setuptools or distribute
use relative paths, and will change
all the scripts to use
activate_this.py instead of using the
location of the Python interpreter to
select the environment.
Note: you must run this after you've
installed any packages into the
environment. If you make an
environment relocatable, then install
a new package, you must run virtualenv
--relocatable again.
I believe "knowing why" matters more than "knowing how". So, here is another approach to fix this.
When you run . env/bin/activate, it actually executes the following commands (using /tmp for example):
VIRTUAL_ENV="/tmp/myproject/env"
export VIRTUAL_ENV
However, you have just renamed myproject to project, so that command failed to execute.
That is why it says pip is not installed, because you haven't installed pip in the system global environment and your virtualenv pip is not sourced correctly.
If you want to fix this manually, this is the way:
With your favorite editor like Vim, modify /tmp/project/env/bin/activate usually in line 42:
VIRTUAL_ENV='/tmp/myproject/env' => VIRTUAL_ENV='/tmp/project/env'
Modify /tmp/project/env/bin/pip in line 1:
#!/tmp/myproject/env/bin/python => #!/tmp/project/env/bin/python
After that, activate your virtual environment env again, and you will see your pip has come back again.
NOTE: As #jb. points out, this solution only applies to easily (re)created virtualenvs. If an environment takes several hours to install this solution is not recommended
Virtualenvs are great because they are easy to make and switch around; they keep you from getting locked into a single configuration. If you know the project requirements, or can get them, Make a new virtualenv:
Create a requirements.txt file
(env)$ pip freeze > requirements.txt
If you can't create the requirements.txt file, check env/lib/pythonX.X/site-packages before removing the original env.
Delete the existing (env)
deactivate && rm -rf env
Create a new virtualenv, activate it, and install requirements
virtualenv env && . env/bin/activate && pip install -r requirements.txt
Alternatively, use virtualenvwrapper to make things a little easier as all virtualenvs are kept in a centralized location
$(old-venv) pip freeze > temp-reqs.txt
$(old-venv) deactivate
$ mkvirtualenv new-venv
$(new-venv) pip install -r temp-reqs.txt
$(new-venv) rmvirtualenv old-venv
I always install virtualenvwrapper to help out. From the shell prompt:
pip install virtualenvwrapper
There is a way documented in the virtualenvwrapper documents - cpvirtualenv
This is what you do. Make sure you are out of your environment and back to the shell prompt. Type in this with the names required:
cpvirtualenv oldenv newenv
And then, if necessary:
rmvirtualenv oldenv
To go to your newenv:
workon newenv
You can fix your issue by following these steps:
rename your directory
rerun this: $ virtualenv ..\path\renamed_directory
virtualenv will correct the directory associations while leaving your packages in place
$ scripts/activate
$ pip freeze to verify your packages are in place
An important caveat, if you have any static path dependencies in script files in your virtualenv directory, you will have to manually change those.
Yet another way to do it that worked for me many times without problems is virtualenv-clone:
pip install virtualenv-clone
virtualenv-clone old-dir/env new-dir/env
Run this inside your project folder:
cd bin
sed -i 's/old_dir_name/new_dir_name/g' *
Don't forget to deactivate and activate.
In Python 3.3+ with built-in venv
As of Python 3.3 the virtualenv package is now built-in to Python as the venv module. There are a few minor differences, one of which is the --relocatable option has been removed. As a result, it is normally best to recreate a virtual environment rather than attempt to move it. See this answer for more information on how to do that.
What is the purpose behind wanting to move rather than just recreate any virtual environment? A virtual environment is intended to manage the dependencies of a module/package with the venv so that it can have different and specific versions of a given package or module it is dependent on, and allow a location for those things to be installed locally.
As a result, a package should provide a way to recreate the venv from scratch. Typically this is done with a requirements.txt file and sometimes also a requirements-dev.txt file, and even a script to recreate the venv in the setup/install of the package itself.
One part that may give headaches is that you may need a particular version of Python as the executable, which is difficult to automate, if not already present. However, when recreating an existing virtual environment, one can simply run python from the existing venv when creating the new one. After that it is typically just a matter of using pip to reinstall all dependencies from the requirements.txt file:
From Git Bash on Windows:
python -m venv mynewvenv
source myvenv/Scripts/activate
pip install -r requirements.txt
It can get a bit more involved if you have several local dependencies from other locally developed packages, because you may need to update local absolute paths, etc. - though if you set them up as proper Python packages, you can install from a git repo, and thus avoid this issue by having a static URL as the source.
virtualenv --relocatable ENV is not a desirable solution. I assume most people want the ability to rename a virtualenv without any long-term side effects.
So I've created a simple tool to do just that. The project page for virtualenv-mv outlines it in a bit more detail, but essentially you can use virtualenv-mv just like you'd use a simple implementation of mv (without any options).
For example:
virtualenv-mv myproject project
Please note however that I just hacked this up. It could break under unusual circumstances (e.g. symlinked virtualenvs) so please be careful (back up what you can't afford to lose) and let me know if you encounter any problems.
Even easier solution which worked for me: just copy the site-packages folder of your old virtual environment into a new one.
Using Visual Studio Code (vscode), I just opened the ./env folder in my project root, and did a bulk find/replace to switch to my updated project name. This resolved the issue.
Confirm with which python
If you are using an conda env,
conda create --name new_name --clone old_name
conda remove --name old_name --all # or its alias: `conda env remove --name old_name`

Categories