I know that virtualenv, if not passed the --no-site-packages argument when creating a new virtual environment, will link the packages in /usr/local/lib/python2.7/site-packages (for Python 2.7) with a newly-created virtual environment. On Ubuntu 12.04 LTS, I have three locations where Python 2.7 packages can be installed (using the default, Ubuntu-supplied Python 2.7 installation):
/usr/lib/python2.7/dist-packages: this has my global installation of ipython, scipy, numpy, matplotlib – packages that I would find difficult and time-consuming to install individually (and all their dependences) if they were not available via the scipy stack.
/usr/local/lib/python2.7/site-packages: this is empty, and I think it will stay that way on Ubuntu unless I install a package from source.
/usr/local/lib/python2.7/dist-packages: this has very important local packages for astronomy, notably those related to PyRAF, STScI, etc., and they are extremely difficult and time-consuming to install individually.
Note that a global directory such as /usr/lib/python2.7/site-packages does not exist on my system. Note also that my global installation of ipython, scipy, etc. lets me use those packages on-the-fly without having to source/activate a virtual environment every time.
Naturally, I now want to use virtualenv to create one virtual environment in my user home directory which I will source/activate for my future projects. However, I would like this virtual environment, while being created, to link/copy all of my packages in locations (1) and (3) in the list above. The main reason for this is that I don't want to go through the pip install process (if it is even possible) to re-install ipython, scipy, the astro-packages, etc. for this (and maybe other) virtual environments.
Here are my questions:
Is there a way for me to specify to virtualenv that I would like it to link/copy packages in these two dist-packages directories for virtual environments that are created in the future?
When I eventually update my global installation of scipy, ipython, etc. in the two dist-packages directories, will this also update/change the packages that my virtual environment uses (and which it originally got during virtualenv creation)?
If I ever install a package from source on Ubuntu, will it go in /usr/local/lib/python2.7/dist-packages, or /usr/local/lib/python2.7/site-packages?
Thanks in advance for your help!
This might be a legitimate use of PYTHONPATH - an environmental variable that virtualenv doesn't touch, which uses the same syntax as the environmental variable PATH, in bash PYTHONPATH=/usr/lib/python2.7/dist-packages:/usr/local/lib/python2.7/dist-packages in a .bashrc or similar. If you followed this path,
You don't have to tell your virtual environment about this at all, it won't try to change it.
No relinking will be required, and
That will still go wherever it would have gone (pip install always uses /usr/local/lib/python2.7/dist-packages/ for my Ubuntu) if you install them outside of your virtual environment. If you install them from within your virtual environment (while it's activated) then of course it'll be put in the virtualenvironment.
I'm just getting my head around virtualenv, but there seems to be an easier way than mentioned so far.
Since virtualenv 1.7 --no-site-packages has been the default behavior.
Therefore using the --system-site-packages flag to virtualenv is all that is needed to get dist-packages in your path - if you use the tweaked virtualenv shipped by Ubuntu. (This answer and this one give some useful history). I've tested this and it does work.
$ virtualenv --system-site-packages .
I agree with Thomas here - I can't see any action required in virtualenv to see the effect of updates in dist-packages.
Having tested that with python setup.py install, it does (again as Thomas said) still go to dist-packages. You could change that by building your own python, but that's a bit extreme.
PYTHONPATH works for me.
vim ~/.bashrc
add this line below:
export PYTHONPATH=$PYTHONPATH:/usr/lib/python2.7/dist-packages:/usr/local/lib/python2.7/dist-packages
source ~/.bashrc
In the directory site-packages, create a file dist.pth
In the file dist.path, put the following:
../dist-packages
Now deactivate and activate your virtualenv. You should be set.
What you want to achieve here is essentially add specific folder (dist-packages) to Python search path. You have a number of options for this:
Use path configuration (.pth) file, entries will be appended to the system path.
Modify PYTHONPATH (entries from it go to the beginning of system path).
Modify sys.path directly from your Python script, i.e. append required folders to it.
I think that for this particular case (enable global dist-packages folder) third option is better, because with first option you have to create .pth file for every virtualenv you'll be working in (with some external shell script?). It's easy to forget it when you distribute your package. Second option requires run-time setup (add a envvar), which is, again, easy to miss.
And only third option doesn't require any prerequisites at configure- or run-time and can be distributed without issues (on the same-type system, of course).
You can use function like this:
def enable_global_distpackages():
import sys
sys.path.append('/usr/lib/python2.7/dist-packages')
sys.path.append('/usr/local/lib/python2.7/dist-packages')
And then in __init__.py file of your package:
enable_global_distpackages()
Related
I'm confused as to where I should put my virtualenvs.
With my first django project, I created the project with the command
django-admin.py startproject djangoproject
I then cd'd into the djangoproject directory and ran the command
virtualenv env
which created the virtual environment directory at the same level as the inner djangoproject directory.
Is this the wrong place in which to create the virtualenv for this particular project?
I'm getting the impression that most people keep all their virtualenvs together in an entirely different directory, e.g. ~/virtualenvs, and then use virtualenvwrapper to switch back and forth between them.
Is there a correct way to do this?
Many people use the virtualenvwrapper tool, which keeps all virtualenvs in the same place (the ~/.virtualenvs directory) and allows shortcuts for creating and keeping them there. For example, you might do:
mkvirtualenv djangoproject
and then later:
workon djangoproject
It's probably a bad idea to keep the virtualenv directory in the project itself, since you don't want to distribute it (it might be specific to your computer or operating system). Instead, keep a requirements.txt file using pip:
pip freeze > requirements.txt
and distribute that. This will allow others using your project to reinstall all the same requirements into their virtualenv with:
pip install -r requirements.txt
Changing the location of the virtualenv directory breaks it
This is one advantage of putting the directory outside of the repository tree, e.g. under ~/.virtualenvs with virutalenvwrapper.
Otherwise, if you keep it in the project tree, moving the project location will break the virtualenv.
See: Renaming a virtualenv folder without breaking it
There is --relocatable but it is known to not be perfect.
Another minor advantage: you don't have to .gitignore it.
The advantages of putting it gitignored in the project tree itself are:
keeps related stuff close together.
you will likely never reuse a given virtualenv across projects, so putting it somewhere else does not give much advantage
This is an annoying design flaw in my opinion. They should implement virutalenv in a way that does not matter where the directory is, as storing in-tree is just simpler and more isolated. Node.js' NPM package manager does it without any problem. And while we are at it: pip should just use local directories by default just like NPM. Having this separate virtualenv layer is wonky. Node.js just have NPM that does it all without extra typing. I can't believe I'm prasing the JavaScript ecosystem on a Python post, but it's true.
The generally accepted place to put them is the same place that the default installation of virtualenvwrapper puts them: ~/.virtualenvs
Related: virtualenvwrapper is an excellent tool that provides shorthands for the common virtualenv commands. http://www.doughellmann.com/projects/virtualenvwrapper/
If you use pyenv install Python, then pyenv-virtualenv will be a best practice. If set .python-version file, it can auto activate or deactivate virtual env when you change work folder. Pyenv-virtualenv also put all virtual env into $HOME/.pyenv/versions folder.
From my personal experience, I would recommend to organize all virtual environments in one single directory. Unless someone has extremely sharp memory and can remember files/folders scattered across file system.
Not a big fan of using other tools just to mange virtual environments. In VSCode if I configure(python.venvPath) directory containing all virtual environments, it can automatically recognize all of them.
For Anaconda installations of Python, the "conda create" command puts it in a directory within the anaconda3 folder by default. Specifically (for Windows):
C:\Users\username\anaconda3\envs
This allows other conda commands to work without specifying the path. One advantage, not noted above, is that putting environments in the project folder allows you to use the same name for all of them (but that is not much of an advantage for me). For more info, see:
https://conda.io/projects/conda/en/latest/user-guide/tasks/manage-environments.html
Why does pip need to use virtual environments to isolate packages per-project, instead of just installing them in a default directory in the project? This seems like added complexity without benefit.
NPM, for example, installs packages in the <project_root>\node_modules by default. No virtual environment necessary, and packages are still installed in a project-independent way.
Edit: To be clear, I'm interested in the practical advantages to pip's use of virtual environments over package management systems like NPM, Nuget, and Webpack, which all use directories in the project. Otherwise, if this is just a limitation of Python's modules system, then I'd be interested to know that too.
Because Python's module system doesn't work that way. If pip were to install, say, requests by just downloading it to a python_modules directory, that wouldn't be enough to for import requests to work; it would have to be import python_modules.requests, but then we'd still have problems whenever requests tried to import one of its dependencies, as that would need python_modules prepended, too, and it'd just be a big mess. The solution that virtual environments use is to modify the PYTHONPATH environment variable to include python_modules, plus some extra stuff to take care of executable scripts and not importing packages from outside the virtualenv.
I think maybe you don't know what a virtual environment actually is.
If you were to put some module in a project-specific directory, like myproj/modules, then you would have to add myproj/modules to the search path that Python uses so that you module can be found. One way to do that is to define or modify the environment variable PYTHONPATH. Any directories listed in that variable will be searched for modules, in addition to some hard-coded set of directories.
$ export PYTHONPATH=./myproj/modules
However, that's really all a virtual environment is. The directory contains the desired version of Python, along with whatever modules you want to use. The activate script you run to "enable" a virtual environment does little more than set the value of PATH and PYTHONPATH so that anytime you run python, both the correct version is used and your project-specific set of modules is used in place of any global library.
I'm aware of how to use virtualenv for isolating Python dependencies in a long-running script, like a Flask or Twisted app. But I've been sort of puzzled about how you're supposed to go about this for a script intended to be invoked from the command line.
Suppose I wanted to make a CLI tool for interacting with some API, perhaps using Click or docopt. Obviously you don't want to have to source venv/bin/activate every time you want to use this tool. But I'd assume that it's still best to still use virtualenv to keep a clean environment even beyond development.
Sorry for the newbie question, but...what are you supposed to do to package up a script so it can be cleanly used in this manner? (I'm more used to RubyGems, and am still figuring out Pip and VirtualEnv.)
In general, if a package you've installed in a virtual env that provides binary command line script, say in ~/.virtualenv/bin/ you can symlink it into ~/bin/ (or wherever on your path you'd like to put local scripts).
There are a couple projects that aim to solve this issue:
pipsi the pip script installer -- amounts to doing the virtual env creation and symlinking for you
pipx pip for executable binaries
An excellent article on virtualenv by Dabapps would make it clear for you:
http://www.dabapps.com/blog/introduction-to-pip-and-virtualenv-python/
As for invoking it from a CLI script:
1. cd to your project root
2. env/bin/python your_main_file.py (Assuming your virtualenv is named env)
This way you don't need to source the virtualenv everytime.
Each virtualenv has it's own Python site_packages, builtin modles, and Python interpreter. So virtualenv is meant to be used at the project level, not at the "package by package" level. It isolates a collection of Python modules and possible dependencies. Each virtualenv has it's own location where pip will install packages. In theory, virtualenv should not be necessary, but in practice, it's nice to have a way to have different "environments" with different versions of Python modules and Python interpreters. I don't know if Ruby has something similar, that would allow you to have different "sets" of gems for different projects.
People that use straight virtualenv will add aliases to their .bashrc, for example:
alias workonawesomeproject="source ~/venv/awesomeproject/bin/activate"
They would activate the virtualenv with the alias
workonawesomeproject
To leave a virtualenv you use the command deactivate
An easier way to deal with virtualenvs is to use virtualenvwrapper
pip install virtualenvwrapper
Add these lines to your .bashrc (or other shell initialization file)
export WORKON_HOME=$HOME/venv # this directory is your choice
export PROJECT_HOME=$HOME/src # this directory is your choice
source /usr/local/bin/virtualenvwrapper.sh # leave this alone
If you just modified your .bashrc make sure to source it
source ~/.bashrc
Then to make a new virtualenv you simply run
mkvirtualenv awesomeproject
To use that virtualenv
workon awesomeproject
To deactivate that virtualenv
deactivate
Virtualenvwrapper Docs:
http://virtualenvwrapper.readthedocs.org/en/latest/install.html
I've installed a Python based package in my site-packages directory. However, I'm trying to learn how the code works so I'd like to hack it a bunch by putting in lots of print statements so I can understand what the code is doing. But at the end of the day I want a clean installation without all my hacks in it.
Of course I could just copy the original files to something else, make some hacks, and then at the end copy all the original files back over. But that's really tedious. At the very least, I'd like to install a local copy of the Python package and then have the python script use this copy preferentially (perhaps by suitable statements at the top of the script). But perhaps this isn't even the best way to do python development/hacking.
What's the best solution for my problem? I want to be able to hack on the package (and use that package) but without messing up my clean version.
Take a look at virtualenv. You can basically setup a local python environment in which you can install anything you like without having to mess around with the system environment.
The virtualenv-advice given is correct, depending on your actual package you can even go beyond that and not mess with the site-packages inside the virtualenv at all.
If the package is setuptools-based, a simple
$ python setup.py develop
from within a working-copy of it's source, it won't be installed, but instead just hooked into the virtualenv pointing to the working-copy. Advantage: you can edit (and e.g. rollback using GIT or whatever SCM the package maintainer uses) files in a well-defined and non-volatile location.
This is what the Python virtualenv tool is for. It allows you to create a local Python environment with a set of packages distinct from your system installation. For example, I could do something like this:
$ virtualenv myenv
$ . myenv/bin/activate
$ pip install nifty-module
The activate script modifies your PATH so that any script that starts with:
#!/usr/bin/env python
will use the Python from your virtual environment, rather than the system Python, and will see the modules installed in that environment.
I have a local git repository on my machine, let's say under /develop/myPackage.
I'm currently developing it as a python package (a Django app) and I would like to access it from my local virtualenv. I've tried to include its path in my PYTHONPATH (I'm on a Mac)
export PATH="$PATH:/develop/myPackage"
The directory already contains a __init__.py within its root and within each subdirectory.
No matter what I do but I can't get it work, python won't see my package.
The alternatives are:
Push my local change to github and install the package within my virtualenv from there with pip
Activate my virtualenv and install the package manually with python setup.py install
Since I often need to make changes to my code the last two solution would require too much work all the time even for a small change.
Am I doing something wrong? Would you suggest a better solution?
Install it in editable mode from your local path:
pip install -e /develop/MyPackage
This actually symlinks the package within your virtualenv so you can keep on devving and testing.
The example you show above uses PATH, and not PYTHONPATH. Generally, the search path used by python is partially predicated on the PYTHONPATH environment variable (PATH has little use for this case.)
Try this:
export PYTHONPATH=$PYTHONPATH:/develop/myPackage
Though in reality, you likely want it to be pointing to the directory that contains your package (so you can do 'import myPackage', rather than importing things within the package. That being said, you likely want:
export PYTHONPATH=$PYTHONPATH:/develop/
Reference the python docs here for more information about Python's module/package search path: http://docs.python.org/2/tutorial/modules.html#the-module-search-path
By default, Python uses the packages that it was installed with as it's default path, and as a result PYTHONPATH is unset in the environment.