Python venv and virtualenvwrapper combined - python

In Python 3.5 the recommended way to create virtual environments is with the venv, instead of virtualenv. Still the python packaging tutorial mentions both tools.
However virtualenvwrapper is a recommended wrapper tool to use when using virtualenv.
My questions are then:
Is there a way to use virtualenvwrapper with venv?
Or could one even consider virtualenvwrapper not needed due to venv? (I cannot see how this could be true since it is a wrapper solving
another problem)
Edit: I can see that there is some confusion in the answers to my question. venv is Python's official equivalent of virtualenv, as explained in the links above. Multiple stack overflow questions suggests that venv should be used. As mentioned in the "duplicate" you suggested:
the introduction of venv is in part a response to that mess. If you want to help improve the situation, I suggest you use venv and encourage others to do the same
So it is encouraged to use venv. But as this questions implies, if one is to use venv how does one use a wrapper like virtualenvwrapper

Here is a custom but still clean and clear solution. Append This script to in .bashrc / .bash_profile / .zshrc, it will give you basic management of venv.
In addition, you can extend the script by adding the following line so that it will also display the existing venv lists.
lsvenv(){
ls $VENV_HOME
}

Is there a way to use virtualenvwrapper with venv?
Yes. Simply point WORKON_HOME at your venvs directory. Here's what I do in my ~/.zshrc, and I use a mix of virtualenv (rare now, just for a few legacy py2 needs) and venv (most common). I renamed the default to .venvs to make it clear these are mostly Python 3 venvs and not virtualenvs.
# Python Environment Handling
export WORKON_HOME=$HOME/.venvs # Default name changed from virtualenv to highlight I am using python3 -m venv (aka pyvenv)
export PROJECT_HOME=$HOME/dev
source /usr/local/bin/virtualenvwrapper.sh # symlinked to /Library/Frameworks/Python.framework/Versions/3.7/bin/virtualenvwrapper.sh
Or could one even consider virtualenvwrapper not needed due to venv? (I cannot see how this could be true since it is a wrapper solving another problem)
venv == virtualenv (in short). venv does not supersede virtualenvwrapper for the same reason virtualenv does supersede it. Your hunch on this is correct.
How is this better than just making an alias that sources the activate?
Well, it's up to you to determine what you need, but I find virtualenvwrapper with zsh plugins virtualenv and virtualenvwrapper to be pretty great and better than raw aliases.
workon to list all venvs is very nice, then workon speech_analyzer to jump right in on it.
Other solutions?
You can also set up hooks to activate venvs on directory change, but if that's what you are after and only that, then that's essentially pipenv. Pipenv is great if that's all you want to do. Pipenv also has an interesting and promising lockfile feature, but it is too slow for development and too immature with issues in production to comment on at this time.
But I've never liked the 1:1 environment per project workflow for the same reasons given here: https://chriswarrick.com/blog/2018/07/17/pipenv-promises-a-lot-delivers-very-little/. Particularly these needs line up with mine of the multi project single environment: https://chriswarrick.com/blog/2018/07/17/pipenv-promises-a-lot-delivers-very-little/#nikola
I have six environments on my machine and about 20 projects. Pipenv doesn't extend to that situation. Pipenv insists on 20 environments for 20 projects. It just doesn't work and creates more problems than it solves. If you do have a 1:1 workflow though currently, then pipenv may be the tool you'd like. One word of caution, you can ~only~ do that workflow in pipenv, unfortunately.

[EDIT, 8 Jul 19: Readers will probably find that this answer gives a fuller
description of the different tools for handling virtual environments
in Python. They all have their issues, as does conda, which has a
somewhat more sophisticated concept of "environment".]
As its name suggests, virtualenvwrapper was specifically designed to
wrap virtualenv, on which it depends. I don't believe that anyone
has similarly wrapped venv yet.
venv is intended to do the basic work of creating virtual
environments, but environment management has to be done with
scripts. Although shell scripting is often people's first
resort, the venv module has an extensive API to help you in
those tasks.
Nowadays there are many options for creating Python virtual environments. Besides those you mention, anaconda allows the creation and management of environments and even works pretty well with pip most of the time.
The tools in the virtual environment space have been designed to work together with standard Python distribution tools as far as possible, but the arrival of venv in Python 3.5 did not invalidate either virtualenv or virtualenvwrapper, which should both still work fine.
The venv module is principally a simple toolkit to allow the creation of virtualenvs inside programs, and is by no means intended to replace the convenience of virtualenvwrapper. It simply meets a rather different set of needs.

Related

What directory do I install a virtualenvironment? [duplicate]

I'm confused as to where I should put my virtualenvs.
With my first django project, I created the project with the command
django-admin.py startproject djangoproject
I then cd'd into the djangoproject directory and ran the command
virtualenv env
which created the virtual environment directory at the same level as the inner djangoproject directory.
Is this the wrong place in which to create the virtualenv for this particular project?
I'm getting the impression that most people keep all their virtualenvs together in an entirely different directory, e.g. ~/virtualenvs, and then use virtualenvwrapper to switch back and forth between them.
Is there a correct way to do this?
Many people use the virtualenvwrapper tool, which keeps all virtualenvs in the same place (the ~/.virtualenvs directory) and allows shortcuts for creating and keeping them there. For example, you might do:
mkvirtualenv djangoproject
and then later:
workon djangoproject
It's probably a bad idea to keep the virtualenv directory in the project itself, since you don't want to distribute it (it might be specific to your computer or operating system). Instead, keep a requirements.txt file using pip:
pip freeze > requirements.txt
and distribute that. This will allow others using your project to reinstall all the same requirements into their virtualenv with:
pip install -r requirements.txt
Changing the location of the virtualenv directory breaks it
This is one advantage of putting the directory outside of the repository tree, e.g. under ~/.virtualenvs with virutalenvwrapper.
Otherwise, if you keep it in the project tree, moving the project location will break the virtualenv.
See: Renaming a virtualenv folder without breaking it
There is --relocatable but it is known to not be perfect.
Another minor advantage: you don't have to .gitignore it.
The advantages of putting it gitignored in the project tree itself are:
keeps related stuff close together.
you will likely never reuse a given virtualenv across projects, so putting it somewhere else does not give much advantage
This is an annoying design flaw in my opinion. They should implement virutalenv in a way that does not matter where the directory is, as storing in-tree is just simpler and more isolated. Node.js' NPM package manager does it without any problem. And while we are at it: pip should just use local directories by default just like NPM. Having this separate virtualenv layer is wonky. Node.js just have NPM that does it all without extra typing. I can't believe I'm prasing the JavaScript ecosystem on a Python post, but it's true.
The generally accepted place to put them is the same place that the default installation of virtualenvwrapper puts them: ~/.virtualenvs
Related: virtualenvwrapper is an excellent tool that provides shorthands for the common virtualenv commands. http://www.doughellmann.com/projects/virtualenvwrapper/
If you use pyenv install Python, then pyenv-virtualenv will be a best practice. If set .python-version file, it can auto activate or deactivate virtual env when you change work folder. Pyenv-virtualenv also put all virtual env into $HOME/.pyenv/versions folder.
From my personal experience, I would recommend to organize all virtual environments in one single directory. Unless someone has extremely sharp memory and can remember files/folders scattered across file system.
Not a big fan of using other tools just to mange virtual environments. In VSCode if I configure(python.venvPath) directory containing all virtual environments, it can automatically recognize all of them.
For Anaconda installations of Python, the "conda create" command puts it in a directory within the anaconda3 folder by default. Specifically (for Windows):
C:\Users\username\anaconda3\envs
This allows other conda commands to work without specifying the path. One advantage, not noted above, is that putting environments in the project folder allows you to use the same name for all of them (but that is not much of an advantage for me). For more info, see:
https://conda.io/projects/conda/en/latest/user-guide/tasks/manage-environments.html

PyCharm's venv relies on a physical python installation?

I trusted PyCharm to let it create a venv for me, and then shipped the code project with the venv folder. Inside my code, whenever calling a separate script, I use the venv Python uder venv/Scripts/.
To my surprise, it breaks when running on another machine, where the venv python actually points to my local installation of Python at C:\Python\Python37. This beats the purpose of creating a venv!
What have I done wrong? My project structure is like
code/*.py
venv
And this is what I shipped to another machine.
Yes venv creates a link to your original python.
The best practice to creating "venv requirements" is to create requirements file:
pip freeze > requirements.txt
Then just add this file to your project folder
P.S
There is better alternative called Conda. You can provide the python version in conda envs.
Thanks to #AndreyG's answer. In addition, a bunch of others similar questions were helpful, like:
Copy complete virtualenv to another pc
Port Python virtualenv to another system
Especially after reading this tutorial: https://realpython.com/python-virtual-environments-a-primer/,
I realized my venv misconception, which was kinda due to my previous experience on shipping among Mac machines, where Python was usually preinstalled.
Now combined with the pip-freezing approach towards shipping site-packages, I also ship the Python Windows installer and automate the installation to make sure that the venv symlinks point to the exact same Python installation on all machines. Reference:
https://docs.python.org/3/using/windows.html
The automation code:
python.<version>-<arch>.exe /quiet InstallAllUser=-1 PrependPath=1 TargetDir="C:\Python\Python<version>"

Virtual environment and python versions for different projects

Let me first outline my desired solution and then elaborate on a specific question how to achieve this state.
I'm soon starting two coding projects in python. I've used python before but never on such big projects. My ideal scenario would be to have a setup where I can run virtual environments and different python version for various project. Some research pointed me to virtualenv / virtualenvwrapper and pyenv. It seems using pyenv-virtualenv or pyenv-virtualenvwrapper there is a nice way to specify the virtualenvironment and python version for a specific project.
Question: Once I've setup a virtualenvironment and python version for a specific project, how easily could I switch in a later stage to a newer python version? Let's say I've started project A with python 3.4 and in one year in the future I would like to move everything to python 3.6. Is this possible in a neat way?
Sure:
$ rm -r my-python-3.4-env
$ virtualenv -p python3.6 my-python-3.6-env
$ source my-python-3.6-env/bin/activate
In other words, each virtual environment is just a folder with the necessary files in it. You "activate" an environment with the source .../activate command (in case of virtualenv) and you leave it just as easily. To switch to a different environment you simply create a new one with a specific Python executable and activate it.
What you want to be careful about is to keep your installation repeatable, meaning if you depend on external modules (which modern projects typically do), you don't want to install each dependency by hand and instead automate that. For instance, you create a setuptools setup.py file which lists your dependencies, and then have it install them into your new environment automatically:
$ source my-python-3.6-env/bin/activate
(my-python-3.6-env) $ python setup.py develop

Python, virtualenv: telling IDE to run the debug session in virtualenv

My project's running fine in virtualenv. Unfortunately, I can't yet run it via my IDE (Eric) because of the import troubles. It stands to reason, as I never told the IDE anything about virtualenv.
I know the drill ($ source project/bin/activate etc.), but lack general understanding. What constitutes "running inside virtualenv"? What IDE options might be relevant?
I think the only required setting to run or debug code is path to python interpreter.
Relevant IDE options could be SDK or Interpreter settings.
Note that you should run not default python (eg. /usr/bin/python) but python binary in your virtual environment (eg /path/to/virtualenv/bin/python)
Also there are some environment variables set by activate, but i think they aren't needed when you point to virtualenv python binary directly.
So, again, what activate does is only environment variables setup: at least, it modifies system $PATH in a way that python and pip commands points to executable files under path/to/virtaulenv/bin directiry.
As far as I know it is possible to run your script/project using your virtualenv simply by calling /path/to/your/venv/python your_script.py. To install new packages in your venv for example, you would run /path/to/your/venv/pip install some_package.
I guess the main advantage of "runnnig inside virtualenv" would be not being concerned about having to inform the location of python packages/executable everytime you want to run some script. But I lack general understanding too.
I usually install a virtualenv with the --no-site-packages option in order to have a "clean" installation of python.
--- EDIT ---
The second answer of this discussion has a nice explanation.
Simply look at the project/bin/activate, everything you need is there to set up the appropriate search.
Usually the most important path is the PYTHONPATH which should point to the site-packages.

How to get virtualenv to use dist-packages on Ubuntu?

I know that virtualenv, if not passed the --no-site-packages argument when creating a new virtual environment, will link the packages in /usr/local/lib/python2.7/site-packages (for Python 2.7) with a newly-created virtual environment. On Ubuntu 12.04 LTS, I have three locations where Python 2.7 packages can be installed (using the default, Ubuntu-supplied Python 2.7 installation):
/usr/lib/python2.7/dist-packages: this has my global installation of ipython, scipy, numpy, matplotlib – packages that I would find difficult and time-consuming to install individually (and all their dependences) if they were not available via the scipy stack.
/usr/local/lib/python2.7/site-packages: this is empty, and I think it will stay that way on Ubuntu unless I install a package from source.
/usr/local/lib/python2.7/dist-packages: this has very important local packages for astronomy, notably those related to PyRAF, STScI, etc., and they are extremely difficult and time-consuming to install individually.
Note that a global directory such as /usr/lib/python2.7/site-packages does not exist on my system. Note also that my global installation of ipython, scipy, etc. lets me use those packages on-the-fly without having to source/activate a virtual environment every time.
Naturally, I now want to use virtualenv to create one virtual environment in my user home directory which I will source/activate for my future projects. However, I would like this virtual environment, while being created, to link/copy all of my packages in locations (1) and (3) in the list above. The main reason for this is that I don't want to go through the pip install process (if it is even possible) to re-install ipython, scipy, the astro-packages, etc. for this (and maybe other) virtual environments.
Here are my questions:
Is there a way for me to specify to virtualenv that I would like it to link/copy packages in these two dist-packages directories for virtual environments that are created in the future?
When I eventually update my global installation of scipy, ipython, etc. in the two dist-packages directories, will this also update/change the packages that my virtual environment uses (and which it originally got during virtualenv creation)?
If I ever install a package from source on Ubuntu, will it go in /usr/local/lib/python2.7/dist-packages, or /usr/local/lib/python2.7/site-packages?
Thanks in advance for your help!
This might be a legitimate use of PYTHONPATH - an environmental variable that virtualenv doesn't touch, which uses the same syntax as the environmental variable PATH, in bash PYTHONPATH=/usr/lib/python2.7/dist-packages:/usr/local/lib/python2.7/dist-packages in a .bashrc or similar. If you followed this path,
You don't have to tell your virtual environment about this at all, it won't try to change it.
No relinking will be required, and
That will still go wherever it would have gone (pip install always uses /usr/local/lib/python2.7/dist-packages/ for my Ubuntu) if you install them outside of your virtual environment. If you install them from within your virtual environment (while it's activated) then of course it'll be put in the virtualenvironment.
I'm just getting my head around virtualenv, but there seems to be an easier way than mentioned so far.
Since virtualenv 1.7 --no-site-packages has been the default behavior.
Therefore using the --system-site-packages flag to virtualenv is all that is needed to get dist-packages in your path - if you use the tweaked virtualenv shipped by Ubuntu. (This answer and this one give some useful history). I've tested this and it does work.
$ virtualenv --system-site-packages .
I agree with Thomas here - I can't see any action required in virtualenv to see the effect of updates in dist-packages.
Having tested that with python setup.py install, it does (again as Thomas said) still go to dist-packages. You could change that by building your own python, but that's a bit extreme.
PYTHONPATH works for me.
vim ~/.bashrc
add this line below:
export PYTHONPATH=$PYTHONPATH:/usr/lib/python2.7/dist-packages:/usr/local/lib/python2.7/dist-packages
source ~/.bashrc
In the directory site-packages, create a file dist.pth
In the file dist.path, put the following:
../dist-packages
Now deactivate and activate your virtualenv. You should be set.
What you want to achieve here is essentially add specific folder (dist-packages) to Python search path. You have a number of options for this:
Use path configuration (.pth) file, entries will be appended to the system path.
Modify PYTHONPATH (entries from it go to the beginning of system path).
Modify sys.path directly from your Python script, i.e. append required folders to it.
I think that for this particular case (enable global dist-packages folder) third option is better, because with first option you have to create .pth file for every virtualenv you'll be working in (with some external shell script?). It's easy to forget it when you distribute your package. Second option requires run-time setup (add a envvar), which is, again, easy to miss.
And only third option doesn't require any prerequisites at configure- or run-time and can be distributed without issues (on the same-type system, of course).
You can use function like this:
def enable_global_distpackages():
import sys
sys.path.append('/usr/lib/python2.7/dist-packages')
sys.path.append('/usr/local/lib/python2.7/dist-packages')
And then in __init__.py file of your package:
enable_global_distpackages()

Categories