In this question on S/O:
Can existing virtualenv be upgraded gracefully?
The accepted answer says that you can:
use the Python 2.6 virtualenv to "revirtual" the existing directory
I cannot seem to find any details on how to "revirtual" an existing virtualenv. I know how to manually install python, but I am looking very specifically for the "python / virtualenv" way to upgrade python inside a specific virtualenv.
In my very specific situation, I have become an administrator in someone else' stead. The system python is 2.6.6, however the virtualenvs are all using 2.7.4 within the virtualenv path, which is something like:
/home/user_name/.virtualenvs/application_name/bin/python2.7
While there is no python 2.7.x in the system. I cannot find any evidence of python having been manually installed, and I cannot find any details online about using pip or using apt-get/yum or anything to install different versions of python within the virtualenv.
So, my very specific questions are:
How does one "revirtual" a virtualenv?
How does one upgrade python within a virtualenv the "python" way?
Are there alternative ways to manage python versions within virtualenvs?
Thanks, and please let me know if I can clarify my questions in any way!
-S
Usual caveat: make backup copies first. However, the structure created by virtualenv is not that complex, so it should be possible to either find what you need or create a new one and migrate. The path with ~/.virtualenvs in it means it was probably created with virtualenvwrapper so you could read up on that too.
virtualenv makes copies of the executables in the bin directory, which means they don't have to exist elsewhere. In the lib and includes directories, however, by default there will be symlinks to items from the "source" python (unless someone changed that setting). You could do a ls -l in those directories and maybe you will find where python 2.7 is installed - or maybe some broken symlinks.
Under lib should be one or more python-<version> directories, with some symlinks (probably) to python standard library stuff and a site-packages directory, which is where your installed packages are.
Now, upgrading. If you try to update the virtualenv "in-place" (i.e. you just run virtualenv --python==python<version> <existing-directory>) then you'll probably run into two issues: (1) The bin/python symlink will not be replaced unless you delete/move it, and (2) the directories under lib are per python version, so if you don't use 2.7 then the site-packages directory won't automatically carry over -- you'll have to reinstall packages, or go in and move it, which will probably work unless you have compiled binary extensions, or are migrating to 3.x, or something else weird happens.
An alternate, cleaner approach is to create a new, fresh virtualenv, and then reinstall the necessary packages. Find your own app's package(s) first and figure out if they have a setup.py that works or are just code that's sitting in a directory. If they don't have setup.py that correctly brings in all dependencies, then in your existing virtualenv folder you can run ./bin/pip freeze to get a listing of installed packages. You can save this to a file and use pip install -r <filename> on it later.
Related
I'm confused as to where I should put my virtualenvs.
With my first django project, I created the project with the command
django-admin.py startproject djangoproject
I then cd'd into the djangoproject directory and ran the command
virtualenv env
which created the virtual environment directory at the same level as the inner djangoproject directory.
Is this the wrong place in which to create the virtualenv for this particular project?
I'm getting the impression that most people keep all their virtualenvs together in an entirely different directory, e.g. ~/virtualenvs, and then use virtualenvwrapper to switch back and forth between them.
Is there a correct way to do this?
Many people use the virtualenvwrapper tool, which keeps all virtualenvs in the same place (the ~/.virtualenvs directory) and allows shortcuts for creating and keeping them there. For example, you might do:
mkvirtualenv djangoproject
and then later:
workon djangoproject
It's probably a bad idea to keep the virtualenv directory in the project itself, since you don't want to distribute it (it might be specific to your computer or operating system). Instead, keep a requirements.txt file using pip:
pip freeze > requirements.txt
and distribute that. This will allow others using your project to reinstall all the same requirements into their virtualenv with:
pip install -r requirements.txt
Changing the location of the virtualenv directory breaks it
This is one advantage of putting the directory outside of the repository tree, e.g. under ~/.virtualenvs with virutalenvwrapper.
Otherwise, if you keep it in the project tree, moving the project location will break the virtualenv.
See: Renaming a virtualenv folder without breaking it
There is --relocatable but it is known to not be perfect.
Another minor advantage: you don't have to .gitignore it.
The advantages of putting it gitignored in the project tree itself are:
keeps related stuff close together.
you will likely never reuse a given virtualenv across projects, so putting it somewhere else does not give much advantage
This is an annoying design flaw in my opinion. They should implement virutalenv in a way that does not matter where the directory is, as storing in-tree is just simpler and more isolated. Node.js' NPM package manager does it without any problem. And while we are at it: pip should just use local directories by default just like NPM. Having this separate virtualenv layer is wonky. Node.js just have NPM that does it all without extra typing. I can't believe I'm prasing the JavaScript ecosystem on a Python post, but it's true.
The generally accepted place to put them is the same place that the default installation of virtualenvwrapper puts them: ~/.virtualenvs
Related: virtualenvwrapper is an excellent tool that provides shorthands for the common virtualenv commands. http://www.doughellmann.com/projects/virtualenvwrapper/
If you use pyenv install Python, then pyenv-virtualenv will be a best practice. If set .python-version file, it can auto activate or deactivate virtual env when you change work folder. Pyenv-virtualenv also put all virtual env into $HOME/.pyenv/versions folder.
From my personal experience, I would recommend to organize all virtual environments in one single directory. Unless someone has extremely sharp memory and can remember files/folders scattered across file system.
Not a big fan of using other tools just to mange virtual environments. In VSCode if I configure(python.venvPath) directory containing all virtual environments, it can automatically recognize all of them.
For Anaconda installations of Python, the "conda create" command puts it in a directory within the anaconda3 folder by default. Specifically (for Windows):
C:\Users\username\anaconda3\envs
This allows other conda commands to work without specifying the path. One advantage, not noted above, is that putting environments in the project folder allows you to use the same name for all of them (but that is not much of an advantage for me). For more info, see:
https://conda.io/projects/conda/en/latest/user-guide/tasks/manage-environments.html
I have a python library that I am wanting to help out with and fix some issues. I just don't know how to test my changes given the complexity of how python/pip installs libraries.
I have the library installed with pip and I can run python code connecting to the library by doing an "from import *". But now that I want to make changes to it I pulled the code with git and plan to branch to work on my changes. That's fine. I will then do a pull request to merge any changes given tests pass.
But after I make a change, how do I integrate my changes into python to test out the changes I made with the library? Can pip install my custom/modified version of the library?
I have looked around and haven't successfully found an answer to this but perhaps I'm not looking in the right spot.
Can pip install my custom/modified version of the library?
Yes.
There are various ways of approaching this question. A common solution is the use of Python virtual environments. This allows you to create an isolated Python environment that does not share the same packages as your system Python install. You can then install things into it (such as your modified Python library) to test it out.
To get started, you need the virtualenv tool. This is probably available as a package for your distribution, but you can also install it using pip. Once you have it, you can run in the same directory as your code:
virtualenv .venv
This creates a virtuelenv named .venv. You can call it anything you want, but naming it .venv (or anything starting with a .) means it won't clutter up the output of ls in your workspace.
Next, you need to activate the virtualenv:
. .venv/bin/activate.sh
This modifies your $PATH to place the virtualenv at the front of the list of directories. Now when you type python or pip, you'll be using the virtualenv version.
If your code has a setup.py file, you can install it like this:
pip install -e .
The -e means you want to perform an "editable" install, which means python will use the code "in place", and any changes you make will be immediately visible to the code you use for testing.
When you're done, you can run:
deactivate
This will remove the changes that activate made to your environment.
For more information:
Pipenv & Virtual Environments discusses a higher level tool for managing virtual environments.
Virtualenvwrapper is another take on a higher level management tool.
I've installed a Python based package in my site-packages directory. However, I'm trying to learn how the code works so I'd like to hack it a bunch by putting in lots of print statements so I can understand what the code is doing. But at the end of the day I want a clean installation without all my hacks in it.
Of course I could just copy the original files to something else, make some hacks, and then at the end copy all the original files back over. But that's really tedious. At the very least, I'd like to install a local copy of the Python package and then have the python script use this copy preferentially (perhaps by suitable statements at the top of the script). But perhaps this isn't even the best way to do python development/hacking.
What's the best solution for my problem? I want to be able to hack on the package (and use that package) but without messing up my clean version.
Take a look at virtualenv. You can basically setup a local python environment in which you can install anything you like without having to mess around with the system environment.
The virtualenv-advice given is correct, depending on your actual package you can even go beyond that and not mess with the site-packages inside the virtualenv at all.
If the package is setuptools-based, a simple
$ python setup.py develop
from within a working-copy of it's source, it won't be installed, but instead just hooked into the virtualenv pointing to the working-copy. Advantage: you can edit (and e.g. rollback using GIT or whatever SCM the package maintainer uses) files in a well-defined and non-volatile location.
This is what the Python virtualenv tool is for. It allows you to create a local Python environment with a set of packages distinct from your system installation. For example, I could do something like this:
$ virtualenv myenv
$ . myenv/bin/activate
$ pip install nifty-module
The activate script modifies your PATH so that any script that starts with:
#!/usr/bin/env python
will use the Python from your virtual environment, rather than the system Python, and will see the modules installed in that environment.
I know that virtualenv, if not passed the --no-site-packages argument when creating a new virtual environment, will link the packages in /usr/local/lib/python2.7/site-packages (for Python 2.7) with a newly-created virtual environment. On Ubuntu 12.04 LTS, I have three locations where Python 2.7 packages can be installed (using the default, Ubuntu-supplied Python 2.7 installation):
/usr/lib/python2.7/dist-packages: this has my global installation of ipython, scipy, numpy, matplotlib – packages that I would find difficult and time-consuming to install individually (and all their dependences) if they were not available via the scipy stack.
/usr/local/lib/python2.7/site-packages: this is empty, and I think it will stay that way on Ubuntu unless I install a package from source.
/usr/local/lib/python2.7/dist-packages: this has very important local packages for astronomy, notably those related to PyRAF, STScI, etc., and they are extremely difficult and time-consuming to install individually.
Note that a global directory such as /usr/lib/python2.7/site-packages does not exist on my system. Note also that my global installation of ipython, scipy, etc. lets me use those packages on-the-fly without having to source/activate a virtual environment every time.
Naturally, I now want to use virtualenv to create one virtual environment in my user home directory which I will source/activate for my future projects. However, I would like this virtual environment, while being created, to link/copy all of my packages in locations (1) and (3) in the list above. The main reason for this is that I don't want to go through the pip install process (if it is even possible) to re-install ipython, scipy, the astro-packages, etc. for this (and maybe other) virtual environments.
Here are my questions:
Is there a way for me to specify to virtualenv that I would like it to link/copy packages in these two dist-packages directories for virtual environments that are created in the future?
When I eventually update my global installation of scipy, ipython, etc. in the two dist-packages directories, will this also update/change the packages that my virtual environment uses (and which it originally got during virtualenv creation)?
If I ever install a package from source on Ubuntu, will it go in /usr/local/lib/python2.7/dist-packages, or /usr/local/lib/python2.7/site-packages?
Thanks in advance for your help!
This might be a legitimate use of PYTHONPATH - an environmental variable that virtualenv doesn't touch, which uses the same syntax as the environmental variable PATH, in bash PYTHONPATH=/usr/lib/python2.7/dist-packages:/usr/local/lib/python2.7/dist-packages in a .bashrc or similar. If you followed this path,
You don't have to tell your virtual environment about this at all, it won't try to change it.
No relinking will be required, and
That will still go wherever it would have gone (pip install always uses /usr/local/lib/python2.7/dist-packages/ for my Ubuntu) if you install them outside of your virtual environment. If you install them from within your virtual environment (while it's activated) then of course it'll be put in the virtualenvironment.
I'm just getting my head around virtualenv, but there seems to be an easier way than mentioned so far.
Since virtualenv 1.7 --no-site-packages has been the default behavior.
Therefore using the --system-site-packages flag to virtualenv is all that is needed to get dist-packages in your path - if you use the tweaked virtualenv shipped by Ubuntu. (This answer and this one give some useful history). I've tested this and it does work.
$ virtualenv --system-site-packages .
I agree with Thomas here - I can't see any action required in virtualenv to see the effect of updates in dist-packages.
Having tested that with python setup.py install, it does (again as Thomas said) still go to dist-packages. You could change that by building your own python, but that's a bit extreme.
PYTHONPATH works for me.
vim ~/.bashrc
add this line below:
export PYTHONPATH=$PYTHONPATH:/usr/lib/python2.7/dist-packages:/usr/local/lib/python2.7/dist-packages
source ~/.bashrc
In the directory site-packages, create a file dist.pth
In the file dist.path, put the following:
../dist-packages
Now deactivate and activate your virtualenv. You should be set.
What you want to achieve here is essentially add specific folder (dist-packages) to Python search path. You have a number of options for this:
Use path configuration (.pth) file, entries will be appended to the system path.
Modify PYTHONPATH (entries from it go to the beginning of system path).
Modify sys.path directly from your Python script, i.e. append required folders to it.
I think that for this particular case (enable global dist-packages folder) third option is better, because with first option you have to create .pth file for every virtualenv you'll be working in (with some external shell script?). It's easy to forget it when you distribute your package. Second option requires run-time setup (add a envvar), which is, again, easy to miss.
And only third option doesn't require any prerequisites at configure- or run-time and can be distributed without issues (on the same-type system, of course).
You can use function like this:
def enable_global_distpackages():
import sys
sys.path.append('/usr/lib/python2.7/dist-packages')
sys.path.append('/usr/local/lib/python2.7/dist-packages')
And then in __init__.py file of your package:
enable_global_distpackages()
I have a local git repository on my machine, let's say under /develop/myPackage.
I'm currently developing it as a python package (a Django app) and I would like to access it from my local virtualenv. I've tried to include its path in my PYTHONPATH (I'm on a Mac)
export PATH="$PATH:/develop/myPackage"
The directory already contains a __init__.py within its root and within each subdirectory.
No matter what I do but I can't get it work, python won't see my package.
The alternatives are:
Push my local change to github and install the package within my virtualenv from there with pip
Activate my virtualenv and install the package manually with python setup.py install
Since I often need to make changes to my code the last two solution would require too much work all the time even for a small change.
Am I doing something wrong? Would you suggest a better solution?
Install it in editable mode from your local path:
pip install -e /develop/MyPackage
This actually symlinks the package within your virtualenv so you can keep on devving and testing.
The example you show above uses PATH, and not PYTHONPATH. Generally, the search path used by python is partially predicated on the PYTHONPATH environment variable (PATH has little use for this case.)
Try this:
export PYTHONPATH=$PYTHONPATH:/develop/myPackage
Though in reality, you likely want it to be pointing to the directory that contains your package (so you can do 'import myPackage', rather than importing things within the package. That being said, you likely want:
export PYTHONPATH=$PYTHONPATH:/develop/
Reference the python docs here for more information about Python's module/package search path: http://docs.python.org/2/tutorial/modules.html#the-module-search-path
By default, Python uses the packages that it was installed with as it's default path, and as a result PYTHONPATH is unset in the environment.