I'm confused as to where I should put my virtualenvs.
With my first django project, I created the project with the command
django-admin.py startproject djangoproject
I then cd'd into the djangoproject directory and ran the command
virtualenv env
which created the virtual environment directory at the same level as the inner djangoproject directory.
Is this the wrong place in which to create the virtualenv for this particular project?
I'm getting the impression that most people keep all their virtualenvs together in an entirely different directory, e.g. ~/virtualenvs, and then use virtualenvwrapper to switch back and forth between them.
Is there a correct way to do this?
Many people use the virtualenvwrapper tool, which keeps all virtualenvs in the same place (the ~/.virtualenvs directory) and allows shortcuts for creating and keeping them there. For example, you might do:
mkvirtualenv djangoproject
and then later:
workon djangoproject
It's probably a bad idea to keep the virtualenv directory in the project itself, since you don't want to distribute it (it might be specific to your computer or operating system). Instead, keep a requirements.txt file using pip:
pip freeze > requirements.txt
and distribute that. This will allow others using your project to reinstall all the same requirements into their virtualenv with:
pip install -r requirements.txt
Changing the location of the virtualenv directory breaks it
This is one advantage of putting the directory outside of the repository tree, e.g. under ~/.virtualenvs with virutalenvwrapper.
Otherwise, if you keep it in the project tree, moving the project location will break the virtualenv.
See: Renaming a virtualenv folder without breaking it
There is --relocatable but it is known to not be perfect.
Another minor advantage: you don't have to .gitignore it.
The advantages of putting it gitignored in the project tree itself are:
keeps related stuff close together.
you will likely never reuse a given virtualenv across projects, so putting it somewhere else does not give much advantage
This is an annoying design flaw in my opinion. They should implement virutalenv in a way that does not matter where the directory is, as storing in-tree is just simpler and more isolated. Node.js' NPM package manager does it without any problem. And while we are at it: pip should just use local directories by default just like NPM. Having this separate virtualenv layer is wonky. Node.js just have NPM that does it all without extra typing. I can't believe I'm prasing the JavaScript ecosystem on a Python post, but it's true.
The generally accepted place to put them is the same place that the default installation of virtualenvwrapper puts them: ~/.virtualenvs
Related: virtualenvwrapper is an excellent tool that provides shorthands for the common virtualenv commands. http://www.doughellmann.com/projects/virtualenvwrapper/
If you use pyenv install Python, then pyenv-virtualenv will be a best practice. If set .python-version file, it can auto activate or deactivate virtual env when you change work folder. Pyenv-virtualenv also put all virtual env into $HOME/.pyenv/versions folder.
From my personal experience, I would recommend to organize all virtual environments in one single directory. Unless someone has extremely sharp memory and can remember files/folders scattered across file system.
Not a big fan of using other tools just to mange virtual environments. In VSCode if I configure(python.venvPath) directory containing all virtual environments, it can automatically recognize all of them.
For Anaconda installations of Python, the "conda create" command puts it in a directory within the anaconda3 folder by default. Specifically (for Windows):
C:\Users\username\anaconda3\envs
This allows other conda commands to work without specifying the path. One advantage, not noted above, is that putting environments in the project folder allows you to use the same name for all of them (but that is not much of an advantage for me). For more info, see:
https://conda.io/projects/conda/en/latest/user-guide/tasks/manage-environments.html
Related
My question is do i have to install django every single time in my virtual environment in order to run my python files? and is this taking up bunch of space on my machine? My project also uses "matplotlib" and every virtual environment i create it also asks me to import the matplotlib module too. its getting annoying. do i have to do this every time?
Im new to Django. I wanted to run some python files in django but they weren't working, so after some research i found out i needed to run my pycharm project in a virtual environment in order to run these python files.
my folders look like this pycharmProjects -> my project
I enter pycharmProjects and I set up virtual environment using "pienv shell". Then i run "python3 manage.py runserver". It turns out i must install django in the virtual environment before the files run.
Short answer is no, you don't have to use a virtual environment at all and can install your dependancies globally instead. However you will soon find that it will cause a lot of issues. The main reason you would create a virtual environment is to give control of your dependancies and prevent bugs that could be caused because of them having their wires crossed between projects.
Short answer yes.
If you create a virualenv you have to install all packages, that your program needs.
Long answer:
You could install django system wide and then create a virtualenv with the option
--system-site-packages then django would be used from your globally installed python.
(Or you install everything just in your global python, put I personally don't think this is good practice)
If you work with many different projects I think you will avoid a lot of trouble if you use one virtualenv per project.
Trouble meaning that one project breaks, because one pip install for another project changed the version of one package and one project can't handle the newer version.
I would recommend to create a requirements.txt file for each project, that lists the dependencies then you can create the virtualenv with following command
pip install -r requirements.txt
if you have requirement.txt files, then you can create virtualenvs rather quickly if going back to an old project and you can delete the virtualenvs whenever you run out of disk space. If you want to be an the safe side, type pip freeze > pipfreeze.txt prior to deleting the virtualenv and use pip install -r pipfreeze.txt if you want to create one with the same modules and the same versions.
You also might want to look at direnv or autoenv if working on a linux like system.
This will automatically switch to the required virtualenv when changing to a project's working directory.
I am wondering whether having an (untracked) virtual environment folder inside of your local Git clone is considered bad directory structure.
It seems cleaner to place the repository and the virtual environment in a single folder, but that is also more awkward and bulky.
Here are the two options I am considering:
A.
git_clone/
virtual-environment/
B.
name_of_project/
git_clone/
virtual-environment/
This question is similar to this one, but for users/contributors instead of maintainers.
Is it bad to have my virtualenv directory inside my git repository?
To decide, I answer the question "will I reuse my virtual env in multiple projects?" If yes, then I place the virtual env outside the working tree. If no, then I place the virtual env inside the working tree.
As for reuse, projects often do not depend on the same set of libraries. Even when they do depend on the same set of libraries, projects depend on specific versions of libraries. So, having dedicated virtual env inside working tree is a simple way to avoid issues due to incorrect or extraneous dependencies in above situations.
In terms of cost, while dedicated virtual envs can lead to duplicates, most of my virtual envs are in the order of few 10s of MBs; a small price in terms of space to avoid the hassles due to incorrect dependencies. If extra space is indeed an issue, then virtual envs can be created, used, and deleted as and when required given how easy it is to create virtual envs (e.g., via pip and requirements.txt).
Copy the requirements.txt out into a parent folder and use PIP there. PIP is like NPM, it installs dependencies directly into the OS (unlike snapd), and while it shouldn't matter, better to be safe than sorry!
I am starting my first actual python project. I follow "Learn Python the Hard Way" to make an initial Python skeleton and I am using virtualenv too.
Now I want to use git to do the version control. According to some previous questions in SO, I am not suggested to commit any virtualenv files. Instead, I could use pip freeze > requirements.txt, and use .gitignore to ignore the virtualenv directories.
However, both virtualenv and Python project skeleton require a /bin directory, should I commit it as well? (Actually I don't really know what the role /bin is playing in Python project)
Any suggestions are appreciated, if there is something wrong with my process to set up a Python project, please correct me.
yourproject/bin is distinct from yourproject/env/bin, where yourproject/env is the virtual environment's directory (and neither of them is /bin in the root directory). You should ignore everything in env, and indeed, your project should work for someone who isn't using a virtual environment, or is managing it differently. Otherwise, you lose the benefits.
Let's imagine for a second that you finish your project, and I want to use it for a new task. I start a new project with a virtualenv of its own, and install some other components I want to use, and then yours. Oops, now I have an older version of Python than I started out with, and that bug in deactivate from two years ago somehow got resurrected. Imagine debugging that, let alone the annoyance of finding that your project replaced some files of mine.
(As these things go, the bin directory is a pretty static part of the virtual environment; zapping other parts of my private env would be much more destructive. If you committed lib you would have prevented me from installing any other components before yours.)
I have a Python project which contains three components: main executable scripts, modules which those scripts rely on, and data (sqlite3 databases, flat files, etc.) which those scripts manipulate. The top level has an __init__.py file so that other programs can also borrow from the modules if needed.
The question is, is it more "Pythonic" or "correct" to move my project into the default site-packages directory, or to modify PYTHONPATH to include one directory above my project (so that the project can be imported from)? On the one hand, what I've described is not strictly a "package", but a "project" with data that can be treated as a package. So I'm leaning in the direction of modifying PYTHONPATH (after all, PYTHONPATH must exist for a reason, right?)
Definitely do not add your project to site-packages this is going to spoil your system Python installation and will fire back at the moment some other app would come there or you would need to install something.
There are at last two popular options for installing python apps in isolated manner
Using virtualenv
See virtualenv project. It allows
creation of new isolating python environment - python for this environment is different from system one and has it's own PYTHONPATH setup this allows to keep all installed packages private for it.
activation and deactivation of given virtualenv from command line for command line usage. After activate, you can run pip install etc. and it will affect only given virtualenv install.
calling any Python script by starting by virtualenv Python copy - this will use related virtualenv (note, that there is no need to call any activate)
using zc.buildout
This package provides command buildout. With this, you can use special configuration file and this allows creation of local python environment with all packages and scripts.
Conclusions
virtualenv seems more popular today and I find it much easier to learn and use
zc.buildout might be also working for you, but be prepared for a bit longer learning time
installing into system Python directories shall be always reserved for very special cases (pip, easy_install), better avoid it.
installing into private directories and manipulatig PYTHONPATH is also an option, but you would repeat, what virtualenv already provides
I know that virtualenv, if not passed the --no-site-packages argument when creating a new virtual environment, will link the packages in /usr/local/lib/python2.7/site-packages (for Python 2.7) with a newly-created virtual environment. On Ubuntu 12.04 LTS, I have three locations where Python 2.7 packages can be installed (using the default, Ubuntu-supplied Python 2.7 installation):
/usr/lib/python2.7/dist-packages: this has my global installation of ipython, scipy, numpy, matplotlib – packages that I would find difficult and time-consuming to install individually (and all their dependences) if they were not available via the scipy stack.
/usr/local/lib/python2.7/site-packages: this is empty, and I think it will stay that way on Ubuntu unless I install a package from source.
/usr/local/lib/python2.7/dist-packages: this has very important local packages for astronomy, notably those related to PyRAF, STScI, etc., and they are extremely difficult and time-consuming to install individually.
Note that a global directory such as /usr/lib/python2.7/site-packages does not exist on my system. Note also that my global installation of ipython, scipy, etc. lets me use those packages on-the-fly without having to source/activate a virtual environment every time.
Naturally, I now want to use virtualenv to create one virtual environment in my user home directory which I will source/activate for my future projects. However, I would like this virtual environment, while being created, to link/copy all of my packages in locations (1) and (3) in the list above. The main reason for this is that I don't want to go through the pip install process (if it is even possible) to re-install ipython, scipy, the astro-packages, etc. for this (and maybe other) virtual environments.
Here are my questions:
Is there a way for me to specify to virtualenv that I would like it to link/copy packages in these two dist-packages directories for virtual environments that are created in the future?
When I eventually update my global installation of scipy, ipython, etc. in the two dist-packages directories, will this also update/change the packages that my virtual environment uses (and which it originally got during virtualenv creation)?
If I ever install a package from source on Ubuntu, will it go in /usr/local/lib/python2.7/dist-packages, or /usr/local/lib/python2.7/site-packages?
Thanks in advance for your help!
This might be a legitimate use of PYTHONPATH - an environmental variable that virtualenv doesn't touch, which uses the same syntax as the environmental variable PATH, in bash PYTHONPATH=/usr/lib/python2.7/dist-packages:/usr/local/lib/python2.7/dist-packages in a .bashrc or similar. If you followed this path,
You don't have to tell your virtual environment about this at all, it won't try to change it.
No relinking will be required, and
That will still go wherever it would have gone (pip install always uses /usr/local/lib/python2.7/dist-packages/ for my Ubuntu) if you install them outside of your virtual environment. If you install them from within your virtual environment (while it's activated) then of course it'll be put in the virtualenvironment.
I'm just getting my head around virtualenv, but there seems to be an easier way than mentioned so far.
Since virtualenv 1.7 --no-site-packages has been the default behavior.
Therefore using the --system-site-packages flag to virtualenv is all that is needed to get dist-packages in your path - if you use the tweaked virtualenv shipped by Ubuntu. (This answer and this one give some useful history). I've tested this and it does work.
$ virtualenv --system-site-packages .
I agree with Thomas here - I can't see any action required in virtualenv to see the effect of updates in dist-packages.
Having tested that with python setup.py install, it does (again as Thomas said) still go to dist-packages. You could change that by building your own python, but that's a bit extreme.
PYTHONPATH works for me.
vim ~/.bashrc
add this line below:
export PYTHONPATH=$PYTHONPATH:/usr/lib/python2.7/dist-packages:/usr/local/lib/python2.7/dist-packages
source ~/.bashrc
In the directory site-packages, create a file dist.pth
In the file dist.path, put the following:
../dist-packages
Now deactivate and activate your virtualenv. You should be set.
What you want to achieve here is essentially add specific folder (dist-packages) to Python search path. You have a number of options for this:
Use path configuration (.pth) file, entries will be appended to the system path.
Modify PYTHONPATH (entries from it go to the beginning of system path).
Modify sys.path directly from your Python script, i.e. append required folders to it.
I think that for this particular case (enable global dist-packages folder) third option is better, because with first option you have to create .pth file for every virtualenv you'll be working in (with some external shell script?). It's easy to forget it when you distribute your package. Second option requires run-time setup (add a envvar), which is, again, easy to miss.
And only third option doesn't require any prerequisites at configure- or run-time and can be distributed without issues (on the same-type system, of course).
You can use function like this:
def enable_global_distpackages():
import sys
sys.path.append('/usr/lib/python2.7/dist-packages')
sys.path.append('/usr/local/lib/python2.7/dist-packages')
And then in __init__.py file of your package:
enable_global_distpackages()