virtualenv and CLI tools - python

I'm aware of how to use virtualenv for isolating Python dependencies in a long-running script, like a Flask or Twisted app. But I've been sort of puzzled about how you're supposed to go about this for a script intended to be invoked from the command line.
Suppose I wanted to make a CLI tool for interacting with some API, perhaps using Click or docopt. Obviously you don't want to have to source venv/bin/activate every time you want to use this tool. But I'd assume that it's still best to still use virtualenv to keep a clean environment even beyond development.
Sorry for the newbie question, but...what are you supposed to do to package up a script so it can be cleanly used in this manner? (I'm more used to RubyGems, and am still figuring out Pip and VirtualEnv.)

In general, if a package you've installed in a virtual env that provides binary command line script, say in ~/.virtualenv/bin/ you can symlink it into ~/bin/ (or wherever on your path you'd like to put local scripts).
There are a couple projects that aim to solve this issue:
pipsi the pip script installer -- amounts to doing the virtual env creation and symlinking for you
pipx pip for executable binaries

An excellent article on virtualenv by Dabapps would make it clear for you:
http://www.dabapps.com/blog/introduction-to-pip-and-virtualenv-python/
As for invoking it from a CLI script:
1. cd to your project root
2. env/bin/python your_main_file.py (Assuming your virtualenv is named env)
This way you don't need to source the virtualenv everytime.

Each virtualenv has it's own Python site_packages, builtin modles, and Python interpreter. So virtualenv is meant to be used at the project level, not at the "package by package" level. It isolates a collection of Python modules and possible dependencies. Each virtualenv has it's own location where pip will install packages. In theory, virtualenv should not be necessary, but in practice, it's nice to have a way to have different "environments" with different versions of Python modules and Python interpreters. I don't know if Ruby has something similar, that would allow you to have different "sets" of gems for different projects.
People that use straight virtualenv will add aliases to their .bashrc, for example:
alias workonawesomeproject="source ~/venv/awesomeproject/bin/activate"
They would activate the virtualenv with the alias
workonawesomeproject
To leave a virtualenv you use the command deactivate
An easier way to deal with virtualenvs is to use virtualenvwrapper
pip install virtualenvwrapper
Add these lines to your .bashrc (or other shell initialization file)
export WORKON_HOME=$HOME/venv # this directory is your choice
export PROJECT_HOME=$HOME/src # this directory is your choice
source /usr/local/bin/virtualenvwrapper.sh # leave this alone
If you just modified your .bashrc make sure to source it
source ~/.bashrc
Then to make a new virtualenv you simply run
mkvirtualenv awesomeproject
To use that virtualenv
workon awesomeproject
To deactivate that virtualenv
deactivate
Virtualenvwrapper Docs:
http://virtualenvwrapper.readthedocs.org/en/latest/install.html

Related

How to install python module local to a single project

I've been going around but was not able to find a definitive answer...
So here's my question..
I come from javascript background. I'm trying to pickup python now.
In javascript, the basic practice would be to npm install (or use yarn)
This would install some required module in a specific project.
Now, for python, I've figured out that pip install is the module manager.
I can't seem to figure out how to install this specific to a project (like how javascript does it)
Instead, it's all global.. I've found --user flag, but that's not really I'm looking for.
I've come to conclusion that this is just a complete different schema and I shouldn't try to approach as I have when using javascript.
However, I can't really find a good document why this method was favored.
It may be just my problem but I just can't not think about how I'm consistently bloating my pip global folder with modules that I'm only ever gonna use once for some single project.
Thanks.
A.) Anaconda (the simplest) Just download “Anaconda” that contains a lots of python modules pre installed just use them and it also has code editors. You can creat multiple module collections with the GUI.
B.) Venv = virtual environments (if you need something light and specific that contains specific packages for every project
macOS terminal commands:
Install venv
pip install virtualenv
Setup Venve (INSIDE BASE Project folder)
python3 -m venv thenameofyourvirtualenvironment
Start Venve
source thenameofyourvirtualenvironment/bin/activate
Stop Venve
deactivate
while it is activated you can install specific packages ex.:
pip -q install bcrypt
C.) Use “Docker” it is great if you want to go in depth and have a solide experience, but it can get complicated.
Pip is a program used to manage Python distribution. You usually have one system distribution which is by default managed by Pip. When you do pip install scipy, you install package scipy to your system Python. Everytime you try to import scipy after it will work because your system Python has it.
Project specific distributions are acomplished by using virtual environments. python -m venv env or venv env creates a copy of system Python interpreter, pip, setuptools and a couple of other essential tools. In other words, virtual environment created this way is empty.
To use created virtual environement one should use source env/bin/activate. After that, everytime you invoke python command it will use activated Python interpreter. When you install packages using pip, it will install them in the virtual environment rather than to your system python. To use system Python again use deactivate.
Such usage is actually prefered for projects because some user applications could rely on system Python and some packages, and installing, updating etc. could be potentionally dangerous.
Further reading: venv documentation

How do I test out changes to a python library pulled through git?

I have a python library that I am wanting to help out with and fix some issues. I just don't know how to test my changes given the complexity of how python/pip installs libraries.
I have the library installed with pip and I can run python code connecting to the library by doing an "from import *". But now that I want to make changes to it I pulled the code with git and plan to branch to work on my changes. That's fine. I will then do a pull request to merge any changes given tests pass.
But after I make a change, how do I integrate my changes into python to test out the changes I made with the library? Can pip install my custom/modified version of the library?
I have looked around and haven't successfully found an answer to this but perhaps I'm not looking in the right spot.
Can pip install my custom/modified version of the library?
Yes.
There are various ways of approaching this question. A common solution is the use of Python virtual environments. This allows you to create an isolated Python environment that does not share the same packages as your system Python install. You can then install things into it (such as your modified Python library) to test it out.
To get started, you need the virtualenv tool. This is probably available as a package for your distribution, but you can also install it using pip. Once you have it, you can run in the same directory as your code:
virtualenv .venv
This creates a virtuelenv named .venv. You can call it anything you want, but naming it .venv (or anything starting with a .) means it won't clutter up the output of ls in your workspace.
Next, you need to activate the virtualenv:
. .venv/bin/activate.sh
This modifies your $PATH to place the virtualenv at the front of the list of directories. Now when you type python or pip, you'll be using the virtualenv version.
If your code has a setup.py file, you can install it like this:
pip install -e .
The -e means you want to perform an "editable" install, which means python will use the code "in place", and any changes you make will be immediately visible to the code you use for testing.
When you're done, you can run:
deactivate
This will remove the changes that activate made to your environment.
For more information:
Pipenv & Virtual Environments discusses a higher level tool for managing virtual environments.
Virtualenvwrapper is another take on a higher level management tool.

Virtual environment and python versions for different projects

Let me first outline my desired solution and then elaborate on a specific question how to achieve this state.
I'm soon starting two coding projects in python. I've used python before but never on such big projects. My ideal scenario would be to have a setup where I can run virtual environments and different python version for various project. Some research pointed me to virtualenv / virtualenvwrapper and pyenv. It seems using pyenv-virtualenv or pyenv-virtualenvwrapper there is a nice way to specify the virtualenvironment and python version for a specific project.
Question: Once I've setup a virtualenvironment and python version for a specific project, how easily could I switch in a later stage to a newer python version? Let's say I've started project A with python 3.4 and in one year in the future I would like to move everything to python 3.6. Is this possible in a neat way?
Sure:
$ rm -r my-python-3.4-env
$ virtualenv -p python3.6 my-python-3.6-env
$ source my-python-3.6-env/bin/activate
In other words, each virtual environment is just a folder with the necessary files in it. You "activate" an environment with the source .../activate command (in case of virtualenv) and you leave it just as easily. To switch to a different environment you simply create a new one with a specific Python executable and activate it.
What you want to be careful about is to keep your installation repeatable, meaning if you depend on external modules (which modern projects typically do), you don't want to install each dependency by hand and instead automate that. For instance, you create a setuptools setup.py file which lists your dependencies, and then have it install them into your new environment automatically:
$ source my-python-3.6-env/bin/activate
(my-python-3.6-env) $ python setup.py develop

How to get virtualenv to use dist-packages on Ubuntu?

I know that virtualenv, if not passed the --no-site-packages argument when creating a new virtual environment, will link the packages in /usr/local/lib/python2.7/site-packages (for Python 2.7) with a newly-created virtual environment. On Ubuntu 12.04 LTS, I have three locations where Python 2.7 packages can be installed (using the default, Ubuntu-supplied Python 2.7 installation):
/usr/lib/python2.7/dist-packages: this has my global installation of ipython, scipy, numpy, matplotlib – packages that I would find difficult and time-consuming to install individually (and all their dependences) if they were not available via the scipy stack.
/usr/local/lib/python2.7/site-packages: this is empty, and I think it will stay that way on Ubuntu unless I install a package from source.
/usr/local/lib/python2.7/dist-packages: this has very important local packages for astronomy, notably those related to PyRAF, STScI, etc., and they are extremely difficult and time-consuming to install individually.
Note that a global directory such as /usr/lib/python2.7/site-packages does not exist on my system. Note also that my global installation of ipython, scipy, etc. lets me use those packages on-the-fly without having to source/activate a virtual environment every time.
Naturally, I now want to use virtualenv to create one virtual environment in my user home directory which I will source/activate for my future projects. However, I would like this virtual environment, while being created, to link/copy all of my packages in locations (1) and (3) in the list above. The main reason for this is that I don't want to go through the pip install process (if it is even possible) to re-install ipython, scipy, the astro-packages, etc. for this (and maybe other) virtual environments.
Here are my questions:
Is there a way for me to specify to virtualenv that I would like it to link/copy packages in these two dist-packages directories for virtual environments that are created in the future?
When I eventually update my global installation of scipy, ipython, etc. in the two dist-packages directories, will this also update/change the packages that my virtual environment uses (and which it originally got during virtualenv creation)?
If I ever install a package from source on Ubuntu, will it go in /usr/local/lib/python2.7/dist-packages, or /usr/local/lib/python2.7/site-packages?
Thanks in advance for your help!
This might be a legitimate use of PYTHONPATH - an environmental variable that virtualenv doesn't touch, which uses the same syntax as the environmental variable PATH, in bash PYTHONPATH=/usr/lib/python2.7/dist-packages:/usr/local/lib/python2.7/dist-packages in a .bashrc or similar. If you followed this path,
You don't have to tell your virtual environment about this at all, it won't try to change it.
No relinking will be required, and
That will still go wherever it would have gone (pip install always uses /usr/local/lib/python2.7/dist-packages/ for my Ubuntu) if you install them outside of your virtual environment. If you install them from within your virtual environment (while it's activated) then of course it'll be put in the virtualenvironment.
I'm just getting my head around virtualenv, but there seems to be an easier way than mentioned so far.
Since virtualenv 1.7 --no-site-packages has been the default behavior.
Therefore using the --system-site-packages flag to virtualenv is all that is needed to get dist-packages in your path - if you use the tweaked virtualenv shipped by Ubuntu. (This answer and this one give some useful history). I've tested this and it does work.
$ virtualenv --system-site-packages .
I agree with Thomas here - I can't see any action required in virtualenv to see the effect of updates in dist-packages.
Having tested that with python setup.py install, it does (again as Thomas said) still go to dist-packages. You could change that by building your own python, but that's a bit extreme.
PYTHONPATH works for me.
vim ~/.bashrc
add this line below:
export PYTHONPATH=$PYTHONPATH:/usr/lib/python2.7/dist-packages:/usr/local/lib/python2.7/dist-packages
source ~/.bashrc
In the directory site-packages, create a file dist.pth
In the file dist.path, put the following:
../dist-packages
Now deactivate and activate your virtualenv. You should be set.
What you want to achieve here is essentially add specific folder (dist-packages) to Python search path. You have a number of options for this:
Use path configuration (.pth) file, entries will be appended to the system path.
Modify PYTHONPATH (entries from it go to the beginning of system path).
Modify sys.path directly from your Python script, i.e. append required folders to it.
I think that for this particular case (enable global dist-packages folder) third option is better, because with first option you have to create .pth file for every virtualenv you'll be working in (with some external shell script?). It's easy to forget it when you distribute your package. Second option requires run-time setup (add a envvar), which is, again, easy to miss.
And only third option doesn't require any prerequisites at configure- or run-time and can be distributed without issues (on the same-type system, of course).
You can use function like this:
def enable_global_distpackages():
import sys
sys.path.append('/usr/lib/python2.7/dist-packages')
sys.path.append('/usr/local/lib/python2.7/dist-packages')
And then in __init__.py file of your package:
enable_global_distpackages()

Including global package into a virtualenv that has been created with --no-site-packages

I'd usually prefer to create virtualenvs with --no-site-packages option for more isolation, and also because default python global packages includes quite a lot of packages, and usually most of them are not needed.
However I'd still want to keep a few select packages in global, like PIL or psycopg2. Is there a good way to include them into the virtualenv, that can also be automated easily?
If you're using virtualenvwrapper and you might be able to use the postmkvirtualenv script to automatically create symlinks in the new virtualenv sitepackages directory.
#!/bin/sh
cdsitepackages
ln -s /path/to/system/site-packages/package-name
cdvirtualenv
If you are using virtualenvwrapper, the shell command add2virtualenv should be present in an active virtualenv. Use:
add2virtualenv /path/to/package
to add an entry to the PTH file _virtualenv_path_extensions.pth in your virtualenv site-packages.
The benefit of using add2virtualenv rather than creating symlinks yourself, is that you can remove the package from being importable by commenting out its line in the PTH file. This makes it easier to check your code's validity against several versions of a library on which it depends.
I haven't actually tried this with those specific packages, but I would guess that a simple symlink from the global site-packages into the virtualenv's site-packages might work, and this is easily scriptable.

Categories