why did virtualenv not include all of the system paths? - python

I created a virtualenv, and while it has many system paths, it doesn't have others. Specifically, pyshared and dist-packages don't seem to be included. As a result, my system-wide MySQLdb and psycopg2 aren't available. Any ideas why?

Seems to be related to Ubuntu's messing with python and virtualenv

The only possible way that i'm aware of, is if you have created your virtualenv with the argument --no-site-packages:
from Here:
If you build with virtualenv
--no-site-packages ENV it will not inherit any packages from
/usr/lib/python2.5/site-packages (or
wherever your global site-packages
directory is). This can be used if you
don't have control over site-packages
and don't want to depend on the
packages there, or you just want more
isolation from the global system.
so Here is an example to understand more:
First i will create a virtualenv normally (without --no-site-package) and you will see that
i can always access django that is installed in my system site-packages (or dist-packages):
$ virtualenv A
New python executable in A/bin/python
Installing setuptools............done
$ source A/bin/activate
(A)$ python
Python 2.6.5 (r265:79063, Apr 16 2010, 13:57:41)
[GCC 4.4.3] on linux2
Type "help", "copyright", "credits" or "license" for more information.
>>> import django
>>> django.__file__
'/usr/local/lib/python2.6/dist-packages/django/__init__.pyc'
But now i will create the virtual env using --no-site-package:
$ virtualenv B --no-site-package
New python executable in B/bin/python
Installing setuptools............done.
$ source B/bin/activate
(B)$ python
Python 2.6.5 (r265:79063, Apr 16 2010, 13:57:41)
[GCC 4.4.3] on linux2
Type "help", "copyright", "credits" or "license" for more information.
>>> import django
Traceback (most recent call last):
File "<stdin>", line 1, in <module>
ImportError: No module named django
now you see that virtaulenv was able to access django from system dist-packages (ubuntu) in my machine.
Hope this will help :)

Related

pip prints trash messages in stdout

When I try to use pip commands, annoying messages are coming out in stdout:
~# pip -V
Platform: linu
pip 1.5.4 from /usr/lib/python2.7/dist-packages (python 2.7)
~# pip install
Platform: linu
You must give at least one requirement to install (see "pip help install")
Python commands are working normally.
OS - Ubuntu 14.04
I tried to reinstall pip and all dependencies, but it didn't help.
What is that and where it comes from?
The problem was noticed when I tried to use ec2.py dynamic inventory script for AWS. I faced the same problem as here:
https://github.com/ansible/ansible/issues/14667
ec2.py generates JSON with starting "Platform: linu" and therefore ansible doesn't work with that.
Also I searched for boto library (used in ec2.py) and pip configs. But they are blank.
Any suggestions?
Python
~# python
Python 2.7.6 (default, Jun 22 2015, 17:58:13)
[GCC 4.8.2] on linux2
Type "help", "copyright", "credits" or "license" for more information.
>>> import os
>>> print os.name
posix
>>> import platform
>>> platform.system()
'Linux'
Found one more way to reproduce the issue:
:/usr/bin# python
Python 2.7.6 (default, Jun 22 2015, 17:58:13)
[GCC 4.8.2] on linux2
Type "help", "copyright", "credits" or "license" for more information.
>>> import sys
>>> import os
>>> import boto
Platform: linu
>>>
This is common mistake. Most of this weird problem occur (whether in a plain PC, VM or inside cloud instance such as EC2. Did you notice AWS EC2 using a different locked down pip version) is running Python in sudo mode.
DO NOT run PIP in sudo mode!
My suggestion : setup Virtualenv and install your package on that virtualenv. Then use mkvirtualenv yourenv to create an custom virtualenv.
To load a python script automatically with the designated virtualenv, you just need to put an extra line inside your bash script to trigger your python package/modules.
source <virtualenv_folder>/<virtualenv_name>/bin/activate

Importing third-party modules within a virtualenv

Following the trouble I described here trying to install an older wPython inside a virtualenv, I then downloaded the dmg directly in a browser, and (after needing to right-click on the .pkg), it successfully got installed.
When I open an intepreter, I confirm that I can import wx.
Yet when I recreate my virtualenv, activate it, and open its intepreter:
(venv)MacBook-Pro-de-Pyderman:Projet Pyderman$ python
Python 2.6.6 (r266:84374, Aug 31 2010, 11:00:51)
[GCC 4.0.1 (Apple Inc. build 5493)] on darwin
Type "help", "copyright", "credits" or "license" for more information.
>>> import wx
Traceback (most recent call last):
File "<stdin>", line 1, in <module>
ImportError: No module named wx
Is there a step I am missing with respect to being able to import third-party modules within a virtualenv?
You should try creating the virtualenv with the --system-site-packages option. This will allow you to access the packages installed at system level, like the ones you install from a dmg image.

How to install libgit2/pygit2 into virtualenv? (Ubuntu)

I have tried dulwich, and GitPython - neither of which seem mature. Now I am trying to install libgit2/pygit2. I have successfully installed them into the host packages environment, but now I need to get them installed in the virtualenv of the app I am building.
Python 2.7.3 (default, Sep 26 2013, 20:03:06)
[GCC 4.6.3] on linux2
Type "help", "copyright", "credits" or "license" for more information.
>>> import pygit2
Traceback (most recent call last):
File "<stdin>", line 1, in <module>
File "/usr/local/lib/python2.7/dist-packages/pygit2/__init__.py", line 32, in <module>
import _pygit2
ImportError: libgit2.so.0: cannot open shared object file: No such file or directory
>>>
As you can see the current issue seems to be that the globally installed libgit2.so.0 cannot be found from within the virtualenv. Fair enough, that is what virtualenv is about after all,, sandboxing. So how do I go about linking/symlinking/building a version/copy of the libgit2.so for the virtualenv ?
I found the following script in a Gist that - when run while logged into a virtualenv -- will install the current libgit2/pygit2 together there in the venv.
https://gist.github.com/olivier-m/5755638
One note of caution, update the version numbers for both libraries to the same most recent version ( 0.20.0 at this time of writing ).

Virtualenv wont work on

I have build my virtual env with this command:
virtualenv env --distribute --no-site-packages
And then I have installed several modules (django etc) into env with pip, the problem is that when I wanted to run the code on the second machine it would not work, here is what I have done:
visgean#rewitaqia:~/scripty/project_name$ source ./env/bin/activate
(env)visgean#rewitaqia:~/scripty/project_name$ python
Python 2.7.1+ (r271:86832, Sep 27 2012, 21:12:17)
[GCC 4.5.2] on linux2
Type "help", "copyright", "credits" or "license" for more information.
>>> import django
>>> django.__file__
'/home/visgean/scripty/project_name/env/lib/python2.7/site-packages/django/__init__.pyc'
but when I want to run it on the second machine:
(env)user#debian:~/project_name$ python
Python 2.6.6 (r266:84292, Dec 27 2010, 00:02:40)
[GCC 4.4.5] on linux2
Type "help", "copyright", "credits" or "license" for more information.
>>> import django
Traceback (most recent call last):
File "<stdin>", line 1, in <module>
ImportError: No module named django
>>>
I wild error appears! The first machine is ubuntu, the second one is ubuntu. There seems to be some broken links in /home/user/project_name/env/lib/python2.7 , is that the problem? and if so, how should I prevent it/repair it?
If you are just copying the virtualenv to the second machine you may encounter some issues. From the virtualenv site:
Normally environments are tied to a specific path. That means that you
cannot move an environment around or copy it to another computer. You
can fix up an environment to make it relocatable with the command:
$ virtualenv --relocatable ENV
This will make some of the files
created by setuptools or distribute use relative paths, and will
change all the scripts to use activate_this.py instead of using the
location of the Python interpreter to select the environment.
Note: you must run this after you've installed any packages into the
environment. If you make an environment relocatable, then install a
new package, you must run virtualenv --relocatable again.
I have just noticed that I did have a wrong version of python on the second machine - debian does not have python2.7, installing 2.7 and pip for is the solution

How does Python keep track of modules installed with eggs?

If I have a module, foo, in Lib/site-packages, I can just import foo and it will work. However, when I install stuff from eggs, I get something like blah-4.0.1-py2.7-win32.egg as a folder, with the module contents inside, yet I still only need do import foo, not anything more complicated. How does Python keep track of eggs? It is not just dirname matching as if I drop that folder into a Python installation without going through dist-utils, it does not find the module.
To be clearer: I just installed zope. The folder name is "zope.interface-3.3.0-py2.7-win32.egg". This works:
Python 2.7.1 (r271:86832, Nov 27 2010, 18:30:46) [MSC v.1500 32 bit (Intel)] on win32
Type "help", "copyright", "credits" or "license" for more information.
>>> import zope.interface
>>>
I create a "blah-4.0.1-py2.7-win32.egg" folder with an empty module "haha" in it (and __init__.py). This does not work:
Python 2.7.1 (r271:86832, Nov 27 2010, 18:30:46) [MSC v.1500 32 bit (Intel)] on win32
Type "help", "copyright", "credits" or "license" for more information.
>>> import blah.haha
Traceback (most recent call last):
File "<stdin>", line 1, in <module>
ImportError: No module named blah.haha
>>>
This does, though:
Python 2.7.1 (r271:86832, Nov 27 2010, 18:30:46) [MSC v.1500 32 bit (Intel)] on win32
Type "help", "copyright", "credits" or "license" for more information.
>>> from pkg_resources import require
>>> require("blah>=1.0")
[blah 4.0.1 (c:\python27\lib\site-packages\blah-4.0.1-py2.7-win32.egg)]
>>> import haha
>>>
So how do I make it work without a require?
If you use the easy_install script provided by setuptools (or the Distribute fork of it) to install packages as eggs, you will see that, by default, it creates a file named easy-install.pth in the site-packages directory of your Python installation. Path configuration files are a standard feature of Python:
A path configuration file is a file
whose name has the form package.pth
and exists in one of the four
directories mentioned above; its
contents are additional items (one per
line) to be added to sys.path.
easy_install makes heavy use of this Python feature. When you use easy_install to add or update a distribution, it modifies easy-install.pth to add the egg directory or zip file. In this way, easy_install maintains control of the module searching order and ensures that the eggs it installs appear early in the search order. Here is an example of the contents of an easy-install.pth:
import sys; sys.__plen = len(sys.path)
./appscript-0.21.1-py2.6-macosx-10.5-ppc.egg
./yolk-0.4.1-py2.6.egg
./Elixir-0.7.1-py2.6.egg
./Fabric-0.9.0-py2.6.egg
import sys; new=sys.path[sys.__plen:]; del sys.path[sys.__plen:]; p=getattr(sys,'__egginse
rt',0); sys.path[p:p]=new; sys.__egginsert = p+len(new)
As you can see here and if you examine the code in setuptools, you will find it goes to some trickery to bootstrap itself and then cover its tracks which can make debugging problems with site.py and interpreter startup a bit interesting. (That is one of the reasons that some developers are not fond of using it.)
If you use the -m parameter of easy_install to install a distribution as multi-version, the easy-install.pth entry for it is not added or is removed if it already exists. This is why the easy_install documentation tells you to use -m before deleting an installed egg.
When you run easy_install it copies the egg into site-packages and puts the path to that egg on your sys.path variable. (Note that sys.path is not your PATH environment variable, it is constructed from PYTHONPATH and other environment variables. So the .egg file you install with easy_install gets put in some environment variable and python knows to add it to sys.path when the python interpreter starts).
To get blah.haha to work in your example, either run easy_install blah-4.0.1-py2.7-win32.egg and then you can import haha from within python, or just put the haha module directly in site-packages.

Categories