how to set different PYTHONPATH variables for python3 and python2 respectively - python

I want to add a specific library path only to python2. After adding export PYTHONPATH="/path/to/lib/" to my .bashrc, however, executing python3 gets the error: Your PYTHONPATH points to a site-packages dir for Python 2.x but you are running Python 3.x!
I think it is due to that python2 and python3 share the common PYTHONPATH variable.
So, can I set different PYTHONPATH variables respectively for python2 and python3. If not, how can I add a library path exclusively to a particular version of python?

PYTHONPATH is somewhat of a hack as far as package management is concerned. A "pretty" solution would be to package your library and install it.
This could sound more tricky than it is, so let me show you how it works.
Let us assume your "package" has a single file named wow.py and you keep it in /home/user/mylib/wow.py.
Create the file /home/user/mylib/setup.py with the following content:
from setuptools import setup
setup(name="WowPackage",
packages=["."],
)
That's it, now you can "properly install" your package into the Python distribution of your choice without the need to bother about PYTHONPATH. As far as "proper installation" is concerned, you have at least three options:
"Really proper". Will copy your code to your python site-packages directory:
$ python setup.py install
"Development". Will only add a link from the python site-packages to /home/user/mylib. This means that changes to code in your directory will have effect.
$ python setup.py develop
"User". If you do not want to write to the system directories, you can install the package (either "properly" or "in development mode") to /home/user/.local directory, where Python will also find them on its own. For that, just add --user to the command.
$ python setup.py install --user
$ python setup.py develop --user
To remove a package installed in development mode, do
$ python setup.py develop -u
or
$ python setup.py develop -u --user
To remove a package installed "properly", do
$ pip uninstall WowPackage
If your package is more interesting than a single file (e.g. you have subdirectories and such), just list those in the packages parameter of the setup function (you will need to list everything recursively, hence you'll use a helper function for larger libraries). Once you get a hang of it, make sure to read a more detailed manual as well.
In the end, go and contribute your package to PyPI -- it is as simple as calling python setup.py sdist register upload (you'll need a PyPI username, though).

You can create a configuration file mymodule.pth under lib/site-packages (on Windows) or lib/pythonX.Y/site-packages (on Unix and Macintosh), then add one line containing the directory to add to python path.
From docs.python2 and docs.python3:
A path configuration file is a file whose name has the form name.pth and exists in one of the four directories mentioned above; its contents are additional items (one per line) to be added to sys.path. Non-existing items are never added to sys.path, and no check is made that the item refers to a directory rather than a file. No item is added to sys.path more than once. Blank lines and lines beginning with # are skipped. Lines starting with import (followed by space or tab) are executed.

I found that there is no way to modify PYTHONPATH that is only for python2 or only for python3. I had to use a .pth file.
What I had to do was:
make sure directory is created in my home: $HOME/.local/lib/python${MAJOR_VERSION}.${MINOR_VERSION}/site-packages
create a .pth file in that directory
test that your .pth file is work
done
For more info on `.pth. file syntax and how they work please see: python2 docs and python3 docs.
(.pth files in a nutshell: when your python interpreter starts it will look in certain directories and see the .pth file, open those files, parse the files, and add those directories to your sys.path (i.e. the same behavior as PYTHONPATH) and make any python modules located on those directories available for normal importing.)

If you don't want to bother with moving/adding documents in lib/site-packages, try adding two lines of code in the python2.7 script you would like to run (below.)
import sys
sys.path = [p for p in sys.path if p.startswith(r'C:\Python27')]
This way, PYTHONPATH will be updated (ignore all python3.x packages) every time you run your code.

Related

Python can import a module that isn't installed

So, I'm playing around with packaging a python script I've written, and it has a submodule, let's call it submodule. The folder structure looks like this:
cool_script/
setup.py
cool_script.py
submodule/
__init__.py
implementation.py
Now, after many pip install . and pip install -e . calls, I have the situation where submodule can be imported globally. No matter where on my system, this will always work:
$ python3
[...]
>>> import submodule
>>> submodule.__file__
'/home/me/fake/path/cool_script/submodule/__init__.py'
But I don't know why.
The package I made was uninstalled again, and pip cannot find the submodule in its index. There's nothing in dist-packages either, I manually deleted the cool_script.egg-link that was still sitting around there:
$ ls /usr/local/lib/python3.4/dist-packages | ack cool
$ ls /usr/local/lib/python3.4/dist-packages | ack submodule
$
The PYTHONPATH is empty as well:
$ echo $PYTHONPATH
$
Why does Python know the location of submodule? How can I find out?
First run python -c "import site; print(site.getsitepackages())". It will print a list like this:
['/XXX/something/site-packages']
Normally there is a single path in this list, and it points to a directory where pip installs your scripts. You can ls into it if you're curious: ls /XXX/something/site-packages/.
More interestingly, though, pip puts a "link" file in that directory when you're using developer installs (a.k.a. pip install -e). The "link" file is named after the original project with a .egg-link extension at the end.
So you probably have a cool_script.egg-link file in that directory. And if you try to print it out you should find that its contents list the original filesystem location of your module. Something like:
$ cat /XXX/something/site-packages/cool_script.egg-link
/home/me/fake/path/cool_script/
.
This is how pip records that it has installed something in developer mode, but it isn't how Python actually knows how to find your module (that would have been too easy, right? :-)).
Python doesn't know about .egg-link files, but it reads all .pth files in the site-packages directory to get additional paths for sys.path (*). So, for Python to be able to import developer mode installs, pip writes all theirs paths in a single .pth file conventionally called easy-install.pth (because the old easy-install tool actually pioneered that technique). And if you print out that file, you'll get the list of all projects paths installed in developer mode:
$ cat /XXX/something/site-packages/easy-install.pth
/home/me/fake/path/cool_script/
/home/me/another/project/
And you can check that indeed all those paths listed in easy-install.pth indeed get added to your sys.path.
(*) Technically, the part of Python that reads those .pth file is the site module which is normally imported automatically at startup. There is an option to disable the site module, though, for example by using python -S. In that case, you'll see that sys.path contains neither the site-packages directory nor the developer install paths.

Idiom for script directory in python application?

I have a python application (Django based), and I have a couple of standalone maintenance scripts that go along with the application, that I have to call every now and then. They have to import parts of my application (sub-packages). Currently, I just put them in my toplevel directory:
application/
djangoproject/
djangoapp/
otherpackage/
tool1.py
tool2.py
Where tool1.py would do
from djangoproject import wsgi
from djangoapp.models import Poll
I've accumulated quite some of these tools, and would like to move them to a scripts subdirectory. Then, I would like to be able to call them via python scripts/tool1.py or maybe cd scripts; python tool1.py.
I understand (and sometimes lament) how Python's imports work, and I know that I can add some lines to each script to add the parent directory to PYTHONPATH. I am wondering if there is a widespread pattern to handle such a collection of assorted scripts. Maybe one could put the path manipulation into another file, and have every script start with import mainproject?
I am using a virtualenv, and installing dependencies with pip. But the application itself currently doesn't use a setup.py, and I think it wouldn't help to move the scripts to a separate package installed via pip, since I change them a lot during development, and there are lots of one-offs.
The ways for organizing the source code vary from project to project. From the years of my experience, the best and the most pythonic way is to always have setup.py.
In that case, you can make pip install -e . and the editable version from . dir will be pseudo-installed to the virtualenv. Actually, not really installed (i.e. copied), but "linked": the source code dir will be added to sys.path with .pth files, so you can edit & try without any special copying/installing steps afterward.
More on that, you can extend setup.py with extra dependencies for e.g. the development purposes, and install them by pip install -e .[dev]. More like a fancy consequence.
The rest depends on the nature of the scripts.
If the scripts are part of the application, they should be installed via the entry-points in setup.py.
# setup.py:
setup(
entry_points={
'console_scripts': [
'tool1 = mytools.tool1:main',
'tool2 = mytools.tool2:main',
],
},
)
In that case, after pip install -e ., they will be in the bin folder of the virtualenv, or in /usr/local/bin or alike if the system python is used. You can execute them like this:
source .venv/bin/activate
tool1 ...
# OR:
~/path/to/venv/bin/tool2
The scripts installed this way are fully aware of the virtualenv, to which they were installed, so no activation and no explicit python binary are needed.
If the scripts are for the code maintenance, and not semantically part of the application, then they are usually put into ./scripts/ directory (or any other, e.g. ./ci/), with shebang at the top (#!/usr/bin/env python). E.g., tool1.py:
#!/usr/bin/env python
def main():
pass
if __name__ == '__main__':
main()
And executed in the current virtualenv due to this shebang as follows:
source .venv/bin/activate
./scripts/tool1.py ...
# OR:
~/path/to/venv/bin/python ./scripts/tool1.py
Unlike the scripts installed via the entry points, these scripts do not know about their own virtualenv in any way, so the virtualenv should be activate or proper python used explicitly.
This way is also used when the scripts are non-python, e.g. for the bash scripts.
In both cases, the requirements.txt file is sometimes used to pin the application's & dependencies' versions (with pip freeze), so that the deployments would be persistent & predictable. But this is another story — about the deployment of the application, not about the packaging & maintenance.
The requirements.txt file is regenerated from time to time to satisfy the new unpinned (i.e. flexible) requirements in setup.py and the new package versions available. But usually it is the generated content (despite being committed in the repo), not the content maintained by hand.
If you strictly do not want to have setup.py for any reason, then either execute those scripts with the modified env var:
PYTHONPATH=. python scripts/tool1.py
Or hack the sys.path from inside:
# tools1.py
import sys
import os
sys.path.insert(0, os.path.dirname(os.path.dirname(__file__)))
This is exactly what pip install -e . does, just done manually on every call, not once with the .pth file in the virtualenv. And also this looks hacky.
However, as we know, neither hacky solutions nor the duplicating solutions, especially those duplicating the standard toolkit, are considered "pythonic".

How do I write a setup.py and other installation files in Python that set path?

I have several packages in folder Top. The path is set at the command prompt such that each package contains some python files that use other packages' modules.
I have the following files in Top directory: setup.py, MANIFEST, MANIFEST.in, README. I wish to modify the setup files such that the path is set during installation. Does PYTHONPATH set it, and does it need to go into a new file?
The appropriate actions here are
Packages do not mess with PYTHONPATH. Never.
Instead, you write an setup.py entry point to your command line scripts
When the user installs the package using pip install the command line script is automatically added to user's PATH, which is hopefully inside virtualenv
This command line script is generated during the install, so that it points to the PYTHONPATH of virtualenv or system-wide Python installation. The path is hardcoded at the script head, pointing to the current Python interpreter.
More info
https://packaging.python.org/en/latest/distributing/

Access a Python Package from local git repository

I have a local git repository on my machine, let's say under /develop/myPackage.
I'm currently developing it as a python package (a Django app) and I would like to access it from my local virtualenv. I've tried to include its path in my PYTHONPATH (I'm on a Mac)
export PATH="$PATH:/develop/myPackage"
The directory already contains a __init__.py within its root and within each subdirectory.
No matter what I do but I can't get it work, python won't see my package.
The alternatives are:
Push my local change to github and install the package within my virtualenv from there with pip
Activate my virtualenv and install the package manually with python setup.py install
Since I often need to make changes to my code the last two solution would require too much work all the time even for a small change.
Am I doing something wrong? Would you suggest a better solution?
Install it in editable mode from your local path:
pip install -e /develop/MyPackage
This actually symlinks the package within your virtualenv so you can keep on devving and testing.
The example you show above uses PATH, and not PYTHONPATH. Generally, the search path used by python is partially predicated on the PYTHONPATH environment variable (PATH has little use for this case.)
Try this:
export PYTHONPATH=$PYTHONPATH:/develop/myPackage
Though in reality, you likely want it to be pointing to the directory that contains your package (so you can do 'import myPackage', rather than importing things within the package. That being said, you likely want:
export PYTHONPATH=$PYTHONPATH:/develop/
Reference the python docs here for more information about Python's module/package search path: http://docs.python.org/2/tutorial/modules.html#the-module-search-path
By default, Python uses the packages that it was installed with as it's default path, and as a result PYTHONPATH is unset in the environment.

PYTHONPATH conflict

I am trying to import ZipCodeDatabase in helloworld.py.
helloworld.py exists at /google-app-engine/helloworld
ZipCodeDatabase module exists /usr/local/lib/python/python2.7/dist-packages
PYTHONPATH = /usr/local/lib/python/python2.7/dist-packages;/usr/local/lib/python/
When compiling helloworld I am still getting "ZipCodeDatabase module not found". Why isn't it being picked from the PYTHONPATH?
I highly doubt you've got a module called ZipCodeDatabase. That naming convention is typically reserved for a class that resides within a module. Modules are usually lowercase or lower_snake_case, to represent the file containing the module. I'm assuming you've installed pyzipcode here, but it may be a different module.
# assuming pyzipcode.py in the dist-packages directory
$ python -c 'from pyzipcode import ZipCodeDatabase'
If I'm wrong above, then are you sure you're running the version of python that has the ZipCodeDatabase module installed?
Some troubleshooting steps:
$ which python
$ python --version
$ python -c 'import ZipCodeDatabase'
$ ls -l /usr/local/lib/python2.7/dist-packages/ | grep -i zip
Also, is it really necessary for you to specify the PYTHONPATH line? Typically, the site-packages folder (and by extension I assume the dist-packages folder on Ubuntu) is included in the default PYTHONPATH, along with the current directory of the python module you're using.
How did you install the ZipCodeDatabase? Did you just drop the file in there? Try putting it alongside your helloworld.py file and try importing it then. Also, a full stack trace is useful information here, especially when others are trying to diagnose the problem you're having.
Edit:
Ok, now that I know you're using google app engine (should have been obvious from your use of paths - I'm sorry), it looks like it doesn't use the site-packages or dist-packages to load modules. You should create a sub-directory in your project with the relevant third party libraries, and add that sub-directory to your path. Disclaimer: I've never used GAE so I might be missing the mark with this.
Check out this answer for how to structure your project and add the extra directory to your path from within the application.

Categories