Python can import a module that isn't installed - python

So, I'm playing around with packaging a python script I've written, and it has a submodule, let's call it submodule. The folder structure looks like this:
cool_script/
setup.py
cool_script.py
submodule/
__init__.py
implementation.py
Now, after many pip install . and pip install -e . calls, I have the situation where submodule can be imported globally. No matter where on my system, this will always work:
$ python3
[...]
>>> import submodule
>>> submodule.__file__
'/home/me/fake/path/cool_script/submodule/__init__.py'
But I don't know why.
The package I made was uninstalled again, and pip cannot find the submodule in its index. There's nothing in dist-packages either, I manually deleted the cool_script.egg-link that was still sitting around there:
$ ls /usr/local/lib/python3.4/dist-packages | ack cool
$ ls /usr/local/lib/python3.4/dist-packages | ack submodule
$
The PYTHONPATH is empty as well:
$ echo $PYTHONPATH
$
Why does Python know the location of submodule? How can I find out?

First run python -c "import site; print(site.getsitepackages())". It will print a list like this:
['/XXX/something/site-packages']
Normally there is a single path in this list, and it points to a directory where pip installs your scripts. You can ls into it if you're curious: ls /XXX/something/site-packages/.
More interestingly, though, pip puts a "link" file in that directory when you're using developer installs (a.k.a. pip install -e). The "link" file is named after the original project with a .egg-link extension at the end.
So you probably have a cool_script.egg-link file in that directory. And if you try to print it out you should find that its contents list the original filesystem location of your module. Something like:
$ cat /XXX/something/site-packages/cool_script.egg-link
/home/me/fake/path/cool_script/
.
This is how pip records that it has installed something in developer mode, but it isn't how Python actually knows how to find your module (that would have been too easy, right? :-)).
Python doesn't know about .egg-link files, but it reads all .pth files in the site-packages directory to get additional paths for sys.path (*). So, for Python to be able to import developer mode installs, pip writes all theirs paths in a single .pth file conventionally called easy-install.pth (because the old easy-install tool actually pioneered that technique). And if you print out that file, you'll get the list of all projects paths installed in developer mode:
$ cat /XXX/something/site-packages/easy-install.pth
/home/me/fake/path/cool_script/
/home/me/another/project/
And you can check that indeed all those paths listed in easy-install.pth indeed get added to your sys.path.
(*) Technically, the part of Python that reads those .pth file is the site module which is normally imported automatically at startup. There is an option to disable the site module, though, for example by using python -S. In that case, you'll see that sys.path contains neither the site-packages directory nor the developer install paths.

Related

(Dumb noob) Will changing directory in cmd.exe prior to installing a third party module, cause that module to be stored in that directory?

I want to install python, and then some additional modules, to my computer (windows).
I've been told that the best way to install python and packages is to make a folder in your drive somewhere BEFORE installing anything, (ex. 'Python') then pointing all your downloads (using cd in the command line) to this folder so that everything is in once place, and you don't have to go on a wild goose chase to make sure everything you want access to is in your PATH when you go to import modules etc.
Do I have the right idea?
I have had trouble importing modules in the past because they were not in the Path.
Will changing the directory in the command line before typing:
pip install somemodule
cause that module to be saved to where I just changed the directory to?
pip always installs the libraries in a fixed directory, usually in the user folder. You can check this by the command pip show <installed-package-name>
So you can use any package you already installed to get the pip directory. The location might vary based on your python version and env name.
Example: c:\users\<user>\appdata\roaming\python\python37\site-packages

how to set different PYTHONPATH variables for python3 and python2 respectively

I want to add a specific library path only to python2. After adding export PYTHONPATH="/path/to/lib/" to my .bashrc, however, executing python3 gets the error: Your PYTHONPATH points to a site-packages dir for Python 2.x but you are running Python 3.x!
I think it is due to that python2 and python3 share the common PYTHONPATH variable.
So, can I set different PYTHONPATH variables respectively for python2 and python3. If not, how can I add a library path exclusively to a particular version of python?
PYTHONPATH is somewhat of a hack as far as package management is concerned. A "pretty" solution would be to package your library and install it.
This could sound more tricky than it is, so let me show you how it works.
Let us assume your "package" has a single file named wow.py and you keep it in /home/user/mylib/wow.py.
Create the file /home/user/mylib/setup.py with the following content:
from setuptools import setup
setup(name="WowPackage",
packages=["."],
)
That's it, now you can "properly install" your package into the Python distribution of your choice without the need to bother about PYTHONPATH. As far as "proper installation" is concerned, you have at least three options:
"Really proper". Will copy your code to your python site-packages directory:
$ python setup.py install
"Development". Will only add a link from the python site-packages to /home/user/mylib. This means that changes to code in your directory will have effect.
$ python setup.py develop
"User". If you do not want to write to the system directories, you can install the package (either "properly" or "in development mode") to /home/user/.local directory, where Python will also find them on its own. For that, just add --user to the command.
$ python setup.py install --user
$ python setup.py develop --user
To remove a package installed in development mode, do
$ python setup.py develop -u
or
$ python setup.py develop -u --user
To remove a package installed "properly", do
$ pip uninstall WowPackage
If your package is more interesting than a single file (e.g. you have subdirectories and such), just list those in the packages parameter of the setup function (you will need to list everything recursively, hence you'll use a helper function for larger libraries). Once you get a hang of it, make sure to read a more detailed manual as well.
In the end, go and contribute your package to PyPI -- it is as simple as calling python setup.py sdist register upload (you'll need a PyPI username, though).
You can create a configuration file mymodule.pth under lib/site-packages (on Windows) or lib/pythonX.Y/site-packages (on Unix and Macintosh), then add one line containing the directory to add to python path.
From docs.python2 and docs.python3:
A path configuration file is a file whose name has the form name.pth and exists in one of the four directories mentioned above; its contents are additional items (one per line) to be added to sys.path. Non-existing items are never added to sys.path, and no check is made that the item refers to a directory rather than a file. No item is added to sys.path more than once. Blank lines and lines beginning with # are skipped. Lines starting with import (followed by space or tab) are executed.
I found that there is no way to modify PYTHONPATH that is only for python2 or only for python3. I had to use a .pth file.
What I had to do was:
make sure directory is created in my home: $HOME/.local/lib/python${MAJOR_VERSION}.${MINOR_VERSION}/site-packages
create a .pth file in that directory
test that your .pth file is work
done
For more info on `.pth. file syntax and how they work please see: python2 docs and python3 docs.
(.pth files in a nutshell: when your python interpreter starts it will look in certain directories and see the .pth file, open those files, parse the files, and add those directories to your sys.path (i.e. the same behavior as PYTHONPATH) and make any python modules located on those directories available for normal importing.)
If you don't want to bother with moving/adding documents in lib/site-packages, try adding two lines of code in the python2.7 script you would like to run (below.)
import sys
sys.path = [p for p in sys.path if p.startswith(r'C:\Python27')]
This way, PYTHONPATH will be updated (ignore all python3.x packages) every time you run your code.

Python: Export all used modules

I'm using a lot of modules installed by Internet.
It's possible to write a script to copy automatically all of these module in a folder?
I don't know where these modules are, I only write:
import module1
import module2
I simply want that module1 and module2 can copied in a folder in order to use my file.py in other pc witouth installing any software except for Python.
pip and virtualenvs. You develop locally on a virtualenv, installing and uninstalling whatever you want. When your code is exportable make a list of requirements with command "pip freeze". Then you carry your code to another computer, without any other code except for the output of "pip freeze" in a file called "requirements.txt". Do a "pip install -r requirements.txt" and... ¡magic! All are installed in their proper path.
If you are interested in where those modules are, find your python path and find a "site-packages" folder (on Windows it usually is "C:\PythonX.X\lib\site-packages" or something like that). But I'm 100% sure you will regret copying modules manually from here to there.

PYTHONPATH conflict

I am trying to import ZipCodeDatabase in helloworld.py.
helloworld.py exists at /google-app-engine/helloworld
ZipCodeDatabase module exists /usr/local/lib/python/python2.7/dist-packages
PYTHONPATH = /usr/local/lib/python/python2.7/dist-packages;/usr/local/lib/python/
When compiling helloworld I am still getting "ZipCodeDatabase module not found". Why isn't it being picked from the PYTHONPATH?
I highly doubt you've got a module called ZipCodeDatabase. That naming convention is typically reserved for a class that resides within a module. Modules are usually lowercase or lower_snake_case, to represent the file containing the module. I'm assuming you've installed pyzipcode here, but it may be a different module.
# assuming pyzipcode.py in the dist-packages directory
$ python -c 'from pyzipcode import ZipCodeDatabase'
If I'm wrong above, then are you sure you're running the version of python that has the ZipCodeDatabase module installed?
Some troubleshooting steps:
$ which python
$ python --version
$ python -c 'import ZipCodeDatabase'
$ ls -l /usr/local/lib/python2.7/dist-packages/ | grep -i zip
Also, is it really necessary for you to specify the PYTHONPATH line? Typically, the site-packages folder (and by extension I assume the dist-packages folder on Ubuntu) is included in the default PYTHONPATH, along with the current directory of the python module you're using.
How did you install the ZipCodeDatabase? Did you just drop the file in there? Try putting it alongside your helloworld.py file and try importing it then. Also, a full stack trace is useful information here, especially when others are trying to diagnose the problem you're having.
Edit:
Ok, now that I know you're using google app engine (should have been obvious from your use of paths - I'm sorry), it looks like it doesn't use the site-packages or dist-packages to load modules. You should create a sub-directory in your project with the relevant third party libraries, and add that sub-directory to your path. Disclaimer: I've never used GAE so I might be missing the mark with this.
Check out this answer for how to structure your project and add the extra directory to your path from within the application.

How to put the "build" module in python search path

After so much of hassle i build the libxml from source. I performed following steps
Downloaded the lxml.tar.gz and extracted its contents
Build it using
python2.7 setup.py build_ext -i -I /usr/include/libxml2 --with-xslt-config=/opt/xslt/bin/xslt-config
I tried going in python shell and tried import lxml . it didn't worked
Then i went into directory
/home/user/tmp/(extracted lxml directory/
and on linux command prompt i typed
PYTHONPATH=src python27
then i tried import lxml and then it worked.
src folder conatains folder name lxml
So i want to know that when i build the lxml does it mean that i always need that directory to use it or i can delete that. If not then in which location do i need to put that folder so that if i run python normal way then i can access that
Does the modules which we build ourselves are not installed in python folder??
Can i make python egg from it
You told it to build_ext, so it just compiled it and didn't install. If you told it to install, it would install it in system-wide directory (but you need write permissions for that) or whatever directory you specify (with --home (for installing as user) or --prefix (for installing as root to non-standard directory like under /opt) option).
When you set PYTHONPATH, you gave it a relative path, so it will only work from that folder. If you specify an absolute path, like:
export PYTHONPATH=/home/user/tmp/extracted_whatever
It will work regardless of the folder you're in now.

Categories