After so much of hassle i build the libxml from source. I performed following steps
Downloaded the lxml.tar.gz and extracted its contents
Build it using
python2.7 setup.py build_ext -i -I /usr/include/libxml2 --with-xslt-config=/opt/xslt/bin/xslt-config
I tried going in python shell and tried import lxml . it didn't worked
Then i went into directory
/home/user/tmp/(extracted lxml directory/
and on linux command prompt i typed
PYTHONPATH=src python27
then i tried import lxml and then it worked.
src folder conatains folder name lxml
So i want to know that when i build the lxml does it mean that i always need that directory to use it or i can delete that. If not then in which location do i need to put that folder so that if i run python normal way then i can access that
Does the modules which we build ourselves are not installed in python folder??
Can i make python egg from it
You told it to build_ext, so it just compiled it and didn't install. If you told it to install, it would install it in system-wide directory (but you need write permissions for that) or whatever directory you specify (with --home (for installing as user) or --prefix (for installing as root to non-standard directory like under /opt) option).
When you set PYTHONPATH, you gave it a relative path, so it will only work from that folder. If you specify an absolute path, like:
export PYTHONPATH=/home/user/tmp/extracted_whatever
It will work regardless of the folder you're in now.
Related
I want to install python, and then some additional modules, to my computer (windows).
I've been told that the best way to install python and packages is to make a folder in your drive somewhere BEFORE installing anything, (ex. 'Python') then pointing all your downloads (using cd in the command line) to this folder so that everything is in once place, and you don't have to go on a wild goose chase to make sure everything you want access to is in your PATH when you go to import modules etc.
Do I have the right idea?
I have had trouble importing modules in the past because they were not in the Path.
Will changing the directory in the command line before typing:
pip install somemodule
cause that module to be saved to where I just changed the directory to?
pip always installs the libraries in a fixed directory, usually in the user folder. You can check this by the command pip show <installed-package-name>
So you can use any package you already installed to get the pip directory. The location might vary based on your python version and env name.
Example: c:\users\<user>\appdata\roaming\python\python37\site-packages
So, I'm playing around with packaging a python script I've written, and it has a submodule, let's call it submodule. The folder structure looks like this:
cool_script/
setup.py
cool_script.py
submodule/
__init__.py
implementation.py
Now, after many pip install . and pip install -e . calls, I have the situation where submodule can be imported globally. No matter where on my system, this will always work:
$ python3
[...]
>>> import submodule
>>> submodule.__file__
'/home/me/fake/path/cool_script/submodule/__init__.py'
But I don't know why.
The package I made was uninstalled again, and pip cannot find the submodule in its index. There's nothing in dist-packages either, I manually deleted the cool_script.egg-link that was still sitting around there:
$ ls /usr/local/lib/python3.4/dist-packages | ack cool
$ ls /usr/local/lib/python3.4/dist-packages | ack submodule
$
The PYTHONPATH is empty as well:
$ echo $PYTHONPATH
$
Why does Python know the location of submodule? How can I find out?
First run python -c "import site; print(site.getsitepackages())". It will print a list like this:
['/XXX/something/site-packages']
Normally there is a single path in this list, and it points to a directory where pip installs your scripts. You can ls into it if you're curious: ls /XXX/something/site-packages/.
More interestingly, though, pip puts a "link" file in that directory when you're using developer installs (a.k.a. pip install -e). The "link" file is named after the original project with a .egg-link extension at the end.
So you probably have a cool_script.egg-link file in that directory. And if you try to print it out you should find that its contents list the original filesystem location of your module. Something like:
$ cat /XXX/something/site-packages/cool_script.egg-link
/home/me/fake/path/cool_script/
.
This is how pip records that it has installed something in developer mode, but it isn't how Python actually knows how to find your module (that would have been too easy, right? :-)).
Python doesn't know about .egg-link files, but it reads all .pth files in the site-packages directory to get additional paths for sys.path (*). So, for Python to be able to import developer mode installs, pip writes all theirs paths in a single .pth file conventionally called easy-install.pth (because the old easy-install tool actually pioneered that technique). And if you print out that file, you'll get the list of all projects paths installed in developer mode:
$ cat /XXX/something/site-packages/easy-install.pth
/home/me/fake/path/cool_script/
/home/me/another/project/
And you can check that indeed all those paths listed in easy-install.pth indeed get added to your sys.path.
(*) Technically, the part of Python that reads those .pth file is the site module which is normally imported automatically at startup. There is an option to disable the site module, though, for example by using python -S. In that case, you'll see that sys.path contains neither the site-packages directory nor the developer install paths.
I want to add a specific library path only to python2. After adding export PYTHONPATH="/path/to/lib/" to my .bashrc, however, executing python3 gets the error: Your PYTHONPATH points to a site-packages dir for Python 2.x but you are running Python 3.x!
I think it is due to that python2 and python3 share the common PYTHONPATH variable.
So, can I set different PYTHONPATH variables respectively for python2 and python3. If not, how can I add a library path exclusively to a particular version of python?
PYTHONPATH is somewhat of a hack as far as package management is concerned. A "pretty" solution would be to package your library and install it.
This could sound more tricky than it is, so let me show you how it works.
Let us assume your "package" has a single file named wow.py and you keep it in /home/user/mylib/wow.py.
Create the file /home/user/mylib/setup.py with the following content:
from setuptools import setup
setup(name="WowPackage",
packages=["."],
)
That's it, now you can "properly install" your package into the Python distribution of your choice without the need to bother about PYTHONPATH. As far as "proper installation" is concerned, you have at least three options:
"Really proper". Will copy your code to your python site-packages directory:
$ python setup.py install
"Development". Will only add a link from the python site-packages to /home/user/mylib. This means that changes to code in your directory will have effect.
$ python setup.py develop
"User". If you do not want to write to the system directories, you can install the package (either "properly" or "in development mode") to /home/user/.local directory, where Python will also find them on its own. For that, just add --user to the command.
$ python setup.py install --user
$ python setup.py develop --user
To remove a package installed in development mode, do
$ python setup.py develop -u
or
$ python setup.py develop -u --user
To remove a package installed "properly", do
$ pip uninstall WowPackage
If your package is more interesting than a single file (e.g. you have subdirectories and such), just list those in the packages parameter of the setup function (you will need to list everything recursively, hence you'll use a helper function for larger libraries). Once you get a hang of it, make sure to read a more detailed manual as well.
In the end, go and contribute your package to PyPI -- it is as simple as calling python setup.py sdist register upload (you'll need a PyPI username, though).
You can create a configuration file mymodule.pth under lib/site-packages (on Windows) or lib/pythonX.Y/site-packages (on Unix and Macintosh), then add one line containing the directory to add to python path.
From docs.python2 and docs.python3:
A path configuration file is a file whose name has the form name.pth and exists in one of the four directories mentioned above; its contents are additional items (one per line) to be added to sys.path. Non-existing items are never added to sys.path, and no check is made that the item refers to a directory rather than a file. No item is added to sys.path more than once. Blank lines and lines beginning with # are skipped. Lines starting with import (followed by space or tab) are executed.
I found that there is no way to modify PYTHONPATH that is only for python2 or only for python3. I had to use a .pth file.
What I had to do was:
make sure directory is created in my home: $HOME/.local/lib/python${MAJOR_VERSION}.${MINOR_VERSION}/site-packages
create a .pth file in that directory
test that your .pth file is work
done
For more info on `.pth. file syntax and how they work please see: python2 docs and python3 docs.
(.pth files in a nutshell: when your python interpreter starts it will look in certain directories and see the .pth file, open those files, parse the files, and add those directories to your sys.path (i.e. the same behavior as PYTHONPATH) and make any python modules located on those directories available for normal importing.)
If you don't want to bother with moving/adding documents in lib/site-packages, try adding two lines of code in the python2.7 script you would like to run (below.)
import sys
sys.path = [p for p in sys.path if p.startswith(r'C:\Python27')]
This way, PYTHONPATH will be updated (ignore all python3.x packages) every time you run your code.
I need to run python module as trivial Linux command without worrying about the location of the file. Can I do it with Distutils?
Specific about what I need. A user should do simple installation and be able to use it:
python setup.py install
mymodule --arg value
How to do it?
If you specify a list of scripts in setup(), distutils will automatically install them into an appropriate directory:
from distutils.core import setup
setup(
name='somemodule',
scripts=['mymodule']
)
Note that mymodule here for your example will need to be a runnable python script with that name; it probably shouldn't be your actual module but rather import your module if necessary.
If you want to be able to run the command by specifying just it's name, it needs to be installed in a directory listed in the PATH environment variable. For user programs, the usual locations are /bin, /usr/bin (with /usr/local/bin being fairly common too).
So setup your package to install your script in one those locations. However, this will require root privileges.
If you want to be able to install without root privileges, the alternative is to ensure that the script's directory is listed in the PATH variable.
I'm trying to create a deb-package from distribution in a tarball. It has setup.py file.
My actions are:
python setup.py --command-packages=stdeb.command sdist_dsc
cd deb_dist/<pkgname>
debuild -uc -us -i -b
Everything works fine. But when i do
dpkg -i <pkgname>.deb
all package module's files are installs into /usr/share/pyshared/<pkgname> directory and i want to change it.
Is it possible? How?
Thanks.
That's the right directory for installation of Python system libraries, according to Debian Python Policy. The generated deb source ought to be arranging for those files to be symlinked into the appropriate /usr/lib/python2.*/dist-packages directories, based on what Python versions are installed. That would be normally be taken care of by the dh_python2 tool during package build; it should put calls to update-python-modules in the generated postinst.
That behavior can be changed, but the right way to change it depends on the reason you want to change it. What part of this process isn't working for you?