Change install path for python package when creating .deb-package - python

I'm trying to create a deb-package from distribution in a tarball. It has setup.py file.
My actions are:
python setup.py --command-packages=stdeb.command sdist_dsc
cd deb_dist/<pkgname>
debuild -uc -us -i -b
Everything works fine. But when i do
dpkg -i <pkgname>.deb
all package module's files are installs into /usr/share/pyshared/<pkgname> directory and i want to change it.
Is it possible? How?
Thanks.

That's the right directory for installation of Python system libraries, according to Debian Python Policy. The generated deb source ought to be arranging for those files to be symlinked into the appropriate /usr/lib/python2.*/dist-packages directories, based on what Python versions are installed. That would be normally be taken care of by the dh_python2 tool during package build; it should put calls to update-python-modules in the generated postinst.
That behavior can be changed, but the right way to change it depends on the reason you want to change it. What part of this process isn't working for you?

Related

how to set different PYTHONPATH variables for python3 and python2 respectively

I want to add a specific library path only to python2. After adding export PYTHONPATH="/path/to/lib/" to my .bashrc, however, executing python3 gets the error: Your PYTHONPATH points to a site-packages dir for Python 2.x but you are running Python 3.x!
I think it is due to that python2 and python3 share the common PYTHONPATH variable.
So, can I set different PYTHONPATH variables respectively for python2 and python3. If not, how can I add a library path exclusively to a particular version of python?
PYTHONPATH is somewhat of a hack as far as package management is concerned. A "pretty" solution would be to package your library and install it.
This could sound more tricky than it is, so let me show you how it works.
Let us assume your "package" has a single file named wow.py and you keep it in /home/user/mylib/wow.py.
Create the file /home/user/mylib/setup.py with the following content:
from setuptools import setup
setup(name="WowPackage",
packages=["."],
)
That's it, now you can "properly install" your package into the Python distribution of your choice without the need to bother about PYTHONPATH. As far as "proper installation" is concerned, you have at least three options:
"Really proper". Will copy your code to your python site-packages directory:
$ python setup.py install
"Development". Will only add a link from the python site-packages to /home/user/mylib. This means that changes to code in your directory will have effect.
$ python setup.py develop
"User". If you do not want to write to the system directories, you can install the package (either "properly" or "in development mode") to /home/user/.local directory, where Python will also find them on its own. For that, just add --user to the command.
$ python setup.py install --user
$ python setup.py develop --user
To remove a package installed in development mode, do
$ python setup.py develop -u
or
$ python setup.py develop -u --user
To remove a package installed "properly", do
$ pip uninstall WowPackage
If your package is more interesting than a single file (e.g. you have subdirectories and such), just list those in the packages parameter of the setup function (you will need to list everything recursively, hence you'll use a helper function for larger libraries). Once you get a hang of it, make sure to read a more detailed manual as well.
In the end, go and contribute your package to PyPI -- it is as simple as calling python setup.py sdist register upload (you'll need a PyPI username, though).
You can create a configuration file mymodule.pth under lib/site-packages (on Windows) or lib/pythonX.Y/site-packages (on Unix and Macintosh), then add one line containing the directory to add to python path.
From docs.python2 and docs.python3:
A path configuration file is a file whose name has the form name.pth and exists in one of the four directories mentioned above; its contents are additional items (one per line) to be added to sys.path. Non-existing items are never added to sys.path, and no check is made that the item refers to a directory rather than a file. No item is added to sys.path more than once. Blank lines and lines beginning with # are skipped. Lines starting with import (followed by space or tab) are executed.
I found that there is no way to modify PYTHONPATH that is only for python2 or only for python3. I had to use a .pth file.
What I had to do was:
make sure directory is created in my home: $HOME/.local/lib/python${MAJOR_VERSION}.${MINOR_VERSION}/site-packages
create a .pth file in that directory
test that your .pth file is work
done
For more info on `.pth. file syntax and how they work please see: python2 docs and python3 docs.
(.pth files in a nutshell: when your python interpreter starts it will look in certain directories and see the .pth file, open those files, parse the files, and add those directories to your sys.path (i.e. the same behavior as PYTHONPATH) and make any python modules located on those directories available for normal importing.)
If you don't want to bother with moving/adding documents in lib/site-packages, try adding two lines of code in the python2.7 script you would like to run (below.)
import sys
sys.path = [p for p in sys.path if p.startswith(r'C:\Python27')]
This way, PYTHONPATH will be updated (ignore all python3.x packages) every time you run your code.

Using a local module instead of system installed module (Python 2.x.x)

I have a cluster system at work with Python and some modules installed on that system - however I wanted to use the most up to date version of the module - it has several methods not present in older versions, so I built it and it's deps locally in the area I have access to:
# From my home directory: /gpfs/env/yrq12edu
# Get the source I need for the up to date version of the module I want to install locally.
svn co svn://svn.code.sf.net/p/simupop/code/trunk simuPOP
# Install PCRE stuff...
cd pcre-8.34
./configure --prefix=/gpfs/env/yrq12edu/pcre_install
make
make install
export PATH=/gpfs/env/yrq12edu/pcre_install/bin:$PATH
export LD_LIBRARY_PATH=/gpfs/env/yrq12edu/pcre_install/lib:$LD_LIBRARY_PATH
cd ..
# Install Swig Stuff...
cd swig-3.0.0
./configure --prefix=/gpfs/env/yrq12edu/swig_install
make
make install
export PATH=/gpfs/env/yrq12edu/swig_install/bin:$PATH
cd ..
export PYTHONPATH=/gpfs/env/yrq12edu/PythonModules/lib/python2.7/site-packages
# Build the up to date simuPOP module I need locally...
cd simuPOP
python setup.py install --prefix=/gpfs/env/yrq12edu/PythonModules
How can I ensure that when I execute my Python scripts in the cluster it will try and use my local module rather than the system one? I have obviously changed PYTHONPATH during the build process which I know should allow modules to be loaded locally, but wondered which it will load when there is the choice of the system installed old version, or my new locally installed version. Will Python just know to favour the local one and load it instead or do I have to specify some option to force it?
Thanks,
Ben W.
According to the docs Python will load the built-in module if it's available. If it's not, it then looks in each path in sys.path (which starts with the current directory).
However, if I'm reading it correctly, standard modules are different from built-in modules. Standard modules are found by looking in sys.path, so if you put your path at the start of sys.path Python will get your module instead of the standard one.

Why does easy install want access to my rootfs for a "develop" install?

I'm looking at a python application server and I wanted to play around with the code. I'm lead to believe passing "develop" to setup.py should leave everything in place without installing anything. However when running so it is attempting to creating directories in my rootfs.
./setup.py develop
Gives:
running develop
Checking .pth file support in /usr/local/lib/python2.7/dist-packages/
error: can't create or remove files in install directory
I thought this might be something to do with package checking but surely attempting to write stuff into the rootfs is wrong?
The develop command wants to add a .pth entry for your project so that it can be imported as an egg. See the Development mode documentation, as well as the develop command docs.
The default is to put that entry in site-packages. Set a different library path with the --install-dir switch.
You can use --user option (Alternate installation: the user scheme).

How to put the "build" module in python search path

After so much of hassle i build the libxml from source. I performed following steps
Downloaded the lxml.tar.gz and extracted its contents
Build it using
python2.7 setup.py build_ext -i -I /usr/include/libxml2 --with-xslt-config=/opt/xslt/bin/xslt-config
I tried going in python shell and tried import lxml . it didn't worked
Then i went into directory
/home/user/tmp/(extracted lxml directory/
and on linux command prompt i typed
PYTHONPATH=src python27
then i tried import lxml and then it worked.
src folder conatains folder name lxml
So i want to know that when i build the lxml does it mean that i always need that directory to use it or i can delete that. If not then in which location do i need to put that folder so that if i run python normal way then i can access that
Does the modules which we build ourselves are not installed in python folder??
Can i make python egg from it
You told it to build_ext, so it just compiled it and didn't install. If you told it to install, it would install it in system-wide directory (but you need write permissions for that) or whatever directory you specify (with --home (for installing as user) or --prefix (for installing as root to non-standard directory like under /opt) option).
When you set PYTHONPATH, you gave it a relative path, so it will only work from that folder. If you specify an absolute path, like:
export PYTHONPATH=/home/user/tmp/extracted_whatever
It will work regardless of the folder you're in now.

Developing Python Module

I'd like to start developing an existing Python module. It has a source folder and the setup.py script to build and install it. The build script just copies the source files since they're all python scripts.
Currently, I have put the source folder under version control and whenever I make a change I re-build and re-install. This seems a little slow, and it doesn't settle well with me to "commit" my changes to my python install each time I make a modification. How can I cause my import statement to redirect to my development directory?
Use a virtualenv and use python setup.py develop to link your module to the virtual Python environment. This will make your project's Python packages/modules show up on the sys.path without having to run install.
Example:
% virtualenv ~/virtenv
% . ~/virtenv/bin/activate
(virtenv)% cd ~/myproject
(virtenv)% python setup.py develop
Virtualenv was already mentioned.
And as your files are already under version control you could go one step further and use Pip to install your repo (or a specific branch or tag) into your working environment.
See the docs for Pip's editable option:
-e VCS+REPOS_URL[#REV]#egg=PACKAGE, --editable=VCS+REPOS_URL[#REV]#egg=PACKAGE
Install a package directly from a checkout. Source
will be checked out into src/PACKAGE (lower-case) and
installed in-place (using setup.py develop).
Now you can work on the files that pip automatically checked out for you and when you feel like it, you commit your stuff and push it back to the originating repository.
To get a good, general overview concerning Pip and Virtualenv see this post: http://www.saltycrane.com/blog/2009/05/notes-using-pip-and-virtualenv-django
Install the distrubute package then use the developer mode. Just use python setup.py develop --user and that will place path pointers in your user dir location to your workspace.
Change the PYTHONPATH to your source directory. A good idea is to work with an IDE like ECLIPSE that overrides the default PYTHONPATH.

Categories