Integration of manually installed python libs into the system? - python

I have manually built numpy, scipy, matplotlib etc - without root privilages (I needed fresh matplotlib). All libs installed in the standard place:
~/.local/lib/python2.7
Now when I'm trying to install anything related - synaptic suggest me to install all the libs system wide. Is there a way I can tell synaptic to use the locally installed libs?
I tried to link ~/.local/lib/python2.7/site-packages to /usr/lib/python2.7 - no help.
Edit:
If I clone a python package, and change the name in the setup.py to the name of the ubuntu package, and then build:
python setup.py bdist --format=rpm
and then convert it to deb with alien:
sudo alien -k my.rpm
and then install the deb:
sudo dpkg -i my.deb
then synaptic does recognise it as a package (remember I've tweaked the name in setup.py).
But I can't find a way to make synaptic aware of locally installed python libs.

How can a package manager, that manages packages at a system level, know anything about something that is installed in a user directory, something that is the opposite of the system level?
A package manager resolves dependencies based on meta-information stored in a package (be it rpm, deb, whatever) and/or package repository.
To achieve your goal you can go either of two options.
First is to build a system-specific package from your sources and then install it via your package manager. See Creating Built Distributions docs for that. It would look something like this:
$ python setup.py bdist --format=rpm
$ rpm -i dist/$PACKAGE.rpm
That would make your package manager aware of the fact that some dependencies are already provided.
This approach may or may not work.
The other, preferred, option is to use python package manager such as pip and install all your packages in a virtual environment. There are several advantages of this method:
You can have several distinct package sets, with different versions of packages installed.
You can optionally isolate your virtual environment from the packages installed system-wide.

Related

Manage non pip dependencies with poetry

We are develop our projects with poetry. In one of which we implement qr-code decoding. As the pip package site states zbar needs to be installed on the system.
Is it somehow possible that poetry/pip installs the zbar dependency, while installing our package?
Unfortunately not. poetry is essentially a wrapper of pip and can only manage Python dependencies from PyPi.
You could consider something like Nix for this use case: https://github.com/NixOS/nix#nix
Or you could consider a Makefile that runs the appropriate brew or apt-get-equivalent based on the operating system.

When install external python packages global, when local? pip or system package-manager?

I'm confused about the possibilities of installing external python packages:
install package local with pip into /home/chris/.local/lib/python3.4/site-packages
$ pip install --user packagename
install package global with pip into /usr/local/lib/python3.4/site-packages
(superuser permission required)
$ pip install packagename
install package global with zypper into /usr/lib/python3.4/site-packages
(superuser permission required)
$ zypper install packagename
I use OpenSuse with package-manager zypper and have access to user root.
What I (think to) know about pip is that:
- pip just downloads the latest version.
- For installed packages won't be checked if newer versions are available.
- Own packages can be installed in a virtual env.
- Takes more time to download and install than zypper.
- Local or global installation possible.
The package-manager of my system:
- Does download and installation faster.
- Installs the package only globally.
My question is when and why should I do the installation: pip (local, global) or with zypper?
I've read a lot about this issue but could not answer this question clearly...
The stuff under /usr/lib is system packages considered part of the OS. It's likely/possible that OS scripts and services will have dependencies on these components. I'd recommend not touching these yourself, or really using or depending on them for user scripts either as this will make your app OS or even OS version dependent. Use these if writing scripts that run at system level such as doing maintenance or admin tasks, although I'd seriously consider even these using...
Stuff under /usr/local/lib is installed locally for use by any user. System scripts and such won't depend on these (I don't know SuSE myself though), but other user's scripts might well do, so that needs to be borne in mind when making changes here. It's a shared resource. If your writing scripts that other users might need to run, develop against this to ensure they will have access to all required dependencies.
Stuff in your home directory is all yours, so do as thou wilt. Use this if you're writing something just for yourself and especially if you might need the scripts to be portable to other boxes/OSes.
There might well be other options that make sense, such as if you're part of a team developing application software, in which case install your team's base dev packages in a shared location but perhaps not /usr/local.
In terms of using zypper or pip, I'd suggest using zypper to update /usr/lib for sure as it's the specific tool for OS configuration update. Probably same goes for /usr/local/lib too as that's really part of the 'system' but it's really up to you and which method might make most sense e.g. if you needed to replicate the config an another host. For stuff in your homedir it's up to you but if you decide to move to a new host on a new OS, pip will still be available and so that environment will be easier to recreate.

How to make a debian package which includes several python packages

I want to create a debian package that when installed, it will install several python packages with pip. I can think of two ways:
install the python packages into a directory then make a debian package from that directory. But this will confuse the building host (such as its pip metadata), especially if the host already installed some of those packages.
make a debian package with all python packages, and during debian install and uninstall, run some scripts to install/uninstall the python packages. But this will need two more scripts to be maintained, and some place to hold all the python packages in the installed machine.
Any other solution and what's the best way to solve this problem?
In my opinion if you want to create a debian package you should avoid reference to external distribution systems.
Here there are the guidelines about the creation of python packages under debian.
EDIT: Sorry, I see now that the Debian wiki page about Python Packaging could be outdated. You could read:
the guide for pybuild
eventually the python page about building packages
If you want to create a meta package which depends on python-<packagename> in repositories, it is easy and I think you already know that. (if not, google equivs package). I assume you would like to have recent versions of the python packages installed or some packages are missing in debian repositories so the debian repositories will not be used.
pip is a good tool however you can break dependencies if you unistall a python package which my be required by another package which is installed by apt after your meta package. apt is your friend. You should be careful. To overcome this, my suggestions is to add the appropriate package names to your meta package's control file's Provides, Conflicts and Replaces fields whether you dynamically install the python packages via pip or bundle them in your main package. I quickly searched "bundling multiple debian packages into one package" and found no solution.
If you want to completely seperate your python packages from your system wide python packages, virtualenv is the best choice I know.
And if you want to build debian compliant packages using pip, stdeb can do that easily.
Moreover, As far as I remember, I saw some packages in Kali Linux (Debian based) dynamically installing python packages during install or during startup however Debian policies may not allow this kind of flexibility not to break dependencies (if you want to build an official package). I hope this answer guide you in the right direction.

Can I have my pip user-installed package be preferred over system?

I would like to figure out a "fool-proof" installation instruction to put in the README of a Python project, call it footools, such that other people in our group can install the newest SVN version of it on their laptops and their server accounts.
The problem is getting the user-installed libs to be used by Python when they call the scripts installed by pip. E.g., we're using a server that has an old version of footools in /opt/local/Library/Frameworks/Python.framework/Versions/2.7/lib/python2.7/site-packages/.
If I do python2.7 setup.py install --user and run the main entry script, it uses the files in /Users/unhammer/Library/Python/2.7/lib/python/site-packages/. This is what I want, but setup.py alone doesn't install dependencies.
If I (revert the installation and) instead do pip-2.7 install --user . and run the main entry script, it uses the old files in /opt/local/Library/Frameworks/Python.framework/Versions/2.7/lib/python2.7/site-packages/ – that's not what I want.
If I (revert the installation and) instead do pip-2.7 install --user -e . and run the main entry script, it uses the files in . – that's not what I want, the user should be able to remove the source dir (and be able to svn up without that affecting their install).
I could use (and recommend other people to use) python2.7 setup.py install --user – but then they have to first do
pip-2.7 install -U --user -r requirements.txt -e .
pip-2.7 uninstall -y footools
in order to get the dependencies installed (since pip has no install --only-deps option). That's rather verbose though.
What is setup.py doing that pip is not doing here?
(Edited to make it clear I'm looking for simpler+safer installation instructions.)
Install virtualenvwrapper. I allows setting up separate python environments to alleviate any conflicts you might be having. Here is a tutorial for installing and using virtualenv.
Related:
https://virtualenvwrapper.readthedocs.org/en/latest/
Console scripts generated by pip in the process of installation should use user installed versions of libraries as according to PEP 370:
The user site directory is added before the system site directories
but after Python's search paths and PYTHONPATH. This setup allows the
user to install a different version of a package than the system
administrator but it prevents the user from accidently overwriting a
stdlib module. Stdlib modules can still be overwritten with
PYTHONPATH.
Sidenote
Setuptools use hack by inserting code in easy_install.pth file which is placed in site-packages directory. This code makes packages installed with setuptools come before other packages in sys.path so they shadow other packages with the same name. This is referred to as sys.path modification in the table comparing setuptools and pip. This is the reason console scripts use user installed libraries when you install with setup.py install instead of using pip.
Taking all of the above into account the reason for what you observe might be caused by:
PYTHONPATH pointing to directories with system-wide installed libraries
Having system-wide libraries installed using sudo python.py install (...)
Having OS influence sys.path construction in some way
In the first case either clearing PYTHONPATH or adding path to user installed library to the beginning of PYTHONPATH should help.
In the second case uninstalling system-wide libraries and installing them with distro package manager instead might help (please note that you never should use sudo with pip or setup.py to install Python packages).
In the third case it's necessary to find out how does OS influence sys.path construction and if there's some way of placing user installed libraries before system ones.
You might be interested in reading issue pip list reports wrong version of package installed both in system site and user site where I asked basically the same question as you:
Does it mean that having system wide Python packages installed with easy_install thus having them use sys.path manipulation breaks scripts from user bin directory? If so is there any workaround?
Last resort solution would be to manually place directory/directories with user installed libraries in the beginning of sys.path from your scripts before importing these libraries.
Having said that if your users do not need direct access to source code I would propose packaging your app together with all dependencies using tool like pex or Platter into self-contained bundle.

setup.py not honoring PIP_INDEX_URL

I am running a local pypi server. I can install packages from this server by either specifying it with the -i option of the pip command or by setting the PIP_INDEX_URL environment variable. When I install a package that has prerequisites, setup.py has historically honored the PIP_INDEX_URL environment variable, pulling the additional packages from my local server.
However, on a couple of systems that have been recently installed, it is behaving differently. Running, for instance, python setup.py develop fails because it tries to install prerequisites packages from pypi.python.org.
I have updated all of the related python packages (python, distribute, virtualenv, pip, etc...) on all the systems I'm testing on and continue to see this discrepancy. On my "original" system, setup.py downloads prerequisites from the pypi server specified in my PIP_INDEX_URL environment variable. On the newer systems, I can't seem to make it honor this variable.
What am I missing?
Create setup.cfg in the same folder as your setup.py with following content:
[easy_install]
allow_hosts = *.myintranet.example.com
From: http://pythonhosted.org/setuptools/easy_install.html#restricting-downloads-with-allow-hosts
You can use the --allow-hosts (-H) option to restrict what domains EasyInstall will look for links and downloads on.
--allow-hosts=None prevents downloading altogether.
I ran into the same issue. Fundamentally, setup.py is using setuptools which leverages easy_install, not pip. Thus, it ignores any pip-related environment variables you set.
Rather than use python setup.py develop you can run pip (from the top of the package) pip install -e . to produce the same effect.

Categories