Manage non pip dependencies with poetry - python

We are develop our projects with poetry. In one of which we implement qr-code decoding. As the pip package site states zbar needs to be installed on the system.
Is it somehow possible that poetry/pip installs the zbar dependency, while installing our package?

Unfortunately not. poetry is essentially a wrapper of pip and can only manage Python dependencies from PyPi.
You could consider something like Nix for this use case: https://github.com/NixOS/nix#nix
Or you could consider a Makefile that runs the appropriate brew or apt-get-equivalent based on the operating system.

Related

Use pip or dnf to install python packages in Fedora?

When I install some python packages in Fedora, there're two ways:
use dnf install python-package
use pip install package
I notice even I use dnf update to make my Fedora the newest,
when I use pip, it still tell me something like
pip is a old version, please use pip update
I guess the dnf package management is different with python-pip package management.
So which one is more recommended to install python packages ?
Quoted from Gentoo Wiki:
It is important to understand that packages installed using pip will not be tracked by Portage. This is the case for installing any package through means other than the emerge command. Possible conflicts can be created when installing a Python package that is available in the Portage tree, then installing the same package using pip.
Decide which package manager will work best for the use case: either use emerge or pip for Python packages, but not both. Sometimes a certain Python packages will not be available in the Portage tree, in these cases the only option is to use pip. Be wise and make good choices!
This is true for almost any nowadays package managers. If you are using packages or certain package versions that only exists in pip, use it but don't try to install that from dnf. Doing this will not only cause file collisions but also will (most possibly) break the package manager's knowledge of the system, which usually leads to major package management issues.
Other solution would be using pip in user mode, without root permissions, which will install relevant things into your home directory.
So again, it's both okay to use pip or dnf, but just don't mix these two package managers together.

How to make a debian package which includes several python packages

I want to create a debian package that when installed, it will install several python packages with pip. I can think of two ways:
install the python packages into a directory then make a debian package from that directory. But this will confuse the building host (such as its pip metadata), especially if the host already installed some of those packages.
make a debian package with all python packages, and during debian install and uninstall, run some scripts to install/uninstall the python packages. But this will need two more scripts to be maintained, and some place to hold all the python packages in the installed machine.
Any other solution and what's the best way to solve this problem?
In my opinion if you want to create a debian package you should avoid reference to external distribution systems.
Here there are the guidelines about the creation of python packages under debian.
EDIT: Sorry, I see now that the Debian wiki page about Python Packaging could be outdated. You could read:
the guide for pybuild
eventually the python page about building packages
If you want to create a meta package which depends on python-<packagename> in repositories, it is easy and I think you already know that. (if not, google equivs package). I assume you would like to have recent versions of the python packages installed or some packages are missing in debian repositories so the debian repositories will not be used.
pip is a good tool however you can break dependencies if you unistall a python package which my be required by another package which is installed by apt after your meta package. apt is your friend. You should be careful. To overcome this, my suggestions is to add the appropriate package names to your meta package's control file's Provides, Conflicts and Replaces fields whether you dynamically install the python packages via pip or bundle them in your main package. I quickly searched "bundling multiple debian packages into one package" and found no solution.
If you want to completely seperate your python packages from your system wide python packages, virtualenv is the best choice I know.
And if you want to build debian compliant packages using pip, stdeb can do that easily.
Moreover, As far as I remember, I saw some packages in Kali Linux (Debian based) dynamically installing python packages during install or during startup however Debian policies may not allow this kind of flexibility not to break dependencies (if you want to build an official package). I hope this answer guide you in the right direction.

Difference between 'python setup.py install' and 'pip install'

I have an external package I want to install into my python virtualenv from a tar file.
What is the best way to install the package?
I've discovered 2 ways that can do it:
Extract the tar file, then run python setup.py install inside of the extracted directory.
pip install packagename.tar.gz from example # 7 in https://pip.pypa.io/en/stable/reference/pip_install/#examples
Is if there is any difference doing them in these 2 ways.
On the surface, both do the same thing: doing either python setup.py install or pip install <PACKAGE-NAME> will install your python package for you, with a minimum amount of fuss.
However, using pip offers some additional advantages that make it much nicer to use.
pip will automatically download all dependencies for a package for you. In contrast, if you use setup.py, you often have to manually search out and download dependencies, which is tedious and can become frustrating.
pip keeps track of various metadata that lets you easily uninstall and update packages with a single command: pip uninstall <PACKAGE-NAME> and pip install --upgrade <PACKAGE-NAME>. In contrast, if you install a package using setup.py, you have to manually delete and maintain a package by hand if you want to get rid of it, which could be potentially error-prone.
You no longer have to manually download your files. If you use setup.py, you have to visit the library's website, figure out where to download it, extract the file, run setup.py... In contrast, pip will automatically search the Python Package Index (PyPi) to see if the package exists there, and will automatically download, extract, and install the package for you. With a few exceptions, almost every single genuinely useful Python library can be found on PyPi.
pip will let you easily install wheels, which is the new standard of Python distribution. More info about wheels.
pip offers additional benefits that integrate well with using virtualenv, which is a program that lets you run multiple projects that require conflicting libraries and Python versions on your computer. More info.
pip is bundled by default with Python as of Python 2.7.9 on the Python 2.x series, and as of Python 3.4.0 on the Python 3.x series, making it even easier to use.
So basically, use pip. It only offers improvements over using python setup.py install.
If you're using an older version of Python, can't upgrade, and don't have pip installed, you can find more information about installing pip at the following links:
Official instructions on installing pip for all operating systems
Instructions on installing pip on Windows (including solutions to common problems)
Instructions on installing pip for Mac OX
pip, by itself, doesn't really require a tutorial. 90% of the time, the only command you really need is pip install <PACKAGE-NAME>. That said, if you're interested in learning more about the details of what exactly you can do with pip, see:
Quickstart guide
Official documentation.
It is also commonly recommended that you use pip and virtualenv together. If you're a beginner to Python, I personally think it'd be fine to start of with just using pip and install packages globally, but eventually I do think you should transition to using virtualenv as you tackle more serious projects.
If you'd like to learn more about using pip and virtualenv together, see:
Why you should be using pip and virtualenv
A non-magical introduction to Pip and Virtualenv for Python beginners
Virtual Environments
python setup.py install is the analog of make install: it’s a limited way to compile and copy files to destination directories. This doesn’t mean that it’s the best way to really install software on your system.
pip is a package manager, which can install, upgrade, list and uninstall packages, like familiar package managers including: dpkg, apt, yum, urpmi, ports etc. Under the hood, it will run python setup.py install, but with specific options to control how and where things end up installed.
In summary: use pip.
The question is about the preferred method to install a local tarball containing a python package, NOT about the advantage of uploading package to an indexing service like PyPi.
As lest I know some software distributor does not upload their package to PyPi, instead asking developers to download package from their website and install.
python setup.py install
This can work but not recommended. It's not necessary to unwrap the tarball file and go into it to run setup.py file.
pip install ../path/to/packagename.tar.gz
This is the way designed and preferred. Concise and align with PyPi-style packages.
More information about pip install can be found here: https://pip.readthedocs.io/en/stable/reference/pip_install/

How to add PyPi dependencies to DEB package

I created some python app using autobahn and packaged it using baazar builddeb. In python setup.py file I added requires tag with all the required dependencies. Is it possible to tell debian package installer to install these packages?
I added some of deps to debian/control>Depends but:
dpkg -i my_package does not install dependencies. Just shows the error and I need to install these deps manually.
some packages does not exists in standard Ubuntu repos. For example autobahn. And in general I'd like to have installed all python dependencies by pip/easy_install
I am using DistUtilsExtra.auto.setup with personalized install action. So I think I could run easy_install packages there. Is it good idea?
Thank you.
Create debian packages from pypi using the python-stdeb package, then depend on them like any other package.
See http://pypi.python.org/pypi/stdeb

Integration of manually installed python libs into the system?

I have manually built numpy, scipy, matplotlib etc - without root privilages (I needed fresh matplotlib). All libs installed in the standard place:
~/.local/lib/python2.7
Now when I'm trying to install anything related - synaptic suggest me to install all the libs system wide. Is there a way I can tell synaptic to use the locally installed libs?
I tried to link ~/.local/lib/python2.7/site-packages to /usr/lib/python2.7 - no help.
Edit:
If I clone a python package, and change the name in the setup.py to the name of the ubuntu package, and then build:
python setup.py bdist --format=rpm
and then convert it to deb with alien:
sudo alien -k my.rpm
and then install the deb:
sudo dpkg -i my.deb
then synaptic does recognise it as a package (remember I've tweaked the name in setup.py).
But I can't find a way to make synaptic aware of locally installed python libs.
How can a package manager, that manages packages at a system level, know anything about something that is installed in a user directory, something that is the opposite of the system level?
A package manager resolves dependencies based on meta-information stored in a package (be it rpm, deb, whatever) and/or package repository.
To achieve your goal you can go either of two options.
First is to build a system-specific package from your sources and then install it via your package manager. See Creating Built Distributions docs for that. It would look something like this:
$ python setup.py bdist --format=rpm
$ rpm -i dist/$PACKAGE.rpm
That would make your package manager aware of the fact that some dependencies are already provided.
This approach may or may not work.
The other, preferred, option is to use python package manager such as pip and install all your packages in a virtual environment. There are several advantages of this method:
You can have several distinct package sets, with different versions of packages installed.
You can optionally isolate your virtual environment from the packages installed system-wide.

Categories