The difference between opencv-python and opencv-contrib-python - python

I was looking at the Python Package Index (PyPi) and noticed 2 very similar packages: opencv-contrib-python and opencv-python and wondering what the difference was. I looked at them and they had the exact same description and version numbers.

As per PyPi documentation:
There are four different packages (see options 1, 2, 3 and 4 below):
Packages for standard desktop environments:
Option 1 - Main modules package: pip install opencv-python
Option 2 -
Full package (contains both main modules and contrib/extra modules):
pip install opencv-contrib-python (check contrib/extra modules listing
from OpenCV documentation)
Packages for server (headless) environments:
Option 3 - Headless main modules package: pip install opencv-python-headless
Option 4 - Headless full package (contains both main modules and contrib/extra modules): pip install opencv-contrib-python-headless
Do not install multiple different packages in the same environment

Opencv has two compilations for each version, the "regular" one that is functional and well tested and the compilation with the extra components (contribs package) in their github's page they put:
This repository is intended for the development of so-called "extra" modules, contributed functionality. New modules quite often do not have stable API, and they are not well-tested. Thus, they shouldn't be released as a part of the official OpenCV distribution, since the library maintains binary compatibility, and tries to provide decent performance and stability.
Also in contribs package there are several non-free computer vision algorithms (for features) such as SURF, BRIEF, Censure, Freak, LUCID, Daisy, BEBLID, TEBLID.

Related

Match versions between packages in setup.py install_requires

I have a package with setup.py which will depend on two packages, tensorflow and tensorflow-probability.
Q:
How do I make sure the versions of the above two packages will always match when I install my package? My package currently has the requirement of tensorflow>=2.6.2 and I will be adding tensorflow-probability==tensorflow_MAJOR_MINOR_version.
The tensorflow-probability website has the following clause:
https://www.tensorflow.org/probability/install
Note: Since TensorFlow is not included as a dependency of the TensorFlow Probability package (in setup.py), you must explicitly install the TensorFlow package (tensorflow or tensorflow-gpu). This allows us to maintain one package instead of separate packages for CPU and GPU-enabled TensorFlow.
This means it's up to the installer to install the correct tensorflow package version, but what if this is a setup.py script instead of a human?
I am not finding the syntax to do this in PEP 440 – Version Identification and Dependency Specification.
Dependency Management in Setuptools: Platform specific dependencies shows that it's possible to have conditional dependencies based on operating system platform.
My best guess is to augment setup.py with logic to:
* Get the currently installed version of `tensorflow` on the local system,
or get the latest version that will be installed from pypl.
* Isolate the MAJOR.MINOR component of the fetched version.
* Use that version in a string replace for
install_requires=[
'tensorflow>=2.6.2',
f'tensorflow-probability=={tensorflow_version}.*'
]

Extra packages in setuptools

I created a package which has 2 modes:
the basic mode with basic functionality only
extended functionality which adds additional modules and needs extra requirements.
For example:
MyPackageName
core
cyber_analyzer
parsing
Where "parsing" is the extension and needs "pandas" as requirement.
Then, I want my package to have 2 modes:
pip install mypackage
pip install mypackage[parsing]
I found out I can use extras_require to install "pandas". Yet, installing the whl file would install all the 3 modules: core, cyber_analyzer and parsing. I would like to install "parsing" only if the extra flag "parsing" was specified.
Is it possible to do so? How can I achieve it? Should I always install "parsing" and the users shouldn't use it?

GDAL libraries, who does what

I'm struggling installing GDAL on ubuntu 16.04 to work with GeoDjango (Django 2.1, python3), so I need to understand what I'm actually installing.
What is the rôle of each library/package/module ?
apt
gdal-bin (A 'C' library containing the actual functions ?)
python-gdal (The same in python, or just some kind of bridge ?)
python3-gdal (see above, but for python3. Does it need python-gdal ?)
pip
gdal
pygdal
What is the link between pip modules and apt packages here ?
Every piece of info is available, if one is willing to search for it.
DEBs (installed system-wide):
gdal-bin ([Ubtu]: Package: gdal-bin) - a collection of gdal related binaries (tools and utilities)
python3-gdal ([Ubtu]: Package: python3-gdal) - Python 3 bindings, extensions (.sos) and some wrapper scripts, which enable gdal usage from Python
python-gdal - the same thing, but for Python 2 (totally unrelated to previous item)
WHLs (installed as Python modules to the interpreter used to launch pip):
GDAL ([PyPI]: GDAL) - the sources (.tar.gz) for #2. (and / or #3.). During pip install phase, they are built and installed for current Python
pygdal ([PyPI]: pygdal) - same thing (but for VEnv?) as previous item. It seems to be a lighter version (it doesn't contain the scripts)
But, all of the above depend on libgdal ([Ubtu]: Package: libgdal1i), which is the gdal library.

including python package dependecy as an executable

Currently my python package does not have a dependency on the wmi package and it can be easily installed via
pip install mypackage
If I add a dependency on the wmi package, this will likely fail since when I try installing wmi through pip, I encounter errors since I do not have visual studio 2008 installed...and I only managed to get it installed using the binary distribution.
Is it possible for me to include and install the binary release of wmi in my package?
The main concern is that if people fail to install my package via the pip command, they just avoid using my package.
The first thing to consider is why are you considering adding the wmi package - since it is MS-Windows specific if you use it, or anything depending on it, your package will also be MS-Windows specific.
Are there other ways to achieve what you are trying to do that remain cross platform? If not and you really have to use it then you could include a prerequisite statement in the documentation, and ideally in setup.py, telling people that they need to have an installed & working copy of wmi, hopefully with a pointer to the binary distributions.
The other way to go - if you are on a late enough version of python - is to build and distribute your package as python wheels. Since wheels allow the inclusion of C package elements without relying on the presence of a compiler on the target system - see pep-0427 & here for some more information.
Creating Wheels:
You need to be running python2 > 2.6 or python3, pip >= 1.4 and setuptools >= 0.8.
Basically, assuming that you have a setup.py that will create your, (source), distribution for upload to pip with:
python setup.py sdist
then you can create a binary distribution that should contain all the dependencies of your package for your current python version with:
python setup.py bdist_wheel
This will build a distribution wheel that includes the .pyc files and the binary files from the required packages.
But - you need to do this once for each version of python that you are planning of supporting, (virtualenv is magic for this), and on each platform if you are also planning on supporting 64 bit or mac. Unless, of course, you manage to make a pure python package that will run, without 2to3, under both python 2 & 3 in which case you can build a universal wheel - obviously you can not do this if you require .c extensions.
For more information on wheels see Wheel - Read The Docs.

Make SetupTools/easy_install aware of installed Debian Packages?

I'm installing an egg with easy_install which requires ruledispatch. It isn't available in PyPI, and when I use PEAK's version it FTBFS. There is, however, a python-dispatch package which provides the same functionality as ruledispatch. How can I get easy_install to stop trying to install ruledispatch, and to allow it to recognize that ruledispatch is already installed as python-ruledispatch?
Running Debian etch with Python 2.4
The path least fiddly is likely:
easy_install --no-deps
Look at the egginfo of what you just installed
Install all dependencies except ruledispatch by hand
Optionally, prod the people responsible to list their stuff on pypi / not have dependencies that the package installer can't possibly satisfy / use dependency_links / use a custom package index / something.
If the python-ruledispatch from the .deb is the same as the egg depends on or compatible, this should work.

Categories