rebuilding the opencv-python wheel installer - python

I am using the opencv-python project here. What I would like to do is recreate the wheel file again. So what I did was something like:
python setup.py bdist_wheel
This creates a dist directory and adds the wheel file there which I then take and try to install in an Anaconda environment as follows:
pip install ~/opencv_python-3.4.2+5b36c37-cp36-cp36m-linux_x86_64.whl
This is fine and seems to install fine. But when I try to use it and do
import cv2
I get the error:
ImportError: libwebp.so.5: cannot open shared object file: No such file or directory
I thought that creating the wheel file would take care of all the dependencies but I wonder if I have to do something else before the wheel generation to make sure everything is packaged correctly?
EDIT
I compare the wheel archives from the official sources and the one I generated and I see that the third party libraries are not included. So, my zip file contents are:
['cv2/LICENSE-3RD-PARTY.txt',
'cv2/LICENSE.txt', 'cv2/__init__.py',
'cv2/cv2.cpython-36m-x86_64-linux-gnu.so']
I have omitted some XML files which are not relevant. Meanwhile, the official archive has:
['cv2/__init__.py',
'cv2/cv2.cpython-36m-i386-linux-gnu.so',
'cv2/.libs/libswresample-08248319.so.3.2.100',
'cv2/.libs/libavformat-d485f70f.so.58.17.101',
'cv2/.libs/libvpx-1b5256ac.so.5.0.0',
'cv2/.libs/libz-83853723.so.1.2.3',
'cv2/.libs/libQtGui-55070e59.so.4.8.7',
'cv2/.libs/libavcodec-3b67922d.so.58.21.104',
'cv2/.libs/libswscale-3bf29a6c.so.5.2.100',
'cv2/.libs/libQtTest-0cf8861e.so.4.8.7',
'cv2/.libs/libQtCore-ccf6d197.so.4.8.7',
'cv2/.libs/libavutil-403a4871.so.56.18.102']

Related

How to change the package path inside an archive created by `python setup.py bdist`?

I am creating a .zip package by command python setup.py bdist.
After the archive is created, the main folder which I packaged is placed at path <archive_name>.zip/<Python_path>/Lib/site-packages/<packaged_folder>.
What I want to achieve is the folder path inside archive to keep some specific, as when I run python setup.py bdist statement for the same code on different environments, the path inside archive gets changed based on Python version and platform.
Edit: I am not talking about --dist-dir option.
To do this you need to explicitly call the install and bdist_dumb commands with some options:
(yes, this is a real command)
setup.py install --root=build\bdist --install-lib=. bdist_dumb --bdist-dir=build\bdist
For the install command you need to override the following options:
--install-lib installation directory for all module distributions
--install-headers installation directory for C/C++ headers
--install-scripts installation directory for Python scripts
--install-data installation directory for data files
Note that the bdist command is obsolete and what you want is basically what wheel format is for:
A wheel is a ZIP-format archive with a specially formatted file name and the .whl extension. It contains a single distribution nearly as it would be installed according to PEP 376 with a particular installation scheme. Although a specialized installer is recommended, a wheel file may be installed by simply unpacking into site-packages with the standard unzip tool while preserving enough information to spread its contents out onto their final paths at any later time.
See the following links for an up-to-date information on packaging Python projects:
Tutorial: Packaging Python Projects (more recent);
Guide: Packaging and distributing projects.

How do I add all include .h files in python directory

I am trying to install PyHook using PIP. When I run the command on cmd pip install pyhook3 I get a C1080 error that tells me there is no such .h file located in my directory. I traced the directory, downloaded the file and it showed me another. I kept doing this until I noticed that there seems to be no end. There seems to be a lot of missing .h files in this includes folder C:\Users\User\AppData\Local\Programs\Python\Python38-32\include directory. I don't want to have to download or copy and paste source code for each of these files. Is there any way to get all of them at once, or am I missing the plot entirely?
pyhook is a Python package with binary dependencies.
When running pip install pyhook3 you download the source and ask your computer to build it so it can be installed. It thus requires a compiler and a set of header files that are apparently missing for you.
A workaround may be to download manually a compiled version of this package and install it.
You can find on this page a set of binary wheel for pyhook (not pyhook3) for python3 (32 or 64 bit). Once you have downloaded the correct .whl, you can install it with pip install the_filename_you_have_downloaded.whl

pip does not install my package dependencies

I have developed a python package on github that I released on PyPi. It installs with pip install PACKAGENAME, but does not do anything with the dependencies that are stated in the "install_requires" of the setup.py file.
Weirdly enough, the zip file of the associated release does install all dependencies.. I tried with different virtual environments and on different computers but it never installs the dependencies.. Any help appreciated.
pip install pythutils downloads a wheel if it's available — and it's available for your package.
When generating a wheel setuptools runs python setup.py locally but doesn't include setup.py into the wheel. Download your wheel file and unzip it (it's just a zip archive) — there is your main package directory pythutils and a directory with metadata pythutils-1.1.1.dist-info. In the metadata directory there is a file METADATA that usually lists static dependencies but your file doesn't list any. Because when you were generating wheels all your dependencies has already been installed so all your dynamic code paths were skipped.
The archive that you downloaded from Github release install dependencies because it's not a wheel so pip runs python setup.py install and your dynamic dependencies work.
What you can do? My advice is to avoid dynamic dependencies. Declare static dependencies and allow pip to decide what versions to install:
install_requires=[
'numpy==1.16.5; python_version>="2" and python_version<"3"',
'numpy; python_version>="3"',
],
Another approach would be to create version-specific wheel files — one for Python 2 and another for Python 3 — with fixed dependencies.
Yet another approach is to not publish wheels at all and only publish sdist (source distribution). Then pip is forced to run python setup.py install on the target machine. That not the best approach and it certainly will be problematic for packages with C extensions (user must have a compiler and developer tools to install from sources).
Your setup.py does a series of checks like
try:
import numpy
except ImportError:
if sys.version_info[0] == 2:
install_requires.append('numpy==1.16.5')
if sys.version_info[0] == 3:
install_requires.append("numpy")
Presumably the system where you ran it had all the required modules already installed, and so ended up with an empty list in install_requires. But this is the wrong way to do it anyway; you should simply make a static list (or two static lists, one each for Python 2 and Python 3 if you really want to support both in the same package).

install a pre-build wheel file as part of setup requirement

I have a python project where I am using the maskrcnn-benchmark project from Facebook Research. The problem is that the setup file for the facebook project depends on pytorch i.e. the setup file has an import line like:
import torch
So, I need to have pytorch pre-installed and this is causing me some problems. For me, the cleanest solution would be if I could prebuild the maskrcnn-benchmark project as a wheel with all its dependencies like pytorch and then add this wheel as a requirement in my setup.py file.
However, I could not find an easy way to do so. Is there someway to adsd a wheel file as an install_requires step in the setup file of a python project.
The maskrcnn-benchmark project should have torch==1.0.1 (or whichever version) in install_requirements= (along with any other requirements).
Then, you can use
pip wheel . --wheel-dir /tmp/deps
to have pip gather up the wheels (for your current architecture!) in /tmp/deps. Then, to install the dependencies from the wheel dir,
pip install --find-links=/tmp/deps -e .
This technique works for other target types too, like -r requirements.txt.
EDIT: If you also want to build a wheel for the project itself, that'd be python setup.py bdist_wheel, but that won't look for dependencies.

Import error when using twine and wheel to upload python package to PYPI

I have a python package ready for distribution on PyPI. To do this I am using twine as recommended on the in the Python docs. I have my setup.py file and this previously worked using the setup.py register upload command for my previous release.
To upload on to PyPi I am using:
python setup.py sdist
python setup.py bdist_wheel
twine upload dist\PyCoTools-2.1.2-py2-none-any.whl #this was created in the previous line
Now, on another computer I try using:
pip install PyCoTools
and it installs but then:
>>> import PyCoTools
Gives an import error. I go to the Libs/site-packages and all I see is this:
i.e. no folder called PyCoTools, just the dist info.
and inside that I just have
Which (obviously) doesn't incude the files that are in my package. Could anybody give me some pointers as to what I'm doing wrong?
Thanks
did you forget to put init.py inside your pyCoTools directory ? I had the same issue and I resolved it by adding this file.

Categories