Context
I have a wheel file of a Python CLI app (call it A) in an S3 bucket. Not the repo, not the source code, just the wheel, because "reasons". I also have access to the source code of a wrapper for A (call it B).
What's Needed
I would like to create a wheel file that will ideally install both A and B along with other dependencies available via PyPI, and distribute said wheel to multiple people many of whom may not (and need not) have access to the S3 bucket. I'm wondering if it's possible to package A within B's wheel such that when someone pip installs B.whl, A is picked up automatically.
What I have tried
I tried including a reference to A in B's setup.py under install_requires and the relative path to A (./deps/A.whl) under dependency_links, but that didn't work. The error I get is that pip could not find a version that satisfies the requirement of package A. I did not already know whether that would work or not for certain; just tried using the path instead of a URL.
Build command: python setup.py bdist_wheel
Related
I am trying to build python wheel following the instructions as described in the link below. I am doing this first time.
https://packaging.python.org/en/latest/tutorials/packaging-projects/
I set up the folder structure, files and all. I have added this in pyproject.toml file.
[build-system]
requires= ["setuptools>=57.4.0","wheel>=0.37.1"]
build-backend = "setuptools.build_meta"
I have installed setuptoos and wheel on my virtual environment.
When I tried to run the build command, I am getting an SSL warnings and below error.
Could not find a version that satisfies the requirement wheel>=0.37.1
Could not fetch from URL https://pypi.org/simple
Even though have installed setuptools and wheel on my virtual environment, I think it is hitting the pypi to find and download these packages.
I don't know how the build module finds the modules/packages in "requires". I am not finding a way to direct to use the already installed setuptools and wheel on my machine instead of fetching from pypi.
Even if it tries to doanload again, how can derect to use our artifactory instead of pypi.
Any help in this is greatly appreciated.
I tried all below with differenct combinations but did not work. Obviously I am missing something.
1.
I added a pip.ini in my virtual environment (Lib\site-packages\pip).
Added the index-url with our organization's artifactory url.
Added trusted-host
Also tried pip.config
I downloaded the wheels for setuptools and wheel.
Added another argument in pyproject.toml
[easy-install]
find-links = c:\wheels
Added the wheels directly in the src folder.
Thanks.
I have developed a python package on github that I released on PyPi. It installs with pip install PACKAGENAME, but does not do anything with the dependencies that are stated in the "install_requires" of the setup.py file.
Weirdly enough, the zip file of the associated release does install all dependencies.. I tried with different virtual environments and on different computers but it never installs the dependencies.. Any help appreciated.
pip install pythutils downloads a wheel if it's available — and it's available for your package.
When generating a wheel setuptools runs python setup.py locally but doesn't include setup.py into the wheel. Download your wheel file and unzip it (it's just a zip archive) — there is your main package directory pythutils and a directory with metadata pythutils-1.1.1.dist-info. In the metadata directory there is a file METADATA that usually lists static dependencies but your file doesn't list any. Because when you were generating wheels all your dependencies has already been installed so all your dynamic code paths were skipped.
The archive that you downloaded from Github release install dependencies because it's not a wheel so pip runs python setup.py install and your dynamic dependencies work.
What you can do? My advice is to avoid dynamic dependencies. Declare static dependencies and allow pip to decide what versions to install:
install_requires=[
'numpy==1.16.5; python_version>="2" and python_version<"3"',
'numpy; python_version>="3"',
],
Another approach would be to create version-specific wheel files — one for Python 2 and another for Python 3 — with fixed dependencies.
Yet another approach is to not publish wheels at all and only publish sdist (source distribution). Then pip is forced to run python setup.py install on the target machine. That not the best approach and it certainly will be problematic for packages with C extensions (user must have a compiler and developer tools to install from sources).
Your setup.py does a series of checks like
try:
import numpy
except ImportError:
if sys.version_info[0] == 2:
install_requires.append('numpy==1.16.5')
if sys.version_info[0] == 3:
install_requires.append("numpy")
Presumably the system where you ran it had all the required modules already installed, and so ended up with an empty list in install_requires. But this is the wrong way to do it anyway; you should simply make a static list (or two static lists, one each for Python 2 and Python 3 if you really want to support both in the same package).
I have a python project where I am using the maskrcnn-benchmark project from Facebook Research. The problem is that the setup file for the facebook project depends on pytorch i.e. the setup file has an import line like:
import torch
So, I need to have pytorch pre-installed and this is causing me some problems. For me, the cleanest solution would be if I could prebuild the maskrcnn-benchmark project as a wheel with all its dependencies like pytorch and then add this wheel as a requirement in my setup.py file.
However, I could not find an easy way to do so. Is there someway to adsd a wheel file as an install_requires step in the setup file of a python project.
The maskrcnn-benchmark project should have torch==1.0.1 (or whichever version) in install_requirements= (along with any other requirements).
Then, you can use
pip wheel . --wheel-dir /tmp/deps
to have pip gather up the wheels (for your current architecture!) in /tmp/deps. Then, to install the dependencies from the wheel dir,
pip install --find-links=/tmp/deps -e .
This technique works for other target types too, like -r requirements.txt.
EDIT: If you also want to build a wheel for the project itself, that'd be python setup.py bdist_wheel, but that won't look for dependencies.
I am using the opencv-python project here. What I would like to do is recreate the wheel file again. So what I did was something like:
python setup.py bdist_wheel
This creates a dist directory and adds the wheel file there which I then take and try to install in an Anaconda environment as follows:
pip install ~/opencv_python-3.4.2+5b36c37-cp36-cp36m-linux_x86_64.whl
This is fine and seems to install fine. But when I try to use it and do
import cv2
I get the error:
ImportError: libwebp.so.5: cannot open shared object file: No such file or directory
I thought that creating the wheel file would take care of all the dependencies but I wonder if I have to do something else before the wheel generation to make sure everything is packaged correctly?
EDIT
I compare the wheel archives from the official sources and the one I generated and I see that the third party libraries are not included. So, my zip file contents are:
['cv2/LICENSE-3RD-PARTY.txt',
'cv2/LICENSE.txt', 'cv2/__init__.py',
'cv2/cv2.cpython-36m-x86_64-linux-gnu.so']
I have omitted some XML files which are not relevant. Meanwhile, the official archive has:
['cv2/__init__.py',
'cv2/cv2.cpython-36m-i386-linux-gnu.so',
'cv2/.libs/libswresample-08248319.so.3.2.100',
'cv2/.libs/libavformat-d485f70f.so.58.17.101',
'cv2/.libs/libvpx-1b5256ac.so.5.0.0',
'cv2/.libs/libz-83853723.so.1.2.3',
'cv2/.libs/libQtGui-55070e59.so.4.8.7',
'cv2/.libs/libavcodec-3b67922d.so.58.21.104',
'cv2/.libs/libswscale-3bf29a6c.so.5.2.100',
'cv2/.libs/libQtTest-0cf8861e.so.4.8.7',
'cv2/.libs/libQtCore-ccf6d197.so.4.8.7',
'cv2/.libs/libavutil-403a4871.so.56.18.102']
Newbie here. I've created my first Python package and I managed to register it on Pypi, as well as upload the tar.gz file. Now whenever I want to run:
pip install myPackage
I get this error in console:
Could not find a version that satisfies the requirement myPackage (from versions: 1.0dev)
No distributions matching the version for flashCardStudy
Storing debug log for failure in /Users/xxx/Library/Logs/pip.log
I believe this is because my version is development version I guess? So yeah, I can install it by adding --pre argument but what I'd really like is to turn it into a normal version so to speak.
I've tried figuring out how to do it and looking at some docs but I can't still figure it out. In my setup.py my version is set to '1.0' so I don't see where to problem is. If anyone wants to have a look at the file, here it is.
So I found the problem. I used utility called Paster which generates package structure, including setup.py and setup.cfg files among others. My utility hasn't been updated in a while and meanwhile submission rules to PyPi have changed. It now requires certain setup.py structure and unless it passes via pip, it's labeled as development version - which pip does not install without --pre argument.
So I just went to PyPi pages and looked at setup.py tutorial, did it their way and now it works.