I am creating a Python package that depends on PyTorch. PyTorch's installation command is as follows (from https://pytorch.org/):
pip3 install torch==1.8.2+cu102 torchvision==0.9.2+cu102 torchaudio==0.8.2 -f https://download.pytorch.org/whl/lts/1.8/torch_lts.html
Question 1: How should I prepare setup.py?
I read this Equivalent for `--find-links` in `setup.py` where it says we could add the link into dependency_links list, but this is no longer supported since 2019.
Question 2: How to programatically decide which version to install? (CPU, GPU, CUDA version)
The command above is for a GPU-enabled machine with CUDA 10.2. But if the machine doesn't have GPU, one would use:
pip3 install torch==1.8.2+cpu torchvision==0.9.2+cpu torchaudio==0.8.2 -f https://download.pytorch.org/whl/lts/1.8/torch_lts.html
Is it possible for my package's setup.py to automatically identify the user's system and modify the install_requires list accordingly?
Related
I have a package with setup.py which will depend on two packages, tensorflow and tensorflow-probability.
Q:
How do I make sure the versions of the above two packages will always match when I install my package? My package currently has the requirement of tensorflow>=2.6.2 and I will be adding tensorflow-probability==tensorflow_MAJOR_MINOR_version.
The tensorflow-probability website has the following clause:
https://www.tensorflow.org/probability/install
Note: Since TensorFlow is not included as a dependency of the TensorFlow Probability package (in setup.py), you must explicitly install the TensorFlow package (tensorflow or tensorflow-gpu). This allows us to maintain one package instead of separate packages for CPU and GPU-enabled TensorFlow.
This means it's up to the installer to install the correct tensorflow package version, but what if this is a setup.py script instead of a human?
I am not finding the syntax to do this in PEP 440 – Version Identification and Dependency Specification.
Dependency Management in Setuptools: Platform specific dependencies shows that it's possible to have conditional dependencies based on operating system platform.
My best guess is to augment setup.py with logic to:
* Get the currently installed version of `tensorflow` on the local system,
or get the latest version that will be installed from pypl.
* Isolate the MAJOR.MINOR component of the fetched version.
* Use that version in a string replace for
install_requires=[
'tensorflow>=2.6.2',
f'tensorflow-probability=={tensorflow_version}.*'
]
I have developed a python package on github that I released on PyPi. It installs with pip install PACKAGENAME, but does not do anything with the dependencies that are stated in the "install_requires" of the setup.py file.
Weirdly enough, the zip file of the associated release does install all dependencies.. I tried with different virtual environments and on different computers but it never installs the dependencies.. Any help appreciated.
pip install pythutils downloads a wheel if it's available — and it's available for your package.
When generating a wheel setuptools runs python setup.py locally but doesn't include setup.py into the wheel. Download your wheel file and unzip it (it's just a zip archive) — there is your main package directory pythutils and a directory with metadata pythutils-1.1.1.dist-info. In the metadata directory there is a file METADATA that usually lists static dependencies but your file doesn't list any. Because when you were generating wheels all your dependencies has already been installed so all your dynamic code paths were skipped.
The archive that you downloaded from Github release install dependencies because it's not a wheel so pip runs python setup.py install and your dynamic dependencies work.
What you can do? My advice is to avoid dynamic dependencies. Declare static dependencies and allow pip to decide what versions to install:
install_requires=[
'numpy==1.16.5; python_version>="2" and python_version<"3"',
'numpy; python_version>="3"',
],
Another approach would be to create version-specific wheel files — one for Python 2 and another for Python 3 — with fixed dependencies.
Yet another approach is to not publish wheels at all and only publish sdist (source distribution). Then pip is forced to run python setup.py install on the target machine. That not the best approach and it certainly will be problematic for packages with C extensions (user must have a compiler and developer tools to install from sources).
Your setup.py does a series of checks like
try:
import numpy
except ImportError:
if sys.version_info[0] == 2:
install_requires.append('numpy==1.16.5')
if sys.version_info[0] == 3:
install_requires.append("numpy")
Presumably the system where you ran it had all the required modules already installed, and so ended up with an empty list in install_requires. But this is the wrong way to do it anyway; you should simply make a static list (or two static lists, one each for Python 2 and Python 3 if you really want to support both in the same package).
My tensorflow 2.0.0beta1 runs normally, but I cannot install tensorflow-text using the command pip install tensorflow-text (as described on the tensorflow page). I can find it using pip search tensorflow-text but I am getting an error
ERROR: Could not find a version that satisfies the requirement tensorflow-text (from versions: none)
There are no requirements for this package (i.e. a specific python version).
I am running on windows, using conda, python 3.6.9
Update
The first release candidate of 2.4.0 was published today which features windows wheels for the first time. 2.4.0rc0 on PyPI. Note that only wheels for Python 3.6 and 3.7 are working properly at the moment. Install via e.g.
> py -3.7 -m pip install tensorflow-text==2.4.0rc0
Original answer
At the time of writing this, tensorflow-text is not available for Windows yet.
Windows is something we do wish to add. We've had some difficulties getting a working package though, which is why it is not available yet. The difference between this library and tensorflow-probability is we make use of custom ops written in c++, and building those shared libraries to work well with Tensorflow inside Windows has had issues; plus, the lengthy build times on Windows has made iterating on these issues slow. While the next beta release (this week) will not include Windows, we would like for the next release to include it.
Source.
I'm trying to build a multistage docker image with some python packages. For some reason, pip wheel command still downloads source files .tar.gz for few packages even though .whl files exist in Pypi. For example: it does it for pandas, numpy.
Here is my requirements.txt:
# REST client
requests
# ETL
pandas
# SFTP
pysftp
paramiko
# LDAP
ldap3
# SMB
pysmb
First stage of the Dockerfile:
ARG IMAGE_TAG=3.7-alpine
FROM python:${IMAGE_TAG} as python-base
COPY ./requirements.txt /requirements.txt
RUN mkdir /wheels && \
apk add build-base openssl-dev pkgconfig libffi-dev
RUN pip wheel --wheel-dir=/wheels --requirement /requirements.txt
ENTRYPOINT tail -f /dev/null
Output below shows that it is downloading source package for Pandas but it got a wheel for Requests package. Also, surprisingly it takes a lot of time (I really mean a lot of time) to download and build these packages !!
Step 5/11 : RUN pip wheel --wheel-dir=/wheels --requirement /requirements.txt
---> Running in d7bd8b3bd471
Collecting requests (from -r /requirements.txt (line 4))
Downloading https://files.pythonhosted.org/packages/51/bd/23c926cd341ea6b7dd0b2a00aba99ae0f828be89d72b2190f27c11d4b7fb/requests-2.22.0-py2.py3-none-any.whl (57kB)
Saved /wheels/requests-2.22.0-py2.py3-none-any.whl
Collecting pandas (from -r /requirements.txt (line 7))
Downloading https://files.pythonhosted.org/packages/0b/1f/8fca0e1b66a632b62cc1ae38e197befe48c5cee78f895edf4bf8d340454d/pandas-0.25.0.tar.gz (12.6MB)
I would like to know how I can force it get a wheel file for all the required packages and also for the dependencies listed in these packages. I observed that some dependencies get a wheel file but others get the source packages.
NOTE: code above is a combination of multiple online sources.
Any help to make this build process easier is greatly appreciated.
Thanks in Advance.
You are using Alpine Linux. This one is somewhat unique as it uses musl as the underlying libc implementation, as opposed to the most other Linux distros which use glibc.
If a Python project implements C extensions (this is what e.g. numpy or pandas do), it has two options: either
offer a source dist (.tar.gz, .tar.bz2 or .zip) so that the C extensions are compiled using the C compiler/library found on the target system, or
offer a wheel that contains compiled C extensions. If the extensions are compiled against glibc, they will be unusable on systems using musl, and AFAIK vice versa too.
Now, Python defines the manylinux1 platform tag which is specified in PEP 513 and updated in PEP 571. Basically, the name says it all - wheels with compiled C extensions should be built against glibc and thus will work on many distros (that use glibc), but not on some (Alpine being one of them).
For you, it means that you have two possibilities: either build packages from source dists (this is what pip already does), or install the prebuilt packages via Alpine's package manager. E.g. for py3-pandas it would mean doing:
# echo "#edge http://dl-cdn.alpinelinux.org/alpine/edge/testing" >> /etc/apk/repositories
# apk update
# apk add py3-pandas#edge
However, I don't see a big issue with building packages from source. When done right, you capture it in a separate layer placed as high as possible in the image, so it is cached and not rebuilt each time.
You might ask, why there's no platform tag analogous to manylinux1, but for musl-based distros? Because no one has written a PEP similar to PEP 513 that defines a musllinux platform tag yet. If you are interested in the current state of it, take a look at the issue #37.
Update
PEP 656 That defines a musllinux platform tag is now accepted, so it (hopefully) won't last long until prebuilt wheels for Alpine start to ship. You can track the current implementation state in auditwheel#305.
For Python 3, your packages will be installed from wheels with ordinary pip call:
pip install pandas numpy
From the docs:
Pip prefers Wheels where they are available. To disable this, use the --no-binary flag for pip install.
If no satisfactory wheels are found, pip will default to finding source archives.
I am just starting with virtualenv, but I am trying to install gevent within a virtualenv environment (I am running Windows). When I use PIP from virtualenv, I get this error:
MyEnv>pip install gevent
Downloading/unpacking gevent
Running setup.py egg_info for package gevent
Please provide path to libevent source with --libevent DIR
The package index has MSIs and EXEs for installing on Windows (http://pypi.python.org/pypi/gevent/0.13.7), but I don't know how to install those into a virtualenv environment (or if that is even possible). When I try pip install gevent-0.13.7.win32-py2.7.exe from the virtualenv promp, I get an error as well:
ValueError: ('Expected version spec in', 'D:\\Downloads\\gevent-0.13.7.win32-py2.7.exe', 'at', ':\\Downloads\\gevent-0.13.7.win32-py2.7.exe')
Does someone know how to do this?
Pip doesn't support installing binary packages, yet. If you want to install from binary package you have to use easy_install - easy_install gevent-0.13.7.win32-py2.7.exe
Microsoft Windows XP [Wersja 5.1.2600]
(C) Copyright 1985-2001 Microsoft Corp.
Z:\>virtualenv z:\venv\gevent-install
New python executable in z:\venv\gevent-install\Scripts\python.exe
Installing distribute..................................................................................................
............................................................................................done.
Installing pip.................done.
Z:\>venv\gevent-install\Scripts\activate
(gevent-install) Z:\>easy_install c:\python\packages\gevent-0.13.7.win32-py2.7.exe
Processing gevent-0.13.7.win32-py2.7.exe
creating 'c:\docume~1\pdobro~1\ustawi~1\temp\easy_install-b5nj3i\gevent-0.13.7-py2.7-win32.egg' and adding 'c:\docume~1
pdobro~1\ustawi~1\temp\easy_install-b5nj3i\gevent-0.13.7-py2.7-win32.egg.tmp' to it
creating z:\venv\gevent-install\lib\site-packages\gevent-0.13.7-py2.7-win32.egg
Extracting gevent-0.13.7-py2.7-win32.egg to z:\venv\gevent-install\lib\site-packages
Adding gevent 0.13.7 to easy-install.pth file
Installed z:\venv\gevent-install\lib\site-packages\gevent-0.13.7-py2.7-win32.egg
Processing dependencies for gevent==0.13.7
Searching for greenlet
Reading http://pypi.python.org/simple/greenlet/
Reading http://bitbucket.org/ambroff/greenlet
Reading https://github.com/python-greenlet/greenlet
Best match: greenlet 0.3.4
Downloading http://pypi.python.org/packages/2.7/g/greenlet/greenlet-0.3.4-py2.7-win32.egg#md5=9941aa246358c586bb274812e
130629
Processing greenlet-0.3.4-py2.7-win32.egg
creating z:\venv\gevent-install\lib\site-packages\greenlet-0.3.4-py2.7-win32.egg
Extracting greenlet-0.3.4-py2.7-win32.egg to z:\venv\gevent-install\lib\site-packages
Adding greenlet 0.3.4 to easy-install.pth file
Installed z:\venv\gevent-install\lib\site-packages\greenlet-0.3.4-py2.7-win32.egg
Finished processing dependencies for gevent==0.13.7
(gevent-install) Z:\>
See Can I install Python windows packages into virtualenvs? Another option is to install from source and you can do this with pip but this requires setting up compiler and environment which is much harder than the simple command above.
From the error message, it would appear you need libevent source code. I would imagine you need to go a step further and compile/install libevent system-wide so pip can find it.
I would start by downloading the latest stable source from http://libevent.org/.
Compile and install it using instructions in the README: https://github.com/libevent/libevent#readme
To compile it on Windows, you'll need to use GNU-style build utilities like make and autoconf. I recommend http://www.mingw.org/.
Once you've installed libevent system-wide, I imagine pip will find it and proceed with gevent installation.
In the msi for gevent-0.13.7 there's an option to select an alternate installation point. point it to the root dir of your particular virtual environment (just above where /Lib and /Scripts are located). That should install it correctly.
You also need to make sure greenlets are installed. For that you can use Piotr's suggested method with easy_install on the .exe.