Currently my python package does not have a dependency on the wmi package and it can be easily installed via
pip install mypackage
If I add a dependency on the wmi package, this will likely fail since when I try installing wmi through pip, I encounter errors since I do not have visual studio 2008 installed...and I only managed to get it installed using the binary distribution.
Is it possible for me to include and install the binary release of wmi in my package?
The main concern is that if people fail to install my package via the pip command, they just avoid using my package.
The first thing to consider is why are you considering adding the wmi package - since it is MS-Windows specific if you use it, or anything depending on it, your package will also be MS-Windows specific.
Are there other ways to achieve what you are trying to do that remain cross platform? If not and you really have to use it then you could include a prerequisite statement in the documentation, and ideally in setup.py, telling people that they need to have an installed & working copy of wmi, hopefully with a pointer to the binary distributions.
The other way to go - if you are on a late enough version of python - is to build and distribute your package as python wheels. Since wheels allow the inclusion of C package elements without relying on the presence of a compiler on the target system - see pep-0427 & here for some more information.
Creating Wheels:
You need to be running python2 > 2.6 or python3, pip >= 1.4 and setuptools >= 0.8.
Basically, assuming that you have a setup.py that will create your, (source), distribution for upload to pip with:
python setup.py sdist
then you can create a binary distribution that should contain all the dependencies of your package for your current python version with:
python setup.py bdist_wheel
This will build a distribution wheel that includes the .pyc files and the binary files from the required packages.
But - you need to do this once for each version of python that you are planning of supporting, (virtualenv is magic for this), and on each platform if you are also planning on supporting 64 bit or mac. Unless, of course, you manage to make a pure python package that will run, without 2to3, under both python 2 & 3 in which case you can build a universal wheel - obviously you can not do this if you require .c extensions.
For more information on wheels see Wheel - Read The Docs.
Related
I have developed a python package on github that I released on PyPi. It installs with pip install PACKAGENAME, but does not do anything with the dependencies that are stated in the "install_requires" of the setup.py file.
Weirdly enough, the zip file of the associated release does install all dependencies.. I tried with different virtual environments and on different computers but it never installs the dependencies.. Any help appreciated.
pip install pythutils downloads a wheel if it's available — and it's available for your package.
When generating a wheel setuptools runs python setup.py locally but doesn't include setup.py into the wheel. Download your wheel file and unzip it (it's just a zip archive) — there is your main package directory pythutils and a directory with metadata pythutils-1.1.1.dist-info. In the metadata directory there is a file METADATA that usually lists static dependencies but your file doesn't list any. Because when you were generating wheels all your dependencies has already been installed so all your dynamic code paths were skipped.
The archive that you downloaded from Github release install dependencies because it's not a wheel so pip runs python setup.py install and your dynamic dependencies work.
What you can do? My advice is to avoid dynamic dependencies. Declare static dependencies and allow pip to decide what versions to install:
install_requires=[
'numpy==1.16.5; python_version>="2" and python_version<"3"',
'numpy; python_version>="3"',
],
Another approach would be to create version-specific wheel files — one for Python 2 and another for Python 3 — with fixed dependencies.
Yet another approach is to not publish wheels at all and only publish sdist (source distribution). Then pip is forced to run python setup.py install on the target machine. That not the best approach and it certainly will be problematic for packages with C extensions (user must have a compiler and developer tools to install from sources).
Your setup.py does a series of checks like
try:
import numpy
except ImportError:
if sys.version_info[0] == 2:
install_requires.append('numpy==1.16.5')
if sys.version_info[0] == 3:
install_requires.append("numpy")
Presumably the system where you ran it had all the required modules already installed, and so ended up with an empty list in install_requires. But this is the wrong way to do it anyway; you should simply make a static list (or two static lists, one each for Python 2 and Python 3 if you really want to support both in the same package).
I am working in WIN10 , with python 2.7.15
I am try to install package, during the installation process I received the following error .
Cannot uninstall 'PyYAML'. It is a distutils installed project and thus we cannot accurately determine which files belong to it which would lead to only a partial uninstall.
I try to uninstall with pip (18.1) command and I received the same error.
pip uninstall PyYAML
How I can uninstall/upgrade distutils packge in win10 OS.
Base distutils functionality doesn't leave any information about which files belong to a package -- thus it cannot be reliably uninstalled. That's what the message is telling you. Moreover, it doesn't have dependency metadata, so it can't be "upgraded" reliably, either. All those features are additions by setuptools (and some by wheel and pip itself).
This can happen if you installed the package directly from source with setup.py install if setup.py is distutils- rather than setuptools-based. Or if you installed it manually from some types of packages by copying/extracting files.
Unless the way you installed it provides an own uninstaller, you'll have to manually figure out which files belong to the package and delete them from Python directories.
Usually, these are:
site-packages\<package_name>* directories and/or
site-packages\<package_name>*.py for standalone modules
optionally, a site-packages\<package_name>.pth file
Generally, look for anything that bears the package's name on it.
If you can build the same package from source, you can use the build process to get a hint: build a binaly package that you can look into (e.g. setup.py bdist_wheel -- .whl is a ZIP archive) and see what files it has in it.
I use Python 3.4
I try to install Cython and Numba but keep getting "Unable to find vcvarsall.bat".
I googled for the solution and found that I need Microsoft Visual C++ 2010 installed (for Python 3.4).
So I installed it.
And tried installing Cython and Numba ---> fail.
And then they say I must type "SET VS90COMNTOOLS=%VS100COMNTOOLS%" in the command prompt, which I did, like C:\Users\Dorky>set vs90comntools=%vs100comntools%.
And tried installing Cython and Numba again ---> fail.
Not enough with that, I also went to the environment variables to set this VS90 to VS100 thing manually.
And tried installing Cython and Numba again ---> fail.
So how exactly can I solve this special "Unable to find vcvarsall.bat" problem?
What the heck is so special with this vcvarsall.bat that the user must install Microsoft's programs in order to use it?
Why not just extract out this vcvarsall.bat file as an independent file and then just copy&paste it to any file or directory that needs and not bother with the rest of the software package?
Why not the Python team just extract out this vcvarsall.bat and incorporate it into its Python packages so whenever a user installs Python, he/she will also install vcvarsall.bat along the way and then Python would also know where to look for this file in case needing to install Cython or Numba or any other?
If you're using the python.org version of Python, there's a much easier way to go about things - grab the packages you're interested in from Christoph Gohlke's Python Extension Packages for Windows repository. He has a very large selection of mainly scientific computing-based Python modules, including Cython and numba (you'll need numpy - compiled with Intel's MKL - and llvmlite - which requires this - as well). Everything is precompiled into .whl packages that can be installed with an up-to-date version of pip. Most modules are kept updated with the latest versions on PyPI or other repositories.
This is definitely my go-to site for installing packages on Windows, and if what I'm looking for isn't there, then I'll install via pip or the package source.
I want to create package for python that embeds and uses an external library (.so) on Linux using the cffi module.
Is there standard way to include .so file into python package?
The package will be used only internally and won't be published to pypi.
I think Wheel packages are the best option - they would create platform specific package with all files ready to be copied so there will be no need to build anything on target environments.
You can use auditwheel to inject the external libraries into the wheel:
auditwheel repair: copies these external shared libraries into the wheel itself, and automatically modifies the appropriate RPATH entries such that these libraries will be picked up at runtime. This accomplishes a similar result as if the libraries had been statically linked without requiring changes to the build system. Packagers are advised that bundling, like static linking, may implicate copyright concerns.
You can pre-build the external c++ library by typically executing the following:
./configure && make && make install
This will generate an my_external_library.so file and install it in the appropriate path. However, you'll need to ensure that the library path is properly set in order for the auditwheel to discover the missing dependency.
export LD_LIBRARY_PATH=/usr/local/lib
You can then build the python wheel by executing:
python setup.py bdist_wheel
Finally, you can repair the wheel, which will inject the my_external_library.so into the package.
auditwheel repair my-python-wheel-1.5.2-cp35-cp35m-linux_x86_64.whl
I successfully applied the above steps to the python library confluent-kafka-python which has a required c/c++ dependency on librdkafka.
Note: auditwheel is Linux-only. For MacOS, see the delocate tool.
Wheels are the standard way of distributing Python packages, but there is a problem when you have extension modules that depend on other so's. This is because the normal Linux dynamic linker is used, and that only looks in /usr/lib or /usr/local/lib. This is a problem when installing a wheel in a virtualenv.
As far as I know, you have three options:
Static linking, so the 'wrapper' does not depend on anything else;
using ctypes to wrap the so directly from Python;
Split the distribution into a wheel with the Python code & wrapper, and a separate RPM or DEB to install the so into either /usr/lib or /usr/local/lib.
A wheel may work if you include the dependent so as a data file to be stored in /lib and install into the root Python environment (haven't tried that), but this will break if someone tries to install the wheel into a virtualenv (did try that).
I need to use scikits.bvp_solver in python.
I currently use Canopy as my standard Python interface, where this package isn't available. Is there another available package for solving boundary value problems? I have also tried downloading using macports but the procedure sticks when it tries building gcc48 dependency.
You can try to download the package tar.gz and use easy_install . Or you can unpack the package and use the standard way of python setup.py install. I believe both ways require a fortran compiler.