I am working in WIN10 , with python 2.7.15
I am try to install package, during the installation process I received the following error .
Cannot uninstall 'PyYAML'. It is a distutils installed project and thus we cannot accurately determine which files belong to it which would lead to only a partial uninstall.
I try to uninstall with pip (18.1) command and I received the same error.
pip uninstall PyYAML
How I can uninstall/upgrade distutils packge in win10 OS.
Base distutils functionality doesn't leave any information about which files belong to a package -- thus it cannot be reliably uninstalled. That's what the message is telling you. Moreover, it doesn't have dependency metadata, so it can't be "upgraded" reliably, either. All those features are additions by setuptools (and some by wheel and pip itself).
This can happen if you installed the package directly from source with setup.py install if setup.py is distutils- rather than setuptools-based. Or if you installed it manually from some types of packages by copying/extracting files.
Unless the way you installed it provides an own uninstaller, you'll have to manually figure out which files belong to the package and delete them from Python directories.
Usually, these are:
site-packages\<package_name>* directories and/or
site-packages\<package_name>*.py for standalone modules
optionally, a site-packages\<package_name>.pth file
Generally, look for anything that bears the package's name on it.
If you can build the same package from source, you can use the build process to get a hint: build a binaly package that you can look into (e.g. setup.py bdist_wheel -- .whl is a ZIP archive) and see what files it has in it.
Related
There is a locally built package (eg main-0.1.tar.gz). There is another package (for example base-0.1) that requires main-0.1 as a dependency.
It is necessary that during the subsequent installation of the base-0.1 package, the main-0.1 package is also installed.
Those. You can specify only packages with PyPI in install_requires, but local adding packages to the assembly is not clear how.
You can add the package main-0.1.tag.gz to the base-0.1 archive using MANIFEST.in (include main-0.1.tag.gz). But further dependency_links, for example, does not work correctly.
How do I add a local package to the build of another package and then install it along with another package, as if it were pulled from PyPI?
You might want to look at:
PEP 440 ("File URLs")
PEP 508
import setuptools
setuptools.setup(
# [...]
install_requires = [
'main # file:///path/to/main-0.1.tar.gz'
# [...]
],
)
Alternatively (probably better actually), use some combination of pip install options:
pip install --no-index --find-links '/path/to/distributions' main base
Reference:
https://pip.pypa.io/en/stable/user_guide/#installing-from-local-packages
Found a rough solution. I don't know how much it is for Feng Shui, but it works.
Add include main-0.1.tar.gz to MANIFEST.in
In setup.py, at the end of the file (after calling setup ()), add:
if 'sdist' not in sys.argv[1]:
os.system('pip install main-0.1.tar.gz')
The condition may be different if, for example, sdist is not used for building (python setup.py sdist). The main thing is to somehow determine that this is running setup for assembly, and not for installation (pip install base-0.1.tar.gz in the future).
In this case, we copy the local dependent package into the archive of the package being built, and it is distributed, accordingly, along with it. And installed the same way.
I have developed a python package on github that I released on PyPi. It installs with pip install PACKAGENAME, but does not do anything with the dependencies that are stated in the "install_requires" of the setup.py file.
Weirdly enough, the zip file of the associated release does install all dependencies.. I tried with different virtual environments and on different computers but it never installs the dependencies.. Any help appreciated.
pip install pythutils downloads a wheel if it's available — and it's available for your package.
When generating a wheel setuptools runs python setup.py locally but doesn't include setup.py into the wheel. Download your wheel file and unzip it (it's just a zip archive) — there is your main package directory pythutils and a directory with metadata pythutils-1.1.1.dist-info. In the metadata directory there is a file METADATA that usually lists static dependencies but your file doesn't list any. Because when you were generating wheels all your dependencies has already been installed so all your dynamic code paths were skipped.
The archive that you downloaded from Github release install dependencies because it's not a wheel so pip runs python setup.py install and your dynamic dependencies work.
What you can do? My advice is to avoid dynamic dependencies. Declare static dependencies and allow pip to decide what versions to install:
install_requires=[
'numpy==1.16.5; python_version>="2" and python_version<"3"',
'numpy; python_version>="3"',
],
Another approach would be to create version-specific wheel files — one for Python 2 and another for Python 3 — with fixed dependencies.
Yet another approach is to not publish wheels at all and only publish sdist (source distribution). Then pip is forced to run python setup.py install on the target machine. That not the best approach and it certainly will be problematic for packages with C extensions (user must have a compiler and developer tools to install from sources).
Your setup.py does a series of checks like
try:
import numpy
except ImportError:
if sys.version_info[0] == 2:
install_requires.append('numpy==1.16.5')
if sys.version_info[0] == 3:
install_requires.append("numpy")
Presumably the system where you ran it had all the required modules already installed, and so ended up with an empty list in install_requires. But this is the wrong way to do it anyway; you should simply make a static list (or two static lists, one each for Python 2 and Python 3 if you really want to support both in the same package).
Is it actually possible to install data files from a sdist to arbitrary locations outside the sys.prefix directory using setuptools?
The documentation implies that it is possible to do so, using an absolute path in the data_files directory part, but in practice the leading slash seems to be ignored.
For instance with data_files=[('/opt/foo/bar', [...])], the files should be installed into the directory /opt/foo/bar. But they all end up in /usr/local/lib/python3.4/dist-packages/opt/foo/bar, which is no use to man nor beast.
I suspect that it used to work - has it been changed/broken recently?
(Using pip 8.1.1, Python 3.4, setuptools 20.9.0)
Best bet would be to override install with your own cmdclass and move the files yourself.
Ah, it's all pip's fault. Given a sdist, pip helpfully creates a wheel before installing that. But wheels can't install files outside dist-packages.
Luckily you can knock some sense into pip by chucking an error, causing it to fall back to installing the sdist you wanted in the first place.
# in setup.py
if 'bdist_wheel' in sys.argv:
raise RuntimeError("This setup.py does not support wheels")
(Thanks to Benjamin Bach)
Is there a way to install files to arbitrary locations with setuptools? I've used Data Files with setuptools before, but those are typically installed inside the package directory. I need to install a plugin file that will be located in the install directory of another application.
It seems that setuptools has purposely made it difficult to install files outside of the package directory.
I instead included the plugin files as package data and used the Entry Points feature of setuptools to expose the install/uninstall functions for the plugin files I wanted to distribute.
setup(
...
entry_points={
'console_scripts': [
'mypackage_install_plugins = mypackage:install_plugins',
'mypackage_uninstall_plugins = mypackage:uninstall_plugins',
],
}
)
I just added an additional step to the installation instructions to run the following command after installing the python package:
$> mypackage_install_plugins
The data_files attribute will allow you to specify full paths.
You could also do some shutil.copy magic in your setup.py, except don't.
Check out this answer:
Execute a Python script post install using distutils / setuptools
which shows how to add an arbitrary install script (python, shell, whatever) that runs at the end of the install. It'll run whther you use "setup.py install" directly, or a package manager like "pip install". With this, you can add any files you want, anywhere you want.
Unfortunately, I feel Brendan's pain - setuptools, not being a full package manager itself, does not handle the uninstall. Therefore, there's no way to have an uninstall hook to reverse what you did in the post-install script.
I want to create package for python that embeds and uses an external library (.so) on Linux using the cffi module.
Is there standard way to include .so file into python package?
The package will be used only internally and won't be published to pypi.
I think Wheel packages are the best option - they would create platform specific package with all files ready to be copied so there will be no need to build anything on target environments.
You can use auditwheel to inject the external libraries into the wheel:
auditwheel repair: copies these external shared libraries into the wheel itself, and automatically modifies the appropriate RPATH entries such that these libraries will be picked up at runtime. This accomplishes a similar result as if the libraries had been statically linked without requiring changes to the build system. Packagers are advised that bundling, like static linking, may implicate copyright concerns.
You can pre-build the external c++ library by typically executing the following:
./configure && make && make install
This will generate an my_external_library.so file and install it in the appropriate path. However, you'll need to ensure that the library path is properly set in order for the auditwheel to discover the missing dependency.
export LD_LIBRARY_PATH=/usr/local/lib
You can then build the python wheel by executing:
python setup.py bdist_wheel
Finally, you can repair the wheel, which will inject the my_external_library.so into the package.
auditwheel repair my-python-wheel-1.5.2-cp35-cp35m-linux_x86_64.whl
I successfully applied the above steps to the python library confluent-kafka-python which has a required c/c++ dependency on librdkafka.
Note: auditwheel is Linux-only. For MacOS, see the delocate tool.
Wheels are the standard way of distributing Python packages, but there is a problem when you have extension modules that depend on other so's. This is because the normal Linux dynamic linker is used, and that only looks in /usr/lib or /usr/local/lib. This is a problem when installing a wheel in a virtualenv.
As far as I know, you have three options:
Static linking, so the 'wrapper' does not depend on anything else;
using ctypes to wrap the so directly from Python;
Split the distribution into a wheel with the Python code & wrapper, and a separate RPM or DEB to install the so into either /usr/lib or /usr/local/lib.
A wheel may work if you include the dependent so as a data file to be stored in /lib and install into the root Python environment (haven't tried that), but this will break if someone tries to install the wheel into a virtualenv (did try that).