I want to include a Python package dependency (installed using pip3 install) in an rpm package. I cannot install using dnf because its version is out of date. rpm returns the following error if I install the dependency using pip3 install:
error: Failed dependencies python3.6dist(dependency-package)
Any suggestions on how to include a Python package inside of an rpm?
OK. Some of your package require python3-somepackage or python3dist(somepackage). For rpm, it is just a string. Rpm does not care that the python module has been installed using pip. There must be some package which provides that string.
You have two options.
Prefered is use of pyp2rpm --srpm somepackage. That will download the latest version of the module from PyPI and produce the src.rpm. You can then build it using mock -r epel-8-x86_64 somepackage.src.rpm
The other option is to fake the provides. You can install the module using pip and then run: create-fake-rpm --build python3-somepackage 'python3dist(somepackage)'. This will generate the file fake-python3-somepackage-0-0.noarch.rpm which you can install using rpm. Then you can proceed with the install of your application. Be warned, that this is cheating. Future dnf upgrade will not update this module, you will have to take care of that yourself.
Related
I've a private pypi, and keep uploading new package framework to that registry.
I want to install the latest package in a virtualenv.
Command used:
pip install -i https://user:pass#registry framework
OUTPUT:
Collecting framework:
... downloads many versions
ERROR: Cannot install framework==0.25.13, framework==0.25.14 and framework==0.26.0 because these package versions have conflicting dependencies.
The conflict is caused by:
framework 0.26.0 depends on toolz<0.12.0 and >=0.11.1
framework 0.25.14 depends on toolz<0.12.0 and >=0.11.1
framework 0.25.13 depends on toolz<0.12.0 and >=0.11.1
To fix this you could try to:
1. loosen the range of package versions you've specified
2. remove package versions to allow pip attempt to solve the dependency conflict
ERROR: ResolutionImpossible: for help visit https://pip.pypa.io/en/latest/user_guide/#fixing-conflicting-dependencies
I want only the latest version to be downloaded. I cannot hard-code version like framework==0.26.0 while pip install because this command is to be used in script and I may need to modify the script everytime a new framework get uploaded.
pip version: pip 21.1.2
The solution that worked for this problem was using --extra-index-url instead of -i
Command to use
pip install --extra-index-url https://user:pass#registry framework
I'm developing an application based on django==1.7.x.
The problem I have is that the setup.py of one of my dependencies (let's call it foo) specifies Django>=1.3 as one of its requirements, but when such foo is being installed, it tries to install the latest version of django, which as of now is 1.8.3.
I thought that when specifying dependencies like package>=min_version in the setup.py file, pip would see that package is already installed, the installed version suffies the minimum required version and thus respect that installation of package.
Why is pip trying to install the latest version? and how can I force it to respect my current installed version?
Update: FYI, I'm using pip==7.1.0
Update: This only happens when installing manually, like pip install foo==X.Y. When the dependency is in a requirements file and is installed via pip install -r requirements.txt, installed versions of required packages are respected by pip.
Thanks!
Many python packages have build dependencies on non-Python packages. I'm specifically thinking of lxml and cffi, but this dilemma applies to a lot of packages on PyPI. Both of these packages have unadvertised build dependencies on non-Python packages like libxml2-dev, libxslt-dev, zlib1g-dev, and libffi-dev. The websites for lxml and cffi declare some of these dependencies, but it appears that there is no way to do figure this out from a command line.
As a result, there are hundreds of questions on SO that take this general form:
pip install foo fails with an error: "fatal error: bar.h: No such file or directory". How do I fix it?
Is this a misuse of pip or is this how it is intended to work? Is there a sane way to know what build dependencies to install before running pip? My current approach is:
I want to install a package called foo.
pip install foo
foo has a dependency on a Python package bar.
If bar build fails, then look at error message and guess/google what non-Python dependency I need to install.
sudo apt-get install libbaz-dev
sudo pip install bar
Repeat until bar succeeds.
sudo pip uninstall foo
Repeat entire process until no error messages.
Step #4 is particularly annoying. Apparently pip (version 1.5.4) installs the requested package first, before any dependencies. So if any dependencies fail, you can't just ask pip to install it again, because it thinks its already installed. There's also no option to install just the dependencies, so you must uninstall the package and then reinstall it.
Is there some more intelligent process for using pip?
This is actually a comment about the answer suggesting using apt-get but I don't have enough reputation points to leave one.
If you use virtualenv a lot, then installing the python-packages through apt-get can become a pain, as you can get mysterious errors when the python packages installed system-wide and the python packages installed in your virtualenv try to interact with each other. One thing that I have found that does help is to use the build-dep feature. To build the matplotlib dependencies, for example:
sudo apt-get build-dep python-matplotlib
And then activate your virtual environment and do pip install matplotlib. It will still go through the build process but many of the dependencies will be taken care of for you.
This is sort what the cran repositories suggest when installing R packages in ubuntu.
For most popular packages, There is a workaround for recent ubuntu systems. For example, I want to install matplotlib. When you order pip install matplotlib, it usually fails because of a missing dependency.
You can use apt-get install python-matplotlib instead. For python3, you can use apt-get install python3-matplotlib
I have one python package A which has depends on another private package named godot(hosted at bitbucket, and should be accessed by git+ssh protocol). In package A's setup.py, I have following code:
...
install_requires=['godot'],
dependency_links=['git+ssh://git#bitbucket.org/xxx/godot.git#egg=godot']
...
I have two questions here:
Now setuptools 1.4 (latest stable version) does not support 'git+ssh' protocol, only code in the development branch handle this protocol: Python setuptools: How can I list a private repository under install_requires?. I have installed the development version via:
pip install --upgrade --force-reinstall hg+https://bitbucket.org/pypa/setuptools#egg=setuptools
I almost solved this bit, but I wonder If any other approach available? Invoke pip install -r requirements.txt(have git+ssh://git#bitbucket.org/xxx/godot.git#egg=godot list in requirements.txt)?
The second question is name conflict. There is another package on pypi also named godot, So when I install package A using follow command, pip install the godot from pypi index:
pip install git+ssh://git#pypi.corp.com/xxx/A.git#egg=A
How could force pip(setup.py) to install the private godot package, rather than the one on pypi index?
For part 1: you can install packages via pip by specifying as:
$ pip install http://my.package.repo/SomePackage-1.0.4.zip
To keep it simple and avoid spending undue time on it, I would just download the .zip source file and install via pip as above.
See here...
For part 2: pip has a --no-dependencies switch. Add that after installing all the dependencies manually
I've got a Python module which is distributed on PyPI, and therefore installable using easy_install. It depends on lxml, which in turn depends on libxslt1-dev. I'm unable to install libxslt1-dev with easy_install, so it doesn't work to put it in install_requires. Is there any way I can get setuptools to install it instead of resorting to apt-get?
setuptools can only install Python packages that in the package index you are using, either the default index of the one you specify with easy_install -i http://myindex.site/index.
Any non-Python dependencies have to be installed using the standard installation package for the platform (apt-get on Debian based Linux distros). libxml2 and libxslt fall into this category so you should install these in the standard way.
It's better use apt-get to install lxml (or the python packages that has c extensions) and then pull pure python package from pypi. Also I generally try to avoid using easy_install for top level install, I rather create a virtual env using virtualenv and then use easy_install created by virtualenv to keep my setups clean.
This strategy is working successfully for me for couple of production environments.