I created some python app using autobahn and packaged it using baazar builddeb. In python setup.py file I added requires tag with all the required dependencies. Is it possible to tell debian package installer to install these packages?
I added some of deps to debian/control>Depends but:
dpkg -i my_package does not install dependencies. Just shows the error and I need to install these deps manually.
some packages does not exists in standard Ubuntu repos. For example autobahn. And in general I'd like to have installed all python dependencies by pip/easy_install
I am using DistUtilsExtra.auto.setup with personalized install action. So I think I could run easy_install packages there. Is it good idea?
Thank you.
Create debian packages from pypi using the python-stdeb package, then depend on them like any other package.
See http://pypi.python.org/pypi/stdeb
Related
We are develop our projects with poetry. In one of which we implement qr-code decoding. As the pip package site states zbar needs to be installed on the system.
Is it somehow possible that poetry/pip installs the zbar dependency, while installing our package?
Unfortunately not. poetry is essentially a wrapper of pip and can only manage Python dependencies from PyPi.
You could consider something like Nix for this use case: https://github.com/NixOS/nix#nix
Or you could consider a Makefile that runs the appropriate brew or apt-get-equivalent based on the operating system.
Getting this error`Python AttributeError: module 'rpm' has no attribute 'TransactionSet'. Installed rpm module like this pip3 install rpm. Do I need to install any other modules?
I don't know this module, but I hope I can help.
I imagine that you have installed this module via pip install rpm
Pypi offical docs says: https://pypi.org/project/rpm/
Placeholder package to make the RPM Python API available through PyPI.
Right now, this package just reserves the rpm name on PyPI to avoid the >potential for a name conflict with the python2-rpm and python3-rpm Python >bindings on RPM-based Linux distros.
Unlike libsolv and libdnf (which use CMake, and are hence amenable to PyPI >compatible build automation with scikit-build), rpm itself still uses >autotools, so creating usable pip-installable Python bindings for it may be >a bit trickier than doing so for the other libraries.
So un-install this package pip uninstall rpm and then install it from system scope apt-get install python3-rpm.
Maybe you need to re-install virtualenv like virtualenv --system-site-packages rpm and make rpm available to virtenv. Because apt-get install will enable rpm for system python, not for virtualenv
I am extremely new in python so i cant figure out a way to install a standalone python module hosted by a third party. Ex https://github.com/cvangysel/gitexd-drupalorg/tree/master/drupalorg
How to install this specific python module DrupalHash . Should i use pip ? I tried to read the documentation Installing Python Modules but i could not quite get it.
Any help ?
Seeing as the linked repo doesn't contain a setup.py, which would be required to install it with pip or easy_install, and the last commit was over 4 years ago, I'd just copy the drupalpass.py file into your local project and use it with a simple from drupalpass import DrupalHash.
The module you are linking doesn't seem to be available on PyPI, you will need to clone the repository to use the library.
For packages that may be available on PyPI, you can install pip:
easy_install pip
Then, you will be able to install your packages:
pip install *package name*
I current have a setup.py file for an application I've written. Using setuptools I've easily been able to install pip-available requirements as such:
install_requires = [
'argparse',
'multiprocessing',
'requests',
'numpy',
'termcolor',
'prettytable'
]
The problem is that I also need to install MySQLdb, which is not installed via pip. When setting up locally, I had to download the tarball, uncompress, install, symlink, etc... To put it short, it was a PITA.
Is there anyway to automate this within my setup.py file? Rather than downloading the tarball and including it as a package? Even then, how would I run a setup.py within my own setup.py?
You can actually install MySQLdb via pip, but the package is named MySQL-python.
Now, your users will still need the package's C dependencies installed (libmysqlclient), but this is easily installed with a package manager. It would also be reasonably easy to provide a non-setup.py install script (e.g. a bash script) that installs the appropriate system dependencies (libmysqlclient) and calls your setup.py:
#!/bin/bash
apt-get install -y libmysqlclient-dev # Improve me! Check the platform first
python setup.py install
Just don't try to do too much in your setup.py. No one expects a setup.py script to install system packages, so you should refrain from doing that in yours.
Now, if requiring users to install a system package is too much (if they don't have root access, it can be), you should use a pure-Python MySQL client instead.
One such client is pymysql, which of course can be installed via pip.
I have manually built numpy, scipy, matplotlib etc - without root privilages (I needed fresh matplotlib). All libs installed in the standard place:
~/.local/lib/python2.7
Now when I'm trying to install anything related - synaptic suggest me to install all the libs system wide. Is there a way I can tell synaptic to use the locally installed libs?
I tried to link ~/.local/lib/python2.7/site-packages to /usr/lib/python2.7 - no help.
Edit:
If I clone a python package, and change the name in the setup.py to the name of the ubuntu package, and then build:
python setup.py bdist --format=rpm
and then convert it to deb with alien:
sudo alien -k my.rpm
and then install the deb:
sudo dpkg -i my.deb
then synaptic does recognise it as a package (remember I've tweaked the name in setup.py).
But I can't find a way to make synaptic aware of locally installed python libs.
How can a package manager, that manages packages at a system level, know anything about something that is installed in a user directory, something that is the opposite of the system level?
A package manager resolves dependencies based on meta-information stored in a package (be it rpm, deb, whatever) and/or package repository.
To achieve your goal you can go either of two options.
First is to build a system-specific package from your sources and then install it via your package manager. See Creating Built Distributions docs for that. It would look something like this:
$ python setup.py bdist --format=rpm
$ rpm -i dist/$PACKAGE.rpm
That would make your package manager aware of the fact that some dependencies are already provided.
This approach may or may not work.
The other, preferred, option is to use python package manager such as pip and install all your packages in a virtual environment. There are several advantages of this method:
You can have several distinct package sets, with different versions of packages installed.
You can optionally isolate your virtual environment from the packages installed system-wide.