How to make a debian package which includes several python packages - python

I want to create a debian package that when installed, it will install several python packages with pip. I can think of two ways:
install the python packages into a directory then make a debian package from that directory. But this will confuse the building host (such as its pip metadata), especially if the host already installed some of those packages.
make a debian package with all python packages, and during debian install and uninstall, run some scripts to install/uninstall the python packages. But this will need two more scripts to be maintained, and some place to hold all the python packages in the installed machine.
Any other solution and what's the best way to solve this problem?

In my opinion if you want to create a debian package you should avoid reference to external distribution systems.
Here there are the guidelines about the creation of python packages under debian.
EDIT: Sorry, I see now that the Debian wiki page about Python Packaging could be outdated. You could read:
the guide for pybuild
eventually the python page about building packages

If you want to create a meta package which depends on python-<packagename> in repositories, it is easy and I think you already know that. (if not, google equivs package). I assume you would like to have recent versions of the python packages installed or some packages are missing in debian repositories so the debian repositories will not be used.
pip is a good tool however you can break dependencies if you unistall a python package which my be required by another package which is installed by apt after your meta package. apt is your friend. You should be careful. To overcome this, my suggestions is to add the appropriate package names to your meta package's control file's Provides, Conflicts and Replaces fields whether you dynamically install the python packages via pip or bundle them in your main package. I quickly searched "bundling multiple debian packages into one package" and found no solution.
If you want to completely seperate your python packages from your system wide python packages, virtualenv is the best choice I know.
And if you want to build debian compliant packages using pip, stdeb can do that easily.
Moreover, As far as I remember, I saw some packages in Kali Linux (Debian based) dynamically installing python packages during install or during startup however Debian policies may not allow this kind of flexibility not to break dependencies (if you want to build an official package). I hope this answer guide you in the right direction.

Related

How to reinstall all user packages after updating Python version in Windows?

I have a Windows 7 machine running Python 3.8.5 with a very large number of physics/electronics/data analysis/simulation packages. As it turned out, I must have - for some inexplicable reason - installed the 32-bit version of Python instead of the 64-bit one despite having a 64-bit system. And I didn't notice until very recently when I was trying to install some packages that require 64-bit Python. Hence I've now downloaded and installed the latest Python version that is supported by Windows 7, which seems to be 3.8.10.
Question: What is the easiest and also fail-safe way to reinstall all the user packages - that I currently have under 3.8.5 - to 3.8.10?
For some reason, I couldn't find any "canonical" solution for this online. As it seems, Python does not come with any built-in support for updating or system migration and I'm honestly wondering why...
Anyway, my first idea was to get a list of all user (= "local"?) packages currently installed under 3.8.5, but I don't know how. Reason: Doing help('modules') inside the interpreter will list all packages and I don't see a way to "selectively apply" pip to a specific Python version, e.g. something like python-3.8.5 -m pip list --local is not supported.
After getting a list of the user packages, I was thinking to pack it into a batch command pip install package_1 package_2 <...> package_N, thus reinstalling everything to Python 3.8.10. And afterwards uninstalling Python 3.8.5 and removing all environment variables from system PATH.
Is this the proper way to do this?
Anyway, my first idea was to get a list of all user (= "local"?) packages currently installed under 3.8.5, but I don't know how.
Create a list of installed packages with pip freeze > pkglist.txt or pip list --format=freeze. If you already have one, that's great.
Then uninstall 32-bit Python 3.8.5 and clean your path for all Python related variables. Now, install 64-bit Python 3.8.10.
After reinstalling, you can install back all the packages with pip install -r pkglist.txt and it will restore the exact versions of the packages.
If you insist on having both 32-bit and 64-bit versions installed and also have the Python Launcher installed, you could invoke 32 and 64 bit versions separately with py -3.8-64 -m pip and py -3.8-32 -m pip.
I don't see a way to "selectively apply" pip to a specific Python version.
This is possible with the Python Launcher on Windows. But only between major/minor versions and not the patch versions according to its help message.
I would also recommend creating a virtual environment this time before installing the packages and leaving the root environment alone. You can create one named venv with just python -m venv venv, activate it with ./venv/Scripts/activate and proceed with the installation of packages.
Nope, doesn't work. After installing the packages with the newer Python version in PATH, e.g. Jupyter won't start.
If the Jupyter error persists, you could try pinning packages to their most recent patch/minor versions to update them and yet not break your code.
As a last resort, you could try installing Python 3.10 alongside your current Python installation (without uninstall or editing the PATH) and then installing the absolute latest versions of the packages in a 3.10 virtual environment to see if it works for you. You would invoke the two versions with Py Launcher, e.g. py -3.10 and py -3.8.
If I understood correctly, you have multiple packages like NumPy, pandas etc. installed on your machine, and you want to reinstall them "automatically" on a fresh installation of python.
The method (I use) to perform such an operation is by creating a file named setup.py which includes a list of all the packages.
Bellow, I am attaching an example of such a file I use in one of my projects:
from setuptools import setup, find_packages
setup(
name='surface_quality_tools',
version='0.1',
install_requires=["matplotlib", "psutil", "numpy", "scipy", "pandas", "trimesh", "pyglet", "networkx", "protobuf",
"numpy-stl", "sklearn", "opencv-python", "seaborn", "scikit-image", "flask", "tqdm", "pytest"],
package_data={'': ['*.json']},
packages=find_packages(include=[])
)
to run the installation you should open a command prompt from inside the project directory and run:
pip install -e .
You can find a nice example in this blog page
One common way of handling packages in Python is via virtual environments. You can use Anaconda (conda), venv or any of several other solutions. For example, see this post:
https://towardsdatascience.com/virtual-environments-104c62d48c54#:~:text=A%20virtual%20environment%20is%20a,a%20system%2Dwide%20Python).
The way this works in by keeping the Python interpreter separate from the virtual environment that contains all the necessary packages.
Probably the main reason Python doesn't feature migration tools (at least as part of standard library) is because pip - the main package tool - doesn't handle conflict resolution all too well. When you update a version of Python it might so happen (especially with niche packages) that some of them won't work any more and pip often won't be able to solve the dependencies. This is why it's a good idea to keep a separate venv for different Python versions and different projects.
The other tool you could use for easy migration is Docker which is a semi-virtual machine working on top of your host OS and containing usually some linux distribution, Python along with the necessary packages necessary for running and development.
It takes a bit of time to set up a container image initially but afterwards setting everythin on a new machine or in the cloud becomes a breeze.
Listing currently installed packages is done via pip freeze command, the output of which you can then pipe into a file to keep a record of project requirements, for example pip freeze > requirements.txt.

Installing python packages that needs to be compiled (like SciPy) gives me trouble

At my work we have some restrictions. Let me present the setup.
Setup
The computers for the average user have the current installation
Win7
Python 2.7.12 via Anaconda 4.1.1 64 bit.
Including SciPy 0.17.1.
Goal
I have developed a package hanzo that depends on packages which are not available from the bare anaconda installation. I want my package to be installed with all its dependencies via pip.
Challenges
We are behind a firewall and not allowed to use PyPi.
The python installation is located in a folder where the user doesn't have write rights. Hence packages have to be installed in a separate folder.
On my developer computer I have access to PyPi and installed all the necessary dependencies. I have uploaded the dependencies (whl/zip/tar.gz) into my own PyPi-repository. This overcomes the first problem.
My own thoughts
Now on a user computer I'd run the following (a hanzo wheel has been uploaded to the PyPi-server as well)
pip install --index-url <My PyPi-URL> --target C:\py_packages hanzo
but a compiler problem occurs if a package like SciPy is required (similar to this). This leads to two questions
(Fatal): Why the error when the user have SciPy 0.17.1 installed and there is not any specific version required for my package (or for its dependencies).
(Not fatal): Is there any way to install new versions of packages that needs to be compiled like SciPy, numpy and pandas in the current user setup? Say I want to add a package that needs a specific version of SciPy then I'd be in trouble.
Edit
Is there possibilites with virtualenv or using conda instead of pip?

Use pip or dnf to install python packages in Fedora?

When I install some python packages in Fedora, there're two ways:
use dnf install python-package
use pip install package
I notice even I use dnf update to make my Fedora the newest,
when I use pip, it still tell me something like
pip is a old version, please use pip update
I guess the dnf package management is different with python-pip package management.
So which one is more recommended to install python packages ?
Quoted from Gentoo Wiki:
It is important to understand that packages installed using pip will not be tracked by Portage. This is the case for installing any package through means other than the emerge command. Possible conflicts can be created when installing a Python package that is available in the Portage tree, then installing the same package using pip.
Decide which package manager will work best for the use case: either use emerge or pip for Python packages, but not both. Sometimes a certain Python packages will not be available in the Portage tree, in these cases the only option is to use pip. Be wise and make good choices!
This is true for almost any nowadays package managers. If you are using packages or certain package versions that only exists in pip, use it but don't try to install that from dnf. Doing this will not only cause file collisions but also will (most possibly) break the package manager's knowledge of the system, which usually leads to major package management issues.
Other solution would be using pip in user mode, without root permissions, which will install relevant things into your home directory.
So again, it's both okay to use pip or dnf, but just don't mix these two package managers together.

Python installation in Mac OS X virtual environment that includes a framework that I can include into Xcode?

I like to use Python with numpy, scipy and some other packages. I am an absolute Python beginner and have some issues with the installation under Mac OS X.
I am following these two tutorials to install python: 1 and 2.
Here, HomeBrew is used to install Python (with pip) and virtualenv. I do not have an opinion about what is better, MacPorts, HomeBrew, Fink... I just found this tutorial inspiring confidence.
If I understand things correctly, OS X system Python, which I should never touch, is under /System/Library/Frameworks/Python.Framework. And I cannot use this one in Xcode because it does not have my wanted packages. The HomeBrew Python will be installed somewhere in /usr/local/. I found a framework there but as the system framework it does not have the additional packages. The tutorial explains that it might be better to install additional packages in virtual environments only which is done via pip. But I cannot find a framework there.
So my question is: How can I get a Python installation in a virtual environment that includes a framework that I can include into Xcode?
The Apple Python is functional and the normal site-packages folder is /Library/Python/2.7/site-packages (and not /System/Library/Frameworks/Python.framework/Versions/2.7/lib/python2.7/site-packages). You can use it without problem.
I never had any problem to install all modules (numpy, scipy, matplotlib, pandas, shapely, and others...) that I wanted as frameworks, or with pip or easy_install, including virtualenv (simply install them in the conventional way in Python) or creating virtual environments.
when you install a framework module, It is placed in in the normal site-packages folder.
the only problem is possibly the "old" Python version (not a problem for me, using 2.6.x, 2.7.x and 3.3.x versions)
But if you want, you can install others versions of Python (in 64-bits, not 32 !):
a) the way prescribed by Apple: as a framework
the official versions of Python.org are installed in /Library/Frameworks/Python.framework with site-packages folder in /Library/Frameworks/Python.framework/Versions/2.7/lib/python2.7/site-packages
same for the Enthought scientific version of Python (scientific distribution with many modules preinstalled, numpy, scipy, matplotlib,...),
(you can also install the Homebrew Python version as a framework, see below)
You must change the PATH of the Python executable in /usr/bin (usually, this is done automatically by the distribution by symlinks or in the /Users/me/.bash_profile file ).
The modules installed in /Library/Frameworks/Python.framework/Versions/2.7/lib/python2.7/site-packages do not interfere with those installed in /Library/Python/2.7/site-packages if you use the appropriate Python executable and viceversa.
b) the package management system way
MacPorts install its own version of Python in the folder /opt/;
sudo port -v install python27
Fink install its own version of Python in the folder /sw/;
fink install Python27
Homebrew installs Python in /usr/local/Cellar with symlinks in /usr/local/bin.
brew install python
or
brew install python --framework
To use them, you must add /sw/bin, /sw/lib/ or /opt/bin, /opt/lib/ to the PATH and change the PATH of the Python executable
For me, the main problem with Fink and MacPorts is that they do not take into account what is already installed and install all in their respective folders which can create real problems in the management of library paths.
The Homebrew solution is "cleaner" (in /usr/local) and is based on existing libraries if they are up to date, otherwise it installs its own versions of the libraries
c) the "autonomous" way
the perfect solution is Anaconda (another scientific distribution with many modules preinstalled, ),
Installs cleanly into a single directory (where you want as /Users/me/anaconda)
doesn’t require root privileges
doesn’t affect other Python installs on your system, or interfere with OS X Frameworks
switch to/from Anaconda just by setting $PATH or creating an alias in /Users/me/.bash_profile
alias anaconda='/Users/me/anaconda/bin/python'
alias anaconda3='/Users/me/anaconda/envs/py33/bin/python3'
you can install Python versions from 2.6.x to 3.3.x
Innovative package and environment manager for Python, named conda, but you can use pip or easy_install without problem
for me now, it is the best solution to install virtual environments (as /Users/me/anaconda/envs/py33 )
d) the "hard" way
you can compile your own version of Python in a classical form (results in /usr/) or as a framework. It takes time but it is not difficult.
So your question:
How can I get a Python installation in a virtual environment that includes a framework that I can include into Xcode?
Unless you are a Unix specialist (PATHs management) , you must use the Apple's recommended solution, a frameworks distribution (including the Apple Python)

Integration of manually installed python libs into the system?

I have manually built numpy, scipy, matplotlib etc - without root privilages (I needed fresh matplotlib). All libs installed in the standard place:
~/.local/lib/python2.7
Now when I'm trying to install anything related - synaptic suggest me to install all the libs system wide. Is there a way I can tell synaptic to use the locally installed libs?
I tried to link ~/.local/lib/python2.7/site-packages to /usr/lib/python2.7 - no help.
Edit:
If I clone a python package, and change the name in the setup.py to the name of the ubuntu package, and then build:
python setup.py bdist --format=rpm
and then convert it to deb with alien:
sudo alien -k my.rpm
and then install the deb:
sudo dpkg -i my.deb
then synaptic does recognise it as a package (remember I've tweaked the name in setup.py).
But I can't find a way to make synaptic aware of locally installed python libs.
How can a package manager, that manages packages at a system level, know anything about something that is installed in a user directory, something that is the opposite of the system level?
A package manager resolves dependencies based on meta-information stored in a package (be it rpm, deb, whatever) and/or package repository.
To achieve your goal you can go either of two options.
First is to build a system-specific package from your sources and then install it via your package manager. See Creating Built Distributions docs for that. It would look something like this:
$ python setup.py bdist --format=rpm
$ rpm -i dist/$PACKAGE.rpm
That would make your package manager aware of the fact that some dependencies are already provided.
This approach may or may not work.
The other, preferred, option is to use python package manager such as pip and install all your packages in a virtual environment. There are several advantages of this method:
You can have several distinct package sets, with different versions of packages installed.
You can optionally isolate your virtual environment from the packages installed system-wide.

Categories