Installing numpy RPM with older Python version - python

I'm trying to install numpy 1.7 via an RPM on an older Linux machine with Python 2.4. The numpy release notes and the RPM page say it is supposed to be compatible with 2.4 (or <= 2.7), but when I try to install it on the machine with the command
rpm -i /tmp/python-numpy-1.7.0-2.1.i586.rpm
I get a number of missing dependency notes, including:
libc.so.6(GLIBC_2.11) is needed by python-numpy-1.7.0-2.1.i586
libc.so.6(GLIBC_2.4) is needed by python-numpy-1.7.0-2.1.i586
liblapack.so.3 is needed by python-numpy-1.7.0-2.1.i586
libpython2.7.so.1.0 is needed by python-numpy-1.7.0-2.1.i586
python >= 2.7 is needed by python-numpy-1.7.0-2.1.i586
python = 2.7 is needed by python-numpy-1.7.0-2.1.i586
python(abi) = 2.7 is needed by python-numpy-1.7.0-2.1.i586
rpmlib(PayloadIsLzma) <= 4.4.6-1 is needed by python-numpy-1.7.0-2.1.i586
So now at least Python 2.7 is needed, rather than up to 2.7. Is this a real discrepancy or am I using rpm incorrectly? I'm used to higher-level Linux package managers that report dependencies correctly and install them automatically, so I'm unsure how to proceed here.

Are you sure your distribution does not provide numpy already? It looks like numpy is part of epel.
If for some reason you are unwilling to use the version in the distribution, you're likely going to have to build the RPM yourself. I was able to build 1.7.1 on CentOS 5.7 like so:
sudo yum install rpm-build gcc python-devel
wget 'https://pypi.python.org/packages/source/n/numpy/numpy-1.7.1.tar.gz'
tar -xf numpy-1.7.1.tar.gz
cd numpy-1.7.1/
python setup.py bdist_rpm
sudo yum localinstall dist/numpy-1.7.1-1.i386.rpm
the generated RPM (in ./dist) should be useable without rebuilding on all of the machines with similar hardware and OS.

If all the machines have identical versions of Python, glibc, etc., then it would probably be easier to get the numpy source and build it yourself, assuming you have gcc (and perhaps gfortran) installed, along with dependencies like BLAS and LAPACK. Once it's installed on one machine, you can copy the numpy folder (and any .egg file) from /usr/lib/python2.4/site-packages (or whichever directory) and distribute that around the world. Make sure to create static libraries when you build so you don't need all the dependencies everywhere.
I'd also get numpy 1.7.1, as it fixes some issues with 1.7.0.

Related

Python version mismatch even though CMake reports having found the correct version

I'm building some C++ Python extensions for Python 3.10 (using PyBind11) but I'm finding that when trying to import these extensions I get: ImportError: Python version mismatch: module was compiled for Python 3.8, but the interpreter version is incompatible: 3.10.5.
I have find_package(Python3 3.10 REQUIRED) in my CMakeLists.txt and I use -DPYTHON_EXECUTABLE:FILEPATH=$(which python) when running cmake. I can confirm that which python points to a Python 3.10 executable. When running cmake part of the output says:
Found Python3: /PATH/TO/venv/bin/python3 (found suitable version "3.10.5", minimum required is "3.10") found components: Interpreter
and there are no other mentions of finding some other Python version. The compiled files look like module.cpython-310-x86_64-linux-gnu.so. Because of the 310 they can only be imported into a Python 3.10 interpreter sesion.
BUT, when I try importing them I get the ImportError I mentioned above. On the other hand, if I manually rename the compiled files to module.cpython-38-x86_64-linux-gnu.so and open up a Python 3.8 interpreter, I'm able to import.
How can I fix this? Why are all the clues suggesting that I have built the files correctly when I somehow haven't?
Note that I have already tried the solutions from the answers here.
There may have been other factors at play here (like including 3rd party CMake projects) so to fix my problem the first step was to remove those. Then I:
changed my find_package(Python3 3.10 REQUIRED) to find_package(Python3 3.10 REQUIRED COMPONENTS Interpreter Development) (see here). User #Tsyvarev also somewhat alluded to this in his comment on my question.
installed sudo apt install python3.10-dev (I only had this for Python 3.8). If you're like me and don't understand the difference between sudo apt install python and sudo apt install python-dev see this answer.

GDAL libraries, who does what

I'm struggling installing GDAL on ubuntu 16.04 to work with GeoDjango (Django 2.1, python3), so I need to understand what I'm actually installing.
What is the rôle of each library/package/module ?
apt
gdal-bin (A 'C' library containing the actual functions ?)
python-gdal (The same in python, or just some kind of bridge ?)
python3-gdal (see above, but for python3. Does it need python-gdal ?)
pip
gdal
pygdal
What is the link between pip modules and apt packages here ?
Every piece of info is available, if one is willing to search for it.
DEBs (installed system-wide):
gdal-bin ([Ubtu]: Package: gdal-bin) - a collection of gdal related binaries (tools and utilities)
python3-gdal ([Ubtu]: Package: python3-gdal) - Python 3 bindings, extensions (.sos) and some wrapper scripts, which enable gdal usage from Python
python-gdal - the same thing, but for Python 2 (totally unrelated to previous item)
WHLs (installed as Python modules to the interpreter used to launch pip):
GDAL ([PyPI]: GDAL) - the sources (.tar.gz) for #2. (and / or #3.). During pip install phase, they are built and installed for current Python
pygdal ([PyPI]: pygdal) - same thing (but for VEnv?) as previous item. It seems to be a lighter version (it doesn't contain the scripts)
But, all of the above depend on libgdal ([Ubtu]: Package: libgdal1i), which is the gdal library.

Cannot install python packages from source-code

I need to install PIL (python imaging library) on my Ubunto10.4-32bit (EDIT:64bit) machine on my python2.5.4-32bit.
This question is also relevant to any other source package I guess (among those that I need are RPyC,psyco and numpy).
I downloaded the source-code since I can't find any neat package to do the job and did
a sudo python2.5 setup.py install.
output:
Could not find platform dependent libraries <exec_prefix>
Consider setting $PYTHONHOME to <prefix>[:<exec_prefix>]
Traceback (most recent call last):
File "setup.py", line 9, in <module>
import glob, os, re, struct, string, sys
File "/usr/lib/python2.5/struct.py", line 30, in <module>
from _struct import Struct, error
ImportError: No module named _struct
but
echo $PYTHONHOME
/usr
Well, in the file struct.py theres the line from _struct import Struct, error
This is part of the python source code itself so I really wonder whats wrong with the python installation, since the code fails to import the module.
I installed py2.5.4 by doing:
./configure --prefix=/usr
make altinstall
(using make altinstall since I need py26 as default python interpreter)
EDIT: This issue might have risen from mistakenly using a 64bit platform :) and 32bit python2.5 . So anyhow problem solved by reducing unnecessary complexities - switching to 32bit machine and porting app to python 2.6.
In short:
Try using the Ubuntu repository first. If the package isn't there, use easy_install. If all fails, download the package directly to your source folder.
Ubuntu repository (the apt-get approach)
Ubuntu (10.04 and newer) has most mainstream packages are available with apt-get. The naming convention is python-NAME, e.g. python-imaging or python-scipy.
This is the best way to go, since the native package manager will handle any dependencies and updates issues.
Run apt-cache search python | grep "^python-" | less to see a list of packages available for your system (I have over 1,200 in my 10.04 machine).
Setuptools
For packages that are not part of the Ubuntu repository, you can use the python easy-install tool. First, install the setup tool:
sudo apt-get install python-setuptools
And you can install any Python package, e.g. colorworld, using easy-install:
sudo easy_install colorworld
This gives you some degree of protection (e.g., handles dependencies) but updates are generally manual, and it's a real pain to reinstall all these packages in a new computer.
Manual download
You can always download the source code to some directory and add it to your PYTHONPATH. It's the best approach when you just need to evaluate a package or apply some quick-and-dirty solution.
sudo aptitude install python-imaging
This will install PIL library.
sudo aptitude install python-imaging
That will install PIL. But I'm not really sure how to help with your other packages. Maybe try searching for them in synaptic.

Mercurial for Windows - Python version?

What version of Python is needed to run Mercurial?
I see that the website says it requires 2.4. Does that mean 2.4, or 2.x? or something higher than 2.4, i.e., could I install 3.x?
I've installed Mercurial without reading the requirements and I installed it anyway and hg.exe executes fine.
Looking in the directory that hg.exe lives (C:\Program Files\Mercurial\), it has a python26.dll in there. Does that mean i won't have to install Python - i.e. it's bundled with Mercurial?
Thanks
Yes, it comes bundled. If you install Mercurial using the Windows installer, then you don't need to worry about which version of Python you are using. Mercurial uses py2exe to create an executable that runs without a Python installation.
Python 3.x is not compatible with 2.x.
If Mercurial supports 2.4 and above, then you are better off installing python 2.6.x.
Yes there are installers available that come bundled with python.
You run the following on command line and if you do not get any errors then you are on your way to use mercurial
> hg version
> hg debuginstall
> hg test_mercurial
> cd test_mercurial

Purge complete Python installation on OS X

I’m working on a recently-upgraded OS X Snow Leopard and MacPorts and I’m running into problems at every corner.
The first problem is the sheer number of installed Python versions: altogether, there are four:
2.5, 2.6 and 3.0 in /Library/Frameworks/Python.framework
2.6 in /opt/local/Library/Frameworks/Python.framework/ (MacPorts installation)
So there are at least two useless/redundant versions: 2.5 and the redundant 2.6.
Additionally, the pre-installed Python is giving me severe problems because some of the pre-installed libraries (in particular, scipy, numpy and matplotlib) don’t work properly.
I am sorely tempted to purge the complete /Library/Frameworks/Python.framework path, as well as the MacPorts Python installation. After that, I’ll start from a clean slate by installing a properly configured Python, e.g. that from Enthought.
Am I running headlong into trouble? Or is this a sane undertaking?
(In particular, I need a working Python in the next few days and if I end up with a non-working Python this would be a catastrophe of medium proportions. On the other hand, some features I need from matplotlib aren’t working now.)
Macports only installs into /opt/local (for python and related).
Apple's python uses /Library/Frameworks/Python.framework/2.x 2.5 from Leopard and 2.6 for Snow Leopard but just puts a site-packages install in there on install
Thus I think you can get rid of /Library/Frameworks/Python.framework
I would the use the macports python and install numpy etc through that as I find that the easiest way for installing packages that have C dependencies
Alternatives are to install python for python.org and install numpy etc from that

Categories