I normally use python 2.7.3 traditionally installed in /usr/local/bin, but I needed to rebuild python 2.6.6 (which I did without using virtualenv) in another directory ~/usr/local/ and rebuild numpy, scipy, all libraries I needed different versions from what I had for python 2.7.3 there...
But all the other packages that I want exactly as they were (meaning same version) in my default installation, I don't know how to just use them in the python 2.6.6 without having to download tarballs, build and installing them using --prefix=/home/myself/usr/local/bin.
Is there a fast or simpler way of "re-using" those packages in my "local" python 2.6.6?
Reinstall them. It may seem like a no-brainer to reuse modules (in a lot of cases, you can), but in the case of modules that have compiled code - for long term systems administration this can be an utter nightmare.
Consider supporting multiple versions of Python for multiple versions / architectures of Linux. Some modules will reference libraries in /usr/local/lib, but those libraries can be the wrong arch or wrong version.
You're better off making a requirements.txt file and using pip to install them from source.
Related
I'm trying to understand Linux OS library dependencies to effectively run python 3.9 and imported pip packages to work. Is there a requirement for GCC to be installed for pip modules with c extention modules to run? What system libraries does Python's interpreter (CPython) depends on?
I'm trying to understand Linux OS library dependencies to effectively run python 3.9 and imported pip packages to work.
Your questions may have pretty broad answers and depend on a bunch of input factors you haven't mentioned.
Is there a requirement for GCC to be installed for pip modules with c extention modules to run?
It depends how the package is built and shipped. If it is available only as a source distribution (sdist), then yes. Obviously a compiler is needed to take the .c files and produce a laudable binary extension (ELF or DLL). Some packages ship binary distributions, where the publisher does the compilation for you. Of course this is more of a burden on the publisher, as they must support many possible target machines.
What system libraries does Python's interpreter depends on?
It depends on a number of things, including which interpreter (there are multiple!) and how it was built and packaged. Even constraining the discussion to CPython (the canonical interpreter), this may vary widely.
The simplest thing to do is whatever your Linux distro has decided for you; just apt install python3 or whatever, and don't think too hard about it. Most distros ship dynamically-linked packages; these will depend on a small number of "common" libraries (e.g. libc, libz, etc). Some distros will statically-link the Python library into the interpreter -- IOW the python3 executable will not depend on libpython3.so. Other distros will dynamically link against libpython.
What dependencies will external modules (e.g. from PyPI) have? Well that completely depends on the package in question!
Hopefully this helps you understand the limitations of your question. If you need more specific answers, you'll need to either do your own research, or provide a more specific question.
Python depends on compilers and a lot of other tools if you're going to compile the source (from the repository). This is from the offical repository, telling you what you need to compile it from source, check it out.
1.4. Install dependencies
This section explains how to install additional extensions (e.g. zlib) on Linux and macOs/OS X. On Windows, extensions are already included and built automatically.
1.4.1. Linux
For UNIX based systems, we try to use system libraries whenever available. This means optional components will only build if the relevant system headers are available. The best way to obtain the appropriate headers will vary by distribution, but the appropriate commands for some popular distributions are below.
However, if you just want to run python programs, all you need is the python binary (and the libraries your script wants to use). The binary is usually at /usr/bin/python3 or /usr/bin/python3.9
Python GitHub Repository
For individual packages, it depends on the package.
Further reading:
What is PIP?
Official: Managing application dependencies
I need to support some software that is using an old Python version (2.4). So I have downloaded and compiled Python 2.4 and installed it in a virtualenv. So far, all OK and normal procedure.
But the software is trying to import an rpm module. And I cannot find a source for that module (it is not part of the standard Python library, afaict).
Typically, once the virtualenv is enabled (source env/bin/activate) I can install required software using easy_install. But easy_install rpm is failing to find anything. There is a pyrpm module, but it is not the same thing (it installs a module called "pyrpm"). And google searches are useless, as they all link to articles on how to build rpms...
If I were using the system python (on Ubuntu) I could install the python-rpm package. But that is for Python 2.7. How do I install the equivalent for Python 2.4?
[My impression is that the rpm libraries, used by many Linux systems, include a Python library, which is packaged as python-dev by the distro. But I can't see how to access that for an arbitrary python version.]
I AM NOT LOOKING FOR AN RPM THAT CONTAINS PYTHON 2.4. I AM LOOKING FOR A MODULE NAMED rpm THAT IS USED BY SOFTWARE WRITTEN FOR PYTHON 2.4.
It's right there, in the python-rpm RPM package:
http://rpmfind.net/linux/rpm2html/search.php?query=python-rpm
You will probably want to download the package contents, extract them, and then use
python setup.py install
From your active environment.
Of course, as it's pre compiled, you might have trouble getting the C extension to run.
I'm not familiar enough with RPM's to know whether you can get the source from there.
No guarantees the package will work with your python version though.
there's no simple way to do this; the python library is part of the system rpm package and interfaces to C code, so is closely tied to the rpm package installed on your machine.
instead, it's much simpler to install an old OS in a VM (eg CentOS 5) that uses Python 2.4. then everything is consistent and works.
the sources for the rpm module can be found here: http://www.rpm.org/wiki/Download
After you download the wanted version read and follow the INSTALL instructions in order to compile it on your target OS. Afterwards make sure you add the correct path to the 'site-packages' folder the installation chose into your PYTHONPATH environment variable.
To test start your python interpreter and run 'import rpm'
HTH,
Ran
I have two installations of Python 2.7.2 -- from MacPorts and Enthought -- on my Mac. I use the Enthought Python as the primary one; however, the MacPorts distribution has several additional packages like pymacs, rope etc., which I would like to make available to the Enthought Python. (I'm actually trying to use Emacs w/ Enthought Python, but also make use of the MacPorts-installed Rope, Pymacs for code completion in Emacs).
Is there a clean way to make the MacPorts packages available to the Enthought Python without breaking anything?
It's risky trying to combine the two distributions, as you are likely to get conflicts (especially for C-extensions linked to slightly different versions of shared libraries). This is a common cause of problems with EPD:
https://support.enthought.com/entries/22094157-OS-X-Conflict-with-installed-packages-in-earlier-Python-installation
The recommended way to install new packages in EPD is with the enpkg tool. You can find out more about enpkg with enpkg --help or in this article:
https://support.enthought.com/entries/22415022-Using-enpkg-to-update-EPD-packages
If your package isn't available through enpkg (in your case it looks like rope is while pymacs is not, assuming you have an appropriate subscription), EPD is a very standard python distribution, and you can install packages in it through normal means such as pip or by grabbing the source and running python setup.py install. See:
https://support.enthought.com/entries/22914233-Using-non-EPD-package-installers-such-as-pip
It would be cleaner to install the additional packages once more for the Enthought Python. Trying to reuse packages from another installation seems neither clean nor safe to me.
I'm trying to compile and use the PyGRIB module. There is no binary distribution of the module, so I have compiled using Cygwin. I would really like to be able to use the module in my windows python installation -- I already have numpy, matplotlib, and a development environment setup for my windows installation. How do I do this?
It looks like the Cygwin install creates the following two files:
pygrib-1.9.3-py2.6.egg-info
pygrib.dll
in my c:\cygwin\lib\python2.6\site-packages directory.
I have tried copying these to: C:\Python27\Lib\site-packages but that doesn't seem to do the trick.
If I can't do this, can I get IPython in Cygwin? I haven't seen it in the setup utility.
Honestly, the easiest way to compile on windows when using python is to just use the free visual studios distribution. I've installed many different packages that way and never had an issue. Normally, the installation will place the path variable on your path, but you will need to verify that.
You need to make sure to use 2008 though, and not 2010.
You can retrieve it from here:
http://msdn.microsoft.com/en-us/express/future/bb421473
Do note, if you go this way, it will mean you will have to reinstall any other compiled python binary packages (numpy, scipy, etc)
That said, I notice you are downloading a 2.6egg to a 2.7 distro. Off the top of my head, I'm not certain that 2.6 and 2.7 were compiled using the same compiler, but I believe they were. In any event, that could be your problem, either the package doesn't support 2.7, or 2.6 doesn't compile with the same compiler as 2.7.
Various software installations on my laptop each require their own particular version of Python. ViewVC requires Python 2.5 and Blender requires Python 2.6. Mercurial (thankfully) comes with its Python interpreter packaged in a DLL in the Mercurial installation itself.
How do I get by without having to install the entire Python environment each time? Is there some minimal installer which will install the bare minimum without affecting other programs? Can I modify the Blender and ViewVC installations so that they too use their own Python-in-a-DLL?
It's hard to know which "bare minimum" the Blender scripts you'll want to use in the future may be counting on (short of the full Python standard library, which isn't all that large in term of disk space after all). Why not install both Python 2.5 and 2.6? They can coexist nicely (if your scriptable apps use hashbangs like #!/usr/bin/env python instead of specifically mentioning python2.5 or python2.6, you may need to trick out their PATHs just a little bit).
You should be able to get away with installing the Python binaries in the same tree as the specific application I believe (Totally untested hunch though).