Getting Pypy to recognize third party modules - python

Just a quick question, how do I get pypy to recognize third pary modules that I have in Python? For instance, I get the following error.
from tables import *
ImportError: No Module named tables
Which is basically saying that it cannot find my pytables library that I use to interact with in the script I am trying to run.

For pure python modules, just add the directory containing the modules to your sys.path, using something like:
sys.path.insert(0, '/usr/local/lib')
sys.path.insert(0, os.path.expanduser('~/lib'))
This works for CPython, Pypy and Jython.
For C extension modules, you can try Pypy's cpyext, but it won't run everything you might hope for, because some CPython C extension modules wander into dark corners of CPython's C-based runtime:
http://morepypy.blogspot.com/2010/04/using-cpython-extension-modules-with.html
I sometimes write code that uses ctypes to interface with a C .so, and then use that on both CPython and Pypy, because they both do pretty well with ctypes - but ctypes can be kinda slow on CPython:
http://docs.python.org/library/ctypes.html
Last I checked, Jython had the beginnings of ctypes, but it wasn't far enough along to use, at least not for my purposes.
There's also a new interface that requires a C compiler at runtime. It'll likely be less brittle (read: prone to segfaults) than ctypes. It's described here:
http://morepypy.blogspot.com/2012/06/release-01-of-cffi.html
It comes from the Pypy project I believe, but it was made to work first on CPython. AFAIK, it doesn't yet run on Pypy.

Pypy has a separate install space. Therefore, any modules you want to install from pypi should be installed into its space. So, for instance, I have pypy installed in /usr/local/pypy-1.9-32bit
I recommend using pip or easy_install. Here's how to install pip:
curl curl https://bootstrap.pypa.io/get-pip.py | /usr/local/pypy-1.9-32bit/bin/pypy
or
curl https://raw.github.com/pypa/pip/master/contrib/get-pip.py | /usr/local/pypy-1.9-32bit/bin/pypy
Then, just use the newly installed pip to get the module:
sudo /usr/local/pypy-1.9-32bit/bin/pip install tables
In this case, it failed, with the following error:
bminton#bminton:/tmp$ sudo /usr/local/pypy-1.9-32bit/bin/pip install tables
Downloading/unpacking tables
Downloading tables-2.4.0.tar.gz (8.9Mb): 8.9Mb downloaded
Running setup.py egg_info for package tables
.. ERROR:: You need numpy 1.4.1 or greater to run PyTables!
Complete output from command python setup.py egg_info:
.. ERROR:: You need numpy 1.4.1 or greater to run PyTables!
Installation failed in this case, because Tables depends on Numpy, which is not yet supported by PyPy (although they are working on it, see http://pypy.org/numpydonate.html). However, for many python modules, this method works great. For instance, I successfully installed the logilab constraint package this way.

As pointed out in other answers, pypy has a separate space for installed modules. I find the easiest way to add a module to pypy is the following:
download the source (e.g. as a *.tar.gz file)
extract, cd into the extracted directory
run pypy setup.py install (sometimes you need to prepend a sudo)

Copy folder for the module from C:\Python27\Lib to C:\pypy-2.3.1-win32\lib-python or the equivalent of where you have them installed.
Obviously, this will only work on Windows.

Actually, there is pip_pypy when you install pypy. See here:
.
Then install third module with pip_pypy.

Related

How to install python modules using cppyy?

I want to package a python module containing python source and a native c++ library. Cppyy is used to dynamically generate the bindings so the library is really just a normal library. The build system for the library is meson and should not be replaced. The whole thing is in a git repository. I only care about Linux.
My question is how to get from this to “pip install url_to_package builds/installs everything.” in the least complicated way possible.
What I’ve tried:
Extending setuptools with a custom build command:
…that executes meson compile and copies the result in the right place. But pip install will perform its work in some random split-off temporary directory and I can’t find my C++ sources from there.
The Meson python module:
…can build my library and install files directly into some python env. Does not work with pip and has very limited functionality.
Wheels:
…are incredibly confusing and overkill for me. I will likely be the only user of this module. Actually, all I want is to easily use the module in projects that live in different directories…
Along the way, I also came across different CMake solutions, but those are disqualified because of my build system choice. What should I do?

How to update Python Standard Library packages with pip?

I am creating a requirements.txt file for my Python project. When someone installs my project via pip; I want pip to also download the Tkinter and time Python modules, both of which are modules in the Python Standard Library.
I used the command pip freeze to list all of the currently installed Python packages, but when I scrolled down to the T section, I could not find Tkinter or time in the list. I then tried to install Tkinter via pip using the command pip install Tkinter but got the following error:
Could not find a version that satisfies the requirement Tkinter (from versions: )
No matching distribution found for Tkinter
I also did the same with time and got the same error. (I am running Python 3.6 and Windows 10.)
Is it possible to get pip to install those modules while pip is installing my package, or do I just have to trust that because the user has Python installed they will also have Tkinter and time installed?
All help is welcome and appreciated!
Python's Standard Library is called the Standard Library because it is a standard of Python. In other words, if there is no Standard Library installed, the python environment is not python at all.
The Standard Library is tested and released together with each Python release as part of this release (not as an addition or extension).
So, YES, you can expect these libraries to exist if the user has Python installed. Just by definition.
Regarding the upgrades of the built-in libraries: NO, you cannot do this. Because they are part of the python setup, not of the application environment. Python is very tightly bound to the specific code in those libraries. All python apps & libs expect the same behavior of those libraries, even if they are buggy.
In addition to that, you cannot install a module/package with the same name as one of the python's builtins, because it will create the ambiguity on import, and can confuse/break all other libraries which depend on it (or worse, the system applications if you install it into the system python).
However, in some cases you can find the backports of some of the libraries. Usually, they are backported from py3 to py2. Of course, their name is changed.
As an example, you can look into concurrent.features library, which is a handy builtin in py3.2+, but was absent in py2.7.
UPD: Though, as #JulienPalard hints in the comments, some OS distributions can split this standard library to simplify the binary dependencies: e.g., on Debian, Tkinter will be installable separately as python3-tk.
This makes sense, indeed, from the point of view of the binary OS packaging: it is not worth installing the UI parts of the python library if you have no UI at all and want to save the disk space.
However, you are still unable to install it via pip. Because this package is not packaged and available separately on PyPI. This standard library separation is made by the selected OS distributions and is resolved with the means of that OS distribution only.
pip installs packages from pypi, which does not expose the standard library, which is bundled into Python.
So in theory you should trust your users environment, if they have Python, they should have the whole stadard library.
But some distributions are splitting Python in a few packages for reasons (A minimal Debian already depends on Python, but they don't want Python to pull tk to pull libx11 to pull the life, the universe, and everything).
So in practice some package will be there, and you can trust the distribs for this, like time will always be here. And some package may not be here, like tkinter, in which case you may want to surround the import tkinter by a try to error a nice "Looks like your distribution does not provide tk by default, please install it.".
Be reassured, you won't have to surround every imports by try statements just in case some distribution splitted the stdlib, just tkinter.

setup.py / pypi - Catching errors during installation

I am currently developing a command line application in python, which I will be uploading to pypi for end users to pip install and use. I am taking advantage of the extras functionality in setup.py to support 2 versions of my application, one is a basic functionality version with minimal dependencies and the other is more feature rich but has a large amount of dependencies (numpy, pandas, networkx, matplotlib, etc)
So briefly:
pip install app # simple, no deps
pip install app[all] # all the deps
Now the problem is that one of my dependencies in the feature rich version has what has been described as "flakey" pypi support. Basically, it cannot be installed unless one of its dependencies is already pre-installed before the whole installation process occurs. Luckily (or not), my application (which pulls in this flakey module) also has the flakey modules needed module. Lets refer to the module that flakey module needs pre-installed fixer-module
So:
pip install app[all] # triggers the installation of flakey module
Installation will fail here if fixer-module is not installed, even though it will be installed before flakey module is. One must basically do this:
pip install fixer-module
pip install app[all]
Now what I would like to do is include some kind of checking code that accomplishes the following:
Runs only when the app[all] distribution is being installed
Does a try import fixer-module, except ImportError check and prints a message explaining the situation.
Stops and cleans up the installation process before it fails
I have been researching this for quite some time. I found some examples of checking the input args to setup.py and whatnot but none of them seem to cover how to handle stuff during the end user install process. Any pointers are much appreciated!

Building NumPy on RedHat

I installed a local version of Python 2.7 in my home directory (Linux RedHat) under ~/opt using the --prefix flag.
More specifically, Python was placed in ~/home/opt/bin.
Now, I want to install NumPy, but I am not really sure how I would achieve this. All I found in the INSTALL.txt and online documentation was the command to use the compiler.
I tried gfortran, and it worked without any error message:
python setup.py build --fcompiler=gnu95
However, I am not sure how to install it for my local version of Python.
Also, I have to admit that I don't really understand how this whole approach works in general. E.g., what is the setup.py build doing? Is it creating module files that I have to move to a specific folder?
I hope anyone can give me some help here, and I would also appreciate a few lines of information how this approach works, or maybe some resources where I can read it up (I didn't find anything on the NumPy pages).
Your local version of python should keep all of it's files somewhere in ~/opt (presumably). As long as this is the python installation that gets used when you issue the command
python setup.py build --fcompiler=gnu95
you should be all set because in the sys module, there are a bunch of constants which the setup script uses to determine where to put the modules once they are built.
So -- running python setup.py build issues all of the necessary commands to build the module (compiling the C/Fortran code into shared object libraries that python can load dynamically and copying the pure python code to create the proper directory structure). The module is actually built somewhere in the build subdirectory which gets created during the process if it doesn't already exist. Once the library has been built (successfully), installing it should be as simple as:
python setup.py install
(You might need to sudo if you don't have write privileges in the install directory).

Installing rpm module for (non-system) Python

I need to support some software that is using an old Python version (2.4). So I have downloaded and compiled Python 2.4 and installed it in a virtualenv. So far, all OK and normal procedure.
But the software is trying to import an rpm module. And I cannot find a source for that module (it is not part of the standard Python library, afaict).
Typically, once the virtualenv is enabled (source env/bin/activate) I can install required software using easy_install. But easy_install rpm is failing to find anything. There is a pyrpm module, but it is not the same thing (it installs a module called "pyrpm"). And google searches are useless, as they all link to articles on how to build rpms...
If I were using the system python (on Ubuntu) I could install the python-rpm package. But that is for Python 2.7. How do I install the equivalent for Python 2.4?
[My impression is that the rpm libraries, used by many Linux systems, include a Python library, which is packaged as python-dev by the distro. But I can't see how to access that for an arbitrary python version.]
I AM NOT LOOKING FOR AN RPM THAT CONTAINS PYTHON 2.4. I AM LOOKING FOR A MODULE NAMED rpm THAT IS USED BY SOFTWARE WRITTEN FOR PYTHON 2.4.
It's right there, in the python-rpm RPM package:
http://rpmfind.net/linux/rpm2html/search.php?query=python-rpm
You will probably want to download the package contents, extract them, and then use
python setup.py install
From your active environment.
Of course, as it's pre compiled, you might have trouble getting the C extension to run.
I'm not familiar enough with RPM's to know whether you can get the source from there.
No guarantees the package will work with your python version though.
there's no simple way to do this; the python library is part of the system rpm package and interfaces to C code, so is closely tied to the rpm package installed on your machine.
instead, it's much simpler to install an old OS in a VM (eg CentOS 5) that uses Python 2.4. then everything is consistent and works.
the sources for the rpm module can be found here: http://www.rpm.org/wiki/Download
After you download the wanted version read and follow the INSTALL instructions in order to compile it on your target OS. Afterwards make sure you add the correct path to the 'site-packages' folder the installation chose into your PYTHONPATH environment variable.
To test start your python interpreter and run 'import rpm'
HTH,
Ran

Categories