When/where should I check for the minimum Python version? - python

This question tells me how to check the version of Python. For my package I require at least Python 3.3:
MIN_VERSION_INFO = 3, 3
import sys
if not sys.version_info >= MIN_VERSION_INFO:
exit("Python {}.{}+ is required.".format(*MIN_VERSION_INFO))
but where/when should this check occur?
I want to produce the clearest possible error message for users installing via pip (sdist and wheel) or python setup.py install. Something like:
$ pip -V
pip x.x.x from ... (Python 3.2)
$ pip install MyPackage
Python 3.3+ is required.
$ python -V
Python 3.2
$ python setup.py install
Python 3.3+ is required.

The primary point of a compatibility check is to have this check either elegantly handle a compatibility issue or gracefully exit with an explanation before the incompatibility causes problems.
I'd put it near the top of setup.py, or the first script that belongs to you that will be called. Best practice is not to include code in __init__.py (unless you're making a MAJOR package, I'd opine), but if you already have a lot of code there, it's fine.
The quickest point of reference I can find: The old Python 2.6 unittest module had a test for it near the top, naturally, after the module docstring, imports, and __all__.
Another point of reference shows a compatibility check in an __init__.py, again, near the top, though here it is immediately after the docstring and an import of sys, required for the check. There are other similar examples of this usage in __init__.py in this same set of libraries.

Related

How do I manage python versions in source control for application?

We have an application that uses pyenv/virtualenv to manage python dependencies. We want to ensure that everyone who works on the application will have the same python version. Coming from ruby, the analog is Gemfile. To a certain degree, .ruby-version.
What's the equivalent in python? Is it .python-version? I've seen quite a few .gitignore that have that in it and usually under a comment ".pyenv". What's the reason for that? And what's the alternative?
Recent versions of setuptools (24.2.0+) allow you to control Python version at the distribution level.
For example, suppose you wanted to allow installation only on a (compatible) version of Python 3.6, you could specify:
# in setup.py
from setuptools import setup
setup(
...
python_requires='~=3.6',
...
)
The distribution built by this setup would have associated metadata which would prevent installation on incompatible Python version. Your clients need a current version of pip for this feature to work properly, older pip (<9.0.0) will not check this metadata.
If you must extend the requirement to people using older version of pip, you may put an explicit check on sys.version somewhere in the module level of the setup.py file. However, note that with this workaround, the package will still be downloaded by pip - it will fail later, on a pip install attempt with incorrect interpreter version.

Python 2 & 3 compatible namespace modules (using pip)

How is it possible build multiple python modules sharing the same namespace compatible for Python 2.7+ and 3.3+?
Let's call the namespace test. Now I want to have two seperate modules called test.foo and another one called test.bar. However, I'm currently developing test.helloworld which depends on both, test.foo and test.bar. Both are listed in the requirements.txt file.
The modules test.foo and test.bar are currently using the Python 2 solution for namespace packages:
import pkg_resources
pkg_resources.declare_namespace(__name__)
Running the suggested pip-command for development mode pip install -e . turns into: ImportError: No module named 'test.helloworld' while importing test.foo or test.bar is working.
The Python 3 solution for namespace packages are Implicit Namespace Packages where the namespace package has no __init__.py file. This is sadly not working for Python 2 versions.
How can I design a solution for both Python 2 and 3 (which allows me to use pip install -e .)? The --egg solution does not work for me since it is already deprecated.
I recently had a similar issue, where I had to install a package for Python 2 and 3. I ended up having to download the code from GitHub, then ran the setup.py by calling
sudo python setup.py install
and
sudo python3 setup.py install
This results in the package being installed for both Python 2 and 3, even though the code itself was meant for Python 2. This allows me to work with the package whether I use Python 2 or 3, without any namespace conflicts.
You'd want to use pkgutil-style namespace packages.
From https://packaging.python.org/guides/packaging-namespace-packages/:
pkgutil-style namespace packages
Python 2.3 introduced the pkgutil module and the extend_path function. This can be used to declare namespace packages that need to be compatible with both Python 2.3+ and Python 3. This is the recommended approach for the highest level of compatibility.
A table listing out all the possible ways of dealing with namespace packages, and which ways would work together: https://github.com/pypa/sample-namespace-packages/blob/master/table.md
See the answer at similar question for complete instructions which works on both python 2 and 3.
In short, setup.py needs to have unique name for each module and a common namespace_packages definition in addition to __init__.py declaring the namespace set at namespace_packages.
If you are still having issues, please post your setup.py and __init__.py for each module.

Enforcing python version in setup.py

Currently, we are setting\installing up some packages on system by mentioning their version and dependencies in setup.py under install_requires attribute. Our system requires Python 2.7. Sometimes, users are having multiple versions of Python on their systems, say 2.6.x and 2.7, some packages it says are available already but actually on the system available under 2.6 site packages list. Also some users have 2.6 only, how to enforce from setup.py or is there any other way to say to have only Python 2.7 and all packages which we want setup.py to update are for only 2.7. We require minimum 2.7 on the machine to run our code.
Thanks!
Santhosh
The current best practice (as of this writing in March 2018) is to add a python_requires argument directly to the setup() call in setup.py:
from setuptools import setup
[...]
setup(name="my_package_name",
python_requires='>3.5.2',
[...]
Note that this requires setuptools>=24.2.0 and pip>=9.0.0; see the documentation for more information.
As the setup.py file is installed via pip (and pip itself is run by the python interpreter) it is not possible to specify which Python version to use in the setup.py file.
Instead have a look at this answer to setup.py: restrict the allowable version of the python interpreter which has a basic workaround to stop the install.
In your case the code would be:
import sys
if sys.version_info < (2,7):
sys.exit('Sorry, Python < 2.7 is not supported')

Getting Pypy to recognize third party modules

Just a quick question, how do I get pypy to recognize third pary modules that I have in Python? For instance, I get the following error.
from tables import *
ImportError: No Module named tables
Which is basically saying that it cannot find my pytables library that I use to interact with in the script I am trying to run.
For pure python modules, just add the directory containing the modules to your sys.path, using something like:
sys.path.insert(0, '/usr/local/lib')
sys.path.insert(0, os.path.expanduser('~/lib'))
This works for CPython, Pypy and Jython.
For C extension modules, you can try Pypy's cpyext, but it won't run everything you might hope for, because some CPython C extension modules wander into dark corners of CPython's C-based runtime:
http://morepypy.blogspot.com/2010/04/using-cpython-extension-modules-with.html
I sometimes write code that uses ctypes to interface with a C .so, and then use that on both CPython and Pypy, because they both do pretty well with ctypes - but ctypes can be kinda slow on CPython:
http://docs.python.org/library/ctypes.html
Last I checked, Jython had the beginnings of ctypes, but it wasn't far enough along to use, at least not for my purposes.
There's also a new interface that requires a C compiler at runtime. It'll likely be less brittle (read: prone to segfaults) than ctypes. It's described here:
http://morepypy.blogspot.com/2012/06/release-01-of-cffi.html
It comes from the Pypy project I believe, but it was made to work first on CPython. AFAIK, it doesn't yet run on Pypy.
Pypy has a separate install space. Therefore, any modules you want to install from pypi should be installed into its space. So, for instance, I have pypy installed in /usr/local/pypy-1.9-32bit
I recommend using pip or easy_install. Here's how to install pip:
curl curl https://bootstrap.pypa.io/get-pip.py | /usr/local/pypy-1.9-32bit/bin/pypy
or
curl https://raw.github.com/pypa/pip/master/contrib/get-pip.py | /usr/local/pypy-1.9-32bit/bin/pypy
Then, just use the newly installed pip to get the module:
sudo /usr/local/pypy-1.9-32bit/bin/pip install tables
In this case, it failed, with the following error:
bminton#bminton:/tmp$ sudo /usr/local/pypy-1.9-32bit/bin/pip install tables
Downloading/unpacking tables
Downloading tables-2.4.0.tar.gz (8.9Mb): 8.9Mb downloaded
Running setup.py egg_info for package tables
.. ERROR:: You need numpy 1.4.1 or greater to run PyTables!
Complete output from command python setup.py egg_info:
.. ERROR:: You need numpy 1.4.1 or greater to run PyTables!
Installation failed in this case, because Tables depends on Numpy, which is not yet supported by PyPy (although they are working on it, see http://pypy.org/numpydonate.html). However, for many python modules, this method works great. For instance, I successfully installed the logilab constraint package this way.
As pointed out in other answers, pypy has a separate space for installed modules. I find the easiest way to add a module to pypy is the following:
download the source (e.g. as a *.tar.gz file)
extract, cd into the extracted directory
run pypy setup.py install (sometimes you need to prepend a sudo)
Copy folder for the module from C:\Python27\Lib to C:\pypy-2.3.1-win32\lib-python or the equivalent of where you have them installed.
Obviously, this will only work on Windows.
Actually, there is pip_pypy when you install pypy. See here:
.
Then install third module with pip_pypy.

Why can't I import pygtk?

I followed the instructions in this post. Everything installed successfully. However, when I run python I cannot import pygtk. Specifically, it says this:
>>> import pygtk \n
“ImportError: No module named pygtk”
I'm guessing I have to do some commands like make or something, but I can't find anywhere where it says what to do. Please help, I am getting very frustrated.
Edit: I should probably mention I'm on Mac OS X
How are you running python? Is it the one that comes with OSX (/usr/bin/python) or the MacPorts version (/opt/local/bin/python)?
The page you linked to has the instructions for installing pygtk under using MacPorts. So it should run with that installation of python. See the MacPorts wiki for help on how to configure your PATH variable to use the appropriate python installation.
EDIT: Try running the macports python explicitly: "/opt/local/bin/python" and then import pygtk. Also, check under the macports python site-packages directory on the filesystem to see if pygtk exists there (usually something like /opt/local/lib/python2.5/site-packages).
If you are running git mergetool from virtual environment, then python interpreter cannot find pygtk. fix your python path for virtualenv or deactivate virtualenv first.
Below worked for me - assumes you have HomeBrew installed
I was trying to install meld and was getting the same error - this fixed it
brew install python
It is likely that you've installed pip separately (rather than through macports). So, your packages are being installed in a location that is not readable by macports-installed python. For example, in my OS X, the following code works:
[user]$ /usr/bin/python
>>> import pip
>>> for package in pip.get_installed_distributions():
>>> print package, package.location
But if I start /opt/local/bin/python (which is my default python) and say "import pip", then it gives an importerror stating there is no module named pip.
There might be two things that work for you:
1) Install pip using macports, and set that as your default (port install pip). Install pygtk again with this pip
2) Launch python explicitly with /usr/bin/python, and write your codes there.
There may be a way to have the /opt python (from macports) read modules installed by non-macports pip, but I am not aware of it.
a> pip install pygtk - (windows only),
b> brew install python
Not sure why, first options is saying its only works on windows.
Below is working fine for me.
A very general view on the problem, this is what I do if python complains about not finding a module that I know exists:
(This is very general rather basic stuff, so apologies if this is stuff that you already tried even before posting here... in that case I hope it'll be useful to someone else)
1: Go to the python installation directory and make sure the module is actually there (or: figure out where exactly it is -- I have some modules that are part of a project, and thus not in the main directory). ... sometimes this will uncover that the module is not actually installed although it looked like it was)
2: make sure you're writing it correct (capital/lowercase letters are a likely source of frustration -- the import statement needs to reflect the module's directory name)
3: if it isn't located in the python path, either setting the $PYTHONPATH environment variable or putting something like this at the beginning of your script will help:
import sys
sys.path.append('\\path\\to_the_directory\\containing_themodule')
(double slashes required to make sure they're not read as special characters)
in this example, pytk would be in \path\to_the_directory\containing_themodule\pytk'.

Categories