How is it possible build multiple python modules sharing the same namespace compatible for Python 2.7+ and 3.3+?
Let's call the namespace test. Now I want to have two seperate modules called test.foo and another one called test.bar. However, I'm currently developing test.helloworld which depends on both, test.foo and test.bar. Both are listed in the requirements.txt file.
The modules test.foo and test.bar are currently using the Python 2 solution for namespace packages:
import pkg_resources
pkg_resources.declare_namespace(__name__)
Running the suggested pip-command for development mode pip install -e . turns into: ImportError: No module named 'test.helloworld' while importing test.foo or test.bar is working.
The Python 3 solution for namespace packages are Implicit Namespace Packages where the namespace package has no __init__.py file. This is sadly not working for Python 2 versions.
How can I design a solution for both Python 2 and 3 (which allows me to use pip install -e .)? The --egg solution does not work for me since it is already deprecated.
I recently had a similar issue, where I had to install a package for Python 2 and 3. I ended up having to download the code from GitHub, then ran the setup.py by calling
sudo python setup.py install
and
sudo python3 setup.py install
This results in the package being installed for both Python 2 and 3, even though the code itself was meant for Python 2. This allows me to work with the package whether I use Python 2 or 3, without any namespace conflicts.
You'd want to use pkgutil-style namespace packages.
From https://packaging.python.org/guides/packaging-namespace-packages/:
pkgutil-style namespace packages
Python 2.3 introduced the pkgutil module and the extend_path function. This can be used to declare namespace packages that need to be compatible with both Python 2.3+ and Python 3. This is the recommended approach for the highest level of compatibility.
A table listing out all the possible ways of dealing with namespace packages, and which ways would work together: https://github.com/pypa/sample-namespace-packages/blob/master/table.md
See the answer at similar question for complete instructions which works on both python 2 and 3.
In short, setup.py needs to have unique name for each module and a common namespace_packages definition in addition to __init__.py declaring the namespace set at namespace_packages.
If you are still having issues, please post your setup.py and __init__.py for each module.
Related
I have read that there is no longer a need to add the __init__.py file in latest versions of python to treat a folder as package. However, the python official documentation does not say this - for example the below still shows examples and documentation using the __init__.py file.
The __init__.py files are required to make Python treat directories
containing the file as packages.
https://docs.python.org/3/tutorial/modules.html#packages
Do we still need to use the __init__.py file to make python treat a folder as package? And are there any advantages/disadvantages to adding/removing this file?
That is True but only for namespace packages.
There are currently three different approaches to creating namespace
packages:
Use native namespace packages. This type of namespace package is defined in PEP 420 and is available in Python 3.3 and later. This is
recommended if packages in your namespace only ever need to support
Python 3 and installation via pip.
Use pkgutil-style namespace packages. This is recommended for new packages that need to support Python 2 and 3 and installation via both
pip and python setup.py install.
Use pkg_resources-style namespace packages. This method is recommended if you need compatibility with packages already using this
method or if your package needs to be zip-safe.
Maybe you mention the native namespace packages.
What is the purpose of the namespace_packages argument in setup.py when working with PEP420 namespace packages (the ones without __init__.py)?
I played with it and saw no difference whether I declared the namespace packages or not.
"setup.py install" and "pip install ." worked in any case.
I am building an automatic setup.py code generator and would be happy not to handle this if this is not necessary.
As long as you:
aim for Python 3.3 and newer or Python 2.7 with importlib2 dependency installed (a backport of importlib for Python 2),
use a recent version of setuptools for packaging (I think it should be 28.8 or newer)
and use a recent pip version for installing (9.0 and newer will be fine, 8.1.2 will probably also work, but you should test that yourself),
you are on the safe side and can safely omit the namespace_packages keyword arg in your setup scripts.
There is a PyPA's official repository named sample-namespace-packages on GitHub that contains a suite of tests for different possible scenarios of distributions installed that contain namespace packages of each kind. As you can see, the sample packages using the implicit namespace packages don't use namespace_packages arg in their setup scripts (here is one of the scripts) and all of the tests of types pep420 and cross_pep420_pkgutil pass on Python 3; here is the complete results table.
Namespace packages are separate packages that are installed under one top-level name.
Usually two different packages (for example SQLObject and Cheetah3) install two (or more) different top-level packages (sqlobject and Cheetah in my examples).
But what if I have a library that I want to split into parts and allow to install these parts without the rest of the library? I use namespace packages. Example: these two packages are 2 parts of one library: m_lib and m_lib.defenc. One installs m_lib/defenc.py which can be used separately, the other installs the rest of the m_lib library. To install the entire library at once I also provide m_lib.full.
PS. All mentioned packages are mine. Source code is provided at Github or my personal git hosting.
I have a package that I'm working on (LDB_Algebra). It has an extra that depends on another package that I created (LDB_LAPACK). I have created a virtualenv and installed each of these packages as shown:
$ virtualenv -p pypy ve_pypy
$ . ve_pypy/bin/activate
(ve_pypy) $ pip install LDB_LAPACK
...
(ve_pypy) $ python setup.py install
... (Installs LDB_Algebra)
Each has the following for its __init__.py file under the ldb package:
__import__('pkg_resources').declare_namespace(__name__)
Problem:
The trouble is that when I try to use ldb.algebra it reports that it can't find the package. Just to make sure it hasn't completely lost everything I attempt to import ldb.lapack and that works fine. This suggests to me that I'm having a namespace package problem. It seems a similar question has been asked here (with no answer sadly). Upon investigating the directory structure of my virtualenv I find that under ve_pypy/site-packages/ there is a folder for the ldb namespace package which includes the lapack package but not the algebra package. I also see an egg file, LDB_Algebra-0.3.2-py2.7.egg. Inside this egg file in the ldb directory is an __init__.py file with the appropriate namespace declaration (as above). Presumably this is supposed to be where it gets the ldb.algebra package from but it's not looking there.
Questions:
Can anyone confirm with a reference that what I'm seeing is a known issue (i.e. that I'm not just doing some slightly wrong that's causing all these troubles)? Are eggs and w/e the pip install method created (the ldb package directory under site-packages) fundamentally incompatible?
Assuming that the answer to the first question is that my method of package installing is fundamentally flawed, is there an easier way of installing the LDB_LAPACK package from pypi and the LDB_Algebra package from my local directory? I'm not a setuptools wiz or anything so the answer may be very straightforward (don't overlook the obvious).
Apparently this is a well known problem. The solution that was suggested to me and seems to work fine is to use pip install . instead of python setup.py install.
I would like to have several python sub modules inside a main module, but I want to distribute them as separated python packages. So package A should provide 'my_data.source_a', package B should provide 'my_data.source_b', ... and so on.
I found out that I have to use a namespace package for this, but trying to figuring out the details, I found multiple PEPs covering that problem. PEP 420 seems to be the latest one, which builds upon PEP 402 and PEP 382.
To me it's not clear what the status of the different PEPs an implementations is. So my question is: Is http://pythonhosted.org/distribute/setuptools.html#namespace-packages still the way to go or how should I build my namespace package?
The Python documentation has a good description of the three ways of creating namespace packages in Python, including guidelines for when to use each of the three methods. Furthermore, this topic is discussed in great depth in a different StackOverflow thread which has a good accepted answer. Finally, if you are someone who would rather read code than documentation, the sample-namespace-packages repo contains examples of namespace packages created using each of the three available methods.
In brief, if you intend your packages to work with Python versions 3.3 and above, you should use the native namespace packages method. If you intend your packages to work with older versions of Python, you should use the pkgutil method. If you intend to add a namespace package to a namespace that is already using the pkg_resources method, you should continue to use method.
With native namespace packages, we can remove __init__.py from both packages and modify our setup.py files to look as follows:
# setup.py file for my_data.source_a
from setuptools import setup, find_namespace_packages
setup(
name="my_data.source_a",
version="0.1",
packages=find_namespace_packages(include=['my_data.*'])
)
# setup.py file for my_data.source_b
from setuptools import setup, find_namespace_packages
setup(
name="my_data.source_b",
version="0.1",
packages=find_namespace_packages(include=['my_data.*'])
)
We need to add the include=['my_data.*'] argument because, by default find_namespace_packages() is rather lenient in the folders that it includes as namespace packages, as described here.
This is the recommended approach for packages supporting Python 3.3 and above.
With pkgutil-style namespace packages, we need to add the following line to the my_data.__init__.py files in each of our packages:
__path__ = __import__('pkgutil').extend_path(__path__, __name__)
This is the approach used by the backports namespace, and by different packages in the google-cloud-python repo, and it is the recommended approach for supporting older versions of Python.
The latest version of Python which is Python 3.7 uses the native namespace packages approach to create namespace packages which are defined in PEP 420.
There are currently three different approaches to creating namespace packages:
Use native namespace packages. This type of namespace package is defined in PEP 420 and is available in Python 3.3 and later. This is recommended if packages in your namespace only ever need to support Python 3 and installation via pip.
Use pkgutil-style namespace packages. This is recommended for new packages that need to support Python 2 and 3 and installation via both pip and python setup.py install.
Use pkg_resources-style namespace packages. This method is recommended if you need compatibility with packages already using this method or if your package needs to be zip-safe.
Reference: Packaging namespace packages
Just a quick question, how do I get pypy to recognize third pary modules that I have in Python? For instance, I get the following error.
from tables import *
ImportError: No Module named tables
Which is basically saying that it cannot find my pytables library that I use to interact with in the script I am trying to run.
For pure python modules, just add the directory containing the modules to your sys.path, using something like:
sys.path.insert(0, '/usr/local/lib')
sys.path.insert(0, os.path.expanduser('~/lib'))
This works for CPython, Pypy and Jython.
For C extension modules, you can try Pypy's cpyext, but it won't run everything you might hope for, because some CPython C extension modules wander into dark corners of CPython's C-based runtime:
http://morepypy.blogspot.com/2010/04/using-cpython-extension-modules-with.html
I sometimes write code that uses ctypes to interface with a C .so, and then use that on both CPython and Pypy, because they both do pretty well with ctypes - but ctypes can be kinda slow on CPython:
http://docs.python.org/library/ctypes.html
Last I checked, Jython had the beginnings of ctypes, but it wasn't far enough along to use, at least not for my purposes.
There's also a new interface that requires a C compiler at runtime. It'll likely be less brittle (read: prone to segfaults) than ctypes. It's described here:
http://morepypy.blogspot.com/2012/06/release-01-of-cffi.html
It comes from the Pypy project I believe, but it was made to work first on CPython. AFAIK, it doesn't yet run on Pypy.
Pypy has a separate install space. Therefore, any modules you want to install from pypi should be installed into its space. So, for instance, I have pypy installed in /usr/local/pypy-1.9-32bit
I recommend using pip or easy_install. Here's how to install pip:
curl curl https://bootstrap.pypa.io/get-pip.py | /usr/local/pypy-1.9-32bit/bin/pypy
or
curl https://raw.github.com/pypa/pip/master/contrib/get-pip.py | /usr/local/pypy-1.9-32bit/bin/pypy
Then, just use the newly installed pip to get the module:
sudo /usr/local/pypy-1.9-32bit/bin/pip install tables
In this case, it failed, with the following error:
bminton#bminton:/tmp$ sudo /usr/local/pypy-1.9-32bit/bin/pip install tables
Downloading/unpacking tables
Downloading tables-2.4.0.tar.gz (8.9Mb): 8.9Mb downloaded
Running setup.py egg_info for package tables
.. ERROR:: You need numpy 1.4.1 or greater to run PyTables!
Complete output from command python setup.py egg_info:
.. ERROR:: You need numpy 1.4.1 or greater to run PyTables!
Installation failed in this case, because Tables depends on Numpy, which is not yet supported by PyPy (although they are working on it, see http://pypy.org/numpydonate.html). However, for many python modules, this method works great. For instance, I successfully installed the logilab constraint package this way.
As pointed out in other answers, pypy has a separate space for installed modules. I find the easiest way to add a module to pypy is the following:
download the source (e.g. as a *.tar.gz file)
extract, cd into the extracted directory
run pypy setup.py install (sometimes you need to prepend a sudo)
Copy folder for the module from C:\Python27\Lib to C:\pypy-2.3.1-win32\lib-python or the equivalent of where you have them installed.
Obviously, this will only work on Windows.
Actually, there is pip_pypy when you install pypy. See here:
.
Then install third module with pip_pypy.