I have a Cython extension module for a Python project I want to install to the same namespace as the project on installation. When I try to specify in the extension to install it inside the package itself, it can't be found and imported. If I specify the extension goes in the root of the Python namespace, it works fine, but it's not in the module namespace I want. How do I get the Extension Module to be importable from the same namespace as the package itself?
I've made a simple test case for this question.
Folder Structure:
mypkg
├── foo
│ ├── __init__.py
│ └── barCy.pyx
└── setup.py
The .barCy.pyx file:
cpdef long bar():
return 0
The setup.py code:
import distutils.extension
from Cython.Build import cythonize
from setuptools import setup, find_packages
extensions = [distutils.extension.Extension("foo.bar",
['foo/barCy.pyx'])]
setup(
name='foo',
packages=find_packages(),
ext_modules=cythonize(extensions),
)
The __init.py__ is empty.
I want to be able to do:
>>> import foo.bar as bar
>>> bar.bar()
0
Instead, I get
>>> import foo.bar as bar
ModuleNotFoundError: No module named 'foo.bar'
If I were to change the Extension("foo.bar",... to Extension("bar",..., then I can do import bar like it were a top level package. Although that is the expected behavior, it's not what I want as I only want this extension module to be accessible through the foo namespace.
My Python interpreter was being run from the same folder as the setup.py script, so doing import foo was importing the local package, rather than the one installed in Python interpreter's site-packages directory. Because they have the same folder structure, the local directory was chosen as the superseding imported package.
Do testing/running from a different folder than the source.
When building a python package where the source tree looks like this:
src -\
+- module -\
<stuff>
+- setup.py
is pretty clear.
Is it possible to build a package where module source doesn't reside in the same location as the setup.py? For more specific use case the code for the module is either partially or full autogenerated in a location other then src
E.g.
src -\
+- setup.py
generated -\
module -\
<module code>
You can control the directory where packages reside by using the package_dir argument to setup(...)
and while it does appear to build a proper source distribution when package_dir is a relative path starting with .., it appears that pip will refuse to install it -- I'd suggest instead nesting your generated code inside that src directory instead and then using package_dir to select that.
Here's an example which moves all modules inside a generated subdir:
setup(
name='mypkg',
package_dir={'': 'generated'},
packages=find_packages('generated'),
)
Using a setup like:
$ tree .
.
├── generated
│ ├── mod1
│ │ └── __init__.py
│ └── mod2
│ └── __init__.py
└── setup.py
This would make the following succeed after install: import mod1; import mod2
If you wanted to make those modules available under a different prefix, you would do:
setup(
name='mypkg',
package_dir={'hello': 'generated'},
packages=[f'hello.{mod}' for mod in find_packages('generated')],
)
This would make import hello.mod1; import hello.mod2 succeed after installation
You can use relative paths in package lookup configuration. Examples:
all distributed sources are in generated
from setuptools import setup, find_packages
setup(
...
package_dir={'': '../generated'},
packages=find_packages(where='../generated'),
)
selected packages should be included from generated
In this example, only packages spam and eggs from generated will be included:
import pathlib
from setuptools import setup, find_packages
setup(
name='so',
package_dir={'spam': '../generated/spam', 'eggs': '../generated/eggs'},
packages=find_packages(where='../generated'), # or just ['spam', 'eggs']
)
Or implement a dynamic lookup like e.g.
package_dir={p.name: p.resolve() for p in pathlib.Path('..', 'generated').iterdir()}
better implementation by resolving all paths relative to the setup.py file
Resolving all paths relative to the setup.py script allows you to run the script from any other directory than src, e.g. you can run python src/setup.py bdist_wheel etc. You may or may not need it, depending on your use case. Nevertheless, the recipe is as usual: resolve all paths to __file__, e.g.
import pathlib
from setuptools import setup, find_packages
src_base = pathlib.Path(__file__, '..', '..', 'generated').resolve()
setup(
...
package_dir={'': str(src_base)},
packages=find_packages(where=src_base),
)
I have the following package structure:
+ repo/
+ setup.py
+ package/
+ module1/
+ submodule1.py
+ submodule2.pyx
+ module2/
+ submodule3.py
I would like to use submodule2.pyx from submodule1.py by something like:
import submodule2
but I have absolutely no idea how to do this. I tried adding the following lines to my setup.py:
from distutils.core import setup
from setuptools import setup
from Cython.Distutils import build_ext
ext_modules = cythonize(Extension(
"zindex",
sources=["ndmg/graph/zindex.pyx"],
language="c",
))
for e in ext_modules:
e.pyrex_directives = {"boundscheck": False}
setup(
name='ndmg',
ext_modules = ext_modules,
packages=[
'package',
'package.module1',
....
)
but was unsuccessful. All of the tutorials I could find had very very simplified examples, so I am not sure how to include Cython modules in my python package when the rest of the package is all just normal python code. Does anybody have any good examples I could follow, or can somebody tell me what I'm doing wrong?
Thanks in advance!
The name given to cythonize is what Cython will use to call the module and what it will be have to be imported as.
The above setup.py will generate a native extension called zindex and will have to be imported as import zindex even within python files in the zindex package.
Here is an example of how to do this:
from distutils.core import setup
from setuptools import setup
from Cython.Distutils import build_ext
ext_modules = cythonize(Extension(
"ndmg.graph.zindex",
sources=["ndmg/graph/zindex.pyx"],
language="c",
))
<..>
Build and install extension.
In a python file under ndmg/graph/py_index.py you can then do.
from zindex import <..>
to import from the cython module.
I have a package named mypack which inside has a module mymod.py, and
the __init__.py.
For some reason that is not in debate, I need to package this module compiled
(nor .py or .pyc files are allowed). That is, the __init__.py is the only
source file allowed in the distributed compressed file.
The folder structure is:
.
│
├── mypack
│ ├── __init__.py
│ └── mymod.py
├── setup.py
I find that Cython is able to do this, by converting each .py file in a .so library
that can be directly imported with python.
The question is: how the setup.py file must be in order to allow an easy packaging and installation?
The target system has a virtualenv where the package must be installed with
whatever method that allows easy install and uninstall (easy_install, pip, etc are all
welcome).
I tried all that was at my reach. I read setuptools and distutils documentation,
all stackoverflow related questions,
and tried with all kind of commands (sdist, bdist, bdist_egg, etc), with lots
of combinations of setup.cfg and MANIFEST.in file entries.
The closest I got was with the below setup file, that would subclass the bdist_egg
command in order to remove also .pyc files, but that is breaking the installation.
A solution that installs "manually" the files in the venv is
also good, provided that all ancillary files that are included in a proper
installation are covered (I need to run pip freeze in the venv and see
mymod==0.0.1).
Run it with:
python setup.py bdist_egg --exclude-source-files
and (try to) install it with
easy_install mymod-0.0.1-py2.7-linux-x86_64.egg
As you may notice, the target is linux 64 bits with python 2.7.
from Cython.Distutils import build_ext
from setuptools import setup, find_packages
from setuptools.extension import Extension
from setuptools.command import bdist_egg
from setuptools.command.bdist_egg import walk_egg, log
import os
class my_bdist_egg(bdist_egg.bdist_egg):
def zap_pyfiles(self):
log.info("Removing .py files from temporary directory")
for base, dirs, files in walk_egg(self.bdist_dir):
for name in files:
if not name.endswith('__init__.py'):
if name.endswith('.py') or name.endswith('.pyc'):
# original 'if' only has name.endswith('.py')
path = os.path.join(base, name)
log.info("Deleting %s",path)
os.unlink(path)
ext_modules=[
Extension("mypack.mymod", ["mypack/mymod.py"]),
]
setup(
name = 'mypack',
cmdclass = {'build_ext': build_ext,
'bdist_egg': my_bdist_egg },
ext_modules = ext_modules,
version='0.0.1',
description='This is mypack compiled lib',
author='Myself',
packages=['mypack'],
)
UPDATE.
Following #Teyras answer, it was possible to build a wheel as requested in the answer. The setup.py file contents are:
import os
import shutil
from setuptools.extension import Extension
from setuptools import setup
from Cython.Build import cythonize
from Cython.Distutils import build_ext
class MyBuildExt(build_ext):
def run(self):
build_ext.run(self)
build_dir = os.path.realpath(self.build_lib)
root_dir = os.path.dirname(os.path.realpath(__file__))
target_dir = build_dir if not self.inplace else root_dir
self.copy_file('mypack/__init__.py', root_dir, target_dir)
def copy_file(self, path, source_dir, destination_dir):
if os.path.exists(os.path.join(source_dir, path)):
shutil.copyfile(os.path.join(source_dir, path),
os.path.join(destination_dir, path))
setup(
name = 'mypack',
cmdclass = {'build_ext': MyBuildExt},
ext_modules = cythonize([Extension("mypack.*", ["mypack/*.py"])]),
version='0.0.1',
description='This is mypack compiled lib',
author='Myself',
packages=[],
include_package_data=True )
The key point was to set packages=[],. The overwriting of the build_ext class run method was needed to get the __init__.py file inside the wheel.
Unfortunately, the answer suggesting setting packages=[] is wrong and may break a lot of stuff, as can e.g. be seen in this question. Don't use it. Instead of excluding all packages from the dist, you should exclude only the python files that will be cythonized and compiled to shared objects.
Below is a working example; it uses my recipe from the question Exclude single source file from python bdist_egg or bdist_wheel. The example project contains package spam with two modules, spam.eggs and spam.bacon, and a subpackage spam.fizz with one module spam.fizz.buzz:
root
├── setup.py
└── spam
├── __init__.py
├── bacon.py
├── eggs.py
└── fizz
├── __init__.py
└── buzz.py
The module lookup is being done in the build_py command, so it is the one you need to subclass with custom behaviour.
Simple case: compile all source code, make no exceptions
If you are about to compile every .py file (including __init__.pys), it is already sufficient to override build_py.build_packages method, making it a noop. Because build_packages doesn't do anything, no .py file will be collected at all and the dist will include only cythonized extensions:
import fnmatch
from setuptools import find_packages, setup, Extension
from setuptools.command.build_py import build_py as build_py_orig
from Cython.Build import cythonize
extensions = [
# example of extensions with regex
Extension('spam.*', ['spam/*.py']),
# example of extension with single source file
Extension('spam.fizz.buzz', ['spam/fizz/buzz.py']),
]
class build_py(build_py_orig):
def build_packages(self):
pass
setup(
name='...',
version='...',
packages=find_packages(),
ext_modules=cythonize(extensions),
cmdclass={'build_py': build_py},
)
Complex case: mix cythonized extensions with source modules
If you want to compile only selected modules and leave the rest untouched, you will need a bit more complex logic; in this case, you need to override module lookup. In the below example, I still compile spam.bacon, spam.eggs and spam.fizz.buzz to shared objects, but leave __init__.py files untouched, so they will be included as source modules:
import fnmatch
from setuptools import find_packages, setup, Extension
from setuptools.command.build_py import build_py as build_py_orig
from Cython.Build import cythonize
extensions = [
Extension('spam.*', ['spam/*.py']),
Extension('spam.fizz.buzz', ['spam/fizz/buzz.py']),
]
cython_excludes = ['**/__init__.py']
def not_cythonized(tup):
(package, module, filepath) = tup
return any(
fnmatch.fnmatchcase(filepath, pat=pattern) for pattern in cython_excludes
) or not any(
fnmatch.fnmatchcase(filepath, pat=pattern)
for ext in extensions
for pattern in ext.sources
)
class build_py(build_py_orig):
def find_modules(self):
modules = super().find_modules()
return list(filter(not_cythonized, modules))
def find_package_modules(self, package, package_dir):
modules = super().find_package_modules(package, package_dir)
return list(filter(not_cythonized, modules))
setup(
name='...',
version='...',
packages=find_packages(),
ext_modules=cythonize(extensions, exclude=cython_excludes),
cmdclass={'build_py': build_py},
)
While packaging as a wheel is definitely what you want, the original question was about excluding .py source files from the package. This is addressed in Using Cython to protect a Python codebase by #Teyras, but his solution uses a hack: it removes the packages argument from the call to setup(). This prevents the build_py step from running which does, indeed, exclude the .py files but it also excludes any data files you want included in the package. (For example my package has a data file called VERSION which contains the package version number.) A better solution would be replacing the build_py setup command with a custom command which only copies the data files.
You also need the __init__.py file as described above. So the custom build_py command should create the __init_.py file. I found that the compiled __init__.so runs when the package is imported so all that is needed is an empty __init__.py file to tell Python that the directory is a module which is ok to import.
Your custom build_py class would look like:
import os
from setuptools.command.build_py import build_py
class CustomBuildPyCommand(build_py):
def run(self):
# package data files but not .py files
build_py.build_package_data(self)
# create empty __init__.py in target dirs
for pdir in self.packages:
open(os.path.join(self.build_lib, pdir, '__init__.py'), 'a').close()
And configure setup to override the original build_py command:
setup(
...
cmdclass={'build_py': CustomBuildPyCommand},
)
I suggest you use the wheel format (as suggested by fish2000). Then, in your setup.py, set the packages argument to []. Your Cython extension will still build and the resulting .so files will be included in the resulting wheel package.
If your __init__.py is not included in the wheel, you can override the run method of build_ext class shipped by Cython and copy the file from your source tree to the build folder (the path can be found in self.build_lib).
This was exactly the sort of problem the Python wheels format – described in PEP 427 – was developed to address.
Wheels are a replacement for Python eggs (which were/are problematic for a bunch of reasons) – they are supported by pip, can contain architecture-specific private binaries (here is one example of such an arrangement) and are accepted generally by the Python communities who have stakes in these kind of things.
Here is one setup.py snippet from the aforelinked Python on Wheels article, showing how one sets up a binary distribution:
import os
from setuptools import setup
from setuptools.dist import Distribution
class BinaryDistribution(Distribution):
def is_pure(self):
return False
setup(
...,
include_package_data=True,
distclass=BinaryDistribution,
)
… in leu of the older (but probably somehow still canonically supported) setuptools classes you are using. It’s very straightforward to make Wheels for your distribution purposes, as outlined – as I recall from experience, either the wheel modules’ build process is somewhat cognizant of virtualenv, or it’s very easy to use one within the other.
In any case, trading in the setuptools egg-based APIs for wheel-based tooling should save you some serious pain, I should think.
I have following setup.py:
from setuptools import setup
from distutils.core import setup
setup(
name="foobar",
version="0.1.0",
author="Batman",
author_email="batman#gmail.com",
packages = ["foobar"],
include_package_data=True,
install_requires=[
"asyncio",
],
entry_points={
'console_scripts': [
'foobar = foobar.__main__:main'
]
},
)
Now, the main.py file gets installed and callable by foobar out of console after installation, which is what I wanted. Problem is, main.py has import at line 3 and that does not work.
So my folder structure is as follows
dummy/setup.py
dummy/requirements.txt
dummy/foobar/__init__.py
dummy/foobar/__main__.py
dummy/foobar/wont_be_imported_one.py
I run python3 setup.py bdist being in dummy directory.
Upon running foobar after installation, I get error
File "/usr/local/bin/foobar", line 9, in <module>
load_entry_point('foobar==0.1.0', 'console_scripts', 'foobar')()
[...]
ImportError: No module named 'wont_be_imported_one'.
UPDATE.
__init__.py has content of
from wont_be_imported_one import wont_be_imported_one
wont_be_imported_one.py has from wont_be_imported_one function which I actually need to import.
In Python 3, imports are absolute by default, and so from wont_be_imported_one import ... inside of foobar will be interpreted as a reference to some module named wont_be_imported_one outside of foobar. You need to use a relative import instead:
from .wont_be_imported_one import wont_be_imported_one
# ^ Add this
See PEP 328 for more information.