Python project with C extensions - structure, imports and tests - python

A project written in python with some C extensions (not using SWIG etc). I am trying to figure out how to structure my project so that:
imports work. Importing the shared objects
I don't need to change PYTHONPATH (tried figuring it out and failed).
In the future, distributing the project package will be easiest.
Current structure is, as suggested here:
Project\
docs\ # mainly documentation, right?
bin\ # empty
setup.py # setup for the project, as suggested in the above link
project\
__init__.py
module.py
tests\
bla_test.py
C\ # the package of C files
file.c
file.so
other_c_stuff.c
header.h
setup.py # setup to compile the C files and create .so files
build\ # contaisn a bunch of (hopefully) irrelevant stuf
It worked from PyDev but not from shell. Ideal answer would adress the following:
A suggested structure for the project.
How an import would be perfomed (from, say, the by modules in tests).
Should (can) I keep all the C files in a separate library?
The build of the C files is done in which of the setup.py files (should I post them here?)
Is it possible to automatically build when necessary? How?
I tried relative imports - they don't work for me for some reason.
I saw the accepted answet to this question. He says - do whatever. But I can't get the imports to work. I read this answer, but have no clue what's all the stuff he has (and I don't have). The accepted answer doesn't help me because, again, the imports fail. This blog posts gives good advice but, again, the imports!

I don't want to go into detail for a general answer, since you linked to good ones already.
Some structure, which should work for you, could look like:
Project\
build\ # directory used by setup.py
docs\ # mainly documentation, right?
setup.py # setup for the project, as suggested in the above link
project\
__init__.py
module.py
c_package\
__init__.py
file.c
file.so
other_c_stuff.c
header.h
tests\
__init__.py
test_bla.py
So within the project package and its subpackages you can use relative imports, if you build the C Extensions inplace
python setup.py build_ext --inplace
or create a setup.cfg containing
[build_ext]
inplace=True
but only use this for development and don't release it, since installation will fail.
A build automation is possible, but I don't know of any other than calling setup.py directly, whenever the C sources have changed.

Related

Splitting setup.py into multiple modules

I have inherited a complex setup.py which uses various python modules inside a separate folder called buildchain. The entire repository is basically structured like
- setup.py
- buildchain
- __init__.py
- extension_generator.py
- src
- my_package
- __init__.py
- my_module.py
with setup.py stating
import setuptools
from buildchain.extension_generator import CppExtensionGenerator
setuptools.setup(
...
ext_modules=CppExtensionGenerator().create_ext_modules()
)
This structure works fine when running python setup.py build_ext and similar things, but when installing the package using pip, for example in a tox environment, I get a ModuleNotFoundError: No module named 'buildchain'. Understandable, but frustrating.
buildchain/ is explicitly excluded in the manifest, because it is only necessary for compiling the extensions, but not otherwise needed after the installation is finished, so I don't want it in the wheel. The working solution is sticking all code inside setup.py, but that's not easy to navigate. I'm looking for something with the equivalent effect of still not adding all this code to the users's Python environment permanently.
How can I have a setup.py split into different modules with imports between them?
#sinoroc and #9769953 pointed out that I do not want buildchain/ inside the wheel, but I do want it inside the sdist. Therefore it should be included in the manifest (which informs sdist), but not constitute package data.
How to achieve that is a question answered eg. in Include files in sdist but not wheel.
[I think. I intend to try it out and then amend this answer describing what exactly I needed to do.]

use and install python personal libraries

I have two python projects, one includes useful packages for file manipulation and such. They are usefull because they can be reused in any other kind of project. The second is one of these projects, requiring the use of my useful packages. Here is my projects' file structure:
Python-projects/
Useful/
package_parsing/
package_file_manipulation/
package_blabla/
Super-Application/
pkg1/
__init__.py
module1.py
module2.py
pkg2/
setup.py
MANIFEST.in
README.rst
First of, I would like to use the Useful/package_parsing package in my module Super-Application/pkg1/module1.py. Is there a more convenient way to do it other than copying the package_parsing in Super-Application project?
Depending in the first answer, that is if there is a way to link a module from a different project, how could I include such external module in a release package of my Super-Application project? I am not sure that making use of install_requires in the setup.py will do.
My main idea here is not to duplicate the Useful/package_parsing package in all of my other development projects, especially when I would like to do modifications to this useful package. I wouldn't like to update all the outdated copies in each project.
=============
EDIT 1
It appears the first part of my question cna be dealt with appending the path:
import sys
sys.path.insert(0, path/to/Useful/package_parsing)
Moreover I can simply check the available paths using:
for p in sys.path:
print p
Now for the second part, how could I include such external module in a release package, possibly using the setup.py installation file?

Python dynamic import path

G'day,
Being a total python noob when it comes to packaging and module organisation..
Given the following (simplified) structure:
.
├── bin
│   └── fos.py
└── lib
└── drac.py
And the fact, that when installed, contents of lib folder will go somewhere into /usr/local/share/pyshared and contents of bin folder somewhere in /usr/bin, how do I persuade this whole thing to import my modules from ../lib when in VCS mode and work like it should, i.e. from modulename.drac import bla when installed, while keeping imports preferably the same?
Yes, I've read python docs on module organisation and structure, I just can't seem to wrap my head around some best practices. Asking for best practices on SO is stupid, hence this concrete example, which I have on a daily basis more or less.
Is this structure acceptable, if so, how do I organise the imports? If not, what would be the pythonic way to redo it?
Thanks!
I think you are bucking the idiom here. What you are describing is similiar to the old c ld_lib paradigm.
There is nothing wrong with a python project sourcing modules out of its own local file tree. Alternatively if your code is really that separate and your lib has a well defined API then you should package it separately and import/install it using ez_install, pip, or a setup.py
Generally if the code appears to be evolving together it best to just leave it together. Install it wherever you install your python code (opt..etc.) And symbolically link executables into /usr/local/bin

How to deal with relative imports in a Python package

I'm working on a Python project with approximately the following layout
project/
foo/
__init__.py
useful.py
test/
__init__.py
test_useful.py
test_useful.py tries to import project.foo.useful so it can test it, but it doesn't work when I say "python project/foo/test/test_useful.py", but it does work if I copy it into my current directory and run "python test_useful.py".
What is the correct way to handle these imports while developing? It seems like this won't be an issue once installed, because it will be in PYTHONPATH. Should I use distutils to make a build/ folder and add it to my PYTHONPATH?
First of all you need to set up your PYTHONPATH to either include "project" or the parent of "project". This is important while you're developing too :-)
Then you should be able to use an absolute import:
from project.foo import useful
Secondly, I would suggest that instead of running tests by executing the module, you install py.test (pip install pytest). Then you'll be able to use relative imports, as long as your py.test invocation is generic enough (i.e. "py.test foo" will work, but "py.test foo/test/test_useful.py" will not). I would still recommend that you not use relative imports in tests.
Please consider using distutils/setuptools to make your project installable in a Python standard way. (Hint: you'll need to create a setup.py file parallel to the 'foo' directory, also known as a package.)
Doing so will also allow you to then use a number of common Python testing frameworks (nose, py.test, etc.) to make it possible to collect and run tests, where most such frameworks automatically ensure 'foo' is an importable package before running the tests. Your test_useful.py tests can them import 'foo.useful' without a problem.
Also worth noting from your example directory structure is that it seems to be generally recommended that your tests directory NOT be a Python package. i.e. delete the test/init.py file. The framework will ensure the tests are runnable, and not having it as a package will help ensure it only gets distributed in source distributions and not binary ones (where it likely isn't wanted.)

Correct Structure for Python Project Containing a Package and Scripts?

I'm currently working on a project that has a custom designed Python package, along with a bunch of scripts that use that package. What's the best way of structuring this such that I can run the scripts from anywhere without getting package not found errors? I'd like to build tests for the package as well, so I was thinking of having something like:
project/
|--src
| |--some_package
|--test
|--scripts
But then I'm not sure how to have the scripts import my custom package such that I can run/reference the scripts from anywhere without "package can't be found" errors. Any help is appreciated!
There is documentation for this at the Hitch Hikers Guide to Packaging
One way to install and run these scripts on other machines is to package them up using distutils. See http://docs.python.org/library/distutils.html.
Otherwise, if you put an __init__.py file into a directory on your tree, Python will see the directory as a package and will allow you to import modules from it. For example, if you have this structure:
project/
|some_script.py
|--some_package
|__init__.py
|some_module.py
|--test
|__init__.py
|--scripts
|__init__.py
in some_script.py you could do this:
import some_package.some_module
That will allow you to do imports from subdirectories without an elaborate installation to put your modules somewhere in the Python path. The same thing can be done for the 'test' and 'scripts' directories. (You probably already know this, but __init__.py can just be an empty file.)

Categories