Splitting setup.py into multiple modules - python

I have inherited a complex setup.py which uses various python modules inside a separate folder called buildchain. The entire repository is basically structured like
- setup.py
- buildchain
- __init__.py
- extension_generator.py
- src
- my_package
- __init__.py
- my_module.py
with setup.py stating
import setuptools
from buildchain.extension_generator import CppExtensionGenerator
setuptools.setup(
...
ext_modules=CppExtensionGenerator().create_ext_modules()
)
This structure works fine when running python setup.py build_ext and similar things, but when installing the package using pip, for example in a tox environment, I get a ModuleNotFoundError: No module named 'buildchain'. Understandable, but frustrating.
buildchain/ is explicitly excluded in the manifest, because it is only necessary for compiling the extensions, but not otherwise needed after the installation is finished, so I don't want it in the wheel. The working solution is sticking all code inside setup.py, but that's not easy to navigate. I'm looking for something with the equivalent effect of still not adding all this code to the users's Python environment permanently.
How can I have a setup.py split into different modules with imports between them?

#sinoroc and #9769953 pointed out that I do not want buildchain/ inside the wheel, but I do want it inside the sdist. Therefore it should be included in the manifest (which informs sdist), but not constitute package data.
How to achieve that is a question answered eg. in Include files in sdist but not wheel.
[I think. I intend to try it out and then amend this answer describing what exactly I needed to do.]

Related

Python setup.py: a lot of additional folders in project

I had some issues with importing my custom package into project.
I've created very simple setup.py file.
It looks like:
from setuptools import setup
import custom_package
setup(
name='custom_package',
version=custom_package.__version__,
packages=['custom_package'],
)
Then I'm installing it into my system:
python setup.py install
custom_package becomes available, but there are a lot of additional folders in my project after this command:
build/
dist/
custom_package.egg-info/
Is it expected or I should avoid them somehow?
Yes that is to be expected, because you don't limit the files that should go into the package by using the setup argument package_dir. Without that setup will take everything from the directory that it is in, which includes the directories it makes for storing the structure that has to go into the installable archive (build) , as well as the destination directory for the final .tar.gz: dist).
This is one of the reasons why many package layouts have the habit of having a src subdirectory to store all the sources. You can add a list all the .py files that have to go in as a list, or use some helper function to generate that list (which is easier if you already have everything under one directory that is "clean".

use and install python personal libraries

I have two python projects, one includes useful packages for file manipulation and such. They are usefull because they can be reused in any other kind of project. The second is one of these projects, requiring the use of my useful packages. Here is my projects' file structure:
Python-projects/
Useful/
package_parsing/
package_file_manipulation/
package_blabla/
Super-Application/
pkg1/
__init__.py
module1.py
module2.py
pkg2/
setup.py
MANIFEST.in
README.rst
First of, I would like to use the Useful/package_parsing package in my module Super-Application/pkg1/module1.py. Is there a more convenient way to do it other than copying the package_parsing in Super-Application project?
Depending in the first answer, that is if there is a way to link a module from a different project, how could I include such external module in a release package of my Super-Application project? I am not sure that making use of install_requires in the setup.py will do.
My main idea here is not to duplicate the Useful/package_parsing package in all of my other development projects, especially when I would like to do modifications to this useful package. I wouldn't like to update all the outdated copies in each project.
=============
EDIT 1
It appears the first part of my question cna be dealt with appending the path:
import sys
sys.path.insert(0, path/to/Useful/package_parsing)
Moreover I can simply check the available paths using:
for p in sys.path:
print p
Now for the second part, how could I include such external module in a release package, possibly using the setup.py installation file?

Python project with C extensions - structure, imports and tests

A project written in python with some C extensions (not using SWIG etc). I am trying to figure out how to structure my project so that:
imports work. Importing the shared objects
I don't need to change PYTHONPATH (tried figuring it out and failed).
In the future, distributing the project package will be easiest.
Current structure is, as suggested here:
Project\
docs\ # mainly documentation, right?
bin\ # empty
setup.py # setup for the project, as suggested in the above link
project\
__init__.py
module.py
tests\
bla_test.py
C\ # the package of C files
file.c
file.so
other_c_stuff.c
header.h
setup.py # setup to compile the C files and create .so files
build\ # contaisn a bunch of (hopefully) irrelevant stuf
It worked from PyDev but not from shell. Ideal answer would adress the following:
A suggested structure for the project.
How an import would be perfomed (from, say, the by modules in tests).
Should (can) I keep all the C files in a separate library?
The build of the C files is done in which of the setup.py files (should I post them here?)
Is it possible to automatically build when necessary? How?
I tried relative imports - they don't work for me for some reason.
I saw the accepted answet to this question. He says - do whatever. But I can't get the imports to work. I read this answer, but have no clue what's all the stuff he has (and I don't have). The accepted answer doesn't help me because, again, the imports fail. This blog posts gives good advice but, again, the imports!
I don't want to go into detail for a general answer, since you linked to good ones already.
Some structure, which should work for you, could look like:
Project\
build\ # directory used by setup.py
docs\ # mainly documentation, right?
setup.py # setup for the project, as suggested in the above link
project\
__init__.py
module.py
c_package\
__init__.py
file.c
file.so
other_c_stuff.c
header.h
tests\
__init__.py
test_bla.py
So within the project package and its subpackages you can use relative imports, if you build the C Extensions inplace
python setup.py build_ext --inplace
or create a setup.cfg containing
[build_ext]
inplace=True
but only use this for development and don't release it, since installation will fail.
A build automation is possible, but I don't know of any other than calling setup.py directly, whenever the C sources have changed.

How to deal with relative imports in a Python package

I'm working on a Python project with approximately the following layout
project/
foo/
__init__.py
useful.py
test/
__init__.py
test_useful.py
test_useful.py tries to import project.foo.useful so it can test it, but it doesn't work when I say "python project/foo/test/test_useful.py", but it does work if I copy it into my current directory and run "python test_useful.py".
What is the correct way to handle these imports while developing? It seems like this won't be an issue once installed, because it will be in PYTHONPATH. Should I use distutils to make a build/ folder and add it to my PYTHONPATH?
First of all you need to set up your PYTHONPATH to either include "project" or the parent of "project". This is important while you're developing too :-)
Then you should be able to use an absolute import:
from project.foo import useful
Secondly, I would suggest that instead of running tests by executing the module, you install py.test (pip install pytest). Then you'll be able to use relative imports, as long as your py.test invocation is generic enough (i.e. "py.test foo" will work, but "py.test foo/test/test_useful.py" will not). I would still recommend that you not use relative imports in tests.
Please consider using distutils/setuptools to make your project installable in a Python standard way. (Hint: you'll need to create a setup.py file parallel to the 'foo' directory, also known as a package.)
Doing so will also allow you to then use a number of common Python testing frameworks (nose, py.test, etc.) to make it possible to collect and run tests, where most such frameworks automatically ensure 'foo' is an importable package before running the tests. Your test_useful.py tests can them import 'foo.useful' without a problem.
Also worth noting from your example directory structure is that it seems to be generally recommended that your tests directory NOT be a Python package. i.e. delete the test/init.py file. The framework will ensure the tests are runnable, and not having it as a package will help ensure it only gets distributed in source distributions and not binary ones (where it likely isn't wanted.)

How can I make setuptools ignore subversion inventory?

When packaging a Python package with a setup.py that uses the setuptools:
from setuptools import setup
...
the source distribution created by:
python setup.py sdist
not only includes, as usual, the files specified in MANIFEST.in, but it also, gratuitously, includes all of the files that Subversion lists as being version controlled beneath the package directory. This is vastly annoying. Not only does it make it difficult to exercise any sort of explicit control over what files get distributed with my package, but it means that when I build my package following an "svn export" instead of an "svn checkout", the contents of my package might be quite different, since without the .svn metadata setuptools will make different choices about what to include.
My question: how can I turn off this terrible behavior, so that "setuptools" treats my project the same way whether I'm using Subversion, or version control it's never heard of, or a bare tree created with "svn export" that I've created at the end of my project to make sure it builds cleanly somewhere besides my working directory?
The best I have managed so far is an ugly monkey-patch:
from setuptools.command import sdist
del sdist.finders[:]
But this is Python, not the jungle, so of course I want a better solution that involves no monkeys at all. How can I tame setuptools, turn off its magic, and have it behave sensibly by looking at the visible, predictable rules in my MANIFEST.py instead?
I know you know much of this, Brandon, but I'll try to give as a complete answer as I can (although I'm no setuptools gury) for the benefit of others.
The problem here is that setuptools itself involves quite a lot of black magick, including using an entry point called setuptools.file_finders where you can add plugins to find files to include. I am, however, at a complete loss as to how REMOVE plugins from it...
Quick workaround: svn export your package to a temporary directory and run the setup.py from there. That means you have no svn, so the svn finder finds no files to include. :)
Longer workaround: Do you really need setuptools? Setuptools have a lot of features, so the answer is likely yes, but mainly those features are depdenencies (so your dependencies get installed by easy_install), namespace packages (foo.bar), and entry points. Namespace packages can actually be created without setuptools as well. But if you use none of these you might actually get away with just using distutils.
Ugly workaround: The monkeypatch you gave to sdist in your question, which simply makes the plugin not have any finders, and exit quickly.
So as you see, this answer, although as complete as I can make it, is still embarrassingly incomplete. I can't actually answer your question, though I think the answer is "You can't".
Create a MANIFEST.in file with:
recursive-exclude .
# other MANIFEST.in commands go here
# to explicitly include whatever files you want
See http://docs.python.org/distutils/commandref.html#sdist-cmd for the MANIFEST.in syntax.
Simple solution, do not use setuptools for creating the source distribution, downgrade to distutils for that command:
from distutils.command.sdist import sdist
from setuptools import setup
setup(
# ... all the usual setup arguments ...
cmdclass = {'sdist': sdist},
)
Probably the answer is in your setup.py. Do you use find_packages? This function by default uses the VCS (e.g. subversion, hg, ...). If you don't like it, just write a different Python function which collects only the things you want.
I would argue that the default sdist behavior is correct. When you are building a source distribution, I would expect it to contain everything that is checked into Subversion. Of course it would be nice to be able to override it cleanly in special circumstances.
Compare sdist to bdist_egg; I bet only the files that are specified explicitly get included.
I did a simple test with three files, all in svn. Empty dummy.lkj and foobar.py and with setup.py looking like this:
import setuptools
setuptools.setup(name='foobar', version='0.1', py_modules=['foobar'])
sdist creates a tarball that includes dummy.lkj. bdist_egg creates an egg that does not include dummy.lkj.
You probably want something like this:
from distutils.core import setup
def packages():
import os
packages = []
for path, dirs, files in os.walk("yourprogram"):
if ".svn" in dirs:
dirs.remove(".svn")
if "__init__.py" in files:
packages.append(path.replace(os.sep, "."))
return packages
setup(
# name, version, description, etc...
packages = packages(),
# pacakge_data, data_files, etc...
)

Categories