Python packaging - Proper usage of the tests folder with unittest - python

I'm experimenting with packaging some Pythons projects and have followed this guide. The anonymized file tree can be seen below. The toml file is a barebones one from the tutorial modified appropriately. Building and uploading works well. So far so good.
.
├── LICENSE
├── pyproject.toml
├── README.md
├── src
│   └── mymodule
│      ├── __init__.py
│      └── main.py
└── tests
My next intended step is to package an older smaller well behaving project which includes a test suite written with unittest. Simplified structure below.
.
├── mymodule
│   ├── submoduleA
│   │   ├── __init__.py
│   │ └ foo.py
│   ├── submoduleB
│   │   ├── __init__.py
│   │ └ bar.py
│ ├── baz.py
│   └── __init__.py
└── tests
├── test_submoduleA.py
└── test_submoduleB.py
This is where my progress grinds to a halt.
There are many different ways to skin a cat but none directly involves unittest as far as I can tell. I have opted to go ahead by using tox to call the former.
Similarly when I have a look at different Python project repos the structure under tests seem to differ a bit.
End intent/wish: Convert said older project to a packagable one, editing the tests as little as possible, and using the tests for testing while developing and to do basic tests on the target device later.
Questions:
What is the purpose of the tests folder? Eg to run tests while developing files in src, to test the built package and/or to verify a package works once installed?
Is it possible to use the pyproject.toml file with unittest?

Related

Python package-namespace: common test/docs/setup.py or one per namespace - which is the better pattern?

(In the interest of transparency, this is a follow up to a question asked here)
I'm dealing with related files in which a namespace package seems a good fit. I'm following the guide from the packaging authority, which places a setup.py in each namespace package;
mynamespace-subpackage-a/
setup.py
mynamespace/
subpackage_a/
__init__.py
mynamespace-subpackage-b/
setup.py
mynamespace/
subpackage_b/
__init__.py
module_b.py
In my tests, created a similar project. Apart from setup.py, I placed my unit tests, docs, and other stuff per namespace (I left out some of the directories for compactness.). I used pyscaffold to generate the namespaces.
├── namespace-package-test.package1
│ ├── LICENSE.txt
│ ├── README.md
│ ├── setup.cfg
│ ├── setup.py
│ ├── src
│ │ └── pkg1
│ │ ├── cli
│ │ │ ├── __init__.py
│ │ │ └── pkg1_cli.py
│ │ └── __init__.py
│ └── tests
├── namespace-package-test.package2
│ ├── AUTHORS.rst
However, I then noticed that pyscaffold has the option to create namespaces packages in the putup command.
(venv) steve#PRVL10SJACKSON:~/Temp$ putup --force my-package -p pkg1 --namespace namespace1
(venv) steve#PRVL10SJACKSON:~/Temp$ putup --force my-package -p pkg1 --namespace namespace2
This creates a folder structure like this;
├── AUTHORS.rst
├── CHANGELOG.rst
├── LICENSE.txt
├── README.rst
├── requirements.txt
├── setup.cfg
├── setup.py
├── src
│   ├── namespace1
│   │   ├── __init__.py
│   │   └── pkg1
│   │   ├── __init__.py
│   │   └── skeleton.py
│   └── namespace2
│   ├── __init__.py
│   └── pkg1
│   ├── __init__.py
│   └── skeleton.py
└── tests
├── conftest.py
└── test_skeleton.py
So I'm conflicted; I trust the team at pyscaffold, but it goes against the example from the packaging authority.
Are both approaches valid?
Is there a reason to choose one approach over the other?
The idea behind the namespace option in PyScaffold is to share/reuse namespaces across projects (in opposite of having more than one namespace inside a single project). Or in other words, to split a larger project in independently maintained/developed projects.
To my best understanding, having an structure like the one you showed in the 4th code block will not work. Using putup --force twice with 2 different namespaces for the same root folder is not the intended/supported usage.
The approach of PyScaffold is the same as the package authority, the only difference is that PyScaffold will assume you have only one package contained in a single project and git repository (PyScaffold also uses a src directory for the reasons explained in Ionel's blog post)
The reason behind adopting one setup.py per namespace+package is that it is required for building separated distribution files (i.e. you need one setup.py per *.whl).

importing python modules and packages in different sub-directories of the same project

I'd like to figure out the cleanest and preferably self contained way to use my packages in scripts that are in a different directory to the package itself.
The example problem is as follows:
The modules in lib need to both be imported, and run as a script.
My project directory is as below and I'm having two issues:
in lib/api.py, I want to read in data_files/key.txt correctly when api.py is called or imported
in testing_script.py I want to import and use lib/get_data.py
I can't seem to find a clean way to do this, does this mean my project is structured in a non-pythonic way?
Thanks for the help.
my-project-git
├── LICENSE
├── README.md
├─── my_project
│   ├── data_files
│   │   ├── key.txt
│   │   ├── mappings.csv
│   ├── lib
│   │   ├── __init__.py
│   │   ├── api.py
│   │   └── get_data.py
│   └── test
│   ├── __init__.py
│   └── testing_script.py
├── requirements.txt
└── setup.py
As far as I know, there's isn't a pythonic way to structure your project.
This is what Kenneth Reitz recommended in 2013 and it's how I use it: https://www.kennethreitz.org/essays/repository-structure-and-python.
README.rst
LICENSE
setup.py
requirements.txt
sample/__init__.py
sample/core.py
sample/helpers.py
docs/conf.py
docs/index.rst
tests/test_basic.py
tests/test_advanced.py
Inside sample (my_project in your case) you can separate into categories as you like. E.g. Utils (common functions), Database (read, write), View (user commands), etc. It depends on your project.
As for calling the modules at the same level, you should define them in the __init__ file of the top hierarchy module which is sample in this case.
For example:
__init__ in _my_project
from sample.core import a_Class
from sample.core import a_function
from sample.core import anything
then from /test/test_basic.py you do:
from sample import a_Class
# or import sample
a = a_Class() # use the class from core.py
# or a = sample.a_Class()
Take a look at the sample module repository: https://github.com/navdeep-G/samplemod

Packaging python with cython extension

I'm trying to build a package that uses both python and cython modules. The problem I'm having deals with imports after building and installing where I'm not sure how to make files import from the .so file generated by the build process.
Before building my folder structure looks like this
root/
├── c_integrate.c
├── c_integrate.pyx
├── cython_builder.py
├── __init__.py
├── integrator_class.py
├── integrator_modules
│   ├── cython_integrator.py
│   ├── __init__.py
│   ├── integrator.py
│   ├── numba_integrator.py
│   ├── numpy_integrator.py
│   ├── quadratic_error.png
│   ├── report3.txt
│   ├── report4.txt
│   └── report5.txt
├── report6.txt
├── setup.py
└── test
├── __init__.py
└── test_integrator.py
Building with python3.5 setup.py build gives this new folder in root
root/build/
├── lib.linux-x86_64-3.5
│   ├── c_integrate.cpython-35m-x86_64-linux-gnu.so
│   ├── integrator_modules
│   │   ├── cython_integrator.py
│   │   ├── __init__.py
│   │   ├── integrator.py
│   │   ├── numba_integrator.py
│   │   └── numpy_integrator.py
│   └── test
│   ├── __init__.py
│   └── test_integrator.py
The setup.py file looks like this
from setuptools import setup, Extension, find_packages
import numpy
setup(
name = "integrator_package",
author = "foo",
packages = find_packages(),
ext_modules = [Extension("c_integrate", ["c_integrate.c"])],
include_dirs=[numpy.get_include()],
)
My question is then: how do I write import statements of the functions from the .so file into ìntegrator_class.py in root and cython_integrator and test_integrator located in the build directory. Appending to sys.path seems like a quick and dirty solution that I don't much like.
EDIT:
As pointed out in the comments I haven't installed the package. This is because I don't know what to write to import from the .so file
In no specific order:
The file setup.py is typically located below the root of a project. Example:
library_name/
__init__.py
file1.py
setup.py
README
Then, the build directory appears alongside the project's source and not in the project source.
To import the file c_integrate.cpython-35m-x86_64-linux-gnu.so in Python, just import "c_integrate". The rest of the naming is taken care of automatically as it is just the platform information. See PEP 3149
A valid module is one of
a directory with a modulename/__init__.py file
a file named modulename.py
a file named modulename.PLATFORMINFO.so
of course located in the Python path. So there is no need for a __init__.py file for a compiled Cython module.
For your situation, move the Cython code in the project directory and either do a relative import import .c_integrate or a full from integrator_modules import c_integrate where the latter only works when your package is installed.
A few of this information can be found in my blog post on Cython modules http://pdebuyl.be/blog/2017/cython-module.html
I believe that this should let you build a proper package, comment below if not.
EDIT: to complete the configuration (see comments below), the poster also
Fixed the module path in the setup.py file so that it is the full module name starting from the PYTHONPATH: Extension("integrator_package.integrator_modules.c_integrat‌​or", ["integrator_package/integrator_modules/c_integrator.c"] instead of Extension("c_integrate", ["c_integrate.c"])]
Cythonize the module, build it and use with a same Python interpreter.
Further comment: the setup.py file can cythonize the file as well. Include the .pyx file instead of the .c file as the source.
cythonize(Extension('integrator_package.integrator_modules.c_integrat‌​or',
["integrator_package/integrator_modules/c_integrator.pyx"],
include_dirs=[numpy.get_include()]))

Versioning multiple projects with versioneer within a single git repository

I have a single, large git repo with many different projects (not submodules). A few of these projects are Python projects for which I'd like to track versioning with Python's versioneer, others may be completely independent projects (say, in Haskell). i.e. The directory structure looks like this:
myrepository
├── .git
├── README.md
├── project1/
│   ├── project1/
│   │ ├── __init__.py
│   │ └── _version.py
│   ├── versioneer.py
│   ├── setup.cfg
│   └── setup.py
├── project2/
├── project3/
└── project4/
   ├── project4/
   │ ├── __init__.py
   │ └── _version.py
   ├── versioneer.py
   ├── setup.cfg
   └── setup.py
This doesn't play well with versioneer because it can't discover the .git directory at the project root level, so I get a version of 0+unknown.
Questions:
Is there a suggested way to use versioneer with a single monolithic repo with multiple projects?
Depending on the answer to the above, is it recommended that my git tags read like: project1-2.1.0, project2-1.3.1, or should I unify the git tags like: 1.2.1, 1.2.2?

Pip install ignores files in MANIFEST.in - how to structure the project correctly?

I read a lot about this problem but could not find any solution, so I'll ask yet another question about it, since I'm not even sure if I use the correct folder structure for my Python package.
So basically I'm developing an application which uses the Tornado web-server framework and I want to package it, so the users can install it via pip and get access to a basic script to start the web server.
The directory structure is the following:
├── MANIFEST.in
├── README.md
├── config
│   └── default.cfg
├── docs
│   ├── Makefile
│   ├── _build
│   ├── _static
│   ├── _templates
│   ├── conf.py
│   ├── index.rst
├── foopackage
│   ├── __init__.py
│   ├── barmodule.py
│   └── bazmodule.py
├── setup.py
├── static
│   ├── css
│   │   ├── menu.css
│   │   └── main.css
│   ├── img
│   │   └── logo.png
│   ├── js
│   │   ├── ui.js
│   │   └── navigation.js
│   └── lib
│      ├── d3.v3.min.js
│      └── jquery-1.11.0.min.js
└── templates
├── index.html
└── whatever.html
The Python code is as you can see in the package foopackage.
The MANIFEST.in file recursively includes the directories config, static, templates, and docs.
This is my setup.py (only the relevant parts:
from setuptools import setup
setup(name='foo',
version='0.1.0',
packages=['foopackage'],
include_package_data=True,
install_requires=[
'tornado>=3.2.2',
],
entry_points={
'console_scripts': [
'foo=foopackage.barmodule:main',
],
},
)
If I run python setup.py sdist, everything gets packaged nicely, the docs, templates and config files etc. are included. However, if I run pip install ..., only the foopackage gets installed and everything else is ignored.
How do I include those additional files to the install procedure? Is my directory structure OK? I also read about "faking a package", so putting everything in a directory and touch a __init__.py file, but that seems pretty odd to me :-\
I solved the problem by moving the static directory to the actual Python module directory (foopackage). It seems that top level "non-package" folders are ignored otherwise.

Categories