Run all tests from subdirectories in Python - python

I am at my wits end with trying to get all my unittest to run in Python. I have searched about 30 different posts and the unit test documentation but still cannot figure it out.
First I have two test classes that I can run each individually and all the tests pass:
File: unittest.subfolder1.TestObject1.py
class TestObject1(unittest.TestCase):
def test_case1(self):
...some code...
...some assertions...
if __name__ == '__main__':
unittest.main()
File: unittest.subfolder2.TestObject2.py
class TestObject2(unittest.TestCase):
def test_case1(self):
...some code...
...some assertions...
if __name__ == '__main__':
unittest.main()
Starting in the top level directory above 'unittest' I am trying to us unittest.discover to find and run all my tests:
import unittest
loader = unittest.TestLoader()
suite = loader.discover('unittest')
unittest.TextTestRunner().run(suite)
When I do this I get the error `ModuleNotFoundError: No module named 'subfolder1.TestObject1'
What am I doing wrong?

A good approach is to run all the tests in a subdirectory from the command line. In order to find the following files "TestObject1.py, TestObject2.py, ..." in subdirectories, you can run the following command in the command line:
python -m unittest discover -p 'Test*.py'
Additionally, the __init__.py is required within the import and module directories: Python unittest discovery with subfolders
The import unittest is required in the files unittest.subfolder1.TestObject1.py and unittest.subfolder2.TestObject2.py
It is also possible to define explicitly the directory where the discovery starts with the -s parameter:
python -m unittest discover [options]
-s directory Directory to start discovery ('.' default)
-p pattern Pattern to match test files ('test*.py' default)
In case you are using unittest2, it comes with a script unit2. The command line usage is:
unit2 discover unit2 -v test_module

Do not name your directory unittest, it may conflict with the standard library.
You also need to create a file named __init__.py in all of your directories (subfolder1, etc.), so they become packages and their content can be imported.

So I had to do my own workaround but at least I can get them all to run with the above file structure. It requires that I reinstantiate the TestLoader and the TestSuite each time I give it a new file path, so first I need to collect all relevant file paths in the unittest directory.
import os
import unittest
import traceback
class UnitTestLauncher(object):
def runTests(self):
#logging.INFO("Running unit tests...")
lsPaths = []
#Find all relevant subdirectories that contain unit tests
#Exclude 'unittest' directory, but include subdirectories, with code `path != 'unittest'`
for path,subdirs,files in os.walk('unittest'):
if "pycache" not in path and path != 'unittest':
lsPaths.append(path)
#loop through subdirectories and run individually
for path in lsPaths:
loader = unittest.TestLoader()
suite = unittest.TestSuite()
suite = loader.discover(path)
unittest.TextTestRunner().run(suite)
This solution is not perfect and each different directory comes out as a line of output so you have to look through each line manually for failed tests.

Old question but, oh, so current. I am new to Python, coming from strong typed languages and while the language itself is ok(ish), the conventions, tools and workarounds to make everything work in the ecosystem can drive you nuts. I struggled myself with running unit tests from separate subdirectories and this is the way I solved it.
First, the code you test, package it into a package. Organize your directories like this:
Work
|
+---PkToTest
| |
| +--- __init__.py
| +--- a.py
| +--- <other modules>.py
|
+---Tests (for PKToTest)
|
+--- test_a.py
PkToTest becomes a package due to the init.py file. In test_a.py make sure your sys.path will contain the path to PkToTest (absolute path not relative). I did that by:
import sys
sys.path.insert(0, "<absolute path to parent of PkTotest directory>")
import unittest
from PkToTest import a
class aTestSuite(unittest.TestCase):
def test1(self):
self.assertEqual(a.fnToTest(), ...)

Testing All Subdirectories
Given a structure of:
my_package
|
|
controller
|-- validator.py
|
validator
|-- controller.py
|
test
|-- controller
|
|-- __init__.py (MAKE SURE THIS EXISTS OR unittest MODULE WOULD NOT KNOW)
|-- test_controller.py
|
|-- validator
|
|-- __init__.py (MAKE SURE THIS EXISTS OR unittest MODULE WOULD NOT KNOW)
|-- test_validator.py
|
then just run
python -m unittest discover -s my_package/test
What this does is to test and -s means to start with the my_package/test as the starting directory

In my project all folders are folders (not modules) and they have the structure:
Folder > Subfolder > Subfolder > Tests > test_xxxx.py
Folder > Subfolder > Subfolder > xxxx.py
So i modified the answer from here, and also took a part from How do I run all Python unit tests in a directory? and came up with this:
import os, unittest
testFolderPaths = []
for path, subdirs, files in os.walk(os.getcwd()):
for file in files:
if file.startswith("test_") and file.endswith(".py"):
testFolderPaths.append(path)
for path in testFolderPaths:
print(f"Running tests from {path}...")
loader = unittest.TestLoader()
suite = loader.discover(path)
runner = unittest.TextTestRunner()
result = runner.run(suite)
print(f"RUN {result.testsRun} Tests. PASSED? {result.wasSuccessful()}")
If any tests fail it will throw and error showing which one exactly failed.

Related

pytest configuration problem (transition from nosetests (71 sec) to pytest (1536 sec))

The problem:
pytest (decided by policy) takes 1536 seconds to run the same test suite (585 tests) as nosetest, which runs in 71 seconds.
The pytest.ini files is:
[pytest]
python_files = tests_*.py *_tests.py
norecursedirs = .idea (pycharm env).
testpaths = tests
And the file is placed at the root of the project:
root
|-+ mod1
| |-- core.py
| |-- utils.py
|-+ mod2
| |-- core.py
| |-- utils2.py
|-+ tests
| |-- test_mod1
| |-- test_mod2
|-+ utils (don't test).
| |-- u1.py
| |-- u2.py
|- pytest.ini
|- readme.md
Things I've checked (following advice from the 14 other SO posts):
The number of Pass/Fails is the same.
When running the tests individually with pytests they take ~ 20ms.
When running the folder with pytests 10-20 tests take 14-15 seconds.
The test suite has one environment, there's no env or os magic. Just lots of technical logic.
Each test_xyz.py file has it's own isolated def setup and def teardown that creates/drop an sqlite database. The tests interact with the database, by adding new transactions and checking the additions. Example:
global db
def setup():
db = get_new_db()
def teardown():
pass
def test_01():
w = Widget(db) # create widget instance.
w.add_friend('a#b.com')
assert 'a#b.com' in w.friends()
Questions:
Do I really have to plaster #pytest.fixtures(scope='module') on the setup and teardown of every 585 tests? I hope not.
What can I do to get the runtime of pytest to be similar to nosetests?
I'm not sure why pytest chose to invoke the module setup function in a pytest_runtest_setup hook that runs once per each test instead of a module-scoped autouse fixture, but here it is:
#hookimpl(trylast=True)
def pytest_runtest_setup(item):
if is_potential_nosetest(item):
if not call_optional(item.obj, "setup"):
# call module level setup if there is no object level one
call_optional(item.parent.obj, "setup")
# XXX this implies we only call teardown when setup worked
item.session._setupstate.addfinalizer((lambda: teardown_nose(item)), item)
This means you'll need to rename the setup/teardown functions to setup_module()/teardown_module(). If you're on Linux/MacOS, you can use sed combined with grep for batch renaming:
$ grep -lr "\(def setup():\|def teardown():\)" | \
xargs sed -i 's/def setup():/def setup_module():/g;s/def teardown():/def teardown_module():/g'

Application says module not found after moving to directory

I'm rather new to python and have this odd issue, which I can't seem to find an answer for.
When both app.py and mod_db were in the root directory, it works but when I shifted them to a sub directory
My directory structure:
demo_api
|
|-- demo-api
|
|-- __init__.py
|-- app.py
|-- mod_db.py
My main module
import json
from flask import Flask, request, Response
from .db_mod import insert_and_score
app = Flask(__name__)
#app.route('/')
def hello_world():
return 'Hello World!'
#app.route('/emotional_scoring', methods=['POST'])
def get_scoring():
json_obj = request.json
ret_json = insert_and_score(json_obj)
resp = Response(json.dumps(ret_json), mimetype='application/json', status=200)
return resp
if __name__ == '__main__':
app.run(host='0.0.0.0', debug=False)
The error message is on this line
from .mod_db import insert_and_score
I've tried change the sub-directory name. I've tried doing a full path, i.e. from demo_app.mod_db import insert_and_score and the error is ModuleNotFoundError: No module named 'demo_app'
The issue is that it works find in my IDE (PyCharm) but when I do it on command line, these are the errors I encounter.
As stated by mfrackowiak, I just had to change it to
from db_mod import insert_and_score
and it worked. But PyCharm doesn't like this. So I think it might be a PyCharm issue.
Fix by removing relative import
It seems you want both the relative imports and Pycharm to be happy. Like mfrackowiak said, you want to use absolute importing.
from db_mod import insert_and_score
Tell PyCharm what the new "Sources Root" is
Now to make Pycharm happy, you will want to tell it that the demo_api subdirectory is a "Sources Root". You can do this by right clicking the directory and go to Mark Directory as > Sources Root. You can also find it in Preferences > Project Structure. You can do this for each subdirectory, as needed.
Why this is good to do
This is useful when you have a python project as a subdirectory of a repo with many configs, scripts, and other files and directories. You often don't want the python app to be mixed in with those, so you move it to an app/ folder. This confuses Pycharm, as by default, it uses the Content Root as the Sources Root as well. You can fix this by simply telling Pycharm explicitly what the Sources Root is.
Example:
my-awesome-project/ <---- Content Root
|--.venv/
| |--<venv stuff>
|--scripts/
| |--build.sh
| |--run.sh
|--docker/
| |--dev.Dockerfile
| |--prod.Dockerfile
|--app/ <---------------- Sources Root
| |--sub-mod1/
| | |--foo.py
| | |--bar.py
| |--sub-mod2/
| |--baz.py
|--.gitignore
|--.python-version
|--requirements.txt
|--dev.env
|--docker-compose.yml

How do you properly integrate unit tests for file parsing with pytest?

I'm trying to test file parsing with pytest. I have a directory tree that looks something like this for my project:
project
project/
cool_code.py
setup.py
setup.cfg
test/
test_read_files.py
test_files/
data_file1.txt
data_file2.txt
My setup.py file looks something like this:
from setuptools import setup
setup(
name = 'project',
description = 'The coolest project ever!',
setup_requires = ['pytest-runner'],
tests_require = ['pytest'],
)
My setup.cfg file looks something like this:
[aliases]
test=pytest
I've written several unit tests with pytest to verify that files are properly read. They work fine when I run pytest from within the "test" directory. However, if I execute any of the following from my project directory, the tests fail because they cannot find data files in test_files:
>> py.test
>> python setup.py pytest
The test seems to be sensitive to the directory from which pytest is executed.
How can I get pytest unit tests to discover the files in "data_files" for parsing when I call it from either the test directory or the project root directory?
One solution is to define a rootdir fixture with the path to the test directory, and reference all data files relative to this. This can be done by creating a test/conftest.py (if not already created) with some code like this:
import os
import pytest
#pytest.fixture
def rootdir():
return os.path.dirname(os.path.abspath(__file__))
Then use os.path.join in your tests to get absolute paths to test files:
import os
def test_read_favorite_color(rootdir):
test_file = os.path.join(rootdir, 'test_files/favorite_color.csv')
data = read_favorite_color(test_file)
# ...
One solution is to try multiple paths to find the files.
#!/usr/bin/env python
# -*- coding: utf-8 -*-
from coolprogram import *
import os
def test_file_locations():
"""Possible locations where test data could be found."""
return(['./test_files',
'./tests/test_files',
])
def find_file(filename):
""" Searches for a data file to use in tests """
for location in test_file_locations():
filepath = os.path.join(location, filename)
if os.path.exists(filepath):
return(filepath)
raise IOError('Could not find test file.')
def test_read_favorite_color():
""" Test that favorite color is read properly """
filename = 'favorite_color.csv'
test_file = find_file(filename)
data = read_favorite_color(test_file)
assert(data['first_name'][1] == 'King')
assert(data['last_name'][1] == 'Arthur')
assert(data['correct_answers'][1] == 2)
assert(data['cross_bridge'][1] == True)
assert(data['favorite_color'][1] == 'green')
One way is to pass a dictionary of command name and custom command class to cmdclass argument of setup function.
Another way is like here, posted it here for quick reference.
pytest-runner will install itself on every invocation of setup.py. In some cases, this causes delays for invocations of setup.py that will never invoke pytest-runner. To help avoid this contingency, consider requiring pytest-runner only when pytest is invoked:
pytest = {'pytest', 'test', 'ptr'}.intersection(sys.argv)
pytest_runner = ['pytest-runner'] if needs_pytest else []
# ...
setup(
#...
setup_requires=[
#... (other setup requirements)
] + pytest_runner,
)
Make sure all the data you read in your test module is relative to the location of setup.py directory.
In OP's case data file path would be test/test_files/data_file1.txt,
I made a project with same structure and read the data_file1.txt with some text in it and it works for me.

Python - Get path of root project structure

I've got a python project with a configuration file in the project root.
The configuration file needs to be accessed in a few different files throughout the project.
So it looks something like: <ROOT>/configuration.conf
<ROOT>/A/a.py, <ROOT>/A/B/b.py (when b,a.py access the configuration file).
What's the best / easiest way to get the path to the project root and the configuration file without depending on which file inside the project I'm in? i.e without using ../../? It's okay to assume that we know the project root's name.
You can do this how Django does it: define a variable to the Project Root from a file that is in the top-level of the project. For example, if this is what your project structure looks like:
project/
configuration.conf
definitions.py
main.py
utils.py
In definitions.py you can define (this requires import os):
ROOT_DIR = os.path.dirname(os.path.abspath(__file__)) # This is your Project Root
Thus, with the Project Root known, you can create a variable that points to the location of the configuration (this can be defined anywhere, but a logical place would be to put it in a location where constants are defined - e.g. definitions.py):
CONFIG_PATH = os.path.join(ROOT_DIR, 'configuration.conf') # requires `import os`
Then, you can easily access the constant (in any of the other files) with the import statement (e.g. in utils.py): from definitions import CONFIG_PATH.
Other answers advice to use a file in the top-level of the project. This is not necessary if you use pathlib.Path and parent (Python 3.4 and up). Consider the following directory structure where all files except README.md and utils.py have been omitted.
project
│ README.md
|
└───src
│ │ utils.py
| | ...
| ...
In utils.py we define the following function.
from pathlib import Path
def get_project_root() -> Path:
return Path(__file__).parent.parent
In any module in the project we can now get the project root as follows.
from src.utils import get_project_root
root = get_project_root()
Benefits: Any module which calls get_project_root can be moved without changing program behavior. Only when the module utils.py is moved we have to update get_project_root and the imports (refactoring tools can be used to automate this).
All the previous solutions seem to be overly complicated for what I think you need, and often didn't work for me. The following one-line command does what you want:
import os
ROOT_DIR = os.path.abspath(os.curdir)
Below Code Returns the path until your project root
import sys
print(sys.path[1])
To get the path of the "root" module, you can use:
import os
import sys
os.path.dirname(sys.modules['__main__'].__file__)
But more interestingly if you have an config "object" in your top-most module you could -read- from it like so:
app = sys.modules['__main__']
stuff = app.config.somefunc()
Try:
ROOT_DIR = os.path.dirname(os.path.dirname(os.path.abspath(__file__)))
A standard way to achieve this would be to use the pkg_resources module which is part of the setuptools package. setuptools is used to create an install-able python package.
You can use pkg_resources to return the contents of your desired file as a string and you can use pkg_resources to get the actual path of the desired file on your system.
Let's say that you have a package called stackoverflow.
stackoverflow/
|-- app
| `-- __init__.py
`-- resources
|-- bands
| |-- Dream\ Theater
| |-- __init__.py
| |-- King's\ X
| |-- Megadeth
| `-- Rush
`-- __init__.py
3 directories, 7 files
Now let's say that you want to access the file Rush from a module app.run. Use pkg_resources.resouces_filename to get the path to Rush and pkg_resources.resource_string to get the contents of Rush; thusly:
import pkg_resources
if __name__ == "__main__":
print pkg_resources.resource_filename('resources.bands', 'Rush')
print pkg_resources.resource_string('resources.bands', 'Rush')
The output:
/home/sri/workspace/stackoverflow/resources/bands/Rush
Base: Geddy Lee
Vocals: Geddy Lee
Guitar: Alex Lifeson
Drums: Neil Peart
This works for all packages in your python path. So if you want to know where lxml.etree exists on your system:
import pkg_resources
if __name__ == "__main__":
print pkg_resources.resource_filename('lxml', 'etree')
output:
/usr/lib64/python2.7/site-packages/lxml/etree
The point is that you can use this standard method to access files that are installed on your system (e.g pip install xxx or yum -y install python-xxx) and files that are within the module that you're currently working on.
Simple and Dynamic!
this solution works on any OS and in any level of directory:
Assuming your project folder name is my_project
from pathlib import Path
current_dir = Path(__file__)
project_dir = [p for p in current_dir.parents if p.parts[-1]=='my_project'][0]
I've recently been trying to do something similar and I have found these answers inadequate for my use cases (a distributed library that needs to detect project root). Mainly I've been battling different environments and platforms, and still haven't found something perfectly universal.
Code local to project
I've seen this example mentioned and used in a few places, Django, etc.
import os
print(os.path.dirname(os.path.abspath(__file__)))
Simple as this is, it only works when the file that the snippet is in is actually part of the project. We do not retrieve the project directory, but instead the snippet's directory
Similarly, the sys.modules approach breaks down when called from outside the entrypoint of the application, specifically I've observed a child thread cannot determine this without relation back to the 'main' module. I've explicitly put the import inside a function to demonstrate an import from a child thread, moving it to top level of app.py would fix it.
app/
|-- config
| `-- __init__.py
| `-- settings.py
`-- app.py
app.py
#!/usr/bin/env python
import threading
def background_setup():
# Explicitly importing this from the context of the child thread
from config import settings
print(settings.ROOT_DIR)
# Spawn a thread to background preparation tasks
t = threading.Thread(target=background_setup)
t.start()
# Do other things during initialization
t.join()
# Ready to take traffic
settings.py
import os
import sys
ROOT_DIR = None
def setup():
global ROOT_DIR
ROOT_DIR = os.path.dirname(sys.modules['__main__'].__file__)
# Do something slow
Running this program produces an attribute error:
>>> import main
>>> Exception in thread Thread-1:
Traceback (most recent call last):
File "C:\Python2714\lib\threading.py", line 801, in __bootstrap_inner
self.run()
File "C:\Python2714\lib\threading.py", line 754, in run
self.__target(*self.__args, **self.__kwargs)
File "main.py", line 6, in background_setup
from config import settings
File "config\settings.py", line 34, in <module>
ROOT_DIR = get_root()
File "config\settings.py", line 31, in get_root
return os.path.dirname(sys.modules['__main__'].__file__)
AttributeError: 'module' object has no attribute '__file__'
...hence a threading-based solution
Location independent
Using the same application structure as before but modifying settings.py
import os
import sys
import inspect
import platform
import threading
ROOT_DIR = None
def setup():
main_id = None
for t in threading.enumerate():
if t.name == 'MainThread':
main_id = t.ident
break
if not main_id:
raise RuntimeError("Main thread exited before execution")
current_main_frame = sys._current_frames()[main_id]
base_frame = inspect.getouterframes(current_main_frame)[-1]
if platform.system() == 'Windows':
filename = base_frame.filename
else:
filename = base_frame[0].f_code.co_filename
global ROOT_DIR
ROOT_DIR = os.path.dirname(os.path.abspath(filename))
Breaking this down:
First we want to accurately find the thread ID of the main thread. In Python3.4+ the threading library has threading.main_thread() however, everybody doesn't use 3.4+ so we search through all threads looking for the main thread save it's ID. If the main thread has already exited, it won't be listed in the threading.enumerate(). We raise a RuntimeError() in this case until I find a better solution.
main_id = None
for t in threading.enumerate():
if t.name == 'MainThread':
main_id = t.ident
break
if not main_id:
raise RuntimeError("Main thread exited before execution")
Next we find the very first stack frame of the main thread. Using the cPython specific function sys._current_frames() we get a dictionary of every thread's current stack frame. Then utilizing inspect.getouterframes() we can retrieve the entire stack for the main thread and the very first frame.
current_main_frame = sys._current_frames()[main_id]
base_frame = inspect.getouterframes(current_main_frame)[-1]
Finally, the differences between Windows and Linux implementations of inspect.getouterframes() need to be handled. Using the cleaned up filename, os.path.abspath() and os.path.dirname() clean things up.
if platform.system() == 'Windows':
filename = base_frame.filename
else:
filename = base_frame[0].f_code.co_filename
global ROOT_DIR
ROOT_DIR = os.path.dirname(os.path.abspath(filename))
So far I've tested this on Python2.7 and 3.6 on Windows as well as Python3.4 on WSL
I decided for myself as follows.
Need to get the path to 'MyProject/drivers' from the main file.
MyProject/
├─── RootPackge/
│ ├── __init__.py
│ ├── main.py
│ └── definitions.py
│
├─── drivers/
│ └── geckodriver.exe
│
├── requirements.txt
└── setup.py
definitions.py
Put not in the root of the project, but in the root of the main package
from pathlib import Path
ROOT_DIR = Path(__file__).parent.parent
Use ROOT_DIR:
main.py
# imports must be relative,
# not from the root of the project,
# but from the root of the main package.
# Not this way:
# from RootPackge.definitions import ROOT_DIR
# But like this:
from definitions import ROOT_DIR
# Here we use ROOT_DIR
# get path to MyProject/drivers
drivers_dir = ROOT_DIR / 'drivers'
# Thus, you can get the path to any directory
# or file from the project root
driver = webdriver.Firefox(drivers_dir)
driver.get('http://www.google.com')
Then PYTHON_PATH will not be used to access the 'definitions.py' file.
Works in PyCharm:
run file 'main.py' (ctrl + shift + F10 in Windows)
Works in CLI from project root:
$ py RootPackge/main.py
Works in CLI from RootPackge:
$ cd RootPackge
$ py main.py
Works from directories above project:
$ cd ../../../../
$ py MyWork/PythoProjects/MyProject/RootPackge/main.py
Works from anywhere if you give an absolute path to the main file.
Doesn't depend on venv.
Here is a package that solves that problem: from-root
pip install from-root
from from_root import from_root, from_here
# path to config file at the root of your project
# (no matter from what file of the project the function is called!)
config_path = from_root('config.json')
# path to the data.csv file at the same directory where the callee script is located
# (has nothing to do with the current working directory)
data_path = from_here('data.csv')
Check out the link above and read the readme to see more use cases
I struggled with this problem too until I came to this solution.
This is the cleanest solution in my opinion.
In your setup.py add "packages"
setup(
name='package_name'
version='0.0.1'
.
.
.
packages=['package_name']
.
.
.
)
In your python_script.py
import pkg_resources
import os
resource_package = pkg_resources.get_distribution(
'package_name').location
config_path = os.path.join(resource_package,'configuration.conf')
This worked for me using a standard PyCharm project with my virtual environment (venv) under the project root directory.
Code below isnt the prettiest, but consistently gets the project root. It returns the full directory path to venv from the VIRTUAL_ENV environment variable e.g. /Users/NAME/documents/PROJECT/venv
It then splits the path at the last /, giving an array with two elements. The first element will be the project path e.g. /Users/NAME/documents/PROJECT
import os
print(os.path.split(os.environ['VIRTUAL_ENV'])[0])
Just an example: I want to run runio.py from within helper1.py
Project tree example:
myproject_root
- modules_dir/helpers_dir/helper1.py
- tools_dir/runio.py
Get project root:
import os
rootdir = os.path.dirname(os.path.realpath(__file__)).rsplit(os.sep, 2)[0]
Build path to script:
runme = os.path.join(rootdir, "tools_dir", "runio.py")
execfile(runme)
I used the ../ method to fetch the current project path.
Example:
Project1 -- D:\projects
src
ConfigurationFiles
Configuration.cfg
Path="../src/ConfigurationFiles/Configuration.cfg"
I had to implement a custom solution because it's not as simple as you might think.
My solution is based on stack trace inspection (inspect.stack()) + sys.path and is working fine no matter the location of the python module in which the function is invoked nor the interpreter (I tried by running it in PyCharm, in a poetry shell and other...). This is the full implementation with comments:
def get_project_root_dir() -> str:
"""
Returns the name of the project root directory.
:return: Project root directory name
"""
# stack trace history related to the call of this function
frame_stack: [FrameInfo] = inspect.stack()
# get info about the module that has invoked this function
# (index=0 is always this very module, index=1 is fine as long this function is not called by some other
# function in this module)
frame_info: FrameInfo = frame_stack[1]
# if there are multiple calls in the stacktrace of this very module, we have to skip those and take the first
# one which comes from another module
if frame_info.filename == __file__:
for frame in frame_stack:
if frame.filename != __file__:
frame_info = frame
break
# path of the module that has invoked this function
caller_path: str = frame_info.filename
# absolute path of the of the module that has invoked this function
caller_absolute_path: str = os.path.abspath(caller_path)
# get the top most directory path which contains the invoker module
paths: [str] = [p for p in sys.path if p in caller_absolute_path]
paths.sort(key=lambda p: len(p))
caller_root_path: str = paths[0]
if not os.path.isabs(caller_path):
# file name of the invoker module (eg: "mymodule.py")
caller_module_name: str = Path(caller_path).name
# this piece represents a subpath in the project directory
# (eg. if the root folder is "myproject" and this function has ben called from myproject/foo/bar/mymodule.py
# this will be "foo/bar")
project_related_folders: str = caller_path.replace(os.sep + caller_module_name, '')
# fix root path by removing the undesired subpath
caller_root_path = caller_root_path.replace(project_related_folders, '')
dir_name: str = Path(caller_root_path).name
return dir_name
Here's my take on this issue.
I have a simple use-case that bugged me for a while. Tried a few solutions, but I didn't like either of them flexible enough.
So here's what I figured out.
create a blank python file in the root dir -> I call this beacon.py
(assuming that the project root is in the PYTHONPATH so it can be imported)
add a few lines to my module/class which I call here not_in_root.py.
This will import the beacon.py module and get the path to that
module
Here's an example project structure
this_project
├── beacon.py
├── lv1
│   ├── __init__.py
│   └── lv2
│   ├── __init__.py
│   └── not_in_root.py
...
The content of the not_in_root.py
import os
from pathlib import Path
class Config:
try:
import beacon
print(f"'import beacon' -> {os.path.dirname(os.path.abspath(beacon.__file__))}") # only for demo purposes
print(f"'import beacon' -> {Path(beacon.__file__).parent.resolve()}") # only for demo purposes
except ModuleNotFoundError as e:
print(f"ModuleNotFoundError: import beacon failed with {e}. "
f"Please. create a file called beacon.py and place it to the project root directory.")
project_root = Path(beacon.__file__).parent.resolve()
input_dir = project_root / 'input'
output_dir = project_root / 'output'
if __name__ == '__main__':
c = Config()
print(f"Config.project_root: {c.project_root}")
print(f"Config.input_dir: {c.input_dir}")
print(f"Config.output_dir: {c.output_dir}")
The output would be
/home/xyz/projects/this_project/venv/bin/python /home/xyz/projects/this_project/lv1/lv2/not_in_root.py
'import beacon' -> /home/xyz/projects/this_project
'import beacon' -> /home/xyz/projects/this_project
Config.project_root: /home/xyz/projects/this_project
Config.input_dir: /home/xyz/projects/this_project/input
Config.output_dir: /home/xyz/projects/this_project/output
Of course, it doesn't need to be called beacon.py nor need to be empty, essentially any python file (importable) file would do as long as it's in the root directory.
Using an empty .py file sort of guarantees that it will not be moved elsewhere due to some future refactoring.
Cheers
If you are working with anaconda-project, you can query the PROJECT_ROOT from the environment variable --> os.getenv('PROJECT_ROOT'). This works only if the script is executed via anaconda-project run .
If you do not want your script run by anaconda-project, you can query the absolute path of the executable binary of the Python interpreter you are using and extract the path string up to the envs directory exclusiv. For example: The python interpreter of my conda env is located at:
/home/user/project_root/envs/default/bin/python
# You can first retrieve the env variable PROJECT_DIR.
# If not set, get the python interpreter location and strip off the string till envs inclusiv...
if os.getenv('PROJECT_DIR'):
PROJECT_DIR = os.getenv('PROJECT_DIR')
else:
PYTHON_PATH = sys.executable
path_rem = os.path.join('envs', 'default', 'bin', 'python')
PROJECT_DIR = py_path.split(path_rem)[0]
This works only with conda-project with fixed project structure of a anaconda-project
I ended up needing to do this in various different situations where different answers worked correctly, others didn't, or either with various modifications, so I made this package to work for most situations
pip install get-project-root
from get_project_root import root_path
project_root = root_path(ignore_cwd=False)
# >> "C:/Users/person/source/some_project/"
https://pypi.org/project/get-project-root/
This is not exactly the answer to this question; But it might help someone. In fact, if you know the names of the folders, you can do this.
import os
import sys
TMP_DEL = '×'
PTH_DEL = '\\'
def cleanPath(pth):
pth = pth.replace('/', TMP_DEL)
pth = pth.replace('\\', TMP_DEL)
return pth
def listPath():
return sys.path
def getPath(__file__):
return os.path.abspath(os.path.dirname(__file__))
def getRootByName(__file__, dirName):
return getSpecificParentDir(__file__, dirName)
def getSpecificParentDir(__file__, dirName):
pth = cleanPath(getPath(__file__))
dirName = cleanPath(dirName)
candidate = f'{TMP_DEL}{dirName}{TMP_DEL}'
if candidate in pth:
pth = (pth.split(candidate)[0]+TMP_DEL +
dirName).replace(TMP_DEL*2, TMP_DEL)
return pth.replace(TMP_DEL, PTH_DEL)
return None
def getSpecificChildDir(__file__, dirName):
for x in [x[0] for x in os.walk(getPath(__file__))]:
dirName = cleanPath(dirName)
x = cleanPath(x)
if TMP_DEL in x:
if x.split(TMP_DEL)[-1] == dirName:
return x.replace(TMP_DEL, PTH_DEL)
return None
List available folders:
print(listPath())
Usage:
#Directories
#ProjectRootFolder/.../CurrentFolder/.../SubFolder
print(getPath(__file__))
# c:\ProjectRootFolder\...\CurrentFolder
print(getRootByName(__file__, 'ProjectRootFolder'))
# c:\ProjectRootFolder
print(getSpecificParentDir(__file__, 'ProjectRootFolder'))
# c:\ProjectRootFolder
print(getSpecificParentDir(__file__, 'CurrentFolder'))
# None
print(getSpecificChildDir(__file__, 'SubFolder'))
# c:\ProjectRootFolder\...\CurrentFolder\...\SubFolder
One-line solution
Hi all! I have been having this issue for ever as well and none of the solutions worked for me, so I used a similar approach that here::here() uses in R.
Install the groo package: pip install groo-ozika
Place a hidden file in your root directory, e.g. .my_hidden_root_file.
Then from anywhere lower in the directory hierarchy (i.e. within
the root) run the following:
from groo.groo import get_root
root_folder = get_root(".my_hidden_root_file")
That's it!
It just executes the following function:
def get_root(rootfile):
import os
from pathlib import Path
d = Path(os.getcwd())
found = 0
while found == 0:
if os.path.isfile(os.path.join(d, rootfile)):
found = 1
else:
d=d.parent
return d
The project root directory does not have __init__.py.
I solved this problem by looking for an ancestor directory that does not have __init__.py.
from functools import lru_cache
from pathlib import Path
#lru_cache()
def get_root_dir() -> str:
path = Path().cwd()
while Path(path, "__init__.py").exists():
path = path.parent
return str(path)
There are many answers here but I couldn't find something simple that covers all cases so allow me to suggest my solution too:
import pathlib
import os
def get_project_root():
"""
There is no way in python to get project root. This function uses a trick.
We know that the function that is currently running is in the project.
We know that the root project path is in the list of PYTHONPATH
look for any path in PYTHONPATH list that is contained in this function's path
Lastly we filter and take the shortest path because we are looking for the root.
:return: path to project root
"""
apth = str(pathlib.Path().absolute())
ppth = os.environ['PYTHONPATH'].split(':')
matches = [x for x in ppth if x in apth]
project_root = min(matches, key=len)
return project_root
Important: This solution requires you to run the file as a module with python -m pkg.file and not as a script like python file.py.
import sys
import os.path as op
root_pkg_dirname = op.dirname(sys.modules[__name__.partition('.')[0]].__file__)
Other answers have requirements like depending on an environment variable or the position of another module in the package structure.
As long as you run the script as python -m pkg.file (with the -m), this approach is self-contained and will work in any module of the package, including in the top-level __init__.py file.
import sys
import os.path as op
root_pkg_name, _, _ = __name__.partition('.')
root_pkg_module = sys.modules[root_pkg_name]
root_pkg_dirname = op.dirname(root_pkg_module.__file__)
config_path = os.path.join(root_pkg_dirname, 'configuration.conf')
It works by taking the first component in the dotted string contained in __name__ and using it as a key in sys.modules which returns the module object of the top-level package. Its __file__ attribute contains the path we want after trimming off /__init__.py using os.path.dirname().

Optimal file structure organization of Python module unittests?

Sadly I observed that there are are too many ways to keep your unittest in Python and they are not usually well documented.
I am looking for an "ultimate" structure, one would accomplish most of the below requirements:
be discoverable by test frameworks, including:
pytest
nosetests
tox
the tests should be outside the module files and in another directory than the module itself (maintenance), probably in a tests/ directory at package level.
it should be possible to just execute a test file (the test must be able to know where is the module that is supposed to test)
Please provide a sample test file that does a fake test, specify filename and directory.
Here's the approach I've been using:
Directory structure
# All __init__.py files are empty in this example.
app
package_a
__init__.py
module_a.py
package_b
__init__.py
module_b.py
test
__init__.py
test_app.py
__init__.py
main.py
main.py
# This is the application's front-end.
#
# The import will succeed if Python can find the `app` package, which
# will occur if the parent directory of app/ is in sys.path, either
# because the user is running the script from within that parect directory
# or because the user has included the parent directory in the PYTHONPATH
# environment variable.
from app.package_a.module_a import aaa
print aaa(123, 456)
module_a.py
# We can import a sibling module like this.
from app.package_b.module_b import bbb
def aaa(s, t):
return '{0} {1}'.format(s, bbb(t))
# We can also run module_a.py directly, using Python's -m option, which
# allows you to run a module like a script.
#
# python -m app.package_a.module_a
if __name__ == '__main__':
print aaa(111, 222)
print bbb(333)
module_b.py
def bbb(s):
return s + 1
test_app.py
import unittest
# From the point of view of testing code, our working modules
# are siblings. Imports work accordingly, as seen in module_a.
from app.package_a.module_a import aaa
from app.package_a.module_a import bbb
class TestApp(unittest.TestCase):
def test_aaa(self):
self.assertEqual(aaa(77, 88), '77 89')
def test_bbb(self):
self.assertEqual(bbb(99), 100)
# Simiarly, we can run our test modules directly as scripts using the -m option,
# or using nose.
#
# python -m app.test.test_app
# nosetests app/test/test_app.py
if __name__ == '__main__':
unittest.main()

Categories