Python: Run all functions in all python scripts in a folder dynamically - python

main_folder/
__init__.py
sub_folder1/
__init__.py
run_all_functions.py
sub_folder2/
__init__.py
script1.py
script2.py
script3.py
script1.py
def something():
....
def something2():
....
script2.py
def example():
....
def example2():
....
script3.py
def example3():
....
Would like to check if there are ways that i can run all the functions in different scripts in folder2 and consolidate them in run_all_functions.py in folder1 dynamically. I might add more scripts with functions in folder 2 in the near future.

Provided that you add a run_all() entry-point function to each .py file in a directory with your scripts, you could load all these file and call their respective run_all() functions.
Here is a little demonstration, adjust DIR.
import importlib.util
import pathlib
DIR = '/tmp/test'
for pyfile in pathlib.Path(DIR).glob('*.py'): # or perhaps 'script*.py'
spec = importlib.util.spec_from_file_location(f"{__name__}.imported_{pyfile.stem}" , pyfile)
module = importlib.util.module_from_spec(spec)
spec.loader.exec_module(module)
module.run_all()
Without a per module run_all(), you must use some kind of introspection to find all functions to be run.

Related

Why my __init__.py doesn't work in the module?

I am trying to add 'project_root' into __init__.py and all modules can use it, but it doesn't work.
Environment: Python 3.7.0 MACOS MOJAVE
file structure
·
├── __init__.py
└── a.py
the codes in __init__.py file:
import sys
project_root = os.path.dirname(os.path.dirname(os.path.realpath(__file__)))
sys.path.append(project_root)
and in another file
print(project_root)
If I run python a.py in the same dir ,or run python a.py out of the dir, error are the same below:
Traceback (most recent call last):
File "a.py", line 1, in <module>
print(project_root)
NameError: name 'project_root' is not defined
My question is why it doesn't work and how to fix it.
Another question is what if you want to share some variables for other modules in a same package, how to do it?
Let us try to understand by example.
Code and directory explanation:
Suppose we have the following directory and file structure:
dir_1
├── __init__.py
└── a.py
b.py
__init__.py contains:
import sys,os
# Just to make things clear
print("Print statement from init")
project_root = os.path.dirname(os.path.dirname(os.path.realpath(__file__)))
sys.path.append(project_root)
a.py contains:
def func():
print("func from a.py")
Let us start importing things:
Suppose you start with having the following code in b.py:
from dir_1.a import func
func()
Executing the above will give the following output:
Print statement from init
func from a.py
So, from the above, we understand that the print statement from __init__.py is being executed. Now, let's add print(project_root) in b.py:
from dir_1.a import func
func()
print(project_root)
Executing the above will result in an error saying:
...
NameError: name 'project_root' is not defined
This happened because we did not have to import the print statement from __init__.py it just gets executed. But that is not the case for a variable.
Let's try to import the variable and see what happens:
from dir_1.a import func
from dir_1 import project_root
func()
print(project_root)
Executing the above file will give the following output:
Print statement from init
func from a.py
/home/user/some/directory/name/dir_1
Long story short, you need to import the variable defined in __init__.py
or anywhere else in order to use it.
Hope this helps : )
You have to import the variable:
from dir import project_root
Where dir is the directory you are in

How can I get module name automatically from __init__.py or conftest.py?

I am running multiple tests in a tests package, and I want to print each module name in the package, without duplicating code.
So, I wanted to insert some code to __init__.py or conftest.py that will give me the executing module name.
Let's say my test modules are called: checker1, checker2, etc...
My directory structure is like this:
tests_dir/
├── __init__.py
├── conftest.py
├── checker1
├── checker2
└── checker3
So, inside __init__.py I tried inserting:
def module_name():
return os.path.splitext(__file__)[0]
But it still gives me __init__.py from each file when I call it.
I also tried using a fixture inside conftest.py, like:
#pytest.fixture(scope='module')
def module_name(request):
return request.node.name
But it seems as if I still need to define a function inside each module to get module_name as a parameter.
What is the best method of getting this to work?
Edit:
In the end, what I did is explained here:
conftest.py
#pytest.fixture(scope='module', autouse=True)
def module_name(request):
return request.node.name
example for a test file with a test function. The same needs to be added to each file and every function:
checker1.py
from conftest import *
def test_columns(expected_res, actual_res, module_name):
expected_cols = expected_res.columns
actual_cols = actual_res.columns
val = expected_cols.difference(actual_cols) # verify all expected cols are in actual_cols
if not val.empty:
log.error('[{}]: Expected columns are missing: {}'.format(module_name, val.values))
assert val.empty
Notice the module_name fixture I added to the function's parameters.
expected_res and actual_res are pandas Dataframes from excel file.
log is a Logger object from logging package
In each module (checker1, checker2, checker3, conftest.py), in the main function, execute
print(__name__)
When the __init__.py file imports those packages, it should print the module name along with it.
Based on your comment, you can perhaps modify the behaviour in the __init__.py file for local imports.
__init.py__
import sys, os
sys.path.append(os.path.split(__file__)[0])
def my_import(module):
print("Module name is {}".format(module))
exec("import {}".format(module))
testerfn.py
print(__name__)
print("Test")
Directory structure
tests_dir/
├── __init__.py
└── testerfn.py
Command to test
import tests_dir
tests_dir.my_import("testerfn")

How to access symbols in __init__.py from __main__.py?

I have a module - let's call it foo - and I want to make it usable via a python -m foo call. My program look like this:
my_project
├── foo
│   └── __init__.py
└── my_program.py
In __init__.py I have some code which I run when calling python -m foo:
def bar(name):
print(name)
# -- code used to 'run' the module
def main(name):
bar("fritz")
if __name__ == "__main__":
main()
Since I have a fair amount of execution code in __init__.py now (argparse stuff and some logic) I want to separate it into a __main__.py:
my_project
├── foo
│   ├── __init__.py
│   └── __main__.py
└── my_program.py
Despite that looks very simple to me I didn't manage to import stuff located in __init__.py from __main__.py yet.
I know - if foo is located in site-packages or accessible via PYTHONPATH I can just import foo..
But in case I want to execute __main__.py directly (e.g. from some IDE) with foo located anywhere (i.e. not a folder where Python looks for packages) - is there a way to import foo (__init__.py from the same directory)?
I tried import . and import foo - but both approaches fail (because they just mean something else of course)
What I can do - at least to explain my goal - is something like this:
sys.path.append(os.path.join(os.path.dirname(__file__), ".."))
import foo
Works, but is ugly and a bit dangerous since I don't even know if I really import foo from the same directory..
You can manually set the module import state as if __main__.py were executed with -m:
# foo/__main__.py
import os
import sys
if __package__ is None and __name__ == "__main__": # executed without -m
# set special attributes as if part of the package
__file__ = os.path.abspath(__file__)
__package__ = os.path.basename(os.path.dirname(__file__))
# replace import path for __main__ with path for package
main_path = os.path.dirname(__file__)
try:
index = sys.path.index(dir_path)
if index != 0 or index != 1:
raise ValueError('expected script directory after current directory or matching it')
except ValueError:
raise RuntimeError('sys.path does not include script directory as expected')
else:
sys.path[index] = main_path
# import regularly
from . import bar
This exploits that python3 path/to/foo/__main__.py executes __main__ as a standalone script: __package__ is None and the __name__ does not include the package either. The search path in this case is <current directory>, <__main__ directory>, ..., though it gets collapsed if the two are the same: the index is either 0 or 1.
As with all trickery on internals, there is some transient state where invariants are violated. Do not perform any imports before the module is patched!

Optimal file structure organization of Python module unittests?

Sadly I observed that there are are too many ways to keep your unittest in Python and they are not usually well documented.
I am looking for an "ultimate" structure, one would accomplish most of the below requirements:
be discoverable by test frameworks, including:
pytest
nosetests
tox
the tests should be outside the module files and in another directory than the module itself (maintenance), probably in a tests/ directory at package level.
it should be possible to just execute a test file (the test must be able to know where is the module that is supposed to test)
Please provide a sample test file that does a fake test, specify filename and directory.
Here's the approach I've been using:
Directory structure
# All __init__.py files are empty in this example.
app
package_a
__init__.py
module_a.py
package_b
__init__.py
module_b.py
test
__init__.py
test_app.py
__init__.py
main.py
main.py
# This is the application's front-end.
#
# The import will succeed if Python can find the `app` package, which
# will occur if the parent directory of app/ is in sys.path, either
# because the user is running the script from within that parect directory
# or because the user has included the parent directory in the PYTHONPATH
# environment variable.
from app.package_a.module_a import aaa
print aaa(123, 456)
module_a.py
# We can import a sibling module like this.
from app.package_b.module_b import bbb
def aaa(s, t):
return '{0} {1}'.format(s, bbb(t))
# We can also run module_a.py directly, using Python's -m option, which
# allows you to run a module like a script.
#
# python -m app.package_a.module_a
if __name__ == '__main__':
print aaa(111, 222)
print bbb(333)
module_b.py
def bbb(s):
return s + 1
test_app.py
import unittest
# From the point of view of testing code, our working modules
# are siblings. Imports work accordingly, as seen in module_a.
from app.package_a.module_a import aaa
from app.package_a.module_a import bbb
class TestApp(unittest.TestCase):
def test_aaa(self):
self.assertEqual(aaa(77, 88), '77 89')
def test_bbb(self):
self.assertEqual(bbb(99), 100)
# Simiarly, we can run our test modules directly as scripts using the -m option,
# or using nose.
#
# python -m app.test.test_app
# nosetests app/test/test_app.py
if __name__ == '__main__':
unittest.main()

How to import classes defined in __init__.py

I am trying to organize some modules for my own use. I have something like this:
lib/
__init__.py
settings.py
foo/
__init__.py
someobject.py
bar/
__init__.py
somethingelse.py
In lib/__init__.py, I want to define some classes to be used if I import lib. However, I can't seem to figure it out without separating the classes into files, and import them in__init__.py.
Rather than say:
lib/
__init__.py
settings.py
helperclass.py
foo/
__init__.py
someobject.py
bar/
__init__.py
somethingelse.py
from lib.settings import Values
from lib.helperclass import Helper
I want something like this:
lib/
__init__.py #Helper defined in this file
settings.py
foo/
__init__.py
someobject.py
bar/
__init__.py
somethingelse.py
from lib.settings import Values
from lib import Helper
Is it possible, or do I have to separate the class into another file?
EDIT
OK, if I import lib from another script, I can access the Helper class. How can I access the Helper class from settings.py?
The example here describes Intra-Package References. I quote "submodules often need to refer to each other". In my case, the lib.settings.py needs the Helper and lib.foo.someobject need access to Helper, so where should I define the Helper class?
'lib/'s parent directory must be in sys.path.
Your 'lib/__init__.py' might look like this:
from . import settings # or just 'import settings' on old Python versions
class Helper(object):
pass
Then the following example should work:
from lib.settings import Values
from lib import Helper
Answer to the edited version of the question:
__init__.py defines how your package looks from outside. If you need to use Helper in settings.py then define Helper in a different file e.g., 'lib/helper.py'.
.
| `-- import_submodule.py
`-- lib
|-- __init__.py
|-- foo
| |-- __init__.py
| `-- someobject.py
|-- helper.py
`-- settings.py
2 directories, 6 files
The command:
$ python import_submodule.py
Output:
settings
helper
Helper in lib.settings
someobject
Helper in lib.foo.someobject
# ./import_submodule.py
import fnmatch, os
from lib.settings import Values
from lib import Helper
print
for root, dirs, files in os.walk('.'):
for f in fnmatch.filter(files, '*.py'):
print "# %s/%s" % (os.path.basename(root), f)
print open(os.path.join(root, f)).read()
print
# lib/helper.py
print 'helper'
class Helper(object):
def __init__(self, module_name):
print "Helper in", module_name
# lib/settings.py
print "settings"
import helper
class Values(object):
pass
helper.Helper(__name__)
# lib/__init__.py
#from __future__ import absolute_import
import settings, foo.someobject, helper
Helper = helper.Helper
# foo/someobject.py
print "someobject"
from .. import helper
helper.Helper(__name__)
# foo/__init__.py
import someobject
If lib/__init__.py defines the Helper class then in settings.py you can use:
from . import Helper
This works because . is the current directory, and acts as a synonym for the lib package from the point of view of the settings module. Note that it is not necessary to export Helper via __all__.
(Confirmed with python 2.7.10, running on Windows.)
You just put them in __init__.py.
So with test/classes.py being:
class A(object): pass
class B(object): pass
... and test/__init__.py being:
from classes import *
class Helper(object): pass
You can import test and have access to A, B and Helper
>>> import test
>>> test.A
<class 'test.classes.A'>
>>> test.B
<class 'test.classes.B'>
>>> test.Helper
<class 'test.Helper'>
Add something like this to lib/__init__.py
from .helperclass import Helper
now you can import it directly:
from lib import Helper
Edit, since i misunderstood the question:
Just put the Helper class in __init__.py. Thats perfectly pythonic. It just feels strange coming from languages like Java.
Yes, it is possible. You might also want to define __all__ in __init__.py files. It's a list of modules that will be imported when you do
from lib import *
Maybe this could work:
import __init__ as lib

Categories