Python package structure with base classes - python

I am wondering if there is a way to do what I am trying, best explained with an example:
Contents of a.py:
class A(object):
def run(self):
print('Original')
class Runner(object):
def run(self):
a = A()
a.run()
Contents of b.py:
import a
class A(a.A):
def run(self):
# Do something project-specific
print('new class')
class Runner(a.Runner):
def other_fcn_to_do_things(self):
pass
Basically, I have a file with some base classes that I would like to use for a few different projects. What I would like would be for b.Runner.run() to use the class A in b.py, without needing to override the run method. In the example above, I would like to code
import b
r = b.Runner()
print(r.run())
to print "new class". Is there any way to do that?

This seems a little convoluted. The Runner classes are probably unnecessary, unless there's something else more complex going on that was left out of your example. If you're set on not overriding the original run(), you could call it in another method in B. Please take a look at this post and this post on super().
It would probably make more sense to do something like this:
a.py:
class A(object):
def run(self):
# stuff
print ('Original')
b.py:
import a
class B(A):
def run(self):
return super(A, self).run()
# can also do: return A.run()
def run_more(self):
super(A, self).run()
# other stuff
print('new class')

Related

How does Python support this common problem related to run time polymorphism?

I am trying to exectute the below code but I get errors.
class base:
def callme(data):
print(data)
class A(base):
def callstream(self):
B.stream(self)
def callme(data):
print("child ", data)
class B:
def stream(data):
# below statement doesn't work but I want this to run to achieve run time
# polymorphism where method call is not hardcoded to a certain class reference.
(base)data.callme("streaming data")
# below statement works but it won't call child class overridden method. I
# can use A.callme() to call child class method but then it has to be
# hardcoded to A. which kills the purpose. Any class A or B or XYZ which
# inherits base call should be able to read stream data from stream class.
# How to achive this in Python? SO any class should read the stream data as
# long as it inherits from the base class. This will give my stream class a
# generic ability to be used by any client class as long as they inherit
# base class.
#base.callme("streaming data")
def main():
ob = A()
ob.callstream()
if __name__=="__main__":
main()
I got the output you say you're looking for (in a comment rather than the question -- tsk, tsk) with the following code, based on the code in your question:
class base:
def callme(self, data):
print(data)
class A(base):
def callstream(self):
B.stream(self)
def callme(self, data):
print("child", data)
class B:
#classmethod
def stream(cls, data):
data.callme("streaming data")
def main():
ob = A()
ob.callstream()
if __name__=="__main__":
main()
Basically, I just made sure the instance methods had self parameters, and since you seem to be using B.stream() as a class method, I declared it as such.

Python: Using import to add common functions into a class

I have an existing python (python v2.7) application that imports external py files on the fly which contain specifically named classes to processes data. The external py file loaded is chosen based on the type of post-processing of the data that is needed.
So I have this collection of classes, each in their own file. The files are named in a specific fashion based on the type of processing so that the main program knows what file to import from the upstream request.
Keep in mind that I and others are always tweaking these class files, but we can not change the code on the main application.
What I would like to do is to import a "template" of the common functions into the class scope which can provide the standard set of controls that the main program expects without needing to copy/paste them into each file. I hate it when I find a bug and make a correction in one of these main class i/o function which I then have to replicate in thirty-some other files.
Now, I understand from googling that my import here is bad... I get the message:
TestClassFile.py:5: SyntaxWarning: import * only allowed at module level
But this method is the only way I have found to import the functions so that they come into the namespace of the class itself. I have an example below...
What method (if any) is the appropriate way to do this in Python?
Example
main.py
import TestClassFile
print "New TestClass - Init"
oTest = TestClassFile.TestClass("foo")
print "Should run... Function A"
oTest.funcA()
print "Should run... Function b"
oTest.funcB()
TestClassFile.py
class TestClass:
from TestClassImport import *
def __init__(self, str):
print "INIT! and be ... ", str
def funcA(self):
print "Function A"
TestClassImport.py
def funcB(self):
print "Function B"
Much appreciated!
Update
Many thanks to everyone for the contributions. From researching MixIns, these appear to be the proper python way to extend a class.
TestClassImport.py
class ImportClass:
def funcB(self):
print "Function B"
TestClassFile.py
from TestClassImport import ImportClass
class TestClass(ImportClass):
def __init__(self, str):
print "INIT! and be ... ", str
def funcA(self):
print "Function A"
It sounds like you should make the imported functions into mixins, which you can inherit from. So:
TestClassImport.py
class ClassB(object):
def funcB(self):
print "Function B"
TestClassFile.py
from TestClassImport import ClassB
from OtherClassImport import ClassX
class TestClass(ClassB, ClassX):
...
This appears to work:
import types
from TestClassImport import funcB
class TestClass:
def __init__(self, str):
print "INIT! and be ... ", str
setattr(self, 'funcB', types.MethodType(funcB, self, TestClass))
def funcA(self):
print "Function A"
When I run it I get the following output:
INIT! and be ... foo
Should run... Function A
Function A
Should run... Function b
Function B
I don't know if this is by any means a good solution, but you can write a function to construct a metaclass to dynamically add properties to your classes.
def make_meta(*import_classes):
class TestMeta(type):
def __new__(meta, name, bases, dct):
new_class = super(TestMeta, meta).__new__(meta, name, bases, dct)
for import_class in import_classes:
for name in vars(import_class):
if not name.startswith('__'):
prop = getattr(import_class, name)
setattr(new_class, name, prop)
return new_class
return TestMeta
class TestClass:
import TestClassImport
__metaclass__ = make_meta(TestClassImport)
# other functions defined as normal...
This will add everything in the global scope of TestClassImport.py that doesn't start with '__' as a property on TestClass.
Or, you can use a class decorator to add properties dynamically in the same fashion.
def add_imports(*import_classes):
def augment_class(cls):
for import_class in import_classes:
for name in vars(import_class):
if not name.startswith('__'):
prop = getattr(import_class, name)
setattr(cls, name, prop)
return cls
return augment_class
import TestClassImport
#add_imports(TestClassImport)
class TestClass:
# normal class body
But mixins do seem like a better approach.
You can use importlib for this, e.g.:
import importlib
class TestClass:
def __init__(self, module_name):
_tmp = importlib.import_module(module_name)
for elem in _tmp.__dir__():
if not elem.startswith('_'):
prop = getattr(_tmp, elem)
setattr(self, elem, prop)
def funcA(self):
print("function A")
tc = TestClass('some_module')
tc.funcB()
>>> prints "function B"
With this approach, you can create function load_module(module_name) instead of __init__() to load modules independently of each other (e.g. to prevent names collision).

Passing an object through multiple classes neatly

I have a logfile object that I would like to effectively be omnipresent in the rest of my code so that it can pick up comments throughout the code. How can I go about coding this into a nice OOP format?
At the moment I sort of have instantiate it in the first class it's used in and then pass it to every other class when they are instantiated (I haven't even started to think about trying to hand the log back once I've finished using a class. This clearly seems unnecessarily messy!
class LogFile(object):
def __init__(self):
self.log = []
pass
def write_log(self, data):
self.log.append(data)
class A(object):
def __init__(self):
self.logger = LogFile()
self.do_some_stuff(stuff)
def do_some_stuff(self, stuff):
...
b = B(self.logger)
b.do_some_more_stuff(stuff)
self.logger.write_log(stuff)
class B(object):
def __init__(self, logger):
self.logger = logger
def do_some_more_stuff(self, stuff):
...
...
self.logger.write_log(more_stuff)
item = A()
item.do_some_stuff(stuff)
Thanks
Although I usually don't like to use Singletons, but for things like logging it is the easiest solution I know.
And the simplest way to implement it in python is to use the module-level variable, in the module mylogger.py:
class LogFile(object):
def __init__(self):
self.log = []
pass
def write_log(self, data):
self.log.append(data)
logger = LogFile()
And in the another module:
from mylogger import logger
class A(object):
def do_some_stuff(self, stuff):
...
logger.write_log(data)
Python's standard logging module uses similar approach having the root logger defined as a module-level variable.

partial function overload in python

I wasn't sure how to name the title, so if anyone knows the specific name, please correct me.
Here is my situation:
class Abs(object):
def run(self, var):
raise NotImplementedError
class Sub1(Abs):
def run(self, var):
var = get_var()
# Sub1 RUN
class Sub2(Abs):
def run(self, var):
var = get_var()
#Sub2 RUN
So as you can see I have to classes that inherit from the "interface" class and both have different run function. Even though some of the run functions are different between the two, there is some similar code. (as you can see in the example) Is there any way to write the common part in the "interface" class in order to not repeat it twice?
Write the common part and put it in a new method in the base class. Consider giving it a name beginning with an underscore (the Python equivalent of protected access). Call if from each place that needs it.
class Abs(object):
def run(self, var):
raise NotImplementedError
def _common_things(self,var):
pass
class Sub1(Abs):
def run(self, var):
self._common_things(var)
# etc
class Sub2(Abs):
def run(self, var):
self._common_things(var)
# etc
Interfaces are not supposed to implemet code but in python there are no real interfaces so it doesn't matter.
You can put var = get_var() in the base class (Abs) and then the other two method would look like this:
def run(self, var):
super(Sub1, self).run(var)
# Sub1 run
def run(self, var):
super(Sub2, self).run(var)
# Sub2 run
super() will call the code in the base class (the "interface")

Passing different values to decorators for unit tests in Python

I have a situation where I'm trying to modify the arguments passed to a decorator on one of my class methods. The code looks something like this:
class MyClass(object):
#tryagain(retries=3)
def mymethod(self, arg):
... do stuff ...
My problem is I'd like to alter the "retries" variable to something less than 3 when running my unit tests, but keep it at "3" for the production code. Unfortunately, it doesn't look like I can do something like this:
#tryagain(retries=self.retries)
def mymethod(self, arg):
... do stuff ...
or
#tryagain(retries=MyClass.retries)
def mymethod(self, arg):
... do stuff ...
because the class isn't defined at the point the arguments are passed to the decorator (as near as I can tell).
I also tried to add the variable within the module like so:
retries = 1
def MyClass(object):
#tryagain(retries=retries)
def mymethod(self, arg):
... do stuff ...
but then I can't seem to modify the value of "retries" from within my unit tests. Is there another way to accomplish what I'm trying to do?
I assume you try to reduce the number of retrials to increase test speed.
If so, modifying the number of retries variable doesn't seem to be the best approach. Instead, you could unit test the function mymethod without decorator first, and then create a mock function of mymethod. Let's call it mock_mymethod, decorate it with #tryagain and test if the logic of `tryagain actually works.
Check the mock module to see how to create a mock instance, this article about mock is also worth reading.
You could use an environment variable, set from your calling code (it might be good to put a default in here
import os
# ...
class MyClass(object):
#tryagain(retries=int(os.environ['project_num_retries']))
def mymethod(self, arg):
print("mymethod")
Or use a "globals"-type module, for example: project_settings.py containing:
num_retries = 3
Then
import project_settings
class MyClass(object):
#tryagain(retries=project_settings.num_retries)
def mymethod(self, arg):
print("mymethod")
But I'm not sure decorating your code with test information is how you really should go about it -- what about:
class MyClass(object):
def mymethod(self, arg):
print("mymethod")
Then in something like unittests.py:
DEV_TESTS = True # Change to False for production
num_retries = 3 if not DEV_TESTS else 1
import <your class>
class UnitTests():
def __init__(self):
self.c = <your_class>.MyClass()
#tryagain(retries=num_retries)
def test_mymethod(self):
self.c.mymethod("Foo")
t = UnitTests()
t.test_mymethod()
If you were so inclined, this unittests.py could be used with something like python's unittest package with:
DEV_TESTS = True # Change to False for production
num_retries = 3 if not DEV_TESTS else 1
import unittest
import <your class>
class UnitTests(unittest.TestCase):
def setUp(self):
self.c = <your class>.MyClass()
#tryagain(retries=num_retries)
def test_mymethod(self):
self.c.mymethod("Foo")
Note, I used the following simple example of a #tryagain decorator, yours may be more complicated and require some tuning of the examples:
def tryagain(retries):
def wrap(f):
def wrapped_f(*args,**kwargs):
for _ in xrange(retries):
f(*args,**kwargs)
return wrapped_f
return wrap

Categories