I wasn't sure how to name the title, so if anyone knows the specific name, please correct me.
Here is my situation:
class Abs(object):
def run(self, var):
raise NotImplementedError
class Sub1(Abs):
def run(self, var):
var = get_var()
# Sub1 RUN
class Sub2(Abs):
def run(self, var):
var = get_var()
#Sub2 RUN
So as you can see I have to classes that inherit from the "interface" class and both have different run function. Even though some of the run functions are different between the two, there is some similar code. (as you can see in the example) Is there any way to write the common part in the "interface" class in order to not repeat it twice?
Write the common part and put it in a new method in the base class. Consider giving it a name beginning with an underscore (the Python equivalent of protected access). Call if from each place that needs it.
class Abs(object):
def run(self, var):
raise NotImplementedError
def _common_things(self,var):
pass
class Sub1(Abs):
def run(self, var):
self._common_things(var)
# etc
class Sub2(Abs):
def run(self, var):
self._common_things(var)
# etc
Interfaces are not supposed to implemet code but in python there are no real interfaces so it doesn't matter.
You can put var = get_var() in the base class (Abs) and then the other two method would look like this:
def run(self, var):
super(Sub1, self).run(var)
# Sub1 run
def run(self, var):
super(Sub2, self).run(var)
# Sub2 run
super() will call the code in the base class (the "interface")
Related
I'm writing a feature generation class that can be extendable. For example in the following example any method that starts with generate is a feature generation method:
class FeatureGenerator:
def __init__(self):
self.generate_a()
self.generate_b()
def method_a(self): pass
def generate_a(self): do stuffs
def generate_b(self): do stuffs
I want to execute all methods with generate prefix within init. However, I don't want to add it manually every time I write a new method. One solution could be writing a decorator that will add it to a list and then execute all elements in the list within init. But I am not sure that's a good idea. Is there any pythonic way to do that?
Using the dir of the instance:
class FeatureGenerator:
def __init__(self):
for name in dir(self):
attr = getattr(self, name)
if callable(attr) and name.startswith("generate"):
attr()
def method_a(self): pass
def generate_a(self): print("gen a")
def generate_b(self): print("gen b")
I'm going to pass a function dynamically to another class as shown below
class simulator(object):
def __init__(self, fn_):
print(self.test(fn_))
def test(self, fn):
return fn(self, 20)
class t(object):
s = 'def get_fitness(x, y):\n return x+y'
exec(s)
def fnGetFitness(self,genes):
return get_fitness(genes, 10)
simulator(fnGetFitness)
t()
but i face error below:
File "N:/Job/GA/mine/dyn.py", line 25, in fnGetFitness
return get_fitness(genes, 10)
NameError: name 'get_fitness' is not defined
i guess its something related to scopes but i can't handle it
anyone on this?
EDIT :
this is a simpler code, showing the problem :
class t(object):
def __init__(self):
exec('def get_fitness(x, y):\n return x+y')
print(get_fitness(2,3))
t()
nothing to do with exec. What you're doing is equivalent (with safety removed) to:
class t(object):
def get_fitness(x,y):
return x+y
but your method definition is at class level, but not on the simulator class.
simulator(fnGetFitness) calls fnGetFitness out of t class context, so it doesn't know your new function.
That cannot work (also get_fitness should be decorated as #staticmethod because it doesn't have a self parameter)
What works is to define dynamically (or not) the function at global level so class can call it
s = 'def get_fitness(x, y):\n return x+y'
exec(s)
class t(object):
def fnGetFitness(self,genes):
return get_fitness(genes, 10)
simulator(fnGetFitness)
t()
that fixed it, but honestly I'm puzzled about the purpose (already took me a while to figure out how to make something run from your code)
EDIT: a simpler and somehow different (and exec related) code has been posted in comments:
class t(object):
def __init__(self):
exec('def get_fitness(x, y):\n return x+y')
print(get_fitness(2,3))
t()
this raises NameError: name 'get_fitness' is not defined
now this has to do with exec. When __init__ is parsed, get_fitness isn't known because the parser didn't see it as a local variable, even if at the time of execution, it is set in locals() dictionary by exec (related: why is 'ord' seen as an unassigned variable here?).
A workaround is to fetch the function in the local variables like this:
class t(object):
def __init__(self):
exec('def get_fitness(x, y):\n return x+y')
print(locals()["get_fitness"](2,3))
t()
this works & prints 5.
I am wondering if there is a way to do what I am trying, best explained with an example:
Contents of a.py:
class A(object):
def run(self):
print('Original')
class Runner(object):
def run(self):
a = A()
a.run()
Contents of b.py:
import a
class A(a.A):
def run(self):
# Do something project-specific
print('new class')
class Runner(a.Runner):
def other_fcn_to_do_things(self):
pass
Basically, I have a file with some base classes that I would like to use for a few different projects. What I would like would be for b.Runner.run() to use the class A in b.py, without needing to override the run method. In the example above, I would like to code
import b
r = b.Runner()
print(r.run())
to print "new class". Is there any way to do that?
This seems a little convoluted. The Runner classes are probably unnecessary, unless there's something else more complex going on that was left out of your example. If you're set on not overriding the original run(), you could call it in another method in B. Please take a look at this post and this post on super().
It would probably make more sense to do something like this:
a.py:
class A(object):
def run(self):
# stuff
print ('Original')
b.py:
import a
class B(A):
def run(self):
return super(A, self).run()
# can also do: return A.run()
def run_more(self):
super(A, self).run()
# other stuff
print('new class')
I'm having problem with passing value/object from one class to another. My code as below. When I invoke the method mainFunction() , all the calls work fine, but it is failing self.stub.call_method() . Not sure why. When I declare the stub as a global variable, and then use it it works perfectly fine. The Bar() is another class which has remote invocation methods and one being call_method() . Any help on this will be greatly appreciated.
The failure I meant is, the call reaches till call_method() and the call_method has some http invocation and it is unable to do the http invocation and throws HTTP Exception.
class Command(object):
def dosomething(self):
return 0
class Connect(Command):
def __init__(self, value1, stub):
self.value1 = value1
self.stub = stub
def dosomething(self):
self.stub.call_method()
class Osclass(object):
def __init__(self, val1):
self.val1 =val1
stub=Bar(value)
self.stub = stub
def activemethod(self):
return Connect(self.val1, self.stub)
def mainFunction(val1, Osclass):
ret_value = Osclass.activemethod()
ret_value.execute()
I have a situation where I'm trying to modify the arguments passed to a decorator on one of my class methods. The code looks something like this:
class MyClass(object):
#tryagain(retries=3)
def mymethod(self, arg):
... do stuff ...
My problem is I'd like to alter the "retries" variable to something less than 3 when running my unit tests, but keep it at "3" for the production code. Unfortunately, it doesn't look like I can do something like this:
#tryagain(retries=self.retries)
def mymethod(self, arg):
... do stuff ...
or
#tryagain(retries=MyClass.retries)
def mymethod(self, arg):
... do stuff ...
because the class isn't defined at the point the arguments are passed to the decorator (as near as I can tell).
I also tried to add the variable within the module like so:
retries = 1
def MyClass(object):
#tryagain(retries=retries)
def mymethod(self, arg):
... do stuff ...
but then I can't seem to modify the value of "retries" from within my unit tests. Is there another way to accomplish what I'm trying to do?
I assume you try to reduce the number of retrials to increase test speed.
If so, modifying the number of retries variable doesn't seem to be the best approach. Instead, you could unit test the function mymethod without decorator first, and then create a mock function of mymethod. Let's call it mock_mymethod, decorate it with #tryagain and test if the logic of `tryagain actually works.
Check the mock module to see how to create a mock instance, this article about mock is also worth reading.
You could use an environment variable, set from your calling code (it might be good to put a default in here
import os
# ...
class MyClass(object):
#tryagain(retries=int(os.environ['project_num_retries']))
def mymethod(self, arg):
print("mymethod")
Or use a "globals"-type module, for example: project_settings.py containing:
num_retries = 3
Then
import project_settings
class MyClass(object):
#tryagain(retries=project_settings.num_retries)
def mymethod(self, arg):
print("mymethod")
But I'm not sure decorating your code with test information is how you really should go about it -- what about:
class MyClass(object):
def mymethod(self, arg):
print("mymethod")
Then in something like unittests.py:
DEV_TESTS = True # Change to False for production
num_retries = 3 if not DEV_TESTS else 1
import <your class>
class UnitTests():
def __init__(self):
self.c = <your_class>.MyClass()
#tryagain(retries=num_retries)
def test_mymethod(self):
self.c.mymethod("Foo")
t = UnitTests()
t.test_mymethod()
If you were so inclined, this unittests.py could be used with something like python's unittest package with:
DEV_TESTS = True # Change to False for production
num_retries = 3 if not DEV_TESTS else 1
import unittest
import <your class>
class UnitTests(unittest.TestCase):
def setUp(self):
self.c = <your class>.MyClass()
#tryagain(retries=num_retries)
def test_mymethod(self):
self.c.mymethod("Foo")
Note, I used the following simple example of a #tryagain decorator, yours may be more complicated and require some tuning of the examples:
def tryagain(retries):
def wrap(f):
def wrapped_f(*args,**kwargs):
for _ in xrange(retries):
f(*args,**kwargs)
return wrapped_f
return wrap