I have the following code:
import time
class output_letter():
def output_a(self):
print('a')
def output_b(self):
print('b')
def output_c(self):
print('c')
.............
def output_z(self):
print('z')
class wait_time():
def sleep_2(self):
time.sleep(2)
out = output_letter()
out.output_a()
I would like to tack the functionality of class wait_time.sleep_2 onto the beginning of class output_letter . As an example if I were to create a hypothetical class "combined" , I could do
c=combined()
c.output_a # would wait 2 seconds and then print 'a'
The problem is that I don't want to rewrite all the functions of 'output_letter', I just want to add the same functionality to all of its functions.
Its not clear to me how to do this , or if this is possible with inheritance or composition.
I see how you may want to solve 2 separate tasks based on DRY and SOLID principles:
It should be really easy to apply wait_time in future in many other places
Would be cool to keep output_letter as clean and untouchable as possible
So here's my idea:
1. Create a module that will allow you to apply a decorator for all the methods:
def for_all_methods(decorator):
def decorate(cls):
for attr in cls.__dict__:
if callable(getattr(cls, attr)):
setattr(cls, attr, decorator(getattr(cls, attr)))
return cls
return decorate
2. Isolate wait_time in a module as well:
from time import sleep
class wait_time():
#staticmethod
def sleep_decorator(function):
def wrapper(*args, **kwargs):
#sleep(2)
print("sleeping")
return function(*args, **kwargs)
return wrapper
3. This is how you can use it
#for_all_methods(wait_time.sleep_decorator)
class output_letter():
def output_a(self):
print('a')
def output_b(self):
print('b')
def output_c(self):
print('c')
def output_z(self):
print('z')
So the benefit, we are not actually touching constructors or changing inheritance. All we did, just added a decorator above a class, what is easy to enable or disable when necessary.
If you want to go even further and prevent dealing with an original class directly, you can just create a child combined inherited from the original class and add that decorator signature above its definition.
You can try this DEMO
The code below accomplishes what you want. You can set class variables in the combined class to represent output_letter and wait_time. Then inside of the combined class you have access to all attributes and functions of the other classes.
import time
class output_letter():
def output_a(self):
print('a')
def output_b(self):
print('b')
def output_c(self):
print('c')
class wait_time():
def sleep_2(self):
time.sleep(2)
class combined():
ol = output_letter()
wt = wait_time()
def output_a(self):
self.wt.sleep_2()
self.ol.output_a()
def main():
c=combined()
c.output_a()
if __name__ == '__main__':
main()
You can either create a class instance of wait_time as an attribute in output_letter or call time.sleep in the method itself:
Option 1:
import time
class wait_time():
def sleep_2(self):
time.sleep(2)
class output_letter():
def __init__(self):
self.wait = wait_time()
def output_a(self):
self.wait.sleep_2()
print('a')
def output_b(self):
self.wait.sleep_2()
print('b')
def output_c(self):
self.wait.sleep_2()
print('c')
Option 2:
class output_letter():
def output_a(self):
time.sleep(2)
print('a')
def output_b(self):
time.sleep(2)
print('b')
def output_c(self):
time.sleep(2)
print('c')
Edit: regarding your recent comment and edits, you may want to merely create two instances and call each:
a = outout_letter()
t = wait_time()
t.sleep_2()
a.output_a()
Also, it seems that you are trying to output each letter of the output given a method call with the target letter at the end. To shorten your code and implement wait_time as well, you can use __getattr__:
class Output_Letter:
def __init__(self):
self.t = wait_time()
def __getattr__(self, name):
def wrapper():
self.t.sleep_2()
print name.split('_')[-1]
return wrapper
a = Output_Letter()
a.output_c()
Output:
c
You can subclass multiple classes. In Python 3:
Given
import time
class Letter():
"Print a letter."""
def output_a(self):
print('a')
def output_b(self):
print('b')
def output_c(self):
print('c')
class WaitTime():
"""Set a delay."""
def sleep_2(self):
time.sleep(2)
Code
class Combined(Letter, WaitTime):
"""Combine methods."""
def output_a(self):
super().sleep_2()
super().output_a()
combined = Combined()
combined.output_a()
# 'a' # delayed output
This allows you to keep your original classes. You simply call them in the new Combined class.
Details
The Combined class mixes other classes together. The super() function makes a call to these superior classes to access their methods according to the class method resolution order:
Combined.__mro__
# (__main__.Combined, __main__.Letter, __main__.WaitTime, object)
In order to work in Python 2, some adjustments are required:
class Letter(object):
...
class WaitTime(object):
...
class Combined(Letter, WaitTime):
def output_a(self):
super(Combined, self).sleep_2()
super(Combined, self).output_a()
Minor note: for readability, use CamelCase names for classes. Class names are typically nouns; functions are usually verbs.
None of the classes you've shown in your example are very good OOP design. There's no need for a class when your methods don't refer to any state. if you just want a namespace, usually a module is better.
But that doesn't matter too much if your question is really about how to combine the effects of multiple functions. There are many ways to do that. Functions are first-class objects in Python, so you can put them in lists or dictionaries, or pass them as arguments to other functions if you want. You can also make "function factories", functions that return new functions.
Here's how you might be able to use that to build the delayed letter writing functions you want. I'm storing the function results in a couple of dictionaries, but you could do something different with them if you need to.
import itertools
import time
def letter_writer_factory(letter):
"""Return a function that prints the provided letter when it is called"""
def func():
print(letter)
return func
def delay_factory(seconds):
"""Return a function that when called, waits for the specified time"""
def delay():
time.sleep(seconds)
return delay
def function_sequencer_factory(*functions):
"""Return a function that calls each function argument in turn"""
def caller():
for func in functions:
func()
return caller
letters = 'abc'
delay_times = [2, 5, 10]
output_funcs = {c: letter_writer_factory(c) for c in letters}
delay_funcs = {t: delay_factory(t) for t in delay_times}
delayed_output_funcs = {(c, t): function_sequencer_factory(df, of)
for (c, of), (t, df) in itertools.product(output_funcs.items(),
delay_funcs.items())}
#print c after waiting for 10 seconds:
delayed_output_funcs['c', 10]()
This is of course a pretty silly example, but these kinds of functional programming techniques can be used to do some actually useful things in some contexts.
Related
I'm currently working on redesigning a class to be under an abstract base class. The current class has a method func that does some logic for two things, say A and B.
(note that all the code below is very simplified. There's a lot more functionality than what is shown)
class current_class:
def func(self):
# does stuff for A
# does stuff for B
During logic A, it loads a large dataset into a dictionary, say, dataset and later dataset.keys() is used for logic B, but other than that, A and B are independent of each other.
I will create an alternate class, say, another_class that is similar to current_class, but this class doesn't need B and only needs A. So something like
class another_class:
def func(self):
# does stuff for A
And then both will be under an abstract base class base. Since both inherited classes involves A, I plan on just creating a method in base class that does A, say, func_A. But I'm having trouble with figuring out the best way to approach this so that the function signatures conform and without having to reload dataset for B.
If another_class also needed the logic for B, I think we can just return dataset.keys() from func_A and use it in func_B, but another_class doesn't.
So I don't know if there's a good way to conform this without having different signatures for the methods.
So in code, I have the following two ideas:
1)
class base:
#abstractmethod
def func(self):
pass
def func_A(self):
# does stuff for A and gets the dataset
return dataset.keys()
class current_class:
def func_B(self, keys):
# does stuff for B
def func(self):
keys = self.func_A
self.func_B(keys)
class current_class:
def func(self):
_ = self.func_A() # the return is unused...
class base:
#abstractmethod
def func(self):
pass
class current_class:
def func_A(self):
# does stuff for A and gets the dataset
return dataset.keys()
def func_B(self, keys):
# does stuff for B
def func(self):
keys = self.func_A()
self.func_B(keys)
class current_class:
def func_A(self):
# does same stuff as func_A for current_class, and doesn't return anything
def func(self):
self.func_A()
I don't like the first design because func_A only needs to return something for one of the subclasses and not for all of them. I also don't like the second design because we have to separately implement func_A in each inherited class even though they're identical methods, except one needs to return something and the other doesn't.
It's not a big deal to ignore the return value of a function that is primarily called for its side effects. Just define func_A once in the base class and let both child classes use it as appropriate to their needs.
class Base:
#abstractmethod
def func(self):
pass
def func_A(self):
# does stuff for A and gets the dataset
return dataset.keys()
class Child1:
def func_B(self, keys):
# does stuff for B
def func(self):
keys = self.func_A
self.func_B(keys)
class Child2:
def func(self):
self.func_A()
If there is more in func_A that isn't necessary for Child2, then it should of course be split up to avoid doing unnecessary work in Child2.func. But simply returning a value is not in anyway time- or space-intensive, and should not be a concern.
I'm learning python design patterns from github repo faif/python-patterns and found the example chain_of_responsibility implements abstractmethod check_range as staticmethod.
My question is, is there any benefit other than less typing a self?
Simplify code is
from abc import ABC, abstractmethod
class A(ABC):
#abstractmethod
def foo(self, x):
pass
class B(A):
#staticmethod
def foo(x):
print("B.foo", x)
# both the two works
B.foo(1)
b = B()
b.foo(2)
There's no particular benefit. Sure, not all the check_range methods need an instance of the handler class that defines it, so they are declared as static methods.
But there's no reason any of the classes in your linked page need to exist in the first place, because in Python you can just store a function itself in a list, rather than storing some other dummy object that has an equivalent method.
Here's how you can implement "chain of responsibility" in Python idiomatically.
def check_range0(request):
return request in range(10)
# A closure can be used in place of a class
def make_check_range1():
start = 10
stop = 20
return lambda request: request in range(start, stop)
# Another way of using a closure in place of a class
def make_check_range2(start, stop):
return lambda request: request in range(start, stop)
def fallback(request):
print("No handler for {request}")
handlers = [check_range0, make_check_range1(), make_check_range2(20, 30)]
for request in requests:
if any((handler:=f)(request) for f in handlers):
print(f"{request} handled by {handler.__name__}")
else:
fallback(request)
A simple list takes the place of the linked list implied by Handler. Each subclass of Handler is replaced by a regular function (or a function that returns a closure, just to emphasize that a class is not necessary to provide or store state). The any function implements the iteration provided by Handler.handle.
If you really want a handler class, you can define it more simply than the example.
class Handler(ABC):
def __init__(self, handlers=None):
if handlers is None:
handlers = []
self.handlers = handlers
self.fallback = lambda request: pass
def add_handler(self, f):
self.handlers.append(f)
# Barely necessary; you can set the fallback
# attribute on a Handler instance yourself.
def set_fallback(self, f):
self.fallback = f
def handle(self, request):
# An alternative to any()
for h in self.handlers:
if h(request):
break
else:
self.fallback(request)
h = Handler([check_range0])
h.add_handler(make_check_range1())
h.add_handler(make_check_range2(20, 30))
def fallback(request):
print(f"No handler for {request}"
h.fallback = fallback
# h.set_fallback(fallback)
for request in requests:
h.handle(request)
I have a class with a static method which is called multiple times by other methods. For example:
class A:
def __init__(self):
return
#staticmethod
def one():
return 1
def two(self):
return 2 * A.one()
def three(self):
return 3 * A.one()
Method one is a utility function that belongs inside the class but isn't logically an attribute of the class or the class instance.
If the name of the class were to be changed from A to B, do I have to explicitly change every call to method one from A.one() to B.one()? Is there a better way of doing this?
I pondered this question once upon a time and, while I agree that using a refactoring utility is probably the best way to go, as far as I can tell it is technically possible to achieve this behaviour in two ways:
Declare the method a classmethod.
Use the __class__ attribute. Leads to rather messy code, and may well be deemed unsafe or inefficient for reasons I am not aware of(?).
class A:
def __init__(self):
return
#staticmethod
def one():
return 1
#classmethod
def two(cls):
return 2 * cls.one()
def three(self):
return 3 * self.__class__.one()
a = A()
print(a.two())
print(a.three())
I have a class which maintains a list of functions. These functions are just objects sitting in a queue and every so often the class pops one off and executes it. However, there are times when I would like to print out this list, and I'm imagining code as follows:
for function in self.control_queue:
print function.summarize()
if function.ready():
function()
In other words, I would like to call methods called summarize() and ready(), that I want to define somewhere, on these function objects. Also, I would like to be able to toss anonymous functions on this queue - i.e., generate everything dynamically.
you can make it a class and define __call__
class MyClass():
def summarize(self):
#summarize stuff
pass
def ready(self):
#ready stuff
pass
def _call__(self):
#put the code here, for when you call myClass()
pass
How you run it:
function = MyClass()
print function.summarize()
if function.ready():
function()
You have a couple possible approaches.
You could add the definitions to functions.
def foo():
pass
# later..
foo.summarize = lambda: "To pair with bar"
foo.ready = lambda: True
You could create class objects to wrap the function operation.
class Func():
def summarize(self):
return "Function!"
def ready(self):
return self.ready
def __call__(self):
# Act as a function
Or you can have a function which checks the function label for these capabilities.
def summarize_func(func):
return func.__name__ # Or branch here on specific names/attributes
def ready_func(func):
return True # Or branch on names/attributes
Finally to accommodate anonymous functions you can check for prescience of these attributes and return optimistically if the attributes are absent. Then you can combine above approaches with something that will work on any function.
def summarize_func(func):
if hasattr(func, summarize):
return func.summarize()
else:
# Note this will just be '<lambda>' for anonymous funcs
return func.__name__
def ready_func(func):
if hasattr(func, ready):
return func.ready()
else:
return True
One option is to implement function as a class instance:
class Function(object):
def summarize(self): pass # some relevant code here
def __call__(self): pass # and there
and use it later with
function = Function()
With __call__ magic method implemented, this function becomes a callable object.
For sure, you can assign attributes to functions, but it is rather obscure and conterintuitive:
>>> def summ(a): return sum(a)
...
>>> def function(a): return a
...
>>> function.sum=summ
>>> function.sum([1,2,3])
6
I have a few dozen classes. Here are two of them:
class Class_A(ClassABC):
def __init__(self):
super().__init__()
def from_B(self, b):
#do stuff
def from_C(self, c):
#do stuff
#...
def to_B(self):
rt = Class_B()
rt.from_A(self)
return rt
def to_C(self):
rt = Class_C()
rt.from_A(self)
return rt
#...
class Class_B(ClassABC):
def __init__(self):
super().__init__()
def from_A(self, a):
#do stuff
def from_C(self, c):
#do stuff
def to_A(self):
rt = Class_A()
rt.from_B(self)
return rt
def to_C(self):
rt = Class_C()
rt.from_B(self)
return rt
#...
#class Class_C, Class_D, Class_E, etc,
and here is the ABC:
class ClassABC(metaclass=abc.ABCMeta):
#abc.abstractmethod
def __init__(self):
#do stuff
The problem I have is that all the to_* methods in the subclasses follow the same exact pattern, and it becomes tedious to implement them. I would like to automatically generate them in the ClassABC if possible, but so far I have failed. I also tried creating a class decorater for the subclasses, but that didn't work either. I have, however, managed to auto generate the methods in each subclass using exec(), but I rather have the ABC generate them or use class decoraters. Is there a way to do this?
Note: all the classes are in their own separate module
First of all, your to_* methods aren't going to work because you need to explicitly include self at the beginning of the argument list to be able to use it in the method body. Secondly, I'd go with something akin to JBernardo's suggestion.
def to_class(self, cls):
rt = cls()
rt.from_class(self.__class__)
return rt
def from_class(self, cls):
#do stuff, with different things happening based on what class cls is; this may be a good place to use a dict of functions with the class or classname as keys