Python abstract factory: two classes uses different method params - python

I have to know if this situation is correct
We have an abstract class with one method that requires 2 parameters:
class Base(ABC):
#abstractmethod
def class_method(self, param1, param2):
pass
Then I have to implement 3 classes. Two of them use param1 and param2 in their method, but one of them only uses param1, I don´t need to use it inside!
class ClassA(Base):
def class_method(self, param1, param2):
return param1 + param2
class ClassB(Base):
def class_method(self, param1, param2):
return param1 + param2
class ClassC(Base):
def class_method(self, param1, param2):
return param1
Is this a correct implementation? What´s the best way to manage the unused param2 in the ClassC method?
I tried also to define param2 as optional
class Base(ABC):
#abstractmethod
def class_method(self, param1, param2=None):
pass
But the question is still the same.

Your first example is correct. Whether or not ClassC uses the required parameter is irrelevant, as long as it accepts an argument. Consider this example:
lst: Base = [ClassA(), ClassB(), ClassC()]
for obj in lst:
obj.class_method(x, y)
From a static typing perspective, you don't know what the actual runtime values in lst are, only that they are instances of (subclasses of) Base, and so obj.class_method must accept two arguments.
If you make an argument optional as in your second example, then a subclass cannot turn around and require it, for the same reasons. The following is correct given the static type of lst: no use of class_method should require a second argument.
lst: Base = [ClassA(), ClassB(), ClassC()]
for obj in lst:
obj.class_method(x)
Note this is in the context of conforming to the Liskov substitution principle. For all ABC cares about, the following is "correct":
class ClassD(Base):
class_method = None
You can assign anything you want to class_method, as long as it's not another abstract method, and you'll be able to instantiate ClassD without a problem. Whether that instance behaves properly doesn't matter.

Related

what is the difference of self.function vs function?

Assume I have a class CLASS and I create a methodfnc in it
class CLASS():
def __init__(self,arg):
#initizalize
self.arg = arg
def fnc(self):
print(self.arg)
if I (in my class) wants to call fnc in a method prnt I can do it in two different ways, one using self.fnc and one just using fnc
class CLASS():
def __init__(self,arg):
#initizalize
self.arg = arg
def fnc(self):
print(self.arg)
def prnt(self):
self.fnc() #Method one
fnc() #Method two
which both seems to be working. I do know the self argument and how it works, but I do not understand the difference of the two method/function calls.
This should not work. It works because CLASS in your code is not actually a class since you used keyword def instead of class to define it.
What you actually did is define a function CLASS, which when executed defines some other functions.
To correct this declare your class like this:
class CLASS:
And your second call will raise a NameError probably because fnc does not exist in the scope of your method.

Checking of **kwargs in concrete implementation of abstract class method. Interface issue?

I am trying to implement the Strategy design pattern to create an interface for an underlying algorithm to be implemented in a modular fashion.
Currently, as per code below, I have one top-level/parent abstract class (ParentAbstractStrategy) that defines the base interface for the strategy method.
I also have a one-level-down from this abstract class (ChildAbstractStrategy).
The reason I have two abstract classes is because of the attributes they need to hold; see the __init__ methods.
ChildAbstractStrategy is a special case of ParentAbstractStrategy in that it stores an additional attribute: attr2.
Otherwise its interface is identical, as seen by identical strategy method signatures.
Sometimes, I want to be able to directly subclass ParentAbstractStrategy and implement the strategy method (see ConcreteStrategyA), but other times I want to be able to subclass ChildAbstractStrategy, because the extra attribute is required (see ConcreteStrategyB).
An additional complication is that in some subclasses of either abstract class I want to be able to handle additional arguments in the strategy method. This is why I have added **kwargs to all signatures of the strategy method, so that I can pass in whatever additional arguments I want to a subclass, on a case-by-case basis.
This creates the last problem: these extra arguments are not optional in the subclasses. E.g. in the strategy method of ConcreteStrategyB I want to be certain that the caller passed in a third argument.
I'm basically abusing **kwargs to provide what probably should be positional arguments (since I can't give them sane defaults and need their existence to be enforced).
This current solution of using **kwargs for "method overloading" in subclasses feels really messy, and I'm not sure if this means there is a problem with the class inheritance scheme or interface design, or both.
Is there a way that I can achieve these design goals in a cleaner fashion. It feels like I'm missing something big picture here and maybe the class/interface design is bad. Maybe creating two disjoint abstract classes with different signatures for the strategy method?
import abc
class ParentAbstractStrategy(metaclass=abc.ABCMeta):
#abc.abstractmethod
def __init__(self, attr1):
self.attr1 = attr1
#abc.abstractmethod
def strategy(self, arg1, arg2, **kwargs):
raise NotImplementedError
class ChildAbstractStrategy(ParentAbstractStrategy, metaclass=abc.ABCMeta):
#abc.abstractmethod
def __init__(self, attr1, attr2):
super().__init__(attr1)
self.attr2 = attr2
#abc.abstractmethod
def strategy(self, arg1, arg2, **kwargs):
raise NotImplementedError
class ConcreteStrategyA(ParentAbstractStrategy):
def __init__(self, attr1):
super().__init__(attr1)
def strategy(self, arg1, arg2, **kwargs):
print(arg1, arg2)
class ConcreteStrategyB(ChildAbstractStrategy):
def __init__(self, attr1, attr2):
super().__init__(attr1, attr2)
def strategy(self, arg1, arg2, **kwargs):
print(arg1, arg2)
arg3 = kwargs.get("arg3", None)
if arg3 is None:
raise ValueError("Missing arg3")
else:
print(arg3)
Here's an interpreter session demonstrating how it's currently working:
>>> a = ConcreteStrategyA(1)
>>> a.attr1
1
>>> a.strategy("a", "b")
a b
>>> b = ConcreteStrategyB(1, 2)
>>> b.attr1
1
>>> b.attr2
2
>>> b.strategy("a", "b")
a b
Traceback (most recent call last):
File "<stdin>", line 1, in <module>
File "/home/space/strategy.py", line 42, in strategy
raise ValueError("Missing arg3")
ValueError: Missing arg3
>>> b.strategy("a", "b", arg3="c")
a b
c
Answering my own question.
My usage of **kwargs is 'bad' in this scenario. Why?
As far as I can tell, **kwargs is typically used for:
Wrapper functions, e.g. decorators.
Collecting extra keyword arguments to a function that the function knows about (e.g. see usage in https://matplotlib.org/api/_as_gen/matplotlib.pyplot.plot.html?highlight=plot#matplotlib.pyplot.plot). In this scenario the **kwargs are optional arguments that can be passed into the function and they have sane default values.
Making **kwargs to be required in a function call is defeating their purpose; positional arguments that need to be explicitly supplied should be used instead. That way the interface provided by the function has to be explicitly satisfied by the caller.
There is another problem with using **kwargs in an interface, as I have. It involves the LSP (Liskov Substitution Principle, see https://en.wikipedia.org/wiki/Liskov_substitution_principle). The current implementation is abusing **kwargs in an attempt to define a variable interface for the strategy method among sublcasses. Although syntactically the function signatures for all strategy methods match, semantically the interfaces are different. This is a violation of the LSP, which would require that I could e.g. treat any descendant of ParentAbstractStrategy the same when considering their interfaces, e.g. I should be able to treat the strategy method of ConcreteStrategyA and ConcreteStrategyB the same.
What was my solution?
I have changed the interface of the strategy method to no longer include **kwargs and instead use a mix of positional arguments and keyword arguments with default values.
E.g. if ConcreteStrategyB still needs a third argument arg3 but ConcreteStrategyA does not, I could change the classes to look like this:
class ConcreteStrategyA(ParentAbstractStrategy):
def __init__(self, attr1):
super().__init__(attr1)
def strategy(self, arg1, arg2, arg3=None):
print(arg1, arg2)
class ConcreteStrategyB(ChildAbstractStrategy):
def __init__(self, attr1, attr2):
super().__init__(attr1, attr2)
def strategy(self, arg1, arg2, arg3=None):
print(arg1, arg2)
assert arg3 is not None
print(arg3)
With the interfaces of both parent classes changed to match.

What would be more pythonic solution to this problem?

I have following structure for class.
class foo(object):
def __call__(self,param1):
pass
class bar(object):
def __call__(self,param1,param2):
pass
I have many classes of this type. And i am using this callable class as follows.
classes = [foo(), bar()]
for C in classes:
res = C(param1)
'''here i want to put condition if class takes 1 argumnet just pass 1
parameter otherwise pass two.'''
I have think of one pattern like this.
class abc():
def __init__(self):
self.param1 = 'xyz'
self.param2 = 'pqr'
def something(self, classes): # classes = [foo(), bar()]
for C in classes:
if C.__class__.__name__ in ['bar']:
res = C(self.param1, self.param2)
else:
res = C(self.param2)
but in above solution have to maintain list of class which takes two arguments and as i will add more class to file this will become messy.
I dont know whether this is correct(pythonic) way to do it.
On more idea i have in mind is to check how many argument that class is taking. If its 2 then pass an additional argument otherwise pass 1 argument.I have looked at this solution How can I find the number of arguments of a Python function? . But i am not confident enought that this is the best suited solution to my problem.
Few things about this:
There are only two type of classes in my usecase one with 1 argument and one with 2.
Both class takes first argument same so params1 in both case is same argument i am passing. in case of class with two required parameter i am passing additional argument(params2) containing some data.
Ps : Any help or new idea for this problem are appretiated.
UPD : Updated the code.
Basically, you want to use polymorphism on your object's __call__() method, but you have an issue with your callables signature not being the same.
The plain simple answer to this is: you can only use polymorphism on compatible types, which in this case means that your callables MUST have compatible signatures.
Hopefully, there's a quick and easy way to solve this: just modify your methods signatures so they accept varargs and kwargs:
class Foo(object):
def __call__(self,param1, *args, **kw):
pass
class Bar(object):
def __call__(self, param1, param2, *args, **kw):
pass
For the case where you can't change the callable's signature, there's still a workaround: use a lambda as proxy:
def func1(y, z):
pass
def func2(x):
pass
callables = [func1, lambda y, z: func2(y)]
for c in callables:
c(42, 1138)
Note that this last example is actually known as the adapter pattern
Unrelated: this:
if C.__class__.__name__ in ['bar']:
is a inefficient and convoluted way to write:
if C.__class__.__name__ == 'bar':
which is itself an inefficient, convoluted AND brittle way to write:
if type(C) is bar:
which, by itself, is a possible design smell (there are legit use cases for checking the exact type of an object, but most often this is really a design issue).

what is the recommended approach to initialize a class with many optional properties in Python

I'm a newbie to Python and I'm looking for a way to write a class with many optional properties, that can be initialized in a single code line.
What I've found so far is the approach of optional parameters to the init method which can be assigned by name:
class MyClass():
def __init__(self, param1=None, param2=None, param3=None, param4=None):
self.param1 = param1
self.param2 = param2
self.param3 = param3
self.param4 = param4
And the initialization
can look like:
o1 = MyClass(param2="someVal", param4="someOtherVal")
This approach seems fine, plus I think an IDE like IntelliJ will be able to supply some code completion in this style.
However, I wonder if this is the right approach in case I will create a complex class hierarchy. Let's say that MyClass will inherit from MyBaseClass and MyOtherClass and I wish to initialize all the properties from all the classes through the init method of MyClass(or some other way?).
What's the best way to accomplish this, hopefully with still helping the IDE to be able to provide code completion?
Thanks for the help.
A class with so many optional arguments in its constructor is probably a "code smell" and a redesign is in order. I suggest writing some parameter group classes:
class ParamSetA:
def __init__(param2="someVal", param4="someOtherVal"):
self.param2 = param2
self.param4 = param4
class ParamSetB:
def __init__(param3, param4):
self.param3 = param3
self.param4 = param4
class MyClass():
def __init__(self, params):
self.params = params
Now it is obvious how to pass the parameters to other related classes: just pass the entire object which encapsulates one valid set of parameters.
This also makes validation easier: it is more difficult to construct MyClass with a nonsensical set of parameters (e.g. mutually exclusive parameters, or related parameters).
How about this?
class Base:
def __init__(self, param1=None, param2=None, param3=None, param4=None):
class Derived(Base):
def __init__(self, param1=None, param2=None, param3=None, param4=None):
super().__init__(**locals())
Ref: Get a list/tuple/dict of the arguments passed to a function? - you may need to use inspect.getargspec() as well.

Python Abstract class with concrete methods

I'm wondering if in python (3) an abstract class can have concrete methods.
Although this seems to work I'm not sure this is the proper way to do this in python:
from abc import ABCMeta, abstractclassmethod, abstractmethod
class MyBaseClass:
__metaclass__ = ABCMeta
#property
#abstractmethod
def foo_prop(self):
"""Define me"""
pass
#abstractclassmethod
def get_something(cls, param1, param2, param3):
"""This is a class method, override it with #classmethod """
pass
#classmethod
def get(cls, param1, param2):
"""Concrete method calling an abstract class method and an abstract property"""
if param1 < cls.foo_prop:
raise Exception()
param3 = param1 + 42
item = cls.get_something(param1, param2, param3)
return item
class MyConcreteClassA(MyBaseClass):
"""Implementation """
foo_prop = 99
#classmethod
def get_something(cls, param1, param2, param3):
return cls.foo_prop + param1 + param2 + param3
class MyConcreteClassB(MyBaseClass):
"""Implementation """
foo_prop = 255
#classmethod
def get_something(cls, param1, param2, param3):
return cls.foo_prop - param1 - param2 - param3
In the example the abstract class MyBaseClass has:
an abstract property foo_prop that will be defined in the subclasses
the only way I could find to declare this was to create an abstract "property method"
an abstract class method get_something that will be implemented in the subclasses
a concrete method get that in turns uses the (not yet defined) abstract method and property mentioned above.
Questions:
Is there a better way to define an abstract property? Would it make more sense to define a concrete property in MyBaseClass set to None and just redefine it in the subclasses?
Can I mix abstract and concrete methods in an abstract class as shown in the example?
If yes, does it always makes sense to declare the class abstract or can a concrete class have abstract methods (in this case it should never be instantiated directly anyway).
Thanks
According to the docs (text in brackets and code formatting mine):
[#abstractproperty is] Deprecated since version 3.3: It is now possible to use property, property.getter(), property.setter() and property.deleter() with abstractmethod(), making this decorator redundant.
so I think you're doing it right.
You can do this, or at least in my experience it has not been an issue. Maybe somebody else can offer other advice, but that advice will probably take the form of "inheritance is bad". Although I don't see anything explicitly about it in the docs, this section shows an example wherein an abstract method, concrete method, and class method are all defined within an ABC.
I believe you still have to declare the class abstract in order to use #abstractmethod and similar decorators. By setting the metaclass as ABC, the class cannot be instantiated until all abstract methods are defined, which sounds like the behavior that you want, so I think you need to declare it abstract unless you just want to rely on documentation to enforce the "you shall not instantiate" rule on this class. As an aside, you can declare your abstract class with:
from abc import ABC
class MyBaseClass(ABC):
# ...
inheriting from abc instead of manually setting the metaclass. I think this construction is preferred.

Categories