Python: Testing abstract class with concrete implementation details - python

I have a class containing a mixture of #abstractmethods and normal implementation methods, and I'm wondering how I should go about testing the normal implementations.
Quick Example: I'd like to test the zoo_str method, even though it depends on the abstract description method. If I have 100 animals, it seems like overkill to write a test in the Lion class, the Antelope class, the Hippo class, etc. What's the best way to do this -- my intuition says I should try to mock description, but I can't instatntiate the class and this falls apart if the abstract method is private (_description).
class Animal:
#abstractmethod
def description(self) -> str:
pass
def zoo_str(self) -> str:
return self.description() + "Get more info at zoo.com!"

Just create a subclass.
class TestAnimal(Animal):
def description(self):
return "foo"
assert TestAnimal().zoo_str() == "fooGet more info at zoo.com!"

You can simply use multiple inheritance:
# test_animals.py
import unittest
from animals import Animal
class TestAnimal(unittest.TestCase, Animal):
def description(self) -> str:
return "Unittest"
def test_zoo_str(self) -> None:
assert self.zoo_str() == "UnittestGet more info at zoo.com!"

Here is a mock-using variant (based on https://stackoverflow.com/a/63777635) showing how to test against all Animal subclasses:
#pytest.mark.parametrize("cls", Animal.__subclasses__())
def test_animals(mocker, cls):
mocker.patch.multiple(cls, __abstractmethods__=set())
inst = cls()
assert inst.zoo_str() == f"{inst.description()}Get more info at zoo.com!"

Related

How to create a class with an abstract `__init__` method?

I want to create an abstract base class in Python where part of the contract is how instances can be created. The different concrete implementations represent various algorithms that can be used interchangeably. Below is a simplified example (usual disclaimer - the real use-case is more complex):
from abc import ABC, abstractmethod
from typing import Type
class AbstractAlgorithm(ABC):
#abstractmethod
def __init__(self, param: int):
pass
#abstractmethod
def get_result(self) -> int:
pass
class ConcreteAlgorithm(AbstractAlgorithm):
def __init__(self, param: int):
self._param = param
def get_result(self) -> int:
return self._param * 2
def use_algorithm(algorithm: Type[AbstractAlgorithm]) -> int:
a = algorithm(10)
return a.get_result()
The above works, but has the drawback that I can't call super().__init__(...) in ConcreteAlgorithm.__init__, which might break certain inheritance scenarios, I think (correct me if I'm wrong here, but calling super is important for multiple inheritance, right?). (Strictly speaking __init__ can be called, but with the same signature as the subclass __init__, which doesn't make sense).
Python classes are callables, so I could also express it like this:
from abc import ABC, abstractmethod
from typing import Callable
class AbstractAlgorithm(ABC):
#abstractmethod
def get_result(self) -> int:
pass
class ConcreteAlgorithm(AbstractAlgorithm):
def __init__(self, param: int):
self._param = param
def get_result(self) -> int:
return self._param * 2
def use_algorithm(algorithm: Callable[[int], AbstractAlgorithm]) -> int:
a = algorithm(10)
return a.get_result()
print(use_algorithm(ConcreteAlgorithm))
This works and doesn't have the drawback mentioned above, but I do like having the __init__-signature in the abstract base class for documentation purposes.
Finally, it is possible to have abstract classmethods, so this approach works as well:
from abc import ABC, abstractmethod
from typing import Type
class AbstractAlgorithm(ABC):
#classmethod
#abstractmethod
def initialize(cls, param: int) -> "AbstractAlgorithm":
pass
#abstractmethod
def get_result(self) -> int:
pass
class ConcreteAlgorithm(AbstractAlgorithm):
#classmethod
def initialize(cls, param: int) -> "ConcreteAlgorithm":
return cls(param)
def __init__(self, param: int):
self._param = param
def get_result(self) -> int:
return self._param * 2
def use_algorithm(algorithm: Type[AbstractAlgorithm]) -> int:
a = algorithm.initialize(10)
return a.get_result()
print(use_algorithm(ConcreteAlgorithm))
This works, but I lose the nice property of using algorithm like a callable (it's just more flexible, in case someone actually wants to drop in a function, for example to decide which algorithm to use based on certain parameter values).
So, is there an approach that satisfies all three requirements:
Full documentation of the interface in the abstract base class.
Concrete implementations usable as callables.
No unsafe behavior like not being able to call the base-class __init__.
Strictly speaking __init__ can be called, but with the same signature as the subclass __init__, which doesn't make sense.
No, it makes perfect sense.
You're prescribing the signature because you require each child class to implement it exactly. That means you need to call it exactly like that as well. Each child class needs to call its super().__init__ exactly according to the abstract definition, passing all defined parameters along.

`mypy` doesn't recognize inherited dataclass members

I'm trying to design my code as follows - i.e., I'd like that each subclass which implements my functionlity will have as member a collection of fields, which can also inherit from a base dataclass.
from dataclasses import dataclass
from abc import ABC, abstractmethod
#dataclass
class BaseFields:
pass
#dataclass
class MoreFields(baseFields):
name: str = "john"
class A(ABC):
def __init__(self) -> None:
super().__init__()
self.fields: BaseFields = BaseFields()
#abstractmethod
def say_hi(self) -> None:
pass
class B(A):
def __init__(self) -> None:
super().__init__()
self.fields = MoreFields()
def say_hi(self) -> None:
print(f"Hi {self.fields.name}!")
if __name__ == "__main__":
b = B()
b.say_hi()
When I run it, I get Hi john! as output, as expected.
But mypy doesn't seem to recognize it:
❯ mypy dataclass_inheritence.py
dataclass_inheritence.py:25: error: "baseFields" has no attribute "name"
Found 1 error in 1 file (checked 1 source file)
I looked and found this github issue, and it links to another one, but doesn't seem like it offers a solution.
I should also note that if I remove the #dataclass decorators and implement the Fields classes as plain ol' classes, with __init__ - I still get the same mypy error.
My motivation (as you may tell) is to reference composite members within the implemented methods of the functional subclasses. Those members are constants, as in the example, so I might use some form of Enum inheritance, but looking at this question it's not a popular design choice (will have to use some 3rd party module which I'm not keen on doing).
Has anyone encountered something similar? Do you have suggestions for a design that could achieve my goal?
The type of self.fields is declared as baseFields in A.__init__, and is not narrowed implicitly by assigning a moreFields to it in B.__init__ -- after all, you might want to be able to re-assign it to another baseFields instance, and it is therefore never assumed to be anything more specific than baseFields.
If you explicitly annotate it as moreFields in B.__init__, the error goes away:
class B(A):
def __init__(self) -> None:
super().__init__()
self.fields: moreFields = moreFields()
def say_hi(self) -> None:
print(f"Hi {self.fields.name}!") # ok!
although this actually feels like a bug in mypy, because now you can do this, violating the LSP:
if __name__ == "__main__":
b: A = B()
b.fields = baseFields() # no mypy error, because b is an A, right?
b.say_hi() # runtime AttributeError because b is actually a B!
If I want a subclass to be able to narrow the type of an attribute, I make it a property backed by private attributes:
class A(ABC):
def __init__(self) -> None:
super().__init__()
self.__baseFields = baseFields()
#property
def fields(self) -> baseFields:
return self.__baseFields
#abstractmethod
def say_hi(self) -> None:
pass
class B(A):
def __init__(self) -> None:
super().__init__()
self.__moreFields = moreFields()
#property
def fields(self) -> moreFields:
return self.__moreFields
def say_hi(self) -> None:
print(f"Hi {self.fields.name}!") # ok!
You can use a generic base class to define the class. I would also have the fields attribute be passed to the base class constructor. There are some subtle tricks to get the signature on the init method working, but this should work.
Some imports you'll want:
from __future__ import annotations
from abc import ABC, abstractmethod
from dataclasses import dataclass
from typing import Generic, TypeVar, overload
Rename the classes with more pythonic names, and define a generic TypeVar to represent which fields we are using.
#dataclass
class BaseFields:
pass
#dataclass
class MoreFields(BaseFields):
name: str = "john"
Fields = TypeVar('Fields', bound=BaseFields)
For defining the base class, we want to allow the fields param to be anything satisfying the TypeVar. We also need to add some overloads to handle the case where a default is used or not.
class A(Generic[Fields], ABC):
fields: Fields
#overload
def __init__(self: A[BaseFields]) -> None:
...
#overload
def __init__(self: A[Fields], fields: Fields) -> None:
...
def __init__(self, fields=None):
self.fields = fields or BaseFields()
#abstractmethod
def say_hi(self) -> None:
pass
Now we can run our test:
class B(A[MoreFields]):
def __init__(self) -> None:
super().__init__(MoreFields())
def say_hi(self) -> None:
print(f"Hi {self.fields.name}!")
if __name__ == "__main__":
b = B()
b.say_hi()
$ mypy test.py
Success: no issues found in 1 source file

Python - Explicit Implementation of two abstract classes with same abstract method name

I'm trying to implement two abstract classes in one class, but the two abstract classes contain abstract methods with the same name. In C#, I would be able to explicitly implement the abstract methods allowing them to be called on the context of the type. Is there a way to do something similar in python to allow for both abstract classes to be implemented?
from abc import ABC, abstractmethod
from builtins import str
class AbstractConfig1(ABC):
#property
#abstractmethod
def unique_prop(self) -> str:
pass
#property
#abstractmethod
def output_filepath(self) -> str: ## same name in AbstractConfig2
pass
class AbstractConfig2(ABC):
#property
#abstractmethod
def other_unique_prop(self) -> str:
pass
#property
#abstractmethod
def output_filepath(self) -> str: ## same name in AbstractConfig1
pass
class Config(AbstractConfig1, AbstractConfig2):
def __init__(self,
unique_prop:str,
other_unique_prop:str,
config1_output_filepath: str,
config2_output_filepath: str
):
self._unique_prop = unique_prop
self._other_unique_prop = other_unique_prop
self._config1_output_filepath = config1_output_filepath
self._config2_output_filepath = config2_output_filepath
#property
def unique_prop(self) -> str:
return self._unique_prop
#property
def other_unique_prop(self) -> str:
return self._other_unique_prop
#property
def AbstractConfig1.output_filepath(self) -> str: ## How I would explicitly implement this in C#
return self._config1_output_filepath
#property
def AbstractConfig2.output_filepath(self) -> str: ## How I would explicitly implement this in C#
return self._config2_output_filepath
Here is a link to what I'm attempting in terms of C#
https://learn.microsoft.com/en-us/dotnet/csharp/programming-guide/interfaces/explicit-interface-implementation
Edit to clear things up a little more:
I simplified this a little more than what my code is doing, instead of just passing through a string, the abstract methods I named output_filepath are returning objects built in the Config class. But I will continue using str in the example to simplify.
Essentially the Config class is acting as a facade to multiple AbstractConfig classes. This way, the facade Config can be configured and then passed to initialize other objects. This would look a bit like below:
class ClassUsingAbstractConfig1:
def __init__(self, config: AbstractConfig1):
self.config = config
def output_file(self):
path = self.config.output_filepath
# this object outputs to one filepath
class ClassUsingAbstractConfig2:
def __init__(self, config: AbstractConfig2):
self.config = config
def output_file(self):
path = self.config.output_filepath
# this object outputs to another filepath
config = Config("prop",
"prop2",
"filepath1",
"filepath2")
class1 = ClassUsingAbstractConfig1(config)
class2 = ClassUsingAbstractConfig2(config)
class1.output_file() # outputs to filepath1
class2.output_file() # outputs to filepath2
And it may just be that python won't allow this and I need to take a different approach.
You have to ask yourself: "What is the signature of string: Config::output_filepath(Config: self)"
What you're referring to is the Multiple inheritance - Diamond Problem. Basically, the class: Config can have only 1 implementation for the method (function) with the same signature. You have to imagine that each function uses its signature as the index for the function call table. This is how you call a function from one class to another, especially when they share the same name. But in your case it shares the same signature.
As a consequence, I think you'll have something like:
from abc import ABC, abstractmethod
from builtins import str
class AbstractConfig1(ABC):
#property
#abstractmethod
def unique_prop(self) -> str:
pass
#property
#abstractmethod
def output_filepath(self) -> str: ## same name in AbstractConfig2
pass
class AbstractConfig2(ABC):
#property
#abstractmethod
def other_unique_prop(self) -> str:
pass
#property
#abstractmethod
def output_filepath(self) -> str: ## same name in AbstractConfig1
pass
class Config(AbstractConfig1, AbstractConfig2):
def __init__(self,
unique_prop:str,
other_unique_prop:str,
config1_output_filepath: str,
config2_output_filepath: str
):
self._unique_prop = unique_prop
self._other_unique_prop = other_unique_prop
self._config1_output_filepath = config1_output_filepath
self._config2_output_filepath = config2_output_filepath
#property
def unique_prop(self) -> str:
return self._unique_prop
#property
def other_unique_prop(self) -> str:
return self._other_unique_prop
#property
def output_filepath(self) -> str:
# or whatever the implementation you want to be.
return self._config1_output_filepath + self._config2_output_filepath
This problem exists in virtually all high level languages and is related to the concept of Class and Function and how that ends up being translated on the Language Virtual Machine (if is a C#, Java, Python etc.) all the way to the kernel and CPU to be loaded and executed (other answer)

Encapsulate the decision which child class to initialize

I have a parent class and different child classes. I want to encapsulate the decision which child class is to initialize in the initialization.
A simple example:
class Person:
def __init__(self, name):
if self.name_is_male(name):
real_instance = Male(name)
else:
real_instance = Female(name)
return real_instance
def name_is_male(self, name):
if name == 'Donald':
return True
elif name == 'Daisy':
return False
else:
raise ValueError('unknown name!')
class Male(Person):
def __init__(self, name):
...
class Female(Person):
def __init__(self, name):
...
This simple example will end in a recursion and doesn’t work, but it’s for illustrating my question: how to encapsulate the decision which child class to initialize in the initialization of a parent class? Or is this altogether a stupid idea?
Though the use case is not very clear, I would have used factory design pattern to achieve something similar to this. A basic example can be:
class Person(object):
# Create objects based on some name:
#staticmethod
def factory(name):
if name== "Male":
return Male()
elif name== "Female":
return Female()
else:
return None
class Male(Person):
pass
class Female(Person):
pass
person = Person.factory('Male')
Another example on factory method design pattern
__init__ is not supposed to return anything (or rather: it has to return None). Imo it's not the best way of writing it, or as you put it "altogether a stupid idea". Is there a particular reason why it can't be an attribute?

Python Classes: turn all inherited methods private

Class Bar inherits from Foo:
class Foo(object):
def foo_meth_1(self):
return 'foometh1'
def foo_meth_2(self):
return 'foometh2'
class Bar(Foo):
def bar_meth(self):
return 'bar_meth'
Is there a way of turning all methods inherited from Foo private?
class Bar(Foo):
def bar_meth(self):
return 'bar_meth'
def __foo_meth_1(self):
return 'foometh1'
def __foo_meth_2(self):
return 'foometh2'
Python doesn't have privates, only obfuscated method names. But I suppose you could iterate over the methods of the superclass when creating the instance, removing them from yourself and creating new obfuscatingly named method names for those functions. setattr and getattr could be useful if you use a function to create obfuscated names.
With that said, it's a pretty cthuhlu-oid thing to do. You mention the intent is to keep the namespace cleaner, but this is more like mixing ammonia and chlorine. If the method needs to be hidden, hide it in the superclass. The don't create instances of the superclass -- instead create a specific class that wraps the hidden methods in public ones, which you could name the same thing but strip the leading whitespace.
Assuming I understand your intent correctly, I would suggest doing something like this:
class BaseFoo(object):
def __init__(self):
raise NotImplementedError('No instances of BaseFoo please.')
def _foo(self):
return 'Foo.'
def _bar(self):
return 'Bar.'
class HiddenFoo(BaseFoo):
def __init__(self): pass
class PublicFoo(BaseFoo):
def __init__(self): pass
foo = BaseFoo._foo
bar = BaseFoo._bar
def try_foobar(instance):
print 'Trying ' + instance.__class__.__name__
try:
print 'foo: ' + instance.foo
print 'bar: ' + instance.bar
except AttributeError, e:
print e
foo_1 = HiddenFoo()
foo_2 = PublicFoo()
try_foobar(foo_1)
try_foobar(foo_2)
And if PublicFoo.foo would do something more than BaseFoo.foo, you would write a wrapper that does whatever is needed, and then calls foo from the superclass.
This is only possible with Pyhtons's metaclasses. But this is quite sophisticated and I am not sure if it is worth the effort. For details have a look here
Why would you like to do so?
Since foo() and __foo() are completely different methods with no link between them, Python is unable to understand what you want to do. So you have to explain to it step by step, meaning (like sapth said) to remove the old methods and add new ones.
This is an Object Oriented Design flaw and a better approach would be through delegation:
class Basic:
def meth_1(self):
return 'meth1'
def meth_2(self):
return 'meth2'
class Foo(Basic):
# Nothing to do here
pass
class Bar:
def __init__(self):
self.dg = Basic()
def bar_meth(self):
return 'bar_meth ' + self.__meth_1()
def __meth_1(self):
return self.dg.meth_1()
def __meth_2(self):
return self.dg.meth_2()
While Foo inherits the Basic class because he wants the public methods from him, Bar will only delegate the job to Basic because he doesn't want to integrate Basic's interface into its own interface.
You can use metaclasses, but Boo will no longer be an actual subclass of Foo, unless you want Foo's methods to be both 'private' and 'public' in instances of Bar (you cannot selectively inherit names or delattr members inherited from parent classes). Here is a very contrived example:
from inspect import getmembers, isfunction
class TurnPrivateMetaclass(type):
def __new__(cls, name, bases, d):
private = {'__%s' % i:j for i,j in getmembers(bases[0]) if isfunction(j)}
d.update(private)
return type.__new__(cls, name, (), d)
class Foo:
def foo_meth_1(self): return 'foometh1'
def foo_meth_2(self): return 'foometh2'
class Bar(Foo, metaclass=TurnPrivateMetaclass):
def bar_meth(self): return 'bar_meth'
b = Bar()
assert b.__foo_meth_1() == 'foometh1'
assert b.__foo_meth_2() == 'foometh2'
assert b.bar_meth() == 'bar_meth
If you wanted to get attribute access working, you could create a new Foo base class in __new__ with all renamed methods removed.

Categories