I want to create an abstract base class in Python where part of the contract is how instances can be created. The different concrete implementations represent various algorithms that can be used interchangeably. Below is a simplified example (usual disclaimer - the real use-case is more complex):
from abc import ABC, abstractmethod
from typing import Type
class AbstractAlgorithm(ABC):
#abstractmethod
def __init__(self, param: int):
pass
#abstractmethod
def get_result(self) -> int:
pass
class ConcreteAlgorithm(AbstractAlgorithm):
def __init__(self, param: int):
self._param = param
def get_result(self) -> int:
return self._param * 2
def use_algorithm(algorithm: Type[AbstractAlgorithm]) -> int:
a = algorithm(10)
return a.get_result()
The above works, but has the drawback that I can't call super().__init__(...) in ConcreteAlgorithm.__init__, which might break certain inheritance scenarios, I think (correct me if I'm wrong here, but calling super is important for multiple inheritance, right?). (Strictly speaking __init__ can be called, but with the same signature as the subclass __init__, which doesn't make sense).
Python classes are callables, so I could also express it like this:
from abc import ABC, abstractmethod
from typing import Callable
class AbstractAlgorithm(ABC):
#abstractmethod
def get_result(self) -> int:
pass
class ConcreteAlgorithm(AbstractAlgorithm):
def __init__(self, param: int):
self._param = param
def get_result(self) -> int:
return self._param * 2
def use_algorithm(algorithm: Callable[[int], AbstractAlgorithm]) -> int:
a = algorithm(10)
return a.get_result()
print(use_algorithm(ConcreteAlgorithm))
This works and doesn't have the drawback mentioned above, but I do like having the __init__-signature in the abstract base class for documentation purposes.
Finally, it is possible to have abstract classmethods, so this approach works as well:
from abc import ABC, abstractmethod
from typing import Type
class AbstractAlgorithm(ABC):
#classmethod
#abstractmethod
def initialize(cls, param: int) -> "AbstractAlgorithm":
pass
#abstractmethod
def get_result(self) -> int:
pass
class ConcreteAlgorithm(AbstractAlgorithm):
#classmethod
def initialize(cls, param: int) -> "ConcreteAlgorithm":
return cls(param)
def __init__(self, param: int):
self._param = param
def get_result(self) -> int:
return self._param * 2
def use_algorithm(algorithm: Type[AbstractAlgorithm]) -> int:
a = algorithm.initialize(10)
return a.get_result()
print(use_algorithm(ConcreteAlgorithm))
This works, but I lose the nice property of using algorithm like a callable (it's just more flexible, in case someone actually wants to drop in a function, for example to decide which algorithm to use based on certain parameter values).
So, is there an approach that satisfies all three requirements:
Full documentation of the interface in the abstract base class.
Concrete implementations usable as callables.
No unsafe behavior like not being able to call the base-class __init__.
Strictly speaking __init__ can be called, but with the same signature as the subclass __init__, which doesn't make sense.
No, it makes perfect sense.
You're prescribing the signature because you require each child class to implement it exactly. That means you need to call it exactly like that as well. Each child class needs to call its super().__init__ exactly according to the abstract definition, passing all defined parameters along.
Related
could you help me understand why I am getting the TypeError: 'type' object is not subscriptable error with the code below?
Maybe I'm getting this wrong, but as I understood the Color type annotation in the filter() function is saying that the function will result in an Iterable of Color , which is exactly what I want. But when I try to annotate the function I get the error. ( but the waty, how come a type annotation is preventing the program to run? I thought that type hints in in Python would just matter inside your IDE, not in runtime).
Any light on this would be much appreciated.
# -*- coding: utf-8 -*-
from __future__ import annotations
from typing import TypeVar, Any, Generic, Iterator, Iterable
from abc import ABC, abstractmethod
from dataclasses import dataclass
T = TypeVar('T', bound=Any)
I = TypeVar('I', bound=Any)
class AbstractGenerator(ABC, Iterator[T], Generic[T, I]):
def __init__(self):
super().__init__()
self._items = None
self._next_item = None
#property
def items(self) -> Any:
return self._items
#items.setter
def items(self, items: Any) -> AbstractGenerator:
self._items = items
return self
#property
def next_item(self) -> Any:
return self._next_item
#next_item.setter
def next_item(self, next_item: Any) -> AbstractGenerator:
self._next_item = next_item
return self
#abstractmethod
def __len__(self) -> int:
pass
#abstractmethod
def __iter__(self) -> Iterable[T]:
pass
#abstractmethod
def __next__(self) -> Iterable[T]:
pass
#abstractmethod
def __getitem__(self, id: I) -> Iterable[T]:
pass
ColorId = int
#dataclass(frozen=True)
class Color:
id: ColorId
name: str
class MyColorsGenerator(AbstractGenerator[Color, int]):
def __init__(self):
super().__init__()
self._colors: list[Color] = []
self._next_color_index: int = 0 #None
#property
def colors(self) -> list[Color]:
return self._colors
#colors.setter
def colors(self, colors: list[Color]) -> MyColorsGenerator:
self._colors = colors
return self
#property
def next_color_index(self) -> int:
return self._next_color_index
#next_color_index.setter
def next_color_index(self, next_color_index: int) -> MyColorsGenerator:
self._next_color_index = next_color_index
return self
def add_color(self, color: Color) -> MyColorsGenerator:
self.colors.append(color)
return self
def __len__(self) -> int:
return len(self.colors)
def __iter__(self) -> Iterable[Color]:
return self
def __next__(self) -> Iterable[Color]:
if self.next_color_index < len(self.colors):
self.next_color_index += 1
return self.colors[self.next_color_index - 1]
else:
raise StopIteration
def __getitem__(self, id: ColorId) -> Iterable[Color]:
return list(filter[Color](lambda color: color.id == id, self.colors))
colors_generator: MyColorsGenerator = MyColorsGenerator()
colors_generator \
.add_color(Color(id=0, name="Blue")) \
.add_color(Color(id=1, name="Red")) \
.add_color(Color(id=2, name="Yellow")) \
.add_color(Color(id=3, name="Green")) \
.add_color(Color(id=4, name="White")) \
.add_color(Color(id=5, name="Black"))
# This results in: TypeError: 'type' object is not subscriptable
#colors: Optional[list[Color]] = list(filter[Color](lambda color: color.id == 4, colors_generator))
# This works, notice the only thing I did was to remove the type annotation for the expected generic type ([Color])
colors: Optional[list[Color]] = list(filter(lambda color: color.id == 4, colors_generator))
print(colors)
The issue is that generics aren't a language-level addition, but a library one. Specifying the generic type parameters actually employs the same [] operator you use for item access in collections, except it is defined on the metaclass. For this reason the generics syntax originally only worked with specific classes in the typing module (typing.List[int], typing.Dict[str, str], etc.). Since python3.9, however, some common classes from the standard library have been extended to support the same operation, for brevity, like list[int], dict[str, str]. This is still NOT a language feature, and most classes in the standard library do not implement it. Moreover, as you've rightfully noticed, these annotations carry (almost) no meaning for the interpreter, and are (mostly) just there for the ide. Among other things, that implies that you don't instantiate generic classes as specialized generics (list() is correct, list[int]() is legal, but pointless and considered a bad practice). filter is a class in the standard library, which does not provide the generic-aliasing [] operation, so you get the error that applying it is not implemented ("'type' object is not subscriptable", filter is an instance of type, and [] is the subscription operator). Python as the language does not understand the concept of a generic, and so it cannot give you a better error message like "'filter' is not a generic class". Even if it was, however, you shouldn't have invoked it this way.
A special note should be made about generic functions. They CANNOT be explicitly supplied with generic parameters. So, if instead of filter we were talking about some function like:
T = typing.TypeVar("T")
def my_filter(f: typing.Callable[[T], bool], seq: list[T]) -> list[T]:
...
, there would have been no way to explicitly tell you're interested in my_filter[Color].
TL;DR: filter is not a generic class in terms of type annotations, so it does not support the [] operation
I have a use-case where I have two classes: Foo and Bar.
I want to write a function that, given one of these classes, dynamically creates a new class that is a subclass of the given class. This is what I currently have:
from typing import Type, TypeVar
class Foo:
...
class Bar
...
T = TypeVar("T")
def generate_child_class(base_class: Type[T]) -> Type[T]:
class GeneratedClassPlaceholder(base_class):
def __init__(self, **kwargs) -> None:
super(GeneratedClassPlaceholder, self).__init__(**kwargs)
# some additional logic unique to the subclass
...
return GeneratedClassPlaceholder
I intend to use that function as following:
FooChildClass = generate_child_class(Foo)
BarChildClass = generate_child_class(Bar)
However, the type-checker complains about the function: "Incompatible return type [7]: Expected Type[Variable[T]] but got Type[GeneratedFixtureClass]."
How do I correctly type-annotate this code? Could a metaclass be useful here?
Thank you!
I'm trying to design my code as follows - i.e., I'd like that each subclass which implements my functionlity will have as member a collection of fields, which can also inherit from a base dataclass.
from dataclasses import dataclass
from abc import ABC, abstractmethod
#dataclass
class BaseFields:
pass
#dataclass
class MoreFields(baseFields):
name: str = "john"
class A(ABC):
def __init__(self) -> None:
super().__init__()
self.fields: BaseFields = BaseFields()
#abstractmethod
def say_hi(self) -> None:
pass
class B(A):
def __init__(self) -> None:
super().__init__()
self.fields = MoreFields()
def say_hi(self) -> None:
print(f"Hi {self.fields.name}!")
if __name__ == "__main__":
b = B()
b.say_hi()
When I run it, I get Hi john! as output, as expected.
But mypy doesn't seem to recognize it:
❯ mypy dataclass_inheritence.py
dataclass_inheritence.py:25: error: "baseFields" has no attribute "name"
Found 1 error in 1 file (checked 1 source file)
I looked and found this github issue, and it links to another one, but doesn't seem like it offers a solution.
I should also note that if I remove the #dataclass decorators and implement the Fields classes as plain ol' classes, with __init__ - I still get the same mypy error.
My motivation (as you may tell) is to reference composite members within the implemented methods of the functional subclasses. Those members are constants, as in the example, so I might use some form of Enum inheritance, but looking at this question it's not a popular design choice (will have to use some 3rd party module which I'm not keen on doing).
Has anyone encountered something similar? Do you have suggestions for a design that could achieve my goal?
The type of self.fields is declared as baseFields in A.__init__, and is not narrowed implicitly by assigning a moreFields to it in B.__init__ -- after all, you might want to be able to re-assign it to another baseFields instance, and it is therefore never assumed to be anything more specific than baseFields.
If you explicitly annotate it as moreFields in B.__init__, the error goes away:
class B(A):
def __init__(self) -> None:
super().__init__()
self.fields: moreFields = moreFields()
def say_hi(self) -> None:
print(f"Hi {self.fields.name}!") # ok!
although this actually feels like a bug in mypy, because now you can do this, violating the LSP:
if __name__ == "__main__":
b: A = B()
b.fields = baseFields() # no mypy error, because b is an A, right?
b.say_hi() # runtime AttributeError because b is actually a B!
If I want a subclass to be able to narrow the type of an attribute, I make it a property backed by private attributes:
class A(ABC):
def __init__(self) -> None:
super().__init__()
self.__baseFields = baseFields()
#property
def fields(self) -> baseFields:
return self.__baseFields
#abstractmethod
def say_hi(self) -> None:
pass
class B(A):
def __init__(self) -> None:
super().__init__()
self.__moreFields = moreFields()
#property
def fields(self) -> moreFields:
return self.__moreFields
def say_hi(self) -> None:
print(f"Hi {self.fields.name}!") # ok!
You can use a generic base class to define the class. I would also have the fields attribute be passed to the base class constructor. There are some subtle tricks to get the signature on the init method working, but this should work.
Some imports you'll want:
from __future__ import annotations
from abc import ABC, abstractmethod
from dataclasses import dataclass
from typing import Generic, TypeVar, overload
Rename the classes with more pythonic names, and define a generic TypeVar to represent which fields we are using.
#dataclass
class BaseFields:
pass
#dataclass
class MoreFields(BaseFields):
name: str = "john"
Fields = TypeVar('Fields', bound=BaseFields)
For defining the base class, we want to allow the fields param to be anything satisfying the TypeVar. We also need to add some overloads to handle the case where a default is used or not.
class A(Generic[Fields], ABC):
fields: Fields
#overload
def __init__(self: A[BaseFields]) -> None:
...
#overload
def __init__(self: A[Fields], fields: Fields) -> None:
...
def __init__(self, fields=None):
self.fields = fields or BaseFields()
#abstractmethod
def say_hi(self) -> None:
pass
Now we can run our test:
class B(A[MoreFields]):
def __init__(self) -> None:
super().__init__(MoreFields())
def say_hi(self) -> None:
print(f"Hi {self.fields.name}!")
if __name__ == "__main__":
b = B()
b.say_hi()
$ mypy test.py
Success: no issues found in 1 source file
I'm trying to implement two abstract classes in one class, but the two abstract classes contain abstract methods with the same name. In C#, I would be able to explicitly implement the abstract methods allowing them to be called on the context of the type. Is there a way to do something similar in python to allow for both abstract classes to be implemented?
from abc import ABC, abstractmethod
from builtins import str
class AbstractConfig1(ABC):
#property
#abstractmethod
def unique_prop(self) -> str:
pass
#property
#abstractmethod
def output_filepath(self) -> str: ## same name in AbstractConfig2
pass
class AbstractConfig2(ABC):
#property
#abstractmethod
def other_unique_prop(self) -> str:
pass
#property
#abstractmethod
def output_filepath(self) -> str: ## same name in AbstractConfig1
pass
class Config(AbstractConfig1, AbstractConfig2):
def __init__(self,
unique_prop:str,
other_unique_prop:str,
config1_output_filepath: str,
config2_output_filepath: str
):
self._unique_prop = unique_prop
self._other_unique_prop = other_unique_prop
self._config1_output_filepath = config1_output_filepath
self._config2_output_filepath = config2_output_filepath
#property
def unique_prop(self) -> str:
return self._unique_prop
#property
def other_unique_prop(self) -> str:
return self._other_unique_prop
#property
def AbstractConfig1.output_filepath(self) -> str: ## How I would explicitly implement this in C#
return self._config1_output_filepath
#property
def AbstractConfig2.output_filepath(self) -> str: ## How I would explicitly implement this in C#
return self._config2_output_filepath
Here is a link to what I'm attempting in terms of C#
https://learn.microsoft.com/en-us/dotnet/csharp/programming-guide/interfaces/explicit-interface-implementation
Edit to clear things up a little more:
I simplified this a little more than what my code is doing, instead of just passing through a string, the abstract methods I named output_filepath are returning objects built in the Config class. But I will continue using str in the example to simplify.
Essentially the Config class is acting as a facade to multiple AbstractConfig classes. This way, the facade Config can be configured and then passed to initialize other objects. This would look a bit like below:
class ClassUsingAbstractConfig1:
def __init__(self, config: AbstractConfig1):
self.config = config
def output_file(self):
path = self.config.output_filepath
# this object outputs to one filepath
class ClassUsingAbstractConfig2:
def __init__(self, config: AbstractConfig2):
self.config = config
def output_file(self):
path = self.config.output_filepath
# this object outputs to another filepath
config = Config("prop",
"prop2",
"filepath1",
"filepath2")
class1 = ClassUsingAbstractConfig1(config)
class2 = ClassUsingAbstractConfig2(config)
class1.output_file() # outputs to filepath1
class2.output_file() # outputs to filepath2
And it may just be that python won't allow this and I need to take a different approach.
You have to ask yourself: "What is the signature of string: Config::output_filepath(Config: self)"
What you're referring to is the Multiple inheritance - Diamond Problem. Basically, the class: Config can have only 1 implementation for the method (function) with the same signature. You have to imagine that each function uses its signature as the index for the function call table. This is how you call a function from one class to another, especially when they share the same name. But in your case it shares the same signature.
As a consequence, I think you'll have something like:
from abc import ABC, abstractmethod
from builtins import str
class AbstractConfig1(ABC):
#property
#abstractmethod
def unique_prop(self) -> str:
pass
#property
#abstractmethod
def output_filepath(self) -> str: ## same name in AbstractConfig2
pass
class AbstractConfig2(ABC):
#property
#abstractmethod
def other_unique_prop(self) -> str:
pass
#property
#abstractmethod
def output_filepath(self) -> str: ## same name in AbstractConfig1
pass
class Config(AbstractConfig1, AbstractConfig2):
def __init__(self,
unique_prop:str,
other_unique_prop:str,
config1_output_filepath: str,
config2_output_filepath: str
):
self._unique_prop = unique_prop
self._other_unique_prop = other_unique_prop
self._config1_output_filepath = config1_output_filepath
self._config2_output_filepath = config2_output_filepath
#property
def unique_prop(self) -> str:
return self._unique_prop
#property
def other_unique_prop(self) -> str:
return self._other_unique_prop
#property
def output_filepath(self) -> str:
# or whatever the implementation you want to be.
return self._config1_output_filepath + self._config2_output_filepath
This problem exists in virtually all high level languages and is related to the concept of Class and Function and how that ends up being translated on the Language Virtual Machine (if is a C#, Java, Python etc.) all the way to the kernel and CPU to be loaded and executed (other answer)
I try to call a classmethod on a generic class:
from typing import List, Union, TypeVar, Generic
from enum import IntEnum
class Gender(IntEnum):
MALE = 1
FEMALE = 2
DIVERS = 3
T = TypeVar('T')
class EnumAggregate(Generic[T]):
def __init__(self, value: Union[int, str, List[T]]) -> None:
if value == '':
raise ValueError(f'Parameter "value" cannot be empty!')
if isinstance(value, list):
self._value = ''.join([str(x.value) for x in value])
else:
self._value = str(value)
def __contains__(self, item: T) -> bool:
return item in self.to_list
#property
def to_list(self) -> List[T]:
return [T(int(character)) for character in self._value]
#property
def value(self) -> str:
return self._value
#classmethod
def all(cls) -> str:
return ''.join([str(x.value) for x in T])
Genders = EnumAggregate[Gender]
But if I call
Genders.all()
I get the error TypeError: 'TypeVar' object is not iterable. So the TypeVar T isn't properly matched with the Enum Gender.
How can I fix this? The expected behavior would be
>>> Genders.all()
'123'
Any ideas? Or is this impossible?
Python's type hinting system is there for a static type checker to validate your code and T is just a placeholder for the type system, like a slot in a template language. It can't be used as an indirect reference to a specific type.
You need to subclass your generic type if you want to produce a concrete implementation. And because Gender is a class and not an instance, you'd need to tell the type system how you plan to use a Type[T] somewhere, too.
Because you also want to be able to use T as an Enum() (calling it with EnumSubclass(int(character))), I'd also bind the typevar; that way the type checker will understand that all concrete forms of Type[T] are callable and will produce individual T instances, but also that those T instances will always have a .value attribute:
from typing import ClassVar, List, Union, Type, TypeVar, Generic
from enum import IntEnum
T = TypeVar('T', bound=IntEnum) # only IntEnum subclasses
class EnumAggregate(Generic[T]):
# Concrete implementations can reference `enum` *on the class itself*,
# which will be an IntEnum subclass.
enum: ClassVar[Type[T]]
def __init__(self, value: Union[int, str, List[T]]) -> None:
if not value:
raise ValueError('Parameter "value" cannot be empty!')
if isinstance(value, list):
self._value = ''.join([str(x.value) for x in value])
else:
self._value = str(value)
def __contains__(self, item: T) -> bool:
return item in self.to_list
#property
def to_list(self) -> List[T]:
# the concrete implementation needs to use self.enum here
return [self.enum(int(character)) for character in self._value]
#property
def value(self) -> str:
return self._value
#classmethod
def all(cls) -> str:
# the concrete implementation needs to reference cls.enum here
return ''.join([str(x.value) for x in cls.enum])
With the above generic class you can now create a concrete implementation, using your Gender IntEnum fitted into the T slot and as a class attribute:
class Gender(IntEnum):
MALE = 1
FEMALE = 2
DIVERS = 3
class Genders(EnumAggregate[Gender]):
enum = Gender
To be able to access the IntEnum subclass as a class attribute, we needed to use typing.ClassVar[]; otherwise the type checker has to assume the attribute is only available on instances.
And because the Gender IntEnum subclass is itself a class, we need to tell the type checker about that too, hence the use of typing.Type[].
Now the Gender concrete subclass works; the use of EnumAggregate[Gender] as a base class tells the type checker to substitute T for Gender everywhere, and because the implementation uses enum = Gender, the type checker sees that this is indeed correctly satisfied and the code passes all checks:
$ bin/mypy so65064844.py
Success: no issues found in 1 source file
and you can call Genders.all() to produce a string:
>>> Genders.all()
'123'
Note that I'd not store the enum values as strings, but rather as integers. There is little value in converting it back and forth here, and you are limiting yourself to enums with values between 0 and 9 (single digits).
The other answer does not work anymore, at least in Python 3.10. The type annotation ClassVar[Type[T]] results in a mypy error: ClassVar cannot contain type variables is thrown. This is because ClassVar should only be used in a Protocol and structural subtyping, which is not the best answer for the problem at hand.
The following modification of the other answer works:
class EnumAggregate(Generic[T]):
enum: type[T]
[...]
class Genders(EnumAggregate[Gender]):
enum = Gender
Abstract class variables
I would also recommend making enum abstract in some way, so instantiating EnumAggregate[Gender] instead of Genders will raise an error at the time of instantiation, not only at calls of to_list() or all().
This can be done in two ways: Either check the implementation in __init__:
class EnumAggregate(Generic[T]):
enum: type[T]
def __init__
[...]
if not hasattr(type(self), 'enum'):
raise NotImplementedError("Implementations must define the class variable 'enum'")
Or use an abstract class property, see this discussion. This makes mypy happy in several situations, but not Pylance (see here):
class EnumAggregate(Generic[T]):
#property
#classmethod
#abstractmethod
def enum(cls) -> type[T]: ...
[...]
class Genders(EnumAggregate[Gender]):
enum = Gender
However, there are unresolved problems with mypy and decorators, so right now there are spurious errors which might disappear in the future. For reference:
mypy issue 1
mypy issue 2
Discussion whether to deprecate chaining classmethod decorators