Mypy: Generic container with some methods only valid if extra protocols apply - python

In mypy, how would you specify that a type Generic over T has methods that are only valid if T meets certain conditions?
For example, if we made a custom collection class with a min method, returning the smallest element in the collection:
from typing import Generic, TypeVar
T = TypeVar("T")
class MyCollection(Generic[T]):
def __init__(self, some_list: List[T]):
self._storage = some_list
def min(self) -> T: # This requires that T implements __lt__
"Get the smallest element in the collection"
return min(self._storage)
How can you tell the type system that calling min on MyCollection of T is only allowed if T implements __lt__?
So basically I'd like to have some methods of a generic container only be valid if extra protocols are met.
-- Useful links --
You can see from the typehints in the standardlib for min that they've defined a protocol for enforcing __lt__ is implemented
class SupportsLessThan(Protocol):
def __lt__(self, __other: Any) -> bool: ...
SupportsLessThanT = TypeVar("SupportsLessThanT", bound=SupportsLessThan) # noqa: Y001

In the same stub file you linked, look at the type hints for list.sort:
class list(MutableSequence[_T], Generic[_T]):
...
#overload
def sort(self: List[SupportsLessThanT], *, key: None = ..., reverse: bool = ...) -> None: ...
#overload
def sort(self, *, key: Callable[[_T], SupportsLessThan], reverse: bool = ...) -> None: ...
By type hinting self, you can specify that a method is only applicable for certain specializations of a generic class. You can also see this documented in the mypy docs.
Your min would thus be annotated as
def min(self: 'MyCollection[SupportsLessThanT]') -> SupportsLessThanT:
...
with a suitable definition of SupportsLessThanT.

Related

Annotating type with class and instance

I'm making a semi-singleton class Foo that can have (also semi-singleton) subclasses. The constructor takes one argument, let's call it a slug, and each (sub)class is supposed to have at most one instance for each value of slug.
Let's say I have a subclass of Foo called Bar. Here is an example of calls:
Foo("a slug") -> returns a new instance of Foo, saved with key (Foo, "a slug").
Foo("some new slug") -> returns a new instance Foo, saved with key (Foo, "some new slug").
Foo("a slug") -> we have the same class and slug from step 1, so this returns the same instance that was returned in step 1.
Bar("a slug") -> we have the same slug as before, but a different class, so this returns a new instance of Bar, saved with key (Bar, "a slug").
Bar("a slug") -> this returns the same instance of Bar that we got in step 4.
I know how to implement this: class dictionary associating a tuple of type and str to instance, override __new__, etc. Simple stuff.
My question is how to type annotate this dictionary?
What I tried to do was something like this:
FooSubtype = TypeVar("FooSubtype", bound="Foo")
class Foo:
_instances: Final[dict[tuple[Type[FooSubtype], str], FooSubtype]] = dict()
So, the idea is "whatever type is in the first element of the key ("assigning" it to FooSubtype type variable), the value needs to be an instance of that same type".
This fails with Type variable "FooSubtype" is unbound, and I kinda see why.
I get the same error if I split it like this:
FooSubtype = TypeVar("FooSubtype", bound="Foo")
InstancesKeyType: TypeAlias = tuple[Type[FooSubtype], str]
class Foo:
_instances: Final[dict[InstancesKeyType, FooSubtype]] = dict()
The error points to the last line in this example, meaning it's the value type, not the key one, that is the problem.
mypy also suggests using Generic, but I don't see how to do it in this particular example, because the value's type should somehow relate to the key's type, not be a separate generic type.
This works:
class Foo:
_instances: Final[dict[tuple[Type["Foo"], str], "Foo"]] = dict()
but it allows _instance[(Bar1, "x")] to be of type Bar2 (Bar1 and Bar2 here being different subclasses of Foo). It's not a big problem and I'm ok with leaving it like this, but I'm wondering if there is a better (stricter) approach.
This is a really great question. First I looked through and said "no, you can't at all", because you can't express any relation between dict key and value. However, then I realised that your suggestion is almost possible to implement.
First, let's define a protocol that describes your desired behavior:
from typing import TypeAlias, TypeVar, Protocol
_T = TypeVar("_T", bound="Foo")
# Avoid repetition, it's just a generic alias
_KeyT: TypeAlias = tuple[type[_T], str]
class _CacheDict(Protocol):
def __getitem__(self, __key: _KeyT[_T]) -> _T: ...
def __delitem__(self, __key: _KeyT['Foo']) -> None: ...
def __setitem__(self, __key: _KeyT[_T], __value: _T) -> None: ...
How does it work? It defines an arbitrary data structure with item access, such that cache_dict[(Foo1, 'foo')] resolves to type Foo1. It looks very much like a dict sub-part (or collections.abc.MutableMapping), but with slightly different typing. Dunder argument names are almost equivalent to positional-only arguments (with /). If you need other methods (e.g. get or pop), add them to this definition as well (you may want to use overload). You'll almost certainly need __contains__ which should have the same signature as __delitem__.
So, now
class Foo:
_instances: Final[_CacheDict] = cast(_CacheDict, dict())
class Foo1(Foo): pass
class Foo2(Foo): pass
reveal_type(Foo._instances[(Foo, 'foo')]) # N: Revealed type is "__main__.Foo"
reveal_type(Foo._instances[(Foo1, 'foo')]) # N: Revealed type is "__main__.Foo1"
wow, we have properly inferred value types! We cast dict to the desired type, because our typing is different from dict definitions.
It still has a problem: you can do
Foo._instances[(Foo1, 'foo')] = Foo2()
because _T just resolves to Foo here. However, this problem is completely unavoidable: even had we some infer keyword or Infer special form to spell def __setitem__(self, __key: _KeyT[Infer[_T]], __value: _T) -> None, it won't work properly:
foo1_t: type[Foo] = Foo1 # Ok, upcasting
foo2: Foo = Foo2() # Ok again
Foo._instances[(foo1_t, 'foo')] = foo2 # Ough, still allowed, _T is Foo again
Note that we don't use any casts above, so this code is type-safe, but certainly conflicts with our intent.
So, we probably have to live with __setitem__ unstrictness, but at least have proper type from item access.
Finally, the class is not generic in _T, because otherwise all values will be inferred to declared type instead of function-scoped (you can try using Protocol[_T] as a base class and watch what's happening, it's pretty good for deeper understanding of mypy approach to type inference).
Here's a link to playground with full code.
Also, you can subclass a MutableMapping[_KeyT['Foo'], 'Foo'] to get more methods instead of defining them manually. It will deal with __delitem__ and __contains__ out of the box, but __setitem__ and __getitem__ still need your implementation.
Here's an alternative solution with MutableMapping and get (because get was tricky and funny to implement) (playground):
from collections.abc import MutableMapping
from abc import abstractmethod
from typing import TypeAlias, TypeVar, Final, TYPE_CHECKING, cast, overload
_T = TypeVar("_T", bound="Foo")
_Q = TypeVar("_Q")
_KeyT: TypeAlias = tuple[type[_T], str]
class _CacheDict(MutableMapping[_KeyT['Foo'], 'Foo']):
#abstractmethod
def __getitem__(self, __key: _KeyT[_T]) -> _T: ...
#abstractmethod
def __setitem__(self, __key: _KeyT[_T], __value: _T) -> None: ...
#overload # No-default version
#abstractmethod
def get(self, __key: _KeyT[_T]) -> _T | None: ...
# Ooops, a `mypy` bug, try to replace with `__default: _T | _Q`
# and check Foo._instances.get((Foo1, 'foo'), Foo2())
# The type gets broader, but resolves to more specific one in a wrong way
#overload # Some default
#abstractmethod
def get(self, __key: _KeyT[_T], __default: _Q) -> _T | _Q: ...
# Need this because of https://github.com/python/mypy/issues/11488
#abstractmethod
def get(self, __key: _KeyT[_T], __default: object = None) -> _T | object: ...
class Foo:
_instances: Final[_CacheDict] = cast(_CacheDict, dict())
class Foo1(Foo): pass
class Foo2(Foo): pass
reveal_type(Foo._instances)
reveal_type(Foo._instances[(Foo, 'foo')]) # N: Revealed type is "__main__.Foo"
reveal_type(Foo._instances[(Foo1, 'foo')]) # N: Revealed type is "__main__.Foo1"
reveal_type(Foo._instances.get((Foo, 'foo'))) # N: Revealed type is "Union[__main__.Foo, None]"
reveal_type(Foo._instances.get((Foo1, 'foo'))) # N: Revealed type is "Union[__main__.Foo1, None]"
reveal_type(Foo._instances.get((Foo1, 'foo'), Foo1())) # N: Revealed type is "__main__.Foo1"
reveal_type(Foo._instances.get((Foo1, 'foo'), Foo2())) # N: Revealed type is "Union[__main__.Foo1, __main__.Foo2]"
(Foo1, 'foo') in Foo._instances # We get this for free
Foo._instances[(Foo1, 'foo')] = Foo1()
Foo._instances[(Foo1, 'foo')] = object() # E: Value of type variable "_T" of "__setitem__" of "_CacheDict" cannot be "object" [type-var]
Note that we don't use a Protocol now (because it needs MutableMapping to be a protocol as well) and use abstract methods instead.
Trick, don't use it!
When I was writing this answer, I discovered a mypy bug that you can abuse in a very interesting way here. We started with something like this, right?
from collections.abc import MutableMapping
from abc import abstractmethod
from typing import TypeAlias, TypeVar, Final, TYPE_CHECKING, cast, overload
_T = TypeVar("_T", bound="Foo")
_Q = TypeVar("_Q")
_KeyT: TypeAlias = tuple[type[_T], str]
class _CacheDict(MutableMapping[_KeyT['Foo'], 'Foo']):
#abstractmethod
def __getitem__(self, __key: _KeyT[_T]) -> _T: ...
#abstractmethod
def __setitem__(self, __key: _KeyT[_T], __value: _T) -> None: ...
class Foo:
_instances: Final[_CacheDict] = cast(_CacheDict, dict())
class Foo1(Foo): pass
class Foo2(Foo): pass
Foo._instances[(Foo1, 'foo')] = Foo1()
Foo._instances[(Foo1, 'foo')] = Foo2()
Now let's change __setitem__ signature to a very weird thing. Warning: this is a bug, don't rely on this behavior! If we type __default as _T | _Q, we magically get "proper" typing with strict narrowing to type of first argument.
#abstractmethod
def __setitem__(self, __key: _KeyT[_T], __value: _T | _Q) -> None: ...
Now:
Foo._instances[(Foo1, 'foo')] = Foo1() # Ok
Foo._instances[(Foo1, 'foo')] = Foo2() # E: Incompatible types in assignment (expression has type "Foo2", target has type "Foo1") [assignment]
It is simply wrong, because _Q union part can be resolved to anything and is not used in fact (and moreover, it can't be a typevar at all, because it's used only once in the definition).
Also, this allows another invalid assignment, when right side is not a Foo subclass:
Foo._instances[(Foo1, 'foo')] = object() # passes
I'll report this soon and link the issue to this question.

Type annotation in a filter() function over a custom generator

could you help me understand why I am getting the TypeError: 'type' object is not subscriptable error with the code below?
Maybe I'm getting this wrong, but as I understood the Color type annotation in the filter() function is saying that the function will result in an Iterable of Color , which is exactly what I want. But when I try to annotate the function I get the error. ( but the waty, how come a type annotation is preventing the program to run? I thought that type hints in in Python would just matter inside your IDE, not in runtime).
Any light on this would be much appreciated.
# -*- coding: utf-8 -*-
from __future__ import annotations
from typing import TypeVar, Any, Generic, Iterator, Iterable
from abc import ABC, abstractmethod
from dataclasses import dataclass
T = TypeVar('T', bound=Any)
I = TypeVar('I', bound=Any)
class AbstractGenerator(ABC, Iterator[T], Generic[T, I]):
def __init__(self):
super().__init__()
self._items = None
self._next_item = None
#property
def items(self) -> Any:
return self._items
#items.setter
def items(self, items: Any) -> AbstractGenerator:
self._items = items
return self
#property
def next_item(self) -> Any:
return self._next_item
#next_item.setter
def next_item(self, next_item: Any) -> AbstractGenerator:
self._next_item = next_item
return self
#abstractmethod
def __len__(self) -> int:
pass
#abstractmethod
def __iter__(self) -> Iterable[T]:
pass
#abstractmethod
def __next__(self) -> Iterable[T]:
pass
#abstractmethod
def __getitem__(self, id: I) -> Iterable[T]:
pass
ColorId = int
#dataclass(frozen=True)
class Color:
id: ColorId
name: str
class MyColorsGenerator(AbstractGenerator[Color, int]):
def __init__(self):
super().__init__()
self._colors: list[Color] = []
self._next_color_index: int = 0 #None
#property
def colors(self) -> list[Color]:
return self._colors
#colors.setter
def colors(self, colors: list[Color]) -> MyColorsGenerator:
self._colors = colors
return self
#property
def next_color_index(self) -> int:
return self._next_color_index
#next_color_index.setter
def next_color_index(self, next_color_index: int) -> MyColorsGenerator:
self._next_color_index = next_color_index
return self
def add_color(self, color: Color) -> MyColorsGenerator:
self.colors.append(color)
return self
def __len__(self) -> int:
return len(self.colors)
def __iter__(self) -> Iterable[Color]:
return self
def __next__(self) -> Iterable[Color]:
if self.next_color_index < len(self.colors):
self.next_color_index += 1
return self.colors[self.next_color_index - 1]
else:
raise StopIteration
def __getitem__(self, id: ColorId) -> Iterable[Color]:
return list(filter[Color](lambda color: color.id == id, self.colors))
colors_generator: MyColorsGenerator = MyColorsGenerator()
colors_generator \
.add_color(Color(id=0, name="Blue")) \
.add_color(Color(id=1, name="Red")) \
.add_color(Color(id=2, name="Yellow")) \
.add_color(Color(id=3, name="Green")) \
.add_color(Color(id=4, name="White")) \
.add_color(Color(id=5, name="Black"))
# This results in: TypeError: 'type' object is not subscriptable
#colors: Optional[list[Color]] = list(filter[Color](lambda color: color.id == 4, colors_generator))
# This works, notice the only thing I did was to remove the type annotation for the expected generic type ([Color])
colors: Optional[list[Color]] = list(filter(lambda color: color.id == 4, colors_generator))
print(colors)
The issue is that generics aren't a language-level addition, but a library one. Specifying the generic type parameters actually employs the same [] operator you use for item access in collections, except it is defined on the metaclass. For this reason the generics syntax originally only worked with specific classes in the typing module (typing.List[int], typing.Dict[str, str], etc.). Since python3.9, however, some common classes from the standard library have been extended to support the same operation, for brevity, like list[int], dict[str, str]. This is still NOT a language feature, and most classes in the standard library do not implement it. Moreover, as you've rightfully noticed, these annotations carry (almost) no meaning for the interpreter, and are (mostly) just there for the ide. Among other things, that implies that you don't instantiate generic classes as specialized generics (list() is correct, list[int]() is legal, but pointless and considered a bad practice). filter is a class in the standard library, which does not provide the generic-aliasing [] operation, so you get the error that applying it is not implemented ("'type' object is not subscriptable", filter is an instance of type, and [] is the subscription operator). Python as the language does not understand the concept of a generic, and so it cannot give you a better error message like "'filter' is not a generic class". Even if it was, however, you shouldn't have invoked it this way.
A special note should be made about generic functions. They CANNOT be explicitly supplied with generic parameters. So, if instead of filter we were talking about some function like:
T = typing.TypeVar("T")
def my_filter(f: typing.Callable[[T], bool], seq: list[T]) -> list[T]:
...
, there would have been no way to explicitly tell you're interested in my_filter[Color].
TL;DR: filter is not a generic class in terms of type annotations, so it does not support the [] operation

Classmethods on generic classes

I try to call a classmethod on a generic class:
from typing import List, Union, TypeVar, Generic
from enum import IntEnum
class Gender(IntEnum):
MALE = 1
FEMALE = 2
DIVERS = 3
T = TypeVar('T')
class EnumAggregate(Generic[T]):
def __init__(self, value: Union[int, str, List[T]]) -> None:
if value == '':
raise ValueError(f'Parameter "value" cannot be empty!')
if isinstance(value, list):
self._value = ''.join([str(x.value) for x in value])
else:
self._value = str(value)
def __contains__(self, item: T) -> bool:
return item in self.to_list
#property
def to_list(self) -> List[T]:
return [T(int(character)) for character in self._value]
#property
def value(self) -> str:
return self._value
#classmethod
def all(cls) -> str:
return ''.join([str(x.value) for x in T])
Genders = EnumAggregate[Gender]
But if I call
Genders.all()
I get the error TypeError: 'TypeVar' object is not iterable. So the TypeVar T isn't properly matched with the Enum Gender.
How can I fix this? The expected behavior would be
>>> Genders.all()
'123'
Any ideas? Or is this impossible?
Python's type hinting system is there for a static type checker to validate your code and T is just a placeholder for the type system, like a slot in a template language. It can't be used as an indirect reference to a specific type.
You need to subclass your generic type if you want to produce a concrete implementation. And because Gender is a class and not an instance, you'd need to tell the type system how you plan to use a Type[T] somewhere, too.
Because you also want to be able to use T as an Enum() (calling it with EnumSubclass(int(character))), I'd also bind the typevar; that way the type checker will understand that all concrete forms of Type[T] are callable and will produce individual T instances, but also that those T instances will always have a .value attribute:
from typing import ClassVar, List, Union, Type, TypeVar, Generic
from enum import IntEnum
T = TypeVar('T', bound=IntEnum) # only IntEnum subclasses
class EnumAggregate(Generic[T]):
# Concrete implementations can reference `enum` *on the class itself*,
# which will be an IntEnum subclass.
enum: ClassVar[Type[T]]
def __init__(self, value: Union[int, str, List[T]]) -> None:
if not value:
raise ValueError('Parameter "value" cannot be empty!')
if isinstance(value, list):
self._value = ''.join([str(x.value) for x in value])
else:
self._value = str(value)
def __contains__(self, item: T) -> bool:
return item in self.to_list
#property
def to_list(self) -> List[T]:
# the concrete implementation needs to use self.enum here
return [self.enum(int(character)) for character in self._value]
#property
def value(self) -> str:
return self._value
#classmethod
def all(cls) -> str:
# the concrete implementation needs to reference cls.enum here
return ''.join([str(x.value) for x in cls.enum])
With the above generic class you can now create a concrete implementation, using your Gender IntEnum fitted into the T slot and as a class attribute:
class Gender(IntEnum):
MALE = 1
FEMALE = 2
DIVERS = 3
class Genders(EnumAggregate[Gender]):
enum = Gender
To be able to access the IntEnum subclass as a class attribute, we needed to use typing.ClassVar[]; otherwise the type checker has to assume the attribute is only available on instances.
And because the Gender IntEnum subclass is itself a class, we need to tell the type checker about that too, hence the use of typing.Type[].
Now the Gender concrete subclass works; the use of EnumAggregate[Gender] as a base class tells the type checker to substitute T for Gender everywhere, and because the implementation uses enum = Gender, the type checker sees that this is indeed correctly satisfied and the code passes all checks:
$ bin/mypy so65064844.py
Success: no issues found in 1 source file
and you can call Genders.all() to produce a string:
>>> Genders.all()
'123'
Note that I'd not store the enum values as strings, but rather as integers. There is little value in converting it back and forth here, and you are limiting yourself to enums with values between 0 and 9 (single digits).
The other answer does not work anymore, at least in Python 3.10. The type annotation ClassVar[Type[T]] results in a mypy error: ClassVar cannot contain type variables is thrown. This is because ClassVar should only be used in a Protocol and structural subtyping, which is not the best answer for the problem at hand.
The following modification of the other answer works:
class EnumAggregate(Generic[T]):
enum: type[T]
[...]
class Genders(EnumAggregate[Gender]):
enum = Gender
Abstract class variables
I would also recommend making enum abstract in some way, so instantiating EnumAggregate[Gender] instead of Genders will raise an error at the time of instantiation, not only at calls of to_list() or all().
This can be done in two ways: Either check the implementation in __init__:
class EnumAggregate(Generic[T]):
enum: type[T]
def __init__
[...]
if not hasattr(type(self), 'enum'):
raise NotImplementedError("Implementations must define the class variable 'enum'")
Or use an abstract class property, see this discussion. This makes mypy happy in several situations, but not Pylance (see here):
class EnumAggregate(Generic[T]):
#property
#classmethod
#abstractmethod
def enum(cls) -> type[T]: ...
[...]
class Genders(EnumAggregate[Gender]):
enum = Gender
However, there are unresolved problems with mypy and decorators, so right now there are spurious errors which might disappear in the future. For reference:
mypy issue 1
mypy issue 2
Discussion whether to deprecate chaining classmethod decorators

Python type hints: What does Callable followed by a TypeVar mean?

I am trying to understand the type hint Getter[T] in the following piece of code:
Simplified example
T = TypeVar('T')
Getter = Callable[[T, str], str]
class AbstractClass(abc.ABC):
#abc.abstractmethod
def extract(
self,
get_from_carrier: Getter[T], # <---- See here
...
) -> Context:
Help much appreciated since I have been breaking my head over this.
Original source code
The original source code is from the OpenTelemetry project file "textmap.py":
import abc
import typing
from opentelemetry.context.context import Context
TextMapPropagatorT = typing.TypeVar("TextMapPropagatorT")
Setter = typing.Callable[[TextMapPropagatorT, str, str], None]
Getter = typing.Callable[[TextMapPropagatorT, str], typing.List[str]]
class TextMapPropagator(abc.ABC):
"""This class provides an interface that enables extracting and injecting
context into headers of HTTP requests.
...
"""
#abc.abstractmethod
def extract(
self,
get_from_carrier: Getter[TextMapPropagatorT],
carrier: TextMapPropagatorT,
context: typing.Optional[Context] = None,
) -> Context:
A Callable followed by a type variable means that the callable is a generic function that takes one or more arguments of generic type T.
The type variable T is a parameter for any generic type.
The line:
Getter = Callable[[T, str], str]
defines Getter as a type alias for a callable function whose arguments are of generic type T and string, and whose return type is string.
Therefore, the line:
get_from_carrier: Getter[T]
defines an argument (get_from_carrier) that is a generic function. And the first argument of the generic function is of generic type T.
Concrete Example
This can be better understood by looking at a concrete example. See propagators.extract below from "instrumentation/opentelemetry-instrumentation-asgi/src/opentelemetry/instrumentation/asgi/init.py ":
In the call propagators.extract, the function get_header_from_scope is a callable function whose first argument is of type dict, and this dict is serving as a TextMapPropagatorT.
def get_header_from_scope(scope: dict, header_name: str) -> typing.List[str]:
"""Retrieve a HTTP header value from the ASGI scope.
Returns:
A list with a single string with the header value if it exists, else an empty list.
"""
headers = scope.get("headers")
return [
value.decode("utf8")
for (key, value) in headers
if key.decode("utf8") == header_name
]
...
class OpenTelemetryMiddleware:
"""The ASGI application middleware.
...
"""
...
async def __call__(self, scope, receive, send):
"""The ASGI application ... """
if scope["type"] not in ("http", "websocket"):
return await self.app(scope, receive, send)
token = context.attach(
propagators.extract(get_header_from_scope, scope)
)
tl;dr: _C[_T] is a generic type alias that's equivalent to Callable[[_T, int], int].
Here you're defining _C to be the type alias of Callable[[_T, int], int]. When an type alias contains a TypeVar (in this case, _T), it becomes a generic type alias. You can use it the same way as you'd use built-in generic types like List[T] or Dict[K, V], for example, _C[str] would be equivalent to Callable[[str, int], int].
Then, type annotations of get_from_elem define it as a generic function. What this means is the the same type variable used within the entire function should be bound to the same class. To explain what this means, take a look at these function calls:
_T = typing.TypeVar('_T')
_C = typing.Callable[[_T,int],int]
def get_from_elem(get: _C[_T], elem: _T):
...
def foo_str(a: str, b: int) -> int:
# This function matches `_C[str]`, i.e. `Callable[[str, int], int]`
...
def foo_float(a: float, b: int) -> int:
# This function matches `_C[float]`, i.e. `Callable[[float, int], int]`
...
def foo_generic(a: _T, b: int) -> int:
# This function matches `_C[_T]`, it is also a generic function
...
_T2 = typing.TypeVar('_T2', str, bytes)
def foo_str_like(a: _T2, b: int) -> int:
# A generic function with constraints: type of first argument must be `str` or `bytes`
...
get_from_elem(foo_str, "abc") # Correct: `_T` is bound to `str`
get_from_elem(foo_float, 1.23) # Correct: `_T` is bound to `float`
get_from_elem(foo_str, 1.23) # Wrong: `_T` bound to two different types `str` and `float`
get_from_elem(foo_float, [1.23]) # Wrong: `_T` bound to two different types `float` and `List[float]`
get_from_elem(foo_generic, 1.45) # Correct: `_T` is only bound to `float`
get_from_elem(foo_str_like, 1.45) # Wrong: `_T` is only bound to `float`, but doesn't satisfy `foo_str_like` constraints
In the last two cases, the first argument is a generic function, which does not bind the type variable, so the type variable is only bound by the second argument. However, in the last case, foo_str_like has an additional constraint on its first argument type, and the bound type float does not satisfy that constraint, so it fails type checking.

Annotating return types for methods returning self in mixins

I am using a builder pattern where most methods on a (big) class return their identity (self) and are thus annotated to return the type of the class they're a member of:
class TextBuilder:
parts: List[str] # omitted
render: Callable[[], str] # for brevity
def text(self, val: str) -> "TextBuilder":
self.parts.append(val)
return self
def bold(self, val: str) -> "TextBuilder":
self.parts.append(f"<b>{val}</b>")
return self
...
Example usage:
joined_text = TextBuilder().text("a ").bold("bold").text(" text").render()
# a <b>bold</b> text
Now as this class is growing large I would like to split and group related methods up into mixins:
class BaseBuilder:
parts: List[str] # omitted
render: Callable[[], str] # for brevity
class TextBuilder(BaseBuilder):
def text(self, val: str):
self.parts.append(val)
return self
...
class HtmlBuilder(BaseBuilder):
def bold(self, val: str):
self.parts.append(f"<b>{val}</b>")
return self
...
class FinalBuilder(TextBuilder, HtmlBuilder):
pass
However, I do not see a way to properly annotate the mixin classes' return types in a way that the resulting class FinalBuilder always makes mypy believe that it returns FinalBuilder and not one of the mixin classes. All that of course assuming I want to actually annotate self and return types because they may not be inferred from what goes on inside those methods.
I have tried making the mixin classes generic and marking them explicitly as returning a type T bound to BaseBuilder, but that did not satisfy mypy. Any ideas? For now I am just going to skip all these shenanigans and omit the return types everywhere as they should be properly inferred when using the FinalBuilder, but I'm still curious if there is a general way to approach this.
If you want the return type to always be what self is, just annotate the self parameter like so:
from typing import List, Callable, TypeVar
T = TypeVar('T', bound=BaseBuilder)
class BaseBuilder:
parts: List[str] # omitted
render: Callable[[], str] # for brevity
class TextBuilder(BaseBuilder):
def text(self: T, val: str) -> T:
self.parts.append(val)
return self
...
class HtmlBuilder(BaseBuilder):
def bold(self: T, val: str) -> T:
self.parts.append(f"<b>{val}</b>")
return self
...
class FinalBuilder(TextBuilder, HtmlBuilder):
pass
# Type checks
f = FinalBuilder().text("foo").bold("bar")
# Mypy states this is type 'FinalBuilder'
reveal_type(f)
A few notes:
If we don't annotate self, mypy will normally assume it's the type of whatever class we're currently contained in. However, it's actually fine to give it a custom type hint if you want, so long as that type hint is compatible with the class. (For example, it wouldn't be legal to add a def foo(self: int) -> None to HtmlBuilder since int isn't a supertype of HtmlBuilder.)
We take advantage of this by making self generic so we can specify a more specific return type.
See the mypy docs for more details: https://mypy.readthedocs.io/en/stable/generics.html#generic-methods-and-generic-self
I bounded the TypeVar to BaseBuilder so that both functions would be able to see the parts and render fields. If you want your text(...) and bold(...) functions to also see fields defined within TextBuilder and HtmlBuilder respectively, you'll need to create two TypeVars bound to these more specific child classes.

Categories