Consider:
from __future__ import annotations
class A:
#classmethod
def get(cls) -> A:
return cls()
class B(A):
pass
def func() -> B: # Line 12
return B.get()
Running mypy on this we get:
$ mypy test.py
test.py:12: error: Incompatible return value type (got "A", expected "B")
Found 1 error in 1 file (checked 1 source file)
Additionally, I have checked to see if old-style recursive annotations work. That is:
# from __future__ import annotations
class A:
#classmethod
def get(cls) -> "A":
# ...
...to no avail.
Of course one could do:
from typing import cast
def func() -> B: # Line 12
return cast(B, B.get())
Every time this case pops up. But I would like to avoid doing that.
How should one go about typing this?
The cls and self parameters are usually inferred by mpyp to avoid a lot of redundant code, but when required they can be specified explicitly by annotations.
In this case the explicit type for the class method would look like the following:
class A:
#classmethod
def get(cls: Type[A]) -> A:
return cls()
So what we really need here is a way to make Type[A] a generic parameter, such that when the class method is called from a child class, you can reference the child class instead. Luckily, we have TypeVar values for this.
Working this into your existing example we will get the following:
from __future__ import annotations
from typing import TypeVar, Type
T = TypeVar('T')
class A:
#classmethod
def get(cls: Type[T]) -> T:
return cls()
class B(A):
pass
def func() -> B:
return B.get()
Now mypy should be your friend again! 😎
Related
Python's typing system allows for generics in classes:
class A(Generic[T]):
def get_next(self) -> T
which is very handy. However, even in 3.11 with the Self type, I cannot find a way to change the type argument (the T) without specifying the class name. Here's the recommended usage from PEP 673: Self Type: https://peps.python.org/pep-0673/a
class Container(Generic[T]):
def foo(
self: Container[T],
) -> Container[str]:
# maybe implementing something like:
return self.__class__([str(x) for x in self])
The problem is if I want to subclass container:
class SuperContainer(Container[T]):
def time_travel(self): ...
And then if I have an instance of SuperContainer and call foo on it, the typing will be wrong, and think that it's a Container not SuperContainer.
sc = SuperContainer([1, 2, 3])
sc2 = sc.foo()
reveal_type(sc2) # mypy: Container[str]
sc2.time_travel() # typing error: only SuperContainers can time-travel
isinstance(sc2, SuperContainer) # True
Is there an accepted way to allow a program to change the type argument in the superclass that preserves the typing of the subclass?
To solve this, you need a second generic type argument, to represent the return type of foo.
SelfStr = TypeVar("SelfStr", bound="Container[str, Any]", covariant=True)
The Any is okay. We'll see that later.
So far so good. Let's define the Container:
class Container(Generic[T, SelfStr]):
def __init__(self, contents: list[T]):
self._contents = contents
def __iter__(self):
return iter(self._contents)
def foo(self) -> SelfStr:
reveal_type(type(self))
# Mypy is wrong here: it thinks that type(self) is already annotated, but in fact the type parameters are erased.
return type(self)([str(x) for x in self]) # type: ignore
def __repr__(self):
return type(self).__name__ + "(" + repr(self._contents) + ")"
Note that we had to ignore the types in foo. This is because mypy has inferred the type of type(self) incorrectly. It thinks that type(self) returns Container[...] (or a subclass), but in fact it returns Container (or a subclass). You'll see that when we get to running this code.
Next, we need some way of creating a container. We want the type to look like Container[T, Container[str, Container[str, ...]]].
In the first line of the class declaration, we made the second type-parameter of the class be a SelfStr, which is itself Container[str, Any]. This means that definition of SelfStr should become bounded to Container[str, SelfStr], so we should get an upper bound of Container[str, Container[str, ...]] as we wanted. This works: it will only allow our recursive type (or subclasses) or Any. Unfortunately, mypy won't do inference on recursive generic types, reporting test.Container[builtins.int, <nothing>], so we have to do the heavy lifting. Time for some ✨ magic ✨.
_ContainerStr: TypeAlias = Container[str, "_ContainerStr"]
ContainerComplete: TypeAlias = Container[T, _ContainerStr]
The _ContainerStr alias will give us the recursive part of the signature. We then expose ContainerComplete, which we can use as a constructor, for example:
ContainerComplete[int]([1,2,3])
Awesome! But what about subclasses? We just have to do the same thing again, for our subclass:
class SuperContainer(Container[T, SelfStr]):
def time_travel(self):
return "magic"
_SuperContainerStr: TypeAlias = SuperContainer[str, "_SuperContainerStr"]
SuperContainerComplete: TypeAlias = SuperContainer[T, _SuperContainerStr]
All done! Now let's demonstrate:
sc = SuperContainerComplete[int]([3, 4, 5])
reveal_type(sc)
sc2 = sc.foo()
reveal_type(sc2)
print(sc2.time_travel())
Putting everything together, we get:
from typing import TypeVar, Generic, Any, TypeAlias, TYPE_CHECKING
if not TYPE_CHECKING:
reveal_type = print
T = TypeVar('T')
SelfStr = TypeVar("SelfStr", bound="Container[str, Any]", covariant=True)
class Container(Generic[T, SelfStr]):
def __init__(self, contents: list[T]):
self._contents = contents
def __iter__(self):
return iter(self._contents)
def foo(self) -> SelfStr:
reveal_type(type(self))
# Mypy is wrong here: it thinks that type(self) is already annotated, but in fact the type parameters are erased.
return type(self)([str(x) for x in self]) # type: ignore
def __repr__(self):
return type(self).__name__ + "(" + repr(self._contents) + ")"
_ContainerStr: TypeAlias = Container[str, "_ContainerStr"]
ContainerComplete: TypeAlias = Container[T, _ContainerStr]
class SuperContainer(Container[T, SelfStr]):
def time_travel(self):
return "magic"
_SuperContainerStr: TypeAlias = SuperContainer[str, "_SuperContainerStr"]
SuperContainerComplete: TypeAlias = SuperContainer[T, _SuperContainerStr]
sc = SuperContainerComplete[int]([3, 4, 5])
reveal_type(sc)
sc2 = sc.foo()
reveal_type(sc2)
print(sc2.time_travel())
And the output looks like this (you need a recent version of mypy):
$ mypy test.py
test.py:17: note: Revealed type is "Type[test.Container[T`1, SelfStr`2]]"
test.py:33: note: Revealed type is "test.SuperContainer[builtins.int, test.SuperContainer[builtins.str, ...]]"
test.py:36: note: Revealed type is "test.SuperContainer[builtins.str, test.SuperContainer[builtins.str, ...]]"
Success: no issues found in 1 source file
$ python test.py
<__main__.SuperContainer object at 0x7f30165582d0>
<class '__main__.SuperContainer'>
<__main__.SuperContainer object at 0x7f3016558390>
magic
$
You can remove a lot of the boilerplate using metaclasses. This has the added advantage that it's inherited. If you override __call__, you can even get isinstance working properly (it doesn't work with the generic type aliases *Complete, it still works fine for the classes themselves).
Note that this only partially works in PyCharm:
I have no clue how this works, but I made it work with Type[T] too. I genuinely cannot explain this code so I'm just gonna copy and paste, better people than me can tell you why.
from typing import TypeVar, Generic, Any, TypeAlias, TYPE_CHECKING, Type
if not TYPE_CHECKING:
reveal_type = print
T = TypeVar('T')
SelfStr = TypeVar("SelfStr", bound="Container[str, Any, Any]", covariant=True)
SelfTypeT = TypeVar("SelfTypeT", bound="Container[Type[Any], Any, Any]", covariant=True)
class Container(Generic[T, SelfStr, SelfTypeT]):
def __init__(self, contents: list[T]):
self._contents = contents
def __iter__(self):
return iter(self._contents)
def foo(self) -> SelfStr:
reveal_type(type(self))
# Mypy is wrong here: it thinks that type(self) is already annotated, but in fact the type parameters are erased.
return type(self)([str(x) for x in self]) # type: ignore
def get_types(self) -> SelfTypeT:
return type(self)([type(x) for x in self]) # type: ignore
def __repr__(self):
return type(self).__name__ + "(" + repr(self._contents) + ")"
_ContainerStr: TypeAlias = Container[str, "_ContainerStr", "ContainerComplete[Type[str]]"]
_ContainerTypeT: TypeAlias = Container[Type[T], "_ContainerStr", "_ContainerTypeT[Type[type]]"]
ContainerComplete: TypeAlias = Container[T, _ContainerStr, _ContainerTypeT[T]]
class SuperContainer(Container[T, SelfStr, SelfTypeT]):
def time_travel(self):
return "magic"
_SuperContainerStr: TypeAlias = SuperContainer[str, "_SuperContainerStr", "SuperContainerComplete[Type[str]]"]
_SuperContainerTypeT: TypeAlias = SuperContainer[Type[T], "_SuperContainerStr", "_SuperContainerTypeT[Type[type]]"]
SuperContainerComplete: TypeAlias = SuperContainer[T, _SuperContainerStr, _SuperContainerTypeT[T]]
sc = SuperContainerComplete[int]([3, 4, 5])
reveal_type(sc)
sc2 = sc.foo()
reveal_type(sc2)
sc3 = sc.get_types()
reveal_type(sc3)
class Base:
pass
class Impl1(Base):
pass
class Impl2(Base):
pass
sc4 = SuperContainerComplete[Base]([Impl1(), Impl2()])
sc5 = sc4.foo()
reveal_type(sc5)
sc6 = sc4.get_types()
reveal_type(sc6)
print(sc2.time_travel())
Mypy and Python are both happy with this code so I guess It Works (TM).
I'm trying to design my code as follows - i.e., I'd like that each subclass which implements my functionlity will have as member a collection of fields, which can also inherit from a base dataclass.
from dataclasses import dataclass
from abc import ABC, abstractmethod
#dataclass
class BaseFields:
pass
#dataclass
class MoreFields(baseFields):
name: str = "john"
class A(ABC):
def __init__(self) -> None:
super().__init__()
self.fields: BaseFields = BaseFields()
#abstractmethod
def say_hi(self) -> None:
pass
class B(A):
def __init__(self) -> None:
super().__init__()
self.fields = MoreFields()
def say_hi(self) -> None:
print(f"Hi {self.fields.name}!")
if __name__ == "__main__":
b = B()
b.say_hi()
When I run it, I get Hi john! as output, as expected.
But mypy doesn't seem to recognize it:
❯ mypy dataclass_inheritence.py
dataclass_inheritence.py:25: error: "baseFields" has no attribute "name"
Found 1 error in 1 file (checked 1 source file)
I looked and found this github issue, and it links to another one, but doesn't seem like it offers a solution.
I should also note that if I remove the #dataclass decorators and implement the Fields classes as plain ol' classes, with __init__ - I still get the same mypy error.
My motivation (as you may tell) is to reference composite members within the implemented methods of the functional subclasses. Those members are constants, as in the example, so I might use some form of Enum inheritance, but looking at this question it's not a popular design choice (will have to use some 3rd party module which I'm not keen on doing).
Has anyone encountered something similar? Do you have suggestions for a design that could achieve my goal?
The type of self.fields is declared as baseFields in A.__init__, and is not narrowed implicitly by assigning a moreFields to it in B.__init__ -- after all, you might want to be able to re-assign it to another baseFields instance, and it is therefore never assumed to be anything more specific than baseFields.
If you explicitly annotate it as moreFields in B.__init__, the error goes away:
class B(A):
def __init__(self) -> None:
super().__init__()
self.fields: moreFields = moreFields()
def say_hi(self) -> None:
print(f"Hi {self.fields.name}!") # ok!
although this actually feels like a bug in mypy, because now you can do this, violating the LSP:
if __name__ == "__main__":
b: A = B()
b.fields = baseFields() # no mypy error, because b is an A, right?
b.say_hi() # runtime AttributeError because b is actually a B!
If I want a subclass to be able to narrow the type of an attribute, I make it a property backed by private attributes:
class A(ABC):
def __init__(self) -> None:
super().__init__()
self.__baseFields = baseFields()
#property
def fields(self) -> baseFields:
return self.__baseFields
#abstractmethod
def say_hi(self) -> None:
pass
class B(A):
def __init__(self) -> None:
super().__init__()
self.__moreFields = moreFields()
#property
def fields(self) -> moreFields:
return self.__moreFields
def say_hi(self) -> None:
print(f"Hi {self.fields.name}!") # ok!
You can use a generic base class to define the class. I would also have the fields attribute be passed to the base class constructor. There are some subtle tricks to get the signature on the init method working, but this should work.
Some imports you'll want:
from __future__ import annotations
from abc import ABC, abstractmethod
from dataclasses import dataclass
from typing import Generic, TypeVar, overload
Rename the classes with more pythonic names, and define a generic TypeVar to represent which fields we are using.
#dataclass
class BaseFields:
pass
#dataclass
class MoreFields(BaseFields):
name: str = "john"
Fields = TypeVar('Fields', bound=BaseFields)
For defining the base class, we want to allow the fields param to be anything satisfying the TypeVar. We also need to add some overloads to handle the case where a default is used or not.
class A(Generic[Fields], ABC):
fields: Fields
#overload
def __init__(self: A[BaseFields]) -> None:
...
#overload
def __init__(self: A[Fields], fields: Fields) -> None:
...
def __init__(self, fields=None):
self.fields = fields or BaseFields()
#abstractmethod
def say_hi(self) -> None:
pass
Now we can run our test:
class B(A[MoreFields]):
def __init__(self) -> None:
super().__init__(MoreFields())
def say_hi(self) -> None:
print(f"Hi {self.fields.name}!")
if __name__ == "__main__":
b = B()
b.say_hi()
$ mypy test.py
Success: no issues found in 1 source file
Consider the following code:
from typing import Union
class A:
def function_in_a(self) -> str:
return 'a'
class B:
def function_in_b(self) -> str:
return "b"
class C(A, B):
pass
def call_functions(c: Union[A, B]) -> None:
print(c.function_in_a())
print(c.function_in_b())
if __name__=="__main__":
c = C()
call_functions(c)
Note that the function call_functions relies on definitions contained in both classes A and B. It expects objects that inherit from both of these classes.
This code will compile when run using python test.py. But mypy --strict test.py throws an error:
test.py:15: note: Revealed type is "Union[test.A, test.B]"
test.py:16: error: Item "B" of "Union[A, B]" has no attribute "function_in_a"
test.py:17: error: Item "A" of "Union[A, B]" has no attribute "function_in_b"
Found 2 errors in 1 file (checked 1 source file)
This makes sense to me. Union means that c can be a subclass of either A or B, but not both. I saw mention of an Intersection type in PEP483 but a quick perusal of the typing module docs showed that this type was never implemented.
How can I get mypy to recognize that parameters of call_functions are objects which inherit from both A and B using type hinting?
Use typing.Protocol (New in version 3.8.) to define a type that must implement both methods invoked in the function.
from typing import Protocol
class A:
def function_in_a(self) -> str:
return 'a'
class B:
def function_in_b(self) -> str:
return "b"
class C(A, B):
pass
class D(B):
pass
class ProtoAB(Protocol):
def function_in_a(self) -> str:
...
def function_in_b(self) -> str:
...
def call_functions(obj: ProtoAB) -> None:
print(obj.function_in_a())
print(obj.function_in_b())
def main() -> None:
c = C()
call_functions(c)
d = D()
call_functions(d)
if __name__ == "__main__":
main()
Another solution is to make A and B Protocol. Protocols can can be used as normal class:
from typing import Protocol
class A(Protocol):
def function_in_a(self) -> str:
return 'a'
class B(Protocol):
def function_in_b(self) -> str:
return "b"
class AB(A, B, Protocol):
pass
def call_functions(c: AB) -> None:
print(c.function_in_a())
print(c.function_in_b())
class C(A, B):
pass
call_functions(C())
I want to use multiple generic protocols and ensure they're compatible:
from typing import TypeVar, Protocol, Generic
from dataclasses import dataclass
# checking fails as below and with contravariant=True or covariant=True:
A = TypeVar("A")
class C(Protocol[A]):
def f(self, a: A) -> None: pass
class D(Protocol[A]):
def g(self) -> A: pass
# Just demonstrates my use case; doesn't have errors:
#dataclass
class CompatibleThings(Generic[A]):
c: C[A]
d: D[A]
Mypy gives the following error:
Invariant type variable 'A' used in protocol where contravariant one is expected
Invariant type variable 'A' used in protocol where covariant one is expected
I know this can be done by making C and D generic ABC classes, but I want to use protocols.
The short explanation is that your approach breaks subtype transitivity; see this section of PEP 544 for more information. It gives a pretty clear explanation of why your D protocol (and, implicitly, your C protocol) run into this problem, and why it requires different types of variance for each to solve it. You can also look on Wikipedia for info on type variance.
Here's the workaround: use covariant and contravariant protocols, but make your generic dataclass invariant. The big hurdle here is inheritance, which you have to handle in order to use Protocols, but is kind of tangential to your goal. I'm going to switch naming here to highlight the inheritance at play, which is what this is all about:
A = TypeVar("A") # Invariant type
A_cov = TypeVar("A_cov", covariant=True) # Covariant type
A_contra = TypeVar("A_contra", contravariant=True) # Contravariant type
# Give Intake its contravariance
class Intake(Protocol[A_contra]):
def f(self, a: A_contra) -> None: pass
# Give Output its covariance
class Output(Protocol[A_cov]):
def g(self) -> A_cov: pass
# Just tell IntakeOutput that the type needs to be the same
# Since a is invariant, it doesn't care that
# Intake and Output require contra / covariance
#dataclass
class IntakeOutput(Generic[A]):
intake: Intake[A]
output: Output[A]
You can see that this works with the following tests:
class Animal:
...
class Cat(Animal):
...
class Dog(Animal):
...
class IntakeCat:
def f(self, a: Cat) -> None: pass
class IntakeDog:
def f(self, a: Dog) -> None: pass
class OutputCat:
def g(self) -> Cat: pass
class OutputDog:
def g(self) -> Dog: pass
compat_cat: IntakeOutput[Cat] = IntakeOutput(IntakeCat(), OutputCat())
compat_dog: IntakeOutput[Dog] = IntakeOutput(IntakeDog(), OutputDog())
# This is gonna error in mypy
compat_fail: IntakeOutput[Dog] = IntakeOutput(IntakeDog(), OutputCat())
which gives the following error:
main.py:48: error: Argument 2 to "IntakeOutput" has incompatible type "OutputCat"; expected "Output[Dog]"
main.py:48: note: Following member(s) of "OutputCat" have conflicts:
main.py:48: note: Expected:
main.py:48: note: def g(self) -> Dog
main.py:48: note: Got:
main.py:48: note: def g(self) -> Cat
So what's the catch? What are you giving up? Namely, inheritance in IntakeOutput. Here's what you can't do:
class IntakeAnimal:
def f(self, a: Animal) -> None: pass
class OutputAnimal:
def g(self) -> Animal: pass
# Ok, as expected
ok1: IntakeOutput[Animal] = IntakeOutput(IntakeAnimal(), OutputAnimal())
# Ok, because Output is covariant
ok2: IntakeOutput[Animal] = IntakeOutput(IntakeAnimal(), OutputDog())
# Both fail, because Intake is contravariant
fails1: IntakeOutput[Animal] = IntakeOutput(IntakeDog(), OutputDog())
fails2: IntakeOutput[Animal] = IntakeOutput(IntakeDog(), OutputAnimal())
# Ok, because Intake is contravariant
ok3: IntakeOutput[Dog] = IntakeOutput(IntakeAnimal(), OutputDog())
# This fails, because Output is covariant
fails3: IntakeOutput[Dog] = IntakeOutput(IntakeAnimal(), OutputAnimal())
fails4: IntakeOutput[Dog] = IntakeOutput(IntakeDog(), OutputAnimal())
So. There it is. You can play around with this more here.
Please check the below code
import typing
import abc
class A(abc.ABC):
#abc.abstractmethod
def f(self) -> typing.NamedTuple[typing.Union[int, str], ...]:
...
class NT(typing.NamedTuple):
a: int
b: str
class B(A):
def f(self) -> NT:
return NT(1, "s")
print(B().f())
I get an error. In parent class A I want to define method f such that I indicate that any child class should override it by returning a NamedTuple that is made up of int ot str fields only.
But I get a error sayin that:
TypeError: 'NamedTupleMeta' object is not subscriptable
Changing the signature as below helps but then how will I tell typing system that the child class can return NamedTuples's that have only int and str's
class A(abc.ABC):
#abc.abstractmethod
def f(self) -> typing.NamedTuple:
...
The issue is that fundamentally typing.NamedTuple is not a proper type. It essentially allows you to use the class factory collections.namedtuple using the syntax of inheritance and type annotations. It's sugar.
This is misleading. Normally, when we expect:
class Foo(Bar):
pass
foo = Foo()
print(isinstance(foo, Bar))
to always print True. But typing.NamedTuple actually, through metaclass machinery, just makes something a descendant of tuple, exactly like collections.namedtuple. Indeed, practically its only reason to exist is to use the NamedTupleMetaclass to intercept class creation. Perhaps the following will be illuminating:
>>> from typing import NamedTuple
>>> class Employee(NamedTuple):
... """Represents an employee."""
... name: str
... id: int = 3
...
>>> isinstance(Employee(1,2), NamedTuple)
False
>>>
>>> isinstance(Employee(1,2), tuple)
True
Some may find this dirty, but as stated in the Zen of Python, practicality beats purity.
And note, people often get confused about collections.namedtuple which is itself not a class, but a class factory. So:
>>> import collections
>>> Point = collections.namedtuple("Point", "x y")
>>> p = Point(0, 0)
>>> isinstance(p, collections.namedtuple)
Traceback (most recent call last):
File "<stdin>", line 1, in <module>
TypeError: isinstance() arg 2 must be a type or tuple of types
Although note, the classes generated by namedtuple/NamedTuple do act as expected when you inherit from them.
Note, your solution:
import typing
import abc
class A(abc.ABC):
#abc.abstractmethod
def f(self) -> typing.Tuple:
...
class NT(typing.NamedTuple):
a: int
b: str
class B(A):
def f(self) -> NT:
return NT(1, "s")
print(B().f())
Doesn't pass mypy:
(py38) juan$ mypy test_typing.py
test_typing.py:18: error: Return type "NT" of "f" incompatible with return type "NamedTuple" in supertype "A"
Found 1 error in 1 file (checked 1 source file)
However, usint Tuple does:
class A(abc.ABC):
#abc.abstractmethod
def f(self) -> typing.Tuple[typing.Union[str, int],...]:
...
Although, that may not be very useful.
What you really want is some sort of structural typing, but I can't think of any way to use typing.Protocol for this. Basically, it can't express "any type with with a variadic number of attributes all of which are typing.Union[int, str].