In a class Foo I link a static method in a class variable to subclass that class later and just exchange that function with another. The class contains some methods which call this exchangable function. The code below does not produce any mypy issues.
def some_function(text: str) -> None:
print(text)
class Foo:
_some_func: ClassVar[staticmethod] = staticmethod(some_function)
def some_method(self, text: str) -> None:
self._some_func(text)
if __name__ == "__main__":
Foo().some_method("Hello World!")
Now, I am trying to improve my typing, so I want to use a callback protocol to actually add typing for Foo._some_func. I have created the following protocol class:
class SomeFuncProtocol(Protocol):
def __call__(self, __text: str) -> None:
...
It does work as long as I use _some_func: ClassVar[SomeFuncProtocol] = some_function, but I can't find a way to use staticmethod and the protocol class for typing. I wish for something like the following, but mypy tells me that staticmethod does not expect a type argument.
class Foo:
_some_func: ClassVar[staticmethod[SomeFuncProtocol]] = staticmethod(some_function)
...
Does anybody know how to do it?
I was stuck with similar thing for a while, here's what worked for me:
from typing import ClassVar, Protocol
def some_function(text: str) -> None:
print(text)
class SomeFuncProtocol(Protocol):
def __call__(self, __text: str) -> None:
return
class Foo:
_some_func: ClassVar[SomeFuncProtocol] = staticmethod(some_function)
Foo._some_func('a')
Foo()._some_func('a')
Foo._some_func = some_function
Foo()._some_func = some_function # E: Cannot assign to class variable "_some_func" via instance
The code above typechecks (except for last line that is intentionally incorrect).
You don't need staticmethod in type annotation: it's a function that (simplified) takes callable as argument and returns another callable with same signature, but with explicit sign that it doesn't accept self. So return type of staticmethod is the same callable, we can express it like this:
from typing import Any, Callable, TypeVar
_C = TypeVar('_C', bound=Callable[..., Any])
def staticmethod(func: _C) -> _C: ...
You can try it in playground.
Related
Given a class with a helper method for initialization:
class TrivialClass:
def __init__(self, str_arg: str):
self.string_attribute = str_arg
#classmethod
def from_int(cls, int_arg: int) -> ?:
str_arg = str(int_arg)
return cls(str_arg)
Is it possible to annotate the return type of the from_int method?
I'v tried both cls and TrivialClass but PyCharm flags them as unresolved references which sounds reasonable at that point in time.
Starting with Python 3.11 you can use the new typing.Self object. For older Python versions you can get the same object by using the typing-extensions project:
try:
from typing import Self
except ImportError:
from typing_extensions import Self
class TrivialClass:
# ...
#classmethod
def from_int(cls, int_arg: int) -> Self:
# ...
return cls(...)
Note that you don't need to annotate cls in this case.
Warning: mypy support for the Self type has not yet been released; you'll need to wait for the next version after 0.991. Pyright already supports it.
If you can't wait for Mypy support, then you can use a generic type to indicate that you'll be returning an instance of cls:
from typing import Type, TypeVar
T = TypeVar('T', bound='TrivialClass')
class TrivialClass:
# ...
#classmethod
def from_int(cls: Type[T], int_arg: int) -> T:
# ...
return cls(...)
Any subclass overriding the class method but then returning an instance of a parent class (TrivialClass or a subclass that is still an ancestor) would be detected as an error, because the factory method is defined as returning an instance of the type of cls.
The bound argument specifies that T has to be a (subclass of) TrivialClass; because the class doesn't yet exist when you define the generic, you need to use a forward reference (a string with the name).
See the Annotating instance and class methods section of PEP 484.
Note: The first revision of this answer advocated using a forward reference
naming the class itself as the return value, but issue 1212 made it possible to use generics instead, a better solution.
As of Python 3.7, you can avoid having to use forward references in annotations when you start your module with from __future__ import annotations, but creating a TypeVar() object at module level is not an annotation. This is still true even in Python 3.10, which defers all type hint resolution in annotations.
From Python 3.7 you can use __future__.annotations:
from __future__ import annotations
class TrivialClass:
# ...
#classmethod
def from_int(cls, int_arg: int) -> TrivialClass:
# ...
return cls(...)
Edit: you can't subclass TrivialClass without overriding the classmethod, but if you don't require this then I think it's neater than a forward reference.
A simple way to annotate the return type is to use a string as the annotation for the return value of the class method:
# test.py
class TrivialClass:
def __init__(self, str_arg: str) -> None:
self.string_attribute = str_arg
#classmethod
def from_int(cls, int_arg: int) -> 'TrivialClass':
str_arg = str(int_arg)
return cls(str_arg)
This passes mypy 0.560 and no errors from python:
$ mypy test.py --disallow-untyped-defs --disallow-untyped-calls
$ python test.py
In Python 3.11 there is a nicer way to do this using the new Self type:
from typing import Self
class TrivialClass:
def __init__(self, str_arg: str):
self.string_attribute = str_arg
#classmethod
def from_int(cls, int_arg: int) -> Self:
str_arg = str(int_arg)
return cls(str_arg)
This also works correctly with sub classes as well.
class TrivialSubClass(TrivialClasss):
...
TrivialSubclass.from_int(42)
The IDE shows return type TrivialSubClass and not TrivialClass.
This is described in PEP 673.
I have a decorator that creates an abstractmethod from a simple method. It works as I'd expect, however if I run mypy, it tells me this:
mypy_try.py:20: error: Missing return statement [empty-body]
mypy_try.py:20: note: If the method is meant to be abstract, use #abc.abstractmethod
Found 1 error in 1 file (checked 1 source file)
My code:
import abc
from functools import wraps
import pytest
def make_it_abstract(method_to_decorate):
#wraps(method_to_decorate)
def decorated_method(*method_args, **method_kwargs):
return method_to_decorate(*method_args, **method_kwargs)
return abc.abstractmethod(decorated_method)
class MyInterfaceClass(abc.ABC):
#make_it_abstract
# #abc.abstractmethod
def my_method(self, value: int) -> int:
...
def test_abstract_method():
class MyImplementationClass(MyInterfaceClass):
pass
with pytest.raises(
TypeError,
match="Can't instantiate abstract class MyImplementationClass with abstract method my_method"
):
MyImplementationClass()
class MyImplementationClass(MyInterfaceClass):
def my_method(self, value: int) -> float:
return value +1
assert 43 == MyImplementationClass().my_method(42)
If I use the abc.abstractmethod decorator, it works fine.
What am I doing wrong?
You're doind everything fine, but mypy is not smart enough to figure out that your decorator calls abc.abstractmethod (and this is almost impossible, in fact, even if you've typed the decorator).
According to code in typeshed, abstractmethod is a no-op for type checkers. So mypy just detects the usage of abc.abstractmethod as decorator directly, as can be seen here. refers_to_fullname method expands aliases and basically checks if node name is equal to one of requested names.
So even the following raises the same error:
ab = abc.abstractmethod
class MyInterfaceClass(abc.ABC):
#ab
def my_method(self, value: int) -> int: # E: Missing return statement [empty-body]
...
from typing import Protocol
class MyObj:
def my_method(self, name: str):
pass
class Proto(Protocol):
def __call__(self, obj: MyObj, name: str):
pass
def my_fn(obj: MyObj, name: str):
pass
def caller(fn: Proto):
fn(MyObj(), 'some name')
caller(my_fn) # passes type check
caller(MyObj.my_method) # fails type check
I'm using mypy 0.971 for type checking. I have trouble understanding why the second call is illegal according to mypy. Is it in fact incorrect according to Python static typing rules?
Interestingly, if I remove the "name" parameter from all the signatures, the type check passes:
from typing import Protocol
class MyObj:
def my_method(self):
pass
class Proto(Protocol):
def __call__(self, obj: MyObj):
pass
def my_fn(obj: MyObj):
pass
def caller(fn: Proto):
fn(MyObj())
caller(my_fn) # passes
caller(MyObj.my_method) # passes
EDIT:
As per #Wombatz explanation, if I modify the protocol to be:
class Proto(Protocol):
def __call__(self, obj: MyObj, /, name: str):
pass
it works, since now the name of the first parameter does not matter
since it's required to be called with a positional argument.
The problem is that the Protocol is more restrictive than you think.
class Proto(Protocol):
def __call__(self, obj: MyObj, name: str) -> None:
pass
def incompatible(what: MyObj, name: str) -> None:
pass
The function incompatible is also not compatible with the protocol, because the protocol requires a callable where the first argument is a MyObj and its name is obj and the second argument is a str and its name is name.
So in theory, the protocol could be used like this:
def caller(p: Proto) -> None:
p(obj=MyObj(), name="Hello")
This works for my_func but fails for the method, because the name of the first parameter of the method is self and not obj. So mypy is correct here!
You can define your protocol differently to only require a callable with two positional arguments of type MyObj and str
class Proto(Protocol):
def __call__(self, obj: MyObj, name: str, /) -> None:
pass
Now you cannot use named parameters and thus the method and my incompatible function are compatible with the protocol.
Interestingly, if I remove the "name" parameter from all the signatures, the type check passes.
I cannot reproduce that. It should fail and it does
For full context, I was hoping to make some decorators that did better static analysis for tests. In an ideal world it would work something like this:
class SomeTest(unittest.TestCase):
#login_decorate
def test_login(self):
reveal_type(self.user) # type: User
#anonymous_decorate
def test_anonymous(self):
reveal_type(self.user) # type: None
And just to get started, I was trying to create a decorator that looked something like this:
def login_decorate(func: Callable[[unittest.TestCase], None]):
def decorated_function(self: unittest.TestCase):
self.user = User()
return func(self)
return decorated_function
But then when I ran mypy I got this error:
error: Argument 1 to "login_decorate" has incompatible type "Callable[[SomeTest], None]";
expected "Callable[[TestCase], None]"
After thinking on it a bit, I agree that this is the correct behavior for mypy due to contravariance, but that doesn't help me solve my problem.
Is there any way to get the decorator to work elegantly without explicitly hacking the type with Any?
You are right in that mypy check fails because Callable is contravariant.
It can be fixed by using type variables.
import unittest
from typing import Callable, TypeVar
T = TypeVar('T', bound=unittest.TestCase)
def login_decorate(func: Callable[[T], None]):
def decorated_function(self: T):
...
return func(self)
return decorated_function
class SomeTest(unittest.TestCase):
#login_decorate
def test_login(self):
...
While trying to update my code to be PEP-484 compliant (I'm using mypy 0.610) I've ran into the following report:
$ mypy mymodule --strict-optional --ignore-missing-imports --disallow-untyped-calls --python-version 3.6
myfile.py:154: error: Signature of "deliver" incompatible with supertype "MyClass"
MyClass:
from abc import abstractmethod
from typing import Any
class MyClass(object):
#abstractmethod
def deliver(self, *args: Any, **kwargs: Any) -> bool:
raise NotImplementedError
myfile.py:
class MyImplementation(MyClass):
[...]
def deliver(self, source_path: str,
dest_branches: list,
commit_msg: str = None,
exclude_files: list = None) -> bool:
[...]
return True
I'm definitely doing something wrong here, but I can't quite understand what :)
Any pointers would be much appreciated.
#abstractmethod
def deliver(self, *args: Any, **kwargs: Any) -> bool:
raise NotImplementedError
This declaration doesn't mean subclasses can give deliver any signature they want. Subclass deliver methods must be ready to accept any arguments the superclass deliver method would accept, so your subclass deliver has to be ready to accept arbitrary positional or keyword arguments:
# omitting annotations
def deliver(self, *args, **kwargs):
...
Your subclass's deliver does not have that signature.
If all subclasses are supposed to have the same deliver signature you've written for MyImplementation, then you should give MyClass.deliver the same signature too. If your subclasses are going to have different deliver signatures, maybe this method shouldn't really be in the superclass, or maybe you need to rethink your class hierarchy, or give them the same signature.
You can solve the question by using Callable[..., Any] and type: ignore such like bellow.
from typing import Callable
class MyClass(object):
deliver: Callable[..., bool]
#abstractmethod
def deliver(self, *args, **kwargs): # type: ignore
raise NotImplementedError
Maybe you should work it around this way:
Define abstract method without arguments:
class MyClass:
#abstractmethod
def deliver(self) -> bool:
raise NotImplementedError
In implementations get all your data from self:
class MyImplementation(MyClass):
def __init__(
self,
source_path: str,
dest_branches: list,
commit_msg: str = None,
exclude_files: list = None
) -> None:
super().__init__()
self.source_path = source_path
self.dest_branches = dest_branches
self.commit_msg = commit_msg
self.exclude_files = exclude_files
def deliver(self) -> bool:
# some logic
if self.source_path and self.commit_msg:
return True
return False
This way you will have completely compatible method declarations and still can implement methods as you want.