Callback protocol with unbound method - python

from typing import Protocol
class MyObj:
def my_method(self, name: str):
pass
class Proto(Protocol):
def __call__(self, obj: MyObj, name: str):
pass
def my_fn(obj: MyObj, name: str):
pass
def caller(fn: Proto):
fn(MyObj(), 'some name')
caller(my_fn) # passes type check
caller(MyObj.my_method) # fails type check
I'm using mypy 0.971 for type checking. I have trouble understanding why the second call is illegal according to mypy. Is it in fact incorrect according to Python static typing rules?
Interestingly, if I remove the "name" parameter from all the signatures, the type check passes:
from typing import Protocol
class MyObj:
def my_method(self):
pass
class Proto(Protocol):
def __call__(self, obj: MyObj):
pass
def my_fn(obj: MyObj):
pass
def caller(fn: Proto):
fn(MyObj())
caller(my_fn) # passes
caller(MyObj.my_method) # passes
EDIT:
As per #Wombatz explanation, if I modify the protocol to be:
class Proto(Protocol):
def __call__(self, obj: MyObj, /, name: str):
pass
it works, since now the name of the first parameter does not matter
since it's required to be called with a positional argument.

The problem is that the Protocol is more restrictive than you think.
class Proto(Protocol):
def __call__(self, obj: MyObj, name: str) -> None:
pass
def incompatible(what: MyObj, name: str) -> None:
pass
The function incompatible is also not compatible with the protocol, because the protocol requires a callable where the first argument is a MyObj and its name is obj and the second argument is a str and its name is name.
So in theory, the protocol could be used like this:
def caller(p: Proto) -> None:
p(obj=MyObj(), name="Hello")
This works for my_func but fails for the method, because the name of the first parameter of the method is self and not obj. So mypy is correct here!
You can define your protocol differently to only require a callable with two positional arguments of type MyObj and str
class Proto(Protocol):
def __call__(self, obj: MyObj, name: str, /) -> None:
pass
Now you cannot use named parameters and thus the method and my incompatible function are compatible with the protocol.
Interestingly, if I remove the "name" parameter from all the signatures, the type check passes.
I cannot reproduce that. It should fail and it does

Related

Calling a abstract method from a static method [duplicate]

Given a class with a helper method for initialization:
class TrivialClass:
def __init__(self, str_arg: str):
self.string_attribute = str_arg
#classmethod
def from_int(cls, int_arg: int) -> ?:
str_arg = str(int_arg)
return cls(str_arg)
Is it possible to annotate the return type of the from_int method?
I'v tried both cls and TrivialClass but PyCharm flags them as unresolved references which sounds reasonable at that point in time.
Starting with Python 3.11 you can use the new typing.Self object. For older Python versions you can get the same object by using the typing-extensions project:
try:
from typing import Self
except ImportError:
from typing_extensions import Self
class TrivialClass:
# ...
#classmethod
def from_int(cls, int_arg: int) -> Self:
# ...
return cls(...)
Note that you don't need to annotate cls in this case.
Warning: mypy support for the Self type has not yet been released; you'll need to wait for the next version after 0.991. Pyright already supports it.
If you can't wait for Mypy support, then you can use a generic type to indicate that you'll be returning an instance of cls:
from typing import Type, TypeVar
T = TypeVar('T', bound='TrivialClass')
class TrivialClass:
# ...
#classmethod
def from_int(cls: Type[T], int_arg: int) -> T:
# ...
return cls(...)
Any subclass overriding the class method but then returning an instance of a parent class (TrivialClass or a subclass that is still an ancestor) would be detected as an error, because the factory method is defined as returning an instance of the type of cls.
The bound argument specifies that T has to be a (subclass of) TrivialClass; because the class doesn't yet exist when you define the generic, you need to use a forward reference (a string with the name).
See the Annotating instance and class methods section of PEP 484.
Note: The first revision of this answer advocated using a forward reference
naming the class itself as the return value, but issue 1212 made it possible to use generics instead, a better solution.
As of Python 3.7, you can avoid having to use forward references in annotations when you start your module with from __future__ import annotations, but creating a TypeVar() object at module level is not an annotation. This is still true even in Python 3.10, which defers all type hint resolution in annotations.
From Python 3.7 you can use __future__.annotations:
from __future__ import annotations
class TrivialClass:
# ...
#classmethod
def from_int(cls, int_arg: int) -> TrivialClass:
# ...
return cls(...)
Edit: you can't subclass TrivialClass without overriding the classmethod, but if you don't require this then I think it's neater than a forward reference.
A simple way to annotate the return type is to use a string as the annotation for the return value of the class method:
# test.py
class TrivialClass:
def __init__(self, str_arg: str) -> None:
self.string_attribute = str_arg
#classmethod
def from_int(cls, int_arg: int) -> 'TrivialClass':
str_arg = str(int_arg)
return cls(str_arg)
This passes mypy 0.560 and no errors from python:
$ mypy test.py --disallow-untyped-defs --disallow-untyped-calls
$ python test.py
In Python 3.11 there is a nicer way to do this using the new Self type:
from typing import Self
class TrivialClass:
def __init__(self, str_arg: str):
self.string_attribute = str_arg
#classmethod
def from_int(cls, int_arg: int) -> Self:
str_arg = str(int_arg)
return cls(str_arg)
This also works correctly with sub classes as well.
class TrivialSubClass(TrivialClasss):
...
TrivialSubclass.from_int(42)
The IDE shows return type TrivialSubClass and not TrivialClass.
This is described in PEP 673.

How to mock inner method's default parameter in Pytest?

I am having problems trying to mock/patch a default parameter for a method, that is being called inside a method that is being unit tested with Pytest. In general the code looks like so:
class Repository:
DEFAULT_VERSION = "0.1.10"
...
#classmethod
def _get_metadata(cls, id: str, version: str = DEFAULT_VERSION) -> Dict[str, str]:
return ...
def write(self, df: DataFrame, id: str) -> None:
...
metadata = self._get_metadata(id)
class TestRepository:
def test_write(self, ...):
assert df.write(df=test_df, id="1").count() > 1
TEST_DEFAULT_VERSION = "0.2.20"
Now, I would like to mock the value of DEFAULT_VERSION parameter to be the value of TEST_DEFAULT_VERSION - how can I do that in Pytest?
You can do this by either modifying the class method to allow Monkey Patching of the constant, or Monkey Patching the class method directly. Have a look at this solution (basically the same question): pytest - monkeypatch keyword argument default

Add typing for a staticmethod in a python class

In a class Foo I link a static method in a class variable to subclass that class later and just exchange that function with another. The class contains some methods which call this exchangable function. The code below does not produce any mypy issues.
def some_function(text: str) -> None:
print(text)
class Foo:
_some_func: ClassVar[staticmethod] = staticmethod(some_function)
def some_method(self, text: str) -> None:
self._some_func(text)
if __name__ == "__main__":
Foo().some_method("Hello World!")
Now, I am trying to improve my typing, so I want to use a callback protocol to actually add typing for Foo._some_func. I have created the following protocol class:
class SomeFuncProtocol(Protocol):
def __call__(self, __text: str) -> None:
...
It does work as long as I use _some_func: ClassVar[SomeFuncProtocol] = some_function, but I can't find a way to use staticmethod and the protocol class for typing. I wish for something like the following, but mypy tells me that staticmethod does not expect a type argument.
class Foo:
_some_func: ClassVar[staticmethod[SomeFuncProtocol]] = staticmethod(some_function)
...
Does anybody know how to do it?
I was stuck with similar thing for a while, here's what worked for me:
from typing import ClassVar, Protocol
def some_function(text: str) -> None:
print(text)
class SomeFuncProtocol(Protocol):
def __call__(self, __text: str) -> None:
return
class Foo:
_some_func: ClassVar[SomeFuncProtocol] = staticmethod(some_function)
Foo._some_func('a')
Foo()._some_func('a')
Foo._some_func = some_function
Foo()._some_func = some_function # E: Cannot assign to class variable "_some_func" via instance
The code above typechecks (except for last line that is intentionally incorrect).
You don't need staticmethod in type annotation: it's a function that (simplified) takes callable as argument and returns another callable with same signature, but with explicit sign that it doesn't accept self. So return type of staticmethod is the same callable, we can express it like this:
from typing import Any, Callable, TypeVar
_C = TypeVar('_C', bound=Callable[..., Any])
def staticmethod(func: _C) -> _C: ...
You can try it in playground.

mypy: Untyped decorator makes function "my_method" untyped

When I try using a decorator that I defined in another package, mypy fails with the error message Untyped decorator makes function "my_method" untyped. How should I define my decorator to make sure this passes?
from mypackage import mydecorator
#mydecorator
def my_method(date: int) -> str:
...
The mypy documentation contains the section describing the declaration of decorators for functions with an arbitrary signature.
An example from there:
from typing import Any, Callable, TypeVar, Tuple, cast
F = TypeVar('F', bound=Callable[..., Any])
# A decorator that preserves the signature.
def my_decorator(func: F) -> F:
def wrapper(*args, **kwds):
print("Calling", func)
return func(*args, **kwds)
return cast(F, wrapper)
# A decorated function.
#my_decorator
def foo(a: int) -> str:
return str(a)
a = foo(12)
reveal_type(a) # str
foo('x') # Type check error: incompatible type "str"; expected "int"

Python 3.6: Signature of {method} incompatible with super type {Class}

While trying to update my code to be PEP-484 compliant (I'm using mypy 0.610) I've ran into the following report:
$ mypy mymodule --strict-optional --ignore-missing-imports --disallow-untyped-calls --python-version 3.6
myfile.py:154: error: Signature of "deliver" incompatible with supertype "MyClass"
MyClass:
from abc import abstractmethod
from typing import Any
class MyClass(object):
#abstractmethod
def deliver(self, *args: Any, **kwargs: Any) -> bool:
raise NotImplementedError
myfile.py:
class MyImplementation(MyClass):
[...]
def deliver(self, source_path: str,
dest_branches: list,
commit_msg: str = None,
exclude_files: list = None) -> bool:
[...]
return True
I'm definitely doing something wrong here, but I can't quite understand what :)
Any pointers would be much appreciated.
#abstractmethod
def deliver(self, *args: Any, **kwargs: Any) -> bool:
raise NotImplementedError
This declaration doesn't mean subclasses can give deliver any signature they want. Subclass deliver methods must be ready to accept any arguments the superclass deliver method would accept, so your subclass deliver has to be ready to accept arbitrary positional or keyword arguments:
# omitting annotations
def deliver(self, *args, **kwargs):
...
Your subclass's deliver does not have that signature.
If all subclasses are supposed to have the same deliver signature you've written for MyImplementation, then you should give MyClass.deliver the same signature too. If your subclasses are going to have different deliver signatures, maybe this method shouldn't really be in the superclass, or maybe you need to rethink your class hierarchy, or give them the same signature.
You can solve the question by using Callable[..., Any] and type: ignore such like bellow.
from typing import Callable
class MyClass(object):
deliver: Callable[..., bool]
#abstractmethod
def deliver(self, *args, **kwargs): # type: ignore
raise NotImplementedError
Maybe you should work it around this way:
Define abstract method without arguments:
class MyClass:
#abstractmethod
def deliver(self) -> bool:
raise NotImplementedError
In implementations get all your data from self:
class MyImplementation(MyClass):
def __init__(
self,
source_path: str,
dest_branches: list,
commit_msg: str = None,
exclude_files: list = None
) -> None:
super().__init__()
self.source_path = source_path
self.dest_branches = dest_branches
self.commit_msg = commit_msg
self.exclude_files = exclude_files
def deliver(self) -> bool:
# some logic
if self.source_path and self.commit_msg:
return True
return False
This way you will have completely compatible method declarations and still can implement methods as you want.

Categories