Problem
How to only type the first positional parameter of a Protocol method and let the others be untyped?
Example, having a protocol named MyProtocol that has a method named my_method that requires only the first positional parameter to be an int, while letting the rest be untyped.
the following class would implement it correctly without error:
class Imp1(MyProtocol):
def my_method(self, first_param: int, x: float, y: float) -> int:
return int(first_param - x + y)
However the following implementation wouldn't implement it correctly, since the first parameter is a float:
class Imp2(MyProtocol):
def my_method(self, x: float, y: float) -> int: # Error, method must have a int parameter as a first argument after self
return int(x+y)
I thought I would be able to do that with *args, and **kwargs combined with Protocol like so:
from typing import Protocol, Any
class MyProtocol(Protocol):
def my_method(self, first_param: int, /, *args: Any, **kwargs: Any) -> int:
...
But (in mypy) this makes both Imp1 and Imp2 fail, because it forces the method contract to really have a *args, **kwargs like so:
class Imp3(MyProtocol):
def my_method(self, first_param: int, /, *args: Any, **kwargs: Any) -> int:
return first_param
But this does not solves what I am trying to achieve, that is make the implementation class have any typed/untyped parameters except for the first parameter.
Workaround
I manged to circumvent the issue by using an abstract class with a setter set_first_param, like so:
from abc import ABC, abstractmethod
from typing import Any
class MyAbstractClass(ABC):
_first_param: int
def set_first_param(self, first_param: int):
self._first_param = first_param
#abstractmethod
def my_method(self, *args: Any, **kwargs: Any) -> int:
...
class AbcImp1(MyAbstractClass):
def my_method(self, x: float, y: float) -> int:
return int(self._first_param + x - y) # now i can access the first_parameter with self._first_param
But this totally changes the initial API that I am trying to achieve, and in my opinion makes less clear to the implementation method that this parameter will be set before calling my_method.
Note
This example was tested using python version 3.9.13 and mypy version 0.991.
If your MyProtocol can accept any number of arguments, you cannot have a subtype (or implementation) which accepts a set number, this breaks the Liskov substitution principle as the subtype only accepts a limited set of cases accepted by the supertype.
[original paragraph]
Then, if you keep on inheriting from Protocol, you keep on making protocols, protocols are different from ABCs, they use structural subtyping (not nominal subtyping), meaning that as long as an object implements all the methods/properties of a protocol it is an instance of that protocol (see PEP 544 for more details).
[end original paragraph]
[edit upon further reading]
In my opinion, protocols should only be inherited by other protocols which will be used with structural subtyping. For nominal subtyping (which for instance allows default implementation) I would use ABCs.
[edit upon further reading]
Without more detail on the implementations you'd want to use, #blhsing's solution is probably the most open because it does not type the Callable's call signature.
Here is a set of implementations around a generic protocol with contravariant types (bound to float as it is the top of the numeric tower), which would allow any numeric type for the two x and y arguments.
from typing import Any, Generic, Protocol, TypeVar
T = TypeVar("T", contravariant=True, bound=float)
U = TypeVar("U", contravariant=True, bound=float)
class MyProtocol(Protocol[T, U]):
def my_method(self, first_param: int, x: T, y: U) -> int:
...
class ImplementMyProtocol1(Generic[T, U]):
"""Generic implementation, needs typing"""
def my_method(self, first_param: int, x: T, y: U) -> int:
return int(first_param - x + y)
class ImplementMyProtocol2:
"""Float implementation, and ignores first argument"""
def my_method(self, _: int, x: float, y: float) -> int:
return int(x + y)
class ImplementMyProtocol3:
"""Another float implementation, with and extension"""
def my_method(self, first_param: int, x: float, y: float, *args: float) -> int:
return int(first_param - x + y + sum(args))
def use_MyProtocol(inst: MyProtocol[T, U], n: int, x: T, y: U) -> int:
return inst.my_method(n, x, y)
use_MyProtocol(ImplementMyProtocol1[float, float](), 1, 2.0, 3.0) # OK MyProtocol[float, float]
use_MyProtocol(ImplementMyProtocol1[int, int](), 1, 2, 3) # OK MyProtocol[int, int]
use_MyProtocol(ImplementMyProtocol2(), 1, 2.0, 3.0) # OK MyProtocol[float, float]
use_MyProtocol(ImplementMyProtocol3(), 1, 2.0, 3.0) # OK MyProtocol[float, float]
One reasonable workaround would be to make the method take just the typed arguments, and leave the untyped arguments to a callable that the method returns. Since you can declare the return type of a callable without specifying the call signature by using an ellipsis, it solves your problem of leaving those additional arguments untyped:
from typing import Protocol, Callable
class MyProtocol(Protocol):
def my_method(self, first_param: int) -> Callable[..., int]:
...
class Imp1(MyProtocol):
def my_method(self, first_param: int) -> Callable[..., int]:
def _my_method(x: float, y: float) -> int:
return int(first_param - x + y)
return _my_method
print(Imp1().my_method(5)(1.5, 2.5)) # outputs 6
Demo of the code passing mypy:
https://mypy-play.net/?mypy=latest&python=3.12&gist=677569f73f6fc3bc6e44858ef37e9faf
Signature of method 'Imp1.my_method()' does not match signature of the base method in class 'MyProtocol'
must be I suppose
class Imp1(MyProtocol):
def my_method(self, first_param: int, *args: Any, **kwargs: Any) -> int:
...
Yours Imp2 the same as in Imp1 but does not even have first named parameter.
I had a method like this on python:
def method(a, b, c: int=0):
return a+b+c
When I called method(5,2) it returns me 7.
However when I want to use multiple dispatching:
from multipledispatch import dispatch
#dispatch(int, int, int)
def method(a, b, c=0):
return a+b+c
method(5,2) understandably gives an error. Is there any way to make one of the values in dispatch not required like a ref statement on c#?
This will work (you need to specify the names of args with default values when using #dispatch).
#dispatch(int, int, c=int)
def method(a, b, c=0):
return a+b+c
method(2,7)
# Out[58]: 9
I am creating a custom container that returns an instance of itself when sliced:
from typing import Union, List
class CustomContainer:
def __init__(self, values: List[int]):
self.values = values
def __getitem__(self, item: Union[int, slice]) -> Union[int, CustomContainer]:
if isinstance(item, slice):
return CustomContainer(self.values[item])
return self.values[item]
This works but comes with the following problem:
a = CustomContainer([1, 2])
b = a[0] # is always int, but recognized as both int and CustomContainer
c = a[:] # is always CustomContainer, but recognized as both int and CustomContainer
# Non-scalable solution: Forced type hint
d: int = a[0]
e: CustomContainer = a[:]
If I change the return type of __getitem__ to only int (my original approach), then a[0] correctly shows type int, but a[:] is considered a list instead of a CustomContainer.
As far as I understand, there used to be a function in python2 to define how slices are created, but it was removed in python3.
Is there a way to give the proper type hint without having to force the type hint every time I use my container?
You want to use typing.overload, which allows you to register multiple different signatures of a function with a type checker. Functions decorated with #overload are ignored at runtime, so you'll typically just fill the body with a literal ellipsis ..., pass, or a docstring. This also means that you have to keep at least one version of the function that isn't decorated with #overload, which will be the actual function used at runtime.
If you take a look at typeshed, the repository of stub files used by most major type-checkers for checking the standard library, you'll see this is the technique they use for annotating __getitem__ methods in custom containers such as collections.UserList. In your case, you'd annotate your method like this:
from typing import overload, Union, List
class CustomContainer:
def __init__(self, values: List[int]):
self.values = values
#overload
def __getitem__(self, item: int) -> int:
"""Signature when the function is passed an int"""
#overload
def __getitem__(self, item: slice) -> CustomContainer:
"""Signature when the function is passed a slice"""
def __getitem__(self, item: Union[int, slice]) -> Union[int, CustomContainer]:
"""Actual runtime implementation"""
if isinstance(item, slice):
return CustomContainer(self.values[item])
return self.values[item]
a = CustomContainer([1, 2])
b = a[0]
c = a[:]
reveal_type(b)
reveal_type(c)
Run it through MyPy, and it tells us:
main.py:24: note: Revealed type is "builtins.int"
main.py:25: note: Revealed type is "__main__.CustomContainer"
Further reading
The mypy docs for #overload can be found here.
def f() -> Callable[[ # how to show there can be any number of int?
], float]:
def g(*args):
assert all(type(x) == int for x in args)
return 0.1
return g
I read the typing docs and Callable (i.e. Callable[…, ReturnType]) is not what I need.
I know Tuple[int, …], but Callable[[int, …], float] return Error "…" not allowed in this context Pylance.
You can do this by defining a Protocol with a __call__ whose function signature has the desired typing:
from typing import Protocol
class IntCallable(Protocol):
def __call__(self, *args: int) -> float: ...
def f() -> IntCallable:
def g(*args: int) -> float:
assert all(type(x) == int for x in args)
return 0.1
return g
Testing it out with mypy:
f()(1, 2, 3) # fine
f()("foo") # error: Argument 1 to "__call__" of "IntCallable" has incompatible type "str"; expected "int"
The other option is to have your function take a single Iterable[int] argument instead of an arbitrary number of int arguments, which lets you use a simple Callable typing instead of having to go the more complex Protocol route.
I'm trying to create a function that can be used as an annotation (to later inspect the function and do stuff with the expected return values...)
def WillReturn(*kwargs):
# dark magic
return kwargs
Question is: how do I annotate WillReturn in such a way that the type hinter will give the following warnings?
def MyFunction(a, b, c) -> WillReturn(int=1, str='yo'):
return (123,
1.2, # error! expects a str at position #1
None) # error! expects two values
Idea 1:
This would do the trick but is not quite scalable:
A,B,C,D,E,F = map(typing.TypeVar, 'ABCDEF')
#typing.overload
def WillReturn(A) -> A: ...
#typing.overload
def WillReturn(A,B) -> (A,B): ...
#typing.overload
def WillReturn(A,B,C) -> (A,B,C): ...
# and so on and so forth
Idea 2:
subclass from typing.Tuple or use _VariadicGenericAlias directly, but I'm not entirely sure if this is the intended usage of either object.