def f() -> Callable[[ # how to show there can be any number of int?
], float]:
def g(*args):
assert all(type(x) == int for x in args)
return 0.1
return g
I read the typing docs and Callable (i.e. Callable[…, ReturnType]) is not what I need.
I know Tuple[int, …], but Callable[[int, …], float] return Error "…" not allowed in this context Pylance.
You can do this by defining a Protocol with a __call__ whose function signature has the desired typing:
from typing import Protocol
class IntCallable(Protocol):
def __call__(self, *args: int) -> float: ...
def f() -> IntCallable:
def g(*args: int) -> float:
assert all(type(x) == int for x in args)
return 0.1
return g
Testing it out with mypy:
f()(1, 2, 3) # fine
f()("foo") # error: Argument 1 to "__call__" of "IntCallable" has incompatible type "str"; expected "int"
The other option is to have your function take a single Iterable[int] argument instead of an arbitrary number of int arguments, which lets you use a simple Callable typing instead of having to go the more complex Protocol route.
Related
Problem
How to only type the first positional parameter of a Protocol method and let the others be untyped?
Example, having a protocol named MyProtocol that has a method named my_method that requires only the first positional parameter to be an int, while letting the rest be untyped.
the following class would implement it correctly without error:
class Imp1(MyProtocol):
def my_method(self, first_param: int, x: float, y: float) -> int:
return int(first_param - x + y)
However the following implementation wouldn't implement it correctly, since the first parameter is a float:
class Imp2(MyProtocol):
def my_method(self, x: float, y: float) -> int: # Error, method must have a int parameter as a first argument after self
return int(x+y)
I thought I would be able to do that with *args, and **kwargs combined with Protocol like so:
from typing import Protocol, Any
class MyProtocol(Protocol):
def my_method(self, first_param: int, /, *args: Any, **kwargs: Any) -> int:
...
But (in mypy) this makes both Imp1 and Imp2 fail, because it forces the method contract to really have a *args, **kwargs like so:
class Imp3(MyProtocol):
def my_method(self, first_param: int, /, *args: Any, **kwargs: Any) -> int:
return first_param
But this does not solves what I am trying to achieve, that is make the implementation class have any typed/untyped parameters except for the first parameter.
Workaround
I manged to circumvent the issue by using an abstract class with a setter set_first_param, like so:
from abc import ABC, abstractmethod
from typing import Any
class MyAbstractClass(ABC):
_first_param: int
def set_first_param(self, first_param: int):
self._first_param = first_param
#abstractmethod
def my_method(self, *args: Any, **kwargs: Any) -> int:
...
class AbcImp1(MyAbstractClass):
def my_method(self, x: float, y: float) -> int:
return int(self._first_param + x - y) # now i can access the first_parameter with self._first_param
But this totally changes the initial API that I am trying to achieve, and in my opinion makes less clear to the implementation method that this parameter will be set before calling my_method.
Note
This example was tested using python version 3.9.13 and mypy version 0.991.
If your MyProtocol can accept any number of arguments, you cannot have a subtype (or implementation) which accepts a set number, this breaks the Liskov substitution principle as the subtype only accepts a limited set of cases accepted by the supertype.
[original paragraph]
Then, if you keep on inheriting from Protocol, you keep on making protocols, protocols are different from ABCs, they use structural subtyping (not nominal subtyping), meaning that as long as an object implements all the methods/properties of a protocol it is an instance of that protocol (see PEP 544 for more details).
[end original paragraph]
[edit upon further reading]
In my opinion, protocols should only be inherited by other protocols which will be used with structural subtyping. For nominal subtyping (which for instance allows default implementation) I would use ABCs.
[edit upon further reading]
Without more detail on the implementations you'd want to use, #blhsing's solution is probably the most open because it does not type the Callable's call signature.
Here is a set of implementations around a generic protocol with contravariant types (bound to float as it is the top of the numeric tower), which would allow any numeric type for the two x and y arguments.
from typing import Any, Generic, Protocol, TypeVar
T = TypeVar("T", contravariant=True, bound=float)
U = TypeVar("U", contravariant=True, bound=float)
class MyProtocol(Protocol[T, U]):
def my_method(self, first_param: int, x: T, y: U) -> int:
...
class ImplementMyProtocol1(Generic[T, U]):
"""Generic implementation, needs typing"""
def my_method(self, first_param: int, x: T, y: U) -> int:
return int(first_param - x + y)
class ImplementMyProtocol2:
"""Float implementation, and ignores first argument"""
def my_method(self, _: int, x: float, y: float) -> int:
return int(x + y)
class ImplementMyProtocol3:
"""Another float implementation, with and extension"""
def my_method(self, first_param: int, x: float, y: float, *args: float) -> int:
return int(first_param - x + y + sum(args))
def use_MyProtocol(inst: MyProtocol[T, U], n: int, x: T, y: U) -> int:
return inst.my_method(n, x, y)
use_MyProtocol(ImplementMyProtocol1[float, float](), 1, 2.0, 3.0) # OK MyProtocol[float, float]
use_MyProtocol(ImplementMyProtocol1[int, int](), 1, 2, 3) # OK MyProtocol[int, int]
use_MyProtocol(ImplementMyProtocol2(), 1, 2.0, 3.0) # OK MyProtocol[float, float]
use_MyProtocol(ImplementMyProtocol3(), 1, 2.0, 3.0) # OK MyProtocol[float, float]
One reasonable workaround would be to make the method take just the typed arguments, and leave the untyped arguments to a callable that the method returns. Since you can declare the return type of a callable without specifying the call signature by using an ellipsis, it solves your problem of leaving those additional arguments untyped:
from typing import Protocol, Callable
class MyProtocol(Protocol):
def my_method(self, first_param: int) -> Callable[..., int]:
...
class Imp1(MyProtocol):
def my_method(self, first_param: int) -> Callable[..., int]:
def _my_method(x: float, y: float) -> int:
return int(first_param - x + y)
return _my_method
print(Imp1().my_method(5)(1.5, 2.5)) # outputs 6
Demo of the code passing mypy:
https://mypy-play.net/?mypy=latest&python=3.12&gist=677569f73f6fc3bc6e44858ef37e9faf
Signature of method 'Imp1.my_method()' does not match signature of the base method in class 'MyProtocol'
must be I suppose
class Imp1(MyProtocol):
def my_method(self, first_param: int, *args: Any, **kwargs: Any) -> int:
...
Yours Imp2 the same as in Imp1 but does not even have first named parameter.
I have two variables a and b that are either int or str.
I write an assertion that insists a and b are either both ints or strs.
If I typenarrow a to an int, is there a way for mypy to infer b is also an int?
Here is some sample code.
Mypy version:
mypy 0.980+dev.0f17aff06ac1c05c442ba989e23655a2c6adbfbf (compiled: no)
Thanks for your help.
def my_func(a: int | str, b: int | str):
# We assert one of the statements is true: 1) a and b are ints, or 2) a and b are strings.
# In the case of an int a, and a string b, or vice versa, this assertion will fail.
assert isinstance(a, int) == isinstance(b, int)
# Another alternative assertion
assert type(a) == type(b)
if isinstance(a, int):
reveal_type(b) # mypy still thinks b is int or str
Using typing.TypeVar:
from typing import TypeVar
T = TypeVar('T', int, str)
def reveal_type(a: int):
pass
def foo(a: str):
pass
def my_func(a: T, b: T):
if isinstance(a, int):
reveal_type(b) # pass
else:
foo(b) # pass
If we simply exchange the calling positions of the two functions, mypy will find that they are all wrong calls and give two errors:
def my_func(a: T, b: T):
if isinstance(a, int):
foo(b) # Argument 1 to "foo" has incompatible type "int"; expected "str" (16:12)
else:
reveal_type(b) # Argument 1 to "reveal_type" has incompatible type "str"; expected "int" (18:20)
I would like to change the input arguments of a function. This will also lead to changes within the function body.
What's a pythonic way to mark an input argument "deprecated" and maintain backward compatibility at the same time? Here's a toy example:
from typing import List
# original function
def sum_numbers(numbers: List[int]):
return sum(numbers)
# function with changed input arguments and function body
def sum_numbers(a: int, b: int) -> int:
return a + b
The user should be able to call sum_numbers either with numbers: List[int] argument or by using a: int, b: int. However, I want to submit a DeprecationWarning when the user uses the original call method.
An option is to overload the function using multipledispatch module:
from multipledispatch import dispatch
#dispatch(int, int)
def sum_numbers(a, b):
print("Deprecated")
return a + b
#dispatch(list)
def sum_numbers(numbers):
return sum(numbers)
An alternative to multipledispatch is to take in *args or have an optional arg and dispatch internally:
# original function
def sum_numbers(a, b=None):
if isinstance(a, list):
warnings.warn("...", DeprecationWarning, stacklevel=2)
return sum(numbers)
return a + b
then for typing purposes you can use typing.overload:
#typing.overload
def sum_numbers(numbers: list[int]) -> int:
"""deprecated"""
#typing.overload
def sum_numbers(a: int, b: int) -> int:
...
(note that as documented the overloads should come first and the actual implementation last)
I have a function
function(input: str) -> Tuple[str, str]:
return tuple(x for x in input.split(","))
The input is always 'value1, value2', however, I got an error message of: "Tuple[str, ...]", expected "Tuple[str, str]")
Is there anyway force the return type to the expected one just use one line of code?
The interpreter has no idea how many elements x for x in input.split(",") will produce - from its perspective, input could be anything. Accordingly, it classifies the returned type as Tuple[str, ...].
To get it to use a return type of Tuple[str, str], which you marked the method as, you need to explicitly return exactly two elements:
def function(input: str) -> Tuple[str, str]:
tup = input.split(',')
return (tup[0], tup[1])
or otherwise, annotate the method differently.
as a one-liner you could use a list slice:
def function(input: str) -> Tuple[str, str]:
return tuple(input.split(','))[:2]
I have a function:
def dummy_func(a, b: float, c: int = 1) -> float:
#some code
I want to write a generic functor (dummy_prepender) that would take the function and construct a new one wose only difference from the original is that an extra unused parameter is prepended to signature:
dummy_method1 = dummy_prepender(dummy_func)
#dummy_method1 should behave exactly like the following code:
def dummy_method(dummy, a, b: float, c: int = 1) -> float:
#some code
so that for any set of arguments and dummy_arg
dummy_method(dummy_arg, arguments...) behave exactly the same as dummy_func(arguments...).
I thought that #staticmethod would do the trick, but there are places where the dispatch happens differently and applying staticmethod has no effect.
Some behavior tests:
help(dummy_method) should show the correct signature
dummy_method(dummy='', a=1) should fail
dummy_method(dummy='', a=1, b=2, c=3, z=4) should fail
Update:
Here is an example of staticmethod not working as expected:
Example 1
factory_type = type('Typ1', (), {
'__init__': staticmethod(dummy_func),
})
inspect.signature(factory_type)
#<Signature (b:float, c:int=1) -> float>
#BTW, in this case it's just the inspect.signature that is broken. The call behavior seems to be correct
Example 2:
factory_type2 = type('Typ1', (), {
'__new__': staticmethod(dummy_func),
})
#Call behavior is broken
factory_type2(a=1, b=2)
# TypeError: dummy_func() got multiple values for argument 'a'
factory_type2(7, 13)
# locals()={'c': 13, 'b': 7, 'a': <class '__main__.Typ1'>}
# See how the class got passed as `a` instead of `dummy`
The result of staticmethod(func) is not a function, but a descriptor object. And sometimes the behavior is not the same.
My first try would be something like this:
def dummy_method(*args, **kwargs) -> float:
return dummy_func(*args[1:], **kwargs)
The upcoming Python 3.8 implements PEP 570, which allows the following syntax:
# 'dummy' is a positional only argument, disallowing 'dummy=...'
def dummy_method(dummy, /, *args, **kwargs) -> float:
return dummy_func(*args, **kwargs)
Here another version for Python 3.8, which keeps the signature:
def dummy_method(dummy, /, a, b: float, c: int = 1) -> float:
return dummy_func(a, b, c)