I have a function that takes a callback function as a parameter:
def function(arg1: int, callback):
...
I am trying to add a type hint for the callback function. Now, the callback functions that are passed to this function have the same positional args but the kwargs can be completely different (both in type and name).
def function1(arg1: int, arg2: str, **kwargs1):
...
def function2(arg1: int, arg2: str, **kwargs2):
...
I am trying to write a Callback Protocol that fits these 2 functions, but so far I don't see a way to make this work when the kwargs are different. Is there a way to create a unified Callback protocol in this case?
You can create this Callback protocol:
from typing_extensions import Protocol
class Callback(Protocol):
def __call__(self, arg1: int, arg2: str, **kwargs): ...
and type hint callback as Callback:
def function(arg1: int, callback: Callback): ...
Now, among these three function calls:
def function1(arg1: int, arg2: str, **kwargs1): ...
def function2(arg1: int, arg2: str, **kwargs2): ...
def function3(arg1: int): ...
function(0, function1) # Success.
function(0, function2) # Success.
function(0, function3) # Error.
mypy will report success for the first two, but report an error for the third one because of the missing arg2.
The key thing to realize here is that the name of the keyword-arguments parameter does not actually matter. If I take function2 and rename arg1 to arg_1, mypy would complain about that. This is because the public interface of function2 has changed in a backwards-incompatible way. For example, I would have to modify arg1=0 at any call site to arg_1=0.
But if I take the same function and rename kwargs2 to kwargs_2, the public interface does not actually change! This is because it was impossible to ever explicitly refer to the kwargs / kwargs1 / kwargs2 parameter at a call site in the first place. Take this example:
function2(arg1=0, arg2="", kwargs2={})
If I were to dump kwargs2 from inside the function, it would actually have the key "kwargs2" mapped to {}, instead of being an empty dictionary itself. This shows that it is impossible to rely on the name of the keyword-arguments parameter. mypy can therefore allow different names when checking if the public interface of function2 matches that of a Callback.
As for the type, kwargs, kwargs1 and kwargs2 all have a type of Dict[str, Any] left annotated. Since you're consistent across all 3, mypy reports success here, while also enabling you to take advantage of Any for keyword arguments specific to function1 or function2.
Related
Suppose I have a fully type hinted method with only keyword arguments :
class A:
def func(a: int, b: str, c: SomeIntricateTypeHint) -> OutputClass:
...
Now suppose I have a function that takes in variable keyword arguments, and passes them entirely onto that method :
def outer_func(n_repeat: int, **kwargs: ???) -> OtherOutputClass:
a = A()
for _ in range(n_repeat):
a.func(**kwargs)
In doing so, I have lost the benefits of the type hints for func. How do I type hint kwargs in outer_func such that I recover those benefits ?
For extra detail, in my case, I don't personally define func. It's actually a method from a boto3 client object. As a result I'm looking for a solution that dynamically creates the type hint, rather than having to manually create the TypedDict.
I have the following code
def func1(f: Callable):
def decorator(*args, **kwargs):
# do something
return f(*args, **kwargs)
return decorator
#func1
def func2(parameter: str):
# do something else
...
I want to specify that func1 takes a Callable that has either 1 parameter of a certain type (in this case, a str), or no parameters at all so that I could use it not just with func2, but also with another function that takes no parameters like the following function
#func1
def func3():
# this function doesn't take any parameters
Obviously even if there is a solution, it wouldn't actually be effective because type hints are ignored anyway, but I would like to add actual validation using Pydnatic, that's why I want to specify that the function must have a parameter of a certain type or no parameters at all.
I want to specify that func1 takes a Callable that has either 1 parameter of a certain type (in this case, a str), or no parameters at all
Use the union of the two signatures:
func : Callable[[str], Any] | Callable[[], Any])
I am testing mypy in one of my projects to see if I will like or not. I know only the basics so far. Here is a little problem I am unable to solve.
I need a function with positional-only parameters. In Python 3.8+:
def func(self, arg: int, /, **data):
(This allows to use self=something and arg=something in the data, if you're curious why.)
To make it work also in Python 3.7 I had to write:
def func(*args, **data):
self, arg = args
And it works fine, but seems to confuse the mypy, it complains about parameter types in calls of this func.
How can I annotatate, that arg[1] is an int?
Update:
I had some partial success with typing.overload.
A single #overload generates this error:
"Single overload definition, multiple required"
and when I write it twice:
"Overloaded function signature 2 will never be matched"
but the calls to that method are now checked OK.
I wrote a higher-order python function (let's call it parent) and its parameter function (let's call it child) is a variadic function.
I don't know how to typehint it.
child takes as parameter a first argument that always is a str and a variable number of parameters that can be anything. It returns Any.
The closest I can get to it is Callable[..., Any] but then I "lose" the fact that the first argument is a str.
I would like something as such Callable[[str,...], Any] but this is not a valid typehint.
Is there a way to typehint my function?
Using a Protocol does not require you to manually wrap values in a "type-hinting wrapper", but unfortunately it doesn't help you here.
If a function can match a protocol, it's signature must exactly match that of the __call__ method in the protocol. However, (if I'm not mistaken) you want to match any function with a string as the first argument, which can be any of the following:
def fn1(x: str) -> Any: ...
def fn2(x: str, arg1: int, arg2: float) -> Any: ...
def fn3(x: str, *args: Any, **kwargs: Any) -> Any: ...
these all have different signatures, and thus cannot be matched by a single protocol: (mypy-play)
from typing import Any, Protocol
# This might be the protocol you might use, but unfortunately it doesn't work.
class StrCallable(Protocol):
def __call__(self, x: str, *args, **kwargs) -> Any:
...
def my_higher_order_fn(fn: StrCallable) -> StrCallable:
return fn # fill this with your actual implementation
my_higher_order_fn(fn1) # fails
my_higher_order_fn(fn2) # fails
my_higher_order_fn(fn3) # this passes, though
PEP 612 introduced ParamSpec, which is what you'll need here. It's kinda like TypeVar, but for function signatures. You can put a ParamSpec where you'd put the first list argument in Callable:
from typing import Callable, Concatenate, ParamSpec, TypeVar
P = ParamSpec("P")
TRet = TypeVar("TRet")
StrCallable = Callable[Concatenate[str, P], TRet]
where Concatenate is concatenating types to an existing parameter spec. Concatenate[str, P] is exactly what you need: any function signature whose first argument is a str.
Unfortunately, PEP 612 won't be available until Python 3.10, and mypy does not yet fully support it either. Until then, you might have to just use Callable[..., TRet].
What's the proper type hint for functools.partial? I have a function that returns a partial and I want to type hint it so mypy doesn't throw any error:
def my_func() -> ?:
return partial(foo, bar="baz")
More specific than typing.Callable
You have a couple of options here, depending on exactly what you're going for.
I'm going to assume for example that foo is defined
def foo(qux: int, thud: float, bar: str) -> str:
# Does whatever
return "Hi"
If we use reveal_type we find that partial(foo, bar="blah") is identified as functools.partial[builtins.str*]. That roughly translates to a function-ish thing which takes anything and returns a string. So, you could annotate it with exactly that, and you'd at least get the return type in your annotation.
def my_func() -> partial[str]:
...
a: str = my_func()(2, 2.5) # Works fine
b: int = my_func()(2, 2.5) # correctly fails, we know we don't get an int
c: str = my_func()("Hello", [12,13]) # Incorrectly passes. We don't know to reject those inputs.
We can be more specific, which takes a bit of care when writing the function, and allows MyPy to better help us later. In general, there are two main options for annotating functions and function-like things. There's Callable and there's Protocol.
Callable is generally more consise and works when you're dealing with positional arguments. Protocol is a bit more verbose, and works with keyword arguments too.
So, you can annotate your function as
def my_func() -> Callable[[int, float], str]:
That is, it returns a function which takes an int (for qux) and a float (for thud) and returns a string. Now, note that MyPy doesn't know what the input type is going to be, so it can't verify that bit. partial[str] would be just as compatible with Callable[[spam, ham, eggs], str]. It does, however, pass without errors, and it will then helpfully warn you if you try to pass in the wrong parameters to your Callable. That is,
my_func()(7, 2.6) # This will pass
my_func()("Hello", [12,13]) # This will now correctly fail.
Now, let's suppose instead that foo were defined as follows.
def foo(qux: int, bar: str, thud: float) -> str:
# Does whatever
return "Hi"
Once we've got a partial passing bar as a keyword argument, there is no way to get thud in as a positional argument. That means there's no way to use Callable to annotate this one. Instead, we have to use a Protocol.
The syntax is a bit weird. It works as follows.
class PartFoo(Protocol):
def __call__(fakeSelf, qux: int, *, thud: float) -> str:
...
Teasing apart that __call__ line, we first have the fakeSelf entry. That's just notation: if call is a method, the first parameter gets swallowed.
Next, we have qux, annotated as an int as before. We then have the * marker to indicate that everything following is keyword only, because we can no longer reach thud in the real method positionally. Then we have thud with its annotation, and finally we have the -> str to give the return type.
Now if you define def my_func() -> PartFoo: you get the behaviour we want
my_func()(7, thud=1.5) # Works fine, qux is passed positionally, thud is a kwarg
my_func()(qux=7, thud=1.5) # Works fine, both qux and thud are kwargs
my_func()(7) # Correctly fails because thud is missing
my_func()(7, 1.5) # Correctly fails because thud can't be a positional arg.
The final situation that you could run into is where your original method has optional parameters. So, let's say
def foo(qux: int, bar: str, thud: float = 0.5) -> str:
# Does whatever
return "Hi"
Once again, we can't handle this precisely with Callable but Protocol is just fine. We simply ensure that the PartFoo protocol also specifies a default value for thud. Here I am using an ellipsis literal, as a gentle reminder that the actual value of that default may differ between implementations and the Protocol.
class PartFoo(Protocol):
def __call__(fakeSelf, qux: int, *, thud: float=...) -> str:
...
Now our behavior is
my_func()(7, thud=1.5) # Works fine, qux is passed positionally, thud is a kwarg
my_func()(qux=7, thud=1.5) # Works fine, both qux and thud are kwargs
my_func()(7) # Works fine because thud is optional
my_func()(7, 1.5) # Correctly fails because thud can't be a positional arg.
To recap, the partial function returns a fairly vague functionish type, which you can use directly but would lose checking against the inputs. You can annotate it with something more specific, using Callable in simple cases and Protocol in more complicated ones.