Typehint a variadic function in Python - python

I wrote a higher-order python function (let's call it parent) and its parameter function (let's call it child) is a variadic function.
I don't know how to typehint it.
child takes as parameter a first argument that always is a str and a variable number of parameters that can be anything. It returns Any.
The closest I can get to it is Callable[..., Any] but then I "lose" the fact that the first argument is a str.
I would like something as such Callable[[str,...], Any] but this is not a valid typehint.
Is there a way to typehint my function?

Using a Protocol does not require you to manually wrap values in a "type-hinting wrapper", but unfortunately it doesn't help you here.
If a function can match a protocol, it's signature must exactly match that of the __call__ method in the protocol. However, (if I'm not mistaken) you want to match any function with a string as the first argument, which can be any of the following:
def fn1(x: str) -> Any: ...
def fn2(x: str, arg1: int, arg2: float) -> Any: ...
def fn3(x: str, *args: Any, **kwargs: Any) -> Any: ...
these all have different signatures, and thus cannot be matched by a single protocol: (mypy-play)
from typing import Any, Protocol
# This might be the protocol you might use, but unfortunately it doesn't work.
class StrCallable(Protocol):
def __call__(self, x: str, *args, **kwargs) -> Any:
...
def my_higher_order_fn(fn: StrCallable) -> StrCallable:
return fn # fill this with your actual implementation
my_higher_order_fn(fn1) # fails
my_higher_order_fn(fn2) # fails
my_higher_order_fn(fn3) # this passes, though
PEP 612 introduced ParamSpec, which is what you'll need here. It's kinda like TypeVar, but for function signatures. You can put a ParamSpec where you'd put the first list argument in Callable:
from typing import Callable, Concatenate, ParamSpec, TypeVar
P = ParamSpec("P")
TRet = TypeVar("TRet")
StrCallable = Callable[Concatenate[str, P], TRet]
where Concatenate is concatenating types to an existing parameter spec. Concatenate[str, P] is exactly what you need: any function signature whose first argument is a str.
Unfortunately, PEP 612 won't be available until Python 3.10, and mypy does not yet fully support it either. Until then, you might have to just use Callable[..., TRet].

Related

How to ignore decorator typing [duplicate]

It is not a big problem, but I just wondered the way to solve this.
Since I am new to using function annotations on Python, I am not familiar with it. And I have a question below.
When you make a decorator and want to put the annotation on it, how do you do that?
For example, codes like below.
def decorator(func: Callable[[*args,**kwargs], <what type should be here?>]) -> <??>:
def new_func(*args, **kwargs):
return func(*args, **kwargs)
return new_func
Note that PEP 612 (which is implemented in Python 3.10) introduces ParamSpec, which solves your problem like this:
from typing import Callable, TypeVar, ParamSpec
T = TypeVar('T')
P = ParamSpec('P')
def decorator(func: Callable[P, T]) -> Callable[P, T]:
def new_func(*args: P.args, **kwargs: P.kwargs) -> T:
return func(*args, **kwargs)
return new_func
Update
I actually think #Nikola Benes has the correct answer instead of me, namely:
PEP 612 introduced ParamSpec, which provides the ability to define dependencies between the parameters of callables.
Below is one way you could have tried to do it before ParamSpec, but ParamSpec is the way to go.
For those using Python <3.10, you should be able to get ParamSpec it from typing_extensions
from typing_extensions import ParamSpec
but I've not experimented with it. It might also depend on whether your static type checker (e.g. mypy, pyright, etc.), and the version of that checker, has implemented support for it.
The first part of the PyCon 2022 Typing Summit video recording shows ParamSpec in action.
Old Workaround:
Use Any for the return type and return another Callable of return type Any. From PEP 484 and the python standard library, the first parameter to Callable must be the types of the arguments to the callable, not the arguments themselves. Hence, your use of *args and **kwargs in Callable is unaccepted. Instead, you must use ellipses ... (which permits any number of positional and keyword argument types).
Decorator functions are more cleanly expressed using generic types (typing.TypeVar). In layman's terms, a generic is something that allows a type to be a parameter.
Paraphrasing from the mypy docs (FYI: mypy is a static type checker package for python):
Decorator functions can be expressed using generic types. Generics can
be restricted to using values that are subtypes of specific types with
the keyword argument bound=.... An upper bound can be used to
preserve the signature of the wrapper function a decorator decorates.
Hence, your example becomes this:
from typing import Any, Callable, TypeVar, cast
F = TypeVar('F', bound=Callable[..., Any])
def decorator(func: F) -> F:
def new_func(*args, **kwargs):
return func(*args, **kwargs)
return cast(F, new_func)
Also paraphrasing from the mypy docs and PEP 484:
The bound on F is used so that calling the decorator on a
non-function will be rejected. Also, the wrapper function (new_func)
is not type-checked because there is (currently) no support for
specifying callback signatures with a variable number of arguments of
a specific type, so we must cast the type at the end.

In Python, how do you annotate a function that accepts a fixed argument as well as any number of other arguments?

In Python, from my understanding, the ellipsis lets you annotate a function that has any number of arguments (documentation found here). Here's an example of what I'd like to do:
from typing import Callable, Any
def foo(first: str, *rest: Any):
print(rest)
return first
def call(f: Callable[[str, ...], str]):
f("Hello", 1, None, True)
print(call(foo))
Python (or, at least, Pylance) doesn't like the ellipsis ("..." not allowed in this context):
I've tried to use Python 3.10's ParamSpecs, but the documentation on them (including PEP 612) seems to say they're used for other purposes, and I can't tell what those purposes are. Here's what I've tried:
from typing import Any, Callable, Concatenate, ParamSpec
P = ParamSpec("P")
def foo(first: str, *rest: Any):
print(rest)
return first
def call(f: Callable[Concatenate[str, P], str]):
f("Hello", 1, None, True)
print(call(foo))
Python (or, at least, Pylance) seems to reflect that they aren't meant to be used this way:
How do I annotate a function like this, that knows the type of one or more of its arguments and accepts any number of other arguments anyway?
You can type arbitrary function signatures using __call__ on a Protocol
class Foo(Protocol):
def __call__(self, first: str, *rest: Any) -> str:
...
def call(f: Foo):
f("Hello", 1, None, True)

Type annotations for decorators

It is not a big problem, but I just wondered the way to solve this.
Since I am new to using function annotations on Python, I am not familiar with it. And I have a question below.
When you make a decorator and want to put the annotation on it, how do you do that?
For example, codes like below.
def decorator(func: Callable[[*args,**kwargs], <what type should be here?>]) -> <??>:
def new_func(*args, **kwargs):
return func(*args, **kwargs)
return new_func
Note that PEP 612 (which is implemented in Python 3.10) introduces ParamSpec, which solves your problem like this:
from typing import Callable, TypeVar, ParamSpec
T = TypeVar('T')
P = ParamSpec('P')
def decorator(func: Callable[P, T]) -> Callable[P, T]:
def new_func(*args: P.args, **kwargs: P.kwargs) -> T:
return func(*args, **kwargs)
return new_func
Update
I actually think #Nikola Benes has the correct answer instead of me, namely:
PEP 612 introduced ParamSpec, which provides the ability to define dependencies between the parameters of callables.
Below is one way you could have tried to do it before ParamSpec, but ParamSpec is the way to go.
For those using Python <3.10, you should be able to get ParamSpec it from typing_extensions
from typing_extensions import ParamSpec
but I've not experimented with it. It might also depend on whether your static type checker (e.g. mypy, pyright, etc.), and the version of that checker, has implemented support for it.
The first part of the PyCon 2022 Typing Summit video recording shows ParamSpec in action.
Old Workaround:
Use Any for the return type and return another Callable of return type Any. From PEP 484 and the python standard library, the first parameter to Callable must be the types of the arguments to the callable, not the arguments themselves. Hence, your use of *args and **kwargs in Callable is unaccepted. Instead, you must use ellipses ... (which permits any number of positional and keyword argument types).
Decorator functions are more cleanly expressed using generic types (typing.TypeVar). In layman's terms, a generic is something that allows a type to be a parameter.
Paraphrasing from the mypy docs (FYI: mypy is a static type checker package for python):
Decorator functions can be expressed using generic types. Generics can
be restricted to using values that are subtypes of specific types with
the keyword argument bound=.... An upper bound can be used to
preserve the signature of the wrapper function a decorator decorates.
Hence, your example becomes this:
from typing import Any, Callable, TypeVar, cast
F = TypeVar('F', bound=Callable[..., Any])
def decorator(func: F) -> F:
def new_func(*args, **kwargs):
return func(*args, **kwargs)
return cast(F, new_func)
Also paraphrasing from the mypy docs and PEP 484:
The bound on F is used so that calling the decorator on a
non-function will be rejected. Also, the wrapper function (new_func)
is not type-checked because there is (currently) no support for
specifying callback signatures with a variable number of arguments of
a specific type, so we must cast the type at the end.

Python type annotation for identical, generic function signatures

typing.Callable takes two "arguments": the argument type(s) and the return type. The argument type should be either ..., for arbitrary arguments, or a list of explicit types (e.g., [str, str, int]).
Is there a way of representing Callables that have exactly the same, albeit arbitrary, signatures for generics?
For example, say I wanted a function that took functions and returned a function with the same signature, I could do this if I knew the function signature upfront:
def fn_combinator(*fn:Callable[[Some, Argument, Types], ReturnType]) -> Callable[[Some, Argument, Types], ReturnType]:
...
However, I don't know the argument types upfront and I want my combinator to be suitably general. I had hoped that this would work:
ArgT = TypeVar("ArgT")
RetT = TypeVar("RetT")
FunT = Callable[ArgT, RetT]
def fn_combinator(*fn:FunT) -> FunT:
...
However, the parser (at least in Python 3.7) doesn't like ArgT in the first position. Is Callable[..., RetT] the best I can do?
Prior to Python 3.10
If you don't need to change the function signature at all, you should define FuncT as a TypeVar:
FuncT = TypeVar("FuncT", bound=Callable[..., object])
def fn_combinator(*fn: FuncT) -> FuncT:
...
Is there a way of representing Callables that have exactly the same, albeit arbitrary, signatures for generics?
Unlike a type alias (e.g.: FuncT = Callable[..., RetT]), TypeVar allows the type checker to infer a dependency between the parameters and the return value, ensuring that the function signatures will be exactly the same.
However, this approach is utterly limited. Using FuncT makes it difficult to properly type the returned function (See this mypy issue).
def fn_combinator(*fn: FuncT) -> FuncT:
def combined_fn(*args: Any, **kwargs: Any) -> Any:
...
# return combined_fn # Won't work. `combined_fn` is not guaranteed to be `FuncT`
return cast(FuncT, combined_fn)
This is the best we can do as of Python 3.7 due to the limitation of Callable introduced in PEP 484.
... only a list of parameter arguments ([A, B, C]) or an ellipsis (signifying "undefined parameters") were acceptable as the first "argument" to typing.Callable. --- PEP 612
Python 3.10+
Fortunately, type annotations for callables becomes more flexible in Python 3.10 with typing.ParamSpec (the so-called "parameter specification variable") and typing.Concatenate proposed in PEP 612. This extends Callable to support annotating more complicated callables.
This means that you will be able to do the following:
P = ParamSpec("P")
RetT = TypeVar("RetT")
def fn_combinator(*fn: Callable[P, RetT]) -> Callable[P, RetT]:
...
It also allows us the fully type check the returned callable without using cast:
def fn_combinator(*fn: Callable[P, RetT]) -> Callable[P, RetT]:
def combined_fn(*args: P.args, **kwargs: P.kwargs) -> RetT:
...
return combined_fn
See the release notes here.

Specifying *args for a Callable type hint

What's the best way to specify that the Callable variable fn takes *my_args as arguments? Like this:
def test(fn: Callable([Tuple[any]], None),
*my_args: any) -> None:
fn(*myargs)
From the documentation on typing.Callable:
There is no syntax to indicate optional or keyword arguments; such function types are rarely used as callback types. Callable[..., ReturnType] (literal ellipsis) can be used to type hint a callable taking any number of arguments and returning ReturnType.
So in your case where *args is optional and ReturnType is None, use
fn: Callable[..., None]
P.s. I don't use type hints so please let me know if I've misunderstood anything.
Now with PEP 612 in Python 3.10, you can write this:
from typing import Callable, ParamSpec
P = ParamSpec("P")
def test(fn: Callable[P, None], *my_args: P.args, **my_kwargs: P.kwargs) -> None:
fn(*myargs, **my_kwargs)
Then any calls to test will be properly type checked.

Categories