I am testing mypy in one of my projects to see if I will like or not. I know only the basics so far. Here is a little problem I am unable to solve.
I need a function with positional-only parameters. In Python 3.8+:
def func(self, arg: int, /, **data):
(This allows to use self=something and arg=something in the data, if you're curious why.)
To make it work also in Python 3.7 I had to write:
def func(*args, **data):
self, arg = args
And it works fine, but seems to confuse the mypy, it complains about parameter types in calls of this func.
How can I annotatate, that arg[1] is an int?
Update:
I had some partial success with typing.overload.
A single #overload generates this error:
"Single overload definition, multiple required"
and when I write it twice:
"Overloaded function signature 2 will never be matched"
but the calls to that method are now checked OK.
Related
How does mypy apply the Liskov substitution principle to *args, **kwargs parameters?
I thought the following code should fail a mypy check since some calls to f allowed by the Base class are not allowed by C, but it actually passed. Are there any reasons for this?
from abc import ABC, abstractmethod
from typing import Any
class Base(ABC):
#abstractmethod
def f(self, *args: Any, **kwargs: Any) -> int:
pass
class C(Base):
def f(self, batch: int, train: bool) -> int:
return 1
I also tried to remove either *args or **kwargs, both failed.
Unlike Daniil said in currently accepted answer, the reason is exactly (*args: Any, **kwargs: Any) signature part.
Please check the corresponding discussion on mypy issue tracker:
I actually like this idea, I have seen this confusion several times, and although it is a bit unsafe, most of the time when people write (*args, **kwargs) it means "don't care", rather than "should work for all calls".
[GVR] Agreed, this is a case where practicality beats purity.
So, mypy gives a special treatment to functions of form
# _T is arbitrary type
class _:
def _(self, *args, **kwargs) -> _T: ...
and considers them fully equivalent to Callable[..., _T].
Yes, this actually violates LSP, of course, but this was designed specially to allow declaring functions with signature "just ignore my parameters".
To declare the broadest possible function that really accepts arbitrary positional and keyword arguments, you should use object in signature instead.
This has nothing to do with *args or **kwargs per se. The reason for this is strictly the fact that you used typing.Any for both annotations.
The Any annotation is basically a Jedi mind trick for the type checker to the effect of:
These are the types you were looking for.
No matter what, it will always pass.
For this reason, the typing documentation specifically recommends to use object as much as possible instead of Any, when you mean to say something like "the broadest possible type". Any should be reserved as the last resort, when you bump against the limits of the Python typing system.
The mypy docs also have a section explaining the difference between Any and object.
If you change even one of those Any annotations to object, you will be rightfully chastised by mypy with an [override] error for C.f.
Example:
from typing import Any
class Base:
def f(self, *args: object, **kwargs: Any) -> int:
return 2
class C(Base):
def f(self, batch: int, train: bool) -> int: # now this is an error
return 1
Whereas the combination of saying "any number of positional and keyword-arguments" together with "each argument will always pass the type check" essentially translates to "no override will ever be wrong" (in terms of arguments).
So I would suggest using object instead of Any everywhere, unless you cannot avoid using the latter.
These confusions are one of the reasons I think the choice to name this construct Any is so unfortunate.
PS
My first paragraph was not well worded. As #SUTerliakov explained more clearly, the reason this override does not cause an error is specifically because of the combination of the *args/**kwargs parameters and them being annotated with Any. Only if both conditions are met, does mypy make this exception.
I want to restrict scope of functions that can be passed as parameter to another function. For example, to restrict functions to be only one from two specified, or from particular module, or by signature. I tried the code below but in it there is now restrictions: as parameter can be passed any function.
Is this possible in Python?
def func_as_param():
print("func_as_param called")
def other_func():
print("other_func called")
def func_with_func_arg(func: func_as_param): # this is not giving restrictions
# def func_with_func_arg(func: type(func_as_param)): # this is also not giving restrictions
print("func_with_func_arg called")
func()
def test_func_with_func_arg():
print("test_func_with_func_arg")
func_with_func_arg(func_as_param)
func_with_func_arg(other_func) # <- here IDE must complain that only func_as_param is expected
Frameworks expecting callback functions of specific signatures might be type hinted using Callable[[Arg1Type, Arg2Type], ReturnType]
You might want to use the Callable type. This might help https://docs.python.org/3/library/typing.html#callable
NOTE
Type annotations in Python are not make-or-break like in C. They’re optional chunks of syntax that we can add to make our code more explicit.
Erroneous type annotations will do nothing more than highlight the incorrect annotation in our code editor — no errors are ever raised due to annotations.
If thats necessary, you must do the checking by yourself.
normally this is done by creating your own type (class) ... then any other function can inherit from it and will be of the same "type".
class my_functions:
pass
class func_as_param_class(my_functions):
#staticmethod
def __call__():
print("func_as_param called")
func_as_param = func_as_param_class() # create the callable function ....
def other_func():
print("other_func called")
def func_with_func_arg(func: my_functions): # only my_functions are accepted.
# def func_with_func_arg(func: type(func_as_param)):
print("func_with_func_arg called")
func()
def test_func_with_func_arg():
print("test_func_with_func_arg")
func_with_func_arg(func_as_param)
func_with_func_arg(other_func) # <- here IDE must complain that only func_as_param is expected
in the above code, my IDE (pycharm) does complain that other_func is of the wrong type, this however doesn't do any restriction at runtime, it only allows IDE linter and mypy to issue warning on violation.
Edit: removed the arguments by declaring the call function as static.
I don't know if it's just me that Python doesn't make any changes to the type Annotations
Syntax :
def greeting(name: str) -> str:
return 'Hello ' + name
you can find more here: https://docs.python.org/3/library/typing.html
Is adding an argument to a function through a wrapper a python anti-pattern? I want to add a wrapper that saves the output of many functions to a location, so a wrapper seems to make sense. However, Pycharm fails to autocomplete the arguments to a decorated function (https://intellij-support.jetbrains.com/hc/en-us/community/posts/360002754060-Autocomplete-with-arguments-for-decorated-functions).
and some of the discussion related to changing the function signature with a wrapper seems to indicate that it is a bad practice (https://youtrack.jetbrains.com/issue/PY-33688#focus=Comments-27-3268273.0-0).
Some decorators could change functions' signatures so to correctly process such cases PyCharm have to read decorator's body that could not be done due to performance reasons.
So would doing something like the following be an anti-pattern:
from functools import wraps
from typing import Callable
my_dict = {"very": {"deeply": {"nested": {"filepath": "hidden_filepath"}}}}
def decorator(function: Callable):
#wraps(function)
def wrapper(extra_arg: str, function_arg: str) -> str:
file_path: str = my_dict["very"]["deeply"]["nested"][extra_arg]
print(f"saving to: {file_path}")
result: str = function(function_arg)
print(f"result: {result}")
return result
wrapper.__doc__ += "/n:param extra_arg: an extra argument"
return wrapper
#decorator
def my_function(an_arg: str) -> str:
"""
my docstring
:param an_arg:
:return:
"""
print(f"my_function arg: {an_arg}")
return an_arg * 2
my_function("filepath", "cool_str")
I'm also not in love with appending to a docstring in the function, but found that as a solution here: Signature-changing decorator: properly documenting additional argument. Would it make more sense to just change the docstring of the decorated function?
Edit:
The only other reasonable solution I can think of is to create a function that takes the other function as an argument, which is what a wrapper is supposed to solve, eg.
def decorator(extra_arg:str, function: Callable, **kwargs)-> str:
file_path: str = my_dict["very"]["deeply"]["nested"][extra_arg]
print(f"saving to: {file_path}")
result: str = function(**kwargs)
print(f"result: {result}")
return result
def my_function(an_arg: str) -> str:
"""
my docstring
:param an_arg:
:return:
"""
print(f"my_function arg: {an_arg}")
return an_arg * 2
decorator("filepath", my_function, an_arg="cool_str")
Think about maintainability.
Someone else maintaining your code will see that my_function has just one arg. PyCharm and mypy will scream that calls to my_function with more than one arg are wrong. That someone else then go in 'fixing' all the 'bugs'.
Your program breaks.
Hours and hours of troubleshooting before finding out that your decorator changed the signature of the function.
Heck, doesn't need to be someone else... leave your code for a month or two, and when you go back, you'll likely have forgotten that your decorator mangled your function...
So yeah, it's a bad practice, an anti-pattern, a code smell, a <insert your favorite negative-connotation buzzword here>.
I am working within a Python web framework that uses Python 3 type annotations for validation and dependency injection.
So I am looking for a way to generate functions with type annotations from a parameters given to the generating function:
def gen_fn(args: Dict[str, Any]) -> Callable:
def new_fn(???):
pass
return new_fn
so that
inspect.signature(gen_fn({'a': int}))
will return
<Signature (a:int)>
Is there something I cam put instead of the ??? that will do the thing I need.
I also looked at Signature.replace() in the inspect module, but did not find a way to attach the new signature to a new or existing function.
I am hesitant to use ast because:
The abstract syntax itself might change with each Python release
So my question is: What (if any) is a reasonable way to generate a function with Python 3 type annotation based on a dict passed to the generating function?
Edit: while #Aran-Fey's solution answer my question correctly, it appears that my assumption was wrong. Changing the signature doesn't allow calling the new_fn using the new signature. That is gen_fn({'a': int})(a=42) raises a TypeError: ... `got an unexpected keyword argument 'a'.
Instead of creating a function with annotations, it's easier to create a function and then set the annotations manually.
inspect.signature looks for the existence of a __signature__ attribute before it looks at the function's actual signature, so we can craft an appropriate inspect.Signature object and assign it there:
params = [inspect.Parameter(param,
inspect.Parameter.POSITIONAL_OR_KEYWORD,
annotation=type_)
for param, type_ in args.items()]
new_fn.__signature__ = inspect.Signature(params)
typing.get_type_hints does not respect __signature__, so we should update the __annotations__ attribute as well:
new_fn.__annotations__ = args
Putting them both together:
def gen_fn(args: Dict[str, Any]) -> Callable:
def new_fn():
pass
params = [inspect.Parameter(param,
inspect.Parameter.POSITIONAL_OR_KEYWORD,
annotation=type_)
for param, type_ in args.items()]
new_fn.__signature__ = inspect.Signature(params)
new_fn.__annotations__ = args
return new_fn
print(inspect.signature(gen_fn({'a': int}))) # (a:int)
print(get_type_hints(gen_fn({'a': int}))) # {'a': <class 'int'>}
Note that this doesn't make your function callable with these arguments; all of this is just smoke and mirrors that makes the function look like it has those parameters and annotations. Implementing the function is a separate issue.
You can define the function with varargs to aggregate all the arguments into a tuple and a dict:
def new_fn(*args, **kwargs):
...
But that still leaves you with the problem of implementing the function body. You haven't said what the function should do when it's called, so I can't help you with that. You can look at this question for some pointers.
Some decorators should only be used in the outermost layer.
A decorator that augments the original function and add a configure parameter is one example.
from functools import wraps
def special_case(f):
#wraps(f)
def _(a, b, config_x=False):
if config_x:
print "Special case here"
return
return f(a, b)
How can I avoid decorators like this getting decorated by another decorator?
EDIT
It is really disgusting to let everyone trying to apply a new decorator worry about the application order.
So, is it possible to avoid this kind of situation? Is it possible to add a config option without introducing a new parameter?
There isn't any way to stop it from being decorated. You just have to document that it needs to apply last and tell people not to use it inside another decorator.
Edit responding to your edit: In Python 3 you can give your function a keyword-only argument. This drastically reduces the impact that the change will have on existing uses of the function. Unfortunately this only works in Python 3.
Ultimately, applying a decorator to a function just means passing the decorated function as an argument to another function. There's no way for a function (or any object) to even know that it's being passed as an argument, let alone what it's being passed to. The reason you can't know about later decorators is the same reason that in an ordinary function call like f(g(x)), the function g can't know that it will later be called by f.
This is one reason writing decorators is tricky. Code that relies on heavy use of decorators that pass explicit arguments to their wrapped functions (as yours passes a and b) is inherently going to be fragile. Fortunately, a lot of the time you can write a decorator that uses *args and **kwargs so it can pass all the arguments it doesn't use along to the decorated function.
If someone takes the code you provide, and writes another decorator that explicitly accepts only a and b as arguments, and then calls the decorated function as f(a, b, True), it's their own fault if it fails. They should have known that other decorators they used might have changed the function signature.
Normally, when one write a decorator to be used generically, one does not estrict the number or name of arguments for the function it is wrapping.
Most decorators out there accept a list o positional arguments, and amapping of keyword arguments as parameters for their wrapper, and pass those, as received, to the decorated function:
def deco(func):
def wrapper(*args, **kwargs):
... decorator stuff here ...
return func(*args, **kwargs)
Threfore, if the decorator is to receive a parameter that it should "consume" - like the config_x you mention, all you have to do is to document it, have it as a keyword parameter, and pick it from kwargs. To avoid name clashes on parameters, one can, for example, prefix this parameter name with the decorator's own name or other distinct name:
def deco(func):
def wrapper(*args, **kwargs):
if "deco_config_x" in kwargs):
config_x = kwargs.pop(deco_config_x)
... decorator stuff here ...
return func(*args, **kwargs)
This way, the decorator may be put anywhere on a "decorator stack" - it will pick the parameter(s) addressed to it, and those bellow it won't get any stranger parameter. Theonly requirement is that your functions and decorators as a whole juts let keyword parametrs they don't know about to pass through.