Is adding an argument to a function through a wrapper a python anti-pattern? I want to add a wrapper that saves the output of many functions to a location, so a wrapper seems to make sense. However, Pycharm fails to autocomplete the arguments to a decorated function (https://intellij-support.jetbrains.com/hc/en-us/community/posts/360002754060-Autocomplete-with-arguments-for-decorated-functions).
and some of the discussion related to changing the function signature with a wrapper seems to indicate that it is a bad practice (https://youtrack.jetbrains.com/issue/PY-33688#focus=Comments-27-3268273.0-0).
Some decorators could change functions' signatures so to correctly process such cases PyCharm have to read decorator's body that could not be done due to performance reasons.
So would doing something like the following be an anti-pattern:
from functools import wraps
from typing import Callable
my_dict = {"very": {"deeply": {"nested": {"filepath": "hidden_filepath"}}}}
def decorator(function: Callable):
#wraps(function)
def wrapper(extra_arg: str, function_arg: str) -> str:
file_path: str = my_dict["very"]["deeply"]["nested"][extra_arg]
print(f"saving to: {file_path}")
result: str = function(function_arg)
print(f"result: {result}")
return result
wrapper.__doc__ += "/n:param extra_arg: an extra argument"
return wrapper
#decorator
def my_function(an_arg: str) -> str:
"""
my docstring
:param an_arg:
:return:
"""
print(f"my_function arg: {an_arg}")
return an_arg * 2
my_function("filepath", "cool_str")
I'm also not in love with appending to a docstring in the function, but found that as a solution here: Signature-changing decorator: properly documenting additional argument. Would it make more sense to just change the docstring of the decorated function?
Edit:
The only other reasonable solution I can think of is to create a function that takes the other function as an argument, which is what a wrapper is supposed to solve, eg.
def decorator(extra_arg:str, function: Callable, **kwargs)-> str:
file_path: str = my_dict["very"]["deeply"]["nested"][extra_arg]
print(f"saving to: {file_path}")
result: str = function(**kwargs)
print(f"result: {result}")
return result
def my_function(an_arg: str) -> str:
"""
my docstring
:param an_arg:
:return:
"""
print(f"my_function arg: {an_arg}")
return an_arg * 2
decorator("filepath", my_function, an_arg="cool_str")
Think about maintainability.
Someone else maintaining your code will see that my_function has just one arg. PyCharm and mypy will scream that calls to my_function with more than one arg are wrong. That someone else then go in 'fixing' all the 'bugs'.
Your program breaks.
Hours and hours of troubleshooting before finding out that your decorator changed the signature of the function.
Heck, doesn't need to be someone else... leave your code for a month or two, and when you go back, you'll likely have forgotten that your decorator mangled your function...
So yeah, it's a bad practice, an anti-pattern, a code smell, a <insert your favorite negative-connotation buzzword here>.
Related
I have some code I want to improve. I got a suggestion to use a complex solution using functools which I could not understand.
Code Explanation: I am trying to create converter for Strings.
What I want to do here is to run some fixed code before a convert function is executed. However that execution depends on variable argument like country code and validation lengths for that string.
This is what I implemented taking inspiration from: https://www.scaler.com/topics/python/python-decorators/
I don't understand why we need 3 levels of functions nesting here just to implement decorator that requires arguments country_code and valid_lengths.
import functools
from collections.abc import Callable
class Number:
def prevalidate(country_code: str, valid_lengths: list[int]): # type: ignore
def decorator(func: Callable):
#functools.wraps(func)
def wrapper(num: str, validate=False):
if num.startswith(country_code):
num = num[2:]
if validate and len(num) not in valid_lengths:
raise ValueError(f"{num} is not valid {country_code} number")
return func(num, validate)
return wrapper
return decorator
#staticmethod
#prevalidate(country_code="DZ", valid_lengths=[13])
def convert_dz(num: str, validate=False) -> str:
return num[4:6] + num[-4:]
... # other similar methods
num = Number.convert_dz("W/2011/012346") # => 1012346
Let me explain each level first.
The outer level is the decorator factory, which produces your decorator based on some input values.
The second level is the decorator which is a function taking a function as argument and returns a new function which wraps the original function.
The inner level is the wrapper, which is the function which will replace the original function.
Now, you wonder why level 1. and 2. are not merged. Indeed they can be merged, but the three layers are motivated by the shortcut given by the # symbol. The #deco on a function func is equivalent to overwriting the name of the function with func = deco(func), and #deco_factory(args) is equivalent to deco = deco_factory(args); func=deco(func). So it is the # symbol which will only pass the function as single argument. Still, you can manually decorate functions, but you may confuse other python developers which are already used to the three layer design.
Edit:
I did not yet comment to your code example, but just explained the title question. Note, that every time you call the decorator factory with the same arguments, you are creating a new decorator. It would be better if you just reuse a single instance of the decorator. Moreover, if the input values of the decorator factory change the way your class Number behaves, you should better add those values to the class constructor, I mean the __init__ method, and work with instances of Number.
Now, the implementation may not require a decorator, because adding self.prevalidate(num) at the beginning of each function is just a one-liner and is more explicit than the decorator, but there might be more ways to achieve it.
I want to restrict scope of functions that can be passed as parameter to another function. For example, to restrict functions to be only one from two specified, or from particular module, or by signature. I tried the code below but in it there is now restrictions: as parameter can be passed any function.
Is this possible in Python?
def func_as_param():
print("func_as_param called")
def other_func():
print("other_func called")
def func_with_func_arg(func: func_as_param): # this is not giving restrictions
# def func_with_func_arg(func: type(func_as_param)): # this is also not giving restrictions
print("func_with_func_arg called")
func()
def test_func_with_func_arg():
print("test_func_with_func_arg")
func_with_func_arg(func_as_param)
func_with_func_arg(other_func) # <- here IDE must complain that only func_as_param is expected
Frameworks expecting callback functions of specific signatures might be type hinted using Callable[[Arg1Type, Arg2Type], ReturnType]
You might want to use the Callable type. This might help https://docs.python.org/3/library/typing.html#callable
NOTE
Type annotations in Python are not make-or-break like in C. They’re optional chunks of syntax that we can add to make our code more explicit.
Erroneous type annotations will do nothing more than highlight the incorrect annotation in our code editor — no errors are ever raised due to annotations.
If thats necessary, you must do the checking by yourself.
normally this is done by creating your own type (class) ... then any other function can inherit from it and will be of the same "type".
class my_functions:
pass
class func_as_param_class(my_functions):
#staticmethod
def __call__():
print("func_as_param called")
func_as_param = func_as_param_class() # create the callable function ....
def other_func():
print("other_func called")
def func_with_func_arg(func: my_functions): # only my_functions are accepted.
# def func_with_func_arg(func: type(func_as_param)):
print("func_with_func_arg called")
func()
def test_func_with_func_arg():
print("test_func_with_func_arg")
func_with_func_arg(func_as_param)
func_with_func_arg(other_func) # <- here IDE must complain that only func_as_param is expected
in the above code, my IDE (pycharm) does complain that other_func is of the wrong type, this however doesn't do any restriction at runtime, it only allows IDE linter and mypy to issue warning on violation.
Edit: removed the arguments by declaring the call function as static.
I don't know if it's just me that Python doesn't make any changes to the type Annotations
Syntax :
def greeting(name: str) -> str:
return 'Hello ' + name
you can find more here: https://docs.python.org/3/library/typing.html
I am testing mypy in one of my projects to see if I will like or not. I know only the basics so far. Here is a little problem I am unable to solve.
I need a function with positional-only parameters. In Python 3.8+:
def func(self, arg: int, /, **data):
(This allows to use self=something and arg=something in the data, if you're curious why.)
To make it work also in Python 3.7 I had to write:
def func(*args, **data):
self, arg = args
And it works fine, but seems to confuse the mypy, it complains about parameter types in calls of this func.
How can I annotatate, that arg[1] is an int?
Update:
I had some partial success with typing.overload.
A single #overload generates this error:
"Single overload definition, multiple required"
and when I write it twice:
"Overloaded function signature 2 will never be matched"
but the calls to that method are now checked OK.
I'm making the transition over to Python3 and have been exploring some of the functionality of the stdlib. functools.singledispatch caught my eye and I've been playing around with it a little bit. However, at the point where I tried using it in a class I ran into some problems.
It doesn't appear to work with functions registered inside the class, you can make it work by directly calling fun.dispatch(type(arg))(argname=arg) and I was wondering if there was a better way to do it.
I tried using #classmethod and #staticmethod as decorators above and below the registration but that didn't work.
Here's a contrived example that registers handlers to convert the input argument when creating a class to ensure that it will always be a list.
from functools import singledispatch
class UrlDispatcher(object):
#singledispatch
def url_input(self, input):
print('input wasn\'t dispatched', input)
#url_input.register(str)
def _(self, input):
print('input is a str', input)
self.input = [input]
#url_input.register(list)
def _(self, input):
print('input is a list', input)
self.input = input
def __init__(self, arg):
# Works, albeit clunkily
self.url_input.dispatch(type(arg))(self,input=arg)
# Always uses the base dispatcher
self.url_input(input=arg)
a = "http://www.cnn.com"
b = ["http://www.google.com", "http://www.slashdot.org"]
s1 = UrlDispatcher(a)
s2 = UrlDispatcher(b)
The following should work. Whether or not it's the best solution, I don't know.
class Foo:
def method(self, arg):
_method(arg, self)
#functools.singledispatch
def _method(arg, self):
...
...
...
https://docs.python.org/3/library/functools.html#functools.singledispatchmethod
as of python 3.8 there is an stdlib function for method dispatching
I found the answer - you don't.
http://code.activestate.com/lists/python-dev/122554/
Quoting from a post I found at the above URL I think it's explained - short answer is 'generic functions' are for stateless algorithms. I was unaware of that definition.
Correct. OO and generic functions are different development paradigms,
and there are limitations on mixing them. Generic functions are for
stateless algorithms, which expect to receive all required input
through their arguments. By contrast, class and instance methods
expect to receive some state implicitly - in many respects, they
already are generic functions.
Thus, this is really a request for dual dispatch in disguise: you want
to first dispatch on the class or instance (through method dispatch)
and then dispatch on the second argument (through generic function
dispatch).
Dual dispatch is much harder than single dispatch and
"functools.singledispatch" does not and should not support it (it's in
the name). As PJE noted, you can use singledispatch with
staticmethods, as that eliminates the dual dispatch behaviour by
removing the class and instance based dispatch step. You can also
register already bound class and instance methods as implementations
for a generic function, as that also resolves the dual dispatch in a
way that means the single dispatch implementation doesn't even need to
be aware it is happening.
Here is my solution, work on any Python (3.8 or less)
class MyClass:
def over_ride_func( self, arg):
if type(arg) is list:
self.over_ride_func_list (arg)
if type(arg) is int:
self.over_ride_func_int (arg)
if type(arg) is float:
self.over_ride_func_float (arg)
if type(arg) is str:
self.over_ride_func_string (arg)
def over_ride_func_list(self, arg ):
print ("List arg: ", arg, " with legnth", len(arg), " first item", arg[0])
def over_ride_func_int(self, arg ):
print ("int arg: ", arg)
def over_ride_func_float(self, arg ):
print ("Float arg: ", arg)
def over_ride_func_string(self, arg ):
print ("String arg ", arg)
obj_myclass = MyClass()
obj_myclass.over_ride_func(665)
obj_myclass.over_ride_func(444.31)
obj_myclass.over_ride_func([3,5,6])
obj_myclass.over_ride_func("Hello over ride function")
I'm trying to make sure running help() at the Python 2.7 REPL displays the __doc__ for a function that was wrapped with functools.partial. Currently running help() on a functools.partial 'function' displays the __doc__ of the functools.partial class, not my wrapped function's __doc__. Is there a way to achieve this?
Consider the following callables:
def foo(a):
"""My function"""
pass
partial_foo = functools.partial(foo, 2)
Running help(foo) will result in showing foo.__doc__. However, running help(partial_foo) results in the __doc__ of a Partial object.
My first approach was to use functools.update_wrapper which correctly replaces the partial object's __doc__ with foo.__doc__. However, this doesn't fix the 'problem' because of how pydoc.
I've investigated the pydoc code, and the issue seems to be that partial_foo is actually a Partial object not a typical function/callable, see this question for more information on that detail.
By default, pydoc will display the __doc__ of the object type, not instance if the object it was passed is determined to be a class by inspect.isclass. See the render_doc function for more information about the code itself.
So, in my scenario above pydoc is displaying the help of the type, functools.partial NOT the __doc__ of my functools.partial instance.
Is there anyway to make alter my call to help() or functools.partial instance that's passed to help() so that it will display the __doc__ of the instance, not type?
I found a pretty hacky way to do this. I wrote the following function to override the __builtins__.help function:
def partialhelper(object=None):
if isinstance(object, functools.partial):
return pydoc.help(object.func)
else:
# Preserve the ability to go into interactive help if user calls
# help() with no arguments.
if object is None:
return pydoc.help()
else:
return pydoc.help(object)
Then just replace it in the REPL with:
__builtins__.help = partialhelper
This works and doesn't seem to have any major downsides, yet. However, there isn't a way with the above naive implementation to support still showing the __doc__ of some functools.partial objects. It's all or nothing, but could probably attach an attribute to the wrapped (original) function to indicate whether or not the original __doc__ should be shown. However, in my scenario I never want to do this.
Note the above does NOT work when using IPython and the embed functionality. This is because IPython directly sets the shell's namespace with references to the 'real' __builtin__, see the code and old mailing list for information on why this is.
So, after some investigation there's another way to hack this into IPython. We must override the site._Helper class, which is used by IPython to explicitly setup the help system. The following code will do just that when called BEFORE IPython.embed:
import site
site._Helper.__call__ = lambda self, *args, **kwargs: partialhelper(*args, **kwargs)
Are there any other downsides I'm missing here?
how bout implementing your own?
def partial_foo(*args):
""" some doc string """
return foo(*((2)+args))
not a perfect answer but if you really want this i suspect this is the only way to do it
You identified the issue - partial functions aren't typical functions, and the dunder variables don't carry over. This applies not just to __doc__, but also __name__, __module__, and more. Not sure if this solution existed when the question was asked, but you can achieve this more elegantly ("elegantly" up to interpretation) by re-writing partial() as a decorator factory. Since decorators (& factories) do not automatically copy over dunder variables, you need to also use #wraps(func):
def wrapped_partial(*args, **kwargs):
def foo(func):
#wraps(func)
def bar(*fargs,**fkwargs):
return func(*args, *fargs, **kwargs, **fkwargs)
return bar
return foo
Usage example:
#wrapped_partial(3)
def multiply_triple(x, y=1, z=0):
"""Multiplies three numbers"""
return x * y * z
# Without decorator syntax: multiply_triple = wrapped_partial(3)(multiply_triple)
With output:
>>>print(multiply_triple())
0
>>>print(multiply_triple(3,z=3))
9
>>>help(multiply_triple)
help(multiply_triple)
Help on function multiply_triple in module __main__:
multiply_triple(x: int, y: int = 1, z: int = 0)
Multiplies three numbers
Thing that didn't work, but informative when using multiple decorators
You might think, as I first did, that based upon the stacking syntax of decorators in PEP-318, you could put the wrapping and the partial function definition in separate decorators, e.g.
def partial_func(*args, **kwargs):
def foo(func):
def bar(*fargs,**fkwargs):
return func(*args, *fargs, **kwargs, **fkwargs)
return bar
return foo
def wrapped(f):
#wraps(f)
def wrapper(*args, **kwargs):
return f(*args, **kwargs)
return wrapper
#wrapped
#partial_func(z=3)
def multiply_triple(x, y=1, z=0):
"""Multiplies three numbers"""
return x * y * z
In these cases (and in reverse order), the decorators are applied one at a time, and the #partial_func interrupts wrapping. This means that if you are trying to use any decorator that you want to wrap, you need to rewrite the decorator in a factory where the decorator's return function is itself decorated by #wraps(func). If you are using multiple decorators, they all have to be turned into wrapped factories.
Alternate method to have decorators "wrap"
Since decorators are just functions, you can write a copy_dunder_vars(obj1, obj2) function that retruns obj2 but with all the dunder variables from obj1. Call as:
def foo()
pass
foo = copy_dunder_vars(decorator(foo), foo)
This goes against the preferred syntax, but practicality beats purity. I think "not forcing you to rewrite decorators that you're borrowing from elsewhere and leaving largely unchanged" fits into that category. After all that wrapping, don't forget ribbon and a bow ;)