Documenting function params when wrapping a function - python

I've written some code to wrap shutil.copy file like so (this is a largely simplified example):
from functools import wraps
from shutil import copyfile
def my_wrapper(f):
#wraps(f)
def wrapper(*args, **kwargs):
return f(*args, **kwargs)
return wrapper
#my_wrapper
def mycopyfile(*args, **kwargs):
"""Wrap :func:`shutil.copyfile`"""
return copyfile(*args, **kwargs)
In PyCharm, if I type mycopyfile. it suggests *args, and **kwargs as the params. How can I make it so PyCharm, and other IDE's suggest the params of shutil.copyfile?
In addition, the quick docs in PyCharm use the documents for mycopyfile, rather than the docs shutil.copy file, even though mycopyfile.doc returns the docs correctly (as deteremined by the #wraps decorator)

Related

Calling a decorated python function from robotframework script resulting in infinite recursing

I'm trying to invoke a python function from robotframework keyword. The python function has been decorated to be invoked using run_keyword from Builtin library. This is because robot logs appear well structured if library functions are invoked via run_keyword function from built in library. rather than invoked directly. However this is resulting in an infinite loop. Is there a solution to gracefully accomplish the goal?
robotkeyword :
do something
#creates a user by calling a function from python based library
create user
python function
#wrap_with_run_keyword
def create_user():
pass
def wrap_with_run_keyword(func):
def func_wrapper(*args, **kwargs):
return run_keyword(func, *args, **kwargs)
return func_wrapper
I couldn't solve the problem using partial application.
However, I broker the recursive loop by setting and unsetting an attribute as give below.
def wrap_with_run_keyword(func):
def func_wrapper(*args, **kwargs):
if not hasattr(func, 'second'):
setattr(func, "second", True)
return run_keyword(func, *args, **kwargs)
else:
delattr(func, "second")
return func(*args, **kwargs)
return func_wrapper
I have however run into another problem. I defined create_user as follows
def create_user(properties):
#some code
pass
On Calling this function in the way below
create_user("name=abc")
I'm getting the following error : got an unexpected keyword argument 'name'
I did run in the same issue, but solved it, only wondering if i can detect the caller...if the call is done from robotframework or by python in case that the call is done by the rf it should do only the second call
#wraps(function)
def wrapper(self, *args, **kwargs):
if not hasattr(function, 'second'):
setattr(function, 'second', True)
ar= list(args)
for key, value in kwargs.items():
ar.append(value)
return BuiltIn().run_keyword('Mylib.' + function.__name__, ar)
else:
delattr(function, 'second')
return function(self,*args[0])
return wrapper
Take a look at the partial class from the functools module. I think this might help you.
Or take a look at how decorators work in python.

Decorator not work with argument suggestions?

decorator code:
from functools import wraps
def wrap2(func):
#wraps(func)
def wrap(*args, **kwargs):
return func(*args, **kwargs)
return wrap
test function :
#wrap2
def f2(x='', y=''):
return 1
def f3(x='', y=''):
return 1
problem: can not use arguments suggestion with tab key on decorated function.
screenshot:
great thanks
the decorator can only keep the doctoring the same even if you use functools.wraps, but can not keep the signature of your original function.

Pass a decorator's function into Python RQ

How do I pass a decorator's function into a job?
I have a decorator that would run a job using the function.
#job
def queueFunction(passedFunction, *args, **kwargs):
# Do some stuff
passedFunction(*args, **kwargs)
def myDecorator(async=True):
def wrapper(function):
def wrappedFunc(*args, **kwargs):
data = DEFAULT_DATA
if async:
queueFunction.delay(function, *args, **kwargs)
else:
data = queueFunction(function, *args, **kwargs)
return data
return wrappedFunc
return wrapper
I get an error when trying to use it.
Can't pickle <function Model.passedFunction at 0x7f410ad4a048>: it's not the same object as modelInstance.models.Model.passedFunction
Using Python 3.4
What happens is that you are passing in the original function (or method) to the queueFunction.delay() function, but that's not the same function that it's qualified name says it is.
In order to run functions in a worker, Python RQ uses the pickle module to serialise both the function and its arguments. But functions (and classes) are serialised as importable names, and when deserialising the pickle module simply imports the recorded name. But it does first check that that will result in the right object. So when pickling, the qualified name is tested to double-check it'll produce the exact same object.
If we use pickle.loads as a sample function, then what roughly happens is this:
>>> import pickle
>>> import sys
>>> sample_function = pickle.loads
>>> module_name = sample_function.__module__
>>> function_name = sample_function.__qualname__
>>> recorded_name = f"{module_name}.{function_name}"
>>> recorded_name
'_pickle.loads'
>>> parent, obj = sys.modules[module_name], None
>>> for name in function_name.split("."): # traverse a dotted path of names
... obj = getattr(parent, name)
...
>>> obj is sample_function
True
Note that pickle.loads is really _pickle.loads; that doesn't matter all that much, but what does matter is that _pickle can be accessed and it has an object that can be found by using the qualified name, and it is the same object still. This will work even for methods on classes (modulename.ClassName.method_name).
But when you decorate a function, you are potentially replacing that function object:
>>> def decorator(f):
... def wrapper(*args, **kwargs):
... return f, f(*args, **kwargs)
... return wrapper
...
>>> #decorator
... def foo(): pass
...
>>> foo.__qualname__
'decorator.<locals>.wrapper'
>>> foo()[0].__qualname__ # original function
'foo'
Note that the decorator result has a very different qualified name from the original! Pickle won't be able to map that back to either the decorator result or to the original function.
You are passing in the original, undecorated function to queueFunction.delay(), and it's qualified name will not match that of the wrappedFunc() function you replaced it with; when pickle tries to import the fully qualified name found on that function object, it'll find the wrappedFunc object and that's not the same object.
There are several ways around this, but the easiest is to store the original function as an attribute on the wrapper, and rename it's qualified name to match. This makes the original function available
You'll have to use he #functools.wraps() utility decorator here to copy various attributes from the original, decorated function over to your wrapper function. This includes the original name.
Here is a version that alters the original function qualified name:
from functools import wraps
def myDecorator(async_=True):
def wrapper(function):
#wraps(function)
def wrappedFunc(*args, **kwargs):
data = DEFAULT_DATA
if async:
queueFunction.delay(function, *args, **kwargs)
else:
data = queueFunction(function, *args, **kwargs)
return data
# make the original available to the pickle module as "<name>.original"
wrappedFunc.original = function
wrappedFunc.original.__qualname__ += ".original"
return wrappedFunc
return wrapper
The #wraps(function) decorator makes sure that wrappedFunc.__qualname__ is set to that of function, so if function was named foo, so now is the wrappedFunc function object. The wrappedFunc.original.__qualname__ += ".original" statement then sets the qualified name of wrappedFunc.original to foo.original, and that's exactly where pickle can find it again!
Note: I renamed async to async_ to make the above code work on Python 3.7 and above; as of Python 3.7 async is a reserved keyword.
I also see that you are making the decision to run something synchronous or asynchronous at decoration time. In that case I'd re-write it to not check the aync_ boolean flag each time you call the function. Just return different wrappers:
from functools import wraps
def myDecorator(async_=True):
def decorator(function):
if async_:
#wraps(function)
def wrapper(*args, **kwargs):
queueFunction.delay(wrappedFunc.original, *args, **kwargs)
return DEFAULT_DATA
# make the original available to the pickle module as "<name>.original"
wrapper.original = function
wrapper.original.__qualname__ += ".original"
else:
#wraps(function)
def wrapper(*args, **kwargs):
return queueFunction(function, *args, **kwargs)
return wrapper
return decorator
I also renamed the various inner functions; myDecorator is a decorator factory that returns the actual decorator, and the decorator returns the wrapper.
Either way, the result is that now the .original object can be pickled:
>>> import pickle
>>> #myDecorator(True)
... def foo(): pass
...
>>> foo.original
<function foo.original at 0x10195dd90>
>>> pickle.dumps(foo.original, pickle.HIGHEST_PROTOCOL)
b'\x80\x04\x95\x1d\x00\x00\x00\x00\x00\x00\x00\x8c\x08__main__\x94\x8c\x0cfoo.original\x94\x93\x94.'

Run-time-patch python module

I am searching for a way to run a module while replacing imports. This would be the missing magic to implement run_patched in the following pseudocode.
from argparse import ArgumentParser
class ArgumentCounter(ArgumentParser):
def __init__(self, *args, **kwargs):
super().__init__(*args, **kwargs)
arg_counter = 0
def add_argument(self, *args, **kwargs):
super().add_argument(*args, **kwargs)
arg_counter += 1
def parse_args(self, *args, **kwargs):
super().parse_args(*args, **kwargs)
print(arg_counter)
run_patched('test.test_argparse', ArgumentParser = ArgumentCounter)
I know that single methods could be replaced by assignment, for example stating ArgumentParser.parse_args = print, so I was tempted to mess with globals like sys.modules and then execute the module by runpy.run_module.
Unfortunately, the whole strategy should be able to work in a multithreaded scenario. So the change should only affect the module executed while other parts of the program can continue to use the unpatched module(s) as if they were never touched.

Decorating an instance method and calling it from the decorator

I am using nose test generators feature to run the same test with different contexts. Since it requires the following boiler plate for each test:
class TestSample(TestBase):
def test_sample(self):
for context in contexts:
yield self.check_sample, context
def check_sample(self, context):
"""The real test logic is implemented here"""
pass
I decided to write the following decorator:
def with_contexts(contexts=None):
if contexts is None:
contexts = ['twitter', 'linkedin', 'facebook']
def decorator(f):
#wraps(f)
def wrapper(self, *args, **kwargs):
for context in contexts:
yield f, self, context # The line which causes the error
return wrapper
return decorator
The decorator is used in the following manner:
class TestSample(TestBase):
#with_contexts()
def test_sample(self, context):
"""The real test logic is implemented here"""
var1 = self.some_valid_attribute
When the tests executed an error is thrown specifying that the attribute which is being accessed is not available. However If I change the line which calls the method to the following it works fine:
yield getattr(self, f.__name__), service
I understand that the above snippet creates a bound method where as in the first one self is passed manually to the function. However as far as my understanding goes the first snippet should work fine too. I would appreciate if anyone could clarify the issue.
The title of the question is related to calling instance methods in decorators in general but I have kept the description specific to my context.
You can use functools.partial to tie the wrapped function to self, just like a method would be:
from functools import partial
def decorator(f):
#wraps(f)
def wrapper(self, *args, **kwargs):
for context in contexts:
yield partial(f, self), context
return wrapper
Now you are yielding partials instead, which, when called as yieldedvalue(context), will call f(self, context).
As far as I can tell, some things don't fit together. First, your decorator goes like
def with_contexts(contexts=None):
if contexts is None:
contexts = ['twitter', 'linkedin', 'facebook']
def decorator(f):
#wraps(f)
def wrapper(self, *args, **kwargs):
for context in contexts:
yield f, self, context # The line which causes the error
return wrapper
return decorator
but you use it like
#with_contexts
def test_sample(self, context):
"""The real test logic is implemented here"""
var1 = self.some_valid_attribute
This is wrong: this calls with_context(test_sample), but you need with_context()(test_sample). So do
#with_contexts()
def test_sample(self, context):
"""The real test logic is implemented here"""
var1 = self.some_valid_attribute
even if you don't provide the contexts argument.
Second, you decorate the wrong function: your usage shows that the test function yields the check function for each context. The function you want to wrap does the job of the check function, but you have to name it after the test function.
Applying self to a method can be done with partial as Martijn writes, but it can as well be done the way Python does it under the hood: with
method.__get__(self, None)
or maybe better
method.__get__(self, type(self))
you can achieve the same. (Maybe your original version works as well, with yielding the function to be called and the arguments to use. It was not clear to me that this is the way it works.)

Categories