decorator inside class Python - python

Sorry for my english. I want to create a decorator method that can check each step methods and write it db.
This is my method:
class Test:
#StepStatusManager.logger_steps("GET_LIST") # TypeError: logger_steps() missing 1 required positional argument: 'type'
def get_mails(self):
print("GET_MAIL")
This is my decorator class:
class StepStatusManager:
def __init__(self):
self.db = DB()
def logger_steps(self, type):
def logger_steps(func):
#functools.wraps(func)
def wrapper(*args):
try:
func(*args)
self.db.setStatus(type)
except BaseException as e:
print(e)
return wrapper
return logger_steps

You are trying to call the instance method, logger_steps, directly from the class StepStatusManager, and Python is taking the value "GET_LIST" as the self parameter instead of type. You should create an instance of StepStatusManager and then make the decorator calling the method of the instance instead. It can be as simple as:
manager = StepStatusManager()
class Test:
#manager.logger_steps("GET_LIST")
def get_mails(self):
print("GET_MAIL")
This is now creating an instance of the class and then calling the method on the instance, instead of trying to call the method directly from the class. You can now use manager to decorate as many methods as you want. Also, this would make all decorated methods use the same StepStatusManager, but if you want you can create different instances and use them to decorate different methods; that would allow you to use different self.db for different methods, if you need it.
Another approach could be having the db variable in the class, and make logger_steps a class method instead:
class StepStatusManager:
db = DB()
#classmethod
def logger_steps(cls, type):
def logger_steps(func):
#functools.wraps(func)
def wrapper(*args):
try:
func(*args)
cls.db.setStatus(type)
except BaseException as e:
print(e)
return wrapper
return logger_steps
class Test:
#StepStatusManager.logger_steps("GET_LIST")
def get_mails(self):
print("GET_MAIL")
Note however that this is less flexible, in that it will not allow you to have methods decorated with different managers, should you ever need to. Also, this is mostly equivalent to have, instead of a class, a StepStatusManager module, where db is a module variable and logger_steps is a module function, and that would probably clearer if you want this functionality:
# StepStatusManager.py
# ...
db = DB()
def logger_steps(type):
def logger_steps(func):
#functools.wraps(func)
def wrapper(*args):
try:
func(*args)
cls.db.setStatus(type)
except BaseException as e:
print(e)
return wrapper
return logger_steps
# test.py
import StepStatusManager
class Test:
#StepStatusManager.logger_steps("GET_LIST")
def get_mails(self):
print("GET_MAIL")
Again this is maybe more straightforward but less flexible as your first proposed class-based solution.
EDIT:
Just for completeness and comparison, here is yet another version, similar to the one with #classmethod, but using #staticmethod instead (to understand the subtle difference between these two decorators, check one of the many SO questions about it, e.g. What is the difference between #staticmethod and #classmethod? or Meaning of #classmethod and #staticmethod for beginner?):
class StepStatusManager:
db = DB()
#staticmethod
def logger_steps(type):
def logger_steps(func):
#functools.wraps(func)
def wrapper(*args):
try:
func(*args)
StepStatusManager.db.setStatus(type)
except BaseException as e:
print(e)
return wrapper
return logger_steps
class Test:
#StepStatusManager.logger_steps("GET_LIST")
def get_mails(self):
print("GET_MAIL")
As it frequently happens with #classmethod and #staticmethod, the difference is quite minimal. Their behavior might differ if you are using inheritance, or if you are using a metaclass, or a decorator, or something like that, but otherwise they pretty much the same.

Related

Adding a decorator to an abstract method

I'm trying to add a decorator on an abstractmethod such that when the method is called in the subclasses, the decorator function is also called. This is for a framework, so I'm trying to limit the amount of extra code later users have to code. The decorator is pretty specific to the framework and isn't meant to be used by the general users of the framework - hope that makes sense.
I know there are a few other ways of doing this, but most of them involve the user copying some boilerplate code when they create their own subclasses. The decorator helps prevent having to copy the boilerplate.
from abc import abstractmethod, ABC
def prepost(fn):
# The new function the decorator returns
def wrapper(*args, **kwargs):
print("In wrap 1")
fn(*args, **kwargs)
print("In wrap 2")
return
wrapper.__isabstractmethod__ = True
return wrapper
class Base(ABC):
pass
class Foo(Base):
#prepost
#abstractmethod
def dosomething(self):
raise NotImplementedError
class Bar(Foo):
def dosomething(self):
print("I'm doing something")
Test = Bar()
Test.dosomething()
When I try this I just get:
I'm doing something
rather than the extra output from the decorator.

Call another method in a class when the given method does not exist?

Say I have a class which contains several functions.
class Person:
def __init__(self): pass
def say(self, speech): pass
def walk(self, destination): pass
def jump(self): pass
When the user instantiates a Person, I'd like them to be able to call any method of the class. If the requested method does not exist (e.g. Person.dance()), a default function should be called instead.
I imagine that this could be done via a theoretical magic method -
class Person:
def __init__(self): pass
def say(self, speech): pass
def walk(self, destination): pass
def jump(self): pass
def sleep(self): print("Zzz")
def __method__(self, func):
if func.__name__ not in ['say','walk','jump']:
return self.sleep
else
return func
billy = Person()
billy.dance()
>> "Zzz"
However, I know of no such magic method.
Is there a way to make non-existent methods within a class redirect to another class?
It's important that the end-user doesn't have to do anything - from their perspective, it should just work.
The standard way to catch an undefined attribute is to use __getattr__:
# But see the end of the answer for an afterthought
def __getattr__(self, attr):
return self.sleep
Python does not differentiate between "regular" attributes and methods; a method call starts with an ordinary attribute lookup, whose result just happens to be callable. That is,
billy.run()
is the same as
f = billy.run
f()
This means that __getattr__ will be invoked for any undefined attribute; there is no way to tell at lookup time whether the result is going to be called or not.
However, if all you want is to define "aliases" for a common method, you can do that with a loop after the class statement.
class Person:
def __init__(self): pass
def say(self, speech): pass
def walk(self, destination): pass
def jump(self): pass
def do_all(self): pass
for alias in ["something", "something_else", "other"]:
setattr(Person, alias, Person.do_all)
You can also make hard-coded assignments in the class statement, but that would be unwieldy if there are, as you mention, hundreds of such cases:
class Person:
def do_all(self): pass
something = do_all
something_else = do_all
(I did not experiment with using exec to automate such assignments in a loop; it might be possible, though not recommended.)
You can also embed the list of aliases in the definition of __getattr__, come to think of it:
def __getattr__(self, attr):
if attr in ["something", "something_else", "other"]:
return self.sleep
else:
raise AttributeError(f"type object 'Person' has no attribute '{attr}'")
Your users might find the API behavior confusing. However, if you're sure you need this pattern, you can try something like
# getattr will get the function attribute by a string name version
# None is returned if no function is found
my_func = getattr(self, 'my_func', None)
# callable ensures `my_func` is actually a function and not a generic attribute
# Add your if-else logic here
if callable(my_func):
my_func(args)
else:
...
You could nest your "default" function inside __getattr__ in order to gain access to the called non-existent method's name and arguments.
class Test:
def __getattr__(self, attr):
def default(*args, **kwargs):
return attr, args, kwargs
return default
test = Test()
print(test.test('test'))
# ('test', ('test',), {})

How to use decorators on overridable class methods

I have a custom class with multiple methods that all return a code. I would like standard logic that checks the returned code against a list of acceptable codes for that method and raises an error if it was not expected.
I thought a good way to achieve this was with a decorator:
from functools import wraps
def expected_codes(codes):
def decorator(f):
#wraps(f)
def wrapper(*args, **kwargs):
code = f(*args, **kwargs)
if code not in codes:
raise Exception(f"{code} not allowed!")
else:
return code
return wrapper
return decorator
then I have a class like so:
class MyClass:
#expected_codes(["200"])
def return_200_code(self):
return "200"
#expected_codes(["300"])
def return_300_code(self):
return "301" # Exception: 301 not allowed!
This works fine, however if I override the base class:
class MyNewClass:
#expected_codes(["300", "301"])
def return_300_code(self):
return super().return_300_code() # Exception: 301 not allowed!
I would have expected the above overriden method to return correctly instead of raise an Exception because of the overridden decorator.
From what I've gathered through reading, my desired approach won't work because the decorator is being evaluated at class definition- however I'm surprised there's not a way to achieve what I wanted. This is all in the context of a Django application and I thought Djangos method_decorator decorator might have taken care of this for me, but I think I have a fundamental misunderstanding of how that works.
TL;DR
Use the __wrapped__ attribute to ignore the parent's decorator:
class MyNewClass(MyClass):
#expected_codes(["300", "301"])
def return_300_code(self):
return super().return_300_code.__wrapped__(self) # No exception raised
Explanation
The #decorator syntax is equivalent to:
def f():
pass
f = decorator(f)
Therefore you can stack up decorators:
def decorator(f):
#wraps(f)
def wrapper(*args, **kwargs):
print(f"Calling {f.__name__}")
f(*args, **kwargs)
return wrapper
#decorator
def f():
print("Hi!")
#decorator
def g():
f()
g()
#Calling g
#Calling f
#Hi!
But if you want to avoid stacking up, the __wrapped__ attribute is your friend:
#decorator
def g():
f.__wrapped__()
g()
#Calling g
#Hi!
In short, if you call one of the decorated parent's method in a decorated method of the child class, decorators will stack up, not override one another.
So when you call super().return_300_code() you are calling the decorated method of the parent class which doesn't accept 301 as a valid code and will raise its own exception.
If you want to reuse the original parent's method, the one that simply returns 301 without checking, you can use the __wrapped__ attribute which gives access to the original function (before it was decorated):
class MyNewClass(MyClass):
#expected_codes(["300", "301"])
def return_300_code(self):
return super().return_300_code.__wrapped__(self) # No exception raised

What's the proper way of defining or documenting calls handled by __getattr__?

I have a class who's job is to wrap another class (code I don't control), intercept all calls to the wrapped class, perform some logic, and pass along the call to the underlying class. Here's an example:
class GithubRepository(object):
def get_commit(self, sha):
return 'Commit {}'.format(sha)
def get_contributors(self):
return ['bobbytables']
class LoggingGithubRepositoryWrapper(object):
def __init__(self, github_repository):
self._github_repository = github_repository
def __getattr__(self, name):
base_func = getattr(self._github_repository, name)
def log_wrap(*args, **kwargs):
print "Calling {}".format(name)
return base_func(*args, **kwargs)
return log_wrap
if __name__ == '__main__':
git_client = LoggingGithubRepositoryWrapper(GithubRepository())
print git_client.get_commit('abcdef1245')
print git_client.get_contributors()
As you can see, the way that I do this is by implementing __getattr__ on the wrapping class and delegating to the underlying class. The downside to this approach is that users of LoggingGithubRepositoryWrapper don't know which attributes/methods the underlying GithubRepository actually has.
This leads me to my question: is there a way to define or document the calls handled by __getattr__? Ideally, I'd like to be able to autocomplete on git_client. and be provided a list of supported methods. Thanks for your help in advance!
You can do this a few different ways, but they wont involve the use of __getattr__.
What you really need to do is dynamically create your class, or at least dynamically create the wrapped functions on your class. There are a few ways to do this in python.
You could build the class definition using type() or metaclasses, or build it on class instantiation using the __new__ method.
Every time you call LoggingGithubRepositoryWrapper(), the __new__ method will be called. Here, it looks at all the attributes on the github_repository argument and finds all the non-private methods. It then creates a function on the instantiated LoggingGithubRepositoryWrapper class instance that wraps the repo call in a logging statement.
At the end, it passes back the modified class instance. Then __init__ is called.
from types import MethodType
class LoggingGithubRepositoryWrapper(object):
def __new__(cls, github_repository):
self = super(LoggingGithubRepositoryWrapper, cls).__new__(cls)
for name in dir(github_repository):
if name.startswith('__'):
continue
func = getattr(github_repository, name)
if isinstance(func, MethodType):
setattr(self, name, cls.log_wrap(func))
return self
#staticmethod
def log_wrap(func):
def wrap(*args, **kwargs):
print 'Calling {0}'.format(func.__name__)
return func(*args, **kwargs)
return wrap
def __init__(self, github_repository):
... # this is all the same

Python - decorator - trying to access the parent class of a method

This doesn't work:
def register_method(name=None):
def decorator(method):
# The next line assumes the decorated method is bound (which of course it isn't at this point)
cls = method.im_class
cls.my_attr = 'FOO BAR'
def wrapper(*args, **kwargs):
method(*args, **kwargs)
return wrapper
return decorator
Decorators are like the movie Inception; the more levels in you go, the more confusing they are. I'm trying to access the class that defines a method (at definition time) so that I can set an attribute (or alter an attribute) of the class.
Version 2 also doesn't work:
def register_method(name=None):
def decorator(method):
# The next line assumes the decorated method is bound (of course it isn't bound at this point).
cls = method.__class__ # I don't really understand this.
cls.my_attr = 'FOO BAR'
def wrapper(*args, **kwargs):
method(*args, **kwargs)
return wrapper
return decorator
The point of putting my broken code above when I already know why it's broken is that it conveys what I'm trying to do.
I don't think you can do what you want to do with a decorator (quick edit: with a decorator of the method, anyway). The decorator gets called when the method gets constructed, which is before the class is constructed. The reason your code isn't working is because the class doesn't exist when the decorator is called.
jldupont's comment is the way to go: if you want to set an attribute of the class, you should either decorate the class or use a metaclass.
EDIT: okay, having seen your comment, I can think of a two-part solution that might work for you. Use a decorator of the method to set an attribute of the method, and then use a metaclass to search for methods with that attribute and set the appropriate attribute of the class:
def TaggingDecorator(method):
"Decorate the method with an attribute to let the metaclass know it's there."
method.my_attr = 'FOO BAR'
return method # No need for a wrapper, we haven't changed
# what method actually does; your mileage may vary
class TaggingMetaclass(type):
"Metaclass to check for tags from TaggingDecorator and add them to the class."
def __new__(cls, name, bases, dct):
# Check for tagged members
has_tag = False
for member in dct.itervalues():
if hasattr(member, 'my_attr'):
has_tag = True
break
if has_tag:
# Set the class attribute
dct['my_attr'] = 'FOO BAR'
# Now let 'type' actually allocate the class object and go on with life
return type.__new__(cls, name, bases, dct)
That's it. Use as follows:
class Foo(object):
__metaclass__ = TaggingMetaclass
pass
class Baz(Foo):
"It's enough for a base class to have the right metaclass"
#TaggingDecorator
def Bar(self):
pass
>> Baz.my_attr
'FOO BAR'
Honestly, though? Use the supported_methods = [...] approach. Metaclasses are cool, but people who have to maintain your code after you will probably hate you.
Rather than use a metaclass, in python 2.6+ you should use a class decorator. You can wrap the function and class decorators up as methods of a class, like this real-world example.
I use this example with djcelery; the important aspects for this problem are the "task" method and the line "args, kw = self.marked[klass.dict[attr]]" which implicitly checks for "klass.dict[attr] in self.marked". If you want to use #methodtasks.task instead of #methodtasks.task() as a decorator, you could remove the nested def and use a set instead of a dict for self.marked. The use of self.marked, instead of setting a marking attribute on the function as the other answer did, allows this to work for classmethods and staticmethods which, because they use slots, won't allow setting arbitrary attributes. The downside of doing it this way is that the function decorator MUST go above other decorators, and the class decorator MUST go below, so that the functions are not modified / re=wrapped between one and the other.
class DummyClass(object):
"""Just a holder for attributes."""
pass
class MethodTasksHolder(object):
"""Register tasks with class AND method decorators, then use as a dispatcher, like so:
methodtasks = MethodTasksHolder()
#methodtasks.serve_tasks
class C:
#methodtasks.task()
##other_decorators_come_below
def some_task(self, *args):
pass
#methodtasks.task()
#classmethod
def classmethod_task(self, *args):
pass
def not_a_task(self):
pass
#..later
methodtasks.C.some_task.delay(c_instance,*args) #always treat as unbound
#analagous to c_instance.some_task(*args) (or C.some_task(c_instance,*args))
#...
methodtasks.C.classmethod_task.delay(C,*args) #treat as unbound classmethod!
#analagous to C.classmethod_task(*args)
"""
def __init__(self):
self.marked = {}
def task(self, *args, **kw):
def mark(fun):
self.marked[fun] = (args,kw)
return fun
return mark
def serve_tasks(self, klass):
setattr(self, klass.__name__, DummyClass())
for attr in klass.__dict__:
try:
args, kw = self.marked[klass.__dict__[attr]]
setattr(getattr(self, klass.__name__), attr, task(*args,**kw)(getattr(klass, attr)))
except KeyError:
pass
#reset for next class
self.marked = {}
return klass

Categories