For some given class in python like:
class Foo:
def __init__(self, ...):
...
pass
def not_raising_1(self, ...):
...
pass
def not_raising_2(self, ...):
...
pass
is it possible to enforce that the user has to call not_raising_1() or not_raising_2() after creating an object of type Foo. So I'm thinking of a behavior:
foo = Foo(...) # will raise a Exception saying you need to call not_raising_1 or not_raising_1
foo = Foo(...).not_raising_1(...) # will NOT raise a Excpetion
foo = Foo(...).not_raising_2(...) # will NOT raise a Excpetion
I know that a pragmatic solution would obviously be to put what ever should happen in not_raising_1() or not_raising_2() with some parameter in the constructor of Foo. But I'm here not asking for a pragmatic solution but am just curios if someone can think of some creative solution to get the described behavior.
First, for the record: if the two methods must be called before the object is ready to be used, that means that calling them is part of the initialization of the object, and so they should be called by __init__ itself:
class Foo:
def __init__(self, ...):
...
self.not_raising_1()
self.not_raising_2()
def not_raising_1(self, ...):
...
def not_raising_2(self, ...):
...
But, moving on to the question as asked...
The problem is not well defined.
Unless you call the methods inside __init__ itself, it is trivially true that neither method has been called the instant __init__ exits.
Further, once __init__ exits, the class Foo has no visibility into what happens outside its own definition. You need some sort of explicit state that maintains what happens after Foo.__init__ exits. Foo.not_raising_1 could examine that state to determine if anything else happened before it was called.
But that raises another problem: who will update that state? Every single bit of code would have to cooperate with Foo. Is this illegal?
x = Foo()
y = 3
x.not_raising_1()
Then how are you going to make Python update your state when it executes y = 3? The hooks just aren't there.
And finally, who is going to raise the exception if x.not_raising_1 is never called?
Refine the problem.
Rather than ask if the functions are never called, you can ensure they are called inside a with statement using an appropriately defined context manager. This context manager can ensure that not_raising_1 and not_raising_2 are called before the with statement completes, as well as ensure that they are only used inside a with statement. You can't enforce that the object is used as a context manager, but you can ensure that it is only used in a with statement.
class Foo:
def __init__(self, ...):
...
self._in_with_statement = False
self._r1_called = False
self._r2_called = False
def not_raising_1(self, ...):
self._r1_called = True
if not self._in_with_statement
raise RuntimeException("Foo instance must be used with context manager")
def not_raising_2(self, ...):
self._r2_called = True
if not self._in_with_statement
raise RuntimeException("Foo instance must be used with context manager")
def something_else(self):
if not self._r1_called or not self._r2_called:
raise RuntimeException("Failed to call not_raising_1 and/or not_raising_2")
...
def __enter__(self):
self._in_with_statement = True
def __exit__(self):
self._in_with_statement = False
if not self._r1_called or not self._r2_called:
raise RuntimeException("Failed to call not_raising_1 and/or not_raising_2")
self._r1_called = False
self._r2_called = False
Here, __init__ sets the condition that neither method has yet been called, nor are we yet executing in a with statement. The instance itself acts as the external state that monitors how the instance is used.
The two required methods require themselves to be executed inside a with statement (by checking if __enter__ has been called).
Every other method can check if the required methods have been called.
The __enter__ method simply marks the object as now being in a with statement, allowing the required methods to be called.
The __exit_ method ensures that the required methods were eventually called, and resets the state of the object as being outside a context manger.
I think this is as strong a guarantee as you can enforce, short of a class that uses the inspect module to examine the script's source code looking for violations.
You could use a classmethod like this:
class Foo:
def __init__(self, flag=True):
if flag:
raise CustomError()
#classmethod
def not_raising_1(cls):
return cls(flag=False)
Thus foo = Foo() or foo = Foo(...).not_raising_1(...) would still raise the exception, but foo = Foo.not_raising_1(...) would work.
It's not possible. You could use a workaround as that one suggested by Ajay Signh Rana or chepner but I would, personally, not recommend it as it is hard to grasp when reading the code.
Your goal should be to increase readability and usability of the class for yourself and other programmers that uses this class. Use well-known patterns and concepts whenever possible and if applicable.
Reading your question, I understand that the object is not ready to use until one of the other methods is invoked. You could consider Julian Fock's answer and use a class method.
Or use any of the other Creational Design Patterns:
https://refactoring.guru/design-patterns/creational-patterns
Depending on the reason why you want to achieve this behaviour, you could consider to implement the Builder pattern:
https://refactoring.guru/design-patterns/builder/python/example
https://stackoverflow.com/a/26193004/42659
Builder pattern equivalent in Python
A third alternative would be, as you mention yourself, that you pass some parameters along when invoking the constructor and call, depending on the parameter, either of the other methods within the constructor.
Which approach is usable and applicable for your situation depends on your needs and bigger picture than the example in your Question. You should choose the approach that suits your needs best and is most readable.
I did get your question but as others suggested it cannot be done. But yeah you wann raise an exception and it should be raised if the function isn't call then you must create another function that checks if the previous functions were called or not and if not you can raise the exception.
I would approach this problem by creating a variable that changes it's value based on the funciton calls and if the functions haven't been called we can determine that as well.
Try:
class SomeError(Exception):
pass
class Foo:
def __init__(self, ...):
self.flag = False # set the flag to false for each object initially
...
pass
def not_raising_1(self, ...):
self.flag = True # set it to true once the function has been called
...
pass
def not_raising_2(self, ...):
self.flag = True # repeat for this on too
...
pass
def raise_exception(self):
if(not self.flag):
raise SomeError
obj1 = Foo()
obj1.not_raising_1()
obj1.raise_exception() # won't do anything
obj2 = Foo()
obj2.raise_exception() # would raise exception as either of the two functions weren't called
As others have suggested, it's not something that you should consider in actual code. But Just as an excercise, I tried doing something similar:
class NoError(Exception):
pass
class Foo:
def __init__(self):
pass
def not_raising_1(self):
raise NoError()
def not_raising_2(self):
raise NoError()
How to use:
try:
Foo()
raise Exception('please use either not_raising_1 or not_raising_2')
except NoError:
print('No error')
# actual code
Related
I need a way to defer the initialization of a global variable until the firs access to it, the overall idea is expressed in the following Python pseudocode:
FOO = bar
FOO.some_method_on_bar() # Init bar: bar = Bar(); bar.some_method_on_bar()
FOO.some_method_on_bar() # Use cached bar: bar.some_method_on_bar()
So far I'm thinking of somehow telling Python to call a special class method every time its instance is evaluated, but I can't seem to google it up:
class LazyGetter:
def __init__(self, get_value) -> None:
self.get_value = get_value
def __class__instance__access__(self):
return self.get_value()
FOO = LazyGetter(get_value=lambda: Bar())
FOO # = LazyGetter.__class__instance__access__()
FOO.some_method_on_bar() # = LazyGetter.__class__instance__access__().some_method_on_bar()
So, basically I need to know if there's something equivalent to the madeup __class__instance__access__ method.
If you have to defer initialization, you may be doing too much in the __init__ method. But if you don't control that code, then you seem to be needing something like a proxy class, so you can do:
proxied_bar = Proxy(Bar)
...
proxied_bar.some_bar_method() # this would initialize Bar, if it isn't yet initialized, and then call the some_bar_method
One way to do so, see: Python proxy class
In that answer an instantiated object is proxied (rather than the class), so you have to make some modifications if you want to defer the __init__ call.
Since Python 3.7, one can define a module __getattr__ method to programmatically provide "global" attributes. Earlier and more generally, one can define a custom module type to provide such a method.
Assuming that Bar() is needed to initialise the global FOO, the following __getattr__ at module scope can be used.
# can type-annotate to "hint" that FOO will exist at some point
FOO: Bar
# called if module.<item> fails
def __getattr__(item: str):
if item == "FOO":
global FOO # add FOO to global scope
FOO = Bar()
return FOO
raise AttributeError(f"module {__name__!r} has no attribute {item!r}")
This makes FOO available programmatically when accessed as an attribute, i.e. as module.FOO or an import. It is only available in the global scope after the first such access.
If the access to FOO is expected to happen inside the module first, it is easier to provide a "getter" function instead.
def get_FOO() -> Bar:
global _FOO
try:
return _FOO
except NameError:
_FOO = Bar()
return _FOO
You might want to consider just having an actual global variable and accessing it with global <variable> but I can't say if that fits the use-case. It should work fine if you're just looking for some caching logic.
You might be able to do this with metaclasses which is a way of modifying a class when it's instantiated. Whether this is useful depends on what you're trying to achieve.
If you control the class code, you can use __getattribute__ to delay initialization until the first time you access an attribute.
class Bar:
def __init__(self, *args):
self._args = args
def __getattribute__(self, name):
args = super().__getattribute__('_args')
if args is not None:
# Initialize the object here.
self.data = args[0]
self.args = None
return super().__getattribute__(name)
def some_method_on_bar(self):
return self.data
As far as I understand, __init__() and __enter__() methods of the context manager are called exactly once each, one after another, leaving no chance for any other code to be executed in between. What is the purpose of separating them into two methods, and what should I put into each?
Edit: sorry, wasn't paying attention to the docs.
Edit 2: actually, the reason I got confused is because I was thinking of #contextmanager decorator. A context manager created using #contextmananger can only be used once (the generator will be exhausted after the first use), so often they are written with the constructor call inside with statement; and if that was the only way to use with statement, my question would have made sense. Of course, in reality, context managers are more general than what #contextmanager can create; in particular context managers can, in general, be reused. I hope I got it right this time?
As far as I understand, __init__() and __enter__() methods of the context manager are called exactly once each, one after another, leaving no chance for any other code to be executed in between.
And your understanding is incorrect. __init__ is called when the object is created, __enter__ when it is entered with with statement, and these are 2 quite distinct things. Often it is so that the constructor is directly called in with initialization, with no intervening code, but this doesn't have to be the case.
Consider this example:
class Foo:
def __init__(self):
print('__init__ called')
def __enter__(self):
print('__enter__ called')
return self
def __exit__(self, *a):
print('__exit__ called')
myobj = Foo()
print('\nabout to enter with 1')
with myobj:
print('in with 1')
print('\nabout to enter with 2')
with myobj:
print('in with 2')
myobj can be initialized separately and entered in multiple with blocks:
Output:
__init__ called
about to enter with 1
__enter__ called
in with 1
__exit__ called
about to enter with 2
__enter__ called
in with 2
__exit__ called
Furthermore if __init__ and __enter__ weren't separated, it wouldn't be possible to even use the following:
def open_etc_file(name):
return open(os.path.join('/etc', name))
with open_etc_file('passwd'):
...
since the initialization (within open) is clearly separate from with entry.
The managers created by contextlib.manager are single-entrant, but they again can be constructed outside the with block. Take the example:
from contextlib import contextmanager
#contextmanager
def tag(name):
print("<%s>" % name)
yield
print("</%s>" % name)
you can use this as:
def heading(level=1):
return tag('h{}'.format(level))
my_heading = heading()
print('Below be my heading')
with my_heading:
print('Here be dragons')
output:
Below be my heading
<h1>
Here be dragons
</h1>
However, if you try to reuse my_heading (and, consequently, tag), you will get
RuntimeError: generator didn't yield
Antti Haapalas answer is perfectly fine. I just wanted to elaborate a bit on the usage of arguments (like myClass(* args)) since that was somewhat unclear to me (retrospective I ask myself why....)
Using arguments for initialising your class in a with statement is not different from using the class the usual way.
The calls will happen in the following order:
__init__ (allocation of the class)
__enter__ (enter context)
__exit__ (leaving context)
Simple Example:
class Foo:
def __init__(self, i):
print('__init__ called: {}'.format(i))
self.i = i
def __enter__(self):
print('__enter__ called')
return self
def do_something(self):
print('do something with {}'.format(self.i))
def __exit__(self, *a):
print('__exit__ called')
with Foo(42) as bar:
bar.do_something()
Output:
__init__ called: 42
__enter__ called
do something with 42
__exit__ called
If you want to make sure that your calls can (almost) only be used in a context (e.g. to force the call to __exit__), see the stackoverflow post here. In the comments you will also find a answer to the question how to use arguments even then.
I'm writing a decorator, and for various annoying reasons[0] it would be expedient to check if the function it is wrapping is being defined stand-alone or as part of a class (and further which classes that new class is subclassing).
For example:
def my_decorator(f):
defined_in_class = ??
print "%r: %s" %(f, defined_in_class)
#my_decorator
def foo(): pass
class Bar(object):
#my_decorator
def bar(self): pass
Should print:
<function foo …>: False
<function bar …>: True
Also, please note:
At the point decorators are applied the function will still be a function, not an unbound method, so testing for instance/unbound method (using typeof or inspect) will not work.
Please only offer suggestions that solve this problem — I'm aware that there are many similar ways to accomplish this end (ex, using a class decorator), but I would like them to happen at decoration time, not later.
[0]: specifically, I'm writing a decorator that will make it easy to do parameterized testing with nose. However, nose will not run test generators on subclasses of unittest.TestCase, so I would like my decorator to be able to determine if it's being used inside a subclass of TestCase and fail with an appropriate error. The obvious solution - using isinstance(self, TestCase) before calling the wrapped function doesn't work, because the wrapped function needs to be a generator, which doesn't get executed at all.
Take a look at the output of inspect.stack() when you wrap a method. When your decorator's execution is underway, the current stack frame is the function call to your decorator; the next stack frame down is the # wrapping action that is being applied to the new method; and the third frame will be the class definition itself, which merits a separate stack frame because the class definition is its own namespace (that is wrapped up to create a class when it is done executing).
I suggest, therefore:
defined_in_class = (len(frames) > 2 and
frames[2][4][0].strip().startswith('class '))
If all of those crazy indexes look unmaintainable, then you can be more explicit by taking the frame apart piece by piece, like this:
import inspect
frames = inspect.stack()
defined_in_class = False
if len(frames) > 2:
maybe_class_frame = frames[2]
statement_list = maybe_class_frame[4]
first_statment = statement_list[0]
if first_statment.strip().startswith('class '):
defined_in_class = True
Note that I do not see any way to ask Python about the class name or inheritance hierarchy at the moment your wrapper runs; that point is "too early" in the processing steps, since the class creation is not yet finished. Either parse the line that begins with class yourself and then look in that frame's globals to find the superclass, or else poke around the frames[1] code object to see what you can learn — it appears that the class name winds up being frames[1][0].f_code.co_name in the above code, but I cannot find any way to learn what superclasses will be attached when the class creation finishes up.
A little late to the party here, but this has proven to be a reliable means of determining if a decorator is being used on a function defined in a class:
frames = inspect.stack()
className = None
for frame in frames[1:]:
if frame[3] == "<module>":
# At module level, go no further
break
elif '__module__' in frame[0].f_code.co_names:
className = frame[0].f_code.co_name
break
The advantage of this method over the accepted answer is that it works with e.g. py2exe.
Some hacky solution that I've got:
import inspect
def my_decorator(f):
args = inspect.getargspec(f).args
defined_in_class = bool(args and args[0] == 'self')
print "%r: %s" %(f, defined_in_class)
But it relays on the presence of self argument in function.
you can use the package wrapt to check for
- instance/class methods
- classes
- freestanding functions/static methods:
See the project page of wrapt: https://pypi.org/project/wrapt/
You could check if the decorator itself is being called at the module level or nested within something else.
defined_in_class = inspect.currentframe().f_back.f_code.co_name != "<module>"
I think the functions in the inspect module will do what you want, particularly isfunction and ismethod:
>>> import inspect
>>> def foo(): pass
...
>>> inspect.isfunction(foo)
True
>>> inspect.ismethod(foo)
False
>>> class C(object):
... def foo(self):
... pass
...
>>> inspect.isfunction(C.foo)
False
>>> inspect.ismethod(C.foo)
True
>>> inspect.isfunction(C().foo)
False
>>> inspect.ismethod(C().foo)
True
You can then follow the Types and Members table to access the function inside the bound or unbound method:
>>> C.foo.im_func
<function foo at 0x1062dfaa0>
>>> inspect.isfunction(C.foo.im_func)
True
>>> inspect.ismethod(C.foo.im_func)
False
How to get all methods of a given class A that are decorated with the #decorator2?
class A():
def method_a(self):
pass
#decorator1
def method_b(self, b):
pass
#decorator2
def method_c(self, t=5):
pass
Method 1: Basic registering decorator
I already answered this question here: Calling functions by array index in Python =)
Method 2: Sourcecode parsing
If you do not have control over the class definition, which is one interpretation of what you'd like to suppose, this is impossible (without code-reading-reflection), since for example the decorator could be a no-op decorator (like in my linked example) that merely returns the function unmodified. (Nevertheless if you allow yourself to wrap/redefine the decorators, see Method 3: Converting decorators to be "self-aware", then you will find an elegant solution)
It is a terrible terrible hack, but you could use the inspect module to read the sourcecode itself, and parse it. This will not work in an interactive interpreter, because the inspect module will refuse to give sourcecode in interactive mode. However, below is a proof of concept.
#!/usr/bin/python3
import inspect
def deco(func):
return func
def deco2():
def wrapper(func):
pass
return wrapper
class Test(object):
#deco
def method(self):
pass
#deco2()
def method2(self):
pass
def methodsWithDecorator(cls, decoratorName):
sourcelines = inspect.getsourcelines(cls)[0]
for i,line in enumerate(sourcelines):
line = line.strip()
if line.split('(')[0].strip() == '#'+decoratorName: # leaving a bit out
nextLine = sourcelines[i+1]
name = nextLine.split('def')[1].split('(')[0].strip()
yield(name)
It works!:
>>> print(list( methodsWithDecorator(Test, 'deco') ))
['method']
Note that one has to pay attention to parsing and the python syntax, e.g. #deco and #deco(... are valid results, but #deco2 should not be returned if we merely ask for 'deco'. We notice that according to the official python syntax at http://docs.python.org/reference/compound_stmts.html decorators are as follows:
decorator ::= "#" dotted_name ["(" [argument_list [","]] ")"] NEWLINE
We breathe a sigh of relief at not having to deal with cases like #(deco). But note that this still doesn't really help you if you have really really complicated decorators, such as #getDecorator(...), e.g.
def getDecorator():
return deco
Thus, this best-that-you-can-do strategy of parsing code cannot detect cases like this. Though if you are using this method, what you're really after is what is written on top of the method in the definition, which in this case is getDecorator.
According to the spec, it is also valid to have #foo1.bar2.baz3(...) as a decorator. You can extend this method to work with that. You might also be able to extend this method to return a <function object ...> rather than the function's name, with lots of effort. This method however is hackish and terrible.
Method 3: Converting decorators to be "self-aware"
If you do not have control over the decorator definition (which is another interpretation of what you'd like), then all these issues go away because you have control over how the decorator is applied. Thus, you can modify the decorator by wrapping it, to create your own decorator, and use that to decorate your functions. Let me say that yet again: you can make a decorator that decorates the decorator you have no control over, "enlightening" it, which in our case makes it do what it was doing before but also append a .decorator metadata property to the callable it returns, allowing you to keep track of "was this function decorated or not? let's check function.decorator!". And then you can iterate over the methods of the class, and just check to see if the decorator has the appropriate .decorator property! =) As demonstrated here:
def makeRegisteringDecorator(foreignDecorator):
"""
Returns a copy of foreignDecorator, which is identical in every
way(*), except also appends a .decorator property to the callable it
spits out.
"""
def newDecorator(func):
# Call to newDecorator(method)
# Exactly like old decorator, but output keeps track of what decorated it
R = foreignDecorator(func) # apply foreignDecorator, like call to foreignDecorator(method) would have done
R.decorator = newDecorator # keep track of decorator
#R.original = func # might as well keep track of everything!
return R
newDecorator.__name__ = foreignDecorator.__name__
newDecorator.__doc__ = foreignDecorator.__doc__
# (*)We can be somewhat "hygienic", but newDecorator still isn't signature-preserving, i.e. you will not be able to get a runtime list of parameters. For that, you need hackish libraries...but in this case, the only argument is func, so it's not a big issue
return newDecorator
Demonstration for #decorator:
deco = makeRegisteringDecorator(deco)
class Test2(object):
#deco
def method(self):
pass
#deco2()
def method2(self):
pass
def methodsWithDecorator(cls, decorator):
"""
Returns all methods in CLS with DECORATOR as the
outermost decorator.
DECORATOR must be a "registering decorator"; one
can make any decorator "registering" via the
makeRegisteringDecorator function.
"""
for maybeDecorated in cls.__dict__.values():
if hasattr(maybeDecorated, 'decorator'):
if maybeDecorated.decorator == decorator:
print(maybeDecorated)
yield maybeDecorated
It works!:
>>> print(list( methodsWithDecorator(Test2, deco) ))
[<function method at 0x7d62f8>]
However, a "registered decorator" must be the outermost decorator, otherwise the .decorator attribute annotation will be lost. For example in a train of
#decoOutermost
#deco
#decoInnermost
def func(): ...
you can only see metadata that decoOutermost exposes, unless we keep references to "more-inner" wrappers.
sidenote: the above method can also build up a .decorator that keeps track of the entire stack of applied decorators and input functions and decorator-factory arguments. =) For example if you consider the commented-out line R.original = func, it is feasible to use a method like this to keep track of all wrapper layers. This is personally what I'd do if I wrote a decorator library, because it allows for deep introspection.
There is also a difference between #foo and #bar(...). While they are both "decorator expressons" as defined in the spec, note that foo is a decorator, while bar(...) returns a dynamically-created decorator, which is then applied. Thus you'd need a separate function makeRegisteringDecoratorFactory, that is somewhat like makeRegisteringDecorator but even MORE META:
def makeRegisteringDecoratorFactory(foreignDecoratorFactory):
def newDecoratorFactory(*args, **kw):
oldGeneratedDecorator = foreignDecoratorFactory(*args, **kw)
def newGeneratedDecorator(func):
modifiedFunc = oldGeneratedDecorator(func)
modifiedFunc.decorator = newDecoratorFactory # keep track of decorator
return modifiedFunc
return newGeneratedDecorator
newDecoratorFactory.__name__ = foreignDecoratorFactory.__name__
newDecoratorFactory.__doc__ = foreignDecoratorFactory.__doc__
return newDecoratorFactory
Demonstration for #decorator(...):
def deco2():
def simpleDeco(func):
return func
return simpleDeco
deco2 = makeRegisteringDecoratorFactory(deco2)
print(deco2.__name__)
# RESULT: 'deco2'
#deco2()
def f():
pass
This generator-factory wrapper also works:
>>> print(f.decorator)
<function deco2 at 0x6a6408>
bonus Let's even try the following with Method #3:
def getDecorator(): # let's do some dispatching!
return deco
class Test3(object):
#getDecorator()
def method(self):
pass
#deco2()
def method2(self):
pass
Result:
>>> print(list( methodsWithDecorator(Test3, deco) ))
[<function method at 0x7d62f8>]
As you can see, unlike method2, #deco is correctly recognized even though it was never explicitly written in the class. Unlike method2, this will also work if the method is added at runtime (manually, via a metaclass, etc.) or inherited.
Be aware that you can also decorate a class, so if you "enlighten" a decorator that is used to both decorate methods and classes, and then write a class within the body of the class you want to analyze, then methodsWithDecorator will return decorated classes as well as decorated methods. One could consider this a feature, but you can easily write logic to ignore those by examining the argument to the decorator, i.e. .original, to achieve the desired semantics.
To expand upon #ninjagecko's excellent answer in Method 2: Source code parsing, you can use the ast module introduced in Python 2.6 to perform self-inspection as long as the inspect module has access to the source code.
def findDecorators(target):
import ast, inspect
res = {}
def visit_FunctionDef(node):
res[node.name] = [ast.dump(e) for e in node.decorator_list]
V = ast.NodeVisitor()
V.visit_FunctionDef = visit_FunctionDef
V.visit(compile(inspect.getsource(target), '?', 'exec', ast.PyCF_ONLY_AST))
return res
I added a slightly more complicated decorated method:
#x.y.decorator2
def method_d(self, t=5): pass
Results:
> findDecorators(A)
{'method_a': [],
'method_b': ["Name(id='decorator1', ctx=Load())"],
'method_c': ["Name(id='decorator2', ctx=Load())"],
'method_d': ["Attribute(value=Attribute(value=Name(id='x', ctx=Load()), attr='y', ctx=Load()), attr='decorator2', ctx=Load())"]}
If you do have control over the decorators, you can use decorator classes rather than functions:
class awesome(object):
def __init__(self, method):
self._method = method
def __call__(self, obj, *args, **kwargs):
return self._method(obj, *args, **kwargs)
#classmethod
def methods(cls, subject):
def g():
for name in dir(subject):
method = getattr(subject, name)
if isinstance(method, awesome):
yield name, method
return {name: method for name,method in g()}
class Robot(object):
#awesome
def think(self):
return 0
#awesome
def walk(self):
return 0
def irritate(self, other):
return 0
and if I call awesome.methods(Robot) it returns
{'think': <mymodule.awesome object at 0x000000000782EAC8>, 'walk': <mymodulel.awesome object at 0x000000000782EB00>}
For those of us who just want the absolute simplest possible case - namely, a single-file solution where we have total control over both the class we're working with and the decorator we're trying to track, I've got an answer. ninjagecko linked to a solution for when you have control over the decorator you want to track, but I personally found it to be complicated and really hard to understand, possibly because I've never worked with decorators until now. So, I've created the following example, with the goal of being as straightforward and simple as possible. It's a decorator, a class with several decorated methods, and code to retrieve+run all methods that have a specific decorator applied to them.
# our decorator
def cool(func, *args, **kwargs):
def decorated_func(*args, **kwargs):
print("cool pre-function decorator tasks here.")
return_value = func(*args, **kwargs)
print("cool post-function decorator tasks here.")
return return_value
# add is_cool property to function so that we can check for its existence later
decorated_func.is_cool = True
return decorated_func
# our class, in which we will use the decorator
class MyClass:
def __init__(self, name):
self.name = name
# this method isn't decorated with the cool decorator, so it won't show up
# when we retrieve all the cool methods
def do_something_boring(self, task):
print(f"{self.name} does {task}")
#cool
# thanks to *args and **kwargs, the decorator properly passes method parameters
def say_catchphrase(self, *args, catchphrase="I'm so cool you could cook an egg on me.", **kwargs):
print(f"{self.name} says \"{catchphrase}\"")
#cool
# the decorator also properly handles methods with return values
def explode(self, *args, **kwargs):
print(f"{self.name} explodes.")
return 4
def get_all_cool_methods(self):
"""Get all methods decorated with the "cool" decorator.
"""
cool_methods = {name: getattr(self, name)
# get all attributes, including methods, properties, and builtins
for name in dir(self)
# but we only want methods
if callable(getattr(self, name))
# and we don't need builtins
and not name.startswith("__")
# and we only want the cool methods
and hasattr(getattr(self, name), "is_cool")
}
return cool_methods
if __name__ == "__main__":
jeff = MyClass(name="Jeff")
cool_methods = jeff.get_all_cool_methods()
for method_name, cool_method in cool_methods.items():
print(f"{method_name}: {cool_method} ...")
# you can call the decorated methods you retrieved, just like normal,
# but you don't need to reference the actual instance to do so
return_value = cool_method()
print(f"return value = {return_value}\n")
Running the above example gives us the following output:
explode: <bound method cool.<locals>.decorated_func of <__main__.MyClass object at 0x00000220B3ACD430>> ...
cool pre-function decorator tasks here.
Jeff explodes.
cool post-function decorator tasks here.
return value = 4
say_catchphrase: <bound method cool.<locals>.decorated_func of <__main__.MyClass object at 0x00000220B3ACD430>> ...
cool pre-function decorator tasks here.
Jeff says "I'm so cool you could cook an egg on me."
cool post-function decorator tasks here.
return value = None
Note that the decorated methods in this example have different types of return values and different signatures, so the practical value of being able to retrieve and run them all is a bit dubious. However, in cases where there are many similar methods, all with the same signature and/or type of return value (like if you're writing a connector to retrieve unnormalized data from one database, normalize it, and insert it into a second, normalized database, and you have a bunch similar methods, e.g. 15 read_and_normalize_table_X methods), being able to retrieve (and run) them all on the fly could be more useful.
Maybe, if the decorators are not too complex (but I don't know if there is a less hacky way).
def decorator1(f):
def new_f():
print "Entering decorator1", f.__name__
f()
new_f.__name__ = f.__name__
return new_f
def decorator2(f):
def new_f():
print "Entering decorator2", f.__name__
f()
new_f.__name__ = f.__name__
return new_f
class A():
def method_a(self):
pass
#decorator1
def method_b(self, b):
pass
#decorator2
def method_c(self, t=5):
pass
print A.method_a.im_func.func_code.co_firstlineno
print A.method_b.im_func.func_code.co_firstlineno
print A.method_c.im_func.func_code.co_firstlineno
I don't want to add much, just a simple variation of ninjagecko's Method 2. It works wonders.
Same code, but using list comprehension instead of a generator, which is what I needed.
def methodsWithDecorator(cls, decoratorName):
sourcelines = inspect.getsourcelines(cls)[0]
return [ sourcelines[i+1].split('def')[1].split('(')[0].strip()
for i, line in enumerate(sourcelines)
if line.split('(')[0].strip() == '#'+decoratorName]
A simple way to solve this problem is to put code in the decorator that adds each function/method, that is passed in, to a data set (for example a list).
e.g.
def deco(foo):
functions.append(foo)
return foo
now every function with the deco decorator will be added to functions.
I'd like to do something like this:
class SillyWalk(object):
#staticmethod
def is_silly_enough(walk):
return (False, "It's never silly enough")
def walk(self, appraisal_method=is_silly_enough):
self.do_stuff()
(was_good_enough, reason) = appraisal_method(self)
if not was_good_enough:
self.execute_self_modifying_code(reason)
return appraisal_method
def do_stuff(self):
pass
def execute_self_modifying_code(self, problem):
from __future__ import deepjuju
deepjuju.kiss_booboo_better(self, problem)
with the idea being that someone can do
>>> silly_walk = SillyWalk()
>>> appraise = walk()
>>> is_good_walk = appraise(silly_walk)
and also get some magical machine learning happening; this last bit is not of particular interest to me, it was just the first thing that occurred to me as a way to exemplify the use of the static method in both an in-function context and from the caller's perspective.
Anyway, this doesn't work, because is_silly_enough is not actually a function: it is an object whose __get__ method will return the original is_silly_enough function. This means that it only works in the "normal" way when it's referenced as an object attribute. The object in question is created by the staticmethod() function that the decorator puts in between SillyWalk's is_silly_enough attribute and the function that's originally defined with that name.
This means that in order to use the default value of appraisal_method from within either SillyWalk.walk or its caller, we have to either
call appraisal_method.__get__(instance, owner)(...) instead of just calling appraisal_method(...)
or assign it as the attribute of some object, then reference that object property as a method that we call as we would call appraisal_method.
Given that neither of these solutions seem particularly Pythonic™, I'm wondering if there is perhaps a better way to get this sort of functionality. I essentially want a way to specify that a method should, by default, use a particular class or static method defined within the scope of the same class to carry out some portion of its daily routine.
I'd prefer not to use None, because I'd like to allow None to convey the message that that particular function should not be called. I guess I could use some other value, like False or NotImplemented, but it seems a) hackety b) annoying to have to write an extra couple of lines of code, as well as otherwise-redundant documentation, for something that seems like it could be expressed quite succinctly as a default parameter.
What's the best way to do this?
Maybe all you need is to use the function (and not the method) in the first place?
class SillyWalk(object):
def is_silly_enough(walk):
return (False, "It's never silly enough")
def walk(self, appraisal_function=is_silly_enough):
self.do_stuff()
(was_good_enough, reason) = appraisal_function(self)
if not was_good_enough:
self.execute_self_modifying_code(reason)
return appraisal_function
def do_stuff(self):
pass
def execute_self_modifying_code(self, problem):
deepjuju.kiss_booboo_better(self, problem)
Note that the default for appraisal_function will now be a function and not a method, even though is_silly_enough will be bound as a class method once the class is created (at the end of the code).
This means that
>>> SillyWalk.is_silly_enough
<unbound method SillyWalk.is_silly_enough>
but
>>> SillyWalk.walk.im_func.func_defaults[0] # the default argument to .walk
<function is_silly_enough at 0x0000000002212048>
And you can call is_silly_enough with a walk argument, or call a walk instance with .is_silly_enough().
If you really wanted is_silly_enough to be a static method, you could always add
is_silly_enough = staticmethod(is_silly_enough)
anywhere after the definition of walk.
I ended up writing an (un)wrapper function, to be used within function definition headers, eg
def walk(self, appraisal_method=unstaticmethod(is_silly_enough)):
This actually seems to work, at least it makes my doctests that break without it pass.
Here it is:
def unstaticmethod(static):
"""Retrieve the original function from a `staticmethod` object.
This is intended for use in binding class method default values
to static methods of the same class.
For example:
>>> class C(object):
... #staticmethod
... def s(*args, **kwargs):
... return (args, kwargs)
... def m(self, args=[], kwargs={}, f=unstaticmethod(s)):
... return f(*args, **kwargs)
>>> o = C()
>>> o.s(1, 2, 3)
((1, 2, 3), {})
>>> o.m((1, 2, 3))
((1, 2, 3), {})
"""
# TODO: Technically we should be passing the actual class of the owner
# instead of `object`, but
# I don't know if there's a way to get that info dynamically,
# since the class is not actually declared
# when this function is called during class method definition.
# I need to figure out if passing `object` instead
# is going to be an issue.
return static.__get__(None, object)
update:
I wrote doctests for the unstaticmethod function itself; they pass too. I'm still not totally sure that this is an actual smart thing to do, but it does seem to work.
Not sure if I get exactly what you're after, but would it be cleaner to use getattr?
>>> class SillyWalk(object):
#staticmethod
def ise(walk):
return (False, "boo")
def walk(self, am="ise"):
wge, r = getattr(self, am)(self)
print wge, r
>>> sw = SillyWalk()
>>> sw.walk("ise")
False boo