How to get variable names of function call - python

I am going to to write a decorator which evaluates the actual names (not their value) of the variables that are passed to the function call.
Below, you find a skeleton of the code which makes it a bit clearer what I want to do.
import functools
def check_func(func):
# how to get variable names of function call
# s.t. a call like func(arg1, arg2, arg3)
# returns a dictionary {'a':'arg1', 'b':'arg2', 'c':'arg3'} ?
pass
def my_decorator(func):
#functools.wraps(func)
def call_func(*args, **kwargs):
check_func(func)
return func(*args, **kwargs)
return call_func
#my_decorator
def my_function(a, b, c):
pass
arg1='foo'
arg2=1
arg3=[1,2,3]
my_function(arg1,arg2,arg3)

You can't really have what you are asking for.
There are many ways of calling a function, where you won't even get variable names for individual values. For example, what would the names when you use literal values in the call, so:
my_function('foo', 10 - 9, [1] + [2, 3])
or when you use a list with values for argument expansion with *:
args = ['foo', 1, [1, 2, 3]]
my_function(*args)
Or when you use a functools.partial() object to bind some argument values to a callable object:
from functools import partial
func_partial = partial(my_function, arg1, arg2)
func_partial(arg3)
Functions are passed objects (values), not variables. Expressions consisting of just names may have been used to produce the objects, but those objects are independent of the variables.
Python objects can have many different references, so just because the call used arg1, doesn't mean that there won't be other references to the object elsewhere that would be more interesting to your code.
You could try to analyse the code that called the function (the inspect module can give you access to the call stack), but then that presumes that the source code is available. The calling code could be using a C extension, or interpreter only has access to .pyc bytecode files, not the original source code. You still would have to trace back and analyse the call expression (not always that straightforward, functions are objects too and can be stored in containers and retrieved later to be called dynamically) and from there find the variables involved if there are any at all.
For the trivial case, where only direct positional argument names were used for the call and the whole call was limited to a single line of source code, you could use a combination of inspect.stack() and the ast module to parse the source into something useful enough to analyse:
import inspect, ast
class CallArgumentNameFinder(ast.NodeVisitor):
def __init__(self, functionname):
self.name = functionname
self.params = []
self.kwargs = {}
def visit_Call(self, node):
if not isinstance(node.func, ast.Name):
return # not a name(...) call
if node.func.id != self.name:
return # different name being called
self.params = [n.id for n in node.args if isinstance(n, ast.Name)]
self.kwargs = {
kw.arg: kw.value.id for kw in node.keywords
if isinstance(kw.value, ast.Name)
}
def check_func(func):
caller = inspect.stack()[2] # caller of our caller
try:
tree = ast.parse(caller.code_context[0])
except SyntaxError:
# not a complete Python statement
return None
visitor = CallArgumentNameFinder(func.__name__)
visitor.visit(tree)
return inspect.signature(func).bind_partial(
*visitor.params, **visitor.kwargs)
Again, for emphasis: this only works with the most basic of calls, where the call consists of a single line only, and the called name matches the function name. It can be expanded upon but this takes a lot of work.
For your specific example, this produces <BoundArguments (a='arg1', b='arg2', c='arg3')>, so an inspect.BoundArguments instance. Use .arguments to get an OrderedDict mapping with the name-value pairs, or dict(....arguments) to turn that into a regular dictionary.
You'll have to think about your specific problem differently instead. Decorators are not meant to be acting upon the code calling, they act upon the decorated object. There are many other powerful features in the language that can help you deal with the calling context, decorators are not it.

Related

Passing same argument names for a function in Python

In python, I have a class with functions printing certain outputs, of which each has default parameters but those can be changed, like
def func1(self,a='a', b='b'):
return something
def func2(self,c='c', d='d'):
return something
Lots of other functions of a similar kind too.
I created another function that can take those functions with parameters and can do something with them, ie
def foo(self, default,*args,**kwargs):
df=self.df
if default == True:
args = self.list1
else:
for fname in args:
getattr(self, fname)(*kwargs)
df['appending_array'] = df.apply(lambda...
In the effect I'd like to be able to call something like
object.foo(False, func2,func11,d='z')
Unfortunately in the loop, when I change d to 'z', it changes the first argument of each function that is iterated, instead of the actual parameter d from the function I want.
Is there a possibility to either rearrange it so I can pass the original parameters of each passed function, or configure **kwarg so it can refer to the original parameters' names of each function?
Many thanks for any help/advice
So, if I understand correctly, you want to write it such that if, for example, d is present in the foo call, it is solely passed to all functions in args that have d as an input argument (in this case, func2)?
If so, then you want to determine all input arguments that each function can take.
Luckily, Python has a function that allows you to do just that: inspect.signature.
So, you could rewrite it like this (I am assuming you wanted to pass the names of the functions as args, not the actual functions themselves):
from inspect import signature
def foo(self, default, *args, **kwargs):
df=self.df
if default:
args = self.list1
else:
for fname in args:
# Obtain function
func = getattr(self, fname)
# Obtain signature of this function
f_sig = signature(func)
# Create dict with values for this function, updating with values from kwargs
# Make sure to skip 'self'
f_kwargs = {argname: kwargs.get(argname, argpar.default)
for argname, argpar in f_sig.parameters.items()
if (argname != 'self')}
# Call this function
f_out = func(**f_kwargs)
# Perform some operations on the output of this function
df['appending_array'] = df.apply(lambda...
Keep in mind that this only works if every function that is ever passed as args solely takes optional arguments.
If one argument is mandatory, it must be passed to kwargs.
PS: I am however unsure what the use of the args = self.list1 for if default is.
Because that part is incompatible with everything else.
You need to unpack the kwargs with dictionary unpacking using two stars:
getattr(self, fname)(**kwargs)
And I think you also need to provide self as the first argument since these are unbound methods:
getattr(self, fname)(self,**kwargs)

Customize how a Python object is processed as a function argument?

A Python class's __call__ method lets us specify how a class member should be behave as a function. Can we do the "opposite", i.e. specify how a class member should behave as an argument to an arbitrary other function?
As a simple example, suppose I have a ListWrapper class that wraps lists, and when I call a function f on a member of this class, I want f to be mapped over the wrapped list. For instance:
x = WrappedList([1, 2, 3])
print(x + 1) # should be WrappedList([2, 3, 4])
d = {1: "a", 2: "b", 3:"c"}
print(d[x]) # should be WrappedList(["a", "b", "c"])
Calling the hypothetical __call__ analogue I'm looking for __arg__, we could imagine something like this:
class WrappedList(object):
def __init__(self, to_wrap):
self.wrapped = to_wrap
def __arg__(self, func):
return WrappedList(map(func, self.wrapped))
Now, I know that (1) __arg__ doesn't exist in this form, and (2) it's easy to get the behavior in this simple example without any tricks. But is there a way to approximate the behavior I'm looking for in the general case?
You can't do this in general.*
You can do something equivalent for most of the builtin operators (like your + example), and a handful of builtin functions (like abs). They're implemented by calling special methods on the operands, as described in the reference docs.
Of course that means writing a whole bunch of special methods for each of your types—but it wouldn't be too hard to write a base class (or decorator or metaclass, if that doesn't fit your design) that implements all those special methods in one place, by calling the subclass's __arg__ and then doing the default thing:
class ArgyBase:
def __add__(self, other):
return self.__arg__() + other
def __radd__(self, other):
return other + self.__arg__()
# ... and so on
And if you want to extend that to a whole suite of functions that you create yourself, you can give them all similar special-method protocols similar to the builtin ones, and expand your base class to cover them. Or you can just short-circuit that and use the __arg__ protocol directly in those functions. To avoid lots of repetition, I'd use a decorator for that.
def argify(func):
def _arg(arg):
try:
return arg.__arg__()
except AttributeError:
return arg
#functools.wraps(func)
def wrapper(*args, **kwargs):
args = map(_arg, args)
kwargs = {kw: _arg(arg) for arg in args}
return func(*args, **kwargs)
return wrapper
#argify
def spam(a, b):
return a + 2 * b
And if you really want to, you can go around wrapping other people's functions:
sin = argify(math.sin)
… or even monkeypatching their modules:
requests.get = argify(requests.get)
… or monkeypatching a whole module dynamically a la early versions of gevent, but I'm not going to even show that, because at this point we're getting into don't-do-this-for-multiple-reasons territory.
You mentioned in a comment that you'd like to do this to a bunch of someone else's functions without having to specify them in advance, if possible. Does that mean every function that ever gets constructed in any module you import? Well, you can even do that if you're willing to create an import hook, but that seems like an even worse idea. Explaining how to write an import hook and either AST-patch each function creation node or insert wrappers around the bytecode or the like is way too much to get into here, but if your research abilities exceed your common sense, you can figure it out. :)
As a side note, if I were doing this, I wouldn't call the method __arg__, I'd call it either arg or _arg. Besides being reserved for future use by the language, the dunder-method style implies things that aren't true here (special-method lookup instead of a normal call, you can search for it in the docs, etc.).
* There are languages where you can, such as C++, where a combination of implicit casting and typed variables instead of typed values means you can get a method called on your objects just by giving them an odd type with an implicit conversion operator to the expected type.

Decorator which conditionally activates another decorator?

I have some functions which, under normal circumstances, are called with arguments provided by user input. It is, however, valid to call some of these functions with certain series of arguments which are determined at runtime based on some system state.
I would like for the user to be able to optionally instruct my program to call these functions with all valid input and return the results of each call. I think a decorator which would work something like an activation switch for functions which have another decorator which indicates which series of arguments to use would work well.
Additionally, I need to preserve the function signature and metadata. It's vital to the operation of my program.
This is what I've tried, but it doesn't work. It is based upon this example.
>>> from decorator import decorator
>>> def the_list():
... return ["foo", "bar", "baz"]
...
>>> import itertools
>>> #decorator
... def do_all(func):
... # this will do nothing (besides wrap in a tuple) unless func is decorated with #gets_arg_from
... if hasattr(func, 'get_from'):
... return tuple(func(*args) for args in itertools.product(*(list_fun() for list_fun in func.get_from)))
... else:
... return (func(),)
...
>>> def gets_arg_from(*list_funcs):
... # this will do nothing to func unless decorated with #do_all
... def gets_arg_from(func, *args, **kwargs):
... func.get_from = list_funcs
... return func(*args, **kwargs)
... return decorator(gets_arg_from)
...
>>> #gets_arg_from(the_list)
... def print_func(word):
... # this will print its argument unless decorated with #do_all
... # at that point it will print every element returned by the_list()
... print word
...
>>> print_func("foo")
foo
>>> all = decorator(do_all, print_func)
>>> all()
Traceback (most recent call last):
File "<stdin>", line 1, in <module>
TypeError: print_func() takes exactly 1 argument (0 given)
>>> print_func.get_from
Traceback (most recent call last):
File "<stdin>", line 1, in <module>
AttributeError: 'function' object has no attribute 'get_from'
What I expected was:
>>> all()
("foo", "bar", "baz")
What I've noticed is wrong:
gets_arg_from doesn't add the get_from attribute to func.
Something about me using the notation #gets_arg_from(the_list) is wrong. It thinks I am trying to pass two arguments (but why would that be a problem anyway?)
As for my motivation, I think of decorators for this because there are literally hundreds of these routines, their implementation details (as well as their functional correctness) is subject to frequent change, and I don't want to use inspect to reason what to do based on their argument names nor do I want to hard-code the do_all functionality for each function for which it makes sense. Class methods might work, but for my purpose, they're semantically contrived. Furthermore, for the sake of others who may have to maintain my code, I think it is easier to ask them to apply a decorator and not to worry about the rest rather than to use certain argument names or place the function in a certain class or whatever. I realize this question may sound strange, so I figured this footnote might help make me look less like a madman.
Isn't this doing the thing you want?
import functools
from itertools import product
def the_list():
return ["foo", "bar", "baz"]
def do_all(func):
if hasattr(func, 'get_from'):
#functools.wraps(func)
def wrapper(*args, **kwargs):
return tuple(func(*args) for args in
product(*(lf() for lf in func.get_from)))
return wrapper
return func
def gets_arg_from(*list_funcs):
def decorator(func):
func.get_from = list_funcs
return func
return decorator
#gets_arg_from(the_list)
def print_func(word):
return word
print print_func('foo')
all = do_all(print_func)
print all()
Edit: Explanation
These two code segments are identical:
#deco
def func(...):
some code
is the same as
func = deco(lambda ...: some code)
#something is just a syntactic sugar for the function call and anonymous function creation...
I'll explain what happened in the next peace of code step by step:
#gets_arg_from(the_list)
def print_func(word):
return word
First Python creates an anonimous function that receives a parameter word and has a body that just returns this word (or does whatever the function body does)
Then the function get_arg_from gets called and the_list gets passed to it as an argument
get_arg_from creates a decorator function and returnes it
The decorator function returned from the get_arg_from is called (this is the syntactic sugar thing) passing as an argument func the anonimous function created in the step 1.
decorator just assigns the list_funcs tuple to the get_from attribute of the anonymous function and returns the anonimous function
The return value of the decorator function is assigned to a variable print_func
Similar effect can be achieved with:
def __anonimous(word):
return word
__decorator = gets_arg_from(the_list)
print_func = __decorator(__anonimous)
So basically gets_arg_from is not a decorator it's a function that returns a decorator.
do_all on the other hand is a decorator, it receives a function as an argument, and returns either the original function (if the function doesn't have the attribute get_from) or a wrapper function which replaces the original function (if it has the get_from attribute).
You can find more examples here.

"self" inside plain function?

I've got a bunch of functions (outside of any class) where I've set attributes on them, like funcname.fields = 'xxx'. I was hoping I could then access these variables from inside the function with self.fields, but of course it tells me:
global name 'self' is not defined
So... what can I do? Is there some magic variable I can access? Like __this__.fields?
A few people have asked "why?". You will probably disagree with my reasoning, but I have a set of functions that all must share the same signature (accept only one argument). For the most part, this one argument is enough to do the required computation. However, in a few limited cases, some additional information is needed. Rather than forcing every function to accept a long list of mostly unused variables, I've decided to just set them on the function so that they can easily be ignored.
Although, it occurs to me now that you could just use **kwargs as the last argument if you don't care about the additional args. Oh well...
Edit: Actually, some of the functions I didn't write, and would rather not modify to accept the extra args. By "passing in" the additional args as attributes, my code can work both with my custom functions that take advantage of the extra args, and with third party code that don't require the extra args.
Thanks for the speedy answers :)
self isn't a keyword in python, its just a normal variable name. When creating instance methods, you can name the first parameter whatever you want, self is just a convention.
You should almost always prefer passing arguments to functions over setting properties for input, but if you must, you can do so using the actual functions name to access variables within it:
def a:
if a.foo:
#blah
a.foo = false
a()
see python function attributes - uses and abuses for when this comes in handy. :D
def foo():
print(foo.fields)
foo.fields=[1,2,3]
foo()
# [1, 2, 3]
There is nothing wrong with adding attributes to functions. Many memoizers use this to cache results in the function itself.
For example, notice the use of func.cache:
from decorator import decorator
#decorator
def memoize(func, *args, **kw):
# Author: Michele Simoniato
# Source: http://pypi.python.org/pypi/decorator
if not hasattr(func, 'cache'):
func.cache = {}
if kw: # frozenset is used to ensure hashability
key = args, frozenset(kw.iteritems())
else:
key = args
cache = func.cache # attribute added by memoize
if key in cache:
return cache[key]
else:
cache[key] = result = func(*args, **kw)
return result
You can't do that "function accessing its own attributes" correctly for all situations - see for details here how can python function access its own attributes? - but here is a quick demonstration:
>>> def f(): return f.x
...
>>> f.x = 7
>>> f()
7
>>> g = f
>>> g()
7
>>> del f
>>> g()
Traceback (most recent call last):
File "<interactive input>", line 1, in <module>
File "<interactive input>", line 1, in f
NameError: global name 'f' is not defined
Basically most methods directly or indirectly rely on accessing the function object through lookup by name in globals; and if original function name is deleted, this stops working. There are other kludgey ways of accomplishing this, like defining class, or factory - but thanks to your explanation it is clear you don't really need that.
Just do the mentioned keyword catch-all argument, like so:
def fn1(oneArg):
// do the due
def fn2(oneArg, **kw):
if 'option1' in kw:
print 'called with option1=', kw['option1']
//do the rest
fn2(42)
fn2(42, option1='something')
Not sure what you mean in your comment of handling TypeError - that won't arise when using **kw. This approach works very well for some python system functions - check min(), max(), sort(). Recently sorted(dct,key=dct.get,reverse=True) came very handy to me in CodeGolf challenge :)
Example:
>>> def x(): pass
>>> x
<function x at 0x100451050>
>>> x.hello = "World"
>>> x.hello
"World"
You can set attributes on functions, as these are just plain objects, but I actually never saw something like this in real code.
Plus. self is not a keyword, just another variable name, which happens to be the particular instance of the class. self is passed implicitly, but received explicitly.
if you want globally set parameters for a callable 'thing' you could always create a class and implement the __call__ method?
There is no special way, within a function's body, to refer to the function object whose code is executing. Simplest is just to use funcname.field (with funcname being the function's name within the namespace it's in, which you indicate is the case -- it would be harder otherwise).
This isn't something you should do. I can't think of any way to do what you're asking except some walking around on the call stack and some weird introspection -- which isn't something that should happen in production code.
That said, I think this actually does what you asked:
import inspect
_code_to_func = dict()
def enable_function_self(f):
_code_to_func[f.func_code] = f
return f
def get_function_self():
f = inspect.currentframe()
code_obj = f.f_back.f_code
return _code_to_func[code_obj]
#enable_function_self
def foo():
me = get_function_self()
print me
foo()
While I agree with the the rest that this is probably not good design, the question did intrigue me. Here's my first solution, which I may update once I get decorators working. As it stands, it relies pretty heavily on being able to read the stack, which may not be possible in all implementations (something about sys._getframe() not necessarily being present...)
import sys, inspect
def cute():
this = sys.modules[__name__].__dict__.get(inspect.stack()[0][3])
print "My face is..." + this.face
cute.face = "very cute"
cute()
What do you think? :3
You could use the following (hideously ugly) code:
class Generic_Object(object):
pass
def foo(a1, a2, self=Generic_Object()):
self.args=(a1,a2)
print "len(self.args):", len(self.args)
return None
... as you can see it would allow you to use "self" as you described. You can't use an "object()" directly because you can't "monkey patch(*)" values into an object() instance. However, normal subclasses of object (such as the Generic_Object() I've shown here) can be "monkey patched"
If you wanted to always call your function with a reference to some object as the first argument that would be possible. You could put the defaulted argument first, followed by a *args and optional **kwargs parameters (through which any other arguments or dictionaries of options could be passed during calls to this function).
This is, as I said hideously ugly. Please don't ever publish any code like this or share it with anyone in the Python community. I'm only showing it here as a sort of strange educational exercise.
An instance method is like a function in Python. However, it exists within the namespace of a class (thus it must be accessed via an instance ... myobject.foo() for example) and it is called with a reference to "self" (analagous to the "this" pointer in C++) as the first argument. Also there's a method resolution process which causes the interpreter to search the namespace of the instance, then it's class, and then each of the parent classes and so on ... up through the inheritance tree.
An unbound function is called with whatever arguments you pass to it. There can't bee any sort of automatically pre-pended object/instance reference to the argument list. Thus, writing a function with an initial argument named "self" is meaningless. (It's legal because Python doesn't place any special meaning on the name "self." But meaningless because callers to your function would have to manually supply some sort of object reference to the argument list and it's not at all clear what that should be. Just some bizarre "Generic_Object" which then floats around in the global variable space?).
I hope that clarifies things a bit. It sounds like you're suffering from some very fundamental misconceptions about how Python and other object-oriented systems work.
("Monkey patching" is a term used to describe the direct manipulation of an objects attributes -- or "instance variables" by code that is not part of the class hierarchy of which the object is an instance).
As another alternative, you can make the functions into bound class methods like so:
class _FooImpl(object):
a = "Hello "
#classmethod
def foo(cls, param):
return cls.a + param
foo = _FooImpl.foo
# later...
print foo("World") # yes, Hello World
# and if you have to change an attribute:
foo.im_self.a = "Goodbye "
If you want functions to share attribute namespaecs, you just make them part of the same class. If not, give each its own class.
What exactly are you hoping "self" would point to, if the function is defined outside of any class? If your function needs some global information to execute properly, you need to send this information to the function in the form of an argument.
If you want your function to be context aware, you need to declare it within the scope of an object.

Using class/static methods as default parameter values within methods of the same class

I'd like to do something like this:
class SillyWalk(object):
#staticmethod
def is_silly_enough(walk):
return (False, "It's never silly enough")
def walk(self, appraisal_method=is_silly_enough):
self.do_stuff()
(was_good_enough, reason) = appraisal_method(self)
if not was_good_enough:
self.execute_self_modifying_code(reason)
return appraisal_method
def do_stuff(self):
pass
def execute_self_modifying_code(self, problem):
from __future__ import deepjuju
deepjuju.kiss_booboo_better(self, problem)
with the idea being that someone can do
>>> silly_walk = SillyWalk()
>>> appraise = walk()
>>> is_good_walk = appraise(silly_walk)
and also get some magical machine learning happening; this last bit is not of particular interest to me, it was just the first thing that occurred to me as a way to exemplify the use of the static method in both an in-function context and from the caller's perspective.
Anyway, this doesn't work, because is_silly_enough is not actually a function: it is an object whose __get__ method will return the original is_silly_enough function. This means that it only works in the "normal" way when it's referenced as an object attribute. The object in question is created by the staticmethod() function that the decorator puts in between SillyWalk's is_silly_enough attribute and the function that's originally defined with that name.
This means that in order to use the default value of appraisal_method from within either SillyWalk.walk or its caller, we have to either
call appraisal_method.__get__(instance, owner)(...) instead of just calling appraisal_method(...)
or assign it as the attribute of some object, then reference that object property as a method that we call as we would call appraisal_method.
Given that neither of these solutions seem particularly Pythonic™, I'm wondering if there is perhaps a better way to get this sort of functionality. I essentially want a way to specify that a method should, by default, use a particular class or static method defined within the scope of the same class to carry out some portion of its daily routine.
I'd prefer not to use None, because I'd like to allow None to convey the message that that particular function should not be called. I guess I could use some other value, like False or NotImplemented, but it seems a) hackety b) annoying to have to write an extra couple of lines of code, as well as otherwise-redundant documentation, for something that seems like it could be expressed quite succinctly as a default parameter.
What's the best way to do this?
Maybe all you need is to use the function (and not the method) in the first place?
class SillyWalk(object):
def is_silly_enough(walk):
return (False, "It's never silly enough")
def walk(self, appraisal_function=is_silly_enough):
self.do_stuff()
(was_good_enough, reason) = appraisal_function(self)
if not was_good_enough:
self.execute_self_modifying_code(reason)
return appraisal_function
def do_stuff(self):
pass
def execute_self_modifying_code(self, problem):
deepjuju.kiss_booboo_better(self, problem)
Note that the default for appraisal_function will now be a function and not a method, even though is_silly_enough will be bound as a class method once the class is created (at the end of the code).
This means that
>>> SillyWalk.is_silly_enough
<unbound method SillyWalk.is_silly_enough>
but
>>> SillyWalk.walk.im_func.func_defaults[0] # the default argument to .walk
<function is_silly_enough at 0x0000000002212048>
And you can call is_silly_enough with a walk argument, or call a walk instance with .is_silly_enough().
If you really wanted is_silly_enough to be a static method, you could always add
is_silly_enough = staticmethod(is_silly_enough)
anywhere after the definition of walk.
I ended up writing an (un)wrapper function, to be used within function definition headers, eg
def walk(self, appraisal_method=unstaticmethod(is_silly_enough)):
This actually seems to work, at least it makes my doctests that break without it pass.
Here it is:
def unstaticmethod(static):
"""Retrieve the original function from a `staticmethod` object.
This is intended for use in binding class method default values
to static methods of the same class.
For example:
>>> class C(object):
... #staticmethod
... def s(*args, **kwargs):
... return (args, kwargs)
... def m(self, args=[], kwargs={}, f=unstaticmethod(s)):
... return f(*args, **kwargs)
>>> o = C()
>>> o.s(1, 2, 3)
((1, 2, 3), {})
>>> o.m((1, 2, 3))
((1, 2, 3), {})
"""
# TODO: Technically we should be passing the actual class of the owner
# instead of `object`, but
# I don't know if there's a way to get that info dynamically,
# since the class is not actually declared
# when this function is called during class method definition.
# I need to figure out if passing `object` instead
# is going to be an issue.
return static.__get__(None, object)
update:
I wrote doctests for the unstaticmethod function itself; they pass too. I'm still not totally sure that this is an actual smart thing to do, but it does seem to work.
Not sure if I get exactly what you're after, but would it be cleaner to use getattr?
>>> class SillyWalk(object):
#staticmethod
def ise(walk):
return (False, "boo")
def walk(self, am="ise"):
wge, r = getattr(self, am)(self)
print wge, r
>>> sw = SillyWalk()
>>> sw.walk("ise")
False boo

Categories