How can I dynamically extend arguments of a method? - python

The methods should be callable by assigning the specific parameters as well as considering the class' attributes.
So what I like to achieve is overwriting a method's arguments with preset attributes.
E.g.:
class Obj:
def __init__(self, cfg=None):
self.cfg = {}
if cfg:
self.cfg = cfg
return
def _actual_f_1(self, x, **kwargs):
return x+100
def f_1(self, x):
new = {"x": x, **self.cfg}
return self._actual_f_1(**new)
o = Obj()
o.f_1(1)
which prints
101
OR using the overriding approach:
o = Obj({"x": 100})
o.f_1(1)
which now gives
200
The defined class Obj looks pretty clumsy. Especially if several methods of the class should use the described logic.
How can I generalize the logic of f_1 which basically only alters the parameters before calling the actual method?

You can use __init_subclass__ in a base class to decorate all methods in a class, in a transparent way, to pick the fitting named parameters form a .cfg dict if they are not passed.
So, first let's think of the code for such a decorator - applying arguments can be rather complicated because among positional X named parameters with "positional only" and "named only" in the mix, the number of combinationx explode
Python's stdlib have a signature call which returns a Signature object with enough functionality to cover all corner cases. I use it, and just a common path - so that if the arguments you want to apply automatically are normal positional_or_keyword or keyword_only parameters, it should work:
from functools import wraps
def extender(method):
sig = inspect.signature(method)
#wraps(method)
def wrapper(self, *args, **kwargs):
bound = sig.bind_partial(*args, **kwargs)
for parameter in sig.parameters.keys():
if parameter not in bound.arguments and parameter in getattr(self, "cfg", {}):
kwargs[parameter] = self.cfg[parameter]
return method(self, *args, **kwargs)
return wrapper
Here we can see that working:
In [78]: class A:
...: def __init__(self):
...: self.cfg={"b": 5}
...: #extender
...: def a(self, a, b, c=10):
...: print( a, b, c)
...:
In [79]: a = A()
In [80]: a.a(1)
1 5 10
In [81]: a.a(1, c=2)
1 5 2
In [82]: a.a(1, c=2, b=3)
1 3 2
With only this decorator, your code could be rewritten as:
class Obj:
def __init__(self, cfg=None):
self.cfg = cfg if cfg is not None else {"extra": 10}
#extender
def f_1(self, x, extra):
return x+100
And if you want a base-class that will transparently wrap all methods in all subclasses with the extender, you can use this:
class Base:
def __init_subclass__(cls, *args, **kwargs):
super().__init_subclass__(*args, **kwargs)
for name, value in cls.__dict__.items():
if callable(value):
setattr(cls, name, extender(value))
And now one can use simply:
In [84]: class B(Base):
...: def __init__(self):
...: self.cfg = {"c": 10}
...: def b(self, a, c):
...: print(a, c)
...:
In [85]: B().b(1)
1 10
This decorator, unlike your example, takes care to just inject the arguments the function expects to receive, and not all of the keys from self.cfg in every function call.
If you want that behavior instead, you just have to take care to expand the cfg dict first and then updating it with the passed arguments, so that passed arguments will override default values in the cfg. The decorator code for that would be:
from functools import wraps
def extender_kw(method):
sig = inspect.signature(method)
#wraps(method)
def wrapper(self, *args, **kwargs):
bound = sig.bind_partial(*args, **kwargs)
kwargs = bound.arguments
kwargs |= getattr(self, "cfg", {})
return method(self, **kwargs)
return wrapper

I am interpreting "generalize" as "write this with fewer lines of code."
You wrote
self.cfg = {}
if cfg:
self.cfg = cfg
return
The 4th line is superfluous and can be elided.
The first 3 lines could be a simple assignment of ... = cfg or {}
Or we could inherit from a utility class,
and make a super().__init__ call.
So now we're presumably down to DRYing up and condensing these two lines:
new = {"x": x, **self.cfg}
return self._actual_f_1(**new)
Maybe write them as
return self._actual_f_1(self.args())
where the parent
abstract
class offers an args helper that knows about self.cfg.
It would inspect
the stack to find the caller's arg signature,
and merge it with cfg.
Alternatively you could phrase it as
return self.call(self._actual_f_1)
though that seems less convenient.
Do let us know
the details of how you wind up resolving this.

Related

Wrap an arbitrary class with a new method dynamically

I have a class A.
I have another class B. instances of class B should function exactly like class A, except for one caveat: I want another function available called special_method(self, args, kwargs)
So the following should work:
instance_A = classA(args, kwargs)
instance_B = classB(instance_A)
method_result = instance_B.special_method(args, kwargs)
How do I write class B to accomplish this?
Note: If I only wanted to do this for ONE class A, I could just have class B inherit class A. but I want to be able to add special_method to class C, D, E, F... etc.
So, you are describing a proxy object. Doing this for non-special methods is trivial in Python, you can use the __getattr__
In [1]: class A:
...: def foo(self):
...: return "A"
...:
In [2]: class B:
...: def __init__(self, instance):
...: self._instance = instance
...: def special_method(self, *args, **kwargs):
...: # do something special
...: return 42
...: def __getattr__(self, name):
...: return getattr(self._instance, name)
...:
In [3]: a = A()
In [4]: b = B(a)
In [5]: b.foo()
Out[5]: 'A'
In [6]: b.special_method()
Out[6]: 42
However, there is one caveat here: this won't work with special methods because special methods skip this part of attribute resolution and are directly looked up on the class __dict__.
An alternative, you can simply add the method to all the classes you need. Something like:
def special_method(self, *args, **kwargs):
# do something special
return 42
for klass in [A, C, D, E, F]:
klass.special_method = special_method
Of course, this would affect all instances of these classes (since you are simply dynamically adding a method to the class).
If you really need special methods, your best best would by to create a subclass, but you can do this dynamically with a simple helper function, e.g.:
def special_method(self, *args, **kwargs):
# do something special
return 42
_SPECIAL_MEMO = {}
def dynamic_mixin(klass, *init_args, **init_kwargs):
if klass not in _SPECIAL_MEMO:
child = type(f"{klass.__name__}Special", (klass,), {"special_method":special_method})
_SPECIAL_MEMO[klass] = child
return _SPECIAL_MEMO[klass](*init_args, **init_kwargs)
class Foo:
def __init__(self, foo):
self.foo = foo
def __len__(self):
return 88
def bar(self):
return self.foo*2
special_foo = dynamic_mixin(Foo, 10)
print("calling len", len(special_foo))
print("calling bar", special_foo.bar())
print("calling special method", special_foo.special_method())
The above script prints:
calling len 88
calling bar 20
calling special method 42

Get arguments that an object's __init__ was called with

Is there a way to get an object's init argument values in python 2.7? I'm able to get the defaults through getargspec but i would like to access passed in values
import inspect
class AnObject(object):
def __init__(self, kw='', *args, **kwargs):
print 'Hello'
anobj = AnObject(kw='a keyword arg')
print inspect.getargspec(anobj.__init__)
Returns
Hello
ArgSpec(args=['self', 'kw'], varargs='args', keywords='kwargs', defaults=('',))
__init__ is treated no differently than any other function. So, like with any other function, its arguments are discarded once it returns -- unless you save them somewhere before that.
The standard approach is to save what you need later in attributes of the instance:
class Foo:
def __init__(self, a, b, *args, **kwargs):
self.a = a
self.b = b
<etc>
"Dataclasses" introduced in 3.7 streamline this process but require data annotations:
import dataclasses
#dataclasses.dataclass
class Foo:
a: int
b: str
is equivalent to:
class Foo:
def __init__(self, a:int, b:str):
self.a = a
self.b = b
Though see Python decorator to automatically define __init__ variables why this streamlining is not very useful in practice.
You can store them as attributes.
class AnObject(object):
def __init__(self, kw='', *args, **kwargs):
self.kw = kw
self.args = args
self.kwargs = kwargs
then just print them:
anobj = AnObject(kw='a keyword arg')
print anobj.kw
print anobj.args
print anobj.kwargs
if you want to see them all, you could take a look into its __dict__ attribute.

Using Python decorators to add a method to a method

I want to be able to call a method according to some standard format:
outputs = obj.meth(in_0, in_1, ...)
, where outputs is a tuple of arrays, and each input is an array.
However, in most instances, I only return one array, and don't want to be forced to return a tuple of length 1 just for the sake of the standard format. (My actual formatting problem is more complicated but lets stick with this explanation for now.)
I want to be able to define a class like:
class _SomeClass(object):
def __init__(self):
self._amount_to_add = 1
#single_return_format
def add_one(self, x):
return x+self._amount_to_add
And then be able to call it as follows:
obj = _SomeClass()
assert obj.add_one(3) == 4
assert obj.add_one.standard_format(3)==(4, )
Question is: how do I define the decorator to allow this behaviour?
I tried:
def single_return_format(fcn):
fcn.standard_format = lambda *args: (fcn(*args), )
return fcn
, but it fails on the line with the second assert with:
TypeError: add_one() takes exactly 2 arguments (1 given)
Because the add_one requires "self" as an argument, and the the object has not even been created yet at the time the decorator modifies the function.
So Stack, how can I do this?
Notes:
1) I know I could do this with base-classes and inheritance instead, but that becomes a problem when you have more than one method in the class that you want to decorate this way.
2) The actual problem comes from using theano - the standard format is outputs, updates = fcn(*inputs), but most functions don't return any updates, so you want to be able to define those functions in a natural way, but still have the option of calling them according to this standard interface.
That's indeed a problem, because the way the "bound" method is retrieved from the function doesn't consider this way.
I see two ways:
You could just wrap the function:
def single_return_format(fcn):
# TODO Do some functools.wraps here...
return lambda *args, **kwargs: (fcn(*args, **kwargs), )
No fooling around with .standard_format, but a mere replacement of the function. So the function can define itself as returning one value, but can only be called as returning the tuple.
If this is not what you want, you can define a class for decorating methods which overrides __get__ and does the wrapping in a "live fashion". Of course, it can as well redefine __call__ so that it is usable for (standalone, non-method) functions as well.
To get exactly what you want you'd have to write a non-data descriptor and a set of wrapper classes for your functions. The reason for this is that the process of getting functions from objects as methods is highly optimised and it's not possible to hijack this mechanism. Instead you have to write your own classes that simulate this mechanism -- which will slow down your code if you are making lots of small method calls.
The very best way I can think to get the desired functionality is not to use any of the methods that you describe, but rather write a wrapper function that you use when needed to call a normal function in the standard format. eg.
def vectorise(method, *args, **kwargs):
return tuple(method(arg, **kwargs) for arg in args)
obj = _SomeClass()
result = vectorise(obj.add_one, 1, 2, 3)
Indeed, this is how numpy takes functions that operate on one argument and turns them into a function that works on arrays.
import numpy
def add_one(x):
return x + 1
arr = numpy.vectorize(add_one)([1, 2, 3])
If you really, really want to use non-data descriptors then following will work. Be warned these method calls are considerably slower. On my computer a normal method call takes 188 nanoseconds versus 1.53 microseconds for a "simple" method call -- a ten-fold difference. And vectorise call takes half the time a standard_form call does. The vast majority of that time is the lookup of the methods. The actual method calls are quite fast.
class simple_form:
"""Allows a simple function to be called in a standard way."""
def __init__(self, func):
self.func = func
def __get__(self, instance, owner):
if instance is None:
return self.func
return SimpleFormMethod(self.func, instance)
class MethodBase:
"""Provides support for getting the string representation of methods."""
def __init__(self, func, instance):
self.func = func
self.instance = instance
def _format(self):
return "<bound {method_class} {obj_class}.{func} of {obj}>".format(
method_class=self.__class__.__name__,
obj_class=self.instance.__class__.__name__,
func=self.func.__name__,
obj=self.instance)
def __str__(self):
return self._format()
def __repr__(self):
return self._format()
class SimpleFormMethod(MethodBase):
def __call__(self, *args, **kwargs):
return self.func(self.instance, *args, **kwargs)
#property
def standard_form(self):
return StandardFormMethod(self.func, self.instance)
class StandardFormMethod(MethodBase):
def __call__(self, *args, **kwargs):
return tuple(self.func(self.instance, arg, **kwargs) for arg in args)
class Number(object):
def __init__(self, value):
self.value = value
def add_to(self, *values):
return tuple(val + self.value for val in values)
#simple_form
def divide_into(self, value):
return value / self.value
num = Number(2)
print("normal method access:", num.add_to, sep="\n")
print("simple form method access:", num.divide_into, sep="\n")
print("standard form method access:", num.divide_into.standard_form, sep="\n")
print("access to underlying function:", Number.divide_into, sep="\n")
print("simple example usage:", num.divide_into(3))
print("standard example usage:", num.divide_into.standard_form(*range(3)))
Dunes gave the correct answer. I've stripped it down to bare bones so that it solves the problem in the question. The stripped-down code is here:
class single_return_format(object):
def __init__(self, func):
self._func = func
def __get__(self, instance, owner):
return SimpleFormMethod(instance, self._func)
class SimpleFormMethod(object):
def __init__(self, instance, func):
self._instance = instance
self._func = func
def __call__(self, *args, **kwargs):
return self._func(self._instance, *args, **kwargs)
#property
def standard_format(self):
return lambda *args, **kwargs: (self._func(self._instance, *args, **kwargs), )
class _SomeClass(object):
def __init__(self):
self._amount_to_add = 1
#single_return_format
def add_one(self, x):
return x+self._amount_to_add
obj = _SomeClass()
assert obj.add_one(3) == 4
assert obj.add_one.standard_format(3) == (4, )

Python 3 bound methods subscription

On the beginning, I know the bound methods attributes does not exist in Python 3 (according to this topic: Why does setattr fail on a bound method)
I'm trying to write a pseudo 'reactive' Python framework. Maybe I'm missing something and maybe, that what I'm trying to do is somehow doable. Lets look at the code:
from collections import defaultdict
class Event:
def __init__(self):
self.funcs = []
def bind(self, func):
self.funcs.append(func)
def __call__(self, *args, **kwargs):
for func in self.funcs:
func(*args, **kwargs)
def bindable(func):
events = defaultdict(Event)
def wrapper(self, *args, **kwargs):
func(self, *args, **kwargs)
# I'm doing it this way, because we need event PER class instance
events[self]()
def bind(func):
# Is it possible to somehow implement this method "in proper way"?
# to capture "self" somehow - it has to be implemented in other way than now,
# because now it is simple function not connected to an instance.
print ('TODO')
wrapper.bind = bind
return wrapper
class X:
# this method should be bindable - you should be able to attach callback to it
#bindable
def test(self):
print('test')
# sample usage:
def f():
print('calling f')
a = X()
b = X()
# binding callback
a.test.bind(f)
a.test() # should call f
b.test() # should NOT call f
Of course all classes, like Event were simplified for this example. Is there any way to fix this code to work? I want simply to be able to use bindable decorator to make a method (not a function!) bindable and be able to later "bind" it to a callback - in such way, that if somebody calls the method, the callback will be called automatically.
Is there any way in Python 3 to do it?
Ou yeah! :D I've found an answer - a little creazy, but working fast. If somebody has a comment or better solution, I would be very interested in seeing it. Following code is working for methods AND functions:
# ----- test classes -----
class Event:
def __init__(self):
self.funcs = []
def bind(self, func):
self.funcs.append(func)
def __call__(self, *args, **kwargs):
message = type('EventMessage', (), kwargs)
for func in self.funcs:
func(message)
# ----- implementation -----
class BindFunction:
def __init__(self, func):
self.func = func
self.event = Event()
def __call__(self, *args, **kwargs):
out = self.func(*args, **kwargs)
self.event(source=None)
return out
def bind(self, func):
self.event.bind(func)
class BindMethod(BindFunction):
def __init__(self, instance, func):
super().__init__(func)
self.instance = instance
def __call__(self, *args, **kwargs):
out = self.func(self.instance, *args, **kwargs)
self.event(source=self.instance)
return out
class Descriptor(BindFunction):
methods = {}
def __get__(self, instance, owner):
if not instance in Descriptor.methods:
Descriptor.methods[instance] = BindMethod(instance, self.func)
return Descriptor.methods[instance]
def bindable(func):
return Descriptor(func)
# ----- usage -----
class list:
def __init__(self, seq=()):
self.__list = [el for el in seq]
#bindable
def append(self, p_object):
self.__list.append(p_object)
def __str__(self):
return str(self.__list)
#bindable
def x():
print('calling x')
# ----- tests -----
def f (event):
print('calling f')
print('source type: %s' % type(event.source))
def g (event):
print('calling g')
print('source type: %s' % type(event.source))
a = list()
b = list()
a.append.bind(f)
b.append.bind(g)
a.append(5)
print(a)
b.append(6)
print(b)
print('----')
x.bind(f)
x()
and the output:
calling f
source type: <class '__main__.list'>
[5]
calling g
source type: <class '__main__.list'>
[6]
----
calling x
calling f
source type: <class 'NoneType'>
The trick is to use Python's descriptors to store current instance pointer.
As a result we are able to bind a callback to any python function. The execution overhead is not too big - the empty function execution is 5 - 6 times slower than without this decorator. This overhead is caused by needed function chain and by event handling.
When using the "proper" event implementation (using weak references), like this one: Signal slot implementation, we are getting the overhead of 20 - 25 times the base function execution, which still is good.
EDIT:
According to Hyperboreus question, I updated the code to be able to read from the callback methods the source object from whic the callbacks were called. They are now accessible by event.source variable.
To be honest, I do not have an answer to your question, just another question is return:
Wouldn't monkey-patching your instances create the behaviour you intend:
#! /usr/bin/python3.2
import types
class X:
def __init__ (self, name): self.name = name
def test (self): print (self.name, 'test')
def f (self): print (self.name, '!!!')
a = X ('A')
b = X ('B')
b.test = types.MethodType (f, b) #"binding"
a.test ()
b.test ()

Preventing a class's function attributes from being passed self as the first arg

Okay, so I've got a class where one of the attributes is a callback function. Problem is, whenever I call it from within the class (e.g. as self.function_attr(), it gets passed self as the first argument. Here's an idea of what I'm working with:
def callback(a, b):
# do something with a, b
class A:
def __init__(self, callback):
self.callback = callback
self.callback(1, 2) # Raises a TypeError: takes exactly 2 arguments (3 given)
I'm not willing to write each callback function to take self as a first argument. I wrote a decorator that works around the issue:
def callback_decorator(func):
def newfunc(self, *args, **kw):
return func(*args, **kw)
return newfunc
but I'm wondering if there's anything better.
Basically, my question is, how can I call instance attributes of my class which are functions without them being passed self as the first argument?
You just need to make it a staticmethod when you bind it to the class.
def callback(a, b):
# do something with a, b
class A:
def __init__(self, callback):
# now it won't get passed self
self.callback = staticmethod(callback)
self.callback(1, 2)
or
class A:
def __init__(self, callback):
self.callback(1, 2)
# now it won't get passed self
callback = staticmethod(callback)
As far as I know, a wrapper (like your decorator) is the simplest way to go. Since you already have an object in which to store the function, I wouldn't bother with a decorator. (Note I've inherited from object, which is something you should probably be doing unless you specifically want old-style class behaviour.)
class A(object):
def __init__(self, callback):
self._callback = callback
self.callback(1,2)
def callback(self, *args, **kwargs):
return self._callback(*args, **kwargs)
This behaves as you'd expect:
>>> def f(x, y):
... print "X: %s, Y: %s" % (x,y)
...
>>> mya = A(f)
X: 1, Y: 2

Categories