If I want to pass an indentical list of args from one function to another, I can use the locals function. However this falls over if I'm in an instance method, as self cannot be passed as kwarg. I can create a dict of locals and then delete self. I was wondering if there's a more elegant way of doing this though - it seems to come up quite a lot for me.
class MyObj:
def test(self, a=1, b=2, c=3, d=4):
return [a,b,c,d]
class MyVerboseObj(MyObj):
def test(self, a=1, b=2, c=3, d=4):
print "RUNNING TEST a=%d"%a
kwargs = locals()
del kwargs["self"]
return MyObj.test(self, **kwargs)
MyVerboseObj().test()
Is there a better way to pass the identical list args from MyVerboseObj.test to MyObj.test?
** UPDATE **
This is a simplified example to demostrate my question. A more practical example might be this:
class GenericShow:
def __init__(self, path):
self.path = path
def getRenderPath(self, extn="jpg", colspace="rgb"):
return "%s/render_%s.%s"%(self.path, colspace, extn)
def hasGenericShowHandler(self):
try:
import GenericShowHandler
return True
except: pass
class UkShow(GenericShow):
def getRenderPath(self, extn="jpg", colspace="rgb"):
if self.hasGenericShowHandler():
kwargs = locals()
kwargs.pop("self")
return GenericShow.getRenderPath(self, **kwargs)
else:
return "%s/blah/render_%s.%s"%(path, colspace, extn)
show = UkShow("/jobs/test")
show.hasGenericShowHandler()
print show.getRenderPath()
I have a lot of show objects with different vars defined by different modules and different naming conventions - I'm trying to create show objects with common functionality.
If you don't want to override the parent's function (because you want to use it) - simply avoid overriding it by giving it a different name!
class MyObj:
def base_test(self, a=1, b=2, c=3, d=4):
return [a,b,c,d]
class MyVerboseObj(MyObj):
def test(self, a=1, b=2, c=3, d=4):
print "RUNNING TEST a=%d"%a
kwargs = locals()
del kwargs["self"]
return self.base_test(**kwargs)
print MyVerboseObj().test()
Using locals() is rarely a good idea. This approach is confusing and fragile to modifications on the code, because adding any local variable will mess up your call.
I would go for the more explicit:
class MyObj:
def test(self, a=1, b=2, c=3, d=4):
return [a,b,c,d]
class MyVerboseObj(MyObj):
def test(self, a=1, b=2, c=3, d=4):
print "RUNNING TEST a=%d"%a
return MyObj.test(self, a, b, c, d)
MyVerboseObj().test()
I know it is tedious, but it avoids some trouble. Plus it is more efficient.
Alternatively, if you are OK with loosing the signature, you could just accept *args, **kwargs as parameters to your overriding function:
class MyObj:
def test(self, a=1, b=2, c=3, d=4):
return [a,b,c,d]
class MyVerboseObj(MyObj):
def test(self, *args, **kwargs):
print "RUNNING TEST"
return MyObj.test(self, *args, **kwargs)
MyVerboseObj().test()
Accessing a in this version would be a bit of a hassle, but since you are not really interested in the arguments in your second example this might be the easier solution.
As a side note: As your inheritance structure becomes more complicated (i.e. multiple inheritance) you might want to start using the super() method:
class MyObj:
def test(self, a=1, b=2, c=3, d=4):
return [a,b,c,d]
class MyVerboseObj(MyObj):
def test(self, *args, **kwargs):
print "RUNNING TEST"
super(MyObj, self).test(*args, **kwargs)
MyVerboseObj().test()
Related
The methods should be callable by assigning the specific parameters as well as considering the class' attributes.
So what I like to achieve is overwriting a method's arguments with preset attributes.
E.g.:
class Obj:
def __init__(self, cfg=None):
self.cfg = {}
if cfg:
self.cfg = cfg
return
def _actual_f_1(self, x, **kwargs):
return x+100
def f_1(self, x):
new = {"x": x, **self.cfg}
return self._actual_f_1(**new)
o = Obj()
o.f_1(1)
which prints
101
OR using the overriding approach:
o = Obj({"x": 100})
o.f_1(1)
which now gives
200
The defined class Obj looks pretty clumsy. Especially if several methods of the class should use the described logic.
How can I generalize the logic of f_1 which basically only alters the parameters before calling the actual method?
You can use __init_subclass__ in a base class to decorate all methods in a class, in a transparent way, to pick the fitting named parameters form a .cfg dict if they are not passed.
So, first let's think of the code for such a decorator - applying arguments can be rather complicated because among positional X named parameters with "positional only" and "named only" in the mix, the number of combinationx explode
Python's stdlib have a signature call which returns a Signature object with enough functionality to cover all corner cases. I use it, and just a common path - so that if the arguments you want to apply automatically are normal positional_or_keyword or keyword_only parameters, it should work:
from functools import wraps
def extender(method):
sig = inspect.signature(method)
#wraps(method)
def wrapper(self, *args, **kwargs):
bound = sig.bind_partial(*args, **kwargs)
for parameter in sig.parameters.keys():
if parameter not in bound.arguments and parameter in getattr(self, "cfg", {}):
kwargs[parameter] = self.cfg[parameter]
return method(self, *args, **kwargs)
return wrapper
Here we can see that working:
In [78]: class A:
...: def __init__(self):
...: self.cfg={"b": 5}
...: #extender
...: def a(self, a, b, c=10):
...: print( a, b, c)
...:
In [79]: a = A()
In [80]: a.a(1)
1 5 10
In [81]: a.a(1, c=2)
1 5 2
In [82]: a.a(1, c=2, b=3)
1 3 2
With only this decorator, your code could be rewritten as:
class Obj:
def __init__(self, cfg=None):
self.cfg = cfg if cfg is not None else {"extra": 10}
#extender
def f_1(self, x, extra):
return x+100
And if you want a base-class that will transparently wrap all methods in all subclasses with the extender, you can use this:
class Base:
def __init_subclass__(cls, *args, **kwargs):
super().__init_subclass__(*args, **kwargs)
for name, value in cls.__dict__.items():
if callable(value):
setattr(cls, name, extender(value))
And now one can use simply:
In [84]: class B(Base):
...: def __init__(self):
...: self.cfg = {"c": 10}
...: def b(self, a, c):
...: print(a, c)
...:
In [85]: B().b(1)
1 10
This decorator, unlike your example, takes care to just inject the arguments the function expects to receive, and not all of the keys from self.cfg in every function call.
If you want that behavior instead, you just have to take care to expand the cfg dict first and then updating it with the passed arguments, so that passed arguments will override default values in the cfg. The decorator code for that would be:
from functools import wraps
def extender_kw(method):
sig = inspect.signature(method)
#wraps(method)
def wrapper(self, *args, **kwargs):
bound = sig.bind_partial(*args, **kwargs)
kwargs = bound.arguments
kwargs |= getattr(self, "cfg", {})
return method(self, **kwargs)
return wrapper
I am interpreting "generalize" as "write this with fewer lines of code."
You wrote
self.cfg = {}
if cfg:
self.cfg = cfg
return
The 4th line is superfluous and can be elided.
The first 3 lines could be a simple assignment of ... = cfg or {}
Or we could inherit from a utility class,
and make a super().__init__ call.
So now we're presumably down to DRYing up and condensing these two lines:
new = {"x": x, **self.cfg}
return self._actual_f_1(**new)
Maybe write them as
return self._actual_f_1(self.args())
where the parent
abstract
class offers an args helper that knows about self.cfg.
It would inspect
the stack to find the caller's arg signature,
and merge it with cfg.
Alternatively you could phrase it as
return self.call(self._actual_f_1)
though that seems less convenient.
Do let us know
the details of how you wind up resolving this.
Is there a way to get an object's init argument values in python 2.7? I'm able to get the defaults through getargspec but i would like to access passed in values
import inspect
class AnObject(object):
def __init__(self, kw='', *args, **kwargs):
print 'Hello'
anobj = AnObject(kw='a keyword arg')
print inspect.getargspec(anobj.__init__)
Returns
Hello
ArgSpec(args=['self', 'kw'], varargs='args', keywords='kwargs', defaults=('',))
__init__ is treated no differently than any other function. So, like with any other function, its arguments are discarded once it returns -- unless you save them somewhere before that.
The standard approach is to save what you need later in attributes of the instance:
class Foo:
def __init__(self, a, b, *args, **kwargs):
self.a = a
self.b = b
<etc>
"Dataclasses" introduced in 3.7 streamline this process but require data annotations:
import dataclasses
#dataclasses.dataclass
class Foo:
a: int
b: str
is equivalent to:
class Foo:
def __init__(self, a:int, b:str):
self.a = a
self.b = b
Though see Python decorator to automatically define __init__ variables why this streamlining is not very useful in practice.
You can store them as attributes.
class AnObject(object):
def __init__(self, kw='', *args, **kwargs):
self.kw = kw
self.args = args
self.kwargs = kwargs
then just print them:
anobj = AnObject(kw='a keyword arg')
print anobj.kw
print anobj.args
print anobj.kwargs
if you want to see them all, you could take a look into its __dict__ attribute.
Wrapping a class's method in a "boilerplate" Python decorator will treat that method as a regular function and make it lose its __self__ attribute that refers to the class instance object. Can this be avoided?
Take the following class:
class MyClass(object):
def __init__(self, a=1, b=2):
self.a = a
self.b = b
def meth(self):
pass
If meth() is undecorated, MyClass().meth.__self__ refers to an instance method and enables something like setattr(my_class_object.meth.__self__, 'a', 5).
But when wrapping anything in a decorator, only the function object is passed; the object to which it is actually bound is not passed on along with it. (See this answer.)
import functools
def decorate(method):
#functools.wraps(method)
def wrapper(*args, **kwargs):
# Do something to method.__self__ such as setattr
print(hasattr(method, '__self__'))
result = method(*args, **kwargs)
return result
return wrapper
class MyClass(object):
def __init__(self, a=1, b=2):
self.a = a
self.b = b
#decorate
def meth(self):
pass
MyClass().meth()
# False <--------
Can this be overriden?
Your main misunderstanding here is order of operation.
When the decorate() decorator is called, meth() is not a method yet - it is still a function - it is only when the class block is over that meth is transformed into a method by the metaclass descriptors! - that's why it doesn't have __self__ (yet).
In other words, to decorate methods you have to ignore the fact that they are methods and treat them as normal functions - because that's what they are when the decorator is called.
In fact, the original meth function will never turn into a method - instead the function wrapper you returned from the decorator will be part of the class and will be the one that will get the __self__ attribute later.
If you decorate method of the class, first argument is always self object (you can access it with args[0]):
import functools
def decorate(method):
#functools.wraps(method)
def wrapper(*args, **kwargs):
print(hasattr(args[0], 'a'))
result = method(*args, **kwargs)
return result
return wrapper
class MyClass(object):
def __init__(self, a=1, b=2):
self.a = a
self.b = b
#decorate
def meth(self):
pass
MyClass().meth()
Prints:
True
Edit:
You can specify also self in your wrapper function (based on comments):
import functools
def decorate(method):
#functools.wraps(method)
def wrapper(self, *args, **kwargs):
print(hasattr(self, 'a'))
result = method(self, *args, **kwargs)
return result
return wrapper
class MyClass(object):
def __init__(self, a=1, b=2):
self.a = a
self.b = b
#decorate
def meth(self):
pass
MyClass().meth()
Prints also:
True
Let me clarify the process of decorating:
When you decorate meth with decorate in class MyClass, you are doing:
class MyClass(object):
... omit
meth = decorate(meth) # the meth in "()" is the original function.
As you can see, decorate takes method which is a function as parameter and return wrapper which is another funtion. And now the original meth in MyClass is replaced by new one wrapper. So when you call myclass_instance.meth(), you are calling the new wrapper function.
There isn't any black magic, so self can be definitely passed into wrapper, and it is safe to accept self using wrapper(self, *args, **kwargs).
How can I modify a self variable with a decorator?
Ex.
class Foo:
def __init__(self,a):
self.a = a
self.li = []
def afunction(self):
pass
I want to add the function object afunction to the list self.li so I can call it in a list. Ex. Have a list of functions defined by the class. How would I do that?
Thanks
I don't think you need a decorator. Functions are first-class objects in Python:
class Foo:
def __init__(self,a):
self.a = a
self.li = [self.afunction]
def afunction(self):
pass
If your intention is to mark certain functions of a class as a special type so that you can identify them later for some other purpose, you could use a decorator, or you could just use a naming convention.
def marked(function):
function.marked = 1
return function
class MarkAware(object):
def run_marked(self, *args, **kwargs):
for name in dir(self):
meth = getattr(self, name)
if hasattr(meth, 'marked'):
meth(*args, **kwargs)
def foo(self):
pass
#marked
def bar(self):
pass
Alternative:
class NameConvention(object):
def run_batchable(self, *args, **kwargs):
for name in dir(self):
if name.startswith('batchable_'):
getattr(self, name)(*args, **kwargs)
def foo(self):
pass
def batchable_bar(self):
pass
As Lattyware explains in a comment to unutbu's answer, you can't directly do what you're asking, because any decorator on afunction will be run while the class itself is being created, not when each instance is created.
If all you really want is "a list of functions defined by the class", you don't need anything fancy at all for that. Just create that list in __init__:
def __init__(self, a):
self.a = a
self.li = [f for f in dir(self) if inspect.ismethod(f)]
If you want a list of certain specific functions, the easiest way is the way unutbu suggests, which still doesn't require a decorator.
If you want the decorator just to mark "this method should go into li", see sr2222's answer.
None of these are what you asked for, but they are probably what you want. There are a few ways to actually use a decorator to add the function to self.li, but they're all pretty horrible, and you probably don't want them. For example:
class Foo:
def __init__(self,a):
self.a = a
self.li = []
def mydecorator(f):
self.li.append(f)
return f
#mydecorator
def afunction(self):
print('a')
self.afunction = new.instancemethod(afunction, self, Foo)
I have a class:
class A(object):
def __init__(self,a,b,c,d,e,f,g,...........,x,y,z)
#do some init stuff
And I have a subclass which needs one extra arg (the last W)
class B(A):
def __init__(self.a,b,c,d,e,f,g,...........,x,y,z,W)
A.__init__(self,a,b,c,d,e,f,g,...........,x,y,z)
self.__W=W
It seems dumb to write all this boiler-plate code, e.g passing all the args from B's Ctor to the inside call to A's ctor, since then every change to A's ctor must be applied to two other places in B's code.
I am guessing python has some idiom to handle such cases which I am unaware of. Can you point me in the right direction?
My best hunch, is to have a sort of Copy-Ctor for A and then change B's code into
class B(A):
def __init__(self,instanceOfA,W):
A.__copy_ctor__(self,instanceOfA)
self.__W=W
This would suit my needs since I always create the subclass when given an instance of the father class, Though I am not sure whether it's possible...
Considering that arguments could be passed either by name or by position, I'd code:
class B(A):
def __init__(self, *a, **k):
if 'W' in k:
w = k.pop('W')
else:
w = a.pop()
A.__init__(self, *a, **k)
self._W = w
Edit: based on Matt's suggestion, and to address gnibbler's concern re a positional-argument approach; you might check to make sure that the additional subclass-specific argument is being specified—similar to Alex's answer:
class B(A):
def __init__(self, *args, **kwargs):
try:
self._w = kwargs.pop('w')
except KeyError:
pass
super(B,self).__init__(*args, **kwargs)
>>> b = B(1,2,w=3)
>>> b.a
1
>>> b.b
2
>>> b._w
3
Original answer:
Same idea as Matt's answer, using super() instead.
Use super() to call superclass's __init__() method, then continue initialising the subclass:
class A(object):
def __init__(self, a, b):
self.a = a
self.b = b
class B(A):
def __init__(self, w, *args):
super(B,self).__init__(*args)
self.w = w
In situations where some or all of the arguments passed to __init__ have default values, it can be useful to avoid repeating the __init__ method signature in subclasses.
In these cases, __init__ can pass any extra arguments to another method, which subclasses can override:
class A(object):
def __init__(self, a=1, b=2, c=3, d=4, *args, **kwargs):
self.a = a
self.b = b
# …
self._init_extra(*args, **kwargs)
def _init_extra(self):
"""
Subclasses can override this method to support extra
__init__ arguments.
"""
pass
class B(A):
def _init_extra(self, w):
self.w = w
Are you wanting something like this?
class A(object):
def __init__(self, a, b, c, d, e, f, g):
# do stuff
print a, d, g
class B(A):
def __init__(self, *args):
args = list(args)
self.__W = args.pop()
A.__init__(self, *args)