I'd like to automate unpacking init variables once and for all. My thought process is to unpack all and use setattr with the name of the variable as the attribute name.
class BaseObject(object):
def __init__(self, *args):
for arg in args:
setattr(self, argname, arg)
class Sub(BaseObject):
def __init__(self, argone, argtwo, argthree, argfour):
super(Sub, self).__init__(args*)
Then you should be able to do:
In [3]: mysubclass = Sub('hi', 'stack', 'overflow', 'peeps')
In [4]: mysubclass.argtwo
Out[4]: 'stack'
Based on the param for 'stack' having been named argtwo.
This way you'll automatically have access but you could still override like below:
class Sub(BaseObject):
def __init__(self, argone, argtwo, argthree, argfour):
super(Sub, self).__init__(arglist)
clean_arg_three = process_arg_somehow(argthree)
self.argthree = clean_arg_three
Clearly I'm stuck on how to pass the actual name of the param (argone, argtwo, etc) as a string to setattr, and also how to properly pass the args into the super init (the args of super(Sub, self).__init__(args*))
Thank you
Use kwargs instead of args
class BaseObject(object):
def __init__(self, **kwargs):
for argname, arg in kwargs.iteritems():
setattr(self, argname, arg)
class Sub(BaseObject):
def __init__(self, argone, argtwo, argthree, argfour):
super(Sub, self).__init__(argone=argone, argtwo=argtwo, argthree=argthree)
Then you can do:
s = Sub('a', 'b', 'c')
BaseObject.__init__() needs to discover the names of the arguments to Sub.__init__() somehow. The explicit approach would be to pass a dictionary of arguments to BaseObject.__init__(), as Uri Shalit suggested. Alternatively, you can use the inspect module to do this magically, with no extra code in your subclass. That comes with the usual downsides of magic ("how'd that happen?").
import inspect
class BaseObject(object):
def __init__(self):
frame = inspect.stack()[1][0]
args, _, _, values = inspect.getargvalues(frame)
for key in args:
setattr(self, key, values[key])
class Sub(BaseObject):
def __init__(self, argone, argtwo, argthree, argfour):
super(Sub, self).__init__()
This works fine as written, but if you don't define Sub.__init__(), this will grab the arguments from the wrong place (i.e., from the function where you call Sub() to create an object). You might be able to use inspect to doublecheck that the caller is an __init__() function of a subclass of BaseObject. Or you could just move this code to a separate method, e.g., set_attributes_from_my_arguments() and call that from your subclasses' __init__() methods when you want this behavior.
import inspect
class BaseObject(object):
def __init__(self, args):
del args['self']
self.__dict__.update(args)
class Sub(BaseObject):
def __init__(self, argone, argtwo, argthree, argfour):
args = inspect.getargvalues(inspect.currentframe()).locals
super(Sub, self).__init__(args)
s = Sub(1, 2, 3, 4)
Related
Is there a way to get an object's init argument values in python 2.7? I'm able to get the defaults through getargspec but i would like to access passed in values
import inspect
class AnObject(object):
def __init__(self, kw='', *args, **kwargs):
print 'Hello'
anobj = AnObject(kw='a keyword arg')
print inspect.getargspec(anobj.__init__)
Returns
Hello
ArgSpec(args=['self', 'kw'], varargs='args', keywords='kwargs', defaults=('',))
__init__ is treated no differently than any other function. So, like with any other function, its arguments are discarded once it returns -- unless you save them somewhere before that.
The standard approach is to save what you need later in attributes of the instance:
class Foo:
def __init__(self, a, b, *args, **kwargs):
self.a = a
self.b = b
<etc>
"Dataclasses" introduced in 3.7 streamline this process but require data annotations:
import dataclasses
#dataclasses.dataclass
class Foo:
a: int
b: str
is equivalent to:
class Foo:
def __init__(self, a:int, b:str):
self.a = a
self.b = b
Though see Python decorator to automatically define __init__ variables why this streamlining is not very useful in practice.
You can store them as attributes.
class AnObject(object):
def __init__(self, kw='', *args, **kwargs):
self.kw = kw
self.args = args
self.kwargs = kwargs
then just print them:
anobj = AnObject(kw='a keyword arg')
print anobj.kw
print anobj.args
print anobj.kwargs
if you want to see them all, you could take a look into its __dict__ attribute.
class AppendiveDict(c.OrderedDict):
def __init__(self,func,*args):
c.OrderedDict.__init__(args)
def __setitem__(self, key, value,):
if key in self:
self[key] = func(self[key])
else:
c.OrderedDict.__setitem__(self,key,value)
I have this class that is supposed to apply the func function to items already in the dictionary. If I do c.OrderedDict.__init__(args) it tells me descriptor __init__ needs an ordereddict and not a tuple.
If I change c.OrderedDict.__init__(self) then an empty ordereddict shows up in the representation.
The solution is to pass both arguments, because c.OrderedDict.__init__ needs to know on which instance it's supposed to operate and with which arguments:
import collections as c
class AppendiveDict(c.OrderedDict):
def __init__(self, func, *args):
c.OrderedDict.__init__(self, args) # pass self AND args
Or to use super with args only (because super returns a bound method that already knows on which instance it is called):
class AppendiveDict(c.OrderedDict):
def __init__(self, func, *args):
super().__init__(args)
I have a boiler platey class that delegates some actions to a reference class. It looks like this:
class MyClass():
def __init__(self, someClass):
self.refClass = someClass
def action1(self):
self.refClass.action1()
def action2(self):
self.refClass.action2()
def action3(self):
self.refClass.action3()
This is the refClass:
class RefClass():
def __init__(self):
self.myClass = MyClass(self)
def action1(self):
#Stuff to execute action1
def action2(self):
#Stuff to execute action2
def action3(self):
#Stuff to execute action3
I'd like to use Python Metaprogramming to make this more elegant and readable, but I'm not sure how.
I've heard of setattr and getattr, and I think I could do something like
class MyClass():
def __init__(self, someClass):
self.refClass = someClass
for action in ['action1', 'action2', 'action3']:
def _delegate(self):
getattr(self.refClass, action)()
And then I know I need to do this from somewhere, I guess:
MyClass.setattr(action, delegate)
I just can't totally grasp this concept. I understand the basics about not repeating code, and generating the methods with a for loop with functional programming, but then I don't know how to call this methods from elsewhere. Heeeelp!
Python already includes support for generalized delegation to a contained class. Just change the definition of MyClass to:
class MyClass:
def __init__(self, someClass):
self.refClass = someClass # Note: You call this someClass, but it's actually some object, not some class in your example
def __getattr__(self, name):
return getattr(self.refClass, name)
When defined, __getattr__ is called on the instance with the name of the accessed attribute any time an attribute is not found on the instance itself. You then delegate to the contained object by calling getattr to look up the attribute on the contained object and return it. This costs a little each time to do the dynamic lookup, so if you want to avoid it, you can lazily cache attributes when they're first requested by __getattr__, so subsequent access is direct:
def __getattr__(self, name):
attr = getattr(self.refClass, name)
setattr(self, name, attr)
return attr
Personally, for delegating things I usually do something like that:
def delegate(prop_name, meth_name):
def proxy(self, *args, **kwargs):
prop = getattr(self, prop_name)
meth = getattr(prop, meth_name)
return meth(*args, **kwargs)
return proxy
class MyClass(object):
def __init__(self, someClass):
self.refClass = someClass
action1 = delegate('refClass', 'action1')
action2 = delegate('refClass', 'action2')
This will create all delegate methods you need :)
For some explanations, the delegate function here just create a "proxy" function which will act as a class method (see the self argument?) and will pass all arguments given to it to the referenced object's method with the args and kwargs arguments (see *args and **kwargs? for more informations about these arguments)
You can create this with a list too, but I prefer the first because it's more explicit for me :)
class MyClass(object):
delegated_methods = ['action1', 'action2']
def __init__(self, someClass):
self.refClass = someClass
for meth_name in self.delegated_methods:
setattr(self, meth_name, delegate('refClass', meth_name))
I have a base class, a bunch of subclasses, and for each of these subclasses, I have another set of sub-subclasses. For example:
class BaseClass(object):
def __init__(self):
with open(config.txt) as f
self.config_array = f.readlines()
class FirstOrderSubClass(BaseClass):
def __init__(self, name):
self.name = name
class SecondOrderSubClass(FirstOrderSubClass):
def __init__(self, name, version):
self.name = name
self.version = version
super(SecondOrderSubClass, self).__init__(self.name)
# needed to access self.config_array
print self.config_array
I need to get the __init__() method of the SecondOrderSubClass to make the following assignment: self.lines = self.config_array.
EDIT: added line print self.config_array. If I run the code I get:
TypeError: __getattr__() takes exactly 1 argument (2 given)
You cannot access self.config_array until BaseClass.__init__() has run to set the attribute.
Either fix FirstOrderSubClass to also invoke the base class __init__ or call it directly.
Fixing the FirstOrderSubClass is probably the best way to do so:
class FirstOrderSubClass(BaseClass):
def __init__(self, name):
super(FirstOrderSubClass, self).__init__()
self.name = name
However, your __init__ method signatures do not match so you cannot rely on cooperative behaviour here; as soon as you add a mix-in class in the hierarchy, things can and probably will break. See *Python's super() is considered super! by Raymond Hettinger, or it's followup PyCon presentation to explain why you want your signatures to match.
Calling the BaseClass.__init__ unbound method directly (passing in self explicitly) would also work:
class SecondOrderSubClass(FirstOrderSubClass):
def __init__(self, name, version):
super(SecondOrderSubClass, self).__init__(name)
self.version = version
BaseClass.__init__(self)
Note that there is no point in assigning to self.name there if you are going to ask FirstOrderSubClass.__init__ to do the exact same thing.
The proper way to use super() is for all your methods to at least accept all the same arguments. Since object.__init__() never does, this means you need a sentinel class that does not use super(); BaseClass will do nicely here. You can use *args and **kw to capture any additional arguments and just ignore those to make cooperative subclassing work:
class BaseClass(object):
def __init__(self, *args, **kw):
with open(config.txt) as f
self.config_array = f.readlines()
class FirstOrderSubClass(BaseClass):
def __init__(self, name, *args, **kw):
super(FirstOrderSubClass, self).__init__(*args, **kw)
self.name = name
class SecondOrderSubClass(FirstOrderSubClass):
def __init__(self, name, version, *args, **kw):
super(SecondOrderSubClass, self).__init__(name, *args, **kw)
self.version = version
You have to call the FirstOrderSubClass super method:
class BaseClass(object):
def __init__(self):
with open("config.example.txt",'w') as f:
f.write("Hello world")
with open("config.example.txt") as f:
self.config_array = f.readlines()
class FirstOrderSubClass(BaseClass):
def __init__(self, name):
super(FirstOrderSubClass,self).__init__()
self.name = name
class SecondOrderSubClass(FirstOrderSubClass):
def __init__(self, name, version):
self.name = name
self.version = version
super(SecondOrderSubClass, self).__init__(self.name)
# needed to access self.config_array
grandchild = SecondOrderSubClass("peter",2.0)
print grandchild.config_array
##>>>
##['Hello world']
If I write an inheritance relationship as follows:
class Pet(object):
def __init__(self, n):
print n
class Dog(Pet):
def __init__(self, **kwargs):
Pet.__init__(self, 5)
Then the output is 5. If, however, I wanted to do:
class Pet(object):
def __init__(self, **kwargs):
if not "n" in kwargs:
raise ValueError("Please specify the number I am to print")
print kwargs["n"]
class Dog(Pet):
def __init__(self, **kwargs):
Pet.__init__(self, kwargs)
Then I get the error TypeError: __init__() takes exactly one argument (two given)
How can I pass the extra arguments up the inheritance chain in this way?
Simply pass the arguments down as keyword arguments:
class Pet(object):
def __init__(self, **kwargs):
if not "n" in kwargs:
raise ValueError("Please specify the number I am to print")
print kwargs["n"]
class Dog(Pet):
def __init__(self, **kwargs):
Pet.__init__(self, **kwargs)
However, you should use super rather than hardcoding the superclass.
Change your definition of Dog to (Python 2.X):
class Dog(Pet):
def __init__(self, **kwargs):
super(Dog, self).__init__(**kwargs)
And in Python 3.X it's nicer, just:
class Dog(Pet):
def __init__(self, **kwargs):
super().__init__(**kwargs)
Figured it out: Use Pet.__init__(self, **kwargs)
It's called "unpacking" and it transforms the dictionary kwargs into a list of argument=value pairs.
Then **kwargs in the parent constructor is able to handle them. Just passing a dictionary would not work because **kwargs in the constructor is expecting a bunch of argument=value pairs but instead I was just passing it one value, the dictionary kwargs from the child.