If I have a class instance with some attributes defined, how do I access them indirectly? My first thought was to put them in a dict and then access them with the keywords but that doesn't work as I expect - example below:
class Test:
def __init__(self):
self.testval=0
test=Test()
testfuncs={'A':test.testval}
print(test.testval)
testfuncs['A']=1
print(test.testval)
This prints '0' and then '0' as I have not modified the class variable, I've just altered the dictionary value to be the integer '1'.
So I want to be able to access and modify the attribute testval. The reason for this is that in a larger program there are some defined class instance variables that I want to assign once and then reuse throughout. Then by just updating the dict they will change everywhere (they are voltage channels that may change as application changes).
Use the getattr() function to get an attribute of an object if you have its name in a variable, and setattr() to set it in similar circumstances.
class Test:
def __init__(self):
self.testval = 0
test=Test()
A = "testval"
print(test.testval)
setattr(test, A, 1)
print(test.testval)
You can also define your class to have a __setitem__ method; then you can use dictionary-like syntax to set attributes.
class Test:
def __init__(self):
self.testval = 0
def __setitem__(self, key, value):
setattr(self, key, value)
test=Test()
A = "testval"
print(test.testval)
test[A] = 1
print(test.testval)
Finally (well, there are other ways you can handle this, but I'm only going to mention one more)... finally, you could make a class that holds a reference to an object and an attribute name. This is convenient when you want to pass around such references.
class Test:
def __init__(self):
self.testval = 0
class IndirectAttribute:
def __init__(self, obj, attr):
self.obj = obj
self.attr = attr
def set(self, value):
setattr(self.obj, self.attr, value)
test = Test()
A = IndirectAttribute(test, "testval")
print(test.testval)
A.set(1)
print(test.testval)
You can set the value of the dictionary to be your test object
class Test:
def __init__(self):
self.testval = 0
test = Test()
testfuncs = {'A': test}
print(test.testval) # prints 0
testfuncs['A'].testval = 1
print(test.testval) # prints 1
Related
Is there a way in python to pass a function call to an inner object, maybe through a decorator or wrapper? In the example below, class A holds a list of class B objects, and one of the class B objects is selected as the active object. I want class A to function as a passthrough, just identifying which of the class B objects that the call goes to. However, class A doesn't know what type of class it is going to hold beforehand, so I can't just add a set_var function to class A. It has to work for any generic function that class B has. It will only have one type of class in its objects list, so it could take class B as an input when it is instantiated and dynamically create functions, if that's a possibility. The client wouldn't know whether it's dealing with class A or class B. The code below is as far as I got.
class A:
def __init__(self):
self.objects = []
self.current_object = 0
def add_object(self, object):
self.objects.append(object)
class B:
def __init__(self):
self.var = 10
def set_var(self, new_var):
self.var = new_var
a_obj = A()
b_obj1 = B()
b_obj2 = B()
a_obj.add_object(b_obj1)
a_obj.add_object(b_obj2)
a_obj.set_var(100)
You could use the generic __getattr__ to delegate to the wrapped object.
class A:
def __init__(self):
self.objects = []
self.current_object = 0
def add_object(self, obj):
self.objects.append(obj)
self.current_object = obj
def __getattr__(self, name):
return getattr(self.current_object, name)
class B:
def __init__(self):
self.var = 10
def set_var(self, new_var):
self.var = new_var
a_obj = A()
b_obj1 = B()
b_obj2 = B()
a_obj.add_object(b_obj1)
a_obj.add_object(b_obj2)
a_obj.set_var(100)
print(b_obj2.var)
That will print "100".
You will still get an AttributeError if the wrapped object doesn't have the expected method.
It was interesting to look at this, it is intentionally rough but it does indeed allow you to call one the B instance's set_var methods.
The code below uses sets as a quick and dirty way to see the difference in callable methods, and if there is; it sets the attribute based on that name. Binding the method to the A instance.
This would only bind set_var once from the first object given.
def add_object(self, object):
self.objects.append(object)
B_methods = set([m for m in dir(object) if callable(getattr(object, m))])
A_methods = set([m for m in dir(self) if callable(getattr(self, m))])
to_set = B_methods.difference(A_methods)
for method in to_set:
setattr(self, method, getattr(object, method))
consider the following code snippet,
class super1():
def __init__(self):
self.variable = ''
def setVariable(self, value):
self.variable = value
class child(super1):
def __init__(self):
super.__init__(self)
self.setSuperVariable()
def setSuperVariable(self):
# according to this variable should have value 10
self.setVariable(10)
super_instance = super1()
child1 = child()
print super_instance.variable
# prints nothing
super_instance.setVariable(20)
print super_instance.variable
as you can see, i have a base class and a derived class. I wanted the derived class to set the "variable" which can be used outside the program too. for example, the child class is performing come complex task and sets the variable, which will be used by other classes and functions.
But as of now, since the child class has its own instance , its not getting reflected outside the scope.
Is there a workaround for this problem?
# Elmo
class super():
def __init__(self):
self.variable = ''
def setVariable(self, value):
self.variable = value
class child():
def __init__(self, instance_of_super):
self.handle = instance_of_super
self.setSuperVariable()
def setSuperVariable(self):
# according to this variable should have value 10
self.handle.setVariable(10)
super_instance = super()
child1 = child(super_instance)
print super_instance.variable
# prints nothing
super_instance.setVariable(20)
print super_instance.variable
This will set the variable. Though i am not using inheritance. :)
The variable in the instance of super1 does not change when you modify the child instance because inheritance works at the class level. Once you create an instance, it has everything from itself and its parents. Each instance is completely independent from each other, changes in one will not reflect on the other.
You could get that kind of side effect with class attributes, and it that is all you want, you don't need inheritance at all:
class MyClass:
class_attribute = None
#classmethod
def set(cls, value):
cls.class_attribute = value
def do_computation(self):
self.set(10)
a = MyClass()
b = MyClass()
print a.class_attribute
print b.class_attribute
a.do_computation()
print a.class_attribute
print b.class_attribute
The output is:
None
None
10
10
Currently __setattr__ only works for instance. Is there any similar method for class? I am asking this question because I want to collect the list of defined attribute in order when user define it in class as below:
class CfgObj(object):
_fields = []
def __setattr__(self, name, value):
self._fields.append([name, value])
object.__setattr__(self, name, value)
class ACfg(CfgObj):
setting1 = Field(str, default='set1', desc='setting1 ...')
setting2 = Field(int, default=5, desc='setting2...')
I know the above code will not work as expected because the __setattr__ only called by instance as below:
acfg = ACfg()
acfg.c = 1
acfg._fields == [['c', 1]]
So, is there any equivalent __setattr__ for python class? The main purpose is to collect the define attribute in order when user define it in class.
Yes, but that's not how you want to do it.
class MC(type):
def __init__(cls, name, bases, dct):
print dct
super(MC, cls).__init__(name, bases, dct)
class C(object):
__metaclass__ = MC
foo = 42
If you define __setattr__() on the metaclass of a class, it will be called when setting attributes on the class, but only after creating the class:
>>> class Meta(type):
... def __setattr__(cls, name, value):
... print "%s=%r" % (name, value)
...
>>> class A(object):
... __metaclass__ = Meta
...
>>> A.a = 1
a=1
But it won't work at the time of class definition, so it's probably not what you want.
Getting the class attributes in the metaclass __init__() works, but you loose the order of definition (and multiple definitions as well).
What I would do to solve your problem - but not your question - is to set the timestamp of the field creation create a counter of Field objects and set the current value of the counter to the created one:
class Field(object):
count = 0
def __init__(self, value, default=None, desc=None):
self.value = value
self.default = default
self.desc = desc
# Here comes the magic
self.nth = Field.count
Field.count += 1
# self.created_at = time.time()
Then I would create a method for returning all fields ordered by its counter value:
class CfgObj(object):
def params(self):
ns = dir(self)
fs = [getattr(self, field)
for field in ns
if isinstance(getattr(self, field), Field)]
# fs = sorted(fs, key=lambda f: f.created_at)
fs = sorted(fs, key=lambda f: f.nth)
return fs
Its usage is intuitive:
class ACfg(CfgObj):
setting1 = Field(str, default='set1', desc='setting1 ...')
setting2 = Field(int, default=5, desc='setting2...')
print ACfg().params()
Clearly the fields are ordered by time of object creation, not field creation, but it can be enough for you. Is it?
I have this (Py2.7.2):
class MyClass(object):
def __init__(self, dict_values):
self.values = dict_values
self.changed_values = {} #this should track changes done to the values{}
....
I can use it like this:
var = MyClass()
var.values['age'] = 21
var.changed_values['age'] = 21
But I want to use it like this:
var.age = 21
print var.changed_values #prints {'age':21}
I suspect I can use properties to do that, but how?
UPDATE:
I don't know the dict contents at the design time. It will be known at run-time only. And it will likely to be not empty
You can create a class that inherits from a dict and override the needed functions
class D(dict):
def __init__(self):
self.changed_values = {}
self.__initialized = True
def __setitem__(self, key, value):
self.changed_values[key] = value
super(D, self).__setitem__(key, value)
def __getattr__(self, item):
"""Maps values to attributes.
Only called if there *isn't* an attribute with this name
"""
try:
return self.__getitem__(item)
except KeyError:
raise AttributeError(item)
def __setattr__(self, item, value):
"""Maps attributes to values.
Only if we are initialised
"""
if not self.__dict__.has_key('_D__initialized'): # this test allows attributes to be set in the __init__ method
return dict.__setattr__(self, item, value)
elif self.__dict__.has_key(item): # any normal attributes are handled normally
dict.__setattr__(self, item, value)
else:
self.__setitem__(item, value)
a = D()
a['hi'] = 'hello'
print a.hi
print a.changed_values
a.hi = 'wow'
print a.hi
print a.changed_values
a.test = 'test1'
print a.test
print a.changed_values
output
>>hello
>>{'hi': 'hello'}
>>wow
>>{'hi': 'wow'}
>>test1
>>{'hi': 'wow', 'test': 'test1'}
Properties (descriptors, really) will only help if the set of attributes to monitor is bounded. Simply file the new value away in the __set__() method of the descriptor.
If the set of attributes is arbitrary or unbounded then you will need to overrive MyClass.__setattr__() instead.
You can use the property() built-in function.
This is preferred to overriding __getattr__ and __setattr__, as explained here.
class MyClass:
def __init__(self):
self.values = {}
self.changed_values = {}
def set_age( nr ):
self.values['age'] = nr
self.changed_values['age'] = nr
def get_age():
return self.values['age']
age = property(get_age,set_age)
A python descriptor that I'm working with is sharing its value across all instances of its owner class. How can I make each instance's descriptor contain its own internal values?
class Desc(object):
def __init__(self, initval=None,name='val'):
self.val = initval
self.name = name
def __get__(self,obj,objtype):
return self.val
def __set__(self,obj,val):
self.val = val
def __delete__(self,obj):
pass
class MyClass(object):
desc = Desc(10,'varx')
if __name__ == "__main__":
c = MyClass()
c.desc = 'max'
d = MyClass()
d.desc = 'sally'
print(c.desc)
print(d.desc)
The output is this, the last call set the value for both objects:
localhost $ python descriptor_testing.py
sally
sally
There is only one descriptor object, stored on the class object, so self is always the same. If you want to store data per-object and access it through the descriptor, you either have to store the data on each object (probably the better idea) or in some data-structure keyed by each object (an idea I don't like as much).
I would save data on the instance object:
class Desc(object):
default_value = 10
def __init__(self, name):
self.name = name
def __get__(self,obj,objtype):
return obj.__dict__.get(self.name, self.default_value)
# alternatively the following; but won't work with shadowing:
#return getattr(obj, self.name, self.default_value)
def __set__(self,obj,val):
obj.__dict__[self.name] = val
# alternatively the following; but won't work with shadowing:
#setattr(obj, self.name, val)
def __delete__(self,obj):
pass
class MyClass(object):
desc = Desc('varx')
In this case, the data will be stored in the obj's 'varx' entry in its __dict__. Because of how data descriptor lookup works though, you can "shadow" the storage location with the descriptor:
class MyClass(object):
varx = Desc('varx')
In this case, when you do the lookup:
MyClass().varx
The descriptor object gets called and can do its lookup, but when the lookup goes like this:
MyClass().__dict__['varx']
The value is returned directly. Thus the descriptor is able to store its data in a 'hidden' place, so to speak.