The example below is from a REST database driver on Python 2.7.
In the __setattr__ method below, if I use the commented out getattr() line, it reduces the object instantiation performance from 600 rps to 230.
Why is getattr() so much slower than self.__dict__.get() in this case?
class Element(object):
def __init__(self, client):
self._client = client
self._data = {}
self._initialized = True
def __setattr__(self, key, value):
#_initialized = getattr(self, "_initialized", False)
_initialized = self.__dict__.get("_initialized", False)
if key in self.__dict__ or _initialized is False:
# set the attribute normally
object.__setattr__(self, key, value)
else:
# set the attribute as a data property
self._data[key] = value
In short: because getattr(foo,bar) does the same thing as foo.bar, which is not the same thing as just accessing the __dict__ property (for a start, getattr has to select the right __dict__, but there's a whole lot more going on).
An example for illustration:
>>> class A:
... a = 1
...
>>> class B(A):
... b = 2
...
>>> dir(B)
['__doc__', '__module__', 'a', 'b']
>>> B.a
1
>>> B.__dict__
{'__module__': '__main__', 'b': 2, '__doc__': None}
>>> B.__dict__['a']
Traceback (most recent call last):
File "<stdin>", line 1, in <module>
KeyError: 'a'
>>> B.__dict__.get('a')
>>>
Details contained in, or linked to here: http://docs.python.org/reference/datamodel.html (search for "getattr").
Related
I want to create object which is having read-only attributes.
And it need to be initialize dynamically.
Here is situation I want.
readOnlyObject = ReadOnlyClass({'name': 'Tom', 'age': 24})
print(readOnlyObject.name)
>> 'Tom'
print(readOnlyObject.age)
>> 24
readOnlyObject.age = 14
>> AttributeError: can't set attribute
I found a way using property function,
but I think property function only works on attributes that is pre-declared.
Here is my code that property doesn't work.
class ReadOnlyClass:
_preDeclaredVar = "Read-Only!"
preDeclaredVar = property(lambda self: self._preDeclaredVar)
def __init__(self, data: dict):
for attr in data:
setattr(self, '_' + attr, data[attr])
setattr(self, attr, property(lambda self: getattr(self, '_' + attr)))
readOnlyObject = ReadOnlyClass({'name': 'Tom', 'age': 24})
print(readOnlyObject.preDeclaredVar)
>> "Read-Only!"
readOnlyObject.preDeclaredVar = "Can write?"
>> AttributeError: can't set attribute '
print(readOnlyObject.name)
>> <property object at 0x016C62A0> # I think it is weird.. property func only work on pre-declared variable?
what happened?
I want to know is there a way to create read-only object dynamically.
Consider starting with __setattr__:
>>> class ReadOnlyClass:
... def __init__(self, **kwargs):
... self.__dict__.update(kwargs)
...
... def __setattr__(self, key, value):
... raise AttributeError("can't set attribute")
...
>>> readonly_object = ReadOnlyClass(name='Tom', age=24)
>>> readonly_object.name
'Tom'
>>> readonly_object.age
24
>>> readonly_object.age = 10
Traceback (most recent call last):
File "<stdin>", line 1, in <module>
File "<stdin>", line 6, in __setattr__
AttributeError: can't set attribute
However, this may not fully meet your expectations. You can still set the attributes through __dict__:
>>> readonly_object.__dict__['age'] = 10
>>> readonly_object.age
10
You can use Named tuples:
>>> import collections
>>> def ReadOnlyClass(data):
... class_ = collections.namedtuple('ReadOnlyClass', data.keys())
... return class_(**data)
...
>>> readOnlyObject = ReadOnlyClass({'name': 'Tom', 'age': 24})
>>> print(readOnlyObject.name)
Tom
>>> print(readOnlyObject.age)
24
>>> readOnlyObject.age = 14
Traceback (most recent call last):
File "<stdin>", line 1, in <module>
AttributeError: can't set attribute
Assume that I have a class as following:
class MyClass(object):
def __init__(self, value=None):
self.attr = value
#property
def attr(self):
# This acts as a getter?
# Let's call the function "attr_1" as alias
return self.__attr
#attr.setter
def attr(self, value):
# This acts as a setter?
# Let's call the function "attr_2" as alias
self.__attr = value
inst = MyClass(1)
I read the Documentation on Descriptor and looked at the implementation of property class.
As far as I know, when I type inst.attr, the following happens:
The first attr (whose alias is attr_1) is found, and attr is now an instance of property class, which is a data descriptor.
Therefore, it will override the instance dictionary, which means type(inst).__dict__['attr'].__get__(inst, type(inst)) is invoked.
attr.__get__(inst, type(inst)) invokes attr.fget(inst), where fget() is in fact the attr(self) (the "raw" attr_1 function).
Finally, attr.fget(inst) returns inst.__attr.
Here comes the first question: the class MyClass does not have an attribute __attr, then how to interpret inst.__attrin step 3?
Similarly, in the emulated setter, how does Python find an attribute inst.__attr to assign the value?
And a trivial question: since property is a class, why not Property instead of property?
Your question is not directly related to properties actually, and the way they work as data descriptors. It's just the way Python fakes private attributes marked as starting with two underscores.
>>> inst.__attr
Traceback (most recent call last):
File "<pyshell#4>", line 1, in <module>
inst.__attr
AttributeError: 'MyClass' object has no attribute '__attr'
Consider that you wrote your code using an internal variable with a single underscore (usually the convention to say, you shouldn't touch this but I won't enforce, do at your own risk):
>>> class MyClass2(object):
def __init__(self, value=None):
self.attr = value
#property
def attr(self):
# This acts as a getter?
# Let's call the function "attr_1" as alias
return self._attr
#attr.setter
def attr(self, value):
# This acts as a setter?
# Let's call the function "attr_2" as alias
self._attr = value
>>> inst2 = MyClass2(1)
>>> inst2._attr
1
And you can see the trick by peeking at the object's __dict__
>>> inst2.__dict__
{'_attr': 1}
>>> inst.__dict__
{'_MyClass__attr': 1}
Just some more to convince you that this has nothing to do with properties:
>>> class OtherClass(object):
def __init__(self, value):
self.__attr = value
def get_attr(self):
return self.__attr
def set_attr(self, value):
self.__attr = value
>>> other_inst = OtherClass(1)
>>> other_inst.get_attr()
1
>>> other_inst.__attr
Traceback (most recent call last):
File "<pyshell#17>", line 1, in <module>
other_inst.__attr
AttributeError: 'OtherClass' object has no attribute '__attr'
>>> other_inst.__dict__
{'_OtherClass__attr': 1}
>>> other_inst._OtherClass__attr
1
>>> other_inst._OtherClass__attr = 24
>>> other_inst.get_attr()
24
>>> inst._MyClass__attr = 23
>>> inst.attr
23
Concerning your last question, I just don't think there is such a convention in Python that classes must start with an uppercase. property is not an isolated case (datetime, itemgetter, csv.reader, ...).
Im adding callable objects to a instance of a class A at runtime using the __dict__ property. At some point though I want to remove all added objects from my instance. I thought about storing the initial __dict__ property in a member _orgDict and then execute self.__dict__ = self._orgDict later. Im wondering whether:
This works at all?
The removed objects are really deleted or just not contained in my instance anymore?
You mean the del statement?
del(instance.attribute)
A quick test shows that reassigning an instance __dict__ seems to work:
>>> class B(object):
pass
>>> b = B()
>>> b.b = 6
>>> b.b
6
>>> b.__dict__ = {}
>>> b.b
Traceback (most recent call last):
File "<pyshell#57>", line 1, in <module>
b.b
AttributeError: 'B' object has no attribute 'b'
However, I'm not sure whether this is guaranteed, or if it just happens to work. Especially in terms of supporting non-C Pythons, you may want to be careful.
Yes. It is possible to override(delete) the objects by assignment. Here is the example.
>>> class callable_objects:
def __init__(self, name, fame=None):
self.name = name
self.fame = fame
def _name(self):
if self.name[0] in ["a","b","c","d","e"]:
self._fame("1")
else:
self._fame("2")
def _fame(self, ifame):
if ifame == "1":
print "Ur fame is bad"
else:
print "Ur fame is very bad"
>>> c = callable_objects("ameet")
>>> callable_objects.__dict__
{'__module__': '__main__', '_fame': <function _fame at 0x02B5C370>, '__doc__': None, '__init__': <function __init__ at 0x02B5C330>, '_name': <function _name at 0x02B5C2F0>}
>>> c.__dict__
{'name': 'ameet', 'fame': None}
>>> callable_objects.__dict__ = c.__dict__
>>> callable_objects.__dict__
{'name': 'ameet', 'fame': None}
Let spam be an instance of some class Spam, and suppose that spam.ham is an object of some built-in type, say dict. Even though Spam is not a subclass of dict, I would like its instances to have the same API as a regular dict (i.e. the same methods with the same signatures), but I want to avoid typing out a bazillion boilerplate methods of the form:
def apimethod(self, this, that):
return self.ham.apimethod(this, that)
I tried the following:
class Spam(object):
def __init__(self):
self.ham = dict()
def __getattr__(self, attr):
return getattr(self.ham, attr)
...but it works for "regular" methods, like keys and items, but not for special methods, like __setitem__, __getitem__, and __len__:
>>> spam = Spam()
>>> spam.keys()
[]
>>> spam['eggs'] = 42
Traceback (most recent call last):
File "<stdin>", line 1, in <module>
TypeError: 'Spam' object does not support item assignment
>>> spam.ham['eggs'] = 42
>>> foo.items()
[('eggs', 42)]
>>> spam['eggs']
Traceback (most recent call last):
File "<stdin>", line 1, in <module>
TypeError: 'Spam' object is not subscritable
>>> len(spam)
Traceback (most recent call last):
File "<stdin>", line 1, in <module>
TypeError: 'Spam' object has no len()
All the special methods I tried produced similar errors.
How can I automate the definition of special methods (so that they get referred to the delegate)?
Clarification: I'm not necessarily looking for solutions that leverage the standard method lookup sequence. My goal here is to minimize boilerplate code.
Thanks!
This may not be helpful if you need a solution that prohibits metaclasses as well, but here is the solution I came up with:
def _wrapper(func):
def _wrapped(self, *args, **kwargs):
return getattr(self.ham, func)(*args, **kwargs)
return _wrapped
class DictMeta(type):
def __new__(cls, name, bases, dct):
default_attrs = dir(object)
for attr in dir(dict):
if attr not in default_attrs:
dct[attr] = _wrapper(attr)
return type.__new__(cls, name, bases, dct)
class Spam(object):
__metaclass__ = DictMeta
def __init__(self):
self.ham = dict()
Seems to do what you're looking for:
>>> spam = Spam()
>>> spam['eggs'] = 42
>>> spam.items()
[('eggs', 42)]
>>> len(spam)
1
>>> spam.ham
{'eggs': 42}
If on Python 3.x use class Spam(object, metaclass=DictMeta) and remove the __metaclass__ line from the body of Spam.
This looks like a job for ... a metaclass!
def make_method(p, m):
def method(self, *a, **k):
return getattr(getattr(self, p),m)(*a, **k)
return method
class Proxier(type):
def __new__(cls, name, bases, dict):
objs = dict.get('proxyobjs', [])
if objs:
old_init = dict.get('__init__', lambda self: None)
def new_init(self, *a, **k):
for (n,v) in objs.iteritems():
setattr(self, n, v())
old_init(self, *a, **k)
dict['__init__'] = new_init
meths = dict.get('proxymethods', {})
for (proxyname, methnames) in meths.iteritems():
for methname in methnames:
dict[methname] = make_method(proxyname, methname)
return super(Proxier, cls).__new__(cls, name, bases, dict)
class Spam(object):
__metaclass__ = Proxier
proxyobjs = {'ham': dict,
'eggs': list,
}
proxymethods = {'ham': ('__setitem__', '__getitem__', '__delitem__'),
'eggs': ('__contains__', 'append')
}
It works!
In [28]: s = Spam()
In [29]: s[4] = 'hi'
In [30]: s.append(3)
In [31]: 3 in s
Out[31]: True
In [32]: 4 in s
Out[32]: False
In [33]: s[4]
Out[33]: 'hi'
Note that you have to specify what parts of the interface you're using (otherwise, why not just inherit?). So we have __contains__ from list, and __getitem__ from dict, and the __iter__ from neither. (And only one way to mutate the underlying list, using append but not extend or __delitem__.) So (like Martian) I'm not sure how useful this will be.
Attribute access for special methods doesn't obey normal attribute access rules, basically those methods MUST exist at class level, read http://docs.python.org/reference/datamodel.html#special-method-lookup-for-new-style-classes
So you need to add all those methods either manually or you can add them to class programmatically and best way to do that is thru metaclass. Also note that I am not adding all methods in dict but only special methods because rest can be easily redirected thru __getattr__
def redirect(methodname):
def _redirect(self, *args, **kwargs):
print "redirecting",methodname
method = getattr(self.ham, methodname)
return method(*args, **kwargs)
return _redirect
class DictRedirect(object):
def __new__(cls, name, bases, attrs):
# re-create all special methods from dict
dict_attr_names = set(dir(dict))
common_names = set(dir(cls))
for methodname in dict_attr_names-common_names:
if not methodname.startswith('__'):
continue
attrs[methodname] = redirect(methodname)
return type(name, bases, attrs)
class Spam(object):
__metaclass__ = DictRedirect
def __init__(self):
self.ham = dict()
def __getattr__(self, name):
return getattr(self.ham, name)
spam = Spam()
spam['eggs'] = 'yolk'
print 'keys =',spam.keys()
print spam['eggs']
output:
redirecting __setitem__
keys = ['eggs']
redirecting __getitem__
yolk
Disclaimer: IMO this is too much magic and should be avoided except for having fun :)
Not sure __getattribute__ will help, but the reason is the special methods are looked up in the class not in the instance: http://docs.python.org/reference/datamodel.html#special-method-lookup-for-new-style-classes , as for example the special methods like __getattr__ and __getattribute__ themselves have to be looked up somewhere.
Proxying like this seems asking for trouble to me without careful thinking, for example how should things like __dict__ and __class__ behave and about possible method conflicts if your wrapper happens to have any methods, and sure there are other problems.
Re: is-a vs. has-a:
If you just duplicate whole interface of contained member, it seems like anti-pattern to me, as that's what inheritance is for. What if you have a two has-a relations to two dict objects?
In has-a relation, one usually picks useful methods often exporting them under different names to make sensible API. So instead Spam.append(item) you would have Spam.addBot(bot).
Is there some way to make a class-level read-only property in Python? For instance, if I have a class Foo, I want to say:
x = Foo.CLASS_PROPERTY
but prevent anyone from saying:
Foo.CLASS_PROPERTY = y
EDIT:
I like the simplicity of Alex Martelli's solution, but not the syntax that it requires. Both his and ~unutbu's answers inspired the following solution, which is closer to the spirit of what I was looking for:
class const_value (object):
def __init__(self, value):
self.__value = value
def make_property(self):
return property(lambda cls: self.__value)
class ROType(type):
def __new__(mcl,classname,bases,classdict):
class UniqeROType (mcl):
pass
for attr, value in classdict.items():
if isinstance(value, const_value):
setattr(UniqeROType, attr, value.make_property())
classdict[attr] = value.make_property()
return type.__new__(UniqeROType,classname,bases,classdict)
class Foo(object):
__metaclass__=ROType
BAR = const_value(1)
BAZ = 2
class Bit(object):
__metaclass__=ROType
BOO = const_value(3)
BAN = 4
Now, I get:
Foo.BAR
# 1
Foo.BAZ
# 2
Foo.BAR=2
# Traceback (most recent call last):
# File "<stdin>", line 1, in <module>
# AttributeError: can't set attribute
Foo.BAZ=3
#
I prefer this solution because:
The members get declared inline instead of after the fact, as with type(X).foo = ...
The members' values are set in the actual class's code as opposed to in the metaclass's code.
It's still not ideal because:
I have to set the __metaclass__ in order for const_value objects to be interpreted correctly.
The const_values don't "behave" like the plain values. For example, I couldn't use it as a default value for a parameter to a method in the class.
The existing solutions are a bit complex -- what about just ensuring that each class in a certain group has a unique metaclass, then setting a normal read-only property on the custom metaclass. Namely:
>>> class Meta(type):
... def __new__(mcl, *a, **k):
... uniquemcl = type('Uniq', (mcl,), {})
... return type.__new__(uniquemcl, *a, **k)
...
>>> class X: __metaclass__ = Meta
...
>>> class Y: __metaclass__ = Meta
...
>>> type(X).foo = property(lambda *_: 23)
>>> type(Y).foo = property(lambda *_: 45)
>>> X.foo
23
>>> Y.foo
45
>>>
this is really much simpler, because it's based on nothing more than the fact that when you get an instance's attribute descriptors are looked up on the class (so of course when you get a class's attribute descriptors are looked on the metaclass), and making class/metaclass unique isn't terribly hard.
Oh, and of course:
>>> X.foo = 67
Traceback (most recent call last):
File "<stdin>", line 1, in <module>
AttributeError: can't set attribute
just to confirm it IS indeed read-only!
The ActiveState solution that Pynt references makes instances of ROClass have read-only attributes. Your question seems to ask if the class itself can have read-only attributes.
Here is one way, based on Raymond Hettinger's comment:
#!/usr/bin/env python
def readonly(value):
return property(lambda self: value)
class ROType(type):
CLASS_PROPERTY = readonly(1)
class Foo(object):
__metaclass__=ROType
print(Foo.CLASS_PROPERTY)
# 1
Foo.CLASS_PROPERTY=2
# AttributeError: can't set attribute
The idea is this: Consider first Raymond Hettinger's solution:
class Bar(object):
CLASS_PROPERTY = property(lambda self: 1)
bar=Bar()
bar.CLASS_PROPERTY=2
It shows a relatively simple way to give bar a read-only property.
Notice that you have to add the CLASS_PROPERTY = property(lambda self: 1)
line to the definition of the class of bar, not to bar itself.
So, if you want the class Foo to have a read-only property, then the parent class of Foo has to have CLASS_PROPERTY = property(lambda self: 1) defined.
The parent class of a class is a metaclass. Hence we define ROType as the metaclass:
class ROType(type):
CLASS_PROPERTY = readonly(1)
Then we make Foo's parent class be ROType:
class Foo(object):
__metaclass__=ROType
Found this on ActiveState:
# simple read only attributes with meta-class programming
# method factory for an attribute get method
def getmethod(attrname):
def _getmethod(self):
return self.__readonly__[attrname]
return _getmethod
class metaClass(type):
def __new__(cls,classname,bases,classdict):
readonly = classdict.get('__readonly__',{})
for name,default in readonly.items():
classdict[name] = property(getmethod(name))
return type.__new__(cls,classname,bases,classdict)
class ROClass(object):
__metaclass__ = metaClass
__readonly__ = {'a':1,'b':'text'}
if __name__ == '__main__':
def test1():
t = ROClass()
print t.a
print t.b
def test2():
t = ROClass()
t.a = 2
test1()
Note that if you try to set a read-only attribute (t.a = 2) python will raise an AttributeError.