Let spam be an instance of some class Spam, and suppose that spam.ham is an object of some built-in type, say dict. Even though Spam is not a subclass of dict, I would like its instances to have the same API as a regular dict (i.e. the same methods with the same signatures), but I want to avoid typing out a bazillion boilerplate methods of the form:
def apimethod(self, this, that):
return self.ham.apimethod(this, that)
I tried the following:
class Spam(object):
def __init__(self):
self.ham = dict()
def __getattr__(self, attr):
return getattr(self.ham, attr)
...but it works for "regular" methods, like keys and items, but not for special methods, like __setitem__, __getitem__, and __len__:
>>> spam = Spam()
>>> spam.keys()
[]
>>> spam['eggs'] = 42
Traceback (most recent call last):
File "<stdin>", line 1, in <module>
TypeError: 'Spam' object does not support item assignment
>>> spam.ham['eggs'] = 42
>>> foo.items()
[('eggs', 42)]
>>> spam['eggs']
Traceback (most recent call last):
File "<stdin>", line 1, in <module>
TypeError: 'Spam' object is not subscritable
>>> len(spam)
Traceback (most recent call last):
File "<stdin>", line 1, in <module>
TypeError: 'Spam' object has no len()
All the special methods I tried produced similar errors.
How can I automate the definition of special methods (so that they get referred to the delegate)?
Clarification: I'm not necessarily looking for solutions that leverage the standard method lookup sequence. My goal here is to minimize boilerplate code.
Thanks!
This may not be helpful if you need a solution that prohibits metaclasses as well, but here is the solution I came up with:
def _wrapper(func):
def _wrapped(self, *args, **kwargs):
return getattr(self.ham, func)(*args, **kwargs)
return _wrapped
class DictMeta(type):
def __new__(cls, name, bases, dct):
default_attrs = dir(object)
for attr in dir(dict):
if attr not in default_attrs:
dct[attr] = _wrapper(attr)
return type.__new__(cls, name, bases, dct)
class Spam(object):
__metaclass__ = DictMeta
def __init__(self):
self.ham = dict()
Seems to do what you're looking for:
>>> spam = Spam()
>>> spam['eggs'] = 42
>>> spam.items()
[('eggs', 42)]
>>> len(spam)
1
>>> spam.ham
{'eggs': 42}
If on Python 3.x use class Spam(object, metaclass=DictMeta) and remove the __metaclass__ line from the body of Spam.
This looks like a job for ... a metaclass!
def make_method(p, m):
def method(self, *a, **k):
return getattr(getattr(self, p),m)(*a, **k)
return method
class Proxier(type):
def __new__(cls, name, bases, dict):
objs = dict.get('proxyobjs', [])
if objs:
old_init = dict.get('__init__', lambda self: None)
def new_init(self, *a, **k):
for (n,v) in objs.iteritems():
setattr(self, n, v())
old_init(self, *a, **k)
dict['__init__'] = new_init
meths = dict.get('proxymethods', {})
for (proxyname, methnames) in meths.iteritems():
for methname in methnames:
dict[methname] = make_method(proxyname, methname)
return super(Proxier, cls).__new__(cls, name, bases, dict)
class Spam(object):
__metaclass__ = Proxier
proxyobjs = {'ham': dict,
'eggs': list,
}
proxymethods = {'ham': ('__setitem__', '__getitem__', '__delitem__'),
'eggs': ('__contains__', 'append')
}
It works!
In [28]: s = Spam()
In [29]: s[4] = 'hi'
In [30]: s.append(3)
In [31]: 3 in s
Out[31]: True
In [32]: 4 in s
Out[32]: False
In [33]: s[4]
Out[33]: 'hi'
Note that you have to specify what parts of the interface you're using (otherwise, why not just inherit?). So we have __contains__ from list, and __getitem__ from dict, and the __iter__ from neither. (And only one way to mutate the underlying list, using append but not extend or __delitem__.) So (like Martian) I'm not sure how useful this will be.
Attribute access for special methods doesn't obey normal attribute access rules, basically those methods MUST exist at class level, read http://docs.python.org/reference/datamodel.html#special-method-lookup-for-new-style-classes
So you need to add all those methods either manually or you can add them to class programmatically and best way to do that is thru metaclass. Also note that I am not adding all methods in dict but only special methods because rest can be easily redirected thru __getattr__
def redirect(methodname):
def _redirect(self, *args, **kwargs):
print "redirecting",methodname
method = getattr(self.ham, methodname)
return method(*args, **kwargs)
return _redirect
class DictRedirect(object):
def __new__(cls, name, bases, attrs):
# re-create all special methods from dict
dict_attr_names = set(dir(dict))
common_names = set(dir(cls))
for methodname in dict_attr_names-common_names:
if not methodname.startswith('__'):
continue
attrs[methodname] = redirect(methodname)
return type(name, bases, attrs)
class Spam(object):
__metaclass__ = DictRedirect
def __init__(self):
self.ham = dict()
def __getattr__(self, name):
return getattr(self.ham, name)
spam = Spam()
spam['eggs'] = 'yolk'
print 'keys =',spam.keys()
print spam['eggs']
output:
redirecting __setitem__
keys = ['eggs']
redirecting __getitem__
yolk
Disclaimer: IMO this is too much magic and should be avoided except for having fun :)
Not sure __getattribute__ will help, but the reason is the special methods are looked up in the class not in the instance: http://docs.python.org/reference/datamodel.html#special-method-lookup-for-new-style-classes , as for example the special methods like __getattr__ and __getattribute__ themselves have to be looked up somewhere.
Proxying like this seems asking for trouble to me without careful thinking, for example how should things like __dict__ and __class__ behave and about possible method conflicts if your wrapper happens to have any methods, and sure there are other problems.
Re: is-a vs. has-a:
If you just duplicate whole interface of contained member, it seems like anti-pattern to me, as that's what inheritance is for. What if you have a two has-a relations to two dict objects?
In has-a relation, one usually picks useful methods often exporting them under different names to make sensible API. So instead Spam.append(item) you would have Spam.addBot(bot).
Related
My class has a dict, for example:
class MyClass(object):
def __init__(self):
self.data = {'a': 'v1', 'b': 'v2'}
Then I want to use the dict's key with MyClass instance to access the dict, for example:
ob = MyClass()
v = ob.a # Here I expect ob.a returns 'v1'
I know this should be implemented by __getattr__, but I'm new to Python, I don't exactly know how to implement it.
class MyClass(object):
def __init__(self):
self.data = {'a': 'v1', 'b': 'v2'}
def __getattr__(self, attr):
return self.data[attr]
>>> ob = MyClass()
>>> v = ob.a
>>> v
'v1'
Be careful when implementing __setattr__ though, you will need to make a few modifications:
class MyClass(object):
def __init__(self):
# prevents infinite recursion from self.data = {'a': 'v1', 'b': 'v2'}
# as now we have __setattr__, which will call __getattr__ when the line
# self.data[k] tries to access self.data, won't find it in the instance
# dictionary and return self.data[k] will in turn call __getattr__
# for the same reason and so on.... so we manually set data initially
super(MyClass, self).__setattr__('data', {'a': 'v1', 'b': 'v2'})
def __setattr__(self, k, v):
self.data[k] = v
def __getattr__(self, k):
# we don't need a special call to super here because getattr is only
# called when an attribute is NOT found in the instance's dictionary
try:
return self.data[k]
except KeyError:
raise AttributeError
>>> ob = MyClass()
>>> ob.c = 1
>>> ob.c
1
If you don't need to set attributes just use a namedtuple
eg.
>>> from collections import namedtuple
>>> MyClass = namedtuple("MyClass", ["a", "b"])
>>> ob = MyClass(a=1, b=2)
>>> ob.a
1
If you want the default arguments you can just write a wrapper class around it:
class MyClass(namedtuple("MyClass", ["a", "b"])):
def __new__(cls, a="v1", b="v2"):
return super(MyClass, cls).__new__(cls, a, b)
or maybe it looks nicer as a function:
def MyClass(a="v1", b="v2", cls=namedtuple("MyClass", ["a", "b"])):
return cls(a, b)
>>> ob = MyClass()
>>> ob.a
'v1'
Late to the party, but found two really good resources that explain this better (IMHO).
As explained here, you should use self.__dict__ to access fields from within __getattr__, in order to avoid infinite recursion. The example provided is:
def __getattr__(self, attrName):
if not self.__dict__.has_key(attrName):
value = self.fetchAttr(attrName) # computes the value
self.__dict__[attrName] = value
return self.__dict__[attrName]
Note: in the second line (above), a more Pythonic way would be (has_key apparently was even removed in Python 3):
if attrName not in self.__dict__:
The other resource explains that the __getattr__ is invoked only when the attribute is not found in the object, and that hasattr always returns True if there is an implementation for __getattr__. It provides the following example, to demonstrate:
class Test(object):
def __init__(self):
self.a = 'a'
self.b = 'b'
def __getattr__(self, name):
return 123456
t = Test()
print 'object variables: %r' % t.__dict__.keys()
#=> object variables: ['a', 'b']
print t.a
#=> a
print t.b
#=> b
print t.c
#=> 123456
print getattr(t, 'd')
#=> 123456
print hasattr(t, 'x')
#=> True
class A(object):
def __init__(self):
self.data = {'a': 'v1', 'b': 'v2'}
def __getattr__(self, attr):
try:
return self.data[attr]
except Exception:
return "not found"
>>>a = A()
>>>print a.a
v1
>>>print a.c
not found
I like to take this therefore.
I took it from somewhere, but I don't remember where.
class A(dict):
def __init__(self, *a, **k):
super(A, self).__init__(*a, **k)
self.__dict__ = self
This makes the __dict__ of the object the same as itself, so that attribute and item access map to the same dict:
a = A()
a['a'] = 2
a.b = 5
print a.a, a['b'] # prints 2 5
I figured out an extension to #glglgl's answer that handles nested dictionaries and dictionaries insides lists that are in the original dictionary:
class d(dict):
def __init__(self, *a, **k):
super(d, self).__init__(*a, **k)
self.__dict__ = self
for k in self.__dict__:
if isinstance(self.__dict__[k], dict):
self.__dict__[k] = d(self.__dict__[k])
elif isinstance(self.__dict__[k], list):
for i in range(len(self.__dict__[k])):
if isinstance(self.__dict__[k][i], dict):
self.__dict__[k][i] = d(self.__dict__[k][i])
A simple approach to solving your __getattr__()/__setattr__() infinite recursion woes
Implementing one or the other of these magic methods can usually be easy. But when overriding them both, it becomes trickier. This post's examples apply mostly to this more difficult case.
When implementing both these magic methods, it's not uncommon to get stuck figuring out a strategy to get around recursion in the __init__() constructor of classes. This is because variables need to be initialized for the object, but every attempt to read or write those variables go through __get/set/attr__(), which could have more unset variables in them, incurring more futile recursive calls.
Up front, a key point to remember is that __getattr__() only gets called by the runtime if the attribute can't be found on the object already. The trouble is to get attributes defined without tripping these functions recursively.
Another point is __setattr__() will get called no matter what. That's an important distinction between the two functions, which is why implementing both attribute methods can be tricky.
This is one basic pattern that solves the problem.
class AnObjectProxy:
_initialized = False # *Class* variable 'constant'.
def __init__(self):
self._any_var = "Able to access instance vars like usual."
self._initialized = True # *instance* variable.
def __getattr__(self, item):
if self._initialized:
pass # Provide the caller attributes in whatever ways interest you.
else:
try:
return self.__dict__[item] # Transparent access to instance vars.
except KeyError:
raise AttributeError(item)
def __setattr__(self, key, value):
if self._initialized:
pass # Provide caller ways to set attributes in whatever ways.
else:
self.__dict__[key] = value # Transparent access.
While the class is initializing and creating it's instance vars, the code in both attribute functions permits access to the object's attributes via the __dict__ dictionary transparently - your code in __init__() can create and access instance attributes normally. When the attribute methods are called, they only access self.__dict__ which is already defined, thus avoiding recursive calls.
In the case of self._any_var, once it's assigned, __get/set/attr__() won't be called to find it again.
Stripped of extra code, these are the two pieces that are most important.
... def __getattr__(self, item):
... try:
... return self.__dict__[item]
... except KeyError:
... raise AttributeError(item)
...
... def __setattr__(self, key, value):
... self.__dict__[key] = value
Solutions can build around these lines accessing the __dict__ dictionary. To implement an object proxy, two modes were implemented: initialization and post-initialization in the code before this - a more detailed example of the same is below.
There are other examples in answers that may have differing levels of effectiveness in dealing with all aspects of recursion. One effective approach is accessing __dict__ directly in __init__() and other places that need early access to instance vars. This works but can be a little verbose. For instance,
self.__dict__['_any_var'] = "Setting..."
would work in __init__().
My posts tend to get a little long-winded.. after this point is just extra. You should already have the idea with the examples above.
A drawback to some other approaches can be seen with debuggers in IDE's. They can be overzealous in their use of introspection and produce warning and error recovery messages as you're stepping through code. You can see this happening even with solutions that work fine standalone. When I say all aspects of recursion, this is what I'm talking about.
The examples in this post only use a single class variable to support 2-modes of operation, which is very maintainable.
But please NOTE: the proxy class required two modes of operation to set up and proxy for an internal object. You don't have to have two modes of operation.
You could simply incorporate the code to access the __dict__ as in these examples in whatever ways suit you.
If your requirements don't include two modes of operation, you may not need to declare any class variables at all. Just take the basic pattern and customize it.
Here's a closer to real-world (but by no means complete) example of a 2-mode proxy that follows the pattern:
>>> class AnObjectProxy:
... _initialized = False # This class var is important. It is always False.
... # The instances will override this with their own,
... # set to True.
... def __init__(self, obj):
... # Because __getattr__ and __setattr__ access __dict__, we can
... # Initialize instance vars without infinite recursion, and
... # refer to them normally.
... self._obj = obj
... self._foo = 123
... self._bar = 567
...
... # This instance var overrides the class var.
... self._initialized = True
...
... def __setattr__(self, key, value):
... if self._initialized:
... setattr(self._obj, key, value) # Proxying call to wrapped obj.
... else:
... # this block facilitates setting vars in __init__().
... self.__dict__[key] = value
...
... def __getattr__(self, item):
... if self._initialized:
... attr = getattr(self._obj, item) # Proxying.
... return attr
... else:
... try:
... # this block facilitates getting vars in __init__().
... return self.__dict__[item]
... except KeyError:
... raise AttributeError(item)
...
... def __call__(self, *args, **kwargs):
... return self._obj(*args, **kwargs)
...
... def __dir__(self):
... return dir(self._obj) + list(self.__dict__.keys())
The 2-mode proxy only needs a bit of "bootstrapping" to access vars in its own scope at initialization before any of its vars are set. After initialization, the proxy has no reason to create more vars for itself, so it will fare fine by deferring all attribute calls to it's wrapped object.
Any attribute the proxy itself owns will still be accessible to itself and other callers since the magic attribute functions only get called if an attribute can't be found immediately on the object.
Hopefully this approach can be of benefit to anyone who appreciates a direct approach to resolving their __get/set/attr__() __init__() frustrations.
You can initialize your class dictionary through the constructor:
def __init__(self,**data):
And call it as follows:
f = MyClass(**{'a': 'v1', 'b': 'v2'})
All of the instance attributes being accessed (read) in __setattr__, need to be declared using its parent (super) method, only once:
super().__setattr__('NewVarName1', InitialValue)
Or
super().__setattr__('data', dict())
Thereafter, they can be accessed or assigned to in the usual manner:
self.data = data
And instance attributes not being accessed in __setattr__, can be declared in the usual manner:
self.x = 1
The overridden __setattr__ method must now call the parent method inside itself, for new variables to be declared:
super().__setattr__(key,value)
A complete class would look as follows:
class MyClass(object):
def __init__(self, **data):
# The variable self.data is used by method __setattr__
# inside this class, so we will need to declare it
# using the parent __setattr__ method:
super().__setattr__('data', dict())
self.data = data
# These declarations will jump to
# super().__setattr__('data', dict())
# inside method __setattr__ of this class:
self.x = 1
self.y = 2
def __getattr__(self, name):
# This will callback will never be called for instance variables
# that have beed declared before being accessed.
if name in self.data:
# Return a valid dictionary item:
return self.data[name]
else:
# So when an instance variable is being accessed, and
# it has not been declared before, nor is it contained
# in dictionary 'data', an attribute exception needs to
# be raised.
raise AttributeError
def __setattr__(self, key, value):
if key in self.data:
# Assign valid dictionary items here:
self.data[key] = value
else:
# Assign anything else as an instance attribute:
super().__setattr__(key,value)
Test:
f = MyClass(**{'a': 'v1', 'b': 'v2'})
print("f.a = ", f.a)
print("f.b = ", f.b)
print("f.data = ", f.data)
f.a = 'c'
f.d = 'e'
print("f.a = ", f.a)
print("f.b = ", f.b)
print("f.data = ", f.data)
print("f.d = ", f.d)
print("f.x = ", f.x)
print("f.y = ", f.y)
# Should raise attributed Error
print("f.g = ", f.g)
Output:
f.a = v1
f.b = v2
f.data = {'a': 'v1', 'b': 'v2'}
f.a = c
f.b = v2
f.data = {'a': 'c', 'b': 'v2'}
f.d = e
f.x = 1
f.y = 2
Traceback (most recent call last):
File "MyClass.py", line 49, in <module>
print("f.g = ", f.g)
File "MyClass.py", line 25, in __getattr__
raise AttributeError
AttributeError
I think this implement is cooler
class MyClass(object):
def __init__(self):
self.data = {'a': 'v1', 'b': 'v2'}
def __getattr__(self,key):
return self.data.get(key,None)
It is fairly easy to use the __getattr__ special method on Python classes to handle either missing properties or functions, but seemingly not both at the same time.
Consider this example which handles any property requested which is not defined explicitly elsewhere in the class...
class Props:
def __getattr__(self, attr):
return 'some_new_value'
>>> p = Props()
>>> p.prop # Property get handled
'some_new_value'
>>> p.func('an_arg', kw='keyword') # Function call NOT handled
Traceback (most recent call last):
File "<console>", line 1, in <module>
TypeError: 'str' object is not callable
Next, consider this example which handles any function call not defined explicitly elsewhere in the class...
class Funcs:
def __getattr__(self, attr):
def fn(*args, **kwargs):
# Do something with the function name and any passed arguments or keywords
print attr
print args
print kwargs
return
return fn
>>> f = Funcs()
>>> f.prop # Property get NOT handled
<function fn at 0x10df23b90>
>>> f.func('an_arg', kw='keyword') # Function call handled
func
('an_arg',)
{'kw': 'keyword'}
The question is how to handle both types of missing attributes in the same __getattr__? How to detect if the attribute requested was in property notation or in method notation with parentheses and return either a value or a function respectively? Essentially I want to handle SOME missing property attributes AND SOME missing function attributes and then resort to default behavior for all the other cases.
Advice?
How to detect if the attribute requested was in property notation or in method notation with parentheses and return either a value or a function respectively?
You can't. You also can't tell whether a requested method is an instance, class, or static method, etc. All you can tell is that someone is trying to retrieve an attribute for read access. Nothing else is passed into the getattribute machinery, so nothing else is available to your code.
So, you need some out-of-band way to know whether to create a function or some other kind of value. This is actually pretty common—you may actually be proxying for some other object that does have a value/function distinction (think of ctypes or PyObjC), or you may have a naming convention, etc.
However, you could always return an object that can be used either way. For example, if your "default behavior" is to return attributes are integers, or functions that return an integer, you can return something like this:
class Integerizer(object):
def __init__(self, value):
self.value = value
def __int__(self):
return self.value
def __call__(self, *args, **kw):
return self.value
There is no way to detect how the returned attribute was intended to be used. Everything on python objects are attributes, including the methods:
>>> class Foo(object):
... def bar(self): print 'bar called'
... spam='eggs'
...
>>> Foo.bar
<unbound method Foo.bar>
>>> Foo.spam
'eggs'
Python first looks up the attribute (bar or spam), and if you meant to call it (added parenthesis) then Python invokes the callable after lookup up the attribute:
>>> foo = Foo()
>>> fbar = foo.bar
>>> fbar()
'bar called'
In the above code I separated the lookup of bar from calling bar.
Since there is no distinction, you cannot detect in __getattr__ what the returned attribute will be used for.
__getattr__ is called whenever normal attribute access fails; in the following example monty is defined on the class, so __getattr__ is not called; it is only called for bar.eric and bar.john:
>>> class Bar(object):
... monty = 'python'
... def __getattr__(self, name):
... print 'Attribute access for {0}'.format(name)
... if name == 'eric':
... return 'idle'
... raise AttributeError(name)
...
>>> bar = Bar()
>>> bar.monty
'python'
>>> bar.eric
Attribute access for eric
'idle'
>>> bar.john
Attribute access for john
Traceback (most recent call last):
File "<stdin>", line 1, in <module>
File "<stdin>", line 7, in __getattr__
AttributeError: john
Note that functions are not the only objects that you can invoke (call); any custom class that implements the __call__ method will do:
>>> class Baz(object):
... def __call__(self, name):
... print 'Baz sez: "Hello {0}!"'.format(name)
...
>>> baz = Baz()
>>> baz('John Cleese')
Baz sez: "Hello John Cleese!"
You could use that return objects from __getattr__ that can both be called and used as a value in different contexts.
I'm trying to intercept calls to python's double underscore magic methods in new style classes. This is a trivial example but it show's the intent:
class ShowMeList(object):
def __init__(self, it):
self._data = list(it)
def __getattr__(self, name):
attr = object.__getattribute__(self._data, name)
if callable(attr):
def wrapper(*a, **kw):
print "before the call"
result = attr(*a, **kw)
print "after the call"
return result
return wrapper
return attr
If I use that proxy object around list I get the expected behavior for non-magic methods but my wrapper function is never called for magic methods.
>>> l = ShowMeList(range(8))
>>> l #call to __repr__
<__main__.ShowMeList object at 0x9640eac>
>>> l.append(9)
before the call
after the call
>> len(l._data)
9
If I don't inherit from object (first line class ShowMeList:) everything works as expected:
>>> l = ShowMeList(range(8))
>>> l #call to __repr__
before the call
after the call
[0, 1, 2, 3, 4, 5, 6, 7]
>>> l.append(9)
before the call
after the call
>> len(l._data)
9
How do I accomplish this intercept with new style classes?
For performance reasons, Python always looks in the class (and parent classes') __dict__ for magic methods and does not use the normal attribute lookup mechanism. A workaround is to use a metaclass to automatically add proxies for magic methods at the time of class creation; I've used this technique to avoid having to write boilerplate call-through methods for wrapper classes, for example.
class Wrapper(object):
"""Wrapper class that provides proxy access to some internal instance."""
__wraps__ = None
__ignore__ = "class mro new init setattr getattr getattribute"
def __init__(self, obj):
if self.__wraps__ is None:
raise TypeError("base class Wrapper may not be instantiated")
elif isinstance(obj, self.__wraps__):
self._obj = obj
else:
raise ValueError("wrapped object must be of %s" % self.__wraps__)
# provide proxy access to regular attributes of wrapped object
def __getattr__(self, name):
return getattr(self._obj, name)
# create proxies for wrapped object's double-underscore attributes
class __metaclass__(type):
def __init__(cls, name, bases, dct):
def make_proxy(name):
def proxy(self, *args):
return getattr(self._obj, name)
return proxy
type.__init__(cls, name, bases, dct)
if cls.__wraps__:
ignore = set("__%s__" % n for n in cls.__ignore__.split())
for name in dir(cls.__wraps__):
if name.startswith("__"):
if name not in ignore and name not in dct:
setattr(cls, name, property(make_proxy(name)))
Usage:
class DictWrapper(Wrapper):
__wraps__ = dict
wrapped_dict = DictWrapper(dict(a=1, b=2, c=3))
# make sure it worked....
assert "b" in wrapped_dict # __contains__
assert wrapped_dict == dict(a=1, b=2, c=3) # __eq__
assert "'a': 1" in str(wrapped_dict) # __str__
assert wrapped_dict.__doc__.startswith("dict()") # __doc__
Using __getattr__ and __getattribute__ are the last resources of a class to respond to getting an attribute.
Consider the following:
>>> class C:
x = 1
def __init__(self):
self.y = 2
def __getattr__(self, attr):
print(attr)
>>> c = C()
>>> c.x
1
>>> c.y
2
>>> c.z
z
The __getattr__ method is only called when nothing else works (It will not work on operators, and you can read about that here).
On your example, the __repr__ and many other magic methods are already defined in the object class.
One thing can be done, thought, and it is to define those magic methods and make then call the __getattr__ method. Check this other question by me and its answers (link) to see some code doing that.
As of the answers to Asymmetric behavior for __getattr__, newstyle vs oldstyle classes (see also the Python docs), modifying access to "magic" methods with __getattr__ or __getattribute__ is just not possible with new-style classes. This restriction makes the interpreter much faster.
Cut and copy from the documentation:
For old-style classes, special methods are always looked up in exactly the same way as any other method or attribute.
For new-style classes, implicit invocations of special methods are only guaranteed to work correctly if defined on an object’s type, not in the object’s instance dictionary.
According to the docs, super(cls, obj) returns
a proxy object that delegates method calls to a parent or sibling
class of type cls
I understand why super() offers this functionality, but I need something slightly different: I need to create a proxy object that delegates methods calls (and attribute lookups) to class cls itself; and as in super, if cls doesn't implement the method/attribute, my proxy should continue looking in the MRO order (of the new not the original class). Is there any function I can write that achieves that?
Example:
class X:
def act():
#...
class Y:
def act():
#...
class A(X, Y):
def act():
#...
class B(X, Y):
def act():
#...
class C(A, B):
def act():
#...
c = C()
b = some_magic_function(B, c)
# `b` needs to delegate calls to `act` to B, and look up attribute `s` in B
# I will pass `b` somewhere else, and have no control over it
Of course, I could do b = super(A, c), but that relies on knowing the exact class hierarchy and the fact that B follows A in the MRO. It would silently break if any of these two assumptions change in the future. (Note that super doesn't make any such assumptions!)
If I just needed to call b.act(), I could use B.act(c). But I am passing b to someone else, and have no idea what they'll do with it. I need to make sure it doesn't betray me and start acting like an instance of class C at some point.
A separate question, the documentation for super() (in Python 3.2) only talks about its method delegation, and does not clarify that attribute lookups for the proxy are also performed the same way. Is it an accidental omission?
EDIT
The updated Delegate approach works in the following example as well:
class A:
def f(self):
print('A.f')
def h(self):
print('A.h')
self.f()
class B(A):
def g(self):
self.f()
print('B.g')
def f(self):
print('B.f')
def t(self):
super().h()
a_true = A()
# instance of A ends up executing A.f
a_true.h()
b = B()
a_proxy = Delegate(A, b)
# *unlike* super(), the updated `Delegate` implementation would call A.f, not B.f
a_proxy.h()
Note that the updated class Delegate is closer to what I want than super() for two reasons:
super() only does it proxying for the first call; subsequent calls will happen as normal, since by then the object is used, not its proxy.
super() does not allow attribute access.
Thus, my question as asked has a (nearly) perfect answer in Python.
It turns out that, at a higher level, I was trying to do something I shouldn't (see my comments here).
This class should cover the most common cases:
class Delegate:
def __init__(self, cls, obj):
self._delegate_cls = cls
self._delegate_obj = obj
def __getattr__(self, name):
x = getattr(self._delegate_cls, name)
if hasattr(x, "__get__"):
return x.__get__(self._delegate_obj)
return x
Use it like this:
b = Delegate(B, c)
(with the names from your example code.)
Restrictions:
You cannot retrieve some special attributes like __class__ etc. from the class you pass in the constructor via this proxy. (This restistions also applies to super.)
This might behave weired if the attribute you want to retrieve is some weired kind of descriptor.
Edit: If you want the code in the update to your question to work as desired, you can use the foloowing code:
class Delegate:
def __init__(self, cls):
self._delegate_cls = cls
def __getattr__(self, name):
x = getattr(self._delegate_cls, name)
if hasattr(x, "__get__"):
return x.__get__(self)
return x
This passes the proxy object as self parameter to any called method, and it doesn't need the original object at all, hence I deleted it from the constructor.
If you also want instance attributes to be accessible you can use this version:
class Delegate:
def __init__(self, cls, obj):
self._delegate_cls = cls
self._delegate_obj = obj
def __getattr__(self, name):
if name in vars(self._delegate_obj):
return getattr(self._delegate_obj, name)
x = getattr(self._delegate_cls, name)
if hasattr(x, "__get__"):
return x.__get__(self)
return x
A separate question, the documentation for super() (in Python 3.2)
only talks about its method delegation, and does not clarify that
attribute lookups for the proxy are also performed the same way. Is it
an accidental omission?
No, this is not accidental. super() does nothing for attribute lookups. The reason is that attributes on an instance are not associated with a particular class, they're just there. Consider the following:
class A:
def __init__(self):
self.foo = 'foo set from A'
class B(A):
def __init__(self):
super().__init__()
self.bar = 'bar set from B'
class C(B):
def method(self):
self.baz = 'baz set from C'
class D(C):
def __init__(self):
super().__init__()
self.foo = 'foo set from D'
self.baz = 'baz set from D'
instance = D()
instance.method()
instance.bar = 'not set from a class at all'
Which class "owns" foo, bar, and baz?
If I wanted to view instance as an instance of C, should it have a baz attribute before method is called? How about afterwards?
If I view instance as an instance of A, what value should foo have? Should bar be invisible because was only added in B, or visible because it was set to a value outside the class?
All of these questions are nonsense in Python. There's no possible way to design a system with the semantics of Python that could give sensible answers to them. __init__ isn't even special in terms of adding attributes to instances of the class; it's just a perfectly ordinary method that happens to be called as part of the instance creation protocol. Any method (or indeed code from another class altogether, or not from any class at all) can create attributes on any instance it has a reference to.
In fact, all of the attributes of instance are stored in the same place:
>>> instance.__dict__
{'baz': 'baz set from C', 'foo': 'foo set from D', 'bar': 'not set from a class at all'}
There's no way to tell which of them were originally set by which class, or were last set by which class, or whatever measure of ownership you want. There's certainly no way to get at "the A.foo being shadowed by D.foo", as you would expect from C++; they're the same attribute, and any writes to to it by one class (or from elsewhere) will clobber a value left in it by the other class.
The consequence of this is that super() does not perform attribute lookups the same way it does method lookups; it can't, and neither can any code you write.
In fact, from running some experiments, neither super nor Sven's Delegate actually support direct attribute retrieval at all!
class A:
def __init__(self):
self.spoon = 1
self.fork = 2
def foo(self):
print('A.foo')
class B(A):
def foo(self):
print('B.foo')
b = B()
d = Delegate(A, b)
s = super(B, b)
Then both work as expected for methods:
>>> d.foo()
A.foo
>>> s.foo()
A.foo
But:
>>> d.fork
Traceback (most recent call last):
File "<pyshell#43>", line 1, in <module>
d.fork
File "/tmp/foo.py", line 6, in __getattr__
x = getattr(self._delegate_cls, name)
AttributeError: type object 'A' has no attribute 'fork'
>>> s.spoon
Traceback (most recent call last):
File "<pyshell#45>", line 1, in <module>
s.spoon
AttributeError: 'super' object has no attribute 'spoon'
So they both only really work for calling some methods on, not for passing to arbitrary third party code to pretend to be an instance of the class you want to delegate to.
They don't behave the same way in the presence of multiple inheritance unfortunately. Given:
class Delegate:
def __init__(self, cls, obj):
self._delegate_cls = cls
self._delegate_obj = obj
def __getattr__(self, name):
x = getattr(self._delegate_cls, name)
if hasattr(x, "__get__"):
return x.__get__(self._delegate_obj)
return x
class A:
def foo(self):
print('A.foo')
class B:
pass
class C(B, A):
def foo(self):
print('C.foo')
c = C()
d = Delegate(B, c)
s = super(C, c)
Then:
>>> d.foo()
Traceback (most recent call last):
File "<pyshell#50>", line 1, in <module>
d.foo()
File "/tmp/foo.py", line 6, in __getattr__
x = getattr(self._delegate_cls, name)
AttributeError: type object 'B' has no attribute 'foo'
>>> s.foo()
A.foo
Because Delegate ignores the full MRO of whatever class _delegate_obj is an instance of, only using the MRO of _delegate_cls. Whereas super does what you asked in the question, but the behaviour seems quite strange: it's not wrapping an instance of C to pretend it's an instance of B, because direct instances of B don't have foo defined.
Here's my attempt:
class MROSkipper:
def __init__(self, cls, obj):
self.__cls = cls
self.__obj = obj
def __getattr__(self, name):
mro = self.__obj.__class__.__mro__
i = mro.index(self.__cls)
if i == 0:
# It's at the front anyway, just behave as getattr
return getattr(self.__obj, name)
else:
# Check __dict__ not getattr, otherwise we'd find methods
# on classes we're trying to skip
try:
return self.__obj.__dict__[name]
except KeyError:
return getattr(super(mro[i - 1], self.__obj), name)
I rely on the __mro__ attribute of classes to properly figure out where to start from, then I just use super. You could walk the MRO chain from that point yourself checking class __dict__s for methods instead if the weirdness of going back one step to use super is too much.
I've made no attempt to handle unusual attributes; those implemented with descriptors (including properties), or those magic methods looked up behind the scenes by Python, which often start at the class rather than the instance directly. But this behaves as you asked moderately well (with the caveat expounded on ad nauseum in the first part of my post; looking up attributes this way will not give you any different results than looking them up directly in the instance).
Is there some way to make a class-level read-only property in Python? For instance, if I have a class Foo, I want to say:
x = Foo.CLASS_PROPERTY
but prevent anyone from saying:
Foo.CLASS_PROPERTY = y
EDIT:
I like the simplicity of Alex Martelli's solution, but not the syntax that it requires. Both his and ~unutbu's answers inspired the following solution, which is closer to the spirit of what I was looking for:
class const_value (object):
def __init__(self, value):
self.__value = value
def make_property(self):
return property(lambda cls: self.__value)
class ROType(type):
def __new__(mcl,classname,bases,classdict):
class UniqeROType (mcl):
pass
for attr, value in classdict.items():
if isinstance(value, const_value):
setattr(UniqeROType, attr, value.make_property())
classdict[attr] = value.make_property()
return type.__new__(UniqeROType,classname,bases,classdict)
class Foo(object):
__metaclass__=ROType
BAR = const_value(1)
BAZ = 2
class Bit(object):
__metaclass__=ROType
BOO = const_value(3)
BAN = 4
Now, I get:
Foo.BAR
# 1
Foo.BAZ
# 2
Foo.BAR=2
# Traceback (most recent call last):
# File "<stdin>", line 1, in <module>
# AttributeError: can't set attribute
Foo.BAZ=3
#
I prefer this solution because:
The members get declared inline instead of after the fact, as with type(X).foo = ...
The members' values are set in the actual class's code as opposed to in the metaclass's code.
It's still not ideal because:
I have to set the __metaclass__ in order for const_value objects to be interpreted correctly.
The const_values don't "behave" like the plain values. For example, I couldn't use it as a default value for a parameter to a method in the class.
The existing solutions are a bit complex -- what about just ensuring that each class in a certain group has a unique metaclass, then setting a normal read-only property on the custom metaclass. Namely:
>>> class Meta(type):
... def __new__(mcl, *a, **k):
... uniquemcl = type('Uniq', (mcl,), {})
... return type.__new__(uniquemcl, *a, **k)
...
>>> class X: __metaclass__ = Meta
...
>>> class Y: __metaclass__ = Meta
...
>>> type(X).foo = property(lambda *_: 23)
>>> type(Y).foo = property(lambda *_: 45)
>>> X.foo
23
>>> Y.foo
45
>>>
this is really much simpler, because it's based on nothing more than the fact that when you get an instance's attribute descriptors are looked up on the class (so of course when you get a class's attribute descriptors are looked on the metaclass), and making class/metaclass unique isn't terribly hard.
Oh, and of course:
>>> X.foo = 67
Traceback (most recent call last):
File "<stdin>", line 1, in <module>
AttributeError: can't set attribute
just to confirm it IS indeed read-only!
The ActiveState solution that Pynt references makes instances of ROClass have read-only attributes. Your question seems to ask if the class itself can have read-only attributes.
Here is one way, based on Raymond Hettinger's comment:
#!/usr/bin/env python
def readonly(value):
return property(lambda self: value)
class ROType(type):
CLASS_PROPERTY = readonly(1)
class Foo(object):
__metaclass__=ROType
print(Foo.CLASS_PROPERTY)
# 1
Foo.CLASS_PROPERTY=2
# AttributeError: can't set attribute
The idea is this: Consider first Raymond Hettinger's solution:
class Bar(object):
CLASS_PROPERTY = property(lambda self: 1)
bar=Bar()
bar.CLASS_PROPERTY=2
It shows a relatively simple way to give bar a read-only property.
Notice that you have to add the CLASS_PROPERTY = property(lambda self: 1)
line to the definition of the class of bar, not to bar itself.
So, if you want the class Foo to have a read-only property, then the parent class of Foo has to have CLASS_PROPERTY = property(lambda self: 1) defined.
The parent class of a class is a metaclass. Hence we define ROType as the metaclass:
class ROType(type):
CLASS_PROPERTY = readonly(1)
Then we make Foo's parent class be ROType:
class Foo(object):
__metaclass__=ROType
Found this on ActiveState:
# simple read only attributes with meta-class programming
# method factory for an attribute get method
def getmethod(attrname):
def _getmethod(self):
return self.__readonly__[attrname]
return _getmethod
class metaClass(type):
def __new__(cls,classname,bases,classdict):
readonly = classdict.get('__readonly__',{})
for name,default in readonly.items():
classdict[name] = property(getmethod(name))
return type.__new__(cls,classname,bases,classdict)
class ROClass(object):
__metaclass__ = metaClass
__readonly__ = {'a':1,'b':'text'}
if __name__ == '__main__':
def test1():
t = ROClass()
print t.a
print t.b
def test2():
t = ROClass()
t.a = 2
test1()
Note that if you try to set a read-only attribute (t.a = 2) python will raise an AttributeError.