This question answers how to implement __getattr__ for static/class attributes - using a metaclass. However, I would like to implement __getattr__ and __getattribute__ for a class generated by type() and to make things even more interesting, the class inherits a class which has a custom metaclass which must be executed properly.
The code summarizing the paragraph above:
class Inherited(metaclass=SomeFancyMetaclass):
...
generated_class = type("GeneratedClass", (Inherited,), {})
def __class_getattr__(cls, name): # __getattr__ for class, not sure how the code shall look exactly like
return getattr(cls, name)
setattr(generated_class, "__getattr__", __class_getattr__) # similarly for __getattribute__
The question: is this possible, and if so, how? Could someone provide a minimal working example?
Just make your metaclass inherit from SomeFancyMetaclass, implement the __getattr__ (and __getattribute__) there properly, and use this metaclass, rather than a call to type to generate your inheited, dynamic class.
Although you are using a lot of seldon used stuff, there are no special mechanisms in the way - it should be plain Python -
Of course, you did not tell what you want to do in the metaclass special methods - there might be some black magic to be performed there - and if you are doing __getattribute__, you always have to be extra careful, and redirect all attrbiutes that you don't care about to the super-call, otherwise, nothing works.
Also, keep in mind that the attribute-access ustomization possible with both methods won't work to "create magic dunder methods" - that is: your class won't magically have an __add__ or __dir__ method because your metaclass __getattribute__ generates one - rather, these are fixed in spcial slots by the Python runtime, and their checking and calling bypasses normal attribute lookup in Python.
Otherwise:
class Inherited(metaclass=SomeFancyMetaclass):
...
class MagicAttrsMeta(Inherited.__class__):
def __getattr__(self, attr):
if attr in ("flying", "circus", "brian", "king_arthur"):
return "coconut"
raise AttributeError()
generated_class = MagicAttrsMeta("GeneratedClass", (Inherited,), {})
Related
'Every thing in python is an object'
So, should all objects have to have attributes and methods ?
I read below statemts in tutorial site, could you give example of pre-defined object in python that has neither attributes nor methods ?
Some objects have neither attributes nor methods
Everything in Python is indeed an object, this is true. Even classes themselves are considered to be objects and they're indeed the product of the builtin class typewhich is not surprisingly an object too. But objects will almost certainly inherit attributes including methods or data in most circumstances.
So, should all objects have to have attributes and methods ?
Not necessarily all objects have their own attributes. For instance, an object can inherit the attribute from its class and its class's superclasses, but that attribute or method doesn't necessarily live within the instance's namespace dictionary. In fact, instances' namespaces could be empty like the following:
class Foo:
pass
a = A()
print(a.__dict__)
a here doesn't have any attributes aside from those inherited from its class so if you check its namespace through the builtin attribute __dict__ you'll find the namespace to be an empty dictionary. But you might wonder isn't a.__dict__ an attribute of a? Make a distinction between class-level attributes--attributes inherited from the class or its superclasses and instance-attributes--attributes that belong to the instance and usually live in its namespace __dict__.
Could you give example of pre-defined object in python that has neither attributes nor methods ?
If you meant by predefined object, a builtin object, I couldn't imagine such scenario. Again, even if there are no attributes at the object itself, there would be attributes inherited from its class or the class's superclasses if there's any superclass in most cases. Probably and I'm guessing here, the tutorial is asking you to create class that assigns no attributes to its objects, just like the code I included above.
And this already answers your question better: Is everything an object in python like ruby?
There's a hackish way to emulate a Python object with no attributes.
class NoAttr(object):
def __getattribute__(self, attr):
raise AttributeError("no attribute: %s" % attr)
def __setattr__(self, attr, value):
raise AttributeError("can't set attribute: %s" % attr)
def __delattr__(self, attr):
raise AttributeError("no attribute: %s" % attr)
a = NoAttr()
This instance a, for all intents and purposes, in pure Python, behaves like an object with no attributes (you can try hasattr on it).
There may be a low-level way to do this in a C extension by implementing a type in C that pathologically stops Python's object implementation from working. Anyway the margin here is too small for writing one.
A pre-defined object with no attributes would defeat the purpose of pre-defining it.
In python 2.x take the following class:
class Person:
def __init__(self, name):
self.name = name
def myrepr(self):
return str(self.name)
def __getattr__(self, attr):
print('Fetching attr: %s' % attr)
if attr=='__repr__':
return self.myrepr
Now if you create an instance and echo it in the shell (to call __repr__), like
p = Person('Bob')
p
You get
Fetching attr: __repr__
Bob
Without the __getattr__ overload you'd have just got the default <__main__.A instance at 0x7fb8800c6e18> kind of thing.
My question is why is the built-in __getattr__ even capable of handling calls to __repr__ (and other builtins like that) if they are not elsewhere defined. These are not attributes they are operators, more like methods..
(This no longer works in python 3.x so I guess they got rid of the handling by __getattr__ of the builtins.)
It's not a Python 2 or 3 thing, it's a new-style vs old-style class thing. In old-style classes these methods had no special meaning, they were treated like simple attributes.
In new-style classes the special methods are always looked up in the class(implicit lookup) not instance.
For new-style classes, implicit invocations of special methods are
only guaranteed to work correctly if defined on an object’s type, not
in the object’s instance dictionary.
Old-style classes:
For old-style classes, special methods are always looked up in exactly
the same way as any other method or attribute. This is the case
regardless of whether the method is being looked up explicitly as in
x.__getitem__(i) or implicitly as in x[i].
Related: Overriding special methods on an instance
Following this answer it seems that a class' metaclass may be changed after the class has been defined by using the following*:
class MyMetaClass(type):
# Metaclass magic...
class A(object):
pass
A = MyMetaClass(A.__name__, A.__bases__, dict(A.__dict__))
Defining a function
def metaclass_wrapper(cls):
return MyMetaClass(cls.__name__, cls.__bases__, dict(cls.__dict__))
allows me to apply a decorator to a class definition like so,
#metaclass_wrapper
class B(object):
pass
It seems that the metaclass magic is applied to B, however B has no __metaclass__ attribute. Is the above method a sensible way to apply metaclasses to class definitions, even though I am definiting and re-definiting a class, or would I be better off simply writing
class B(object):
__metaclass__ = MyMetaClass
pass
I presume there are some differences between the two methods.
*Note, the original answer in the linked question, MyMetaClass(A.__name__, A.__bases__, A.__dict__), returns a TypeError:
TypeError: type() argument 3 must be a dict, not dict_proxy
It seems that the __dict__ attribute of A (the class definition) has a type dict_proxy, whereas the type of the __dict__ attribute of an instance of A has a type dict. Why is this? Is this a Python 2.x vs. 3.x difference?
Admittedly, I am a bit late to the party. However, I fell this was worth adding.
This is completely doable. That being said, there are plenty of other ways to accomplish the same goal. However, the decoration solution, in particular, allows for delayed evaluation ( obj = dec(obj) ), which using __metaclass__ inside the class does not. In typical decorator style, my solution is below.
There is a tricky thing that you may run into if you just construct the class without changing the dictionary or copying its attributes. Any attributes that the class had previously (before decorating) will appear to be missing. So, it is absolutely essential to copy these over and then tweak them as I have in my solution.
Personally, I like to be able to keep track of how an object was wrapped. So, I added the __wrapped__ attribute, which is not strictly necessary. It also makes it more like functools.wraps in Python 3 for classes. However, it can be helpful with introspection. Also, __metaclass__ is added to act more like the normal metaclass use case.
def metaclass(meta):
def metaclass_wrapper(cls):
__name = str(cls.__name__)
__bases = tuple(cls.__bases__)
__dict = dict(cls.__dict__)
for each_slot in __dict.get("__slots__", tuple()):
__dict.pop(each_slot, None)
__dict["__metaclass__"] = meta
__dict["__wrapped__"] = cls
return(meta(__name, __bases, __dict))
return(metaclass_wrapper)
For a trivial example, take the following.
class MetaStaticVariablePassed(type):
def __new__(meta, name, bases, dct):
dct["passed"] = True
return(super(MetaStaticVariablePassed, meta).__new__(meta, name, bases, dct))
#metaclass(MetaStaticVariablePassed)
class Test(object):
pass
This yields the nice result...
|1> Test.passed
|.> True
Using the decorator in the less usual, but identical way...
class Test(object):
pass
Test = metaclass_wrapper(Test)
...yields, as expected, the same nice result.
|1> Test.passed
|.> True
The class has no __metaclass__ attribute set... because you never set it!
Which metaclass to use is normally determined by a name __metaclass__ set in a class block. The __metaclass__ attribute isn't set by the metaclass. So if you invoke a metaclass directly rather than setting __metaclass__ and letting Python figure it out, then no __metaclass__ attribute is set.
In fact, normal classes are all instances of the metaclass type, so if the metaclass always set the __metaclass__ attribute on its instances then every class would have a __metaclass__ attribute (most of them set to type).
I would not use your decorator approach. It obscures the fact that a metaclass is involved (and which one), is still one line of boilerplate, and it's just messy to create a class from the 3 defining features of (name, bases, attributes) only to pull those 3 bits back out from the resulting class, throw the class away, and make a new class from those same 3 bits!
When you do this in Python 2.x:
class A(object):
__metaclass__ = MyMeta
def __init__(self):
pass
You'd get roughly the same result if you'd written this:
attrs = {}
attrs['__metaclass__'] = MyMeta
def __init__(self):
pass
attrs['__init__'] = __init__
A = attrs.get('__metaclass__', type)('A', (object,), attrs)
In reality calculating the metaclass is more complicated, as there actually has to be a search through all the bases to determine whether there's a metaclass conflict, and if one of the bases doesn't have type as its metaclass and attrs doesn't contain __metaclass__ then the default metaclass is the ancestor's metaclass rather than type. This is one situation where I expect your decorator "solution" will differ from using __metaclass__ directly. I'm not sure exactly what would happen if you used your decorator in a situation where using __metaclass__ would give you a metaclass conflict error, but I wouldn't expect it to be pleasant.
Also, if there are any other metaclasses involved, your method would result in them running first (possibly modifying what the name, bases, and attributes are!) and then pulling those out of the class and using it to create a new class. This could potentially be quite different than what you'd get using __metaclass__.
As for the __dict__ not giving you a real dictionary, that's just an implementation detail; I would guess for performance reasons. I doubt there is any spec that says the __dict__ of a (non-class) instance has to be the same type as the __dict__ of a class (which is also an instance btw; just an instance of a metaclass). The __dict__ attribute of a class is a "dictproxy", which allows you to look up attribute keys as if it were a dict but still isn't a dict. type is picky about the type of its third argument; it wants a real dict, not just a "dict-like" object (shame on it for spoiling duck-typing). It's not a 2.x vs 3.x thing; Python 3 behaves the same way, although it gives you a nicer string representation of the dictproxy. Python 2.4 (which is the oldest 2.x I have readily available) also has dictproxy objects for class __dict__ objects.
My summary of your question: "I tried a new tricky way to do a thing, and it didn't quite work. Should I use the simple way instead?"
Yes, you should do it the simple way. You haven't said why you're interested in inventing a new way to do it.
This one seems a bit tricky to me. Sometime ago I already managed to overwrite an instance's method with something like:
def my_method(self, attr):
pass
instancemethod = type(self.method_to_overwrite)
self.method_to_overwrite = instancemethod(my_method, self, self.__class__)
which worked very well for me; but now I'm trying to overwrite an instance's __getattribute__() function, which doesn't work for me for the reason the method seems to be
<type 'method-wrapper'>
Is it possible to do anything about that? I couldn't find any decent Python documentation on method-wrapper.
You want to override the attribute lookup algorithm on an per instance basis? Without knowing why you are trying to do this, I would hazard a guess that there is a cleaner less convoluted way of doing what you need to do. If you really need to then as Aaron said, you'll need to install a redirecting __getattribute__ handler on the class because Python looks up special methods only on the class, ignoring anything defined on the instance.
You also have to be extra careful about not getting into infinite recursion:
class FunkyAttributeLookup(object):
def __getattribute__(self, key):
try:
# Lookup the per instance function via objects attribute lookup
# to avoid infinite recursion.
getter = object.__getattribute__(self, 'instance_getattribute')
return getter(key)
except AttributeError:
return object.__getattribute__(self, key)
f = FunkyAttributeLookup()
f.instance_getattribute = lambda attr: attr.upper()
print(f.foo) # FOO
Also, if you are overriding methods on your instance, you don't need to instanciate the method object yourself, you can either use the descriptor protocol on functions that generates the methods or just curry the self argument.
#descriptor protocol
self.method_to_overwrite = my_method.__get__(self, type(self))
# or curry
from functools import partial
self.method_to_overwrite = partial(my_method, self)
You can't overwrite special methods at instance level. For new-style classes, implicit invocations of special methods are only guaranteed to work correctly if defined on an object’s type, not in the object’s instance dictionary.
There are a couple of methods which you can't overwrite and __getattribute__() is one of them.
I believe method-wrapper is a wrapper around a method written in C.
I have read posts like these:
What is a metaclass in Python?
What are your (concrete) use-cases for metaclasses in Python?
Python's Super is nifty, but you can't use it
But somehow I got confused. Many confusions like:
When and why would I have to do something like the following?
# Refer link1
return super(MyType, cls).__new__(cls, name, bases, newattrs)
or
# Refer link2
return super(MetaSingleton, cls).__call__(*args, **kw)
or
# Refer link2
return type(self.__name__ + other.__name__, (self, other), {})
How does super work exactly?
What is class registry and unregistry in link1 and how exactly does it work? (I thought it has something to do with singleton. I may be wrong, being from C background. My coding style is still a mix of functional and OO).
What is the flow of class instantiation (subclass, metaclass, super, type) and method invocation (
metaclass->__new__, metaclass->__init__, super->__new__, subclass->__init__ inherited from metaclass
) with well-commented working code (though the first link is quite close, but it does not talk about cls keyword and super(..) and registry). Preferably an example with multiple inheritance.
P.S.: I made the last part as code because Stack Overflow formatting was converting the text metaclass->__new__
to metaclass->new
OK, you've thrown quite a few concepts into the mix here! I'm going to pull out a few of the specific questions you have.
In general, understanding super, the MRO and metclasses is made much more complicated because there have been lots of changes in this tricky area over the last few versions of Python.
Python's own documentation is a very good reference, and completely up to date. There is an IBM developerWorks article which is fine as an introduction and takes a more tutorial-based approach, but note that it's five years old, and spends a lot of time talking about the older-style approaches to meta-classes.
super is how you access an object's super-classes. It's more complex than (for example) Java's super keyword, mainly because of multiple inheritance in Python. As Super Considered Harmful explains, using super() can result in you implicitly using a chain of super-classes, the order of which is defined by the Method Resolution Order (MRO).
You can see the MRO for a class easily by invoking mro() on the class (not on an instance). Note that meta-classes are not in an object's super-class hierarchy.
Thomas' description of meta-classes here is excellent:
A metaclass is the class of a class.
Like a class defines how an instance
of the class behaves, a metaclass
defines how a class behaves. A class
is an instance of a metaclass.
In the examples you give, here's what's going on:
The call to __new__ is being
bubbled up to the next thing in the
MRO. In this case, super(MyType, cls) would resolve to type;
calling type.__new__ lets Python
complete it's normal instance
creation steps.
This example is using meta-classes
to enforce a singleton. He's
overriding __call__ in the
metaclass so that whenever a class
instance is created, he intercepts
that, and can bypass instance
creation if there already is one
(stored in cls.instance). Note
that overriding __new__ in the
metaclass won't be good enough,
because that's only called when
creating the class. Overriding
__new__ on the class would work,
however.
This shows a way to dynamically
create a class. Here's he's
appending the supplied class's name
to the created class name, and
adding it to the class hierarchy
too.
I'm not exactly sure what sort of code example you're looking for, but here's a brief one showing meta-classes, inheritance and method resolution:
print('>>> # Defining classes:')
class MyMeta(type):
def __new__(cls, name, bases, dct):
print("meta: creating %s %s" % (name, bases))
return type.__new__(cls, name, bases, dct)
def meta_meth(cls):
print("MyMeta.meta_meth")
__repr__ = lambda c: c.__name__
class A(metaclass=MyMeta):
def __init__(self):
super(A, self).__init__()
print("A init")
def meth(self):
print("A.meth")
class B(metaclass=MyMeta):
def __init__(self):
super(B, self).__init__()
print("B init")
def meth(self):
print("B.meth")
class C(A, B, metaclass=MyMeta):
def __init__(self):
super(C, self).__init__()
print("C init")
print('>>> c_obj = C()')
c_obj = C()
print('>>> c_obj.meth()')
c_obj.meth()
print('>>> C.meta_meth()')
C.meta_meth()
print('>>> c_obj.meta_meth()')
c_obj.meta_meth()
Example output (using Python >= 3.6):
>>> # Defining classes:
meta: creating A ()
meta: creating B ()
meta: creating C (A, B)
>>> c_obj = C()
B init
A init
C init
>>> c_obj.meth()
A.meth
>>> C.meta_meth()
MyMeta.meta_meth
>>> c_obj.meta_meth()
Traceback (most recent call last):
File "metatest.py", line 41, in <module>
c_obj.meta_meth()
AttributeError: 'C' object has no attribute 'meta_meth'
Here's the more pragmatic answer.
It rarely matters
"What is a metaclass in Python". Bottom line, type is the metaclass of all classes. You have almost no practical use for this.
class X(object):
pass
type(X) == type
"What are your (concrete) use cases for metaclasses in Python?". Bottom line. None.
"Python's Super is nifty, but you can't use it". Interesting note, but little practical value. You'll never have a need for resolving complex multiple inheritance networks. It's easy to prevent this problem from arising by using an explicity Strategy design instead of multiple inheritance.
Here's my experience over the last 7 years of Python programming.
A class has 1 or more superclasses forming a simple chain from my class to object.
The concept of "class" is defined by a metaclass named type. I might want to extend the concept of "class", but so far, it's never come up in practice. Not once. type always does the right thing.
Using super works out really well in practice. It allows a subclass to defer to it's superclass. It happens to show up in these metaclass examples because they're extending the built-in metaclass, type.
However, in all subclass situations, you'll make use of super to extend a superclass.
Metaclasses
The metaclass issue is this:
Every object has a reference to it's type definition, or "class".
A class is, itself, also an object.
Therefore a object of type class has a reference to it's type or "class". The "class" of a "class" is a metaclass.
Since a "class" isn't a C++ run-time object, this doesn't happen in C++. It does happen in Java, Smalltalk and Python.
A metaclass defines the behavior of a class object.
90% of your interaction with a class is to ask the class to create a new object.
10% of the time, you'll be using class methods or class variables ("static" in C++ or Java parlance.)
I have found a few use cases for class-level methods. I have almost no use cases for class variables. I've never had a situation to change the way object construction works.