Python: dereferencing weakproxy - python

Is there any way to get the original object from a weakproxy pointed to it? eg is there the inverse to weakref.proxy()?
A simplified example(python2.7):
import weakref
class C(object):
def __init__(self, other):
self.other = weakref.proxy(other)
class Other(object):
pass
others = [Other() for i in xrange(3)]
my_list = [C(others[i % len(others)]) for i in xrange(10)]
I need to get the list of unique other members from my_list. The way I prefer for such tasks
is to use set:
unique_others = {x.other for x in my_list}
Unfortunately this throws TypeError: unhashable type: 'weakproxy'
I have managed to solve the specific problem in an imperative way(slow and dirty):
unique_others = []
for x in my_list:
if x.other in unique_others:
continue
unique_others.append(x.other)
but the general problem noted in the caption is still active.
What if I have only my_list under control and others are burried in some lib and someone may delete them at any time, and I want to prevent the deletion by collecting nonweak refs in a list?
Or I may want to get the repr() of the object itself, not <weakproxy at xx to Other at xx>
I guess there should be something like weakref.unproxy I'm not aware about.

I know this is an old question but I was looking for an answer recently and came up with something. Like others said, there is no documented way to do it and looking at the implementation of weakproxy type confirms that there is no standard way to achieve this.
My solution uses the fact that all Python objects have a set of standard methods (like __repr__) and that bound method objects contain a reference to the instance (in __self__ attribute).
Therefore, by dereferencing the proxy to get the method object, we can get a strong reference to the proxied object from the method object.
Example:
>>> def func():
... pass
...
>>> weakfunc = weakref.proxy(func)
>>> f = weakfunc.__repr__.__self__
>>> f is func
True
Another nice thing is that it will work for strong references as well:
>>> func.__repr__.__self__ is func
True
So there's no need for type checks if either a proxy or a strong reference could be expected.
Edit:
I just noticed that this doesn't work for proxies of classes. This is not universal then.

Basically there is something like weakref.unproxy, but it's just named weakref.ref(x)().
The proxy object is only there for delegation and the implementation is rather shaky...
The == function doesn't work as you would expect it:
>>> weakref.proxy(object) == object
False
>>> weakref.proxy(object) == weakref.proxy(object)
True
>>> weakref.proxy(object).__eq__(object)
True
However, I see that you don't want to call weakref.ref objects all the time. A good working proxy with dereference support would be nice.
But at the moment, this is just not possible. If you look into python builtin source code you see, that you need something like PyWeakref_GetObject, but there is just no call to this method at all (And: it raises a PyErr_BadInternalCall if the argument is wrong, so it seems to be an internal function). PyWeakref_GET_OBJECT is used much more, but there is no method in weakref.py that could be able to do that.
So, sorry to disappoint you, but you weakref.proxy is just not what most people would want for their use cases. You can however make your own proxy implementation. It isn't to hard. Just use weakref.ref internally and override __getattr__, __repr__, etc.
On a little sidenote on how PyCharm is able to produce the normal repr output (Because you mentioned that in a comment):
>>> class A(): pass
>>> a = A()
>>> weakref.proxy(a)
<weakproxy at 0x7fcf7885d470 to A at 0x1410990>
>>> weakref.proxy(a).__repr__()
'<__main__.A object at 0x1410990>'
>>> type( weakref.proxy(a))
<type 'weakproxy'>
As you can see, calling the original __repr__ can really help!

weakref.ref is hashable whereas weakref.proxy is not. The API doesn't say anything about how you actually can get a handle on the object a proxy points to. with weakref, it's easy, you can just call it. As such, you can roll your own proxy-like class...Here's a very basic attemp:
import weakref
class C(object):
def __init__(self,obj):
self.object=weakref.ref(obj)
def __getattr__(self,key):
if(key == "object"): return object.__getattr__(self,"object")
elif(key == "__init__"): return object.__getattr__(self,"__init__")
else:
obj=object.__getattr__(self,"object")() #Dereference the weakref
return getattr(obj,key)
class Other(object):
pass
others = [Other() for i in range(3)]
my_list = [C(others[i % len(others)]) for i in range(10)]
unique_list = {x.object for x in my_list}
Of course, now unique_list contains refs, not proxys which is fundamentally different...

I know that this is an old question, but I've been bitten by it (so, there's no real 'unproxy' in the standard library) and wanted to share my solution...
The way I solved it to get the real instance was just creating a property which returned it (although I suggest using weakref.ref instead of a weakref.proxy as code should really check if it's still alive before accessing it instead of having to remember to catch an exception whenever any attribute is accessed).
Anyways, if you still must use a proxy, the code to get the real instance is:
import weakref
class MyClass(object):
#property
def real_self(self):
return self
instance = MyClass()
proxied = weakref.proxy(instance)
assert proxied.real_self is instance

Related

how deque of python print all items [duplicate]

If someone writes a class in python, and fails to specify their own __repr__() method, then a default one is provided for them. However, suppose we want to write a function which has the same, or similar, behavior to the default __repr__(). However, we want this function to have the behavior of the default __repr__() method even if the actual __repr__() for the class was overloaded. That is, suppose we want to write a function which has the same behavior as a default __repr__() regardless of whether someone overloaded the __repr__() method or not. How might we do it?
class DemoClass:
def __init__(self):
self.var = 4
def __repr__(self):
return str(self.var)
def true_repr(x):
# [magic happens here]
s = "I'm not implemented yet"
return s
obj = DemoClass()
print(obj.__repr__())
print(true_repr(obj))
Desired Output:
print(obj.__repr__()) prints 4, but print(true_repr(obj)) prints something like:
<__main__.DemoClass object at 0x0000000009F26588>
You can use object.__repr__(obj). This works because the default repr behavior is defined in object.__repr__.
Note, the best answer is probably just to use object.__repr__ directly, as the others have pointed out. But one could implement that same functionality roughly as:
>>> def true_repr(x):
... type_ = type(x)
... module = type_.__module__
... qualname = type_.__qualname__
... return f"<{module}.{qualname} object at {hex(id(x))}>"
...
So....
>>> A()
hahahahaha
>>> true_repr(A())
'<__main__.A object at 0x106549208>'
>>>
Typically we can use object.__repr__ for that, but this will to the "object repr for every item, so:
>>> object.__repr__(4)
'<int object at 0xa6dd20>'
Since an int is an object, but with the __repr__ overriden.
If you want to go up one level of overwriting, we can use super(..):
>>> super(type(4), 4).__repr__() # going up one level
'<int object at 0xa6dd20>'
For an int that thus again means that we will print <int object at ...>, but if we would for instance subclass the int, then it would use the __repr__ of int again, like:
class special_int(int):
def __repr__(self):
return 'Special int'
Then it will look like:
>>> s = special_int(4)
>>> super(type(s), s).__repr__()
'4'
What we here do is creating a proxy object with super(..). Super will walk the method resolution order (MRO) of the object and will try to find the first function (from a superclass of s) that has overriden the function. If we use single inheritance, that is the closest parent that overrides the function, but if it there is some multiple inheritance involved, then this is more tricky. We thus select the __repr__ of that parent, and call that function.
This is also a rather weird application of super since usually the class (here type(s)) is a fixed one, and does not depend on the type of s itself, since otherwise multiple such super(..) calls would result in an infinite loop.
But usually it is a bad idea to break overriding anyway. The reason a programmer overrides a function is to change the behavior. Not respecting this can of course sometimes result into some useful functions, but frequently it will result in the fact that the code contracts are no longer satisfied. For example if a programmer overrides __eq__, he/she will also override __hash__, if you use the hash of another class, and the real __eq__, then things will start breaking.
Calling magic function directly is also frequently seen as an antipattern, so you better avoid that as well.

How to get all instances of a certain class in python?

Someone asked a similar one [question]:Printing all instances of a class.
While I am less concerned about printing them, I'd rather to know how many instances are currently "live".
The reason for this instance capture is more like a setting up a scheduled job, every hour check these "live" unprocessed instances and enrich the data. After that, either a flag in this instance is set or just delete this instance.
Torsten Marek 's answer in [question]:Printing all instances of a class using weakrefs need a call to the base class constructor for every class of this type, is it possible to automate this? Or we can get all instances with some other methods?
You can either track it on your own (see the other answers) or ask the garbage collector:
import gc
class Foo(object):
pass
foo1, foo2 = Foo(), Foo()
foocount = sum(1 for o in gc.get_referrers(Foo) if o.__class__ is Foo)
This can be kinda slow if you have a lot of objects, but it's generally not too bad, and it has the advantage of being something you can easily use with someone else's code.
Note: Used o.__class__ rather than type(o) so it works with old-style classes.
If you only want this to work for CPython, and your definition of "live" can be a little lax, there's another way to do this that may be useful for debugging/introspection purposes:
>>> import gc
>>> class Foo(object): pass
>>> spam, eggs = Foo(), Foo()
>>> foos = [obj for obj in gc.get_objects() if isinstance(obj, Foo)]
>>> foos
[<__main__.Foo at 0x1153f0190>, <__main__.Foo at 0x1153f0210>]
>>> del spam
>>> foos = [obj for obj in gc.get_objects() if isinstance(obj, Foo)]
>>> foos
[<__main__.Foo at 0x1153f0190>, <__main__.Foo at 0x1153f0210>]
>>> del foos
>>> foos = [obj for obj in gc.get_objects() if isinstance(obj, Foo)]
>>> foos
[<__main__.Foo at 0x1153f0190>]
Note that deleting spam didn't actually make it non-live, because we've still got a reference to the same object in foos. And reassigning foos didn't not help, because apparently the call to get_objects happened before the old version is released. But eventually it went away once we stopped referring to it.
And the only way around this problem is to use weakrefs.
Of course this will be horribly slow in a large system, with or without weakrefs.
Sure, store the count in a class attribute:
class CountedMixin(object):
count = 0
def __init__(self, *args, **kwargs):
type(self).count += 1
super().__init__(*args, **kwargs)
def __del__(self):
type(self).count -= 1
try:
super().__del__()
except AttributeError:
pass
You could make this slightly more magical with a decorator or a metaclass than with a base class, or simpler if it can be a bit less general (I've attempted to make this fit in anywhere in any reasonable multiple-inheritance hierarchy, which you usually don't need to worry about…), but basically, this is all there is to it.
If you want to have the instances themselves (or, better, weakrefs to them), rather than just a count of them, just replace count=0 with instances=set(), then do instances.add(self) instead of count += 1, etc. (Again, though, you probably want a weakref to self, rather than self.)
I cannot comment to the answer of kindall, thus I write my comment as answer:
The solution with gc.get_referrers(<ClassName>) does not work with inherited classes in python 3. The method gc.get_referrers(<ClassName>) does not return any instances of a class that was inherited from <ClassName>.
Instead you need to use gc.get_objects() which is much slower, since it returns a full list of objects. But in case of unit-tests, where you simply want to ensure your objects get deleted after the test (no circular references) it should be sufficient and fast enough.
Also do not forget to call gc.collect() before checking the number of your instances, to ensure all unreferenced instances are really deleted.
I also saw an issue with weak references which are also counted in this way. The problem with weak references is, that the object which is referenced might not exist any more, thus isinstance(Instance, Class) might fail with an error about non existing weak references.
Here is a simple code example:
import gc
def getInstances(Class):
gc.collect()
Number = 0
InstanceList = gc.get_objects()
for Instance in InstanceList:
if 'weakproxy' not in str(type(Instance)): # avoid weak references
if isinstance(Instance, Class):
Number += 1
return Number

How to get actual list of names of object if custom __dir__ implemented?

Official docs says:
If the object has a method named __dir__(), this method will be called
and must return the list of attributes. This allows objects that
implement a custom __getattr__() or __getattribute__() function to
customize the way dir() reports their attributes.
If custom __dir__ implemented, results, returning by another function, inspect.getmembers(), also affected.
For example:
class С(object):
__slots__ = ['atr']
def __dir__(self):
return ['nothing']
def method(self):
pass
def __init__(self):
self.atr = 'string'
c = C()
print dir(f) #If we try this - well get ['nothing'] returned by custom __dir__()
print inspect.getmembers(f) #Here we get []
print f.__dict__ #And here - exception will be raised because of __slots__
How in this case list of names of object might be getted?
Answer to original question- does inspect.getmembers() use __dir__() like dir() does?
Here's the source code for inspect.getmembers() so we can see what it's really doing:
def getmembers(object, predicate=None):
"""Return all members of an object as (name, value) pairs sorted by name.
Optionally, only return members that satisfy a given predicate."""
results = []
for key in dir(object):
try:
value = getattr(object, key)
except AttributeError:
continue
if not predicate or predicate(value):
results.append((key, value))
results.sort()
return results
From this we see that it is using dir() and just filtering the results a bit.
How to get attributes with an overridden __dir__()?
According to this answer, it isn't possible to always get a complete list of attributes, but we can still definitely get them in some cases/get enough to be useful.
From the docs:
If the object does not provide __dir__(), the function tries its best
to gather information from the object’s __dict__ attribute, if
defined, and from its type object. The resulting list is not
necessarily complete, and may be inaccurate when the object has a
custom __getattr__().
So if you are not using __slots__, you could look at your object's __dict__ (and it's type object's) to get basically the same info that dir() would normally give you. So, just like with dir(), you would have to use a more rigorous method to get metaclass methods.
If you are using __slots__, then getting class attributes is, in a way, a bit more simple. Yes, there's no dict, but there is __slots__ itself, which contains the names of all of the attributes. For example, adding print c.__slots__ to your example code yields ['atr']. (Again, a more rigorous approach is needed to get the attributes of superclasses as well.)
How to get methods
You might need a different solution depending on the use case, but if you just want to find out the methods easily, you can simply use the builtin help().
Modified PyPy dir()
Here's an alternative to some of the above: To get a version of dir() that ignores user-defined __dir__ methods, you could just take PyPy's implementation of dir() and delete the parts that reference __dir__ methods.
As Matthew pointed out in the other answer, getmembers apparently returns the subset of dir results that are actual attributes.
>>> class C:
>>> def foo(self):
>>> pass
>>> def __dir__(self):
>>> return ['test']
>>>
>>> import inspect
>>> c = C()
>>> dir(c)
['test']
>>> inspect.getmembers(c)
[]

Python: Assign a method from one class to an instance of another class

I want to use new.instancemethod to assign a function (aFunction) from one class (A) to an instance of another class (B). I'm not sure how I can get aFunction to allow itself to be applied to an instance of B - currently I am getting an error because aFunction is expecting to be executed on an instance of A.
[Note: I can't cast instance to A using the __class__ attribute as the class I'm using (B in my example) doesn't support typecasting]
import new
class A(object):
def aFunction(self):
pass
#Class which doesn't support typecasting
class B(object):
pass
b = B()
b.aFunction = new.instancemethod(A.aFunction, b, B.__class__)
b.aFunction()
This is the error:
TypeError: unbound method aFunction() must be called with A instance as first argument (got B instance instead)
new.instancemethod takes a function. A.aFunction is an unbound method. That's not the same thing. (You may want to try adding a bunch of print statements to display all of the things involved—A.aFunction(), A().aFunction, etc.—and their types, to help understanding.)
I'm assuming you don't how descriptors work. If you don't want to learn all of the gory details, what's actually going on is that your declaration of aFunction within a class definition creates a non-data descriptor out of the function. This is a magic thing that means that, depending on how you access it, you get an unbound method (A.aFunction) or a bound method (A().aFunction).
If you modify aFunction to be a #staticmethod, this will actually work, because for a static method both A.aFunction and A().aFunction are just functions. (I'm not sure that's guaranteed to be true by the standard, but it's hard to see how else anyone would ever implement it.) But if you want "aFunction to allow itself to be applied to an instance of B", a static method won't help you.
If you actually want to get the underlying function, there are a number of ways to do it; I think this is the clearest as far as helping you understand how descriptors works:
f = object.__getattribute__(A, 'aFunction')
On the other hand, the simplest is probably:
f = A.aFunction.im_func
Then, calling new.instancemethod is how you turn that function into a descriptor that can be called as a regular method for instances of class B:
b.aFunction = new.instancemethod(f, b, B)
Printing out a bunch of data makes things a lot clearer:
import new
class A(object):
def aFunction(self):
print self, type(self), type(A)
#Class which doesn't support typecasting
class B(object):
pass
print A.aFunction, type(A.aFunction)
print A().aFunction, type(A().aFunction)
print A.aFunction.im_func, type(A.aFunction.im_func)
print A().aFunction.im_func, type(A().aFunction.im_func)
A.aFunction(A())
A().aFunction()
f = object.__getattribute__(A, 'aFunction')
b = B()
b.aFunction = new.instancemethod(f, b, B)
b.aFunction()
You'll see something like this:
<unbound method A.aFunction> <type 'instancemethod'>
<bound method A.aFunction of <__main__.A object at 0x108b82d10>> <type 'instancemethod'>
<function aFunction at 0x108b62a28> <type 'function'>
<function aFunction at 0x108b62a28> <type 'function'>
<__main__.A object at 0x108b82d10> <class '__main__.A'> <type 'type'>
<__main__.A object at 0x108b82d10> <class '__main__.A'> <type 'type'>
<__main__.B object at 0x108b82d10> <class '__main__.B'> <type 'type'>
The only thing this doesn't show is the magic that creates the bound and unbound methods. For that, you need to look into A.aFunction.im_func.__get__, and at that point you're better off reading the descriptor howto than trying to dig it apart yourself.
One last thing: As brandizzi pointed out, this is something you almost never want to do. Alternatives include: Write aFunction as a free function instead of a method, so you just call b.aFunction(); factor out a function that does the real work, and have A.aFunction and B.aFunction just call it; have B aggregate an A and forward to it; etc.
I already posted an answer to the question you asked. But you shouldn't have had to get down to the level of having to understand new.instancemethod to solve your problem. The only reason that happened is because you asked the wrong question. Let's look at what you should have asked, and why.
In PySide.QtGui, I want the list widget items to have methods to set the font and colors, and they don't seem to.
This is what you really want. There may well be an easy way to do this. And if so, that's the answer you want. Also, by starting off with this, you avoid all the comments about "What do you actually want to do?" or "I doubt this is appropriate for whatever you're trying to do" (which often come with downvotes—or, more importantly, with potential answerers just ignoring your question).
Of course I could just write a function that takes a QListWidgetItem and call that function, instead of making it a method, but that won't work for me because __.
I assume there's a reason that won't work for you. But I can't really think of a good one. Whatever line of code said this:
item.setColor(color)
would instead say this:
setItemColor(item, color)
Very simple.
Even if you need to, e.g., pass around a color-setting delegate with the item bound into it, that's almost as easy with a function as with a method. Instead of this:
delegate = item.setColor
it's:
delegate = lambda color: setItemColor(item, color)
So, if there is a good reason you need this to be a method, that's something you really should explain. And if you're lucky, it'll turn out you were wrong, and there's a much simpler way to do what you want than what you were trying.
The obvious way to do this would be to get PySide to let me specify a class or factory function, so I could write a QListWidgetItem subclass, and every list item I ever deal with would be an instance of that subclass. But I can't find any way to do that.
This seems like something that should be a feature of PySide. So maybe it is, in which case you'd want to know. And if it isn't, and neither you nor any of the commenters or answerers can think of a good reason it would be bad design or hard to implement, you should go file a feature request and it might be in the next version. (Not that this helps you if you need to release code next week against the current version, but it's still worth doing.)
Since I couldn't find a way to do that, I tried to find some way to add my setColor method to the QListWidgetItem class, but couldn't think of anything.
Monkey-patching classes is very simple:
QListWidgetItem.setColor = setItemColor
If you didn't know you could do this, that's fine. If people knew that's what you were trying to do, this would be the first thing they'd suggest. (OK, maybe not many Python programmers know much about monkey-patching, but it's still a lot more than the number who know about descriptors or new.instancemethod.) And again, besides being an answer you'd get faster and with less hassle, it's a better answer.
Even if you did know this, some extension modules won't let you do that. If you tried and it failed, explain what didn't work instead:
PySide wouldn't let me add the method to the class; when I try monkey-patching, __.
Maybe someone knows why it didn't work. If not, at least they know you tried.
So I have to add it to every instance.
Monkey-patching instances looks like this:
myListWidgetItem.setColor = setItemColor
So again, you'd get a quick answer, and a better one.
But maybe you knew that, and tried it, and it didn't work either:
PySide also wouldn't let me add the method to each instance; when I try, __.
So I tried patching out the __class__ of each instance, to make it point to a custom subclass, because that works in PyQt4, but in PySide it __.
This probably won't get you anything useful, because it's just the way PySide works. But it's worth mentioning that you tried.
So, I decided to create that custom subclass, then copy its methods over, like so, but __.
And all the way down here is where you'd put all the stuff you put in your original question. If it really were necessary to solving your problem, the information would be there. But if there were an easy solution, you wouldn't get a bunch of confused answers from people who were just guessing at how new.instancemethod works.
If possible you can make the aFunction unbound by using #staticmethod decorator: -
class A(object):
#staticmethod
def aFunction(B): # Modified to take appropriate parameter..
pass
class B(object):
pass
b = B()
b.aFunction = new.instancemethod(A.aFunction, b, B.__class__)
b.aFunction()
*NOTE: - You need to modify the method to take appropriate parameter..
I'm amazed that none of these answers gives what seems the simplest solution, specifically where you want an existing instance to have a method x replaced by a method x (same name) from another class (e.g. subclass) by being "grafted" on to it.
I had this issue in the very same context, i.e. with PyQt5. Specifically, using a QTreeView class, the problem is that I have subclassed QStandardItem (call it MyItem) for all the tree items involved in the tree structure (first column) and, among other things, adding or removing such an item involves some extra stuff, specifically adding a key to a dictionary or removing the key from this dictionary.
Overriding appendRow (and removeRow, insertRow, takeRow, etc.) presents no problem:
class MyItem( QStandardItem ):
...
def appendRow( self, arg ):
# NB QStandarItem.appendRow can be called either with an individual item
# as "arg", or with a list of items
if isinstance( arg, list ):
for element in arg:
assert isinstance( element, MyItem )
self._on_adding_new_item( element )
elif isinstance( arg, MyItem ):
self._on_adding_new_item( arg )
else:
raise Exception( f'arg {arg}, type {type(arg)}')
super().appendRow( self, arg )
... where _on_adding_new_item is a method which populates the dictionary.
The problem arises when you want to want to add a "level-0" row, i.e. a row the parent of which is the "invisible root" of the QTreeView. Naturally you want the items in this new "level-0" row to cause a dictionary entry to be created for each, but how to get this invisible root item, class QStandardItem, to do this?
I tried overriding the model's method invisibleRootItem() to deliver not super().invisibleRootItem(), but instead a new MyItem. This didn't seem to work, probably because QStandardItemModel.invisibleRootItem() does things behind the scenes, e.g. setting the item's model() method to return the QStandardItemModel.
The solution was quite simple:
class MyModel( QStandardItemModel ):
def __init__( self ):
self.root_item = None
...
...
def invisibleRootItem( self ):
# if this method has already been called we don't need to do our modification
if self.root_item != None:
return self.root_item
self.root_item = super().invisibleRootItem()
# now "graft" the method from MyItem class on to the root item instance
def append_row( row ):
MyItem.appendRow( self.root_item, row )
self.root_item.appendRow = append_row
... this is not quite enough, however: super().appendRow( self, arg ) in MyItem.appendRow will then raise an Exception when called on the root item, because a QStandardItem has no super(). Instead, MyItem.appendRow is changed to this:
def appendRow( self, arg ):
if isinstance( arg, list ):
for element in arg:
assert isinstance( element, MyItem )
self._on_adding_new_item( element )
elif isinstance( arg, MyItem ):
self._on_adding_new_item( arg )
else:
raise Exception( f'arg {arg}, type {type(arg)}')
QStandardItem.appendRow( self, arg )
Thanks Rohit Jain - this is the answer:
import new
class A(object):
#staticmethod
def aFunction(self):
pass
#Class which doesn't support typecasting
class B(object):
pass
b = B()
b.aFunction = new.instancemethod(A.aFunction, b, B.__class__)
b.aFunction()

Wrapping a Python Object

I'd like to serialize Python objects to and from the plist format (this can be done with plistlib). My idea was to write a class PlistObject which wraps other objects:
def __init__(self, anObject):
self.theObject = anObject
and provides a "write" method:
def write(self, pathOrFile):
plistlib.writeToPlist(self.theObject.__dict__, pathOrFile)
Now it would be nice if the PlistObject behaved just like wrapped object itself, meaning that all attributes and methods are somehow "forwarded" to the wrapped object. I realize that the methods __getattr__ and __setattr__ can be used for complex attribute operations:
def __getattr__(self, name):
return self.theObject.__getattr__(name)
But then of course I run into the problem that the constructor now produces an infinite recursion, since also self.theObject = anObject tries to access the wrapped object.
How can I avoid this? If the whole idea seems like a bad one, tell me too.
Unless I'm missing something, this will work just fine:
def __getattr__(self, name):
return getattr(self.theObject, name)
Edit: for those thinking that the lookup of self.theObject will result in an infinite recursive call to __getattr__, let me show you:
>>> class Test:
... a = "a"
... def __init__(self):
... self.b = "b"
... def __getattr__(self, name):
... return 'Custom: %s' % name
...
>>> Test.a
'a'
>>> Test().a
'a'
>>> Test().b
'b'
>>> Test().c
'Custom: c'
__getattr__ is only called as a last resort. Since theObject can be found in __dict__, no issues arise.
But then of course I run into the problem that the constructor now produces an infinite recursion, since also self.theObject = anObject tries to access the wrapped object.
That's why the manual suggests that you do this for all "real" attribute accesses.
theobj = object.__getattribute__(self, "theObject")
I'm glad to see others have been able to help you with the recursive call to __getattr__. Since you've asked for comments on the general approach of serializing to plist, I just wanted to chime in with a few thoughts.
Python's plist implementation handles basic types only, and provides no extension mechanism for you to instruct it on serializing/deserializing complex types. If you define a custom class, for example, writePlist won't be able to help, as you've discovered since you're passing the instance's __dict__ for serialization.
This has a couple implications:
You won't be able to use this to serialize any objects that contain other objects of non-basic type without converting them to a __dict__, and so-on recursively for the entire network graph.
If you roll your own network graph walker to serialize all non-basic objects that can be reached, you'll have to worry about circles in the graph where one object has another in a property, which in turn holds a reference back to the first, etc etc.
Given then, you may wish to look at pickle instead as it can handle all of these and more. If you need the plist format for other reasons, and you're sure you can stick to "simple" object dicts, then you may wish to just use a simple function... trying to have the PlistObject mock every possible function in the contained object is an onion with potentially many layers as you need to handle all the possibilities of the wrapped instance.
Something as simple as this may be more pythonic, and keep the usability of the wrapped object simpler by not wrapping it in the first place:
def to_plist(obj, f_handle):
writePlist(obj.__dict__, f_handle)
I know that doesn't seem very sexy, but it is a lot more maintainable in my opinion than a wrapper given the severe limits of the plist format, and certainly better than artificially forcing all objects in your application to inherit from a common base class when there's nothing in your business domain that actually indicates those disparate objects are related.

Categories