Generate methods dynamically giving the keys of a dictionary - python

Looking to find a solution(not sure if it exists!) to the following situation:
Starting point is a dictionary dict = {k1:v1, k2:v2,...,kn:vn} where n is not fixed.
Is there a way to write a generic class that will have n methods generated dynamically that can be called as in the following example:
class example(dict):
example.k1()
example.k2()
.
.
.
example.kn()
Eachexample.ki()where 1<=i<=n, should return the corresponding vi.

Instead of creating so many method dynamically better override __getattr__ method of your class and return a callable from there:
class Example(dict):
def __getattr__(self, k):
if k in self:
return lambda: self[k]
raise TypeError('Example object has not attribute {!r}'.format(k))
Note that for keys like keys(), items(), etc __getattr__ won't be called as they are found in the class by __getattribute__ itself. And better don't name any of your keys after them.
Demo:
>>> d = Example(a=1, b=2, c=3)
>>> d.a()
1
>>> d.b()
2
>>> d.foo()
Traceback (most recent call last):
File "<pyshell#14>", line 1, in <module>
d.foo()
File "/home/ashwini/py/so.py", line 7, in __getattr__
raise TypeError('Example object has not attribute {!r}'.format(k))
TypeError: Example object has not attribute 'foo'

What you want is to override the __getattr__ function described here.
To take your example:
class example(dict):
def __getattr__(self, name):
return lambda: self[name]
This allows you to do:
e = example()
e["foo"] = 1
print e.foo()
==> 1

I think adding a method to class dynamically can help u.
class example(object) :
dict={'k1':'v1','k2':'v2','k3':'v3','kn':'vn'}
def getvalue(self,key) :
return self.dict[key]
if __name__=="__main__" :
e = example()
e.method1=e.getvalue # this is adding a method to example class dynamically.
print e.method1('k1')
e.method2=e.getvalue
print e.method2('k2')
e.method3=e.getvalue
print e.method3('k3')
e.methodn=e.getvalue
print e.methodn('kn')
this outputs
v1
v2
v3
vn

Related

override __getattr__ for methods and not variables

i want the next code to work
class A(object):
def __getattr__(self, item):
print item
return self.item
def x(self):
print 4
a = A()
a.x()
and the output will ber
x
4
i know its not working becuase x is like a static variable and not an instance variable.
I saw this __getattr__ for static/class variables in python and it doesn't seem to work in my case
how can it be done?
thx
There are a couple of obvious problems with your code:
class A(object):
def __getattr__(self, item): # 1
print item
return self.item # 2
def x(self): # 1 again
print 4
__getattr__ will only be invoked if item cannot be found the normal way. For item == 'x', therefore, it is never invoked.
Which is probably just as well, since self.item looks for the attribute item, not the attribute corresponding to whatever is assigned to item. This doesn't exist, so would invoke __getattr__. If you try A().y() you'll get RuntimeError: maximum recursion depth exceeded while calling a Python object.
Instead, I think you want to use __getattribute__, which is always invoked. You need to be careful not to get the same runtime error, though; here I avoid it by calling the superclass implementation of __getattribute__, the naïve way of calling getattr(self, item) would fail:
class A(object):
def __getattribute__(self, item):
print item
return super(A, self).__getattribute__(item)
def x(self):
print 4
Which gives:
>>> A().x()
x
4
>>> A().y()
y
Traceback (most recent call last):
File "<stdin>", line 1, in <module>
File "<stdin>", line 4, in __getattribute__
AttributeError: 'A' object has no attribute 'y'
Note that both __getattr__ and __getattribute__ apply equally to attributes and methods (which are, more or less, just callable attributes).

Questions about details of #property in Python

Assume that I have a class as following:
class MyClass(object):
def __init__(self, value=None):
self.attr = value
#property
def attr(self):
# This acts as a getter?
# Let's call the function "attr_1" as alias
return self.__attr
#attr.setter
def attr(self, value):
# This acts as a setter?
# Let's call the function "attr_2" as alias
self.__attr = value
inst = MyClass(1)
I read the Documentation on Descriptor and looked at the implementation of property class.
As far as I know, when I type inst.attr, the following happens:
The first attr (whose alias is attr_1) is found, and attr is now an instance of property class, which is a data descriptor.
Therefore, it will override the instance dictionary, which means type(inst).__dict__['attr'].__get__(inst, type(inst)) is invoked.
attr.__get__(inst, type(inst)) invokes attr.fget(inst), where fget() is in fact the attr(self) (the "raw" attr_1 function).
Finally, attr.fget(inst) returns inst.__attr.
Here comes the first question: the class MyClass does not have an attribute __attr, then how to interpret inst.__attrin step 3?
Similarly, in the emulated setter, how does Python find an attribute inst.__attr to assign the value?
And a trivial question: since property is a class, why not Property instead of property?
Your question is not directly related to properties actually, and the way they work as data descriptors. It's just the way Python fakes private attributes marked as starting with two underscores.
>>> inst.__attr
Traceback (most recent call last):
File "<pyshell#4>", line 1, in <module>
inst.__attr
AttributeError: 'MyClass' object has no attribute '__attr'
Consider that you wrote your code using an internal variable with a single underscore (usually the convention to say, you shouldn't touch this but I won't enforce, do at your own risk):
>>> class MyClass2(object):
def __init__(self, value=None):
self.attr = value
#property
def attr(self):
# This acts as a getter?
# Let's call the function "attr_1" as alias
return self._attr
#attr.setter
def attr(self, value):
# This acts as a setter?
# Let's call the function "attr_2" as alias
self._attr = value
>>> inst2 = MyClass2(1)
>>> inst2._attr
1
And you can see the trick by peeking at the object's __dict__
>>> inst2.__dict__
{'_attr': 1}
>>> inst.__dict__
{'_MyClass__attr': 1}
Just some more to convince you that this has nothing to do with properties:
>>> class OtherClass(object):
def __init__(self, value):
self.__attr = value
def get_attr(self):
return self.__attr
def set_attr(self, value):
self.__attr = value
>>> other_inst = OtherClass(1)
>>> other_inst.get_attr()
1
>>> other_inst.__attr
Traceback (most recent call last):
File "<pyshell#17>", line 1, in <module>
other_inst.__attr
AttributeError: 'OtherClass' object has no attribute '__attr'
>>> other_inst.__dict__
{'_OtherClass__attr': 1}
>>> other_inst._OtherClass__attr
1
>>> other_inst._OtherClass__attr = 24
>>> other_inst.get_attr()
24
>>> inst._MyClass__attr = 23
>>> inst.attr
23
Concerning your last question, I just don't think there is such a convention in Python that classes must start with an uppercase. property is not an isolated case (datetime, itemgetter, csv.reader, ...).

Why can't I create a default, ordered dict by inheriting OrderedDict and defaultdict?

My first attempt to combine the features of two dictionaries in the collections module was to create a class that inherits them:
from collections import OrderedDict, defaultdict
class DefaultOrderedDict(defaultdict, OrderedDict):
def __init__(self, default_factory=None, *a, **kw):
super().__init__(default_factory, *a, **kw)
However, I cannot assign an item to this dictionary:
d = DefaultOrderedDict(lambda: 0)
d['a'] = 1
Traceback (most recent call last):
File "<stdin>", line 1, in <module>
File "/usr/lib64/python3.3/collections/__init__.py", line 64, in __setitem__
self.__map[key] = link = Link()
AttributeError: 'DefaultOrderedDict' object has no attribute '_OrderedDict__map'
Indeed, this question about how to create a similar object has answers that achieve it by extending the OrderedDict class and manually re-implementing the additional methods provided defaultdict. Using multiple inheritance would be cleaner. Why doesn't it work?
The reason is that the init method of defaultdict instead of calling __init__ of the next class in MRO calls init of PyDict_Type hence some of the attributes like __map that are set in OrderedDict's __init__ are never initialized, hence the error.
>>> DefaultOrderedDict.mro()
[<class '__main__.DefaultOrderedDict'>,
<class 'collections.defaultdict'>,
<class 'collections.OrderedDict'>,
<class 'dict'>, <class 'object'>]
And defaultdict don't have their own __setitem__ method:
>>> defaultdict.__setitem__
<slot wrapper '__setitem__' of 'dict' objects>
>>> dict.__setitem__
<slot wrapper '__setitem__' of 'dict' objects>
>>> OrderedDict.__setitem__
<unbound method OrderedDict.__setitem__>
So, when you called d['a'] = 1, in search of __setitem__ Python reached OrdereredDict's __setitem__ and their the access of uninitialized __map attribute raised the error:
A fix will be to call __init__ on both defaultdict and OrderedDict explicitly:
class DefaultOrderedDict(defaultdict, OrderedDict):
def __init__(self, default_factory=None, *a, **kw):
for cls in DefaultOrderedDict.mro()[1:-2]:
cls.__init__(self, *a, **kw)
Perhaps you are coming from a Java background, but multiple inheritance doesn't do what you'd expect it does in Python. Calling super from the init of the defaultOrderedDict calls the super() as the init of defaultdict and never the init of OrderedDict. The map attribute is first defined in the __init function of OrderedDict. The implementation is the following (from source):
def __init__(self, *args, **kwds):
'''Initialize an ordered dictionary. The signature is the same as
regular dictionaries, but keyword arguments are not recommended because
their insertion order is arbitrary.
'''
if len(args) > 1:
raise TypeError('expected at most 1 arguments, got %d' % len(args))
try:
self.__root
except AttributeError:
self.__root = root = [] # sentinel node
root[:] = [root, root, None]
self.__map = {}
self.__update(*args, **kwds)
Note that this doesn't have to do with the attribute being private. A minimal example with multiple inheritance can illustrate this:
class Foo:
def __init__(self):
self.foo=2
class Bar:
def __init__(self):
self.bar=1
class FooBar(Foo,Bar):
def __init__(self):
super().__init__()
fb = FooBar()
fb.foo
>>2
fb.bar
>>AttributeError: 'FooBar' object has no attribute 'bar'
So, the constructor of Bar was never called. Pythons method resolution order goes from left to right until it finds a class with the function name it seeks (in this case init) and then ignores all other classes on the right (in this case Bar)

How to handle & return both properties AND functions missing in a Python class using the __getattr__ function?

It is fairly easy to use the __getattr__ special method on Python classes to handle either missing properties or functions, but seemingly not both at the same time.
Consider this example which handles any property requested which is not defined explicitly elsewhere in the class...
class Props:
def __getattr__(self, attr):
return 'some_new_value'
>>> p = Props()
>>> p.prop # Property get handled
'some_new_value'
>>> p.func('an_arg', kw='keyword') # Function call NOT handled
Traceback (most recent call last):
File "<console>", line 1, in <module>
TypeError: 'str' object is not callable
Next, consider this example which handles any function call not defined explicitly elsewhere in the class...
class Funcs:
def __getattr__(self, attr):
def fn(*args, **kwargs):
# Do something with the function name and any passed arguments or keywords
print attr
print args
print kwargs
return
return fn
>>> f = Funcs()
>>> f.prop # Property get NOT handled
<function fn at 0x10df23b90>
>>> f.func('an_arg', kw='keyword') # Function call handled
func
('an_arg',)
{'kw': 'keyword'}
The question is how to handle both types of missing attributes in the same __getattr__? How to detect if the attribute requested was in property notation or in method notation with parentheses and return either a value or a function respectively? Essentially I want to handle SOME missing property attributes AND SOME missing function attributes and then resort to default behavior for all the other cases.
Advice?
How to detect if the attribute requested was in property notation or in method notation with parentheses and return either a value or a function respectively?
You can't. You also can't tell whether a requested method is an instance, class, or static method, etc. All you can tell is that someone is trying to retrieve an attribute for read access. Nothing else is passed into the getattribute machinery, so nothing else is available to your code.
So, you need some out-of-band way to know whether to create a function or some other kind of value. This is actually pretty common—you may actually be proxying for some other object that does have a value/function distinction (think of ctypes or PyObjC), or you may have a naming convention, etc.
However, you could always return an object that can be used either way. For example, if your "default behavior" is to return attributes are integers, or functions that return an integer, you can return something like this:
class Integerizer(object):
def __init__(self, value):
self.value = value
def __int__(self):
return self.value
def __call__(self, *args, **kw):
return self.value
There is no way to detect how the returned attribute was intended to be used. Everything on python objects are attributes, including the methods:
>>> class Foo(object):
... def bar(self): print 'bar called'
... spam='eggs'
...
>>> Foo.bar
<unbound method Foo.bar>
>>> Foo.spam
'eggs'
Python first looks up the attribute (bar or spam), and if you meant to call it (added parenthesis) then Python invokes the callable after lookup up the attribute:
>>> foo = Foo()
>>> fbar = foo.bar
>>> fbar()
'bar called'
In the above code I separated the lookup of bar from calling bar.
Since there is no distinction, you cannot detect in __getattr__ what the returned attribute will be used for.
__getattr__ is called whenever normal attribute access fails; in the following example monty is defined on the class, so __getattr__ is not called; it is only called for bar.eric and bar.john:
>>> class Bar(object):
... monty = 'python'
... def __getattr__(self, name):
... print 'Attribute access for {0}'.format(name)
... if name == 'eric':
... return 'idle'
... raise AttributeError(name)
...
>>> bar = Bar()
>>> bar.monty
'python'
>>> bar.eric
Attribute access for eric
'idle'
>>> bar.john
Attribute access for john
Traceback (most recent call last):
File "<stdin>", line 1, in <module>
File "<stdin>", line 7, in __getattr__
AttributeError: john
Note that functions are not the only objects that you can invoke (call); any custom class that implements the __call__ method will do:
>>> class Baz(object):
... def __call__(self, name):
... print 'Baz sez: "Hello {0}!"'.format(name)
...
>>> baz = Baz()
>>> baz('John Cleese')
Baz sez: "Hello John Cleese!"
You could use that return objects from __getattr__ that can both be called and used as a value in different contexts.

How to automate the delegation of __special_methods__ in Python?

Let spam be an instance of some class Spam, and suppose that spam.ham is an object of some built-in type, say dict. Even though Spam is not a subclass of dict, I would like its instances to have the same API as a regular dict (i.e. the same methods with the same signatures), but I want to avoid typing out a bazillion boilerplate methods of the form:
def apimethod(self, this, that):
return self.ham.apimethod(this, that)
I tried the following:
class Spam(object):
def __init__(self):
self.ham = dict()
def __getattr__(self, attr):
return getattr(self.ham, attr)
...but it works for "regular" methods, like keys and items, but not for special methods, like __setitem__, __getitem__, and __len__:
>>> spam = Spam()
>>> spam.keys()
[]
>>> spam['eggs'] = 42
Traceback (most recent call last):
File "<stdin>", line 1, in <module>
TypeError: 'Spam' object does not support item assignment
>>> spam.ham['eggs'] = 42
>>> foo.items()
[('eggs', 42)]
>>> spam['eggs']
Traceback (most recent call last):
File "<stdin>", line 1, in <module>
TypeError: 'Spam' object is not subscritable
>>> len(spam)
Traceback (most recent call last):
File "<stdin>", line 1, in <module>
TypeError: 'Spam' object has no len()
All the special methods I tried produced similar errors.
How can I automate the definition of special methods (so that they get referred to the delegate)?
Clarification: I'm not necessarily looking for solutions that leverage the standard method lookup sequence. My goal here is to minimize boilerplate code.
Thanks!
This may not be helpful if you need a solution that prohibits metaclasses as well, but here is the solution I came up with:
def _wrapper(func):
def _wrapped(self, *args, **kwargs):
return getattr(self.ham, func)(*args, **kwargs)
return _wrapped
class DictMeta(type):
def __new__(cls, name, bases, dct):
default_attrs = dir(object)
for attr in dir(dict):
if attr not in default_attrs:
dct[attr] = _wrapper(attr)
return type.__new__(cls, name, bases, dct)
class Spam(object):
__metaclass__ = DictMeta
def __init__(self):
self.ham = dict()
Seems to do what you're looking for:
>>> spam = Spam()
>>> spam['eggs'] = 42
>>> spam.items()
[('eggs', 42)]
>>> len(spam)
1
>>> spam.ham
{'eggs': 42}
If on Python 3.x use class Spam(object, metaclass=DictMeta) and remove the __metaclass__ line from the body of Spam.
This looks like a job for ... a metaclass!
def make_method(p, m):
def method(self, *a, **k):
return getattr(getattr(self, p),m)(*a, **k)
return method
class Proxier(type):
def __new__(cls, name, bases, dict):
objs = dict.get('proxyobjs', [])
if objs:
old_init = dict.get('__init__', lambda self: None)
def new_init(self, *a, **k):
for (n,v) in objs.iteritems():
setattr(self, n, v())
old_init(self, *a, **k)
dict['__init__'] = new_init
meths = dict.get('proxymethods', {})
for (proxyname, methnames) in meths.iteritems():
for methname in methnames:
dict[methname] = make_method(proxyname, methname)
return super(Proxier, cls).__new__(cls, name, bases, dict)
class Spam(object):
__metaclass__ = Proxier
proxyobjs = {'ham': dict,
'eggs': list,
}
proxymethods = {'ham': ('__setitem__', '__getitem__', '__delitem__'),
'eggs': ('__contains__', 'append')
}
It works!
In [28]: s = Spam()
In [29]: s[4] = 'hi'
In [30]: s.append(3)
In [31]: 3 in s
Out[31]: True
In [32]: 4 in s
Out[32]: False
In [33]: s[4]
Out[33]: 'hi'
Note that you have to specify what parts of the interface you're using (otherwise, why not just inherit?). So we have __contains__ from list, and __getitem__ from dict, and the __iter__ from neither. (And only one way to mutate the underlying list, using append but not extend or __delitem__.) So (like Martian) I'm not sure how useful this will be.
Attribute access for special methods doesn't obey normal attribute access rules, basically those methods MUST exist at class level, read http://docs.python.org/reference/datamodel.html#special-method-lookup-for-new-style-classes
So you need to add all those methods either manually or you can add them to class programmatically and best way to do that is thru metaclass. Also note that I am not adding all methods in dict but only special methods because rest can be easily redirected thru __getattr__
def redirect(methodname):
def _redirect(self, *args, **kwargs):
print "redirecting",methodname
method = getattr(self.ham, methodname)
return method(*args, **kwargs)
return _redirect
class DictRedirect(object):
def __new__(cls, name, bases, attrs):
# re-create all special methods from dict
dict_attr_names = set(dir(dict))
common_names = set(dir(cls))
for methodname in dict_attr_names-common_names:
if not methodname.startswith('__'):
continue
attrs[methodname] = redirect(methodname)
return type(name, bases, attrs)
class Spam(object):
__metaclass__ = DictRedirect
def __init__(self):
self.ham = dict()
def __getattr__(self, name):
return getattr(self.ham, name)
spam = Spam()
spam['eggs'] = 'yolk'
print 'keys =',spam.keys()
print spam['eggs']
output:
redirecting __setitem__
keys = ['eggs']
redirecting __getitem__
yolk
Disclaimer: IMO this is too much magic and should be avoided except for having fun :)
Not sure __getattribute__ will help, but the reason is the special methods are looked up in the class not in the instance: http://docs.python.org/reference/datamodel.html#special-method-lookup-for-new-style-classes , as for example the special methods like __getattr__ and __getattribute__ themselves have to be looked up somewhere.
Proxying like this seems asking for trouble to me without careful thinking, for example how should things like __dict__ and __class__ behave and about possible method conflicts if your wrapper happens to have any methods, and sure there are other problems.
Re: is-a vs. has-a:
If you just duplicate whole interface of contained member, it seems like anti-pattern to me, as that's what inheritance is for. What if you have a two has-a relations to two dict objects?
In has-a relation, one usually picks useful methods often exporting them under different names to make sensible API. So instead Spam.append(item) you would have Spam.addBot(bot).

Categories