How to protect class from adding attributes in that way:
class foo(object):
pass
x=foo()
x.someRandomAttr=3.14
If you want an immutable object, use the collections.namedtuple() factory to create a class for you:
from collections import namedtuple
foo = namedtuple('foo', ('bar', 'baz'))
Demo:
>>> from collections import namedtuple
>>> foo = namedtuple('foo', ('bar', 'baz'))
>>> f = foo(42, 38)
>>> f.someattribute = 42
Traceback (most recent call last):
File "<stdin>", line 1, in <module>
AttributeError: 'foo' object has no attribute 'someattribute'
>>> f.bar
42
Note that the whole object is immutable; you cannot change f.bar after the fact either:
>>> f.bar = 43
Traceback (most recent call last):
File "<stdin>", line 1, in <module>
AttributeError: can't set attribute
Override the __setattr__ method:
>>> class Foo(object):
def __setattr__(self, var, val):
raise TypeError("You're not allowed to do this")
...
>>> Foo().x = 1
Traceback (most recent call last):
File "<ipython-input-31-be77d2b3299a>", line 1, in <module>
Foo().x = 1
File "<ipython-input-30-cb58a6713335>", line 3, in __setattr__
raise TypeError("You're not allowed to do this")
TypeError: You're not allowed to do this
Even Foo's subclasses will raise the same error:
>>> class Bar(Foo):
pass
...
>>> Bar().x = 1
Traceback (most recent call last):
File "<ipython-input-35-35cd058c173b>", line 1, in <module>
Bar().x = 1
File "<ipython-input-30-cb58a6713335>", line 3, in __setattr__
raise TypeError("You're not allowed to do this")
TypeError: You're not allowed to do this
Related
I am trying to access an function stored inside an Enum using it's name but I get a KeyError:
from enum import Enum
def f():
pass
class MyEnum(Enum):
function = f
print MyEnum.function # <unbound method MyEnum.f>
print MyEnum['function'] # KeyError: 'function'
But it work if the Enum don't store func:
from enum import Enum
class MyEnum(Enum):
a = "toto"
print MyEnum.a # MyEnum.a
print MyEnum.a.value # "toto"
print MyEnum['a'] # MyEnum.a
print MyEnum.a.value # "toto"
I know I could use dict instead of Enum, but I want to know why Enum behave differently.
Assigning a function is the same as defining it. And if you define a function in an Enum it becomes a method of the Enum and is not taken as a value for enum.
The following enums A and B are completely equivalent:
>>> from enum import Enum
>>>
>>> class A(Enum):
... a = 1
... def f(self):
... print('Hello')
...
>>> def x(self):
... print('Hello')
...
>>> class B(Enum):
... a = 1
... f = x
...
>>> B.f
<unbound method B.x>
>>> A.f
<unbound method A.f>
>>> A['a']
<A.a: 1>
>>> A['f']
Traceback (most recent call last):
File "<stdin>", line 1, in <module>
File "/usr/lib/python2.7/dist-packages/enum/__init__.py", line 384, in __getitem__
return cls._member_map_[name]
KeyError: 'f'
>>> B['a']
<B.a: 1>
>>> B['f']
Traceback (most recent call last):
File "<stdin>", line 1, in <module>
File "/usr/lib/python2.7/dist-packages/enum/__init__.py", line 384, in __getitem__
return cls._member_map_[name]
KeyError: 'f'
Functions are treated differently because otherwise it would impossible to define custom methods in an enum.
I have a problem with AttributeErrors raised in a #property in combination with __getattr__() in python:
Example code:
>>> def deeply_nested_factory_fn():
... a = 2
... return a.invalid_attr
...
>>> class Test(object):
... def __getattr__(self, name):
... if name == 'abc':
... return 'abc'
... raise AttributeError("'Test' object has no attribute '%s'" % name)
... #property
... def my_prop(self):
... return deeply_nested_factory_fn()
...
>>> test = Test()
>>> test.my_prop
Traceback (most recent call last):
File "<stdin>", line 1, in <module>
File "<stdin>", line 5, in __getattr__
AttributeError: 'Test' object has no attribute 'my_prop'
In my case, this is a highly misleading error message, because it hides the fact that deeply_nested_factory_fn() has a mistake.
Based on the idea in Tadhg McDonald-Jensen's answer, my currently best solution is the following. Any hints on how to get rid of the __main__. prefix to AttributeError and the reference to attributeErrorCatcher in the traceback would be much appreciated.
>>> def catchAttributeErrors(func):
... AttributeError_org = AttributeError
... def attributeErrorCatcher(*args, **kwargs):
... try:
... return func(*args, **kwargs)
... except AttributeError_org as e:
... import sys
... class AttributeError(Exception):
... pass
... etype, value, tb = sys.exc_info()
... raise AttributeError(e).with_traceback(tb.tb_next) from None
... return attributeErrorCatcher
...
>>> def deeply_nested_factory_fn():
... a = 2
... return a.invalid_attr
...
>>> class Test(object):
... def __getattr__(self, name):
... if name == 'abc':
... # computing come other attributes
... return 'abc'
... raise AttributeError("'Test' object has no attribute '%s'" % name)
... #property
... #catchAttributeErrors
... def my_prop(self):
... return deeply_nested_factory_fn()
...
>>> class Test1(object):
... def __init__(self):
... test = Test()
... test.my_prop
...
>>> test1 = Test1()
Traceback (most recent call last):
File "<stdin>", line 1, in <module>
File "<stdin>", line 4, in __init__
File "<stdin>", line 11, in attributeErrorCatcher
File "<stdin>", line 10, in my_prop
File "<stdin>", line 3, in deeply_nested_factory_fn
__main__.AttributeError: 'int' object has no attribute 'invalid_attr'
If you're willing to exclusively use new-style classes, you could overload __getattribute__ instead of __getattr__:
class Test(object):
def __getattribute__(self, name):
if name == 'abc':
return 'abc'
else:
return object.__getattribute__(self, name)
#property
def my_prop(self):
return deeply_nested_factory_fn()
Now your stack trace will properly mention deeply_nested_factory_fn.
Traceback (most recent call last):
File "C:\python\myprogram.py", line 16, in <module>
test.my_prop
File "C:\python\myprogram.py", line 10, in __getattribute__
return object.__getattribute__(self, name)
File "C:\python\myprogram.py", line 13, in my_prop
return deeply_nested_factory_fn()
File "C:\python\myprogram.py", line 3, in deeply_nested_factory_fn
return a.invalid_attr
AttributeError: 'int' object has no attribute 'invalid_attr'
Just in case others find this: the problem with the example on top is that an AttributeError is raised inside __getattr__. Instead, one should call self.__getattribute__(attr) to let that raise.
Example
def deeply_nested_factory_fn():
a = 2
return a.invalid_attr
class Test(object):
def __getattr__(self, name):
if name == 'abc':
return 'abc'
return self.__getattribute__(name)
#property
def my_prop(self):
return deeply_nested_factory_fn()
test = Test()
test.my_prop
This yields
AttributeError Traceback (most recent call last)
Cell In [1], line 15
12 return deeply_nested_factory_fn()
14 test = Test()
---> 15 test.my_prop
Cell In [1], line 9, in Test.__getattr__(self, name)
7 if name == 'abc':
8 return 'abc'
----> 9 return self.__getattribute__(name)
Cell In [1], line 12, in Test.my_prop(self)
10 #property
11 def my_prop(self):
---> 12 return deeply_nested_factory_fn()
Cell In [1], line 3, in deeply_nested_factory_fn()
1 def deeply_nested_factory_fn():
2 a = 2
----> 3 return a.invalid_attr
AttributeError: 'int' object has no attribute 'invalid_attr'
You can create a custom Exception that appears to be an AttributeError but will not trigger __getattr__ since it is not actually an AttributeError.
UPDATED: the traceback message is greatly improved by reassigning the .__traceback__ attribute before re-raising the error:
class AttributeError_alt(Exception):
#classmethod
def wrapper(err_type, f):
"""wraps a function to reraise an AttributeError as the alternate type"""
#functools.wraps(f)
def alt_AttrError_wrapper(*args,**kw):
try:
return f(*args,**kw)
except AttributeError as e:
new_err = err_type(e)
new_err.__traceback__ = e.__traceback__.tb_next
raise new_err from None
return alt_AttrError_wrapper
Then when you define your property as:
#property
#AttributeError_alt.wrapper
def my_prop(self):
return deeply_nested_factory_fn()
and the error message you will get will look like this:
Traceback (most recent call last):
File ".../test.py", line 34, in <module>
test.my_prop
File ".../test.py", line 14, in alt_AttrError_wrapper
raise new_err from None
File ".../test.py", line 30, in my_prop
return deeply_nested_factory_fn()
File ".../test.py", line 20, in deeply_nested_factory_fn
return a.invalid_attr
AttributeError_alt: 'int' object has no attribute 'invalid_attr'
notice there is a line for raise new_err from None but it is above the lines from within the property call. There would also be a line for return f(*args,**kw) but that is omitted with .tb_next.
I am fairly sure the best solution to your problem has already been suggested and you can see the previous revision of my answer for why I think it is the best option. Although honestly if there is an error that is incorrectly being suppressed then raise a bloody RuntimeError chained to the one that would be hidden otherwise:
def assert_no_AttributeError(f):
#functools.wraps(f)
def assert_no_AttrError_wrapper(*args,**kw):
try:
return f(*args,**kw)
except AttributeError as e:
e.__traceback__ = e.__traceback__.tb_next
raise RuntimeError("AttributeError was incorrectly raised") from e
return assert_no_AttrError_wrapper
then if you decorate your property with this you will get an error like this:
Traceback (most recent call last):
File ".../test.py", line 27, in my_prop
return deeply_nested_factory_fn()
File ".../test.py", line 17, in deeply_nested_factory_fn
return a.invalid_attr
AttributeError: 'int' object has no attribute 'invalid_attr'
The above exception was the direct cause of the following exception:
Traceback (most recent call last):
File ".../test.py", line 32, in <module>
x.my_prop
File ".../test.py", line 11, in assert_no_AttrError_wrapper
raise RuntimeError("AttributeError was incorrectly raised") from e
RuntimeError: AttributeError was incorrectly raised
Although if you expect more then just one thing to raise an AttributeError then you might want to just overload __getattribute__ to check for any peculiar error for all lookups:
def __getattribute__(self,attr):
try:
return object.__getattribute__(self,attr)
except AttributeError as e:
if str(e) == "{0.__class__.__name__!r} object has no attribute {1!r}".format(self,attr):
raise #normal case of "attribute not found"
else: #if the error message was anything else then it *causes* a RuntimeError
raise RuntimeError("Unexpected AttributeError") from e
This way when something goes wrong that you are not expecting you will know it right away!
I am curious how I can assign a variable from outside a function object. Before I tried it, I thought I knew how it can be done.
>>> def f():
... print(x)
...
>>> f.x=2
>>> f()
Traceback (most recent call last):
File "<stdin>", line 1, in <module>
File "<stdin>", line 2, in f
NameError: name 'x' is not defined
>>>
I then tried:
>>> class c:
... def f(self):
... print(x)
...
>>> y=c();y.x=2;y.f()
Traceback (most recent call last):
File "<stdin>", line 1, in <module>
File "<stdin>", line 3, in f
NameError: name 'x' is not defined
The same error. Now, I thought, this just has to work:
>>> class c:
... def makef(self):
... return lambda x=x: print(x)
...
>>> y = c();y.x = 2;y.makef()()
Traceback (most recent call last):
File "<stdin>", line 1, in <module>
File "<stdin>", line 3, in makef
NameError: name 'x' is not defined
Alas, it did not. How can I assign a variable accessible to a function after the function has been defined? This is just a curiosity. There's really no reason (that I can think of) for not just passing a parameter.
class Name:
def __init__(self):
self.x = None
def f(self):
print self.x
a = Name()
a.x = 'Value'
a.f()
output
$ Value
I discovered a way of doing what I was trying to accomplish. I need to modify the object's dictionary:
>>> def f():
... print(x)
...
>>> f.__dict__['x'] = 2
>>> f()
2
Basically if you define your variable in the main program, you can then use the global keyword to reference it.
bah = 1
def function():
global bah
bah = 2
Can someone explain to me why the following code produces the exception it does?
>>> class CallableKlass(object):
def __init__(self, callible):
self.callible = callible
def __call__(self, arg):
return self.callible(arg)
>>> class Klass(object):
d = {'foo': 'bar'}
m = CallableKlass(lambda x: d[x])
>>> Klass.m('foo')
Traceback (most recent call last):
File "<pyshell#10>", line 1, in <module>
Klass.m('foo')
File "<pyshell#5>", line 5, in __call__
return self.callible(arg)
File "<pyshell#9>", line 3, in <lambda>
m = CallableKlass(lambda x: d[x])
NameError: global name 'd' is not defined
The class namespace (stuff defined directly in the class body) is not accessible from within functions defined in that namespace. A lambda is just a function, so this applies to lambdas as well. Your CallableKlass is a red herring. The behavior is the same in this simpler case:
>>> class Foo(object):
... d = {'foo': 'bar'}
... (lambda stuff: d[stuff])('foo')
Traceback (most recent call last):
File "<pyshell#3>", line 1, in <module>
class Foo(object):
File "<pyshell#3>", line 3, in Foo
(lambda stuff: d[stuff])('foo')
File "<pyshell#3>", line 3, in <lambda>
(lambda stuff: d[stuff])('foo')
NameError: global name 'd' is not defined
>>> class Foo(object):
... d = {'foo': 'bar'}
... def f(stuff):
... d[stuff]
... f('foo')
Traceback (most recent call last):
File "<pyshell#4>", line 1, in <module>
class Foo(object):
File "<pyshell#4>", line 5, in Foo
f('foo')
File "<pyshell#4>", line 4, in f
d[stuff]
NameError: global name 'd' is not defined
you should use Klass.d inside lambda, as the variables declared inside a class becomes attribute of that class.
That's why your program raised that error, as it is not able to find anything like d in global variables.:
class Klass(object):
d = {'foo': 'bar'}
m = CallableKlass(lambda x: Klass.d[x])
Usually, you can set an arbitrary attribute to a custom object, for instance
----------
>>> a=A()
>>> a.foo=42
>>> a.__dict__
{'foo': 42}
>>>
----------
On the other hand, you can't do the same binding with a string object :
----------
>>> a=str("bar")
>>> a.foo=42
Traceback (most recent call last):
File "<stdin>", line 1, in <module>
AttributeError: 'str' object has no attribute 'foo'
>>> a.__dict__
Traceback (most recent call last):
File "<stdin>", line 1, in <module>
AttributeError: 'str' object has no attribute '__dict__'
>>>
----------
Why ?
Because the str type is a type wich does not has an attribute dict. From the docs, "Classes" section:
A class has a namespace implemented by a dictionary object.
Class attribute references are translated to lookups in this
dictionary, e.g., C.x is translated to C.__dict__["x"]
You can also enforce something similar on custom objects:
>>> class X(object):
... __slots__=('a', )
...
>>> a = X()
>>> a.a = 2
>>> a.foo = 2
Traceback (most recent call last):
File "<stdin>", line 1, in <module>
AttributeError: 'X' object has no attribute 'foo'
In general, you should not be setting nor modifying fields of objects that you are not supposed to. The documentation of the specific data type should reference you what fields are available for public modification.
For example, an ReadOnlyPoint object, where the x and y coordinates are set only on object construction:
>>> class ReadOnlyPoint(object):
... __slots__ = ('_x', '_y')
... def __init__(self, x, y):
... self._x = x
... self._y = y
... def getx(self):
... return self._x
... def gety(self):
... return self._y
... x = property(getx)
... y = property(gety)
...
>>> p = ReadOnlyPoint(2, 3)
>>> print p.x, p.y
2 3
>>> p.x = 9
Traceback (most recent call last):
File "<stdin>", line 1, in <module>
AttributeError: can't set attribute
>>> p._x = 9
>>> print p.x, p.y
9 3
While the x and y properties are read-only, accessing the object internals allows you to alter the object's state.
The inhability to add a new field to an str object is an implementation detail, specific to the Python version that you are using.
http://docs.python.org/reference/datamodel.html
If the class has a setattr() or delattr() method, this is
called instead of updating the instance dictionary directly.
http://docs.python.org/reference/datamodel.html#object.setattr