Python __getattribute__ (or __getattr__) to emulate php __call - python

I would like to create a class that effectively does this (mixing a little PHP with Python)
class Middle(object) :
# self.apply is a function that applies a function to a list
# e.g self.apply = [] ... self.apply.append(foobar)
def __call(self, name, *args) :
self.apply(name, *args)
Thus allowing for code to say:
m = Middle()
m.process_foo(a, b, c)
In this case __call() is the PHP __call() method which is invoked when a method is not found on an object.

You need to define __getattr__, it is called if an attribute is not otherwise found on your object.
Notice that getattr is called for any failed lookup, and that you don't get it like a function all, so you have to return the method that will be called.
def __getattr__(self, attr):
def default_method(*args):
self.apply(attr, *args)
return default_method

Consider passing arguments to your methods as arguments, not encoded into the method name which will then be magically used as an argument.
Where are you writing code that doesn't know what methods it will be calling?
Why call c.do_Something(x) and then unpack the method name instead of just calling c.do('Something', x) ?
In any case it's easy enough to handle unfound attributes:
class Dispatcher(object):
def __getattr__(self, key):
try:
return object.__getattr__(self, key)
except AttributeError:
return self.dispatch(key)
def default(self, *args, **kw):
print "Assuming default method"
print args, kw
def dispatch(self, key):
print 'Looking for method: %s'%(key,)
return self.default
A test:
>>> d = Dispatcher()
>>> d.hello()
Looking for method: hello
Assuming default method
() {}
This seems to be fraught with "gotchas" - the thing returned by getattr is going to be presumed to be not just a function, but a bound method on that instance. So be sure to return that.

I actually did this recently. Here's an example of how I solved it:
class Example:
def FUNC_1(self, arg):
return arg - 1
def FUNC_2(self, arg):
return arg - 2
def decode(self, func, arg):
try:
exec( "result = self.FUNC_%s(arg)" % (func) )
except AttributeError:
# Call your default method here
result = self.default(arg)
return result
def default(self, arg):
return arg
and the output:
>>> dude = Example()
>>> print dude.decode(1, 0)
-1
>>> print dude.decode(2, 10)
8
>>> print dude.decode(3, 5)
5

Related

How can I return self and another variable in a python class method while method chaining?

I understand what I am asking here is probably not the best code design, but the reason for me asking is strictly academic. I am trying to understand how to make this concept work.
Typically, I will return self from a class method so that the following methods can be chained together. My understanding is by returning self, I am simply returning an instance of the class, for the following methods to work on.
But in this case, I am trying to figure out how to return both self and another value from the method. The idea is if I do not want to chain, or I do not call any class attributes, I want to retrieve the data from the method being called.
Consider this example:
class Test(object):
def __init__(self):
self.hold = None
def methoda(self):
self.hold = 'lol'
return self, 'lol'
def newmethod(self):
self.hold = self.hold * 2
return self, 2
t = Test()
t.methoda().newmethod()
print(t.hold)
In this case, I will get an AttributeError: 'tuple' object has no attribute 'newmethod' which is to be expected because the methoda method is returning a tuple which does not have any methods or attributes called newmethod.
My question is not about unpacking multiple returns, but more about how can I continue to chain methods when the preceding methods are returning multiple values. I also understand that I can control the methods return with an argument to it, but that is not what I am trying to do.
As mentioned previously, I do realize this is probably a bad question, and I am happy to delete the post if the question doesnt make any sense.
Following the suggestion by #JohnColeman, you can return a special tuple with attribute lookup delegated to your object if it is not a normal tuple attribute. That way it acts like a normal tuple except when you are chaining methods.
You can implement this as follows:
class ChainResult(tuple):
def __new__(cls, *args):
return super(ChainResult, cls).__new__(cls, args)
def __getattribute__(self, name):
try:
return getattr(super(), name)
except AttributeError:
return getattr(super().__getitem__(0), name)
class Test(object):
def __init__(self):
self.hold = None
def methoda(self):
self.hold = 'lol'
return ChainResult(self, 'lol')
def newmethod(self):
self.hold = self.hold * 2
return ChainResult(self, 2)
Testing:
>>> t = Test()
>>> t.methoda().newmethod()
>>> print(t.hold)
lollol
The returned result does indeed act as a tuple:
>>> t, res = t.methoda().newmethod()
>>> print(res)
2
>>> print(isinstance(t.methoda().newmethod(), tuple))
True
You could imagine all sorts of semantics with this, such as forwarding the returned values to the next method in the chain using closure:
class ChainResult(tuple):
def __new__(cls, *args):
return super(ChainResult, cls).__new__(cls, args)
def __getattribute__(self, name):
try:
return getattr(super(), name)
except AttributeError:
attr = getattr(super().__getitem__(0), name)
if callable(attr):
chain_results = super().__getitem__(slice(1, None))
return lambda *args, **kw: attr(*(chain_results+args), **kw)
else:
return attr
For example,
class Test:
...
def methodb(self, *args):
print(*args)
would produce
>>> t = Test()
>>> t.methoda().methodb('catz')
lol catz
It would be nice if you could make ChainResults invisible. You can almost do it by initializing the tuple base class with the normal results and saving your object in a separate attribute used only for chaining. Then use a class decorator that wraps every method with ChainResults(self, self.method(*args, **kw)). It will work okay for methods that return a tuple but a single value return will act like a length 1 tuple, so you will need something like obj.method()[0] or result, = obj.method() to work with it. I played a bit with delegating to tuple for a multiple return or to the value itself for a single return; maybe it could be made to work but it introduces so many ambiguities that I doubt it could work well.

python3 - try to get method-variable with function-reference using decorator

I am using Python 3.5 and i try to print the number of function-calls with python-decorator. So here is my example:
import inspect
def logWrapper(func):
def wrapper_func(self, *args, **kwargs):
wrapper_func.calls += 1
self.logger.info('Func: {}' .format(func.__name__) )
return func(self, *args, **kwargs)
wrapper_func.calls=0
return wrapper_func
class A:
__init__(self):
print('created')
#logWrapper
def myfunction1(self, var1):
print('var1: {}' .format(var1))
#logWrapper
def myfunction2(self, var2):
print('var2: {}' .format(var2))
if __name__ == "__main__":
Pal1=A()
Pal1.myfunction1('1')
Pal1.myfunction1('2')
Pal1.myfunction1('3')
Pal1.myfunction2('A')
function_list=inspect.getmembers(Pal1, predicate=inspect.ismethod)
for func in function_list:
method_to_call = getattr(A, func[0])
print( 'Function: {}; Calls: {}' .format(func[0], method_to_call.calls))
When i call Pal1.myfunction1.calls and Pal1.myfunction2.calls i got my correct results 3 and 1. But now i like to iterate throw every function of class A. When i try Pal1.func[0].calls i got the error-message *** AttributeError: 'A' object has no attribute 'func', i also tried A.method_to_call.calls and got the same results.
What i am doing wrong?
The problem is that you are calling instance methods as class methods.
For example, you cannot call
A.myfunction1('3')
However you can call,
inst = A()
inst.myfunction1('3')
When this is translated into the getattr syntax, your code becomes
method_to_call = getattr(Pal1, func[0])

How to understand lazy function in Django utils functional module

I am learning from Django source code.
When I read about functional module in Django,
I don't know how to understand it.
What the function is for and how to explain the implement of it.
This is my first to use stackoverflow.
If some rules in here I didn't notice, please remind me.Thanks.
the code:
class Promise(object):
"""
This is just a base class for the proxy class created in
the closure of the lazy function. It can be used to recognize
promises in code.
"""
pass
def lazy(func, *resultclasses):
"""
Turns any callable into a lazy evaluated callable. You need to give result
classes or types -- at least one is needed so that the automatic forcing of
the lazy evaluation code is triggered. Results are not memoized; the
function is evaluated on every access.
"""
#total_ordering
class __proxy__(Promise):
"""
Encapsulate a function call and act as a proxy for methods that are
called on the result of that function. The function is not evaluated
until one of the methods on the result is called.
"""
__dispatch = None
def __init__(self, args, kw):
self.__args = args
self.__kw = kw
if self.__dispatch is None:
self.__prepare_class__()
def __reduce__(self):
return (
_lazy_proxy_unpickle,
(func, self.__args, self.__kw) + resultclasses
)
#classmethod
def __prepare_class__(cls):
cls.__dispatch = {}
for resultclass in resultclasses:
cls.__dispatch[resultclass] = {}
for type_ in reversed(resultclass.mro()):
for (k, v) in type_.__dict__.items():
# All __promise__ return the same wrapper method, but
# they also do setup, inserting the method into the
# dispatch dict.
meth = cls.__promise__(resultclass, k, v)
if hasattr(cls, k):
continue
setattr(cls, k, meth)
cls._delegate_bytes = bytes in resultclasses
cls._delegate_text = six.text_type in resultclasses
assert not (cls._delegate_bytes and cls._delegate_text), "Cannot call lazy() with both bytes and text return types."
if cls._delegate_text:
if six.PY3:
cls.__str__ = cls.__text_cast
else:
cls.__unicode__ = cls.__text_cast
elif cls._delegate_bytes:
if six.PY3:
cls.__bytes__ = cls.__bytes_cast
else:
cls.__str__ = cls.__bytes_cast
#classmethod
def __promise__(cls, klass, funcname, method):
# Builds a wrapper around some magic method and registers that
# magic method for the given type and method name.
def __wrapper__(self, *args, **kw):
# Automatically triggers the evaluation of a lazy value and
# applies the given magic method of the result type.
res = func(*self.__args, **self.__kw)
for t in type(res).mro():
if t in self.__dispatch:
return self.__dispatch[t][funcname](res, *args, **kw)
raise TypeError("Lazy object returned unexpected type.")
if klass not in cls.__dispatch:
cls.__dispatch[klass] = {}
cls.__dispatch[klass][funcname] = method
return __wrapper__
def __text_cast(self):
return func(*self.__args, **self.__kw)
def __bytes_cast(self):
return bytes(func(*self.__args, **self.__kw))
def __cast(self):
if self._delegate_bytes:
return self.__bytes_cast()
elif self._delegate_text:
return self.__text_cast()
else:
return func(*self.__args, **self.__kw)
def __ne__(self, other):
if isinstance(other, Promise):
other = other.__cast()
return self.__cast() != other
def __eq__(self, other):
if isinstance(other, Promise):
other = other.__cast()
return self.__cast() == other
def __lt__(self, other):
if isinstance(other, Promise):
other = other.__cast()
return self.__cast() < other
def __hash__(self):
return hash(self.__cast())
def __mod__(self, rhs):
if self._delegate_bytes and six.PY2:
return bytes(self) % rhs
elif self._delegate_text:
return six.text_type(self) % rhs
return self.__cast() % rhs
def __deepcopy__(self, memo):
# Instances of this class are effectively immutable. It's just a
# collection of functions. So we don't need to do anything
# complicated for copying.
memo[id(self)] = self
return self
#wraps(func)
def __wrapper__(*args, **kw):
# Creates the proxy object, instead of the actual value.
return __proxy__(args, kw)
return __wrapper__
This function takes function and any number of classes.
If to simplify, it returns wrapper(lets say "lazy function") instead of that function. At that point we can say that we turned function
into lazy function.
After that we can call this lazy function. Once called, it will return instance of proxy class, without calling the initial
function instead of result of initial function.
The initial function will be called only after we invoke any method on that result(proxy instance).
*resultclasses here is the classes, instances of which are expected as results of the initial function
For example:
def func(text):
return text.title()
lazy_func = lazy(func, str)
#lazy functon. prepared to dispatch any method of str instance.
res = lazy_func('test') #instance of __proxy__ class instead of 'Test' string.
res.find('T') #only at that point we call the initial function
I'll try to explain how it works in overall:
def lazy(func, *resultclasses): #On decorate
#total_ordering
class __proxy__(Promise):
__dispatch = None
def __init__(self, args, kw): #On call
#3) __proxy__ instance stores the original call's args and kwargs. args = ('Test', ) for our example
self.__args = args
self.__kw = kw
if self.__dispatch is None:
self.__prepare_class__()
#4) if it's the first call ot lazy function, we should prepare __proxy__ class
#On the first call of the __wrapper__ function we should prepare class. Class preparation in this case
#means that we'll fill the __dispatch class attribute with links to all methods of each result class.
#We need to prepare class only on first call.
#classmethod
def __prepare_class__(cls):
cls.__dispatch = {}
for resultclass in resultclasses:
#5) Looping through the resultclasses. In our example it's only str
cls.__dispatch[resultclass] = {}
for type_ in reversed(resultclass.mro()):
#6) looping through each superclass of each resultclass in reversed direction.
# So that'll be (object, str) for our example
for (k, v) in type_.__dict__.items():
#7) Looping through each attribute of each superclass. For example k = 'find', v = str.find
meth = cls.__promise__(resultclass, k, v)
if hasattr(cls, k):
continue
setattr(cls, k, meth)
#9) If __proxy__ class doesn't have attribute 'find' for example, we set the __wrapper__ to
#that attribute
#So class __proxy__ will have the __wrapper__ method in __proxy__.__dict__['find'].
#And so on for all methods.
#classmethod
def __promise__(cls, klass, funcname, method):
# Builds a wrapper around some magic method and registers that
# magic method for the given type and method name.
def __wrapper__(self, *args, **kw): #При вызове каждого метода результирующего класса (str)
# Automatically triggers the evaluation of a lazy value and
# applies the given magic method of the result type.
res = func(*self.__args, **self.__kw)
#10 finally we call the original function
for t in type(res).mro():
#11) We're looping through all the superclasses of result's class from the bottom to the top
#That''ll be (str, object) for our example
if t in self.__dispatch:
#12) If the class is dispatched we pass the result with args and kwargs to
#__proxy__.__dispatch[str]['find'] which is unbound method 'find' of str class
#For our example res = 'Test', args = ('T', )
return self.__dispatch[t][funcname](res, *args, **kw)
raise TypeError("Lazy object returned unexpected type.")
if klass not in cls.__dispatch:
cls.__dispatch[klass] = {}
cls.__dispatch[klass][funcname] = method
#7) Adds __proxy__.__dispatch[str]['find'] = str.find for example which is unbound method 'find' of str class
#and so on with each method of each superclass of each resultclass
#8) Returns new __wrapper__ method for each method of each resultclass. This wrapper method has the
#funcname variable in closure.
return __wrapper__
#wraps(func) #makes the lazy function look like the initial
def __wrapper__(*args, **kw):
# Creates the proxy object, instead of the actual value.
return __proxy__(args, kw)
#2)On call of lazy function we get __proxy__ instance instead of the actual value
return __wrapper__
#1)As the result of lazy(func, *resultclasses) call we get the __wrapper__ function, which looks like
#the initial function because of the #wraps decorator

How to intercept a method call which doesn't exist?

I want to create a class that doesn't gives an Attribute Error on call of any method that may or may not exists:
My class:
class magic_class:
...
# How to over-ride method calls
...
Expected Output:
ob = magic_class()
ob.unknown_method()
# Prints 'unknown_method' was called
ob.unknown_method2()
# Prints 'unknown_method2' was called
Now, unknown_method and unknown_method2 doesn't actually exists in the class, but how can we intercept the method call in python ?
Overwrite the __getattr__() magic method:
class MagicClass(object):
def __getattr__(self, name):
def wrapper(*args, **kwargs):
print "'%s' was called" % name
return wrapper
ob = MagicClass()
ob.unknown_method()
ob.unknown_method2()
prints
'unknown_method' was called
'unknown_method2' was called
Just in case someone is trying to delegate the unknown method to an object, here's the code:
class MagicClass():
def __init__(self, obj):
self.an_obj = obj
def __getattr__(self, method_name):
def method(*args, **kwargs):
print("Handling unknown method: '{}'".format(method_name))
if kwargs:
print("It had the following key word arguments: " + str(kwargs))
if args:
print("It had the following positional arguments: " + str(args))
return getattr(self.an_obj, method_name)(*args, **kwargs)
return method
This is super useful when you need to apply the Proxy pattern.
Moreover, considering both args and kwargs, allows you to generate an interface totally user friendly, as the ones that use MagicClass treat it as it was the real object.
Override __getattr__; see http://docs.python.org/reference/datamodel.html

How to make an immutable object in Python?

Although I have never needed this, it just struck me that making an immutable object in Python could be slightly tricky. You can't just override __setattr__, because then you can't even set attributes in the __init__. Subclassing a tuple is a trick that works:
class Immutable(tuple):
def __new__(cls, a, b):
return tuple.__new__(cls, (a, b))
#property
def a(self):
return self[0]
#property
def b(self):
return self[1]
def __str__(self):
return "<Immutable {0}, {1}>".format(self.a, self.b)
def __setattr__(self, *ignored):
raise NotImplementedError
def __delattr__(self, *ignored):
raise NotImplementedError
But then you have access to the a and b variables through self[0] and self[1], which is annoying.
Is this possible in Pure Python? If not, how would I do it with a C extension?
(Answers that work only in Python 3 are acceptable).
Update:
As of Python 3.7, the way to go is to use the #dataclass decorator, see the newly accepted answer.
Yet another solution I just thought of: The simplest way to get the same behaviour as your original code is
Immutable = collections.namedtuple("Immutable", ["a", "b"])
It does not solve the problem that attributes can be accessed via [0] etc., but at least it's considerably shorter and provides the additional advantage of being compatible with pickle and copy.
namedtuple creates a type similar to what I described in this answer, i.e. derived from tuple and using __slots__. It is available in Python 2.6 or above.
The easiest way to do this is using __slots__:
class A(object):
__slots__ = []
Instances of A are immutable now, since you can't set any attributes on them.
If you want the class instances to contain data, you can combine this with deriving from tuple:
from operator import itemgetter
class Point(tuple):
__slots__ = []
def __new__(cls, x, y):
return tuple.__new__(cls, (x, y))
x = property(itemgetter(0))
y = property(itemgetter(1))
p = Point(2, 3)
p.x
# 2
p.y
# 3
Edit: If you want to get rid of indexing either, you can override __getitem__():
class Point(tuple):
__slots__ = []
def __new__(cls, x, y):
return tuple.__new__(cls, (x, y))
#property
def x(self):
return tuple.__getitem__(self, 0)
#property
def y(self):
return tuple.__getitem__(self, 1)
def __getitem__(self, item):
raise TypeError
Note that you can't use operator.itemgetter for the properties in thise case, since this would rely on Point.__getitem__() instead of tuple.__getitem__(). Fuerthermore this won't prevent the use of tuple.__getitem__(p, 0), but I can hardly imagine how this should constitute a problem.
I don't think the "right" way of creating an immutable object is writing a C extension. Python usually relies on library implementers and library users being consenting adults, and instead of really enforcing an interface, the interface should be clearly stated in the documentation. This is why I don't consider the possibility of circumventing an overridden __setattr__() by calling object.__setattr__() a problem. If someone does this, it's on her own risk.
Using a Frozen Dataclass
For Python 3.7+ you can use a Data Class with a frozen=True option, which is a very pythonic and maintainable way to do what you want.
It would look something like that:
from dataclasses import dataclass
#dataclass(frozen=True)
class Immutable:
a: Any
b: Any
As type hinting is required for dataclasses' fields, I have used Any from the typing module.
Reasons NOT to use a Namedtuple
Before Python 3.7 it was frequent to see namedtuples being used as immutable objects. It can be tricky in many ways, one of them is that the __eq__ method between namedtuples does not consider the objects' classes. For example:
from collections import namedtuple
ImmutableTuple = namedtuple("ImmutableTuple", ["a", "b"])
ImmutableTuple2 = namedtuple("ImmutableTuple2", ["a", "c"])
obj1 = ImmutableTuple(a=1, b=2)
obj2 = ImmutableTuple2(a=1, c=2)
obj1 == obj2 # will be True
As you see, even if the types of obj1 and obj2 are different, even if their fields' names are different, obj1 == obj2 still gives True. That's because the __eq__ method used is the tuple's one, which compares only the values of the fields given their positions. That can be a huge source of errors, specially if you are subclassing these classes.
..howto do it "properly" in C..
You could use Cython to create an extension type for Python:
cdef class Immutable:
cdef readonly object a, b
cdef object __weakref__ # enable weak referencing support
def __init__(self, a, b):
self.a, self.b = a, b
It works both Python 2.x and 3.
Tests
# compile on-the-fly
import pyximport; pyximport.install() # $ pip install cython
from immutable import Immutable
o = Immutable(1, 2)
assert o.a == 1, str(o.a)
assert o.b == 2
try: o.a = 3
except AttributeError:
pass
else:
assert 0, 'attribute must be readonly'
try: o[1]
except TypeError:
pass
else:
assert 0, 'indexing must not be supported'
try: o.c = 1
except AttributeError:
pass
else:
assert 0, 'no new attributes are allowed'
o = Immutable('a', [])
assert o.a == 'a'
assert o.b == []
o.b.append(3) # attribute may contain mutable object
assert o.b == [3]
try: o.c
except AttributeError:
pass
else:
assert 0, 'no c attribute'
o = Immutable(b=3,a=1)
assert o.a == 1 and o.b == 3
try: del o.b
except AttributeError:
pass
else:
assert 0, "can't delete attribute"
d = dict(b=3, a=1)
o = Immutable(**d)
assert o.a == d['a'] and o.b == d['b']
o = Immutable(1,b=3)
assert o.a == 1 and o.b == 3
try: object.__setattr__(o, 'a', 1)
except AttributeError:
pass
else:
assert 0, 'attributes are readonly'
try: object.__setattr__(o, 'c', 1)
except AttributeError:
pass
else:
assert 0, 'no new attributes'
try: Immutable(1,c=3)
except TypeError:
pass
else:
assert 0, 'accept only a,b keywords'
for kwd in [dict(a=1), dict(b=2)]:
try: Immutable(**kwd)
except TypeError:
pass
else:
assert 0, 'Immutable requires exactly 2 arguments'
If you don't mind indexing support then collections.namedtuple suggested by #Sven Marnach is preferrable:
Immutable = collections.namedtuple("Immutable", "a b")
Another idea would be to completely disallow __setattr__ and use object.__setattr__ in the constructor:
class Point(object):
def __init__(self, x, y):
object.__setattr__(self, "x", x)
object.__setattr__(self, "y", y)
def __setattr__(self, *args):
raise TypeError
def __delattr__(self, *args):
raise TypeError
Of course you could use object.__setattr__(p, "x", 3) to modify a Point instance p, but your original implementation suffers from the same problem (try tuple.__setattr__(i, "x", 42) on an Immutable instance).
You can apply the same trick in your original implementation: get rid of __getitem__(), and use tuple.__getitem__() in your property functions.
You could create a #immutable decorator that either overrides the __setattr__ and change the __slots__ to an empty list, then decorate the __init__ method with it.
Edit: As the OP noted, changing the __slots__ attribute only prevents the creation of new attributes, not the modification.
Edit2: Here's an implementation:
Edit3: Using __slots__ breaks this code, because if stops the creation of the object's __dict__. I'm looking for an alternative.
Edit4: Well, that's it. It's a but hackish, but works as an exercise :-)
class immutable(object):
def __init__(self, immutable_params):
self.immutable_params = immutable_params
def __call__(self, new):
params = self.immutable_params
def __set_if_unset__(self, name, value):
if name in self.__dict__:
raise Exception("Attribute %s has already been set" % name)
if not name in params:
raise Exception("Cannot create atribute %s" % name)
self.__dict__[name] = value;
def __new__(cls, *args, **kws):
cls.__setattr__ = __set_if_unset__
return super(cls.__class__, cls).__new__(cls, *args, **kws)
return __new__
class Point(object):
#immutable(['x', 'y'])
def __new__(): pass
def __init__(self, x, y):
self.x = x
self.y = y
p = Point(1, 2)
p.x = 3 # Exception: Attribute x has already been set
p.z = 4 # Exception: Cannot create atribute z
I don't think it is entirely possible except by using either a tuple or a namedtuple. No matter what, if you override __setattr__() the user can always bypass it by calling object.__setattr__() directly. Any solution that depends on __setattr__ is guaranteed not to work.
The following is about the nearest you can get without using some sort of tuple:
class Immutable:
__slots__ = ['a', 'b']
def __init__(self, a, b):
object.__setattr__(self, 'a', a)
object.__setattr__(self, 'b', b)
def __setattr__(self, *ignored):
raise NotImplementedError
__delattr__ = __setattr__
but it breaks if you try hard enough:
>>> t = Immutable(1, 2)
>>> t.a
1
>>> object.__setattr__(t, 'a', 2)
>>> t.a
2
but Sven's use of namedtuple is genuinely immutable.
Update
Since the question has been updated to ask how to do it properly in C, here's my answer on how to do it properly in Cython:
First immutable.pyx:
cdef class Immutable:
cdef object _a, _b
def __init__(self, a, b):
self._a = a
self._b = b
property a:
def __get__(self):
return self._a
property b:
def __get__(self):
return self._b
def __repr__(self):
return "<Immutable {0}, {1}>".format(self.a, self.b)
and a setup.py to compile it (using the command setup.py build_ext --inplace:
from distutils.core import setup
from distutils.extension import Extension
from Cython.Distutils import build_ext
ext_modules = [Extension("immutable", ["immutable.pyx"])]
setup(
name = 'Immutable object',
cmdclass = {'build_ext': build_ext},
ext_modules = ext_modules
)
Then to try it out:
>>> from immutable import Immutable
>>> p = Immutable(2, 3)
>>> p
<Immutable 2, 3>
>>> p.a = 1
Traceback (most recent call last):
File "<stdin>", line 1, in <module>
AttributeError: attribute 'a' of 'immutable.Immutable' objects is not writable
>>> object.__setattr__(p, 'a', 1)
Traceback (most recent call last):
File "<stdin>", line 1, in <module>
AttributeError: attribute 'a' of 'immutable.Immutable' objects is not writable
>>> p.a, p.b
(2, 3)
>>>
I've made immutable classes by overriding __setattr__, and allowing the set if the caller is __init__:
import inspect
class Immutable(object):
def __setattr__(self, name, value):
if inspect.stack()[2][3] != "__init__":
raise Exception("Can't mutate an Immutable: self.%s = %r" % (name, value))
object.__setattr__(self, name, value)
This isn't quite enough yet, since it allows anyone's ___init__ to change the object, but you get the idea.
Here's an elegant solution:
class Immutable(object):
def __setattr__(self, key, value):
if not hasattr(self, key):
super().__setattr__(key, value)
else:
raise RuntimeError("Can't modify immutable object's attribute: {}".format(key))
Inherit from this class, initialize your fields in the constructor, and you'e all set.
In addition to the excellent other answers I like to add a method for python 3.4 (or maybe 3.3). This answer builds upon several previouse answers to this question.
In python 3.4, you can use properties without setters to create class members that cannot be modified. (In earlier versions assigning to properties without a setter was possible.)
class A:
__slots__=['_A__a']
def __init__(self, aValue):
self.__a=aValue
#property
def a(self):
return self.__a
You can use it like this:
instance=A("constant")
print (instance.a)
which will print "constant"
But calling instance.a=10 will cause:
AttributeError: can't set attribute
Explaination: properties without setters are a very recent feature of python 3.4 (and I think 3.3). If you try to assign to such a property, an Error will be raised.
Using slots I restrict the membervariables to __A_a (which is __a).
Problem: Assigning to _A__a is still possible (instance._A__a=2). But if you assign to a private variable, it is your own fault...
This answer among others, however, discourages the use of __slots__. Using other ways to prevent attribute creation might be preferrable.
So, I am writing respective of python 3:
I) with the help of data class decorator and set frozen=True.
we can create immutable objects in python.
for this need to import data class from data classes lib and needs to set frozen=True
ex.
from dataclasses import dataclass
#dataclass(frozen=True)
class Location:
name: str
longitude: float = 0.0
latitude: float = 0.0
o/p:
>>> l = Location("Delhi", 112.345, 234.788)
>>> l.name
'Delhi'
>>> l.longitude
112.345
>>> l.latitude
234.788
>>> l.name = "Kolkata"
dataclasses.FrozenInstanceError: cannot assign to field 'name'
>>>
Source: https://realpython.com/python-data-classes/
If you are interested in objects with behavior, then namedtuple is almost your solution.
As described at the bottom of the namedtuple documentation, you can derive your own class from namedtuple; and then, you can add the behavior you want.
For example (code taken directly from the documentation):
class Point(namedtuple('Point', 'x y')):
__slots__ = ()
#property
def hypot(self):
return (self.x ** 2 + self.y ** 2) ** 0.5
def __str__(self):
return 'Point: x=%6.3f y=%6.3f hypot=%6.3f' % (self.x, self.y, self.hypot)
for p in Point(3, 4), Point(14, 5/7):
print(p)
This will result in:
Point: x= 3.000 y= 4.000 hypot= 5.000
Point: x=14.000 y= 0.714 hypot=14.018
This approach works for both Python 3 and Python 2.7 (tested on IronPython as well).
The only downside is that the inheritance tree is a bit weird; but this is not something you usually play with.
As of Python 3.7, you can use the #dataclass decorator in your class and it will be immutable like a struct! Though, it may or may not add a __hash__() method to your class. Quote:
hash() is used by built-in hash(), and when objects are added to hashed collections such as dictionaries and sets. Having a hash() implies that instances of the class are immutable. Mutability is a complicated property that depends on the programmer’s intent, the existence and behavior of eq(), and the values of the eq and frozen flags in the dataclass() decorator.
By default, dataclass() will not implicitly add a hash() method unless it is safe to do so. Neither will it add or change an existing explicitly defined hash() method. Setting the class attribute hash = None has a specific meaning to Python, as described in the hash() documentation.
If hash() is not explicit defined, or if it is set to None, then dataclass() may add an implicit hash() method. Although not recommended, you can force dataclass() to create a hash() method with unsafe_hash=True. This might be the case if your class is logically immutable but can nonetheless be mutated. This is a specialized use case and should be considered carefully.
Here the example from the docs linked above:
#dataclass
class InventoryItem:
'''Class for keeping track of an item in inventory.'''
name: str
unit_price: float
quantity_on_hand: int = 0
def total_cost(self) -> float:
return self.unit_price * self.quantity_on_hand
Just Like a dict
I have an open source library where I'm doing things in a functional way so moving data around in an immutable object is helpful. However, I don't want to have to transform my data object for the client to interact with them. So, I came up with this - it gives you a dict like object thats immutable + some helper methods.
Credit to Sven Marnach in his answer for the basic implementation of restricting property updating and deleting.
import json
# ^^ optional - If you don't care if it prints like a dict
# then rip this and __str__ and __repr__ out
class Immutable(object):
def __init__(self, **kwargs):
"""Sets all values once given
whatever is passed in kwargs
"""
for k,v in kwargs.items():
object.__setattr__(self, k, v)
def __setattr__(self, *args):
"""Disables setting attributes via
item.prop = val or item['prop'] = val
"""
raise TypeError('Immutable objects cannot have properties set after init')
def __delattr__(self, *args):
"""Disables deleting properties"""
raise TypeError('Immutable objects cannot have properties deleted')
def __getitem__(self, item):
"""Allows for dict like access of properties
val = item['prop']
"""
return self.__dict__[item]
def __repr__(self):
"""Print to repl in a dict like fashion"""
return self.pprint()
def __str__(self):
"""Convert to a str in a dict like fashion"""
return self.pprint()
def __eq__(self, other):
"""Supports equality operator
immutable({'a': 2}) == immutable({'a': 2})"""
if other is None:
return False
return self.dict() == other.dict()
def keys(self):
"""Paired with __getitem__ supports **unpacking
new = { **item, **other }
"""
return self.__dict__.keys()
def get(self, *args, **kwargs):
"""Allows for dict like property access
item.get('prop')
"""
return self.__dict__.get(*args, **kwargs)
def pprint(self):
"""Helper method used for printing that
formats in a dict like way
"""
return json.dumps(self,
default=lambda o: o.__dict__,
sort_keys=True,
indent=4)
def dict(self):
"""Helper method for getting the raw dict value
of the immutable object"""
return self.__dict__
Helper methods
def update(obj, **kwargs):
"""Returns a new instance of the given object with
all key/val in kwargs set on it
"""
return immutable({
**obj,
**kwargs
})
def immutable(obj):
return Immutable(**obj)
Examples
obj = immutable({
'alpha': 1,
'beta': 2,
'dalet': 4
})
obj.alpha # 1
obj['alpha'] # 1
obj.get('beta') # 2
del obj['alpha'] # TypeError
obj.alpha = 2 # TypeError
new_obj = update(obj, alpha=10)
new_obj is not obj # True
new_obj.get('alpha') == 10 # True
This way doesn't stop object.__setattr__ from working, but I've still found it useful:
class A(object):
def __new__(cls, children, *args, **kwargs):
self = super(A, cls).__new__(cls)
self._frozen = False # allow mutation from here to end of __init__
# other stuff you need to do in __new__ goes here
return self
def __init__(self, *args, **kwargs):
super(A, self).__init__()
self._frozen = True # prevent future mutation
def __setattr__(self, name, value):
# need to special case setting _frozen.
if name != '_frozen' and self._frozen:
raise TypeError('Instances are immutable.')
else:
super(A, self).__setattr__(name, value)
def __delattr__(self, name):
if self._frozen:
raise TypeError('Instances are immutable.')
else:
super(A, self).__delattr__(name)
you may need to override more stuff (like __setitem__) depending on the use case.
Classes which inherit from the following Immutable class are immutable, as are their instances, after their __init__ method finishes executing. Since it's pure python, as others have pointed out, there's nothing stopping someone from using the mutating special methods from the base object and type, but this is enough to stop anyone from mutating a class/instance by accident.
It works by hijacking the class-creation process with a metaclass.
"""Subclasses of class Immutable are immutable after their __init__ has run, in
the sense that all special methods with mutation semantics (in-place operators,
setattr, etc.) are forbidden.
"""
# Enumerate the mutating special methods
mutation_methods = set()
# Arithmetic methods with in-place operations
iarithmetic = '''add sub mul div mod divmod pow neg pos abs bool invert lshift
rshift and xor or floordiv truediv matmul'''.split()
for op in iarithmetic:
mutation_methods.add('__i%s__' % op)
# Operations on instance components (attributes, items, slices)
for verb in ['set', 'del']:
for component in '''attr item slice'''.split():
mutation_methods.add('__%s%s__' % (verb, component))
# Operations on properties
mutation_methods.update(['__set__', '__delete__'])
def checked_call(_self, name, method, *args, **kwargs):
"""Calls special method method(*args, **kw) on self if mutable."""
self = args[0] if isinstance(_self, object) else _self
if not getattr(self, '__mutable__', True):
# self told us it's immutable, so raise an error
cname= (self if isinstance(self, type) else self.__class__).__name__
raise TypeError('%s is immutable, %s disallowed' % (cname, name))
return method(*args, **kwargs)
def method_wrapper(_self, name):
"Wrap a special method to check for mutability."
method = getattr(_self, name)
def wrapper(*args, **kwargs):
return checked_call(_self, name, method, *args, **kwargs)
wrapper.__name__ = name
wrapper.__doc__ = method.__doc__
return wrapper
def wrap_mutating_methods(_self):
"Place the wrapper methods on mutative special methods of _self"
for name in mutation_methods:
if hasattr(_self, name):
method = method_wrapper(_self, name)
type.__setattr__(_self, name, method)
def set_mutability(self, ismutable):
"Set __mutable__ by using the unprotected __setattr__"
b = _MetaImmutable if isinstance(self, type) else Immutable
super(b, self).__setattr__('__mutable__', ismutable)
class _MetaImmutable(type):
'''The metaclass of Immutable. Wraps __init__ methods via __call__.'''
def __init__(cls, *args, **kwargs):
# Make class mutable for wrapping special methods
set_mutability(cls, True)
wrap_mutating_methods(cls)
# Disable mutability
set_mutability(cls, False)
def __call__(cls, *args, **kwargs):
'''Make an immutable instance of cls'''
self = cls.__new__(cls)
# Make the instance mutable for initialization
set_mutability(self, True)
# Execute cls's custom initialization on this instance
self.__init__(*args, **kwargs)
# Disable mutability
set_mutability(self, False)
return self
# Given a class T(metaclass=_MetaImmutable), mutative special methods which
# already exist on _MetaImmutable (a basic type) cannot be over-ridden
# programmatically during _MetaImmutable's instantiation of T, because the
# first place python looks for a method on an object is on the object's
# __class__, and T.__class__ is _MetaImmutable. The two extant special
# methods on a basic type are __setattr__ and __delattr__, so those have to
# be explicitly overridden here.
def __setattr__(cls, name, value):
checked_call(cls, '__setattr__', type.__setattr__, cls, name, value)
def __delattr__(cls, name, value):
checked_call(cls, '__delattr__', type.__delattr__, cls, name, value)
class Immutable(object):
"""Inherit from this class to make an immutable object.
__init__ methods of subclasses are executed by _MetaImmutable.__call__,
which enables mutability for the duration.
"""
__metaclass__ = _MetaImmutable
class T(int, Immutable): # Checks it works with multiple inheritance, too.
"Class for testing immutability semantics"
def __init__(self, b):
self.b = b
#classmethod
def class_mutation(cls):
cls.a = 5
def instance_mutation(self):
self.c = 1
def __iadd__(self, o):
pass
def not_so_special_mutation(self):
self +=1
def immutabilityTest(f, name):
"Call f, which should try to mutate class T or T instance."
try:
f()
except TypeError, e:
assert 'T is immutable, %s disallowed' % name in e.args
else:
raise RuntimeError('Immutability failed!')
immutabilityTest(T.class_mutation, '__setattr__')
immutabilityTest(T(6).instance_mutation, '__setattr__')
immutabilityTest(T(6).not_so_special_mutation, '__iadd__')
The third party attr module provides this functionality.
Edit: python 3.7 has adopted this idea into the stdlib with #dataclass.
$ pip install attrs
$ python
>>> #attr.s(frozen=True)
... class C(object):
... x = attr.ib()
>>> i = C(1)
>>> i.x = 2
Traceback (most recent call last):
...
attr.exceptions.FrozenInstanceError: can't set attribute
attr implements frozen classes by overriding __setattr__ and has a minor performance impact at each instantiation time, according to the documentation.
If you're in the habit of using classes as datatypes, attr may be especially useful as it takes care of the boilerplate for you (but doesn't do any magic). In particular, it writes nine dunder (__X__) methods for you (unless you turn any of them off), including repr, init, hash and all the comparison functions.
attr also provides a helper for __slots__.
You can override setattr and still use init to set the variable. You would use super class setattr. here is the code.
class Immutable:
__slots__ = ('a','b')
def __init__(self, a , b):
super().__setattr__('a',a)
super().__setattr__('b',b)
def __str__(self):
return "".format(self.a, self.b)
def __setattr__(self, *ignored):
raise NotImplementedError
def __delattr__(self, *ignored):
raise NotImplementedError
I found a way to do it without subclassing tuple, namedtuple etc. All you need to do is to disable setattr and delattr (and also setitem and delitem if you want to make a collection immutable) after the initiation:
def __init__(self, *args, **kwargs):
# something here
self.lock()
where lock can look like this:
#classmethod
def lock(cls):
def raiser(*a):
raise TypeError('this instance is immutable')
cls.__setattr__ = raiser
cls.__delattr__ = raiser
if hasattr(cls, '__setitem__'):
cls.__setitem__ = raiser
cls.__delitem__ = raiser
So you can create class Immutable with this method and use it the way I showed.
If you don't want to write self.lock() in every single init you can make it automatically with metaclasses:
class ImmutableType(type):
#classmethod
def change_init(mcs, original_init_method):
def __new_init__(self, *args, **kwargs):
if callable(original_init_method):
original_init_method(self, *args, **kwargs)
cls = self.__class__
def raiser(*a):
raise TypeError('this instance is immutable')
cls.__setattr__ = raiser
cls.__delattr__ = raiser
if hasattr(cls, '__setitem__'):
cls.__setitem__ = raiser
cls.__delitem__ = raiser
return __new_init__
def __new__(mcs, name, parents, kwargs):
kwargs['__init__'] = mcs.change_init(kwargs.get('__init__'))
return type.__new__(mcs, name, parents, kwargs)
class Immutable(metaclass=ImmutableType):
pass
Test
class SomeImmutableClass(Immutable):
def __init__(self, some_value: int):
self.important_attr = some_value
def some_method(self):
return 2 * self.important_attr
ins = SomeImmutableClass(3)
print(ins.some_method()) # 6
ins.important_attr += 1 # TypeError
ins.another_attr = 2 # TypeError
The basic solution below addresses the following scenario:
__init__() can be written accessing the attributes as usual.
AFTER that the OBJECT is frozen for attributes changes only:
The idea is to override __setattr__ method and replace its implementation each time the object frozen status is changed.
So we need some method (_freeze) which stores those two implementations and switches between them when requested.
This mechanism may be implemented inside the user class or inherited from a special Freezer class as shown below:
class Freezer:
def _freeze(self, do_freeze=True):
def raise_sa(*args):
raise AttributeError("Attributes are frozen and can not be changed!")
super().__setattr__('_active_setattr', (super().__setattr__, raise_sa)[do_freeze])
def __setattr__(self, key, value):
return self._active_setattr(key, value)
class A(Freezer):
def __init__(self):
self._freeze(False)
self.x = 10
self._freeze()
I needed this a little while ago and decided to make a Python package for it. The initial version is on PyPI now:
$ pip install immutable
To use:
>>> from immutable import ImmutableFactory
>>> MyImmutable = ImmutableFactory.create(prop1=1, prop2=2, prop3=3)
>>> MyImmutable.prop1
1
Full docs here: https://github.com/theengineear/immutable
Hope it helps, it wraps a namedtuple as has been discussed, but makes instantiation much simpler.
An alternative approach is to create a wrapper which makes an instance immutable.
class Immutable(object):
def __init__(self, wrapped):
super(Immutable, self).__init__()
object.__setattr__(self, '_wrapped', wrapped)
def __getattribute__(self, item):
return object.__getattribute__(self, '_wrapped').__getattribute__(item)
def __setattr__(self, key, value):
raise ImmutableError('Object {0} is immutable.'.format(self._wrapped))
__delattr__ = __setattr__
def __iter__(self):
return object.__getattribute__(self, '_wrapped').__iter__()
def next(self):
return object.__getattribute__(self, '_wrapped').next()
def __getitem__(self, item):
return object.__getattribute__(self, '_wrapped').__getitem__(item)
immutable_instance = Immutable(my_instance)
This is useful in situations where only some instances have to be immutable (like default arguments of function calls).
Can also be used in immutable factories like:
#classmethod
def immutable_factory(cls, *args, **kwargs):
return Immutable(cls.__init__(*args, **kwargs))
Also protects from object.__setattr__, but fallable to other tricks due to Python's dynamic nature.
I used the same idea as Alex: a meta-class and an "init marker", but in combination with over-writing __setattr__:
>>> from abc import ABCMeta
>>> _INIT_MARKER = '_#_in_init_#_'
>>> class _ImmutableMeta(ABCMeta):
...
... """Meta class to construct Immutable."""
...
... def __call__(cls, *args, **kwds):
... obj = cls.__new__(cls, *args, **kwds)
... object.__setattr__(obj, _INIT_MARKER, True)
... cls.__init__(obj, *args, **kwds)
... object.__delattr__(obj, _INIT_MARKER)
... return obj
...
>>> def _setattr(self, name, value):
... if hasattr(self, _INIT_MARKER):
... object.__setattr__(self, name, value)
... else:
... raise AttributeError("Instance of '%s' is immutable."
... % self.__class__.__name__)
...
>>> def _delattr(self, name):
... raise AttributeError("Instance of '%s' is immutable."
... % self.__class__.__name__)
...
>>> _im_dict = {
... '__doc__': "Mix-in class for immutable objects.",
... '__copy__': lambda self: self, # self is immutable, so just return it
... '__setattr__': _setattr,
... '__delattr__': _delattr}
...
>>> Immutable = _ImmutableMeta('Immutable', (), _im_dict)
Note: I'm calling the meta-class directly to make it work both for Python 2.x and 3.x.
>>> class T1(Immutable):
...
... def __init__(self, x=1, y=2):
... self.x = x
... self.y = y
...
>>> t1 = T1(y=8)
>>> t1.x, t1.y
(1, 8)
>>> t1.x = 7
AttributeError: Instance of 'T1' is immutable.
It does work also with slots ...:
>>> class T2(Immutable):
...
... __slots__ = 's1', 's2'
...
... def __init__(self, s1, s2):
... self.s1 = s1
... self.s2 = s2
...
>>> t2 = T2('abc', 'xyz')
>>> t2.s1, t2.s2
('abc', 'xyz')
>>> t2.s1 += 'd'
AttributeError: Instance of 'T2' is immutable.
... and multiple inheritance:
>>> class T3(T1, T2):
...
... def __init__(self, x, y, s1, s2):
... T1.__init__(self, x, y)
... T2.__init__(self, s1, s2)
...
>>> t3 = T3(12, 4, 'a', 'b')
>>> t3.x, t3.y, t3.s1, t3.s2
(12, 4, 'a', 'b')
>>> t3.y -= 3
AttributeError: Instance of 'T3' is immutable.
Note, however, that mutable attributes stay to be mutable:
>>> t3 = T3(12, [4, 7], 'a', 'b')
>>> t3.y.append(5)
>>> t3.y
[4, 7, 5]
One thing that's not really included here is total immutability... not just the parent object, but all the children as well. tuples/frozensets may be immutable for instance, but the objects that it's part of may not be. Here's a small (incomplete) version that does a decent job of enforcing immutability all the way down:
# Initialize lists
a = [1,2,3]
b = [4,5,6]
c = [7,8,9]
l = [a,b]
# We can reassign in a list
l[0] = c
# But not a tuple
t = (a,b)
#t[0] = c -> Throws exception
# But elements can be modified
t[0][1] = 4
t
([1, 4, 3], [4, 5, 6])
# Fix it back
t[0][1] = 2
li = ImmutableObject(l)
li
[[1, 2, 3], [4, 5, 6]]
# Can't assign
#li[0] = c will fail
# Can reference
li[0]
[1, 2, 3]
# But immutability conferred on returned object too
#li[0][1] = 4 will throw an exception
# Full solution should wrap all the comparison e.g. decorators.
# Also, you'd usually want to add a hash function, i didn't put
# an interface for that.
class ImmutableObject(object):
def __init__(self, inobj):
self._inited = False
self._inobj = inobj
self._inited = True
def __repr__(self):
return self._inobj.__repr__()
def __str__(self):
return self._inobj.__str__()
def __getitem__(self, key):
return ImmutableObject(self._inobj.__getitem__(key))
def __iter__(self):
return self._inobj.__iter__()
def __setitem__(self, key, value):
raise AttributeError, 'Object is read-only'
def __getattr__(self, key):
x = getattr(self._inobj, key)
if callable(x):
return x
else:
return ImmutableObject(x)
def __hash__(self):
return self._inobj.__hash__()
def __eq__(self, second):
return self._inobj.__eq__(second)
def __setattr__(self, attr, value):
if attr not in ['_inobj', '_inited'] and self._inited == True:
raise AttributeError, 'Object is read-only'
object.__setattr__(self, attr, value)
You can just override setAttr in the final statement of init. THen you can construct but not change. Obviously you can still override by usint object.setAttr but in practice most languages have some form of reflection so immutablility is always a leaky abstraction. Immutability is more about preventing clients from accidentally violating the contract of an object. I use:
=============================
The original solution offered was incorrect, this was updated based on the comments using the solution from here
The original solution is wrong in an interesting way, so it is included at the bottom.
===============================
class ImmutablePair(object):
__initialised = False # a class level variable that should always stay false.
def __init__(self, a, b):
try :
self.a = a
self.b = b
finally:
self.__initialised = True #an instance level variable
def __setattr__(self, key, value):
if self.__initialised:
self._raise_error()
else :
super(ImmutablePair, self).__setattr__(key, value)
def _raise_error(self, *args, **kw):
raise NotImplementedError("Attempted To Modify Immutable Object")
if __name__ == "__main__":
immutable_object = ImmutablePair(1,2)
print immutable_object.a
print immutable_object.b
try :
immutable_object.a = 3
except Exception as e:
print e
print immutable_object.a
print immutable_object.b
Output :
1
2
Attempted To Modify Immutable Object
1
2
======================================
Original Implementation:
It was pointed out in the comments, correctly, that this does not in fact work, as it prevents the creation of more than one object as you are overriding the class setattr method, which means a second cannot be created as self.a = will fail on the second initialisation.
class ImmutablePair(object):
def __init__(self, a, b):
self.a = a
self.b = b
ImmutablePair.__setattr__ = self._raise_error
def _raise_error(self, *args, **kw):
raise NotImplementedError("Attempted To Modify Immutable Object")
I've created a small class decorator decorator to make class immutable (except inside __init__). As part of https://github.com/google/etils.
from etils import epy
#epy.frozen
class A:
def __init__(self):
self.x = 123 # Inside `__init__`, attribute can be assigned
a = A()
a.x = 456 # AttributeError
This support inheritance too.
Implementation:
_Cls = TypeVar('_Cls')
def frozen(cls: _Cls) -> _Cls:
"""Class decorator which prevent mutating attributes after `__init__`."""
if not isinstance(cls, type):
raise TypeError(f'{cls.__name__} is not a class.')
cls.__init__ = _wrap_init(cls.__init__)
cls.__setattr__ = _wrap_setattr(cls.__setattr__)
return cls
def _wrap_init(init_fn):
"""`__init__` wrapper."""
#functools.wraps(init_fn)
def new_init(self, *args, **kwargs):
if hasattr(self, '_epy_is_init_done'):
# `_epy_is_init_done` already created, so it means we're
# a `super().__init__` call.
return init_fn(self, *args, **kwargs)
object.__setattr__(self, '_epy_is_init_done', False)
init_fn(self, *args, **kwargs)
object.__setattr__(self, '_epy_is_init_done', True)
return new_init
def _wrap_setattr(setattr_fn):
"""`__setattr__` wrapper."""
#functools.wraps(setattr_fn)
def new_setattr(self, name, value):
if not hasattr(self, '_epy_is_init_done'):
raise ValueError(
'Child of `#epy.frozen` class should be `#epy.frozen` too. (Error'
f' raised by {type(self)})'
)
if not self._epy_is_init_done: # pylint: disable=protected-access
return setattr_fn(self, name, value)
else:
raise AttributeError(
f'Cannot assign {name!r} in `#epy.frozen` class {type(self)}'
)
return new_setattr

Categories