I have a class A.
I have another class B. instances of class B should function exactly like class A, except for one caveat: I want another function available called special_method(self, args, kwargs)
So the following should work:
instance_A = classA(args, kwargs)
instance_B = classB(instance_A)
method_result = instance_B.special_method(args, kwargs)
How do I write class B to accomplish this?
Note: If I only wanted to do this for ONE class A, I could just have class B inherit class A. but I want to be able to add special_method to class C, D, E, F... etc.
So, you are describing a proxy object. Doing this for non-special methods is trivial in Python, you can use the __getattr__
In [1]: class A:
...: def foo(self):
...: return "A"
...:
In [2]: class B:
...: def __init__(self, instance):
...: self._instance = instance
...: def special_method(self, *args, **kwargs):
...: # do something special
...: return 42
...: def __getattr__(self, name):
...: return getattr(self._instance, name)
...:
In [3]: a = A()
In [4]: b = B(a)
In [5]: b.foo()
Out[5]: 'A'
In [6]: b.special_method()
Out[6]: 42
However, there is one caveat here: this won't work with special methods because special methods skip this part of attribute resolution and are directly looked up on the class __dict__.
An alternative, you can simply add the method to all the classes you need. Something like:
def special_method(self, *args, **kwargs):
# do something special
return 42
for klass in [A, C, D, E, F]:
klass.special_method = special_method
Of course, this would affect all instances of these classes (since you are simply dynamically adding a method to the class).
If you really need special methods, your best best would by to create a subclass, but you can do this dynamically with a simple helper function, e.g.:
def special_method(self, *args, **kwargs):
# do something special
return 42
_SPECIAL_MEMO = {}
def dynamic_mixin(klass, *init_args, **init_kwargs):
if klass not in _SPECIAL_MEMO:
child = type(f"{klass.__name__}Special", (klass,), {"special_method":special_method})
_SPECIAL_MEMO[klass] = child
return _SPECIAL_MEMO[klass](*init_args, **init_kwargs)
class Foo:
def __init__(self, foo):
self.foo = foo
def __len__(self):
return 88
def bar(self):
return self.foo*2
special_foo = dynamic_mixin(Foo, 10)
print("calling len", len(special_foo))
print("calling bar", special_foo.bar())
print("calling special method", special_foo.special_method())
The above script prints:
calling len 88
calling bar 20
calling special method 42
Related
The methods should be callable by assigning the specific parameters as well as considering the class' attributes.
So what I like to achieve is overwriting a method's arguments with preset attributes.
E.g.:
class Obj:
def __init__(self, cfg=None):
self.cfg = {}
if cfg:
self.cfg = cfg
return
def _actual_f_1(self, x, **kwargs):
return x+100
def f_1(self, x):
new = {"x": x, **self.cfg}
return self._actual_f_1(**new)
o = Obj()
o.f_1(1)
which prints
101
OR using the overriding approach:
o = Obj({"x": 100})
o.f_1(1)
which now gives
200
The defined class Obj looks pretty clumsy. Especially if several methods of the class should use the described logic.
How can I generalize the logic of f_1 which basically only alters the parameters before calling the actual method?
You can use __init_subclass__ in a base class to decorate all methods in a class, in a transparent way, to pick the fitting named parameters form a .cfg dict if they are not passed.
So, first let's think of the code for such a decorator - applying arguments can be rather complicated because among positional X named parameters with "positional only" and "named only" in the mix, the number of combinationx explode
Python's stdlib have a signature call which returns a Signature object with enough functionality to cover all corner cases. I use it, and just a common path - so that if the arguments you want to apply automatically are normal positional_or_keyword or keyword_only parameters, it should work:
from functools import wraps
def extender(method):
sig = inspect.signature(method)
#wraps(method)
def wrapper(self, *args, **kwargs):
bound = sig.bind_partial(*args, **kwargs)
for parameter in sig.parameters.keys():
if parameter not in bound.arguments and parameter in getattr(self, "cfg", {}):
kwargs[parameter] = self.cfg[parameter]
return method(self, *args, **kwargs)
return wrapper
Here we can see that working:
In [78]: class A:
...: def __init__(self):
...: self.cfg={"b": 5}
...: #extender
...: def a(self, a, b, c=10):
...: print( a, b, c)
...:
In [79]: a = A()
In [80]: a.a(1)
1 5 10
In [81]: a.a(1, c=2)
1 5 2
In [82]: a.a(1, c=2, b=3)
1 3 2
With only this decorator, your code could be rewritten as:
class Obj:
def __init__(self, cfg=None):
self.cfg = cfg if cfg is not None else {"extra": 10}
#extender
def f_1(self, x, extra):
return x+100
And if you want a base-class that will transparently wrap all methods in all subclasses with the extender, you can use this:
class Base:
def __init_subclass__(cls, *args, **kwargs):
super().__init_subclass__(*args, **kwargs)
for name, value in cls.__dict__.items():
if callable(value):
setattr(cls, name, extender(value))
And now one can use simply:
In [84]: class B(Base):
...: def __init__(self):
...: self.cfg = {"c": 10}
...: def b(self, a, c):
...: print(a, c)
...:
In [85]: B().b(1)
1 10
This decorator, unlike your example, takes care to just inject the arguments the function expects to receive, and not all of the keys from self.cfg in every function call.
If you want that behavior instead, you just have to take care to expand the cfg dict first and then updating it with the passed arguments, so that passed arguments will override default values in the cfg. The decorator code for that would be:
from functools import wraps
def extender_kw(method):
sig = inspect.signature(method)
#wraps(method)
def wrapper(self, *args, **kwargs):
bound = sig.bind_partial(*args, **kwargs)
kwargs = bound.arguments
kwargs |= getattr(self, "cfg", {})
return method(self, **kwargs)
return wrapper
I am interpreting "generalize" as "write this with fewer lines of code."
You wrote
self.cfg = {}
if cfg:
self.cfg = cfg
return
The 4th line is superfluous and can be elided.
The first 3 lines could be a simple assignment of ... = cfg or {}
Or we could inherit from a utility class,
and make a super().__init__ call.
So now we're presumably down to DRYing up and condensing these two lines:
new = {"x": x, **self.cfg}
return self._actual_f_1(**new)
Maybe write them as
return self._actual_f_1(self.args())
where the parent
abstract
class offers an args helper that knows about self.cfg.
It would inspect
the stack to find the caller's arg signature,
and merge it with cfg.
Alternatively you could phrase it as
return self.call(self._actual_f_1)
though that seems less convenient.
Do let us know
the details of how you wind up resolving this.
The get_calling_class function must pass the following tests by returning the class of the method that called the A.f method:
class A:
def f(self): return get_calling_class()
class B(A):
def g(self): return self.f()
class C(B):
def h(self): return self.f()
c = C()
assert c.g() == B
assert c.h() == C
Walking the stack should give the answer.
The answer should ideally be, in the caller's stack frame.
The problem is, the stack frames only record the function
names (like so: 'f', 'g', 'h', etc.) Any information about
classes is lost. Trying to reverse-engineer the lost info,
by navigating the class hierarchy (in parallel with the
stack frame), did not get me very far, and got complicated.
So, here is a different approach:
Inject the class info into the stack frame
(e.g. with local variables),
and read that, from the called function.
import inspect
class A:
def f(self):
frame = inspect.currentframe()
callerFrame = frame.f_back
callerLocals = callerFrame.f_locals
return callerLocals['cls']
class B(A):
def g(self):
cls = B
return self.f()
def f(self):
cls = B
return super().f()
class C(B):
def h(self):
cls = C
return super(B, self).f()
def f(self):
cls = C
return super().f()
c = C()
assert c.h() == C
assert c.g() == B
assert c.f() == B
Related:
get-fully-qualified-method-name-from-inspect-stack
Without modifying the definition of subclasses:
Added an "external" decorator, to wrap class methods.
(At least as a temporary solution.)
import inspect
class Injector:
def __init__(self, nameStr, valueStr):
self.nameStr = nameStr
self.valueStr = valueStr
# Should inject directly in f's local scope / stack frame.
# As is, it just adds another stack frame on top of f.
def injectInLocals(self, f):
def decorate(*args, **kwargs):
exec(f'{self.nameStr} = {self.valueStr}')
return f(*args, **kwargs)
return decorate
class A:
def f(self):
frame = inspect.currentframe()
callerDecoratorFrame = frame.f_back.f_back # Note:twice
callerDecoratorLocals = callerDecoratorFrame.f_locals
return callerDecoratorLocals['cls']
class B(A):
def g(self): return self.f()
def f(self): return super().f()
class C(B):
def h(self): return super(B, self).f()
def f(self): return super().f()
bInjector = Injector('cls', B.__name__)
B.g = bInjector.injectInLocals(B.g)
B.f = bInjector.injectInLocals(B.f)
cInjector = Injector('cls', C.__name__)
C.h = cInjector.injectInLocals(C.h)
C.f = cInjector.injectInLocals(C.f)
c = C()
assert c.h() == C
assert c.g() == B
assert c.f() == B
I found this link very interesting
(but didn't take advantage of metaclasses here):
what-are-metaclasses-in-python
Maybe someone could even replace the function definitions*,
with functions whose code is a duplicate of the original;
but with added locals/information, directly in their scope.
*
Maybe after the class definitions have completed;
maybe during class creation (using a metaclass).
In python, is there a way to prevent adding new class variables after defining the object?
For example:
class foo:
def __init__(self):
self.a = 1
self.b = 2
self.c = 3
bar = foo()
try:
bar.d = 4
except Exception, e:
print "I want this to always print"
Alternatively, is there a way to count the number of variables in an object?
class foo:
def __init__(self):
self.a = 1
self.b = 2
self.c = 3
def count(self):
...
bar = foo()
if bar.count() == 3:
print "I want this to always print"
The only way I thought of doing this was using a dictionary or list:
class foo:
def __int__(self):
self.dict = {'foo':1, 'bar':2}
self.len = 2
def chk():
return self.len == len(self.list)
However, doing this feels rather cumbersome for python. (obj.dict['foo']). I'd prefer just obj.foo if possible.
I want to have this so that I never accidentally declare a variable when I mean to change an existing one.
f = foo()
f.somename = 3
...
f.simename = 4 #this is a typo
if f.somename == 3:
solve_everything()
I suggest using __setattr__ to avoid the oddities of __slots__.
You always have to be careful when messing with __setattr__, since it takes care of setting all instance attributes, including those you set in __init__. Therefore it has to have some way of knowing when to allow the setting of an attribute, and when to deny it. In this solution I've designated a special attribute that controls whether new attributes are allowed or not:
class A(object):
def __init__(self):
self.a = 1
self.b = 2
self.c = 3
self.freeze = True
def __setattr__(self, attr, value):
if getattr(self, "freeze", False) and not hasattr(self, attr):
raise AttributeError("You shall not set attributes!")
super(A, self).__setattr__(attr, value)
Testing:
a = A()
try:
a.d = 89
except AttributeError:
print "It works!"
else:
print "It doesn't work."
a.c = 42
print a.a
print a.c
a.freeze = False
a.d = 28
a.freeze = True
print a.d
Result:
It works!
1
42
28
Also see gnibblers answer that wraps this concept neatly up in a class decorator, so it doesn't clutter up the class definition and can be reused in several classes without duplicating code.
EDIT:
Coming back to this answer a year later, I realize a context manager might solve this problem even better. Here's a modified version of gnibbler's class decorator:
from contextlib import contextmanager
#contextmanager
def declare_attributes(self):
self._allow_declarations = True
try:
yield
finally:
self._allow_declarations = False
def restrict_attributes(cls):
cls.declare_attributes = declare_attributes
def _setattr(self, attr, value):
disallow_declarations = not getattr(self, "_allow_declarations", False)
if disallow_declarations and attr != "_allow_declarations":
if not hasattr(self, attr):
raise AttributeError("You shall not set attributes!")
super(cls, self).__setattr__(attr, value)
cls.__setattr__ = _setattr
return cls
And here's how to use it:
#restrict_attributes
class A(object):
def __init__(self):
with self.declare_attributes():
self.a = 1
self.b = 2
self.c = 3
So whenever you want to set new attributes, just use the with statement as above. It can also be done from outside the instance:
a = A()
try:
a.d = 89
except AttributeError:
print "It works!"
else:
print "It doesn't work."
a.c = 42
print a.a
print a.c
with a.declare_attributes():
a.d = 28
print a.d
In python, is there a way to prevent adding new class variables after defining the object?
Yes. __slots__. But do carefully read the notes.
How about a class decorator based on lazyr's answer
def freeze(cls):
_init = cls.__init__
def init(self, *args, **kw):
_init(self, *args, **kw)
self.freeze = True
cls.__init__ = init
def _setattr(self, attr, value):
if getattr(self, "freeze", None) and (attr=="freeze" or not hasattr(self, attr)):
raise AttributeError("You shall not set attributes!")
super(cls, self).__setattr__(attr, value)
cls.__setattr__ = _setattr
return cls
#freeze
class foo(object):
def __init__(self):
self.a = 1
self.b = 2
self.c = 3
bar = foo()
try:
bar.d = 4
except Exception, e:
print "I want this to always print"
Preventing adding new attibutes using __slots__ class attribute:
class foo(object):
__slots__ = ['a', 'b', 'c']
def __init__(self):
self.a = 1
self.b = 2
self.c = 3
bar = foo()
try:
bar.d = 4
except Exception as e:
print(e,"I want this to always print")
Counting attributes:
print(len([attr for attr in dir(bar) if attr[0] != '_' ]))
use this to count no.of attributes of an instance:
>>> class foo:
def __init__(self):
self.a = 1
self.b = 2
self.c = 3
>>> bar=foo()
>>> bar.__dict__
{'a': 1, 'c': 3, 'b': 2}
>>> len(bar.__dict__) #returns no. of attributes of bar
3
Do you mean new class variables or new instance variables? The latter looks like what you mean and is much easier to do.
Per Ignacio Vazquez-Abrams's answer, __slots__ is probably what you want. Just do __slots__ = ('a', 'b', 'c') inside of your class and that will prevent any other attributes from being created. Note that this only applies to instances of your class -- class-level attributes can still be set, and subclasses can add whatever attributes they please. And he is right -- there are some oddities, so read the linked documentation before you start sprinkling slots everywhere.
If you aren't using slots, return len(vars(self)) works as a body for your suggested count method.
As an alternative to slots, you could define a __setattr__ that rejects any attribute not on a "known good" list, or to reject any new attributes after a frozen attribute is set to True at the end of __init__, etc. This is harder to get right, but more flexible.
If you actually want your instances to be completely read-only after initialization, and you are using a recent version of Python, consider defining a namedtuple or subclass thereof. Tuple subclasses also have some limitations though; if you need to go this route I can expand on it, but I'd stick with slots unless you have a reason to do otherwise.
Suppose you now want your class to have a fixed set of both mutable and immutable attributes? I've hacked gnibbler's answer to make class attributes immutable after init:
def frozenclass(cls):
""" Modify a class to permit no new attributes after instantiation.
Class attributes are immutable after init.
The passed class must have a superclass (e.g., inherit from 'object').
"""
_init = cls.__init__
def init(self, *args, **kw):
_init(self, *args, **kw)
self.freeze = True
cls.__init__ = init
def _setattr(self, attr, value):
if getattr(self, "freeze", None):
if attr=="freeze" or not hasattr(self, attr):
raise AttributeError("You shall not create attributes!")
if hasattr(type(self), attr):
raise AttributeError("You shall not modify immutable attributes!")
super(cls, self).__setattr__(attr, value)
cls.__setattr__ = _setattr
return cls
And an example:
#frozenclass
class myClass(object):
""" A demo class."""
# The following are immutable after init:
a = None
b = None
c = None
def __init__(self, a, b, c, d=None, e=None, f=None):
# Set the immutable attributes (just this once, only during init)
self.a = a
self.b = b
self.c = c
# Create and set the mutable attributes (modifyable after init)
self.d = d
self.e = e
self.f = f
In python, is there a way, when initializing a Class, to change the superclass in function of the value of a class attribute? Here's an example of what I want to do. First I have theses classes:
class A(object):
pass
class B(A):
# extend and override class A
pass
class C(A or B):
# extend and override class A
pass
Secondly, I want to create other classes that inherit from Class C but in some cases I want C to inherit from A and on other cases, inherit from B:
class D(C):
# C inherit only from A
from_B = False
class E(C):
# C inherit from B because attribute from_B = True
from_B = True
I tried with metaclass but that was setting the base class of C (to A or B) for all subclasses (D, E, ...) at the initialization of the first subclass. So, if the first subclass to be initialize had from_B = True, all subclasses of C had C(B) as parent whatever from_B was set. My code was something like this:
class MetaC(type):
def __new__(cls, name, bases, attrs):
if C in bases and getattr(attrs, 'from_B', False):
C.__bases__[C.__bases__.index(A)] = B
return super(MetaC, cls).__new__(cls, name, bases, attrs)
class C(A):
__metaclass__ = MetaC
My goal is to avoid the duplication of the code of the C class and keeping the possibility to have or not the added functionalities of the B class. I should mention that I don't have control on A and B classes.
UPDATE
I think I got it with this metaclass (code is a bit rough at the moment):
class MetaC(type):
def __new__(cls, name, bases, attrs):
for base in bases:
if base.__name__ == 'C':
if attrs.has_key('from_B'):
list_bases = list(base.__bases__)
list_bases[list_bases.index(A)] = B
base.__bases__ = tuple(list_bases)
elif B in base.__bases__:
list_bases = list(base.__bases__)
list_bases[list_bases.index(B)] = A
base.__bases__ = tuple(list_bases)
break
return super(MetaC, cls).__new__(cls, name, bases, attrs)
UPDATE 2
This solution doesn't work because I'm always modifying the base class C. So, when a subclass is instanciated it will use the C class in it's current state.
I ended by using cooperative multiple inheritance. It works fine. The only drawback is that we need to be sure that for methods that need to be call on many parent classes (like methods that are present in A and B and C), there's a super() call in each method definitions of each classes and that they have the same calling signature in every case. Fortunately for me my B classes respect this.
Example:
class A(object):
some_method(arg1, arg2, karg1=None):
do_some_stuff(arg1, arg2, karg1)
class B(A):
# extend and override class A
some_method(arg1, arg2, karg1=None):
super(B, self).some_method(arg1, arg2, karg1)
do_more_stuff(arg1, arg2, karg1)
class C(A, B):
# extend and override class A
some_method(arg1, arg2, karg1=None):
do_other_stuff(arg1, arg2, karg1)
super(C, self).some_method(arg1, arg2, karg1)
This way, when some_method will be call from C or C childrens, all theses calls will be made in this order:
C.some_method
A.some_method
B.some_method
Check The wonders of cooperative inheritance for more info on the subject.
This looks so painful, you have to consider composition/delegation instead of contorting inheritance this way. What do you think of something like this?
class A(object):
def from_B(self):
return False
class B(object):
def from_B(self):
return True
class C(object):
pass
class PolyClass(object):
def __init__(self, *args):
self.delegates = [c() for c in args[::-1]]
def __getattr__(self, attr):
for d in self.delegates:
if hasattr(d, attr):
return getattr(d,attr)
raise AttributeError(attr + "? what the heck is that?")
def __repr__(self):
return "<instance of (%s)>" % ','.join(d.__class__.__name__
for d in self.delegates[::-1])
pc1 = PolyClass(A,B)
pc2 = PolyClass(A,C)
pc3 = PolyClass(B,C)
for p in (pc1,pc2,pc3):
print p, p.from_B()
print pc1.from_C()
Prints:
<instance of (A,B)> True
<instance of (A,C)> False
<instance of (B,C)> True
Traceback (most recent call last):
File "varying_delegation.py", line 33, in <module>
print pc1.from_C()
File "varying_delegation.py", line 21, in __getattr__
raise AttributeError(attr + "? what the heck is that?")
AttributeError: from_C? what the heck is that?
EDIT:
Here's how to take the not-in-your-control classes A and B, and create custom C classes that look like they extend either an A or a B:
# Django admin classes
class A(object):
def from_B(self):
return False
class B(A):
def from_B(self):
return True
# Your own class, which might get created with an A or B instance
class C(object):
def __init__(self, obj):
self.obj = obj
def __getattr__(self, attr):
return getattr(self.obj, attr)
# these are instantiated some way, not in your control
a,b = A(), B()
# now create different C's
c1 = C(a)
c2 = C(b)
print c1.from_B()
print c2.from_B()
prints:
False
True
And to create your subclasses D and E, create an interim subclass of C (I called it SubC cause I lack imagination), which will auto-init the C superclass with a specific global variable, either a or b.
# a subclass of C for subclasses pre-wired to delegate to a specific
# global object
class SubC(C):
c_init_obj = None
def __init__(self):
super(SubC,self).__init__(self.c_init_obj)
class D(SubC): pass
class E(SubC): pass
# assign globals to C subclasses so they build with the correct contained
# global object
D.c_init_obj = a
E.c_init_obj = b
d = D()
e = E()
print d.from_B()
print e.from_B()
Again, prints:
False
True
I have a class:
class A(object):
def __init__(self,a,b,c,d,e,f,g,...........,x,y,z)
#do some init stuff
And I have a subclass which needs one extra arg (the last W)
class B(A):
def __init__(self.a,b,c,d,e,f,g,...........,x,y,z,W)
A.__init__(self,a,b,c,d,e,f,g,...........,x,y,z)
self.__W=W
It seems dumb to write all this boiler-plate code, e.g passing all the args from B's Ctor to the inside call to A's ctor, since then every change to A's ctor must be applied to two other places in B's code.
I am guessing python has some idiom to handle such cases which I am unaware of. Can you point me in the right direction?
My best hunch, is to have a sort of Copy-Ctor for A and then change B's code into
class B(A):
def __init__(self,instanceOfA,W):
A.__copy_ctor__(self,instanceOfA)
self.__W=W
This would suit my needs since I always create the subclass when given an instance of the father class, Though I am not sure whether it's possible...
Considering that arguments could be passed either by name or by position, I'd code:
class B(A):
def __init__(self, *a, **k):
if 'W' in k:
w = k.pop('W')
else:
w = a.pop()
A.__init__(self, *a, **k)
self._W = w
Edit: based on Matt's suggestion, and to address gnibbler's concern re a positional-argument approach; you might check to make sure that the additional subclass-specific argument is being specified—similar to Alex's answer:
class B(A):
def __init__(self, *args, **kwargs):
try:
self._w = kwargs.pop('w')
except KeyError:
pass
super(B,self).__init__(*args, **kwargs)
>>> b = B(1,2,w=3)
>>> b.a
1
>>> b.b
2
>>> b._w
3
Original answer:
Same idea as Matt's answer, using super() instead.
Use super() to call superclass's __init__() method, then continue initialising the subclass:
class A(object):
def __init__(self, a, b):
self.a = a
self.b = b
class B(A):
def __init__(self, w, *args):
super(B,self).__init__(*args)
self.w = w
In situations where some or all of the arguments passed to __init__ have default values, it can be useful to avoid repeating the __init__ method signature in subclasses.
In these cases, __init__ can pass any extra arguments to another method, which subclasses can override:
class A(object):
def __init__(self, a=1, b=2, c=3, d=4, *args, **kwargs):
self.a = a
self.b = b
# …
self._init_extra(*args, **kwargs)
def _init_extra(self):
"""
Subclasses can override this method to support extra
__init__ arguments.
"""
pass
class B(A):
def _init_extra(self, w):
self.w = w
Are you wanting something like this?
class A(object):
def __init__(self, a, b, c, d, e, f, g):
# do stuff
print a, d, g
class B(A):
def __init__(self, *args):
args = list(args)
self.__W = args.pop()
A.__init__(self, *args)