Hello!
I need each child class to has own set of constants. I've found a "proper" way with properties and overloading setter methods, but:
I need to define constructor in child classes (which I don't need) and assign values in constructor;
Every instance of class will have copy of this constants in memory (senseless resource consumption);
It looks weird when you define setter, getter and property at all just to use it as constant.
I've done something like this:
class BaseClass:
def get_a(self):
raise NotImplementedError("Oooops")
def get_b(self):
raise NotImplementedError("Oooops")
class FirstClass(BaseClass):
def get_a(self):
return "a"
def get_b(self):
return "b"
class SecondClass(BaseClass)
def get_a(self):
return "A"
def get_b(self):
return "B"
class SomeClass:
def some_method(self, class_param):
return "{}-{}".format(class_param.get_a, class_param.get_b)
This method also doesn't solve problems of method with properties (except last), just more compact. There's other way, which I find not good:
class BaseClass:
pass
class FirstClass(BaseClass):
A_CONST = "a"
B_CONST = "b"
class SecondClass(BaseClass)
A_CONST = "A"
B_CONST = "B"
class SomeClass:
def some_method(self, class_param):
return "{}-{}".format(class_param.A_CONST, class_param.B_CONST)
In fact, it solve all problems and pretty compact, BUT it violates rule of inheritance (isn't it?).
Question:
What is the proper way to do this?
P.S. Provided code is simplified example, base class contains methods which I use in child class, please don't write me that base class is useless here.
If you want your base class to indicate that it needs to be subclassed with certain attributes, you can make it an abstract base class.
from abc import ABC, abstractmethod
class Base(ABC):
#property
#abstractmethod
def a(self):
raise NotImplementedError
#property
#abstractmethod
def b(self):
raise NotImplementedError
You will then not be allowed to instantiate Base or its subclasses unless they override the abstract methods. You can do either
class First(Base):
a = 1
b = 2
to assign class attributes with those names, or
class Second(Base):
#Base.a.getter
def a(self):
return 3
#Base.b.getter
def b(self):
return 4
The benefit of the second approach is that it will raise an error if you try to assign to the property
Second().a = 5 # AttributeError
your second version looks fine to me… each language has their own conventions around what a "class" or "object" means, and this looks reasonably "Pythonic"
one minor comment about the first version, is that Python doesn't care about "overloading", you don't need to include:
class BaseClass:
def get_a(self):
raise NotImplementedError("Oooops")
at all, i.e. it's fine to have:
class BaseClass:
pass
as well in your first version.
another potentially useful tool here is the property decorator, e.g:
class FirstClass(BaseClass):
#property
def a(self):
return "a"
print(FirstClass().a)
would output "a"
If the key_name : [A_CONST, B_CONST] remains same for child classes, super() will take care of all your concerns (1., 2., 3.).
A 'pythonic' solution would include, to remove duplication's, of any, setter and getter in child classes and let BaseClass() handle these common-tasks.
class BaseClass(object):
def __init__(self, a, b):
self._a_const = a
self._b_const = b
#property
def A_CONST(self):
return self._a_const
#property
def B_CONST(self):
return self._b_const
class FirstClass(BaseClass):
def __init__(self, _aconst, _bconst):
# Let Base class object hold my constants but FirstClass Constructor
# is setting the value. Look SecondClass
super(FirstClass, self).__init__(_aconst, _bconst)
class SecondClass(BaseClass):
def __init__(self, _aconst, _bconst):
# Magic happens here
super(SecondClass, self).__init__(_aconst, _bconst)
class SomeClass():
def some_method(self, class_param):
return "{}-{}".format(class_param.A_CONST, class_param.B_CONST)
firstobj = FirstClass("a", "b")
secondobj = SecondClass("A", "B")
print(SomeClass().some_method(firstobj))
print(SomeClass().some_method(secondobj))
Related
Suppose I have:
class Super:
def __init__(self,a):
self.a = a
#classmethod
def from_b(cls,b):
return cls(b.to_a())
class Regular(Super):
def __init__(self,b):
# how to set my super to the output of
super = super.from_b(b)
How do I correctly initialize the super class with the output of the super class method rather than init?
My OOP background is in C++ and I am continually getting into these scenarios due to the ability to overload constructors in C++, so a workaround for this would be awesome.
#shx2's answer works but wastefully/awkwardly creates a throw-away Super object just to initialize the new Regular object with its a attribute.
If you have control over the source of Super, you can make the from_b method create an instance of the given subclass, and have the subclass call the from_b method in its __new__ method instead, so that a Regular object can be both created and initialized directly:
class Super:
def __init__(self, a):
self.a = a
#classmethod
def from_b(cls, b):
obj = super().__new__(cls)
cls.__init__(obj, b.to_a())
return obj
class Regular(Super):
def __new__(cls, b):
return super().from_b(b)
so that the following assertions will pass:
from unittest.mock import Mock
obj = Regular(Mock())
assert type(obj) is Regular
assert obj.a.to_a.is_called()
This is slightly awkward (since what you're trying to do is slightly awkward), but it would work:
class Super:
def __init__(self,a):
self.a = a
#classmethod
def from_b(cls,b):
return cls(b.to_a())
class Regular(Super):
def __init__(self,b):
a = Super.from_b(b).a
super().__init__(a)
By the way, it might help keeping in mind that a "constructor" method such as from_b() (typically) returns a new object, while __init__() only initializes an object after it's been created.
In the case of multiple inheritance in python, is there a way to identify which super class a class-level variable is obtained from?
All attempts I tried to google are overwhelmingly about How to get the attribute not find out where it came from:
https://www.google.com/search?q=pythin+which+super+class+defines+attr
https://www.google.com/search?q=python+which+super+class+has+attribute&oq=python+which+super+class+has+attr
https://www.google.com/search?q=python+which+super+class+attribute+obtained+from
I suppose I can manually step through the MRO using inspect.getmro(cls). But I couldn't find any more elegant solutions. Just wondering if anyone knows of one.
EDIT
For a concrete example:
class Super1(object):
__class_attribute__ = "Foo"
class Super2(object):
pass
class Derived(Super1, Super2):
pass
d = Derived()
parent_cls = some_function_to_get_defining_class(d.__class_attribute__) # <-- should return `Super1`
The __qualname__ attribute gives an indication from which class a method was inherited. However, this only returns a string, not the superclass itself. If you need to the superclass for metaprogramming, I think you are going to have to dig into the MRO.
class A:
def a(self):
return 1
def b(self):
return 2
class B:
def b(self):
return 2.5
def c(self):
return 3
class C(A,B):
pass
Using:
C.b.__qualname__
# returns:
'A.b'
However, this does not apply when using abstract methods to define an interface, since the method has to be overwritten.
from abc import abstractmethod
class A:
def a(self):
return 1
#abstractmethod
def b(self):
pass
class C(A):
def b(self):
return 100
C.b.__qualname__
# returns:
'C.b'
I have a class that is a super-class to many other classes. I would like to know (in the __init__() of my super-class) if the subclass has overridden a specific method.
I tried to accomplish this with a class method, but the results were wrong:
class Super:
def __init__(self):
if self.method == Super.method:
print 'same'
else:
print 'different'
#classmethod
def method(cls):
pass
class Sub1(Super):
def method(self):
print 'hi'
class Sub2(Super):
pass
Super() # should be same
Sub1() # should be different
Sub2() # should be same
>>> same
>>> different
>>> different
Is there any way for a super-class to know if a sub-class has overridden a method?
It seems simplest and sufficient to do this by comparing the common subset of the dictionaries of an instance and the base class itself, e.g.:
def detect_overridden(cls, obj):
common = cls.__dict__.keys() & obj.__class__.__dict__.keys()
diff = [m for m in common if cls.__dict__[m] != obj.__class__.__dict__[m]]
print(diff)
def f1(self):
pass
class Foo:
def __init__(self):
detect_overridden(Foo, self)
def method1(self):
print("Hello foo")
method2=f1
class Bar(Foo):
def method1(self):
print("Hello bar")
method2=f1 # This is pointless but not an override
# def method2(self):
# pass
b=Bar()
f=Foo()
Runs and gives:
['method1']
[]
If you want to check for an overridden instance method in Python 3, you can do this using the type of self:
class Base:
def __init__(self):
if type(self).method == Base.method:
print('same')
else:
print('different')
def method(self):
print('Hello from Base')
class Sub1(Base):
def method(self):
print('Hello from Sub1')
class Sub2(Base):
pass
Now Base() and Sub2() should both print "same" while Sub1() prints "different". The classmethod decorator causes the first parameter to be bound to the type of self, and since the type of a subclass is by definition different to its base class, the two class methods will compare as not equal. By making the method an instance method and using the type of self, you're comparing a plain function against another plain function, and assuming functions (or unbound methods in this case if you're using Python 2) compare equal to themselves (which they do in the C Python implementation), the desired behavior will be produced.
You can use your own decorator. But this is a trick and will only work on classes where you control the implementation.
def override(method):
method.is_overridden = True
return method
class Super:
def __init__(self):
if hasattr(self.method, 'is_overridden'):
print 'different'
else:
print 'same'
#classmethod
def method(cls):
pass
class Sub1(Super):
#override
def method(self):
print 'hi'
class Sub2(Super):
pass
Super() # should be same
Sub1() # should be different
Sub2() # should be same
>>> same
>>> different
>>> same
In reply to answer https://stackoverflow.com/a/9437273/1258307, since I don't have enough credits yet to comment on it, it will not work under python 3 unless you replace im_func with __func__ and will also not work in python 3.4(and most likely onward) since functions no longer have the __func__ attribute, only bound methods.
EDIT: Here's the solution to the original question(which worked on 2.7 and 3.4, and I assume all other version in between):
class Super:
def __init__(self):
if self.method.__code__ is Super.method.__code__:
print('same')
else:
print('different')
#classmethod
def method(cls):
pass
class Sub1(Super):
def method(self):
print('hi')
class Sub2(Super):
pass
Super() # should be same
Sub1() # should be different
Sub2() # should be same
And here's the output:
same
different
same
You can compare whatever is in the class's __dict__ with the function inside the method
you can retrieve from the object -
the "detect_overriden" functionbellow does that - the trick is to pass
the "parent class" for its name, just as one does in a call to "super" -
else it is not easy to retrieve attributes from the parentclass itself
instead of those of the subclass:
# -*- coding: utf-8 -*-
from types import FunctionType
def detect_overriden(cls, obj):
res = []
for key, value in cls.__dict__.items():
if isinstance(value, classmethod):
value = getattr(cls, key).im_func
if isinstance(value, (FunctionType, classmethod)):
meth = getattr(obj, key)
if not meth.im_func is value:
res.append(key)
return res
# Test and example
class A(object):
def __init__(self):
print detect_overriden(A, self)
def a(self): pass
#classmethod
def b(self): pass
def c(self): pass
class B(A):
def a(self): pass
##classmethod
def b(self): pass
edit changed code to work fine with classmethods as well:
if it detects a classmethod on the parent class, extracts the underlying function before proceeding.
--
Another way of doing this, without having to hard code the class name, would be to follow the instance's class ( self.__class__) method resolution order (given by the __mro__ attribute) and search for duplicates of the methods and attributes defined in each class along the inheritance chain.
I'm using the following method to determine if a given bound method is overridden or originates from the parent class
class A():
def bla(self):
print("Original")
class B(A):
def bla(self):
print("Overridden")
class C(A):
pass
def isOverriddenFunc(func):
obj = func.__self__
prntM = getattr(super(type(obj), obj), func.__name__)
return func.__func__ != prntM.__func__
b = B()
c = C()
b.bla()
c.bla()
print(isOverriddenFunc(b.bla))
print(isOverriddenFunc(c.bla))
Result:
Overridden
Original
True
False
Of course, for this to work, the method must be defined in the base class.
You can also check if something is overridden from its parents, without knowing any of the classes involved using super:
class A:
def fuzz(self):
pass
class B(A):
def fuzz(self):
super().fuzz()
class C(A):
pass
>>> b = B(); c = C()
>>> b.__class__.fuzz is super(b.__class__, b).fuzz.__func__
False
>>> c.__class__.fuzz is super(c.__class__, c).fuzz.__func__
True
See this question for some more nuggets of information.
A general function:
def overrides(instance, function_name):
return getattr(instance.__class__, function_name) is not getattr(super(instance.__class__, instance), function_name).__func__
>>> overrides(b, "fuzz")
True
>>> overrides(c, "fuzz")
False
You can check to see if the function has been overridden by seeing if the function handle points to the Super class function or not. The function handler in the subclass object points either to the Super class function or to an overridden function in the Subclass. For example:
class Test:
def myfunc1(self):
pass
def myfunc2(self):
pass
class TestSub(Test):
def myfunc1(self):
print('Hello World')
>>> test = TestSub()
>>> test.myfunc1.__func__ is Test.myfunc1
False
>>> test.myfunc2.__func__ is Test.myfunc2
True
If the function handle does not point to the function in the Super class, then it has been overridden.
Not sure if this is what you're looking for but it helped me when I was looking for a similar solution.
class A:
def fuzz(self):
pass
class B(A):
def fuzz(self):
super().fuzz()
assert 'super' in B.__dict__['fuzz'].__code__.co_names
The top-trending answer and several others use some form of Sub.method == Base.method. However, this comparison can return a false negative if Sub and Base do not share the same import syntax. For example, see discussion here explaining a scenario where issubclass(Sub, Base) -> False.
This subtlety is not apparent when running many of the minimal examples here, but can show up in a more complex code base. The more reliable approach is to compare the method defined in the Sub.__bases__ entry corresponding to Base because __bases__ is guaranteed to use the same import path as Sub
import inspect
def method_overridden(cls, base, method):
"""Determine if class overriddes the implementation of specific base class method
:param type cls: Subclass inheriting (and potentially overriding) the method
:param type base: Base class where the method is inherited from
:param str method: Name of the inherited method
:return bool: Whether ``cls.method != base.method`` regardless of import
syntax used to create the two classes
:raises NameError: If ``base`` is not in the MRO of ``cls``
:raises AttributeError: If ``base.method`` is undefined
"""
# Figure out which base class from the MRO to compare against
base_cls = None
for parent in inspect.getmro(cls):
if parent.__name__ == base.__name__:
base_cls = parent
break
if base_cls is None:
raise NameError(f'{base.__name__} is not in the MRO for {cls}')
# Compare the method implementations
return getattr(cls, method) != getattr(base_cls, method)
I have python class trees, each made up of an abstract base class and many deriving concrete classes. I want all concrete classes to be accessible through a base-class method, and I do not want to specify anything during child-class creation.
This is what my imagined solution looks like:
class BaseClassA(object):
# <some magic code around here>
#classmethod
def getConcreteClasses(cls):
# <some magic related code here>
class ConcreteClassA1(BaseClassA):
# no magic-related code here
class ConcreteClassA2(BaseClassA):
# no magic-related code here
As much as possible, I'd prefer to write the "magic" once as a sort of design pattern. I want to be able to apply it to different class trees in different scenarios (i.e. add a similar tree with "BaseClassB" and its concrete classes).
Thanks Internet!
you can use meta classes for that:
class AutoRegister(type):
def __new__(mcs, name, bases, classdict):
new_cls = type.__new__(mcs, name, bases, classdict)
#print mcs, name, bases, classdict
for b in bases:
if hasattr(b, 'register_subclass'):
b.register_subclass(new_cls)
return new_cls
class AbstractClassA(object):
__metaclass__ = AutoRegister
_subclasses = []
#classmethod
def register_subclass(klass, cls):
klass._subclasses.append(cls)
#classmethod
def get_concrete_classes(klass):
return klass._subclasses
class ConcreteClassA1(AbstractClassA):
pass
class ConcreteClassA2(AbstractClassA):
pass
class ConcreteClassA3(ConcreteClassA2):
pass
print AbstractClassA.get_concrete_classes()
I'm personnaly very wary of this kind of magic. Don't put too much of this in your code.
Here is a simple solution using modern python's (3.6+) __init__subclass__ defined in PEP 487. It allows you to avoid using a meta-class.
class BaseClassA(object):
_subclasses = []
#classmethod
def get_concrete_classes(cls):
return list(cls._subclasses)
def __init_subclass__(cls):
BaseClassA._subclasses.append(cls)
class ConcreteClassA1(BaseClassA):
pass # no magic-related code here
class ConcreteClassA2(BaseClassA):
pass # no magic-related code here
print(BaseClassA.get_concrete_classes())
You should know that part of the answer you're looking for is built-in. New-style classes automatically keep a weak reference to all of their child classes which can be accessed with the __subclasses__ method:
#classmethod
def getConcreteClasses(cls):
return cls.__subclasses__()
This won't return sub-sub-classes. If you need those, you can create a recursive generator to get them all:
#classmethod
def getConcreteClasses(cls):
for c in cls.__subclasses__():
yield c
for c2 in c.getConcreteClasses():
yield c2
Another way to do this, with a decorator, if your subclasses are either not defining __init__ or are calling their parent's __init__:
def lister(cls):
cls.classes = list()
cls._init = cls.__init__
def init(self, *args, **kwargs):
cls = self.__class__
if cls not in cls.classes:
cls.classes.append(cls)
cls._init(self, *args, **kwargs)
cls.__init__ = init
#classmethod
def getclasses(cls):
return cls.classes
cls.getclasses = getclasses
return cls
#lister
class A(object): pass
class B(A): pass
class C(A):
def __init__(self):
super(C, self).__init__()
b = B()
c = C()
c2 = C()
print 'Classes:', c.getclasses()
It will work whether or not the base class defines __init__.
Is it possible, when instantiating an object, to pass-in a class which the object should derive from?
For instance:
class Red(object):
def x(self):
print '#F00'
class Blue(object):
def x(self):
print '#00F'
class Circle(object):
def __init__(self, parent):
# here, we set Bar's parent to `parent`
self.x()
class Square(object):
def __init__(self, parent):
# here, we set Bar's parent to `parent`
self.x()
self.sides = 4
red_circle = Circle(parent=Red)
blue_circle = Circle(parent=Blue)
blue_square = Square(parent=Blue)
Which would have similar effects as:
class Circle(Red):
def __init__(self):
self.x()
without, however, affecting other instances of Circle.
Perhaps what you are looking for is a class factory:
#!/usr/bin/env python
class Foo(object):
def x(self):
print('y')
def Bar(parent=Foo):
class Adoptee(parent):
def __init__(self):
self.x()
return Adoptee()
obj=Bar(parent=Foo)
I agree with #AntsAasma. You should probably consider using dependency injection. Atleast in the example given (which I'm sure is greatly simplified to illustrate your problem), the color of a shape is better represented by via a has-a relationship rather than with a is-a relationship.
You could implement this via passing in the desired color object to the constructor, storing a reference to it, and delegating the function call to this object. This greatly simplifies the implementation while still retaining the desired behavior. See an example here:
class Red(object):
def x(self):
print '#F00'
class Blue(object):
def x(self):
print '#00F'
class Shape(object):
def __init__(self,color):
self._color=color
def x(self):
return self._color.x()
class Circle(Shape):
def __init__(self, color):
Shape.__init__(self,color)
self.x()
class Square(Shape):
def __init__(self, color):
Shape.__init__(self,color)
self.x()
self.sides = 4
red_circle = Circle(color=Red())
blue_circle = Circle(color=Blue())
blue_square = Square(color=Blue())
Edit: Fixed names of constructor arguments in sample code
It sounds like you are trying to use inheritance for something that it isn't meant for. If you would explain why you want to do this, maybe a more idiomatic and robust way to achieve your goals can be found.
If you really need it, then you could use type constructor, e.g. within a factory function (or inside __new__ method, but this is probably safer approach):
class Foo(object):
def x(self):
print 'y'
class Bar(object):
def __init__(self):
self.x()
def magic(cls, parent, *args, **kwargs):
new = type(cls.__name__, (parent,), cls.__dict__.copy())
return new(*args, **kwargs)
obj = magic(Bar, parent = Foo)
As everybody else says, that's a pretty weird usage, but, if you really want it, it's surely feasible (except for the mysterious Bar that you pull out of thin air in comments;-). For example:
class Circle(object):
def __init__(self, parent):
self.__class__ = type('Circle', (self.__class__, parent), {})
self.x()
This gives each instance of Circle its own personal class (all named Circle, but all different) -- this part is actually the key reason this idiom is sometimes very useful (when you want a "per-instance customized special method" with new-style classes: since the special method always gets looked up on the class, to customize it per-instance you need each instance to have a distinct class!-). If you'd rather do as much class-sharing as feasible you may want a little memoizing factory function to help:
_memo = {}
def classFor(*bases):
if bases in _memo: return _memo[bases]
name = '_'.join(c.__name__ for c in bases)
c = _memo[bases] = type(name, bases, {})
return c
(here I'm also using a different approach to the resulting class's name, using class names such as Circle_Red and Circle_Blue for your examples rather than just Circle). Then:
class Circle(object):
def __init__(self, parent):
self.__class__ = classFor(Circle, parent)
self.x()
So the technique is smooth and robust, but I still don't see it as a good match to the use case you exemplify with. However, it might be useful in other use cases, so I'm showing it.