Python: Should I use delegation or inheritance here? - python

I am pondering if I should use inheritance or delegation to implement a kind of wrapper class. My problem is like this: Say I have a class named Python.
class Python:
def __init__(self):
...
def snake(self):
""" Make python snake through the forest"""
...
def sleep(self):
""" Let python sleep """
...
... and much more behavior. Now I have existing code which expects an Anaconda, which is almost like a Python, but slightly different: Some members have slightly different names and parameters, other members add new functionality. I really want to reuse the code in Python. Therefore I could do this with inheritance:
class Anaconda(Python):
def __init__(self):
Python.__init__(self)
def wriggle(self):
"""Different name, same thing"""
Python.snake(self)
def devourCrocodile(self, croc):
""" Python can't do this"""
...
Of course I can also call Anaconda().sleep(). But here is the problem: There is a PythonFactory which I need to use.
class PythonFactory:
def makeSpecialPython(self):
""" Do a lot of complicated work to produce a special python"""
…
return python
I want it to make a Python and then I should be able to convert it to an Anaconda:
myAnaconda = Anaconda(PythonFactory().makeSpecialPython())
In this case, delegation would be the way to go. (I don't know whether this can be done using inheritance):
class Anaconda:
def __init__(self, python):
self.python = python
def wriggle(self):
self.python.wriggle()
def devourCrocodile(self, croc):
...
But with delegation, I cannot call Anaconda().sleep().
So, if you're still with me, my questions are:
A) In a case similar to this, where I need to
add some functionality
rename some functionality
use "base class" functionality otherwise
convert "base class" object to "subclass" object
should I use inheritance or delegation? (Or something else?)
B) An elegant solution would be to use delegation plus some special method that forwards all attribute and method accesses which Anaconda does not respond to to its instance of Python.

B) An elegant solution would be to use delegation plus some special method that forwards all attribute and method accesses which Anaconda does not respond to to its instance of Python.
This is simple in Python, just define __getattr__:
class Anaconda:
def __init__(self, python):
self.python = python
def wriggle(self):
self.python.snake()
def devourCrocodile(self, croc):
...
def __getattr__(self, name):
return getattr(self.python, name)
See the Python docs on __getattr__

Related

How can I specialise instances of objects when I don't have access to the instantiation code?

Let's assume I am using a library which gives me instances of classes defined in that library when calling its functions:
>>> from library import find_objects
>>> result = find_objects("name = any")
[SomeObject(name="foo"), SomeObject(name="bar")]
Let's further assume that I want to attach new attributes to these instances. For example a classifier to avoid running this code every time I want to classify the instance:
>>> from library import find_objects
>>> result = find_objects("name = any")
>>> for row in result:
... row.item_class= my_classifier(row)
Note that this is contrived but illustrates the problem: I now have instances of the class SomeObject but the attribute item_class is not defined in that class and trips up the type-checker.
So when I now write:
print(result[0].item_class)
I get a typing error. It also trips up auto-completion in editors as the editor does not know that this attribute exists.
And, not to mention that this way of implementing this is quite ugly and hacky.
One thing I could do is create a subclass of SomeObject:
class ExtendedObject(SomeObject):
item_class = None
def classify(self):
cls = do_something_with(self)
self.item_class = cls
This now makes everything explicit, I get a chance to properly document the new attributes and give it proper type-hints. Everything is clean. However, as mentioned before, the actual instances are created inside library and I don't have control over the instantiation.
Side note: I ran into this issue in flask for the Response class. I noticed that flask actually offers a way to customise the instantiation using Flask.response_class. But I am still interested how this could be achieved in libraries that don't offer this injection seam.
One thing I could do is write a wrapper that does something like this:
class WrappedObject(SomeObject):
item_class = None
wrapped = None
#staticmethod
def from_original(wrapped):
self.wrapped = wrapped
self.item_class = do_something_with(wrapped)
def __getattribute__(self, key):
return getattr(self.wrapped, key)
But this seems rather hacky and will not work in other programming languages.
Or try to copy the data:
from copy import deepcopy
class CopiedObject(SomeObject):
item_class = None
#staticmethod
def from_original(wrapped):
for key, value in vars(wrapped):
setattr(self, key, deepcopy(value))
self.item_class = do_something_with(wrapped)
but this feels equally hacky, and is risky when the objects sue properties and/or descriptors.
Are there any known "clean" patterns for something like this?
I would go with a variant of your WrappedObject approach, with the following adjustments:
I would not extend SomeObject: this is a case where composition feels more appropriate than inheritance
With that in mind, from_original is unnecessary: you can have a proper __init__ method
item_class should be an instance variable and not a class variable. It should be initialized in your WrappedObject class constructor
Think twice before implementing __getattribute__ and forwarding everything to the wrapped object. If you need only a few method and attributes of the original SomeObject class, it might be better to implement them explicitly as methods and properties
class WrappedObject:
def __init__(self, wrapped):
self.wrapped = wrapped
self.item_class = do_something_with(wrapped)
def a_method(self):
return self.wrapped.a_method()
#property
def a_property(self):
return self.wrapped.a_property

How to set functions for a property()?

This is a snippet for registers for an emulator I'm working on:
class registers(object):
def __init__(self):
self._AF = registerpair()
def _get_AF(self):
return self._AF.getval()
def _set_AF(self, val):
self._AF.setval(val)
AF = property(_get_AF, _set_AF)
The registerpair() class has an increment() method. I would like to know if there is any way I could do the following:
r = registers()
r.AF.increment()
rather than having to do:
r._AF.increment()
As is, no. You have set the fget method to return a getval() for your registerpair() class.
Since the property is for the _AF attribute which is a registerpair() instance, I believe it would be more reasonable to change your fget (and fset for that matter) to actually return it, and maybe create an auxiliary function to actually get the value with getval() or access it directly.
So if your _get_AF looked something like:
def _get_AF(self):
return self._AF
you can then call r.AF.increment() just fine. Then you could move the getval() call to another function in your class:
def getAFval(self):
self._AF.getval()
Or just make direct calls like r.AF.getval() which seems like the most clear way to do things.
You are effectively modifying the interface to the registerpair class using this wrapper class, and in doing so hiding the original interface. As such in your new interface the property() in Python refers to the values stored in the registerpair, not to the registerpair itself, as it reimplements the getval() and setval() interface of the registerpair.
So a couple of suggestions, firstly if this wrapper class is just reimplementing the interface to the registerpair, should you not just inherit from the registerpair, that way the original interface would be available?
Alternatively you could implement the remainder of the registerpair interface, using for example a method such as registers.increment_AF():
class registers(object):
def __init__(self):
self._AF = registerpair()
def _get_AF(self):
return self._AF.getval()
def _set_AF(self, val):
self._AF.setval(val)
AF = property(_get_AF, _set_AF)
def increment_AF(self):
self._AF.increment()
If I understand you correctly
You can call r._AF.increment() which references registerpair() object but since self._AF is a private method you cannot use
r.AF.increment()
for further information..check this
https://www.python.org/dev/peps/pep-0008/
an extract from this site
_single_leading_underscore : weak "internal use" indicator. E.g. from M import * does not import objects whose name starts with an underscore.
single_trailing_underscore_ : used by convention to avoid conflicts with Python keyword, e.g.

Creating a class-based reusable application

I am trying to create a re-usable application in python 2.6.
I am developing server-side scripts for listening GPS tracking devices. The script are using sockets.
I have a base class that defines the basic methods for handling the data sent by device.
class MyDevice(object):
def __init__(self, db)
self.db = db # This is the class that defines methods for connecting/using database
def initialize_db(self):
...
def handle_data(self):
self.initialize_db()
...
self.process_data()
def process_data(self):
...
self.categorize_data()
def categorize_data(self):
...
self.save_data()
def save_data(self):
...
This base class serves for many devices since there are only some minor differences between the devices. So I create a class for each specific device type and make arrangements which are specific to that device.
class MyDeviceType1(Mydevice):
def __init__(self, db):
super(MyDeviceType1, self).__init__(db)
def categorize_data(self):
super(MyDeviceType1, self).categorize_data(self)
self.prepopulate_data()
... # Do other operations specific to device
def prepopulate_data(self):
"""this is specific to Type1 devices"""
...
class MyDeviceType2(Mydevice):
def __init__(self, db):
super(MyDeviceType1, self).__init__(db)
def categorize_data(self):
super(MyDeviceType1, self).categorize_data(self)
self.unpopulate_data()
... # Do other operations specific to device
def unpopulate_data(self):
"""this is specific to Type2 devices"""
...
And I have socket listeners that listens specific sockets, and call related class (MyDeviceType1 or MyDeviceType2) like:
conn, address = socket.accept()
...
thread.start_new_thread(MyDeviceType1(db_connector).handle_data, ())
That structure is all fine and useful to me. One device (MyDevice) may have many subtypes (MyDeviceType1, MyDeviceType2) which inherits the base class.
And there are more than one type of base devices. So there is OtherDevice with subtypes OtherDeviceType1 etc.
MyDevice and OtherDevice works quite differently, so they are the base types and underlying code is quite different in all of them.
I also have some add-on functionalities. These functionalities are usable by one or two subtypes of nearly all device base types.
So I want to prepare a single reusable (plug-able) class that can be inherited by any subtype that needs those functionalities.
class MyAddOn(object):
def remove_unusable_data(self):
...
def categorize_data(self):
super ???
self.remove_unusable_data()
And here is the part that I stuck. Since this is an independent module, it should not be inherited from MyDevice or OtherDevice etc, but not all sub device types are using these functionalities, I can not inherit MyDevice from MyAddOn too.
Only logical method looks like, inheriting the subtype MyDeviceSubType1 from both MyDevice and MyAddOn
class MyDeviceType1(Mydevice, MyAddOn):
def __init__(self, db):
super(MyDeviceType1, self).__init__(db)
def categorize_data(self):
>> super(MyDeviceType1, self).categorize_data(self) <<
self.prepopulate_data()
... # Do other operations specific to device
def prepopulate_data(self):
"""this is specific to Type1 devices"""
super(MyDeviceType1, self).categorize_data(self) is the problem part. super is triggering the Mydevice.categorize_data but not MyAddOn.categorize_data
Is there any way to trigger MyAddOn methods using super call or in a such fashion that I do not need to call that class method seperately? Both MyDevice.categorize_data and MyAddOn.categorize_data should be called.
This is called cooperative multiple inheritance in python and works just fine.
What you refer to as an "Addon" class, is generally called a "Mixin".
Just call the super method in your Mixin class:
class MyAddOn(object):
def remove_unusable_data(self):
...
def categorize_data(self):
super(MyAddon,self).categorize_data()
self.remove_unusable_data()
I'd like to note some things:
The method resolution order is left to right
You have to call super
You should be using **kwargs for cooperative inheritance
It seems counterintuitive to call super here, as the parent of MyAddon does not have an attribute called categorize_data, and you would expect this notation to fail.
This is where the super function comes into play. Some consider this behaviour to be the best thing about python.
Unlike in C++ or Java the super function does not necessarily call the class' parent class. In fact it is impossible to know in advance which function will be called by super because it will be decided at run-time based on the method resoltion order.
super in python should really be called next because it will call the next method in the inheritance tree.
For Mixins it is especially important to call super, even if you're inheriting from object.
For further information I advise to watch Raymond Hettinger's excellent talk on Super considered Super from pycon 2015.
It's an excellent pattern to use in python. Here is a pattern I encounter often when programming structured applications obeying the open-closed principle:
I have this library class which is used in production:
class BaseClassA(object):
def __init__(self, **kwargs):
... Do something that's important for many modules
def ...
class BaseClassB(object):
def __init__(self, **kwargs):
... Do something that's important for many modules
def ...
Now you get a feature request that in a particular case both BaseClassA and BaseClassB should implement feature X.
According to open-close you shouldn't have to touch existing code to implement the feature, and according to DRY you shouldnt repeat the code.
The solution is to create a FeatureMixin and create empty child classes which inherit from the base class and the mixin:
class FeatureMixin(object):
def __init__(self,**kwargs):
...do something specific
return super(FeatureMixin,self).__init__(**kwargs)
class ExtendedA(FeatureMixin,BaseClassA):
pass
class ExtendedB(FeatureMixin,BaseClassB):
pass

Changing API to many utility classes

I have a lot of classes, realizing some general tasks for different sites :
class AbstractCalculator :
pass # ... abstract methods lying here
class Realization1 (AbstractCalculator) :
#classmethod
def calculate_foo(...) :
# ...
#classmethod
def calculate_bar(...) :
# ...
class Realization2 (AbstractCalculator) :
#classmethod
def calculate_foo(...) :
# ...
#classmethod
def calculate_bar(...) :
# ...
Then i aggregating all those classes in one dictionary
Now i introduce new different API :
class NewAbstractClass :
# ... introducting new API ...
#staticmethod
def adopt(old_class) :
# .. converting AbstractClass to NewAbstactClass
And then i use adopt() method like #decorator, to convert all old realizations to new.
But it all is very strange and complicated. Is there any better way to do this?
UPD #ColinMcGrath :
No I am asking definitely other.
My adopt() decorator is working, and I have no problems with it functioning (just, its body is not related to my question, so I have not provide it).
I think that hardcoding decoration of several tens of differnet classes right in their source code is not a best idea, and looking for canonical soulution.
In general there is no magic way for code to know that one api equivalent to another.
However, the mechanisms in python for procedural generation of classes are metaclasses and class decorators. You could use a metaclass which uses some information you give it to generate another class.
This is a good introduction: http://eli.thegreenplace.net/2011/08/14/python-metaclasses-by-example/
Because of python "duck typing", abstract classes are rarely necessary, and something of a code smell. You should probably just redesign your code such that renaming and abstract classes are unnecessary. In particular, if your code needs to change, it needs to change. I would only do something like this if you are wrapping external code to make several disparate apis compatible.
Something like:
class renamer(type):
def __init__(cls, classname, bases, dct):
if '__rename__' in dct:
dct.update(cls.__rename__)
class orig(object):
def foo(*params): pass
class newclass:
__meta__ = renamer
__rename__ = (('bar', orig.foo),)
Or, you could do a similar thing with a decorator:
def renamer(subs):
def thedecorator(inclass):
meta = type(inclass)
dct = inclass.__dict__.copy()
dct.update(subs)
return meta(inclass.__name__,inclass.__bases__,dct)
return thedecorator
#renamer((('bar', orig.foo),))
class newclass(object): pass

How can I ensure that one of my class's methods is always called even if a subclass overrides it?

For example, I have a
class BaseHandler(object):
def prepare(self):
self.prepped = 1
I do not want everyone that subclasses BaseHandler and also wants to implement prepare to have to remember to call
super(SubBaseHandler, self).prepare()
Is there a way to ensure the superclass method is run even if the subclass also implements prepare?
I have solved this problem using a metaclass.
Using a metaclass allows the implementer of the BaseHandler to be sure that all subclasses will call the superclasses prepare() with no adjustment to any existing code.
The metaclass looks for an implementation of prepare on both classes and then overwrites the subclass prepare with one that calls superclass.prepare followed by subclass.prepare.
class MetaHandler(type):
def __new__(cls, name, bases, attrs):
instance = type.__new__(cls, name, bases, attrs)
super_instance = super(instance, instance)
if hasattr(super_instance, 'prepare') and hasattr(instance, 'prepare'):
super_prepare = getattr(super_instance, 'prepare')
sub_prepare = getattr(instance, 'prepare')
def new_prepare(self):
super_prepare(self)
sub_prepare(self)
setattr(instance, 'prepare', new_prepare)
return instance
class BaseHandler(object):
__metaclass__ = MetaHandler
def prepare(self):
print 'BaseHandler.prepare'
class SubHandler(BaseHandler):
def prepare(self):
print 'SubHandler.prepare'
Using it looks like this:
>>> sh = SubHandler()
>>> sh.prepare()
BaseHandler.prepare
SubHandler.prepare
Tell your developers to define prepare_hook instead of prepare, but
tell the users to call prepare:
class BaseHandler(object):
def prepare(self):
self.prepped = 1
self.prepare_hook()
def prepare_hook(self):
pass
class SubBaseHandler(BaseHandler):
def prepare_hook(self):
pass
foo = SubBaseHandler()
foo.prepare()
If you want more complex chaining of prepare calls from multiple subclasses, then your developers should really use super as that's what it was intended for.
Just accept that you have to tell people subclassing your class to call the base method when overriding it. Every other solution either requires you to explain them to do something else, or involves some un-pythonic hacks which could be circumvented too.
Python’s object inheritance model was designed to be open, and any try to go another way will just overcomplicate the problem which does not really exist anyway. Just tell everybody using your stuff to either follow your “rules”, or the program will mess up.
One explicit solution without too much magic going on would be to maintain a list of prepare call-backs:
class BaseHandler(object):
def __init__(self):
self.prepare_callbacks = []
def register_prepare_callback(self, callback):
self.prepare_callbacks.append(callback)
def prepare(self):
# Do BaseHandler preparation
for callback in self.prepare_callbacks:
callback()
class MyHandler(BaseHandler):
def __init__(self):
BaseHandler.__init__(self)
self.register_prepare_callback(self._prepare)
def _prepare(self):
# whatever
In general you can try using __getattribute__ to achive something like this (until the moment someone overwrites this method too), but it is against the Python ideas. There is a reason to be able to access private object members in Python. The reason is mentioned in import this

Categories