Pythonic way to encapsulate method arguments of a class - python

Objects of my class A are similar to network connections, i.e. characterized by a handle per connection opened. That is, one calls different methods with a handle (a particular connection) as argument. My class A (python 2.7) looks like:
class A(object):
def __init__(self, *args):
... some init
def my_open(self, *args)
handle = ... some open
return handle
def do_this(self, handle, *args):
foo_this(handle, args)
def do_that(self, handle, *args):
foo_that(handle, args)
A typical usage is
a = A(args)
handle = a.my_open(args2)
a.do_this(handle, args3)
Now, in a particular situation, there is only one connection to take care of, i.e. one handle in play. So, it is reasonable to hide this handle but keep class A for the more general situation. Thus, my first thoughts on a class B
which "is a" kind of class A (usage stays the same but hides handle) are:
class B(A):
def __init__(self, *args):
super(A, self).__init__(args)
self.handle = None
def my_open(self, *args):
self.handle = super(A, self).__init__(args)
def do_this(self, *args):
super(A, self).do_this(self.handle, args)
def do_that(self, *args):
super(A, self).do_that(self.handle, args)
Unfortunately, in my opinion, it seems very convoluted. Any better ideas?

Objects of my class A are similar to network connections, i.e. characterized by a handle per connection opened. That is, one calls different methods with a handle (a particular connection) as argument.
You have inverted the responsibility. The handle object holds the state the methods operate on, so those methods should live on the handle, not the factory.
Move your methods to the handle object, so the API becomes:
a = A(args)
handle = a.my_open(args2)
handle.do_this(args3)
The class implementing the handle() could retain a reference to a if so required; that's an implementation detail that the users of the API don't need to worry about.
You then return new handles, or a singleton handle, as needed.
By moving responsibility to the handle object, you can also make your factory produce handles of entirely different types, depending on the arguments. A(args).my_open(args2) could also produce the singleton handle that you now have class B for, for example.

How about a class for the handle itself?:
class Handle(object):
def __init__(self, *args):
# init ...
self._handle = low_level_handle
def do_this(self, *args):
# do_this ...
pass
def do_that(self, *args):
# do_that
pass
class A(object):
def __init__(self, *args):
# init ...
def my_open(self, *args):
handle = Handle(args)
# handle post-processing (if any)
return handle
e.g.:
a = A(args)
handle = a.my_open(args2)
handle.do_this(args3)

Related

Do factory class methods break the Liskov substitution principle?

I was wondering if factory class methods break the Liskov substitution principle.
For instance in the following Python code, does the Response.from_request factory class method break it?
import abc
class BaseResponse(abc.ABC):
#abc.abstractmethod
def get_headers(self):
raise NotImplementedError
#abc.abstractmethod
def get_body(self):
raise NotImplementedError
class Response(BaseResponse):
def __init__(self, headers, body):
self.__headers = headers
self.__body = body
def get_headers(self):
return self.__headers
def get_body(self):
return self.__body
#classmethod
def from_request(cls, request, payload):
headers = request.get_headers()
headers["meta_data"] = payload["meta_data"]
body = payload["data"]
return cls(headers, body)
The substitution principle says that you need to be able to substitute an object with another object of a compatible type (i.e. a subtype), and it must still behave the same. You need to see this from the perspective of a function that type hints for a specific object:
def func(foo: BaseResponse):
...
This function expects an argument that behaves like BaseResponse. What does that behave like?
get_headers()
get_body()
These are the only two methods of BaseResponse. As long as the object you pass to func has these two characteristics, it passes the duck test of BaseResponse. If it further implements any additional methods, that's of no concern.
So, no, class methods don't break the LSP.

Trivial context manager in Python

My resource can by of type R1 which requires locking or of type R2
which does not require it:
class MyClass(object): # broken
def __init__ (self, ...):
if ...:
self.resource = R1(...)
self.lock = threading.Lock()
else:
self.resource = R2(...)
self.lock = None
def foo(self): # there are many locking methods
with self.lock:
operate(self.resource)
The above obviously fails if self.lock is None.
My options are:
if:
def foo(self):
if self.lock:
with self.lock:
operate(self.resource)
else:
operate(self.resource)
cons: too verbose
pro: does not create an unnecessary threading.Lock
always set self.lock to threading.Lock
pro: code is simplified
cons: with self.lock appears to be relatively expensive
(comparable to disk i/o!)
define a trivial lock class:
class TrivialLock(object):
def __enter__(self): pass
def __exit__(self, _a, _b, _c): pass
def acquire(self): pass
def release(self): pass
and use it instead of None for R2.
pro: simple code
cons: I have to define TrivialLock
Questions
What method is preferred by the community?
Regardless of (1), does anyone actually define something like
TrivialLock? (I actually expected that something like that would be
in the standard library...)
Is my observation that locking cost is comparable to that of a
write conforms to expectations?
I would define TrivialLock. It can be even more trivial, though, since you just need a context manager, not a lock.
class TrivialLock(object):
def __enter__(self):
pass
def __exit__(*args):
pass
You can make this even more trivial using contextlib:
import contextlib
#contextlib.contextmanager
def TrivialLock():
yield
self.lock = TrivialLock()
And since yield can be an expression, you can define TrivalLock inline instead:
self.lock = contextlib.contextmanager(lambda: (yield))()
Note the parentheses; lambda: yield is invalid. However, the generator expression (yield) makes this a single-use context manager; if you try to use the same value in a second with statement, you get a Runtime error because the generator is exhausted.

Class Decorator when Inheriting from another class

Ive been on a tear of writing some decorators recently.
One of the ones I just wrote allows you to put the decorator just before a class definition, and it will cause every method of the class to print some logigng info when its run (more for debugging/initial super basic speed tests during a build)
def class_logit(cls):
class NCls(object):
def __init__(self, *args, **kwargs):
self.instance = cls(*args, **kwargs)
#staticmethod
def _class_logit(original_function):
def arg_catch(*args, **kwargs):
start = time.time()
result = original_function(*args, **kwargs)
print('Called: {0} | From: {1} | Args: {2} | Kwargs: {3} | Run Time: {4}'
''.format(original_function.__name__, str(inspect.getmodule(original_function)),
args, kwargs, time.time() - start))
return result
return arg_catch
def __getattribute__(self, s):
try:
x = super(NCls, self).__getattribute__(s)
except AttributeError:
pass
else:
return x
x = self.instance.__getattribute__(s)
if type(x) == type(self.__init__):
return self._class_logit(x)
else:
return x
return NCls
This works great when applied to a very basic class i create.
Where I start to encounter issues is when I apply it to a class that is inheriting another - for instance, using QT:
#scld.class_logit
class TestWindow(QtGui.QDialog):
def __init__(self):
print self
super(TestWindow, self).__init__()
a = TestWindow()
Im getting the following error... and im not entirely sure what to do about it!
self.instance = cls(*args, **kwargs)
File "<string>", line 15, in __init__
TypeError: super(type, obj): obj must be an instance or subtype of type
Any help would be appreciated!
(Apologies in advance, no matter WHAT i do SO is breaking the formatting on my first bit of code... Im even manually spending 10 minutes adding spaces but its coming out incorrectly... sorry!)
You are being a bit too intrusive with your decorator.
While if you want to profile methods defined on the Qt framework itself, a somewhat aggressive approach is needed, your decorator replaces the entire class by a proxy.
Qt bindings are somewhat complicated indeed, and it is hard to tell why it is erroring when being instantiated in this case.
So - first things first - if your intent would be to apply the decorator to a class hierarchy defined by yourself, or at least one defined in pure Python, a good approach there could be using metaclasses: with a metaclass you could decorate each method when a class is created, and do not mess anymore at runtime, when methods are retrieved from each class.
but Qt, as some other libraries, have its methods and classes defined in native code, and that will prevent you from wrapping existing methods in a new class. So, wrapping the methods on attribute retrieval on __getattribute__ could work.
Here is a simpler approach that instead of using a Proxy, just plug-in a foreign __getattribute__ that does the wrap-with-logger thing you want.
Your mileage may vary with it. Specially, it won't be triggered if one method of the class is called by other method in native code - as this won't go through Python's attribute retrieval mechanism (instead, it will use C++ method retrieval directly).
from PyQt5 import QtWidgets, QtGui
def log_dec(func):
def wraper(*args, **kwargs):
print(func.__name__, args, kwargs)
return func(*args, **kwargs)
return wraper
def decorate(cls):
def __getattribute__(self, attr):
attr = super(cls, self).__getattribute__(attr)
if callable(attr):
return log_dec(attr)
return attr
cls.__getattribute__ = __getattribute__
return cls
#decorate
class Example(QtGui.QWindow):
pass
app = QtWidgets.QApplication([])
w = Example()
w.show()
(Of course, just replace the basic logger by your fancy logger above)

Python decorator to determine order of execution of methods

I have a basic class Framework with 3 methods that can be set by the user: initialize, handle_event and finalize.
These methods are executed by the method run:
class Framework(object):
def initialize(self):
pass
def handle_event(self, event):
pass
def finalize(self):
pass
def run(self):
self.initialize()
for event in range(10):
self.handle_event(event)
self.finalize()
I would like to create 3 decorators: on_initialize, on_event and on_finalize so that I could write such a class:
class MyFramework(Framework):
# The following methods will be executed once in this order
#on_initialize(precedence=-1)
def say_hi(self):
print('say_hi')
#on_initialize(precedence=0)
def initialize(self):
print('initialize')
#on_initialize(precedence=1)
def start_processing_events(self):
print('start_processing_events')
# The following methods will be executed in each event in this order
#on_event(precedence=-1)
def before_handle_event(self, event):
print('before_handle_event:', event)
#on_event(precedence=0)
def handle_event(self, event):
print('handle_event:', event)
#on_event(precedence=1)
def after_handle_event(self, event):
print('after_handle_event:', event)
# The following methods will be executed once at the end on this order
#on_finalize(precedence=-1)
def before_finalize(self):
print('before_finalize')
#on_finalize(precedence=0)
def finalize(self):
print('finalize')
#on_finalize(precedence=1)
def after_finalize(self):
print('after_finalize')
if __name__ == '__main__':
f = MyFramework()
f.run()
These decorators determine the order of execution of the optional methods the user may add to the class. I think that by default, initialize, handle_event and finalize should take precedence=0. Then the user could add any method with the right decorator and he will know when they get executed in the simulation run.
I have honestly no idea how to get started with this problem. Any help to push me in the right direction will be very welcome! Many thanks.
If you are using Python 3.6, this is a case that can take advantage of the new __init_subclass__ method. It is called on the superclass by subclasses when they are created.
Withut Python3.6 you have to resort to a metaclass.
The decorator itself can just mark each method with the needed data.
def on_initialize(precedence=0):
def marker(func):
func._initializer = precedence
return func
return marker
def on_event(precedence=0):
def marker(func):
func._event_handler = precedence
return func
return marker
def on_finalize(precedence=0):
def marker(func):
func._finalizer = precedence
return func
return marker
class Framework:
def __init_subclass__(cls, *args, **kw):
super().__init_subclass__(*args, **kw)
handlers = dict(_initializer=[], _event_handler=[], _finalizer=[])
for name, method in cls.__dict__.items():
for handler_type in handlers:
if hasattr(method, handler_type):
handlers[handler_type].append((getattr(method, handler_type), name))
for handler_type in handlers:
setattr(cls, handler_type,
[handler[1] for handler in sorted(handlers[handler_type])])
def initialize(self):
for method_name in self._initializer:
getattr(self, method_name)()
def handle_event(self, event):
for method_name in self._event_handler:
getattr(self, method_name)(event)
def finalize(self):
for method_name in self._finalizer:
getattr(self, method_name)()
def run(self):
self.initialize()
for event in range(10):
self.handle_event(event)
self.finalize()
If you will have a complex class hierarchy that should inherit the action methods properly, you wll have to merge the lists in the handlers dictionary with the ones in the superclass (get the superclass as cls.__mro__[1]) before applying then as class attributes.
Also, if you are using any Python < 3.6, you will need to move the logic on __init_subclass__ to a metaclass. Just put the code as it is on the __init__ method of a metaclass (and adjust the incoming parameters and super call as apropriate), and it should work just the same.
My idea is to use class based decorators, which are simple and gives intermediate context to share between decorated functions. So decorator would look like this (I am using python3.5):
class OnInitialize:
methods = {}
def __init__(self, precedence):
self.precedence = precedence
def __call__(self, func):
self.methods[self.precedence] = func
def wrapper(*a, **kw):
for precedence in sorted(self.methods.keys()):
self.methods[precedence](*a, **kw)
return wrapper
on decoration, first of all init is executed and it stores the precedence value for further use. Secondly the call is executed which just appends target function to the methods dictionary (Please note that call and methods structure could be customized to allow calling multiple methods with same precedence).
on the other hand, target class and methods would look like this
class Target:
#OnInitialize(precedence=-1)
def say_hi(self):
print("say_hi")
#OnInitialize(precedence=0)
def initialize(self):
print("initialize")
#OnInitialize(precedence=1)
def start_processing_events(self):
print("start_processing_events")
which ensures that, if one of the following methods are called, it will call all the decorated methods with given order.
target = Target()
target.initialize()
Hope it helps, please comment me below if you were interested in something other.

Proxy class can't call methods on child

I'm writing a Python class to wrap/decorate/enhance another class from a package called petl, a framework for ETL (data movement) workflows. Due to design constraints I can't just subclass it; every method call has to be sent through my own class so I can control what kind of objects are being passed back. So in principle this is a proxy class, but I'm having some trouble using existing answers/recipes out there. This is what my code looks like:
from functools import partial
class PetlTable(object):
"""not really how we construct petl tables, but for illustrative purposes"""
def hello(name):
print('Hello, {}!'.format(name)
class DatumTable(object):
def __init__(self, petl_tbl):
self.petl_tbl = petl_tbl
def __getattr__(self, name):
"""this returns a partial referencing the child method"""
petl_attr = getattr(self.petl_tbl, name, None)
if petl_attr and callable(petl_attr):
return partial(self.call_petl_method, func=petl_attr)
raise NotImplementedError('Not implemented')
def call_petl_method(self, func, *args, **kwargs):
func(*args, **kwargs)
Then I try to instantiate a table and call something:
# create a petl table
pt = PetlTable()
# wrap it with our own class
dt = DatumTable(pt)
# try to run the petl method
dt.hello('world')
This gives a TypeError: call_petl_method() got multiple values for argument 'func'.
This only happens with positional arguments; kwargs seem to be fine. I'm pretty sure it has to do with self not being passed in, but I'm not sure what the solution is. Can anyone think of what I'm doing wrong, or a better solution altogether?
This seems to be a common issue with mixing positional and keyword args:
TypeError: got multiple values for argument
To get around it, I took the positional arg func out of call_petl_method and put it in a kwarg that's unlikely to overlap with the kwargs of the child function. A little hacky, but it works.
I ended up writing a Proxy class to do all this generically:
class Proxy(object):
def __init__(self, child):
self.child = child
def __getattr__(self, name):
child_attr = getattr(self.child, name)
return partial(self.call_child_method, __child_fn__=child_attr)
#classmethod
def call_child_method(cls, *args, **kwargs):
"""
This calls a method on the child object and wraps the response as an
object of its own class.
Takes a kwarg `__child_fn__` which points to a method on the child
object.
Note: this can't take any positional args or they get clobbered by the
keyword args we're trying to pass to the child. See:
https://stackoverflow.com/questions/21764770/typeerror-got-multiple-values-for-argument
"""
# get child method
fn = kwargs.pop('__child_fn__')
# call the child method
r = fn(*args, **kwargs)
# wrap the response as an object of the same class
r_wrapped = cls(r)
return r_wrapped
This will also solve the problem. It doesn't use partial at all.
class PetlTable(object):
"""not really how we construct petl tables, but for illustrative purposes"""
def hello(name):
print('Hello, {}!'.format(name))
class DatumTable(object):
def __init__(self, petl_tbl):
self.petl_tbl = petl_tbl
def __getattr__(self, name):
"""Looks-up named attribute in class of the petl_tbl object."""
petl_attr = self.petl_tbl.__class__.__dict__.get(name, None)
if petl_attr and callable(petl_attr):
return petl_attr
raise NotImplementedError('Not implemented')
if __name__ == '__main__':
# create a petl table
pt = PetlTable()
# wrap it with our own class
dt = DatumTable(pt)
# try to run the petl method
dt.hello('world') # -> Hello, world!

Categories