How do I decorate an __init__ method with Click? - python

I frequently use the following pattern, where I create an instance of a Main object, typically for a batch, then call process() on it. All the necessary options are passed into the __init__. This means that, provided I create a Main with the appropriate parameters, I can run it from anywhere (celery being an example).
(I have learned to minimize in-celery debugging to the greatest extent to keep my sanity - the scripts are debugged fully from the command line first, and only then are Mains created and launched from celery)
This is my working Click approach:
Pretty simple. the decorated run function defines its arguments and passes them to Main.__init__.
Conceptually however, run and its docstring are very separated from that __init__. So on the __init__ I have to look at run to understand what to pass and when I --help the command line, I get the docstring for run.
import click
#click.command()
#click.option('--anoption')
def run(**kwargs):
mgr = Main(**kwargs)
mgr.process()
class Main:
def __init__(self, **kwargs):
self.__dict__.update(**kwargs)
def process(self):
print(f"{self}.process({vars(self)})")
if __name__ == "__main__":
run()
output:
<__main__.Main object at 0x10d5e42b0>.process({'anoption': None})
This is what I would like instead:
import click
class Main:
#click.command()
#click.option('--anoption')
def __init__(self, **kwargs):
self.__dict__.update(**kwargs)
def process(self):
print(f"{self}.process({vars(self)})")
if __name__ == "__main__":
main = Main(standalone_mode=False)
main.process()
But that gets me TypeError: __init__() missing 1 required positional argument: 'self'
Using a factory method
the closest I've come is with a factory staticmethod, but that's not great. At least the factory is sitting on the class, but there's still not much of a link to the init.
import click
class Main:
#staticmethod
#click.command()
#click.option('--anoption')
def factory(**kwargs):
return Main(**kwargs)
def __init__(self, **kwargs):
self.__dict__.update(**kwargs)
def process(self):
print(f"{self}.process({vars(self)})")
if __name__ == "__main__":
main = Main.factory(standalone_mode=False)
main.process()
output:
<__main__.Main object at 0x10ef59128>.process({'anoption': None})
What's a simple Pythonic way to bring click and that Main.__init__ closer together?

Related

Finding parameters of `__init__()` or parameters needed to construct an object in python

I have scenario where I am passing a file name and checking if it has argument start as constructor if it has then I have to create instance of that class.
Consider the example where I have a file named test.py which have three class namely A,B,C now only class A has start parameter others have other different parameter or extra parameter.
#test.py
class A:
def __init__(self, start=""):
pass
class B:
def __init__(self, randomKeyword, start=""):
pass
class C:
def __init__(self):
pass
Now I want to write a script which takes test.py as an argument and create instance of A. Till now my progress is
detail = importlib.util.spec_from_file_location('test.py', '/path/to/test.py')
module = importlib.util.module_from_spec(detail)
spec.loader.exec_module(mod)
Bacially I need to write a program which finds init argument of all class in file and create an instance of file with start as init argument.
As mentioned by #deceze it's not a good idea to instantiate a class on the basis of it's init parameter as we're not sure what is there. But it's possible to do it. So I am posting this answer just so that you know how it can be done.
#test.py
class A:
def __init__(self, start=""):
pass
class B:
def __init__(self, randomKeyword, start=""):
pass
class C:
def __init__(self):
pass
One of the possibility is
#init.py
import importlib.util
from inspect import getmembers, isclass, signature
detail = importlib.util.spec_from_file_location('test.py', '/path/to/test.py')
module = importlib.util.module_from_spec(detail)
spec.loader.exec_module(module)
for name, data in getmembers(mod, isclass):
cls = getattr(mod, name)
parameter = signature(cls.__init__).parameters.keys()
# parameter start
if len(parameter) == 2 and 'start' in parameter:
object = cls(start="Whatever you want")
Ofcourse it's not the best approach so more answer are welcome and if you are in this scenario consider #deceze comment and define a builder.

Python 3 : Inheritance and the assignment of "self"

Why does this script require "self" as an argument to mssg() in line 3? PyCharm flags "self" in line 3 as, expected type "Another", got "Main" instead. This warning makes sense to me (although the code works). When "self" is omitted, Python throws an error:
TypeError: mssg() missing 1 required positional argument: 'self'
class Main():
def __init__(self):
print(Another.mssg(self))
class Another():
def __init__(self):
pass
def mssg(self):
return "Hello World"
_foo = Main()
Using your guidance, here are three different ways to prevent the TypeError:
class Main():
def __init__(self):
print(Another.mssg('asdasdsa'))
print(Another().mssg())
print(_bar.mssg())
class Another():
def __init__(self):
pass
def mssg(self):
return "Hello World"
_bar = Another()
_foo = Main()
If you use Another.mssg(self), then your are calling a class method, that is why self is taken as a parameter and you need to use exactly one argument to call the function. Try print(Another.mssg('asdasdsa')) and you will see that it works.
If your intention was using mssg(self) as an instance method, then you should call it using print(Another().mssg()), so you create your instance and then you call its method.

Python decorator to determine order of execution of methods

I have a basic class Framework with 3 methods that can be set by the user: initialize, handle_event and finalize.
These methods are executed by the method run:
class Framework(object):
def initialize(self):
pass
def handle_event(self, event):
pass
def finalize(self):
pass
def run(self):
self.initialize()
for event in range(10):
self.handle_event(event)
self.finalize()
I would like to create 3 decorators: on_initialize, on_event and on_finalize so that I could write such a class:
class MyFramework(Framework):
# The following methods will be executed once in this order
#on_initialize(precedence=-1)
def say_hi(self):
print('say_hi')
#on_initialize(precedence=0)
def initialize(self):
print('initialize')
#on_initialize(precedence=1)
def start_processing_events(self):
print('start_processing_events')
# The following methods will be executed in each event in this order
#on_event(precedence=-1)
def before_handle_event(self, event):
print('before_handle_event:', event)
#on_event(precedence=0)
def handle_event(self, event):
print('handle_event:', event)
#on_event(precedence=1)
def after_handle_event(self, event):
print('after_handle_event:', event)
# The following methods will be executed once at the end on this order
#on_finalize(precedence=-1)
def before_finalize(self):
print('before_finalize')
#on_finalize(precedence=0)
def finalize(self):
print('finalize')
#on_finalize(precedence=1)
def after_finalize(self):
print('after_finalize')
if __name__ == '__main__':
f = MyFramework()
f.run()
These decorators determine the order of execution of the optional methods the user may add to the class. I think that by default, initialize, handle_event and finalize should take precedence=0. Then the user could add any method with the right decorator and he will know when they get executed in the simulation run.
I have honestly no idea how to get started with this problem. Any help to push me in the right direction will be very welcome! Many thanks.
If you are using Python 3.6, this is a case that can take advantage of the new __init_subclass__ method. It is called on the superclass by subclasses when they are created.
Withut Python3.6 you have to resort to a metaclass.
The decorator itself can just mark each method with the needed data.
def on_initialize(precedence=0):
def marker(func):
func._initializer = precedence
return func
return marker
def on_event(precedence=0):
def marker(func):
func._event_handler = precedence
return func
return marker
def on_finalize(precedence=0):
def marker(func):
func._finalizer = precedence
return func
return marker
class Framework:
def __init_subclass__(cls, *args, **kw):
super().__init_subclass__(*args, **kw)
handlers = dict(_initializer=[], _event_handler=[], _finalizer=[])
for name, method in cls.__dict__.items():
for handler_type in handlers:
if hasattr(method, handler_type):
handlers[handler_type].append((getattr(method, handler_type), name))
for handler_type in handlers:
setattr(cls, handler_type,
[handler[1] for handler in sorted(handlers[handler_type])])
def initialize(self):
for method_name in self._initializer:
getattr(self, method_name)()
def handle_event(self, event):
for method_name in self._event_handler:
getattr(self, method_name)(event)
def finalize(self):
for method_name in self._finalizer:
getattr(self, method_name)()
def run(self):
self.initialize()
for event in range(10):
self.handle_event(event)
self.finalize()
If you will have a complex class hierarchy that should inherit the action methods properly, you wll have to merge the lists in the handlers dictionary with the ones in the superclass (get the superclass as cls.__mro__[1]) before applying then as class attributes.
Also, if you are using any Python < 3.6, you will need to move the logic on __init_subclass__ to a metaclass. Just put the code as it is on the __init__ method of a metaclass (and adjust the incoming parameters and super call as apropriate), and it should work just the same.
My idea is to use class based decorators, which are simple and gives intermediate context to share between decorated functions. So decorator would look like this (I am using python3.5):
class OnInitialize:
methods = {}
def __init__(self, precedence):
self.precedence = precedence
def __call__(self, func):
self.methods[self.precedence] = func
def wrapper(*a, **kw):
for precedence in sorted(self.methods.keys()):
self.methods[precedence](*a, **kw)
return wrapper
on decoration, first of all init is executed and it stores the precedence value for further use. Secondly the call is executed which just appends target function to the methods dictionary (Please note that call and methods structure could be customized to allow calling multiple methods with same precedence).
on the other hand, target class and methods would look like this
class Target:
#OnInitialize(precedence=-1)
def say_hi(self):
print("say_hi")
#OnInitialize(precedence=0)
def initialize(self):
print("initialize")
#OnInitialize(precedence=1)
def start_processing_events(self):
print("start_processing_events")
which ensures that, if one of the following methods are called, it will call all the decorated methods with given order.
target = Target()
target.initialize()
Hope it helps, please comment me below if you were interested in something other.

Proxy class can't call methods on child

I'm writing a Python class to wrap/decorate/enhance another class from a package called petl, a framework for ETL (data movement) workflows. Due to design constraints I can't just subclass it; every method call has to be sent through my own class so I can control what kind of objects are being passed back. So in principle this is a proxy class, but I'm having some trouble using existing answers/recipes out there. This is what my code looks like:
from functools import partial
class PetlTable(object):
"""not really how we construct petl tables, but for illustrative purposes"""
def hello(name):
print('Hello, {}!'.format(name)
class DatumTable(object):
def __init__(self, petl_tbl):
self.petl_tbl = petl_tbl
def __getattr__(self, name):
"""this returns a partial referencing the child method"""
petl_attr = getattr(self.petl_tbl, name, None)
if petl_attr and callable(petl_attr):
return partial(self.call_petl_method, func=petl_attr)
raise NotImplementedError('Not implemented')
def call_petl_method(self, func, *args, **kwargs):
func(*args, **kwargs)
Then I try to instantiate a table and call something:
# create a petl table
pt = PetlTable()
# wrap it with our own class
dt = DatumTable(pt)
# try to run the petl method
dt.hello('world')
This gives a TypeError: call_petl_method() got multiple values for argument 'func'.
This only happens with positional arguments; kwargs seem to be fine. I'm pretty sure it has to do with self not being passed in, but I'm not sure what the solution is. Can anyone think of what I'm doing wrong, or a better solution altogether?
This seems to be a common issue with mixing positional and keyword args:
TypeError: got multiple values for argument
To get around it, I took the positional arg func out of call_petl_method and put it in a kwarg that's unlikely to overlap with the kwargs of the child function. A little hacky, but it works.
I ended up writing a Proxy class to do all this generically:
class Proxy(object):
def __init__(self, child):
self.child = child
def __getattr__(self, name):
child_attr = getattr(self.child, name)
return partial(self.call_child_method, __child_fn__=child_attr)
#classmethod
def call_child_method(cls, *args, **kwargs):
"""
This calls a method on the child object and wraps the response as an
object of its own class.
Takes a kwarg `__child_fn__` which points to a method on the child
object.
Note: this can't take any positional args or they get clobbered by the
keyword args we're trying to pass to the child. See:
https://stackoverflow.com/questions/21764770/typeerror-got-multiple-values-for-argument
"""
# get child method
fn = kwargs.pop('__child_fn__')
# call the child method
r = fn(*args, **kwargs)
# wrap the response as an object of the same class
r_wrapped = cls(r)
return r_wrapped
This will also solve the problem. It doesn't use partial at all.
class PetlTable(object):
"""not really how we construct petl tables, but for illustrative purposes"""
def hello(name):
print('Hello, {}!'.format(name))
class DatumTable(object):
def __init__(self, petl_tbl):
self.petl_tbl = petl_tbl
def __getattr__(self, name):
"""Looks-up named attribute in class of the petl_tbl object."""
petl_attr = self.petl_tbl.__class__.__dict__.get(name, None)
if petl_attr and callable(petl_attr):
return petl_attr
raise NotImplementedError('Not implemented')
if __name__ == '__main__':
# create a petl table
pt = PetlTable()
# wrap it with our own class
dt = DatumTable(pt)
# try to run the petl method
dt.hello('world') # -> Hello, world!

What is causing "unbound method __init__() must be called with instance as first argument" from this Python code?

I have this class:
from threading import Thread
import time
class Timer(Thread):
def __init__(self, interval, function, *args, **kwargs):
Thread.__init__()
self.interval = interval
self.function = function
self.args = args
self.kwargs = kwargs
self.start()
def run(self):
time.sleep(self.interval)
return self.function(*self.args, **self.kwargs)
and am calling it with this script:
import timer
def hello():
print \"hello, world
t = timer.Timer(1.0, hello)
t.run()
and get this error and I can't figure out why: unbound method __init__() must be called with instance as first argument
You are doing:
Thread.__init__()
Use:
Thread.__init__(self)
Or, rather, use super()
This is a frequently asked question at SO, but the answer, in brief, is that the way you call your superclass's constructor is like:
super(Timer,self).__init__()
First, the reason you must use:
Thread.__init__(self)
instead of
Thread.__init__()
is because you are using the class name, and not an object (an instance of the class), so you cannot call a method in the same way as an object.
Second, if you are using Python 3, the recommended style for invoking a super class method from a sub class is:
super().method_name(parameters)
Although in Python 3 is possible to use:
SuperClassName.method_name(self, parameters)
It is an old style of syntax that is not the prefer style.
You just need to pass 'self' as an argument to 'Thread.init'. After that, it works on my machines.

Categories