In Python, I'd like to be able to create a function that behaves both as a class function and an instance method, but with the ability to change behaviors. The use case for this is for a set of serializable objects and types. As an example:
>>> class Thing(object):
#...
>>> Thing.to_json()
'A'
>>> Thing().to_json()
'B'
I know that given the definition of classmethod() in funcobject.c in the Python source, this looks like it'd be simple with a C module. Is there a way to do this from within python?
Thanks!
With the hint of descriptors, I was able to do it with the following code:
class combomethod(object):
def __init__(self, method):
self.method = method
def __get__(self, obj=None, objtype=None):
#functools.wraps(self.method)
def _wrapper(*args, **kwargs):
if obj is not None:
return self.method(obj, *args, **kwargs)
else:
return self.method(objtype, *args, **kwargs)
return _wrapper
Thank you Alex!
Sure, you just need to define your own descriptor type. There's an excellent tutorial on Python descriptors here.
Related
I'm trying to inherit numpy.ndarray, but need to resize the array in a method, but no answer so far :
Resize inherited ndarray, inside a method?
So my question then is how to redirect methods to local object.
For attribute you can use __getattribute__, __getattr__ is there similar option for methods ?
something like this :
class Blah():
def __init__(...):
......
self.obj = np.zeros(...)
def __get_method__(self, method):
return self.obj.method() ????
Can you not just inherit from numpy.array?
class Blah(numpy.array):
def __init__(self, *args, **kwargs):
numpy.array.__init__(self, *args, **kwargs)
....
def resize(self, ...):
....
As a hack that might sometimes be more convenient, you can do alternatively
class Blah:
...
def __getattr__(self, name):
return getattr(self.obj, name)
But this is normally something you would regret later :-}.
Edit: To come back to the first (better) way: You can resize in-place. See here: https://numpy.org/doc/stable/reference/generated/numpy.ndarray.resize.html#numpy.ndarray.resize
So I've created a module inspired heavily by amoffat's sh module, where I can import shell programs as functions; unlike sh, my module can do something like git(C = path).commit(m = message) directly, by returning the module class itself as a partial: return partial(bakery, self.program). However, I've lost the ability to run something like ls() without a placeholder method like ls._(), which doesn't look as good. The code in the latter: return output_as_list(args, kwargs).
from functools import partial
def __getattr__(name):
if name == "__path__":
raise AttributeError
return bakery(name)
class bakery:
def __init__(self, program):
self.program = program
def __getattr__(self, subcommand):
return subcommand
#property
def __call__(self):
return partial(bakery, self.program)
My question is this:
Is there a way to tell __call__ you're accessing a dynamic attribute, using a __getattr__ inside it, for example, to implement both the git(C = path).commit(m = message) and ls() scenarios? Or to conditionally return a partial or an output list depending on whether an attribute of __call__ is being accessed?
Edit:
I was wondering if something similar to this might work?
def __call__(self, *args, **kwargs):
class inner_class:
def __init__(self, *args, **kwargs):
self.args = args
self.kwargs = kwargs
def __getattr__(self, subcommand):
return partial(bakery, self.program)
def __call__(self, *args, **kwargs):
return bakery(*self.args, **self.kwargs)._get_output_as_list(*args, **kwargs)
return inner_class(self.program, *args, **kwargs)
Edit 2:
I suppose I could just convert the individual __call__ functions to subclasses, and import from whichever is necessary.
So I have the following decorator code
class Factory:
def __init__(self, cls):
self.cls = cls
def __instancecheck__(self, inst):
return isinstance(inst, self.cls)
def Produce(self):
return self.cls()
And the following class code
#Factory
class Foo:
def __init__(self, arg):
self.arg = arg
def method(self): pass
Which works great. Allows me to do stuff like
Foo.Produce().method()
Instead of
instance = Foo()
instance.method()
But now I cant use the class constructor normally
Foo(arg)
Gives the exception 'Factory object is not callable'. My question is the following: How can I make a decorator that allows me to instantiate the decorated class using its constructor, but also allows me to use a function in the decorator?
Alternative ways I'd rather not use:
Skip the constructor. Always use <Class>.Produce() (and use *args/**kwargs to make it abstract/reusable.
Use setters in all the classes, and make them return self so they can be chained.
Make a class containing the produce method and extend this class.
The exception is telling you all you need to know, just add a __call__ method:
class Factory:
# ...
def __call__(self, *args, **kwargs):
return self.cls(*args, **kwargs)
If all you want to do is to add a Produce function to the class, you can rewrite your decorator like this:
def Factory(cls):
def Produce():
return cls()
cls.Produce= Produce # add the function to the class
return cls
I was trying to have my class-based decorator keeping the repr() behavior of the original wrapped function (to match the way the functools.wraps decorator works on functions). I am using python 3.3.
First I tried functools:
import functools
class ClassBasedDecorator():
def __init__(self, fn):
self.fn = fn
functools.update_wrapper(self, fn)
def __call__(self, *args, **kwargs):
self.fn(*args, **kwargs)
#ClassBasedDecorator
def wrapped(text):
pass
But when I call repr() on the decorated function, I get:
>>> repr(wrapped)
'<__main__.ClassBasedDecorator object at 0x2d8860b6850>'
Very well, so I tried to customize the __repr__ method of my decorator, which is supposed to be called by repr().
Using functools again:
class ClassBasedDecorator():
def __init__(self, fn):
self.fn = fn
functools.update_wrapper(
self, fn,
assigned=functools.WRAPPER_ASSIGNMENTS + ('__repr__',)
)
def __call__(self, *args, **kwargs):
self.fn(*args, **kwargs)
Doesn't change the output, but something interesting happens:
>>> repr(wrapped)
'<__main__.ClassBasedDecorator object at 0x2d8860b69d0>'
>>> wrapped.__repr__()
'<function wrapped at 0x2d8860a9710>'
Explicitly setting the __repr__ method of the decorator instance has the same effect.
After a little more tests I deduced repr(instance) actually calls instance.__class__.__repr__(instance). Thus the overriden __repr__ method of the instance is never called.
So here are my questions:
Why does repr(instance) call the instance.__class__.__repr__(instance) instead of instance.__repr__()? Or have I missed something else?
How would you fully reproduce what functools.wraps does with function-based decorators to class-based decorators (including altering the result of repr() calls on the decorated function)?
Special methods are always looked up on the type of the instance (here the class object), not on the instance. Otherwise a __repr__ on a class would be used when you tried to print the representation of the class itself; type(class).__repr__(class) would use the correct magic method, while class.__repr__() would raise an exception because self was not provided.
Implement your own __repr__ hooks:
class ClassBasedDecorator():
def __init__(self, fn):
self.fn = fn
functools.update_wrapper(self, fn)
def __call__(self, *args, **kwargs):
self.fn(*args, **kwargs)
def __repr__(self):
return repr(self.fn)
e.g. still copy over the __module__, __name__ and __doc__ attributes, and copy over the attributes from the function __dict__, but make any special methods a proxy.
I guess that's how they are called, but I will give examples just in case.
Decorator class:
class decorator(object):
def __init__(self, func):
self.func = func
def __call__(self, *args, **kwargs):
print 'something'
self.func(*args, **kwargs)
Decorator function:
def decorator(func):
def wrapper(*args, **kwargs):
print 'something'
return func(*args, **kwargs)
return wrapper
Is using one or the other just a matter of taste? Is there any practical difference?
If you can write a function to implement your decorator you should prefer it. But not all decorators can easily be written as a function - for example when you want to store some internal state.
class counted(object):
""" counts how often a function is called """
def __init__(self, func):
self.func = func
self.counter = 0
def __call__(self, *args, **kwargs):
self.counter += 1
return self.func(*args, **kwargs)
#counted
def something():
pass
something()
print something.counter
I've seen people (including myself) go through ridiculous efforts to write decorators only with functions. I still have no idea why, the overhead of a class is usually totally negligible.
It is generally just a matter of taste. Most Python programs use duck typing and don't really care whether the thing they're calling is a function or an instance of some other type, so long as it is callable. And anything with a __call__() method is callable.
There are a few advantages to using function-style decorators:
Much cleaner when your decorator doesn't return a wrapper function (i.e., it returns the original function after doing something to it, such as setting an attribute).
No need to explicitly save the reference to the original function, as this is done by the closure.
Most of the tools that help you make decorators, such as functools.wraps() or Michele Simionato's signature-preserving decorator module, work with function-style decorators.
There may be some programs out there somewhere which don't use duck typing, but actually expect a function type, so returning a function to replace a function is theoretically "safer."
For these reasons, I use function-style decorators most of the time. As a counterexample, however, here is a recent instance in which the class-style decorator was more natural for me.
The proposed class decorator implementation has a slight difference with the function implementation : it will fail on methods
class Decorator(object):
def __init__(self, func):
self.func = func
def __call__(self, *args, **kwargs):
print('something')
self.func(*args, **kwargs)
class A:
#Decorator
def mymethod(self):
print("method")
A().mymethod()
will raise TypeError: mymethod() missing 1 required positional argument: 'self'
To add support of methods, you need to implement the __get__
import types
class Decorator2(object):
def __init__(self, func):
self.func = func
def __call__(self, *args, **kwargs):
print('something')
self.func(*args, **kwargs)
def __get__(self, instance, owner):
if instance is None:
return self
return types.MethodType(self, instance)
class B:
#Decorator2
def mymethod(self):
print("method")
B().mymethod()
will output
class B:...
something
method
The reason it works is that when you access B().mymethod, the __get__ is called first and supplies the bound method. Then __call__ is called
To conclude, provided you define the __get__, class and function implementation can be used the same way. See python cookbook recipe 9.9 for more information.