I'm trying to inherit numpy.ndarray, but need to resize the array in a method, but no answer so far :
Resize inherited ndarray, inside a method?
So my question then is how to redirect methods to local object.
For attribute you can use __getattribute__, __getattr__ is there similar option for methods ?
something like this :
class Blah():
def __init__(...):
......
self.obj = np.zeros(...)
def __get_method__(self, method):
return self.obj.method() ????
Can you not just inherit from numpy.array?
class Blah(numpy.array):
def __init__(self, *args, **kwargs):
numpy.array.__init__(self, *args, **kwargs)
....
def resize(self, ...):
....
As a hack that might sometimes be more convenient, you can do alternatively
class Blah:
...
def __getattr__(self, name):
return getattr(self.obj, name)
But this is normally something you would regret later :-}.
Edit: To come back to the first (better) way: You can resize in-place. See here: https://numpy.org/doc/stable/reference/generated/numpy.ndarray.resize.html#numpy.ndarray.resize
Related
The specific use case I need it for is to deprecate class names.
Suppose we have class A in an earlier version and we want to deprecate its name but keep backwards compatibility:
class A(B):
def __init__(self, *args, **kwargs):
warnings.warn('deprecation!')
super(A, self).__init__(*args, **kwargs)
... and B now has the correct implementation.
When we create a class A, we will run into a deprecation warning here. We can also use the deprecated module for decorators on __init__.
However, I want to skip this process and write less code, and hopefully achieve something like:
#deprecated_alias('A')
class B:
# ... do something
Can I somehow inject the classname into the module-level namespace so that I can use A like this?
Can I somehow inject the classname into the module-level namespace so that I can use A like this?
Yes. The class decorator should:
create a new type, with overridden __init__ method, using the 3-argument invocation of type
get the module of the original class, sys.modules[original_class.__module__]
bind the new class in the module namespace, using setattr
return the original class unchanged
Example:
import sys
def deprecated_alias(name):
def decorator(class_):
mod = sys.modules[class_.__module__]
if hasattr(mod, name):
raise Exception('uhoh, name collision')
NewClass = type(name, (class_,), {'__init__': ...})
setattr(mod, name, NewClass)
return class_
return decorator
#deprecated_alias('A')
class B:
pass
I don't recommend this approach - too much magic. It will confuse IDEs and break autocompletion.
A less magical approach, perhaps? This could also be made into a decorator, and use __subclasscheck__/__subclasshook__ if you need to control the finer details of inheritance.
class A(B):
def __init__(self, *args, **kwargs):
warnings.warn('deprecation!')
return B(*args, **kwargs)
While this is not exactly what you asked for, it is substantially less magical and ultimately the same number of lines of code. It is also far more explicit:
import warnings
def deprecated(DeprecatedByClass):
class Deprecated(DeprecatedByClass):
def __new__(cls, *args, **kwargs):
warnings.warn("deprecation!")
return super(Deprecated, cls).__new__(cls, *args, **kwargs)
return Deprecated
You can then use this like so:
class B:
pass
A = deprecated(B)
I have some python objects with some methods in which i would like to do some check at the beggining, depending of this check, the method's code would run, or an execption would be raised. Instead of replicating the "check" code at the beginning of every method I though of doing a decorator, I also want the decorator to be embedded inside the class itself, since it is closely related to it. So basically:
instead of this
class A(object):
def a_method(self):
if self.check_var is True:
(some_code)
else:
raise Exception
I would like to have this
class A(object):
def decorator(function):
def function_wrapper(self, *args, **kwargs):
if self.check_var is True:
return function(self, *args, **kwargs)
else:
raise Exception
return function_wrapper
#decorator
def a_method(self):
(some_code)
My first question is, am I going about this right? or is there a better way. I have many methods of the A class that need to have this check, so that is why I don't want to replicate the code unnecessarily.
My second question is, if I go about this the way I described, I run into a problem when I want to derive a class from class A and performe the same decorator checks. Again I don't want to replicate the code, so I want to reuse the decorator in the base class A to performe checks in the derived class. I read about turning the decorator into a #classmethod however when I do this I am able to use the decorator in the derived class but not in the base class anymore!
So basically I would like something like this:
class A(object):
#classmethod #maybe
def decorator(function):
def function_wrapper(self, *args, **kwargs):
if self.check_var is True:
return function(self, *args, **kwargs)
else:
raise Exception
return function_wrapper
#decorator
def a_method(self):
(some_code)
class B(A):
#decorator
def b_method(self):
(some_code)
Does anybody know of any clean way to do this?
Since you would prefer to put the decorator inside the class (rather than outside both of them as I suggested in a comment), below shows a way to do it. It makes the decorator a staticmethod instead of a classmethod, and requires using it in a slightly unusual manner, but only within the class.
For more information regarding the necessity of using the decorator like this, see my question Calling class staticmethod within the class body?
class A(object):
#staticmethod
def decorator(function):
def function_wrapper(*args, **kwargs):
print('in function_wrapper')
return function(*args, **kwargs)
return function_wrapper
#decorator.__func__ #### Note unusual decorator usage inside defining class
def a_method(self):
print('in a_method')
class B(A):
#A.decorator #### Normal decorator usage outside defining class
def b_method(self):
print('in b_method')
One way to avoid having to use __func__ and still keep the definition in the first class would be to postpone turning it into a staticmethod until the very end of the class definition:
class A(object):
def decorator(function):
def function_wrapper(*args, **kwargs):
print('in function_wrapper')
return function(*args, **kwargs)
return function_wrapper
#decorator
def a_method(self):
print('in a_method')
decorator = staticmethod(decorator) #### convert for use outside this class
class B(A):
#A.decorator
def b_method(self):
print('in b_method')
Yet another way to avoid the __func__ is something like this:
class A(object):
class Check:
#staticmethod
def decorator(function):
def function_wrapper(*args, **kwargs):
print('in function_wrapper')
return function(*args, **kwargs)
return function_wrapper
#Check.decorator
def a_method(self):
print('in a_method')
class B(A):
Check = A.Check
#Check.decorator
def b_method(self):
print('in b_method')
Which has the additional advantage of making usage of the decorator very uniform.
My first question is, am I going about this right?
As martineau said below, the good practice is put classic decorator outside class.
def get_decorator(function, argument):
def function_wrapper(*args, **kwargs):
if argument is True:
return function(*args, **kwargs)
else:
raise Exception
return function_wrapper
class A(object):
def __init__(self):
self.check_var = True
self.a_method = get_decorator(self.a_method, self.check_var)
def a_method(self):
(whatever)
class B(A):
def __init__(self):
super(B, self).__init__()
self.b_method = get_decorator(self.b_method, self.check_var)
def b_method(self):
(whatever)
Classic decorator is called during class creation time, which is long before an instance is created. Reference
So I have the following decorator code
class Factory:
def __init__(self, cls):
self.cls = cls
def __instancecheck__(self, inst):
return isinstance(inst, self.cls)
def Produce(self):
return self.cls()
And the following class code
#Factory
class Foo:
def __init__(self, arg):
self.arg = arg
def method(self): pass
Which works great. Allows me to do stuff like
Foo.Produce().method()
Instead of
instance = Foo()
instance.method()
But now I cant use the class constructor normally
Foo(arg)
Gives the exception 'Factory object is not callable'. My question is the following: How can I make a decorator that allows me to instantiate the decorated class using its constructor, but also allows me to use a function in the decorator?
Alternative ways I'd rather not use:
Skip the constructor. Always use <Class>.Produce() (and use *args/**kwargs to make it abstract/reusable.
Use setters in all the classes, and make them return self so they can be chained.
Make a class containing the produce method and extend this class.
The exception is telling you all you need to know, just add a __call__ method:
class Factory:
# ...
def __call__(self, *args, **kwargs):
return self.cls(*args, **kwargs)
If all you want to do is to add a Produce function to the class, you can rewrite your decorator like this:
def Factory(cls):
def Produce():
return cls()
cls.Produce= Produce # add the function to the class
return cls
i had a class called CacheObject,and many class extend from it.
now i need to add something common on all classes from this class so i write this
class CacheObject(object):
def __init__(self):
self.updatedict = dict()
but the child class didn't obtain the updatedict attribute.i know calling super init function was optional in python,but is there an easy way to force all of them to add the init rather than walk all the classes and modify them one by one?
I was in a situation where I wanted classes to always call their base classes' constructor in order before they call their own. The following is Python3 code that should do what you want:
class meta(type):
def __init__(cls,name,bases,dct):
def auto__call__init__(self, *a, **kw):
for base in cls.__bases__:
base.__init__(self, *a, **kw)
cls.__init__child_(self, *a, **kw)
cls.__init__child_ = cls.__init__
cls.__init__ = auto__call__init__
class A(metaclass=meta):
def __init__(self):
print("Parent")
class B(A):
def __init__(self):
print("Child")
To illustrate, it will behave as follows:
>>> B()
Parent
Child
<__main__.B object at 0x000001F8EF251F28>
>>> A()
Parent
<__main__.A object at 0x000001F8EF2BB2B0>
I suggest a non-code fix:
Document that super().__init__() should be called by your subclasses before they use any other methods defined in it.
This is not an uncommon restriction. See, for instance, the documentation for threading.Thread in the standard library, which says:
If the subclass overrides the constructor, it must make sure to invoke the base class constructor (Thread.__init__()) before doing anything else to the thread.
There are probably many other examples, I just happened to have that doc page open.
You can override __new__. As long as your base classes doesn't override __new__ without calling super().__new__, then you'll be fine.
class CacheObject(object):
def __new__(cls, *args, **kwargs):
instance = super().__new__(cls, *args, **kwargs)
instance.updatedict = {}
return instance
class Foo(CacheObject):
def __init__(self):
pass
However, as some commenters said, the motivation for this seems a little shady. You should perhaps just add the super calls instead.
This isn't what you asked for, but how about making updatedict a property, so that it doesn't need to be set in __init__:
class CacheObject(object):
#property
def updatedict(self):
try:
return self._updatedict
except AttributeError:
self._updatedict = dict()
return self._updatedict
Hopefully this achieves the real goal, that you don't want to have to touch every subclass (other than to make sure none uses an attribute called updatedict for something else, of course).
There are some odd gotchas, though, because it is different from setting updatedict in __init__ as in your question. For example, the content of CacheObject().__dict__ is different. It has no key updatedict because I've put that key in the class, not in each instance.
Regardless of motivation, another option is to use __init_subclass__() (Python 3.6+) to get this kind of behavior. (For example, I'm using it because I want users not familiar with the intricacies of Python to be able to inherit from a class to create specific engineering models, and I'm trying to keep the structure of the class they have to define very basic.)
In the case of your example,
class CacheObject:
def __init__(self) -> None:
self.updatedict = dict()
def __init_subclass__(cls) -> None:
orig_init = cls.__init__
#wraps(orig_init)
def __init__(self, *args, **kwargs):
orig_init(self, *args, **kwargs)
super(self.__class__, self).__init__()
cls.__init__ = __init__
What this does is any class that subclasses CacheObject will now, when created, have its __init__ function wrapped by the parent class—we're replacing it with a new function that calls the original, and then calls super() (the parent's) __init__ function. So now, even if the child class overrides the parent __init__, at the instance's creation time, its __init__ is then wrapped by a function that calls it and then calls its parent.
You can add a decorator to your classes :
def my_decorator(cls):
old_init = cls.__init__
def new_init(self):
self.updatedict = dict()
old_init(self)
cls.__init__ = new_init
return cls
#my_decorator
class SubClass(CacheObject):
pass
if you want to add the decorators to all the subclasses automatically, use a metaclass:
class myMeta(type):
def __new__(cls, name, parents, dct):
return my_decorator(super().__new__(cls, name, parents, dct))
class CacheObject(object, metaclass=myMeta):
pass
In Python, I'd like to be able to create a function that behaves both as a class function and an instance method, but with the ability to change behaviors. The use case for this is for a set of serializable objects and types. As an example:
>>> class Thing(object):
#...
>>> Thing.to_json()
'A'
>>> Thing().to_json()
'B'
I know that given the definition of classmethod() in funcobject.c in the Python source, this looks like it'd be simple with a C module. Is there a way to do this from within python?
Thanks!
With the hint of descriptors, I was able to do it with the following code:
class combomethod(object):
def __init__(self, method):
self.method = method
def __get__(self, obj=None, objtype=None):
#functools.wraps(self.method)
def _wrapper(*args, **kwargs):
if obj is not None:
return self.method(obj, *args, **kwargs)
else:
return self.method(objtype, *args, **kwargs)
return _wrapper
Thank you Alex!
Sure, you just need to define your own descriptor type. There's an excellent tutorial on Python descriptors here.