I'm using Flask-Classy to write a Flask app using class based views.
My base class is called SlugView. It catches URLs like example.com/124/catchy-article-name:
class SlugView(FlaskView):
#route('/<id>')
#route('/<id>/<slug>')
def get(self, id, slug=None)
raise NotImplementedError
My second class is called ArticleView:
class ArticleView(SlugView):
def get(self, id, slug=None):
return render_template('article.html', article=get_article_by_id(id))
What decorator magic can I use to have the subclassed function inherit the same decorators as the parent class?
Magic? Yes. Decorator magic? No. Do you object to metaclass magic?
class InheritableRoutesMeta(type):
def __new__(cls, cls_name, bases, attributes):
for name, value in attributes.items():
if not callable(value):
continue
for base in bases:
super_method = getattr(base, name)
if super_method and hasattr(super_method, "_rule_cache"):
value._rule_cache = super_method._rule_cache
break
return super(InheritableRoutesMeta, cls).__new__(cls, cls_name,
bases, attributes)
Then you should be able to do something like this:
class ArticleView(SlugView, metaclass=InheritableRoutesMeta):
# Use the keyword argument metaclass for Python 3
# For Python 2, remove the argument and uncomment the below
# __metaclass__ = InheritableRoutesMeta
def get(self, id, slug=None):
return render_template('article.html', article=get_article_by_id(id))
Warning: This is based on an internal property. If Flask-Classy chooses to change how it stores these decorators the above code will break (assuming that it works in the first place). If you really need this, it is worth filing an issue with the creator(s) to either make this property part of the public API or to provide another way of doing what you are doing. They may choose not to do either, but at least then they are aware of the use case.
Related
The specific use case I need it for is to deprecate class names.
Suppose we have class A in an earlier version and we want to deprecate its name but keep backwards compatibility:
class A(B):
def __init__(self, *args, **kwargs):
warnings.warn('deprecation!')
super(A, self).__init__(*args, **kwargs)
... and B now has the correct implementation.
When we create a class A, we will run into a deprecation warning here. We can also use the deprecated module for decorators on __init__.
However, I want to skip this process and write less code, and hopefully achieve something like:
#deprecated_alias('A')
class B:
# ... do something
Can I somehow inject the classname into the module-level namespace so that I can use A like this?
Can I somehow inject the classname into the module-level namespace so that I can use A like this?
Yes. The class decorator should:
create a new type, with overridden __init__ method, using the 3-argument invocation of type
get the module of the original class, sys.modules[original_class.__module__]
bind the new class in the module namespace, using setattr
return the original class unchanged
Example:
import sys
def deprecated_alias(name):
def decorator(class_):
mod = sys.modules[class_.__module__]
if hasattr(mod, name):
raise Exception('uhoh, name collision')
NewClass = type(name, (class_,), {'__init__': ...})
setattr(mod, name, NewClass)
return class_
return decorator
#deprecated_alias('A')
class B:
pass
I don't recommend this approach - too much magic. It will confuse IDEs and break autocompletion.
A less magical approach, perhaps? This could also be made into a decorator, and use __subclasscheck__/__subclasshook__ if you need to control the finer details of inheritance.
class A(B):
def __init__(self, *args, **kwargs):
warnings.warn('deprecation!')
return B(*args, **kwargs)
While this is not exactly what you asked for, it is substantially less magical and ultimately the same number of lines of code. It is also far more explicit:
import warnings
def deprecated(DeprecatedByClass):
class Deprecated(DeprecatedByClass):
def __new__(cls, *args, **kwargs):
warnings.warn("deprecation!")
return super(Deprecated, cls).__new__(cls, *args, **kwargs)
return Deprecated
You can then use this like so:
class B:
pass
A = deprecated(B)
I want to register classes to a manager after class was loaded, like http handlers register to a handler manager.
It can be done in other ways, such as define the relationship in a map or call a register function after class definition.
But is there a better way to do this automatically?
update:
While Dunes's answer filled my need. I'm trying to improve the question and make it more useful for others who meet the same problem.
Here are the examples.
handler/__init__.py
handler/basehandler.py - classBase, HandlerManager
handler/handlerA.py - classA(classBase)
handler/hanlderB.py - classB(classBase)
handlerA and handlerB contains classA and classB, which are subclasses of classBase.
classA handlers requests from /a/, classB handlers /b/
I need to register them to HandlerManager automatically at the first time the handler module is imported.
If "being loaded" here means "being imported" (at the first time), then class decorator is an solution. Below sample code is copied from this page
registry = {}
def register(cls):
registry[cls.__clsid__] = cls
return cls
#register
class Foo(object):
__clsid__ = "123-456"
def bar(self):
pass
Seems like a possible use for metaclasses. It's rare to need to use metaclasses for anything -- they're overkill for pretty much everything. And most things that can be achieved using a meta class can be more easily achieved using decorators. However, this way you can ensure that any subclass of your base handler will automatically be registered too (unless it asks to not be registered).
class HandlerManager:
handlers = []
#classmethod
def register(cls, handler):
print("registering", handler)
cls.handlers.append(handler)
class HandlerRegisterer(type):
def __init__(self, name, bases, attrs, register=True):
super().__init__(name, bases, attrs)
if register:
HandlerManager.register(self)
def __new__(metaclass, name, bases, attrs, register=True):
return super().__new__(metaclass, name, bases, attrs)
class BaseHandler(metaclass=HandlerRegisterer, register=False):
# not actually a real handler, so don't register this class
pass
class MyHandler(BaseHandler):
# only have to inherit from another handler to make sure this class
# gets registered.
pass
print(HandlerManager.handlers)
assert BaseHandler not in HandlerManager.handlers
assert MyHandler in HandlerManager.handlers
If you need to use abstract classes then you will need to make your meta class subclass ABCMeta. This is because abstract classes are achieved by using meta classes, and python only allows a class to have one meta class. By subclassing ABCMeta you make the two subclasses compatible (there's no code in either one that conflicts with the other).
from abc import ABC, ABCMeta, abstractmethod
class HandlerRegisterer(ABCMeta):
# subclass ABCMeta rather than type
def __init__(self, name, bases, attrs, register=True):
super().__init__(name, bases, attrs)
if register:
HandlerManager.register(self)
def __new__(metaclass, name, bases, attrs, register=True):
return super().__new__(metaclass, name, bases, attrs)
class AbstractSubHandler(MyHandler, ABC, register=False):
# not strictly necessary to subclass ABC, but nice to know it won't
# screw things up
#abstractmethod
def some_method(self):
pass
try:
AbstractSubHandler()
except TypeError:
print("Can not instantiate abstract class")
print(HandlerManager.handlers)
assert AbstractSubHandler not in HandlerManager.handlers
I am not sure what you mean by "loaded" classes are usually initialized, or called.
In which case either the __init__ method or the __call__ method are used.
Both can be defined and can include calls to register in a manager.
Classes, and specifically the __init__ method are described better here.
Small example:
class test:
def __init__(self):
print 'I have started!'
>>> x = test()
I have started!
(Just replace the print with your registration code.)
Yes, there is a way to do this automatically.
As Inbar suggests, the __init__ method is the place to register an object creation.
Here is an example that you can use to effectively wrap existing classes, rather than overwriting __init__. In this case I have made a wrapper for the lists class. By calling super you can use initialising code from the original class.
class nlist(list):
""" """
def __init__(self, *args, **kwargs):
print('making a new list!') # overwrite with call to a manager
super().__init__(*args)
How this looks:
>>> list('hello')
['h', 'e', 'l', 'l', 'o']
>>> nlist('hello')
making a new list!
['h', 'e', 'l', 'l', 'o']
In general, I'm not familiar with python's way of overriding methods and using super().
question is: can I override get_FOO_display()?
class A(models.Model):
unit = models.IntegerField(choices=something)
def get_unit_display(self, value):
... use super(A, self).get_unit_display()
I want to override get_FOO_display() because I want to pluralize my display.
But super(A, self).get_unit_display() doesn't work.
Normally you would just override a method as you have shown. But the trick here is that the get_FOO_display method is not present on the superclass, so calling the super method will do nothing at all. The method is added dynamically by the field class when it is added to the model by the metaclass - see the source here (EDIT: outdated link as permalink).
One thing you could do is define a custom Field subclass for your unit field, and override contribute_to_class so that it constructs the method you want. It's a bit tricky unfortunately.
(I don't understand your second question. What exactly are you asking?)
Now in Django > 2.2.7:
Restored the ability to override get_FOO_display() (#30931).
You can override:
class FooBar(models.Model):
foo_bar = models.CharField(_("foo"), choices=[(1, 'foo'), (2, 'bar')])
def get_foo_bar_display(self):
return "something"
You could do it this way:
Override the Django IntegerField to make a copy of your get_FOO_display function:
class MyIntegerField(models.IntegerField):
def contribute_to_class(self, cls, name, private_only=False):
super(MyIntegerField, self).contribute_to_class(cls, name, private_only)
if self.choices is not None:
display_override = getattr(cls, 'get_%s_display' % self.name)
setattr(cls, 'get_%s_display_override' % self.name, display_override)
In your class, replace your choice field with MyIntegerField:
class A(models.Model):
unit = MyIntegerField(choices=something)
Finally, use the copy function to return the super value:
def get_unit_display(self, value):
if your condition:
return your value
return self.get_unit_display_override()
You can't directly call super() because the original method doesn't "exist" yet on the parent model.
Instead, call self._get_FIELD_display() with the field object as its input. The field object is accessible through the self._meta.get_field() method.
def get_unit_display(self):
singular = self._get_FIELD_display(self._meta.get_field('unit'))
return singular + 's'
You should be able to override any method on a super class by creating a method with the same name on the subclass. The argument signature is not considered. For example:
class A(object):
def method(self, arg1):
print "Method A", arg1
class B(A):
def method(self):
print "Method B"
A().method(True) # "Method A True"
B().method() # "Method B"
In the case of get_unit_display(), you do not have to call super() at all, if you want to change the display value, but if you want to use super(), ensure that you're calling it with the correct signature, for example:
class A(models.Model):
unit = models.IntegerField(choices=something)
def get_unit_display(self, value):
display = super(A, self).get_unit_display(value)
if value > 1:
display = display + "s"
return display
Note that we are passing value to the super()'s get_unit_display().
This doesn't work:
def register_method(name=None):
def decorator(method):
# The next line assumes the decorated method is bound (which of course it isn't at this point)
cls = method.im_class
cls.my_attr = 'FOO BAR'
def wrapper(*args, **kwargs):
method(*args, **kwargs)
return wrapper
return decorator
Decorators are like the movie Inception; the more levels in you go, the more confusing they are. I'm trying to access the class that defines a method (at definition time) so that I can set an attribute (or alter an attribute) of the class.
Version 2 also doesn't work:
def register_method(name=None):
def decorator(method):
# The next line assumes the decorated method is bound (of course it isn't bound at this point).
cls = method.__class__ # I don't really understand this.
cls.my_attr = 'FOO BAR'
def wrapper(*args, **kwargs):
method(*args, **kwargs)
return wrapper
return decorator
The point of putting my broken code above when I already know why it's broken is that it conveys what I'm trying to do.
I don't think you can do what you want to do with a decorator (quick edit: with a decorator of the method, anyway). The decorator gets called when the method gets constructed, which is before the class is constructed. The reason your code isn't working is because the class doesn't exist when the decorator is called.
jldupont's comment is the way to go: if you want to set an attribute of the class, you should either decorate the class or use a metaclass.
EDIT: okay, having seen your comment, I can think of a two-part solution that might work for you. Use a decorator of the method to set an attribute of the method, and then use a metaclass to search for methods with that attribute and set the appropriate attribute of the class:
def TaggingDecorator(method):
"Decorate the method with an attribute to let the metaclass know it's there."
method.my_attr = 'FOO BAR'
return method # No need for a wrapper, we haven't changed
# what method actually does; your mileage may vary
class TaggingMetaclass(type):
"Metaclass to check for tags from TaggingDecorator and add them to the class."
def __new__(cls, name, bases, dct):
# Check for tagged members
has_tag = False
for member in dct.itervalues():
if hasattr(member, 'my_attr'):
has_tag = True
break
if has_tag:
# Set the class attribute
dct['my_attr'] = 'FOO BAR'
# Now let 'type' actually allocate the class object and go on with life
return type.__new__(cls, name, bases, dct)
That's it. Use as follows:
class Foo(object):
__metaclass__ = TaggingMetaclass
pass
class Baz(Foo):
"It's enough for a base class to have the right metaclass"
#TaggingDecorator
def Bar(self):
pass
>> Baz.my_attr
'FOO BAR'
Honestly, though? Use the supported_methods = [...] approach. Metaclasses are cool, but people who have to maintain your code after you will probably hate you.
Rather than use a metaclass, in python 2.6+ you should use a class decorator. You can wrap the function and class decorators up as methods of a class, like this real-world example.
I use this example with djcelery; the important aspects for this problem are the "task" method and the line "args, kw = self.marked[klass.dict[attr]]" which implicitly checks for "klass.dict[attr] in self.marked". If you want to use #methodtasks.task instead of #methodtasks.task() as a decorator, you could remove the nested def and use a set instead of a dict for self.marked. The use of self.marked, instead of setting a marking attribute on the function as the other answer did, allows this to work for classmethods and staticmethods which, because they use slots, won't allow setting arbitrary attributes. The downside of doing it this way is that the function decorator MUST go above other decorators, and the class decorator MUST go below, so that the functions are not modified / re=wrapped between one and the other.
class DummyClass(object):
"""Just a holder for attributes."""
pass
class MethodTasksHolder(object):
"""Register tasks with class AND method decorators, then use as a dispatcher, like so:
methodtasks = MethodTasksHolder()
#methodtasks.serve_tasks
class C:
#methodtasks.task()
##other_decorators_come_below
def some_task(self, *args):
pass
#methodtasks.task()
#classmethod
def classmethod_task(self, *args):
pass
def not_a_task(self):
pass
#..later
methodtasks.C.some_task.delay(c_instance,*args) #always treat as unbound
#analagous to c_instance.some_task(*args) (or C.some_task(c_instance,*args))
#...
methodtasks.C.classmethod_task.delay(C,*args) #treat as unbound classmethod!
#analagous to C.classmethod_task(*args)
"""
def __init__(self):
self.marked = {}
def task(self, *args, **kw):
def mark(fun):
self.marked[fun] = (args,kw)
return fun
return mark
def serve_tasks(self, klass):
setattr(self, klass.__name__, DummyClass())
for attr in klass.__dict__:
try:
args, kw = self.marked[klass.__dict__[attr]]
setattr(getattr(self, klass.__name__), attr, task(*args,**kw)(getattr(klass, attr)))
except KeyError:
pass
#reset for next class
self.marked = {}
return klass
I'm using base class constructor as factory and changing class in this constructor/factory to select appropriate class -- is this approach is good python practice or there are more elegant ways?
I've tried to read help about metaclasses but without big success.
Here example of what I'm doing.
class Project(object):
"Base class and factory."
def __init__(self, url):
if is_url_local(url):
self.__class__ = ProjectLocal
else:
self.__class__ = ProjectRemote
self.url = url
class ProjectLocal(Project):
def do_something(self):
# do the stuff locally in the dir pointed by self.url
class ProjectRemote(Project):
def do_something(self):
# do the stuff communicating with remote server pointed by self.url
Having this code I can create the instance of ProjectLocal/ProjectRemote via base class Project:
project = Project('http://example.com')
project.do_something()
I know that alternate way is to using fabric function that will return the class object based on url, then code will looks similar:
def project_factory(url):
if is_url_local(url):
return ProjectLocal(url)
else:
return ProjectRemote(url)
project = project_factory(url)
project.do_something()
Is my first approach just matter of taste or it has some hidden pitfalls?
You shouldn't need metaclasses for this. Take a look at the __new__ method. This will allow you to take control of the creation of the object, rather than just the initialisation, and so return an object of your choosing.
class Project(object):
"Base class and factory."
def __new__(cls, url):
if is_url_local(url):
return super(Project, cls).__new__(ProjectLocal, url)
else:
return super(Project, cls).__new__(ProjectRemote, url)
def __init__(self, url):
self.url = url
I would stick with the factory function approach. It's very standard python and easy to read and understand. You could make it more generic to handle more options in several ways such as by passing in the discriminator function and a map of results to classes.
If the first example works it's more by luck than by design. What if you wanted to have an __init__ defined in your subclass?
The following links may be helpful:
http://www.suttoncourtenay.org.uk/duncan/accu/pythonpatterns.html#factory
http://code.activestate.com/recipes/86900/
In addition, as you are using new style classes, using __new__ as the factory function (and not in a base class, a separate class is better) is what is usually done (as far as I know).
A factory function is generally simpler (as other people have already posted)
In addition, it isn't a good idea to set the __class__ attribute the way you have done.
I hope you find the answer and the links helpful.
All the best.
Yeah, as mentioned by #scooterXL, factory function is the best approach in that case, but I like to note a case for factories as classmethods.
Consider the following class hierarchy:
class Base(object):
def __init__(self, config):
""" Initialize Base object with config as dict."""
self.config = config
#classmethod
def from_file(cls, filename):
config = read_and_parse_file_with_config(filename)
return cls(filename)
class ExtendedBase(Base):
def behaviour(self):
pass # do something specific to ExtendedBase
Now you can create Base objects from config dict and from config file:
>>> Base({"k": "v"})
>>> Base.from_file("/etc/base/base.conf")
But also, you can do the same with ExtendedBase for free:
>>> ExtendedBase({"k": "v"})
>>> ExtendedBase.from_file("/etc/extended/extended.conf")
So, this classmethod factory can be also considered as auxiliary constructor.
I usually have a seperate factory class to do this. This way you don't have to use meta classes or assignments to self.__class__
I also try to avoid to put the knowledge about which classes are available for creation into the factory. Rather, I have all the available classes register themselves withe the factory during module import. The give there class and some information about when to select this class to the factory (this could be a name, a regex or a callable (e.g. a class method of the registering class)).
Works very well for me and also implements such things like encapsulation and information hiding.
I think the second approach using a factory function is a lot cleaner than making the implementation of your base class depend on its subclasses.
Adding to #Brian's answer, the way __new__ works with *args and **kwargs would be as follows:
class Animal:
def __new__(cls, subclass: str, name: str, *args, **kwargs):
if subclass.upper() == 'CAT':
return super(Animal, cls).__new__(Dog)
elif subclass.upper() == 'DOG':
return super(Animal, cls).__new__(Cat)
raise NotImplementedError(f'Unsupported subclass: "{subclass}"')
class Dog(Animal):
def __init__(self, name: str, *args, **kwargs):
self.name = name
print(f'Created Dog "{self.name}"')
class Cat(Animal):
def __init__(self, name: str, *args, num_whiskers: int = 5, **kwargs):
self.name = name
self.num_whiskers = num_whiskers
print(f'Created Cat "{self.name}" with {self.num_whiskers} whiskers')
sir_meowsalot = Animal(subclass='Cat', name='Sir Meowsalot')
shadow = Animal(subclass='Dog', name='Shadow')