Python metaprogramming paradigm - is there a 'standard'? - python

I am trying to get my arms around the idiom - if there is one - related to Python metaprogramming. This comes up, for example, with serializers.
For example, the Django REST framework points out you would be tempted to do something like this, which seems straightforward:
from datetime import datetime
class Comment(object):
def __init__(self, email, content, created=None):
self.email = email
self.content = content
self.created = created or datetime.now()
comment = Comment(email='leila#example.com', content='foo bar')like
However, with little explanation, they then show how it is done in their 'idiom', as I will call it, and it looks like this:
from rest_framework import serializers
class CommentSerializer(serializers.Serializer):
email = serializers.EmailField()
content = serializers.CharField(max_length=200)
created = serializers.DateTimeField()
I get lost on why one would not have the code look more like this, as it is purely a design choice:
class CommentSerializer(serializers.Serializer):
self.email = serializers.EmailField()
self.content = serializers.CharField(max_length=200)
self.created = serializers.DateTimeField()
with some appropriate getter/setter functions.
I have actually looked at the source code, and can't identify if there is a method to the madness. When instantiated, a 'comment' based on this approach involves endless introspection using magic methods (to see what was declared in the class), lots of byzantine code shaped like:
class ListSerializer(BaseSerializer):
child = None
many = True
default_error_messages = {
'not_a_list': _('Expected a list of items but got type "{input_type}".'),
'empty': _('This list may not be empty.')
}
def __init__(self, *args, **kwargs):
self.child = kwargs.pop('child', copy.deepcopy(self.child))
self.allow_empty = kwargs.pop('allow_empty', True)
assert self.child is not None, '`child` is a required argument.'
assert not inspect.isclass(self.child), '`child` has not been ins tantiated.'
super(ListSerializer, self).__init__(*args, **kwargs)
self.child.bind(field_name='', parent=self)
def bind(self, field_name, parent):
super(ListSerializer, self).bind(field_name, parent)
self.partial = self.parent.partial
def get_initial(self):
if hasattr(self, 'initial_data'):
return self.to_representation(self.initial_data)
return []
This code is so dense as to be nearly impossible to decipher.
My question is really is this just a one-off that works, or is there a pattern here that is useful to master and replicate? I have a hard time seeing a larger paradigm or idiom. If anything, there seems to be a very byzantine use of introspection, metaclasses, and dynamic methods to accomplish an outcome that is stylized, but pretty hard to understand how to use the approach more generally.

Related

Use outer class instance as self in inner class?

I'm writing a wrapper for the GMAIL API. In this wrapper, I am trying to include subattributes in the "main class" so it more closely follows the below:
Previously, I was use methods such as:
class Foo:
def __init__(self, ...):
# add some attributes
def get_method(self, ...):
return some_stuff
This allows me to do foo.get_method(...). To follow the GMAIL API, I try to do:
class Foo:
def __init__(self, ...):
# add some attributes
#property
def method(self):
class _Method:
#staticmethod
def get(self, ...):
return some_stuff
return _Method()
Which allows me to do foo.method.get(...). The above has some problems, it redefines the class every time, and I have to add #staticmethod above every method as part of it. I do realise that I could create the class at the outer class level, and set a hidden variable for each which then .method returns or creates, but this seems like too much workaround.
tldr: Is it possible to make the instance passed to the inner class as self be the instance of the outer class (I do not wish to have to pass the attributes of the outer class to each inner class).
Instead of sharing the self parameter between classes, you are probably better off just passing the things you need to the constructor of the class you instantiate.
class Messages:
def __init__(self, name):
self.name = name
def method(self, other_arg):
return self.name + other_arg
class Test:
name = "hi"
def __init__(self):
self.messages = Messages(name=self.name)
If you need to pass a lot of information to the constructor and it starts becoming unwieldy, you can do something like split the shared code into a third class, and then pass that between the Test and Messages classes as a single object.
In Python there are all sorts of clever things that you can do with metaclasses and magic methods, but in 99% of cases just refactoring things into different classes and functions will get you more readable and maintainable code.
Users should have an instance of messages, which allows method get. The scetch for code is:
class Messages:
...
def get()
...
class Users:
...
messages = Messages(...)
allows
users = Users()
users.messages.get()
The bad thing in this API is plural names, which is a bad sign for class. If done from scratch you would rather have classes User and Message, which make more sense.
If you have a closer look at GET/POST calls in the API you link provided, you would notice the urls are like UserId/settings, another hint to implement User class, not Users.
self in the methods reference the self of the outer class
maybe this is what you want factory-method
Although the example code I'll provide bellow might be similar to the already provided answers, and the link above to another answer might satify you wish, because it is slight different formed I'll still provide my vision on what you asked. The code is self explanatory.
class User:
def __init__(self, pk, name):
self.pk = pk
self.name = name
self._messages = None
def messages(self):
if self.messages is None:
self._messages = Messages(self.pk)
return self._messages
class Messages:
def __init__(self, usr):
self.usr = usr
def get(self):
return self._grab_data()
def _grab_data(self):
# grab the data from DB
if self.usr == 1:
print('All messages of usr 1')
elif self.usr == 2:
print('All messages of usr 2')
elif self.usr == 3:
print('All messages of usr 3')
one = User(1, 'One')
two = User(2, 'Two')
three = User(3, 'Three')
one.messages().get()
two.messages().get()
three.messages().get()
The messages method approach practical would be the same for labels, history etc.
Edit: I'll give one more try to myself trying to understand what you want to achieve, even though you said that
I have tried numerous things with defining the classes outside of the container class [...]
. I don't know if you tried inheritance, since your inner class me, despite it quite don't represent nothing here, but still looks like you want to make use of its functionality somehow. You said as well
self in the methods reference the self of the outer class
This sounds to me like you want inheritance at the end.
Then the way to go would be (a proximity idea by using inheritance):
class me(object):
def __init__(self):
self.__other_arg = None # private and hidden variable
# setter and getter methods
def set_other_arg(self, new_other_arg):
self.__other_arg = new_other_arg
def get_other_arg(self):
return self.__other_arg
class Test(me):
name = 'Class Test'
#property
def message(self):
other_arg = self.get_other_arg()
if other_arg is not None:
return '{} {}'.format(self.name, other_arg)
else:
return self.name
t = Test()
t.set_other_arg('said Hello')
print(t.message)
# output >>> Class Test said Hello
I think this could be a preferable way to go rather than your inner class approach, my opinion, you'll decide. Just one side note, look up for getter and setter in python, it might help you if you want to stick with the inheritance idea given.

Django inheritance and polymorphism with proxy models

I'm working on a Django project that I did not start and I am facing a problem of inheritance.
I have a big model (simplified in the example) called MyModel that is supposed to represents different kind of items.
All the instance objects of MyModel should have the same fields but the methods behaviours varies a lot depending on the item type.
Up to this moment this has been designed using a single MyModel field called item_type.
Then methods defined in MyModel check for this field and perform different logic using multiple if:
def example_method(self):
if self.item_type == TYPE_A:
do_this()
if self.item_type == TYPE_B1:
do_that()
Even more, some of the sub-types have many things in common, so let's say the subtypes B and C represents a 1st level of inheritance.
Then these types have sub-types being for example B1, B2, C1, C2 (better explained in the example code below).
I would say that's not the best approach to perform polymorphism.
Now I want to change these models to use real inheritance.
Since all submodels have the same fields I think multi-table inheritance is not necessary. I was thinking to use proxy models because only their behaviour should change depending on their types.
This a pseudo-solution I came up to:
ITEM_TYPE_CHOICES = (
(TYPE_A, _('Type A')),
(TYPE_B1, _('Type B1')),
(TYPE_B2, _('Type B2')),
(TYPE_C1, _('Type C1')),
(TYPE_C2, _('Type C2')))
class MyModel(models.Model):
item_type = models.CharField(max_length=12, choices=ITEM_TYPE_CHOICES)
def common_thing(self):
pass
def do_something(self):
pass
class ModelA(MyModel):
class Meta:
proxy = True
def __init__(self, *args, **kwargs):
super().__init__(self, *args, **kwargs)
self.item_type = TYPE_A
def do_something(self):
return 'Hola'
class ModelB(MyModel):
class Meta:
proxy = True
def common_thing(self):
pass
class ModelB1(ModelB):
class Meta:
proxy = True
def __init__(self, *args, **kwargs):
super().__init__(self, *args, **kwargs)
self.item_type = TYPE_B1
def do_something(self):
pass
class ModelB2(ModelB):
class Meta:
proxy = True
def __init__(self, *args, **kwargs):
super().__init__(self, *args, **kwargs)
self.item_type = TYPE_B2
def do_something(self):
pass
This might work if we already know the type of the object we are working on.
Let's say we want to instantiate a MyModel object of type C1 then we could simply instantiate a ModelC1 and the item_type would be set up correctly.
The problem is how to get the correct proxy model from the generic MyModel instances?
The most common case is when we get a queryset result: MyModel.objects.all(), all these objects are instances of MyModel and they don't know anything about the proxies.
I've seen around different solution like django-polymorphic but as I've understood that relies on multi-table inheritance, isn't it?
Several SO answers and custom solutions I've seen:
https://stackoverflow.com/a/7526676/1191416
Polymorphism in Django
http://anthony-tresontani.github.io/Python/2012/09/11/django-polymorphism/
https://github.com/craigds/django-typed-models
Creating instances of Django proxy models from their base class
but none of them convinced me 100%..
Considering this might be a common scenario did anyone came up with a better solution?
When you use django-polymorphic in your base model, you'll get this casting behavior for free:
class MyModel(PolymorphicModel):
pass
Each model that extends from it (proxy model or concrete model), will be casted back to that model when you do a MyModel.objects.all()
I have few experience with model proxies so I can't tell if this would properly work (without bearking anything I mean) nor how complicated this might be, but you could use an item_type:ProxyClass mapping and override your model's queryset (or provide a second manager with custom queryset etc) that actually lookup this mapping and instanciates the correct proxy model.
BTW you may want at django.models.base.Model.from_db, which (from a very quick glance at the source code) seems to be the method called by QuerySet.populate() to instanciate models. Just overriding this method might possibly be enough to solve the problem - but here again it might also breaks something...
I came up with a custom solution inspired by this SO answer and this blog post:
from django.db import models
from django.dispatch.dispatcher import receiver
ITEM_TYPE_CHOICES = (
(TYPE_A, _('type_a')),
(TYPE_B1, _('type_b1')),
(TYPE_B2, _('type_b2')),
(TYPE_C1, _('type_c1')),
(TYPE_C2, _('type_c2')),
)
class MyModel(models.Model):
item_type = models.CharField(max_length=12, choices=ITEM_TYPE_CHOICES)
description = models.TextField(blank=True, null=True)
def common_thing(self):
pass
def do_something(self):
pass
# ****************
# Hacking Django *
# ****************
PROXY_CLASS_MAP = {} # We don't know this yet
#classmethod
def register_proxy_class(cls, item_type):
"""Class decorator for registering subclasses."""
def decorate(subclass):
cls.PROXY_CLASS_MAP[item_type] = subclass
return subclass
return decorate
def get_proxy_class(self):
return self.PROXY_CLASS_MAP.get(self.item_type, MyModel)
# REGISTER SUBCLASSES
#MyModel.register_proxy_class(TYPE_A)
class ModelA(MyModel):
class Meta:
proxy = True
def __init__(self, *args, **kwargs):
super().__init__(self, *args, **kwargs)
self.item_type = TYPE_A
def do_something(self):
pass
# No need to register this, it's never instantiated directly
class ModelB(MyModel):
class Meta:
proxy = True
def common_thing(self):
pass
#MyModel.register_proxy_class(TYPE_B1)
class ModelB1(ModelB):
class Meta:
proxy = True
def __init__(self, *args, **kwargs):
super().__init__(self, *args, **kwargs)
self.item_type = TYPE_B1
def do_something(self):
pass
#MyModel.register_proxy_class(TYPE_B2)
class ModelB2(ModelB):
class Meta:
proxy = True
def __init__(self, *args, **kwargs):
super().__init__(self, *args, **kwargs)
self.item_type = TYPE_B2
def do_something(self):
pass
# USING SIGNAL TO CHANGE `__class__` at runtime
#receiver(models.signals.post_init, sender=MyModel)
def update_proxy_object(sender, **kwargs):
instance = kwargs['instance']
if hasattr(instance, "get_proxy_class") and not instance._meta.proxy:
proxy_class = instance.get_proxy_class()
if proxy_class is not None:
instance.__class__ = proxy_class
I'm using the decorator register_proxy_class to register each subclass after MyModel has been declared otherwise I would have needed to explicitly declare a map of {type: subclass} inside MyModel.
This would have been bad:
because at declaration we can't reference any of the proxy subclasses from MyModel (we could solve these with string names)
the parent would be aware of its subclasses which breaks OOP principles.
How it works:
Using the #register_proxy_class(type) decorator each subclass register itself, in fact creating an entry into MyModel.PROXY_CLASS_MAP dict when the module is loaded.
Then update_proxy_object is executed whenever MyModel dispatch a post_init signal. It change the __class__ of MyModel instances at runtime to select the right proxy subclass.
So basically:
# a1: MyModel dispatch a post_init signal -> `update_proxy_object` set the proper instance __class__ = ModelA
# Do NOT call ModelA.__init__
a1 = MyModel(item_type=TYPE_A)
isinstance(a1, MyModel) # True
isinstance(a1, ModelA) # True
# a2: calls ModelA.__init__ that call the parent MyModel.__init__ then it sets up the item_type for us
a2 = ModelA() # <- no need to pass item_type
isinstance(a2,MyModel) # True
isinstance(a2, ModelA) #True
# Using custom managers of MyModel return all objects having item_type == 'TYPE_B1'
b1 = MyModel.objects.b1()[0] # get the first one
isinstance(b1, ModelB1) # True
isinstance(b1, ModelB) # True
isinstance(b1, MyModel) # True
isinstance(b1, ModelA) # False
It seems to work so far but I will experiment a bit more for possible problems I haven't think about.
Cool!

python code optimization: creating dynamic variables in a loop inside a class for django / wagtail

I am trying to optimize internationalization for my django (wagtail) site.
I have a model which creates fields for the CMS that allow translation to different language:
from internationalization.translatedfield import TranslatedField
class BlogPage(Page):
body_en = RichTextField(blank=True)
body_fr = RichTextField(blank=True)
body = TranslatedField(
'body_en',
'body_fr'
)
content_panels = Page.content_panels + [
FieldPanel('body_en', classname="full"),
FieldPanel('body_fr', classname="full"),
]
The translatedfield import just allows us to use a single variable name in our django template ('body'):
from django.utils import translation
class TranslatedField(object):
def __init__(self, en_field, fr_field):
self.en_field = en_field
self.fr_field = fr_field
def __get__(self, instance, owner):
if translation.get_language() == 'fr':
return getattr(instance, self.fr_field)
else:
return getattr(instance, self.en_field)
And in the template:
{{ page.body|richtext }}
All this allows the CMS user to input text for 'body_fr' and 'body_en' fields, and, based on the URL the visitor was at, output either the french or english translations (e.g. site.com/en/blog or site.com/fr/blog).
The problem is there could be potentially dozens of languages and dozens of fields, so you can see how this model could get very large. So what I'd like to do is dynamically create those self.en_fields and .fr fields in a loop.
So the TranslatedField import might look something like:
class TranslatedField(object):
def __init__(self, field, languages):
where field would be, in this example, body and languages would be an array of languages we want to support for this field: ['en','fr'].
Then I would loop through that array, for language in languages and somehow return self.{language}_field = language, and in def __get__(self, instance, owner): somehow loop through all the languages again and say if translation.get_language() == language: return getattr(instance,self.{language_field}` (above being pseudocode).
How do I restructure the existing model and import to more effectively render the fields that I need? I'm guessing the answer might involve using a combination of loops and dictionaries, but I haven't had a ton of success with dynamically creating and implementing these so far.
This isn't a complete answer but hopefully it will get you on the right track.
The first think you could simplify is the construction of the TranslatedField and it's __get__ method like so:
class TranslatedField(object):
def __init__(self, field_name, languages):
self.field_name = field_name
self.languages = languages
def __get__(self, instance, owner):
required_lang = translation.get_language()
lang = required_lang if required_lang in self.languages else self.languages[0]
translated_field_name = '%s_%s' % (self.field_name, lang)
return getattr(instance, translated_field_name)
You will still need to create body_en, body_fr and so on but that's a start.
In order to not have to create all the body_xxx fields, and do something like this instead body = TranslatedField(field_type=RichText, field_args=None, fields_kwargs={'blank': True}, languages=['en', 'fr']), then you'll have to look at contribute_to_class.
Btw, there are some translation libraries out there for Wagtail which might be worth a look.

Creating a tastypie resource for a "singleton" non-model object

I'm using tastypie and I want to create a Resource for a "singleton" non-model object.
For the purposes of this question, let's assume what I want the URL to represent is some system settings that exist in an ini file.
What this means is that...:
The fields I return for this URL will be custom created for this Resource - there is no model that contains this information.
I want a single URL that will return the data, e.g. a GET request on /api/v1/settings.
The returned data should return in a format that is similar to a details URL - i.e., it should not have meta and objects parts. It should just contain the fields from the settings.
It should not be possible to GET a list of such object nor is it possible to perform POST, DELETE or PUT (this part I know how to do, but I'm adding this here for completeness).
Optional: it should play well with tastypie-swagger for API exploration purposes.
I got this to work, but I think my method is kind of ass-backwards, so I want to know what is the common wisdom here. What I tried so far is to override dehydrate and do all the work there. This requires me to override obj_get but leave it empty (which is kind of ugly) and also to remove the need for id in the details url by overriding override_urls.
Is there a better way of doing this?
You should be able to achieve this with the following. Note I haven't actually tested this, so some tweaking may be required. A more rich example can be found in the Tastypie Docs
class SettingsResource(Resource):
value = fields.CharField(attribute='value', help_text='setting value')
class Meta:
resource_name = 'setting'
fields = ['value']
allowed_methods = ['get']
def detail_uri_kwargs(self, bundle_or_obj):
kwargs = {}
return kwargs
def get_object_list(self, request):
return [self.obj_get()]
def obj_get_list(self, request=None, **kwargs):
return [self.obj_get()]
def obj_get(self, request=None, key=None, **kwargs):
setting = SettingObject()
setting.value = 'whatever value'
return setting
The SettingObject must support the getattr and setattr methods. You can use this as a template:
class SettingObject(object):
def __init__(self, initial=None):
self.__dict__['_data'] = {}
if initial:
self.update(initial)
def __getattr__(self, name):
return self._data.get(name, None)
def __setattr__(self, name, value):
self.__dict__['_data'][name] = value
def update(self, other):
for k in other:
self.__setattr__(k, other[k])
def to_dict(self):
return self._data
This sounds like something completely outside of TastyPie's wheelhouse. Why not have a single view somewhere decorated with #require_GET, if you want to control headers, and return an HttpResponse object with the desired payload as application/json?
The fact that your object is a singleton and all other RESTful interactions with it are prohibited suggests that a REST library is the wrong tool for this job.

Django polymorphism hack

I'm trying to bake out a sort of "single table inheritence" a.k.a. "table per hierarchy" model in Django.
Here's what I'd like to do:
class PolymorphicModel(models.Model):
content_type = models.ForeignKey(ContentType)
class Meta:
abstract = True
def __init__(self, *args, **kwargs):
super(PolymorphicModel, self).__init__(*args, **kwargs)
# Dynamically switch the class to the actual one
self.__class__ = self.content_type.model_class()
def save(self, *args, **kwargs):
if not self.content_type:
# Save the actual class name for the future.
self.content_type = ContentType.objects.get_for_model(self.__class__)
super(PolymorphicModel, self).save(*args, **kwargs)
And then the actual hierarchy:
class Base(PolymorphicModel):
a = models.IntegerField()
b = models.IntegerField()
#abstractmethod
def something(self): pass
class DerivedA(Base):
def something(self):
return self.a
class DerivedB(Base):
def something(self):
return self.b
Unfortunately I get an error DoesNotExist when constructing DerivedA(). It complains about content_type not existing.
EDIT:
Concerning my questions:
Why do I get the exception, how to fix it?
See my answer below: content_type is apparently not a viable name.
Is the thing that I'm trying to achieve doable this way?
Yes it is! And it works beautifully. Using class names instead of content type is also possible. This has an added value of handling proxy = True appropriately.
Ups, well apparently content_type is a reserved name. I changed the property name to ct and it works now.
I've published by solution here:
http://djangosnippets.org/snippets/2408/

Categories