Overriding a function in python - python

I have the following base class (code shortened):
class SignupForm(GroupForm):
username = forms.CharField(
label = _("Username"),
max_length = 30,
widget = forms.TextInput()
)
def __init__(self, *args, **kwargs):
super(SignupForm, self).__init__(*args, **kwargs)
if REQUIRED_EMAIL or EMAIL_VERIFICATION or EMAIL_AUTHENTICATION:
self.fields["email"].label = ugettext("Email")
self.fields["email"].required = True
else:
self.fields["email"].label = ugettext("Email (optional)")
self.fields["email"].required = False
def after_signup(self, user, **kwargs):
"""
An extension point for subclasses.
"""
pass
What I wanna do is override the after_signup() function and the username field like so:
class CompanySignupForm(SignupForm):
#TODO: override fields for company signup form
username = forms.CharField(
label = _("Username TEST"),
max_length = 30,
widget = forms.TextInput()
)
def after_signup(self, user, **kwargs):
"""
An extension point for subclasses.
"""
print str('after_signup is has been overwritten')
My Problem:
Only the username field shows the desired behavior. The after_signup() function get never called. Instead the after_signup() function of the base class SignupForm gets called. What am I doning wrong?
EDIT:
the imports:
from django import forms
from django.contrib.auth.models import User
from django.utils.translation import ugettext_lazy as _, ugettext
instantiating CompanySignupForm:
url(r"^signup/$", CompanySignupForm.as_view(), name="acct_signup")
after_signup() is beeing called from a function in the base class:
def save(self, request=None):
# more code here
# ...
self.after_signup(new_user)

Use isinstance() to check an instance’s type and .__class__ to make sure you are instantiating CompanySignupForm.
Also you might want to create an __init__ method on the CompanySignupForm to ensure it's not just instantiating the super class.
Note: Reading your edit more closely your not calling after_signup directly the base class function save is right? This will then call it's local method after_signup if it exists. I'd take that function out of the base class and force it to call the inherited functions
To check the version run:
signup = CompanySignupForm.as_view()
print signup.__class__
url(r"^signup/$", signup, name="acct_signup")

Related

How do I pass **kwargs when instantiating an object and access it in the class's __init__() method

Here I'd like to pass a **kwargs dictionary when instantiating my PlayerForm objects and be able to access it when calling __init__() method. This is what I've done below but it's not working.
This is somewhere in my views.py file:
context = {'player_form': PlayerForm(kwargs={'user': request.user})}
This is in my forms.py file
from .models import Game, Player
class PlayerForm(forms.ModelForm):
class Meta:
model = Player
fields = ['game', 'username']
def __init__(self, *args, **kwargs):
super().__init__(*args, **kwargs)
if kwargs.get('user'):
self.fields['game'].queryset = Game.objects.exclude(player__user=user)
You can use kwargs using ** operator.
Try using below code:
context = {'player_form': PlayerForm(**{'user': request.user})}

How to pass username into function inside a form in django?

I have a form which has a variable that calls a function to get a list of names. I need to pass the current logged in user as a dynamic parameter variable into this function.
I have spent about 2 days on this trying to work any and every solution I can. Cannot find anything that works. I have tried to initialize a request object but cannot get that to work.
class ManagerForm(forms.Form):
names = get_employee_names(<<dynamic username goes here>>)
manager = forms.ChoiceField(choices=names, widget=forms.RadioSelect)
The expected result is to pass the username as a string into the function as a parameter.
Forms by itself doesn't have access to request object and therefore can't identify which user is currently logged. Your view should pass current user username instead:
views.py:
def index(request):
# ...
form = ManagerForm(request.POST or None, current_user_username=request.user.username)
# ...
forms.py:
def get_employee_names(username):
# assuming it constructs correct choices tuples, like:
# choices = ((username, username), ('noname', 'noname'))
return choices
class ManagerForm(forms.Form):
manager = forms.ChoiceField(choices=[], widget=forms.RadioSelect)
def __init__(self, *args, **kwargs):
username = kwargs.pop('current_user_username')
super().__init__(*args, **kwargs)
self.fields['manager'].choices = get_employee_names(username)
This is description of what django expect choices to be.

Django inheritance and polymorphism with proxy models

I'm working on a Django project that I did not start and I am facing a problem of inheritance.
I have a big model (simplified in the example) called MyModel that is supposed to represents different kind of items.
All the instance objects of MyModel should have the same fields but the methods behaviours varies a lot depending on the item type.
Up to this moment this has been designed using a single MyModel field called item_type.
Then methods defined in MyModel check for this field and perform different logic using multiple if:
def example_method(self):
if self.item_type == TYPE_A:
do_this()
if self.item_type == TYPE_B1:
do_that()
Even more, some of the sub-types have many things in common, so let's say the subtypes B and C represents a 1st level of inheritance.
Then these types have sub-types being for example B1, B2, C1, C2 (better explained in the example code below).
I would say that's not the best approach to perform polymorphism.
Now I want to change these models to use real inheritance.
Since all submodels have the same fields I think multi-table inheritance is not necessary. I was thinking to use proxy models because only their behaviour should change depending on their types.
This a pseudo-solution I came up to:
ITEM_TYPE_CHOICES = (
(TYPE_A, _('Type A')),
(TYPE_B1, _('Type B1')),
(TYPE_B2, _('Type B2')),
(TYPE_C1, _('Type C1')),
(TYPE_C2, _('Type C2')))
class MyModel(models.Model):
item_type = models.CharField(max_length=12, choices=ITEM_TYPE_CHOICES)
def common_thing(self):
pass
def do_something(self):
pass
class ModelA(MyModel):
class Meta:
proxy = True
def __init__(self, *args, **kwargs):
super().__init__(self, *args, **kwargs)
self.item_type = TYPE_A
def do_something(self):
return 'Hola'
class ModelB(MyModel):
class Meta:
proxy = True
def common_thing(self):
pass
class ModelB1(ModelB):
class Meta:
proxy = True
def __init__(self, *args, **kwargs):
super().__init__(self, *args, **kwargs)
self.item_type = TYPE_B1
def do_something(self):
pass
class ModelB2(ModelB):
class Meta:
proxy = True
def __init__(self, *args, **kwargs):
super().__init__(self, *args, **kwargs)
self.item_type = TYPE_B2
def do_something(self):
pass
This might work if we already know the type of the object we are working on.
Let's say we want to instantiate a MyModel object of type C1 then we could simply instantiate a ModelC1 and the item_type would be set up correctly.
The problem is how to get the correct proxy model from the generic MyModel instances?
The most common case is when we get a queryset result: MyModel.objects.all(), all these objects are instances of MyModel and they don't know anything about the proxies.
I've seen around different solution like django-polymorphic but as I've understood that relies on multi-table inheritance, isn't it?
Several SO answers and custom solutions I've seen:
https://stackoverflow.com/a/7526676/1191416
Polymorphism in Django
http://anthony-tresontani.github.io/Python/2012/09/11/django-polymorphism/
https://github.com/craigds/django-typed-models
Creating instances of Django proxy models from their base class
but none of them convinced me 100%..
Considering this might be a common scenario did anyone came up with a better solution?
When you use django-polymorphic in your base model, you'll get this casting behavior for free:
class MyModel(PolymorphicModel):
pass
Each model that extends from it (proxy model or concrete model), will be casted back to that model when you do a MyModel.objects.all()
I have few experience with model proxies so I can't tell if this would properly work (without bearking anything I mean) nor how complicated this might be, but you could use an item_type:ProxyClass mapping and override your model's queryset (or provide a second manager with custom queryset etc) that actually lookup this mapping and instanciates the correct proxy model.
BTW you may want at django.models.base.Model.from_db, which (from a very quick glance at the source code) seems to be the method called by QuerySet.populate() to instanciate models. Just overriding this method might possibly be enough to solve the problem - but here again it might also breaks something...
I came up with a custom solution inspired by this SO answer and this blog post:
from django.db import models
from django.dispatch.dispatcher import receiver
ITEM_TYPE_CHOICES = (
(TYPE_A, _('type_a')),
(TYPE_B1, _('type_b1')),
(TYPE_B2, _('type_b2')),
(TYPE_C1, _('type_c1')),
(TYPE_C2, _('type_c2')),
)
class MyModel(models.Model):
item_type = models.CharField(max_length=12, choices=ITEM_TYPE_CHOICES)
description = models.TextField(blank=True, null=True)
def common_thing(self):
pass
def do_something(self):
pass
# ****************
# Hacking Django *
# ****************
PROXY_CLASS_MAP = {} # We don't know this yet
#classmethod
def register_proxy_class(cls, item_type):
"""Class decorator for registering subclasses."""
def decorate(subclass):
cls.PROXY_CLASS_MAP[item_type] = subclass
return subclass
return decorate
def get_proxy_class(self):
return self.PROXY_CLASS_MAP.get(self.item_type, MyModel)
# REGISTER SUBCLASSES
#MyModel.register_proxy_class(TYPE_A)
class ModelA(MyModel):
class Meta:
proxy = True
def __init__(self, *args, **kwargs):
super().__init__(self, *args, **kwargs)
self.item_type = TYPE_A
def do_something(self):
pass
# No need to register this, it's never instantiated directly
class ModelB(MyModel):
class Meta:
proxy = True
def common_thing(self):
pass
#MyModel.register_proxy_class(TYPE_B1)
class ModelB1(ModelB):
class Meta:
proxy = True
def __init__(self, *args, **kwargs):
super().__init__(self, *args, **kwargs)
self.item_type = TYPE_B1
def do_something(self):
pass
#MyModel.register_proxy_class(TYPE_B2)
class ModelB2(ModelB):
class Meta:
proxy = True
def __init__(self, *args, **kwargs):
super().__init__(self, *args, **kwargs)
self.item_type = TYPE_B2
def do_something(self):
pass
# USING SIGNAL TO CHANGE `__class__` at runtime
#receiver(models.signals.post_init, sender=MyModel)
def update_proxy_object(sender, **kwargs):
instance = kwargs['instance']
if hasattr(instance, "get_proxy_class") and not instance._meta.proxy:
proxy_class = instance.get_proxy_class()
if proxy_class is not None:
instance.__class__ = proxy_class
I'm using the decorator register_proxy_class to register each subclass after MyModel has been declared otherwise I would have needed to explicitly declare a map of {type: subclass} inside MyModel.
This would have been bad:
because at declaration we can't reference any of the proxy subclasses from MyModel (we could solve these with string names)
the parent would be aware of its subclasses which breaks OOP principles.
How it works:
Using the #register_proxy_class(type) decorator each subclass register itself, in fact creating an entry into MyModel.PROXY_CLASS_MAP dict when the module is loaded.
Then update_proxy_object is executed whenever MyModel dispatch a post_init signal. It change the __class__ of MyModel instances at runtime to select the right proxy subclass.
So basically:
# a1: MyModel dispatch a post_init signal -> `update_proxy_object` set the proper instance __class__ = ModelA
# Do NOT call ModelA.__init__
a1 = MyModel(item_type=TYPE_A)
isinstance(a1, MyModel) # True
isinstance(a1, ModelA) # True
# a2: calls ModelA.__init__ that call the parent MyModel.__init__ then it sets up the item_type for us
a2 = ModelA() # <- no need to pass item_type
isinstance(a2,MyModel) # True
isinstance(a2, ModelA) #True
# Using custom managers of MyModel return all objects having item_type == 'TYPE_B1'
b1 = MyModel.objects.b1()[0] # get the first one
isinstance(b1, ModelB1) # True
isinstance(b1, ModelB) # True
isinstance(b1, MyModel) # True
isinstance(b1, ModelA) # False
It seems to work so far but I will experiment a bit more for possible problems I haven't think about.
Cool!

simplest way to override Django admin inline to request formfield_for_dbfield for each instance

I would like to provide different widgets to input form fields for the same type of model field in a Django admin inline.
I have implemented a version of the Entity-Attribute-Value paradigm in my shop application (I tried eav-django and it wasn't flexible enough). In my model it is Product-Parameter-Value (see Edit below).
Everything works as I want except that when including an admin inline for the Parameter-Value pair, the same input formfield is used for every value. I understand that this is the default Django admin behaviour because it uses the same formset for each Inline row.
I have a callback on my Parameter that I would like to use (get_value_formfield). I currently have:
class SpecificationValueAdminInline(admin.TabularInline):
model = SpecificationValue
fields = ('parameter', 'value')
readonly_fields = ('parameter',)
max_num = 0
def get_formset(self, request, instance, **kwargs):
"""Take a copy of the instance"""
self.parent_instance = instance
return super().get_formset(request, instance, **kwargs)
def formfield_for_dbfield(self, db_field, **kwargs):
"""Override admin function for requesting the formfield"""
if self.parent_instance and db_field.name == 'value':
# Notice first() on the end -->
sv_instance = SpecificationValue.objects.filter(
product=self.parent_instance).first()
formfield = sv_instance.parameter.get_value_formfield()
else:
formfield = super().formfield_for_dbfield(db_field, **kwargs)
return formfield
formfield_for_dbfield is only called once for each admin page.
How would I override the default behaviour so that formfield_for_dbfield is called once for each SpecificationValue instance, preferably passing the instance in each time?
Edit:
Here is the model layout:
class Product(Model):
specification = ManyToManyField('SpecificationParameter',
through='SpecificationValue')
class SpecificationParameter(Model):
"""Other normal model fields here"""
type = models.PositiveSmallIntegerField(choices=TUPLE)
def get_value_formfield(self):
"""
Return the type of form field for parameter instance
with the correct widget for the value
"""
class SpecificationValue(Model):
product = ForeignKey(Product)
parameter = ForeignKey(SpecificationParameter)
# To store and retrieve all types of value, overrides CharField
value = CustomValueField()
The way I eventually solved this is using the form = attribute of the Admin Inline. This skips the form generation code of the ModelAdmin:
class SpecificationValueForm(ModelForm):
class Meta:
model = SpecificationValue
def __init__(self, instance=None, **kwargs):
super().__init__(instance=instance, **kwargs)
if instance:
self.fields['value'] = instance.parameter.get_value_formfield()
else:
self.fields['value'].disabled = True
class SpecificationValueAdminInline(admin.TabularInline):
form = SpecificationValueForm
Using standard forms like this, widgets with choices (e.g. RadioSelect and CheckboxSelectMultiple) have list bullets next to them in the admin interface because the <ul> doesn't have the radiolist class. You can almost fix the RadioSelect by using AdminRadioSelect(attrs={'class': 'radiolist'}) but there isn't an admin version of the CheckboxSelectMultiple so I preferred consistency. Also there is an aligned class missing from the <fieldset> wrapper element.
Looks like I'll have to live with that!

Django Form inheritance problem

Why can't I do this?
from django import forms
from django.forms import widgets
class UserProfileConfig(forms.Form):
def __init__(self,*args,**kwargs):
super (UserProfileConfig,self).__init__(*args,**kwargs)
self.tester = 'asdf'
username = forms.CharField(label='Username',max_length=100,initial=self.tester)
More specifically, why cant the forms.CharField grab the variable tester that I set during construction?
I feel like I am missing something about the way Python handles this sort of thing...
edit :
What I am actually trying to do is this:
class UserProfileConfig(forms.Form):
def __init__(self,request,*args,**kwargs):
super (UserProfileConfig,self).__init__(*args,**kwargs)
self.tester = request.session['some_var']
username = forms.CharField(label='Username',max_length=100,initial=self.tester)
In other words, I need to grab a session variable and then set it to an initial value...
Is there any way to handle this through the __init__ or otherwise?
What you've got doesn't work because your CharField gets created, and pointed to by UserProfileConfig.username when the class is created, not when the instance is created. self.tester doesn't exist until you call __init__ at instance creation time.
You can just do it this way
from django import forms
from django.forms import widgets
class UserProfileConfig(forms.Form):
username = forms.CharField(label='Username',max_length=100,initial=self.tester)
tester = 'asdf'
You could do this:-
class UserProfileConfig(forms.Form):
username = forms.CharField(label='Username',max_length=100)
def view(request):
user_form = UserProfileConfig(initial={'username': request.session['username',})
Which is the generally accepted method, but you can also do this:-
class UserProfileConfig(forms.Form):
def __init__(self,request,*args,**kwargs):
super (UserProfileConfig,self).__init__(*args,**kwargs)
self.fields['username'] = request.session['some_var']
username = forms.CharField(label='Username',max_length=100)
def view(request):
user_form = UserProfileConfig(request=request)

Categories