I'm currently working on a model that has been already built and i need to add some validation managment. (accessing to two fields and checking data, nothing too dramatic)
I was wondering about the exact difference between models and forms at a validation point of view and if i would be able to just make a clean method raising errors as in a formview in a model view ?
for extra knowledge, why are thoses two things separated ?
And finnaly, what would you do ? There are already some methods written for the model and i don't know yet if i would rewrite it to morph it into a form and simply add the clean() method + i don't exactly know how they work.
Oh, and everything is in the admin interface, havn't yet worked a lot on it since i started django not so long ago.
Thanks in advance,
You should use model (field) validation to make sure the returning datatype meets your database's requirements. Usually you won't need this because django's builtin fields do this for you, so unless you've built some custom field or know what you are doing you shouldn't change things.
Form validation is where you clean the user's input, you can add a clean method for every form field by adding a clean_FIELD(self) method, e.g.
class ContactForm(forms.Form):
# Everything as before.
...
def clean_recipients(self):
data = self.cleaned_data['recipients']
if "fred#example.com" not in data:
raise forms.ValidationError("You have forgotten about Fred!")
# Always return the cleaned data, whether you have changed it or
# not.
return data
Before a Form's main clean method is ran, it checks for a field level clean for each of its fields
Generally models represent business entities which may be stored in some persistent storage (usually relational DB). Forms are used to render HTML forms which may retreive data from users.
Django supports creating forms on the basis of models (using ModelForm class). Forms may be used to fetch data which should be saved in persistent storage, but that's not only the case - one may use forms just to get data to be searched in persistent storage or passed to external service, feed some application counters, test web browser engines, render some text on the basis of data entered by user (e.g. "Hello USERNAME"), login user etc.
Calling save() on model instance should guarantee that data will be saved in persistent storage if and only data is valid - that will provide consistent mechanism of validation of data before saving to persistent storage, regardless whether business entity is to be saved after user clicks "Save me" button on web page or in django interactive shell user will execute save() method of model instance.
Related
I am writing a Django application where users manipulate a Model. A typical session for users is this :
they get the model ;
they do some actions (vizualise the model's datas, change these datas..);
then, they save the model if they modified something.
But, if two users are manipulating the same instance of the model, and one save his modification after the second loaded it, I want to be able to "notify" the second that the model has changed, so he can reload it.
I could perform a get to check if there was a modification in the database every time a view is called, but it doesn't seems optimal.
I have looked at Django's signals too, but I don't know how to send a signal to users manipulating a specific instance of the model.
Do you have any ideas on how I can do it ?
Background: I'm currently creating a project in pyramid. It uses beaker sessions and SQLAlchemy as DB-backend.
Some forms contain information in hidden fields, with the only purpose of supplying it to the view that processes the post, the user never sees them and doesn't need to. An example:
A DB-entity can be edited by the user. Since all data fields of the entity, including the name, can be edited, the ID of the entity is put in a hidden field, so the view can query the object and update it. This approach has some flaws:
The ID of my entities is no concern of my users. They should not even be aware of it.
Data being resubmitted by the client can be tinkered with. Someone might try to get access to other entities by forging a different id here.
In other scenarios there could be more mirrored data than just an id (maybe the return to url? Maybe much more somewhere). Using hidden fields for that would transmit the data to the client and back, needlessly (bandwidth) and makes it necessary to validate it.
Transmitting data over insecure channels (the client) without need is just wrong. The solution is not that complicated: store that information on the server (in session or DB), and make it accessible with a key (form-id?), similar to a session, with a session-id. Put that totally anonymous token into a hidden field. That will be the only hidden field needed in the form. Everything else would be stored on the server and be restored from the view responding to the post-request (well - I would still have my CSRF-token in there, because it's in all my post requests). This would also make it easy, to give forms a timeout, since you can make the form-id expire after some hours or so.
If I recall correctly, drupals does supply like this by default. I would not really expect pyramid itself to have support for this, but would imagine there must be a package doing this, using the pyramid supplied session object, still could not find any. I'm sure I could write something usable myself, but why do so if there might be something awesome out there already?
Anyone knows of such packages?
If I understand you, you want to update a model without using hidden fields on forms.
Here is how to do it
Using pyramid_simpleform and Urldispatch
#view_config(route_name="my_route",renderer="myrenderer.mako")
def update(request):
id = request.matchdict['id']
#assuming you have an SQLalchemy model called MyModel which you imported
#your model has a method that gets by id
toupdate = MyModel.get_by_id(id)
form = Form(request, schema=MyModelSchema,obj=toupdate)
if not toupdate:
#you can flash a message here and redirect wherever you want
return HTTPFound(location=request.route_url('home'))
if 'submit' in request.POST and form.validate():
form.bind(toupdate)
DBSession.add(toupdate)
DBSession.flush()
#redirect
return HTTPFound(location=request.route_url('home'))
return dict(form=FormRenderer(form))
At the view, just draw your form fields without any hidden field using the form object.
#configure your route as below
config.add_route('my_route','/myroute/{id}/edit')
UPDATE
to use webhelpers tool. Follow as bellow(Using Mako template)
<%
from webhelpers.html.tools import js_obfuscate
%>
js_obfuscate("<input type='hidden' name='check' value='valid' />")
to obfuscate the data in javascript tag
We have a django application that is, at its core, a series of webpages with Forms which our users fill out in order. (We'll call the particular series of pages with forms on them a "flow".)
We will be white-labeling this application for a Partner -- the Partner will want to add some fields and even add some webpages with their own new Forms. This may result in a new order in which the Forms are filled out. (A new "flow", in addition to changes to existing Forms/Models or new Forms/Models.)
What is the best way to extend our existing, simple Forms-and-Models structure to use different Forms and Models depending on the running instance of the app (e.g. an environment variable)? Some things we thought about:
implement something like get_user_model for every Model and Form use in the app, which would look at the current environment
implement a more generic key-value store so that we're not bound by the current implementation's field types (i.e., have the data field name be part of the data as well)
a data model which tracks this particular environment's "flow" and which models it needs to use
subclass existing Models and Forms for each new white-label implementation
Model Field injection may be what you are looking for, take a look of this article
The approach boils down to three concepts:
Dynamically adding fields to model classes Ensuring Django’s model
system respects the new fields
Getting the load ordering correct for the above to work
Mezzanine has done a beautiful job implementing this model field injection with dynamic extra models via EXTRA_MODEL_FIELDS
I have a django webapp with multiple users logging in and fill in a form.
Some users may start filling in a form and lack some required data (e.g., a grant #) needed to validate the form (and before we can start working on it). I want them to be able to fill out the form and have an option to save the partial info (so another day they can log back in and complete it) or submit the full info undergoing validation.
Currently I'm using ModelForm for all the forms I use, and the Model has constraints to ensure valid data (e.g., the grant # has to be unique). However, I want them to be able to save this intermediary data without undergoing any validation.
The solution I've thought of seems rather inelegant and un-django-ey: create a "Save Partial Form" button that saves the POST dictionary converts it to a shelf file and create a "SavedPartialForm" model connecting the user to partial forms saved in the shelf. Does this seem sensible? Is there a better way to save the POST dict directly into the db? Or is an add-on module that does this partial-save of a form (which seems to be a fairly common activity with webforms)?
My biggest concern with my method is I want to eventually be able to do this form-autosave automatically (say every 10 minutes) in some ajax/jquery method without actually pressing a button and sending the POST request (e.g., so the user isn't redirected off the page when autosave is triggered). I'm not that familiar with jquery and am wondering if it would be possible to do this.
before Saving:
for field in form.fields:
form.fields[field].required = False
then:
form.save()
The issue is that you have multiple Forms.
Partial. Incomplete. Complete. Ready for this. Ready for that.
Indeed, you have a Form-per-stage of a workflow.
Nothing wrong with this at all.
Figure out where in the workflow you are.
Populate and present the form for the next stage.
Forms can inherit from each other to save repeating validation methods.
Place the following into your form __init__
for field in form.fields:
form.fields[field].required = False
For example:
class MySexyForm(Form):
def __init__(self, *args, **kwargs):
super(MySexyForm, self).__init__(*args, **kwargs)
for field in self.fields:
self.fields[field].required = False
Then call:
form = MySexyForm(...)
form.save()
However you'll need to make sure your clean() method can handle any missing attributes by conditionally checking if they exist in cleaned_data. For example, if another form field validation relies on customer_id but your partial form have not specified one, then customer_id would not be in cleaned_data.
If this is for a model form, you could check if the value was in cleaned_data, and fallback onto instance.field if it was missing, for example;
def clean(self):
inst = self.instance
customer_id_new = self.cleaned_data.get('customer_id', None)
customer_id_old = getattr(self.instance, 'customer_id') if inst else None
customer_id = customer_id_new if customer_id_new else customer_id_old
Remember that the value new value will almost certainly not be in the same format as the old value, for example customer_id could actually be a RelatedField on the model instance but an pk int on the form data. Again, you'll need to handle these type differences within your clean.
This is one area where Django Forms really are lacking sadly.
I use django and I wonder in what cases where model validation should go. There are at least two variants:
Validate in the model's save method and to raise IntegrityError or another exception if business rules were violated
Validate data using forms and built-in clean_* facilities
From one point of view, answer is obvious: one should use form-based validation. It is because ORM is ORM and validation is completely another concept. Take a look at CharField: forms.CharField allows min_length specification, but models.CharField does not.
Ok cool, but what the hell all that validation features are doing in django.db.models? I can specify that CharField can't be blank, I can use EmailField, FileField, SlugField validation of which are performed here, in python, not on RDBMS. Furthermore there is the URLField which checks existance of url involving some really complex logic.
From another side, if I have an entity I want to guarantee that it will not be saved in inconsistent state whether it came from a form or was modified/created by some internal algorithms. I have a model with name field, I expect it should be longer than one character. I have a min_age and a max_age fields also, it makes not much sense if min_age > max_age. So should I check such conditions in save method?
What are the best practices of model validation?
I am not sure if this is best practise but what I do is that I tend to validate both client side and server side before pushing the data to the database. I know it requires a lot more effort but this can be done by setting some values before use and then maintaining them.
You could also try push in size contraints with **kwargs into a validation function that is called before the put() call.
Your two options are two different things.
Form-based validation can be regarded as syntactic validation + convert HTTP request parameters from text to Python types.
Model-based validation can be regarded as semantic validation, sometimes using context not available at the HTTP/form layer.
And of course there is a third layer at the DB where constraints are enforced, and may not be checkable anywhere else because of concurrent requests updating the database (e.g. uniqueness constraints, optimistic locking).
"but what the hell all that validation features are doing in django.db.models? "
One word: Legacy. Early versions of Django had less robust forms and the validation was scattered.
"So should I check such conditions in save method?"
No, you should use a form for all validation.
"What are the best practices of model validation?"*
Use a form for all validation.
"whether it came from a form or was modified/created by some internal algorithms"
What? If your algorithms suffer from psychotic episodes or your programmers are sociopaths, then -- perhaps -- you have to validate internally-generated data.
Otherwise, internally-generated data is -- by definition -- valid. Only user data can be invalid. If you don't trust your software, what's the point of writing it? Are your unit tests broken?
There's an ongoing Google Summer of Code project that aims to bring validation to the Django model layer. You can read more about it in this presentation from the GSoC student (Honza Kral). There's also a github repository with the preliminary code.
Until that code finds its way into a Django release, one recommended approach is to use ModelForms to validate data, even if the source isn't a form. It's described in this blog entry from one of the Django core devs.
DB/Model validation
The data store in database must always be in a certain form/state. For example: required first name, last name, foreign key, unique constraint. This is where the logic of you app resides. No matter where you think the data comes from - it should be "validated" here and an exception raised if the requirements are not met.
Form validation
Data being entered should look right. It is ok if this data is entered differently through some other means (through admin or api calls).
Examples: length of person's name, proper capitalization of the sentence...
Example1: Object has a StartDate and an EndDate. StartDate must always be before EndDate. Where do you validate this? In the model of course! Consider a case when you might be importing data from some other system - you don't want this to go through.
Example2: Password confirmation. You have a field for storing the password in the db. However you display two fields: password1 and password2 on your form. The form, and only the form, is responsible for comparing those two fields to see that they are the same. After form is valid you can safely store the password1 field into the db as the password.