Best way to handle complex data sent to tornado by ajax - python

This is a design question, not a code question. Title sucks, the problem isn't very describable in a few words.
I have a user database (mongodb). I built a front-end table in javascript that allows admin to see all users, edit their information, add new ones and delete them. All changes are stored in a javascript object until the admin clicks a Save button, then they are ajaxed to the server. The object looks like this:
{
"new": [<ids of new users>],
"deleted": [<ids to be deleted from database>],
"edited": {<edited fields of existing users and all fields of new users>}
}
I need to send this to the server and write changes into the database. There are multiple ways to do this.
I send the whole object to the server, where it will be handled by one RequestHandler. I need to also have separate handlers to handle the separate "add", "delete" and "edit" operations, because they will be needed elsewhere, so I put the actual functionality into functions that can be called from more handlers so I don't repeat myself too much.
I send three ajax requests for new, deleted and edited. In this case, new would carry the data, not only the ids. Each is handled by the separate handler. This is easier for backend, because the one big handler from option 1 doesn't have to exist, but I don't know if it is a good idea to make multiple ajax requests like this.
I ajax the one big object to one handler and then using HTTPClient I send requests from the handler to the separate handlers to do their thing. Saves ajax requests and the backend implementations are a bit cleaner, but the actual server side request sending seems dirty to me.
Another way I don't see.
What do you think?

Related

How to call REST APIs from Django?

Not new to programming, but new to Django
Where do I place the code in my Django project that calls an external REST API?
In the simplest example: I have a form where the user enters their ZIP code and presses a "Check" button which tells if the customer's address can be serviced by the lawn care company.
Upon getting a response, I have to the JSON to determine if it's serviceable, and what kind of services, etc.
So I need to be able to send, and read responses.
Where do I do this?
There is no database involved, so I presume I don't use a model.
I don't think the "View" is a good spot
Where?!
Thanks!

Loading the HTML while still working in Django (probably through asynchronous functions)

I have this project that I’ve been working on for a while:
In the views.py file, I scrape a lot of information from IMDB, with each call taking around 0.3 of a second. Meanwhile, my page will stand idle. I want to have it load the page and then finish up the call.
For instance, I want it to load the recommended movies after already showing the actors and actresses that played in both. Or in an index, I want to allow the user to type and then show the options to click on.
I’ve tried Celery with Redis, but Django can’t display asynchronous tasks.
How could I do this?
As you said, Django can't do that. My advice would be to divide and conquer:
First, write a view that just displays some HTML to load a JS script that can load your data asynchronously, using the render shortcut. A front-end framework like Vue can help you to hide the parts of your layout while your data loads.
Next, write views that just return the data using the JsonResponse object in Django, for example: one view to load the recommendations, one view to load the actor list.
Use XH requests to call your views and retrieve the information, using the Promise methods to make everything appear in sync.
Bonus: If you already have Celery in place, you can define a task that grabs all the data you need on the server, and create a view that polls the status of your task, and then call it using XHR every few milliseconds until your data (or part of it, that really depends on how you define your task) is ready.

Checking login status at every page load in CherryPy

I am in the midst of writing a web app in CherryPy. I have set it up so that it uses OpenID auth, and can successfully get user's ID/email address.
I would like to have it set so that whenever a page loads, it checks to see if the user is logged in, and if so displays some information about their login.
As I see it, the basic workflow should be like this:
Is there a userid stored in the current session? If so, we're golden.
If not, does the user have cookies with a userid and login token? If so, process them, invalidate the current token and assign a new one, and add the user information to the session. Once again, we're good.
If neither condition holds, display a "Login" link directing to my OpenID form.
Obviously, I could just include code (or a decorator) in every public page that would handle this. But that seems very... irritating.
I could also set up a default index method in each class, which would do this and then use a (page-by-page) helper method to display the rest of the content. But this seems like a nightmare when it comes to the occasional exposed method other than index.
So, my hope is this: is there a way in CherryPy to set some code to be run whenever a request is received? If so, I could use this to have it set up so that the current session always includes all the information I need.
Alternatively, is it safe to create a wrapper around the cherrypy.expose decorator, so that every exposed page also runs this code?
Or, failing either of those: I'm also open to suggestions of a different workflow. I haven't written this kind of system before, and am always open to advice.
Edit: I have included an answer below on how to accomplish what I want. However, if anybody has any workflow change suggestions, I would love the advice! Thanks all.
Nevermind, folks. Turns out that this isn't so bad to do; it is simply a matter of doing the following:
Write a function that does what I want.
Make the function in to a custom CherryPy Tool, set to the before_handler hook.
Enable that tool globally in my config.

requests using django session table

I have an application that uses the requests library to make calls to a web service. On django, I use a table to keep sessions using the standard django_session library.
I've noticed that I have a new record in the database for every time the requests library fires off a call. I find this quite bizarre, since I'm not using request sessions (explicitly) and most of my calls are single GET calls that shouldn't need any kind of persistance.
Has anybody else had this problem?
django.contrib.sessions.middleware.SessionMiddleware will create a new session for each request.
If you want to disable this behavior, consider replacing django.contrib.sessions.middleware.SessionMiddleware with a custom middleware that extends django.contrib.sessions.middleware.SessionMiddleware but prevents sessions from being created in instances where you don't want them.
This answer provides an example of how to do so. It parses request.path_info to determine whether a session should be created, but you could use a number of different techniques, such as adding a custom header to your request, including a post variable, etc.
Turns out I had this setting turned on
SESSION_SAVE_EVERY_REQUEST=True
When I started with sessions I wanted to be safe to make sure nothing falls through the cracks. Turns out this was then writing empty session information to the db.

Submitting Multiple Forms At The Same Time (Edit Profile Page)

My question I suppose is rather simple. Basically, I have a profile. It has many variables being passed in. For instance, name, username, profile picture, and many others that are updated by their own respective pages. So one page would be used to update the profile picture, and that form would submit data from the form to the handler, and put() it to the database. What i'm trying to do here, is put all of the forms used to edit the profile on one single page at the same time.
Would I need one huge handler to deal with that page? When I hit 'save' at the bottom of the page, how do I avoid overwriting data that hasn't been modified? Currently, say I have 5 profile variables, they map to 5 handlers, and 5 separate pages that contain their own respective form.
Thanks.
I've used django on most of my webapps, but the concept should be the same; I use ajax to send the data to the backend whenever the user hits submit (and the form returns false) so the user can keep editing it. With ajax, you can send the data to different handlers on the backend. Also, using jQuery, you can set flags to see if fields have been changed, to avoid sending the ajax message in the first place. Ajax requests behave almost exactly like standard HTTP requests, but I believe the header indicates AJAX.
If you're looking at strictly backend, then you will need to do multiple "if" statements on the backend and check one field at a time to see if it has been changed. On the backend you should still be able to call other handlers (passing them the same request).

Categories