Access HttpRequest object in request_finished callback in Django - python

I am trying to call a function after a particular view is finished sending the response object to the user - so the user does not have to wait for the function to be executed.
I am trying to use request_finished of the Django Signals Framework but I do not know how to access the HttpRequest object in the kwargs that Django signal sends to my callback.
Looks like the Signal object does not contain any useful information about the request.
ALSO, is this the best way to execute a function outside the request-response cycle? I do not want to use an advanced solution like Celery at this point in time.

That signal doesn't do what you think it does. As you can see from the handler code, the request_finished signal is sent when the request has been processed, but before the response is returned to the user. So anything that you add to that signal will still happen before the user sees any of the response.
Because of the way web servers work, there's no way to run code after the response is returned to the user. Really, the only thing to do is use something like Celery - you could knock up your own version that simulates a task queue using a db table, then have a cron job pick up items from the table, but it'll be a whole lot easier just to use Celery.

The crosstown_traffic API of hendrix, which uses Twisted to serve Django, is designed specifically to defer logic until immediately after the Response has gone out over the wire to the client.
http://hendrix.readthedocs.org/en/latest/crosstown_traffic/

Related

Django - repeatedly send API call result via websocket on events (REST Framework + Channels)

I came with a problem while integrating Django REST Framework with Django Channels.
I have a viewset with retrieve (GET) method that prepares information from several different models in tricky way and sends this "complex" result to the frontend. So when client sends GET request with entity primary key to this endpoint (like /complex_entity/1) he instantly receives everything he needed.
And now guys on the frontend side want to have another feature - backend should be able to send results of this complex request to the frontend each time when some of relevant underlying models were changed. Like this: browser subscribes for the changes of ComplexEntity with primary key 1 and when ComplexEntity 1 is changed (or its linked entities which is not a problem) server sends the result of this complex request via websocket. So the request can be executed many times during one websocket connection (on each model change signal).
I see two intuitive ways to provide this behaviour:
Good(?): somehow execute requests to this viewset retrieve method from the django itself - either by calling this method internally or by executing "loopback" HTTP request.
Bad/ugly: copy all complex logic from viewset retrieve method to websocket consumer
Also I've found Django Channels REST Framework which allows to subscribe to model entity but the problem is I need to return not just model instance but this "custom" result glued from several models. DCRF lacks that feature as I understood.
For now I don't really know what is the best way to solve my problem - looks like calling method internally is ok but how to do it?
Loopback HTTP request is ok too (I think) but it should be untied from site hostname and sanity says that it's better to forward "originator" cookies to such request to prevent unauthorized access to entities. The question is, again, how to do it right.
So does anybody know what is a best way to execute same complex request several times during one websocket connection?
The proper way would be to move the common logic into a reusable method and use it in both DRF view and in channels.
That method will receive some arguments (I guess ComplexEntity's ID) and will return the result data in the format you need.

Flask/Python: from flask import request

I read the Flask doc, it said whenever you need to access the GET variables in the URL, you can just import the request object in your current python file?
My question here is that if two user are hitting the same Flask app with the same URL and GET variable, how does Flask differentiate the request objects? Can someone tell me want is under the hood?
From the docs:
In addition to the request object there is also a second object called
session which allows you to store information specific to a user from
one request to the next. This is implemented on top of cookies for you
and signs the cookies cryptographically. What this means is that the
user could look at the contents of your cookie but not modify it,
unless they know the secret key used for signing.
Means every user is associated with a flask session object which distinguishes them from eachother.
Just wanted to highlight one more fact about the requests object.
As per the documentation, it is kind of proxy to objects that are local to a specific context.
Imagine the context being the handling thread. A request comes in and the web server decides to spawn a new thread (or something else, the underlying object is capable of dealing with concurrency systems other than threads). When Flask starts its internal request handling it figures out that the current thread is the active context and binds the current application and the WSGI environments to that context (thread). It does that in an intelligent way so that one application can invoke another application without breaking.

What’s the correct way to run a long-running task in Django whilst returning a page to the user immediately?

I’m writing a tiny Django website that’s going to provide users with a way to delete all their contacts on Flickr.
It’s mainly an exercise to learn about Selenium, rather than something actually useful — because the Flickr API doesn’t provide a way to delete contacts, I’m using Selenium to make an actual web browser do the actual deleting of contacts.
Because this might take a while, I’d like to present the user with a message saying that the deleting is being done, and then notify them when it’s finished.
In Django, what’s the correct way to return a web page to the user immediately, whilst performing a task on the server that continues after the page is returned?
Would my Django view function use the Python threading module to make the deleting code run in another thread whilst it returns a page to the user?
Consider using some task queues - one of the most liked by Django community solution is to use Celery with RabbitMQ.
Once I needed this, I set up another Python process, that would communicate with Django via xmlrpc - this other process would take care of the long requests, and be able to answer the status of each. The Django views would call that other process (via xmlrpc) to queue jobs, and query job status. I made a couple proper json views in django to query the xmlrpc process - and would update the html page using javascript asynchronous calls to those views (aka Ajax)

Django - execute method after sending output to user

I have a long (aprox. 1sec) diagnostic method to call for every user's request. This is perfectly fine to call this method after rendering and sending output to the user. How can I do that in Django? Any guidelines where I can find info about it? Unfortunately using any queue system cannot be considered.
You could use a request_finished signal and start your function at the end of every request. Look at https://docs.djangoproject.com/en/dev/ref/signals/#django.core.signals.request_finished

Bind arbitrary Python objects to CherryPy sessions

I'm using CherryPy to make a web-based frontend for SymPy that uses an asynchronous process library on the server side to allow for processing multiple requests at once without waiting for each one to complete. So as to allow for the frontend to function as expected, I am using one process for the entirety of each session. The client-side Javascript sends the session-id from the cookie to the server when the user submits a request, and the server-side currently uses a pair of lists, storing instances of a controller class in one and the corresponding session-id's in another, creating a new interpreter proxy and sending the input if a non-existant session-id is submitted. The only problem with this is that the proxy classes are not deleted upon the expiration of their corresponding sessions. Also, I can't see anything to retrieve the session-id for which the current request is being served.
My questions about all this are: is there any way to "connect" an arbitrary object to a CherryPy session so that it gets deleted upon session expiration, is there something I am overlooking here that would greatly simplify things, and does CherryPy's multi-threading negate the problem of synchronous reading of the stdout filehandle from the child process?
You can create your own session type, derived from CherryPy's base session. Use its clean_up method to do your cleanup.
Look at cherrypy/lib/sessions.py for details and sample session implementations.

Categories