Accessing web request globally in Tornado - python

I need to be able to access the currently executing web request in Tornado deep within my application without passing it down through all of my methods. When the request is first received I want to assign a trace id to it and then every time a message gets logged I want to include it in the logging information.
Is there some global information somewhere that I can use in Tornado to identify the current request that's being processed?
Thanks!

Tornado's StackContext is the way to do this.
Here's an example: https://gist.github.com/simon-weber/7755289.

Related

Separate Thread to handle specific API request in cherrypy

I am using a single threaded cherrypy server(threadpool = 1). I am trying to prioritise a specific request. That's i want a particular api request to be served even if main thread is busy. What are the possibilities to achieve this ?
Any help is appreciated.

Flask/Python: from flask import request

I read the Flask doc, it said whenever you need to access the GET variables in the URL, you can just import the request object in your current python file?
My question here is that if two user are hitting the same Flask app with the same URL and GET variable, how does Flask differentiate the request objects? Can someone tell me want is under the hood?
From the docs:
In addition to the request object there is also a second object called
session which allows you to store information specific to a user from
one request to the next. This is implemented on top of cookies for you
and signs the cookies cryptographically. What this means is that the
user could look at the contents of your cookie but not modify it,
unless they know the secret key used for signing.
Means every user is associated with a flask session object which distinguishes them from eachother.
Just wanted to highlight one more fact about the requests object.
As per the documentation, it is kind of proxy to objects that are local to a specific context.
Imagine the context being the handling thread. A request comes in and the web server decides to spawn a new thread (or something else, the underlying object is capable of dealing with concurrency systems other than threads). When Flask starts its internal request handling it figures out that the current thread is the active context and binds the current application and the WSGI environments to that context (thread). It does that in an intelligent way so that one application can invoke another application without breaking.

Access HttpRequest object in request_finished callback in Django

I am trying to call a function after a particular view is finished sending the response object to the user - so the user does not have to wait for the function to be executed.
I am trying to use request_finished of the Django Signals Framework but I do not know how to access the HttpRequest object in the kwargs that Django signal sends to my callback.
Looks like the Signal object does not contain any useful information about the request.
ALSO, is this the best way to execute a function outside the request-response cycle? I do not want to use an advanced solution like Celery at this point in time.
That signal doesn't do what you think it does. As you can see from the handler code, the request_finished signal is sent when the request has been processed, but before the response is returned to the user. So anything that you add to that signal will still happen before the user sees any of the response.
Because of the way web servers work, there's no way to run code after the response is returned to the user. Really, the only thing to do is use something like Celery - you could knock up your own version that simulates a task queue using a db table, then have a cron job pick up items from the table, but it'll be a whole lot easier just to use Celery.
The crosstown_traffic API of hendrix, which uses Twisted to serve Django, is designed specifically to defer logic until immediately after the Response has gone out over the wire to the client.
http://hendrix.readthedocs.org/en/latest/crosstown_traffic/

Bind arbitrary Python objects to CherryPy sessions

I'm using CherryPy to make a web-based frontend for SymPy that uses an asynchronous process library on the server side to allow for processing multiple requests at once without waiting for each one to complete. So as to allow for the frontend to function as expected, I am using one process for the entirety of each session. The client-side Javascript sends the session-id from the cookie to the server when the user submits a request, and the server-side currently uses a pair of lists, storing instances of a controller class in one and the corresponding session-id's in another, creating a new interpreter proxy and sending the input if a non-existant session-id is submitted. The only problem with this is that the proxy classes are not deleted upon the expiration of their corresponding sessions. Also, I can't see anything to retrieve the session-id for which the current request is being served.
My questions about all this are: is there any way to "connect" an arbitrary object to a CherryPy session so that it gets deleted upon session expiration, is there something I am overlooking here that would greatly simplify things, and does CherryPy's multi-threading negate the problem of synchronous reading of the stdout filehandle from the child process?
You can create your own session type, derived from CherryPy's base session. Use its clean_up method to do your cleanup.
Look at cherrypy/lib/sessions.py for details and sample session implementations.

Request-Aware Code in Google App Engine -- os.environ?

In GAE, you can say users.get_current_user() to get the currently logged-in user implicit to the current request. This works even if multiple requests are being processed simultaneously -- the users module is somehow aware of which request the get_current_user function is being called on behalf of. I took a look into the code of the module in the development server, and it seems to be using os.environ to get the user email and other values associated to the current request.
Does this mean that every request gets an independent os.environ object?
I need to implement a service similar to users.get_current_user() that would return different values depending on the request being handled by the calling code. Assuming os.environ is the way to go, how do I know which variable names are already being used (or reserved) by GAE?
Also, is there a way to add a hook (or event handler) that gets called before every request?
As the docs say,
A Python web app interacts with the
App Engine web server using the CGI
protocol.
This basically means exactly one request is being served at one time within any given process (although, differently from real CGI, one process can be serially reused for multiple requests, one after the other, if it defines main functions in the various modules to which app.yaml dispatches). See also this page, and this one for documentation of the environment variables CGI defines and uses.
The hooks App Engine defines are around calls at the RPC layer, not the HTTP requests. To intercept each request before it gets served, you could use app.yaml to redirect all requests to a single .py file and perform your interception in that file's main function before redirecting (or, you could call your hook at the start of the main in every module you're using app.yaml to dispatch to).

Categories