Calling a Tornado handler in another handler - python

I want to split a Tornado handler in two handlers. In the first handler, I want send a command to a sensor. In the second handler, I want to wait for a response from the sensor.
Is this possible? Or I can't call an handler from another? If it's possible, how I can call this second handler?
Thank you very much.

I know that original post was asked years ago, but since accepted answer doesn't solve the question asked, and I do believe I have the proper solution, as I had a need for it. Also, if some poor soul is Googling for the same issue, hopefully they will find this.
class OneHandler(RequestHandler):
def get(self, id):
AnotherHandler(self.application, self.request).get(id)
Here you can call "AnotherHandler" from within "OneHandler"

Sounds like you've got a variant of the "chat" application. Your best bet is to take a look at the basic chat demo app.
The basic operating idea is to have a mixin (or globals if you prefer) that has a list of connections waiting on a response, when that response is set it triggers the callbacks on the original handlers.

Related

How do I create an asynchronous socket in Python?

I've created a socket object for Telnet communication, and I'm using it to communicate with an API, sending and receiving data. I need to configure it in such a way that I can send and receive data at the same time. By that, I mean data should be sent as soon as the application tries to send it, and data should be processed immediately on receipt. Currently, I have a configuration which allows receipt to be instant, and sending to be second priority with a very short delay.
Currently the best way I have found to do this is by having an event queue, and pushing data to send into it, then having a response queue into which I put messages from the server. I have a thread which polls the buffer every .1 seconds to check for new data, if there isn't any, it then checks the request queue and processes anything there, and that's running in a continuous loop. I then have threads insert data into the request queue, and read data from the response queue. Everything is just about linear enough that this works fine.
This is not "asynchronous", in a sense that I've had to make it as asynchronous as possible without actually achieving it. Is there a proper way to do this? Or is anything under the hood going to be doing exactly the same as I am?
Other things I have investigated as a solution to this problem:
A callback system, where I might call socket.on_receipt(handle_message, args) to call the method handle_message with args as a parameter, passing the received data into the method. The only way I could find to achieve this is by implementing what I already have, then registering a callback for it (in fact, this is very close to what I do already have).
Please note: I am approaching this as a learning exercise to understand better how asynchronous systems work, not to understand how to use a particular library, so please do not suggest an existing library unless it contains very clear code which is simple to understand and answers the question fully and concisely.
This seems like a pretty straightforward use case for asyncio. I wouldn't consider using asyncio as "using a particular library" since socket programming paired with asyncio's event loop is pretty low-level and the concept is very transparent if you have experience with other languages and just want to see how async programming works in Python.
You can use this async chat as an example: https://gist.github.com/gregvish/7665915
Essentially, you create a non-blocking socket, see standard library reference on socket.setblocking(0):
https://docs.python.org/3/library/socket.html#socket.socket.setblocking
I'd also suggest this amazing session by David Beazley as a must-see for async Python programming. He explains the concurrency concepts in Python using sockets, exactly what you need: https://www.youtube.com/watch?v=MCs5OvhV9S4

threading.local from a different thread

I'm trying to make a threaded cgi webserver similar to this; however, I'm stuck on how to set local data in the handler for a different thread. Is it possible to set threading.local data, such as a dict, for a thread other than the handler. To be more specific I want to have the request parameters, headers, etc available from a cgi file that was started with subprocess.run. The bottom of the do_GET in this file on github is what I use now, but that can only serve one client at a time. I want to replace this part because I want multiple connections/threads at once, and I need different data in each connection/thread.
Is there a way to edit/set threading.local data from a different thread. Or if there is a better way to achieve what I am trying, please let me know. If you know that this is definently impossible, say so.
Thanks in advance!
Without seeing what test code you have, and knowing what you've tried so far, I can't tell you exactly what you need to succeed. That said, I can tell you that trying to edit information in a threading.local() object from another thread is not the cleanest path to take.
Generally, the best way to send calls to other threads is through threading.Event() objects. Usually, a thread listens to an Event() object and does an action based on that. In this case, I could see having a handler set an event in the case of a GET request.
Then, in the thread that is writing the cgi file, have a function that, when the Event() object is set, records the data you need and unsets the Event() object.
So, in pseudo-code:
import threading
evt = threading.Event()
def noteTaker(evt):
while True:
if evt.wait():
modifyDataYouNeed()
f.open()
f.write()
f.close()
evt.clear()
def do_GET(evt):
print "so, a query hit your webserver"
evt.set()
print "and noteTaker was just called"
So, while I couldn't answer your question directly, I hope this helps some on how threads communicate and will help you infer what you need :)
threading information (as I'm sure you've read already, but for the sake of diligence) is here

Returning (Passing Around) A Function Call in Python/Tornado?

So I'm creating the back end for a web-based game in python. Currently it works like this...
WebSocket Handler receives message...
WebSocket Handler calls message handler...
Message Handler calls Game class functions...
Game class calls other classes to update information.
This is highly coupled, and probably should be in model-view-controller format. So I'm finally going to change that.
Basically I want it to work like this.
Controller has open a WebSocket Handler.
WebSocket Handler returns to Controller a (1?).
Controller uses (1?) calls Message Handler.
Message Handler returns to Controller a (2?).
Contoller uses (2?) to call Model.
Model sends updates to appropriate places.
So there's two problems here.
First of all, when I'm getting a message in the WebSocket Handler, it is an instance of Tornado's WebSocketHandler, and I'm not sure how I can return anything to the Controller. Is the case simply that this is not possible? Do I have to keep a small amount of coupling between the WebSocket Handler and the Message Handler? I know I could always call a function in Controller, but that doesn't seem like an actual fix, just more function calls.
Is there a way to pass a function call around in python, keeping track of it's parameters while doing so? That would be the optimal way to go about doing this, but I don't think it's implemented in the python language. Otherwise, I feel like the best way to do it would be to return a dictionary with a field for the function name to be called, and fields for the parameters. This of course is a lot more code. If it can be avoided, I'd like to, but I'm not sure the direction in which to take this.
Thanks for the tips guys, this is a big refactoring and I'm really nervous about where to start.
For the second part of your question, I believe you want to use partial functions.
Check out: http://docs.python.org/2/library/functools.html
Basically you would go:
from functools import partial
function_call(partial(future_function_call, future_argument))

Advice on backgrounding a task with variables?

I have a python webapp which accepts some data via POST. The method which is called can take a while to complete (30-60s), so I would like to "background" the method so I can respond to the user with a "processing" message.
The data is quite sensitive, so I'd prefer not to use any queue-based solutions. I also want to ensure that the backgrounded method doesn't get interrupted should the webapp fail in any way.
My first thought is to fork a process, however I'm unsure how I can pass variables to a process.
I've used Gevent before, which has a handy method: gevent.spawn(function, *args, **kwargs). Is there anything like this that I could use at the process-level?
Any other advice?
The simplest approach would be to use a thread. Pass data to and from a thread with a Queue.

Merging two event loops (Cherrypy and Wxpython)

Okay, I have an application written with cherrypy, and I want to build a wxpython gui for it. The problem is that both modules use a close loop for event handling, which (I assume) means while one is running the other will be locked.
I asked for some advice and it was suggested that I merge the two event loops rather than using the stock entrypoints (quickloop() for cherrypy and MainLoop() for wx)
The problem is I have no idea how to do this. Any advice would be greatly appreciated.
You already asked the same question here: cherrypy and wxpython, and I gave you the best response you're going to find anywhere there, which was voted up and you approved, apparently. Why are you asking again?
In the case of cherrypy, you have the source. Look in the code what quickloop() does and then try to merge this code with the MainLoop() of WX.
Both loops will probably look like this:
while (true) {
if (pendingEvents()) processEvents ();
else waitForEvents ();
}
You must find a way to merge the two waiting calls into one (so the code continues if either event source had pending events). For WX, look at Dispatch(), Pending() and ProcessIdle().
Or you can look at wxIdleEvent (see the docs) and process all cherrypy events in there.
Another solution might be to run the two loops in different threads. In this case, you can't call WX methods from cherrypy code and vice versa. To solve this, you must find a way to send messages to the other thread with all the information which method to call. This makes sure that WX methods get executed in the WX thread and cherrypy methods get executed in the cherrypy thread.

Categories