Using gen.coroutine’s callback argument in Tornado - python

Looking for a simple example demonstrating use of tornado.gen.coroutine’s callback argument. The docs say:
Functions with [the gen.coroutine] decorator return a Future. Additionally, they may be called with a callback keyword argument, which will be invoked with the future’s result when it resolves.
Adapting an example from the docs’ User’s guide, I would think I could do:
from tornado import gen
#gen.coroutine
def divide(x, y):
return x / y
#gen.coroutine
def good_call():
yield divide(1, 2)
good_call(callback=print)
I’d expect this to print 0.5, but there’s no output.
I’ve found copious examples demonstrating the deprecated gen.engine decorator, but there doesn’t seem to be as much out there on gen.coroutine. Running on Python 3.5.1 and Tornado 4.3.

You still need to start the IOLoop. If you add tornado.ioloop.IOLoop.current().start() at the end of your script you'll see the output printed (and then the IOLoop runs forever. If you want it to stop, you'll need to do so from your callback after printing).
Note that in general it is possible (and encouraged) to write Tornado applications using only coroutines and yield, without passing any callbacks directly.

Related

Python Asyncio - Eventloop in a contextmanager

Since I dont like the approach to use loop.run() for various reasons I wanted to code contextual loop, since the docs states on different occasions that if you don't go with the canonical .run() you have to prevent memory leaks by yourself (i.e). After a bit of research it seems like the python devs answer this feature with We don't need it!. While contextmanagers seems in general perfectly fine if you using the lower level api of asyncio, see PEP 343 - The “with” Statement exampel 10:
This can be used to deterministically close anything with a close
method, be it file, generator, or something else. It can even be used
when the object isn’t guaranteed to require closing (e.g., a function
that accepts an arbitrary iterable)
So can we do it anyway?
Related links:
https://bugs.python.org/issue24795
https://bugs.python.org/issue32875
https://groups.google.com/g/python-tulip/c/8bRLexUzeU4
https://bugs.python.org/issue19860#msg205062
https://github.com/python/asyncio/issues/261
Yes we can have a context manager for our event loop, even if there seems no good practice via subclassing due the c implementations (i.e). Basically the idea crafted out below is the following:
TL;DR
Create an Object with __enter__ and __exit__ to have the syntax with working.
Instead of usually returning the object, we return the loop that is served by asyncio
wrapping the asycio.loop.close() so that the loop gets stopped and our __exit__ method gets invoked before.
Close all connections which can lead to memory leaks and then close the loop.
Side-note
The implementation is due a wrapper object that returns a new loop into an annonymous block statement. Be aware that loop.stop() will finalize the loop and no further actions should be called. Overall the code below is just a little help and more a styling choice in my opinion, especially due the fact that it is no real subclass. But I think if someone wants to use the lower api without minding to finalize everything before, here is a possibility.
import asyncio
class OpenLoop:
def close(self,*args, **kwargs):
self._loop.stop()
def _close_wrapper(self):
self._close = self._loop.close
self._loop.close = self.close
def __enter__(self):
self._loop = asyncio.new_event_loop()
self._close_wrapper()
return self._loop
def __exit__(self,*exc_info):
asyncio.run(self._loop.shutdown_asyncgens())
asyncio.run(self._loop.shutdown_default_executor())
#close other services
self._close()
if __name__ == '__main__':
with OpenLoop() as loop:
loop.call_later(1,loop.close)
loop.run_forever()
assert loop.is_closed()

How to use the "result" and "loop" arguments in asyncio.sleep?

Extracted from python 3.6.8 documentation.
coroutine asyncio.sleep(delay, result=None, *, loop=None)
Create a coroutine that completes after a given time (in seconds). If result is provided, it is produced to the caller when the coroutine completes.
Question 1: What does the 2nd sentence mean, i.e. "If result is provided, ....."? I don't understand how to use the result argument. Can an example be provided to illustrate it's use?
Question 2: When should the loop argument be used? Can an example be given also to illustrate it's use?
I don't understand how to use the result argument.
result is simply the value that will be returned by asyncio.sleep once the specified time elapses. This is useful if you replace something that returns actual data with sleep(), e.g. for testing purposes, you can immediately specify a return value. For example:
data = await read_from_database()
...
if mocking:
read_from_database = functools.partial(
asyncio.sleep, 0.1, result='no data')
else:
async def read_from_database():
... real implementation ...
When should the loop argument be used?
The loop argument is, as of Python 3.7 deprecated and scheduled for removal. It was useful in Python 3.5 and earlier, when the return value of asyncio.get_event_loop() wasn't guaranteed to be the currently running event loop, but an event loop associated with the thread. Since one can run multiple event loops during the lifetime of a thread, correct code had to propagate an explicit loop everywhere. If you were running in a non-default event loop, you had to specify the loop to asyncio.sleep and most other asyncio functions and constructors. This style is often encountered in old tutorials and is nowadays actively discouraged.

How to run a code after 10 seconds for a single time in python?

I've seen a few multithreading posts regarding running a code every 10 seconds, but how do you run a code only once after x seconds?
Specifically, I am trying to create a class method that executes some code and calls another method after 10 seconds once but still allows other methods to be called in the meantime.
I suggest using Timer
E.g.:
from threading import Timer
class Test:
def start(self):
Timer(10, self.some_method, ()).start()
def some_method(self):
print "called some_method after 10 seconds"
t = Test()
t.start()
In C++ I use Boost to do similar tasks. Look at deadline_timer http://www.boost.org/doc/libs/1_66_0/doc/html/boost_asio/reference/deadline_timer.html for example. The page has some examples as well to get you started. Unfortunately you need to use io_service to make use of deadline_timer.
Python has twisted framework to do the same http://twistedmatrix.com/documents/13.1.0/api/twisted.internet.interfaces.IReactorTime.html#callLater
In sytem programming world you could use timer_create http://man7.org/linux/man-pages/man2/timer_create.2.html for example (depending on the OS you are using) and I would not go that route unless you have a good reason todo so.
Implement #timeout decorator and put before the function you want to setup.
Use signal on Unix like OS, refer to timeout_decorator. Since Windows doesn't implement signals at the system level, you have to use InterruptableThread.

How monkey_patch(time=True) affects eventlet.spawn?

Normally, when using greenthread, I can write code as:
def myfun():
print "in my func"
eventlet.spawn(myfunc)
eventlet.sleep(0) #then myfunc() is triggered as a thread.
But after use money_patch(time=true), I can change the code as:
eventlet.monkey_patch(time=True)
eventlet.spawn(myfunc) # now myfunc is called immediately
Why I dont need to call eventlet.sleep(0) this time?
And after I write my own sleep function:
def my_sleep(seconds):
print "oh, my god, my own..."
and set the sleep attr of built-in time module as my_sleep func, then I find my_sleep func would be called so many times with a lot of outputs.
But I can only see only one debug-thread in eclipse, which did not call my_sleep func.
So, the conclusion is, sleep function is called continually by default, and I think that the authors of eventlet know this, so they developed the monkey_patch() function. Is that rigth?
#
According the answer of #temoto, CPython cannot reproduce the same result. I thinks I should add some message to introduce how I find this interesting thing.(Why I did not add this message at first time, cause input so many words in not easy, and my English is not that good.^^)
I find this thing when I remote-debug openstack code using eclipse.
In nova/network/model.py, a function is written like this:
class NetworkInfoAsyncWrapper(NetworkInfo):
"""Wrapper around NetworkInfo that allows retrieving NetworkInfo
in an async manner.
This allows one to start querying for network information before
you know you will need it. If you have a long-running
operation, this allows the network model retrieval to occur in the
background. When you need the data, it will ensure the async
operation has completed.
As an example:
def allocate_net_info(arg1, arg2)
return call_neutron_to_allocate(arg1, arg2)
network_info = NetworkInfoAsyncWrapper(allocate_net_info, arg1, arg2)
[do a long running operation -- real network_info will be retrieved
in the background]
[do something with network_info]
"""
def __init__(self, async_method, *args, **kwargs):
self._gt = eventlet.spawn(async_method, *args, **kwargs)
methods = ['json', 'fixed_ips', 'floating_ips']
for method in methods:
fn = getattr(self, method)
wrapper = functools.partial(self._sync_wrapper, fn)
functools.update_wrapper(wrapper, fn)
setattr(self, method, wrapper)
When I first debug to this function, After exeute
self._gt = eventlet.spawn(async_method, *args, **kwargs)
the call-back function async_method is exeuted at once. But please remember, this is a thread, it should be triggered by eventlet.sleep(0).
But I didnt find the code to call sleep(0), so if sleep is perhaps called by eclipse, then in the real world(not-debug world), who triggered it?
TL;DR: Eventlet API does not require sleep(0) to start a green thread. spawn(fun) will start function some time in future, including right now. You only should call sleep(0) to ensure that it starts right now, and even then it's better to use explicit synchronization, like Event or Semaphore.
I can't reproduce this behavior using CPython 2.7.6 and 3.4.3, eventlet 0.17.4 in either IPython or pure python console. So it's probably Eclipse who is calling time.sleep in background.
monkey_patch was introduced as a shortcut to running through all your (and third party) code and replacing time.sleep -> eventlet.sleep, and similar for os, socket, etc modules. It's not related to Eclipse (or something else) repeating time.sleep calls.
But this is an interesting observation, thank you.

Using Tornado and Twisted at the same time

I am in a weird situation where I have to use Twisted in a system built completely out of Tornado.
They can share the same IOLoop so I know they can work together. My question is can I safely use their co-routine decorators in the same function? For example:
import tornado.platform.twisted
tornado.platform.twisted.install()
...
#gen.engine
#defer.inlineCallbacks
def get(self):
...
a = yield gen.Task(getA) # tornado
b = yield proxy.callRemote(getB) # twisted
...
defer.returnValue(a + b) # twisted
They do work on the same IOLoop so I am thinking this should be fine. Would there be any unforeseen consequences? Thanks in advance.
Looks like what you want is Cyclone, a web server framework for Python that implements the Tornado API as a Twisted protocol.
No, this wouldn't work. In your case inlineCallbacks is wrapped directly around your generator and gen.engine is wrapped outside. The problem is that inlineCallbacks does not know anything about gen.Task and it will yield it immediately (it has no way of passing it along to gen.engine).
To elaborate: if you yield obj inside an inlineCallbacks-wrapped generator, two things can happen:
obj is a Deferred in which case control is returned to the reactor until that Deferred fires.
obj is something else, in which case it is immediately sent back into your generator.
In your case, the result would be:
a = yield gen.Task(getA) # continues right through
# a is of type gen.Task here
b = yield proxy.callRemote(getB) # waits for result of proxy.callRemote
See here for how inlineCallbacks is implemented.
What is the right way to do this? Try to use either inlineCallbacks or gen.engine (but not both). Wrap the alien gen.Task (or Deferred) into the "native" form. I am not familiar with Tornado but maybe this question helps.
Alternatively, write your own decorator like inlineCallbacks that handles gen.Task as well.

Categories