I'm studying in Python yield and find that yield is not only the way in which generators output a return value but also a way to put values into a generator. For example the following code
def f():
print (yield),
print 0,
print (yield),
print 1
g = f()
g.send(None)
g.send('x')
g.send('y')
In the global scope it sends value 'x', 'y' to the generator and thus in f it will output x 0 y 1. But I cannot understand
There are 2 yields but 3 sends. Why should it send None at the first time?
It throws a StopIteration at the last send. Is there any way to avoid this exception?
Could anyone please explain that? Thanks in advance.
From the documentation:
When send() is called to start the generator, it must be called with None as the argument, because there is no yield expression that could receive the value.
As for the exception, you can't really avoid it. The generator throws this exception when it's done iterating, so instead of avoiding it, just catch it:
g = f()
try:
g.send(None)
g.send('x')
g.send('y')
except StopIteration:
print 'Done'
Related
Update: I've started a thread on python-ideas to propose additional syntax or a stdlib function for this purpose (i.e. specifying the first value sent by yield from). So far 0 replies... :/
How do I intercept the first yielded value of a subgenerator but delegate the rest of the iteration to the latter using yield from?
For example, suppose we have an arbitrary bidirectional generator subgen, and we want to wrap this in another generator gen. The purpose of gen is to intercept the first yielded value of subgen and delegate the rest of the generation—including sent values, thrown exceptions, .close(), etc.—to the sub-generator.
The first thing that might come to mind could be this:
def gen():
g = subgen()
first = next(g)
# do something with first...
yield "intercepted"
# delegate the rest
yield from g
But this is wrong, because when the caller .sends something back to the generator after getting the first value, it will end up as the value of the yield "intercepted" expression, which is ignored, and instead g will receive None as the first .send value, as part of the semantics of yield from.
So we might think to do this:
def gen():
g = subgen()
first = next(g)
# do something with first...
received = yield "intercepted"
g.send(received)
# delegate the rest
yield from g
But what we've done here is just moving the problem back by one step: as soon as we call g.send(received), the generator resumes its execution and doesn't stop until it reaches the next yield statement, whose value becomes the return value of the .send call. So we'd also have to intercept that and re-send it. And then send that, and that again, and so on... So this won't do.
Basically, what I'm asking for is a yield from with a way to customize what the first value sent to the generator is:
def gen():
g = subgen()
first = next(g)
# do something with first...
received = yield "intercepted"
# delegate the rest
yield from g start with received # pseudocode; not valid Python
...but without having to re-implement all of the semantics of yield from myself. That is, the laborious and poorly maintainable solution would be:
def adaptor(generator, init_send_value=None):
send = init_send_value
try:
while True:
send = yield generator.send(send)
except StopIteration as e:
return e.value
which is basically a bad re-implementation of yield from (it's missing handling of throw, close, etc.). Ideally I would like something more elegant and less redundant.
If you're trying to implement this generator wrapper as a generator function using yield from, then your question basically boils down to whether it is possible to specify the first value sent to the "yielded from" generator. Which it is not.
If you look at the formal specification of the yield from expression in PEP 380, you can see why. The specification contains a (surprisingly complex) piece of sample code that behaves the same as a yield from expression. The first few lines are:
_i = iter(EXPR)
try:
_y = next(_i)
except StopIteration as _e:
_r = _e.value
else:
...
You can see that the first thing that is done to the iterator is to call next() on it, which is basically equivalent to .send(None). There is no way to skip that step and your generator will always receive another None whenever yield from is used.
The solution I've come up with is to implement the generator protocol using a class instead of a generator function:
class Intercept:
def __init__(self, generator):
self._generator = generator
self._intercepted = False
def __next__(self):
return self.send(None)
def send(self, value):
yielded_value = self._generator.send(value)
# Intercept the first value yielded by the wrapped generator and
# replace it with a different value.
if not self._intercepted:
self._intercepted = True
print(f'Intercepted value: {yielded_value}')
yielded_value = 'intercepted'
return yielded_value
def throw(self, type, *args):
return self._generator.throw(type, *args)
def close(self):
self._generator.close()
__next__(), send(), throw(), close() are described in the Python Reference Manual.
The class wraps the generator passed to it when created will mimic its behavior. The only thing it changes is that the first value yielded by the generator is replaced by a different value before it is returned to the caller.
We can test the behavior with an example generator f() which yields two values and a function main() which sends values into the generator until the generator terminates:
def f():
y = yield 'first'
print(f'f(): {y}')
y = yield 'second'
print(f'f(): {y}')
def main():
value_to_send = 0
gen = f()
try:
x = gen.send(None)
while True:
print(f'main(): {x}')
# Send incrementing integers to the generator.
value_to_send += 1
x = gen.send(value_to_send)
except StopIteration:
print('main(): StopIteration')
main()
When ran, this example will produce the following output, showing which values arrive in the generator and which are returned by the generator:
main(): first
f(): 1
main(): second
f(): 2
main(): StopIteration
Wrapping the generator f() by changing the statement gen = f() to gen = Intercept(f()), produces the following output, showing that the first yielded value has been replaced:
Intercepted value: first
main(): intercepted
f(): 1
main(): second
f(): 2
As all other calls to any of the generator API are forwarded directly to the wrapped generator, it should behave equivalently to the wrapped generator itself.
If I understand the question, I think this works? Meaning, I ran this script and it did what I expected, which was to print all but the first line of the input file. But as long as the generator passed as the argument to the skip_first function can be iterator over, it should work.
def skip_first(thing):
_first = True
for _result in thing:
if _first:
_ first = False
continue
yield _result
inp = open("/var/tmp/test.txt")
for line in skip_first(inp):
print(line, end="")
I have read about yield at What does the "yield" keyword do in Python? keyword in python but i have one question that how system identity that yield iterated once.
def test_yield():
name = 'Hello World!'
for i in name:
yield i
y= test_yield()
print "first yield",y
for c in y:
print c
print "second yield",y
for c in y:
print c
Output
first yield <generator object test_yield at 0x7f4c2bc10be0>
H
e
l
l
o
W
o
r
l
d
!
second yield <generator object test_yield at 0x7f4c2bc10be0>
In output second time yield object printed but not iterate.so how program identity that its executed once?
Looping through an iterator "uses it up". So your first yield loop iterates through it and reaches the end. Just as if you opened a file and read all of the lines until EOF. After that, no matter how many times you call read(), you won't get more data, only EOF. Similarly, once your iterator has reached its last element, calling .next on it just raises StopIteration.
When a generator function calls yield, the "state" of the generator function is frozen; the values of all variables are saved and the next line of code to be executed is recorded until next() is called again. Once it is, the generator function simply resumes where it left off. If next() is never called again, the state recorded during the yield call is (eventually) discarded.
Given that when the generator function reaches it's end, a StopIteration Exception is raised, making it exhausted hence you have to reload it, that is the reason it didn't iterate any values during the second call.
Note : for gets values by calling next() implicitly
The first time the for calls the generator object created from your function, it will run the code in your function from the beginning until it hits yield, then it’ll return the first value of the loop. Then, each other call will run the loop you have written in the function one more time, and return the next value, until there is no value to return. In your second for loop, there's no more value to retrieve because you already wasted them.
For more insight as to what's happening behind the scenes, the for loop can be rewritten to this:
iterator = some_func()
try:
while 1:
print iterator.next()
except StopIteration:
pass
When you wrote
y = test_yield()
you initialize iterator and when you finish your iteration after first
for c in y:
the iterator was ended. You need to initialize once more like:
def test_yield():
name = 'Hello World!'
for i in name:
yield i
y= test_yield()
print "first yield",y
for c in y:
print c
# once again
y= test_yield()
print "second yield",y
for c in y:
print c
i'm trying to generate an endless stream of results given a function f and an initial value x
so first call should give the initial value, second call should give f(x), third call is f(x2) while x2 is the previous result of f(x) and so on..
what i have come up with:
def generate(f, x):
return itertools.repeat(lambda x: f(x))
which does not seem to work. any ideas? (i cant use yield in my code). also i cant use more than 1 line of code for this problem. any help would be appreciated.
also note that in a previous ex. i was asked to use the yield. with no problems:
while True:
yield x
x = f(x)
this works fine. but now.. no clue how to do it without
In Python 3.3, you can use itertools.accumulate:
import itertools
def generate(f, x):
return itertools.accumulate(itertools.repeat(x), lambda v,_:f(v))
for i, val in enumerate(generate(lambda x: 2*x, 3)):
print(val)
if i == 10:
break
I think this works:
import itertools as it
def g(f, x):
return it.chain([x],(setattr(g, 'x', f(getattr(g, 'x', x))) or getattr(g, 'x') for _ in it.count()))
def f(x):
return x + 1
gen = g(f, 1)
print next(gen)
print next(gen)
print next(gen)
print next(gen)
Of course, it relys on some sketchy behavior where I actually add an attribute to the function itself to keep the state. Basically, this function will only work the first time you call it. After that, all bets are off.
If we want to relax that restriction, we can use a temporary namespace. The problem is that to get a temporary namespace we need a unique class instance (or class, but an instance is cleaner and only requires 1 extra set of parenthesis). To make that happen in one line, we need to create a new function inline and use that as a default argument:
import itertools as it
def g(f, x):
return (lambda f, x, ns=type('foo', (object,), {})(): \
it.chain([x],
(setattr(ns, 'x', f(getattr(ns, 'x', x))) or getattr(ns, 'x')
for _ in it.count()))
)(f, x)
def f(x):
return x + 1
gen = g(f, 1)
print next(gen) == 1
print next(gen) == 2
print next(gen) == 3
print next(gen) == 4
print "first worked?"
gen2 = g(f, 2)
print next(gen2) == 2
print next(gen2) == 3
print next(gen2) == 4
I've broken it into a few lines, for readability, but it's a 1-liner at heart.
A version without any imports
(and the most robust one yet I believe).
def g(f, x):
return iter(lambda f=f, x=x, ns=type('foo', (object,), {'x':x}): ((getattr(ns, 'x'),setattr(ns, 'x', f(getattr(ns, 'x'))))[0]), object())
One trick here is the same as before. We create a lambda function with a mutable default argument to keep the state. Inside the function, we build a tuple. The first item is what we actually want, the second item is the return value of the setattr function which is used to update the state. In order to get rid of the itertools.chain, we set the initial value on the namespace to the value of x so the class is already initialzed to have the starting state. The second trick is that we use the two argument form of iter to get rid of it.count() which was only used to create an infinite iterable before. iter keeps calling the function you give it as the first argument until the return value is equal to the second argument. However, since my second argument is an instance of object, nothing returned from our function will ever be equal to it so we've effectively created an infinite iterable without itertools or yield! Come to think of it, I believe this last version is the most robust too. Previous versions had a bug where they relied on the truthfulness of the return value of f. I think they might have failed if f returned 0. This last version fixes that bug.
I'm guessing this is some sort of homework or assignment? As such, I'd say you should take a look at generator expressions. Though I agree with the other commenters that this seems an exercise of dubious value...
I try to define a generator function mycount() that can be reset with the generator function send(0) as in the example below. Everything works fine, except when I use send(0) on a new generator object that hasn't started yet. In this case it gives a TypeError. Is there any function that checks if the generator has started or do I have to catch the TypeError and create a new generator object with mycount(0) in such case?
def mycount(value):
while True:
v = yield value
if v == None:
value = value + 1
else:
value = v
g = mycount(3)
print(next(g)) # prints 3
print(next(g)) # prints 4
print(g.send(0)) # prints 0
print(next(g)) # prints 1
print(next(g)) # prints 2
g2 = mycount(3)
g2.send(0)
# TypeError: can't send non-None value to a just-started generator
To avoid sending a non-None value to a just-started generator, you need to call next or send(None) first. I agree with the others that David Beazley's coroutine decorator (in python 3.x you need to call to __next__() function instead of next()) is a great option. Though that particular decorator is simple, I've also successfully used the copipes library, which is a nice implementation of many of the utilities from Beazley's presentations, including coroutine.
Regarding whether one can check if a generator is started - in Python 3, you can use inspect.getgeneratorstate. This isn't available in Python 2, but the CPython implementation is pure python and doesn't rely on anything new to Python 3, so you can check yourself in the same way:
if generator.gi_running:
return GEN_RUNNING
if generator.gi_frame is None:
return GEN_CLOSED
if generator.gi_frame.f_lasti == -1:
return GEN_CREATED
return GEN_SUSPENDED
Specifically, g2 is started if inspect.getgeneratorstate(g2) != inspect.GEN_CREATED.
As your error implies the send function must be called with None on a just-started generator
(docs-link).
You could catch the TypeError and roll from there:
#...
try:
g2.send(0)
except TypeError:
#Now you know it hasn't started, etc.
g2.send(None)
Either way it can't be used to 'reset' the generator, it just has to be remade.
Great overview of generator concepts and syntax here, covering chaining of generators and other advanced topics.
In particular, you might find a way to use the consumer decorator described on p. I-131 of David Beazley's "Generator Tricks," to which J. Gwyn provided a link:
def consumer(func):
def start(*args,**kwargs):
c = func(*args,**kwargs)
c.next()
return c
return start
I use something similar in my code.
Note that if v is None is preferred over if v == None.
Here's a complete implementation Python2 compatible routine, getgeneratorstate(gtor), with test code.
import unittest
import enum
class GtorState(enum.Enum):
GEN_RUNNING ='GEN_RUNNING'
GEN_CLOSED ='GEN_CLOSED'
GEN_CREATED ='GEN_CREATED'
GEN_SUSPENDED ='GEN_SUSPENDED'
#staticmethod
def getgeneratorstate(gtor):
if gtor.gi_running:
return GtorState.GEN_RUNNING
if gtor.gi_frame is None:
return GtorState.GEN_CLOSED
if gtor.gi_frame.f_lasti == -1:
return GtorState.GEN_CREATED
return GtorState.GEN_SUSPENDED
#end-def
def coro000():
""" a coroutine that does little
"""
print('-> coroutine started')
x =yield
print('-> coroutine received ', x)
class Test_Coro(unittest.TestCase):
def test_coro000(self):
my_coro000 =coro000()
self.assertEqual( GtorState.getgeneratorstate(my_coro000), GtorState.GEN_CREATED)
next(my_coro000) # prints '-> coroutine started'
self.assertEqual( GtorState.getgeneratorstate(my_coro000), GtorState.GEN_SUSPENDED)
try:
my_coro000.send(42) # prints '-> coroutine received 42
self.assertEqual( GtorState.getgeneratorstate(my_coro000), GtorState.GEN_SUSPENDED)
self.fail('should have raised StopIteration ')
except StopIteration:
self.assertTrue(True, 'On exit a coroutine will throw StopIteration')
self.assertEqual( GtorState.getgeneratorstate(my_coro000), GtorState.GEN_CLOSED)
I have a function
def f():
while True:
blah
I want to alter f in such a way that the caller could control the number of times the while loop in f runs, without altering much of the code in f (specially not adding a counter in f). Something like
def f(num_executions = True):
while num_executions:
blah()
f() will run an infinite loop
but f(an_expression_that_evaluates_to_true_n_times) will run the while loop n times.
What could such an expression be?
UPDATE:
I know, there are plenty of way to control how many times a loop will run, but the real question here is -
Can an expression in python evaluate to True for configurable number of times?
Some ideas I am toying with
-making an expression out of list = list[:-1]
-modifying default parameters of a function within a function
No need for a while-loop. Use a for-loop:
>>> def f(n):
... for _ in range(n):
... dostuff()
_ is used as a variable name in a for loop normally to be a placeholder. This loop loops through n amount of times. So f(5) would loop five times.
While I agree with the others that this is a bad idea, it is entirely (and easily) possible:
class BoolChange:
def __init__(self):
self.count = 0
def __bool__(self):
self.count += 1
return self.count <= 5
x = BoolChange()
while x:
print("Running")
This outputs Running five times, then exits.
The main reason this is a bad idea is that it means checking the state of the object modifies it, which is weird behaviour people won't expect. I can't imagine a good use case for this.
You can't do exactly what you describe. What is passed in python is not an expression, but a value. An object. An Immutable object in general evaluate to either True or to False. It will not change during the loop. Mutable object can change its truth value, but you can't make arbitrary object change during a general loop (which does not touch it in any way). In general, as have been said here, you really need to use for statement, or pass in a callable object (say, a function):
def f(is_true = lambda x : True):
while is_true():
blah()
Note that the reason that the callable solution is acceptable, while the "hiding in boolean" #Lattyware demonstrated is not, is that the additional coputation here is explicit - the () tells the reader that almost anything can happen here, depending on the object passed, and you don't need to know that the __bool__ method in this object is silently called and is expected to have side effect.
def f(c=-1):
while c:
print 'blah'
if c > 0: c -= 1
How about using a generator as a coroutine?
def blah():
print 'hi'
def f():
while True:
blah()
yield
x = f()
next(x) # "hi"
The generator here isn't be used for what it yields, but you get to control how many times it blahs externally, because it yields control ever time it blahs.
for i in range(3):
next(x) # blah blah blah
This will also work -
def foo(n=[1,2,3]):
foo.func_defaults = tuple([foo.func_defaults[0][:-1]],)
return n
while foo():
blah()