Python chained `yield form` - python

I was reading through this article about async and await in python, and saw this sample code:
def bottom():
# Returning the yield lets the value that goes up the call stack to come right back
# down.
return (yield 42)
def middle():
return (yield from bottom())
def top():
return (yield from middle())
# Get the generator.
gen = top()
value = next(gen)
print(value) # Prints '42'.
try:
value = gen.send(value * 2)
except StopIteration as exc:
value = exc.value
print(value) # Prints '84'.
I can understand the chaining generators to return 42, but I can't seem to get my head around gen.send(value * 2) and get 84 back. I would have thought initial next(gen) would have already exhausted the generator as following experiment?
def bottom():
# Returning the yield lets the value that goes up the call stack to come right back
# down.
return (yield 42)
def middle():
return (yield from bottom())
def top():
return (yield from middle())
# Get the generator.
gen = top()
value = next(gen)
print(value) # Prints '42'.
value = next(gen)
print(value)
Traceback (most recent call last):
File "a.py", line 16, in <module>
value = next(gen)
StopIteration
Could someone please explain?
PS: Not the most well thought out title, someone please help fix it...

As #Steven Rumbalski already explained in the comment: The generator yields only one value - the 42. In the second call the iterator raises the StopIteration, which is caught by the except __StopIteration__ as exc:. So you are completely right in that the initial next(gen) has already exhausted the generator. The same holds true in your second example, but in this case you are not catching the StopIteration exception. For further reading I will quote from the PEP 380 -- Syntax for Delegating to a Subgenerator.
Add a new send() method for generator-iterators, which resumes the generator and sends a value that becomes the result of the current yield-expression. The send() method returns the next value yielded by the generator, or raises StopIteration if the generator exits without yielding another value.
from PEP342
So why do you get 84 back with the gen.send(value * 2)? The value of value at this point is still 42 from the previous call of value = next(gen). So you are just getting back the 84 that you wanted to sent to the iterator. Why is that?
Consider the following simplified examples to understand better what that definition means. First we only yield a single value. No returns. This results in an empty StopIteration with value attribute None.
def generator():
yield 1
gen = generator()
value = next(gen)
print(value) # Prints '1'.
try:
value = gen.send(3)
except StopIteration as exc:
value = exc.value
print(value) # Prints 'None'.
return expr in a generator causes StopIteration(expr) to be raised upon exit from the generator.
from PEP380
Then we return 2 after the generator yields no further values. In this case the rule from PEP380 comes into play. As 2 was the returned value the value attribute of StopIteration is 2.
def generator():
yield 1
return 2
gen = generator()
value = next(gen)
print(value) # Prints '1'.
try:
value = gen.send(3)
except StopIteration as exc:
value = exc.value
print(value) # Prints '2'.
Now we return (yield 1). According to the rule from PEP342 the 3 as the value of gen.send(3) becomes the result of the current yield-expression. But since the generator is already exhausted the StopIteration exception is raised. Which leads to the 3 as the value of the raised StopIteration.
def generator():
return (yield 1)
gen = generator()
value = next(gen)
print(value) # Prints '1'.
try:
value = gen.send(3)
except StopIteration as exc:
value = exc.value
print(value) # Prints '3'.

Related

function is still a generator if "yield" used in a if clause [duplicate]

In Python 2 there was an error when return was together with yield in a function definition. But for this code in Python 3.3:
def f():
return 3
yield 2
x = f()
print(x.__next__())
there is no error that return is used in function with yield. However when the function __next__ is called then there is thrown exception StopIteration. Why there is not just returned value 3? Is this return somehow ignored?
This is a new feature in Python 3.3. Much like return in a generator has long been equivalent to raise StopIteration(), return <something> in a generator is now equivalent to raise StopIteration(<something>). For that reason, the exception you're seeing should be printed as StopIteration: 3, and the value is accessible through the attribute value on the exception object. If the generator is delegated to using the (also new) yield from syntax, it is the result. See PEP 380 for details.
def f():
return 1
yield 2
def g():
x = yield from f()
print(x)
# g is still a generator so we need to iterate to run it:
for _ in g():
pass
This prints 1, but not 2.
The return value is not ignored, but generators only yield values, a return just ends the generator, in this case early. Advancing the generator never reaches the yield statement in that case.
Whenever a iterator reaches the 'end' of the values to yield, a StopIteration must be raised. Generators are no exception. As of Python 3.3 however, any return expression becomes the value of the exception:
>>> def gen():
... return 3
... yield 2
...
>>> try:
... next(gen())
... except StopIteration as ex:
... e = ex
...
>>> e
StopIteration(3,)
>>> e.value
3
Use the next() function to advance iterators, instead of calling .__next__() directly:
print(next(x))

What is the reasoning behind return in generatorfunctions? [duplicate]

In Python 2 there was an error when return was together with yield in a function definition. But for this code in Python 3.3:
def f():
return 3
yield 2
x = f()
print(x.__next__())
there is no error that return is used in function with yield. However when the function __next__ is called then there is thrown exception StopIteration. Why there is not just returned value 3? Is this return somehow ignored?
This is a new feature in Python 3.3. Much like return in a generator has long been equivalent to raise StopIteration(), return <something> in a generator is now equivalent to raise StopIteration(<something>). For that reason, the exception you're seeing should be printed as StopIteration: 3, and the value is accessible through the attribute value on the exception object. If the generator is delegated to using the (also new) yield from syntax, it is the result. See PEP 380 for details.
def f():
return 1
yield 2
def g():
x = yield from f()
print(x)
# g is still a generator so we need to iterate to run it:
for _ in g():
pass
This prints 1, but not 2.
The return value is not ignored, but generators only yield values, a return just ends the generator, in this case early. Advancing the generator never reaches the yield statement in that case.
Whenever a iterator reaches the 'end' of the values to yield, a StopIteration must be raised. Generators are no exception. As of Python 3.3 however, any return expression becomes the value of the exception:
>>> def gen():
... return 3
... yield 2
...
>>> try:
... next(gen())
... except StopIteration as ex:
... e = ex
...
>>> e
StopIteration(3,)
>>> e.value
3
Use the next() function to advance iterators, instead of calling .__next__() directly:
print(next(x))

How to intercept the first value of a generator and transparently yield from the rest

Update: I've started a thread on python-ideas to propose additional syntax or a stdlib function for this purpose (i.e. specifying the first value sent by yield from). So far 0 replies... :/
How do I intercept the first yielded value of a subgenerator but delegate the rest of the iteration to the latter using yield from?
For example, suppose we have an arbitrary bidirectional generator subgen, and we want to wrap this in another generator gen. The purpose of gen is to intercept the first yielded value of subgen and delegate the rest of the generation—including sent values, thrown exceptions, .close(), etc.—to the sub-generator.
The first thing that might come to mind could be this:
def gen():
g = subgen()
first = next(g)
# do something with first...
yield "intercepted"
# delegate the rest
yield from g
But this is wrong, because when the caller .sends something back to the generator after getting the first value, it will end up as the value of the yield "intercepted" expression, which is ignored, and instead g will receive None as the first .send value, as part of the semantics of yield from.
So we might think to do this:
def gen():
g = subgen()
first = next(g)
# do something with first...
received = yield "intercepted"
g.send(received)
# delegate the rest
yield from g
But what we've done here is just moving the problem back by one step: as soon as we call g.send(received), the generator resumes its execution and doesn't stop until it reaches the next yield statement, whose value becomes the return value of the .send call. So we'd also have to intercept that and re-send it. And then send that, and that again, and so on... So this won't do.
Basically, what I'm asking for is a yield from with a way to customize what the first value sent to the generator is:
def gen():
g = subgen()
first = next(g)
# do something with first...
received = yield "intercepted"
# delegate the rest
yield from g start with received # pseudocode; not valid Python
...but without having to re-implement all of the semantics of yield from myself. That is, the laborious and poorly maintainable solution would be:
def adaptor(generator, init_send_value=None):
send = init_send_value
try:
while True:
send = yield generator.send(send)
except StopIteration as e:
return e.value
which is basically a bad re-implementation of yield from (it's missing handling of throw, close, etc.). Ideally I would like something more elegant and less redundant.
If you're trying to implement this generator wrapper as a generator function using yield from, then your question basically boils down to whether it is possible to specify the first value sent to the "yielded from" generator. Which it is not.
If you look at the formal specification of the yield from expression in PEP 380, you can see why. The specification contains a (surprisingly complex) piece of sample code that behaves the same as a yield from expression. The first few lines are:
_i = iter(EXPR)
try:
_y = next(_i)
except StopIteration as _e:
_r = _e.value
else:
...
You can see that the first thing that is done to the iterator is to call next() on it, which is basically equivalent to .send(None). There is no way to skip that step and your generator will always receive another None whenever yield from is used.
The solution I've come up with is to implement the generator protocol using a class instead of a generator function:
class Intercept:
def __init__(self, generator):
self._generator = generator
self._intercepted = False
def __next__(self):
return self.send(None)
def send(self, value):
yielded_value = self._generator.send(value)
# Intercept the first value yielded by the wrapped generator and
# replace it with a different value.
if not self._intercepted:
self._intercepted = True
print(f'Intercepted value: {yielded_value}')
yielded_value = 'intercepted'
return yielded_value
def throw(self, type, *args):
return self._generator.throw(type, *args)
def close(self):
self._generator.close()
__next__(), send(), throw(), close() are described in the Python Reference Manual.
The class wraps the generator passed to it when created will mimic its behavior. The only thing it changes is that the first value yielded by the generator is replaced by a different value before it is returned to the caller.
We can test the behavior with an example generator f() which yields two values and a function main() which sends values into the generator until the generator terminates:
def f():
y = yield 'first'
print(f'f(): {y}')
y = yield 'second'
print(f'f(): {y}')
def main():
value_to_send = 0
gen = f()
try:
x = gen.send(None)
while True:
print(f'main(): {x}')
# Send incrementing integers to the generator.
value_to_send += 1
x = gen.send(value_to_send)
except StopIteration:
print('main(): StopIteration')
main()
When ran, this example will produce the following output, showing which values arrive in the generator and which are returned by the generator:
main(): first
f(): 1
main(): second
f(): 2
main(): StopIteration
Wrapping the generator f() by changing the statement gen = f() to gen = Intercept(f()), produces the following output, showing that the first yielded value has been replaced:
Intercepted value: first
main(): intercepted
f(): 1
main(): second
f(): 2
As all other calls to any of the generator API are forwarded directly to the wrapped generator, it should behave equivalently to the wrapped generator itself.
If I understand the question, I think this works? Meaning, I ran this script and it did what I expected, which was to print all but the first line of the input file. But as long as the generator passed as the argument to the skip_first function can be iterator over, it should work.
def skip_first(thing):
_first = True
for _result in thing:
if _first:
_ first = False
continue
yield _result
inp = open("/var/tmp/test.txt")
for line in skip_first(inp):
print(line, end="")

Python function becomes a generator [duplicate]

In Python 2 there was an error when return was together with yield in a function definition. But for this code in Python 3.3:
def f():
return 3
yield 2
x = f()
print(x.__next__())
there is no error that return is used in function with yield. However when the function __next__ is called then there is thrown exception StopIteration. Why there is not just returned value 3? Is this return somehow ignored?
This is a new feature in Python 3.3. Much like return in a generator has long been equivalent to raise StopIteration(), return <something> in a generator is now equivalent to raise StopIteration(<something>). For that reason, the exception you're seeing should be printed as StopIteration: 3, and the value is accessible through the attribute value on the exception object. If the generator is delegated to using the (also new) yield from syntax, it is the result. See PEP 380 for details.
def f():
return 1
yield 2
def g():
x = yield from f()
print(x)
# g is still a generator so we need to iterate to run it:
for _ in g():
pass
This prints 1, but not 2.
The return value is not ignored, but generators only yield values, a return just ends the generator, in this case early. Advancing the generator never reaches the yield statement in that case.
Whenever a iterator reaches the 'end' of the values to yield, a StopIteration must be raised. Generators are no exception. As of Python 3.3 however, any return expression becomes the value of the exception:
>>> def gen():
... return 3
... yield 2
...
>>> try:
... next(gen())
... except StopIteration as ex:
... e = ex
...
>>> e
StopIteration(3,)
>>> e.value
3
Use the next() function to advance iterators, instead of calling .__next__() directly:
print(next(x))

Generator with return statement [duplicate]

This question already has answers here:
Return in generator together with yield
(2 answers)
Closed 6 years ago.
During my coverage, I scratched my head on the following case (python 3.4)
def simple_gen_function(str_in, sep=""):
if sep == "":
yield str_in[0]
for c in str_in[1:]:
yield c
else:
return str_in
# yield from str_in
str_in = "je teste "
t = "".join(simple_gen_function(str_in))
p = "".join(simple_gen_function(str_in, "\n"))
print("%r %r" % (t, p))
# 'je teste' ''
Using return in the generator, the return was not "reached" while using yield from str_in I have the expected result.
The question seems simple, but I believed that using return in a generator, it was in reached.
The presence of yield in a function body turns it into a generator function instead of a normal function. And in a generator function, using return is a way of saying "The generator has ended, there are no more elements." By having the first statement of a generator method be return str_in, you are guaranteed to have a generator that returns no elements.
As a comment mentions, the return value is used as an argument to the StopIteration exception that gets raised when the generator has ended. See:
>>> gen = simple_gen_function("hello", "foo")
>>> next(gen)
Traceback (most recent call last):
File "<stdin>", line 1, in <module>
StopIteration: hello
If there's a yield anywhere in your def, it's a generator!
In the comments, the asker mentions they thought the function turned into a generator dynamically, when the yield statement is executed. But this is not how it works! The decision is made before the code is ever excuted. If Python finds a yield anywhere at all under your def, it turns that def into a generator function.
See this ultra-condensed example:
>>> def foo():
... if False:
... yield "bar"
... return "baz"
>>> foo()
<generator object foo at ...>
>>> # The return value "baz" is only exposed via StopIteration
>>> # You probably shouldn't use this behavior.
>>> next(foo())
Traceback (most recent call last):
...
StopIteration: baz
>>> # Nothing is ever yielded from the generator, so it generates no values.
>>> list(foo())
[]

Categories