Why does the yield expression collapse? - python

I was messing around and noticed that the following code yields the value once, while I was expecting it to return a generator object.
def f():
yield (yield 1)
f().next() # returns 1
def g():
yield (yield (yield 1)
g().next() # returns 1
My question is what is the value of the yield expression and also why are we allowed to nest yield expression if the yield expression collapses?

The value of the yield expression after resuming depends on the method which resumed the execution. If __next__() is used (typically via either a for or the next() builtin) then the result is None. Otherwise, if send() is used, then the result will be the value passed in to that method.
So this:
def f():
yield (yield 1)
Is equivalent to this:
def f():
x = yield 1
yield x
Which in this case (since you're not using generator.send()) is equivalent to this:
def f():
yield 1
yield None
Your code is only looking at the first item yielded by the generator. If you instead call list() to consume the whole sequence, you'll see what I describe:
def f():
yield (yield 1)
def g():
yield (yield (yield 1))
print(list(f()))
print(list(g()))
Output:
$ python3 yield.py
[1, None]
[1, None, None]
If we iterate the generator manually (as you have), but .send() it values, then you can see that yield "returns" this value:
gen = f()
print(next(gen))
print(gen.send(42))
Output:
$ python3 yield_manual.py
1
42

Related

function is still a generator if "yield" used in a if clause [duplicate]

In Python 2 there was an error when return was together with yield in a function definition. But for this code in Python 3.3:
def f():
return 3
yield 2
x = f()
print(x.__next__())
there is no error that return is used in function with yield. However when the function __next__ is called then there is thrown exception StopIteration. Why there is not just returned value 3? Is this return somehow ignored?
This is a new feature in Python 3.3. Much like return in a generator has long been equivalent to raise StopIteration(), return <something> in a generator is now equivalent to raise StopIteration(<something>). For that reason, the exception you're seeing should be printed as StopIteration: 3, and the value is accessible through the attribute value on the exception object. If the generator is delegated to using the (also new) yield from syntax, it is the result. See PEP 380 for details.
def f():
return 1
yield 2
def g():
x = yield from f()
print(x)
# g is still a generator so we need to iterate to run it:
for _ in g():
pass
This prints 1, but not 2.
The return value is not ignored, but generators only yield values, a return just ends the generator, in this case early. Advancing the generator never reaches the yield statement in that case.
Whenever a iterator reaches the 'end' of the values to yield, a StopIteration must be raised. Generators are no exception. As of Python 3.3 however, any return expression becomes the value of the exception:
>>> def gen():
... return 3
... yield 2
...
>>> try:
... next(gen())
... except StopIteration as ex:
... e = ex
...
>>> e
StopIteration(3,)
>>> e.value
3
Use the next() function to advance iterators, instead of calling .__next__() directly:
print(next(x))

What is the reasoning behind return in generatorfunctions? [duplicate]

In Python 2 there was an error when return was together with yield in a function definition. But for this code in Python 3.3:
def f():
return 3
yield 2
x = f()
print(x.__next__())
there is no error that return is used in function with yield. However when the function __next__ is called then there is thrown exception StopIteration. Why there is not just returned value 3? Is this return somehow ignored?
This is a new feature in Python 3.3. Much like return in a generator has long been equivalent to raise StopIteration(), return <something> in a generator is now equivalent to raise StopIteration(<something>). For that reason, the exception you're seeing should be printed as StopIteration: 3, and the value is accessible through the attribute value on the exception object. If the generator is delegated to using the (also new) yield from syntax, it is the result. See PEP 380 for details.
def f():
return 1
yield 2
def g():
x = yield from f()
print(x)
# g is still a generator so we need to iterate to run it:
for _ in g():
pass
This prints 1, but not 2.
The return value is not ignored, but generators only yield values, a return just ends the generator, in this case early. Advancing the generator never reaches the yield statement in that case.
Whenever a iterator reaches the 'end' of the values to yield, a StopIteration must be raised. Generators are no exception. As of Python 3.3 however, any return expression becomes the value of the exception:
>>> def gen():
... return 3
... yield 2
...
>>> try:
... next(gen())
... except StopIteration as ex:
... e = ex
...
>>> e
StopIteration(3,)
>>> e.value
3
Use the next() function to advance iterators, instead of calling .__next__() directly:
print(next(x))

How does "yield from" work in recursive functions?

I'm writing some python code where I need to use generators inside recursive functions. Here is some code I wrote to mimic what I am trying to do. This is attempt 1.
def f():
def f2(i):
if i > 0:
yield i
f2(i - 1)
yield f2(10)
for x in f():
for y in x:
print(y)
This only prints 10, attempt 2 using this yield from construct I found online.
def f():
def f2(i):
if i > 0:
yield i
yield from f2(i - 1)
yield from f2(10)
for x in f():
print(x)
This does what I want, but I don't understand what is happening, what is yield from doing behind the scenes and why doesn't my first attempt work?
You can think of yield from as a for loop which yields every item:
for i in f(10):
yield i
is the same as yield from f(10). In other words, it yields the items from the given iteratable which in this case is another generator.
yield from g() will recurse inside a new generator g yielding from each yield statement at that generator
so
def g1():
yield from g2()
def g2()
for i in range(10):
yield i * 2
You can think as if yield from in g1 was unrolling g2 inside of it, expanding to something like this
def g1():
for i in range(10):
yield i * 2
This not what is happening because you have scopes and etc, but during the execution of yield from g2() in g1 then interpreter recurse in g2 yield each value that it yields, possibly recursing to another generator.
Now consider this generator
def flatten(maybe_it):
try:
for i0 in maybe_it:
for i1 in flatten(i0):
yield i1
except TypeError:
yield maybe_it
with yield from it can be rewrite as
def flatten(maybe_it):
try:
for i0 in maybe_it:
yield from flatten(i0):
except TypeError:
yield maybe_it

Python function becomes a generator [duplicate]

In Python 2 there was an error when return was together with yield in a function definition. But for this code in Python 3.3:
def f():
return 3
yield 2
x = f()
print(x.__next__())
there is no error that return is used in function with yield. However when the function __next__ is called then there is thrown exception StopIteration. Why there is not just returned value 3? Is this return somehow ignored?
This is a new feature in Python 3.3. Much like return in a generator has long been equivalent to raise StopIteration(), return <something> in a generator is now equivalent to raise StopIteration(<something>). For that reason, the exception you're seeing should be printed as StopIteration: 3, and the value is accessible through the attribute value on the exception object. If the generator is delegated to using the (also new) yield from syntax, it is the result. See PEP 380 for details.
def f():
return 1
yield 2
def g():
x = yield from f()
print(x)
# g is still a generator so we need to iterate to run it:
for _ in g():
pass
This prints 1, but not 2.
The return value is not ignored, but generators only yield values, a return just ends the generator, in this case early. Advancing the generator never reaches the yield statement in that case.
Whenever a iterator reaches the 'end' of the values to yield, a StopIteration must be raised. Generators are no exception. As of Python 3.3 however, any return expression becomes the value of the exception:
>>> def gen():
... return 3
... yield 2
...
>>> try:
... next(gen())
... except StopIteration as ex:
... e = ex
...
>>> e
StopIteration(3,)
>>> e.value
3
Use the next() function to advance iterators, instead of calling .__next__() directly:
print(next(x))

Check either the generator awaiting "send" or yielding the value

How can I check if the some blackbox generator is awaiting the value or it is returning the value now? I mean managing the following generator:
def gen():
a = yield
yield a
yield a+1
yield a+2
can be the following:
g = gen()
g.next()
print g.send(5)
print g.next()
print g.next()
and for the different generator, for example:
def gen():
a = yield
b = yield
yield a+b
it needs to be different also, for example:
g = gen()
g.next()
g.send(1)
print g.send(2)
So the question is how can I choose between sending value in generator and getting results from it only in case of blackbox (third-party) generator? I need to write the following code:
values = [1, 2, 3]
results = list()
g = gen()
g.next()
for v in values:
# needs this magic
if g.__awaits__: # in case of "x = yield" expression
results.append(g.send(v))
elif g.__yields__: # in case of "yield x" expression
results.append(g.next())
You can't, because there isn't any difference in terms of the allowable operations on those two "kinds" of generators. You can always send a value to the generator as soon as you have called next on it once, even if the generator doesn't do anything with the sent value:
>>> def gen():
... yield 1
... yield 2
... yield 3
>>> x = gen()
>>> next(x)
1
>>> x.send('hello')
2
>>> x.send('hello')
3
There's no way to tell whether the generator "needs" you to send a value except by reading the documentation for whatever generator you're using.

Categories