Why does this recursive ```yield from``` function not raise an error? - python

def prefixes(s):
if s:
yield from prefixes(s[:-1])
yield s
t = prefixes('both')
next(t)
The next(t) returns 'b'. I'm just confused as to why this is because if we follow down the yield from statement, we will eventually end at yield from prefixes('') which would return None. In all my other tests yield from None raises a TypeError. Instead, this seems to just be ignored and prefixes('b') moves onto the next yield statement (? why does it do that?) to yield 'b'...
Any ideas as to why? Would greatly appreciate an explanation.

prefixes is wrapped in a generator that raises StopIteration when the function returns. When passed an empty string, prefixes skips any yields, reaches the end of its code block and returns, causing StopIteration. The return value doesn't matter, it is discarded
>>> next(prefixes(""))
Traceback (most recent call last):
File "<stdin>", line 1, in <module>
StopIteration
yield from suppresses the inner generator's StopIteration and lets the outer generator continue.

generators are lazy(on-demand) objects, you didn't exhaust your generator t, to exhaust your generator you can use:
list(t)
# ['b', 'bo', 'bot', 'both']
now if you use next(t) you will get the expected StopIteration
StopIteration Traceback (most recent call last)
<ipython-input-25-680ef78477e2> in <module>
6 t = prefixes('both')
7 list(t)
----> 8 next(t)
StopIteration:
the if statement is "guaranteeing" that you have an end and you will never do None[:-1] to get the TypeError

Related

What does a yield inside a yield do?

Consider the following code:
def mygen():
yield (yield 1)
a = mygen()
print(next(a))
print(next(a))
The output yields:
1
None
What does the interpreter do at the "outside" yield exactly?
a is a generator object. The first time you call next on it, the body is evaluated up to the first yield expression (that is, the first to be evaluated: the inner one). That yield produces the value 1 for next to return, then blocks until the next entry into the generator. That is produced by the second call to next, which does not send any value into the generator. As a result, the first (inner) yield evaluates to None. That value is used as the argument for the outer yield, which becomes the return value of the second call to next. If you were to call next a third time, you would get a StopIteration exception.
Compare the use of the send method (instead of next) to change the return value of the first yield expression.
>>> a = mygen()
>>> next(a)
1
>>> a.send(3) # instead of next(a)
3
>>> next(a)
Traceback (most recent call last):
File "<stdin>", line 1, in <module>
StopIteration
A more explicit way of writing the generator would have been
def mygen():
x = yield 1
yield x
a = mygen()
print(a.send(None)) # outputs 1, from yield 1
print(a.send(5)) # makes yield 1 == 5, then gets 5 back from yield x
print(a.send(3)) # Raises StopIteration, as there's nothing after yield x
Prior to Python 2.5, the yield statement provided one-way communication between a caller and a generator; a call to next would execute the generator up to the next yield statement, and the value provided by the yield keyword would serve as the return value of next. The generator
would also suspend at the point of the yield statement, waiting for the next call to next to resume.
In Python 2.5, the yield statement was replaced* with the yield expression, and generators acquired a send method. send works very much like next, except it can take an argument. (For the rest of this, assume that next(a) is equivalent to a.send(None).) A generator starts execution after a call to send(None), at which point it executes up to the first yield, which returns a value as before. Now, however, the expression blocks until the next call to send, at which point the yield expression evaluates to the argument passed to send. A generator can now receive a value when it resumes.
* Not quite replaced; kojiro's answer goes into more detail about the subtle difference between a yield statement and yield expression.
yield has two forms, expressions and statements. They're mostly the same, but I most often see them in the statement form, where the result would not be used.
def f():
yield a thing
But in the expression form, yield has a value:
def f():
y = yield a thing
In your question, you're using both forms:
def f():
yield ( # statement
yield 1 # expression
)
When you iterate over the resulting generator, you get first the result of the inner yield expression
>>> x=f()
>>> next(x)
1
At this point, the inner expression has also produced a value that the outer statement can use
>>> next(x)
>>> # None
and now you've exhausted the generator
>>> next(x)
Traceback (most recent call last):
File "<stdin>", line 1, in <module>
StopIteration
To understand more about statements vs expressions, there are good answers in other stackoverflow questions: What is the difference between an expression and a statement in Python?
>>> def mygen():
... yield (yield 1)
...
>>> a = mygen()
>>>
>>> a.send(None)
1
>>> a.send(5)
5
>>> a.send(2)
Traceback (most recent call last):
File "<stdin>", line 1, in <module>
StopIteration
>>>
>>>
>>>
>>> def mygen():
... yield 1
...
>>> def mygen2():
... yield (yield 1)
...
>>> def mygen3():
... yield (yield (yield 1))
...
>>> a = mygen()
>>> a2 = mygen2()
>>> a3 = mygen3()
>>>
>>> a.send(None)
1
>>> a.send(0)
Traceback (most recent call last):
File "<stdin>", line 1, in <module>
StopIteration
>>> a2.send(None)
1
>>> a2.send(0)
0
>>> a2.send(1)
Traceback (most recent call last):
File "<stdin>", line 1, in <module>
StopIteration
>>> a3.send(None)
1
>>> a3.send(0)
0
>>> a3.send(1)
1
>>> a3.send(2)
Traceback (most recent call last):
File "<stdin>", line 1, in <module>
StopIteration
>>>
Every other yield simply waits for a value to be passed into, generator don't only give data but they also receive it.
>>> def mygen():
... print('Wait for first input')
... x = yield # this is what we get from send
... print(x, 'is received')
...
>>> a = mygen()
>>> a.send(None)
Wait for first input
>>> a.send('bla')
bla is received
Traceback (most recent call last):
File "<stdin>", line 1, in <module>
StopIteration
>>>
yield gives the next value when you continue if you get it, and if it is not used for giving the next value, it is being used for receiving the next
>>> def mygen():
... print('Wait for first input')
... x = yield # this is what we get from send
... yield x*2 # this is what we give
...
>>> a = mygen()
>>> a.send(None)
Wait for first input
>>> a.send(5)
10
>>>
Any generator exhausts elements till it runs out of them.
In the 2-level nested example like below, the first next gives us the element from inner most yield, which is 1, the next yields just returns None, since it has no elements to return, if you call next again, it will return StopIteration
def mygen():
yield (yield 1)
a = mygen()
print(next(a))
print(next(a))
print(next(a))
You can expand this case to include more nested yields, and you will see that after n next are called, StopIteration expection is thrown, below is an example with 5 nested yields
def mygen():
yield ( yield ( yield ( yield (yield 1))))
a = mygen()
print(next(a))
print(next(a))
print(next(a))
print(next(a))
print(next(a))
print(next(a))
Note that this answer is just based on my observation, and might not be technically correct in the nitty-gritties, all updates and suggestions are welcome

Python chained `yield form`

I was reading through this article about async and await in python, and saw this sample code:
def bottom():
# Returning the yield lets the value that goes up the call stack to come right back
# down.
return (yield 42)
def middle():
return (yield from bottom())
def top():
return (yield from middle())
# Get the generator.
gen = top()
value = next(gen)
print(value) # Prints '42'.
try:
value = gen.send(value * 2)
except StopIteration as exc:
value = exc.value
print(value) # Prints '84'.
I can understand the chaining generators to return 42, but I can't seem to get my head around gen.send(value * 2) and get 84 back. I would have thought initial next(gen) would have already exhausted the generator as following experiment?
def bottom():
# Returning the yield lets the value that goes up the call stack to come right back
# down.
return (yield 42)
def middle():
return (yield from bottom())
def top():
return (yield from middle())
# Get the generator.
gen = top()
value = next(gen)
print(value) # Prints '42'.
value = next(gen)
print(value)
Traceback (most recent call last):
File "a.py", line 16, in <module>
value = next(gen)
StopIteration
Could someone please explain?
PS: Not the most well thought out title, someone please help fix it...
As #Steven Rumbalski already explained in the comment: The generator yields only one value - the 42. In the second call the iterator raises the StopIteration, which is caught by the except __StopIteration__ as exc:. So you are completely right in that the initial next(gen) has already exhausted the generator. The same holds true in your second example, but in this case you are not catching the StopIteration exception. For further reading I will quote from the PEP 380 -- Syntax for Delegating to a Subgenerator.
Add a new send() method for generator-iterators, which resumes the generator and sends a value that becomes the result of the current yield-expression. The send() method returns the next value yielded by the generator, or raises StopIteration if the generator exits without yielding another value.
from PEP342
So why do you get 84 back with the gen.send(value * 2)? The value of value at this point is still 42 from the previous call of value = next(gen). So you are just getting back the 84 that you wanted to sent to the iterator. Why is that?
Consider the following simplified examples to understand better what that definition means. First we only yield a single value. No returns. This results in an empty StopIteration with value attribute None.
def generator():
yield 1
gen = generator()
value = next(gen)
print(value) # Prints '1'.
try:
value = gen.send(3)
except StopIteration as exc:
value = exc.value
print(value) # Prints 'None'.
return expr in a generator causes StopIteration(expr) to be raised upon exit from the generator.
from PEP380
Then we return 2 after the generator yields no further values. In this case the rule from PEP380 comes into play. As 2 was the returned value the value attribute of StopIteration is 2.
def generator():
yield 1
return 2
gen = generator()
value = next(gen)
print(value) # Prints '1'.
try:
value = gen.send(3)
except StopIteration as exc:
value = exc.value
print(value) # Prints '2'.
Now we return (yield 1). According to the rule from PEP342 the 3 as the value of gen.send(3) becomes the result of the current yield-expression. But since the generator is already exhausted the StopIteration exception is raised. Which leads to the 3 as the value of the raised StopIteration.
def generator():
return (yield 1)
gen = generator()
value = next(gen)
print(value) # Prints '1'.
try:
value = gen.send(3)
except StopIteration as exc:
value = exc.value
print(value) # Prints '3'.

Stop Iteration error when using next()

I am not able to clarify my self over the use of next() in python(3).
I have a data :
chr pos ms01e_PI ms01e_PG_al ms02g_PI ms02g_PG_al ms03g_PI ms03g_PG_al ms04h_PI ms04h_PG_al
2 15881989 4 C|C 6 A|C 7 C|C 7 C|C
2 15882091 4 A|T 6 A|T 7 T|A 7 A|A
2 15882148 4 T|T 6 T|T 7 T|T 7 T|G
and I read it like:
Works fine
c = csv.DictReader(io.StringIO(data), dialect=csv.excel_tab)
print(c)
print(list(c))
Works fine
c = csv.DictReader(io.StringIO(data), dialect=csv.excel_tab)
print(c)
keys = next(c)
print('keys:', keys)
But, now there is a problem.
c = csv.DictReader(io.StringIO(data), dialect=csv.excel_tab)
print(c)
print(list(c))
keys = next(c)
print('keys:', keys)
Error message:
Traceback (most recent call last):
2 15882601 4 C|C 9 C|C 6 C|C 5 T|C
File "/home/everestial007/Test03.py", line 24, in <module>
keys = next(c)
File "/home/everestial007/anaconda3/lib/python3.5/csv.py", line 110, in __next__
row = next(self.reader)
StopIteration
Why does print(keys) after print(list(c)) gives StopIteration? I read the documentation but I am not clear on this particular example.
The error isn't with the print statement. It's with the keys = next(c) line. Consider a simpler example which reproduces your issue.
a = (i ** 2 for i in range(5))
a # `a` is a generator object
<generator object <genexpr> at 0x150e286c0>
list(a) # calling `list` on `a` will exhaust the generator
[0, 1, 4, 9, 16]
next(a) # calling `next` on an exhausted generator object raises `StopIteration`
---------------------------------------------------------------------------
StopIteration Traceback (most recent call last)
<ipython-input-2076-3f6e2eea332d> in <module>()
----> 1 next(a)
StopIteration:
What happens is that c is an iterator object (very similar to the generator a above), and is meant to be iterated over once until it is exhausted. Calling list on this object will exhaust it, so that the elements can be collected into a list.
Once the object has been exhausted, it will not produce any more elements. At this point, the generator mechanism is designed to raise a StopIteration if you attempt to iterate over it even after it has been exhausted. Constructs such as for loops listen for this error, silently swallowing it, however, next returns the raw exception as soon as it has been raised.

Generator with return statement [duplicate]

This question already has answers here:
Return in generator together with yield
(2 answers)
Closed 6 years ago.
During my coverage, I scratched my head on the following case (python 3.4)
def simple_gen_function(str_in, sep=""):
if sep == "":
yield str_in[0]
for c in str_in[1:]:
yield c
else:
return str_in
# yield from str_in
str_in = "je teste "
t = "".join(simple_gen_function(str_in))
p = "".join(simple_gen_function(str_in, "\n"))
print("%r %r" % (t, p))
# 'je teste' ''
Using return in the generator, the return was not "reached" while using yield from str_in I have the expected result.
The question seems simple, but I believed that using return in a generator, it was in reached.
The presence of yield in a function body turns it into a generator function instead of a normal function. And in a generator function, using return is a way of saying "The generator has ended, there are no more elements." By having the first statement of a generator method be return str_in, you are guaranteed to have a generator that returns no elements.
As a comment mentions, the return value is used as an argument to the StopIteration exception that gets raised when the generator has ended. See:
>>> gen = simple_gen_function("hello", "foo")
>>> next(gen)
Traceback (most recent call last):
File "<stdin>", line 1, in <module>
StopIteration: hello
If there's a yield anywhere in your def, it's a generator!
In the comments, the asker mentions they thought the function turned into a generator dynamically, when the yield statement is executed. But this is not how it works! The decision is made before the code is ever excuted. If Python finds a yield anywhere at all under your def, it turns that def into a generator function.
See this ultra-condensed example:
>>> def foo():
... if False:
... yield "bar"
... return "baz"
>>> foo()
<generator object foo at ...>
>>> # The return value "baz" is only exposed via StopIteration
>>> # You probably shouldn't use this behavior.
>>> next(foo())
Traceback (most recent call last):
...
StopIteration: baz
>>> # Nothing is ever yielded from the generator, so it generates no values.
>>> list(foo())
[]

yield from a generator that has return <value> statement in it

I have a generator with the return value statement in it.
If i use next on it I get the Stopiteration: value from it as expected.
However when I use yield from the value is lost.
In [1]: def test():
...: return 1
...: yield 2
...:
In [2]: t = test()
In [3]: t
Out[3]: <generator object test at 0x000000000468F780>
In [4]: next(t)
---------------------------------------------------------------------------
StopIteration Traceback (most recent call last)
<ipython-input-4-9494367a8bed> in <module>()
----> 1 next(t)
StopIteration: 1
In [5]: def new():
...: yield from test()
...:
In [6]: n = new()
In [7]: n
Out[7]: <generator object new at 0x00000000050F23B8>
In [8]: next(n)
---------------------------------------------------------------------------
StopIteration Traceback (most recent call last)
<ipython-input-8-1c47c7af397e> in <module>()
----> 1 next(n)
StopIteration:
Is there a way to preserve the value when using yield from ?
Is this working as intended or maybe it is a bug ?
By receiving the value sent by the sub-generator in the yield from statement.
Taking a quote from PEP 380 -- Syntax for Delegating to a Subgenerator:
The value of the yield from expression is the first argument to the StopIteration exception raised by the iterator when it terminates.
So with a small tweak, res in the new generator will contain the value of StopIteration raised from the test subgenerator:
def new():
res = yield from test()
return res
Now when next(n) is executed you'll get the value in the Exception message:
n = new()
next(n)
---------------------------------------------------------------------------
StopIteration Traceback (most recent call last)
<ipython-input-39-1c47c7af397e> in <module>()
----> 1 next(n)
StopIteration: 1
Oh, and as an addendum, you can of course get the 'return' value without it being encapsulated in the StopIteration object by using yield again:
def new():
res = yield from test()
yield res
Now calling next(new()) will return the value returned from test():
next(new())
Out[20]: 1

Categories