I'm using yield from, but I don't know about the influence of while for yield. If I put yield from in a while loop, it works well, but when I cancel the loop at the mean time an exception occurs.
final_result = {}
def sales_sum(pro_name):
total = 0
nums = []
while True:
x = yield
print(pro_name+" Sales volume: ", x)
if not x:
break
total += x
nums.append(x)
return total, nums
def middle(key):
while True:
final_result[key] = yield from sales_sum(key)
def middle2(key):
final_result[key] = yield from sales_sum(key)
def main(fun):
data_sets = { "A": [1200, 1500], "B": [28,55,98]}
for key, data_set in data_sets.items():
m = fun(key)
m.send(None)
for value in data_set:
m.send(value)
m.send(None)
if __name__ == '__main__':
main(middle) # work well
main(middle2) # StopIteration
I expect main(middle2) to work well as main(middle), but there is a StopIteration exception.
The cause of the unexpected StopIteration exception in main is that your m.send(None) call causes your middle2 generator to be exhausted (after the sub-generator sales_sum breaks out of its loop in response to the falsey value it received). When a generator is exhausted, it raises StopIteration. Normally that's invisible because you consume iterators in for loops, but in this case, it breaks your code.
There are a few ways you could fix this. One would be to use a two-argument call to next instead of using m.send(None):
next(m, None)
This does the same thing as m.send(None), but has the added benefit of suppressing the StopIteration. Note that the None in the call to next is not really the same as the one in send. It's the default return value in the case of an exhausted iterator, not the value that gets sent in (which is always None when using next).
Another approach would be to change middle2 so that it doesn't end when the sales_sum generator does. You could add an extra yield statement at the end, so that it yields control one last time after doing its assignment to final_result when its sub-generator returns.
A final idea would be to replace m.send(None) with m.close(). This would require some changes in final_result, as the close call will throw a GeneratorExit exception into the generator. If you expect it, you could use that as your signal to be done instead of looking for a falsey value:
def sales_sum(pro_name):
total = 0
nums = []
while True:
try:
x = yield
except GeneratorExit:
return total, nums
print(pro_name+" Sales volume: ", x)
total += x
nums.append(x)
With this change, middle2 would not need any modification.
sales_sum is a finite iterator. middle2 iterates over it exactly once; middle tries to iterate over it multiple times.
Related
I was looking into this article by RealPython about Python generators. It uses a palindrome generator as a exemplary tool to explain the .send, .throw and .close methods. Building a program following their instructions gives the following code:
def infinite_palindromes():
num = 0
while True:
if is_palindrome(num):
i = (yield num)
if i is not None:
num = i
num += 1
def is_palindrome(num):
# Skip single-digit inputs
if num // 10 == 0:
return False
temp = num
reversed_num = 0
while temp != 0:
reversed_num = (reversed_num * 10) + (temp % 10)
temp = temp // 10
if num == reversed_num:
return True
else:
return False
pal_gen = infinite_palindromes()
for e in pal_gen:
digits = len(str(e))
pal_gen.send(10**(digits))
I watched the flow of the program in debugging and what happens is basically, when the function finds 11, the first palindrome, it enters into the for e loop block, the digits variable is defined, and 10 to the power of that variable value is sent back to the generator. 100 is sent back, and since it is different from None, num is equated to i and becomes 100, then it's summed with 1 and becomes 101, the next palindrome. Here is what I don't get: when that number is throw and correctly identified as a palindrome, the code goes back to the for e in pal_gen: line, but it doesn't enter the block. Only the next palindrome, 111 is sent back down and enters the block, where the digits variable is defined again, and so on and so forth, and this patterns repeats, with only every other palindrome entering the block. Why is that?
Can someone, please, explain to me why that happens?
To understand the behaviour, we need to understand how for loops are implemented in Python. This code:
for e in pal_gen:
digits = len(str(e))
pal_gen.send(10**(digits))
is basically equivalent to (except that no name _iter is actually generated):
_iter = iter(pal_gen)
while True:
try:
e = next(_iter)
digits = len(str(e))
pal_gen.send(10**(digits))
except StopIteration:
break
However, because pal_gen is a generator object (created by calling the generator function infinite_palindromes), it is already an iterator; calling iter on it returns the same object unchanged. Thus, we effectively have:
while True:
try:
e = next(pal_gen)
digits = len(str(e))
pal_gen.send(10**(digits))
except StopIteration:
break
The other thing to note here is that calling .send on a generator advances the generator, and so does passing it to next (i.e., using it like any other iterator). It's just that the code was written to ignore the value retrieved using .send.
In fact, next on a generator is equivalent to calling .send and passing None (by convention; Python defines the behaviour this way so that the assignment in the generator code, i = (yield num), has something to assign).
So, each time through the loop, two values are sent to the generator:
while True:
try:
e = pal_gen.send(None)
digits = len(str(e))
pal_gen.send(10**(digits))
except StopIteration:
break
The control flow is like so:
None is sent to the generator. Because it hasn't yielded yet, this value is discarded (Python requires it to be None, since it can't be used by the generator).
The generator iterates its own loop until 11 is found and yielded. The execution of the generator pauses here: i is not assigned yet.
The main loop receives 11 as a value for e, computes 100 as the next value to work with, and sends that to the generator.
The generator receives 100 and assigns it to i, then proceeds with its logic. The next step is if i is not None:, and that is indeed the case - thus, 100 is assigned to num. num is incremented at the end of the loop.
The loop inside the generator runs into the next iteration, and immediately finds that 101 is a palindrome. Thus, the .send call returns a value of 101, but this isn't assigned anywhere.
Similarly, in the second iteration of the main loop, 111 will be found and yielded (as the next palindrome after 101) and assigned to e in the first call; then 1000 is sent explicitly, causing 1001 to be yielded and ignored. In the third iteration, 1111 is assigned to e, then 10001 is ignored. In the fourth iteration, 10101 (not 11111! This is the next palindrome after 10001) is assigned, and 100001 is ignored. Etc.
If the code is tested at the REPL, the values returned from the explicit .send will be displayed. This is just a quirk of the REPL.
What exactly happens, when yield and return are used in the same function in Python, like this?
def find_all(a_str, sub):
start = 0
while True:
start = a_str.find(sub, start)
if start == -1: return
yield start
start += len(sub) # use start += 1 to find overlapping matches
Is it still a generator?
Yes, it' still a generator. The return is (almost) equivalent to raising StopIteration.
PEP 255 spells it out:
Specification: Return
A generator function can also contain return statements of the form:
"return"
Note that an expression_list is not allowed on return statements in
the body of a generator (although, of course, they may appear in the
bodies of non-generator functions nested within the generator).
When a return statement is encountered, control proceeds as in any
function return, executing the appropriate finally clauses (if any
exist). Then a StopIteration exception is raised, signalling that the
iterator is exhausted. A StopIteration exception is also raised if
control flows off the end of the generator without an explict return.
Note that return means "I'm done, and have nothing interesting to
return", for both generator functions and non-generator functions.
Note that return isn't always equivalent to raising StopIteration:
the difference lies in how enclosing try/except constructs are
treated. For example,
>>> def f1():
... try:
... return
... except:
... yield 1
>>> print list(f1())
[]
because, as in any function, return simply exits, but
>>> def f2():
... try:
... raise StopIteration
... except:
... yield 42
>>> print list(f2())
[42]
because StopIteration is captured by a bare "except", as is any
exception.
Yes, it is still a generator. An empty return or return None can be used to end a generator function. It is equivalent to raising a StopIteration(see #NPE's answer for details).
Note that a return with non-None arguments is a SyntaxError in Python versions prior to 3.3.
As pointed out by #BrenBarn in comments starting from Python 3.3 the return value is now passed to StopIteration.
From PEP 380:
In a generator, the statement
return value
is semantically equivalent to
raise StopIteration(value)
There is a way to accomplish having a yield and return method in a function that allows you to return a value or generator.
It probably is not as clean as you would want but it does do what you expect.
Here's an example:
def six(how_many=None):
if how_many is None or how_many < 1:
return None # returns value
if how_many == 1:
return 6 # returns value
def iter_func():
for count in range(how_many):
yield 6
return iter_func() # returns generator
Note: you don't get StopIteration exception with the example below.
def odd(max):
n = 0
while n < max:
yield n
n = n + 1
return 'done'
for x in odd(3):
print(x)
The for loop catches it. That's its signal to stop
But you can catch it in this way:
g = odd(3)
while True:
try:
x = next(g)
print(x)
except StopIteration as e:
print("g return value:", e.value)
break
Why must I use a variable to obtain next() values from a Python generator?
def my_gen():
i = 0
while i < 4:
yield 2 * i
i += 1
#p = my_gen()
#for i in range(4):
# print(next(p))
##for i in range(4):
## print(next(my_gen()))
In the above, the # block works, while the ## block returns 4 copies of the first "yield."
print(next(my_gen()))
Each time this runs, you're calling next() on a separate (and brand-new) generator returned by my_gen().
If you want to call next() on the same generator more than once, you'll need some way to keep that generator around for long enough to reuse it.
just gonna add my $0.02...
this may help clear up some misconceptions:
def function_that_creates_a_generator(n):
while n>0:
yield n
n - 1
# this
for x in function_that_creates_a_generator(5):
print(x)
#is almost exactly the same as this
generator = function_that_creates_a_generator(5)
while True:
try:
x = generator.next() #same as: x = next(generator)
except StopIteration:
break
print(x)
For loops are really just a prettier way of writing a while loop that continually asks an iterable object for its next element. when you ask an iterable for another object and it has none, it will raise a StopIteration exception that is automatically caught in for loops; signaling a break from the loop. In the while loop we simply break these things out individually instead of having them hidden by the interpreter.
Having a bit of trouble with while loops. I understand this basic for loop runs through whatever is passed into the function but how would I change the for loop to a while loop? Thought it would be as easy as changing the for to while but apparently not so.
def print_data(items):
for item in items:
print(item)
You can do something like this to have the same printing functionality with a while loop:
def print_data(items):
i = 0
while i < len(items):
print items[i]
i += 1
Here is a while loop version that works by constructing an iterator and manually walking it. It works regardless of whether the input is a generator or a list.
def print_data(items):
it = iter(items)
while True:
try:
print next(it)
except StopIteration:
break
print_data([1,2,3])
One option, that works on both lists and generators, is to construct an iterator and then use Python's built-in next function. When the iterator reaches the end, the next function will raise a StopIteration exception, which you can use to break the loop:
def print_data(items):
it = iter(items)
while True:
try:
print next(it)
except StopIteration:
break
print_data(['a', 'b', 'c'])
print_data(('a', 'b', 'c'))
There's more about the next built-in function and iterators in the docs.
If you are learning Python:
If you need to iterate over an iterable (a list, a generator, a string etc.. in short and not precise words something that contains things and can "give" those things one by one..) you better use for.
In Python for was made for iterables, so you don't need a while.
If you are learning programming in general:
while needs a condition to be satisfied to keep looping, you create your own condition making a counter that will increment every loop, and making the while loop go while this counter is less or equal to the lenght of your items, as showed in Mathias711's answer.
The for-loop you are using is iterating through a so called iterator.
This means to walk through iterable objects (lists, generators, dicts,...) and return the next item from the iterator which is returned by the built-in function [next()][2]. If there is no item left, calling this function will raise an so called StopIteration error which causes to stop iteration.
So the pythonic way to iterate througth iteratable objects is in fact using the for-loop you provided in your question. However, if you really want to use a while loop (which at least in general is not recommended at all) you have to iterate using a try-except-block and handle the StopIteration error raised if no item is left.
def iterate_manually(items):
# convert items list into iterator
iterator = iter(items)
while True:
try:
print(next(iterator))
# handle StopIteration error and exit while-loop
except StopIteration:
break
iterate_manually(['foo', 'bar', 'baz'])
You can try this
def print_data(items):
i =0
while items:
if i < len(items):
print items[i]
i = i+1
else:
break
What exactly happens, when yield and return are used in the same function in Python, like this?
def find_all(a_str, sub):
start = 0
while True:
start = a_str.find(sub, start)
if start == -1: return
yield start
start += len(sub) # use start += 1 to find overlapping matches
Is it still a generator?
Yes, it' still a generator. The return is (almost) equivalent to raising StopIteration.
PEP 255 spells it out:
Specification: Return
A generator function can also contain return statements of the form:
"return"
Note that an expression_list is not allowed on return statements in
the body of a generator (although, of course, they may appear in the
bodies of non-generator functions nested within the generator).
When a return statement is encountered, control proceeds as in any
function return, executing the appropriate finally clauses (if any
exist). Then a StopIteration exception is raised, signalling that the
iterator is exhausted. A StopIteration exception is also raised if
control flows off the end of the generator without an explict return.
Note that return means "I'm done, and have nothing interesting to
return", for both generator functions and non-generator functions.
Note that return isn't always equivalent to raising StopIteration:
the difference lies in how enclosing try/except constructs are
treated. For example,
>>> def f1():
... try:
... return
... except:
... yield 1
>>> print list(f1())
[]
because, as in any function, return simply exits, but
>>> def f2():
... try:
... raise StopIteration
... except:
... yield 42
>>> print list(f2())
[42]
because StopIteration is captured by a bare "except", as is any
exception.
Yes, it is still a generator. An empty return or return None can be used to end a generator function. It is equivalent to raising a StopIteration(see #NPE's answer for details).
Note that a return with non-None arguments is a SyntaxError in Python versions prior to 3.3.
As pointed out by #BrenBarn in comments starting from Python 3.3 the return value is now passed to StopIteration.
From PEP 380:
In a generator, the statement
return value
is semantically equivalent to
raise StopIteration(value)
There is a way to accomplish having a yield and return method in a function that allows you to return a value or generator.
It probably is not as clean as you would want but it does do what you expect.
Here's an example:
def six(how_many=None):
if how_many is None or how_many < 1:
return None # returns value
if how_many == 1:
return 6 # returns value
def iter_func():
for count in range(how_many):
yield 6
return iter_func() # returns generator
Note: you don't get StopIteration exception with the example below.
def odd(max):
n = 0
while n < max:
yield n
n = n + 1
return 'done'
for x in odd(3):
print(x)
The for loop catches it. That's its signal to stop
But you can catch it in this way:
g = odd(3)
while True:
try:
x = next(g)
print(x)
except StopIteration as e:
print("g return value:", e.value)
break