I'm trying to write a function that returns the next element of a generator and if it is at the end of the generator it resets it and returns the next result. The expected output of the code below would be:
1
2
3
1
2
However that is not what I get obviously. What am I doing that is incorrect?
a = '123'
def convert_to_generator(iterable):
return (x for x in iterable)
ag = convert_to_generator(a)
def get_next_item(gen, original):
try:
return next(gen)
except StopIteration:
gen = convert_to_generator(original)
get_next_item(gen, original)
for n in range(5):
print(get_next_item(ag,a))
1
2
3
None
None
Is itertools.cycle(iterable) a possible alternative?
You need to return the result of your recursive call:
return get_next_item(gen, original)
which still does not make this a working approach.
The generator ag used in your for-loop is not changed by the rebinding of the local variable gen in your function. It will stay exhausted...
As has been mentioned in the comments, check out itertools.cycle.
the easy way is just use itertools.cycle, otherwise you would need to remember the elements in the iterable if said iterable is an iterator (aka a generator) becase those can't be reset, if its not a iterator, you can reuse it many times.
the documentation include a example implementation
def cycle(iterable):
# cycle('ABCD') --> A B C D A B C D A B C D ...
saved = []
for element in iterable:
yield element
saved.append(element)
while saved:
for element in saved:
yield element
or for example, to do the reuse thing
def cycle(iterable):
# cycle('ABCD') --> A B C D A B C D A B C D ...
if iter(iterable) is iter(iterable): # is a iterator
saved = []
for element in iterable:
yield element
saved.append(element)
else:
saved = iterable
while saved:
for element in saved:
yield element
example use
test = cycle("123")
for i in range(5):
print(next(test))
now about your code, the problem is simple, it don't remember it state
def get_next_item(gen, original):
try:
return next(gen)
except StopIteration:
gen = convert_to_generator(original) # <-- the problem is here
get_next_item(gen, original) #and you should return something here
in the marked line a new generator is build, but you would need to update your ag variable outside this function to get the desire behavior, there are ways to do it, like changing your function to return the element and the generator, there are other ways, but they are not recommended or more complicated like building a class so it remember its state
get_next_item is a generator, that returns an iterator, that gives you the values it yields via the __next__ method. For that reason, your statement doesn't do anything.
What you want to do is this:
def get_next_item(gen, original):
try:
return next(gen)
except StopIteration:
gen = convert_to_generator(original)
for i in get_next_item(gen, original):
return i
or shorter, and completely equivalent (as long as gen has a __iter__ method, which it probably has):
def get_next_item(gen, original):
for i in gen:
yield i
for i in get_next_item(convert_to_generator(original)):
yield i
Or without recursion (which is a big problem in python, as it is 1. limited in depth and 2. slow):
def get_next_item(gen, original):
for i in gen:
yield i
while True:
for i in convert_to_generator(original):
yield i
If convert_to_generator is just a call to iter, it is even shorter:
def get_next_item(gen, original):
for i in gen:
yield i
while True:
for i in original:
yield i
or, with itertools:
import itertools
def get_next_item(gen, original):
return itertools.chain(gen, itertools.cycle(original))
and get_next_item is equivalent to itertools.cycle if gen is guaranteed to be an iterator for original.
Side note: You can exchange for i in x: yield i for yield from x (where x is some expression) with Python 3.3 or higher.
Related
Since Python 3.3, if a generator function returns a value, that becomes the value for the StopIteration exception that is raised. This can be collected a number of ways:
The value of a yield from expression, which implies the enclosing function is also a generator.
Wrapping a call to next() or .send() in a try/except block.
However, if I'm simply wanting to iterate over the generator in a for loop - the easiest way - there doesn't appear to be a way to collect the value of the StopIteration exception, and thus the return value. Im using a simple example where the generator yields values, and returns some kind of summary at the end (running totals, averages, timing statistics, etc).
for i in produce_values():
do_something(i)
values_summary = ....??
One way is to handle the loop myself:
values_iter = produce_values()
try:
while True:
i = next(values_iter)
do_something(i)
except StopIteration as e:
values_summary = e.value
But this throws away the simplicity of the for loop. I can't use yield from since that requires the calling code to be, itself, a generator. Is there a simpler way than the roll-ones-own for loop shown above?
You can think of the value attribute of StopIteration (and arguably StopIteration itself) as implementation details, not designed to be used in "normal" code.
Have a look at PEP 380 that specifies the yield from feature of Python 3.3: It discusses that some alternatives of using StopIteration to carry the return value where considered.
Since you are not supposed to get the return value in an ordinary for loop, there is no syntax for it. The same way as you are not supposed to catch the StopIteration explicitly.
A nice solution for your situation would be a small utility class (might be useful enough for the standard library):
class Generator:
def __init__(self, gen):
self.gen = gen
def __iter__(self):
self.value = yield from self.gen
This wraps any generator and catches its return value to be inspected later:
>>> def test():
... yield 1
... return 2
...
>>> gen = Generator(test())
>>> for i in gen:
... print(i)
...
1
>>> print(gen.value)
2
You could make a helper wrapper, that would catch the StopIteration and extract the value for you:
from functools import wraps
class ValueKeepingGenerator(object):
def __init__(self, g):
self.g = g
self.value = None
def __iter__(self):
self.value = yield from self.g
def keep_value(f):
#wraps(f)
def g(*args, **kwargs):
return ValueKeepingGenerator(f(*args, **kwargs))
return g
#keep_value
def f():
yield 1
yield 2
return "Hi"
v = f()
for x in v:
print(x)
print(v.value)
A light-weight way to handle the return value (one that doesn't involve instantiating an auxiliary class) is to use dependency injection.
Namely, one can pass in the function to handle / act on the return value using the following wrapper / helper generator function:
def handle_return(generator, func):
returned = yield from generator
func(returned)
For example, the following--
def generate():
yield 1
yield 2
return 3
def show_return(value):
print('returned: {}'.format(value))
for x in handle_return(generate(), show_return):
print(x)
results in--
1
2
returned: 3
The most obvious method I can think of for this would be a user defined type that would remember the summary for you..
>>> import random
>>> class ValueProducer:
... def produce_values(self, n):
... self._total = 0
... for i in range(n):
... r = random.randrange(n*100)
... self._total += r
... yield r
... self.value_summary = self._total/n
... return self.value_summary
...
>>> v = ValueProducer()
>>> for i in v.produce_values(3):
... print(i)
...
25
55
179
>>> print(v.value_summary)
86.33333333333333
>>>
Another light weight way sometimes appropriate is to yield the running summary in every generator step in addition to your primary value in a tuple. The loop stays simple with an extra binding which is still available afterwards:
for i, summary in produce_values():
do_something(i)
show_summary(summary)
This is especially useful if someone could use more than just the last summary value, e. g. updating a progress view.
Since Python 3.3, if a generator function returns a value, that becomes the value for the StopIteration exception that is raised. This can be collected a number of ways:
The value of a yield from expression, which implies the enclosing function is also a generator.
Wrapping a call to next() or .send() in a try/except block.
However, if I'm simply wanting to iterate over the generator in a for loop - the easiest way - there doesn't appear to be a way to collect the value of the StopIteration exception, and thus the return value. Im using a simple example where the generator yields values, and returns some kind of summary at the end (running totals, averages, timing statistics, etc).
for i in produce_values():
do_something(i)
values_summary = ....??
One way is to handle the loop myself:
values_iter = produce_values()
try:
while True:
i = next(values_iter)
do_something(i)
except StopIteration as e:
values_summary = e.value
But this throws away the simplicity of the for loop. I can't use yield from since that requires the calling code to be, itself, a generator. Is there a simpler way than the roll-ones-own for loop shown above?
You can think of the value attribute of StopIteration (and arguably StopIteration itself) as implementation details, not designed to be used in "normal" code.
Have a look at PEP 380 that specifies the yield from feature of Python 3.3: It discusses that some alternatives of using StopIteration to carry the return value where considered.
Since you are not supposed to get the return value in an ordinary for loop, there is no syntax for it. The same way as you are not supposed to catch the StopIteration explicitly.
A nice solution for your situation would be a small utility class (might be useful enough for the standard library):
class Generator:
def __init__(self, gen):
self.gen = gen
def __iter__(self):
self.value = yield from self.gen
This wraps any generator and catches its return value to be inspected later:
>>> def test():
... yield 1
... return 2
...
>>> gen = Generator(test())
>>> for i in gen:
... print(i)
...
1
>>> print(gen.value)
2
You could make a helper wrapper, that would catch the StopIteration and extract the value for you:
from functools import wraps
class ValueKeepingGenerator(object):
def __init__(self, g):
self.g = g
self.value = None
def __iter__(self):
self.value = yield from self.g
def keep_value(f):
#wraps(f)
def g(*args, **kwargs):
return ValueKeepingGenerator(f(*args, **kwargs))
return g
#keep_value
def f():
yield 1
yield 2
return "Hi"
v = f()
for x in v:
print(x)
print(v.value)
A light-weight way to handle the return value (one that doesn't involve instantiating an auxiliary class) is to use dependency injection.
Namely, one can pass in the function to handle / act on the return value using the following wrapper / helper generator function:
def handle_return(generator, func):
returned = yield from generator
func(returned)
For example, the following--
def generate():
yield 1
yield 2
return 3
def show_return(value):
print('returned: {}'.format(value))
for x in handle_return(generate(), show_return):
print(x)
results in--
1
2
returned: 3
The most obvious method I can think of for this would be a user defined type that would remember the summary for you..
>>> import random
>>> class ValueProducer:
... def produce_values(self, n):
... self._total = 0
... for i in range(n):
... r = random.randrange(n*100)
... self._total += r
... yield r
... self.value_summary = self._total/n
... return self.value_summary
...
>>> v = ValueProducer()
>>> for i in v.produce_values(3):
... print(i)
...
25
55
179
>>> print(v.value_summary)
86.33333333333333
>>>
Another light weight way sometimes appropriate is to yield the running summary in every generator step in addition to your primary value in a tuple. The loop stays simple with an extra binding which is still available afterwards:
for i, summary in produce_values():
do_something(i)
show_summary(summary)
This is especially useful if someone could use more than just the last summary value, e. g. updating a progress view.
Since Python 3.3, if a generator function returns a value, that becomes the value for the StopIteration exception that is raised. This can be collected a number of ways:
The value of a yield from expression, which implies the enclosing function is also a generator.
Wrapping a call to next() or .send() in a try/except block.
However, if I'm simply wanting to iterate over the generator in a for loop - the easiest way - there doesn't appear to be a way to collect the value of the StopIteration exception, and thus the return value. Im using a simple example where the generator yields values, and returns some kind of summary at the end (running totals, averages, timing statistics, etc).
for i in produce_values():
do_something(i)
values_summary = ....??
One way is to handle the loop myself:
values_iter = produce_values()
try:
while True:
i = next(values_iter)
do_something(i)
except StopIteration as e:
values_summary = e.value
But this throws away the simplicity of the for loop. I can't use yield from since that requires the calling code to be, itself, a generator. Is there a simpler way than the roll-ones-own for loop shown above?
You can think of the value attribute of StopIteration (and arguably StopIteration itself) as implementation details, not designed to be used in "normal" code.
Have a look at PEP 380 that specifies the yield from feature of Python 3.3: It discusses that some alternatives of using StopIteration to carry the return value where considered.
Since you are not supposed to get the return value in an ordinary for loop, there is no syntax for it. The same way as you are not supposed to catch the StopIteration explicitly.
A nice solution for your situation would be a small utility class (might be useful enough for the standard library):
class Generator:
def __init__(self, gen):
self.gen = gen
def __iter__(self):
self.value = yield from self.gen
This wraps any generator and catches its return value to be inspected later:
>>> def test():
... yield 1
... return 2
...
>>> gen = Generator(test())
>>> for i in gen:
... print(i)
...
1
>>> print(gen.value)
2
You could make a helper wrapper, that would catch the StopIteration and extract the value for you:
from functools import wraps
class ValueKeepingGenerator(object):
def __init__(self, g):
self.g = g
self.value = None
def __iter__(self):
self.value = yield from self.g
def keep_value(f):
#wraps(f)
def g(*args, **kwargs):
return ValueKeepingGenerator(f(*args, **kwargs))
return g
#keep_value
def f():
yield 1
yield 2
return "Hi"
v = f()
for x in v:
print(x)
print(v.value)
A light-weight way to handle the return value (one that doesn't involve instantiating an auxiliary class) is to use dependency injection.
Namely, one can pass in the function to handle / act on the return value using the following wrapper / helper generator function:
def handle_return(generator, func):
returned = yield from generator
func(returned)
For example, the following--
def generate():
yield 1
yield 2
return 3
def show_return(value):
print('returned: {}'.format(value))
for x in handle_return(generate(), show_return):
print(x)
results in--
1
2
returned: 3
The most obvious method I can think of for this would be a user defined type that would remember the summary for you..
>>> import random
>>> class ValueProducer:
... def produce_values(self, n):
... self._total = 0
... for i in range(n):
... r = random.randrange(n*100)
... self._total += r
... yield r
... self.value_summary = self._total/n
... return self.value_summary
...
>>> v = ValueProducer()
>>> for i in v.produce_values(3):
... print(i)
...
25
55
179
>>> print(v.value_summary)
86.33333333333333
>>>
Another light weight way sometimes appropriate is to yield the running summary in every generator step in addition to your primary value in a tuple. The loop stays simple with an extra binding which is still available afterwards:
for i, summary in produce_values():
do_something(i)
show_summary(summary)
This is especially useful if someone could use more than just the last summary value, e. g. updating a progress view.
Since Python 3.3, if a generator function returns a value, that becomes the value for the StopIteration exception that is raised. This can be collected a number of ways:
The value of a yield from expression, which implies the enclosing function is also a generator.
Wrapping a call to next() or .send() in a try/except block.
However, if I'm simply wanting to iterate over the generator in a for loop - the easiest way - there doesn't appear to be a way to collect the value of the StopIteration exception, and thus the return value. Im using a simple example where the generator yields values, and returns some kind of summary at the end (running totals, averages, timing statistics, etc).
for i in produce_values():
do_something(i)
values_summary = ....??
One way is to handle the loop myself:
values_iter = produce_values()
try:
while True:
i = next(values_iter)
do_something(i)
except StopIteration as e:
values_summary = e.value
But this throws away the simplicity of the for loop. I can't use yield from since that requires the calling code to be, itself, a generator. Is there a simpler way than the roll-ones-own for loop shown above?
You can think of the value attribute of StopIteration (and arguably StopIteration itself) as implementation details, not designed to be used in "normal" code.
Have a look at PEP 380 that specifies the yield from feature of Python 3.3: It discusses that some alternatives of using StopIteration to carry the return value where considered.
Since you are not supposed to get the return value in an ordinary for loop, there is no syntax for it. The same way as you are not supposed to catch the StopIteration explicitly.
A nice solution for your situation would be a small utility class (might be useful enough for the standard library):
class Generator:
def __init__(self, gen):
self.gen = gen
def __iter__(self):
self.value = yield from self.gen
This wraps any generator and catches its return value to be inspected later:
>>> def test():
... yield 1
... return 2
...
>>> gen = Generator(test())
>>> for i in gen:
... print(i)
...
1
>>> print(gen.value)
2
You could make a helper wrapper, that would catch the StopIteration and extract the value for you:
from functools import wraps
class ValueKeepingGenerator(object):
def __init__(self, g):
self.g = g
self.value = None
def __iter__(self):
self.value = yield from self.g
def keep_value(f):
#wraps(f)
def g(*args, **kwargs):
return ValueKeepingGenerator(f(*args, **kwargs))
return g
#keep_value
def f():
yield 1
yield 2
return "Hi"
v = f()
for x in v:
print(x)
print(v.value)
A light-weight way to handle the return value (one that doesn't involve instantiating an auxiliary class) is to use dependency injection.
Namely, one can pass in the function to handle / act on the return value using the following wrapper / helper generator function:
def handle_return(generator, func):
returned = yield from generator
func(returned)
For example, the following--
def generate():
yield 1
yield 2
return 3
def show_return(value):
print('returned: {}'.format(value))
for x in handle_return(generate(), show_return):
print(x)
results in--
1
2
returned: 3
The most obvious method I can think of for this would be a user defined type that would remember the summary for you..
>>> import random
>>> class ValueProducer:
... def produce_values(self, n):
... self._total = 0
... for i in range(n):
... r = random.randrange(n*100)
... self._total += r
... yield r
... self.value_summary = self._total/n
... return self.value_summary
...
>>> v = ValueProducer()
>>> for i in v.produce_values(3):
... print(i)
...
25
55
179
>>> print(v.value_summary)
86.33333333333333
>>>
Another light weight way sometimes appropriate is to yield the running summary in every generator step in addition to your primary value in a tuple. The loop stays simple with an extra binding which is still available afterwards:
for i, summary in produce_values():
do_something(i)
show_summary(summary)
This is especially useful if someone could use more than just the last summary value, e. g. updating a progress view.
Since Python 3.3, if a generator function returns a value, that becomes the value for the StopIteration exception that is raised. This can be collected a number of ways:
The value of a yield from expression, which implies the enclosing function is also a generator.
Wrapping a call to next() or .send() in a try/except block.
However, if I'm simply wanting to iterate over the generator in a for loop - the easiest way - there doesn't appear to be a way to collect the value of the StopIteration exception, and thus the return value. Im using a simple example where the generator yields values, and returns some kind of summary at the end (running totals, averages, timing statistics, etc).
for i in produce_values():
do_something(i)
values_summary = ....??
One way is to handle the loop myself:
values_iter = produce_values()
try:
while True:
i = next(values_iter)
do_something(i)
except StopIteration as e:
values_summary = e.value
But this throws away the simplicity of the for loop. I can't use yield from since that requires the calling code to be, itself, a generator. Is there a simpler way than the roll-ones-own for loop shown above?
You can think of the value attribute of StopIteration (and arguably StopIteration itself) as implementation details, not designed to be used in "normal" code.
Have a look at PEP 380 that specifies the yield from feature of Python 3.3: It discusses that some alternatives of using StopIteration to carry the return value where considered.
Since you are not supposed to get the return value in an ordinary for loop, there is no syntax for it. The same way as you are not supposed to catch the StopIteration explicitly.
A nice solution for your situation would be a small utility class (might be useful enough for the standard library):
class Generator:
def __init__(self, gen):
self.gen = gen
def __iter__(self):
self.value = yield from self.gen
This wraps any generator and catches its return value to be inspected later:
>>> def test():
... yield 1
... return 2
...
>>> gen = Generator(test())
>>> for i in gen:
... print(i)
...
1
>>> print(gen.value)
2
You could make a helper wrapper, that would catch the StopIteration and extract the value for you:
from functools import wraps
class ValueKeepingGenerator(object):
def __init__(self, g):
self.g = g
self.value = None
def __iter__(self):
self.value = yield from self.g
def keep_value(f):
#wraps(f)
def g(*args, **kwargs):
return ValueKeepingGenerator(f(*args, **kwargs))
return g
#keep_value
def f():
yield 1
yield 2
return "Hi"
v = f()
for x in v:
print(x)
print(v.value)
A light-weight way to handle the return value (one that doesn't involve instantiating an auxiliary class) is to use dependency injection.
Namely, one can pass in the function to handle / act on the return value using the following wrapper / helper generator function:
def handle_return(generator, func):
returned = yield from generator
func(returned)
For example, the following--
def generate():
yield 1
yield 2
return 3
def show_return(value):
print('returned: {}'.format(value))
for x in handle_return(generate(), show_return):
print(x)
results in--
1
2
returned: 3
The most obvious method I can think of for this would be a user defined type that would remember the summary for you..
>>> import random
>>> class ValueProducer:
... def produce_values(self, n):
... self._total = 0
... for i in range(n):
... r = random.randrange(n*100)
... self._total += r
... yield r
... self.value_summary = self._total/n
... return self.value_summary
...
>>> v = ValueProducer()
>>> for i in v.produce_values(3):
... print(i)
...
25
55
179
>>> print(v.value_summary)
86.33333333333333
>>>
Another light weight way sometimes appropriate is to yield the running summary in every generator step in addition to your primary value in a tuple. The loop stays simple with an extra binding which is still available afterwards:
for i, summary in produce_values():
do_something(i)
show_summary(summary)
This is especially useful if someone could use more than just the last summary value, e. g. updating a progress view.