I'm writing some python code where I need to use generators inside recursive functions. Here is some code I wrote to mimic what I am trying to do. This is attempt 1.
def f():
def f2(i):
if i > 0:
yield i
f2(i - 1)
yield f2(10)
for x in f():
for y in x:
print(y)
This only prints 10, attempt 2 using this yield from construct I found online.
def f():
def f2(i):
if i > 0:
yield i
yield from f2(i - 1)
yield from f2(10)
for x in f():
print(x)
This does what I want, but I don't understand what is happening, what is yield from doing behind the scenes and why doesn't my first attempt work?
You can think of yield from as a for loop which yields every item:
for i in f(10):
yield i
is the same as yield from f(10). In other words, it yields the items from the given iteratable which in this case is another generator.
yield from g() will recurse inside a new generator g yielding from each yield statement at that generator
so
def g1():
yield from g2()
def g2()
for i in range(10):
yield i * 2
You can think as if yield from in g1 was unrolling g2 inside of it, expanding to something like this
def g1():
for i in range(10):
yield i * 2
This not what is happening because you have scopes and etc, but during the execution of yield from g2() in g1 then interpreter recurse in g2 yield each value that it yields, possibly recursing to another generator.
Now consider this generator
def flatten(maybe_it):
try:
for i0 in maybe_it:
for i1 in flatten(i0):
yield i1
except TypeError:
yield maybe_it
with yield from it can be rewrite as
def flatten(maybe_it):
try:
for i0 in maybe_it:
yield from flatten(i0):
except TypeError:
yield maybe_it
Related
I'm trying to iterate over multiple generators randomly, and skip those that are exhausted by removing them from the list of available generators. However, the CombinedGenerator doesn't call itself like it should to switch generator. Instead it throws a StopIteration when the smaller iterator is exhausted. What am I missing?
The following works:
gen1 = (i for i in range(0, 5, 1))
gen2 = (i for i in range(100, 200, 1))
list_of_gen = [gen1, gen2]
print(list_of_gen)
list_of_gen.remove(gen1)
print(list_of_gen)
list_of_gen.remove(gen2)
print(list_of_gen)
where each generator is removed by their reference.
But here it doesn't:
import random
gen1 = (i for i in range(0, 5, 1))
gen2 = (i for i in range(100, 200, 1))
total = 105
class CombinedGenerator:
def __init__(self, generators):
self.generators = generators
def __call__(self):
generator = random.choice(self.generators)
try:
yield next(generator)
except StopIteration:
self.generators.remove(generator)
if len(self.generators) != 0:
self.__call__()
else:
raise StopIteration
c = CombinedGenerator([gen1, gen2])
for i in range(total):
print(f"iter {i}")
print(f"yielded {next(c())}")
As #Tomerikoo mentioned, you are basically creating your own Generator and it is better to implement __next__ which is cleaner and pythonic way.
The above code can be fixed with below lines.
def __call__(self):
generator = random.choice(self.generators)
try:
yield next(generator)
except StopIteration:
self.generators.remove(generator)
if len(self.generators) != 0:
# yield your self.__call__() result as well
yield next(self.__call__())
else:
raise StopIteration
First of all, in order to fix your current code, you just need to match the pattern you created by changing the line:
self.__call__()
to:
yield next(self.__call__())
Then, I would make a few small changes to your original code:
Instead of implementing __call__ and calling the object, it seems more reasonable to implement __next__ and simply call next on the object.
Instead of choosing the generator, I would choose the index. This mainly serves for avoiding the use of remove which is not so efficient when you can directly access the deleted object.
Personally I prefer to avoid recursion where possible so will change where I check that are still generators to use:
class CombinedGenerator:
def __init__(self, generators):
self.generators = generators
def __next__(self):
while self.generators:
i = random.choice(range(len(self.generators)))
try:
return next(self.generators[i])
except StopIteration:
del self.generators[i]
raise StopIteration
c = CombinedGenerator([gen1, gen2])
for i in range(total):
print(f"iter {i}")
print(f"yielded {next(c)}")
A nice bonus can be to add this to your class:
def __iter__(self):
return self
Which then allows you to directly iterate on the object itself and you don't need the total variable:
for i, num in enumerate(c):
print(f"iter {i}")
print(f"yielded {num}")
I was messing around and noticed that the following code yields the value once, while I was expecting it to return a generator object.
def f():
yield (yield 1)
f().next() # returns 1
def g():
yield (yield (yield 1)
g().next() # returns 1
My question is what is the value of the yield expression and also why are we allowed to nest yield expression if the yield expression collapses?
The value of the yield expression after resuming depends on the method which resumed the execution. If __next__() is used (typically via either a for or the next() builtin) then the result is None. Otherwise, if send() is used, then the result will be the value passed in to that method.
So this:
def f():
yield (yield 1)
Is equivalent to this:
def f():
x = yield 1
yield x
Which in this case (since you're not using generator.send()) is equivalent to this:
def f():
yield 1
yield None
Your code is only looking at the first item yielded by the generator. If you instead call list() to consume the whole sequence, you'll see what I describe:
def f():
yield (yield 1)
def g():
yield (yield (yield 1))
print(list(f()))
print(list(g()))
Output:
$ python3 yield.py
[1, None]
[1, None, None]
If we iterate the generator manually (as you have), but .send() it values, then you can see that yield "returns" this value:
gen = f()
print(next(gen))
print(gen.send(42))
Output:
$ python3 yield_manual.py
1
42
_
consider the following example:
def generator(iterable):
print('Start')
for item in iterable: yield item
print('Stop')
for x in generator(range(10)):
print(x)
if x==3: break
print(x)
whose output is
Start
0
0
1
1
2
2
3
Python of course does exactly what it's being told. The generator is not called again after x==3 and so "Stop" is never printed. The generator abstraction is reasonable. However, in this case I'm actually trying to achieve a subtly different kind of abstraction, kind of like decorating a loop to make it a customized loop. Some code should run before the loop, some for each iteration, and some after the loop, even in case of break.
Of course I do not rely on this exact abstraction to make my program work, but it would be nice. Does anyone have any good ideas for this case?
Kind regards.
A generator is a good abstraction for a loop, but for before-and-after code Python has another abstraction – context managers and the 'with' statement. You can use those those two together, e.g. this way:
class generator_context:
def __init__(self, iterable):
self.iterable = iterable
def __enter__(self):
print('Start')
for item in self.iterable: yield item
def __exit__(self, e_type, e_value, e_traceback):
if e_type is None:
print('Stop')
with generator_context(range(10)) as generator:
for x in generator:
print(x)
if x==3: break
print(x)
Well, you could put your generator inside a try > finally.
def generator(iterable):
try:
print('Start')
for item in iterable: yield item
finally:
print('Stop')
This makes what is inside finally always execute.
As I mentioned in the comments, a context manager will probably do what you are looking for. Here's one that auto-closes the generator (which makes sense because "stop" was printed as an indication that there are no more values).
class my_generator:
def __init__(self, iterable):
self.g = (x for x in iterable)
def __iter__(self):
return self.g
def __enter__(self):
print('start')
return self
def __exit__(self, exc_type, exc_value, trace):
self.g.close()
# do whatever cleanup is necessary in case of exception
print('stop')
Usage:
>>> with my_generator([1,2,3]) as g:
... for x in g:
... print(x)
... if x == 2:
... break
...
start
1
2
stop
>>> list(g)
[]
I'm trying to write a function that returns the next element of a generator and if it is at the end of the generator it resets it and returns the next result. The expected output of the code below would be:
1
2
3
1
2
However that is not what I get obviously. What am I doing that is incorrect?
a = '123'
def convert_to_generator(iterable):
return (x for x in iterable)
ag = convert_to_generator(a)
def get_next_item(gen, original):
try:
return next(gen)
except StopIteration:
gen = convert_to_generator(original)
get_next_item(gen, original)
for n in range(5):
print(get_next_item(ag,a))
1
2
3
None
None
Is itertools.cycle(iterable) a possible alternative?
You need to return the result of your recursive call:
return get_next_item(gen, original)
which still does not make this a working approach.
The generator ag used in your for-loop is not changed by the rebinding of the local variable gen in your function. It will stay exhausted...
As has been mentioned in the comments, check out itertools.cycle.
the easy way is just use itertools.cycle, otherwise you would need to remember the elements in the iterable if said iterable is an iterator (aka a generator) becase those can't be reset, if its not a iterator, you can reuse it many times.
the documentation include a example implementation
def cycle(iterable):
# cycle('ABCD') --> A B C D A B C D A B C D ...
saved = []
for element in iterable:
yield element
saved.append(element)
while saved:
for element in saved:
yield element
or for example, to do the reuse thing
def cycle(iterable):
# cycle('ABCD') --> A B C D A B C D A B C D ...
if iter(iterable) is iter(iterable): # is a iterator
saved = []
for element in iterable:
yield element
saved.append(element)
else:
saved = iterable
while saved:
for element in saved:
yield element
example use
test = cycle("123")
for i in range(5):
print(next(test))
now about your code, the problem is simple, it don't remember it state
def get_next_item(gen, original):
try:
return next(gen)
except StopIteration:
gen = convert_to_generator(original) # <-- the problem is here
get_next_item(gen, original) #and you should return something here
in the marked line a new generator is build, but you would need to update your ag variable outside this function to get the desire behavior, there are ways to do it, like changing your function to return the element and the generator, there are other ways, but they are not recommended or more complicated like building a class so it remember its state
get_next_item is a generator, that returns an iterator, that gives you the values it yields via the __next__ method. For that reason, your statement doesn't do anything.
What you want to do is this:
def get_next_item(gen, original):
try:
return next(gen)
except StopIteration:
gen = convert_to_generator(original)
for i in get_next_item(gen, original):
return i
or shorter, and completely equivalent (as long as gen has a __iter__ method, which it probably has):
def get_next_item(gen, original):
for i in gen:
yield i
for i in get_next_item(convert_to_generator(original)):
yield i
Or without recursion (which is a big problem in python, as it is 1. limited in depth and 2. slow):
def get_next_item(gen, original):
for i in gen:
yield i
while True:
for i in convert_to_generator(original):
yield i
If convert_to_generator is just a call to iter, it is even shorter:
def get_next_item(gen, original):
for i in gen:
yield i
while True:
for i in original:
yield i
or, with itertools:
import itertools
def get_next_item(gen, original):
return itertools.chain(gen, itertools.cycle(original))
and get_next_item is equivalent to itertools.cycle if gen is guaranteed to be an iterator for original.
Side note: You can exchange for i in x: yield i for yield from x (where x is some expression) with Python 3.3 or higher.
So, I defined a simple generator:
def gen1(x):
if x <= 10:
yield x
for v in gen1(x + 1):
yield v
Basically, I want to decorate it so it returns all the values, but the last:
def dec(gen):
def new_gen(x):
g = gen(x)
value = g.next()
for v in g:
yield value
value = v
return new_gen
Now, if I redefine gen1
#dec
def gen1(x):
...
for i in gen1(1):
print i # Nothing printed
but if I use:
some_gen = dec(gen1)
for i in some_gen(1):
print i # Prints 1 to 9, as needed
Why my decorator doesn't work and how can I fix it?
The recursive invocation of your gen1 is also subject to your decorator, so everything gets consumed by the decorator.
The simplest fix is to write the generator in non-recursive style, or to encapsulate the recursion:
Encapsulated:
#dec
def gen1(x):
def inner(x):
if x <= 10:
yield x
for v in inner(x + 1):
yield v
return inner(x)
Non-recursive:
#dec
def gen1(x):
for v in range(x, 11):
yield v
It doesn't work due to the interaction between the decorator and recursion. Since your generator is recursive, it relies on a certain recurrence relation. By injecting a modifying decorator between the generator and the sub-generator, you are breaking that recurrence relation.
As long as #dec drops the last element, you can't make it compatible with gen1() by changing #dec alone.
You could, however, change gen1() to make it compatible with #dec:
def dec(gen):
def new_gen(x):
g = gen(x)
value = g.next()
for v in g:
yield value
value = v
return new_gen
#dec
def gen1(x):
def gen2(x):
if x <= 10:
yield x
for v in gen2(x + 1):
yield v
for v in gen2(x):
yield v
for i in gen1(1):
print i # Prints 1 to 9, as needed
The trick here is to make gen1() non-recursive, and to delegate all the work to another, undecorated, generator. The latter can be recursive.
My solution when I had to do sth like that was to create a generator on top of the generator! This is actually the idea of a decorated call. So you do,
def funca():
while True:
print "in funca"
yield True
def dec(func):
while True:
print "in funcb"
func.next()
yield True
decfa = dec(funca())
decfa.next()
>>
"in funcb"
"in funca"
as for exactly your problem (yielding only the last value) I would do something like:
def funca():
for i in range(1,5):
yield i
def dec2(ff):
try:
while True:
val=ff.next()
except:
yield val
>>>dec2(funca()).next()
4
Yet a simpler solution.
Save a pointer to the initial generator as an attribute of the final one:
def dec(gen):
def new_gen(x):
g = gen(x)
value = next(g)
for v in g:
yield value
value = v
new_gen.gen = gen
return new_gen
Use the pointer to the initial generator in place of the unexpectedly recursed one:
#dec
def gen1(x):
if x <= 10:
yield x
for v in gen1.gen(x+1):
yield v