Python: next in for loop - python

I want to use next to skip one or more items returned from a generator. Here is a simplified example designed to skip one item per loop (in actual use, I'd test n and depending on the result, may repeat the next() and the generator is from a package I don't control):
def gen():
for i in range(10):
yield i
for g in gen():
n = next(gen())
print(g, n)
I expected the result to be
0 1
2 3
etc.
Instead I got
0 0
1 0
etc.
What am I doing wrong?

You're making a new generator each time you call gen(). Each new generator starts from 0.
Instead, you can call it once and capture the return value.
def gen():
for i in range(10):
yield i
x = gen()
for g in x:
n = next(x)
print(g, n)

Related

Can a current generator value interact with the value generated before it?

I'm aware yield generates a value on the fly, by my understanding this means it doesn't keep the value in the memory, and therefore the current value shouldn't be able to interact with the last values.
But I just want to be sure that's the case, could someone confirm if it's possible or not?
I'm going to use 5 as the value in number.
Example without generator:
def factorial(number):
result = number
if number <= 1:
return 1
else:
for x in reversed(range(1, number)): # (4,1) reversed
result *= x # 5*4*3*2*1
return result # returns 120
Is it possible to do the same thing by using the yield function? how?
Thank you
Generators can be stateful:
def fibs():
a, b = 1, 1
while True:
yield b
a, b = b, a + b
g = fibs()
for i in range(10):
print next(g)
Here the state is in the local variables. They are kept alive while the iterator generated by the generator is alive.
EDIT. I'm blind it was a factorial
def factorials():
i = 1
a = 1
while True:
yield a
i+=1
a*=i
or if you need a function not a stream of them then here's a one liner
print reduce(lambda a, b: a*b, (range(1, 10+1)))

How to execute a function to every item in a range

I got a list of functions after calling the function count() below. I want to know how to execute these functions besides using the way in my code.
def count():
fs = []
for i in range(1, 4):
def f():
return i*i
fs.append(f)
return fs
print(count()[0](), count()[1](), count()[2]())
You can use map to apply a function to every item in an iterable. The result, in Python 3.x, is an iterable.
There are several different ways you can then extract results. Below is an example.
I have also corrected some errors in your logic. Your function f should take a parameter, presumably i, as this is used in your function. Similarly, it seems you want the function count to take a parameter n to determine your range of inputs.
def count(n):
def f(i):
return i*i
return map(f, range(1, n))
## iterate using next
res = count(5)
print(next(res)) # 1
print(next(res)) # 4
print(next(res)) # 9
## iterate using for loop
for k in count(4):
print(k)
# 1
# 4
# 9
## build list and exhaust all results
print(list(count(4)))
# [1, 4, 9]

Can generator be used more than once?

This is my piece of code with two generators defined:
one_line_gen = (x for x in range(3))
def three_line_gen():
yield 0
yield 1
yield 2
When I execute:
for x in one_line_gen:
print x
for x in one_line_gen:
print x
The result is as expected:
0
1
2
However, if I execute:
for x in three_line_gen():
print x
for x in three_line_gen():
print x
The result is:
0
1
2
0
1
2
Why? I thought any generator can be used only once.
three_line_gen is not a generator, it's a function. What it returns when you call it is a generator, a brand new one each time you call it. Each time you put parenthesis like this:
three_line_gen()
It is a brand new generator to be iterated on. If however you were to first do
mygen = three_line_gen()
and iterate over mygen twice, the second time will fail as you expect.
no, you can not iterate over a generator twice. a generator is exhausted once you have iterated over it. you may make a copy of a generator with tee though:
from itertools import tee
one_line_gen = (x for x in range(3))
gen1, gen2 = tee(one_line_gen)
# or:
# gen1, gen2 = tee(x for x in range(3))
for item in gen1:
print(item)
for item in gen2:
print(item)
for the other issues see Ofer Sadan's answer.
Yes, generator can be used only once. but you have two generator object.
# Python 3
def three_line_gen():
yield 0
yield 1
yield 2
iterator = three_line_gen()
print(iterator)
for x in iterator:
print(id(iterator), x)
iterator2 = three_line_gen()
print(iterator2)
for x in iterator2:
print(id(iterator2), x)
And the result is:
<generator object three_line_gen at 0x1020401b0>
4328784304 0
4328784304 1
4328784304 2
<generator object three_line_gen at 0x1020401f8>
4328784376 0
4328784376 1
4328784376 2
Why? I thought any generator can be used only once.
Because every call to three_line_gen() creates a new generator.
Otherwise, you're correct that generators only run forward until exhausted.
Can generator be used more than once?
Yes, it is possible if the results are buffered outside the generator. One easy way is to use itertools.tee():
>>> from itertools import tee
>>> def three_line_gen():
yield 0
yield 1
yield 2
>>> t1, t2 = tee(three_line_gen())
>>> next(t1)
0
>>> next(t2)
0
>>> list(t1)
[1, 2]
>>> list(t2)
[1, 2]
Because in One liner is Generator Object while the three liner is a function.
They meant to be different.
These two are similar.
def three_line_gen_fun():
yield 0
yield 1
yield 2
three_line_gen = three_line_gen_fun()
one_line_gen = (x for x in range(3))
type(three_line_gen) == type(one_line_gen)

Python function that produces both generator and aggregate results

What is the Pythonic way to make a generator that also produces aggregate results? In meta code, something like this (but not for real, as my Python version does not support mixing yield and return):
def produce():
total = 0
for item in find_all():
total += 1
yield item
return total
As I see it, I could:
Not make produce() a generator, but pass it a callback function to call on every item.
With every yield, also yield the aggregate results up until now. I'd rather not calculate the intermediate results with every yield, only when finishing.
Send a dict as argument to produce() that will be populated with the aggregate results.
Use a global to store aggregate results.
All of them don't seem very attractive...
NB. total is a simple example, my actual code requires complex aggregations. And I need intermediate results before produce() finishes, hence a generator.
Maybe you shouldn't use a generator but an iterator.
def findall(): # no idea what your "find_all" does so I use this instead. :-)
yield 1
yield 2
yield 3
class Produce(object):
def __init__(self, iterable):
self._it = iterable
self.total = 0
def __iter__(self):
return self
def __next__(self):
self.total += 1
return next(self._it)
next = __next__ # only necessary for python2 compatibility
Maybe better to see this with an example:
>>> it = Produce(findall())
>>> it.total
0
>>> next(it)
1
>>> next(it)
2
>>> it.total
2
you can use enumerate to count stuff, for example
i=0
for i,v in enumerate(range(10), 1 ):
print(v)
print("total",i)
(notice the start value of the enumerate)
for more complex stuff, you can use the same principle, make produce a generator that yield both values and ignore one in the iteration and use it later when finished.
other alternative is passing a modifiable object, for example
def produce(mem):
t=0
for x in range(10):
t+=1
yield x
mem.append(t)
aggregate=[]
for x in produce(aggregate):
print(x)
print("total",aggregate[0])
in either case the result is the same for this example
0
1
2
3
4
5
6
7
8
9
total 10
Am I missing something? Why not:
def produce():
total = 0
for item in find_all():
total += 1
yield item
yield total

Check either the generator awaiting "send" or yielding the value

How can I check if the some blackbox generator is awaiting the value or it is returning the value now? I mean managing the following generator:
def gen():
a = yield
yield a
yield a+1
yield a+2
can be the following:
g = gen()
g.next()
print g.send(5)
print g.next()
print g.next()
and for the different generator, for example:
def gen():
a = yield
b = yield
yield a+b
it needs to be different also, for example:
g = gen()
g.next()
g.send(1)
print g.send(2)
So the question is how can I choose between sending value in generator and getting results from it only in case of blackbox (third-party) generator? I need to write the following code:
values = [1, 2, 3]
results = list()
g = gen()
g.next()
for v in values:
# needs this magic
if g.__awaits__: # in case of "x = yield" expression
results.append(g.send(v))
elif g.__yields__: # in case of "yield x" expression
results.append(g.next())
You can't, because there isn't any difference in terms of the allowable operations on those two "kinds" of generators. You can always send a value to the generator as soon as you have called next on it once, even if the generator doesn't do anything with the sent value:
>>> def gen():
... yield 1
... yield 2
... yield 3
>>> x = gen()
>>> next(x)
1
>>> x.send('hello')
2
>>> x.send('hello')
3
There's no way to tell whether the generator "needs" you to send a value except by reading the documentation for whatever generator you're using.

Categories