How yield identify that its iteration done once.? - python

I have read about yield at What does the "yield" keyword do in Python? keyword in python but i have one question that how system identity that yield iterated once.
def test_yield():
name = 'Hello World!'
for i in name:
yield i
y= test_yield()
print "first yield",y
for c in y:
print c
print "second yield",y
for c in y:
print c
Output
first yield <generator object test_yield at 0x7f4c2bc10be0>
H
e
l
l
o
W
o
r
l
d
!
second yield <generator object test_yield at 0x7f4c2bc10be0>
In output second time yield object printed but not iterate.so how program identity that its executed once?

Looping through an iterator "uses it up". So your first yield loop iterates through it and reaches the end. Just as if you opened a file and read all of the lines until EOF. After that, no matter how many times you call read(), you won't get more data, only EOF. Similarly, once your iterator has reached its last element, calling .next on it just raises StopIteration.

When a generator function calls yield, the "state" of the generator function is frozen; the values of all variables are saved and the next line of code to be executed is recorded until next() is called again. Once it is, the generator function simply resumes where it left off. If next() is never called again, the state recorded during the yield call is (eventually) discarded.
Given that when the generator function reaches it's end, a StopIteration Exception is raised, making it exhausted hence you have to reload it, that is the reason it didn't iterate any values during the second call.
Note : for gets values by calling next() implicitly

The first time the for calls the generator object created from your function, it will run the code in your function from the beginning until it hits yield, then it’ll return the first value of the loop. Then, each other call will run the loop you have written in the function one more time, and return the next value, until there is no value to return. In your second for loop, there's no more value to retrieve because you already wasted them.
For more insight as to what's happening behind the scenes, the for loop can be rewritten to this:
iterator = some_func()
try:
while 1:
print iterator.next()
except StopIteration:
pass

When you wrote
y = test_yield()
you initialize iterator and when you finish your iteration after first
for c in y:
the iterator was ended. You need to initialize once more like:
def test_yield():
name = 'Hello World!'
for i in name:
yield i
y= test_yield()
print "first yield",y
for c in y:
print c
# once again
y= test_yield()
print "second yield",y
for c in y:
print c

Related

returning value without breaking a loop

I intend to make a while loop inside a defined function. In addition, I want to return a value on every iteration. Yet it doesn't allow me to iterate over the loop.
Here is the plan:
def func(x):
n=3
while(n>0):
x = x+1
return x
print(func(6))
I know the reason to such issue-return function breaks the loop.
Yet, I insist to use a defined function. Therefore, is there a way to somehow iterate over returning a value, given that such script is inside a defined function?
When you want to return a value and continue the function in the next call at the point where you returned, use yield instead of return.
Technically this produces a so called generator, which gives you the return values value by value. With next() you can iterate over the values. You can also convert it into a list or some other data structure.
Your original function would like this:
def foo(n):
for i in range(n):
yield i
And to use it:
gen = foo(100)
print(next(gen))
or
gen = foo(100)
l = list(gen)
print(l)
Keep in mind that the generator calculates the results 'on demand', so it does not allocate too much memory to store results. When converting this into a list, all results are caclculated and stored in the memory, which causes problems for large n.
Depending on your use case, you may simply use print(x) inside the loop and then return the final value.
If you actually need to return intermediate values to a caller function, you can use yield.
You can create a generator for that, so you could yield values from your generator.
Example:
def func(x):
n=3
while(n>0):
x = x+1
yield x
func_call = func(6) # create generator
print(next(func_call)) # 7
print(next(func_call)) # 8

How to intercept the first value of a generator and transparently yield from the rest

Update: I've started a thread on python-ideas to propose additional syntax or a stdlib function for this purpose (i.e. specifying the first value sent by yield from). So far 0 replies... :/
How do I intercept the first yielded value of a subgenerator but delegate the rest of the iteration to the latter using yield from?
For example, suppose we have an arbitrary bidirectional generator subgen, and we want to wrap this in another generator gen. The purpose of gen is to intercept the first yielded value of subgen and delegate the rest of the generation—including sent values, thrown exceptions, .close(), etc.—to the sub-generator.
The first thing that might come to mind could be this:
def gen():
g = subgen()
first = next(g)
# do something with first...
yield "intercepted"
# delegate the rest
yield from g
But this is wrong, because when the caller .sends something back to the generator after getting the first value, it will end up as the value of the yield "intercepted" expression, which is ignored, and instead g will receive None as the first .send value, as part of the semantics of yield from.
So we might think to do this:
def gen():
g = subgen()
first = next(g)
# do something with first...
received = yield "intercepted"
g.send(received)
# delegate the rest
yield from g
But what we've done here is just moving the problem back by one step: as soon as we call g.send(received), the generator resumes its execution and doesn't stop until it reaches the next yield statement, whose value becomes the return value of the .send call. So we'd also have to intercept that and re-send it. And then send that, and that again, and so on... So this won't do.
Basically, what I'm asking for is a yield from with a way to customize what the first value sent to the generator is:
def gen():
g = subgen()
first = next(g)
# do something with first...
received = yield "intercepted"
# delegate the rest
yield from g start with received # pseudocode; not valid Python
...but without having to re-implement all of the semantics of yield from myself. That is, the laborious and poorly maintainable solution would be:
def adaptor(generator, init_send_value=None):
send = init_send_value
try:
while True:
send = yield generator.send(send)
except StopIteration as e:
return e.value
which is basically a bad re-implementation of yield from (it's missing handling of throw, close, etc.). Ideally I would like something more elegant and less redundant.
If you're trying to implement this generator wrapper as a generator function using yield from, then your question basically boils down to whether it is possible to specify the first value sent to the "yielded from" generator. Which it is not.
If you look at the formal specification of the yield from expression in PEP 380, you can see why. The specification contains a (surprisingly complex) piece of sample code that behaves the same as a yield from expression. The first few lines are:
_i = iter(EXPR)
try:
_y = next(_i)
except StopIteration as _e:
_r = _e.value
else:
...
You can see that the first thing that is done to the iterator is to call next() on it, which is basically equivalent to .send(None). There is no way to skip that step and your generator will always receive another None whenever yield from is used.
The solution I've come up with is to implement the generator protocol using a class instead of a generator function:
class Intercept:
def __init__(self, generator):
self._generator = generator
self._intercepted = False
def __next__(self):
return self.send(None)
def send(self, value):
yielded_value = self._generator.send(value)
# Intercept the first value yielded by the wrapped generator and
# replace it with a different value.
if not self._intercepted:
self._intercepted = True
print(f'Intercepted value: {yielded_value}')
yielded_value = 'intercepted'
return yielded_value
def throw(self, type, *args):
return self._generator.throw(type, *args)
def close(self):
self._generator.close()
__next__(), send(), throw(), close() are described in the Python Reference Manual.
The class wraps the generator passed to it when created will mimic its behavior. The only thing it changes is that the first value yielded by the generator is replaced by a different value before it is returned to the caller.
We can test the behavior with an example generator f() which yields two values and a function main() which sends values into the generator until the generator terminates:
def f():
y = yield 'first'
print(f'f(): {y}')
y = yield 'second'
print(f'f(): {y}')
def main():
value_to_send = 0
gen = f()
try:
x = gen.send(None)
while True:
print(f'main(): {x}')
# Send incrementing integers to the generator.
value_to_send += 1
x = gen.send(value_to_send)
except StopIteration:
print('main(): StopIteration')
main()
When ran, this example will produce the following output, showing which values arrive in the generator and which are returned by the generator:
main(): first
f(): 1
main(): second
f(): 2
main(): StopIteration
Wrapping the generator f() by changing the statement gen = f() to gen = Intercept(f()), produces the following output, showing that the first yielded value has been replaced:
Intercepted value: first
main(): intercepted
f(): 1
main(): second
f(): 2
As all other calls to any of the generator API are forwarded directly to the wrapped generator, it should behave equivalently to the wrapped generator itself.
If I understand the question, I think this works? Meaning, I ran this script and it did what I expected, which was to print all but the first line of the input file. But as long as the generator passed as the argument to the skip_first function can be iterator over, it should work.
def skip_first(thing):
_first = True
for _result in thing:
if _first:
_ first = False
continue
yield _result
inp = open("/var/tmp/test.txt")
for line in skip_first(inp):
print(line, end="")

Is calling next(iter) inside for i in iter supported in python?

Question: Python list iterator behavior and next(iterator) documents the fact that calling
for i in iter:
...
next(iter)
Has the effect of skipping ahead in the for loop. Is this defined behavior that I can rely on (e.g. in some PEP) or merely an accident of implementation that could change without warning?
It depends on what iter is. If it’s an iterator created with a call of iter() on an iterable, then yes, calling next(iter) will also advance the iterator:
>>> it = iter(range(4))
>>> for x in it:
print(x)
_ = next(it)
0
2
It’s important that it is an iterator (instead of just any iterable) since the for loop will internally also call iter() on that object. Since iterators (usually) return themselves when iter(some_iterator) is called, this works fine.
Is this a good idea though? Generally not since it can easily break:
>>> it = iter(range(3))
>>> for x in it:
print(x)
_ = next(it)
0
2
Traceback (most recent call last):
File "<pyshell#21>", line 3, in <module>
_ = next(it)
StopIteration
You would have to add exception handling for the StopIteration exception manually inside the loop; and that gets messy very easily. Of course you could also use a default value, e.g. next(it, None) instead, but that becomes confusing quickly, especially when you’re not actually using the value for anything.
This whole concept also breaks as soon as someone later decides to not use an iterator but some other iterable in the for loop (for example because they are refactoring the code to use a list there, and then everything breaks).
You should attempt to let the for loop be the only one that consumes the iterator. Change your logic so that you can easily determine whether or not an iteration should be skipped: If you can, filter the iterable first, before starting the loop. Otherwise, use conditionals to skip iterations within the loop (using continue):
>>> it = filter(lambda x: x % 2 == 0, range(3))
>>> for x in it:
print(x)
0
2
>>> it = range(3) # no need for an iterator here
>>> for x in it:
if x % 2 == 1:
continue
print(x)
0
2
In for loop, it always call next of the iterating item.
if you call next() before for loop calls next, then it has the effect of skipping the (next) item.
So, being said that, you can implement this in your program, as long as for calls the same method (next).
So far, even in 3.6 the same being implemented. so, there is no need to worry.

generator is yielded but the the yielded variable is not printed

While using generators, we yield a variable whose value is saved and will resume with that saved value when we give the next() statement. Is there a way where we can do this but not actually print the value of the yielded variable?
def foo():
n = 0
print("This is where we start")
yield n
n += 1
print("This is first")
yield n
n += 1
print("This is second")
yield n
a = foo()
next(a)
This is where we start
0
next(a)
This is first
1
This is a very naive way of using generators(implementing them) and doesn't show their effectiveness.
I know that this can also be done using iterators where the value won't be printed, but just wondering if it can be done with generators.
You are using the Python interactive interpreter to call next(), and it is a function of that shell to print return values. What you are seeing has nothing to do with generators.
Simply assign the return value of the next() call to variable to not have them echoed:
ignored = next(a)
or run your code as a script.
Note that generators are paused immediately; no code inside is run until you call next() on it. At that point the code runs until a yield expression is reached; it's value is returned and the generator is paused again. At no point is the yield result 'saved'.

Python - Understanding the send function of a generator

I'm studying in Python yield and find that yield is not only the way in which generators output a return value but also a way to put values into a generator. For example the following code
def f():
print (yield),
print 0,
print (yield),
print 1
g = f()
g.send(None)
g.send('x')
g.send('y')
In the global scope it sends value 'x', 'y' to the generator and thus in f it will output x 0 y 1. But I cannot understand
There are 2 yields but 3 sends. Why should it send None at the first time?
It throws a StopIteration at the last send. Is there any way to avoid this exception?
Could anyone please explain that? Thanks in advance.
From the documentation:
When send() is called to start the generator, it must be called with None as the argument, because there is no yield expression that could receive the value.
As for the exception, you can't really avoid it. The generator throws this exception when it's done iterating, so instead of avoiding it, just catch it:
g = f()
try:
g.send(None)
g.send('x')
g.send('y')
except StopIteration:
print 'Done'

Categories