Can this function be expressed with a generator comprehension? - python

I have the following function:
def infinite_sequence(starting_value, function):
value = starting_value
while True:
yield value
value = function(value)
Is it possible to express this as a generator comprehension? If we were dealing with a fixed range, instead of an infinite sequence, it could be dealt with like so: (Edit: actually that's wrong)
(function(value) for value in range(start, end))
But since we're dealing with an infinite sequence, is it possible to express this using a generator comprehension?

You would need some sort of recursive generator expression:
infinite_sequence = itertools.imap(f, itertools.chain(x, __currentgenerator__))
where __currentgenerator__ is a hypothetical magic reference to the generator expression it occurs in. (Note that the issue is not that you want an infinite sequence, but that the sequence is defined recursively in terms of itself.)
Unfortunately, Python does not have such a feature. Haskell is an example of a language that does, due to its lazy argument evaluation:
infinite_sequence = map f x:infinite_sequence
You can, however, still achieve something similar in Python 3 while still using a def statement, by defining a recursive generator.
def infinite_sequence(f, sv):
x = f(sv)
yield from itertools.chain(x, infinite_sequence(f, x))
(itertools.chain isn't strictly necessary; you could use
def inifinite_sequence(f, sv):
x = f(sv)
yield x
yield from infinite_sequence(f, x)
but I was attempting to preserve the flavor of the Haskell expression x:infinite_sequence.)

This is itertools.accumulate, ignoring all but the first value:
from itertools import accumulate, repeat
def function(x): return x*2
start = 1
seq = accumulate(repeat(start), lambda last, _: function(last))
Just write it out in full, though.

Yes, just use an infinite iterator such as itertools.count.
(function(value) for value in itertools.count(starting_value))

Related

returning value without breaking a loop

I intend to make a while loop inside a defined function. In addition, I want to return a value on every iteration. Yet it doesn't allow me to iterate over the loop.
Here is the plan:
def func(x):
n=3
while(n>0):
x = x+1
return x
print(func(6))
I know the reason to such issue-return function breaks the loop.
Yet, I insist to use a defined function. Therefore, is there a way to somehow iterate over returning a value, given that such script is inside a defined function?
When you want to return a value and continue the function in the next call at the point where you returned, use yield instead of return.
Technically this produces a so called generator, which gives you the return values value by value. With next() you can iterate over the values. You can also convert it into a list or some other data structure.
Your original function would like this:
def foo(n):
for i in range(n):
yield i
And to use it:
gen = foo(100)
print(next(gen))
or
gen = foo(100)
l = list(gen)
print(l)
Keep in mind that the generator calculates the results 'on demand', so it does not allocate too much memory to store results. When converting this into a list, all results are caclculated and stored in the memory, which causes problems for large n.
Depending on your use case, you may simply use print(x) inside the loop and then return the final value.
If you actually need to return intermediate values to a caller function, you can use yield.
You can create a generator for that, so you could yield values from your generator.
Example:
def func(x):
n=3
while(n>0):
x = x+1
yield x
func_call = func(6) # create generator
print(next(func_call)) # 7
print(next(func_call)) # 8

is for/while loop from python is a generator

In an interview , the interviewer asked me for some of generators being used in Python. I know a generator is like a function which yield values instead of return.
so any one tell me is for/while loop is an example of generator.
Short answer: No, but there are other forms of generators.
A for/while loop is a loop structure: it does not emit values and thus is not a generator.
Nevertheless, there are other ways to construct generators.
You example with yield is for instance a generator:
def some_generator(xs):
for x in xs:
if x:
yield x
But there are also generator expressions, like:
(x for x in xs if x)
Furthermore in python-3.x the range(..), map(..), filter(..) constructs are generators as well.
And of course you can make an iterable (by using an iterable pattern):
class some_generator(object):
def __init__(self, xs):
self.n = n
self.idx = 0
def __iter__(self):
return self
def __next__(self):
return self.next()
def next(self):
while self.num < len(self.xs) and not self.xs[self.num]:
self.num += 1
if self.num < len(self.xs):
res = self.xs[self.num]
self.num += 1
return res
else:
raise StopIteration()
Neither while nor for are themselves generators or iterators. They are control constructs that perform iteration. Certainly, you can use for or while to iterate over the items yielded by a generator, and you can use for or while to perform iteration inside the code of a generator. But neither of those facts make for or while generators.
The first line in the python wiki for generators:
Generators functions allow you to declare a function that behaves like an iterator, i.e. it can be used in a for loop.
So in the context of your interview I'd believe they were looking for you to answer about the creation of an iterable.
The wiki for a for loop
In Python this is controlled instead by generating the appropriate sequence.
So you could get pedantic but generally, no, a for loop isn't a generator.
for and while are loop structures, and you can use them to iterate over generators. You can take certain elements of a generator by converting it to a list.

SICP "streams as signals" in Python

I have found some nice examples (here, here) of implementing SICP-like streams in Python. But I am still not sure how to handle an example like the integral found in SICP 3.5.3 "Streams as signals."
The Scheme code found there is
(define (integral integrand initial-value dt)
(define int
(cons-stream initial-value
(add-streams (scale-stream integrand dt)
int)))
int)
What is tricky about this one is that the returned stream int is defined in terms of itself (i.e., the stream int is used in the definition of the stream int).
I believe Python could have something similarly expressive and succinct... but not sure how. So my question is, what is an analogous stream-y construct in Python? (What I mean by a stream is the subject of 3.5 in SICP, but briefly, a construct (like a Python generator) that returns successive elements of a sequence of indefinite length, and can be combined and processed with operations such as add-streams and scale-stream that respect streams' lazy character.)
There are two ways to read your question. The first is simply: How do you use Stream constructs, perhaps the ones from your second link, but with a recursive definition? That can be done, though it is a little clumsy in Python.
In Python you can represent looped data structures but not directly. You can't write:
l = [l]
but you can write:
l = [None]
l[0] = l
Similarly you can't write:
def integral(integrand,initial_value,dt):
int_rec = cons_stream(initial_value,
add_streams(scale_stream(integrand,dt),
int_rec))
return int_rec
but you can write:
def integral(integrand,initial_value,dt):
placeholder = Stream(initial_value,lambda : None)
int_rec = cons_stream(initial_value,
add_streams(scale_stream(integrand,dt),
placeholder))
placeholder._compute_rest = lambda:int_rec
return int_rec
Note that we need to clumsily pre-compute the first element of placeholder and then only fix up the recursion for the rest of the stream. But this does all work (alongside appropriate definitions of all the rest of the code - I'll stick it all at the bottom of this answer).
However, the second part of your question seems to be asking how to do this naturally in Python. You ask for an "analogous stream-y construct in Python". Clearly the answer to that is exactly the generator. The generator naturally provides the lazy evaluation of the stream concept. It differs by not being naturally expressed recursively but then Python does not support that as well as Scheme, as we will see.
In other words, the strict stream concept can be expressed in Python (as in the link and above) but the idiomatic way to do it is to use generators.
It is more or less possible to replicate the Scheme example by a kind of direct mechanical transformation of stream to generator (but avoiding the built-in int):
def integral_rec(integrand,initial_value,dt):
def int_rec():
for x in cons_stream(initial_value,
add_streams(scale_stream(integrand,dt),int_rec())):
yield x
for x in int_rec():
yield x
def cons_stream(a,b):
yield a
for x in b:
yield x
def add_streams(a,b):
while True:
yield next(a) + next(b)
def scale_stream(a,b):
for x in a:
yield x * b
The only tricky thing here is to realise that you need to eagerly call the recursive use of int_rec as an argument to add_streams. Calling it doesn't start it yielding values - it just creates the generator ready to yield them lazily when needed.
This works nicely for small integrands, though it's not very pythonic. The Scheme version works by optimising the tail recursion - the Python version will exceed the max stack depth if your integrand is too long. So this is not really appropriate in Python.
A direct and natural pythonic version would look something like this, I think:
def integral(integrand,initial_value,dt):
value = initial_value
yield value
for x in integrand:
value += dt * x
yield value
This works efficiently and correctly treats the integrand lazily as a "stream". However, it uses iteration rather than recursion to unpack the integrand iterable, which is more the Python way.
In moving to natural Python I have also removed the stream combination functions - for example, replaced add_streams with +=. But we could still use them if we wanted a sort of halfway house version:
def accum(initial_value,a):
value = initial_value
yield value
for x in a:
value += x
yield value
def integral_hybrid(integrand,initial_value,dt):
for x in accum(initial_value,scale_stream(integrand,dt)):
yield x
This hybrid version uses the stream combinations from the Scheme and avoids only the tail recursion. This is still pythonic and python includes various other nice ways to work with iterables in the itertools module. They all "respect streams' lazy character" as you ask.
Finally here is all the code for the first recursive stream example, much of it taken from the Berkeley reference:
class Stream(object):
"""A lazily computed recursive list."""
def __init__(self, first, compute_rest, empty=False):
self.first = first
self._compute_rest = compute_rest
self.empty = empty
self._rest = None
self._computed = False
#property
def rest(self):
"""Return the rest of the stream, computing it if necessary."""
assert not self.empty, 'Empty streams have no rest.'
if not self._computed:
self._rest = self._compute_rest()
self._computed = True
return self._rest
def __repr__(self):
if self.empty:
return '<empty stream>'
return 'Stream({0}, <compute_rest>)'.format(repr(self.first))
Stream.empty = Stream(None, None, True)
def cons_stream(a,b):
return Stream(a,lambda : b)
def add_streams(a,b):
if a.empty or b.empty:
return Stream.empty
def compute_rest():
return add_streams(a.rest,b.rest)
return Stream(a.first+b.first,compute_rest)
def scale_stream(a,scale):
if a.empty:
return Stream.empty
def compute_rest():
return scale_stream(a.rest,scale)
return Stream(a.first*scale,compute_rest)
def make_integer_stream(first=1):
def compute_rest():
return make_integer_stream(first+1)
return Stream(first, compute_rest)
def truncate_stream(s, k):
if s.empty or k == 0:
return Stream.empty
def compute_rest():
return truncate_stream(s.rest, k-1)
return Stream(s.first, compute_rest)
def stream_to_list(s):
r = []
while not s.empty:
r.append(s.first)
s = s.rest
return r
def integral(integrand,initial_value,dt):
placeholder = Stream(initial_value,lambda : None)
int_rec = cons_stream(initial_value,
add_streams(scale_stream(integrand,dt),
placeholder))
placeholder._compute_rest = lambda:int_rec
return int_rec
a = truncate_stream(make_integer_stream(),5)
print(stream_to_list(integral(a,8,.5)))

Lazy evaluation in Python

What is lazy evaluation in Python?
One website said :
In Python 3.x the range() function returns a special range object which computes elements of the list on demand (lazy or deferred evaluation):
>>> r = range(10)
>>> print(r)
range(0, 10)
>>> print(r[3])
3
What is meant by this?
The object returned by range() (or xrange() in Python2.x) is known as a lazy iterable.
Instead of storing the entire range, [0,1,2,..,9], in memory, the generator stores a definition for (i=0; i<10; i+=1) and computes the next value only when needed (AKA lazy-evaluation).
Essentially, a generator allows you to return a list like structure, but here are some differences:
A list stores all elements when it is created. A generator generates the next element when it is needed.
A list can be iterated over as much as you need, a generator can only be iterated over exactly once.
A list can get elements by index, a generator cannot -- it only generates values once, from start to end.
A generator can be created in two ways:
(1) Very similar to a list comprehension:
# this is a list, create all 5000000 x/2 values immediately, uses []
lis = [x/2 for x in range(5000000)]
# this is a generator, creates each x/2 value only when it is needed, uses ()
gen = (x/2 for x in range(5000000))
(2) As a function, using yield to return the next value:
# this is also a generator, it will run until a yield occurs, and return that result.
# on the next call it picks up where it left off and continues until a yield occurs...
def divby2(n):
num = 0
while num < n:
yield num/2
num += 1
# same as (x/2 for x in range(5000000))
print divby2(5000000)
Note: Even though range(5000000) is a generator in Python3.x, [x/2 for x in range(5000000)] is still a list. range(...) does it's job and generates x one at a time, but the entire list of x/2 values will be computed when this list is create.
In a nutshell, lazy evaluation means that the object is evaluated when it is needed, not when it is created.
In Python 2, range will return a list - this means that if you give it a large number, it will calculate the range and return at the time of creation:
>>> i = range(100)
>>> type(i)
<type 'list'>
In Python 3, however you get a special range object:
>>> i = range(100)
>>> type(i)
<class 'range'>
Only when you consume it, will it actually be evaluated - in other words, it will only return the numbers in the range when you actually need them.
A github repo named python patterns and wikipedia tell us what lazy evaluation is.
Delays the eval of an expr until its value is needed and avoids repeated evals.
range in python3 is not a complete lazy evaluation, because it doesn't avoid repeated eval.
A more classic example for lazy evaluation is cached_property:
import functools
class cached_property(object):
def __init__(self, function):
self.function = function
functools.update_wrapper(self, function)
def __get__(self, obj, type_):
if obj is None:
return self
val = self.function(obj)
obj.__dict__[self.function.__name__] = val
return val
The cached_property(a.k.a lazy_property) is a decorator which convert a func into a lazy evaluation property. The first time property accessed, the func is called to get result and then the value is used the next time you access the property.
eg:
class LogHandler:
def __init__(self, file_path):
self.file_path = file_path
#cached_property
def load_log_file(self):
with open(self.file_path) as f:
# the file is to big that I have to cost 2s to read all file
return f.read()
log_handler = LogHandler('./sys.log')
# only the first time call will cost 2s.
print(log_handler.load_log_file)
# return value is cached to the log_handler obj.
print(log_handler.load_log_file)
To use a proper word, a python generator object like range are more like designed through call_by_need pattern, rather than lazy evaluation

Is there a generally accepted pythonic syntax for modifying a generator's values?

Suppose I have a generator gen. Is there a more pythonic or simpler way of modifying gen's values than the example I have provided?
def genwrap(gen):
for value in gen:
yield(somefunc(value))
gen = somegenerator
for x in genwrap(gen):
print x
If it's actually applying a function that already exists, use map. Otherwise, this is fine, and can be shortened to a generator expression if it's simple enough (e.g. (x + 1 for x in somegenerator)).
def genwrap(gen):
return (somefunc(val) for val in gen)
The parentheses are not needed in the yield statement:
def genwrap(gen):
for value in gen:
yield somefunc(value)

Categories