Related
Since Python 3.3, if a generator function returns a value, that becomes the value for the StopIteration exception that is raised. This can be collected a number of ways:
The value of a yield from expression, which implies the enclosing function is also a generator.
Wrapping a call to next() or .send() in a try/except block.
However, if I'm simply wanting to iterate over the generator in a for loop - the easiest way - there doesn't appear to be a way to collect the value of the StopIteration exception, and thus the return value. Im using a simple example where the generator yields values, and returns some kind of summary at the end (running totals, averages, timing statistics, etc).
for i in produce_values():
do_something(i)
values_summary = ....??
One way is to handle the loop myself:
values_iter = produce_values()
try:
while True:
i = next(values_iter)
do_something(i)
except StopIteration as e:
values_summary = e.value
But this throws away the simplicity of the for loop. I can't use yield from since that requires the calling code to be, itself, a generator. Is there a simpler way than the roll-ones-own for loop shown above?
You can think of the value attribute of StopIteration (and arguably StopIteration itself) as implementation details, not designed to be used in "normal" code.
Have a look at PEP 380 that specifies the yield from feature of Python 3.3: It discusses that some alternatives of using StopIteration to carry the return value where considered.
Since you are not supposed to get the return value in an ordinary for loop, there is no syntax for it. The same way as you are not supposed to catch the StopIteration explicitly.
A nice solution for your situation would be a small utility class (might be useful enough for the standard library):
class Generator:
def __init__(self, gen):
self.gen = gen
def __iter__(self):
self.value = yield from self.gen
This wraps any generator and catches its return value to be inspected later:
>>> def test():
... yield 1
... return 2
...
>>> gen = Generator(test())
>>> for i in gen:
... print(i)
...
1
>>> print(gen.value)
2
You could make a helper wrapper, that would catch the StopIteration and extract the value for you:
from functools import wraps
class ValueKeepingGenerator(object):
def __init__(self, g):
self.g = g
self.value = None
def __iter__(self):
self.value = yield from self.g
def keep_value(f):
#wraps(f)
def g(*args, **kwargs):
return ValueKeepingGenerator(f(*args, **kwargs))
return g
#keep_value
def f():
yield 1
yield 2
return "Hi"
v = f()
for x in v:
print(x)
print(v.value)
A light-weight way to handle the return value (one that doesn't involve instantiating an auxiliary class) is to use dependency injection.
Namely, one can pass in the function to handle / act on the return value using the following wrapper / helper generator function:
def handle_return(generator, func):
returned = yield from generator
func(returned)
For example, the following--
def generate():
yield 1
yield 2
return 3
def show_return(value):
print('returned: {}'.format(value))
for x in handle_return(generate(), show_return):
print(x)
results in--
1
2
returned: 3
The most obvious method I can think of for this would be a user defined type that would remember the summary for you..
>>> import random
>>> class ValueProducer:
... def produce_values(self, n):
... self._total = 0
... for i in range(n):
... r = random.randrange(n*100)
... self._total += r
... yield r
... self.value_summary = self._total/n
... return self.value_summary
...
>>> v = ValueProducer()
>>> for i in v.produce_values(3):
... print(i)
...
25
55
179
>>> print(v.value_summary)
86.33333333333333
>>>
Another light weight way sometimes appropriate is to yield the running summary in every generator step in addition to your primary value in a tuple. The loop stays simple with an extra binding which is still available afterwards:
for i, summary in produce_values():
do_something(i)
show_summary(summary)
This is especially useful if someone could use more than just the last summary value, e. g. updating a progress view.
Since Python 3.3, if a generator function returns a value, that becomes the value for the StopIteration exception that is raised. This can be collected a number of ways:
The value of a yield from expression, which implies the enclosing function is also a generator.
Wrapping a call to next() or .send() in a try/except block.
However, if I'm simply wanting to iterate over the generator in a for loop - the easiest way - there doesn't appear to be a way to collect the value of the StopIteration exception, and thus the return value. Im using a simple example where the generator yields values, and returns some kind of summary at the end (running totals, averages, timing statistics, etc).
for i in produce_values():
do_something(i)
values_summary = ....??
One way is to handle the loop myself:
values_iter = produce_values()
try:
while True:
i = next(values_iter)
do_something(i)
except StopIteration as e:
values_summary = e.value
But this throws away the simplicity of the for loop. I can't use yield from since that requires the calling code to be, itself, a generator. Is there a simpler way than the roll-ones-own for loop shown above?
You can think of the value attribute of StopIteration (and arguably StopIteration itself) as implementation details, not designed to be used in "normal" code.
Have a look at PEP 380 that specifies the yield from feature of Python 3.3: It discusses that some alternatives of using StopIteration to carry the return value where considered.
Since you are not supposed to get the return value in an ordinary for loop, there is no syntax for it. The same way as you are not supposed to catch the StopIteration explicitly.
A nice solution for your situation would be a small utility class (might be useful enough for the standard library):
class Generator:
def __init__(self, gen):
self.gen = gen
def __iter__(self):
self.value = yield from self.gen
This wraps any generator and catches its return value to be inspected later:
>>> def test():
... yield 1
... return 2
...
>>> gen = Generator(test())
>>> for i in gen:
... print(i)
...
1
>>> print(gen.value)
2
You could make a helper wrapper, that would catch the StopIteration and extract the value for you:
from functools import wraps
class ValueKeepingGenerator(object):
def __init__(self, g):
self.g = g
self.value = None
def __iter__(self):
self.value = yield from self.g
def keep_value(f):
#wraps(f)
def g(*args, **kwargs):
return ValueKeepingGenerator(f(*args, **kwargs))
return g
#keep_value
def f():
yield 1
yield 2
return "Hi"
v = f()
for x in v:
print(x)
print(v.value)
A light-weight way to handle the return value (one that doesn't involve instantiating an auxiliary class) is to use dependency injection.
Namely, one can pass in the function to handle / act on the return value using the following wrapper / helper generator function:
def handle_return(generator, func):
returned = yield from generator
func(returned)
For example, the following--
def generate():
yield 1
yield 2
return 3
def show_return(value):
print('returned: {}'.format(value))
for x in handle_return(generate(), show_return):
print(x)
results in--
1
2
returned: 3
The most obvious method I can think of for this would be a user defined type that would remember the summary for you..
>>> import random
>>> class ValueProducer:
... def produce_values(self, n):
... self._total = 0
... for i in range(n):
... r = random.randrange(n*100)
... self._total += r
... yield r
... self.value_summary = self._total/n
... return self.value_summary
...
>>> v = ValueProducer()
>>> for i in v.produce_values(3):
... print(i)
...
25
55
179
>>> print(v.value_summary)
86.33333333333333
>>>
Another light weight way sometimes appropriate is to yield the running summary in every generator step in addition to your primary value in a tuple. The loop stays simple with an extra binding which is still available afterwards:
for i, summary in produce_values():
do_something(i)
show_summary(summary)
This is especially useful if someone could use more than just the last summary value, e. g. updating a progress view.
So i'm trying to make a function that keeps track how many times a method is called.
for example:
a = [1,2,3,4]
a.pop()
i want to know how many times a.pop() was called so far so for this example, i would get 1.
Is there a way to do this?
This doesn't work for builtin functions, but an interesting approach would be:
def myfunction():
myfunction.counter += 1
myfunction.counter = 0
You're giving the function an attribute, so every call that attribute is updated. No global variables needed.
Built-ins are read-only. They cannot be modified.
You could use a decorator that tracks how many times the function is called. Since list is a built-in, you can't decorate or replace its pop method so you'd have to use your own list class, for example.
def counted(f):
def wrapped(*args, **kwargs):
wrapped.calls += 1
return f(*args, **kwargs)
wrapped.calls = 0
return wrapped
class MyList(list):
#counted
def pop(self, *args, **kwargs):
return list.pop(self, *args, **kwargs)
x = MyList([1, 2, 3, 4, 5])
for i in range(3):
x.pop()
print x.pop.calls # prints 3
i used the following little trick to track how many times the function was called
def myfun(s,i=[0]):
print(s)
i[0]+=1 # mutable variable get evaluated ONCE
return i[0]
>>> myfun('aaa')
aaa
1
>>> myfun('bbb')
bbb
2
Here is a simple and elegant solution for a self counting function, without any decorators, global variables, etc:
def hello():
hello.counter += 1
print(hello.counter)
hello.counter = 0
Each time you call hello(), it will print 1, 2, etc.
Let's not forget that, in Python, a function is a first-class citizen
and it has rights. And one of them is to have attributes!
If you are willing to include your method call in a function, it can be easy:
def pop_counted(a):
pop_counted.counter += 1
return a.pop()
pop_counted.counter = 0
Voilà!
Comment
This works because a Python function "knows" itself (this is a necessary feature, so that functions can call themselves recursively if desired).
If you wish to keep some information about a function, it might be better to keep it where it belongs: in an attribute of the function.
The advantage of not using a global variable is scope:
no risk of name collisions in the global namespace
the information you were keeping will vanish as soon as the function is taken off the stack, which is what you want -- no garbage left.
A bonus is that this approach will work in cases where a global variable is not really a good option, typically for nested functions where you can't declare a "global" in the outer function.
For kicks, I wrote up an answer using a decorator:
class counter:
#wraps a function, to keep a running count of how many
#times it's been called
def __init__(self, func):
self.func = func
self.count = count
def __call__(self, *args, **kwargs):
self.count += 1
return self.func(*args, **kwargs)
To use it, simply decorate a function. You can then check how many times that function has been run by examining the "count" attribute. Doing it this way is nice because:
1.) No global variables. The count is associated directly with the function.
2.) You can wrap builtin functions easily, by calling the class directly:
sum_wrapped = counter(sum)
sum_wrapped([1, 2 ,3, 4])
#outputs 10
print sum_wrapped.count
#outputs 1
Of course, this could be improved by using the Decorators module to keep the docstrings and other good things intact. Also, for an excellent overview of what decorators are, and how they work, check out this stackoverflow answer.
One approach is to create a proxy of the instance for which you want to count attribute access:
from collections import Counter
class CountingProxy():
def __init__(self, instance):
self._instance = instance
self.count = Counter()
def __getattr__(self, key):
if hasattr(self._instance, key):
self.count[key] += 1
return getattr(self._instance, key)
>>> l = [1,2,3,4,5]
>>> cl = CountingProxy(l)
>>> cl.pop()
5
>>> cl.append(10)
>>> cl.index(3)
2
>>> cl.reverse()
>>> cl.reverse()
>>> cl.count
Counter({'reverse': 2, 'pop': 1, 'append': 1, 'index': 1})
A simple way to do this is to increment a global variable each time you call the function.
counter = 0
a = [1,2,3,4]
a.pop()
counter += 1
i guess the following code will be helpful to you. you just need to make local variable global in order to access the global variable from a method
MYGLOBAL = 5
def func1():
global MYGLOBAL
MYGLOBAL +=10
def func2():
print (MYGLOBAL)
func1() #called the func1 three time thus the value of MYGLOBAL WILL increase 10*3=30
func1() #called the func1 three time thus the value of MYGLOBAL WILL increase 10*3=30
func1() #called the func1 three time thus the value of MYGLOBAL WILL increase 10*3=30
func2() #this will printout 5+30=35
counter = 0
def pop():
counter += 1
print counter
#other function code
a = [1,2,3,4]
a.pop()
this should solve your issue and you should be able to see whats being counted. +
every time you call the function the counter is going to be increased and printed with every pass of the function.
IF ITS BUILT IN:
counter = 0
def newfunction():
a = [1,2,3,4]
a.pop()
counter += 1
print counter
the logic in this is that it will call your new function go into the function that is premade then step out of the built in function and then go on to mark the counter as increased. the output your counter.
Just define a global statement in your function.
count = 1
def your_func():
global count
print(count)
count= count +1
Just define a global variable and increment it inside function.
a = 0
def some_function():
global a
a+=1
<..Your code.>
This will automatically be incremented as function is used and you can access it globally.
I did it copying the way JavaScript console.count() method works. That's my code:
class Terminal(object):
__count_names = []
def count(self, name='default'):
# check if name already exists
i = next((self.__count_names.index(item) for item in self.__count_names if item['name'] == name), None)
# if name argument does not exist register it
if i is None:
dictionary = { 'name': name, 'count': 1 }
self.__count_names.append(dictionary)
# if exists increment 'count'
else:
dictionary = self.__count_names[i]
dictionary['count'] += 1
self.__count_names[i] = dictionary
# finally print name and count
print(f"{dictionary['name']} : {dictionary['count']}")
Your code should look like this:
terminal = Terminal()
def myFunc():
terminal.count("myFunc")
myFunc()
myFunc()
myFunc("myFunc")
Output:
myFunc: 1
myFunc: 2
myFunc: 3
myFunc: 4
An example from Datacamp, using decorators:
def counter(func):
def wrapper(*args, **kwargs):
wrapper.count += 1
# Call the function being decorated and return the result
return func(*args, **kwargs)
wrapper.count = 0
# Return the new decorated function
return wrapper
# Decorate foo() with the counter() decorator
#counter
def foo():
print('calling foo()')
foo()
foo()
print('foo() was called {} times.'.format(foo.count))
# output
calling foo()
calling foo()
foo() was called 2 times.
I solve this with closure. This is a generic function:
def counter(fn):
cnt=0
def inner(*args,**kwargs):
nonlocal cnt
cnt+=1
print('{0} has been called {1} times'.format(fn.__name__,cnt))
return fn(*args,**kwargs)
return inner
a=[1,2,3,4]
a_pop=counter(a.pop)
i need to get consecutive numbers while an input number doesnt change.
so i get give(5)->1, give(5)->2, and so on, but then: give(6)->1 again, starting the count.
So far I solved it with an iterator function count() and a function give(num) like this:
def count(start=1):
n=start
while True:
yield n
n +=1
def give(num):
global last
global a
if num==last:
ret=a.next()
else:
a=count()
ret=a.next()
last=num
return ret
It works, but its ugly: I have two globals and have to set them before I call give(num). I'd like to be able to call give(num) without setting previously the 'a=count()' and 'last=999' variables. I'm positive there's better way to do this...
edit: ty all for incredibly fast and varied responses, i've got a lot to study here..
The obvious thing to do is to make give into an object rather than a function.* Any object can be made callable by defining a __call__ method.
While we're at it, your code can be simplified quite a bit, so let's do that.
class Giver(object):
def __init__(self):
self.last, self.a = object(), count()
def __call__(self, num):
if num != self.last:
self.a = count(1)
self.last = num
return self.a.next()
give = Giver()
So:
>>> give(5)
1
>>> give(5)
2
>>> give(6)
1
>>> give(5)
1
This also lets you create multiple separate givers, each with its own, separate current state, if you have any need to do that.
If you want to expand it with more state, the state just goes into the instance variables. For example, you can replace last and a with a dictionary mapping previously-seen values to counters:
class Giver(object):
def __init__(self):
self.counters = defaultdict(count)
def __call__(self, num):
return next(self.counters[num])
And now:
>>> give(5)
1
>>> give(5)
2
>>> give(6)
1
>>> give(5)
3
* I sort of skipped a step here. You can always remove globals by putting the variables and everything that uses them (which may just be one function) inside a function or other scope, so they end up as free variables in the function's closure. But in your case, I think this would just make your code look "uglier" (in the same sense you thought it was ugly). But remember that objects and closures are effectively equivalent in what they can do, but different in what they look like—so when one looks horribly ugly, try the other.
Just keep track of the last returned value for each input. You can do this with an ordinary dict:
_counter = {}
def give(n):
_counter[n] = _counter.get(n, 0) + 1
return _counter[n]
The standard library has a Counter class that makes things a bit easier:
import collections
_counter = collections.Counter()
def give(n):
_counter[n] += 1
return _counter[n]
collections.defaultdict(int) works too.
You can achieve this with something like this:
def count(start=1):
n = start
while True:
yield n
n += 1
def give(num):
if num not in give.memo:
give.memo[num] = count()
return next(give.memo[num])
give.memo = {}
Which produces:
>>> give(5)
1
>>> give(5)
2
>>> give(5)
3
>>> give(6)
1
>>> give(5)
4
>>>
The two key points are using a dict to keep track of multiple iterators simultaneously, and setting a variable on the function itself. You can do this because functions are themselves objects in python. This is the equivalent of a static local variable in C.
You can basically get what you want via combination of defaultdict and itertools.count:
from collections import defaultdict
from itertools import count
_counters = defaultdict(count)
next(_counters[5])
Out[116]: 0
next(_counters[5])
Out[117]: 1
next(_counters[5])
Out[118]: 2
next(_counters[5])
Out[119]: 3
next(_counters[6])
Out[120]: 0
next(_counters[6])
Out[121]: 1
next(_counters[6])
Out[122]: 2
If you need the counter to start at one, you can get that via functools.partial:
from functools import partial
_counters = defaultdict(partial(count,1))
next(_counters[5])
Out[125]: 1
next(_counters[5])
Out[126]: 2
next(_counters[5])
Out[127]: 3
next(_counters[6])
Out[128]: 1
Adding a second answer because this is rather radically different from my first.
What you are basically trying to accomplish is a coroutine - a generator that preserves state that at arbitrary time, values can be sent into. PEP 342 gives us a way to do that with the "yield expression". I'll jump right into how it looks:
from collections import defaultdict
from itertools import count
from functools import partial
def gen(x):
_counters = defaultdict(partial(count,1))
while True:
out = next(_counters[x])
sent = yield out
if sent:
x = sent
If the _counters line is confusing, see my other answer.
With a coroutine, you can send data into the generator. So you can do something like the following:
g = gen(5)
next(g)
Out[159]: 1
next(g)
Out[160]: 2
g.send(6)
Out[161]: 1
next(g)
Out[162]: 2
next(g)
Out[163]: 3
next(g)
Out[164]: 4
g.send(5)
Out[165]: 3
Notice how the generator preserves state and can switch between counters at will.
In my first answer, I suggested that one solution was to transform the closure into an object. But I skipped a step—you're using global variables, not a closure, and that's part of what you didn't like about it.
Here's a simple way to transform any global state into encapsulated state:
def make_give():
last, a = None, None
def give(num):
nonlocal last
nonlocal a
if num != last:
a = count()
last=num
return a.next()
return give
give = make_give()
Or, adapting my final version of Giver:
def make_giver():
counters = defaultdict(count)
def give(self, num):
return next(counters[num])
return give
If you're curious how this works:
>>> give.__closure__
(<cell at 0x10f0e2398: NoneType object at 0x10b40fc50>, <cell at 0x10f0e23d0: NoneType object at 0x10b40fc50>)
>>> give.__code__.co_freevars
('a', 'last')
Those cell objects are essentially references into the stack frame of the make_give call that created the give function.
This doesn't always work quite as well in Python 2.x as in 3.x. While closure cells work the same way, if you assign to a variable inside the function body and there's no global or nonlocal statement, it automatically becomes local, and Python 2 had no nonlocal statement. So, the second version works fine, but for the first version, you'd have to do something like state = {'a': None, 'last': None} and then write state['a'] = count instead of a = count.
This trick—creating a closure just to hide local variables—is very common in a few other languages, like JavaScript. In Python (partly because of the long history without the nonlocal statement, and partly because Python has alternatives that other languages don't), it's less common. It's usually more idiomatic to stash the state in a mutable default parameter value, or an attribute on the function—or, if there's a reasonable class to make the function a method of, as an attribute on the class instances. There are plenty of cases where a closure is pythonic, this just isn't usually one of them.
I have a function
def f():
while True:
blah
I want to alter f in such a way that the caller could control the number of times the while loop in f runs, without altering much of the code in f (specially not adding a counter in f). Something like
def f(num_executions = True):
while num_executions:
blah()
f() will run an infinite loop
but f(an_expression_that_evaluates_to_true_n_times) will run the while loop n times.
What could such an expression be?
UPDATE:
I know, there are plenty of way to control how many times a loop will run, but the real question here is -
Can an expression in python evaluate to True for configurable number of times?
Some ideas I am toying with
-making an expression out of list = list[:-1]
-modifying default parameters of a function within a function
No need for a while-loop. Use a for-loop:
>>> def f(n):
... for _ in range(n):
... dostuff()
_ is used as a variable name in a for loop normally to be a placeholder. This loop loops through n amount of times. So f(5) would loop five times.
While I agree with the others that this is a bad idea, it is entirely (and easily) possible:
class BoolChange:
def __init__(self):
self.count = 0
def __bool__(self):
self.count += 1
return self.count <= 5
x = BoolChange()
while x:
print("Running")
This outputs Running five times, then exits.
The main reason this is a bad idea is that it means checking the state of the object modifies it, which is weird behaviour people won't expect. I can't imagine a good use case for this.
You can't do exactly what you describe. What is passed in python is not an expression, but a value. An object. An Immutable object in general evaluate to either True or to False. It will not change during the loop. Mutable object can change its truth value, but you can't make arbitrary object change during a general loop (which does not touch it in any way). In general, as have been said here, you really need to use for statement, or pass in a callable object (say, a function):
def f(is_true = lambda x : True):
while is_true():
blah()
Note that the reason that the callable solution is acceptable, while the "hiding in boolean" #Lattyware demonstrated is not, is that the additional coputation here is explicit - the () tells the reader that almost anything can happen here, depending on the object passed, and you don't need to know that the __bool__ method in this object is silently called and is expected to have side effect.
def f(c=-1):
while c:
print 'blah'
if c > 0: c -= 1
How about using a generator as a coroutine?
def blah():
print 'hi'
def f():
while True:
blah()
yield
x = f()
next(x) # "hi"
The generator here isn't be used for what it yields, but you get to control how many times it blahs externally, because it yields control ever time it blahs.
for i in range(3):
next(x) # blah blah blah
This will also work -
def foo(n=[1,2,3]):
foo.func_defaults = tuple([foo.func_defaults[0][:-1]],)
return n
while foo():
blah()