python class attributes as standard input in a class method - python

I don't seam to be able to do this but is would make sense that you could.
So mybe I just made a mistake.
class Foobar:
def __init__(self):
self.myatr = 0
def add(self, someinput=self.myatr): # <-- someinput=self.myatr???
return someinput += 1
but you get the error
NameError: name 'self' is not defined
But it would be logicl if the this was the way it worket
f = Foobar()
f.add() # returns 1
f.add(1) # returns 2

Instance methods are functions bound to class attributes, defined when the class is defined, before any instance exists. Similarly, the default value is set once, at definition time, not on-demand when the method is called without an explicit argument.
As such, you need a sentinel (typically None) which signals that no argument was passed.
def add(self, someinput=None):
if someinput is None:
someinput = self.myatr
return someinput + 1

Default arguments are evaluated at function definition. Moreover, the names of the arguments defined earlier (like self in your function) aren't available during function definition. So when you refer to self.myattr, there's no self yet.
For example, consider this function:
>>> def test(thing=print('hello')):
... ...
...
hello
>>>
The expression print('hello') was evaluated right when the function was defined, and it won't be re-evaluated when you call test.
Also, return someinput += 1 is an error too because assignment is not an expression.
Furthermore, integers are always copied, so if you do this:
def test(x):
x += 1
return x
a = 6
test(a)
a will still be equal to six, since the call test(a) copied a.

Related

How to count function calls using decorators?

I'm refreshing my memory about some python features that I didn't get yet, I'm learning from this python tutorial and there's an example that I don't fully understand. It's about a decorator counting calls to a function, here's the code:
def call_counter(func):
def helper(x):
helper.calls += 1
return func(x)
helper.calls = 0
return helper
#call_counter
def succ(x):
return x + 1
if __name__ == '__main__':
print(succ.calls)
for i in range(10):
print(succ(i))
print(succ.calls)
What I don't get here is why do we increment the calls of the function wrapper (helper.calls += 1) instead of the function calls itself, and why does it actually working?
The important thing to remember about decorators is that a decorator is a function that takes a function as an argument, and returns yet another function. The returned value - yet another function - is what will be called when the name of the original function is invoked.
This model can be very simple:
def my_decorator(fn):
print("Decorator was called")
return fn
In this case, the returned function is the same as the passed-in function. But that's usually not what you do. Usually, you return either a completely different function, or you return a function that somehow chains or wraps the original function.
In your example, which is a very common model, you have an inner function that is returned:
def helper(x):
helper.calls += 1
return func(x)
This inner function calls the original function (return func(x)) but it also increments the calls counter.
This inner function is being inserted as a "replacement" for whatever function is being decorated. So when your module foo.succ() function is looked up, the result is a reference to the inner helper function returned by the decorator. That function increments the call counter and then calls the originally-defined succ function.
When you decorate a function you "substitute" you're function with the wrapper.
In this example, after the decoration, when you call succ you are actually calling helper. So if you are counting calls you have to increase the helper calls.
You can check that once you decorate a function the name is binded tho the wrapper by checking the attribute _name_ of the decorated function:
def call_counter(func):
def helper(*args, **kwargs):
helper.calls += 1
print(helper.calls)
return func(*args, **kwargs)
helper.calls = 0
return helper
#call_counter
def succ(x):
return x + 1
succ(0)
>>> 1
succ(1)
>>> 2
print(succ.__name__)
>>> 'helper'
print(succ.calls)
>>> 2
Example with Class Decorator
When you decorate a function with the Class Decorator, every function has its own call_count. This is simplicity of OOP. Every time CallCountDecorator object is called, it will increase its own call_count attribute and print it.
class CallCountDecorator:
"""
A decorator that will count and print how many times the decorated function was called
"""
def __init__(self, inline_func):
self.call_count = 0
self.inline_func = inline_func
def __call__(self, *args, **kwargs):
self.call_count += 1
self._print_call_count()
return self.inline_func(*args, **kwargs)
def _print_call_count(self):
print(f"The {self.inline_func.__name__} called {self.call_count} times")
#CallCountDecorator
def function():
pass
#CallCountDecorator
def function2(a, b):
pass
if __name__ == "__main__":
function()
function2(1, b=2)
function()
function2(a=2, b=3)
function2(0, 1)
# OUTPUT
# --------------
# The function called 1 times
# The function2 called 1 times
# The function called 2 times
# The function2 called 2 times
# The function2 called 3 times
What I don't get here is why do we increment the calls of the function wrapper (helper.calls += 1) instead of the function calls itself, and why does it actually working?
I think to make it a generically useful decorator. You could do this
def succ(x):
succ.calls += 1
return x + 1
if __name__ == '__main__':
succ.calls = 0
print(succ.calls)
for i in range(10):
print(succ(i))
print(succ.calls)
which works just fine, but you would need to put the .calls +=1 in every function you wanted to apply this too, and initialise to 0 before you ran any of them. If you had a whole bunch of functions you wanted to count this is definitely nicer. Plus it initialises them to 0 at definition, which is nice.
As i understand it it works because it replaces the function succ with the helper function from within the decorator (which is redefined every time it decorates a function) so succ = helper and succ.calls = helper.calls. (although of course the name helper is only definied within the namespace of the decorator)
Does that make sense?
As I understand this (correct me if I'm wrong) the order you program executes is:
Register call_function.
Register succ.
While registering succ function interpreter finds a decorator so it executes call_function.
Your function returns an object which is a function (helper). And adds to this object field calls.
Now your function succ has been assigned to helper. So when you call your function, you're actually calling helper function, wrapped within a decorator. So every field you add to your helper function is accessible outside by addressing succ because those 2 variables refer to same thing.
So when you call succ() it's basically the same if you would do helper(*args, **argv)
Check this out:
def helper(x):
helper.calls += 1
return 2
helper.calls = 0
def call_counter(func):
return helper
#call_counter
def succ(x):
return x + 1
if __name__ == '__main__':
print(succ == helper) # prints true.

Unexpected behavior while importing python Modules with Class Definitions

Consider the code in the file my_module.py:
class A(object):
def __init__(self, x=er()):
self.x = x
Now, when I import this module
import my_module
I get an error,
name 'er is not defined
While I understand that my_module does not have er defined, but I am never creating an instance of class A. Therefore it is puzzling that python tries to execute the __init__ callback when simply importing the module. Although, the __init__ call is not fully executed as explained by the example below:
class A(object):
def __init__(self, x=5):
self.x = x
print ('I am here')
Now, when I import the module - nothing is printed and this is expected behavior.
I am puzzled why is function er called in the first example when I donot instantiate an object of class A. Any pointers to the documentation that explains this?
Because in Python, default argument values are evaluated at definition time. See, for example this question, or this notorious question.
This is documented here
The default values are evaluated at the point of function definition
in the defining scope, so that
i = 5
def f(arg=i):
print(arg)
i = 6
f() will print 5.
Important warning: The default value is evaluated only once. This
makes a difference when the default is a mutable object such as a
list, dictionary, or instances of most classes. For example, the
following function accumulates the arguments passed to it on
subsequent calls:
def f(a, L=[]):
L.append(a)
return L
print(f(1))
print(f(2))
print(f(3))
This will print
[1]
[1, 2]
[1, 2, 3]
If you don’t want the default to be shared between subsequent calls,
you can write the function like this instead:
def f(a, L=None):
if L is None:
L = []
L.append(a)
return L

Behavioural difference between decorated function and method in Python

I use the following workaround for "Pythonic static variables":
def static_vars(**kwargs):
"""decorator for funciotns that sets static variables"""
def decorate(func):
for k, v in kwargs.items():
setattr(func, k, v)
return func
return decorate
#static_vars(var=1)
def global_foo():
_ = global_foo
print _.var
_.var += 1
global_foo() # >>> 1
global_foo() # >>> 2
It works just as supposed to. But when I move such a decorated function inside a class I get a strange change:
class A(object):
#static_vars(var=1)
def foo(self):
bound = self.foo
unbound = A.foo
print bound.var # OK, print 1 at first call
bound.var += 1 # AttributeError: 'instancemethod' object has no attribute 'var'
def check(self):
bound = self.foo
unbound = A.foo
print 'var' in dir(bound)
print 'var' in dir(unbound)
print bound.var is unbound.var # it doesn't make much sense but anyway
a = A()
a.check() # >>> True
# >>> True
# >>> True
a.foo() # ERROR
I can not see what causes such behaviour. It seems to me that it has something to do with python descriptors protocol, all that bound vs unbound method stuff. Somehow the foo.var attribute is accessible but is not writable.
Any help is appreciated.
P.S. I understand that static function variables are essentially class variables and this decorator is unnecessary in the second case but the question is more for understanding the Python under the hood than to get any working solution.
a.foo doesn't return the actual function you defined; it returns a bound method of it, which wraps up the function and has self assigned toa.
https://docs.python.org/3/howto/descriptor.html#functions-and-methods
That guide is out of date a little, though, since unbound methods just return the function in Python 3.
So, to access the attributes on the function, you need to go through A.foo (or a.foo.__func__)instead of a.foo. And this will only work in Python 3. In Python 2, I think A.foo.__func__ will work.

python TypeError: 'int' object is not callable

i have homework and we need to do something like iterator, the func work great but the techer told he run the func with (t=Make_iterator()) like this, what i do wrong? tnx!
global x
x=-1
def Make_iterator(fn):
global x
x+=1
return fn(x)
fn=lambda y:y*2
t=Make_iterator(fn)
print(t())
I think you want a closure, which is a function defined within the local namespace of anther function, so that it can access the outer function's variables:
def make_iterator(func):
x = -1
def helper():
nonlocal x
x += 1
return func(x)
return helper
The nonlocal statement allows the inner function to modify the variable declared in the outer function (otherwise you'd either get an error, or you'd bind your own local variable without changing the outer one). It was only added in Python 3, so if you're still using Python 2, you'll need to wrap the x value in a mutable data structure, like a list.
Another approach to the same idea is to write class, rather than a function. An instance of a class can be callable (just like a function) if the class defines a __call__ method:
class MyIterator(object):
def __init__(self, func):
self.index = -1
self.func = func
def __call__(self):
self.index += 1
return self.func(self.index)
This can be useful if the state you need to keep track of is more complicated (or should change in more complicated ways) than the simple integer index used in this example. It also works in Python 2 without annoying workarounds.
I think he wants your Make_iterator function to return a function that acts as an iterator. So you could wrap the contents of your current Make_iterator function within an inner function f and return that:
def Make_iterator(fn):
def f():
global x
x+=1
return fn(x)
return f
Now if you do t = Make_iterator(fn), every time you call t() it will return the next value of the iterator, in your case 0, 2, 4, 6, 8, etc...

Default value of memberfunction in python

class foo():
def __init__(self,a):
self.A=a
def foo2(self,aa=self.A):
return "la"
#classmethod
def test(cls):
d=foo()
d.foo2()
The memberfunction foo2 cannot find self.A nor A. Is it because A is not globally set inside the class?
Keyword arguments, such as aa, cannot take on a default value from self. Keyword arguments are evaluated when the method is defined, not when it is called. Typically one would achieve what you're trying by setting the default of aa to None:
class foo():
def __init__(self, a):
self.A = a
def foo2(self, aa=None):
if aa is None:
aa = self.A
return 'la'
Note also that since keyword argument defaults are evaluated at definition, not execution, all invocations of foo2 share their default argument even if called from different instances of foo. This often trips up new Python programmers when working with methods such as:
def footoo(a=list()):
a.append(1)
return a
All calls to footoo will get the same list object; not a new one at each call. So calling footoo repeatedly will result in the following:
>>> footoo()
[1]
>>> footoo()
[1, 1]
>>> footoo()
[1, 1, 1]
You are right, you can't do that.
For example, in Python, you have to be very careful about this because it will use the same list over and over (in this case, it will continue appending to the same list):
def foo2(self,aa=[]):
aa.append('foo')
return "la"
Instead, a very common approach is to assign None as the default value, and then have an if-statement to set it inside the function:
def foo2(self,aa=None):
if not aa:
aa = self.A
return "la"
The error occurs when the default value is evaluated. This happens when the class is being defined (not when the method is called). At that point in time, self has no meaning.
One way to fix this is like so:
...
def foo2(self,aa=None):
if aa is None:
aa = self.A
return "la"
...

Categories