I want to get the invoked times of each function or variable from existing codes which is writing in python.
What i thought is override the object's getattribute function, such as below:
acc = {}
class object(object):
def __getattribute__(self, p):
acc.update({str(self) + p: acc.get(str(self) + p, 0) + 1})
return supe(object, self).__getattribute__(p)
class A(object):
def a(self):
pass
class B(A):
def b(self):
pass
def main():
a = A()
a.a()
b = B()
b.b()
b.a = 'a'
b.a
print acc
if __name__ == '__main__':
main()
But, it only can calculate functions and variable in object, how can i calculate the normal functions or variable, such as:
def fun1():
pass
fun1()
fun1()
I want to get the result as 2, is there any tool or method to do it?
I am sorry my pool english, What i really need is the invoked times not the run time.
such as above, we said, fun1() is invoked two times.
Use a decorator.
>>> def timestamp(container, get_timestamp):
... def timestamp_decorator(func):
... def decorated(*args, **kwargs):
... container[func.func_name] = get_timestamp()
... return func(*args, **kwargs)
... return decorated
... return timestamp_decorator
...
And you use it like this:
>>> import datetime
>>> def get_timestamp():
... return datetime.datetime.now()
...
>>> timestamps = {}
>>> #timestamp(timestamps, get_timestamp)
... def foo(a):
... return a * 2
...
>>> x = foo(2)
>>> print x, timestamps
4 {'foo': datetime.datetime(2012, 2, 14, 9, 55, 15, 789893)}
There would be a way to create a counter decorator to a function (nbot a timestamp decorator) -and to automatically wrap all functions in a given module with this decorator -
so, if the module where you want to count the function calls in is named "mymodule" you can write:
class function_counter(object):
def __init__(self, func):
self.counter = 0
self.func = func
def __call__(self, *args, **kw):
self.counter += 1
return self.func(*args, **kw)
And:
>>> #function_counter
... def bla():
... pass
...
>>>
>>> bla()
>>> bla()
>>> bla()
>>> bla.counter
3
To apply this to all the functions in a module, you can write something like:
import mymodule
from types import FunctionType, BuiltinFunctionType
# define the "function_counter" class as above here (or import it)
for key, value in mymodule.__dict__.items():
if isinstance(value, (FunctionType, BuiltinFunctionType)):
mymodule.__dict__[key] = function_counter(value)
That would do for counting function usage.
If you want to count module level variable usage though, it is not that easy - as
you can't just override the mechanism attribute retrieving from a module object as you did for a class in your example.
The way to go there, is to substitute your module for a class - that implements the attribute counting scheme as you do in your example - after you import your module - and have all module attributes to be assigned to instance attributes in this class.
This is not a tested example (unlike the above), but try something along:
import mymodule
from types import FunctionType
class Counter(object):
# counter __getattribute__ just as you did above
c = Counter()
for key, value in mymodule.__dict__.items():
setattr(c, key, staticmethod(value) if isinstance(value, FunctionType) else value)
mymodule = c
Related
Suppose I have some function A.foo() that instantiates and uses an instance of B, calling the member function bar on it.
How can I set return_value on a mocked instance of B when I'm testing my A class, given that I don't have access to the instance of B? Maybe some code would illustrate this better:
import unittest
import unittest.mock
import pandas
class A:
def foo(self):
b = B()
return b.bar()
class B:
def bar():
return 1
#unittest.mock.patch("__main__.B")
class MyTestCase(unittest.TestCase):
def test_case_1(self, MockB):
MockB.bar.return_value = 2
a = A()
self.assertEqual(a.foo(), 2)
test_case = MyTestCase()
test_case.test_case_1()
This fails with;
AssertionError: <MagicMock name='B().bar()' id='140542513129176'> != 2
Apparently the line MockB.bar.return_value = 2 didn't modify the return value of the method.
I think you are not initiating the MockB. You can directly mock "main.B.bar":
#unittest.mock.patch("__main__.B.bar")
class MyTestCase(unittest.TestCase):
def test_case_1(self, MockB):
MockB.return_value = 2
a = A()
self.assertEqual(a.foo(), 2)
You have just 1 mistake in your code. Replace this line:
MockB.bar.return_value = 2
To:
MockB.return_value.bar.return_value = 2
And it would work.
I assume the piece of code you pasted is just a toy example. If the class A and B lies on another file e.g. src/somedir/somefile.py, don't forget to patch the full path.
#unittest.mock.patch("src.somedir.somefile.B")
class MyTestCase(unittest.TestCase):
...
Update
To further expand on this, you can see some usage in the docs:
>>> class Class:
... def method(self):
... pass
...
>>> with patch('__main__.Class') as MockClass:
... instance = MockClass.return_value
... instance.method.return_value = 'foo'
... assert Class() is instance
... assert Class().method() == 'foo'
...
So in your case:
MockB.bar.return_value is like calling a static method e.g. print(MockB.bar())
MockB.return_value.bar.return_value is like calling a class/instance method e.g. print(MockB().bar())
To visualize this:
import unittest.mock
class SomeClass:
def method(self):
return 1
#unittest.mock.patch("__main__.SomeClass")
def test_mock(mock_class):
print(mock_class)
print(mock_class.return_value)
mock_class.method.return_value = -10
mock_class.return_value.method.return_value = -20
print(SomeClass.method())
print(SomeClass().method())
test_mock()
$ python3 test_src.py
<MagicMock name='SomeClass' id='140568144584128'>
<MagicMock name='SomeClass()' id='140568144785952'>
-10
-20
As you can see, mock_class.return_value is the one used for instance operations such as SomeClass().method().
You can solve this without mock.patch. Change the foo method to accept a factory for the dependency it should construct (DI).
class A:
def foo(self, b_factory: 'Callable[[], B]' = B):
b = b_factory()
return b.bar()
def normal_code():
a = A()
assert a.foo() == ...
def test():
dummy_b = ... # build a dummy object here however you like
a = A()
assert a.foo(b_factory=lambda: dummy_b) == 2
I'm struggling to pickle a wrapped function when I use a custom callable class as a wrapper.
I have a callable class "Dependee" that keeps track of dependencies for a wrapped function with a member variable "depends_on". I'd like to use a decorator to wrap functions and also be able to pickle the resulting wrapped function.
So I define my dependee class. Note the use of functools.update_wrapper.
>>> class Dependee:
...
... def __init__(self, func, depends_on=None):
... self.func = func
... self.depends_on = depends_on or []
... functools.update_wrapper(self, func)
...
... def __call__(self, *args, **kwargs):
... return self.func(*args, **kwargs)
...
Then I define my decorator such that it will return an instance of the Dependee wrapper class.
>>> class depends:
...
... def __init__(self, on=None):
... self.depends_on = on or []
...
... def __call__(self, func):
... return Dependee(func, self.depends_on)
...
Here's an example of a wrapped function.
>>> #depends(on=["foo", "bar"])
... def sum(x, y): return x+y
...
The member variable seems to be accessible.
>>> print(sum.depends_on)
['foo', 'bar']
I can call the function as expected.
>>> print(sum(1,2))
3
But I can't pickle the wrapped instance.
>>> print(pickle.dumps(sum))
Traceback (most recent call last):
File "<stdin>", line 1, in <module>
_pickle.PicklingError: Can't pickle <function sum at 0x7f543863fbf8>: it's not the same object as __main__.sum
What am I missing? How can I give pickle a more appropriately qualified name so that it can find the instance rather than the original function. Note that manual wrapping works just fine.
>>> def sum2_func(x,y): return x+y
...
>>> sum2 = Dependee(sum2_func, depends_on=["foo", "bar"])
>>> print(sum2.depends_on)
['foo', 'bar']
>>> print(sum2(1,2))
3
>>> print(pickle.loads(pickle.dumps(sum2)).depends_on)
['foo', 'bar']
You just need a better serializer, like dill. As for how it works, dill just does a lot of registering python types with the equivalent of copy_reg -- it also treats __main__ similarly to a module, and lastly can serialize by reference or by object. So the last bit is relevant if you want to serialize a function or class, and take the class/function definition with the pickle. It's a little bigger of a pickle than serializing by reference, but it's more robust.
Here's your code exactly:
>>> import dill
>>> import functors
>>> class Dependee:
... def __init__(self, func, depends_on=None):
... self.func = func
... self.depends_on = depends_on or []
... functools.update_wrapper(self, func)
... def __call__(self, *args, **kwargs):
... return self.func(*args, **kwargs)
...
>>>
>>> class depends:
... def __init__(self, on=None):
... self.depends_on = on or []
... def __call__(self, func):
... return Dependee(func, self.depends_on)
...
>>> #depends(on=['foo','bar'])
... def sum(x,y): return x+y
...
>>> print(sum.depends_on)
['foo', 'bar']
>>> print(sum(1,2))
3
>>> _sum = dill.dumps(sum)
>>> sum_ = dill.loads(_sum)
>>> print(sum_(1,2))
3
>>> print(sum_.depends_on)
['foo', 'bar']
>>>
Get dill here: https://github.com/uqfoundation
Yep, well-known pickle problem -- can't pickle functions or classes that can't just be retrieved by their name in the module. See e.g https://code.google.com/p/modwsgi/wiki/IssuesWithPickleModule for clear examples (specifically on how this affects modwsgi, but also of the issue in general).
In this case since all you're doing is adding attributes to the function, you can get away with a simplified approach:
class depends:
def __init__(self, on=None):
self.depends_on = on or []
def __call__(self, func):
func.func = func
func.depends_on = self.depends_on or []
return func
the return func is the key idea -- return the same object that's being decorated (possibly after decorating it, as here, with additional attributes -- but, not a different object, else the name-vs-identity issue perks up).
Now this will work (just your original code, only changing depends as above):
$ python d.py
['foo', 'bar']
3
c__main__
sum
p0
.
Of course, this isn't a general-purpose solution (it only works if it's feasible for the decorator to return the same object it's decorating), just one that works in your example.
I am not aware of any serialization approach able to serialize and de-serialize Python objects without this limitation, alas.
In python a function is a first class object. A class can be called. So you can replace a function with a class. But can you make a function behave like a class? Can you add and remove attributes or call inner functions( then called methods) in a function?
I found a way to do this via code inspection.
import inspect
class AddOne(object):
"""class definition"""
def __init__(self, num):
self.num = num
def getResult(self):
"""
class method
"""
def addOneFunc(num):
"inner function"
return num + 1
return addOneFunc(self.num);
if __name__ == '__main__':
two = AddOne(1);
two_src = '\n'.join([line[4:] for line in inspect.getsource(AddOne.getResult).split('\n')])
one_src = '\n'.join([line[4:] for line in two_src.split('\n')
if line[:4] == ' ' and line[4:8] == ' ' or line[4:8] == 'def '])
one_co = compile(one_src, '<string>', 'exec')
exec one_co
print addOneFunc(5)
print addOneFunc.__doc__
But is there a way to access the local variables and functions defined in a class in a more direct way?
EDIT
The question is about how to access the inner structure of python to get a better understanding. Of course I wouldn't do this in normal programming. The question arose when we had a discussion about private variables in python. My opinion was this to be against the philosophy of the language. So someone came up with the example above. At the moment it seems he is right. You cannot access the function inside a function without the inspect module, rendering this function private. With co_varnames we are awfully close because we already have the name of the function. But where is the namespace dictionary to hold the name. If you try to use
getResult.__dict__
it is empty. What I like to have is an answer from python like
function addOneFunc at <0xXXXXXXXXX>
You can consider a function to be an instance of a class that only implements __call__, i.e.
def foo(bar):
return bar
is roughly equivalent to
class Foo(object):
def __call__(self, bar):
return bar
foo = Foo()
Function instances have a __dict__ attribute, so you can freely add new attributes to them.
Adding an attribute to a function can be used, for example, to implement a memoization decorator, which caches previous calls to a function:
def memo(f):
#functools.wraps(f)
def func(*args):
if args not in func.cache: # access attribute
func.cache[args] = f(*args)
return func.cache[args]
func.cache = {} # add attribute
return func
Note that this attribute can also be accessed inside the function, although it can't be defined until after the function.
You could therefore do something like:
>>> def foo(baz):
def multiply(x, n):
return x * n
return multiply(foo.bar(baz), foo.n)
>>> def bar(baz):
return baz
>>> foo.bar = bar
>>> foo.n = 2
>>> foo('baz')
'bazbaz'
>>> foo.bar = len
>>> foo('baz')
6
(although it's possible that nobody would thank you for it!)
Note, however, that multiply, which was not made an attribute of foo, is not accessible from outside the function:
>>> foo.multiply(1, 2)
Traceback (most recent call last):
File "<pyshell#20>", line 1, in <module>
foo.multiply(1, 2)
AttributeError: 'function' object has no attribute 'multiply'
The other question addresses exactly what you're trying to do:
>>> import inspect
>>> import new
>>> class AddOne(object):
"""Class definition."""
def __init__(self, num):
self.num = num
def getResult(self):
"""Class method."""
def addOneFunc(num):
"inner function"
return num + 1
return addOneFunc(self.num)
>>> two = AddOne(1)
>>> for c in two.getResult.func_code.co_consts:
if inspect.iscode(c):
print new.function(c, globals())
<function addOneFunc at 0x0321E930>
Not sure if the following is what you're thinking about, but you can do this:
>>> def f(x):
... print(x)
...
>>> f.a = 1
>>> f.a
1
>>> f(54)
54
>>> f.a = f
>>> f
<function f at 0x7fb03579b320>
>>> f.a
<function f at 0x7fb03579b320>
>>> f.a(2)
2
So you can assign attributes to a function, and those attributes can be variables or functions (note that f.a = f was chosen for simplicity; you can assign f.a to any function of course).
If you want to access the local variables inside the function, I think then it's more difficult, and you may indeed need to revert to introspection. The example below uses the func_code attribute:
>>> def f(x):
... a = 1
... return x * a
...
>>> f.func_code.co_nlocals
2
>>> f.func_code.co_varnames
('x', 'a')
>>> f.func_code.co_consts
(None, 1)
What I want to do is something like:
class Foo(object):
def __init__(self):
pass
def f(self):
print "f"
def g(self):
print "g"
# programatically set the "default" operation
fer=Foo()
fer.__call__=fer.f
# a different instance does something else as its
# default operation
ger=Foo()
ger.__call__=ger.g
fer() # invoke different functions on different
ger() # objects depending on how they were set up.
But as of 2.7 (which I'm currently using) I can't do this, the attempts at fer()
raise an exception.
Is there a way to, in effect, set a per instance __call__ method?
The normal stuff with types.MethodType unfortunately doesn't work here since __call__ is a special method.
From the data model:
Class instances are callable only when the class has a __call__() method; x(arguments) is a shorthand for x.__call__(arguments).
This is slightly ambiguous as to what is actually called, but it's clear that your class needs to have a __call__ method.
You'll need to create some sort of hack:
class Foo(object):
def __init__(self):
pass
def f(self):
print "f"
def g(self):
print "g"
def __call__(self):
return self.__call__()
f = Foo()
f.__call__ = f.f
f()
g = Foo()
g.__call__ = g.g
g()
Careful with this though, it'll result in an infinite recursion if you don't set a __call__ on an instance before you try to call it.
Note that I don't actually recommend calling the magic attribute that you rebind __call__. The point here is to demonstrate that python translates: f() into f.__class__.__call__(f) and so there's nothing you can do to change it on a per-instance basis. the class's __call__ will be called no matter what you do -- You just need to do something to change the behavior of the class's __call__ per-instance which is easily achieved.
You could use a setter type thing to actually create methods on your class (rather than simple functions) -- and of course that could be turned into a property:
import types
class Foo(object):
def __init__(self):
pass
def f(self):
print "f"
def g(self):
print "g"
def set_func(self,f):
self.func = types.MethodType(f,self)
def __call__(self,*args,**kwargs):
self.func(*args,**kwargs)
f = Foo()
f.set_func(Foo.f)
f()
def another_func(self,*args):
print args
f.set_func(another_func)
f(1,2,3,"bar")
You might be trying to solve the wrong problem.
Since python allows procedural creation of classes you could write code like that:
>>> def create_class(cb):
... class Foo(object):
... __call__ = cb
... return Foo
...
>>> Foo1 = create_class(lambda self: 42)
>>> foo1 = Foo1()
>>> foo1()
>>> Foo2 = create_class(lambda self: self.__class__.__name__)
>>> foo2 = Foo2()
>>> foo2()
Please note thought that Foo1 and Foo2 do not have a common base class in this case. So isinstance and issubclass will not work. If you need them to have a common base class I would go for the following code:
>>> class Foo(object):
... #classmethod
... def create_subclass(cls, cb):
... class SubFoo(cls):
... __call__ = cb
... return SubFoo
...
>>> Foo1 = Foo.create_subclass(lambda self: 42)
>>> foo1 = Foo1()
>>> foo1()
>>> Foo2 = Foo.create_subclass(lambda self: self.__class__.__name__)
>>> foo1 = Foo2()
>>> foo2()
'Foo'
>>> issubclass(Foo1, Foo)
True
>>> issubclass(Foo2, Foo)
True
I really like the second way as it provides a clean class hierarchy and looks quite clean to me.
Possible solution:
class Foo(object):
def __init__(self):
self._callable = lambda s: None
def f(self):
print "f"
def set_callable(self, func):
self._callable = func
def g(self):
print "g"
def __call__(self):
return self._callable()
d = Foo()
d.set_callable(d.g)
I'm using the mock library and unittest2 in order to test different aspects of my software project.
At the moment I have the following question: is it possible to mock a function so that the default keyword argument is different, but the functionality remains?
Say I have the following code
class C():
def fun(self, bool_arg = True):
if bool_arg:
return True
else
return False
What if I want to mock C.fun:
C.fun = mock.Mock(???)
so that every instance of C will replace keyword 'bool_arg' with False, instead of True and the result of:
c = C()
c.fun()
returns:
False
you can also try to wrap your function. Something on the line of
def wrapper(func, bool_arg):
def inner(*args, **kwargs):
kwargs['bool_arg']=bool_arg
return func(*args, **kwargs)
return inner
and
class C():
def fun(...):
...
c = C()
c.fun = wrapper(fun, False)
should work
Edit
If you want to change the default for the class and not for a particular instance you can create a derived class and redefine fun wrapping the method of C. Something on the line (I don't have now the time to test it):
class D(C):
def fun(self, *args, **kwargs):
f = wrapper(C.f, False)
return f(*args, **kwargs)
Then about the suggestion of #Ber, you can define def wrapper(func, **wrapkwargs) and then instead of kwargs['bool_arg']=bool_arg do
for i in wrapkwargs.iteritems(): #wrapkwargs is a dictionary
kwargs[i[0]] = i[1]
You can try to use this code:
>>> import mock
>>>
>>> class C():
... def fun(self, bool_arg = True):
... if bool_arg:
... print "True"
... else:
... print "False"
...
>>> c = C()
>>> funCopy = c.fun
>>> c.fun = mock.Mock(side_effect=lambda bool_arg=False: funCopy(bool_arg=bool_arg))
>>> c.fun()
False
Hope this helps