I am mocking a method. I want to raise an exception on the first call, but on exception, I am calling that method again with different parameters, so I want the second call to be processed normally. What do I need to do?
Code
Try 1
with patch('xblock.runtime.Runtime.construct_xblock_from_class', Mock(side_effect=Exception)):
Try 2
with patch('xblock.runtime.Runtime.construct_xblock_from_class', Mock(side_effect=[Exception, some_method])):
On the second call, some_method is returned as it is, and data is not processed with different parameters.
class Foo(object):
def Method1(self, arg):
pass
def Method2(self, arg):
if not arg:
raise
self.Method1(arg)
def Method3(self, arg):
try:
self.Method2(arg)
except:
self.Method2('some default value')
class FooTest(unittest.TestCase):
def SetUp(self):
self.helper = Foo()
def TestFooMethod3(self):
with mock.patch.object(self.helper, 'Method2',
side_effect=[Exception,self.helper.Method1]
) as mock_object:
self.helper.Method3('fake_arg')
mock_object.assert_has_calls([mock.call('fake_arg'),
mock.call('some default value')])
Related
I have a python 2.7x Tornado application that when run serves up a handful of RESTful api endpoints.
My project folder includes numerous test cases that rely on the python mock module such as shown below.
from tornado.testing import AsyncHTTPTestCase
from mock import Mock, patch
import json
from my_project import my_model
class APITestCases(AsyncHTTPTestCase):
def setUp(self):
pass
def tearDown(self):
pass
#patch('my_project.my_model.my_method')
def test_something(
self,
mock_my_method
):
response = self.fetch(
path='http://localhost/my_service/my_endpoint',
method='POST',
headers={'Content-Type': 'application/json'},
body=json.dumps({'hello':'world'})
)
The RESTful endpoint http://localhost/my_service/my_endpoint has two internal calls to my_method respectively: my_method(my_arg=1) and my_method(my_arg=2).
I want to mock out my_method in this test-case such that it returns 0 if it is called with my_arg==2, but otherwise it should return what it would always normally return. How can I do it?
I know that I should do something like this:
mock_my_method.return_value = SOMETHING
But I don't know how to properly specify that something so that its behavior is conditional on the arguments that my_method is called with. Can someone show me or point me to an example??
I want to mock out my_method in this test-case such that it returns 0 if it is called with my_arg==2, but otherwise it should return what it would always normally return. How can I do it?
Write your own method mock calling the original one on condition:
from my_project import my_model
my_method_orig = my_project.my_model.my_method
def my_method_mocked(self, *args, my_arg=1, **kwargs):
if my_arg == 2: # fake call
return 0
# otherwise, dispatch to real method
return my_method_orig(self, *args, **kwargs, my_arg=my_arg)
For patching: if you don't need to assert how often the mocked method was called and with what args etc, it is sufficient to pass the mock via new argument:
#patch('my_project.my_model.my_method', new=my_method_mocked)
def test_something(
self,
mock_my_method
):
response = self.fetch(...)
# this will not work here:
mock_my_method.assert_called_with(2)
If you want to invoke the whole mock assertion machinery, use side_effect as suggested in the other answer. Example:
#patch('my_project.my_model.my_method', side_effect=my_method_mocked, autospec=True)
def test_something(
self,
mock_my_method
):
response = self.fetch(...)
# mock is assertable here
mock_my_method.assert_called_with(2)
you could use side_effect to change return value dynamically:
class C:
def foo(self):
pass
def drive():
o = C()
print(o.foo(my_arg=1))
print(o.foo(my_arg=2))
def mocked_foo(*args, **kwargs):
if kwargs.get('my_arg') == 2:
return 0
else:
return 1
#patch('__main__.C.foo')
def test(mock):
mock.side_effect = mocked_foo
drive()
update: as you want to run original my_method code under some condition, you may need a method proxy, Mock can't get back the real function object being patched.
from unittest.mock import patch
class MyClass:
def my_method(self, my_arg):
return 10000
def func_wrapper(func):
def wrapped(*args, **kwargs):
my_arg = kwargs.get('my_arg')
if my_arg == 2:
return 0
return func(*args, **kwargs)
return wrapped
def drive(o, my_arg):
print('my_arg', my_arg, 'ret', o.my_method(my_arg=my_arg))
def test():
with patch.object(MyClass, 'my_method', new=func_wrapper(MyClass.my_method)):
o = MyClass()
drive(o, 1)
drive(o, 2)
will outputs:
my_arg 1 ret 10000
my_arg 2 ret 0
I am new to decorators but ideally I wan to use them to simply define a bunch of class functions within class OptionClass, each representing some particular option with a name and description and if it's required. I don't want to modify the operation of the class function at all if that makes sense, I only want to use the decorator to define name, description, and if it's required.
Problem 1: I construct an OptionClass() and I want to call it's option_1. When I do this I receive a TypeError as the call decorator is not receiving an instance of OptionClass. Why is this? When I call option_1 passing the instance of OptionClass() it works. How do I call option_1 without needing to always pass the instance as self.
The error when received is:
Traceback (most recent call last):
File "D:/OneDrive_P/OneDrive/projects/python/examples/dec_ex.py", line 110, in <module>
print(a.option_1("test")) # TypeError: option1() missing 1 required positional argument: 'test_text'
File "D:/OneDrive_P/OneDrive/projects/python/examples/dec_ex.py", line 80, in __call__
return self.function_ptr(*args, **kwargs)
TypeError: option_1() missing 1 required positional argument: 'test_text'
Problem 2: How would I run or call methods on the decorator to set_name, set_description, set_required?
Problem 3: Although this is a sample I intend to code an option class using async functions and decorate them. Do I need to make the decorator call be async def __call__() or is it fine since it's just returning the function?
class option_decorator(object):
def __init__(self, function_pt):
self.function_ptr = function_pt
self.__required = True
self.__name = ""
self.__description = ""
def set_name(self, text):
self.__name = text
def set_description(self, text):
self.__description = text
def set_required(self,flag:bool):
self.__required = flag
def __bool__(self):
"""returns if required"""
return self.__required
def __call__(self, *args, **kwargs):
return self.function_ptr(*args, **kwargs)
def __str__(self):
"""prints a description and name of the option """
return "{} - {}".format(self.__name, self.__description)
class OptionClass(object):
"""defines a bunch of options"""
#option_decorator
def option_1(self,test_text):
return("option {}".format(test_text))
#option_decorator
def option_2(self):
print("option 2")
def get_all_required(self):
"""would return a list of option functions within the class that have their decorator required flag set to true"""
pass
def get_all_available(self):
"""would return all options regardless of required flag set"""
pass
def print_all_functions(self):
"""would call str(option_1) and print {} - {} for example"""
pass
a = OptionClass()
print(a.option_1("test")) # TypeError: option1() missing 1 required positional argument: 'test_text'
print(a.option_1(a,"test")) #Prints: option test
Problem 1
You implemented the method wrapper as a custom callable instead of as a normal function object. This means that you must implement the __get__() descriptor that transforms a function into a method yourself. (If you had used a function this would already be present.)
from types import MethodType
class Dec:
def __init__(self, f):
self.f = f
def __call__(self, *a, **kw):
return self.f(*a, **kw)
def __get__(self, obj, objtype=None):
return self if obj is None else MethodType(self, obj)
class Foo:
#Dec
def opt1(self, text):
return 'foo' + text
>>> Foo().opt1('two')
'footwo'
See the Descriptor HowTo Guide
Problem 2
The callable option_decorator instance replaces the function in the OptionClass dict. That means that mutating the callable instance affects all instances of OptionClass that use that callable object. Make sure that's what you want to do, because if you want to customize the methods per-instance, you'll have to build this differently.
You could access it in class definition like
class OptionClass(object):
"""defines a bunch of options"""
#option_decorator
def option_1(self,test_text):
return("option {}".format(test_text))
option_1.set_name('foo')
Problem 3
The __call__ method in your example isn't returning a function. It's returning the result of the function_ptr invocation. But that will be a coroutine object if you define your options using async def, which you would have to do anyway if you're using the async/await syntax in the function body. This is similar to the way that yield transforms a function into a function that returns a generator object.
Suppose I have this decorator:
def decorator(f):
def f_wrap(*args):
for item in args:
print(args)
return f(*args)
return f_wrap
When used as "permanent" decorators with the # syntax, args retrieves the arguments of the wrapped function. For example, when used with the class below, I receive the instance of MyObject.
Class MyObject(object):
def __init__(self):
pass
#decorator
def function(self):
return
How can I achieve the same result using a "fluid" decorator. Or a decorator that is not permanently bound to the function it is decorating? For example:
def decorator(f):
def f_wrap(*args):
if (not args):
print("Nothing in args")
return f(*args)
return f_wrap
class MyClass(object):
def __init__(self):
pass
def function(self):
return
if __name__ == "__main__":
myobj = MyClass()
myobj.function = decorator(myobj.function)
myobj.function()
In this case, the args tuple always returns empty (I always get "Nothing in args"), even though I anticipated that it would return the instance variable myobj.
EDIT:
In case it was not clear from #AChampion's post the solution is to simply call the fluid-decoratored method as an "unbound" method. E.g.,
from types import MethodType
def decorator(f):
def f_wrap(*args):
# I replaced this with an iteration through
# args. It's a bit more demonstrative.
for item in args:
print(item)
return f(*args)
return f_wrap
class MyClass(object):
def __init__(self):
pass
def function(self):
return
if __name__ == "__main__":
myobj = MyClass()
myobj.function = MethodType(decorator(MyClass.function), myobj)
myobj.function()
The reason for the difference is that you are wrapping different things, a unbound method vs a bound method:
class MyObject(object):
#decorator
def function(self):
pass
Is equivalent to:
import types
class MyClass(object):
def function(self):
pass
m = MyClass(object)
m.function = types.MethodType(decorator(MyClass.function), m)
Not:
m.function = decorator(m.function)
The first being an unbound method, the second being a bound method.
You aren't using all properly. all returns a bool on whether all conditions are met inside what you are checking for in all. In your case, you aren't really doing anything. You will always evaluate to True with how you are using all.
I believe what you are looking for is simply this:
if not args:
Now, ultimately what this checks is if the method you are executing has *args. For the case of the function you have, you aren't passing any arguments, therefore, with the if not args check, you will actually get:
"Nothing in args"
However, if you add an argument to your method as such:
def function(self, x):
return
Then call: myobj.function(1)
You will not get "Nothing in args".
To answer your last question about not getting your instance. If you print out f using this method of calling your decorator:
myobj.function = decorator(myobj.function)
myobj.function()
You will get a bound method:
<bound method MyClass.function of <__main__.MyClass object at 0x102002390>>
Now, set up your decorator as such:
#decorator
def function(self):
return
You will see you get a function attached to your class object:
<function MyClass.function at 0x102001620>
Hence showing that they aren't doing the exact same thing you would expect. Hope this helps clarify a bit.
Here is my python code. I have a class MyClass with two static methods: my_method1 and my_method2. Both methods are wrapped with a decorator called exception_handler.
from functools import wraps
import sys
def exception_handler(function):
#wraps(function)
def decorator(self, *args, **kwargs):
try:
return function(self, *args, **kwargs)
except Exception, e:
print "EXCEPTION!: %s" % e
sys.exit(-1)
return decorator
class MyClass:
#staticmethod
#exception_handler
def my_method1(a, b, c,):
return "X"
#staticmethod
#exception_handler
def my_method2(e, f, g,):
print "Y"
return MyClass.my_method1(a=e, b=f, c=g)
print "Trying my_method1"
print MyClass.my_method1(1, 2, 3)
print ""
print "Trying my_method2"
print MyClass.my_method2(1, 2, 3)
When I run this code, I get the following:
Trying my_method1
X
Trying my_method2
Y
EXCEPTION!: decorator() takes at least 1 argument (0 given)
Why does the decorator fail in the second instance and how can I get around it?
It seems like the decorator fails when decorated method is a static method being called by another static method. But why this would happen makes no sense to me.
The problem is that staticmethods do not take self as an argument. I am not sure why it works on the first two calls and not the third. However, removing self from the decorator fixes it.
Here is the refactored code:
from functools import wraps
import sys
def exception_handler(function):
#wraps(function)
def decorator(*args, **kwargs):
try:
return function(*args, **kwargs)
except Exception as e:
print "EXCEPTION!: {}".format(e)
sys.exit(-1)
return decorator
class MyClass(object):
#staticmethod
#exception_handler
def my_method1(a, b, c, ):
return "X"
#staticmethod
#exception_handler
def my_method2(e, f, g, ):
print "Y"
return MyClass.my_method1(a=e, b=f, c=g)
print "Trying my_method1"
print MyClass.my_method1(1, 2, 3)
print
print "Trying my_method2"
print MyClass.my_method2(1, 2, 3)
Doing so gives these results:
Trying my_method1
X
Trying my_method2
Y
X
I think your code is failing without you noticing, can you try printing your a, b, c *args ? You will find out that a is self !!! So it fails silently by assigning the wrong arguments.
Then why it raises an exception the second call here: MyClass.my_method1(a=e, b=f, c=g) that's because your *args are empty now and self cannot replace any variable like before.
I want to create a class that doesn't gives an Attribute Error on call of any method that may or may not exists:
My class:
class magic_class:
...
# How to over-ride method calls
...
Expected Output:
ob = magic_class()
ob.unknown_method()
# Prints 'unknown_method' was called
ob.unknown_method2()
# Prints 'unknown_method2' was called
Now, unknown_method and unknown_method2 doesn't actually exists in the class, but how can we intercept the method call in python ?
Overwrite the __getattr__() magic method:
class MagicClass(object):
def __getattr__(self, name):
def wrapper(*args, **kwargs):
print "'%s' was called" % name
return wrapper
ob = MagicClass()
ob.unknown_method()
ob.unknown_method2()
prints
'unknown_method' was called
'unknown_method2' was called
Just in case someone is trying to delegate the unknown method to an object, here's the code:
class MagicClass():
def __init__(self, obj):
self.an_obj = obj
def __getattr__(self, method_name):
def method(*args, **kwargs):
print("Handling unknown method: '{}'".format(method_name))
if kwargs:
print("It had the following key word arguments: " + str(kwargs))
if args:
print("It had the following positional arguments: " + str(args))
return getattr(self.an_obj, method_name)(*args, **kwargs)
return method
This is super useful when you need to apply the Proxy pattern.
Moreover, considering both args and kwargs, allows you to generate an interface totally user friendly, as the ones that use MagicClass treat it as it was the real object.
Override __getattr__; see http://docs.python.org/reference/datamodel.html