Python nose: setting description attribute on generated tests - python

I'm using test generators with nose. I'd like to have a custom description for each generated test.
Nose documentation says:
By default, the test name output for a generated test in verbose mode
will be the name of the generator function or method, followed by the
args passed to the yielded callable. If you want to show a different
test name, set the description attribute of the yielded callable.
However, this doesn't work:
class TestGraphics:
def test_all(self):
for i, tc in enumerate(testcases):
self.run_sim.description = str(i)
yield(self.run_sim, tc[0], tc[1])
I get:
AttributeError: 'method' object has no attribute 'description'
How do I set the description attribute on the callable here?

The workarounds below, described here and here, appear to work.
from nose.tools import assert_true
class TestEven:
def test_evens(self):
for i in range(0, 5):
yield self.check_even("check_even with {}".format(i)), i
def check_even(self, desc):
func = lambda n: assert_true(n % 2 == 0)
func.description = desc
return func
class TestEven:
def test_evens(self):
for i in range(0, 5):
yield CheckEven(), i
class CheckEven:
def __call__(self, n):
self.description = "check_even with {}".format(n)
assert n % 2 == 0
from functools import partial
class TestEven:
def test_evens(self):
for i in range(0, 5):
f = partial(self.check_even, i)
f.description = 'check_even with {}'.format(i)
yield (f, )
def check_even(self, n):
assert n % 2 == 0

Related

Not sure why MyMock.env["key1"].search.side_effect=["a", "b"] works but MyMock.env["key1"] = ["a"] with MyMock.env["key2"] = ["b"] does not work

I had created a simple example to illustrate my issue. First is the setup say mydummy.py:
class TstObj:
def __init__(self, name):
self.name = name
def search(self):
return self.name
MyData = {}
MyData["object1"] = TstObj("object1")
MyData["object2"] = TstObj("object2")
MyData["object3"] = TstObj("object3")
def getObject1Data():
return MyData["object1"].search()
def getObject2Data():
return MyData["object2"].search()
def getObject3Data():
return MyData["object3"].search()
def getExample():
res = f"{getObject1Data()}{getObject2Data()}{getObject3Data()}"
return res
Here is the test that failed.
def test_get_dummy1():
dummy.MyData = MagicMock()
mydummy.MyData["object1"].search.side_effect = ["obj1"]
mydummy.MyData["object2"].search.side_effect = ["obj2"]
mydummy.MyData["object3"].search.side_effect = ["obj3"]
assert mydummy.getExample() == "obj1obj2obj3"
The above failed with run time error:
/Library/Frameworks/Python.framework/Versions/3.7/lib/python3.7/unittest/mock.py:1078: StopIteration
Here is the test that passed:
def test_get_dummy2():
dummy.MyData = MagicMock()
mydummy.MyData["object1"].search.side_effect = ["obj1", "obj2", "obj3"]
assert mydummy.getExample() == "obj1obj2obj3"
Am I missing something? I would have expected test_get_dummy1() to work and test_get_dummy2() to fail and not vice versa. Where and how can I find/learn more information about mocking to explain what is going on...
MyData["object1"] is converted to this function call: MyData.__getitem__("object1"). When you call your getExample method, the __getitem__ method is called 3 times with 3 parameters ("object1", "object2", "object3").
To mock the behavior you could have written your test like so:
def test_get_dummy_alternative():
mydummy.MyData = MagicMock()
mydummy.MyData.__getitem__.return_value.search.side_effect = ["obj1", "obj2", "obj3"]
assert mydummy.getExample() == "obj1obj2obj3"
Note the small change from your version: mydummy.MyData["object1"]... became: mydummy.MyData.__getitem__.return_value.... This is the regular MagicMock syntax - we want to to change the return value of the __getitem__ method.
BONUS:
I often struggle with mock syntax and understanding what's happening under the hood. This is why I wrote a helper library: the pytest-mock-generator. It can show you the actual calls made to the mock object.
To use it in your case you could have added this "exploration test":
def test_get_dummy_explore(mg):
mydummy.MyData = MagicMock()
mydummy.getExample()
mg.generate_asserts(mydummy.MyData, name='mydummy.MyData')
When you execute this test, the following output is printed to the console, which contains all the asserts to the actual calls to the mock:
from mock import call
mydummy.MyData.__getitem__.assert_has_calls(calls=[call('object1'),call('object2'),call('object3'),])
mydummy.MyData.__getitem__.return_value.search.assert_has_calls(calls=[call(),call(),call(),])
mydummy.MyData.__getitem__.return_value.search.return_value.__str__.assert_has_calls(calls=[call(),call(),call(),])
You can easily derive from here what has to be mocked.

"Hide" a parameter from a parametrized pytest function

I'm trying to create a pytest plugin that runs a test multiple times to see if it ever passes, skipping on failures unless it's the last run. I use a custom marker for those tests, then pytest_generate_tests to parametrize the test, then modify the result in pytest_pyfunc_call. However, with this approach, the test function needs to actually specify the parameter name, is there some way to hide this parameter from the actual function?
Here's the code for the plugin:
from typing import Optional
import pytest
from _pytest.python import Metafunc, Function
from pluggy.callers import _Result
def pytest_configure(config):
config.addinivalue_line(
"markers", "brute_force: Brute force the test multiple times."
)
class Memory:
def __init__(self, attempts: int):
self.attempts = attempts
self.attempt = 0
self.has_passed = False
def pytest_generate_tests(metafunc: Metafunc):
marker = metafunc.definition.get_closest_marker("brute_force")
if not marker:
return
num_attempts = marker.args and marker.args[0] or 3
memory = Memory(num_attempts)
metafunc.parametrize("memory", [memory for _ in range(num_attempts)])
#pytest.hookimpl(hookwrapper=True)
def pytest_pyfunc_call(pyfuncitem: Function):
memory: Optional[Memory] = pyfuncitem.callspec.params.pop('memory', None)
if memory is not None:
memory.attempt +=1
if memory.has_passed:
raise pytest.skip("already passed")
outcome: _Result = yield
if memory is not None:
if outcome.excinfo is None:
memory.has_passed = True
elif memory.attempt != memory.attempts:
pytest.skip("may pass later")
And here is an example of a test, I want to not have the function to have to specify memory as an argument:
import pytest
num = -1
#pytest.mark.brute_force(3)
def test_simple(memory):
global num
num += 1
assert 10 / num

python ray AttributeError : 'function' has no attribute 'remote'

I'm trying to use ray module to on an existing code based on if an env variable is true or not.
This is what I've done so far. this code structure is similar to mine but not exactly due to it's size.
import os
if os.getenv("PARALLEL"):
import ray
ray.init()
class A(object):
def __init__(self, attr):
self.attr = attr
def may_be_remote(func):
return ray.remote(func) if os.getenv("PARALLEL") else func
#may_be_remote
def do_work(self):
#work code
def execute(self, n):
for _ in range(n):
do_work.remote()
Then, I call the execute function of class A :
a = A()
a.execute(7)
I get AttributeError : 'function' has no attribute 'remote' on that line.
Where did I go wrong with this code please?
You are accessing remote() on the function do_work, which is not defined.
Did you mean to just call do_work()?
Unfortunately ray makes it hard to get transparent code to switch easily as you intend.
Following https://docs.ray.io/en/latest/ray-overview/index.html#parallelizing-python-classes-with-ray-actors the quite strange insert-.remote syntax is like...
import os
use_ray = os.getenv("PARALLEL") is not None
if use_ray:
import ray
ray.init()
def maybe_remote(cls):
return ray.remote(cls) if use_ray else cls
#maybe_remote
class A:
def __init__(self, attr):
self.attr = attr
def do_work(self, foo): # do something
self.attr += foo
def get_attr(self): # return value maybe from remote worker
return self.attr
if __name__ == '__main__':
n = 7
if use_ray:
a = A.remote(0)
for i in range(1, n + 1):
a.do_work.remote(i)
result = ray.get(a.get_attr.remote())
else:
a = A(0)
for i in range(1, n + 1):
a.do_work(i)
result = a.get_attr()
expect = int((n / 2) * (n + 1))
assert expect == result
Not sure there is also an easy (decorator) solution for the differences in the method calls.

How to use two helper functions in main script from another script

TypeError: _slow_trap_ramp() takes 1 positional argument but 2 were given
def demag_chip(self):
coil_probe_constant = float(514.5)
field_sweep = [50 * i * (-1)**(i + 1) for i in range(20, 0, -1)] #print as list
for j in field_sweep:
ramp = self._slow_trap_ramp(j)
def _set_trap_ramp(self):
set_trap_ramp = InstrumentsClass.KeysightB2962A.set_trap_ramp
return set_trap_ramp
def _slow_trap_ramp(self):
slow_trap_ramp = ExperimentsSubClasses.FraunhoferAveraging.slow_trap_ramp
return slow_trap_ramp
The error is straightforward.
ramp = self._slow_trap_ramp(j)
You are calling this method with an argument j, but the method doesn't take an argument (other than self, which is used to pass the object).
Re-define your method to accept an argument if you want to pass it one:
def _slow_trap_ramp(self, j):
It looks like your code extract contains methods of some class, whose full definition is not shown, and you are calling one method from another method (self._slow_trap_ramp(j)). When you call a method, Python automatically passes self before any other arguments. So you need to change def _slow_trap_ramp(self) to def _slow_trap_ramp(self, j).
Update in response to comment
To really help, we would need to see more of the class you are writing, and also some info on the other objects you are calling. But I am going to go out on a limb and guess that your code looks something like this:
InstrumentsClass.py
class KeysightB2962A
def __init__(self):
...
def set_trap_ramp(self):
...
ExperimentsSubClasses.py
class FraunhoferAveraging
def __init__(self):
...
def slow_trap_ramp(self, j):
...
Current version of main.py
import InstrumentsClass, ExperimentsSubClasses
class MyClass
def __init__(self)
...
def demag_chip(self):
coil_probe_constant = float(514.5)
field_sweep = [50 * i * (-1)**(i + 1) for i in range(20, 0, -1)] #print as list
for j in field_sweep:
ramp = self._slow_trap_ramp(j)
def _set_trap_ramp(self):
set_trap_ramp = InstrumentsClass.KeysightB2962A.set_trap_ramp
return set_trap_ramp
def _slow_trap_ramp(self):
slow_trap_ramp = ExperimentsSubClasses.FraunhoferAveraging.slow_trap_ramp
return slow_trap_ramp
if __name__ == "__main__":
my_obj = MyClass()
my_obj.demag_chip()
If this is the case, then these are the main problems:
Python passes self and j to MyClass._slow_trap_ramp, but you've only defined it to accept self (noted above),
you are using class methods from KeysightB2962A and FraunhoferAveraging directly instead of instantiating the class and using the instance's methods, and
you are returning references to the methods instead of calling the methods.
You can fix all of these by changing the code to look like this (see embedded comments):
New version of main.py
import InstrumentsClass, ExperimentsSubClasses
class MyClass
def __init__(self)
# create instances of the relevant classes (note parentheses at end)
self.keysight = InstrumentsClass.KeysightB2962A()
self.fraun_averaging = ExperimentsSubClasses.FraunhoferAveraging()
def demag_chip(self):
coil_probe_constant = float(514.5)
field_sweep = [50 * i * (-1)**(i + 1) for i in range(20, 0, -1)] #print as list
for j in field_sweep:
ramp = self._slow_trap_ramp(j)
def _set_trap_ramp(self):
# call instance method (note parentheses at end)
return self.keysight.set_trap_ramp()
def _slow_trap_ramp(self, j): # accept both self and j
# call instance method (note parentheses at end)
return self.fraun_averaging.slow_trap_ramp(j)
if __name__ == "__main__":
my_obj = MyClass()
my_obj.demag_chip()

Parallel Python - create objects into another class in parallel execution

I'm trying to do something pretty simple with Parallel Python. I would like to be able to create an object from a class I've created inside another method from a class use to do a job in parallel.
Here is a basic example of what I would like to make it work :
import pp
class TestClass(object):
def __init__(self):
pass
def doSomething (self, number) :
print number**2
class PPTask (object) :
def __init__ (self) :
pass
def ppTask(self, number = 1) :
sum = 0
sum += number
tc = TestClass()
tc.doSomething(sum)
return sum
if __name__ == '__main__' :
job_server = pp.Server()
job_list = []
results = []
for i in xrange(10) :
pt = PPTask()
job_list.append(job_server.submit(pt.ppTask, (1,), globals = globals()))
for job in job_list :
results.append(job())
for result in results :
print result
This raise NameError: global name 'TestClass' is not defined and I didn't find any solution to pass it or reuse it in the ppTask method.
Any help would be greatly appriciated.
Thank you in advance
One solution would be to tell the job server to import the source module itself for each job you submit. For example, if your script above were called pptest.py, you could create the jobs like so:
job_list.append(job_server.submit(pt.ppTask, (1,), modules=('pptest',)))
And within ppTask, you could instantiate TestClass like so:
tc = pptest.TestClass()
So overall, the code would look like this:
import pp
class TestClass(object):
def __init__(self):
pass
def doSomething (self, number) :
print number**2
class PPTask (object) :
def __init__ (self) :
pass
def ppTask(self, number = 1) :
sum = 0
sum += number
tc = pptest.TestClass()
tc.doSomething(sum)
return sum
if __name__ == '__main__' :
job_server = pp.Server()
job_list = []
results = []
for i in xrange(10) :
pt = PPTask()
job_list.append(job_server.submit(pt.ppTask, (1,), modules=('pptest',)))
for job in job_list :
results.append(job())
for result in results :
print result

Categories