I have some blocks of code which need to be wrapped by function.
try:
if config.DEVELOPMENT == True:
# do_some_stuff
except:
logger.info("Config is not set for development")
Then I'll do again:
try:
if config.DEVELOPMENT == True:
# do_some_another_stuff
except:
logger.info("Config is not set for development")
So, how can I wrap this "do_some_stuff" and "do_some_another_stuff"?
I'm trying to write function with contextmanager:
#contextmanager
def try_dev_config(name):
try:
if name is not None:
yield
except Exception as e:
print "not dev config"
with try_dev_config("config.DEVELOPMENT"):
# do_some_stuff
And I got an error:
RuntimeError: generator didn't yield
You could pass in a function.
boolean = True
def pass_this_in():
print("I just did some stuff")
def the_try_except_bit(function):
try:
if boolean:
function()
except:
print("Excepted")
# Calling the above code
the_try_except_bit(pass_this_in)
If you want to reduce the "pass_this_in" definition bit, then you can use lambda function definitions:
pass_this_in = lambda : print("I just did some stuff")
I am not sure that a context manager is the good method to achieve what you want. The context manager goal is to provide a mecanism to open/instantiate a resource, give access to it (or not) and close/clean it automatically when you no more need it.
IMHO, what you need is a decorator.
A decorator aims at executing code around a function call. It would force you to put each block of code in a function but I don't think it is so difficult. You can implement it like this:
class Config(object):
"""for demonstration purpose only: used to have a config.DEVELOPMENT value"""
DEVELOPMENT = True
class Logger(object):
"""for demonstration purpose only: used to have a logger.info method"""
#staticmethod
def info(msg):
print("Logged: {}".format(msg))
def check_dev_config(config, logger):
def dev_config_checker(func):
def wrapper(*args, **kwargs):
try:
if config.DEVELOPMENT:
func(*args, **kwargs)
except Exception as err:
logger.info(
"Config is not set for developpement: {}".format(err))
return wrapper
return dev_config_checker
#check_dev_config(Config, Logger)
def do_stuff_1():
print("stuff 1 done")
#check_dev_config(Config, Logger)
def do_stuff_2():
raise Exception("stuff 2 failed")
do_stuff_1()
do_stuff_2()
This code prints
stuff 1 done
Logged: Config is not set for developpement: stuff 2 failed
Explanations:
The check_dev_config function is actually a decorator generator which accepts the config and the logger as arguments.
It returns the dev_config_checker function which is an actual (and parameterised) decorator, and which accepts a function to decorate as argument.
This decorator returns a wrapper function which will actually run code around the decorated function call. In this function, the decorated function is called inside a try/except structure and only if the config.DEVELOPMENT is evaluated to True. In case of exception, the logger is used to log an information.
Each block of code to decorate is put into a function (do_stuff_1, do_stuff_2 and decorated with the check_dev_config decorator generator, giving it the config and the logger.
When decorated functions are called, they are called via their decorator and not directly. As you can see, the do_stuff_2 exception has been catched and the a message has been logged.
Related
I have a number of functions that are RPC calls and all have to prepare several things before the main task and to do some cleanup afterwards. Especially their first two parameters are logname and password which are needed to setup user data (properties, rights etc.).
So I decided to build a decorator called #rpc_wrapper which performs these tasks. This wrapper is now working fine for more than a year. The only drawback is that the inspect code function of PyCharm – I think other linters will behave similar – complains about the parameters logname and password which are never used inside of the function but only by the decorator.
How can I get rid of these complaints?
One way might be to insert a line
assert (logname, password)
at the start of each function. But I think that's not the real deal because decorators are made to avoid replications.
Is there a cleaner solution?
Edited
Here is the (simplified) decorator:
def rpc_wrapper(func):
#wraps(func)
def wrapper(login=None, pwd=None, *args, **kwargs):
doCleanup = True
try:
doCleanup = setupUser(login, pwd)
return func(login, pwd, *args, **kwargs)
except Exception:
logging.exception('**<#>')
raise Exception('999: internal error')
finally:
if doCleanup:
cleanup()
return wrapper
And a decorated function:
#rpc_wrapper
def ping(login: str = None, pwd: str = None):
return 'pong'
or another simple one:
#rpc_wrapper
def echo(login: str = None, pwd: str = None, msg: Any = 'message'):
return msg
I was writing a test using pytest library where I need to test a method which takes another method as an argument.
class Certificate:
def upload(self, upload_fn: Callable):
try:
if self.file_name:
upload_fn(self.file_name)
return
raise ValueError("File name doesn't exist")
except Exception as e:
raise e
Now I created a dummy mock function which I am passing while calling upload method but I am not sure how do I make sure if the upload_fn is called.
I am trying to achieve something like this
def test_certificate_upload(certificate):
certificate.upload(some_mock_fn)
assert some_mock_fn.called_once() == True
EDIT: so currently I am testing it in the following way but I think there can be a better approach.
def mock_upload(f_name):
""just an empty mock method""
def mock_upload_raise_error(f_name):
raise Exception e
def test_certificate_upload_raise_exception(certificate):
with pytest.raises(Exception) as e:
certificate.generate(mock_generator_raise_error)
PS: limitation to this approach is we can't assert if the method was called or how many times the method was called or with what params the method was called.
Also, we have to create extra dummy mock methods for differnet scenarios.
You an mock :
def mock_get(self, *args):
return "Result I want"
#mock.patch(upload, side_effect=mock_get)
def test_certificate_upload(certificate):
certificate.upload(some_mock_fn)
assert function_name() == Return_data
Hi I would like to mock my decorator since I don't want to be actually calling/executing this function. But I can't seem to find the solution for this below are my code
# This is the decorator located in my project
# This is located in custom.mydecorators.decorator_file.custom_decorator
Base = declarative_base()
def custom_decorator(func):
#wraps(func)
def wrapper(*args, **kwargs):
print("This message is still printed even when I try to patch this function")
try:
#code here
my_var = CoolClass()
retval = func(*args, **kwargs)
except Exception as e:
#rollback code here
raise e
return retval
return wrapper
Now I'm trying to patch this using this code
patch('custom.mydecorators.decorator_file.custom_decorator', lambda x: x).start()
class TestMockDecoratorsCallingClass(unittest.TestCase):
def test_should_return_success_if_decorators_are_mocked(self):
# Code here
My decorators work properly in a non unittest file. But if I mock this decorator it fails saying that the local variable 'my_var' referenced before assignment
Note: my_var is inside the decorator I'm trying to mock/patch also the print message is still executed even when I try to patch it
I have a script in python which works as shown below. Each function performs a completely different task and not related to each other. My problem is if function2() is having an issue during the execution process then function3(), function4(), function5() will not execute. I know you will say to handle this by catching the exception (try..except) but then i have to catch every exception which is not i am looking for. In a nutshell how do i code where my other functions are not impacted if any of the function is having issue. Ideally it should exclude that problematic function and let the other function to execute.
def function1():
some code
def function2():
some code
def function3():
some code
def function4():
some code
def function5():
some code
if __name__ == '__main__':
function1()
function2()
function3()
function4()
function5()
No need to write multiple try/except. Create a list of your function and execute them. For example, you code should be like:
if __name__ == '__main__':
func_list = [function1, function2, function3, function4, function5]
for my_func in func_list:
try:
my_func()
except:
pass
OR, create a decorator and add that decorator to each of your function. Check A guide to Python's function decorators. For example, your decorator should be like:
def wrap_error(func):
def func_wrapper(*args, **kwargs):
try:
return func(*args, **kwargs)
except:
pass
return func_wrapper
Now add this decorator with your function definition as:
#wrap_error
def function1():
some code
Functions having this decorator added to them won't raise any Exception
As of Python 3.4, a new context manager as contextlib.suppress is added which as per the doc:
Return a context manager that suppresses any of the specified exceptions if they occur in the body of a with statement and then resumes execution with the first statement following the end of the with statement.
In order to suppress all the exceptions, you may use it as:
from contextlib import suppress
if __name__ == '__main__':
func_list = [function1, function2, function3, function4, function5]
for my_func in func_list:
with suppress(Exception): # `Exception` to suppress all the exceptions
my_func() # Any exception raised by `my_func()` will be suppressed
You can use exception and catch all sort of exceptions like this
if __name__ == '__main__':
try:
function1()
except:
pass
try:
function2()
except:
pass
try:
function3()
except:
pass
try:
function4()
except:
pass
for large number of functions you can use
func_dict = {
func1 : {
param1 : val
param2 : val
},
func1 : {
param1 : val
param2 : val
}
}
thus you can iterate over the keys of the dictionary for the function and iterate on the parameters
I want to force object instantiation via class context manager. So make it impossible to instantiate directly.
I implemented this solution, but technically user can still instantiate object.
class HessioFile:
"""
Represents a pyhessio file instance
"""
def __init__(self, filename=None, from_context_manager=False):
if not from_context_manager:
raise HessioError('HessioFile can be only use with context manager')
And context manager:
#contextmanager
def open(filename):
"""
...
"""
hessfile = HessioFile(filename, from_context_manager=True)
Any better solution ?
If you consider that your clients will follow basic python coding principles then you can guarantee that no method from your class will be called if you are not within the context.
Your client is not supposed to call __enter__ explicitly, therefore if __enter__ has been called you know your client used a with statement and is therefore inside context (__exit__ will be called).
You just need to have a boolean variable that helps you remember if you are inside or outside context.
class Obj:
def __init__(self):
self._inside_context = False
def __enter__(self):
self._inside_context = True
print("Entering context.")
return self
def __exit__(self, *exc):
print("Exiting context.")
self._inside_context = False
def some_stuff(self, name):
if not self._inside_context:
raise Exception("This method should be called from inside context.")
print("Doing some stuff with", name)
def some_other_stuff(self, name):
if not self._inside_context:
raise Exception("This method should be called from inside context.")
print("Doing some other stuff with", name)
with Obj() as inst_a:
inst_a.some_stuff("A")
inst_a.some_other_stuff("A")
inst_b = Obj()
with inst_b:
inst_b.some_stuff("B")
inst_b.some_other_stuff("B")
inst_c = Obj()
try:
inst_c.some_stuff("c")
except Exception:
print("Instance C couldn't do stuff.")
try:
inst_c.some_other_stuff("c")
except Exception:
print("Instance C couldn't do some other stuff.")
This will print:
Entering context.
Doing some stuff with A
Doing some other stuff with A
Exiting context.
Entering context.
Doing some stuff with B
Doing some other stuff with B
Exiting context.
Instance C couldn't do stuff.
Instance C couldn't do some other stuff.
Since you'll probably have many methods that you want to "protect" from being called from outside context, then you can write a decorator to avoid repeating the same code to test for your boolean:
def raise_if_outside_context(method):
def decorator(self, *args, **kwargs):
if not self._inside_context:
raise Exception("This method should be called from inside context.")
return method(self, *args, **kwargs)
return decorator
Then change your methods to:
#raise_if_outside_context
def some_other_stuff(self, name):
print("Doing some other stuff with", name)
I suggest the following approach:
class MainClass:
def __init__(self, *args, **kwargs):
self._class = _MainClass(*args, **kwargs)
def __enter__(self):
print('entering...')
return self._class
def __exit__(self, exc_type, exc_val, exc_tb):
# Teardown code
print('running exit code...')
pass
# This class should not be instantiated directly!!
class _MainClass:
def __init__(self, attribute1, attribute2):
self.attribute1 = attribute1
self.attribute2 = attribute2
...
def method(self):
# execute code
if self.attribute1 == "error":
raise Exception
print(self.attribute1)
print(self.attribute2)
with MainClass('attribute1', 'attribute2') as main_class:
main_class.method()
print('---')
with MainClass('error', 'attribute2') as main_class:
main_class.method()
This will outptut:
entering...
attribute1
attribute2
running exit code...
---
entering...
running exit code...
Traceback (most recent call last):
File "scratch_6.py", line 34, in <module>
main_class.method()
File "scratch_6.py", line 25, in method
raise Exception
Exception
None that I am aware of. Generally, if it exists in python, you can find a way to call it. A context manager is, in essence, a resource management scheme... if there is no use-case for your class outside of the manager, perhaps the context management could be integrated into the methods of the class? I would suggest checking out the atexit module from the standard library. It allows you to register cleanup functions much in the same way that a context manager handles cleanup, but you can bundle it into your class, such that each instantiation has a registered cleanup function. Might help.
It is worth noting that no amount of effort will prevent people from doing stupid things with your code. Your best bet is generally to make it as easy as possible for people to do smart things with your code.
You can think of hacky ways to try and enforce this (like inspecting the call stack to forbid direct calls to your object, boolean attribute that is set upon __enter__ that you check before allowing other actions on the instance) but that will eventually become a mess to understand and explain to others.
Irregardless, you should also be certain that people will always find ways to bypass it if wanted. Python doesn't really tie your hands down, if you want to do something silly it lets you do it; responsible adults, right?
If you need an enforcement, you'd be better off supplying it as a documentation notice. That way if users opt to instantiate directly and trigger unwanted behavior, it's their fault for not following guidelines for your code.