Python - Parameter checking with Exception Raising - python

I am attempting to write exception raising code blocks into my Python code in order to ensure that the parameters passed to the function meet appropriate conditions (i.e. making parameters mandatory, type-checking parameters, establishing boundary values for parameters, etc...). I understand satisfactorily how to manually raise exceptions as well as handling them.
from numbers import Number
def foo(self, param1 = None, param2 = 0.0, param3 = 1.0):
if (param1 == None):
raise ValueError('This parameter is mandatory')
elif (not isinstance(param2, Number)):
raise ValueError('This parameter must be a valid Numerical value')
elif (param3 <= 0.0):
raise ValueError('This parameter must be a Positive Number')
...
This is an acceptable (tried and true) way of parameter checking in Python, but I have to wonder: Since Python does not have a way of writing Switch-cases besides if-then-else statements, is there a more efficient or proper way to perform this task? Or is implementing long stretches of if-then-else statements my only option?

You could create a decorator function and pass the expected types and (optional) ranges as parameters. Something like this:
def typecheck(types, ranges=None):
def __f(f):
def _f(*args, **kwargs):
for a, t in zip(args, types):
if not isinstance(a, t):
raise TypeError("Expected %s got %r" % (t, a))
for a, r in zip(args, ranges or []):
if r and not r[0] <= a <= r[1]:
raise ValueError("Should be in range %r: %r" % (r, a))
return f(*args, **kwargs)
return _f
return __f
Instead of if ...: raise you could also invert the conditions and use assert, but as noted in comments those might not always be executed.
You could also extend this to allow e.g. open ranges (like (0., None)) or to accept arbitrary (lambda) functions for more specific checks.
Example:
#typecheck(types=[int, float, str], ranges=[None, (0.0, 1.0), ("a", "f")])
def foo(x, y, z):
print("called foo with ", x, y, z)
foo(10, .5, "b") # called foo with 10 0.5 b
foo([1,2,3], .5, "b") # TypeError: Expected <class 'int'>, got [1, 2, 3]
foo(1, 2.,"e") # ValueError: Should be in range (0.0, 1.0): 2.0

I think you can use decorator to check the parameters.
def parameterChecker(input,output):
... def wrapper(f):
... assert len(input) == f.func_code.co_argcount
... def newfun(*args, **kwds):
... for (a, t) in zip(args, input):
... assert isinstance(a, t), "arg {} need to match {}".format(a,t)
... res = f(*args, **kwds)
... if not isinstance(res,collections.Iterable):
... res = [res]
... for (r, t) in zip(res, output):
... assert isinstance(r, t), "output {} need to match {}".format(r,t)
... return f(*args, **kwds)
... newfun.func_name = f.func_name
... return newfun
... return wrapper
example:
#parameterChecker((int,int),(int,))
... def func(arg1, arg2):
... return '1'
func(1,2)
AssertionError: output 1 need to match <type 'int'>
func(1,'e')
AssertionError: arg e need to match <type 'int'>

This has been bugging me for a while about Python, there is no standard way to output if a provided param is None or has missing value, nor handle a JSON/Dict object gracefully,
for example I want to output the actual parameter name in the error message,
username = None
if not username:
log.error("some parameter is missing value")
There is no way to pass the actual parameter name, unless you do this artificially and messily by hard coding the parameter in the error output message, ie,
if not username:
log.error("username is missing value")
but this is both messy and prone to syntax errors, and pain in butt to maintain.
For this reason, I wrote up a "Dictator" function,
https://medium.com/#mike.reider/python-dictionaries-get-nested-value-the-sane-way-4052ab99356b
If you add your parameters into a dict, or read your parameters from a YAML or JSON config file, you can tell Dictator to raise a ValueError if a param is null,
for example,
config.yaml
skills:
sports:
- hockey
- baseball
now your py program reads in this config file and the parameters, as a JSON dict,
with open(conf_file, 'r') as f:
config = yaml.load(f)
now set your parameters and also check if theyre NULL
sports = dictator(config, "skills.sports", checknone=True)
if sports is None, it will raise a ValueError, telling you exactly what parameter is missing
ValueError("missing value for ['skills']['sports']")
you can also provide a fallback value to your parameter, so in case it is None, give it a default fallback value,
sports = dictator(config, "skills.sports", default="No sports found")
This avoids ugly Index/Value/Key error exceptions.
Its a flexible, graceful way to handle large dictionary data structures and also gives you the ability to check your program's parameters for Null values and output the actual parameter names in the error message

Related

How to process wrapped function arguments in the same order as visible for original function?

I am trying to examine the types of functions arguments before the call (in this example it is foo). I am using python decorators to achieve this. I don't see how I can get arguments in the same order as they are visible to the function foo. In the following example, I get two different orderings but have essentially the same function call.
def wrapper(func):
def f(*args, **kwargs):
print([type(x) for x in args] + [type(v) for v in kwargs.values()])
return func(*args, **kwargs)
return f
#wrapper
def foo(a, b, c, d):
print(f"{a} {b} {c} {d}")
foo(10, 12.5, 14, 5.2) # all good: int, float, int, float
foo(10, 12.5, d=5.2, c=14) # not what I want: int, float, float, int
Is it possible to get arguments in a consistent order? If not, then is it at least possible to get them all keyed by argument name? Something that looks like this:
def wrapper(func):
def f(**kwargs):
# kwargs = {'a': 10, 'b': 12.5, 'c': 14, 'd': 5.2}
print([type(v) for v in kwargs.values()])
return func(*args, **kwargs)
return f
foo(10, 12.5, 14, 5.2) # obviously doesn't work
The type-checking is a bit weak, the annotations works as long you annotate your code but a more robust way can be achieved by using inspect from the standard library:
it provides full access to frame, ... and everything you may need. In this case with inspect.signature can be used to fetch the signature of the original function to get a the original order of the parameters. Then just regroup the parameters and pass the final group back to the original function. More details in the comments.
from inspect import signature
def wrapper(func):
def f(*args, **kwargs):
# signature object
sign = signature(func)
# use order of the signature of the function as reference
order = order = dict.fromkeys(sign.parameters)
# update first key-values
order.update(**kwargs)
# update by filling with positionals
free_pars = (k for k, v in order.items() if v is None)
order.update(zip(free_pars, args))
return func(**order)
return f
#wrapper
def foo(a, b, c, d):
print(f"{a} {b} {c} {d}")
foo(10, 12.5, 14, 5.2)
#10 12.5 14 5.2
foo(10, 12.5, d=5.2, c=14)
#10 12.5 14 5.2
The code is annotations compatible:
#wrapper
def foo(a: int, b: float, c: int, d: float) -> None:
print(f"{a} {b} {c} {d}")
The annotation's way, no imports required:
It is a copy past of the above code but using __annotations__ attribute to get the signature. Remember that annotations may or may not have an annotation for the output
def wrapper(func):
def f(*args, **kwargs):
if not func.__annotations__:
raise Exception('No clue... inspect or annotate properly')
params = func.__annotations__
# set return flag
return_has_annotation = False
if 'return' in params:
return_has_annotation = True
# remove possible return value
return_ = params.pop('return', None)
order = dict.fromkeys(params)
order.update(**kwargs)
free_pars = (k for k, v in order.items() if v is None)
order.update(zip(free_pars, args))
# update with return annotation
if return_has_annotation:
func.__annotations__ = params | {'return': return_}
return func(**order)
return f
#wrapper
def foo(a: int, b: float, c: int, d: float) -> None:
print(f"{a} {b} {c} {d}")
The first thing to be careful of is that key word arguments are implemented because order does not matter for them and are intended to map a value to a specific argument by name at call-time. So enforcing any specific order on kwargs does not make much sense (or at least would be confusing to anyone trying to use your decorater). So you will probably want to check for which kwargs are specified and remove the corresponding argument types.
Next if you want to be able to check the argument types you will need to provide a way to tell your decorator what types you are expected by passing it an argument (you can see more about this here). The only way to do this is to pass a dictionary mapping each argument to the expected type:
#wrapper({'a': int, 'b': int, c: float, d: int})
def f(a, b, c=6.0, d=5):
pass
def wrapper(types):
def inner(func):
def wrapped_func(*args, **kwargs):
# be careful here, this only works if kwargs is ordered,
# for python < 3.6 this portion will not work
expected_types = [v for k, v in types.items() if k not in kwargs]
actual_types = [type(arg) for arg in args]
# substitute these in case you are dead set on checking for key word arguments as well
# expected_types = types
# actual_types = [type(arg) for arg in args)] + [type(v) for v in kwargs.items]
if expected_types != actual_types:
raise TypeError(f"bad argument types:\n\tE: {expected_types}\n\tA: {actual_types}")
func(*args, **kwargs)
return wrapped_func
return inner
#wrapper({'a': int, 'b': float, 'c': int})
def f(a, b, c):
print('good')
f(10, 2.0, 10)
f(10, 2.0, c=10)
f(10, c=10, b=2.0)
f(10, 2.0, 10.0) # will raise exception
Now after all of this, I want to point out that this is functionality is probably largely unwanted and unnecessary in python code. Python was designed to be dynamically typed so anything resembling strong types in python is going against the grain and won't be expected by most.
Next, since python 3.5 we have had access to the built-in typing package. This lets you specify the type that you expect to be receiving in a function call:
def f(a: int, b: float, c: int) -> int:
return a + int(b) + c
Now this won't actually do any type assertions for you, but it will make it plainly obvious what types you are expecting, and most (if not all) IDEs will give you visual warnings that you are passing the wrong type to a function.

How can I type convert many arguments of a function in place?

Context
I use CherryPy to serve a simple webpage that shows different content based on the URL parameters. Specifically it takes the sum of the parameters and shows a different message based on that.
In CherryPy webpages can be defined as functions, and URL parameters are passed as an argument to that function.
As explained in this tutorial URL parameters are passed as strings, so to calculate the sum I want to convert each argument to a float. I will have many URL parameters, so doing this one by one seems cumbersome.
How can I type convert (a large number of) arguments in place?
What I've tried
Dumb
The "dumb" approach would be to simply take each argument and re-assign it as a float:
def dumb(a="0", b="0", c="0", d="0", e="0", f="0", g="0"):
a = float(a)
b = float(b)
c = float(c)
d = float(d)
e = float(e)
f = float(f)
g = float(g)
return print(sum([a, b, c, d, e, f, g]))
It's readable, but rather repetitive and not very "pythonic".
Loop over locals()
Another option I found is to re-assign the locals to a dictionary, then loop over it and call the values from the dict.
def looping_dict(a="0", b="0", c="0", d="0", e="0", f="0", g="0"):
args = locals()
for key in args:
if key in ["a", "b", "c", "d", "e", "f", "g"]:
args[key] = float(args[key])
return print(sum([args["a"], args["b"], args["c"], args["d"], args["e"], args["f"], args["g"]] ) )
This is a bit annoying as I have to reference the dictionary every time. So a simple reference d becomes args["d"]. Doesn't help code readability neither.
This is only documented in the changelog but since 2016 with cherrypy >= 6.2.0 there is a #cherrypy.tools.params tool doing exactly what you want (provided that you use a Python 3 version supporting type annotations):
import cherrypy
#cherrypy.tools.params()
def your_http_handler(
a: float = 0, b: float = 0,
c: float = 0, d: float = 0,
e: float = 0, f: float = 0,
g: float = 0,
):
return str(a + b + c + d + e + f + g)
The PR that added it is PR #1442 — you can explore the usage by looking at the tests there.
If your Python is old for some reason, you could do:
import cherrypy
def your_http_handler(**kwargs):
# Validate that the right query args are present in the HTTP request:
if kwargs.keys() ^ {'a', 'b', 'c', 'd', 'e', 'f', 'g'}:
raise cherrypy.HTTPError(400, message='Got invalid args!')
numbers = (float(num) for num in kwargs.values()) # generator expression won't raise conversion errors here
try:
return str(sum(numbers)) # this will actually call those `float()` conversions so we need to catch a `ValueError`
except ValueError as val_err:
raise cherrypy.HTTPError(
400,
message='All args should be valid numbers: {exc!s}'.format(exc=val_err),
)
P.S. In your initial post you use return print(...) which is wrong. print() always returns None so you'd be sending "None" back to the HTTP client while the argument of print(arg) would be just printed out in your terminal where you run the server.
Here's a #convert decorator I've used before for something similar (originally inspired by https://stackoverflow.com/a/28268292/4597523):
import functools, inspect
def convert(*to_convert, to):
def actual_convert(fn):
arg_names = inspect.signature(fn).parameters.keys()
#functools.wraps(fn)
def wrapper(*args, **kwargs):
args_converted = [to(arg) if arg_name in to_convert else arg
for arg, arg_name in zip(args, arg_names)]
kwargs_converted = {kw_name: to(val) if kw_name in to_convert else val
for kw_name, val in kwargs.items()}
return fn(*args_converted, **kwargs_converted)
return wrapper
return actual_convert
#convert('a', 'c', 'd', to=str)
def f(a, b, c=5, *, d, e=0):
return a, b, c, d, e
print(f(1, 2, d=7))
# Output: ('1', 2, 5, '7', 0)
# Passed params `a` and `d` got changed to `str`,
# however `c` used the default value without conversion
It uses inspect.signature to get the non-keyword arg names. I am not sure how CherryPy passes the params or how it gets the names, but this might be a solid start. Using functools.wraps is important - it makes sure the original signature function signature is preserved, which seems to be important for CherryPy.

Python Typing: Validation Decorator for Literal typed Arguments

Often I encounter the scenario of functions which accept a finite set of values only. I know how to reflect this behavior in the type annotations, using typing.Literal like so:
import typing
def func(a: typing.Literal['foo', 'bar']):
pass
I would like to have a decorator #validate_literals which validates that the parameters to the are consistent with their type:
#validate_literals
def picky_typed_function(
binary: typing.Literal[0, 1],
char: typing.Literal['a', 'b']
) -> None:
pass
so that the input is validated against the restrictions defined by the arguments's types, and a ValueError is raised in case of a violation:
picky_typed_function(0, 'a') # should pass
picky_typed_function(2, 'a') # should raise "ValueError: binary must be one of (0, 1)')"
picky_typed_function(0, 'c') # should raise "ValueError: char must be one of ('a', 'b')"
picky_typed_function(0, char='c') # should raise "ValueError: char must be one of ('a', 'b')"
picky_typed_function(binary=2, char='c') # should raise "ValueError: binary must be one of (0, 1)"
typing type checks are designed to be static, and not happen during runtime. How can I leverage the typing definition for runtime validation?
We can inspect the decorated (validated) function's signature by using inspect.signature, check which of the parameters of the function is typed as a Literal alias by getting the "origin" of the parameter's annotation through typing.get_origin() (or, for python versions < 3.8, using __origin__) and retrieve the valid values by using [typing.get_args()] (https://stackoverflow.com/a/64522240/3566606) (and iterating recursively over nested Literal definitions) from the Literal alias.
In order to do that, all that is left to do, is to figure out which parameters have been passed as positional arguments and map the corresponding values to the parameter's name, so the value can be compared against the valid values of the parameter.
Finally, we build the decorator using the standard recipe with functools.wraps. In the end, this is the code:
import inspect
import typing
import functools
def args_to_kwargs(func: typing.Callable, *args: list, **kwargs: dict) -> dict:
args_dict = {
list(inspect.signature(func).parameters.keys())[i]: arg
for i, arg in enumerate(args)
}
return {**args_dict, **kwargs}
def valid_args_from_literal(annotation: _GenericAlias) -> Set[Any]:
args = get_args(annotation)
valid_values = []
for arg in args:
if typing.get_origin(annotation) is Literal:
valid_values += valid_args_from_literal(arg)
else:
valid_values += [arg]
return set(valid_values)
def validate_literals(func: typing.Callable) -> typing.Callable:
#functools.wraps(func)
def validated(*args, **kwargs):
kwargs = args_to_kwargs(func, *args, **kwargs)
for name, parameter in inspect.signature(func).parameters.items():
# use parameter.annotation.__origin__ for Python versions < 3.8
if typing.get_origin(parameter.annotation) is typing.Literal:
valid_values = valid_args_from_literal(parameter.annotation)
if kwargs[name] not in valid_values:
raise ValueError(
f"Argument '{name}' must be one of {valid_values}"
)
return func(**kwargs)
return validated
This gives the results specified in the question.
I have also published the alpha version of a python package runtime-typing to perform runtime typechecking: https://pypi.org/project/runtime-typing/ (documentation:https://runtime-typing.readthedocs.io) which handles more cases than just typing.Literal, such as typing.TypeVar and typing.Union.
from typing import Literal
from valdec.dec import validate
#validate
def func(a: Literal["foo", "bar"]) -> str:
return a
assert func("bar") == "bar"
#validate("return", exclude=True)
def func(binary: Literal[0, 1], char: Literal["a", "b"]):
return binary, char
assert func(0, "a") == (0, "a")
func(2, "x")
# valdec.utils.ValidationArgumentsError: Validation error <class
# 'valdec.val_pydantic.ValidationError'>: 2 validation errors for argument
# with the name of:
# binary
# unexpected value; permitted: 0, 1 (type=value_error.const; given=2;
# permitted=(0, 1))
# char
# unexpected value; permitted: 'a', 'b' (type=value_error.const; given=x;
# permitted=('a', 'b')).
valdec: https://github.com/EvgeniyBurdin/valdec

Complex variable arguments checking at runtime

Let's consider a function like this:
def f(*args, **kwargs):
...
*args are a variable number of arguments that must be:
(an object, a float, a float), possibly repeated N times,
followed by 0 to N' objects.
For example this is a valid call:
f(my_obj, 0, 1, other_obj, 2, 3, obj3, obj4)
but this is invalid:
f(my_obj, other_obj, 2, 3)
This function is exposed to users through a Python shell.
So, there is value in checking user input -- I am using
the typeguard library that works with type annotations
(like mypy).
I am trying to use typing module to write the proper annotations...
I thought I could at least express the constraint on the groups of 3 args
like this:
#typeguard.typechecked
f(*args:Tuple[Any,float,float])
But it does not work.
And in anyways I have no idea how to add the constraint on the following objects.
Of course I can craft myself some code to check arguments, but I am sure something better exists for cases of complex variable arguments sequences (either a clever use of the typing module or another Python lib ?)
What I meant by making the validation:
def _validate(a, b, c):
assert isinstance(b, float), f"{b} is not a float!"
assert isinstance(c, float), f"{c} is not a float"
def _validate_args(args):
if (len(args) % 3 != 0): # wrong number of args
raise ValueError("Arguments must be passed in pack of 3")
for idx in range(0, len(args), 3):
a, b, c = args[idx: idx + 3]
_validate(a, b, c)
def func(*args, **kwargs):
_validate_args(args)
func(1, 2.0, 3, 1, 2, 3)
AssertionError: 3 is not a float
You can make any message you want.

Automatically Type Cast Parameters In Python

Background:
I mostly run python scripts from the command line in pipelines and so my arguments are always strings that need to be type casted to the appropriate type. I make a lot of little scripts each day and type casting each parameter for every script takes more time than it should.
Question:
Is there a canonical way to automatically type cast parameters for a function?
My Way:
I've developed a decorator to do what I want if there isn't a better way. The decorator is the autocast fxn below. The decorated fxn is fxn2 in the example. Note that at the end of the code block I passed 1 and 2 as strings and if you run the script it will automatically add them. Is this a good way to do this?
def estimateType(var):
#first test bools
if var == 'True':
return True
elif var == 'False':
return False
else:
#int
try:
return int(var)
except ValueError:
pass
#float
try:
return float(var)
except ValueError:
pass
#string
try:
return str(var)
except ValueError:
raise NameError('Something Messed Up Autocasting var %s (%s)'
% (var, type(var)))
def autocast(dFxn):
'''Still need to figure out if you pass a variable with kw args!!!
I guess I can just pass the dictionary to the fxn **args?'''
def wrapped(*c, **d):
print c, d
t = [estimateType(x) for x in c]
return dFxn(*t)
return wrapped
#autocast
def fxn2(one, two):
print one + two
fxn2('1', '2')
EDIT: For anyone that comes here and wants the updated and concise working version go here:
https://github.com/sequenceGeek/cgAutoCast
And here is also quick working version based on above:
def boolify(s):
if s == 'True' or s == 'true':
return True
if s == 'False' or s == 'false':
return False
raise ValueError('Not Boolean Value!')
def estimateType(var):
'''guesses the str representation of the variables type'''
var = str(var) #important if the parameters aren't strings...
for caster in (boolify, int, float):
try:
return caster(var)
except ValueError:
pass
return var
def autocast(dFxn):
def wrapped(*c, **d):
cp = [estimateType(x) for x in c]
dp = dict( (i, estimateType(j)) for (i,j) in d.items())
return dFxn(*cp, **dp)
return wrapped
######usage######
#autocast
def randomFunction(firstVar, secondVar):
print firstVar + secondVar
randomFunction('1', '2')
If you want to auto-convert values:
def boolify(s):
if s == 'True':
return True
if s == 'False':
return False
raise ValueError("huh?")
def autoconvert(s):
for fn in (boolify, int, float):
try:
return fn(s)
except ValueError:
pass
return s
You can adjust boolify to accept other boolean values if you like.
You could just use plain eval to input string if you trust the source:
>>> eval("3.2", {}, {})
3.2
>>> eval("True", {}, {})
True
But if you don't trust the source, you could use literal_eval from ast module.
>>> ast.literal_eval("'hi'")
'hi'
>>> ast.literal_eval("(5, 3, ['a', 'b'])")
(5, 3, ['a', 'b'])
Edit:
As Ned Batchelder's comment, it won't accept non-quoted strings, so I added a workaround, also an example about autocaste decorator with keyword arguments.
import ast
def my_eval(s):
try:
return ast.literal_eval(s)
except ValueError: #maybe it's a string, eval failed, return anyway
return s #thanks gnibbler
def autocaste(func):
def wrapped(*c, **d):
cp = [my_eval(x) for x in c]
dp = {i: my_eval(j) for i,j in d.items()} #for Python 2.6+
#you can use dict((i, my_eval(j)) for i,j in d.items()) for older versions
return func(*cp, **dp)
return wrapped
#autocaste
def f(a, b):
return a + b
print(f("3.4", "1")) # 4.4
print(f("s", "sd")) # ssd
print(my_eval("True")) # True
print(my_eval("None")) # None
print(my_eval("[1, 2, (3, 4)]")) # [1, 2, (3, 4)]
I'd imagine you can make a type signature system with a function decorator, much like you have, only one that takes arguments. For example:
#signature(int, str, int)
func(x, y, z):
...
Such a decorator can be built rather easily. Something like this (EDIT -- works!):
def signature(*args, **kwargs):
def decorator(fn):
def wrapped(*fn_args, **fn_kwargs):
new_args = [t(raw) for t, raw in zip(args, fn_args)]
new_kwargs = dict([(k, kwargs[k](v)) for k, v in fn_kwargs.items()])
return fn(*new_args, **new_kwargs)
return wrapped
return decorator
And just like that, you can now imbue functions with type signatures!
#signature(int, int)
def foo(x, y):
print type(x)
print type(y)
print x+y
>>> foo('3','4')
<type: 'int'>
<type: 'int'>
7
Basically, this is an type-explicit version of #utdemir's method.
If you're parsing arguments from the command line, you should use the argparse module (if you're using Python 2.7).
Each argument can have an expected type so knowing what to do with it should be relatively straightforward. You can even define your own types.
...quite often the command-line string should instead be interpreted as another type, like a float or int. The type keyword argument of add_argument() allows any necessary type-checking and type conversions to be performed. Common built-in types and functions can be used directly as the value of the type argument:
parser = argparse.ArgumentParser()
parser.add_argument('foo', type=int)
parser.add_argument('bar', type=file)
parser.parse_args('2 temp.txt'.split())
>>> Namespace(bar=<open file 'temp.txt', mode 'r' at 0x...>, foo=2)
There are couple of problems in your snippet.
#first test bools
if var == 'True':
return True
elif var == 'False':
return False
This would always check for True because you are testing against the strings 'True' and 'False'.
There is not an automatic coercion of types in python. Your arguments when you receive via *args and **kwargs can be anything. First will look for list of values (each of which can be any datatype, primitive and complex) and second will look for a mapping (with any valid mapping possible). So if you write a decorator, you will end up with a good list of error checks.
Normally, if you wish to send in str, just when the function is invoked, typecast it to string via (str) and send it.
I know I arrived late at this game, but how about eval?
def my_cast(a):
try:
return eval(a)
except:
return a
or alternatively (and more safely):
from ast import literal_eval
def mycast(a):
try:
return literal_eval(a)
except:
return a

Categories