Is there a way to check if function is recursive in python? - python

I want to write a testing function for an exercise, to make sure a function is implemented correctly.
So I got to wonder, is there a way, given a function "foo", to check if it is implemented recursively?
If it encapsulates a recursive function and uses it it also counts. For example:
def foo(n):
def inner(n):
#more code
inner(n-1)
return inner(n)
This should also be considered recursive.
Note that I want to use an external test function to perform this check. Without altering the original code of the function.

Solution:
from bdb import Bdb
import sys
class RecursionDetected(Exception):
pass
class RecursionDetector(Bdb):
def do_clear(self, arg):
pass
def __init__(self, *args):
Bdb.__init__(self, *args)
self.stack = set()
def user_call(self, frame, argument_list):
code = frame.f_code
if code in self.stack:
raise RecursionDetected
self.stack.add(code)
def user_return(self, frame, return_value):
self.stack.remove(frame.f_code)
def test_recursion(func):
detector = RecursionDetector()
detector.set_trace()
try:
func()
except RecursionDetected:
return True
else:
return False
finally:
sys.settrace(None)
Example usage/tests:
def factorial_recursive(x):
def inner(n):
if n == 0:
return 1
return n * factorial_recursive(n - 1)
return inner(x)
def factorial_iterative(n):
product = 1
for i in xrange(1, n+1):
product *= i
return product
assert test_recursion(lambda: factorial_recursive(5))
assert not test_recursion(lambda: factorial_iterative(5))
assert not test_recursion(lambda: map(factorial_iterative, range(5)))
assert factorial_iterative(5) == factorial_recursive(5) == 120
Essentially test_recursion takes a callable with no arguments, calls it, and returns True if at any point during the execution of that callable the same code appeared twice in the stack, False otherwise. I think it's possible that it'll turn out this isn't exactly what OP wants. It could be modified easily to test if, say, the same code appears in the stack 10 times at a particular moment.

from inspect import stack
already_called_recursively = False
def test():
global already_called_recursively
function_name = stack()[1].function
if not already_called_recursively:
already_called_recursively = True
print(test()) # One recursive call, leads to Recursion Detected!
if function_name == test.__name__:
return "Recursion detected!"
else:
return "Called from {}".format(function_name)
print(test()) # Not Recursion, "father" name: "<module>"
def xyz():
print(test()) # Not Recursion, "father" name: "xyz"
xyz()
The output is
Recursion detected!
Called from <module>
Called from xyz
I use the global variable already_called_recursively to make sure I only call it once, and as you can see, at the recursion it says "Recursion Detected", since the "father" name is the same as the current function, which means I called it from the same function aka recursion.
The other prints are the module-level call, and the call inside xyz.
Hope it helps :D

I have not yet verified for myself if Alex's answer works (though I assume it does, and far better than what I'm about to propose), but if you want something a little simpler (and smaller) than that, you can simply use sys.getrecursionlimit() to error it out manually, then check for that within a function. For example, this is what I wrote for a recursion verification of my own:
import sys
def is_recursive(function, *args):
try:
# Calls the function with arguments
function(sys.getrecursionlimit()+1, *args)
# Catches RecursionError instances (means function is recursive)
except RecursionError:
return True
# Catches everything else (may not mean function isn't recursive,
# but it means we probably have a bug somewhere else in the code)
except:
return False
# Return False if it didn't error out (means function isn't recursive)
return False
While it may be less elegant (and more faulty in some instances), this is far smaller than Alex's code and works reasonably well for most instances. The main drawback here is that with this approach, you're making your computer process through every recursion the function goes through until reaching the recursion limit. I suggest temporarily changing the recursion limit with sys.setrecursionlimit() while using this code to minimize the time taken to process through the recursions, like so:
sys.setrecursionlimit(10)
if is_recursive(my_func, ...):
# do stuff
else:
# do other stuff
sys.setrecursionlimit(1000) # 1000 is the default recursion limit

Related

Regarding good coding practices, what is meant by "Useless return at end of function and method"?

I'm using Spyder to create a web scraper, and things are moving smoothly so far. As a rookie, Spyder's Code Analysis function is something I find useful for improving the standard of my code. However, while I usually understand its instructions/recommendations, I've recently run into a bit of a blip. I'll post some sample code first:
def payments(): #### This is line 59 on the editor. Preceding it is another function with a a format similar to this one.
"""Obtains data on the weekly payments in the country"""
html = get(source["Payments"]).text
html = bs(html,"lxml")
location = "/home/oduduwa/Desktop/Python Projects/Financial Analyser/CBN Data/Payments.csv"
def file_check():
headings = [i.text for i in html.find_all(width="284")][:10]
headings.insert(0, "Date")
if isfile(location) is False:
with open(location,"w") as file_obj:
writer(file_obj).writerow(headings)
return
file_check()
file = open(location,"r").read()
dates = [i.text.strip()[8:] for i in html.find_all("th",colspan="2")]
values = [i.text.strip()[4:] for i in html.find_all(width="149") if i.text.strip()[4:]!=""]
values = array(values).reshape(int(len(values)/10),10)
values = insert(values,0,array(dates).transpose(),axis=1)[::-1]
for i in range(len(values)):
if values[i][0] not in file:
with open(location,"a+",newline=("\n")) as file_obj:
writer(file_obj).writerow(values[i])
return
The code runs fin and does everything it should. What I don't really understand, however, is Spyder's statement that there's a useless return call in the code block. Here's what it says specifically:
But from what I gather, every function call is necessary in this coding block. What could I have missed? Thanks for your time!
Python functions implicitly return None by default. The following function definitions are equivalent.
def foo():
pass
def foo():
return
def foo():
return None
In my opinion, it is good practice to either
have no return statement at all - this indicates that you are not supposed to assign a name to the return value when calling the function, or
explicitly return None, to indicate "no result" for a function that could return a meaningful value, or
use just return to make a function that returns no meaningful value stop execution.
Example for situation 1:
def switch_first_last_in_place(lst):
'switch the first and last elements of a list in-place'
lst[0], lst[-1] = lst[-1], lst[0]
This function implicitly returns None and you are not supposed to issue
result = switch_first_last_in_place([1, 2, 3])
Example for situation 2:
def get_user_info_from_database(username):
'fetches user info, returns None if user does not exist'
if user_exist(username):
return query_db(username)
else:
return None
This function explicitly returns None to indicate a user was not found. Assignments like
result = get_user_info_from_database('Bob')
are expected. The part
else:
return None
is unnecessary but I like being explicit in cases where None is a meaningful return value.
Example for situation 3:
def assert_exists(thing, *containers):
'raise AssertionError if thing cannot be found in any container'
for container in containers:
if thing in container:
return
raise AssertionError
Here, return is merely used to break out of the function.
I don't like the bare return at the end of the functions in your example. It is not used to end execution and those functions cannot return other values. I would remove it.
You've misunderstood. The warning isn't talking about any of the functions you are calling. It's referring to your use of the return keyword.
This function:
def print_hello():
print("Hello")
return
Is equivalent to this function:
def print_hello():
print("Hello")
return None
Which is equivalent to this function:
def print_hello():
print("Hello")
The warning is saying, that your return statements are useless, and are not required.

How to implement a decorator function

I'm brand-new to decorators and closures, I'm trying to practice with a simple example. When executed it raises an error of:
NameError: name 'congratulate' is not defined
What do I need to change?
"""
A recursive function to check if a string is a palindrome.
"""
#congratulate
def palindrome(phrase):
characters = [char.lower() for char in phrase if char.isalpha()]
chars_len = len(characters)
out1 = characters[0]
out2 = characters[-1]
if chars_len <= 2:
return out1 == out2
else:
if out1 == out2:
return palindrome(characters[1:-1])
else:
return False
def congratulate(func):
if func:
print('Congratulations, it\'s a palindrome!')
if __name__ == '__main__':
print(palindrome('Rats live on no evil star'))
"""
A recursive function to check if a string is a palindrome.
"""
def congratulate(func):
def wrapper(*argv, **kargs):
result = func(*argv, **kargs)
if result:
print('Congratulations, it\'s a palindrome!')
return result
return wrapper
#congratulate
def palindrome(phrase):
characters = [char.lower() for char in phrase if char.isalpha()]
chars_len = len(characters)
out1 = characters[0]
out2 = characters[-1]
if chars_len <= 2:
return out1 == out2
else:
if out1 == out2:
return palindrome(characters[1:-1])
else:
return False
if __name__ == '__main__':
print(palindrome('Rats live on no evil star'))
the essence of understanding decorator is
#f
def g(args)
=>
f(g)(args)
I know I'm late to the party, but I want to expand.
As noted, the NameError in this case is caused by the fact that you use a name before you actually create one. Moving congratulate() to the top remedies this.
Appart from the NameError you have two implicit Logic Errors relating to Decorator/Function Functionality:
First Issue:
Your if clause in congratulate always evaluates to True; you aren't exactly congratulating when a string is a palindrome.
This is caused by the fact that function objects always evaluate to True, so a condition of the form if func: will always execute:
def f():
pass
if f:
print("I'm true!")
# Prints: I'm true!
This is thankfully trivial and can easily be fixed by actually calling the function if func("test string"):
Second Issue:
The second issue here is less trivial and probably caused by the fact that decorators can be comfusing. You aren't actually using
congratulate() the way decorators are supposed to be used.
A decorator is a callable that returns a callable (callables are things like functions, classes overloaded on __call__). What your 'decorator' is doing here is simply accepting a function object, evaluating if the object is True and then printing congratulations.
Worst part? It is also implicitly rebinding the name palindrome to None.
Again, you can see this indirect effect (+1 for rhyming) in this next snippet:
def decor(f):
if f: print("Decorating can be tricky")
#decor
def f():
print("Do I even Exist afterwards?")
# When executed, this prints:
Decorating can be tricky
Cool, our function f has been decorated, but, look what happens when we try calling our function f:
f()
TypeError Traceback (most recent call last)
<ipython-input-31-0ec059b9bfe1> in <module>()
----> 1 f()
TypeError: 'NoneType' object is not callable
Yes, our function object f has now been assigned to None, the return value of our decor function.
This happens because as pointed out, the #syntax is directly equivalent to the following:
#decor
def f(): pass
# similar to
f = decor(f) # we re-assign the name f!
Because of this we must make sure the return value of a decorator is an object that can afterwards be called again, ergo, a callable object.
So what do you do? One option you might consider would be simply returning the function you passed:
def congratulate(func):
if func("A test Phrase!"):
print('Congratulations, it\'s a palindrome!')
return func
This will guarantee that after the decorator runs on your palindrome() function, the name palindrome is still going to map to a callable object.
The problem? This turns out to be a one-time ride. When Python encounters your decorator and your function, it's going to execute congratulate once and as a result only going to execute your if clause once.
But you need it to run this if every time your function is called! What can you do in order to accomplish this? Return a function that executes the decorated function (so called nested function decorators).
By doing this you create a new function for the name palindrome and this function contains your original function which you make sure is executed each time palindrome() is called.
def congratulate(func): # grabs your decorated function
# a new function that uses the original decorated function
def newFunc():
# Use the function
if func("Test string"):
print('Congratulations, it\'s a palindrome!')
# Return the function that uses the original function
return newFunc
newFunc is now a function that issues calls to your original function.
The decoration process now assigns the palindrome name to the newFunc object (notice how we returned it with return newFunc.
As a result, each time you execute a call of the form palindrome() this is tranlated to newFunc() which in turn calls func() in its body. (If you're still with me I commend you).
What's the final issue here? We've hard-coded the parameters for func. As is, everytime you call palindrome() function newFunc() will call your original function func with a call signature of func("Test String"), which is not what we want, we need to be able to pass parameters.
What's the solution? Thankfully, this is simple: Pass an argument to newFunc() which will then pass the argument to func():
def congratulate(func): # grabs your decorated function
# a new function that uses the original decorated function
# we pass the required argument <phrase>
def newFunc(phrase):
# Use the function
# we use the argument <phrase>
if func(phrase):
print('Congratulations, it\'s a palindrome!')
# Return the function that uses the original function
return newFunc
Now, everytime you call palindrome('Rats live on no evil star') this will translate to a call of newFunc('Rats live on no evil star') which will then transfer that call to your func as func('Rats live on no evil star') in the if clause.
After execution, this works wonderfully and get's you the result you wanted:
palindrome('Rats live on no evil star')
Congratulations, it's a palindrome!
I hope you enjoy reading, I believe I'm done (for now)!
Move the congratulate() function above the function it's decorating (palindrome).

Python - multiple functions - output of one to the next

I know this is super basic and I have been searching everywhere but I am still very confused by everything I'm seeing and am not sure the best way to do this and am having a hard time wrapping my head around it.
I have a script where I have multiple functions. I would like the first function to pass it's output to the second, then the second pass it's output to the third, etc. Each does it's own step in an overall process to the starting dataset.
For example, very simplified with bad names but this is to just get the basic structure:
#!/usr/bin/python
# script called process.py
import sys
infile = sys.argv[1]
def function_one():
do things
return function_one_output
def function_two():
take output from function_one, and do more things
return function_two_output
def function_three():
take output from function_two, do more things
return/print function_three_output
I want this to run as one script and print the output/write to new file or whatever which I know how to do. Just am unclear on how to pass the intermediate outputs of each function to the next etc.
infile -> function_one -> (intermediate1) -> function_two -> (intermediate2) -> function_three -> final result/outfile
I know I need to use return, but I am unsure how to call this at the end to get my final output
Individually?
function_one(infile)
function_two()
function_three()
or within each other?
function_three(function_two(function_one(infile)))
or within the actual function?
def function_one():
do things
return function_one_output
def function_two():
input_for_this_function = function_one()
# etc etc etc
Thank you friends, I am over complicating this and need a very simple way to understand it.
You could define a data streaming helper function
from functools import reduce
def flow(seed, *funcs):
return reduce(lambda arg, func: func(arg), funcs, seed)
flow(infile, function_one, function_two, function_three)
#for example
flow('HELLO', str.lower, str.capitalize, str.swapcase)
#returns 'hELLO'
edit
I would now suggest that a more "pythonic" way to implement the flow function above is:
def flow(seed, *funcs):
for func in funcs:
seed = func(seed)
return seed;
As ZdaR mentioned, you can run each function and store the result in a variable then pass it to the next function.
def function_one(file):
do things on file
return function_one_output
def function_two(myData):
doThings on myData
return function_two_output
def function_three(moreData):
doMoreThings on moreData
return/print function_three_output
def Main():
firstData = function_one(infile)
secondData = function_two(firstData)
function_three(secondData)
This is assuming your function_three would write to a file or doesn't need to return anything. Another method, if these three functions will always run together, is to call them inside function_three. For example...
def function_three(file):
firstStep = function_one(file)
secondStep = function_two(firstStep)
doThings on secondStep
return/print to file
Then all you have to do is call function_three in your main and pass it the file.
For safety, readability and debugging ease, I would temporarily store the results of each function.
def function_one():
do things
return function_one_output
def function_two(function_one_output):
take function_one_output and do more things
return function_two_output
def function_three(function_two_output):
take function_two_output and do more things
return/print function_three_output
result_one = function_one()
result_two = function_two(result_one)
result_three = function_three(result_two)
The added benefit here is that you can then check that each function is correct. If the end result isn't what you expected, just print the results you're getting or perform some other check to verify them. (also if you're running on the interpreter they will stay in namespace after the script ends for you to interactively test them)
result_one = function_one()
print result_one
result_two = function_two(result_one)
print result_two
result_three = function_three(result_two)
print result_three
Note: I used multiple result variables, but as PM 2Ring notes in a comment you could just reuse the name result over and over. That'd be particularly helpful if the results would be large variables.
It's always better (for readability, testability and maintainability) to keep your function as decoupled as possible, and to write them so the output only depends on the input whenever possible.
So in your case, the best way is to write each function independently, ie:
def function_one(arg):
do_something()
return function_one_result
def function_two(arg):
do_something_else()
return function_two_result
def function_three(arg):
do_yet_something_else()
return function_three_result
Once you're there, you can of course directly chain the calls:
result = function_three(function_two(function_one(arg)))
but you can also use intermediate variables and try/except blocks if needed for logging / debugging / error handling etc:
r1 = function_one(arg)
logger.debug("function_one returned %s", r1)
try:
r2 = function_two(r1)
except SomePossibleExceptio as e:
logger.exception("function_two raised %s for %s", e, r1)
# either return, re-reraise, ask the user what to do etc
return 42 # when in doubt, always return 42 !
else:
r3 = function_three(r2)
print "Yay ! result is %s" % r3
As an extra bonus, you can now reuse these three functions anywhere, each on it's own and in any order.
NB : of course there ARE cases where it just makes sense to call a function from another function... Like, if you end up writing:
result = function_three(function_two(function_one(arg)))
everywhere in your code AND it's not an accidental repetition, it might be time to wrap the whole in a single function:
def call_them_all(arg):
return function_three(function_two(function_one(arg)))
Note that in this case it might be better to decompose the calls, as you'll find out when you'll have to debug it...
I'd do it this way:
def function_one(x):
# do things
output = x ** 1
return output
def function_two(x):
output = x ** 2
return output
def function_three(x):
output = x ** 3
return output
Note that I have modified the functions to accept a single argument, x, and added a basic operation to each.
This has the advantage that each function is independent of the others (loosely coupled) which allows them to be reused in other ways. In the example above, function_two() returns the square of its argument, and function_three() the cube of its argument. Each can be called independently from elsewhere in your code, without being entangled in some hardcoded call chain such as you would have if called one function from another.
You can still call them like this:
>>> x = function_one(3)
>>> x
3
>>> x = function_two(x)
>>> x
9
>>> x = function_three(x)
>>> x
729
which lends itself to error checking, as others have pointed out.
Or like this:
>>> function_three(function_two(function_one(2)))
64
if you are sure that it's safe to do so.
And if you ever wanted to calculate the square or cube of a number, you can call function_two() or function_three() directly (but, of course, you would name the functions appropriately).
With d6tflow you can easily chain together complex data flows and execute them. You can quickly load input and output data for each task. It makes your workflow very clear and intuitive.
import d6tlflow
class Function_one(d6tflow.tasks.TaskCache):
function_one_output = do_things()
self.save(function_one_output) # instead of return
#d6tflow.requires(Function_one)
def Function_two(d6tflow.tasks.TaskCache):
output_from_function_one = self.inputLoad() # load function input
function_two_output = do_more_things()
self.save(function_two_output)
#d6tflow.requires(Function_two)
def Function_three():
output_from_function_two = self.inputLoad()
function_three_output = do_more_things()
self.save(function_three_output)
d6tflow.run(Function_three()) # executes all functions
function_one_output = Function_one().outputLoad() # get function output
function_three_output = Function_three().outputLoad()
It has many more useful features like parameter management, persistence, intelligent workflow management. See https://d6tflow.readthedocs.io/en/latest/
This way function_three(function_two(function_one(infile))) would be the best, you do not need global variables and each function is completely independent of the other.
Edited to add:
I would also say that function3 should not print anything, if you want to print the results returned use:
print function_three(function_two(function_one(infile)))
or something like:
output = function_three(function_two(function_one(infile)))
print output
Use parameters to pass the values:
def function1():
foo = do_stuff()
return function2(foo)
def function2(foo):
bar = do_more_stuff(foo)
return function3(bar)
def function3(bar):
baz = do_even_more_stuff(bar)
return baz
def main():
thing = function1()
print thing

Python lazy evaluator

Is there a Pythonic way to encapsulate a lazy function call, whereby on first use of the function f(), it calls a previously bound function g(Z) and on the successive calls f() returns a cached value?
Please note that memoization might not be a perfect fit.
I have:
f = g(Z)
if x:
return 5
elif y:
return f
elif z:
return h(f)
The code works, but I want to restructure it so that g(Z) is only called if the value is used. I don't want to change the definition of g(...), and Z is a bit big to cache.
EDIT: I assumed that f would have to be a function, but that may not be the case.
I'm a bit confused whether you seek caching or lazy evaluation. For the latter, check out the module lazy.py by Alberto Bertogli.
Try using this decorator:
class Memoize:
def __init__ (self, f):
self.f = f
self.mem = {}
def __call__ (self, *args, **kwargs):
if (args, str(kwargs)) in self.mem:
return self.mem[args, str(kwargs)]
else:
tmp = self.f(*args, **kwargs)
self.mem[args, str(kwargs)] = tmp
return tmp
(extracted from dead link: http://snippets.dzone.com/posts/show/4840 / https://web.archive.org/web/20081026130601/http://snippets.dzone.com/posts/show/4840)
(Found here: Is there a decorator to simply cache function return values? by Alex Martelli)
EDIT: Here's another in form of properties (using __get__) http://code.activestate.com/recipes/363602/
You can employ a cache decorator, let see an example
from functools import wraps
class FuncCache(object):
def __init__(self):
self.cache = {}
def __call__(self, func):
#wraps(func)
def callee(*args, **kwargs):
key = (args, str(kwargs))
# see is there already result in cache
if key in self.cache:
result = self.cache.get(key)
else:
result = func(*args, **kwargs)
self.cache[key] = result
return result
return callee
With the cache decorator, here you can write
my_cache = FuncCache()
#my_cache
def foo(n):
"""Expensive calculation
"""
sum = 0
for i in xrange(n):
sum += i
print 'called foo with result', sum
return sum
print foo(10000)
print foo(10000)
print foo(1234)
As you can see from the output
called foo with result 49995000
49995000
49995000
The foo will be called only once. You don't have to change any line of your function foo. That's the power of decorators.
There are quite a few decorators out there for memoization:
http://wiki.python.org/moin/PythonDecoratorLibrary#Memoize
http://code.activestate.com/recipes/498110-memoize-decorator-with-o1-length-limited-lru-cache/
http://code.activestate.com/recipes/496879-memoize-decorator-function-with-cache-size-limit/
Coming up with a completely general solution is harder than you might think. For instance, you need to watch out for non-hashable function arguments and you need to make sure the cache doesn't grow too large.
If you're really looking for a lazy function call (one where the function is only actually evaluated if and when the value is needed), you could probably use generators for that.
EDIT: So I guess what you want really is lazy evaluation after all. Here's a library that's probably what you're looking for:
http://pypi.python.org/pypi/lazypy/0.5
Just for completness, here is a link for my lazy-evaluator decorator recipe:
https://bitbucket.org/jsbueno/metapython/src/f48d6bd388fd/lazy_decorator.py
Here's a pretty brief lazy-decorator, though it lacks using #functools.wraps (and actually returns an instance of Lazy plus some other potential pitfalls):
class Lazy(object):
def __init__(self, calculate_function):
self._calculate = calculate_function
def __get__(self, obj, _=None):
if obj is None:
return self
value = self._calculate(obj)
setattr(obj, self._calculate.func_name, value)
return value
# Sample use:
class SomeClass(object):
#Lazy
def someprop(self):
print 'Actually calculating value'
return 13
o = SomeClass()
o.someprop
o.someprop
Curious why you don't just use a lambda in this scenario?
f = lambda: g(z)
if x:
return 5
if y:
return f()
if z:
return h(f())
Even after your edit, and the series of comments with detly, I still don't really understand. In your first sentence, you say the first call to f() is supposed to call g(), but subsequently return cached values. But then in your comments, you say "g() doesn't get called no matter what" (emphasis mine). I'm not sure what you're negating: Are you saying g() should never be called (doesn't make much sense; why does g() exist?); or that g() might be called, but might not (well, that still contradicts that g() is called on the first call to f()). You then give a snippet that doesn't involve g() at all, and really doesn't relate to either the first sentence of your question, or to the comment thread with detly.
In case you go editing it again, here is the snippet I am responding to:
I have:
a = f(Z)
if x:
return 5
elif y:
return a
elif z:
return h(a)
The code works, but I want to
restructure it so that f(Z) is only
called if the value is used. I don't
want to change the definition of
f(...), and Z is a bit big to cache.
If that is really your question, then the answer is simply
if x:
return 5
elif y:
return f(Z)
elif z:
return h(f(Z))
That is how to achieve "f(Z) is only called if the value is used".
I don't fully understand "Z is a bit big to cache". If you mean there will be too many different values of Z over the course of program execution that memoization is useless, then maybe you have to resort to precalculating all the values of f(Z) and just looking them up at run time. If you can't do this (because you can't know the values of Z that your program will encounter) then you are back to memoization. If that's still too slow, then your only real option is to use something faster than Python (try Psyco, Cython, ShedSkin, or hand-coded C module).

Shortening a oft-used code segment for testing a return value in Python

Consider this Python segment:
def someTestFunction():
if someTest:
return value1
elif someOtherTest:
return value2
elif yetSomeOtherTest:
return value3
return None
def SomeCallingFunction():
a = someTestFunction()
if a != None:
return a
... normal execution continues
Now, the question: the three-line segment in the beginning of SomeCallingFunction to get the value of the test function and bail out if it's not None, is repeated very often in many other functions. Three lines is too long. I want to shorten it to one. How do I do that?
I can freely restructure this code and the contents of someTestFunction however needed. I thought of using exceptions, but those don't seem to help in cutting down the calling code length.
(I've read a bit about Python decorators, but haven't used them. Would this be the place? How would it work?)
If you want to use a decorator, it would look like this:
def testDecorator(f):
def _testDecorator():
a = someTestFunction()
if a is None:
return f()
else: return a
return _testDecorator
#testDecorator
def SomeCallingFunction():
... normal execution
When the module is first imported, it runs testDecorator, passing it your original SomeCallingFunction as a parameter. A new function is returned, and that gets bound to the SomeCallingFunction name. Now, whenever you call SomeCallingFunction, it runs that other function, which does the check, and returns either a, or the result of the original SomeCallingFunction.
I often use a hash table in place of a series of elifs:
def someTestFunction(decorated_test):
options = {
'val1': return_val_1,
'val2': return_val_2
}
return options[decorated_test]
You can set up options as a defaultdict(None) to default to None if a key isn't found.
If you can't get your tests in that form, then a series of if statements might actually be the best thing to do.
One small thing you can do to shorten your code is to use this:
if a: return a
There may be other ways to shorten your code, but these are the ones I can come up with on the spot.
I think this would do it:
UPDATE Fixed!
Sorry for yesterday, I rushed and didn't test the code!
def test_decorator( test_func ):
def tester( normal_function ):
def tester_inner():
a = test_func()
if a is not None:
return a
return normal_function()
return tester_inner
return tester
#usage:
#test_decorator( my_test_function )
def my_normal_function():
#.... normal execution continue ...
It's similar to DNS's answer but allows you to specify which test function you want to use

Categories