Python - Proper way to call nested functions? - python

Lets say you have the following code
def set_args():
#Set Arguments
parser = argparse.ArgumentParser()
parser.add_argument("-test", help = 'test')
return parser
def run_code():
def fileCommand():
print("the file type is\n")
os.system("file " + filename).read().strip()
def main():
parser = set_args()
args = parser.parse_args()
what is best way to call that fileCommand() function from def main():?
I have the following code which of course doesn't work:
def main():
parser = set_args()
args = parser.parse_args()
#If -test,
if args.test:
filename=args.test
fileCommand()
So if you were to run python test.py -test test.txt it should run the file command on it to get the file type
I know if I keep messing around I'll get it one way or another, but I typically start to over complicated things so its harder down the line later. So what is the proper way to call nested functions?
Thanks in advance!

Python inner functions (or nested functions) have many use cases but in non of them, accessing through the nested function from outside is an option. A nested function is created to have access to the parent function variables and calling it from outside of the parent function is in conflict with principle.
You can call the function from the parent function or make it a decorator:
Calling the nested function from the parent would be:
def run_code(filename):
def fileCommand():
print("the file type is\n")
os.system("file " + filename).read().strip()
fileCommand()
If you describe more about your use case and the reason why you want to run the code like this, I can give more details about how you can implement your code.

Related

find and call any function by function in string

It is tricky question, I need to know one thing that...
two function with different functionality and one more function called 3rd function which will decide that to use any one function. That decision will be passed as argument. Below with clarity code.
# Present in project/testing/local/funtion_one.py
def testing_function_one(par1, par2, par3):
"""do something may be add all par value"""
sum_parms = par1 + par2 + par3
return sum_params_one
# Present in project/testing/local/funtion_two.py
def testing_function_two(par1, par2, par3, par4, par5):
"""do something may be add all par value"""
sum_parms = par1 + par2 + par3
return sum_params_two
# Present in project/testing/function_testing.py
def general_function_testing(function_name, function_path, funtion_params, extra_params):
"""
function_name: would be any function testing_function_one or testing_function_two
function_path: path for where the function is located.
funtion_params: arguments for that calling function.
"""
Now I need like based on above params details, how to call the required function
using path and pass the params for that function and how to handle on passing
number of params for that perticular funtion.
I am looking like:
funt_res = function_name(funtion_params)
# After getting result do something with other params.
new_res = funt_res * extra_params
if __name__ == "__main__"
function_name = "testing_function_two"
function_path = "project/testing/local/funtion_two.py"
funtion_params = pass values to testing_function_two funtion. it
can be {"par1": 2, "par2": 2, "par3": 4, "par4": 6, "par5": 8}
extra_params = 50
res = general_function_testing(function_name, function_path,
funtion_params, extra_params)
Tried:
# This part will work only when **calling_funtion_name**
present in same file otherwise it gives error.
For me it should check all the project or specified path
f_res = globals()["calling_funtion_name"](*args, **kwargs)
print('f_ress', f_res)
anyone can try this one...
If above is not clear, let me know, i will try to explain with other examples.
Though possible, in Python, few times one will need to pass a function by its name as a string. Specially, if the wanted result is for the function to be called in its destination - the reason for that is that functions are themselves "first class objects" in Python, and can be assigned to new variable names (which will simply reference the function) and be passed as arguments to other functions.
So, if one wants to pass sin from the module math to be used as a numericd function inside some other code, instead of general_function_testing('sin', 'math', ...) one can simply write:
import math
general_function_testing(math.sin, ...)
And the function callad with this parameter can simply use whatever name it has for the parameter to call the passed function:
def general_function_testing(target_func, ...):
...
result = target_func(argument)
...
While it is possible to retrieve a function from its name and module name as strings, its much more cumbersome due to nested packages: the code retrieveing the function would have to take care of any "."s in the "function path" as you call it, make carefull use of the built-in __import__, which allows one to import a module given its name as a string, though it has a weird API, and then retrieve the function from the module using a getattr call. And all this to have the a reference to the function object itself, which could be passed as a parameter from the very first moment.
The example above doing it via strings could be:
import sys
def general_function_testing(func_name, func_path, ...):
...
__import__(func_path) # imports the module where the function lives, func_path being a string
module = sys.modules[func_path] # retrieves the module path itself
target_func = getattr(module, func_name)
result = target_func(argument)
...

Dynamic function creation using globals() in python

I am trying to call a dynamic method created using exec(), after calling globals() it is considering params as fixture and returning error as fixture 'Template_SI' not found.
Can someone help on how to pass dynamic parameters using globals()function_name(params)?
import pytest
import input_csv
datalist = input_csv.csvdata()
def display():
return 10 + 5
for data in datalist:
functionname = data['TCID']
parameters = [data['Template_name'], data['File_Type']]
body = 'print(display())'
def createfunc(name, *params, code):
exec('''
#pytest.mark.regression
def {}({}):
{}'''.format(name, ', '.join(params), code), globals(), globals())
createfunc(functionname, data['Template_name'], data['File_Type'], code=body)
templateName = data['Template_name']
fileType = data['File_Type']
globals()[functionname](templateName, fileType)
It looks like you're trying to automate the generation of lots of different tests based on different input data. If that's the case, using exec is probably not the best way to go.
pytest provides parameterized tests: https://docs.pytest.org/en/6.2.x/example/parametrize.html
which accomplish test generation by hiding all the details inside Metafunc.parameterize().
If you really want to generate the tests yourself, consider adapting Metafunc to your own purposes. Or, alternatively, checking the unittest framework.

How python calls functions from main [duplicate]

This question already has answers here:
Calling functions by array index in Python
(5 answers)
Closed 1 year ago.
I am new to python and I am exploring python I have many different functions in 1 file want to expose those functions to client.
This is my app.py file
import sys
def get_data(e_id, t_id):
#some code
def get_2_data(e_id, t_id)
#some code
def get_3_data(t_id)
#some code
if __name__ == '__main__':
get_data(sys.argv[1], sys.argv[2])
Here I want to get specific function data. Currently I am running python app.py 1234 1.
The function which is defined under main gets called.
But I want get_3_data() data. How to call the particular function or someone wants to fetch get_2_data(). How to expose those functions. Any suggestion. I dont want any HTTP call or API. I want to call by method name.
The clean way to do this is by using the module argparse.
Here is how to do this:
import argparse
def get_data(args):
e_id = args.e_id
t_id = args.t_id
print(f'Function get_data. e_id: {e_id}. t_id: {t_id}')
def get_data_2(args):
e_id = args.e_id
t_id = args.t_id
print(f'Function get_data_2. e_id: {e_id}. t_id: {t_id}')
def get_data_3(args):
t_id = args.t_id
print(f'Function get_data_3. t_id: {t_id}')
if __name__ == '__main__':
# Create the arguments parser
argparser = argparse.ArgumentParser()
# Create the subparsers
subparsers = argparser.add_subparsers()
# Add a parser for the first function
get_data_parser = subparsers.add_parser('get_data')
# Set the function name
get_data_parser.set_defaults(func=get_data)
# Add its arguments
get_data_parser.add_argument('e_id')
get_data_parser.add_argument('t_id')
# Add a parser for the second function
get_data_2_parser = subparsers.add_parser('get_data_2')
# Set the function name
get_data_2_parser.set_defaults(func=get_data_2)
# Add its arguments
get_data_2_parser.add_argument('e_id')
get_data_2_parser.add_argument('t_id')
# Add a parser for the third function
get_data_3_parser = subparsers.add_parser('get_data_3')
# Set the function name
get_data_3_parser.set_defaults(func=get_data_3)
# Add its arguments
get_data_3_parser.add_argument('t_id')
# Get the arguments from the comand line
args = argparser.parse_args()
# Call the selected function
args.func(args)
As showed in this example, you will have to change your functions a little bit:
They will take only one argument called args (created in the main function with args = argparser.parse_args())
And then, in each function, you will get the needed parameters by their names (the ones added with add_argument, like in get_data_3_parser.add_argument('t_id'). So, to get the argument called t_id, you will write t_id = args.t_id.
argparse documentation: https://docs.python.org/3/library/argparse.html
argparse tutorial: https://docs.python.org/3/howto/argparse.html
From the documentation:
Many programs split up their functionality into a number of
sub-commands, for example, the svn program can invoke sub-commands
like svn checkout, svn update, and svn commit. Splitting up
functionality this way can be a particularly good idea when a program
performs several different functions which require different kinds of
command-line arguments. ArgumentParser supports the creation of such
sub-commands with the add_subparsers() method.

Calling function through string

I have a Python main program that imports another module (called actions) with multiple functions. The main program should run some things, get a string (i.e. goto(114)) and then run actions.goto(114), in which 114 is the argument to the function goto(x) in actions.
I've tried the obvious which was just trying to run the string but that did not work. I've also find the globals() method which would work if the goto(x) was inside my main module and I've also found the getattr method, but in this case I haven't found any example in which I pass the function name and argument so I'm kind of lost here.
#main.py
import actions
def main():
getc = 'goto(114)'
result = actions.getc #this would be actions.goto(114)
print result
#actions.py
def goto(x):
#code
return something
The actual program gets the string from a .txt file that another program wrote, I just made the example that way so that its simple to understand.
One option you could use is __getattribute__ on the action class to get the function goto, then call it with the encompassing argument. You'd need to parse it like so:
import re
import action
getc = 'goto(114)'
func, arg = re.search('(\w+)\((\d+)\)', 'goto(114)').groups()
# f is the function action.goto with the argument 114 supplied as an int
# __getattribute__ allows you to look up a class method by a string name
f = action.__getattribute__(func)
# now you can just call it with the arg converted to int
result = f(int(arg))
The regex might need to be refined a bit, but it's looking for the name of the calling function, and the arguments wrapped in parentheses. The __getattribute__ will get the function object from action and return it uncalled so you can call it later.
For multiple arguments you can leverage the ast library:
import re
import ast
# I'm going to use a sample class as a stand-in
# for action
class action:
def goto(*args):
print(args)
getc = 'goto(114, "abc", [1,2,3])'
func, args = re.search('(\w+)\((.*)\)', getc).groups()
# will convert this into python data structures
# safely, and will fail if the argument to literal_eval
# is not a python structure
args = ast.literal_eval('(%s)' % args)
f = getattr(action, func)
f(*args)
# 114, "abc", [1,2,3]
The easier option (proceed with caution) would be to use eval:
cmd = 'action.%s' % getc
result = eval(cmd)
Note that this is considered bad practice in the python community, though there are examples in the standard library that use it. This is not safe for un-validated code, and is easily exploited if you don't monitor your source file

How to set argparse arguments from python script

I have a main function specified as entry point in my package's setup.py which uses the argparse package in order to pass command line arguments (see discussion here):
# file with main routine specified as entry point in setup.py
import argparse
def main():
parser = argparse.ArgumentParser()
parser.add_argument('a', type=str, help='mandatory argument a')
args = parser.parse_args()
Ideally, I would like to use the same main function in the package's tests as suggested here. In the latter context, I would like to call the main function from within the test class and set (some of) the command line arguments prior to the function call (which otherwise will fail, due to missing arguments).
# file in the tests folder calling the above main function
class TestConsole(TestCase):
def test_basic(self):
set_value_of_a()
main()
Is that possible?
The argparse module actually reads input variables from special variable, which is called ARGV (short from ARGument Vector). This variable is usually accessed by reading sys.argv from sys module.
This variable is a ordinary list, so you can append your command-line parameters to it like this:
import sys
sys.argv.extend(['-a', SOME_VALUE])
main()
However, messing with sys.argv at runtime is not a good way of testing.
A much more cleaner way to replace the sys.argv for some limited scope is using unittest.mock.patch context manager, like this:
with unittest.mock.patch('sys.argv'. ['-a', SOME_VALUE]):
main()
Read more about unittest.mock.patch in documentation
Also, check this SO question:
How do I set sys.argv so I can unit test it?
#William Fernandes: Just for the sake of completeness, I'll post the full solution in the way that was suggested by (checking for an empty dict not kwargs is None):
def main(**kwargs):
a = None
if not kwargs:
parser = argparse.ArgumentParser()
parser.add_argument('a', type=str, help='mandatory argument a')
args = parser.parse_args()
a = args.a
else:
a = kwargs.get('a')
print(a)
From within the test class the main function can then be called with arguments:
# file in the tests folder calling the above main function
class TestConsole(TestCase):
def test_basic(self):
main(a=42)
The call from the command line without kwargs then requires the specification of the command line argument a=....
Add kwargs to main and if they're None, you set them to the parse_args.

Categories