How can I call the custom-defined function in a python script from a bash shell?
I tried to use sys.argv[1], but not working properly.
for example,
import sys
if __name__=='__main__':
try:
func = sys.argv[1]
except: func = None
def function1():
~~~~~~~~
return a
def function2():
~~~~~~~~
return b
here, I want to call the function 1 or function 2 by typing like
$ script.py function1
$ script.py function2
You are getting the name of function , but you are not running it. You should check first if the func name is one of your functions than execute it:
if __name__=='__main__':
try:
func = sys.argv[1]
except:
func = None
functions = {
"function1": function1,
"function2": function2
}
if func in functions:
functions[func]()
A simpler solution:
if func == "function1":
function1()
elif func == "function2":
function2()
I suggest to use argparse module: https://docs.python.org/3/library/argparse.html#module-argparse
You will thank yourself later.
For your case - since you need to call only 1 function at the time - you can use positional arguments:
import argparse
def function1():
print("1")
def function2():
print("2")
parser = argparse.ArgumentParser()
F_MAP = {'function1': function1,
'function2': function2}
parser.add_argument('function', choices=F_MAP.keys())
args = parser.parse_args()
F_MAP[args.function]()
As a bonus you get a nice help page when calling with -h argument :)
#bigOTHER's answer is good and correct, but if you're looking to build a relatively complex text UI, maybe have a look at something like Click?
You can refer Display a list of user defined functions in the Python IDLE session
import types
list_function = [f for f in globals().values() if type(f) == types.FunctionType]
This will return list of available functions, Then you can check if any of these contain sys.argv[1], If yes then you can call your function as
list_function[index]()
Related
I would like to set
sys.argv
so I can unit test passing in different combinations. The following doesn't work:
#!/usr/bin/env python
import argparse, sys
def test_parse_args():
global sys.argv
sys.argv = ["prog", "-f", "/home/fenton/project/setup.py"]
setup = get_setup_file()
assert setup == "/home/fenton/project/setup.py"
def get_setup_file():
parser = argparse.ArgumentParser()
parser.add_argument('-f')
args = parser.parse_args()
return args.file
if __name__ == '__main__':
test_parse_args()
Then running the file:
pscripts % ./test.py
File "./test.py", line 4
global sys.argv
^
SyntaxError: invalid syntax
pscripts %
Changing sys.argv at runtime is a pretty fragile way of testing. You should use mock's patch functionality, which can be used as a context manager to substitute one object (or attribute, method, function, etc.) with another, within a given block of code.
The following example uses patch() to effectively "replace" sys.argv with the specified return value (testargs).
try:
# python 3.4+ should use builtin unittest.mock not mock package
from unittest.mock import patch
except ImportError:
from mock import patch
def test_parse_args():
testargs = ["prog", "-f", "/home/fenton/project/setup.py"]
with patch.object(sys, 'argv', testargs):
setup = get_setup_file()
assert setup == "/home/fenton/project/setup.py"
test_argparse.py, the official argparse unittest file, uses several means of setting/using argv:
parser.parse_args(args)
where args is a list of 'words', e.g. ['--foo','test'] or --foo test'.split().
old_sys_argv = sys.argv
sys.argv = [old_sys_argv[0]] + args
try:
return parser.parse_args()
finally:
sys.argv = old_sys_argv
This pushes the args onto sys.argv.
I just came across a case (using mutually_exclusive_groups) where ['--foo','test'] produces different behavior than '--foo test'.split(). It's a subtle point involving the id of strings like test.
global only exposes global variables within your module, and sys.argv is in sys, not your module. Rather than using global sys.argv, use import sys.
You can avoid having to change sys.argv at all, though, quite simply: just let get_setup_file optionally take a list of arguments (defaulting to None) and pass that to parse_args. When get_setup_file is called with no arguments, that argument will be None, and parse_args will fall back to sys.argv. When it is called with a list, it will be used as the program arguments.
I like to use unittest.mock.patch(). The difference to patch.object() is that you don't need a direct reference to the object you want to patch but use a string.
from unittest.mock import patch
with patch("sys.argv", ["file.py", "-h"]):
print(sys.argv)
It doesn't work because you're not actually calling get_setup_file. Your code should read:
import argparse
def test_parse_args():
sys.argv = ["prog", "-f", "/home/fenton/project/setup.py"]
setup = get_setup_file() # << You need the parentheses
assert setup == "/home/fenton/project/setup.py"
I achieved this by creating an execution manager that would set the args of my choice and remove them upon exit:
import sys
class add_resume_flag(object):
def __enter__(self):
sys.argv.append('--resume')
def __exit__(self, typ, value, traceback):
sys.argv = [arg for arg in sys.argv if arg != '--resume']
class MyTestClass(unittest.TestCase):
def test_something(self):
with add_resume_flag():
...
Very good question.
The trick to setting up unit tests is all about making them repeatable. This means that you have to eliminate the variables, so that the tests are repeatable. For example, if you are testing a function that must perform correctly given the current date, then force it to work for specific dates, where the date chosen does not matter, but the chosen dates match in type and range to the real ones.
Here sys.argv will be an list of length at least one. So create a "fakemain" that gets called with a list. Then test for the various likely list lengths, and contents. You can then call your fake main from the real one passing sys.argv, knowing that fakemain works, or alter the "if name..." part to do perform the normal function under non-unit testing conditions.
You'll normally have command arguments. You need to test them. Here is how to unit test them.
Assume program may be run like: % myprogram -f setup.py
We create a list to mimic this behaviour. See line (4)
Then our method that parses args, takes an array as an argument that is defaulted to None. See line (7)
Then on line (11) we pass this into parse_args, which uses the array if it isn't None. If it is None then it defaults to using sys.argv.
1: #!/usr/bin/env python
2: import argparse
3: def test_parse_args():
4: my_argv = ["-f", "setup.py"]
5: setup = get_setup_file(my_argv)
6: assert setup == "setup.py"
7: def get_setup_file(argv=None):
8: parser = argparse.ArgumentParser()
9: parser.add_argument('-f')
10: # if argv is 'None' then it will default to looking at 'sys.argv'
11: args = parser.parse_args(argv)
12: return args.f
13: if __name__ == '__main__':
14: test_parse_args()
You can attach a wrapper around your function, which prepares sys.argv before calling and restores it when leaving:
def run_with_sysargv(func, sys_argv):
""" prepare the call with given sys_argv and cleanup afterwards. """
def patched_func(*args, **kwargs):
old_sys_argv = list(sys.argv)
sys.argv = list(sys_argv)
try:
return func(*args, **kwargs)
except Exception, err:
sys.argv = old_sys_argv
raise err
return patched_func
Then you can simply do
def test_parse_args():
_get_setup_file = run_with_sysargv(get_setup_file,
["prog", "-f", "/home/fenton/project/setup.py"])
setup = _get_setup_file()
assert setup == "/home/fenton/project/setup.py"
Because the errors are passed correctly, it should not interfere with external instances using the testing code, like pytest.
What's a good way to handle lots of parameters using standard python modules & techniques when creating a function in a module that can be called from the command line or imported and called programmatically?
For example:
# my_thing.py
import argparse
def my_thing(
param1=None, param2=None,
param3=None, param4=None,
param5=None, param6=None,
param7=None, param8=None):
# Do something with all those parameters
pass
def main():
parser = argparse.ArgumentParser()
# add arguments
args = parser.parse_args()
my_thing(
param1=args.param1, param2=args.param2,
param3=args.param3, param4=args.param4,
param5=args.param5, param6=args.param6,
param7=args.param7, param8=args.param8):
if __name__ == "__main__":
main()
or maybe this...
# my_thing.py
import argparse
def my_thing(params):
# Do something with all those parameters
pass
def main():
parser = argparse.ArgumentParser()
# add arguments
args = parser.parse_args()
params = {
"param1":args.param1, "param2":args.param2,
"param3":args.param3, "param4":args.param4,
"param5":args.param5, "param6":args.param6,
"param7":args.param7, "param8":args.param8}
my_thing(params)
if __name__ == "__main__":
main()
It may not be the BEST way, but you could store all the parameters in a dictionary, or order them in a list.
#dictionary
def my_thing(dict):
param_1 = dict['param_1']
param_i = dict['param_i']
# list
def my_thing(list_of_param):
param_1 = list_of_param[0]
...param_i = list_of_param[i]...
A better way would be to create a wrapper object to encapsulate the parameters, but none of those really help for ease of creating new instances.
To create new instances quickly it may help to store the parameters in a .txt or .csv file and parse the file for the different parameters. This would make it easy to run in the command line because you could easily add the file as one of the arguments.
python my_script.py my_parameters.txt
You can actually use a third option, passing the __dict__ attribute of the Namespace object that parser.parse_args() returns, into my_thing. object.__dict__ accesses the underlying dictionary that all objects use to store their attributes. In this case, the attributes of the Namespace object are the command line arguments that are provide to the script.
# my_thing.py
import argparse
def my_thing(params):
print(params)
def main():
parser = argparse.ArgumentParser()
# add arguments
args = parser.parse_args()
my_thing(args.__dict__)
if __name__ == "__main__":
main()
How about using keyword-only arguments?
For example:
import argparse
def my_thing(*, param1, param2, param3):
# Do something with all those parameters
pass
def main():
parser = argparse.ArgumentParser()
# add arguments
args = parser.parse_args()
# see https://stackoverflow.com/q/16878315/5220128
my_thing(**vars(args))
if __name__ == "__main__":
main()
For example, I have two python files, 'test1.py' and 'test2.py'. I want to import test2 into test1, so that when I run test1, it also runs test2.
However, in order to run properly, test2 requires an input argument. Normally when I run test2 from outside of test1, I just type the argument after the file call in the command line. How do I accomplish this when calling test2 from within test1?
Depending on the ability of editing test2.py there are two options:
(Possible to edit) Pack test2.py content into class and pass args in init.
in test1.py file:
from test2 import test2class
t2c = test2class(neededArgumetGoHere)
t2c.main()
in test2.py file:
class test2class:
def __init__(self, neededArgumetGoHere):
self.myNeededArgument = neededArgumetGoHere
def main(self):
# do stuff here
pass
# to run it from console like a simple script use
if __name__ == "__main__":
t2c = test2class(neededArgumetGoHere)
t2c.main()
(Not possible to edit test2.py) Run test2.py as a subprocess. Check subprocess docs for more info how to use it.
test1.py
from subprocess import call
call(['path/to/python','test2.py','neededArgumetGoHere'])
Assuming you can define your own test1 and test2 and that you are OK with using argparse (which is a good idea anyway):
The nice thing with using argparse is that you can let test2 define a whole bunch of default parameter values that test1 doesn't have to worry about. And, in a way, you have a documented interface for test2's calling.
Cribbed off https://docs.python.org/2/howto/argparse.html
test2.py
import argparse
def get_parser():
"separate out parser definition in its own function"
parser = argparse.ArgumentParser()
parser.add_argument("square", help="display a square of a given number")
return parser
def main(args):
"define a main as the test1=>test2 entry point"
print (int(args.square)**2)
if __name__ == '__main__':
"standard test2 from command line call"
parser = get_parser()
args = parser.parse_args()
main(args)
audrey:explore jluc$ python test2.py 3
9
test1.py
import test2
import sys
#ask test2 for its parser
parser = test2.get_parser()
try:
#you can use sys.argv here if you want
square = sys.argv[1]
except IndexError:
#argparse expects strings, not int
square = "5"
#parse the args for test2 based on what test1 wants to do
#by default parse_args uses sys.argv, but you can provide a list
#of strings yourself.
args = parser.parse_args([square])
#call test2 with the parsed args
test2.main(args)
audrey:explore jluc$ python test1.py 6
36
audrey:explore jluc$ python test1.py
25
You can use the call or popen method from subprocess module.
from subprocess import call, Popen
Call(file, args)
Popen(file args)
I am trying to understand the usage of #main annotation in python.
With the below python program,
def cube(x):
return x * x * x
def run_tests():
printf("Should be 1:", cube(1))
printf("Should be 8:", cube(2))
printf("Should be 27:", cube(3))
#main
def main():
print("Starting")
run_tests()
print("Ending.")
I get the following error:
PS C:\Users\MOHET01\Desktop> python.exe -i .\cube.py
Traceback (most recent call last):
File ".\cube.py", line 9, in <module>
#main
NameError: name 'main' is not defined
>>>
Function that is imported from ucb is as shown below:
def main(fn):
"""Call fn with command line arguments. Used as a decorator.
The main decorator marks the function that starts a program. For example,
interact()
#main
def my_run_function():
# function body
Use this instead of the typical __name__ == "__main__" predicate.
"""
if inspect.stack()[1][0].f_locals['__name__'] == '__main__':
args = sys.argv[1:] # Discard the script name from command line
print(args)
print(*args)
print(fn)
fn(*args) # Call the main function
return fn
My question:
Despite i define function with intrinsic name main, Why do i see this error?
I should use this:
def main():
#Do something
if __name__ == "__main__":
#Here use the method that will be the main
main()
I hope this helps
The #main decorator is implemented in a file your course provides, but you have not imported it. The page you linked says to use
from ucb import main, interact
to import the ucb.py features in your program.
As for why the error says name 'main' is not defined, that's because the function definition doesn't actually finish defining anything until the decorators execute. The reuse of the name main for both the decorator and the decorated function is confusing; the main in #main is a different function from the main you're defining in def main(): .... The main in #main is defined to run the decorated function if the file is run as a script, while the main in def main(): ... is the function to be run.
I would strongly recommend not using anything like this decorator when you don't have to. The standard way to perform the task the decorator performs is to write
if __name__ == '__main__':
whatever_function_you_would_have_put_the_decorator_on()
or if you want to handle command line arguments like the decorator would,
if __name__ == '__main__':
import sys
whatever_function_you_would_have_put_the_decorator_on(*sys.argv[1:])
The decorator is an attempt to hide the issues of sys.argv and __name__ so you don't have to know about them, but it has a problem. If you try to write something like this:
#main
def hello():
print(hello_string)
hello_string = 'Hi there.'
you'll get a NameError, because hello_string won't be assigned until after the decorator runs. If you continue to write Python beyond this course, you'll find that using if __name__ == '__main__' is less bug-prone and more understandable to other programmers than using a decorator for this.
You are using the function before it is defined. In other words, you need to define the main function higher up (in the document) than where you use it as a decorator:
def main():
pass
#main
def somefunction():
pass
The #main notation means the main function is being used to "decorate", or modify, another function. There are various articles on python decorators:
http://simeonfranklin.com/blog/2012/jul/1/python-decorators-in-12-steps/
http://www.artima.com/weblogs/viewpost.jsp?thread=240808
http://www.jeffknupp.com/blog/2013/11/29/improve-your-python-decorators-explained/
You can only use a decorator on a different function. Example:
def foo(f):
def inner():
print("before")
f()
print("after")
return inner
#foo
def bar():
print("bar")
if __name__ == "__main__":
bar()
Output:
before
bar
after
I want to do the following:
I have a class which should provide several functions, which need different inputs. And I would like to use these functions from within other scripts, or solely from commandline.
e.g. I have the class "test". It has a function "quicktest" (which basically justs prints something). (From commandline) I want to be able to
$ python test.py quicktest "foo" "bar"
Whereas quicktest is the name of the function, and "foo" and "bar" are the variables.
Also (from within another script) I want to
from test import test
# this
t = test()
t.quicktest(["foo1", "bar1"])
# or this
test().quicktest(["foo2", "bar2"])
I just can't bring that to work. I managed to write a class for the first request and one for the second, but not for both of them. The problem is that I sometimes have to call the functions via (self), sometimes not, and also I have to provide the given parameters at any time, which is also kinda complicated.
So, does anybody have an idea for that?
This is what I already have:
Works only from commandline:
class test:
def quicktest(params):
pprint(params)
if (__name__ == '__main__'):
if (sys.argv[1] == "quicktest"):
quicktest(sys.argv)
else:
print "Wrong call."
Works only from within other scripts:
class test:
_params = sys.argv
def quicktest(self, params):
pprint(params)
pprint(self._params)
if (__name__ == '__main__'):
if (sys.argv[1] == "quicktest"):
quicktest()
else:
print "Wrong call"
try the following (note that the different indentation, the if __name__ part is not part of class test anymore):
class test:
def quicktest(params):
pprint(params)
if __name__ == '__main__':
if sys.argv[1] == "quicktest":
testObj = test()
testObj.quicktest(sys.argv)
else:
print "Wrong call."
from other scripts:
from test import test
testObj = test()
testObj.quicktest(...)
The if __name__ == '__main__': block needs to be at the top level:
class Test(object): # Python class names are capitalized and should inherit from object
def __init__(self, *args):
# parse args here so you can import and call with options too
self.args = args
def quicktest(self):
return 'ret_value'
if __name__ == '__main__':
test = Test(sys.argv[1:])
You can parse the command line with the help of argparse to parse the value from the command line.
Your class which has the method and associate methods to arguments.