Python - passing parameters in a command line app - python

I need to write a command line application, like a shell. So it will include commands etc. The thing is I don't know how to pass parameters to the funcions in a module. For example:
User writes: function1 folder1
Program should now pass the 'folder1' parameter to the function1 function, and run it. But also it has to support other functions with different parameters ex:
User input: function2 folder2 --exampleparam
How to make this to work? I mean, I could just write a module, import it in python and just use the python console, but this is not the case. I need a script that takes command input and runs it.
I tried to use eval(), but that doesn't solve the problem with params. Or maybe it does but I don't see it?

The first part of your problem -- parsing the command line -- can be solved with argparse.
The second part -- converting the string name of a function into a function call -- can be done with exec or a dispatching dict which maps from strings to function objects.
I would recommend NOT using exec for this, since
allowing a user to call arbitrary Python functions from the command line might be dangerous. Instead, make a whitelist of allowable functions:
import argparse
def foo(path):
print('Running foo(%r)' % (path, ))
def bar(path):
print('Running bar(%r)' % (path, ))
dispatch = {
'foo': foo,
'bar': bar,
}
parser = argparse.ArgumentParser()
parser.add_argument('function')
parser.add_argument('arguments', nargs='*')
args = parser.parse_args()
dispatch[args.function](*args.arguments)
% test.py foo 1
Running foo('1')
% test.py bar 2
Running bar('2')
% test.py baz 3
KeyError: 'baz'
The above works when the command is typed into the command-line itself. If the command is being typed into stdin, then we'll need to do something a bit different.
A simple way would be to call raw_input to grab the string from stdin. We could then parse the string with argparse, as we did above:
shmod.py:
import argparse
def foo(path):
print('Running foo(%r)' % (path, ))
def bar(path):
print('Running bar(%r)' % (path, ))
dispatch = {
'foo': foo,
'bar': bar,
}
def parse_args(cmd):
parser = argparse.ArgumentParser()
parser.add_argument('function')
parser.add_argument('arguments', nargs='*')
args = parser.parse_args(cmd.split())
return args
main.py:
import shmod
while True:
cmd = raw_input('> ')
args = shmod.parse_args(cmd)
try:
shmod.dispatch[args.function](*args.arguments)
except KeyError:
print('Invalid input: {!r}'.format(cmd))
Another, more sophisticated way to handle this is to use the cmd module, as #chepner mentioned in the comments.
from cmd import Cmd
class MyInterpreter(Cmd):
prompt = '> '
def do_prompt(self, line):
"Change the interactive prompt"
self.prompt = line + ': '
def do_EOF(self, line):
return True
def do_foo(self, line):
print('Running foo {l}'.format(l=line))
def do_bar(self, line):
print('Running bar {l}'.format(l=line))
if __name__ == '__main__':
MyInterpreter().cmdloop()
For more information on how to use the cmd module, see Doug Hellman's excellent tutorial.
Running the code above yields a result like this:
% test.py
> foo 1
Running foo 1
> foo 1 2 3
Running foo 1 2 3
> bar 2
Running bar 2
> baz 3
*** Unknown syntax: baz 3

optparse is deprecated since python 2.7 and anyway argparse is much more flexible.
The approach of unutbu is safe, but in case you provide whitelist, I would suggest you to let the user know which functions are accepted
dispatch = {
'foo': foo,
'bar': bar,
}
parser = argparse.ArgumentParser()
parser.add_argument('function', choices=dispatch.keys() )
FYI: if the parsing is not too complicated, docopt looks like a very nice package

How about sys.argv? For more advanced stuff check out argsparse. optparse seems depreciated now, but there's a lot of answers here about this question.

Take a look at the optparse module in python. It's exactly what you would need:
http://docs.python.org/2/library/optparse.html
Or you can write your own custom opt-parser (minimalistic though)
def getopts(argv):
opts = {}
while argv:
if argv[0][0] == '-': # find "-name value" pairs
opts[argv[0]] = argv[1] # dict key is "-name" arg
argv = argv[2:]
else:
argv = argv[1:]
return opts
if __name__ == '__main__':
from sys import argv # example client code
myargs = getopts(argv)
# DO something based on your logic here
But in case your script needs to run on python 3 and beyond, you need to consider argparse module.\
Hope that helps.

Take a look at optparse . This can help passing and receiving shell style parameters to python scripts.
Update:
Apparently optparse is deprecated now and argparse is now preferred option for parsing command line arguments.

import sys
def main(arg):
return arg
print main(sys.argv[1])
where sys.argv[0] is the .py file you're running, and all the ones after it would be each argument. you could check the length of the list, then iterate through it, and parse them as necessary and pass the correct things to each function

Related

SystemExit 2 error: loading arguments from text file [duplicate]

I'm trying to use the fromfile-prefix-chars feature of argparse in Python to load all my command line arguments from a file, but it keeps complaining that I haven't specified some argument.
The code:
import argparse
def go():
parser = argparse.ArgumentParser(fromfile_prefix_chars='#')
parser.add_argument("--option1")
parser.add_argument("--option2", type=int, required=True)
args = parser.parse_args()
if __name__ == "__main__":
go()
The argument file:
--option1 foo
--option2 1234
The command line and output:
$ python testargparse.py #testargs
usage: testargparse.py [-h] [--option1 OPTION1] --option2 OPTION2
testargparse.py: error: argument --option2 is required
You can see that I'm providing the required argument in the file, but argparse isn't seeing it.
From the documentation:
Arguments read from a file must by default be one per line ... and are treated as if they were in the same place as the original file referencing argument on the command line. So in the example above, the expression ['-f', 'foo', '#args.txt'] is considered equivalent to the expression ['-f', 'foo', '-f', 'bar'].
In the example:
fp.write('-f\nbar')
So the file contains:
-f
bar
In other words, each of the file lines corresponds to one 'word' (blank separated) in the command line. --option1=foo is one word. --option1 foo is interpreted just as though it was quoted in the command line,eg. prog.py '--option1 foo' '--option2 1234'
The https://docs.python.org/dev/library/argparse.html#argparse.ArgumentParser.convert_arg_line_to_args has a custom function that will split lines on spaces. Experiment with that if you want to stick with the argument file.
import argparse
with open('args.txt', 'w') as fp:
fp.write('--option1 foo\n--option2 1234') # error
# but works with modifed 'convert...'
#fp.write('--option1=foo\n--option2=1234') # works
#fp.write('--option1\nfoo\n--option2\n1234') # works
def convert_arg_line_to_args(arg_line):
for arg in arg_line.split():
if not arg.strip():
continue
yield arg
"""
default line converter:
def convert_arg_line_to_args(self, arg_line):
return [arg_line]
"""
def go():
parser = argparse.ArgumentParser(fromfile_prefix_chars='#')
parser.convert_arg_line_to_args = convert_arg_line_to_args
parser.add_argument("--option1")
parser.add_argument("--option2", type=int, required=True)
args = parser.parse_args(['#args.txt'])
print args
if __name__ == "__main__":
go()
#toes suggested using shlex to parse the file. shlex has a nice feature in that it strips of unnecessary quotes.
shlex can be used to split individual lines of the file.
def sh_split(arg_line):
for arg in shlex.split(arg_line):
yield arg
parser.convert_arg_line_to_args = sh_split
Or it can replace the whole #file read method (_read_args_from_files)- this should function the same as #toes answer, except that the #file string can be anywhere in the commandline (or even be repeated).
def at_read_fn(arg_strings):
# expand arguments referencing files
new_arg_strings = []
for arg_string in arg_strings:
if not arg_string or not arg_string.startswith('#'):
new_arg_strings.append(arg_string)
else:
with open(arg_string[1:]) as args_file:
arg_strings = shlex.split(args_file.read())
new_arg_strings.extend(arg_strings)
return new_arg_strings
parser._read_args_from_files = at_read_fn
Obviously a cleaner production version would modify these methods in an ArgumentParser subclass.
The problem is that, when specified in a file, each argument must have an '=' between it and the option name. While argparse is somewhat more flexible on that format when run from the command line (where space or = is ok), when run from the file it must have an '='.
So, a working argument file would be:
--option1=foo
--option2=1234
Something else to be aware of, be sure you don't have any extra whitespace at the end of the lines or that whitespace will get included with the option when argparse reads the file.
I think there's a better answer to this: use shlex.
if sys.argv[1].startswith('#'):
args = parser.parse_args( shlex.split(open(sys.argv[1][1:]).read()) )
else:
args = parser.parse_args()
This allows you to specify args in a file in a more natural way e.g., it allows using spaces or equals sign to specify your args on a single line as in:
arg1
arg2
--opt1 'foo'
--opt2='bar'
shlex.split splits this as you would expect:
['arg1', 'arg2', '--opt1', 'foo', '--opt2=bar']
The only thing this method doesn't have is that it expects the #file.txt to be the first argument.
Try to this way
# encoding: utf-8
import imp
import argparse
class LoadConfigAction(argparse._StoreAction):
NIL = object()
def __init__(self, option_strings, dest, **kwargs):
super(self.__class__, self).__init__(option_strings, dest)
self.help = "Load configuration from file"
def __call__(self, parser, namespace, values, option_string=None):
super(LoadConfigAction, self).__call__(parser, namespace, values, option_string)
config = imp.load_source('config', values)
for key in (set(map(lambda x: x.dest, parser._actions)) & set(dir(config))):
setattr(namespace, key, getattr(config, key))
Usage
parser.add_argument("-C", "--config", action=LoadConfigAction, default=None)
parser.add_argument("-H", "--host")
Example config (real is python file)
# Config example /etc/my.conf
import os
# Parameter definition
host = os.getenv("HOST", "127.0.0.1")

How to print the first N lines of a file in python with N as argument

How would I go about getting the first N lines of a text file in python? With N have to give as argument
usage:
python file.py datafile -N 10
My code
import sys
from itertools import islice
args = sys.argv
print (args)
if args[1] == '-h':
print ("-N for printing the number of lines: python file.py datafile -N 10")
if args[-2] == '-N':
datafile = args[1]
number = int(args[-1])
with open(datafile) as myfile:
head = list(islice(myfile, number))
head = [item.strip() for item in head]
print (head)
print ('\n'.join(head))
I wrote the program, can let me know better than this code
Assuming that the print_head logic you've implemented need not be altered, here's the script I think you're looking for:
import sys
from itertools import islice
def print_head(file, n):
if not file or not n:
return
with open(file) as myfile:
head = [item.strip() for item in islice(myfile, n)]
print(head)
def parse_args():
result = {'script': sys.argv[0]}
args = iter(sys.argv)
for arg in args:
if arg == '-F':
result['filename'] = next(args)
if arg == '-N':
result['num_lines'] = int(next(args))
return result
if __name__ == '__main__':
script_args = parse_args()
print_head(script_args.get('filename', ''), script_args.get('num_lines', 0))
Running the script
python file.py -F datafile -N 10
Note: The best way to implement it would be to use argparse library
You can access argument passed to the script through sys
sys.argv
The list of command line arguments passed to a Python script. argv[0] is the script name (it is operating system dependent whether this is a full pathname or not). If the command was executed using the -c command line option to the interpreter, argv[0] is set to the string '-c'. If no script name was passed to the Python interpreter, argv[0] is the empty string.
So in code it would look like this:
import sys
print("All of argv")
print(sys.argv)
print("Last element every time")
print(sys.argv[-1])
Reading the documentation you'll see that the first values stored in the sys.argv vary according to how the user calls the script. If you print the code I pasted with different types of calls you can see for yourself the kind of values stored.
For a basic first approach: access n through sys.argv[-1] which returns the last element every time, assuming. You still have to do a try and beg for forgiveness to make sure the argument passed is a number. For that you would have:
import sys
try:
n = int(sys.argv[-1])
except ValueError as v_e:
print(f"Please pass a valid number as argument, not ${sys.argv[-1]}")
That's pretty much it. Obviously, it's quite basic, you can improve this even more by having the users pass values with flags, like --skip-lines 10 and that would be your n, and it could be in any place when executing the script. I'd create a function in charge of translating sys.argv into a key,value dictionary for easy access within the script.
Arguments are available via the sys package.
Example 1: ./file.py datafile 10
#!/usr/bin/env python3
import sys
myfile = sys.argv[1]
N = int(sys.argv[2])
with open("datafile") as myfile:
head = myfile.readlines()[0:args.N]
print(head)
Example 2: ./file.py datafile --N 10
If you want to pass multiple optional arguments you should have a look at the argparse package.
#!/usr/bin/env python3
import argparse
parser = argparse.ArgumentParser(description='Read head of file.')
parser.add_argument('file', help='Textfile to read')
parser.add_argument('--N', type=int, default=10, help='Number of lines to read')
args = parser.parse_args()
with open(args.file) as myfile:
head = myfile.readlines()[0:args.N]
print(head)

Python Programm, how to use console input on file execution?

Let´s say I execute a python file like a program in Ubuntu
python filename.py --input1 --input2
How can I use those 2 inputs in my code? (If even possible with python)
BTW I would like to do this on Windows, not Linux.
For example, my code contains a function that takes 1 argument, in form of a string.
I could just do that argument input as input() while the code is running, but I would like to specify it when I execute the code already.
I recommend you take a look at argparse. https://docs.python.org/3.7/howto/argparse.html
Or
$ python
>>> import argparse
>>> help(argparse)
This is certainly possible and in fact even bread and butter in python and other script languages.
In python there is even the getopt module that helps you with that if you are familiar with the c implementation.
Copy-paste from official python documentation:
import getopt, sys
def main():
try:
opts, args = getopt.getopt(sys.argv[1:], "ho:v", ["help", "output="])
except getopt.GetoptError as err:
# print help information and exit:
print str(err) # will print something like "option -a not recognized"
usage()
sys.exit(2)
output = None
verbose = False
for o, a in opts:
if o == "-v":
verbose = True
elif o in ("-h", "--help"):
usage()
sys.exit()
elif o in ("-o", "--output"):
output = a
else:
assert False, "unhandled option"
# ...
if __name__ == "__main__":
main()
Official documentation is here: https://docs.python.org/2/library/getopt.html
For tutorials, see for example: https://www.tutorialspoint.com/python/python_command_line_arguments.htm
On the other hand argparse is easier if you like to get it done in an easier but not c-like way. For that, see the other answer.
For a simple case, you can just use sys.argv as follows:
# in your source.py
from sys import argv
arg1 = argv[1] # If your input is a "numerci type" , apply the the appropriate
arg2 = argv[2] # conversion by using (int, float)
...
Then you can execute your code, by:
python source.py arg1 arg2

How do I add sys.argv to a function opening a text file in Python

I need to use sys.argv to check for an argument from the command line, which would be the filename in my case. My code is as follows. I'm not allowed to import argparse, only allowed to use sys. I know I'm doing something wrong here. Appreciate any help.
def get_inputfile_object( ):
'''
Check the command line for an argument. If one was there, use it as the
filename. Otherwise, use DEFAULT_INPUT_FILENAME. Open the file.
If file is successfully opened:
print MSG_OPENING_FILE
Return: a file object for that file
If the file cannot be opened:
print MSG_ERROR_OPENNING_FILE
Return: True
'''
if sys.argv > 1:
pass
else:
input_filename = DEFAULT_INPUT_FILENAME
input_filename = DEFAULT_INPUT_FILENAME
if os.path.isfile(input_filename) and os.access(input_filename,os.R_OK):
#Prints the opening file message, and the name of the file
print (MSG_OPENING_FILE,input_filename)
return open(input_filename,'r')
else:
print (MSG_ERROR_OPENING_FILE)
return True
sys.argv is a list of arguments.
You need to check the length of the list:
if len(sys.argv) > 1:
You should check out argparse.
The argparse module also automatically generates help and usage
messages and issues errors when users give the program invalid
arguments.
Haven't tested it, but you can try something similar to this:
import argparse
# setup the parser
parser = argparse.ArgumentParser(description='Describe script')
# add positional argument
parser.add_argument('filename', type=str, help='filename description')
# parse the args
args = parser.parse_args()
print(args.filename)

Parsing cmd args like typical filter programs

I spent few hours reading tutorials about argparse and managed to learn to use normal parameters. The official documentation is not very readable to me. I'm new to Python. I'm trying to write a program that could be invoked in following ways:
cat inFile | program [options] > outFile -- If no inFile or outfile is specified, read from stdin and output to stdout.
program [options] inFile outFile
program [options] inFile > outFile -- If only one file is specified it is input and output should go to stdout.
cat inFile | program [options] - outFile -- If '-' is given in place of inFlie read from stdin.
program [options] /path/to/folder outFile -- Process all files from /path/to/folder and it subdirectories.
I want it to behave like regular cli program under GNU/Linux.
It would be also nice if the program would be able to be invoked:
program [options] inFile0 inFile1 ... inFileN outFile -- first path/file always interpreted as input, last one always interpreted as output. Any additional ones interpreted as inputs.
I could probably write dirty code that would accomplish this but this is going to be used, so someone will end up maintaining it (and he will know where I live...).
Any help/suggestions are much appreciated.
Combining answers and some more knowledge from the Internet I've managed to write this(it does not accept multiple inputs but this is enough):
import sys, argparse, os.path, glob
def inputFile(path):
if path == "-":
return [sys.stdin]
elif os.path.exists(path):
if os.path.isfile(path):
return [path]
else:
return [y for x in os.walk(path) for y in glob.glob(os.path.join(x[0], '*.dat'))]
else:
exit(2)
def main(argv):
cmdArgsParser = argparse.ArgumentParser()
cmdArgsParser.add_argument('inFile', nargs='?', default='-', type=inputFile)
cmdArgsParser.add_argument('outFile', nargs='?', default='-', type=argparse.FileType('w'))
cmdArgs = cmdArgsParser.parse_args()
print cmdArgs.inFile
print cmdArgs.outFile
if __name__ == "__main__":
main(sys.argv[1:])
Thank you!
You need a positional argument (name not starting with a dash), optional arguments (nargs='?'), a default argument (default='-'). Additionally, argparse.FileType is a convenience factory to return sys.stdin or sys.stdout if - is passed (depending on the mode).
All together:
#!/usr/bin/env python
import argparse
# default argument is sys.argv[0]
parser = argparse.ArgumentParser('foo')
parser.add_argument('in_file', nargs='?', default='-', type=argparse.FileType('r'))
parser.add_argument('out_file', nargs='?', default='-', type=argparse.FileType('w'))
def main():
# default argument is is sys.argv[1:]
args = parser.parse_args(['bar', 'baz'])
print(args)
args = parser.parse_args(['bar', '-'])
print(args)
args = parser.parse_args(['bar'])
print(args)
args = parser.parse_args(['-', 'baz'])
print(args)
args = parser.parse_args(['-', '-'])
print(args)
args = parser.parse_args(['-'])
print(args)
args = parser.parse_args([])
print(args)
if __name__ == '__main__':
main()
I'll give you a start script to play with. It uses optionals rather than positionals. and only one input file. But it should give a taste of what you can do.
import argparse
parser = argparse.ArgumentParser()
inarg = parser.add_argument('-i','--infile', type=argparse.FileType('r'), default='-')
outarg = parser.add_argument('-o','--outfile', type=argparse.FileType('w'), default='-')
args = parser.parse_args()
print(args)
cnt = 0
for line in args.infile:
print(cnt, line)
args.outfile.write(line)
cnt += 1
When called without arguments, it just echos your input (after ^D). I'm a little bothered that it doesn't exit until I issue another ^D.
FileType is convenient, but has the major fault - it opens the files, but you have to close them yourself, or let Python do so when exiting. There's also the complication that you don't want to close stdin/out.
The best argparse questions include a basic script, and specific questions on how to correct or improve it. Your specs are reasonably clear. but it would be nice if you gave us more to work with.
To handle the subdirectories option, I would skip the FileType bit. Use argparse to get 2 lists of strings (or a list and an name), and then do the necessary chgdir and or glob to find and iterate over files. Don't expect argparse to do the actual work. Use it to parse the commandline strings. Here a sketch of such a script, leaving most details for you to fill in.
import argparse
import os
import sys # of stdin/out
....
def open_output(outfile):
# function to open a file for writing
# should handle '-'
# return a file object
def glob_dir(adir):
# function to glob a dir
# return a list of files ready to open
def open_forread(afilename):
# function to open file for reading
# be sensitive to '-'
def walkdirs(alist):
outlist = []
for name in alist:
if <name is file>;
outlist.append(name)
else <name is a dir>:
glist = glob(dir)
outlist.extend(glist)
else:
<error>
return outlist
def cat(infile, outfile):
<do your thing here>
def main(args):
# handle args options
filelist = walkdirs(args.inlist)
fout = open_outdir(args.outfile)
for name in filelist:
fin = open_forread(name)
cat(fin,fout)
if <fin not stdin>: fin.close()
if <fout not stdout>: fout.close()
if '__name__' == '__main__':
parser = argparse.ArgumentParser()
parser.add_argument('inlist', nargs='*')
parser.add_argument('outfile')
# add options
args = parser.parse_args()
main(args)
The parser here requires you to give it an outfile name, even if it is '-'. I could define its nargs='?' to make it optional. But that does not play nicely with the 'inlist` '*'.
Consider
myprog one two three
Is that
namespace(inlist=['one','two','three'], outfile=default)
or
namespace(inlist=['one','two'], outfile='three')
With both a * and ? positional, the identity of the last string is ambiguous - is it the last entry for inlist, or the optional entry for outfile? argparse chooses the former, and never assigns the value to outfile.
With --infile, --outfile definitions, the allocation of these strings is clear.
In sense this problem is too complex for argparse - there's nothing in it to handle things like directories. In another sense it is too simple. You could just as easily split sys.argv[1:] between inlist and outfile without the help of argparse.

Categories