How to use pdoc on a script that takes command line args? - python

I'm trying to use pdoc to document my python script but I'm having an issue due to the fact that my program requires the use of command line args.
My program is called as follows: myprogram.py -p arg1 -s arg2
And I try to run pdoc as follows: pdoc --html myprogram
But I get an error saying pdoc: error: the following arguments are required: -p
But if I try to run pdoc like such: pdoc --html myprogram.py -p arg1 -s arg2, I get this error:
pdoc: error: unrecognized arguments: -p arg1 -s arg2
IS there a way to run pdoc on a module that requires command line args?

Since pdoc imports the documented modules, you need to nest your CLI program execution under a main guard:
def main():
from argparse import ArgumentParser
parser = ArgumentParser(...)
args = parser.parse_args()
...
if __name__ == '__main__':
# Run main entry point only if the script was run as a program
# and not if it was merely imported
main()

Related

Pass command line arguments to python script that is called through unix executable

I have a file named "uscf" in /usr/local/bin:
#! /bin/sh
python3 ~/Desktop/path/to/uscf.py
I have already chmod +x this file so that I can run it from my terminal with the command "uscf". How can I run this with command line arguments so that the arguments are accessible through sys.argv in uscf.py?
EDIT: Added below example for clarification:
The uscf.py file:
import sys
if len(sys.argv) > 1:
print(sys.argv)
Running it from the command line:
Abraham$ uscf these are arguments
Expected output:
these are arguments
In sh the "$#" variable contains all the positional arguments passed to the script. You can use that to pass it to your python script:
#!/bin/sh
python3 $HOME/Desktop/path/to/uscf.py "$#"

How to pass arguments to a python gdb script launched from command line

I'd like to pass some command line arguments to a python script run via gdb command, but importing the gdb module in python removes the argv attribute from sys. How do I access arg1 and arg2 within my python script shown in my example?
Command line execution:
$ gdb -x a.py --args python -arg1 -arg2
a.py:
#!/usr/bin/env python
import gdb
import sys
print('The args are: {0}'.format(sys.argv))
gdb.execute('quit')
Error raised:
AttributeError: 'module' object has no attribute 'argv'
Versions:
GNU gdb (GDB) 7.2
Python 2.6.6
Edit:
The end target I'll be debugging is a C executable that is already running, so I'll be attaching to it later in the script, so gdb -x a.py --args python -arg1 -arg2 is not correct either since the python part prints a gdb error: Reading symbols from /usr/bin/python...(no debugging symbols found)...done....
-ex py
This is a possibility:
argv.py
print(arg0)
print(arg1)
Invocation:
gdb --batch -ex 'py arg0 = 12;arg1 = 3.4' -x argv.py
or equivalently:
gdb --batch -ex 'py arg0 = 12' -ex 'py arg1 = 3.4' -x argv.py
Output:
12
3.4
Then you could wrap that in the following script:
#!/usr/bin/env bash
doc="
Pass parameters to python script.
Usage:
$0 scrpit.py 1 2
Where scrpit.py uses the arguments like:
print(arg0)
print(arg1)
"
py="$1"
shift
cli=''
i=0
for arg in $*; do
cli="$cli -ex 'py arg$i = $arg'"
i=$(($i+1))
done
eval gdb --batch $cli -x "$py"
Tested on Python 3.8.6, GDB 9.2, Ubuntu 20.10.
It's not totally clear to me what you are trying to do.
An invocation of the form:
gdb --args something arg arg
Tells gdb to use something as the program to be debugged, with arg arg as the initial command-line arguments for the run command. You can inspect these inside gdb with show args.
So, your command is saying "I want to debug the python executable, passing it some arguments".
However, later you say you plan to attach to some already-running executable.
So, I think you're probably trying to script gdb in Python -- not debug the python executable.
The good news is, this is possible, just not the way you've written it. Instead you have a couple choices:
Make a .py file holding your script and tell gdb to source it, e.g., with gdb -x myscript.py (which you've already done...)
Use -ex to invoke some Python explicitly, like -ex 'python print 23'.

Passing parameters to shell script from a python program

I would like to call a shell script from my python script. I need to pass 3 parameters/arguments to the shell script. I am able to call the shell script (that is in the same directory as that of the python script), but having some issue with the parameter passing
from subprocess import call
// other code here.
line = "Hello"
// Here is how I call the shell command
call (["./myscript.sh", "/usr/share/file1.txt", ""/usr/share/file2.txt", line], shell=True)
In my shell script I have this
#!/bin/sh
echo "Parameters are $1 $2 $3"
...
Unfortunately parameters are not getting passed correctly.
I get this message:
Parameters are
None of the parameter values are passed in the script
call ("./myscript.sh /usr/share/file1.txt /usr/share/file2.txt "+line, shell=True)
When you are using shell=True you can directly pass the command as if passing on shell directly.
Drop shell=True (you might need to make myscript.sh executable: $ chmod +x myscript.sh):
#!/usr/bin/env python
from subprocess import check_call
line = "Hello world!"
check_call(["./myscript.sh", "/usr/share/file1.txt", "/usr/share/file2.txt",
line])
Do not use a list argument and shell=True together.

Cant run a process with subprocess.Popen()

I have a command to run from python program
python test.py arg1 arg2
If I run it using
os.system("python test.py arg1 arg2")
it runs.
I want to use subprocess.Popen() but when I do that
subprocess.Popen("python test.py arg1 arg2")
It gives the following error
raise child_exception
OSError: [Errno 2] No such file or directory
Any ideas?
Thanks a lot
If argument to Popen is a string, it tries to execute it as 'sh <argument>', so in your case it becomes 'sh python test.py arg1 arg2' that is obviously wrong. You can pass it a list (as twall suggested), or specify shell=True parameter. The next should work:
subprocess.Popen("python test.py arg1 arg2", shell=True)
try putting all your arguments in a list:
subprocess.Popen(["python", "test.py", "arg1", "arg2"])
see: http://docs.python.org/library/subprocess.html#subprocess.Popen

Fabfiles With Command Line Arguments

Is there a clean way to have your fabfile take command line arguments? I'm writing an installation script for a tool that I want to be able to specify an optional target directory via the command line.
I wrote some code to test what would happen if I passed in some command line arguments:
# fabfile.py
import sys
def install():
_get_options()
def _get_options():
print repr(sys.argv[1:])
A couple of runs:
$ fab install
['install']
Done.
$ fab install --electric-boogaloo
Usage: fab [options] <command>[:arg1,arg2=val2,host=foo,hosts='h1;h2',...] ...
fab: error: no such option: --electric-boogaloo
I ended up using the per-task arguments. It seems like a better idea than doing unattached command line arguments.

Categories