I have a script (test.py) that I run as a command from the terminal, which calls another script (main.py). I want to store the arguments entered by the user as variables, and then pass them onto the second script.
E.g. when I run 'test -t foo' I want to save 'foo' as 'test=foo', and then when I call 'os.system("python main.py")' at the end of test.py I want main.py to print 'foo'.
This is what I have so far:
test.py
import os, argparse
parser = argparse.ArgumentParser()
parser.add_argument("-t", "--test", action="store", help="Store argument as variable")
args = parser.parse_args()
#I'm not sure how to save the argument as a variable
os.system("python main.py") #I need to keep this line - please see the comments below
terminal commands
chmod +x test.py
mv test.py test
mkdir -p ~/bin
cp test ~/bin
echo 'export PATH=$PATH":$HOME/bin"' >> .profile
main.py
from __main__ import * #this does not work
if args.test:
print(#variable)
In case it is helpful for anyone, I have found a way around the problem:
test.py
import os, argparse
parser = argparse.ArgumentParser()
parser.add_argument("-t", "--test", action="store", type=str, default="None", help="Store argument as variable")
args = parser.parse_args()
with open ("variables.py", 'w') as variables:
variables.writelines(args.test)
os.system("python main.py")
terminal commands
chmod +x test.py
mv test.py test
mkdir -p ~/bin
cp test ~/bin
echo 'export PATH=$PATH":$HOME/bin"' >> .profile
main.py
with open ("variables.py", 'r') as variables:
test = variables.read()
print(test)
It's probably not very pythonic but it does the trick.
Related
Eventually I understand this and it works.
bash script:
#!/bin/bash
#$ -V
#$ -cwd
#$ -o $HOME/sge_jobs_output/$JOB_ID.out -j y
#$ -S /bin/bash
#$ -l mem_free=4G
c=$SGE_TASK_ID
cd /home/xxx/scratch/test/
FILENAME=`head -$c testlist|tail -1`
python testpython.py $FILENAME
python script:
#!/bin/python
import sys,os
path='/home/xxx/scratch/test/'
name1=sys.argv[1]
job_id=os.path.join(path+name1)
f=open(job_id,'r').readlines()
print f[1]
thx
Exported bash variables are actually environment variables. You get at them through the os.environ object with a dictionary-like interface. Note that there are two types of variables in Bash: those local to the current process, and those that are inherited by child processes. Your Python script is a child process, so you need to make sure that you export the variable you want the child process to access.
To answer your original question, you need to first export the variable and then access it from within the python script using os.environ.
##!/bin/bash
#$ -V
#$ -cwd
#$ -o $HOME/sge_jobs_output/$JOB_ID.out -j y
#$ -S /bin/bash
#$ -l mem_free=4G
c=$SGE_TASK_ID
cd /home/xxx/scratch/test/
export FILENAME=`head -$c testlist|tail -1`
chmod +X testpython.py
./testpython.py
#!/bin/python
import sys
import os
for arg in sys.argv:
print arg
f=open('/home/xxx/scratch/test/' + os.environ['FILENAME'],'r').readlines()
print f[1]
Alternatively, you may pass the variable as a command line argument, which is what your code is doing now. In that case, you must look in sys.argv, which is the list of arguments passed to your script. They appear in sys.argv in the same order you specified them when invoking the script. sys.argv[0] always contains the name of the program that's running. Subsequent entries contain other arguments. len(sys.argv) indicates the number of arguments the script received.
#!/bin/python
import sys
import os
if len(sys.argv) < 2:
print 'Usage: ' + sys.argv[0] + ' <filename>'
sys.exit(1)
print 'This is the name of the python script: ' + sys.argv[0]
print 'This is the 1st argument: ' + sys.argv[1]
f=open('/home/xxx/scratch/test/' + sys.argv[1],'r').readlines()
print f[1]
use this inside your script (EDITED per Aarons suggestion):
def main(args):
do_something(args[0])
if __name__ == "__main__":
import sys
main(sys.argv[1:])
Take a look at parsing Python arguments. Your bash code would be fine, just need to edit your Python script to take the argument.
Command line arguments to the script are available as sys.argv list.
I have a file named "uscf" in /usr/local/bin:
#! /bin/sh
python3 ~/Desktop/path/to/uscf.py
I have already chmod +x this file so that I can run it from my terminal with the command "uscf". How can I run this with command line arguments so that the arguments are accessible through sys.argv in uscf.py?
EDIT: Added below example for clarification:
The uscf.py file:
import sys
if len(sys.argv) > 1:
print(sys.argv)
Running it from the command line:
Abraham$ uscf these are arguments
Expected output:
these are arguments
In sh the "$#" variable contains all the positional arguments passed to the script. You can use that to pass it to your python script:
#!/bin/sh
python3 $HOME/Desktop/path/to/uscf.py "$#"
I'm new to python and programming in general. I want to assign options to a variable by using optparse module in ipython. My code is as follows:
import sys
import optparse
parser = optparse.OptionParser()
parser.add_option('-v', action="store_true", dest='verbose', default=False)
(options, others) = parser.parse_args()
print options.verbose
if options.verbose:
print "Not yet"
else:
print "Done"
I saved them in a file and I can run it in bash like this:
$ python filename.py -verbose
Now I want to assign the whole code to a variable. I hope it can be run like this:
$ myvar -verbose
How can I do that? Thanks.
One way to do this would be to make your python script executable with a shebang. Add this as the first line in your filename.py file
#!/usr/bin/env python
Next you need to change the file permissions to be executable. At the command line, execute:
$ chmod +x filename.py
Then you can execute filename.py directly:
$ ./filename.py -verbose
You could of course rename filename.py to myvar, or make a symolic link like this:
$ ln -s filename.py myvar
Now you can do:
$ ./myvar -verbose
If you don't want to add the /. (which just tells the shell that the executable is in the current directory), or want to be able to use the "myvar" command line from anywhere, you could add your working directory to the environment PATH:
export PATH=$PATH:$PWD
You might want to read a good tutorial on command line usage for more on this sort of thing. A little time invested can be really rewarding!
Aliases are your friend:
alias myvar="python filename.py"
Just assign to the shell variable:
myvar='python filename.py'
$myvar -verbose
Note the leading $, which would not be required if you used an alias (#Nick Edward's solution).
By the way, -verbose will set options v, e, r, b, o, s, and e. Maybe you mean a long option, like --verbose?
there is a tool I write using python that analyze a pdf file by passing it in the cmd
c:\python "my_tool.py" -s "my_pdf.pdf"
I want to test the tool on 1000 files. how could I run the tool on all of the 1000 files.
I used this
for /f %%f in ('dir /b C:\Users\Test\Desktop\CVE_2010-2883_PDF_25files') do echo %%f
but how can I specify (the tool) and (-s) argument
Try like this :
#echo off
for /f %%f in ('dir /a-d/b C:\Users\Test\Desktop\CVE_2010-2883_PDF_25files\*.pdf') do (
"c:\python\my_tool.py" -s "%%~dnxf")
you can use grep to pass all .pdf file to script !
c:\python grep *.pdf|"my_tool.py" -s
or with this script :
for i in $(\ls -d *.pdf)
do
python "my_tool.py" -s $i
done
If you have the Unix find command you can use
find . -type f -name "*.pdf" -exec c:\python "my_tool.py" -s {} \;
This runs your command on each of the pdf files in the current directory
You can make your life a lot easier, by making sure the tool can just search a directory for all pdf files:
import glob
import os
def get_files(directory):
for i in glob.iglob(os.path.join(directory, '*.pdf')):
do_something(i)
if __name__ == '__main__':
import argparse
parser = argparse.ArgumentParser()
parser.add_argument('-f', '--file', help='enter filename')
parser.add_argument('-d', '--directory', help='enter directory of pdf files')
args = parser.parse_args()
if args.directory:
get_files(args.directory)
if args.file:
do_something(args.file)
I'm trying to use pdoc to document my python script but I'm having an issue due to the fact that my program requires the use of command line args.
My program is called as follows: myprogram.py -p arg1 -s arg2
And I try to run pdoc as follows: pdoc --html myprogram
But I get an error saying pdoc: error: the following arguments are required: -p
But if I try to run pdoc like such: pdoc --html myprogram.py -p arg1 -s arg2, I get this error:
pdoc: error: unrecognized arguments: -p arg1 -s arg2
IS there a way to run pdoc on a module that requires command line args?
Since pdoc imports the documented modules, you need to nest your CLI program execution under a main guard:
def main():
from argparse import ArgumentParser
parser = ArgumentParser(...)
args = parser.parse_args()
...
if __name__ == '__main__':
# Run main entry point only if the script was run as a program
# and not if it was merely imported
main()