I am unable to get this working .
I have something like this
python executor.py arg1 arg2 arg3
Ant then I have another python script which is not directly called by excecutor.py but by some another script file which is being called in executor.py .Lets it call
script.py
there is a variable name argument in which i want to catch arg1.
How to do do it ?
I am assuming you are doing import script within the executor.py script. If this is the case you have to add to executor.py:
import script
import sys
argument_1 = sys.argv[1] # since [0] is the script name
...
script.yourfunction(argument_1)
assuming script.py and executor.py are in the same folder:
script.py:
def function1(somearg, otherarg):
pass
def function2(moreargs):
pass
and executor.py
import sys
import script
# assign input args; check for valid arguments etc..
arg1 = sys.argv[1]
arg2 = sys.argv[2]
etc...
# call system functions
script.function1(arg1, arg2)
script.function2(arg1)
Related
I'm looking to pass variables from a Python script into variables of a Powershell script without using arguments.
var_pass_test.py
import subprocess, sys
setup_script = 'C:\\Users\\user\\Desktop\\Code\\Creation\\var_pass_test.ps1'
test1 = "Hello"
p = subprocess.run(["powershell.exe",
setup_script], test1,
stdout=sys.stdout)
var_pass_test.ps1
Write-Host $test1
How would one go about doing this such that the Powershell script receives the value of test1 from the Python script? Is this doable with the subprocess library?
To pass arguments verbatim to the PowerShell CLI, use the -File option: pass the script-file path first, followed by the arguments to pass to the script.
In Python, pass all arguments that make up the PowerShell command line as part of the first, array-valued argument:
import subprocess, sys
setup_script = 'C:\\Users\\user\\Desktop\\Code\\Creation\\var_pass_test.ps1'
test1 = "Hello"
p = subprocess.run([
"powershell.exe",
"-File",
setup_script,
test1
],
stdout=sys.stdout)
I have a Python function that I want to run from the context-menu. I am using the Windows Registry to call the function when a file is right clicked, and I want the function to receive the file selected.
The context menu item looks something like the Random Commands option.
The function I wish to run is foo1() in the file test1:
# in script test1.py
def foo1():
import sys
print(sys.argv)
I tried approaching the problem using the python -c switch to directly call the function:
python -c "import test1; test1.foo1()" %1
However, the value of sys.argv after selecting the file and executing is ['-c'], which doesn't include the file selected.
I also tried using an argument parser to dynamically import and run the function:
# in file registry_parser.py
import argparse
import os
import sys
if __name__ == '__main__':
parser = argparse.ArgumentParser()
parser.add_argument('-n', '--func_name')
parser.add_argument('-f', '--file_name')
parser.add_argument('-p', '--file_path')
args = vars(parser.parse_args())
sys.path.insert(0, args['file_path'])
exec(f"import {args['file_name']}")
exec(f"{args['file_name']}.{args['func_name']}(sys.argv)")
And the function I would call from the registry is (paths simplified):
registry_parser.py --func_name foo1 --file_name test1 --file_path "[PATH TO registry_parser.py]" %1
However, I either get an error registry_parser.py: error: unrecognized arguments: %1 or the "%1" is parsed quite literally and ends up as a value in sys.argv.
Any help is appreciated.
So on all the forums I asked I never received a correct answer, but I found this Reddit Post that has an example to a solution. Hope this helps someone else.
May I know what is the best practice to debug an argpars function.
Say I have a py file test_file.py with the following lines
# Script start
import argparse
import os
parser = argparse.ArgumentParser()
parser.add_argument(“–output_dir”, type=str, default=”/data/xx”)
args = parser.parse_args()
os.makedirs(args.output_dir)
# Script stop
The above script can be executed from terminal by:
python test_file.py –output_dir data/xx
However, for debugging process, I would like to avoid using terminal. Thus the workaround would be
# other line were commented for debugging process
# Thus, active line are
# Script start
import os
args = {“output_dir”:”data/xx”}
os.makedirs(args.output_dir)
#Script stop
However, I am unable to execute the modified script. May I know what have I miss?
When used as a script, parse_args will produce a Namespace object, which displays as:
argparse.Namespace(output_dir='data/xx')
then
args.output_dir
will be the value of that attribute
In the test you could do one several things:
args = parser.parse_args([....]) # a 'fake' sys.argv[1:] list
args = argparse.Namespace(output_dir= 'mydata')
and use args as before. Or simply call the
os.makedirs('data/xx')
I would recommend organizing the script as:
# Script start
import argparse
import os
# this parser definition could be in a function
parser = argparse.ArgumentParser()
parser.add_argument(“–output_dir”, type=str, default=”/data/xx”)
def main(args):
os.makedirs(args.output_dir)
if __name__=='__main__':
args = parser.parse_args()
main(args)
That way the parse_args step isn't run when the file is imported. Whether you pass the args Namespace to main or pass values like args.output_dir, or a dictionary, etc. is your choice.
You can write it in a shell script to do what you want
bash:
#!/usr/bin/
cd /path/to/my/script.py
python script.py --output_dir data/xx
If that is insufficient, you can store your args in a json config file
configs.json
{"output_dir": "data/xx"}
To grab them:
import json
with open('configs.json', 'rb') as fh:
args = json.loads(fh.read())
output_dir = args.get('output_dir')
# 'data/xx'
Do take note of the double quotes around your keys and values in the json file
I am working on a script under which some sub program will run.
for example test.py is the main program and under that test1.py, test2.py, test3.pl will run and I need to pass the arguments from the main program(test.py) to test1.py and test2.py program also.the arguments should be in unchanged condition. while passing to another program.
code: test.py
import argparse
import subprocess
import os
commandLineArgumentParser = argparse.ArgumentParser()
commandLineArgumentParser.add_argument("-fname", "--fname",help="first name")
commandLineArgumentParser.add_argument("-lname","--lname", help="last name")
commandLineArgumentParser.add_argument("-age","--age", help="age")
commandLineArguments = commandLineArgumentParser.parse_args()
fname = commandLineArguments.fname
lname = commandLineArguments.lname
age = commandLineArguments.age
print "%s%s%s" %(fname,lname,age)
I am running the program by the bellow commands :
python test.py -fname=abc -age=22 -lname='test a'
or
python test.py -fname="abc test" lname='val' -age=30
or
python test.py -age=45 -lname='abc aa' fname="abc"
or
python test.py -age=45 -lname="test"
now I want to grab the argument part in unchanged condition and put in one variable then we can easily pass the arguments to program easily.
For the first command the variable will hold
-fname=abc -age=22 -lname='test a'
for 2nd command
-fname="abc test" lname='val' -age=30
I was trying to grab the arguments using the bellow code but the quotas are missing by the script.
my_argu=''
if len(sys.argv) >1:
for x in sys.argv:
if x == sys.argv[0]:
continue
if x == sys.argv[len(sys.argv)-1]:
my_argu =my_argu+x
else:
my_argu = my_argu+x+' '
print "%s" %(my_argu)
for the
python test.py -fname="abc test" lname='val' -age=30
the output is :
abc testval30
-fname=abc test lname=val -age=30
as you can see quotas are missing. So I need help to solve it
Seems like you should pull these all together in one wrapper and call that instead.
# wrapper.py
import test, test1, test2, test3
import argparse
parser = argparse.ArgumentParser()
parser.add_argument("-fname", "--fname",help="first name")
parser.add_argument("-lname","--lname", help="last name")
parser.add_argument("-age","--age", help="age")
cli_args = parser.parse_args()
test.run(cli_args)
test1.run(cli_args)
test2.run(cli_args)
test3.run(cli_args)
Then in each of your testN.pys...
# test.py
def run(cli_args):
print("Your name is {} {}".format(cli_args.fname, cli_args.lname))
# or whatever your program does....
Then do:
$ python wrapper.py --fname Adam --lname Smith
Let's say I have this simple Python script, named MyScript.py:
def MyFunction(someInput):
#Do something with input
I would like to write a batch file that specifically calls MyFunction from MyScript with someInput.
Now, I could do some Python-foo and add:
import sys
def MyFunction(someInput):
#Do something with input
if __name__ == "__main__":
eval(sys.argv[1])
Then I can use a batch like this:
python MyScript.py MyFunction('awesomeInput')
pause
But I have a feeling there's a more obvious solution here that doesn't involve me retrofitting "_name_ == "_main_" logic in each of my scripts.
If you are in the same folder as the script you can do:
python -c "import Myscript;Myscript.MyFunction('SomeInput')"
Indeed. You can use the -c (command) argument.
python -c "import MyScript; MyScript.MyFunction('someInput')"
You could write a batch file using this trick:
#echo off
rem = """
rem Do any custom setup like setting environment variables etc if required here ...
python -x "%~f0" %*
goto endofPython """
# Your python code goes here ..
from MyScript import MyFunction
MyFunction('awesomeInput')
rem = """
:endofPython """
my_function.py:
import sys
def MyFunction(someInput):
print 'MyFunction:', someInput
#Do something with input
if __name__ == "__main__":
pass
you call it from shell with:
python -c 'from my_function import MyFunction ; MyFunction("hello")'