Python: Begin one script, kill the old one [duplicate] - python

I would like to run a command in Python Shell to execute a file with an argument.
For example: execfile("abc.py") but how to add 2 arguments?

Actually, wouldn't we want to do this?
import sys
sys.argv = ['abc.py','arg1', 'arg2']
execfile('abc.py')

execfile runs a Python file, but by loading it, not as a script. You can only pass in variable bindings, not arguments.
If you want to run a program from within Python, use subprocess.call. E.g.
import subprocess
subprocess.call(['./abc.py', arg1, arg2])

try this:
import sys
sys.argv = ['arg1', 'arg2']
execfile('abc.py')
Note that when abc.py finishes, control will be returned to the calling program. Note too that abc.py can call quit() if indeed finished.

import sys
import subprocess
subprocess.call([sys.executable, 'abc.py', 'argument1', 'argument2'])

For more interesting scenarios, you could also look at the runpy module. Since python 2.7, it has the run_path function. E.g:
import runpy
import sys
# argv[0] will be replaced by runpy
# You could also skip this if you get sys.argv populated
# via other means
sys.argv = ['', 'arg1' 'arg2']
runpy.run_path('./abc.py', run_name='__main__')

You're confusing loading a module into the current interpreter process and calling a Python script externally.
The former can be done by importing the file you're interested in. execfile is similar to importing but it simply evaluates the file rather than creates a module out of it. Similar to "sourcing" in a shell script.
The latter can be done using the subprocess module. You spawn off another instance of the interpreter and pass whatever parameters you want to that. This is similar to shelling out in a shell script using backticks.

You can't pass command line arguments with execfile(). Look at subprocess instead.

If you set PYTHONINSPECT in the python file you want to execute
[repl.py]
import os
import sys
from time import time
os.environ['PYTHONINSPECT'] = 'True'
t=time()
argv=sys.argv[1:len(sys.argv)]
there is no need to use execfile, and you can directly run the file with arguments as usual in the shell:
python repl.py one two 3
>>> t
1513989378.880822
>>> argv
['one', 'two', '3']

If you want to run the scripts in parallel and give them different arguments you can do like below.
import os
os.system("python script.py arg1 arg2 & python script.py arg11 arg22")

Besides subprocess.call, you can also use subprocess.Popen. Like the following
subprocess.Popen(['./script', arg1, arg2])

This works:
subprocess.call("python abc.py arg1 arg2", shell=True)

runfile('abc.py', ['arg1', 'arg2'])

This works for me :
import subprocess
subprocess.call(['python.exe', './abc.py', arg1, arg2])

Related

Calling a python script with args from another python script

I am still a newbie to python, so apologies in advance. I have related topics on this but didn't find the best solution. (Run a python script from another python script, passing in args)
Basically, I have a python script (scriptB.py) that takes in a config file as argument and does some stuff. I need to call this script from another python script (scriptA.py).
If I had no arguments to pass, I could have just done
import scriptB.py
However, things got little complicated because we need to pass the config file (mycnofig.yml) as argument.
One of the suggestions was to use;
os.system(python scriptB.py myconfig.yml)
But, it is often reported as not a recommended approach and that it often does not work.
Another suggestion was to use:
import subprocess
subprocess.Popen("scriptB.py myconfig.yaml", shell=True)
I am not very sure if this is a common practice.
Just want to point out that both scripts don't have any main inside the script.
Please advise on the best way to handle this.
Thanks,
this should work just fine
subprocess.Popen(['python', '/full_path/scriptB.py', 'myconfig.yaml'], stdout=PIPE, stderr=PIPE)
See https://docs.python.org/3/library/subprocess.html#replacing-os-popen-os-popen2-os-popen3
If you really need to run a separate process, using the multiprocessing library is probably best. I would make an actual function inside scriptB.py that does the work. In the below example I consider config_handler to be a function inside scriptB.py that actually takes the config file path argument.
1.) create a function that will handle the calling of your external python script, also, import your script and the method inside it that takes arguments
scriptA.py: importing config_handler from scriptB
import multiprocessing
from scriptB import config_handler
def other_process(*args):
p = multiprocessing.Process(*args)
p.start()
2.) Then just call the process and feed your arguments to it:
scriptA.py: calling scriptB.py function, config_handler
other_process(name="config_process_name", target=config_handler, args=("myconfig.yml",))
Opinion:
From the information you have provided, i imagine you could manage to do this without separate processes. Just do things all in sequence and make scriptB.py a library with a function you use in scriptA.py.
It seems you got all your answers in the old thread, but if you really want to run it through os, not through python, this is what I do:
from subprocess import run, PIPE, DEVNULL
your_command = './scriptB.py myconfig.yaml'
run(your_command.split(), stdout=PIPE, stderr=DEVNULL)
In case you need the output:
output = run(your_command.split(), stdout=PIPE, stderr=DEVNULL).stdout.decode('utf-8')
If the scriptB has the shebang header telling the bash its a python script, it should run it correctly.
Path can be both relative and absolute.
It is for Python 3.x

Forward a shell command using python

So to be more precise, what I am trying to do is :
read a full shell command as argument of my python script like : python myPythonScript.py ls -Fl
Call that command within my python script when I'd like to (Make some loops on some folders and apply the command etc ...)
I tried this :
import subprocess
from optparse import OptionParser
from subprocess import call
def execCommand(cmd):
call(cmd)
if __name__ == '__main__':
parser = OptionParser()
(options,args) = parser.parse_args()
print args
execCommand(args)
The result is that now I can do python myPythonScript.py ls , but I don't know how to add options. I know I can use parser.add_option , but don't know how to make it work for all options as I don't want to make only specific options available, but all possible options depending on the command I am running.
Can I use something like parser.add_option('-*') ? How can I parse the options then and call the command with its options ?
EDIT
I need my program to parse all type of commands passed as argument : python myScript.py ls -Fl , python myScript.py git pull, python myScript rm -rf * etc ...
OptionParser is useful when your own program wants to process the arguments: it helps you turn string arguments into booleans or integers or list items or whatever. In your case, you just want to pass the arguments on to the program you're invoking, so don't bother with OptionParser. Just pass the arguments as given in sys.argv.
subprocess.call(sys.argv[1:])
Depending on how much your program depends on command line arguments, you can go with simple route.
Simple way of reading command line arguments
Use sys to obtain all the arguments to python command line.
import sys
print sys.argv[1:]
Then you can use subprocess to execute it.
from subprocess import call
# e.g. call(["ls", "-l"])
call(sys.argv[1:])
This sample below works fine for me.
import sys
from subprocess import call
print(sys.argv[1:])
call(sys.argv[1:])

Running a script from another python

I just want to have some ideas to know how to do that...
I have a python script that parses log files, the log name I give it as an argument so that when i want to run the script it's like that.. ( python myscript.py LOGNAME )
what I'd like to do is to have two scripts one that contains the functions and another that has only the main function so i don't know how to be able to give the argument when i run it from the second script.
here's my second script's code:
import sys
import os
path = "/myscript.py"
sys.path.append(os.path.abspath(path))
import myscript
mainFunction()
the error i have is:
script, name = argv
valueError: need more than 1 value to unpack
Python (just as most languages) will share parameters across imports and includes.
Meaning that if you do:
python mysecondscript.py heeey that will flow down into myscript.py as well.
So, check your arguments that you pass.
Script one
myscript = __import__('myscript')
myscript.mainfunction()
script two
import sys
def mainfunction():
print sys.argv
And do:
python script_one.py parameter
You should get:
["script_one.py", "parameter"]
You have several ways of doing it.
>>> execfile('filename.py')
Check the following link:
How to execute a file within the python interpreter?

execute python script with argparse inside python script

I got two different python script, the first one is using argparse to get some additional argument (I will call it arg.py) and the second one is my main script (main.py).
I want to call arg.py inside main.py but I don't know how to do so. I take a look at the command execfile but I didn't manage to make it work.
I try to put
execfile('arg.py -h')
In main.py but python try to find the file 'arg.py -h' which doesn't exist. Does anyone know how to do that ?
Thanks in advance
You can run it as a separate process using subprocess.call or subprocess.Popen. If you don't want to run it as a child process, then it starts to get more complicated (depending on the structure of your arg.py script.
execfile takes the file you pass to it and includes it in the current script -- Much like a #include pre-processor directive in c/c++ (although this is done dynamically). One really ugly option using execfile would be:
#completely untested.
import sys
def run_script(script_name,*args):
_argv = sys.argv[:]
sys.argv = list(args)
execfile(script_name)
sys.argv = _argv

Best way to call a python script from within a python script multiple times

I need to execute a python script from within another python-script multiple times with different arguments.
I know this sounds horrible but there are reasons for it.
Problem is however that the callee-script does not check if it is imported or executed (if __name__ == '__main__': ...).
I know I could use subprocess.popen("python.exe callee.py -arg") but that seems to be much slower then it should be, and I guess thats because Python.exe is beeing started and terminated multiple times.
I can't import the script as a module regularily because of its design as described in the beginning - upon import it will be executed without args because its missing a main() method.
I can't change the callee script either
As I understand it I can't use execfile() either because it doesnt take arguments
Found the solution for you. You can reload a module in python and you can patch the sys.argv.
Imagine echo.py is the callee script you want to call a multiple times :
#!/usr/bin/env python
# file: echo.py
import sys
print sys.argv
You can do as your caller script :
#!/usr/bin/env python
# file: test.py
import sys
sys.argv[1] = 'test1'
import echo
sys.argv[1] = 'test2'
reload(echo)
And call it for example with : python test.py place_holder
it will printout :
['test.py', 'test1']
['test.py', 'test2']

Categories