I want to invoke a specific command shell in python to execute some scripts. For exemple, in R, I use:
system2(os_shell = "C:/Program Files (x86)/pgAdmin III/1.18/pg_dump.exe", args = "very long argument")
Thanks to this code, I can backup my Postgresql's tables with a for loop.
Problem: I didn't find an equivalent in Python.
If you just want to execute a application with args in a shell you can do that very easily with os.system
import os
os.system("'C:/Program Files (x86)/pgAdmin III/1.18/pg_dump.exe' very long argument")
Tho I would recommend subprocess.call or subprocess.run
import subprocess
subprocess.call(["'C:/Program Files (x86)/pgAdmin III/1.18/pg_dump.exe'", "argument1", "argument2"]) #and you can add as many arguments you want
Related
I'm having some trouble understanding the subprocess function in Python 2.7. I have some commands in shell script that I'm trying to convert into Python, svn export -r 5 ... for example, but I don't want to depend on a library such as pysvn to do this. The solution to that (to my understanding) is to use a subprocess and just run each individual command that would be in a shell script. Should this be achieved by subprocess.call("svn export -r 5", shell=True)? Or is Popen what I should be looking at? I know that it's been said you should avoid shell=True, but there is no security concern or possible user error in my case. Any advice would be appreciated.
subprocess.call is just a thin wrapper around subprocess.Popen that waits for the process to complete:
def call(*args, **kwargs):
return Popen(*args, **kwargs).wait()
The only reason to use the shell to run your command is if you want to run some more or less complicated shell command. With a single simple command and its arguments, it is better to pass a single list of strings consisting of the command name and its arguments.
subprocess.call(["svn", "export", "-r", "5"])
If you were writing a function that could, for example, take a revision number as an argument, you can pass that to svn export as long as you ensure that it is a string:
def svn_export(r):
subprocess.call(["svn", "export", "-r", str(r)])
So to be more precise, what I am trying to do is :
read a full shell command as argument of my python script like : python myPythonScript.py ls -Fl
Call that command within my python script when I'd like to (Make some loops on some folders and apply the command etc ...)
I tried this :
import subprocess
from optparse import OptionParser
from subprocess import call
def execCommand(cmd):
call(cmd)
if __name__ == '__main__':
parser = OptionParser()
(options,args) = parser.parse_args()
print args
execCommand(args)
The result is that now I can do python myPythonScript.py ls , but I don't know how to add options. I know I can use parser.add_option , but don't know how to make it work for all options as I don't want to make only specific options available, but all possible options depending on the command I am running.
Can I use something like parser.add_option('-*') ? How can I parse the options then and call the command with its options ?
EDIT
I need my program to parse all type of commands passed as argument : python myScript.py ls -Fl , python myScript.py git pull, python myScript rm -rf * etc ...
OptionParser is useful when your own program wants to process the arguments: it helps you turn string arguments into booleans or integers or list items or whatever. In your case, you just want to pass the arguments on to the program you're invoking, so don't bother with OptionParser. Just pass the arguments as given in sys.argv.
subprocess.call(sys.argv[1:])
Depending on how much your program depends on command line arguments, you can go with simple route.
Simple way of reading command line arguments
Use sys to obtain all the arguments to python command line.
import sys
print sys.argv[1:]
Then you can use subprocess to execute it.
from subprocess import call
# e.g. call(["ls", "-l"])
call(sys.argv[1:])
This sample below works fine for me.
import sys
from subprocess import call
print(sys.argv[1:])
call(sys.argv[1:])
As a personal project to improve my python skills I created a script that retrieves weather data. It takes multiple command line arguments to specify the location and what specific information is wanted.
I'd like to make a second file to run it with specific command line arguments using a double click. I already learned how to make it into an executable/make a second file execute it. However, I don't know how to run it with command line arguments.
Currently my secondary file (wrapper?.. unsure of terminology) looks like this:
#! /usr/bin/env python
import weather
weather.main()
This runs but I don't know how to create command line arguments for it without running from the shell. I'd like to have a simple executable to run the weather for where I am quickly.
Well, you can call a shell process using the os.system or the subprocess module.
os.system takes a string and passes it as a command to a shell.
import os
os.system("ls -1")
Whereas subprocess takes a list of all the arguments (the program itself being the first argument) and passes it as a command.
import subprocess
# Simple command
subprocess.call(['ls', '-1'], shell=True)
Seeing these examples, it's easy to tell that you want the executable program to call either one of these (os.system or subprocess). I recommend using the latter, as it offers more variety.
If you want more information, I suggest you read the review of subprocess on Python Module of the Week..
Add to your wrapper script:
import sys
sys.argv[1:] = ['what', 'ever', 'u', 'want']
before the call to weather.main().
Is it possible to send a list of related commands using os.system() in python? I mean if I want to change current directory to a specific directory and then have a list of contents, how can I do it? (I don't want to use dir "path"- I want to do both changing current dir, and listing the directories)
Note : It was just an example, I want to know how I can send multiple commands! (Some related commands in a row)
Note : Python 3.2
os.system uses the local system shell. You can do it as #Rwaing suggests on many unixy shells but not other places like windows. A better option is subprocess.call and the cwd (current working directory) param
import subprocess
subprocess.call('dir', shell=True, cwd='somepath')
As others have mentioned, if all you really want to do is get a list of the files, the existin python api does it quite well.
Edit: sending mutiple commands
One way to send multiple commands is to pump them into the child shell's stdin. Its shell dependant, but here's a windows example:
import os
import subprocess as subp
p=subp.Popen('cmd.exe', shell=True, stdin=subp.PIPE)
p.stdin.write("""dir
cd "\\program files"
dir
""")
p.stdin.write('exit' + os.linesep)
p.wait()
del p
print 'done'
No need for system calls here. os functions chdir and listdir will change your current directory and list the files in a directory respectively.
have a look at os.listdir(path):
https://docs.python.org/3/library/os.html#os.listdir
example:
import os
entries = os.list.dir('/home/foo')
I got two different python script, the first one is using argparse to get some additional argument (I will call it arg.py) and the second one is my main script (main.py).
I want to call arg.py inside main.py but I don't know how to do so. I take a look at the command execfile but I didn't manage to make it work.
I try to put
execfile('arg.py -h')
In main.py but python try to find the file 'arg.py -h' which doesn't exist. Does anyone know how to do that ?
Thanks in advance
You can run it as a separate process using subprocess.call or subprocess.Popen. If you don't want to run it as a child process, then it starts to get more complicated (depending on the structure of your arg.py script.
execfile takes the file you pass to it and includes it in the current script -- Much like a #include pre-processor directive in c/c++ (although this is done dynamically). One really ugly option using execfile would be:
#completely untested.
import sys
def run_script(script_name,*args):
_argv = sys.argv[:]
sys.argv = list(args)
execfile(script_name)
sys.argv = _argv