my problem might be very easy for some experienced python programmers (I'm not one of them).
I'm trying to start a server with an argument.
Currently i'm doing:
def startServer(host):
host.cmd( 'python server.py &' )
Is there a way to pass an argument to server.py?
def startServer(host):
host.cmd( 'python server.py an_argument_to_server &' )
is that really what you are looking for?
In order to run python run server.py from the command line, we can use the subprocess module. Documentation can be found here: https://docs.python.org/2/library/subprocess.html.
import subprocess
def startServer(host, argument):
subprocess.call(['python', 'server.py', argument])
When you call the startServer function with some argument (say an_argument_to_server), it will run python server.py an_argument_to_server in command line. We don't need the & character, as subprocess will automatically run in the background. The subprocess module also works for python 3.x
Related
So I'm messing around with the "cmd" module for python, I want a command where you can type "python" and then it opens a python command line. Sort of like how an actual command line would.
Here's my current code.
import cmd
class pythonCmd(cmd.Cmd):
def do_(self, args): # <--- I want this command to have it so you don't type a key word
exec(args)
class cmdLine(cmd.Cmd):
def do_python(self, args):
prompt = pythonCmd()
prompt.prompt = 'python> '
prompt.cmdloop('Python 3.8.2')
prompt = cmdLine()
prompt.prompt = '> '
prompt.cmdloop('Command line starting . . .')
I don't know whether you have to use cmd module or not. But there are much better modules similar to cmd. Modules such as subprocess, os and etc.
I recently used subprocess module, try it.
How about this:
Instead of running your program that opens a shell that can take both python commands and potentially your own commands,
Run python shell, import your program module- you have native python shell that can run python code.
Add support of additional commands by implementing a function like cmd(args) which you can call from the shell. You may need to work on your module to simplify using it in interactive python shell by providing #aliases”to existing functions etc..
With do_shell function you can use it with "!" syntax. For example
> !print("Henlo world")
This would print it, you can use other commands too.
I am still a newbie to python, so apologies in advance. I have related topics on this but didn't find the best solution. (Run a python script from another python script, passing in args)
Basically, I have a python script (scriptB.py) that takes in a config file as argument and does some stuff. I need to call this script from another python script (scriptA.py).
If I had no arguments to pass, I could have just done
import scriptB.py
However, things got little complicated because we need to pass the config file (mycnofig.yml) as argument.
One of the suggestions was to use;
os.system(python scriptB.py myconfig.yml)
But, it is often reported as not a recommended approach and that it often does not work.
Another suggestion was to use:
import subprocess
subprocess.Popen("scriptB.py myconfig.yaml", shell=True)
I am not very sure if this is a common practice.
Just want to point out that both scripts don't have any main inside the script.
Please advise on the best way to handle this.
Thanks,
this should work just fine
subprocess.Popen(['python', '/full_path/scriptB.py', 'myconfig.yaml'], stdout=PIPE, stderr=PIPE)
See https://docs.python.org/3/library/subprocess.html#replacing-os-popen-os-popen2-os-popen3
If you really need to run a separate process, using the multiprocessing library is probably best. I would make an actual function inside scriptB.py that does the work. In the below example I consider config_handler to be a function inside scriptB.py that actually takes the config file path argument.
1.) create a function that will handle the calling of your external python script, also, import your script and the method inside it that takes arguments
scriptA.py: importing config_handler from scriptB
import multiprocessing
from scriptB import config_handler
def other_process(*args):
p = multiprocessing.Process(*args)
p.start()
2.) Then just call the process and feed your arguments to it:
scriptA.py: calling scriptB.py function, config_handler
other_process(name="config_process_name", target=config_handler, args=("myconfig.yml",))
Opinion:
From the information you have provided, i imagine you could manage to do this without separate processes. Just do things all in sequence and make scriptB.py a library with a function you use in scriptA.py.
It seems you got all your answers in the old thread, but if you really want to run it through os, not through python, this is what I do:
from subprocess import run, PIPE, DEVNULL
your_command = './scriptB.py myconfig.yaml'
run(your_command.split(), stdout=PIPE, stderr=DEVNULL)
In case you need the output:
output = run(your_command.split(), stdout=PIPE, stderr=DEVNULL).stdout.decode('utf-8')
If the scriptB has the shebang header telling the bash its a python script, it should run it correctly.
Path can be both relative and absolute.
It is for Python 3.x
To launch a python script (it is needed for running an OLED display) from terminal, I have to use the following bash commands: python demo_oled_v01.py --display ssd1351 --width 128 --height 128 --interface spi --gpio-data-command 20. Those parameters after .py are important, otherwise, the script will run with default settings and in my case, the script will not launch with default settings. Thus, those parameters are needed.
The problem arises when I need to launch my script from another python script, (instead of using bash commands on terminal). To launch one of my python script from a parent script. I have used:
import subprocess # to use subprocess
p = subprocess.Popen(['python', 'demo_oled_v01.py --display ssd1351 --width 128 --height 128 --interface spi --gpio-data-command 20'])
in my parent script but I got an error stating:
python: can't open file 'demo_oled_v01.py --display ssd1351 --width 128 --height 128 --interface spi --gpio-data-command 20': [Errno 2] No such file or directory
I suspect that adding the parameters --display ssd1351 --width 128 --height 128 --interface spi --gpio-data-command 20 after .py may be causing difficulty in launching the script. As mentioned, these parameters are otherwise essential for me to include for launching with bash commands on terminal. How can I use subprocess with the required parameters to launch this script?
The subprocess library is interpreting all of your arguments, including demo_oled_v01.py as a single argument to python. That's why python is complaining that it cannot locate a file with that name. Try running it as:
p = subprocess.Popen(['python', 'demo_oled_v01.py', '--display',
'ssd1351', '--width', '128', '--height', '128', '--interface', 'spi',
'--gpio-data-command', '20'])
See more information on Popen here.
This started as a comment thread, but got too long and complex.
Calling Python as a subprocess of Python is an antipattern. You can often fruitfully avoid this by refactoring your Python code so that your program can call the other program as a simple library (or module, or package, or what have you -- there is a bit of terminology here which you'll want to understand more properly ... eventually).
Having said that, there are scenarios where the subprocess needs to be a subprocess (perhaps it is designed to do its own signal handling, for example) so don't apply this blindly.
If you have a script like demo.py which contains something like
def really_demo(something, other, message='No message'):
# .... some functionality here ...
def main():
import argparse
parser = argparse.ArgumentParser(description='Basic boilerplate, ignore the details.')
parser.add_argument('--something', dest='something') # store argument in args.something
parser.add_argument('--other', dest='other') # ends up in args.other
# ... etc etc etc more options
args = parser.parse_args()
# This is the beef: once the arguments are parsed, pass them on
really_demo(args.something, args.other, message=args.message)
if __name__ == '__main__':
main()
Observe how when you run the script from the command line, __name__ will be '__main__' and so it will plunge into the main() function which picks apart the command line, then calls some other function -- in this case, real_demo(). Now, if you are calling this code from an already running Python, there is no need really to collect the arguments into a list and pass them to a new process. Just have your Python script load the function you want to call from the script, and call it with your arguments.
In other words, if you are currently doing
subprocess.call(['demo.py', '--something', 'foo', '--other', value, '--message', 'whatever'])
you can replace the subprocess call with
from demo import real_demo
real_demo('foo', value, message='whatever')
Notice how you are bypassing the main() function and all the ugly command-line parsing, and simply calling another Python function. (Pay attention to the order and names of the arguments; they may be quite different from what the command-line parser accepts.) The fact that it is defined in a different file is a minor detail which import handles for you, and the fact that the file contains other functions is something you can ignore (or perhaps exploit more fully if, for example, you want to access internal functions which are not exposed via the command-line interface in a way which is convenient for you).
As an optimization, Python won't import something twice, so you really need to make sure the functionality you need is not run when you import it. Commonly, you import once, at the beginning of your script (though technically you can do it inside the def which needs it, for example, if there is only one place in your code which depends on the import) and then you call the functions you got from the import as many or as few times as you need them.
This is a lightning recap of a very common question. If this doesn't get you started in the right direction, you should be able to find many existing questions on Stack Overflow about various aspects of this refactoring task.
Add full path to the python script & separate all parameter
EX:
import subprocess
p = subprocess.Popen(['python', 'FULL_PATH_TO_FILE/demo_oled_v01.py', '--display', 'ssd1351', '--width', '128', '--height', '128', '--interface', 'spi', '--gpio-data-command', '20'])
For Windows and Python 3.x, you could :
Use a Windows shell (cmd.exe most probably on Windows by default)
result = subprocess.Popen('cd C:\\Users\\PathToMyPythonScript
&& python myPythonScript.py value1ofParam1 value2ofParam2',
shell=True, universal_newlines=True,
stdout=subprocess.PIPE, stderr=subprocess.PIPE)
output, error = result.communicate()
print(output)
Stay in your Python environment (remove shell=True) you can write :
result = subprocess.Popen(["C:\Windows\System32\cmd.exe", "/k",
"cd", "C:\\Users\\PathToMyPythonScript",
"&&", "dir", "&&", "python", "myPythonScript.py",
"value1ofParam1", "value2ofParam2"], universal_newlines=True,
stdout=subprocess.PIPE, stderr=subprocess.PIPE)
output, error = result.communicate()
print(output)
Example of script file you can call for a try (the "MyPythonScript.py") :
# =============================================================================
# This script just outputs the arguments you've passed to it
import sys
print('Number of arguments:', len(sys.argv), 'arguments.')
print('Argument List:', str(sys.argv))
# =============================================================================
So to be more precise, what I am trying to do is :
read a full shell command as argument of my python script like : python myPythonScript.py ls -Fl
Call that command within my python script when I'd like to (Make some loops on some folders and apply the command etc ...)
I tried this :
import subprocess
from optparse import OptionParser
from subprocess import call
def execCommand(cmd):
call(cmd)
if __name__ == '__main__':
parser = OptionParser()
(options,args) = parser.parse_args()
print args
execCommand(args)
The result is that now I can do python myPythonScript.py ls , but I don't know how to add options. I know I can use parser.add_option , but don't know how to make it work for all options as I don't want to make only specific options available, but all possible options depending on the command I am running.
Can I use something like parser.add_option('-*') ? How can I parse the options then and call the command with its options ?
EDIT
I need my program to parse all type of commands passed as argument : python myScript.py ls -Fl , python myScript.py git pull, python myScript rm -rf * etc ...
OptionParser is useful when your own program wants to process the arguments: it helps you turn string arguments into booleans or integers or list items or whatever. In your case, you just want to pass the arguments on to the program you're invoking, so don't bother with OptionParser. Just pass the arguments as given in sys.argv.
subprocess.call(sys.argv[1:])
Depending on how much your program depends on command line arguments, you can go with simple route.
Simple way of reading command line arguments
Use sys to obtain all the arguments to python command line.
import sys
print sys.argv[1:]
Then you can use subprocess to execute it.
from subprocess import call
# e.g. call(["ls", "-l"])
call(sys.argv[1:])
This sample below works fine for me.
import sys
from subprocess import call
print(sys.argv[1:])
call(sys.argv[1:])
As a personal project to improve my python skills I created a script that retrieves weather data. It takes multiple command line arguments to specify the location and what specific information is wanted.
I'd like to make a second file to run it with specific command line arguments using a double click. I already learned how to make it into an executable/make a second file execute it. However, I don't know how to run it with command line arguments.
Currently my secondary file (wrapper?.. unsure of terminology) looks like this:
#! /usr/bin/env python
import weather
weather.main()
This runs but I don't know how to create command line arguments for it without running from the shell. I'd like to have a simple executable to run the weather for where I am quickly.
Well, you can call a shell process using the os.system or the subprocess module.
os.system takes a string and passes it as a command to a shell.
import os
os.system("ls -1")
Whereas subprocess takes a list of all the arguments (the program itself being the first argument) and passes it as a command.
import subprocess
# Simple command
subprocess.call(['ls', '-1'], shell=True)
Seeing these examples, it's easy to tell that you want the executable program to call either one of these (os.system or subprocess). I recommend using the latter, as it offers more variety.
If you want more information, I suggest you read the review of subprocess on Python Module of the Week..
Add to your wrapper script:
import sys
sys.argv[1:] = ['what', 'ever', 'u', 'want']
before the call to weather.main().