I want to send commands to run a python script to the Linux terminal. I have a list of python files which I want to run and I want to run them one after the other as we read the list sequentially. Once the first file is finished, it should send the second one to run and so on.
You can run scripts sequentially using the following command:
python script1.py && python script2.py && python script3.py
&& runs the next script only if the previous has run successfully.
You can iterate through using the subprocess module:
import subprocess
script_list = ['script1.py', 'script2.py']
for script in script_list:
args = ['python', script]
p = subprocess.check_call(args)
You can use the check_call function of the subprocess module which is a blocking call. when you iterate through the list one will run after another.
import subprocess
files = ['script1.py', 'script2.py']
for _file in files:
call_output = subprocess.check_all(['python', _file])
Related
I downloaded a script written in Python, called let's say 'myScript.py', but I don't know how to run it.
It has the following structure:
import numpy as np
import argparse
parser = argparse.ArgumentParser(description='manual to this script')
parser.add_argument('--file', type=str, default = None)
parser.add_argument('--timeCor', type=bool, default = False)
parser.add_argument('--iteration', type=str, default = 'Newton')
args = parser.parse_args()
def func1(file):
...
def func2(file):
...
def calculate(data, timeCor=False, iteration='Newton'):
...
if __name__ == "__main__":
print('\n--- Calculating ---\nN file:',args.file)
print('Time correction =',args.timeCor,
'\nIteration strategy =',args.iteration,'\n')
rawdata,data = func2(args.file)
pos = calculate(data,timeCor=args.timeCor,iteration=args.iteration)
for each in pos:
print('Pos:',format(np.uint8(each[0]),'2d'),each[1:-1])
np.savetxt('pos.csv',pos,delimiter=',')
print('--- Save file as "pos.csv" in the current directory ---')
How can I run it from command line? And from another script?
Thank you in advance!
If I understood your question correctly, you are asking about executing this python program from another python script. This can be done by making use of subprocess.
import subprocess
process = subprocess.run([“python3”, “path-to-myScript.py”, “command line arguments here”], capture_output=True)
output = process.stdout.decode() # stdout is bytes
print(output)
If you do not want to provide the command in a list, you can add shell=True as an argument to subprocess.run(), or you can use shlex.split().
If you are on windows, replace python3 with python.
However this solution is not very portable, and on a more general note, if you are not strictly needing to run the script, I would recommend you to import it and call its functions directly instead.
To run the python script from command line:
python myScript.py --arguments
And as #treuss has said, if you are on a Unix system (macOS or Linux) you can add a shebang to the top of the script, but I recommend to use the following instead:
#!/usr/bin/env python3
for portability sake, as the actual path of python3 may vary.
To run the edited program:
chmod +x myScript # to make file executable, and note that there is no longer a .py extension as it is not necessary
./myScript --arguments
Run it by executing:
python myScript.py
Alternatively, on systems which support it (UNIX-Like systems), you can add what's called a she-bang to the first line of the file:
#!/usr/bin/python
Note that /usr/bin/python should be the actual path where your python is located. After that, make the file executable:
chmod u+x myScript.py
and run it either from the current directory:
./myScript.py
or from a different directory with full or relative path:
/path/to/python/scripts/myScript.py
I'm trying to write GPS data into a CSV through Python 3 with RaspberryPi. Writing the file works when the commands are run directly through the console, but when it is in python, the file opens and then returns an error (usually that another process is running). We wrote in another line to kill the process, but it's still not writing to the CSV. Any tips?
import math
import time
import os
os.system('sudo fuser -k/dev/ttyAMAO')
os.system('stty -F /dev/ttyAMAO 9600')
os.system('sudo gpsd /dev/ttyAMAO -F /var/run/gpsd.sock')
os.system('sudo gpsmon /dev/ttyAMAO -l /home/pi/Desktop/GPSDATA.txt')
Please note that os.system() executes the command in a subshell. This means that the PID of the shell executing the command will changes at each command.
A simple solution is to chain the commands in you call to os.system.
In my main Python script, I want to call another python script to run, as follows:
python2 ~/script_location/my_side_script.py \ --input-dir folder1/in_folder \ --output-dir folder1/out_folder/ \ --image-ext jpg \
From inside my Python script, how exactly can I do this?
I will be using both Windows and Ubuntu, but primarily the latter. Ideally would like to be able to do on both.
You could import the script in your main file.
Suppose you have two files: myscript.py and main.py
# myscript.py
print('this is my script!')
# main.py
print('this is my main file')
import myscript
print('end')
The output if you run main.py would be:
this is my main file
this is my script
end
EDIT: If you literally just want to call python2 my_side_script.py --options asdf, you could use the subprocess python module:
import subprocess
stdout = subprocess.check_output(['python2', 'my_side_script.py', '--options', 'asdf'])
print(stdout) # will print any output from your sidescript
Not sure if this is possible. I have a set of python scripts and have modified the linux PATH in ~/.bashrc so that whenever I open a terminal, the python scripts are available to run as a command.
export PATH=$PATH:/home/user/pythonlib/
my_command.py resides in the above path.
I can run my_command.py (args) from anywhere in terminal and it will run the python scripts.
I'd like to control this functionality from a different python script as this will be the quickest solution to automating my processing routines. So I need it to open a terminal and run my_command.py (args) from within the python script I'm working on.
I have tried subprocess:
import subprocess
test = subprocess.Popen(["my_command.py"], stdout=subprocess.PIPE)
output = test.communicate()[0]
While my_command.py is typically available in any terminal I launch, here I have no access to it, returns file not found.
I can start a new terminal using os then type in my_command.py, and it works
os.system("x-terminal-emulator -e /bin/bash")
So, is there a way to get the second method to accept a script you want to run from python with args?
Ubuntu 16
Thanks :)
Popen does not load the system PATH for the session you create in a python script. You have to modify the PATH in the session to include the directory to your project like so:
someterminalcommand = "my_command.py (args)"
my_env = os.environ.copy()
my_env["PATH"] = "/home/usr/mypythonlib/:" + my_env["PATH"]
combine = subprocess.Popen(shlex.split(someterminalcommand), env=my_env)
combine.wait()
This allows me to run my "my_command.py" file from a different python session just like I had a terminal window open.
If you're using Gnome, the gnome-terminal command is rather useful in this situation.
As an example of very basic usage, the following code will spawn a terminal, and run a Python REPL in it:
import subprocess
subprocess.Popen(["gnome-terminal", "-e", "python"])
Now, if you want to run a specific script, you will need to concatenate its path with python, for the last element of that list it the line that will be executed in the new terminal.
For instance:
subprocess.Popen(["gnome-terminal", "-e", "python my_script.py"])
If your script is executable, you can omit python:
subprocess.Popen(["gnome-terminal", "-e", "my_script.py"])
If you want to pass parameters to your script, simply add them to the python command:
subprocess.Popen(["gnome-terminal", "-e", "python my_script.py var1 var2"])
Note that if you want to run your script with a particular version of Python, you should specify it, by explicitly calling "python2" or "python3".
A small example:
# my_script.py
import sys
print(sys.argv)
input()
# main.py
import subprocess
subprocess.Popen(["gnome-terminal", "-e", "python3 my_script.py hello world"])
Running python3 main.py will spawn a new terminal, with ['my_script.py', 'hello', 'world'] printed, and waited for an input.
I have several command execution in python on Windows using subprocess.call(), but for each one I need to execute batch file with environmet setup before calling proper command, it looks like this
subprocess.call(precommand + command)
Is there way to "create" shell in python that will have batch file executed only once and in that shell command will be executed several times?
Write commands to a bat-file (tempfile.NamedTemporaryFile())
Run the bat-file (subprocess.check_call(bat_file.name))
(not tested):
#!/usr/bin/env python
from __future__ import print_function
import os
import subprocess
import tempfile
with tempfile.NamedTemporaryFile('w', suffix='.bat', delete=False) as bat_file:
print(precommand, file=bat_file)
print(command, file=bat_file)
rc = subprocess.call(bat_file.name)
os.remove(bat_file.name)
if rc != 0:
raise subprocess.CalledProcessError(rc, bat_file.name)
Do you need to get output from every command separately? If no - you can convey these commands using &&, || or ;
cd dir && cp test1 test2 && cd -