Run command in CMD via python and extract the data - python

I am trying to use the below code to run a command and extract the data from the cmd.
the file with the commands and data is a txt file. (let me know if I should change it or use an excel if better).
the commands look something like this: ping "host name" which would result in some data in the cmd.there is list of these in the file. so it would ping "hostname1" then line two ping "hostname2"..etc
THE QUESTION: I want it to run every line individually and extract the results from the cmd and store them in a txt file or excel file - Ideally I want all the results in the same file. is this possible? and how?
here is the code so far:
root_dir = pathlib.Path(r"path to file here")
cmds_file = root_dir.joinpath('actual file here with commands and data')
#fail = []
cmds = cmds_file.read_text().splitlines()
try:
for cmd in cmds:
args = cmd.split()
print(f"\nRunning: {args[0]}")
output = subprocess.check_output(args)
print(output.decode("utf-8"))
out_file = root_dir.joinpath(f"Name of file where I want results printed in")
out_file.write_text(output.decode("utf-8"))
except:
pass

You can use a module called subprocess import subprocess
Then you can define a variable like this
run = subprocess.run(command_to_execute, capture_output=True)
After that you can do print(run.stdout) to print the command output.
If you want to write it to a file you can do this after you run the above code
with open("PATH TO YOUR FILE", "w") as file:
file.write(run.stdout)
This should write a file which contains the output of your command
After that close the file using file.close() and reopen it but in "a" mode
with open("PATH TO YOUR FILE", "a") as file:
file.write(\n + run.stdout)
This should append data to your file.
Remember to close the file just for best practice, I have some bad memorys about not closing the file after I opened it :D

My plan is simple:
Open input, output file
Read input file line by line
Execute the command and direct the output to the output file
#!/usr/bin/env python3
import pathlib
import shlex
import subprocess
cmds_file = pathlib.Path(__file__).with_name("cmds.txt")
output_file = pathlib.Path(__file__).with_name("out.txt")
with open(cmds_file, encoding="utf-8") as commands, open(output_file, "w", encoding="utf-8") as output:
for command in commands:
command = shlex.split(command)
output.write(f"\n# {shlex.join(command)}\n")
output.flush()
subprocess.run(command, stdout=output, stderr=subprocess.STDOUT, encoding="utf-8")
Notes
Use shlex.split() to simulate the bash shell's command split
The line output.write(...) is optional. You can remove it
With subprocess.run(...), the stdout=output will redirect the command's output to the file. You don't have to do anything.
Update
I updated the subprocess.run line to redirect stderr to stdout, so error will show.

Related

Extracting output from .exe in real-time and storing it

I have a .exe programme that produces real-time data. I want to extract the output when running the programme in real-time, however It's my first time trying this out, and so I wanted help in approaching this.
I have opened it with the following:
cmd = r'/Applications/StockSpy Realtime Stocks Quote.app/Contents/MacOS/StockSpy Realtime Stocks Quote'
import subprocess
with open('output.txt', 'wb') as f:
subprocess.check_call(cmd, stdout=f)
# to read line by line
with open('output.txt') as f:
for line in f:
print(line)
# output = qx(cmd)
with the aim to store the output. However, it does not save any of the output, I get a blank textfile.
I managed to save the output by following this code:
from subprocess import STDOUT, check_call as x
with open(os.devnull, 'rb') as DEVNULL, open('output.txt', 'wb') as f:
x(cmd, stdin=DEVNULL, stdout=f, stderr=STDOUT)
from How do I get all of the output from my .exe using subprocess and Popen?
What you are trying to do can be achieved with python using something like this:
import subprocess
with subprocess.Popen(['/path/to/executable'], stdout=subprocess.PIPE) as proc:
data = proc.stdout.read() # the data variable will contain the
# what would usually be the output
"""Do something with data..."""

Python Tkinter and Subprocess Running multiple command lines from a file

I am trying to write a script that opens a text file that contains specific lines which can be entered into the command window. Instead of having the user copy and paste these lines, I am trying to use subprocess to automatically make the commands run.
Currently my code is this:
import sys
from Tkinter import *
import tkFileDialog
import subprocess
master = Tk()
master.withdraw()
file_path = tkFileDialog.askopenfilename(title="Open command file")
print file_path
with open (file_path,"r") as infile:
cmd1 = infile.readline()
cmd2 = infile.readline()
cmd3 = infile.readline()
cmd4 = infile.readline()
p=subprocess.Popen("{}; {}; {}; {}".format(cmd1, cmd2, cmd3, cmd4), stdout=subprocess.PIPE)
for line in p.stdout:
print line
p.wait()
print p.returncode
print 'complete'
The text file that the user selects has four commands which are all similar to the following:
program.exe commanduniquetoprogram "C:\\...." "C:\\...."
I don't know how the command line works, but if run in a command window, it does as it should without even opening the program. As it is now, when I run my program the only thing that seems to work is the first part program.exe as it opens up the program then throws a strange error. Instead, if the line is pasted into the command window, the program does not open up at all and does its job, which leads me to believe that subprocess doesn't like the spaces in the command line.
Any help is appreciated as I have no experience with the subprocess module!

python subprocess is overwriting file used for stdout - I need it to append to the file (windows)

I want to append the STDOUT of subprocess.call() to an existing file. My code below overwrites the file -
log_file = open(log_file_path, 'r+')
cmd = r'echo "some info for the log file"'
subprocess.call(cmd, shell=True, stdout=log_file, stderr=STDOUT)
log_file.close()
I'm looking for the equivalent of >> in subprocess.call() or subprocess.Popen(). It's driving me crazy trying to find it..
UPDATE:
Following the answers so far I've updated my code to
import subprocess
log_file = open('test_log_file.log', 'a+')
cmd = r'echo "some info for the log file\n"'
subprocess.call(cmd, shell=True, stdout=log_file, stderr=subprocess.STDOUT)
log_file.close()
I'm running this code from the command line in windows -
C:\users\aidan>test_subprocess.py
This adds the text to the log file. When I run the script again, nothing new is added. It still seems to be overwriting the file..
Use the 'a' append mode instead:
log_file = open(log_file_path, 'a+')
If you still see previous content overwritten, perhaps Windows needs you to explicitly seek to the end of the file; open as 'r+' or 'w' and seek to the end of the file:
import os
log_file = open(log_file_path, 'r+')
log_file.seek(0, os.SEEK_END)
Modify how you open log_file_path. You are opening the file for reading and writing 'r+'. Use the 'a' append mode instead of 'r+':
log_file = open(log_file_path, 'a+')

Using subprocess to log print statements of a command

I am running a Transporter command, which prints a log of what is happening to the prompt.
How would I re-direct all the print statements to a separate file called transporter_log.txt in the same folder as the script is running from? Something like -
log_file = open(PATH, 'w')
subprocess.call(shlex.split("/usr/local//iTMSTransporter -m verify...")
log_file.write(...)
You could specify the file as stdout parameter:
with open(PATH, 'wb') as log_file:
subprocess.check_call(cmd, stdout=log_file)
The output of cmd is written to log_file.
What about using the redirect command (in on unix)?
your_python.py > /path/to/transporter_log.txt

How to call a cmd file within python script

This is the python script will do. The question is how to call the external cmd file within the function?
Read a CSV file in the directory.
If the content in 6th column is equal to 'approved', then calls an
external windows script 'TransferProd.cmd'
.
def readCSV(x):
#csvContents is a list in the global scope that will contain lists of the
#items on each line of the specified CSV file
try:
global csvContents
file = open(csvDir + x + '.csv', 'r') #Opens the CSV file
csvContents = file.read().splitlines() #Appends each line of the CSV file to csvContents
#----------------------------------------------------------------------
#This takes each item in csvContents and splits it at "," into a list.
#The list created replaces the item in csvContents
for y in range(0,len(csvContents)):
csvContents[y] = csvContents[y].lower().split(',')
if csvContents[y][6] == 'approved':
***CALL TransferProd.cmd***
file.close()
return
except Exception as error:
log(logFile, 'An error has occurred in the readCSV function: ' + str(error))
raise
Take a look at the subprocess module.
import subprocess
p = subprocess.Popen(['TransferProd.cmd'])
You can specify where you want output/errors to go (directly to a file or to a file-like object), pipe in input, etc.
import os
os.system('TransferProd.cmd')
This works in both unix/windows flavors as it send the commands to the shell. There are some variations in returned values though! Check here.
If you don't need output of the command you could use: os.system(cmd)
The better solution is to use:
from subprocess import Popen, PIPE
proc = Popen(cmd, shell = True, close_fds = True)
stdout, stderr = proc.communicate()

Categories