I have a .exe programme that produces real-time data. I want to extract the output when running the programme in real-time, however It's my first time trying this out, and so I wanted help in approaching this.
I have opened it with the following:
cmd = r'/Applications/StockSpy Realtime Stocks Quote.app/Contents/MacOS/StockSpy Realtime Stocks Quote'
import subprocess
with open('output.txt', 'wb') as f:
subprocess.check_call(cmd, stdout=f)
# to read line by line
with open('output.txt') as f:
for line in f:
print(line)
# output = qx(cmd)
with the aim to store the output. However, it does not save any of the output, I get a blank textfile.
I managed to save the output by following this code:
from subprocess import STDOUT, check_call as x
with open(os.devnull, 'rb') as DEVNULL, open('output.txt', 'wb') as f:
x(cmd, stdin=DEVNULL, stdout=f, stderr=STDOUT)
from How do I get all of the output from my .exe using subprocess and Popen?
What you are trying to do can be achieved with python using something like this:
import subprocess
with subprocess.Popen(['/path/to/executable'], stdout=subprocess.PIPE) as proc:
data = proc.stdout.read() # the data variable will contain the
# what would usually be the output
"""Do something with data..."""
Related
I am trying to use the below code to run a command and extract the data from the cmd.
the file with the commands and data is a txt file. (let me know if I should change it or use an excel if better).
the commands look something like this: ping "host name" which would result in some data in the cmd.there is list of these in the file. so it would ping "hostname1" then line two ping "hostname2"..etc
THE QUESTION: I want it to run every line individually and extract the results from the cmd and store them in a txt file or excel file - Ideally I want all the results in the same file. is this possible? and how?
here is the code so far:
root_dir = pathlib.Path(r"path to file here")
cmds_file = root_dir.joinpath('actual file here with commands and data')
#fail = []
cmds = cmds_file.read_text().splitlines()
try:
for cmd in cmds:
args = cmd.split()
print(f"\nRunning: {args[0]}")
output = subprocess.check_output(args)
print(output.decode("utf-8"))
out_file = root_dir.joinpath(f"Name of file where I want results printed in")
out_file.write_text(output.decode("utf-8"))
except:
pass
You can use a module called subprocess import subprocess
Then you can define a variable like this
run = subprocess.run(command_to_execute, capture_output=True)
After that you can do print(run.stdout) to print the command output.
If you want to write it to a file you can do this after you run the above code
with open("PATH TO YOUR FILE", "w") as file:
file.write(run.stdout)
This should write a file which contains the output of your command
After that close the file using file.close() and reopen it but in "a" mode
with open("PATH TO YOUR FILE", "a") as file:
file.write(\n + run.stdout)
This should append data to your file.
Remember to close the file just for best practice, I have some bad memorys about not closing the file after I opened it :D
My plan is simple:
Open input, output file
Read input file line by line
Execute the command and direct the output to the output file
#!/usr/bin/env python3
import pathlib
import shlex
import subprocess
cmds_file = pathlib.Path(__file__).with_name("cmds.txt")
output_file = pathlib.Path(__file__).with_name("out.txt")
with open(cmds_file, encoding="utf-8") as commands, open(output_file, "w", encoding="utf-8") as output:
for command in commands:
command = shlex.split(command)
output.write(f"\n# {shlex.join(command)}\n")
output.flush()
subprocess.run(command, stdout=output, stderr=subprocess.STDOUT, encoding="utf-8")
Notes
Use shlex.split() to simulate the bash shell's command split
The line output.write(...) is optional. You can remove it
With subprocess.run(...), the stdout=output will redirect the command's output to the file. You don't have to do anything.
Update
I updated the subprocess.run line to redirect stderr to stdout, so error will show.
At the moment, I have code to unzip a file and then zipping it back up in another file using Popen
with open('./file.gz', 'rb') as in_file:
g_unzip_process = Popen(['gunzip', '-c'], stdin=in_file, stdout=PIPE)
with open('./outfile.gz', 'wb+') as out_file:
Popen(['gzip'], stdin=g_unzip_process.stdout, stdout=out_file)
Now, I am trying to insert something in between.
It should look like this:
with open('./file.gz', 'rb') as in_file:
g_unzip_process = Popen(['gunzip', '-c'], stdin=in_file, stdout=PIPE)
transform_process = Popen(['python', 'transform.py'], stdin=g_unzip_process.stdout, stdout=PIPE)
with open('./outfile.gz', 'wb+') as out_file:
Popen(['gzip'], stdin=transform_process.stdout, stdout=out_file)
My transform.py code looks like this
for line in sys.stdin:
sys.stdout.write(line)
sys.stdin.close()
sys.stdout.close()
After running it, why is it that my outfile.gz file is empty?
I have also tried reading each line by running:
for line in transform_process.stdout:
print(line)
How do I make it so that those lines will be written inside outfile.gz?
I was able to get this working by calling Popen.wait() for writing to outfile.gz
I have a Windows command which I want to write to stdout and to a file. For now, I only have 0 string writen in my file:
#!/usr/bin/env python3
#! -*- coding:utf-8 -*-
import subprocess
with open('auto_change_ip.txt', 'w') as f:
print(subprocess.call(['netsh', 'interface', 'show', 'interface']), file=f)
subprocess.call returns an int (the returncode) and that's why you have 0 written in your file.
If you want to capture the output, why don't you use subprocess.run instead?
import subprocess
cmd = ['netsh', 'interface', 'show', 'interface']
p = subprocess.run(cmd, stdout=subprocess.PIPE)
with open('my_file.txt', 'wb') as f:
f.write(p.stdout)
In order to capture the output in p.stdout, you'll have to redirect stdout to subprocess.PIPE.
Now p.stdout holds the output (in bytes), which you can save to file.
Another option for Python versions < 3.5 is subprocess.Popen. The main difference for this case is that .stdout is a file object, so you'll have to read it.
import subprocess
cmd = ['netsh', 'interface', 'show', 'interface']
p = subprocess.Popen(cmd, stdout=subprocess.PIPE)
out = p.stdout.read()
#print(out.decode())
with open('my_file.txt', 'wb') as f:
f.write(out)
I am running a Python script using subprocess and willing to save output to a file as well as show live logs on terminal.
I have written below code and its saving logs in file but not showing live script execution logs on terminal.
TCID = sys.argv[1]
if TCID == "5_2_5_3":
output = subprocess.check_output([sys.executable, './script.py'])
with open('scriptout.log', 'wb') as outfile:
outfile.write(output)
I think this will fix your issue
import subprocess
outputfile = open('scriptout.log', 'a')
process = subprocess.Popen(["ping", "127.0.0.1"], stdout=subprocess.PIPE, stderr=subprocess.STDOUT)
while True:
output = process.stdout.readline()
if output == b'' and process.poll() is not None:
break
if output:
out = output.decode()
outputfile.write(out)
print(out, end="")
outputfile.close()
also I tried
import subprocess
output = subprocess.check_output(["ping", "127.0.0.1"])
with open('scriptout.log', 'wb') as outfile:
print(output)
outfile.write(output)
but it outputs after command execution ends. Also I want try with logging module but I don't know how to use it sorry :(
I am trying to redirect all the stdout to a file, out.txt. But first commands's output display's on the terminal and the rest is fed to the file. I am not sure whats wrong in the piece of code below.
import os
import sys
import subprocess
orig_stdout = sys.stdout
f = file('out.txt', 'w')
sys.stdout = f
os.system("date") #First command
cmd = ["ls", "-al"]
exe_cmd = subprocess.Popen(cmd, stdout=subprocess.PIPE)
output, err = exe_cmd.communicate()
print output #Second command
sys.stdout = orig_stdout
Assigning a file object to sys.stdout redirects python code that uses sys.stdout but doesn't redirect code that uses the underlying file descriptor.
os.system("date")
spawns a new process that uses the underlying file descriptor, so its not redirected.
exe_cmd = subprocess.Popen(cmd, stdout=subprocess.PIPE)
output, err = exe_cmd.communicate()
print output #Second command
spawns a new process with a pipe that is read by the parent process. print uses the parent sys.stdout so it is redirected.
A standard way to redirect is to hand the file object to one of the subprocess calls. The child writes directly to the file without parent interaction.
with open('out.txt', 'w') as f:
cmd = ["ls", "-al"]
subprocess.call(cmd, stdout=f)