windows python stdout and stderr to two different files - python

I would like to output the stdout and stderr to two different log files.
i have tried this code but it only outputs the error to error log , but output is not redirected to runtime.log file.
the code is being run on windows and mostly robocopy is done in the code.
saveerr = sys.stderr
fsock = open('error.log', 'a')
sys.stderr = fsock
saveout = sys.stdout
fsock1 = open('runtime.log', 'a')
sys.stdout = fsock1
the sys.stdout area is not working. please let me know any correction in this code.
here is my entire code
import sys
saveerr = sys.stderr
fsock = open('error.log', 'a')
sys.stderr = fsock
saveout = sys.stdout
fsock1 = open('runtime.log', 'a')
sys.stdout = fsock1
##For site AUCB-NET-01 from source folder AUDC-RSTOR-01 E:\Canberra
exit_code1 = subprocess.call('robocopy \\\\aucb-net-01\\d$ \\\\nasaudc01\\remote_site_sync\\aucb-net-01 /E /MIR /W:2 /R:1', shell=True)
print ("exitcoode1=", exit_code1)
thanks to everyone for reading my post.

As mentioned in my comment your code should divert your stdout to the file. To get robocopy's stdout to go there too just echo each line to your stdout as shown in this link
https://stackoverflow.com/a/28319191/6550457
from subprocess import Popen, PIPE
import sys
saveerr = sys.stderr
fsock = open('error.log', 'a')
sys.stderr = fsock
saveout = sys.stdout
fsock1 = open('runtime.log', 'a')
sys.stdout = fsock1
cmd = 'robocopy \\\\aucb-net-01\\d$ \\\\nasaudc01\\remote_site_sync\\aucb-net-01 /E /MIR /W:2 /R:1'
p = Popen(cmd, stdout=sys.stdout, stderror=sys.stderr, bufsize=1, universal_newlines=True)
exit_code1 = p.wait()
see #eryksuns comments about robo copy and it's exit codes. http://ss64.com/nt/robocopy-exit.html

Related

Python subprocess display logs on terminal and save in file

I am running a Python script using subprocess and willing to save output to a file as well as show live logs on terminal.
I have written below code and its saving logs in file but not showing live script execution logs on terminal.
TCID = sys.argv[1]
if TCID == "5_2_5_3":
output = subprocess.check_output([sys.executable, './script.py'])
with open('scriptout.log', 'wb') as outfile:
outfile.write(output)
I think this will fix your issue
import subprocess
outputfile = open('scriptout.log', 'a')
process = subprocess.Popen(["ping", "127.0.0.1"], stdout=subprocess.PIPE, stderr=subprocess.STDOUT)
while True:
output = process.stdout.readline()
if output == b'' and process.poll() is not None:
break
if output:
out = output.decode()
outputfile.write(out)
print(out, end="")
outputfile.close()
also I tried
import subprocess
output = subprocess.check_output(["ping", "127.0.0.1"])
with open('scriptout.log', 'wb') as outfile:
print(output)
outfile.write(output)
but it outputs after command execution ends. Also I want try with logging module but I don't know how to use it sorry :(

Redirect output to two files in python

I am writing code where i want to redirect output two different files in python
one for log purpose and another for fabric code creation.
Below is the code which redirect output to two different files:
import sys
print('###################################Creating storage commands Variables##########################')
storage_file = open("Storage-output.txt", 'w')
sys.stdout = storage_file
print ('network port show')
print ('Storage Commands are completed')
storage_file.close()
sys.stdout = sys.__stdout__
# write fabric code
fabric_file = open("Fabric_code.py", 'w')
print"from fabric.api import run"
print"def host_type():"
print" run('rows 0', shell=False)"
print" run('networ port show', shell=false)"
fabric_file.close()
sys.stdout = sys.__stdout__
storage_file = open("Storage-output.txt", 'a')
sys.stdout = storage_file
print('###############Genarating aggregate command#############')
print ('storage aggregate create -aggregate aggr_fab -diskcount 6 -raidtype raid_dp", shell=False')
storage_file.close()
sys.stdout = sys.__stdout__
# write fabric code
fabric_file = open("Fabric_code.py", 'a')
print" run('storage aggregate create -aggregate aggr_fab -diskcount 6 - raidtype raid_dp', shell=False) "
fabric_file.close()
sys.stdout = sys.__stdout__
In Above code creating one for log file Storage_file to store this file for records and Fabric_code file to genarate fabric code.
My code generates 1000's of commands I do not want to open and close multiple times for two different files again and again in the python code.
Instead of this is there any solution where i can redirect print output to two direct files without opening and closing
You should refactor your code by opening the files once in the beginning, then writing your files and closing the files in the end. From the above example we can do the following:
import sys
# Open files
storage_file = open("Storage-output.txt", 'w')
fabric_file = open("Fabric_code.py", 'w')
# ===================
# Write Block
# ===================
print('###################################Creating storage commands Variables##########################')
sys.stdout = storage_file
print ('network port show')
print ('Storage Commands are completed')
sys.stdout = sys.__stdout__
# write fabric code
print"from fabric.api import run"
print"def host_type():"
print" run('rows 0', shell=False)"
print" run('networ port show', shell=false)"
sys.stdout = sys.__stdout__
sys.stdout = storage_file
print('###############Genarating aggregate command#############')
print ('storage aggregate create -aggregate aggr_fab -diskcount 6 -raidtype raid_dp", shell=False')
sys.stdout = sys.__stdout__
# write fabric code
print" run('storage aggregate create -aggregate aggr_fab -diskcount 6 - raidtype raid_dp', shell=False) "
sys.stdout = sys.__stdout__
# closing files
storage_file.close()
fabric_file.close()

How to read stdout and stderr and save it all at once with subprocess Popen?

I have seen several questions around this topic already but none has worked for me. What I need is to get hold of the complete stdout and stderr of a subprocess.Popen([...],stdout=subprocess.PIPE) and write it all at once to a file, for example:
import tempfile
import subprocess
stattmpfile = tempfile.NamedTemporaryFile(suffix=".log",prefix="status",delete=False)
proc = subprocess.Popen([mycommand, myparams], stdout=subprocess.PIPE)
while proc.poll() is None:
output = proc.stdout.readline()
statusfile.write(output)
output = proc.communicate()[0]
statusfile.write(output)
statusfile.close()
In this example I only get the first line of the standard output and nothing else.
To save both stdout and stderr of a subprocess to a file:
import subprocess
with open('filename', 'wb', 0) as file:
subprocess.check_call(cmd, stdout=file, stderr=subprocess.STDOUT)

Not all the output is redirected into file in Python

I am trying to redirect all the stdout to a file, out.txt. But first commands's output display's on the terminal and the rest is fed to the file. I am not sure whats wrong in the piece of code below.
import os
import sys
import subprocess
orig_stdout = sys.stdout
f = file('out.txt', 'w')
sys.stdout = f
os.system("date") #First command
cmd = ["ls", "-al"]
exe_cmd = subprocess.Popen(cmd, stdout=subprocess.PIPE)
output, err = exe_cmd.communicate()
print output #Second command
sys.stdout = orig_stdout
Assigning a file object to sys.stdout redirects python code that uses sys.stdout but doesn't redirect code that uses the underlying file descriptor.
os.system("date")
spawns a new process that uses the underlying file descriptor, so its not redirected.
exe_cmd = subprocess.Popen(cmd, stdout=subprocess.PIPE)
output, err = exe_cmd.communicate()
print output #Second command
spawns a new process with a pipe that is read by the parent process. print uses the parent sys.stdout so it is redirected.
A standard way to redirect is to hand the file object to one of the subprocess calls. The child writes directly to the file without parent interaction.
with open('out.txt', 'w') as f:
cmd = ["ls", "-al"]
subprocess.call(cmd, stdout=f)

Why the following python code does not print to file

from sys import stdout
stdout = open('file', 'w')
print 'test'
stdout.close()
does create the file, but it contains nothing.
I had to use
import sys
sys.stdout = open('file', 'w')
print 'test'
sys.stdout.close()
But wouldn't the from ... import... automatically make the name available? Why do I still have to use sys.stdout instead of stdout?
The problem is this: print is equivalent to sys.stdout.write().
So when you do from sys import stdout, the variable stdout won't be used by print.
But when you do
import sys
print 'test'
it actually writes to sys.stdout which is pointing to the file you opened.
Analysis
from sys import stdout
stdout = open('file', 'w')
print 'test' # calls sys.stdout.write('test'), which print to the terminal
stdout.close()
import sys
sys.stdout = open('file', 'w')
print 'test' # calls sys.stdout.write('test'), which print to the file
sys.stdout.close()
Conclusion
This works...
from sys import stdout
stdout = open('file', 'w')
stdout.write('test')
stdout.close()

Categories