how to get output of grep command (Python) - python

I have an input file test.txt as:
host:dc2000
host:192.168.178.2
I want to get all of addresses of those machines by using:
grep "host:" /root/test.txt
And so on, I get command ouput via python:
import subprocess
file_input='/root/test.txt'
hosts=subprocess.Popen(['grep','"host:"',file_input], stdout= subprocess.PIPE)
print hosts.stdout.read()
But the result is empty string.
I don't know what problem I got. Can you suggest me how to solve?

Try that :
import subprocess
hosts = subprocess.check_output("grep 'host:' /root/test.txt", shell=True)
print hosts

Your code should work, are you sure that the user has the access right to read the file?
Also, are you certain there is a "host:" in the file? You might mean this instead:
hosts_process = subprocess.Popen(['grep','host:',file_input], stdout= subprocess.PIPE)
hosts_out, hosts_err = hosts_process.communicate()

Another solution, try Plumbum package(https://plumbum.readthedocs.io/):
from plumbum.cmd import grep
print(grep("host:", "/root/test.txt"))
print(grep("-n", "host:", "/root/test.txt")) #'-n' option

Related

How to pass output of a command in cmd to a txt file using python?

Is there a way through which I can pass the output of a CMD command to a .txt file using python?
Something like this :
import os
os.system('cmd /c "netsh wlan show profiles"')
#output=OUTPUT
output_file=open(outputfile+'.txt','w')
output_file.write(OUTPUT)
output_file.close()
Is there a way to do this?
import subprocess
out = subprocess.getoutput("ls -l")
print(out)
Now you can just pass on this output to a txt file using open() and file.read() method
You can just pass the output to a temporary file. The following is a simple example:
import os
os.system('cmd /c "netsh wlan show profiles" > tmp.txt')
Then you can just read the tmp.txt file and see the output. This solution is based on your code.
There are other options such as using subprocess, you can read more in the following link: https://docs.python.org/3/library/subprocess.html#using-the-subprocess-module
There is already a similar question here: Running shell command and capturing the output
import subprocess
output = subprocess.check_output(["echo", "Hello, World!"])
Try this library https://docs.python.org/3/library/subprocess.html
Method subprocess.run() should do the job. It allows to capture stdout and stderr when param capture_output=True

How can I run a binary executable with input file (bash command) in Python?

I have a binary executable named as "abc" and I have a input file called as "input.txt". I can run these with following bash command:
./abc < input.txt
How can I run this bash command in Python, I tried some ways but I got errors.
Edit:
I also need the store the output of the command.
Edit2:
I solved with this way, thanks for the helps.
input_path = path of the input.txt file.
out = subprocess.Popen(["./abc"],stdin=open(input_path),stdout=subprocess.PIPE, stderr=subprocess.STDOUT)
stdout,stderr = out.communicate()
print(stdout)
use os.system
import os
os.system("echo test from shell");
Using subprocess is the best way to invoke system commands and executables. It provides better control than os.system() and is intended to replace it. The python documentation link below provides additional information.
https://docs.python.org/3/library/subprocess.html
Here is a bit of code that uses subprocess to read output from head to return the first 100 rows from a txt file and process it row by row. It gives you the output (out) and any errors (err).
mycmd = 'head -100 myfile.txt'
(out, err) = subprocess.Popen(mycmd, stdout=subprocess.PIPE, shell=True).communicate()
myrows = str(out.decode("utf-8")).split("\n")
for myrow in myrows:
# do something with myrow
This can be done with os module. The following code works perfectly fine.
import os
path = "path of the executable 'abc' and 'input.txt' file"
os.chdir(path)
os.system("./abc < input.txt")
Hope this works :)

python subprocess with ffmpeg give no output

I m want to extract the scene change timestamp using the scene change detection from ffmpeg. I have to run it on a few hundreds of videos , so i wanted to use a python subprocess to loop over all the content of a folder.
My problem is that the command that i was using for getting these values on a single video involve piping the output to a file which seems to not be an option from inside a subprocess call.
this is my code :
p=subprocess.check_output(["ffmpeg", "-i", sourcedir+"/"+name+".mpg","-filter:v", "select='gt(scene,0.4)',showinfo\"","-f","null","-","2>","output"])
this one tell ffmpeg need an output
output = "./result/"+name
p=subprocess.check_output(["ffmpeg", "-i", sourcedir+"/"+name+".mpg","-filter:v", "select='gt(scene,0.4)',metadata=print:file=output","-an","-f","null","-"])
this one give me no error but doesn't create the file
this is the original command that i use directly with ffmpeg:
ffmpeg -i input.flv -filter:v "select='gt(scene,0.4)',showinfo" -f null - 2> ffout
I just need the ouput of this command to be written to a file, anyone see how i could make it work?
is there a better way then subprocess ? or just another way ? will it be easier in C?
You can redirect the stderr output directly from Python without any need for shell=True which can lead to shell injection.
It's as simple as:
with open(output_path, 'w') as f:
subprocess.check_call(cmd, stderr=f)
Things are easier in your case if you use the shell argument of the subprocess command and it should behave the same. When using the shell command, you can pass in a string as the command rather then a list of args.
cmd = "ffmpeg -i {0} -filter:v \"select='gt(scene,0.4)',showinfo\" -f {1} - 2> ffout".format(inputName, outputFile)
p=subprocess.check_output(cmd, shell=True)
If you want to pass arguments, you can easily format your string

Python subprocess library: Running grep command from Python

I am trying to run grep command from my Python module using the subprocess library. Since, I am doing this operation on the doc file, I am using Catdoc third party library to get the content in a plan text file. I want to store the content in a file. I don't know where I am going wrong but the program fails to generate a plain text file and eventually to get the grep result. I have gone through the error log but its empty. Thanks for all the help.
def search_file(name, keyword):
#Extract and save the text from doc file
catdoc_cmd = ['catdoc', '-w' , name, '>', 'testing.txt']
catdoc_process = subprocess.Popen(catdoc_cmd, stdout=subprocess.PIPE,stderr=subprocess.PIPE, shell=True)
output = catdoc_process.communicate()[0]
grep_cmd = []
#Search the keyword through the text file
grep_cmd.extend(['grep', '%s' %keyword , 'testing.txt'])
print grep_cmd
p = subprocess.Popen(grep_cmd,stdout=subprocess.PIPE,stderr=subprocess.PIPE, shell=True)
stdoutdata = p.communicate()[0]
print stdoutdata
On UNIX, specifying shell=True will cause the first argument to be treated as the command to execute, with all subsequent arguments treated as arguments to the shell itself. Thus, the > won't have any effect (since with /bin/sh -c, all arguments after the command are ignored).
Therefore, you should actually use
catdoc_cmd = ['catdoc -w "%s" > testing.txt' % name]
A better solution, though, would probably be to just read the text out of the subprocess' stdout, and process it using re or Python string operations:
catdoc_cmd = ['catdoc', '-w' , name]
catdoc_process = subprocess.Popen(catdoc_cmd, stdout=subprocess.PIPE,stderr=subprocess.PIPE)
for line in catdoc_process.stdout:
if keyword in line:
print line.strip()
I think you're trying to pass the > to the shell, but that's not going to work the way you've done it. If you want to spawn a process, you should arrange for its standard out to be redirected. Fortunately, that's really easy to do; all you have to do is open the file you want the output to go to for writing and pass it to popen using the stdout keyword argument, instead of PIPE, which causes it to be attached to a pipe which you can read with communicate().

Sftp in python by invoking unix-shell commands

How can we sftp a file from source host to a destinition server in python by invoking unix shell commands in python script using os.system...Please help
I have tried the following code
dstfilename="hi.txt"
host="abc.com"
user="sa"
os.system("echo cd /tmp >sample.txt)
os.system("echo put %(dstfilename)s" %locals()) // line 2
os.system("echo bye >>sample.txt")
os.system("sftp -B /var/tmp/sample.txt %(user)s#%(host)s)
How to append this result of line to sample.txt?
os.system("echo put %(dstfilename)s %locals()) >>sample.txt" // Seems this is syntatically not correct.
cat>sample.txt //should look like this
cd /tmp
put /var/tmp/hi.txt
bye
Any help?
Thanks you
You should pipe your commands into sftp. Try something like this:
import os
import subprocess
dstfilename="/var/tmp/hi.txt"
samplefilename="/var/tmp/sample.txt"
target="sa#abc.com"
sp = subprocess.Popen(['sftp', target], shell=False, stdin=subprocess.PIPE)
sp.stdin.write("cd /tmp\n")
sp.stdin.write("put %s\n" % dstfilename)
sp.stdin.write("bye\n")
[ do other stuff ]
sp.stdin.write("put %s\n" % otherfilename)
[ and finally ]
sp.stdin.write("bye\n")
sp.stdin.close()
But, in order to answer your question:
os.system("echo put %(dstfilename)s %locals()) >>sample.txt" // Seems this is syntatically not correct.
Of course it isn't. You want to pass a stringto os.system. So it has to look like
os.system(<string expression>)
with a ) at the end.
The string expression consists of a string literal with an applied % formatting:
"string literal" % locals()
And the string literal contains the redirection for the shell:
"echo put %(dstfilename)s >>sample.txt"
And together:
os.system("echo put %(dstfilename)s >>sample.txt" % locals())
. But as said, this is the worst solution I can imagine - better write directly to a temp file or even better pipe directly into the sub process.
Well, I think the literal solution to your question would look something like this:
import os
dstfilename="/var/tmp/hi.txt"
samplefilename="/var/tmp/sample.txt"
host="abc.com"
user="sa"
with open(samplefilename, "w") as fd:
fd.write("cd /tmp\n")
fd.write("put %s\n" % dstfilename)
fd.write("bye\n")
os.system("sftp -B %s %s#%s" % (samplefilename, user, host))
As #larsks says, use a proper filehandler to make the tmp file for you, and my personal preference is to not to do string formatting using locals().
However depending on the use case, I don't think this is a particularly suitable approach - how does the password the sftp site get entered for example?
I think you'd get a more robust solution if you took a look at the SFTPClient in Paramiko, or failing that, you might need something like pexpect to help with ongoing automation.
If you want a non-zero return code if any of the sftp commands fail, you should write the commands to a file, then run an sftp batch on them. In this fashion, you can then retrieve the return code to check if the sftp commands had any failure.
Here's a quick example:
import subprocess
host="abc.com"
user="sa"
user_host="%s#%s" % (user, host)
execute_sftp_commands(['put hi.txt', 'put myfile.txt'])
def execute_sftp_commands(sftp_command_list):
with open('batch.txt', 'w') as sftp_file:
for sftp_command in sftp_command_list:
sftp_file.write("%s\n" % sftp_command)
sftp_file.write('quit\n')
sftp_process = subprocess.Popen(['sftp', '-b', 'batch.txt', user_host], shell=False)
sftp_process.communicate()
if sftp_process.returncode != 0:
print("sftp failed on one or more commands: {0}".format(sftp_command_list))
Quick disclaimer: I did not run this in a shell so a typo might be present. If so, send me a comment and I will correct.

Categories