For example, lets say I want to access a Postgresql database in the shell and apply the command select *;. This would require me to run:
psql -U postgres
<insert the password> (optional)
select *;
and ideally get the the intermediate output after each step. This is only a toy example, for doing this sqlalchemy would be a good pick, but still it should be possible through pythons subprocess module.
What I have tried based on this post:
start = f"psql -U postgres"
fw = open("tmpout.txt", "wb")
fr = open("tmpout.txt", "r")
p = subprocess.Popen(start, stdin=subprocess.PIPE, stdout=fw, stderr=fw, bufsize=1,
shell=True)
p.stdin.write(bytes("select *;", 'utf-8'))
out = fr.read() # Here i would expect the result of the select, but it doesn't terminate..
Related
I want to execute a mysqldump in python and provide the password when it is requested from the mysqldump.
Adding the password in the command line is not an option, it must be provided via stdin.
This is what I've done so far:
command = [
'mysqldump',
'-h', mysqlhost,
'-P', mysqlport,
'-u', mysqluser,
'-p',
mysqldb
]
mysqlfile = mysqlpath + "/" + mysqldb + ".sql"
with open(mysqlfile, "w+") as file:
p = subprocess.Popen(command, stdin=subprocess.PIPE, stdout=file)
p.communicate(input=mysqlpass)
p.wait()
But when I execute the code the terminal hangs requesting the password.
Thank you.
You can use pexpect for that. This is modified code as I had to test it, but you get the idea:
import pexpect
command2 = 'mysqldump -h localhost -u root -p xyzzy'
mysqlfile = "/tmp/foo.sql"
with open(mysqlfile, "w+") as file:
p = pexpect.spawn(command2)
p.expect("Enter password: ")
p.sendline("foobar")
q = p.read()
p.wait()
file.write(q)
here "foobar" is my database password.
Hannu
For me, the accepted answer did not solve the problem. Presumably it is related to the python version I am using, which is 3.5.
The difficulties I had:
p.read() was blocking the process (I always killed the script at some point)
The chunk-approach by David Rojo did not block, but .read(1024) returned integers, where strings where expected by file.write(...). I assume this is related to differences in the way unicode is handled in Python 2 and 3, since adding the parameter encoding='utf-8' to pexpect.spawn() gave me the proper results. However, then I had to adapt the writing of the file, s.t. it supports unicode as well.
Another problem with the for chunk in p.read(1024):-approach is, that I experienced the reading to finish before mysqldump finished writing the dump to stdout. I guess that in this case mysqldump was too slow to deliver. I changed my solution, s.t. it waits for EOF.
Note: I just started learning python a couple of days ago, please correct me if my assumptions or conclusions are wrong or misleading.
Code example
The script below is my minimal working example for calling mysqldump and providing the password when mysqldump asks for it:
#!/usr/bin/env python3
import pexpect
import io
cmd = 'mysqldump -u MYSQL_USER -p DATABASES(S)'
sqlfile = "/home/user/test-database-dump.sql"
password = 'secret'
with io.open(sqlfile, 'w', encoding="utf-8") as file:
print('Calling mysqldump...')
p = pexpect.spawn(cmd,encoding='utf-8')
p.expect("Enter password: ")
# Send password to mysqldump
p.sendline(password)
# Capture the dump
print('Reading dump from process and writing it to file...')
while not p.eof():
chunk = p.readline()
file.write(chunk)
print('Finished.')
p.close()
print(p.exitstatus, p.signalstatus)
Using the subprocess module how do I get the following command to work?
isql -v -b -d, DSN_NAME "DOMAIN\username" password <<<
"SELECT column_name, data_type
FROM database_name.information_schema.columns
WHERE table_name = 'some_table';"
This command works perfectly when I run it in a bash shell but I can't get it to work when running from within Python. I'm trying to do this from within Python because I need to be able to modify the query and get different result sets back and then process them in Python. I can't use one of the nice Python database connectors for various reasons which leaves me trying to pipe output from isql.
My code currently looks similar to the following:
bash_command = '''
isql -v -b -d, DSN_NAME "DOMAIN\username" password <<<
"SELECT column_name, data_type
FROM database_name.information_schema.columns
WHERE table_name = 'some_table';"
'''
process = subprocess.Popen(bash_command,
shell=True,
stdout=subprocess.PIPE,
stderr=subprocess.PIPE)
output, error = process.communicate()
However I have tried lots of variations:
Using the entire command as a string, or as a list of strings.
Using check_output vs Popen.
Using communicate() to try and send the query to the isql command or having the query be part of the command string using a heredoc.
Using shell = True or not.
Specifying /bin/bash or using the default /bin/sh.
Lots of different quoting and escaping patterns.
And pretty much every permutation of the above.
In no case do I receive the output of the query that I'm looking for. I'm pretty sure that the command isn't being sent to the shell as is but I can't tell what is being sent to the shell.
I feel like this should be pretty simple, send a command to the shell and get the output back, but I just can't make it work. I can't even see what command is being sent to the shell, even using pdb.
shell=True makes subprocess use /bin/sh by default. <<< "here-string" is a bash-ism; pass executable='/bin/bash':
>>> import subprocess
>>> subprocess.call(u'cat <<< "\u0061"', shell=True)
/bin/sh: 1: Syntax error: redirection unexpected
2
>>> subprocess.call(u'cat <<< "\u0061"', shell=True, executable='/bin/bash')
a
0
You should also use raw-string literals to avoid escaping backslashes: "\\u0061" == r"\u0061" != u"\u0061":
>>> subprocess.call(r'cat <<< "\u0061"', shell=True, executable='/bin/bash')
\u0061
0
Though you don't need shell=True here. You could pass the input as a string using process.communicate(input=input_string):
>>> process = subprocess.Popen(['cat'], stdin=subprocess.PIPE, stdout=subprocess.PIPE)
>>> process.communicate(br"\u0061")
('\\u0061', None)
The result could look like:
#!/usr/bin/env python
import shlex
from subprocess import Popen, PIPE
cmd = shlex.split(r'isql -v -b -d, DSN_NAME "DOMAIN\username" password')
process = Popen(cmd, stdin=PIPE, stdout=PIPE, stderr=PIPE)
output, errors = process.communicate(
b"SELECT column_name, data_type "
b"FROM database_name.information_schema.columns "
b"WHERE table_name = 'some_table';")
Try giving this a shot:
import shlex
from subprocess import Popen, PIPE, STDOUT
sql_statement = '''"SELECT column_name, data_type
FROM database_name.information_schema.columns
WHERE table_name = 'some_table';"'''
isqlcommand = 'isql -v -b -d, DSN_NAME "DOMAIN\username" password'
isqlcommand_args = shlex.split(isqlcommand)
process = Popen(isqlcommand_args, stdin=PIPE, stdout=PIPE, stderr=STDOUT)
output = process.communicate(input=sql_statement)[0]
print output
The idea here is to separate the here-string redirection from the isql command execution. This example will pipe the here-string into the stdin of process via process.communicate(). I'm also using shlex.split() to tokenize the command and its arguments.
Edit: Removed Shell=True after reviewing comment from J.F. Sebastian
I am trying to use python in a unix style pipe.
For example, in unix I can use a pipe such as:
$ samtools view -h somefile.bam | python modifyStdout.py | samtools view -bh - > processed.bam
I can do this by using a for line in sys.stdin: loop in the python script and that appears to work without problems.
However I would like to internalise this unix command into a python script. The files involved will be large so I would like to avoid blocking behaviour, and basically stream between processes.
At the moment I am trying to use Popen to manage each command, and pass the stdout of the first process to the stdin of the next process, and so on.
In a seperate python script I have (sep_process.py):
import sys
f = open("sentlines.txt", 'wr')
f.write("hi")
for line in sys.stdin:
print line
f.write(line)
f.close()
And in my main python script I have this:
import sys
from subprocess import Popen, PIPE
# Generate an example file to use
f = open('sees.txt', 'w')
f.write('somewhere over the\nrainbow')
f.close()
if __name__ == "__main__":
# Use grep as an example command
p1 = Popen("grep over sees.txt".split(), stdout=PIPE)
# Send to sep_process.py
p2 = Popen("python ~/Documents/Pythonstuff/Bam_count_tags/sep_process.py".split(), stdin=p1.stdout, stdout=PIPE)
# Send to final command
p3 = Popen("wc", stdin=p2.stdout, stdout=PIPE)
# Read output from wc
result = p3.stdout.read()
print result
The p2 process however fails [Errno 2] No such file or directory even though the file exists.
Do I need to implement a Queue of some kind and/or open the python function using the multiprocessing module?
The tilde ~ is a shell expansion. You are not using a shell, so it is looking for a directory called ~.
You could read the environment variable HOME and insert that. Use
os.environ['HOME']
Alternatively you could use shell=True if you can't be bothered to do your own expansion.
Thanks #cdarke, that solved the problem for using simple commands like grep, wc etc. However I was too stupid to get subprocess.Popen to work when using an executable such as samtools to provide the data stream.
To fix the issue, I created a string containing the pipe exactly as I would write it in the command line, for example:
sam = '/Users/me/Documents/Tools/samtools-1.2/samtools'
home = os.environ['HOME']
inpath = "{}/Documents/Pythonstuff/Bam_count_tags".format(home)
stream_in = "{s} view -h {ip}/test.bam".format(s=sam, ip=inpath)
pyscript = "python {ip}/bam_tags.py".format(ip=inpath)
stream_out = "{s} view -bh - > {ip}/small.bam".format(s=sam, ip=inpath)
# Absolute paths, witten as a pipe
fullPipe = "{inS} | {py} | {outS}".format(inS=stream_in,
py=pyscript,
outS=stream_out)
print fullPipe
# Translates to >>>
# samtools view -h test.bam | python ./bam_tags.py | samtools view -bh - > small.bam
I then used popen from the os module instead and this worked as expected:
os.popen(fullPipe)
I am looking for a way to capture (and verify) the user login output (i.e. the MOTD located in issues.net) from an ssh session using the subprocess module in Python. I’ve tried several variations of the following but have yet to find a way that traps the desired output without either hanging the session or returning only the passed (i.e. “ls –la”) command’s output. I’m using Python 2.6 and have a requirement to use only the native libraries available at this installation (Red Hat 6.5), so modules such as pexpect are currently unavailable to me.
The code below only returns the “ls –la” output, and not the desired ssh login message. NOTE: "testUser" utilizes a PKI, thus obviating the need for handling passwords.
loginStr = ['ssh', testUser#someHost, "ls -la"]
p = subprocess.Popen(loginStr, stdout=subprocess.PIPE, stderr=subprocess.PIPE)
while True:
line = p.stdout.readline()
if not line: break
print line
I’ve also tried this with similar outcomes:
loginStr = ['ssh', testUser#someHost, 'ls', '-la']
p = subprocess.Popen(loginStr, stdout=subprocess.PIPE, stderr=subprocess.PIPE)
(stdout, stderr) = p.communicate()
print stdout
Might queues and threads be a solution?. Any ideas would be greatly appreciated
You could be using python expect (pexpect)
Something as follows (replace, host, user, passwd appropriately):
(also, adjust according to the regular expression for the shell prompt)
import pexpect
cmd = "ssh -o StrictHostKeyChecking=no %s -l %s" % (<host>, <user>)
exp = pexpect.spawn(cmd, timeout=7)
idx = exp.expect('assword:')
nc = exp.sendline(<passwd>)
idx = exp.expect('[\n\r](#|\$) ')
if idx == 0:
before = exp.before
print before
I am trying to run several commands in a single ssh session and save the output of each command in a different file.
The code that works (but it saves all the output in a single file)
conn = Popen(['ssh',host, "ls;uname -a;pwd"], stdin=PIPE, stdout = open ('/output.txt','w'))
mypassword = conn.communicate('password')
Codes that I am trying to work but not working...
cmd = ['ls', 'pwd', 'uname']
conn = Popen(['ssh',host, "{};{};{}".format(cmd[0],cmd[1],cmd[2])], stdin=PIPE, stdout = output.append('a'))
mypassword = conn.communicate('password')
print (output)
length = range(len(output))
print length
for i in output:
open("$i",'w')
and
cmd = ['ls', 'pwd', 'uname']
conn = Popen(['ssh',host, "{};{};{}".format(cmd[0],cmd[1],cmd[2])], stdin=PIPE, stdout = output())
mypassword = conn.communicate('password')
def output():
for i in cmd:
open(i,'w')
return
Not sure what is the best way of doing it. Should I save it in an array and then save each item in a separate file or should I call a function that will do it?
NOTE that the commands I want to run do not have small output like given in examples here (uname, pwd); it is big as tcpdump, lsof etc.
A single ssh session runs a single command e.g., /bin/bash on the remote host -- you can pass input to that command to emulate running multiple commands in a single ssh session.
Your code won't run even a single command. There are multiple issues in it:
ssh may read the password directly from the terminal (not its stdin stream). conn.communicate('password') in your code writes to ssh's stdin therefore ssh won't get the password.
There are multiple ways to authenticate via ssh e.g., use ssh keys (passwordless login).
stdout = output.append('a') doesn't redirect ssh's stdout because .append list method returns None.
It won't help you to save output of several commands to different files. You could redirect the output to remote files and copy them back later: ls >ls.out; uname -a >uname.out; pwd >pwd.out.
A (hacky) alternative is to use inside stream markers (echo <GUID>) to differentiate the output from different commands. If the output can be unlimited; learn how to read subprocess' output incrementally (without calling .communicate() method).
for i in cmd: open(i,'w') is pointless. It opens (and immediately closes on CPython) multiple files without using them.
To avoid such basic mistakes, write several Python scripts that operate on local files.
SYS_STATS={"Number of CPU Cores":"cat /proc/cpuinfo|grep -c 'processor'\n",
"CPU MHz":"cat /proc/cpuinfo|grep 'cpu MHz'|head -1|awk -F':' '{print $2}'\n",
"Memory Total":"cat /proc/meminfo|grep 'MemTotal'|awk -F':' '{print $2}'|sed 's/ //g'|grep -o '[0-9]*'\n",
"Swap Total":"cat /proc/meminfo|grep 'SwapTotal'|awk -F':' '{print $2}'|sed 's/ //g'|grep -o '[0-9]*'\n"}
def get_system_details(self,ipaddress,user,key):
_OutPut={}
values=[]
sshProcess = subprocess.Popen(['ssh','-T','-o StrictHostKeyChecking=no','-i','%s' % key,'%s#%s'%(user,ipaddress),"sudo","su"],
stdin=subprocess.PIPE, stdout = subprocess.PIPE, universal_newlines=True,bufsize=0)
for element in self.SYS_STATS1:
sshProcess.stdin.write("echo END\n")
sshProcess.stdin.write(element)
sshProcess.stdin.close()
for element in sshProcess.stdout:
if element.rstrip('\n')!="END":
values.append(element.rstrip('\n'))
mapObj={k: v for k, v in zip(self.SYS_STATS_KEYS, values)}
return mapObj