ssh not recognized as a command when executed from python using subprocess? - python

This is my code -
import subprocess
import sys
HOST="xyz3511.uhc.com"
# Ports are handled in ~/.ssh/config since we use OpenSSH
COMMAND="uptime"
ssh = subprocess.Popen(["ssh", "%s" % HOST, COMMAND],
shell=True,
stdout=subprocess.PIPE,
stderr=subprocess.PIPE)
result = ssh.stdout.readlines()
if result == []:
error = ssh.stderr.readlines()
print (sys.stderr, "ERROR: %s" % error)
else:
print (result)
and this is the error I'm getting-
ERROR:
[b"'ssh' is not recognized as an internal or external command,\r\n",
b'operable program or batch file.\r\n'].
Not sure what I'm doing wrong over here. Also, I haven't mentioned any port. All I want is to use subprocess and connect to remote server, execute a simple command like ls. Python version is 3.x.

Apparently this happens in python3.
Workaround found at this link:
https://gist.github.com/bortzmeyer/1284249
system32 = os.path.join(os.environ['SystemRoot'], 'SysNative' if platform.architecture()[0] == '32bit' else 'System32')
ssh_path = os.path.join(system32, 'OpenSSH/ssh.exe')
out1, err1 = Popen([ssh_path, "pi#%s"%self.host, "%s"%cmd],
shell=False,
stdout=PIPE,
stderr=PIPE).communicate()

Related

Python: subprocess.call and variants fail for a particular application from executed .py but not from python in CLI

I have a strange issue here - I have an application that I'm attempting to launch from python, but all attempts to launch it from within a .py script fail without any discernable output. Testing from within VSCode debugger. Here's some additional oddities:
When I swap in notepad.exe into the .py instead of my target applications path, notepad launches ok.
When I run the script line by line from the CLI (start by launching python, then type out the next 4-5 lines of Python), the script works as expected.
Examples:
#This works in the .py, and from the CLI
import subprocess
cmd = ['C:\\Windows\\system32\\notepad.exe', 'C:\\temp\\myfiles\\test_24.xml']
pipe = subprocess.Popen(cmd, shell=True, stdout=subprocess.PIPE, stderr=subprocess.PIPE)
pipe.wait()
print(pipe)
#This fails in the .py, but works ok when pasted in line by line from the CLI
import subprocess
cmd = ['C:\\temp\\temp_app\\target_application.exe', 'C:\\temp\\myfiles\\test_24.xml']
pipe = subprocess.Popen(cmd, shell=True, stdout=subprocess.PIPE, stderr=subprocess.PIPE)
pipe.wait()
print(pipe)
The result is no output when running the .py
I've tried several other variants, including the following:
import subprocess
tup = 'C:\\temp\\temp_app\\target_application.exe C:\temp\test\test_24.xml'
proc = subprocess.Popen(tup)
proc.wait()
(stdout, stderr) = proc.communicate()
print(stdout)
if proc.returncode != 0:
print("The error is: " + str(stderr))
else:
print("Executed: " + str(tup))
Result:
None
The error is: None
1.082381010055542
Now this method indicates there is an error because we are returning something other than 0 and printing "The error is: None", and this is because stderror is "None". So - is it throwing an error without giving an error?
stdout is also reporting "None".
So, lets try check_call and see what happens:
print("Trying check_call")
try:
subprocess.check_call('C:\\temp\\temp_app\\target_application.exe C:\\temp\\test\\test_24.xml', shell=True)
except subprocess.CalledProcessError as error:
print(error)
Results:
Trying check_call
Command 'C:\temp\temp_app\target_application.exe C:\temp\test\test_24.xml' returned non-zero exit status 1.
I've additionally tried subprocess.run, although it is missing the wait procedure I was hoping to use.
import subprocess
tup = 'C:\\temp\\temp_app\\target_application.exe C:\temp\test\test_24.xml'
proc = subprocess.run(tup, check=True)
proc.wait()
(stdout, stderr) = proc.communicate()
print(stdout)
if proc.returncode != 0:
print("The error is: " + str(stderr))
else:
print("Executed: " + str(tup))
What reasons might be worth chasing, or what other ways of trying to catch an error might work here? I don't know how to interpret "`" as an error result.

Conditionally run subprocess over ssh, while appending output to a (potentially remote) file

I have a script which can run on my host machine and several other servers. I want to launch this script as a background process on my host machine along with the remote machine using ssh and output the stdout/stderr to host machine for my host machine background process and on the remote machines for remote machine background tasks.
I tried with
subprocess.check_output(['python' ,'script.py' ,'arg_1', ' > file.log ', ' & echo -ne $! ']
but it doesn't work. it doesnt give me the pid nor write into the file. It works with shell=True but then I read it is not good to use shell=True for security reasons.
then I tried
p = subprocess.Popen(['python' ,'script.py' ,'arg_1', ' > file.log ']
Now i can get the process pid but the output is not writing in the remote log file.
using stdout/stderr arguments like suggested below will open the log file in my host machine not the remote machine. i want to log on the remote machine instead.
append subprocess.Popen output to file?
Could someone please suggest me a single command that works both on my host machine and also ssh's to remote server and launches the background process there? and write to output file ?
<HOW_TO_GET_PID> = subprocess.<WHAT>( ([] if 'localhost' else ['ssh','<remote_server>']) + ['python', 'script.py', 'arg_1' <WHAT>] )
Someone could please finish the above psudo code ?
Thanks,
You're not going to get something that's safe and correct in a one-liner without making it unreadable; better not to try.
Note that we're using a shell here: In the local case we explicitly call shell=True, whereas in the remote case ssh always, implicitly starts a shell.
import shlex
import subprocess
def startBackgroundCommand(argv, outputFile, remoteHost=None, andGetPID=False):
cmd_str = ' '.join(shlex.quote(word) for word in argv)
if outputFile != None:
cmd_str += ' >%s' % (shlex.quote(outputFile),)
if andGetPID:
cmd_str += ' & echo "$!"'
if remoteHost != None:
p = subprocess.Popen(['ssh', remoteHost, cmd_str], stdout=subprocess.PIPE)
else:
p = subprocess.Popen(cmd_str, stdout=subprocess.PIPE, shell=True)
return p.communicate()[0]
# Run your command locally
startBackgroundCommand(['python', 'script.py', 'arg_1'],
outputFile='file.log', andGetPID=True)
# Or run your command remotely
startBackgroundCommand(['python', 'script.py', 'arg_1'],
remoteHost='foo.example.com', outputFile='file.log', andGetPID=True)
# At the beginning you can even program automatic daemonizing
# Using os.fork(), otherwise, you run it with something like:
# nohup python run_my_script.py &
# This will ensure that it continues running even if SSH connection breaks.
from subprocess import Popen, PIPE, STDOUT
p = Popen(["python", "yourscript.py"], stdout=PIPE, stderr=STDOUT, stdin=PIPE)
p.stdin.close()
log = open("logfile.log", "wb")
log.write(b"PID: %i\n\n" % p.pid)
while 1:
line = p.stdout.readline()
if not line: break
log.write(line)
log.flush()
p.stdout.close()
log.write(b"\nExit status: %i" % p.poll())
log.close()

Capturing Login Output from an SSH Session in Python

I am looking for a way to capture (and verify) the user login output (i.e. the MOTD located in issues.net) from an ssh session using the subprocess module in Python. I’ve tried several variations of the following but have yet to find a way that traps the desired output without either hanging the session or returning only the passed (i.e. “ls –la”) command’s output. I’m using Python 2.6 and have a requirement to use only the native libraries available at this installation (Red Hat 6.5), so modules such as pexpect are currently unavailable to me.
The code below only returns the “ls –la” output, and not the desired ssh login message. NOTE: "testUser" utilizes a PKI, thus obviating the need for handling passwords.
loginStr = ['ssh', testUser#someHost, "ls -la"]
p = subprocess.Popen(loginStr, stdout=subprocess.PIPE, stderr=subprocess.PIPE)
while True:
line = p.stdout.readline()
if not line: break
print line
I’ve also tried this with similar outcomes:
loginStr = ['ssh', testUser#someHost, 'ls', '-la']
p = subprocess.Popen(loginStr, stdout=subprocess.PIPE, stderr=subprocess.PIPE)
(stdout, stderr) = p.communicate()
print stdout
Might queues and threads be a solution?. Any ideas would be greatly appreciated
You could be using python expect (pexpect)
Something as follows (replace, host, user, passwd appropriately):
(also, adjust according to the regular expression for the shell prompt)
import pexpect
cmd = "ssh -o StrictHostKeyChecking=no %s -l %s" % (<host>, <user>)
exp = pexpect.spawn(cmd, timeout=7)
idx = exp.expect('assword:')
nc = exp.sendline(<passwd>)
idx = exp.expect('[\n\r](#|\$) ')
if idx == 0:
before = exp.before
print before

python subprocess communicate freezes

I have the following python code that hangs :
cmd = ["ssh", "-tt", "-vvv"] + self.common_args
cmd += [self.host]
cmd += ["cat > %s" % (out_path)]
p = subprocess.Popen(cmd, stdin=subprocess.PIPE,
stdout=subprocess.PIPE, stderr=subprocess.PIPE)
stdout, stderr = p.communicate(in_string)
It is supposed to save a string (in_string) into a remote file over ssh.
The file is correctly saved but then the process hangs. If I use
cmd += ["echo"] instead of
cmd += ["cat > %s" % (out_path)]
the process does not hang so I am pretty sure that I misunderstand something about the way communicate considers that the process has exited.
do you know how I should write the command so the the "cat > file" does not make communicate hang ?
-tt option allocates tty that prevents the child process to exit when .communicate() closes p.stdin (EOF is ignored). This works:
import pipes
from subprocess import Popen, PIPE
cmd = ["ssh", self.host, "cat > " + pipes.quote(out_path)] # no '-tt'
p = Popen(cmd, stdin=PIPE, stdout=PIPE, stderr=PIPE)
stdout, stderr = p.communicate(in_string)
You could use paramiko -- pure Python ssh library, to write data to a remote file via ssh:
#!/usr/bin/env python
import os
import posixpath
import sys
from contextlib import closing
from paramiko import SSHConfig, SSHClient
hostname, out_path, in_string = sys.argv[1:] # get from command-line
# load parameters to setup ssh connection
config = SSHConfig()
with open(os.path.expanduser('~/.ssh/config')) as config_file:
config.parse(config_file)
d = config.lookup(hostname)
# connect
with closing(SSHClient()) as ssh:
ssh.load_system_host_keys()
ssh.connect(d['hostname'], username=d.get('user'))
with closing(ssh.open_sftp()) as sftp:
makedirs_exists_ok(sftp, posixpath.dirname(out_path))
with sftp.open(out_path, 'wb') as remote_file:
remote_file.write(in_string)
where makedirs_exists_ok() function mimics os.makedirs():
from functools import partial
from stat import S_ISDIR
def isdir(ftp, path):
try:
return S_ISDIR(ftp.stat(path).st_mode)
except EnvironmentError:
return None
def makedirs_exists_ok(ftp, path):
def exists_ok(mkdir, name):
"""Don't raise an error if name is already a directory."""
try:
mkdir(name)
except EnvironmentError:
if not isdir(ftp, name):
raise
# from os.makedirs()
head, tail = posixpath.split(path)
if not tail:
assert path.endswith(posixpath.sep)
head, tail = posixpath.split(head)
if head and tail and not isdir(ftp, head):
exists_ok(partial(makedirs_exists_ok, ftp), head) # recursive call
# do create directory
assert isdir(ftp, head)
exists_ok(ftp.mkdir, path)
It makes sense that the cat command hangs. It is waiting for an EOF. I tried sending an EOF in the string but couldn't get it to work. Upon researching this question, I found a great module for streamlining the use of SSH for command line tasks like your cat example. It might not be exactly what you need for your usecase, but it does do what your question asks.
Install fabric with
pip install fabric
Inside a file called fabfile.py put
from fabric.api import run
def write_file(in_string, path):
run('echo {} > {}'.format(in_string,path))
And then run this from the command prompt with,
fab -H username#host write_file:in_string=test,path=/path/to/file

Not able to execute sql command through a session created using POPEN in python

I'm trying to connect to SQL server using the below code I'm getting error invalid argument. I'm trying to read from a sql file and the run the query on the session created by popen using sqlcmd.
My SQL file contain this code -
select ##version;
GO
this is my python code to make connection and run the command. I'm getting "[Errno 22] Invalid argument"
import os
import subprocess
from subprocess import Popen, PIPE, STDOUT
def ms_sql_session():
ip_addr = "xxx.xxx.xxx.xxx,1433"
user = 'sa'
password = 'password'
connection_string = 'sqlcmd -S %s -U %s -P %s' %(ip_addr, user, password)
try:
session = Popen(connection_string, stdin=PIPE, stdout=PIPE, stderr=PIPE)
f = open('abc.sql','r')
str_cmd = f.read()
session.stdin.write(str_cmd)
stdout, stderr = session.communicate()
print stdout
print stderr
return True
except Exception, e:
print str(e)
return False
ms_sql_session()
How can i degug this kind of situation. I'm pretty much novice to sql hence i'm not sure that this is the problem with my code or the way stdin works.
I'm able to run command using the sqlcmd utility on command prompt. I'm using the sqlcmd for sql 2005
try this (add shell=True):
session = Popen(connection_string, stdin=PIPE, stdout=PIPE, stderr=PIPE, shell=True)
or you can you can add explicit path of sqlcmd.exe to your string ...

Categories