Fail to use 'powershell' through telnet connection using pexpect (python) - python

Trying to send 'powershell' command through telnet (from linux to windows) and fail on Timeout.
other commands i send through telnet, such as 'dir' command are ok.
this is part of the code i'm using:
p = host.pobject()
p.cmd = cmd
child = self.connection or self.OpenTelnetConnection()
t = stopwatch.Timer()
try:
child.sendline('{0}\r'.format(cmd))
child.expect(self.prompt, timeout=timeout)
# output = child.before
output = child.after
if stdout:
sys.stdout.write(child.after)
sys.stdout.flush()
child.sendline('echo %errorlevel%\r')
child.expect(self.prompt)
p.rc = int(child.after.split("\r\n")[1])
p.runtime = t.stop()
if p.rc:
p.stderr = output.split("\r\n")[1:-1]
else:
p.stdout = output.split("\r\n")[1:-1]
return p
except Exception, e:
self.report.Error("Failed to run command {0}. {1}".format(cmd, e),
exception=["TestFailure"], testName="WindowsHost")

The solution i found is to send the powershell command as 1st argument.
for example if i want to send 'host' command to the powershell i'll send:
'powershell host'

Related

Python subprocess - command execution on remote machine via ssh without shell interpreting semicolon or ampersand

I have a requirement to execute some commands via ssh on a remote host for which I have to use only subprocess.
For security concerns, I have whitelisted the commands that can be executed on the remote but arguments to those commands are user provided.
For ex: ls command will be whitelisted but -l argument to the command will come from user via API.
Currently my implementation will not stop a user to execute any arbitrary command on remote by using ; or & in the user supplied arguments.
For example, a user can pass: -l ; cat /etc/passwd to the ls command which will be executed at remote.
Now I have added a function to check if there are any special characters in the user supplied input but this looks bit insecure because blacklisting only few characters may still leave a loophole.
Is there any other safer solution to this issue with the restriction that I can only use subprocess and user input cannot be restricted?
Please help.
My current code:
def executeCommand(cmd, log):
try:
resultobj = subprocess.run(cmd, capture_output=True, check=True, universal_newlines=True)
if not resultobj.stdout.strip() == '':
log.info("Command output: %s", resultobj.stdout)
return resultobj.returncode, resultobj.stdout
log.error("Command Execution returned None: %s", cmd)
return -1, resultobj.stdout + "\n" + resultobj.stderr
except subprocess.CalledProcessError as e:
log.error("Command Execution Failed for Command: %s with error %s", cmd, e.stderr)
return e.returncode, e.stderr
except subprocess.SubprocessError as e:
log.error("Command Execution Failed for Command: %s with error %s", cmd, traceback.format_exc())
return -1, traceback.format_exc()
Command passed to above function will look like:
ssh_cmd = ['ssh', '-oConnectTimeout=10', '-oBatchMode=yes', '-oStrictHostKeyChecking=no', '-q', '1.1.1.1', 'ls -l ; cat /etc/passwd']
retcode, cmd_result = executeCommand(ssh_cmd, log)
The special character blacklist function pasted below:
def isBlacklistedChar(str):
filter_chars = "&;|"
if any(c in filter_chars for c in str):
return True
else:
return False

Background process suspended (tty output)

I've a python script which calls a bash script to connect a vpn account. If I run python script from console, I can do 2FA and connect to my VPN account seamlessly. However, if I run this python script as a background process (with nohup, etc.) then the python process becomes suspended (+ suspended (tty output)) whenever I try to connect VPN and python program is not responding (looks like it is stuck in a state that expects an input).
vpn_manager.py:
connection_command = 'sh {}'.format(os.path.join(base_path, 'scripts', 'vpn.sh'))
response = subprocess.run(connection_command, shell=True, stdout=subprocess.PIPE)
stdout = response.stdout.decode('utf-8')
if 'state: Connected' in stdout:
update_icon(environment)
Shell script vpn.sh:
printf "1\nUSERNAME\nPASSWORD\n2\n" | /opt/cisco/anyconnect/bin/vpn -s connect VPN_HOST
Normally, this VPN command asks for a user name and password, then waits for me to verify it from my 2FA app on my phone.
How can I make this python code working as a background process and not interrupted with that VPN prompts?
Using pexpect is a better approach to communicate as #CharlesDuffy suggested.
Solution with pexpect will be similar to below example.
import pexpect
failed = False
vpn = pexpect.spawn('/opt/cisco/anyconnect/bin/vpn -s connect {}'.format(host))
ret = vpn.expect([pexpect.TIMEOUT, CONNECT_SUCCESS, CONNECT_ERR_1, CONNECT_ERR_2, ...])
if ret != 1:
failed = True
if not failed:
vpn.sendline('1')
ret = vpn.expect([pexpect.TIMEOUT, SELECT_GROUP_SUCCESS, SELECT_GROUP_ERR_1, SELECT_GROUP_ERR_2, ...])
if ret != 1:
failed = True
if not failed:
vpn.sendline(USER_NAME)
ret = vpn.expect([pexpect.TIMEOUT, USER_NAME_SUCCESS, USER_NAME_ERR_1, USER_NAME_ERR_2, ...])
if ret != 1:
failed = True
if not failed:
vpn.sendline(PASSWORD)
ret = vpn.expect([pexpect.TIMEOUT, PASSWORD_SUCCESS, PASSWORD_ERR_1, PASSWORD_ERR_2, ...])
if ret != 1:
failed = True
if not failed:
vpn.sendline(AUTHENTICATION_METHOD)
ret = vpn.expect([pexpect.TIMEOUT, AUTHENTICATION_SUCCESS, AUTHENTICATION_ERR_1, AUTHENTICATION_ERR_2, ...])
if ret != 1:
failed = True
if not failed:
print('Connected!')
else:
print('Failed to connect!')
Send the output to a file (or named pipe)
printf "1\nUSERNAME\nPASSWORD\n2\n" | /opt/cisco/anyconnect/bin/vpn -s connect VPN_HOST > /tmp/vpn.out
Then in you python script you can check that file content until you read the expected content.

How to skip lines when printing output from Paramiko SSH

So I built a program that prints out the login logs of my ubuntu server using tail -f.
The program uses Paramiko to connect via ssh and runs the command to tail the logs.
The program works but it prints out the motd from the server which is unnecessary.
I've tried splicing using itertools.
Tried using next().
Still doesn't work.
Here's my code:
import yaml, paramiko, getpass, traceback, time, itertools
from paramiko_expect import SSHClientInteraction
with open("config.yaml", "r") as yamlfile:
cfg = yaml.load(yamlfile, Loader=yaml.FullLoader)
def main():
command = "sudo tail -f /var/log/auth.log"
try:
ssh = paramiko.SSHClient()
ssh.set_missing_host_key_policy(paramiko.AutoAddPolicy())
server_pw = getpass.getpass("Enter the password for your account %s on %s:" % (cfg['ssh_config']['username'], cfg['ssh_config']['host']))
sudo_pw = getpass.getpass("Enter the sudo password for %s on %s: " % (cfg['ssh_config']['username'], cfg['ssh_config']['host']))
ssh.connect(hostname = cfg['ssh_config']['host'], username = cfg['ssh_config']['username'], port = cfg['ssh_config']['port'], password = server_pw)
interact = SSHClientInteraction(ssh, timeout=10, display=False)
interact.send(command)
interact.send(sudo_pw + "\n")
with open(interact.tail(line_prefix=cfg['ssh_config']['servername']+': ')) as tail:
for line in itertools.islice(tail, 17, None):
print(line)
except KeyboardInterrupt:
print('Ctrl+C interruption detected, stopping tail')
except Exception:
traceback.print_exc()
finally:
try:
ssh.close()
except:
pass
if __name__ == '__main__':
main()
You get MOTD because you are opening an interactive shell session. I do not think you need that, quite on the contrary.
Use SSHClient.exec_command instead:
stdin, stdout, stderr = ssh.exec_command(command, get_pty=True)
stdin.write(sudo_pw + "\n")
stdin.flush()
for line in iter(stdout.readline, ""):
print(line, end="")
Related questions:
Get output from a Paramiko SSH exec_command continuously
Pass input/variables to command/script over SSH using Python Paramiko
What is the difference between exec_command and send with invoke_shell() on Paramiko?
Obligatory warning: Do not use AutoAddPolicy – You are losing a protection against MITM attacks by doing so. For a correct solution, see Paramiko "Unknown Server".

How to run commands in REMOTE machine with the subprocess approach using passwordless authentication in python3?

I am trying to write a script which should perform some commands (defined by Payload) in a ssh enable remote computer. I want to have a passworless connection. So that I can use the public and private key authentication. I know how to do it in paramiko and its working. Is there any way to do it by subprocess and get the output? Is there any sample code for that?
My sample code is something like that. For example, I want to execute more connection later on.
import subprocess
def __init__ (type, options):
if type=="ssh":
ssh(options)
else if type="fsexec":
fsexec(options)
def ssh(self, ip, user, sshkey_file, payload):
try:
command = "ssh "
prog = subprocess.call(["ssh -i sshkey_file -t user#ip 'payload'"])
print(prog)
print("Returncode:", prog)
def fsexec(self, ip, user, sshkey_file, payload):
try:
command = "ssh "
prog = subprocess.call(["fsexec -t user#ip 'payload'"])
print(prog)
print("Returncode:", prog)
You should use the Paramiko library to log in with ssh and the keyfile.
I copied an exmaple from a gist (https://gist.github.com/batok/2352501):
import paramiko
k = paramiko.RSAKey.from_private_key_file("/Users/whatever/Downloads/mykey.pem")
c = paramiko.SSHClient()
c.set_missing_host_key_policy(paramiko.AutoAddPolicy())
print "connecting"
c.connect( hostname = "www.acme.com", username = "ubuntu", pkey = k )
print "connected"
commands = [ "/home/ubuntu/firstscript.sh", "/home/ubuntu/secondscript.sh" ]
for command in commands:
print "Executing {}".format( command )
stdin , stdout, stderr = c.exec_command(command)
print stdout.read()
print( "Errors")
print stderr.read()
c.close()

Paramiko can't get output for grep command

I am using a paramiko library.
Using shell = ssh.invoke_shell() and shell.send().
The command i am sending is cmd = """grep -i "Lost LLUS websocket" /var/log/debesys/cme.log\n""".
But i am not getting output .
I am using shell.receive to get the output but i always get blank.I have tested the command manually and it works fine.Does anybody have any idea on how to get the output?
I'm not sure if this works exactly the way you want it, but only going off the question title, here's how I would do it if you wanted to send the grep command itself:
def run_cmd(sshClient, command):
channel = sshClient.get_transport().open_session()
channel.get_pty()
channel.exec_command(command)
out = channel.makefile().read()
err = channel.makefile_stderr().read()
returncode = channel.recv_exit_status()
channel.close() # channel is closed, but not the client
return out, err, returncode
client = paramiko.SSHClient()
client.set_missing_host_key_policy(paramiko.AutoAddPolicy())
client.connect(hostname, username = user, password = pw)
out, err, rc = run_cmd(client, 'grep -i "Lost LLUS websocket" /var/log/debesys/cme.log')
# Do whatever with the output
# Run any other commands
client.close()
Of course, there's a number of other options. You could scp the file to the local machine, open it in python, and search on it. You could even cat the file with an ssh call and use python search on the output.

Categories