run bash script from windows pc on linux machine in python [duplicate] - python

I am using the python paramiko module to run a built in parmiko function SSH.execute on a remote server. I want to run a script on the server which will require 4 prompts. I was planning to do a more complex version of this:
ExpectedString = 'ExpectedOutput'
Output = SSH.execute('./runScript')
if Output == ExpectedString:
SSH.execute('Enter this')
else:
raise SomeException
The problem is nothing comes back for output as the server was waiting for a number to entered and the script gets stuck at this SSH.execute command. So even if another SSH.execute command is run after it never gets run! Should I be looking to use something other than paramiko?

You need to interact with the remote script. Actually, SSH.execute doesn't exist, I assume you're talking about exec_command. Instead of just returning the output, it actually gives you wrappers for stdout, stdin and stderr streams. You can directly use these in order to communicate with the remote script.
Basically, this is how you run a command and pass data over stdin (and receive output using stdout):
ssh.connect('127.0.0.1', username='foo', password='bar')
stdin, stdout, stderr = ssh.exec_command("some_script")
stdin.write('expected_input\n')
stdin.flush()
data = stdout.read.splitlines()
You should check for the prompts, of course, instead of relying on good timing.

#leoluk - yep, I understand your problem, both those recommended solutions won't work. The problem, as you said, with exec_command is that you can only read the output once the command completes. So, if you wanted to remotely run the command rm -i *, you won't be able to read which file is to be deleted before you can respond with a "yes" or a "no". The key here is to use invoke_shell. See this youtube link - https://www.youtube.com/watch?v=lLKdxIu3-A4 - this helped and got me going.

Related

Run remote long running command with Paramiko, feed it with some input and put it to the background

I have a long running program on my remote server, which I want to start with Paramiko.
I know how to start a program in the background (with nohup for example) but my program first needs a few user inputs. In a normal SSH session I would pass those interactively over the terminal and then detach (Ctrl+Z, bg, detach) but I can't do this with Paramiko.
Here is what I've been trying so far:
stdin, stdout, stderr = ssh.exec_command('MyCommand', get_pty=True)
stdin.channel.send(b'MyData\r')
stdin.channel.send(b'\x1A') # This is Ctrl+Z
ssh.exec_command('bg; disown')
ssh.close()
but when the SSH connection closes, the remote program also stops running. How can I send user input to a program, that I want to continue running in the background?
You are currently executing MyCommand and bg; disown in two separate shell instances. That's why the bg; disown has no effect on the MyCommand.
If you really want to emulate the interactive shell features this way, them you need to execute both command in one real shell instance. For that you will need to use SSHClient.invoke_shell and not SSHClient.exec_command.
Though in general, that's not a good idea. See also:
What is the difference between exec_command and send with invoke_shell() on Paramiko?
If the program is yours, modify it to allow getting input from command-line.
If you cannot modify it, use shell constructs to provide the input, like:
ssh.exec_command('nohup echo MyData | MyCommand >/dev/null 2>&1 &')
Maybe you need to add quotes around the command, I'm not sure, what's the "operator precedence" here.

Is there a way to send (python) code to a server to compile and run but display the results on the computer that sent them?

I just bought a server and was wondering if there was a way to run the code remotely but store/display the results locally. For example, I write some code to display a graph, the positions on the graph are computed by the (remote) server, but the graph is displayed on the (local) tablet.
I would like to do this because the tablet I carry around with me on a day-to-day basis is very slow for computational physics simulations. I understand that I can setup some kind of communications protocol that allows the server to compute things and then sends the computations to my tablet for a script on my tablet to handle the data. However, I would like to avoid writing a possibly new set of communications scripts (to handle different formats of data) every single time I run a new simulation.
This is a complete "The Russians used a pencil" solution, but have you considered running a VNC server on the machine that is doing the computations?
You could install a VNC client onto your tablet/phone/PC and view it that way, there are tons of them available. No need to go about creating anything from scratch.
With ssh, you can do this with a python script or a shell script.
ssh machine_name "python" < ~/script/path/script.py
As the OP indicated in the comments that he wants to interact with the script on the remote machine, I have made some change here.
Copy the python or shell script to the remote machine. This can be done in several ways. For example with scp. But also, with ssh, like here:
ssh machine_name bash -c "cat > /tmp/script.py" < ~/script/path/script.py
Interact with the script on the remote machine
ssh machine_name python -u /tmp/script.py
You should be able to interact with your script running in the remote machine now!
Notice the use of -u to set stdin/stdout of python in unbuffered mode. This is needed to be able to interact with the script.
-u Force stdin, stdout and stderr to be totally unbuffered. On systems where it matters, also put stdin,
stdout and stderr in binary mode. Note that there is internal buffering in xreadlines(), readlines() and
file-object iterators ("for line in sys.stdin") which is not influenced by this option. To work around
this, you will want to use "sys.stdin.readline()" inside a "while 1:" loop.
Here is an example.
The code, which was copied to the server:
#!/usr//bin/env python3
while True:
value = input("Please enter the value: ")
if value != "bye":
print("Input received from the user is: ", value)
else:
print("Good bye!!")
break
Interactive session:
$ ssh machine_name python -u python/pyecho.py
Please enter the value: 123
Input received from the user is: 123
Please enter the value: bye
Good bye!!
REF:
https://unix.stackexchange.com/questions/87405/how-can-i-execute-local-script-on-remote-machine-and-include-arguments
Feedback in the comments below.

Python Thread Breaking Terminal

Hello minds of stackoverflow,
I've run into a perplexing bug. I have a python script that creates a new thread that ssh's into a remote machine and starts a process. However, this process does not return on its own (and I want it to keep running throughout the duration of my script). In order to force the thread to return, at the end of my script I ssh into the machine again and kill -9 the process. This is working well, expect for the fact that it breaks the terminal.
To start the thread I run the following code:
t = threading.Thread(target=run_vUE_rfal, args=(vAP.IP, vUE.IP))
t.start()
The function run_vUE_rfal is as follows:
cmd = "sudo ssh -ti ~/.ssh/my_key.pem user#%s 'sudo /opt/company_name/rfal/bin/vUE-rfal -l 3 -m -d %s -i %s'" % (vUE_IP, vAP_IP, vUE_IP)
output = commands.getstatusoutput(cmd)
return
It seems when the command is run, it somehow breaks my terminal. It is broken in that instead of creating a new line for each print, it appends the WIDTH of my terminal in whitespace to the end of each line and prints it as seemingly one long string. Also, I am unable to see my keyboard input to that terminal, but it still successfully read. My terminal looks something like this:
normal formatted output
normal formatted output
running vUE-rfal
print1
print2
print3_extra_long
print4
If I replace the body of the run_vUE_rfal function with some simple prints, the terminal does not break. I have many other ssh's and telnets in this script that work fine. However, this is the only one I'm running in a separate thread as it is the only one that does not return. I need to maintain the ability to close the process of the remote machine when my script is finished.
Any explanations to the cause and idea for a fix are much appreciated.
Thanks in advance.
It seems the process you control is changing terminal settings. These are bypassing stderr and stdout - for good reasons. E.g. ssh itself needs this to ask users for passwords even when it's output is being redirected.
A way to solve this could be to use the python-module pexpect (it's a 3rd-party library) to launch your process, as it will create its' own fake-tty you don't care about.
BTW, to "repair" your terminal, use the reset command. As you already noticed, you can enter commands. reset will set the terminal to default settings.

Writing and reading stdout unbuffered to a file over SSH

I'm using Node to execute a Python script. The Python script SSH's into a server, and then runs a Pig job. I want to be able to get the standard out from the Pig job, and display it in the browser.
I'm using the PExpect library to make the SSH calls, but this will not print the output of the pig call until it has totally completed (at least the way I have it written). Any tips on how to restructure it?
child.sendline(command)
child.expect(COMMAND_PROMPT)
print(child.before)
I know I shouldn't be expecting the command prompt (cause that will only show up when the process ends), but I'm not sure what I should be expecting.
Repeating my comment as an answer, since it solved the issue:
If you set child.logfile_read to a writable file-like object (e.g. sys.stdout), Pexpect will the forward the output there as it reads it.
child.logfile_read = sys.stdout
child.sendline(command)
child.expect(COMMAND_PROMPT)

Read remote output and respond using paramiko (SSH.execute)

I am using the python paramiko module to run a built in parmiko function SSH.execute on a remote server. I want to run a script on the server which will require 4 prompts. I was planning to do a more complex version of this:
ExpectedString = 'ExpectedOutput'
Output = SSH.execute('./runScript')
if Output == ExpectedString:
SSH.execute('Enter this')
else:
raise SomeException
The problem is nothing comes back for output as the server was waiting for a number to entered and the script gets stuck at this SSH.execute command. So even if another SSH.execute command is run after it never gets run! Should I be looking to use something other than paramiko?
You need to interact with the remote script. Actually, SSH.execute doesn't exist, I assume you're talking about exec_command. Instead of just returning the output, it actually gives you wrappers for stdout, stdin and stderr streams. You can directly use these in order to communicate with the remote script.
Basically, this is how you run a command and pass data over stdin (and receive output using stdout):
ssh.connect('127.0.0.1', username='foo', password='bar')
stdin, stdout, stderr = ssh.exec_command("some_script")
stdin.write('expected_input\n')
stdin.flush()
data = stdout.read.splitlines()
You should check for the prompts, of course, instead of relying on good timing.
#leoluk - yep, I understand your problem, both those recommended solutions won't work. The problem, as you said, with exec_command is that you can only read the output once the command completes. So, if you wanted to remotely run the command rm -i *, you won't be able to read which file is to be deleted before you can respond with a "yes" or a "no". The key here is to use invoke_shell. See this youtube link - https://www.youtube.com/watch?v=lLKdxIu3-A4 - this helped and got me going.

Categories