Remote sh script executed in Python (Paramiko) never ends [duplicate] - python

I've got a Python program which sits on a remote server which uploads a file to an AWS bucket when run. If I ssh onto the server and run it with the command sudo python3 /path/to/backup.py it works as expected.
I'm writing a Python program to automate a bigger process which includes running backup.py. I created a function to do this using the paramiko library. This is where the command gets run
ssh_stdin, ssh_stdout, ssh_stderr = self.ssh.exec_command('sudo python3 /path/to/backup.py', 1800)
logging.debug(f'ssh_stdout: {ssh_stdout.readline()}')
logging.debug(f'ssh_stderr: {ssh_stderr.readline()}')
My automation gives me this output:
ssh_stdout: Tue, 19 May 2020 14:36:43 INFO The COS endpoint is 9.11.200.206, writing to vault: SD_BACKUP_4058
The program doesn't do anything after that. When I log onto the server and check the logs of backup.py, I can see that it is still running and seems to be sitting at the file upload. This is the code it's stuck at:
s3_client.upload_file(
Filename=BACKUP,
Bucket=BUCKET_NAME,
Key=SPLIT_FILE_NAME,
Callback=pp(BACKUP),
Config=config)
I can't understand why it's getting stuck here when started by my automation program and not when I run it from a command line in the terminal. I can't see anything in the logs which help me. It just seems to be stuck at that point in its execution. Could it be something to do with the callback not getting returned?

You read only one line of the output.
logging.debug(f'ssh_stdout: {ssh_stdout.readline()}')
If the remote program produces lot of output, as soon as its output buffer fills in, the program hangs on the next attempt to write some output.
If you want the program to finish, you have to keep reading the output.
The simplest way is to use readlines or read:
print(stdout.read())
But that's inefficient for large outputs like yours.
Instead you can read the output line by line:
for line in stdout:
print(line.strip())
It gets more complicated, when the commands produces also an error output, as then you have to read both output streams.
See Paramiko ssh die/hang with big output.
And you should check the error output in any case, for good error handling. See also:
Command executed with Paramiko does not produce any output

Related

run bash script from windows pc on linux machine in python [duplicate]

I am using the python paramiko module to run a built in parmiko function SSH.execute on a remote server. I want to run a script on the server which will require 4 prompts. I was planning to do a more complex version of this:
ExpectedString = 'ExpectedOutput'
Output = SSH.execute('./runScript')
if Output == ExpectedString:
SSH.execute('Enter this')
else:
raise SomeException
The problem is nothing comes back for output as the server was waiting for a number to entered and the script gets stuck at this SSH.execute command. So even if another SSH.execute command is run after it never gets run! Should I be looking to use something other than paramiko?
You need to interact with the remote script. Actually, SSH.execute doesn't exist, I assume you're talking about exec_command. Instead of just returning the output, it actually gives you wrappers for stdout, stdin and stderr streams. You can directly use these in order to communicate with the remote script.
Basically, this is how you run a command and pass data over stdin (and receive output using stdout):
ssh.connect('127.0.0.1', username='foo', password='bar')
stdin, stdout, stderr = ssh.exec_command("some_script")
stdin.write('expected_input\n')
stdin.flush()
data = stdout.read.splitlines()
You should check for the prompts, of course, instead of relying on good timing.
#leoluk - yep, I understand your problem, both those recommended solutions won't work. The problem, as you said, with exec_command is that you can only read the output once the command completes. So, if you wanted to remotely run the command rm -i *, you won't be able to read which file is to be deleted before you can respond with a "yes" or a "no". The key here is to use invoke_shell. See this youtube link - https://www.youtube.com/watch?v=lLKdxIu3-A4 - this helped and got me going.

A command does not finish when executed using Python Paramiko exec_command

I've got a Python program which sits on a remote server which uploads a file to an AWS bucket when run. If I ssh onto the server and run it with the command sudo python3 /path/to/backup.py it works as expected.
I'm writing a Python program to automate a bigger process which includes running backup.py. I created a function to do this using the paramiko library. This is where the command gets run
ssh_stdin, ssh_stdout, ssh_stderr = self.ssh.exec_command('sudo python3 /path/to/backup.py', 1800)
logging.debug(f'ssh_stdout: {ssh_stdout.readline()}')
logging.debug(f'ssh_stderr: {ssh_stderr.readline()}')
My automation gives me this output:
ssh_stdout: Tue, 19 May 2020 14:36:43 INFO The COS endpoint is 9.11.200.206, writing to vault: SD_BACKUP_4058
The program doesn't do anything after that. When I log onto the server and check the logs of backup.py, I can see that it is still running and seems to be sitting at the file upload. This is the code it's stuck at:
s3_client.upload_file(
Filename=BACKUP,
Bucket=BUCKET_NAME,
Key=SPLIT_FILE_NAME,
Callback=pp(BACKUP),
Config=config)
I can't understand why it's getting stuck here when started by my automation program and not when I run it from a command line in the terminal. I can't see anything in the logs which help me. It just seems to be stuck at that point in its execution. Could it be something to do with the callback not getting returned?
You read only one line of the output.
logging.debug(f'ssh_stdout: {ssh_stdout.readline()}')
If the remote program produces lot of output, as soon as its output buffer fills in, the program hangs on the next attempt to write some output.
If you want the program to finish, you have to keep reading the output.
The simplest way is to use readlines or read:
print(stdout.read())
But that's inefficient for large outputs like yours.
Instead you can read the output line by line:
for line in stdout:
print(line.strip())
It gets more complicated, when the commands produces also an error output, as then you have to read both output streams.
See Paramiko ssh die/hang with big output.
And you should check the error output in any case, for good error handling. See also:
Command executed with Paramiko does not produce any output

No "nohup" python command results

I'm developing a real time website. It's a map where the color of each city changes based on the current emotion.
I have the python part which is connected to my database.
So whenever I run the python code a new record is added to the database. - it's a streaming code so it's never ending.
The command line that is suitable for my python code is (nohup) since I want it always running.
I'm using (Bluehost) as hosting server - VPS package.
I opened my SSH command line and run the command:
So this means it's working? It created an out file
but no record is added to the database!
What's the problem?
Thank you
The line Exit 2 means that there is a problem. You'll find a description in nohup.out (see the line that says ignoring input and appending nohup.out)
For a hint more clarity: the line that has Exit ... means the process called through nohup has terminated. The integer generally has meaning (more on those here), but you need to look at the actual nohup.out file before you'll learn anything.

Python Thread Breaking Terminal

Hello minds of stackoverflow,
I've run into a perplexing bug. I have a python script that creates a new thread that ssh's into a remote machine and starts a process. However, this process does not return on its own (and I want it to keep running throughout the duration of my script). In order to force the thread to return, at the end of my script I ssh into the machine again and kill -9 the process. This is working well, expect for the fact that it breaks the terminal.
To start the thread I run the following code:
t = threading.Thread(target=run_vUE_rfal, args=(vAP.IP, vUE.IP))
t.start()
The function run_vUE_rfal is as follows:
cmd = "sudo ssh -ti ~/.ssh/my_key.pem user#%s 'sudo /opt/company_name/rfal/bin/vUE-rfal -l 3 -m -d %s -i %s'" % (vUE_IP, vAP_IP, vUE_IP)
output = commands.getstatusoutput(cmd)
return
It seems when the command is run, it somehow breaks my terminal. It is broken in that instead of creating a new line for each print, it appends the WIDTH of my terminal in whitespace to the end of each line and prints it as seemingly one long string. Also, I am unable to see my keyboard input to that terminal, but it still successfully read. My terminal looks something like this:
normal formatted output
normal formatted output
running vUE-rfal
print1
print2
print3_extra_long
print4
If I replace the body of the run_vUE_rfal function with some simple prints, the terminal does not break. I have many other ssh's and telnets in this script that work fine. However, this is the only one I'm running in a separate thread as it is the only one that does not return. I need to maintain the ability to close the process of the remote machine when my script is finished.
Any explanations to the cause and idea for a fix are much appreciated.
Thanks in advance.
It seems the process you control is changing terminal settings. These are bypassing stderr and stdout - for good reasons. E.g. ssh itself needs this to ask users for passwords even when it's output is being redirected.
A way to solve this could be to use the python-module pexpect (it's a 3rd-party library) to launch your process, as it will create its' own fake-tty you don't care about.
BTW, to "repair" your terminal, use the reset command. As you already noticed, you can enter commands. reset will set the terminal to default settings.

Read remote output and respond using paramiko (SSH.execute)

I am using the python paramiko module to run a built in parmiko function SSH.execute on a remote server. I want to run a script on the server which will require 4 prompts. I was planning to do a more complex version of this:
ExpectedString = 'ExpectedOutput'
Output = SSH.execute('./runScript')
if Output == ExpectedString:
SSH.execute('Enter this')
else:
raise SomeException
The problem is nothing comes back for output as the server was waiting for a number to entered and the script gets stuck at this SSH.execute command. So even if another SSH.execute command is run after it never gets run! Should I be looking to use something other than paramiko?
You need to interact with the remote script. Actually, SSH.execute doesn't exist, I assume you're talking about exec_command. Instead of just returning the output, it actually gives you wrappers for stdout, stdin and stderr streams. You can directly use these in order to communicate with the remote script.
Basically, this is how you run a command and pass data over stdin (and receive output using stdout):
ssh.connect('127.0.0.1', username='foo', password='bar')
stdin, stdout, stderr = ssh.exec_command("some_script")
stdin.write('expected_input\n')
stdin.flush()
data = stdout.read.splitlines()
You should check for the prompts, of course, instead of relying on good timing.
#leoluk - yep, I understand your problem, both those recommended solutions won't work. The problem, as you said, with exec_command is that you can only read the output once the command completes. So, if you wanted to remotely run the command rm -i *, you won't be able to read which file is to be deleted before you can respond with a "yes" or a "no". The key here is to use invoke_shell. See this youtube link - https://www.youtube.com/watch?v=lLKdxIu3-A4 - this helped and got me going.

Categories