java command not found Paramiko - python

I'm trying to run a .sh file using paramiko. with this code:
import paramiko
cmd = "cd path ; ./ file.sh"
ssh = paramiko.SSHClient()
ssh.set_missing_host_key_policy(paramiko.AutoAddPolicy())
ssh.connect(server,username.password)
stdin, stdout, stderr = ssh.exec_command(cmd)
print stdout.readlines()
ssh.close()
but I got this error:
java command not found.
file is passing parameters to loadtestrunner.sh and the error is refering to a line in loadtest runner which is:
java $JAVA_OPTS -cp $SOAPUI_CLASSPATH com.eviware.soapui.SoapUIProLoadTestRunner "$#"
java is installed on server. and loadtestrunner succesfuly runs directly from server

You may have some shell profile that loads java into PATH. If not, create one. Mine is ~/.bash_profile and it contains the line export PATH=${PATH}:/usr/java/jdk1.6.0_21/bin.
Simply append source ~./bash_profile \n to your command and you should be able to use java from paramiko's ssh. (Note that your profile my be ~/.bashrc or a non-bash alternative)

Related

Run rest api calls on a remote server using python [duplicate]

I am trying to run my local bash script on remote server without copying it into remote server. It is as simple as following for test purpose. There are more than a few servers where it runs perfectly, but in some server running tcsh, there is an issue. How do I invoke bash, if following does not work. Below is dummy test.sh
#!/bin/bash
a=test
echo $a
echo $SHELL
I am using Python Paramiko exec_command for remote execution as following:
my_script = open("test.sh").read()
stdin, stdout, stderr = ssh.exec_command(my_script, timeout=15)
print(stdout.read().decode())
err = stderr.read().decode()
if err:
print(err)
Given, that connection works and same script works for other servers with bash default shell.
This is the output that i get:
/bin/tcsh
printing from errors
a=test: Command not found.
a: Undefined variable.
The #!/bin/bash is a comment. Sending it to a remote shell as a command has no effect.
You have to execute /bin/bash on the server and send your script to it:
stdin, stdout, stderr = ssh.exec_command("/bin/bash", timeout=15)
stdin.write(my_script)
Also, you have to exit the shell at the end of your script, otherwise it will never end.
Related question:
Pass arguments to a bash script stored locally and needs to be executed on a remote machine using Python Paramiko

Execute local script on remote server using non-default shell with Python Paramiko

I am trying to run my local bash script on remote server without copying it into remote server. It is as simple as following for test purpose. There are more than a few servers where it runs perfectly, but in some server running tcsh, there is an issue. How do I invoke bash, if following does not work. Below is dummy test.sh
#!/bin/bash
a=test
echo $a
echo $SHELL
I am using Python Paramiko exec_command for remote execution as following:
my_script = open("test.sh").read()
stdin, stdout, stderr = ssh.exec_command(my_script, timeout=15)
print(stdout.read().decode())
err = stderr.read().decode()
if err:
print(err)
Given, that connection works and same script works for other servers with bash default shell.
This is the output that i get:
/bin/tcsh
printing from errors
a=test: Command not found.
a: Undefined variable.
The #!/bin/bash is a comment. Sending it to a remote shell as a command has no effect.
You have to execute /bin/bash on the server and send your script to it:
stdin, stdout, stderr = ssh.exec_command("/bin/bash", timeout=15)
stdin.write(my_script)
Also, you have to exit the shell at the end of your script, otherwise it will never end.
Related question:
Pass arguments to a bash script stored locally and needs to be executed on a remote machine using Python Paramiko

SCP Through python is not transferring file

I have two Raspberry Pi's. I am trying to transfer files from one Pi to the other using scp. I am trying to do this through Python because the program that will be transferring files is a python file.
below is the shell script I have for the SCP part (Blurred out the pass and IP):
#!/bin/sh
sshpass -p ######## scp test.txt pi#IP:/home/pi
and below is the Python Script that launches that Shell script.
import subprocess
subprocess.call(['./ssh.sh'])
print("DONE")
For some reason the python script doesnt kick back any errors and hits the print line but the file is not transferred. When i run the scp command outside of python the file transfers just fine. Am I doing something incorrect here?
****EDIT****
I cant even get Subprocess to work with this which is why i ended up using na shell script. Here is my attempt with Subprocess:
import subprocess
subprocess.call("sshpass -p ######## scp test.txt pi#IP:/home/pi")
print"DONE"
Again I get no errors, but the file is not transferred
****EDIT #2****
So I found out that because sshpass is being used, scp isnt prompting me to add the IP to known hosts, as a result the file simply isnt trnasferred at all. I need a way to add this acceptance into the script IE I ge the following if I launch the command without sshpass:
The authenticity of host 'IP (IP)' can't be established.
ECDSA key fingerprint is 13:91:24:8e:6f:21:98:1f:5b:3a:c8:42:7a:88:e9:91.
Are you sure you want to continue connecting (yes/no)?
I want to communicate to pass "yes\n" to this prompt as well as the password afterwards. Is this possible?
For the first query
You can use 'subprocess.popen' to get output(STDOUT) and error(STDERR) for the executed command.
import subprocess
cmd = 'sshpass -p ****** scp dinesh.txt root#256.219.210.135:/root'
p = subprocess.Popen(cmd.split(), stdout=subprocess.PIPE, stderr=subprocess.PIPE)
out, err = p.communicate()
print "Output is ",out
print "Error is ",err
If you execute above code with wrong password, the you will get below output:
[root#centos /]# python code.py
Output is
Error is Permission denied, please try again.
In this case, if the file is successfully transferred, then there is no output.
If you execute command like 'ls -l' then output will be printed.
For your second query (****EDIT #2****)
Options are :
Password less SSH. Check this.
Pexpect
I found a much easier way of tackling all of this
sshpass -p ###### scp -o StrictHostKeyChecking=no test.txt pi#IP:/home/pi
The -o switch allows me to auto store the IP into known hosts thus I do not need to communicate with the shell at all. The interaction from Python to Shell works with that addition; Doing this solely through subprocess also works.
If you don't mind to try other approaches it worth to use SCPClient from scp import.

execute a program on a remote machine python

I can execute a script from python environment locally using subprocess but due to cross platform issues, I have to execute it on a remote server and get back the results on my local machine.
The directory parserpath contains some third party modules that can be executed using a script run.sh present in parserpath directory. However this parserpath directory is present on a remote server.
This is what I have, but this will work only if parserpath is a local directory. How can I ssh to a remote directory and run the script run.sh?
def run_parser(filename):
current_dir = os.getcwd()
parser_path="/parserpath"
os.chdir(parser_path)
subprocess.call("./run.sh " + filename, shell=True)
os.chdir(current_dir)
With most linux shells, you can run a command in a different working directory by executing a subshell as in
/home/usr> (cd /usr/local/bin;pwd)
/usr/local/bin
/home/usr>
You can do the same thing through ssh to the remote system. Depending on which ssh client you use, you may thin that up a bit. For instance, with paramikos exec_command, a new remote shell is created for each command so cd /path/on/remote/machine;./run.sh is sufficient.
A minimalist example for paramiko on python 2.x is
import sys
import paramiko
try:
hostname, username, password, targetpath = sys.argv[1:5]
except ValueError:
print("Failed, call with hostname username password targetpath")
command = "cd {};pwd".format(targetpath)
print("Command to send: {}".format(command))
ssh = paramiko.SSHClient()
ssh.set_missing_host_key_policy(paramiko.AutoAddPolicy())
ssh.connect(hostname=hostname, username=username, password=password)
stdin, stdout, stderr = ssh.exec_command("cd {};pwd".format(targetpath))
print(stdout.read())
ssh.close()
python3 should be similar. There are other options like libssh2 bindings for python, pexpects ssh support and etc...
Use SSH keys to automate the process of logging in via SSH. Here is the following code to execute a script remotely.
ssh = subprocess.Popen(["ssh", "%s" % HOST, COMMAND],
shell=False,
stdout=subprocess.PIPE,
stderr=subprocess.PIPE)
Try, ssh user#host sh path/run.sh

How to execute remote pysftp commands under a specific shell

I use python module pysftp to connect to remote server. Below you can see python code :
import pysftp
import sys
import sqr_common
srv = pysftp.Connection(host="xxxxxx", username="xxxx",
password="xxxxx")
command = "/usr/bin/bash"
command2="APSHOME=/all/aps/msc_2012; export APSHOME; "
srv.execute(command)
srv.execute(command2)
srv.close()
Problem is that command /usr/bin/bash is an infinite process , so my script will never be executed. Can anyone help me how to choose shell on remote server for example bash and execute command in bash on remote server?? Is there any pysftp function that allows me chosing shell??
try this
/usr/bin/bash -c "APSHOME=/all/aps/msc_2012; export APSHOME; "
This problem is not specific to Python, but more like how to execute commands under specific shell.
If you need to run only single command you can run using bash -c switch
bash -c "echo 123"
You can run multiple commands ; separated
bash -c "echo 123 ; echo 246"
If you need to many commands under a specific shell, remotely create a shell script file (.bash file) an execute it
bash myscript.bash

Categories