i want to send text files via ssh to 2 servers. My servers have the same name and ip but different ports.
I can do it with 1 server but not with 2 how do I do this (normally there should be a port next to -p).
import subprocess
with open("hosts_and_ports.txt") as hp_fh:
hp_contents = hp_fh.readlines()
for hp_pair in hp_contents:
with open("commands.txt") as fh:
completed = subprocess.run("ssh ubuntussh#127.0.0.1 -p ", capture_output=True, text=True, shell=True, stdin=hp_pair)
My text file hosts_and_ports.txt contains the ports of my servers
2222;
2224;
exit;
My text file commands.txt contains the files I want to forward via ssh
touch demofile1.txt;
touch demofile2.txt;
exit;
ssh is always only to one (1) single port only. In your scenario you need to define the port with -p 2222 OR -p 2224 e.g. `ssh user#192.168.1.1 -p 2224 for one (1) and the same again for the other connection.
ssh user#192.168.1.1 -p 2224 "command1 && command2" #executes a remote command.
To send a local file:scp -p 2224 local_file user#192.168.1.1:/remote/directory
Your attempt obviously doesn't pass in the port number at all.
As a simplification, I'll assume that you can remove the silly exit; line from both files, and just keep on reading as long as there are lines in both files. Also, trim the semicolon from the end of each line; it is simply in the way. (It's not hard to ignore in the Python program, either, but why put such chaff in the file in the first place?)
import subprocess
with open("commands.txt") as cmd:
cmds = cmd.readlines()
with open("hosts_and_ports.txt") as hp_fh:
for line in hp_fh:
port = line.rstrip('\n')
for cmd in cmds:
completed = subprocess.run(
["ssh", "ubuntussh#127.0.0.1", "-p", port, cmd],
capture_output=True, text=True, check=True)
We don't need a shell here, and we are better off without it.
Actually probably also rename the file which only contains port numbers, as its name is currently misleading.
Tangentially, touch demofile1.txt demofile2.txt will create both files with a single remote SSH command. I'm guessing maybe you will have other commands you want to add to the file later on, so this runs all commands in the file on all the servers in the other file. Generally speaking, you will probably want to minimize the number of remote connections because there is a fair bit of overhead with each login ... so in fact it would make more sense to send the entire command.txt to each server in one go:
import subprocess
with open("commands.txt") as cmd:
cmds = cmd.read()
with open("hosts_and_ports.txt") as hp_fh:
for line in hp_fh:
port = line.rstrip('\n')
completed = subprocess.run(
["ssh", "ubuntussh#127.0.0.1", "-p", port, cmds],
capture_output=True, text=True, check=True)
Related
I have a script which can run on my host machine and several other servers. I want to launch this script as a background process on my host machine along with the remote machine using ssh and output the stdout/stderr to host machine for my host machine background process and on the remote machines for remote machine background tasks.
I tried with
subprocess.check_output(['python' ,'script.py' ,'arg_1', ' > file.log ', ' & echo -ne $! ']
but it doesn't work. it doesnt give me the pid nor write into the file. It works with shell=True but then I read it is not good to use shell=True for security reasons.
then I tried
p = subprocess.Popen(['python' ,'script.py' ,'arg_1', ' > file.log ']
Now i can get the process pid but the output is not writing in the remote log file.
using stdout/stderr arguments like suggested below will open the log file in my host machine not the remote machine. i want to log on the remote machine instead.
append subprocess.Popen output to file?
Could someone please suggest me a single command that works both on my host machine and also ssh's to remote server and launches the background process there? and write to output file ?
<HOW_TO_GET_PID> = subprocess.<WHAT>( ([] if 'localhost' else ['ssh','<remote_server>']) + ['python', 'script.py', 'arg_1' <WHAT>] )
Someone could please finish the above psudo code ?
Thanks,
You're not going to get something that's safe and correct in a one-liner without making it unreadable; better not to try.
Note that we're using a shell here: In the local case we explicitly call shell=True, whereas in the remote case ssh always, implicitly starts a shell.
import shlex
import subprocess
def startBackgroundCommand(argv, outputFile, remoteHost=None, andGetPID=False):
cmd_str = ' '.join(shlex.quote(word) for word in argv)
if outputFile != None:
cmd_str += ' >%s' % (shlex.quote(outputFile),)
if andGetPID:
cmd_str += ' & echo "$!"'
if remoteHost != None:
p = subprocess.Popen(['ssh', remoteHost, cmd_str], stdout=subprocess.PIPE)
else:
p = subprocess.Popen(cmd_str, stdout=subprocess.PIPE, shell=True)
return p.communicate()[0]
# Run your command locally
startBackgroundCommand(['python', 'script.py', 'arg_1'],
outputFile='file.log', andGetPID=True)
# Or run your command remotely
startBackgroundCommand(['python', 'script.py', 'arg_1'],
remoteHost='foo.example.com', outputFile='file.log', andGetPID=True)
# At the beginning you can even program automatic daemonizing
# Using os.fork(), otherwise, you run it with something like:
# nohup python run_my_script.py &
# This will ensure that it continues running even if SSH connection breaks.
from subprocess import Popen, PIPE, STDOUT
p = Popen(["python", "yourscript.py"], stdout=PIPE, stderr=STDOUT, stdin=PIPE)
p.stdin.close()
log = open("logfile.log", "wb")
log.write(b"PID: %i\n\n" % p.pid)
while 1:
line = p.stdout.readline()
if not line: break
log.write(line)
log.flush()
p.stdout.close()
log.write(b"\nExit status: %i" % p.poll())
log.close()
I have command line program what prompts password:
> cwrsync root#NN.NN.NN.NN:/src /cygdrive/c/dst
Output (when i run it from cmd.exe command line):
root#NN.NN.NN.NN's password:
When i input password manually, all OK. Output:
skipping directory src
I want to provide password for it from command line or python script automatically.
I tried:
One. From command line:
> echo pass|cwrsync -r root#NN.NN.NN.NN:/src /cygdrive/c/dst
Not working. Output:
root#NN.NN.NN.NN's password:
Two. From python script. test.py:
import subprocess
cmd = "cwrsync -r root#NN.NN.NN.NN:/src /cygdrive/c/dst"
proc = subprocess.Popen(cmd1, stdout=subprocess.PIPE, stderr=subprocess.PIPE, stdin=subprocess.PIPE, shell=True)
std1, std2 = proc.communicate("pass")
print std1print std2
Not workin. Output:
Permission denied, please try again.
Permission denied, please try again.
Permission denied (publickey,password).
rsync: connection unexpectedly closed (0 bytes received so far) [Receiver]
rsync error: unexplained error (code 255) at io.c(235) [Receiver=3.1.1]
It is common that security oriented programs ask for password on direct io instead of reading stdin. And as :
echo pass|cwrsync -r root#NN.NN.NN.NN:/src /cygdrive/c/dst
did ask password, I presume that csrsync directly reads from console.
In that case you cannot automate it without some work and low level programming, because you will have to simulate keyboard actions. You should instead search the documentations, because as it looks like it uses an underlying ssh, it is likely to accept a public key pair. If it accept one without passphrase, you should be able to automate it.
Try sending a newline in your stdin string communicate call like so:
import subprocess
cmd = ['cwrsync', '-r', 'root#NN.NN.NN.NN:/src', '/cygdrive/c/dst']
proc = subprocess.Popen(cmd,
stdout=subprocess.PIPE,
stderr=subprocess.PIPE,
stdin=subprocess.PIPE,
shell=True)
std1, std2 = proc.communicate("pass\r\n\r\n")
print std1
print std2
You should also see if it works with shell=False (from subprocess docs):
Using shell=True can be a security hazard. See the warning under Frequently Used Arguments for details.
I am trying to run several commands in a single ssh session and save the output of each command in a different file.
The code that works (but it saves all the output in a single file)
conn = Popen(['ssh',host, "ls;uname -a;pwd"], stdin=PIPE, stdout = open ('/output.txt','w'))
mypassword = conn.communicate('password')
Codes that I am trying to work but not working...
cmd = ['ls', 'pwd', 'uname']
conn = Popen(['ssh',host, "{};{};{}".format(cmd[0],cmd[1],cmd[2])], stdin=PIPE, stdout = output.append('a'))
mypassword = conn.communicate('password')
print (output)
length = range(len(output))
print length
for i in output:
open("$i",'w')
and
cmd = ['ls', 'pwd', 'uname']
conn = Popen(['ssh',host, "{};{};{}".format(cmd[0],cmd[1],cmd[2])], stdin=PIPE, stdout = output())
mypassword = conn.communicate('password')
def output():
for i in cmd:
open(i,'w')
return
Not sure what is the best way of doing it. Should I save it in an array and then save each item in a separate file or should I call a function that will do it?
NOTE that the commands I want to run do not have small output like given in examples here (uname, pwd); it is big as tcpdump, lsof etc.
A single ssh session runs a single command e.g., /bin/bash on the remote host -- you can pass input to that command to emulate running multiple commands in a single ssh session.
Your code won't run even a single command. There are multiple issues in it:
ssh may read the password directly from the terminal (not its stdin stream). conn.communicate('password') in your code writes to ssh's stdin therefore ssh won't get the password.
There are multiple ways to authenticate via ssh e.g., use ssh keys (passwordless login).
stdout = output.append('a') doesn't redirect ssh's stdout because .append list method returns None.
It won't help you to save output of several commands to different files. You could redirect the output to remote files and copy them back later: ls >ls.out; uname -a >uname.out; pwd >pwd.out.
A (hacky) alternative is to use inside stream markers (echo <GUID>) to differentiate the output from different commands. If the output can be unlimited; learn how to read subprocess' output incrementally (without calling .communicate() method).
for i in cmd: open(i,'w') is pointless. It opens (and immediately closes on CPython) multiple files without using them.
To avoid such basic mistakes, write several Python scripts that operate on local files.
SYS_STATS={"Number of CPU Cores":"cat /proc/cpuinfo|grep -c 'processor'\n",
"CPU MHz":"cat /proc/cpuinfo|grep 'cpu MHz'|head -1|awk -F':' '{print $2}'\n",
"Memory Total":"cat /proc/meminfo|grep 'MemTotal'|awk -F':' '{print $2}'|sed 's/ //g'|grep -o '[0-9]*'\n",
"Swap Total":"cat /proc/meminfo|grep 'SwapTotal'|awk -F':' '{print $2}'|sed 's/ //g'|grep -o '[0-9]*'\n"}
def get_system_details(self,ipaddress,user,key):
_OutPut={}
values=[]
sshProcess = subprocess.Popen(['ssh','-T','-o StrictHostKeyChecking=no','-i','%s' % key,'%s#%s'%(user,ipaddress),"sudo","su"],
stdin=subprocess.PIPE, stdout = subprocess.PIPE, universal_newlines=True,bufsize=0)
for element in self.SYS_STATS1:
sshProcess.stdin.write("echo END\n")
sshProcess.stdin.write(element)
sshProcess.stdin.close()
for element in sshProcess.stdout:
if element.rstrip('\n')!="END":
values.append(element.rstrip('\n'))
mapObj={k: v for k, v in zip(self.SYS_STATS_KEYS, values)}
return mapObj
Does anyone know how to make environment variables registered for
exec_command calls when using SSHClient?
I'm using a basic script that instantiates the SSHClient class, connects to another computer using the connect method, then sends out commands using the exec_command method. However, none of the environment variables seem to be registered when I try to issue commands. I can do basic things like 'ls' and see the stdout, but when trying to run installed programs, the fact that the environment variables are missing makes it impossible to run them. Using ssh in the command line to do the same thing works, as the environment variables for the user are set.
#!/usr/bin/python
import paramiko
ssh.connect('mymachine',username='myname',password='pass')
stdin,stdout,stderr=ssh.exec_command('cd /myfolder/path')
stdin,stdout,stderr=ssh.exec_command('ls')
....
....
ssh.close()
Note: I can't change my directory in paramiko. I appended the cd command in the followup command in a single ssh.exec_command('cd /dddd/ddd;ls'). I have given ls as an example but my actual followup command is different.
Since release 2.1.0 2016-12-09 , you can add an environment variable dictionary to the exec_command:
import paramiko
paramiko.util.log_to_file("paramiko.log")
ssh = paramiko.SSHClient()
k = paramiko.RSAKey.from_private_key_file("<private_key_file>")
ssh.connect(<hostname>,username=<username>,pkey=k)
env_dict={"LC_TELEPHONE":"ET_HOME","LC_MEASUREMENT":"MILES_APART"}
stdin , stdout, stderr = ssh.exec_command('echo $LC_TELEPHONE; echo "..."; echo $LC_MEASUREMENT',environment=env_dict)
print stdout.read()
output:
ET_HOME
...
MILES_APART
But why did I choose LC_TELEPHONE and LC_MEASUREMENT? Because those are two of the few environments that the target host's ssh config allows me to set:
grep AcceptEnv /etc/ssh/sshd_config
output:
AcceptEnv LANG LC_CTYPE LC_NUMERIC LC_TIME LC_COLLATE LC_MONETARY LC_MESSAGES
AcceptEnv LC_PAPER LC_NAME LC_ADDRESS LC_TELEPHONE LC_MEASUREMENT
AcceptEnv LC_IDENTIFICATION LC_ALL
In other words, this doesn't work:
env_dict={"HELLO":"WORLD","GOODBYE":"CRUEL_WORLD"}
stdin , stdout, stderr = ssh.exec_command("echo $HELLO; echo '...'; echo $GOODBYE")
print stdout.read()
output:
...
As the documentation warns, the environment variables are silently rejected
http://docs.paramiko.org/en/2.1/api/client.html
http://docs.paramiko.org/en/2.1/api/channel.html#paramiko.channel.Channel.set_environment_variable
If you cannot control the target server's sshd config, putting the environment variables into a file and sourcing it works:
stdin , stdout, stderr = ssh.exec_command("cat .set_env;source .set_env; echo $HELLO; echo '...'; echo $GOODBYE")
print stdout.read()
output:
# begin .set_env
HELLO="WORLD"
GOODBYE="CRUEL_WORLD"
# end .set_env
WORLD
...
CRUEL_WORLD
#!/usr/bin/python
import paramiko
client = paramiko.SSHClient()
client.load_system_host_keys()
client.set_missing_host_key_policy(paramiko.WarningPolicy)
client.connect(myhostname, theport, myuser, thepass)
stdin,stdout,stderr = client.exec_command('cd /tmp;pwd;ls -al')
#returns your output
print stdout.read()
which all works fine for me. If you have special environment Variables you might
have to set them on the remote command prompt. Maybe it helps if you write the
variables into a myENV file and then call
stdin,stdout,stderr = client.exec_command('source ./myEnv')
Did you tried something like that?
You can do: client.exec_command(..., get_pty=True).
This will make paramiko allocate a pseudo terminal, similar to ssh.
I found this problem too. And besides the above approaches, I also fixed the problem by using the following approach:
e.g.,
...
bin_paths = '/usr/local/bin:/bin:/usr/bin:/usr/local/sbin:/usr/sbin'
path_prefix = 'PATH=$PATH:%s && ' % bin_paths
command = path_prefix + command
ssh.exec_command(command=command)
What I'd like to achieve is the launch of the following shell command:
mysql -h hostAddress -u userName -p userPassword
databaseName < fileName
From within a python 2.4 script with something not unlike:
cmd = ["mysql", "-h", ip, "-u", mysqlUser, dbName, "<", file]
subprocess.call(cmd)
This pukes due to the use of the redirect symbol (I believe) - mysql doesn't receive the input file.
I've also tried:
subprocess.call(cmd, stdin=subprocess.PIPE)
no go there ether
Can someone specify the syntax to make a shell call such that I can feed in a file redirection ?
Thanks in advance.
You have to feed the file into mysql stdin by yourself. This should do it.
import subprocess
...
filename = ...
cmd = ["mysql", "-h", ip, "-u", mysqlUser, dbName]
f = open(filename)
subprocess.call(cmd, stdin=f)
The symbol < has this meaning (i. e. reading a file to stdin) only in shell. In Python you should use either of the following:
1) Read file contents in your process and push it to stdin of the child process:
fd = open(filename, 'rb')
try:
subprocess.call(cmd, stdin=fd)
finally:
fd.close()
2) Read file contents via shell (as you mentioned), but redirect stdin of your process accordingly:
# In file myprocess.py
subprocess.call(cmd, stdin=subprocess.PIPE)
# In shell command line
$ python myprocess.py < filename
As Andrey correctly noticed, the < redirection operator is interpreted by shell. Hence another possible solution:
import os
os.system("mysql -h " + ip + " -u " + mysqlUser + " " + dbName)
It works because os.system passes its argument to the shell.
Note that I assumed that all used variables come from a trusted source, otherwise you need to validate them in order to prevent arbitrary code execution. Also those variables should not contain whitespace (default IFS value) or shell special characters.