ping constantly an IP and write into txt - python

I have this code
def ping_google(command):
with open('google.txt', 'a') as f:
f.write(subprocess.check_output(command))
t1 = threading.Thread(target=ping_anel, args=("ping -t 8.8.8.8",))
And i would like to save the infinite pinging to google in a txt file. Is it possible?

command = "ping -i20 8.8.8.8 > ping.txt"
Ask subprocess to execute that with , shell=True,
and you're done.
It will append three status reports each minute, forever,
or until you hit CTRL/C.
(I expect ping
to require a time-to-live value after a -t switch,
so I'm not sure what the intent of your command was.)

Related

PythonSSH to multiple servers

i want to send text files via ssh to 2 servers. My servers have the same name and ip but different ports.
I can do it with 1 server but not with 2 how do I do this (normally there should be a port next to -p).
import subprocess
with open("hosts_and_ports.txt") as hp_fh:
hp_contents = hp_fh.readlines()
for hp_pair in hp_contents:
with open("commands.txt") as fh:
completed = subprocess.run("ssh ubuntussh#127.0.0.1 -p ", capture_output=True, text=True, shell=True, stdin=hp_pair)
My text file hosts_and_ports.txt contains the ports of my servers
2222;
2224;
exit;
My text file commands.txt contains the files I want to forward via ssh
touch demofile1.txt;
touch demofile2.txt;
exit;
ssh is always only to one (1) single port only. In your scenario you need to define the port with -p 2222 OR -p 2224 e.g. `ssh user#192.168.1.1 -p 2224 for one (1) and the same again for the other connection.
ssh user#192.168.1.1 -p 2224 "command1 && command2" #executes a remote command.
To send a local file:scp -p 2224 local_file user#192.168.1.1:/remote/directory
Your attempt obviously doesn't pass in the port number at all.
As a simplification, I'll assume that you can remove the silly exit; line from both files, and just keep on reading as long as there are lines in both files. Also, trim the semicolon from the end of each line; it is simply in the way. (It's not hard to ignore in the Python program, either, but why put such chaff in the file in the first place?)
import subprocess
with open("commands.txt") as cmd:
cmds = cmd.readlines()
with open("hosts_and_ports.txt") as hp_fh:
for line in hp_fh:
port = line.rstrip('\n')
for cmd in cmds:
completed = subprocess.run(
["ssh", "ubuntussh#127.0.0.1", "-p", port, cmd],
capture_output=True, text=True, check=True)
We don't need a shell here, and we are better off without it.
Actually probably also rename the file which only contains port numbers, as its name is currently misleading.
Tangentially, touch demofile1.txt demofile2.txt will create both files with a single remote SSH command. I'm guessing maybe you will have other commands you want to add to the file later on, so this runs all commands in the file on all the servers in the other file. Generally speaking, you will probably want to minimize the number of remote connections because there is a fair bit of overhead with each login ... so in fact it would make more sense to send the entire command.txt to each server in one go:
import subprocess
with open("commands.txt") as cmd:
cmds = cmd.read()
with open("hosts_and_ports.txt") as hp_fh:
for line in hp_fh:
port = line.rstrip('\n')
completed = subprocess.run(
["ssh", "ubuntussh#127.0.0.1", "-p", port, cmds],
capture_output=True, text=True, check=True)

Conditionally run subprocess over ssh, while appending output to a (potentially remote) file

I have a script which can run on my host machine and several other servers. I want to launch this script as a background process on my host machine along with the remote machine using ssh and output the stdout/stderr to host machine for my host machine background process and on the remote machines for remote machine background tasks.
I tried with
subprocess.check_output(['python' ,'script.py' ,'arg_1', ' > file.log ', ' & echo -ne $! ']
but it doesn't work. it doesnt give me the pid nor write into the file. It works with shell=True but then I read it is not good to use shell=True for security reasons.
then I tried
p = subprocess.Popen(['python' ,'script.py' ,'arg_1', ' > file.log ']
Now i can get the process pid but the output is not writing in the remote log file.
using stdout/stderr arguments like suggested below will open the log file in my host machine not the remote machine. i want to log on the remote machine instead.
append subprocess.Popen output to file?
Could someone please suggest me a single command that works both on my host machine and also ssh's to remote server and launches the background process there? and write to output file ?
<HOW_TO_GET_PID> = subprocess.<WHAT>( ([] if 'localhost' else ['ssh','<remote_server>']) + ['python', 'script.py', 'arg_1' <WHAT>] )
Someone could please finish the above psudo code ?
Thanks,
You're not going to get something that's safe and correct in a one-liner without making it unreadable; better not to try.
Note that we're using a shell here: In the local case we explicitly call shell=True, whereas in the remote case ssh always, implicitly starts a shell.
import shlex
import subprocess
def startBackgroundCommand(argv, outputFile, remoteHost=None, andGetPID=False):
cmd_str = ' '.join(shlex.quote(word) for word in argv)
if outputFile != None:
cmd_str += ' >%s' % (shlex.quote(outputFile),)
if andGetPID:
cmd_str += ' & echo "$!"'
if remoteHost != None:
p = subprocess.Popen(['ssh', remoteHost, cmd_str], stdout=subprocess.PIPE)
else:
p = subprocess.Popen(cmd_str, stdout=subprocess.PIPE, shell=True)
return p.communicate()[0]
# Run your command locally
startBackgroundCommand(['python', 'script.py', 'arg_1'],
outputFile='file.log', andGetPID=True)
# Or run your command remotely
startBackgroundCommand(['python', 'script.py', 'arg_1'],
remoteHost='foo.example.com', outputFile='file.log', andGetPID=True)
# At the beginning you can even program automatic daemonizing
# Using os.fork(), otherwise, you run it with something like:
# nohup python run_my_script.py &
# This will ensure that it continues running even if SSH connection breaks.
from subprocess import Popen, PIPE, STDOUT
p = Popen(["python", "yourscript.py"], stdout=PIPE, stderr=STDOUT, stdin=PIPE)
p.stdin.close()
log = open("logfile.log", "wb")
log.write(b"PID: %i\n\n" % p.pid)
while 1:
line = p.stdout.readline()
if not line: break
log.write(line)
log.flush()
p.stdout.close()
log.write(b"\nExit status: %i" % p.poll())
log.close()

Real time output of wget command by using exec_command in Paramiko

I am trying to download a file in all the machine and therefore I have created a python script. It uses the module paramiko
just a snippet from the code:
from paramiko import SSHClient, AutoAddPolicy
ssh = SSHClient()
ssh.load_system_host_keys()
ssh.set_missing_host_key_policy(AutoAddPolicy())
ssh.connect(args.ip,username=args.username,password=args.password)
stdin, stdout, stderr = ssh.exec_command("wget xyz")
print(stdout.read())
The output will be printed after the download has been completed.!
Is there a way I can print the output real time?
Edit: I have looked at this answer and applied something like this:
def line_buffered(f):
line_buf = ""
print "start"
print f.channel.exit_status_ready()
while not f.channel.exit_status_ready():
print("ok")
line_buf += f.read(size=1)
if line_buf.endswith('\n'):
yield line_buf
line_buf = ''
in, out, err = ssh.exec_command("wget xyz")
for l in line_buffered(out):
print l
But, It is not printing the data real time.! It waits for the file to download and then prints the whole status of the download.
Also, I have tried this command : echo one && sleep 5 && echo two && sleep 5 && echo three and the output is realtime with line_buffered function. But, for wget command, it is not working..
wget's progress information outputs to stderr so you should write line_buffered(err).
Or you can use exec_command("wget xyz", get_pty=True) which would combine stdout and stderr.

Pexpect not waiting for whole output - Ubuntu

I have a simple script to SSH into a network switch and run commands and save output into a file. It works fine for output that is displayed instantly but when I run "show iproute" it does not capture any output. The reason is when I run same command on switch directly, it thinks for 5-6 seconds, shows bunch of lines and thinks again and shows couple more lines and then ends. It is not waiting properly for whole command to execute that I am having issue fixing:
str_prompt = ' # '
command = "sh iproute"
device_name = "switch1.test.com"
# Spawn SSH session
ssh_command = 'ssh {}#{}'.format(username, device_name)
session = pexpect.spawn(ssh_command, timeout=5)
# Send the password
session.sendline(password)
# Expect the switch prompt (successful login)
expect_index = session.expect([pexpect.TIMEOUT, str_prompt])
# Success
if expect_index == 1:
# Disable clipaging so that all the output is shown (not in pages) | same as term len 0 in Cisco
session.sendline('disable clip')
# Expect the switch prompt if command is successful
expect_index = session.expect([pexpect.TIMEOUT, str_prompt])
# Send show iproute command
session.sendline(command)
# < This is where it needs to wait >
#session.expect(pexpect.EOF) - Tried this and wait() but that broke the scipt
#session.wait()
# Expect the switch prompt if command is successful
session.expect(str_prompt)
# Save output of "sh iproute" to a variable
output = session.before
# Save results to a file
fp = open(host + '-route.txt', "w")
fp.write(output)
fp.close()
Here is a sample output. The out put does have "#" but not " # ".
#oa 10.10.10.0/24 10.0.0.1 4 UG-D---um--f- V-BB1 99d:0h:14m:49s
#oa 10.10.20.0/24 10.0.0.1 4 UG-D---um--f- V-BB2 99d:0h:14m:49s
#oa 10.10.30.0/24 10.0.0.1 4 UG-D---um--f- V-BB3 99d:0h:14m:49s
#oa 10.10.40.0/24 10.0.0.1 4 UG-D---um--f- V-BB4 99d:0h:14m:49s
and many more line ....
Any help will be appreciated. Thanks
Edit:
I added sleep(60) and that seems to do the trick, but I do not want to use it as I am sending multiple commands and some are super fast. I do not want to wait 1 min for each command, script will take forever to run.
So you need to associate timeout to a commands.
The way i do it today is have a xml file format which my code parses, the xml tag will have a attribute for command and another for timeout, another for end_prompt of the command and so on..
My code reads the command and its timeout and sets the respective variables accordingly before sending the command.
session.sendline(command) #command is read from a xml file
session.expect(end_prompt, timeout=int(tmout)) # end_prompt, tmout for the command read form the same file
For your case, if you dont want to parse a file to get command and its related params you can have them as a dictionary in your script and use it
command_details_dict = { "cmd_details" :[
{'cmd': 'pwd',
'timeout': 5,
},
{'cmd': 'iproute',
'timeout': 60,
}
]
}
inside the dictionary cmd_details is a list of dictionary so that you can maintain the order of your commands while iterating, each command is a dictionary with relevant details, you can add more keys to the command dictionary like prompts, unique identifier etcc..
But if you have the time i would suggest to use a config file instead

Run multiple commands in a single ssh session using popen and save the output in separate files

I am trying to run several commands in a single ssh session and save the output of each command in a different file.
The code that works (but it saves all the output in a single file)
conn = Popen(['ssh',host, "ls;uname -a;pwd"], stdin=PIPE, stdout = open ('/output.txt','w'))
mypassword = conn.communicate('password')
Codes that I am trying to work but not working...
cmd = ['ls', 'pwd', 'uname']
conn = Popen(['ssh',host, "{};{};{}".format(cmd[0],cmd[1],cmd[2])], stdin=PIPE, stdout = output.append('a'))
mypassword = conn.communicate('password')
print (output)
length = range(len(output))
print length
for i in output:
open("$i",'w')
and
cmd = ['ls', 'pwd', 'uname']
conn = Popen(['ssh',host, "{};{};{}".format(cmd[0],cmd[1],cmd[2])], stdin=PIPE, stdout = output())
mypassword = conn.communicate('password')
def output():
for i in cmd:
open(i,'w')
return
Not sure what is the best way of doing it. Should I save it in an array and then save each item in a separate file or should I call a function that will do it?
NOTE that the commands I want to run do not have small output like given in examples here (uname, pwd); it is big as tcpdump, lsof etc.
A single ssh session runs a single command e.g., /bin/bash on the remote host -- you can pass input to that command to emulate running multiple commands in a single ssh session.
Your code won't run even a single command. There are multiple issues in it:
ssh may read the password directly from the terminal (not its stdin stream). conn.communicate('password') in your code writes to ssh's stdin therefore ssh won't get the password.
There are multiple ways to authenticate via ssh e.g., use ssh keys (passwordless login).
stdout = output.append('a') doesn't redirect ssh's stdout because .append list method returns None.
It won't help you to save output of several commands to different files. You could redirect the output to remote files and copy them back later: ls >ls.out; uname -a >uname.out; pwd >pwd.out.
A (hacky) alternative is to use inside stream markers (echo <GUID>) to differentiate the output from different commands. If the output can be unlimited; learn how to read subprocess' output incrementally (without calling .communicate() method).
for i in cmd: open(i,'w') is pointless. It opens (and immediately closes on CPython) multiple files without using them.
To avoid such basic mistakes, write several Python scripts that operate on local files.
SYS_STATS={"Number of CPU Cores":"cat /proc/cpuinfo|grep -c 'processor'\n",
"CPU MHz":"cat /proc/cpuinfo|grep 'cpu MHz'|head -1|awk -F':' '{print $2}'\n",
"Memory Total":"cat /proc/meminfo|grep 'MemTotal'|awk -F':' '{print $2}'|sed 's/ //g'|grep -o '[0-9]*'\n",
"Swap Total":"cat /proc/meminfo|grep 'SwapTotal'|awk -F':' '{print $2}'|sed 's/ //g'|grep -o '[0-9]*'\n"}
def get_system_details(self,ipaddress,user,key):
_OutPut={}
values=[]
sshProcess = subprocess.Popen(['ssh','-T','-o StrictHostKeyChecking=no','-i','%s' % key,'%s#%s'%(user,ipaddress),"sudo","su"],
stdin=subprocess.PIPE, stdout = subprocess.PIPE, universal_newlines=True,bufsize=0)
for element in self.SYS_STATS1:
sshProcess.stdin.write("echo END\n")
sshProcess.stdin.write(element)
sshProcess.stdin.close()
for element in sshProcess.stdout:
if element.rstrip('\n')!="END":
values.append(element.rstrip('\n'))
mapObj={k: v for k, v in zip(self.SYS_STATS_KEYS, values)}
return mapObj

Categories