I have a script that loops over a dozen hosts and executes several functions in each host. The functions take as a parameter the SSHClient() and then execute commands on it.
I could simply set some attribute on the SSHClient(), but before I do that, is there already a way to determine from an instance of SSHClient() which host is currently being connected to?
for host in hosts:
ssh = paramiko.SSHClient()
ssh.set_missing_host_key_policy(paramiko.AutoAddPolicy())
ssh.connect(host, username=USERNAME)
f1(ssh)
f2(ssh)
...
Using the ssh variable for the Client, as you've done, there is:
ssh.get_transport().getpeername()
Which will return a tuple of ('ip address', portnumber)
Will that be enough? Looking at the source for paramiko/client.py it doesn't seem to keep a record of the value of connect()'s hostname parameter, it looks up the address with socket.getaddrinfo and then passes the result on to the transport, which is what getpeername() is asking.
The following is probably what you are after...
stdin, stdout, stderr = ssh.exec_command('hostname')
hostname = stdout.read().decode("utf-8").strip('\n')
print(hostname)
How it works:
On any windows and linux, if you type hostname on any terminal, then you will get back the server name. We execute this cmd on the remote, get back the returned output and change it from byte to string format and finally take off trailing new line character.
This is how I solved the issue:
getpeername() returns an IP address, which is not very user friendly for a log message. So I wrapped it with socket.gethostbyaddr().
This returns the FQDN as a string:
socket.gethostbyaddr(ssh.get_transport().getpeername()[0])[0])
And this splits it to just the hostname itself:
socket.gethostbyaddr(ssh.get_transport().getpeername()[0])[0].split('.')[0])
Related
I am using PAM authentication to authenticate with my linux server. I have created a view on my website through Apache2 where I can use python to manually validate each login through a web shell with facial recognition and two factor authentication. This is working, but I can't seem to recover the IP address of the incoming connection. I need a way to find the IP address of my connection to the server before SSH is connected, in the PAM module which is running Python. I would like to use bash for this.
I am trying to execute commands to recover the IP address, I tried using "who" and other commands to see incoming SSH connections to no avail. I also tried using "echo $PAM_RHOST" and "$SSH_CLIENT" and "$SSH_CONNECTION" with no success.
I ended up using the auth.log which seems to work perfectly. All I had to do was reverse the log and get the last IP. The below code also collects unique IPs in order of the last login.
`
output = run_command('sudo tail -500 /var/log/auth.log')
op = output.split('\n')
op.reverse()
output = '\n'.join(op)
def unique(thelist):
u = []
for i in thelist:
if i not in u: u.append(i)
return u
ips = unique(re.findall('Accepted publickey for user from ([\d]+\.[\d]+\.[\d]+\.[\d]+)', output))
ip = ips[0]
print(ip) # The last IP
`
I am trying to connect to a server using SSH protocol through a jump server. When I connect through a terminal using a protocol, the jump server opens a shell and asks for a server number from the list of available servers provided, followed by a user or password. Using the library Paramiko.
My code:
import paramiko
client = paramiko.SSHClient()
client.set_missing_host_key_policy(paramiko.AutoAddPolicy())
client.connect(
hostname="server_ip",
username="user",
password="pass",
look_for_keys=False,
allow_agent=False
)
com='1'
stdin, stdout, stderr = client.exec_command(com)
data = stdout.read() + stderr.read()
print(data.decode('utf-8'))
I get message:
Invalid target.
My shell on the jump server looks like this:
Your jump server probably shows the selection in an interactive shell session only. So you will have to use SSHClient.invoke_shell, what is otherwise not good thing to do when automating a connection.
See also What is the difference between exec_command and send with invoke_shell() on Paramiko?
I am trying to use Paramiko to make an SSH communication between 2 servers on a private network. The client server is a web server and the host server is going to be a "worker" server. The idea was to not open up the worker server to HTTP connections. The only communication that needs to happen, is the web server needs to pass strings to a script on the worker server. For this I was hoping to use Paramiko and pass the information to the script via SSH.
I set up a new user and created a test script in Python 3, which works when I run it from the command line from my own user's SSH session. I put the same code into my Django web app, thinking that it should work, since it tests OK from the command line, and I get the following error:
Server 'worker-server' not found in known_hosts
Now, I think I understand this error. When performing the test script, I was using a certain user to access the server, and the known hosts information is saved to ~/.ssh/known_hosts even though the user is actually a 3rd party user created just for this one job. So the Django app is running under a different user who doesn't find the saved known hosts info because it doesn't have access to that folder. As far as I can tell the user which Apache uses to execute the Django scripts doesn't have a home directory.
Is there a way I can add this known host in a way that the Django process can see it?
Script:
import paramiko
client = paramiko.SSHClient()
client.load_system_host_keys()
client.connect('worker-server', 22, 'workeruser', 'workerpass')
code = "123wfdv"
survey_id = 111
stdin, stdout, stderr =
client.exec_command('python3 /path/to/test_script/test.py %s %s' % ( code, survey_id ))
print( "ssh successful. Closing connection" )
stdout = stdout.readlines()
client.close()
print ( "Connection closed" )
output = ""
for line in stdout:
output = output + line
if output!="":
print ( output )
else:
print ( "There was no output for this command" )
You can hard-code the host key in your Python code, using HostKeys.add:
import paramiko
from base64 import decodebytes
keydata = b"""AAAAB3NzaC1yc2EAAAABIwAAAQEA0hV..."""
key = paramiko.RSAKey(data=decodebytes(keydata))
client = paramiko.SSHClient()
client.get_host_keys().add('example.com', 'ssh-rsa', key)
client.connect(...)
This is based on my answer to:
Paramiko "Unknown Server".
To see how to obtain the fingerprint for use in the code, see my answer to:
Verify host key with pysftp.
If using pysftp, instead of Paramiko directly, see:
PySFTP failing with "No hostkey for host X found" when deploying Django/Heroku
Or, as you are connecting within a private network, you can give up on verifying host key altogether, using AutoAddPolicy:
client = paramiko.SSHClient()
client.set_missing_host_key_policy(paramiko.AutoAddPolicy())
client.connect(...)
(This can be done only if you really do not need the connection to be secure)
Moin!
Situation: connect to the destination.host over the jump.host and run a command on the destination.host, which connects in the background to the another.host (on this host my ssh key is needed).
Scheme: client --> jump.host --> destination.host --- remote_command with ssh key needed on the other host --> another.host
#!/usr/bin/python
import paramiko
jumpHost=paramiko.SSHClient()
sshKey = paramiko.RSAKey.from_private_key_file('path.to.key/file', password = 'the.passphrase')
jumpHost.set_missing_host_key_policy(paramiko.AutoAddPolicy())
jumpHost.connect('jump.hostname',username='foo', pkey = sshKey)
jumpHostTransport = jumpHost.get_transport()
dest_addr = ('destination.hostname', 22)
local_addr = ('jump.hostname', 22)
jumpHostChannel = jumpHostTransport.open_channel("direct-tcpip", dest_addr, local_addr)
destHost=paramiko.SSHClient()
destHost.set_missing_host_key_policy(paramiko.AutoAddPolicy())
destHost.connect('destination.hostname', username='foo', sock=jumpHostChannel, pkey=sshKey)
destHostAgentSession = destHost.get_transport().open_session()
paramiko.agent.AgentRequestHandler(destHostAgentSession)
stdin, stderr, stdout = destHost.exec_command("my.command.which.connects.to.another.host")
print(stdout.read())
print(stderr.read())
destHost.close()
jumpHost.close()
The above code works well, if run "local" commands on the destination.host - e.g. uname, whoami, hostname, ls and so on... But if i run a command, which connects in the background to another host where my ssh key is needed, the code raised in the error:
raise AuthenticationException("Unable to connect to SSH agent")
paramiko.ssh_exception.AuthenticationException: Unable to connect to SSH agent
If i connect via Putty at the same chain, it works well.
Can anyone give me a hint to resolve my problem?
Thx in advance.
Assumption: Your keys work across jump host and destination host.
Creating a local agent in that case will work. You could manually create it via shell first and test it via iPython.
eval `ssh-agent`; ssh-add <my-key-file-path>
Programmatically this can be done -
# Using shell=True is not a great idea because it is a security risk.
# Refer this post - https://security.openstack.org/guidelines/dg_avoid-shell-true.html
subprocess.check_output("eval `ssh-agent`; ssh-add <my-key-file-path>", shell=True)
I am trying to do something similar and came across this post, I will update if I find a better solution.
EDIT: I have posted the implementation over here - https://adikrishnan.in/2018/10/25/agent-forwarding-with-paramiko/
I am just starting off with paramiko, and I'm having some issue with load_system_host_keys().
When I try:
client = SSHClient()
client.load_system_host_keys(filename='/home/barashe/.ssh/known_hosts')
client.connect(hostname='lvs.cs.bgu.ac.il')
stdin, stdout, stderr = client.exec_command('ls -l')
I get
SHException: Server 'lvs.cs.bgu.ac.il' not found in known_hosts
And it seems like the hostkeys instance is empty
list(client.get_host_keys())
[]
If I use load_host_keys() instead of load_system_host_keys() I still get the same error, but the hostkeys instance is not empty now, and it includes the server I'm trying to connect to
list(client.get_host_keys())
['lvs.cs.bgu.ac.il',
'132.72.41.50']
Which seems rather odd...
I know that by using
client.set_missing_host_key_policy(paramiko.AutoAddPolicy())
I can avoid this situation, but I prefer doing it the "right" way.
What I'm trying to understand is:
Why am I getting the same error when using load_host_keys() even though the server appears in the hostkeys?
What is the difference between load_host_keys() and load_system_host_keys() in this context?
Cheers!
If this is a private host key file in your home directory, you should not use load_system_host_keys but load_host_keys.
Just out of curiosity, where did you get your host key for that particular host if you did not use set_missing_host_key_policy? If you copied it from your .ssh directory, it is possible that the key file format is different. There are several.
You can test it by adding AutoAdd missing host key policy once and pointing to an empty private host key file. Your login should succeed now (assuming authentication succeeds). Whether it succeeds or fails, your private host key file should now contain the host key in the correct format. You can verify it works by removing the missing host key policy setting and running the script again. It should not moan about missing host keys anymore.
This works for me:
from paramiko import SSHClient
import paramiko
client = SSHClient()
client.load_host_keys(filename='/home/test/stest/kknown_hosts')
# client.set_missing_host_key_policy(paramiko.AutoAddPolicy())
client.connect(hostname='137.xx.x.x')
stdin, stdout, stderr = client.exec_command('ls -l')
Hope this helps,
Hannu