I am just starting off with paramiko, and I'm having some issue with load_system_host_keys().
When I try:
client = SSHClient()
client.load_system_host_keys(filename='/home/barashe/.ssh/known_hosts')
client.connect(hostname='lvs.cs.bgu.ac.il')
stdin, stdout, stderr = client.exec_command('ls -l')
I get
SHException: Server 'lvs.cs.bgu.ac.il' not found in known_hosts
And it seems like the hostkeys instance is empty
list(client.get_host_keys())
[]
If I use load_host_keys() instead of load_system_host_keys() I still get the same error, but the hostkeys instance is not empty now, and it includes the server I'm trying to connect to
list(client.get_host_keys())
['lvs.cs.bgu.ac.il',
'132.72.41.50']
Which seems rather odd...
I know that by using
client.set_missing_host_key_policy(paramiko.AutoAddPolicy())
I can avoid this situation, but I prefer doing it the "right" way.
What I'm trying to understand is:
Why am I getting the same error when using load_host_keys() even though the server appears in the hostkeys?
What is the difference between load_host_keys() and load_system_host_keys() in this context?
Cheers!
If this is a private host key file in your home directory, you should not use load_system_host_keys but load_host_keys.
Just out of curiosity, where did you get your host key for that particular host if you did not use set_missing_host_key_policy? If you copied it from your .ssh directory, it is possible that the key file format is different. There are several.
You can test it by adding AutoAdd missing host key policy once and pointing to an empty private host key file. Your login should succeed now (assuming authentication succeeds). Whether it succeeds or fails, your private host key file should now contain the host key in the correct format. You can verify it works by removing the missing host key policy setting and running the script again. It should not moan about missing host keys anymore.
This works for me:
from paramiko import SSHClient
import paramiko
client = SSHClient()
client.load_host_keys(filename='/home/test/stest/kknown_hosts')
# client.set_missing_host_key_policy(paramiko.AutoAddPolicy())
client.connect(hostname='137.xx.x.x')
stdin, stdout, stderr = client.exec_command('ls -l')
Hope this helps,
Hannu
Related
I have a Python script I am trying to run in a Docker container to send a file that is on this container to an SFTP server.
I tried the following :
import paramiko
ssh=paramiko.SSHClient()
ssh.set_missing_host_key_policy(paramiko.AutoAddPolicy())
session = ssh.connect(hostname="X", port=X, username='X', password="X")
stdin,stdout,stderr = ssh.exec_command('sftp -P X ldzdl#hostname', get_pty=True)
I also tried with paramiko transport method but didn't work from remote (docker container) to remote SFTP.
But I have the following error : paramiko.ssh_exception.AuthenticationException: Authentication failed.
How can I do this ? I don't know if my method is okay or if there is other better way to solve it (send data from container to an SFTP server).
The argument given to the exec_command function is not the command you would normally run on the local (client) host's shell, but rather attempted to be ran on the remote (server) host. While it is not likely to get an AuthenticationException by attempting to run remote commands, as you did not post a full traceback in the question - it is hard to tell for sure.
I suggest checking the following code:
import paramiko
ssh=paramiko.SSHClient()
ssh.set_missing_host_key_policy(paramiko.AutoAddPolicy())
session = ssh.connect(hostname="X", port=X, username='X', password="X")
### so far, same code as in the question ###
print("Auth OK")
sftp_connection = ssh.open_sftp()
sftp_connection.put("local/file/path", "remote/file/path")
If you see the "Auth OK" print - then you should be good to go, just replace the file path arguments of the sftp_connection.put() method with actual local and remote file paths.
Otherwise - there is an actual authentication issue which should be resolved.
I am trying to use Paramiko to make an SSH communication between 2 servers on a private network. The client server is a web server and the host server is going to be a "worker" server. The idea was to not open up the worker server to HTTP connections. The only communication that needs to happen, is the web server needs to pass strings to a script on the worker server. For this I was hoping to use Paramiko and pass the information to the script via SSH.
I set up a new user and created a test script in Python 3, which works when I run it from the command line from my own user's SSH session. I put the same code into my Django web app, thinking that it should work, since it tests OK from the command line, and I get the following error:
Server 'worker-server' not found in known_hosts
Now, I think I understand this error. When performing the test script, I was using a certain user to access the server, and the known hosts information is saved to ~/.ssh/known_hosts even though the user is actually a 3rd party user created just for this one job. So the Django app is running under a different user who doesn't find the saved known hosts info because it doesn't have access to that folder. As far as I can tell the user which Apache uses to execute the Django scripts doesn't have a home directory.
Is there a way I can add this known host in a way that the Django process can see it?
Script:
import paramiko
client = paramiko.SSHClient()
client.load_system_host_keys()
client.connect('worker-server', 22, 'workeruser', 'workerpass')
code = "123wfdv"
survey_id = 111
stdin, stdout, stderr =
client.exec_command('python3 /path/to/test_script/test.py %s %s' % ( code, survey_id ))
print( "ssh successful. Closing connection" )
stdout = stdout.readlines()
client.close()
print ( "Connection closed" )
output = ""
for line in stdout:
output = output + line
if output!="":
print ( output )
else:
print ( "There was no output for this command" )
You can hard-code the host key in your Python code, using HostKeys.add:
import paramiko
from base64 import decodebytes
keydata = b"""AAAAB3NzaC1yc2EAAAABIwAAAQEA0hV..."""
key = paramiko.RSAKey(data=decodebytes(keydata))
client = paramiko.SSHClient()
client.get_host_keys().add('example.com', 'ssh-rsa', key)
client.connect(...)
This is based on my answer to:
Paramiko "Unknown Server".
To see how to obtain the fingerprint for use in the code, see my answer to:
Verify host key with pysftp.
If using pysftp, instead of Paramiko directly, see:
PySFTP failing with "No hostkey for host X found" when deploying Django/Heroku
Or, as you are connecting within a private network, you can give up on verifying host key altogether, using AutoAddPolicy:
client = paramiko.SSHClient()
client.set_missing_host_key_policy(paramiko.AutoAddPolicy())
client.connect(...)
(This can be done only if you really do not need the connection to be secure)
I'm learning the fabric to automatically connect the ec2 instance which is already created. I set a ssh_config in the ssh folder
Home myhostname
Hostname 52.62.207.113
User ubuntu
UserKnownHostsFile /dev/null
StrictHostKeyChecking no
PasswordAuthentication no
IdentityFile ~/.ssh/mykey-pem
And I wrote a python file to test
from fabric import Connection
c = Connection('52.62.207.113')
result = c.run('uname -s')
The terminal response
paramiko.ssh_exception.SSHException: No authentication methods available.
I'm not sure what happens. I try to manually
ssh -i mykey.pem ubuntu#52.62.207.113
It is successfully connecting the EC2 instance
Home myhostname
Hostname 52.62.207.113
...
c = Connection('52.62.207.113')
I'm not a fabric user, but I guess you're expecting fabric to make use of the entry from your ssh_config file here? I can see two likely problems:
You have Home myhostname. The correct keyword here is Host, not Home:
Host myhostname
Hostname 52.62.207.113
If you want fabric to use the Host section for myhostname, you probably have to tell it to connect to myhostname:
c = Connection('myhostname')
You're telling it to connect to an IP address, and it probably wouldn't relate that to the host section
The actual error that you're getting, "No authentication methods available", is probably because fabric didn't apply the Host section from ssh_config, and it doesn't know of any key files that it should use for the session.
I think you missed PreferredAuthentications options.
And you typed your key file name incorrectly.
Change the config file as shown below and try connecting again.
Home myhostname
Hostname 52.62.207.113
User ubuntu
PreferredAuthentications publickey
IdentityFile ~/.ssh/mykey.pem
Moin!
Situation: connect to the destination.host over the jump.host and run a command on the destination.host, which connects in the background to the another.host (on this host my ssh key is needed).
Scheme: client --> jump.host --> destination.host --- remote_command with ssh key needed on the other host --> another.host
#!/usr/bin/python
import paramiko
jumpHost=paramiko.SSHClient()
sshKey = paramiko.RSAKey.from_private_key_file('path.to.key/file', password = 'the.passphrase')
jumpHost.set_missing_host_key_policy(paramiko.AutoAddPolicy())
jumpHost.connect('jump.hostname',username='foo', pkey = sshKey)
jumpHostTransport = jumpHost.get_transport()
dest_addr = ('destination.hostname', 22)
local_addr = ('jump.hostname', 22)
jumpHostChannel = jumpHostTransport.open_channel("direct-tcpip", dest_addr, local_addr)
destHost=paramiko.SSHClient()
destHost.set_missing_host_key_policy(paramiko.AutoAddPolicy())
destHost.connect('destination.hostname', username='foo', sock=jumpHostChannel, pkey=sshKey)
destHostAgentSession = destHost.get_transport().open_session()
paramiko.agent.AgentRequestHandler(destHostAgentSession)
stdin, stderr, stdout = destHost.exec_command("my.command.which.connects.to.another.host")
print(stdout.read())
print(stderr.read())
destHost.close()
jumpHost.close()
The above code works well, if run "local" commands on the destination.host - e.g. uname, whoami, hostname, ls and so on... But if i run a command, which connects in the background to another host where my ssh key is needed, the code raised in the error:
raise AuthenticationException("Unable to connect to SSH agent")
paramiko.ssh_exception.AuthenticationException: Unable to connect to SSH agent
If i connect via Putty at the same chain, it works well.
Can anyone give me a hint to resolve my problem?
Thx in advance.
Assumption: Your keys work across jump host and destination host.
Creating a local agent in that case will work. You could manually create it via shell first and test it via iPython.
eval `ssh-agent`; ssh-add <my-key-file-path>
Programmatically this can be done -
# Using shell=True is not a great idea because it is a security risk.
# Refer this post - https://security.openstack.org/guidelines/dg_avoid-shell-true.html
subprocess.check_output("eval `ssh-agent`; ssh-add <my-key-file-path>", shell=True)
I am trying to do something similar and came across this post, I will update if I find a better solution.
EDIT: I have posted the implementation over here - https://adikrishnan.in/2018/10/25/agent-forwarding-with-paramiko/
I have a script that loops over a dozen hosts and executes several functions in each host. The functions take as a parameter the SSHClient() and then execute commands on it.
I could simply set some attribute on the SSHClient(), but before I do that, is there already a way to determine from an instance of SSHClient() which host is currently being connected to?
for host in hosts:
ssh = paramiko.SSHClient()
ssh.set_missing_host_key_policy(paramiko.AutoAddPolicy())
ssh.connect(host, username=USERNAME)
f1(ssh)
f2(ssh)
...
Using the ssh variable for the Client, as you've done, there is:
ssh.get_transport().getpeername()
Which will return a tuple of ('ip address', portnumber)
Will that be enough? Looking at the source for paramiko/client.py it doesn't seem to keep a record of the value of connect()'s hostname parameter, it looks up the address with socket.getaddrinfo and then passes the result on to the transport, which is what getpeername() is asking.
The following is probably what you are after...
stdin, stdout, stderr = ssh.exec_command('hostname')
hostname = stdout.read().decode("utf-8").strip('\n')
print(hostname)
How it works:
On any windows and linux, if you type hostname on any terminal, then you will get back the server name. We execute this cmd on the remote, get back the returned output and change it from byte to string format and finally take off trailing new line character.
This is how I solved the issue:
getpeername() returns an IP address, which is not very user friendly for a log message. So I wrapped it with socket.gethostbyaddr().
This returns the FQDN as a string:
socket.gethostbyaddr(ssh.get_transport().getpeername()[0])[0])
And this splits it to just the hostname itself:
socket.gethostbyaddr(ssh.get_transport().getpeername()[0])[0].split('.')[0])