I'm learning the fabric to automatically connect the ec2 instance which is already created. I set a ssh_config in the ssh folder
Home myhostname
Hostname 52.62.207.113
User ubuntu
UserKnownHostsFile /dev/null
StrictHostKeyChecking no
PasswordAuthentication no
IdentityFile ~/.ssh/mykey-pem
And I wrote a python file to test
from fabric import Connection
c = Connection('52.62.207.113')
result = c.run('uname -s')
The terminal response
paramiko.ssh_exception.SSHException: No authentication methods available.
I'm not sure what happens. I try to manually
ssh -i mykey.pem ubuntu#52.62.207.113
It is successfully connecting the EC2 instance
Home myhostname
Hostname 52.62.207.113
...
c = Connection('52.62.207.113')
I'm not a fabric user, but I guess you're expecting fabric to make use of the entry from your ssh_config file here? I can see two likely problems:
You have Home myhostname. The correct keyword here is Host, not Home:
Host myhostname
Hostname 52.62.207.113
If you want fabric to use the Host section for myhostname, you probably have to tell it to connect to myhostname:
c = Connection('myhostname')
You're telling it to connect to an IP address, and it probably wouldn't relate that to the host section
The actual error that you're getting, "No authentication methods available", is probably because fabric didn't apply the Host section from ssh_config, and it doesn't know of any key files that it should use for the session.
I think you missed PreferredAuthentications options.
And you typed your key file name incorrectly.
Change the config file as shown below and try connecting again.
Home myhostname
Hostname 52.62.207.113
User ubuntu
PreferredAuthentications publickey
IdentityFile ~/.ssh/mykey.pem
Related
I have a Python script I am trying to run in a Docker container to send a file that is on this container to an SFTP server.
I tried the following :
import paramiko
ssh=paramiko.SSHClient()
ssh.set_missing_host_key_policy(paramiko.AutoAddPolicy())
session = ssh.connect(hostname="X", port=X, username='X', password="X")
stdin,stdout,stderr = ssh.exec_command('sftp -P X ldzdl#hostname', get_pty=True)
I also tried with paramiko transport method but didn't work from remote (docker container) to remote SFTP.
But I have the following error : paramiko.ssh_exception.AuthenticationException: Authentication failed.
How can I do this ? I don't know if my method is okay or if there is other better way to solve it (send data from container to an SFTP server).
The argument given to the exec_command function is not the command you would normally run on the local (client) host's shell, but rather attempted to be ran on the remote (server) host. While it is not likely to get an AuthenticationException by attempting to run remote commands, as you did not post a full traceback in the question - it is hard to tell for sure.
I suggest checking the following code:
import paramiko
ssh=paramiko.SSHClient()
ssh.set_missing_host_key_policy(paramiko.AutoAddPolicy())
session = ssh.connect(hostname="X", port=X, username='X', password="X")
### so far, same code as in the question ###
print("Auth OK")
sftp_connection = ssh.open_sftp()
sftp_connection.put("local/file/path", "remote/file/path")
If you see the "Auth OK" print - then you should be good to go, just replace the file path arguments of the sftp_connection.put() method with actual local and remote file paths.
Otherwise - there is an actual authentication issue which should be resolved.
I am trying to use Paramiko to make an SSH communication between 2 servers on a private network. The client server is a web server and the host server is going to be a "worker" server. The idea was to not open up the worker server to HTTP connections. The only communication that needs to happen, is the web server needs to pass strings to a script on the worker server. For this I was hoping to use Paramiko and pass the information to the script via SSH.
I set up a new user and created a test script in Python 3, which works when I run it from the command line from my own user's SSH session. I put the same code into my Django web app, thinking that it should work, since it tests OK from the command line, and I get the following error:
Server 'worker-server' not found in known_hosts
Now, I think I understand this error. When performing the test script, I was using a certain user to access the server, and the known hosts information is saved to ~/.ssh/known_hosts even though the user is actually a 3rd party user created just for this one job. So the Django app is running under a different user who doesn't find the saved known hosts info because it doesn't have access to that folder. As far as I can tell the user which Apache uses to execute the Django scripts doesn't have a home directory.
Is there a way I can add this known host in a way that the Django process can see it?
Script:
import paramiko
client = paramiko.SSHClient()
client.load_system_host_keys()
client.connect('worker-server', 22, 'workeruser', 'workerpass')
code = "123wfdv"
survey_id = 111
stdin, stdout, stderr =
client.exec_command('python3 /path/to/test_script/test.py %s %s' % ( code, survey_id ))
print( "ssh successful. Closing connection" )
stdout = stdout.readlines()
client.close()
print ( "Connection closed" )
output = ""
for line in stdout:
output = output + line
if output!="":
print ( output )
else:
print ( "There was no output for this command" )
You can hard-code the host key in your Python code, using HostKeys.add:
import paramiko
from base64 import decodebytes
keydata = b"""AAAAB3NzaC1yc2EAAAABIwAAAQEA0hV..."""
key = paramiko.RSAKey(data=decodebytes(keydata))
client = paramiko.SSHClient()
client.get_host_keys().add('example.com', 'ssh-rsa', key)
client.connect(...)
This is based on my answer to:
Paramiko "Unknown Server".
To see how to obtain the fingerprint for use in the code, see my answer to:
Verify host key with pysftp.
If using pysftp, instead of Paramiko directly, see:
PySFTP failing with "No hostkey for host X found" when deploying Django/Heroku
Or, as you are connecting within a private network, you can give up on verifying host key altogether, using AutoAddPolicy:
client = paramiko.SSHClient()
client.set_missing_host_key_policy(paramiko.AutoAddPolicy())
client.connect(...)
(This can be done only if you really do not need the connection to be secure)
Moin!
Situation: connect to the destination.host over the jump.host and run a command on the destination.host, which connects in the background to the another.host (on this host my ssh key is needed).
Scheme: client --> jump.host --> destination.host --- remote_command with ssh key needed on the other host --> another.host
#!/usr/bin/python
import paramiko
jumpHost=paramiko.SSHClient()
sshKey = paramiko.RSAKey.from_private_key_file('path.to.key/file', password = 'the.passphrase')
jumpHost.set_missing_host_key_policy(paramiko.AutoAddPolicy())
jumpHost.connect('jump.hostname',username='foo', pkey = sshKey)
jumpHostTransport = jumpHost.get_transport()
dest_addr = ('destination.hostname', 22)
local_addr = ('jump.hostname', 22)
jumpHostChannel = jumpHostTransport.open_channel("direct-tcpip", dest_addr, local_addr)
destHost=paramiko.SSHClient()
destHost.set_missing_host_key_policy(paramiko.AutoAddPolicy())
destHost.connect('destination.hostname', username='foo', sock=jumpHostChannel, pkey=sshKey)
destHostAgentSession = destHost.get_transport().open_session()
paramiko.agent.AgentRequestHandler(destHostAgentSession)
stdin, stderr, stdout = destHost.exec_command("my.command.which.connects.to.another.host")
print(stdout.read())
print(stderr.read())
destHost.close()
jumpHost.close()
The above code works well, if run "local" commands on the destination.host - e.g. uname, whoami, hostname, ls and so on... But if i run a command, which connects in the background to another host where my ssh key is needed, the code raised in the error:
raise AuthenticationException("Unable to connect to SSH agent")
paramiko.ssh_exception.AuthenticationException: Unable to connect to SSH agent
If i connect via Putty at the same chain, it works well.
Can anyone give me a hint to resolve my problem?
Thx in advance.
Assumption: Your keys work across jump host and destination host.
Creating a local agent in that case will work. You could manually create it via shell first and test it via iPython.
eval `ssh-agent`; ssh-add <my-key-file-path>
Programmatically this can be done -
# Using shell=True is not a great idea because it is a security risk.
# Refer this post - https://security.openstack.org/guidelines/dg_avoid-shell-true.html
subprocess.check_output("eval `ssh-agent`; ssh-add <my-key-file-path>", shell=True)
I am trying to do something similar and came across this post, I will update if I find a better solution.
EDIT: I have posted the implementation over here - https://adikrishnan.in/2018/10/25/agent-forwarding-with-paramiko/
I have a Python script that I'm trying to run on a Google Cloud Compute Engine. I have Jupyter Notebooks setup and running on the Compute Engine and when I connect to it and run the SFTP connection script from a Notebook window on my laptop connecting to the Compute Engine instance, it runs OK.
When I run the same script through the command line directly on the Compute Engine I get an SSH error.
(2018-06-15 12:39:52; transport.py:1636) DEBUG:paramiko.transport: Adding ecdsa-sha2-nistp521 host key for sftp.############.com: b'ff97f###################'
(2018-06-15 12:39:52; transport.py:1636) DEBUG:paramiko.transport: Trying SSH agent key b'bc574#######################'
(2018-06-15 12:39:52; transport.py:1636) DEBUG:paramiko.transport: userauth is OK
(2018-06-15 12:39:52; transport.py:1636) INFO:paramiko.transport: Authentication (publickey) failed.
(2018-06-15 12:39:52; transport.py:1636) INFO:paramiko.transport: Disconnect (code 7): Bad service id
(2018-06-15 12:39:53; get_file_sftp.py:29) ERROR:__main__: SFTP Connection: Connection Error
I have tried generating a key using
ssh-keygen hostname
and it created an ECDSA key, but now when I do a regular ssh connection from the command line (which was working before I started messing with keygen) I get the error "shell request failed on channel 0", so I've made things worse and the code still doesn't run.
Doing a ssh-keyscan for the hostname returns two records, one RSA and one ECDSA
Here is my Python code
import paramiko
from parse_config import fn_parse_config
from configparser import ConfigParser
import logging
import logging.config
import datetime
def fn_get_file_sftp():
#initialise logging function
logging.config.fileConfig('logging.conf', disable_existing_loggers=False)
logger=logging.getLogger(__name__)
#get details from config files
sftp_params = fn_parse_config("etl_process.ini", "remote_sftp")
sftp_folders = fn_parse_config("etl_process.ini", "sftp_folders")
ssh = paramiko.SSHClient()
ssh.set_missing_host_key_policy(paramiko.AutoAddPolicy())
#print('start')
try:
logger.info("SFTP Connection:Send credentials")
ssh.connect(**sftp_params)
logger.info("SFTP Connection:SUCCESS")
except paramiko.SSHException:
logger.error("SFTP Connection: Connection Error")
sftp = ssh.open_sftp()
#sftp.chdir("/vivobarefoot/")
print (sftp.listdir("/"))
for file in sftp.listdir("/"):
sftp.get(remotepath = sftp_folders.get('remote_path') + file, localpath=sftp_folders.get('local_path') + file )
logger.info("RETRIEVED SFTP FILE: " + file)
sftp.remove(sftp_folders.get('remote_path') + file)
logger.info("REMOVED SFTP FILE: " + file)
ssh.close()
logger.info("SFTP Connection:CLOSED")
fn_get_file_sftp()
OK I found the solution, it came from this, running the command
gcloud compute ssh INSTANCE_NAME
Doin this created the SSH key files required by the Python script - so no changes required to the Python code. I still get the error "shell request failed on channel 0" if I try to run a basic ssh connection in the command line in Ubuntu, so when I was trying to do it through Unix commands I clearly broke something, but my main requirement was to get the python script working.
========================
Here is the source of the information I used to find this fix
https://groups.google.com/forum/#!topic/gce-discussion/PXEzoZQpcSo
add a rule to GCE firewall : $ gcloud compute firewall-rules create mySSH --allow tcp:22 --source-ranges 0.0.0.0/0
test the rule by telnetting : $ telnet IP 22
if telnet is working, than SSH is accepting connections, so try to ssh : $ gcloud compute ssh INSTANCE_NAME
I'm writing fairly simple application which connects to server through SSH (using paramiko), does something and writes output to web page. I wrote a script which works well when I run it from command line. However, if I run it in Django application, it can't get through connect part.
SSH connect part:
transport = paramiko.Transport((host, port))
# application cannot get through this line
transport.connect(username = '***', password = '***')
output = ...
View:
def ssh_output(request):
return HttpResponse(output)
Any idea why does it behave like this? Is there any way to fix it?
I'm guessing your Django app may be running under a different user than the user you're running your command line script under. Also, I'm guessing it might be the first time the Django app user is trying to ssh to the host, so it may be hanging on some sort of 'is it OK to update ~/.ssh/known_hosts' question.
It looks like if you use SSHClient instead of Transport, then you can set the missing host key policy to automatically add the missing host keys ala
import paramiko
ssh = paramiko.SSHClient()
ssh.set_missing_host_key_policy(paramiko.AutoAddPolicy())
ssh.connect(your_host, port=your_port, username=your_username, password=your_password)