mount via paramiko fails "No such file or directory" - python

i am using paramiko of python to manipulate access remote linux machine. My command "mount device dir" is failing with " No such file or directory", even though exact the same command succeeds once i use it remotely (connected via ssh, not via paramiko).
I have tried to vary /etc/fstab to some values, again, same situation. Once i type it via ssh - ok, the same command via paramiko - above error message.
Any ideas?
example on command (changed minimally from origin):
import paramiko
self.ssh = paramiko.SSHClient()
self.ssh.set_missing_host_key_policy(paramiko.AutoAddPolicy())
self.ssh.connect('192.168.1.1', username='root', password='passwd')
stdin, stdout, stderr = self.ssh.exec_command("/bin/mount /dev/sda1")
gives me an error:
mount /dev/sda1 failed: mount: mounting /dev/sda1 on /media/card failed: No such file or directory
contents from /etc/fstab:
/dev/sda1 /media/card vfat fmask=0000,dmask=0000 0 0
of course, /media/card directory exists. again, i can use above command manually via ssh and it works as expected.
update.
meanwhile i tried fabric library of python (built on paramiko), exactly as described in Python - How do I authenticate SSH connection with Fabric module?
c = fabric.Connection(host = '192.168.1.1', user = "root", connect_kwargs={'password': 'passwd'})
c.run("/bin/mount /dev/sda1")
giving me exactly the same error message as with paramiko directly.
update2. well, as a matter of working around, i mounting drive using direct ssh call, as suggested below in comments. after i do in code whatever necessary, i try to unmount drive using "normal" paramiko call:
self.ssh.exec_command("/bin/umount /dev/sda1")
and it works. so now i am completely lost, mount as above is failing, but unmount is working. this is real strange..
update3. i have tried to extra set LD_LIBRARY_PATH to location of mount's libraries, it needs both libm.so.6 and libc.so.6, both located in /lib like:
self.ssh.exec_command("export LD_LIBRARY_PATH=/lib:/usr/lib && /bin/mount /dev/sda1")
yet no success again.

I was able to get this to work (first draft. Also, I am new to python). Anyway, here is a snip of my code.
The biggest hang-up for me was that it seems as though there is a 4->1 requirement for back slashes in the windows hostname.
Make sure you have a share from the windows PC first. My computer/share name in this case is "COMP_NAME/SHARE_NAME"
The username/password provided are your window creds for accessing the share.
import sys
import paramiko
import constant
### START ###############################################################################
# connect to a GW device
# GW: hostname to connect to
# return: client connection object
def connectToClient(GW):
try:
client = paramiko.SSHClient()
client.set_missing_host_key_policy(paramiko.AutoAddPolicy())
client.connect(GW, username=constant.GW_USER, password=constant.GW_PASS)
except:
print("Unexpected error:", sys.exc_info()[0])
return None
return client
### END ################################################################################
### START ###############################################################################
# execute a command on the remote device
# client: client connection object to the GW
# cmd: the command to execute
# eg. 'ls -l'
# return: nothing (TODO: maybe return error info)
def exec(client, cmd):
stdin, stdout, stderr = client.exec_command(cmd)
for line in stdout:
print(line.strip('\n'))
#for line in stderr:
# print(line.strip('\n'))
return
### END #################################################################################
# other stuff
# .
# .
# .
##########################################
# Start - upload the self extracting file to the GW
##########################################
#create the mount point
exec(client, "sudo mkdir /mnt/remote_files")
#mount the source directory (4 to 1 for the back slash chars in the UNC address ...)
exec(client, "sudo mount -t cifs -o username=oxxxxxxp,password=cxxxxxxxxx0 \\\\\\\\COMP_NAME\\\\SHARE_NAME /mnt/remote_files")
#copy the script file
exec(client, "cp /mnt/remote_files/selfextract.bsx rtls/scripts/selfextract.bsx")
#unmount the remote source
exec(client, "sudo umount /mnt/remote_files")
##########################################
# Done - upload the self extracting file to the GW
##########################################
# other stuff
# .
# .
# .
Hope this helps someone..
Pat

Related

How to send data from a file in a Python docker container to remote SFTP server?

I have a Python script I am trying to run in a Docker container to send a file that is on this container to an SFTP server.
I tried the following :
import paramiko
ssh=paramiko.SSHClient()
ssh.set_missing_host_key_policy(paramiko.AutoAddPolicy())
session = ssh.connect(hostname="X", port=X, username='X', password="X")
stdin,stdout,stderr = ssh.exec_command('sftp -P X ldzdl#hostname', get_pty=True)
I also tried with paramiko transport method but didn't work from remote (docker container) to remote SFTP.
But I have the following error : paramiko.ssh_exception.AuthenticationException: Authentication failed.
How can I do this ? I don't know if my method is okay or if there is other better way to solve it (send data from container to an SFTP server).
The argument given to the exec_command function is not the command you would normally run on the local (client) host's shell, but rather attempted to be ran on the remote (server) host. While it is not likely to get an AuthenticationException by attempting to run remote commands, as you did not post a full traceback in the question - it is hard to tell for sure.
I suggest checking the following code:
import paramiko
ssh=paramiko.SSHClient()
ssh.set_missing_host_key_policy(paramiko.AutoAddPolicy())
session = ssh.connect(hostname="X", port=X, username='X', password="X")
### so far, same code as in the question ###
print("Auth OK")
sftp_connection = ssh.open_sftp()
sftp_connection.put("local/file/path", "remote/file/path")
If you see the "Auth OK" print - then you should be good to go, just replace the file path arguments of the sftp_connection.put() method with actual local and remote file paths.
Otherwise - there is an actual authentication issue which should be resolved.

remote kubectl commands are not working in python [duplicate]

I am slowly trying to make a python script to SSH then FTP to do some manual file getting I have to do all the time. I am using Paramiko and the session seems to command, and prints the directory but my change directory command doesn't seem to work, it prints the directory I start in: /01/home/.
import paramiko
hostname = ''
port = 22
username = ''
password = ''
#selecting PROD instance, changing to data directory, checking directory
command = {
1:'ORACLE_SID=PROD',2:'cd /01/application/dataload',3:'pwd'
}
ssh=paramiko.SSHClient()
ssh.set_missing_host_key_policy(paramiko.AutoAddPolicy())
ssh.connect(hostname,port,username,password)
for key,value in command.items():
stdin,stdout,stderr=ssh.exec_command(value)
outlines=stdout.readlines()
result=''.join(outlines)
print (result)
ssh.close()
When you run exec_command multiple times, each command is executed in its own "shell". So the previous commands have no effect on an environment of the following commands.
If you need the previous commands to affect the following commands, just use an appropriate syntax of your server shell. Most *nix shells use a semicolon or an double-ampersand (with different semantics) to specify a list of commands. In your case, the ampersand is more appropriate, as it executes following commands, only if previous commands succeed:
command = "ORACLE_SID=PROD && cd /01/application/dataload && pwd"
stdin,stdout,stderr = ssh.exec_command(command)
In many cases, you do not even need to use multiple commands.
For example, instead of this sequence, that you might do when using shell interactively:
cd /path
ls
You can do:
ls /path
See also:
How to get each dependent command execution output using Paramiko exec_command
Obligatory warning: Do not use AutoAddPolicy on its own – You are losing a protection against MITM attacks by doing so. For a correct solution, see Paramiko "Unknown Server".
Well by accidentally trying something I managed to figure this out I believe. You need to do all the commands at one time and do not need to do them in a loop. for for my instance it would be
import paramiko
hostname = ''
port = 22
username = ''
password = ''
#selecting PROD instance, changing to data directory, checking directory
command = 'ORACLE_SID=PROD;cd /01/application/dataload;pwd'
ssh=paramiko.SSHClient()
ssh.set_missing_host_key_policy(paramiko.AutoAddPolicy())
ssh.connect(hostname,port,username,password)
stdin,stdout,stderr=ssh.exec_command(value)
outlines=stdout.readlines()
result=''.join(outlines)
print (result)
ssh.close()

python 3 paramiko ssh agent forward over jump host with remote command on third host

Moin!
Situation: connect to the destination.host over the jump.host and run a command on the destination.host, which connects in the background to the another.host (on this host my ssh key is needed).
Scheme: client --> jump.host --> destination.host --- remote_command with ssh key needed on the other host --> another.host
#!/usr/bin/python
import paramiko
jumpHost=paramiko.SSHClient()
sshKey = paramiko.RSAKey.from_private_key_file('path.to.key/file', password = 'the.passphrase')
jumpHost.set_missing_host_key_policy(paramiko.AutoAddPolicy())
jumpHost.connect('jump.hostname',username='foo', pkey = sshKey)
jumpHostTransport = jumpHost.get_transport()
dest_addr = ('destination.hostname', 22)
local_addr = ('jump.hostname', 22)
jumpHostChannel = jumpHostTransport.open_channel("direct-tcpip", dest_addr, local_addr)
destHost=paramiko.SSHClient()
destHost.set_missing_host_key_policy(paramiko.AutoAddPolicy())
destHost.connect('destination.hostname', username='foo', sock=jumpHostChannel, pkey=sshKey)
destHostAgentSession = destHost.get_transport().open_session()
paramiko.agent.AgentRequestHandler(destHostAgentSession)
stdin, stderr, stdout = destHost.exec_command("my.command.which.connects.to.another.host")
print(stdout.read())
print(stderr.read())
destHost.close()
jumpHost.close()
The above code works well, if run "local" commands on the destination.host - e.g. uname, whoami, hostname, ls and so on... But if i run a command, which connects in the background to another host where my ssh key is needed, the code raised in the error:
raise AuthenticationException("Unable to connect to SSH agent")
paramiko.ssh_exception.AuthenticationException: Unable to connect to SSH agent
If i connect via Putty at the same chain, it works well.
Can anyone give me a hint to resolve my problem?
Thx in advance.
Assumption: Your keys work across jump host and destination host.
Creating a local agent in that case will work. You could manually create it via shell first and test it via iPython.
eval `ssh-agent`; ssh-add <my-key-file-path>
Programmatically this can be done -
# Using shell=True is not a great idea because it is a security risk.
# Refer this post - https://security.openstack.org/guidelines/dg_avoid-shell-true.html
subprocess.check_output("eval `ssh-agent`; ssh-add <my-key-file-path>", shell=True)
I am trying to do something similar and came across this post, I will update if I find a better solution.
EDIT: I have posted the implementation over here - https://adikrishnan.in/2018/10/25/agent-forwarding-with-paramiko/

paramiko environment variable during tar [duplicate]

I am slowly trying to make a python script to SSH then FTP to do some manual file getting I have to do all the time. I am using Paramiko and the session seems to command, and prints the directory but my change directory command doesn't seem to work, it prints the directory I start in: /01/home/.
import paramiko
hostname = ''
port = 22
username = ''
password = ''
#selecting PROD instance, changing to data directory, checking directory
command = {
1:'ORACLE_SID=PROD',2:'cd /01/application/dataload',3:'pwd'
}
ssh=paramiko.SSHClient()
ssh.set_missing_host_key_policy(paramiko.AutoAddPolicy())
ssh.connect(hostname,port,username,password)
for key,value in command.items():
stdin,stdout,stderr=ssh.exec_command(value)
outlines=stdout.readlines()
result=''.join(outlines)
print (result)
ssh.close()
When you run exec_command multiple times, each command is executed in its own "shell". So the previous commands have no effect on an environment of the following commands.
If you need the previous commands to affect the following commands, just use an appropriate syntax of your server shell. Most *nix shells use a semicolon or an double-ampersand (with different semantics) to specify a list of commands. In your case, the ampersand is more appropriate, as it executes following commands, only if previous commands succeed:
command = "ORACLE_SID=PROD && cd /01/application/dataload && pwd"
stdin,stdout,stderr = ssh.exec_command(command)
In many cases, you do not even need to use multiple commands.
For example, instead of this sequence, that you might do when using shell interactively:
cd /path
ls
You can do:
ls /path
See also:
How to get each dependent command execution output using Paramiko exec_command
Obligatory warning: Do not use AutoAddPolicy on its own – You are losing a protection against MITM attacks by doing so. For a correct solution, see Paramiko "Unknown Server".
Well by accidentally trying something I managed to figure this out I believe. You need to do all the commands at one time and do not need to do them in a loop. for for my instance it would be
import paramiko
hostname = ''
port = 22
username = ''
password = ''
#selecting PROD instance, changing to data directory, checking directory
command = 'ORACLE_SID=PROD;cd /01/application/dataload;pwd'
ssh=paramiko.SSHClient()
ssh.set_missing_host_key_policy(paramiko.AutoAddPolicy())
ssh.connect(hostname,port,username,password)
stdin,stdout,stderr=ssh.exec_command(value)
outlines=stdout.readlines()
result=''.join(outlines)
print (result)
ssh.close()

Automating SSH using Python

I have this 30 virtual machines and I am doing everything manually right now which is a kind of trouble for me. I am trying now to write a script so I can connect to them all automatically and do my desired task. I made a flow diagram of what I need to do. Can anybody just give me some hints on how will i achieve this task programatically. I am attaching the flow diagram.
Thanks in Advance.
Please Right Click On Image and Click View Image to See the Flow Diagram.
Reading a text file and getting the data is trivial:
with open('host.txt', 'r') as inf:
lines = inf.readlines()
hostlist = [ln.split() for ln in lines]
Now hostlist should be a list of lists;
[['192.168.0.23', 'root', 'secret'], ['192.168.0.24', 'root', 'secret2'] ...
But your list shouldn't have to contain more than the hostnames. IP adresses can be gotten from DNS that you have to configure anyway, and ssh can login without passwords if configured correctly.
Putting the passwords for all your virtual hosts in a plain text file has security concerns. If you want to go that route, make sure to restrict access to that file!
You can use subprocess to execute commands. I would suggest using rsync to push the desired files to the virtual machines. That minimizes network traffic. And you can deploy directly from filesystem to filesystem without having to roll a tarball. It can be as simple as
status = subprocess.check_output(['rsync', '-av', localdir, remotedir])
Where localdir is the directory where the files for the virtual machine in question are stored (it should end with a '/'), and 'remotedir' is the hostname::directory on the virtual machine where the data should land (this should not end with a '/').
For executing commands remotely, ssh is the way to go. Configure that for passwordless login, using ssh's authorized_keys file on each remote host. Then you don't need to put passwords in your list of hosts.
Fabric is best solution for you. Fabric based on paramiko ( which based on libssh2), it makes very easy to work with commands on remote host and provides function to upload and download fiels from remote host.
Here it is http://docs.fabfile.org/en/1.5/
Docs about put function here
I don't get your question in the flow diagram, however you can use paramiko as suggested and I have a large number of background utilities written on top of paramiko which enables support people to monitor remote web servers on the browser. Snippet below,
client = paramiko.SSHClient()
client.set_missing_host_key_policy( paramiko.AutoAddPolicy() )
client.load_system_host_keys()
client.connect( '192.168.100.1', port=4001, username='monitor', password='XXXX' )
cmds = [ "sed -i 's/\/bin\/date -u/\/bin\/date/g' /etc/cron.hourly/reboot" ]
for cmd in cmds:
if __DEBUG_MODE__:
print 'Executing..... ' + cmd
stdin, stdout, stderr = client.exec_command( cmd )
Also if you want to push files below is the snippet,
def setupSFTPClient(self, ip_add):
print 'Setting SFTP client: ' + ip_add
tb = 'Finished SFTP-ing. '
try:
client = paramiko.SSHClient()
client.set_missing_host_key_policy(paramiko.AutoAddPolicy())
client.load_system_host_keys()
client.connect(ip_add, port=4001, username='monitor', password='XXXX')
sftp_client = client.open_sftp()
# NB you need the filename for remote path otherwise
# paramiko barfs with IOError: Failure
sftp_client.put( '/home/projects/portal_release.tgz', '/var/ND/portal_release.tgz' )
sftp_client.put( '/home/projects/portal_installer.sh', '/var/ND/portal_installer.sh' )
sftp_client.close()
except Exception, e:
print e
tb = traceback.format_exc()
finally:
print tb
client.close()

Categories