I need to run some bash commands via Fabric API (ssh).
I have the following String in my Python module:
newCommand = command + "'`echo -ne '\\015'"
When I print this string directly in Python the output is the expected:
command'`echo -ne '\015'
However, if I try to run this command via the Fabric API the command is somehow modified into this:
/bin/bash -l -c "command'\`echo -ne '\015'"
Notice the '\' before 'echo'. Why is this happenning? The '\' is breaking my command and I can't successfuly run the command.
ps: The prefix "/bin/bash -l -c" is expected since that's how Fabric works with SSH
This is not a valid shell command:
command'`echo -ne '\015'
Even if you add the missing backtick and single quote, it's nothing like writing "command" and pressing enter.
The context your command will be run in is basically what you'd get if you'd ssh and paste a command:
clientprompt$ ssh host
Welcome to Host, User
hostprompt$ <COMMAND HERE>
You should focus your efforts on finding a single command that does what you want, and not a series of keypresses that you could write to do it (that's not how ssh works).
Related
I am trying to make a python program write to a root protected file. This is using the python notify module. I am trying to get the program to use the registered endpoint.
On the console these both work and write sometext in the file /root/.config/notify-run:
sudo sh -c 'echo sometext >> /root/.config/notify-run'
echo sometext | sudo tee /root/.config/notify-run
Now in python I tried:
link = 'the endpoint'
command = ['sudo sh -c "echo', link, ' >>/root/.config/notify-run"']
subprocess.call(command, shell=True)
This returns:
syntax error unterminated quoted string
And trying:
link = 'the endpoint'
command = ['echo', link, '| sudo tee -a /root/.config/notify-run']
subprocess.call(command, shell=True)
Returns no error but does not write the endpoint in the file.
Anyone knows how to fix this? using this or other code that does the same as i am trying to do over here?
Use a string command rather than an array. This works to me:
link = 'the endpoint'
command = 'echo ' + link + ' | sudo tee -a /root/.config/notify-run'
subprocess.call(command, shell=True)
However, I advice you to edit directly the notify-run file from your Python script and run the whole Python script with root privileges so you don't have to run sudo, unless your script does much more than writing to that file.
Typing in sudoin your command would show the password prompt, and it would probably fail there.
The above answer asks you to run your script as root directly, however, if that's not possible, then you can move your script sudo tee -a /root/.config/notify-run into a different file.
Give it sudo access in /etc/sudoers file and execute it.
I want to run local python script on remote server by using Posh-SSH which is a module in Powerhsell.
This topic mention it with regular ssh by doing this:
Get-Content hello.py | ssh user#192.168.1.101 python -
I would like here to use Posh-SSH module but I don't get how to achieve it...
I tried sth like this but it doesn't work
Invoke-SSHCommand -Index $sessionid.sessionid -Command "$(Get-Content hello.py) | python3 -u -" -ErrorAction Stop
EDIT
No errors shown, only stuck on it and and do nothing...
EDIT2
Ok I get it now why I have this error when I send a multiline py file.
New lines are not retranscripted, look what command will be send to the remote server :
python3 -u - <<"--ENDOFPYTHON--"
chevre="basic" print("Hello, World!")
--ENDOFPYTHON--
and not:
python3 -u - <<"--ENDOFPYTHON--"
chevre="basic"
print("Hello, World!")
--ENDOFPYTHON--
EDIT3
And finally Done !
Thanks to this topic I performed to change the whitespace to newline as it should be.
For doing this just to this
( (Get-Content hello.py) -join "`r`n")
instead of simple
$(Get-Content hello.py)
Finally the line will be :
$cmd = "python3 -u - <<`"--ENDOFPYTHON--`"`n$((Get-Content $pyscript) -join "`r`n")`n--ENDOFPYTHON--"
Invoke-SSHCommand -Index $sessionid.sessionid -Command $cmd -ErrorAction Stop
Also don't forget to remove the line
#!/usr/bin/env python3
if present on top of your py file, otherwise it won't work.
You are currently sending the following to the SSH endpoint:
# Example Python Added: would be replaced with your code
print("Hello, World!") | python3 -u -
Assuming bash as the endpoint shell, the above is invalid and would either produce an error or hang.
You need to encapsulate the code being sent to the server using echo (assuming bash ssh endpoint).
Invoke-SSHCommand -Index $sessionid.sessionid -Command "echo '$((Get-Content hello.py) -join "`r`n")' | python3 -u -" -ErrorAction Stop
The above will now send:
echo 'print("Hello, World!")' | python3 -u -
This works as long as you don't use single quotes. However, if you must use those or other special characters, you may need to use a here document instead:
Invoke-SSHCommand -Index $sessionid.sessionid -Command "python3 -u - <<`"--ENDOFPYTHON--`"`n$((Get-Content hello.py) -join "`r`n")`n--ENDOFPYTHON--" -ErrorAction Stop
The here document will send exactly what it is to the standard input stream of the program: tabs, spaces, quotes and all. Therefore the above will send the following:
python3 -u - <<"--ENDOFPYTHON--"
print("Hello, World!")
--ENDOFPYTHON--
You can replace --ENDOFPYTHON-- with anything as long as it does not appear in your python file.
Reference for here docs
UPDATE:
Added -join "`r`n" as it is needed for newlines to be sent correctly, as pointed out by the asker.
I need to execute multiple shell commands in a ssh session using the subprocess module.
I am able to execute one command at a time with:
subprocess.Popen(["ssh", "-o UserKnownHostsFile=/dev/null", "-o StrictHostKeyChecking=no", "%s" % <HOST>, <command>])
But is there a way to execute multiple shell commands with subprocess in a ssh session? If possible, I don't want to use packages.
Thank you very much!
Strictly speaking you are only executing one command with Popen no matter how many commands you execute on the remote server. That command is ssh.
To have multiple commands executed on the remote server just pass them in your command string seperated by ;s:
commands = ["echo 'hi'", "echo 'another command'"]
subprocess.Popen([
"ssh",
"-o UserKnownHostsFile=/dev/null",
"-o StrictHostKeyChecking=no",
";".join(commands)
])
You could alternatively join commands on && if you wanted each command to only execute if the previous command had succeeded.
If you have many commands and you are concerned that you might exceed the command line limits, you could execute sh (or bash) with the -s option on the remote server, which will execute commands one-by-one as you send them:
p = subprocess.Popen([
"ssh",
"-o UserKnownHostsFile=/dev/null",
"-o StrictHostKeyChecking=no",
"sh -s",
], stdin=subprocess.PIPE)
for command in commands:
p.stdin.write(command)
p.stdin.write("\n")
p.flush()
p.communicate()
Note that in Python3 you will need to encode the command to a byte string (command.encode("utf8")) before writing it to the stdin of the subprocess.
I feel like this is overkill for most simple situations though where the initial suggest is simplest.
I am using a python script to restrict the commands usage using the command argument in the authorized_keys file.
command:
ssh host-name bash --login -c 'exec $0 "$#"' mkdir -p hello
My script is performing required actions to restrict the commands. After filtering, the python script does sys.exit(1) for error and sys.exit(0) for success. After the return value the above ssh command is not getting executed at the end. Is there something else I need to send from the python script to SSH daemon?
The command modifier in the authorized_keys is not (only) used to validate the users command, but that command is run instead of the command provided by the user. This means calling sys.exit(0) from there prevents running the user-provided command.
In that script, after you validate the command, you need to run it too!
I think changing it to
ssh host-name bash --login -c 'exec $0 "$#" && mkdir -p hello'
should do the trick, otherwise bash will assume only the part in the single quotes is the command to execute.
If the second part should be executed even if the first part fails, replace the && with ;
I use python module pysftp to connect to remote server. Below you can see python code :
import pysftp
import sys
import sqr_common
srv = pysftp.Connection(host="xxxxxx", username="xxxx",
password="xxxxx")
command = "/usr/bin/bash"
command2="APSHOME=/all/aps/msc_2012; export APSHOME; "
srv.execute(command)
srv.execute(command2)
srv.close()
Problem is that command /usr/bin/bash is an infinite process , so my script will never be executed. Can anyone help me how to choose shell on remote server for example bash and execute command in bash on remote server?? Is there any pysftp function that allows me chosing shell??
try this
/usr/bin/bash -c "APSHOME=/all/aps/msc_2012; export APSHOME; "
This problem is not specific to Python, but more like how to execute commands under specific shell.
If you need to run only single command you can run using bash -c switch
bash -c "echo 123"
You can run multiple commands ; separated
bash -c "echo 123 ; echo 246"
If you need to many commands under a specific shell, remotely create a shell script file (.bash file) an execute it
bash myscript.bash