I want to run a python script on my server (that python script has GUI). But I want to start it from ssh. Something like this:
ssh me#server -i my_key "nohup python script.py"
... > let the script run forever
BUT it complains "unable to access video driver" since it is trying to use my ssh terminal as output.
Can I somehow make my commands output run on server machine and not to my terminal... Basically something like "wake-on-lan functionality" -> tell the server you want something and he will do everything using its own system (not sending any output back)
What about
ssh me#server -i my_key "nohup python script.py >/dev/null 2>&1"
You can use redirection to some remote logfile instead of /dev/null of course.
? :)
EDIT: GUI applications on X usually use $DISPLAY variable to know where they should be displayed. Moreover, X11 display servers use authorization to permit or disallow applications connecting to its display. Commands
export DISPLAY=:0 && xhost +
may be helpful for you.
Isn't it possible for you to rather use python ssh extension instead of calling external application?
It would:
run as one process
guarantee that invocation will be the same among all possible system
lose the overhead from "execution"
send everything trough ssh (you won't have to worry about input like "; possibly local executed command)
If not, go with what Piotr Wades suggested.
Related
I am trying to escape a python interactive shell within an ssh server without closing the ssh connection, using exit(), quit() ctrl D closes the ssh connection
I'm assuming your connection is to a Linux/Unix server. If it's Windows, this won't help.
If you only close the python interpreter (ctrl-c for instance) it shouldn't close the SSH connection, since the python interpreter is running on top of the unix shell, which you are actually connected to.
The best way (or at least the easiest) to keep your SSH connection and keep any program running after you leave is to use a tool like tmux or screen (if your linux machine does not have tmux installed).
In order to do so, you can either start your program with $ screen python or start screen before you run anything, and it will start a screen session with bash running.
Then you can safely close the ssh connection, and, when you ssh back into the machine, use screen -r to return to where you leave.
You can easily import the pty module
import pty
and then spawn a new bash shell "/bin/bash" with the module using
pty.spawn("/bin/bash")
Note: "bash" in this context can be changed depending on what shell is avaliable on the server which could be "sh", "dash" or any other type of unix compatible shell
ssh -t
-t flag forces use of ptty. Control sequences will be sent directly to remote process
The only solution i found was this :
cat MyScript.py | ssh username#ip_addr python -
the problem with this is that it wont show the outputs of that script live, it waits until the program is finished and then it displays the output
I'm using scapy with in this script and sniff packets with it, and i have to run this on that remote server (and no, i cant copy it there or write it there)
so what is the solution? how can i view the outputs of a script live in my command line?
I'm using windows 10.
Important note: also i think ssh is using some sort of buffering, and sends the output after the amount of printed stuff gets more than buffer, since when the output is very large it does show part of it suddenly, i want the output of that function to come to my computer as soon as possible, not after reaching a buffer or something
You should first send the file to your remote machine using
scp MyScript.py username#ip_addre:/path/to/script
then SSH to your remote machine using
ssh username#ip_addr
ANd finally, you run you script normally
python path/to/MyScript.py
EDIT
To execute your script directly without copying it to the remote machine, use this command:
ssh user#ip_addr 'python -s' < script.py
I have a problem with running non finishing process via ssh from python script. What I need to do:
Connect to remote server via ssh
Run mongod on this server
Finish script execution without killing mongod process
For now, I'm using subprocess.Popen in this way:
subprocess.Popen(['ssh', 'user#' + host, 'mongod', '-f', '/temp-mongo.conf', '&'])
Problem is that script ends before I'm asked about user password, so it finishes with Too many authentication failures for root.
I tried to use p = subprocess.Popen(...).communicate() and it'a almost ok, but then script waits for mongod command to be finished, what obviously won't happen.
What is proper way to do this? Can I do something to pass password automatically?
I agree with e4c5 that you should use a tool like Fabric for that. If you want to stay with the dirty way something like this should work:
subprocess.call('ssh user#%s "mongod -f /temp-mongo.conf &>/dev/null &"' % host,
shell=True)
Note that you need to do:
quotes around the remote call
add &>/dev/null which routes all output of mongod to /dev/null (without this it will block, not 100% sure why. Probably since the stdout of the shell is attached to the command)
use shell=True so the shell builds up the command for you (so you don't need to put a ", " instead of each space)
This also works with auth over public key (instead of writing the password by hand)
The proper way for SSH without having to enter passwords is to use public key authentication.
I would also look at Fabric for running commands via SSH.
And you aren't running the command in a shell environment on the server. If you run
ssh host ls &
the & will actually put ssh in the background, not ls on the host. Instead try doing
ssh host sh -c 'ls &'
keygen etc. so I can just ssh remotehost w/o using password, shell =BASH
I am using a for loop to ssh over multiple nodes (remote host) and wish to execute a script but it dosent seem to work
for i in {1..10};
do
ssh -f node$i "python script.py $i"
done
the terminal script hangs up and nothing happens
Also I can manually ssh and use python. The PYTHONPATH etc are configured for enodes.
There was cshell on nodes, so i used .cshrc wit exec /bin/bash which atleast when i log manually gives me bash shell, so problem doesent seem to be there.
I
Instead of wrapping python script in a shell script, you should probably have a python script that connects to all the remote hosts via ssh and executes stuff.
Paramiko is a very good framework for this kind of use-case. It will be much easier to maintain your script this was in the long run.
Use -o BatchMode=yes
and maybe you need to force allocate pseudo-tty and -n to prevent reading from stdin
I want to execute a Python script on several (15+) remote machine using SSH. After invoking the script/command I need to disconnect ssh session and keep the processes running in background for as long as they are required to.
I have used Paramiko and PySSH in past so have no problems using them again. Only thing I need to know is how to disconnect a ssh session in python (since normally local script would wait for each remote machine to complete processing before moving on).
This might work, or something similar:
ssh user#remote.host nohup python scriptname.py &
Basically, have a look at the nohup command.
On Linux machines, you can run the script with 'at'.
echo "python scriptname.py" ¦ at now
If you are going to perform repetitive tasks on many hosts, like for example deploying software and running setup scripts, you should consider using something like Fabric
Fabric is a Python (2.5 or higher) library and command-line tool for
streamlining the use of SSH for application deployment or systems
administration tasks.
It provides a basic suite of operations for executing local or remote
shell commands (normally or via sudo) and uploading/downloading files,
as well as auxiliary functionality such as prompting the running user
for input, or aborting execution.
Typical use involves creating a Python module containing one or more
functions, then executing them via the fab command-line tool.
You can even use tmux in this scenario.
As per the tmux documentation:
tmux is a terminal multiplexer. It lets you switch easily between several programs in one terminal, detach them (they keep running in the background) and reattach them to a different terminal. And do a lot more
From a tmux session, you can run a script, quit the terminal, log in again and check back as it keeps the session until the server restart.
How to configure tmux on a cloud server