Execute remote python script via SSH - python

I want to execute a Python script on several (15+) remote machine using SSH. After invoking the script/command I need to disconnect ssh session and keep the processes running in background for as long as they are required to.
I have used Paramiko and PySSH in past so have no problems using them again. Only thing I need to know is how to disconnect a ssh session in python (since normally local script would wait for each remote machine to complete processing before moving on).

This might work, or something similar:
ssh user#remote.host nohup python scriptname.py &
Basically, have a look at the nohup command.

On Linux machines, you can run the script with 'at'.
echo "python scriptname.py" ¦ at now

If you are going to perform repetitive tasks on many hosts, like for example deploying software and running setup scripts, you should consider using something like Fabric
Fabric is a Python (2.5 or higher) library and command-line tool for
streamlining the use of SSH for application deployment or systems
administration tasks.
It provides a basic suite of operations for executing local or remote
shell commands (normally or via sudo) and uploading/downloading files,
as well as auxiliary functionality such as prompting the running user
for input, or aborting execution.
Typical use involves creating a Python module containing one or more
functions, then executing them via the fab command-line tool.

You can even use tmux in this scenario.
As per the tmux documentation:
tmux is a terminal multiplexer. It lets you switch easily between several programs in one terminal, detach them (they keep running in the background) and reattach them to a different terminal. And do a lot more
From a tmux session, you can run a script, quit the terminal, log in again and check back as it keeps the session until the server restart.
How to configure tmux on a cloud server

Related

Invoking multiple docker commands via python

I am trying to invoke a docker instance from python subprocess (windows / wsl).
Let's just assume that the docker I need to run is a simple docker run -it busybox (it's not going to be that, but it's a shell for experimenting) but once loaded I need to insert programmatically (asynchronous or blocked, either way is fine) some commands for git to pull some sources and then compile them and deploy them (before invoking docker, I am asking the user to choose a tag from a set of repos).
So far using the normal subprocess.Popen I was able to tap in to docker, but I need to have this persistent until I leave docker interactive shell (from busybox).
Is this possible to be done, or once I get the subprocess done it stops at the next command (as it happens now)?
(PS I can post some of my code, but I need to clean up some bits first)
Can you possibly do a simple while loop? My other thoughts would be if all the commands are the same each time they are called, put them in a batch file and call the batch from python. All I can come up with without code.

Minecraft Server Executing Scripts

I am working on a minecraft world space that can interact with a terminal shell and run commands on the computer directly. I intend to use not just the vanilla server but maybe craftbukkit or spigot.
Is it possible to create a listener on minecraft server.jar and wait for a certain command which executes a script on the computer itself?
Is there a plugin out there made for this purpose?
You can create a new plugin for Bukkit/Spigot (more information here).
In the onCommand-Method you can then call Runtime.getRuntime().exec("your shell command") to run commands in the linux shell (can also be used on Windows servers).
See also the Java documentation.

Spawning external processes from Jenkins - spawned but not executing

I am attempting to launch a couple of external applications from a Jenkins build step in Windows 7 64-bit. They are essentially programs designed to interact with each other and perform a series of regression tests on some software. Jenkins is run as Windows service as a user with admin privileges on the machine. I think that's full disclosure on any weirdness with my Jenkins installation.
I have written a Python3 script that successfully does what I want it to when run from the Windows command line. When when I run this script as a Jenkins build step, I can see that the applications have been spawned via the Task Manager, but there is no CPU activity associated with them, and no other evidence that they are actually doing anything (they produce log files, etc., but none of these appear). One of the applications typically runs at 25% CPU during the course of the regression tests.
The Python script itself runs to completion as if everything is OK. Jenkins is correctly monitoring the output of the script, which I can watch from the job's console output. I'm using os.spawnv(os.P_NOWAIT, ...) for each of the external application. The subprocess module doesn't do what I want it to, I just want these programs to run externally.
I've even run a bash script via Cygwin that functionally does the same thing as the Python script with the same results. Any idea why these applications spawn but don't execute?
Thanks!

run ssh command remotely without redirecting output

I want to run a python script on my server (that python script has GUI). But I want to start it from ssh. Something like this:
ssh me#server -i my_key "nohup python script.py"
... > let the script run forever
BUT it complains "unable to access video driver" since it is trying to use my ssh terminal as output.
Can I somehow make my commands output run on server machine and not to my terminal... Basically something like "wake-on-lan functionality" -> tell the server you want something and he will do everything using its own system (not sending any output back)
What about
ssh me#server -i my_key "nohup python script.py >/dev/null 2>&1"
You can use redirection to some remote logfile instead of /dev/null of course.
? :)
EDIT: GUI applications on X usually use $DISPLAY variable to know where they should be displayed. Moreover, X11 display servers use authorization to permit or disallow applications connecting to its display. Commands
export DISPLAY=:0 && xhost +
may be helpful for you.
Isn't it possible for you to rather use python ssh extension instead of calling external application?
It would:
run as one process
guarantee that invocation will be the same among all possible system
lose the overhead from "execution"
send everything trough ssh (you won't have to worry about input like "; possibly local executed command)
If not, go with what Piotr Wades suggested.

How to debug long running python scripts or services remotely?

Pretty much what the title says, I would like to be able to connect to a python process running under paster or uwsgi and utilize pdb functionality.
Using winpdb, you can attach to a running process like this:
Insert
import rpdb2; rpdb2.start_embedded_debugger('mypassword')
inside your script.
Launch your script (through paster or uwsgi) as usual.
Run winpdb
Click File>Attach
Type in password (e.g. "mypassword"), select the process.
To detach, click File>Detach. The script will continue to run, and can be attached to again later.

Categories