python capture and store curl command process id - python

i have a python script which uses threading to run a process.
It starts three threads executing curl commands on a different server.
i am not using threading sub-processes or any python sub-processes (as my version of python does not support it ) and I want to kill these specific running curl processes on the different server.
I want to capture the process id at the time the curl command is executed, put it in a list and then use this list of PID's to kill if necessary.
What is the best way to do this ?
I have tried a few ways but nothing working ..
curl command & echo $!
will return the value to screen but i want to capture this. I have tried to set as a variable, export it ..
If i try:
ret,output = remoteConnection.runCmdGetOutput(cmd)
it will return the output (which includes the pid) of one curl command (the first one), but not all 3 (i think this has to do with the threading) and the parent script continues.
Any ideas ?
Thanks

Related

Paramiko read stdout after every stdin [duplicate]

I am writing a program in Python which must communicate through SSH with a physical target, and send to this targets some commands automatically (it is for testing).
I start by doing this with Paramiko and everything was perfect until I have to send several commands and when for example the second one must be execute in the context of the first (for example the first one makes cd /mytargetRep and the second one is ./executeWhatIWant). I can't use exec_command to do so, because each exec_command starts a new session.
I try to use a channel with invoke_shell(), but I have an other problem with this one: I don't know when command execution is ended by doing this. I can have some very short (in time) command execution, and some other are really more longer so I need to know when the command execution is over.
I know a workaround it to use exec_command with a shell logic operations such as && or using ;. For example exec_command("cd /mytargetRep && ./executeWhatIWant"). But I can't do that, because it must also be possible to execute some commands manually (I have a minimalist terminal where I can send commands), so for example, the user will make cd /mytargetRep then ./executeWhatIWant and not cd /mytargetRep && ./executeWhatIWant.
So my question is: is there a solution by using Paramiko to send several commands in a same SSH session and be able to know the end of the command execution?
Thanks
It seems that you want to implement an interactive shell, yet you need to control individual commands execution. That's not really possible with just SSH interface. "shell" channel in SSH is black box with an input and output. So there's nothing in Paramiko that will help you implementing this.
If you need to find out when a specific command finishes or where an output of a specific command ends, you need to use features of a shell.
You can solve that by inserting a unique separator (string) in between and search for it in the channel output stream. With a common *nix shells something like this works:
channel = ssh.invoke_shell()
channel.send('cd /mytargetRep\n')
channel.send('echo unique-string-separating-output-of-the-commands\n')
channel.send('./executeWhatIWant\n')
Though I do not really think that you need that very often. Most commands that are needed to make a specific commands working, like cd or set, do not really output anything.
So in most cases you can use SSHClient.exec_command and your code will be a way simpler and more reliable:
Execute multiple commands in Paramiko so that commands are affected by their predecessors
Even if you need to use something seemingly complex like su/sudo, it is still better to stick with SSHClient.exec_command:
Executing command using "su -l" in SSH using Python
For a similar question, see:
Combining interactive shell and recv_exit_status method using Paramiko

Running scripts on multiple hosts concurrently with fabric

I am trying to create a program that creates multiple droplets, sends a script to each droplet, and initiates the execution all of the scripts without waiting for the output. I have tried to run it in the background, using nohup so that it isn't killed off when disconnected from terminal with the following code:
for i in len(script_names):
c = Connection(host = host[i], user = user[i], connect_kwargs = {"password" : password, "key_filename" : key_filename})
c.run("nohup python3 /root/" + script_names[i] + " &")
I have tried other variations of the same idea, including setting "pty=False", redirecting the output to dev/null with "> /dev/null < /dev/null &" yet nothing seems to work.
Is it possible to issue multiple commands to run scripts on different hosts concurrently without waiting for the output with fabric? Or should I use another package?
Fabric 2.x's groups aren't fully fleshed out yet, so they aren't well suited for this use case. In fabric 1.x I would accomplish this by using a dictionary for script_names where the keys are the host strings from your host list and the values are the names from script_names currently. Then I'd have each task perform its run commands in parallel as usual, looking up values using fabric.api.env.host_string within the task. The execution layer of fabric 2.x does not yet support this use-case afaik. This was my attempt at hacking it in, but the author rightly pointed out that this functionality should be handled in an Executor, which I could not come up with a solution for at the time: https://github.com/fabric/fabric/pull/1595

Python restart script in another cmd's window

is there a way to restart another script in another shell?
i have script that sometimes stuck waiting to read email from gmail and imap. from another script i would like to restart the main one but without stopping the execution of the second
i have tried:
os.system("C:\Users\light\Documents\Python\BOTBOL\Gmail\V1\send.py")
process = subprocess.Popen(["python", "C:\Users\light\Documents\Python\BOTBOL\Gmail\V1\send.py"])
but both run the main in the second's shell
EDIT:
sorry, for shell i mean terminal window
After your last comment and as the syntax show that you are using Windows, I assume that you want to launch a Python script in another console. The magic word here is START if you want that the launching execute in parallel with the new one, or START /W if you want to wait for the end of the subprocess.
In your case, you could use:
subprocess.call(["cmd.exe", "/c", "START", "C:\Path\To\PYTHON.EXE",
"C:\Users\light\Documents\Python\BOTBOL\Gmail\V1\send.py"])
Subprocess has an option called shell which is what you want. Os calls are blocking which means that only after the command is completed will the interpreter move to the next line. On the other hand subprocess popens are non blocking, however both these commands will spawn off child process from the process running this code. If you want to run in shell and get access shell features to execute this , try the shell = True in subprocess.
I could try and explain everything you need but I think this video will do it better: Youtube Video about multithreading
This will allow you to run 2 things f.e.
Have 1 run on checkin email and the other one on inputs so it wont stop at those moments and making multiple 'shelves' possible, as they are parallel.
If you really want to have a different window for this, i am sorry and I can not help.
Hope this was were you were looking for.

How can I start and stop a Python script from shell

thanks for helping!
I want to start and stop a Python script from a shell script. The start works fine, but I want to stop / terminate the Python script after 10 seconds. (it's a counter that keeps counting). bud is won't stop.... I think it is hanging on the first line.
What is the right way to start wait for 10 seconds en stop?
Shell script:
python /home/pi/count1.py
sleep 10
kill /home/pi/count1.py
It's not working yet. I get the point of doing the script on the background. That's working!. But I get another comment form my raspberry after doing:
python /home/pi/count1.py &
sleep 10; kill /home/pi/count1.py
/home/pi/sebastiaan.sh: line 19: kill: /home/pi/count1.py: arguments must be process or job IDs
It's got to be in the: (but what? Thanks for helping out!)
sleep 10; kill /home/pi/count1.py
You're right, the shell script "hangs" on the first line until the python script finishes. If it doesn't, the shell script won't continue. Therefore you have to use & at the end of the shell command to run it in the background. This way, the python script starts and the shell script continues.
The kill command doesn't take a path, it takes a process id. After all, you might run the same program several times, and then try to kill the first, or last one.
The bash shell supports the $! variable, which is the pid of the last background process.
Your current example script is wrong, because it doesn't run the python job and the sleep job in parallel. Without adornment, the script will wait for the python job to finish, then sleep 10 seconds, then kill.
What you probably want is something like:
python myscript.py & # <-- Note '&' to run in background
LASTPID=$! # Save $! in case you do other background-y stuff
sleep 10; kill $LASTPID # Sleep then kill to set timeout.
You can terminate any process from any other if OS let you do it. I.e. if it isn't some critical process belonging to the OS itself.
The command kill uses PID to kill the process, not the process's name or command.
Use pkill for that.
You can also, send it a different signal instead of SIGTERM (request to terminate a program) that you may wish to detect inside your Python application and respond to it.
For instance you may wish to check if the process is alive and get some data from it.
To do this, choose one of the users custom signals and register them within your Python program using signal module.
To see why your script hangs, see Austin's answer.

Python Thread Breaking Terminal

Hello minds of stackoverflow,
I've run into a perplexing bug. I have a python script that creates a new thread that ssh's into a remote machine and starts a process. However, this process does not return on its own (and I want it to keep running throughout the duration of my script). In order to force the thread to return, at the end of my script I ssh into the machine again and kill -9 the process. This is working well, expect for the fact that it breaks the terminal.
To start the thread I run the following code:
t = threading.Thread(target=run_vUE_rfal, args=(vAP.IP, vUE.IP))
t.start()
The function run_vUE_rfal is as follows:
cmd = "sudo ssh -ti ~/.ssh/my_key.pem user#%s 'sudo /opt/company_name/rfal/bin/vUE-rfal -l 3 -m -d %s -i %s'" % (vUE_IP, vAP_IP, vUE_IP)
output = commands.getstatusoutput(cmd)
return
It seems when the command is run, it somehow breaks my terminal. It is broken in that instead of creating a new line for each print, it appends the WIDTH of my terminal in whitespace to the end of each line and prints it as seemingly one long string. Also, I am unable to see my keyboard input to that terminal, but it still successfully read. My terminal looks something like this:
normal formatted output
normal formatted output
running vUE-rfal
print1
print2
print3_extra_long
print4
If I replace the body of the run_vUE_rfal function with some simple prints, the terminal does not break. I have many other ssh's and telnets in this script that work fine. However, this is the only one I'm running in a separate thread as it is the only one that does not return. I need to maintain the ability to close the process of the remote machine when my script is finished.
Any explanations to the cause and idea for a fix are much appreciated.
Thanks in advance.
It seems the process you control is changing terminal settings. These are bypassing stderr and stdout - for good reasons. E.g. ssh itself needs this to ask users for passwords even when it's output is being redirected.
A way to solve this could be to use the python-module pexpect (it's a 3rd-party library) to launch your process, as it will create its' own fake-tty you don't care about.
BTW, to "repair" your terminal, use the reset command. As you already noticed, you can enter commands. reset will set the terminal to default settings.

Categories