Open terminal, run python script and keep it open for results? - python

How to get an sh script for starting a new terminal, execute a python script and keep it running? The python script is supposed to run continuously in a perpetual loop, spitting out results as they pop in. Whenever trying with sh-script for gnome-terminal just getting: child process exited normally with status 2
Manually it would just be: python home/ubuntu/pyscript.py
Could someone give an idea how to do this?
I have a list of scripts to run, so resorting to the manual solution is tedious.

You can use gnome-terminal with the -x flag.
Suppose you have a spam.py script; then the following command will spawn a new terminal, run spam.py in it, and close the terminal once the script has ended.
gnome-terminal -x python spam.py
Try with this script:
# spam.py
import time
for _ in range(5):
print("eggs")
time.sleep(1)
Then the previous command will spawn a terminal, that will be printed eggs five times, and then will be closed.
If you want to leave the terminal open with the Python interpret still running after the script ended, then Python's -i flag (doc then CTRL+F -> -i) is what you want:
gnome-terminal -x python -i spam.py

To run the Python script in a new instance of your favourite terminal, write:
x-terminal-emulator -e python -i home/ubuntu/pyscript.py
This will start the Python script and run it until it ends, then display a Python prompt to stop the terminal emulator from closing.
This will work with x-terminal-emulator substituted with any of the many, many terminals installed on my computer, so will work with little modification across all POSIX-compatible systems with the standard terminals installed. This won't work on a Mac, however. For a properly cross-platform Python implementation of something slightly different, see here. Most of the techniques should be transferable.
To run the Python script in the same terminal whilst carrying on with the rest of the shell script, write:
python home/ubuntu/pyscript.py &
Note the &, which runs the program as a new process (but still connects the output to the virtual terminal).

Related

Shell script kill command

OS: Raspbian
Python: 3.7.3
I am trying to run and kill my shell script through a Python script. The purpose is so that I can simply press "Run" in my py script, instead of having to go through the terminal every time. Here is my shell script (T.sh):
#!/bin/bash
#Track command
cd /home/pi/rpi-deep-pantilt
. . /rpi-deep-pantilt-env/bin/activate
rpi-deep-pantilt track Raspi --edge-tpu
Here is my Py script:
import os
os.system('bash /home/pi/T.sh')
When I issue the command rpi-deep-pantilt track Raspi --edge-tpu in my terminal and press CTRL + C it kills the script, but it doesn't work when I use this Python script, and neither does pkill. The Python script stops, but the camera stays on and the tracking functionality is still operating. Is there any way I can incorporate some kill command that I can issue with a key interruption?
If there is a better way to go about this let me know. I'm very new to this as you can probably tell.
Python may create a new process for T.sh, so within your python code, try:
os.system("pkill T.sh")

Is there a way to run multiple bash scripts with one script at the same time?

I am running an Instagram bot using python and selenium. I use a bash script to run a python script with the accounts credentials(Username, password, hashtags, etc...) I run multiple Instagrams so I have made multiple copies of this file. Is there a way to put this in a single file that I can click on and run?
To open multiple terminals running their assigned account?
I've already tried just to add them to one big file but the scripts wont run until the previous one finishes.
Also since I'm using selenium, trying multi threading in python is somewhat difficult but would not mind going that route if someone could point me to where I could start with that.
#!/bin/sh
cd PycharmProjects/InstaBot
python3 W.py
I highly recommend that everyone read about Bash Job Control.
Getting into multithreading is ridiculous overkill if your bottleneck has nothing to do with the CPU.
for script in PycharmProjects/InstaBot/*.py; do
python3 "$script" &
done
jobs
Only one process in shell can run in foreground mode. So the command gets executed only when the previous completes.
Adding "&" symbol at the end of the command line tells shell that the command should be executed in the background. This way the shell will start python and continue without waiting.
This will execute two instances simultaneously, but they will all output to the same terminal:
#!/bin/sh
cd PycharmProjects/InstaBot
python3 W.py first_credentials &
python3 W.py second_credentials &
You can use the same technique to start a new terminal process for each python script:
#!/bin/sh
cd PycharmProjects/InstaBot
gnome-terminal -e "python3 W.py first_credentials" &
gnome-terminal -e "python3 W.py second_credentials" &

How to send value from python script to console and exit

My python script generate a proper command that user need to run in the same console. My scenario is that user is running a script and then as a result see the command that must to run. Is there any way to exit python script and send that command to console, so user do not need to copy/paste?
A solution would be to have your python script (let's call it script.py) just print the command: print('ls -l') and use it in a terminal like so: $(python3 script.py). This makes bash run the output of your script as a command, and would basically run a ls -l in the terminal.
You can even go a step beyond and create an alias in ~/.bashrc so that you no longer need to call the whole line. You can write at the end of the file something like alias printls=$(python3 /path/to/script.py). After starting a new terminal, you can type printls and the script will run.
A drawback of this method is that you have no proper way of handling exceptions or errors in your code, since everything it prints will be run as a command. One way (though ugly) would be to print('echo "An error occured!"') so that the user who runs the command can see that something malfunctioned.
However, I'd suggest going for the "traditional" way and running the command directly from python. Here's a link to how you can achieve this: Calling an external command in Python.
Python can run system commands in new subshells. The proper way of doing this is via the subprocess module, but for simple tasks it's easier to just use os.system. Example Python script (assuming a Unix-like system):
import os
os.system('ls')

Bash script: Open Python, execute some lines, then let user take control

I have a Python program, A.py, that creates binary data upon completion. To help users analyze the output, I want to add a small script, B.sh, to the output directory that fires up a Python console and executes some commands, C, that load the data and prepare them such that a user sees what is available. After executing C, the script B.sh should keep the Python console open.
First attempt at B.sh:
I figured out that
#!/bin/sh
xterm -e python
opens a Python console and keeps it open but doesn't execute anything within that console.
Second attempt at B.sh:
I figured out that
#!/bin/sh
xterm -e python -i C.py
executes C.py (I'd prefer not to have to write an additional file for the startup commands, but I could live with that) and keeps the window open, but doesn't show what was done. More specifically, the user would be presented with the outputs of C, but not the command that were used to achieve the outputs.
Instead,I'd like the user to be presented with a console like this:
>>> [info,results] = my_package.load(<tag>)
>>> my_package.plot(results)
>>> print(info)
<output>
>>> my_package.analyze(results)
<output>
>>>
Save this in a file called demo.tcl
#!/usr/local/bin/expect -f
# Spawn Python and await prompt
spawn /usr/local/bin/python3
expect ">>>"
# Send Python statement and await prompt
send "print('Hello world!')\n"
expect ">>>"
# Pass control to user so he can interact with Python
interact
Then make it executable with:
chmod +x demo.tcl
And run with:
xterm -e ./demo.tcl
In the picture, you can see I went on after the "Hello world" to print the system version info.
Your paths for Python and expect may be different, so check and alter to suit.
For anyone who happens to be using macOS (a.k.a. OSX), you can install expect with homebrew as follows:
brew install expect
And, since Macs don't ship with X11 any more, rather than install XQuartz and run xterm, you can start a new Terminal and run the Python in there quite simply with:
open -a Terminal.app demo.tcl
As suggested by user #pask I simply printed the commands before executing them.
Furthermore, I added a -c switch to be able to put the Python commands directly in B.sh instead of having to write a file C.py
Here is the B.sh I am using now:
#!/bin/sh
xterm -e python -i -c "print('>>> import my_package');import my_package;print('>>> [info,results] = my_package.load()');[info,results] = my_package.load()"

python multithreading issue in cronjob

I have a python program that uses the ThreadPool for multithreading. The program is one step in a shell script. When I execute the shell script manually on the command line, the entire flow works as expected. However, when I execute the shell script as a cronjob, it appears that the flow goes to the next steps before the python multithreading steps are completely finished.
Inside the python program, I do call AsyncResult.get(timeout) to wait for all the results to come back before moving on.
Run your program via batch(1) (see the output of the command man batch) as well. If that works OK, but the cron version does not, then it is almost certainly a problem with your environment variable setup. To verify that, run printenv from your interactive shell to inspect your environment there. Then do the same thing inside the crontab (you will just need to temporarily set up an extra cron entry for it). Try setting the variables in your shell script before invoking Python.
On the other hand, if it doesn't work via batch(1) either, it could be something to do with the files that your code has open. Try running your shell script with input redirected from /dev/null and output going to a file:
$ /usr/local/bin/myscript </dev/null >|/tmp/outfile.txt 2>&1
Try setting "TERM=xterm" (or whatever env variable you have, figure out by command 'env' on your terminal) in your crontab.

Categories