How to block terminal/console outputs in python - python

I'm using Python to execute some bash commands. The problem is that the terminal outputs from these bash scripts are spamming my terminal. Is there any way to block the output messages from these scripts? I have tried the step in this answer. But is only blocking the print calls I make, and it is not blocking the console outputs from the bash commands.
Can anyone suggest any better solution?

In Bash you can simply use:
$ eclipse &>/dev/null
This catches both stdin and stderr to the redirect point (in bash).
(here eclipse is my command like)

Related

How to send value from python script to console and exit

My python script generate a proper command that user need to run in the same console. My scenario is that user is running a script and then as a result see the command that must to run. Is there any way to exit python script and send that command to console, so user do not need to copy/paste?
A solution would be to have your python script (let's call it script.py) just print the command: print('ls -l') and use it in a terminal like so: $(python3 script.py). This makes bash run the output of your script as a command, and would basically run a ls -l in the terminal.
You can even go a step beyond and create an alias in ~/.bashrc so that you no longer need to call the whole line. You can write at the end of the file something like alias printls=$(python3 /path/to/script.py). After starting a new terminal, you can type printls and the script will run.
A drawback of this method is that you have no proper way of handling exceptions or errors in your code, since everything it prints will be run as a command. One way (though ugly) would be to print('echo "An error occured!"') so that the user who runs the command can see that something malfunctioned.
However, I'd suggest going for the "traditional" way and running the command directly from python. Here's a link to how you can achieve this: Calling an external command in Python.
Python can run system commands in new subshells. The proper way of doing this is via the subprocess module, but for simple tasks it's easier to just use os.system. Example Python script (assuming a Unix-like system):
import os
os.system('ls')

nohup multiple sequential commands not logging output

I have a python script (which takes a lot of time to complete the execution) that I need to run several times varying the parameters. And it's executed in a remote machine. For instance and test purposes, take the following script test.py:
import time
print("\nStart time: {}".format(time.ctime()))
time.sleep(10)
print("End time: {}".format(time.ctime()))
For this I normaly use nohup. It's work just fine whith one execution using the following command:
nohup test.py &
The outputs are correctly saved in nohup.out file. To run in sequence I've done some research and found [this question][1] and I came up with the following command:
nohup $(python test.py; python test.py) &
Which works fine. I run the command, quickly logged out and in again and saw through htop the first execution running, then finishing and then the second one starting. But the problem is that the output isn't been saved into nohup.out file. If I wait in terminal for both executions to finish, the following error is showed:
nohup: failed to run command 'Start': No such file or directory
What am I doing wrong here?
PS.:
I need to log the outputs because I need to see the current progress of the script and know which error happened if it doesn't finish properly. So if there is some other command to use instead of nohup which could log python print's it will be welcomed too.
The command you have:
nohup $(python test.py; python test.py) &
Will attempt to execute the output of the script. It's likely not what you wanted.
What you wanted here is to have nohup start off a command that will execute the two commands in sequence. The most straight forward program that can do this is to use run a child shell:
nohup bash -c "python one.py; python two.py" &
As for a better way to do this, you might want to investigate tmux or screen. If you start off a command in a tmux/screen, not only you can detach the session from the currently running shell, you'd also be able to reconnect to the session later on to resume and interact with the program.
The nohup command is passed a utility and arguments for that utility.
If you like to run your script in sequence via nohup, perhaps using bash as the utility isn't a bad idea
nohup bash -c "python ./test.py; python ./test.py"
However, I recommend to look into using python's logging package as I consider nohup to be a workaround (nohup only appends to nohup.out if the standard output is the terminal.)
Also, there is the approach of using a queue to manage the running of your tasks sequentially.
Here you needn't have to be verbose to run the same script twice. Then again between what you have and writing a worker to consume the queue, I think what you've is simpler ¯\_(ツ)_/¯

Open terminal, run python script and keep it open for results?

How to get an sh script for starting a new terminal, execute a python script and keep it running? The python script is supposed to run continuously in a perpetual loop, spitting out results as they pop in. Whenever trying with sh-script for gnome-terminal just getting: child process exited normally with status 2
Manually it would just be: python home/ubuntu/pyscript.py
Could someone give an idea how to do this?
I have a list of scripts to run, so resorting to the manual solution is tedious.
You can use gnome-terminal with the -x flag.
Suppose you have a spam.py script; then the following command will spawn a new terminal, run spam.py in it, and close the terminal once the script has ended.
gnome-terminal -x python spam.py
Try with this script:
# spam.py
import time
for _ in range(5):
print("eggs")
time.sleep(1)
Then the previous command will spawn a terminal, that will be printed eggs five times, and then will be closed.
If you want to leave the terminal open with the Python interpret still running after the script ended, then Python's -i flag (doc then CTRL+F -> -i) is what you want:
gnome-terminal -x python -i spam.py
To run the Python script in a new instance of your favourite terminal, write:
x-terminal-emulator -e python -i home/ubuntu/pyscript.py
This will start the Python script and run it until it ends, then display a Python prompt to stop the terminal emulator from closing.
This will work with x-terminal-emulator substituted with any of the many, many terminals installed on my computer, so will work with little modification across all POSIX-compatible systems with the standard terminals installed. This won't work on a Mac, however. For a properly cross-platform Python implementation of something slightly different, see here. Most of the techniques should be transferable.
To run the Python script in the same terminal whilst carrying on with the rest of the shell script, write:
python home/ubuntu/pyscript.py &
Note the &, which runs the program as a new process (but still connects the output to the virtual terminal).

How to capture stdout of a Python script executed with pythonw?

When a script is executed with pythonw it will not open a console.
Is there a way to capture the stdout of such a script by keeping the usage of pythonw?
Note, I am looking for a solution that does not require the modification of the script (I know that I can use logging)
Update: pythonw script.py >somefile seems to work. How can I redirect it to console?
It was obvious: pythonw script.py|more
If you can change how you invoke it (as you do in the update), why don't you just run it with python instead of pythonw?
python script.py

python multithreading issue in cronjob

I have a python program that uses the ThreadPool for multithreading. The program is one step in a shell script. When I execute the shell script manually on the command line, the entire flow works as expected. However, when I execute the shell script as a cronjob, it appears that the flow goes to the next steps before the python multithreading steps are completely finished.
Inside the python program, I do call AsyncResult.get(timeout) to wait for all the results to come back before moving on.
Run your program via batch(1) (see the output of the command man batch) as well. If that works OK, but the cron version does not, then it is almost certainly a problem with your environment variable setup. To verify that, run printenv from your interactive shell to inspect your environment there. Then do the same thing inside the crontab (you will just need to temporarily set up an extra cron entry for it). Try setting the variables in your shell script before invoking Python.
On the other hand, if it doesn't work via batch(1) either, it could be something to do with the files that your code has open. Try running your shell script with input redirected from /dev/null and output going to a file:
$ /usr/local/bin/myscript </dev/null >|/tmp/outfile.txt 2>&1
Try setting "TERM=xterm" (or whatever env variable you have, figure out by command 'env' on your terminal) in your crontab.

Categories