Python - nohup.out don't show print statement - python

Im new to python and web development, I have a basic issue I believe.
I'm running my server with pyramid, and I use nohup.out to write output to a file.
nohup ../bin/pserve development.ini
When I do tail -f nohup.out
I can see all the output coming from the logging.info() calls in my code.
but I don't see all the output from the print() calls.
what is the reason for that, and how can I set it that I will see the print() in the nohup file?

You can use
nohup python -u python_script.py &

You can use stdbuf -oL to flush the print statements. Command will look like
nohup stdbuf -oL python python_script.py > nohup.out &

Printed output is buffered when not to a terminal. nohup replaces stdout with a non-terminal (a file, in fact).
In the absence of any code, all I can do is guess, but the most likely answer is that your output is buffered inside the python process, and when that buffer fills it will be flushed to nohup.out. To see if this is the case, try adding a lot more prints to fill up the buffer faster.

Related

How to block terminal/console outputs in python

I'm using Python to execute some bash commands. The problem is that the terminal outputs from these bash scripts are spamming my terminal. Is there any way to block the output messages from these scripts? I have tried the step in this answer. But is only blocking the print calls I make, and it is not blocking the console outputs from the bash commands.
Can anyone suggest any better solution?
In Bash you can simply use:
$ eclipse &>/dev/null
This catches both stdin and stderr to the redirect point (in bash).
(here eclipse is my command like)

nohup multiple sequential commands not logging output

I have a python script (which takes a lot of time to complete the execution) that I need to run several times varying the parameters. And it's executed in a remote machine. For instance and test purposes, take the following script test.py:
import time
print("\nStart time: {}".format(time.ctime()))
time.sleep(10)
print("End time: {}".format(time.ctime()))
For this I normaly use nohup. It's work just fine whith one execution using the following command:
nohup test.py &
The outputs are correctly saved in nohup.out file. To run in sequence I've done some research and found [this question][1] and I came up with the following command:
nohup $(python test.py; python test.py) &
Which works fine. I run the command, quickly logged out and in again and saw through htop the first execution running, then finishing and then the second one starting. But the problem is that the output isn't been saved into nohup.out file. If I wait in terminal for both executions to finish, the following error is showed:
nohup: failed to run command 'Start': No such file or directory
What am I doing wrong here?
PS.:
I need to log the outputs because I need to see the current progress of the script and know which error happened if it doesn't finish properly. So if there is some other command to use instead of nohup which could log python print's it will be welcomed too.
The command you have:
nohup $(python test.py; python test.py) &
Will attempt to execute the output of the script. It's likely not what you wanted.
What you wanted here is to have nohup start off a command that will execute the two commands in sequence. The most straight forward program that can do this is to use run a child shell:
nohup bash -c "python one.py; python two.py" &
As for a better way to do this, you might want to investigate tmux or screen. If you start off a command in a tmux/screen, not only you can detach the session from the currently running shell, you'd also be able to reconnect to the session later on to resume and interact with the program.
The nohup command is passed a utility and arguments for that utility.
If you like to run your script in sequence via nohup, perhaps using bash as the utility isn't a bad idea
nohup bash -c "python ./test.py; python ./test.py"
However, I recommend to look into using python's logging package as I consider nohup to be a workaround (nohup only appends to nohup.out if the standard output is the terminal.)
Also, there is the approach of using a queue to manage the running of your tasks sequentially.
Here you needn't have to be verbose to run the same script twice. Then again between what you have and writing a worker to consume the queue, I think what you've is simpler ¯\_(ツ)_/¯

Is there a way to unbuffer nohup for a running python program?

I started running my python program on a server with:
nohup python program.py &
The program contains a loop which runs a function and prints its output each time:
for i in s:
print f(i)
I started the program yesterday, but my nohup.out is still empty. I searched the internet and it seems python buffers the outputs. I don't want to stop and rerun my program. Is there any way to flush the outputs to nohup.out now?
You can use stdbuf, unbuffered STDOUT:
stdbuf -o0 nohup python program.py &
Line buffered STDOUT:
stdbuf -oL nohup python program.py &
Or better use -u option with python:
nohup python -u program.py &
Or use sys.stdout.flush(() in your program to flush the STDOUT to make the STDOUT unbuffered.
Not easily or reliably. I can describe something that might work, but I have never tested it in Python, and it has some assumptions.
Assuming that Python is using the libc stdio buffers, you might be able to attach to the running Python interpreter with gdb and use gdb to call fflush on stdout.
Be aware that this might crash your running program.
First find the PID of the python program.
Second, use GDB to attach to it: gdb --pid [PID from first step]
Third, in GDB type call fflush(stdout)
Then type detach and then q to get out of GDB.
No guarantees.

no error messages with nohup and python?

I run a python script on a linux server with nohup like that:
nohup python3 ./myscript.py > ./mylog.log &
It works and the script writes the log file but the problem is that python error messages / thrown exceptions don't seem to get written into the log file. How could I achieve this?
Has this something to do with stderr? (but it says: "nohup: redirecting stderr to stdout" when I start the script.)
It is a long running script and after a while sometimes the script stops working because of some problem and with missing error messages I have no clue why. The problems always happen after a few days so this is really a pain to debug.
edit:
Could it have something to do with flush? Since my own prints use flush but maybe python errors don't so they don't show up in the file once the script aborts?
I have found the reason. It really was the buffering problem (see my edit above). :)
nohup python3 -u ./myscript.py > ./mylog.log &
With the python -u parameter it works. It disables buffering.
Now I can go bug hunting...
You are only redirecting stdout. Error messages are given on stderr. Rerun your script like this:
nohup python3 ./myscript.py &> ./mylog.log &
The &> redirects all output (stdout and stderr) to your log file.
with nohup, the error will not be logged unless you specifically redirect error logs to a second file as shown below
nohup python myscript.py > out.log 2> err.log
you can also redirect both logs and error.log to the same files as
nohup python myscript.py > out.log2>&1
Python cache into the buffer before writing to log. To get realtime log used -u flag for "unbuffered".
nohup python -u myscript.py > out.log2>&1
To run in the background and recapture prompt add & at the end
nohup python -u myscript.py > out.log2>&1 &

Nohup is not writing log to output file

I am using the following command to run a python script in the background:
nohup ./cmd.py > cmd.log &
But it appears that nohup is not writing anything to the log file. cmd.log is created but is always empty. In the python script, I am using sys.stdout.write instead of print to print to standard output. Am I doing anything wrong?
You can run Python with the -u flag to avoid output buffering:
nohup python -u ./cmd.py > cmd.log &
It looks like you need to flush stdout periodically (e.g. sys.stdout.flush()). In my testing Python doesn't automatically do this even with print until the program exits.
Using -u with nohup worked for me. Using -u will force the stdout, stderr streams to be unbuffered. It will not affect stdin. Everything will be saved in "nohup.out " file. Like this-
nohup python -u your_code.py &
You can also save it into your directory. This way-
nohup python -u your_code.py > your_directory/nohup.out &
Also, you can use PYTHONUNBUFFERED. If you set it to a non-empty string it will work same as the -u option. For using this run below commands before running python code.
export PYTHONUNBUFFERED=1
or
export PYTHONUNBUFFERED=TRUE
P.S.- I will suggest using tools like cron-job to run things in the background and scheduled execution.
export PYTHONUNBUFFERED=1
nohup ./cmd.py > cmd.log &
or
nohup python -u ./cmd.py > cmd.log &
https://docs.python.org/2/using/cmdline.html#cmdoption-u
Python 3.3 and above has a flush argument to print and this is the only method that worked for me.
print("number to train = " + str(num_train), flush=True)
print("Using {} evaluation batches".format(num_evals), flush=True)
I had a similar issue, but not connected with a Python process. I was running a script which did a nohup and the script ran periodically via cron.
I was able to resolve the problem by:
redirecting the stdin , stdout and stderr
ensuring the the script being invoked via nohup didn't run anything else in the background
PS: my scripts were written in ksh running on RHEL
I run my scripts in the following way and I have no problem at all:
nohup python my_script.py &> my_script.out &
comparing with your syntax looks like you are only missing a "&" symbol after your input...

Categories