get pytest exit code from shell script - python

I'm running pytest test from shell script.
The relevant line in the script looks something like:
pytest pytest_tests --param=$my_param
According to pytest documentation, "Running pytest can result in six different exit codes" (0-5).
My question is how can I get this exit code from the script?
I tried something like
exit_code = pytest pytest_tests --param=$my_param
echo $exit_code
But I got this:
exit_code: command not found
How can I get it? Or is there a better way to get pytest results in the shell script?

After a command runs its exit code should be available via the $? variable. Try something like this:
pytest pytest_tests --param=$my_param
echo Pytest exited $?
This works in Bash, and should work in the regular sh Bourne shell and zsh as well.
If you need to assign this to another variable, use
my_var=$?
Note the lack of spaces.

Related

How to properly redirect Python script output to file?

I am trying to run a script remotely on a server and I intend to use something along the following lines:
nohup ./script.py > runtime.out 2> runtime.err & and monitor the script's progress with tail -f runtiime.out. The problem I am having is that the redirect doesn't seem to work as expected. For the purposes of my problem my problem can be reproduced as described below:
script.py:
#!/usr/bin/env python3
import time
if __name__=='__main__':
for i in range(1000):
print("hi")
time.sleep(1)
Then in shell run ./print.py > a.out &. This will give the PID of the proccess and will exit as expected. However a.out is empty despite the program running. Also if i do ./print.py > a.out without the '&' the a.out file remains empty until I Ctrl-C the command. Then it displays all expected output until the termination of the script.
I thought the ">" redirected continuously the stdout and stderr and not only at command completion.
The simplest way to do that is just by using -u flag of the python command. It should look like that:
nohup python3 -u script.py > runtime.out 2> runtime.err &
According to the python3 --help:
-u : force the stdout and stderr streams to be unbuffered;
this option has no effect on stdin; also PYTHONUNBUFFERED=x
Using print("hi", flush=True) will keep forcing the stream to flush contents, so it will continuously update the output file. I don't have enough information about your program to suggest alternatives, but I would look for a better method if possible.

py.test use -m AS WELL as directory path in command

I'd like to only run selenium tests in my test suite, in addition to filtering it down to only run tests in a specific file/folder. It seems like I should be able to accomplish this with the -m option, and the path positional argument. Furthermore, I'm doing this in a bash script.
So for example, I tried something like this:
#!/bin/bash
# ...some logic here for determining `EXTRA` arg
EXTRA = "not selenium"
py.test -m $EXTRA -v -s --tb=long --no-flaky-report ~/project/mytests/test_blerg.py
And then my test looks like this (still using xunit-style classes):
#pytest.mark.selenium
class BaseTest(UnitTest):
pass
class ChildTest(BaseTest):
def test_first_case(self):
pass
When I run the py.test command as I described above, I get this:
============================================================================ no tests ran in 0.01 seconds ============================================================================
ERROR: file not found: selenium"
Not completely sure why this doesn't work. I'll try manually overriding pytest_runtest_setup(), but I'm feel like I should be able to accomplish what I want without doing that. Also, just FYI, this is a django project, using Django==1.8.7 and pytest-django==2.9.1.
Any help would be greatly appreciated :)
Figured it out. This has nothing to do with py.test itself. I had an error in how I was calling the py.test command in my bash script. The amended command looks like this:
py.test -m "$EXTRA" -v -s --tb=long --no-flaky-report ~/project/mytests/test_blerg.py
Works as expected!

Can we run pylint while executing python script, such that when the pylint passes the code will execute else it will show pylint errors?

example : lets say I have a python script test.py
so when I run the script python test.py pylint should execute first and If pylint executes successfully It should execute tesy.py else should give pylint errors
#falsetru : Thank you for the answer, another solution is that we can write a script in which we will run both the commands ie. pylint test.py and if the rate of the code(output of the pylint) is greater than x(lets say x = 8) will run python test.py else show the pylint errors.
ie instead of python test.py we will run my_script test.py
my_script is the script which contains the above mentioned code.

Error in check_call() subprocess, executing 'mv' unix command: "Syntax error: '(' unexpected"

I'm making a python script for Travis CI.
.travis.yml
...
script:
- support/travis-build.py
...
The python file travis-build.py is something like this:
#!/usr/bin/env python
from subprocess import check_call
...
check_call(r"mv !(my_project|cmake-3.0.2-Darwin64-universal) ./my_project/final_folder", shell=True)
...
When Travis building achieves that line, I'm getting an error:
/bin/sh: 1: Syntax error: "(" unexpected
I just tried a lot of different forms to write it, but I get the same result. Any idea?
Thanks in advance!
Edit
My current directory layout:
- my_project/final_folder/
- cmake-3.0.2-Darwin64-universal/
- fileA
- fileB
- fileC
I'm trying with this command to move all the current files fileA, fileB and fileC, excluding my_project and cmake-3.0.2-Darwin64-universal folders into ./my_project/final_folder. If I execute this command on Linux shell, I get my aim but not through check_call() command.
Note: I can't move the files one by one, because there are many others
I don't know which shell Travis are using by default because I don't specify it, I only know that if I write the command in my .travis.yml:
.travis.yml
...
script:
# Here is the previous Travis code
- mv !(my_project|cmake-3.0.2-Darwin64-universal) ./my_project/final_folder
...
It works. But If I use the script, it fails.
I found this command from the following issue:
How to use 'mv' command to move files except those in a specific directory?
You're using the bash feature extglob, to try to exclude the files that you're specifying. You'll need to enable it in order to have it exclude the two entries you're specifying.
The python subprocess module explicitly uses /bin/sh when you use shell=True, which doesn't enable the use of bash features like this by default (it's a compliance thing to make it more like original sh).
If you want to get bash to interpret the command; you have to pass it to bash explicitly, for example using:
subprocess.check_call(["bash", "-O", "extglob", "-c", "mv !(my_project|cmake-3.0.2-Darwin64-universal) ./my_project/final_folder"])
I would not choose to do the job in this manner, though.
Let me try again: in which shell do you expect your syntax !(...) to work? Is it bash? Is it ksh? I have never used it, and a quick search for a corresponding bash feature led nowhere. I suspect your syntax is just wrong, which is what the error message is telling you. In that case, your problem is entirely independent form python and the subprocess module.
If a special shell you have on your system supports this syntax, you need to make sure that Python is using the same shell when invoking your command. It tells you which shell it has been using: /bin/sh. This is usually just a link to the real shell executable. Does it point to the same shell you have tested your command in?
Edit: the SO solution you referenced contains the solution in the comments:
Tip: Note however that using this pattern relies on extglob. You can
enable it using shopt -s extglob (If you want extended globs to be
turned on by default you can add shopt -s extglob to .bashrc)
Just to demonstrate that different shells might deal with your syntax in different ways, first using bash:
$ !(uname)
-bash: !: event not found
And then, using /bin/dash:
$ !(uname)
Linux
The argument to a subprocess.something method must be a list of command line arguments. Use e.g. shlex.split() to make the string be split into correct command line arguments:
import shlex, subprocess
subprocess.check_call( shlex.split("mv !(...)") )
EDIT:
So, the goal is to move files/directories, with the exemption of some file(s)/directory(ies). By playing around with bash, I could get it to work like this:
mv `ls | grep -v -e '\(exclusion1\|exclusion2\)'` my_project
So in your situation that would be:
mv `ls | grep -v -e '\(myproject\|cmake-3.0.2-Darwin64-universal\)'` my_project
This could go into the subprocess.check_call(..., shell=True) and it should do what you expect it to do.

bash wrap a piped command with a python script

Is there a way to create a python script which wraps an entire bash command including the pipes.
For example, if I have the following simple script
import sys
print sys.argv
and call it like so (from bash or ipython), I get the expected outcome:
[pkerp#pendari trell]$ python test.py ls
['test.py', 'ls']
If I add a pipe, however, the output of the script gets redirected to the pipe sink:
[pkerp#pendari trell]$ python test.py ls > out.txt
And the > out.txt portion is not in sys.argv. I understand that the shell automatically process this output, but I'm curious if there's a way to force the shell to ignore it and pass it to the process being called.
The point of this is to create something like a wrapper for the shell. I'd like to run the commands regularly, but keep track of the strace output for each command (including the pipes). Ideally I'd like to keep all of the bash features, such as tab-completion and up and down arrows and history search, and then just pass the completed command through a python script which invokes a subprocess to handle it.
Is this possible, or would I have to write my own shell to do this?
Edit
It appears I'm asking the exact same thing as this question.
The only thing you can do is pass the entire shell command as a string, then let Python pass it back to a shell for execution.
$ python test.py "ls > out.txt"
Inside test.py, something like
subprocess.call("strace " + sys.argv[1], shell=True, executable="/bin/bash")
to ensure the entire string is passed to the shell (and bash, specifically).
Well, I don't quite see what you are trying to do. The general approach would be to give the desired output destination to the script using command line options: python test.py ls --output=out.txt. Incidentally, strace writes to stderr. You could capture everything using strace python test.py > out 2> err if you want to save everything...
Edit: If your script writes to stderr as well you could use strace -o strace_out python test.py > script_out 2> script_err
Edit2: Okay, I understand better what you want. My suggestion is this: Write a bash helper:
function process_and_evaluate()
{
strace -o /tmp/output/strace_output "$#"
/path/to/script.py /tmp/output/strace_output
}
Put this in a file like ~/helper.sh. Then open a bash, source it using . ~/helper.sh.
Now you can run it like this: process_and_evaluate ls -lA.
Edit3:
To capture output / error you could extend the macro like this:
function process_and_evaluate()
{
out=$1
err=$2
shift 2
strace -o /tmp/output/strace_output "$#" > "$out" 2> "$err"
/path/to/script.py /tmp/output/strace_output
}
You would have to use the (less obvious ) process_and_evaluate out.txt err.txt ls -lA.
This is the best that I can come up with...
At least in your simple example, you could just run the python script as an argument to echo, e.g.
$ echo $(python test.py ls) > test.txt
$ more test.txt
['test.py','ls']
Enclosing a command in parenthesis with a dollar sign first executes the contents then passes the output as an argument to echo.

Categories