Shell Script: Execute a python program from within a shell script - python

I've tried googling the answer but with no luck.
I need to use my works supercomputer server, but for my python script to run, it must be executed via a shell script.
For example I want job.sh to execute python_script.py
How can this be accomplished?

Just make sure the python executable is in your PATH environment variable then add in your script
python path/to/the/python_script.py
Details:
In the file job.sh, put this
#!/bin/sh
python python_script.py
Execute this command to make the script runnable for you : chmod u+x job.sh
Run it : ./job.sh

Method 1 - Create a shell script:
Suppose you have a python file hello.py
Create a file called job.sh that contains
#!/bin/bash
python hello.py
mark it executable using
$ chmod +x job.sh
then run it
$ ./job.sh
Method 2 (BETTER) - Make the python itself run from shell:
Modify your script hello.py and add this as the first line
#!/usr/bin/env python
mark it executable using
$ chmod +x hello.py
then run it
$ ./hello.py

Save the following program as print.py:
#!/usr/bin/python3
print('Hello World')
Then in the terminal type:
chmod +x print.py
./print.py

You should be able to invoke it as python scriptname.py e.g.
# !/bin/bash
python /home/user/scriptname.py
Also make sure the script has permissions to run.
You can make it executable by using chmod u+x scriptname.py.

Imho, writing
python /path/to/script.py
Is quite wrong, especially in these days. Which python? python2.6? 2.7? 3.0? 3.1? Most of times you need to specify the python version in shebang tag of python file. I encourage to use #!/usr/bin/env python2 #or python2.6 or python3 or even python3.1 for compatibility.
In such case, is much better to have the script executable and invoke it directly:
#!/bin/bash
/path/to/script.py
This way the version of python you need is only written in one file. Most of system these days are having python2 and python3 in the meantime, and it happens that the symlink python points to python3, while most people expect it pointing to python2.

This works for me:
Create a new shell file job. So let's say:
touch job.sh and add command to run python script (you can even add command line arguments to that python, I usually predefine my command line arguments).
chmod +x job.sh
Inside job.sh add the following py files, let's say:
python_file.py argument1 argument2 argument3 >> testpy-output.txt && echo "Done with python_file.py"
python_file1.py argument1 argument2 argument3 >> testpy-output.txt && echo "Done with python_file1.py"
Output of job.sh should look like this:
Done with python_file.py
Done with python_file1.py
I use this usually when I have to run multiple python files with different arguments, pre defined.
Note: Just a quick heads up on what's going on here:
python_file.py argument1 argument2 argument3 >> testpy-output.txt && echo "completed with python_file.py" .
Here shell script will run the file python_file.py and add multiple command-line arguments at run time to the python file.
This does not necessarily means, you have to pass command line arguments as well.
You can just use it like: python python_file.py, plain and simple.
Next up, the >> will print and store the output of this .py file in the testpy-output.txt file.
&& is a logical operator that will run only after the above is executed successfully and as an optional echo "completed with python_file.py" will be echoed on to your cli/terminal at run time.

This works best for me:
Add this at the top of the script:
#!c:/Python27/python.exe
(C:\Python27\python.exe is the path to the python.exe on my machine)
Then run the script via:
chmod +x script-name.py && script-name.py

I use this and it works fine
#/bin/bash
/usr/bin/python python python_script.py

Since the other posts say everything (and I stumbled upon this post while looking for the following).
Here is a way how to execute a python script from another python script:
Python 2:
execfile("somefile.py", global_vars, local_vars)
Python 3:
with open("somefile.py") as f:
code = compile(f.read(), "somefile.py", 'exec')
exec(code, global_vars, local_vars)
and you can supply args by providing some other sys.argv

Here I have demonstrated an example to run python script within a shell script. For different purposes you may need to read the output from a shell command, execute both python script and shell command within the same file.
To execute a shell command from python use os.system() method. To read output from a shell command use os.popen().
Following is an example which will grep all processes having the text sample_program.py inside of it. Then after collecting the process IDs (using python) it will kill them all.
#!/usr/bin/python3
import os
# listing all matched processes and taking the output into a variable s
s = os.popen("ps aux | grep 'sample_program.py'").read()
s = '\n'.join([l for l in s.split('\n') if "grep" not in l]) # avoiding killing the grep itself
print("To be killed:")
print(s)
# now manipulating this string s and finding the process IDs and killing them
os.system("kill -9 " + ' '.join([x.split()[1] for x in s.split('\n') if x]))
References:
Execute a python program from within a shell script
Assign output of os.system to a variable and prevent it from being displayed on the screen

If you have a bash script and you need to run inside of it a python3 script (with external modules), I recommend that you point in your bash script to your python path like this.
#!/usr/bin/env bash
-- bash code --
/usr/bin/python3 your_python.py
-- bash code --

Related

Get user input for a Python script ran from a bash script

Tldr at the end
I have a simple Python script with a few functions, let's give this main.py as a minimal example :
import sys
def userAdd():
var = input("Please type any input: ")
print("My input is", var)
if __name__ == '__main__':
globals()[sys.argv[1]]()
When I call :
python main.py userAdd
Everything works fine. The script runs, the Python console asks me for my input, and then edits the JSON file. Now I want this script to be executed everytime I edit a text file, let's say myfile.txt which for now only has this line :
foo
For this, I use this bash script and changed the "RUN COMMAND" line with python main.py userAdd (let's call it update.sh):
#!/bin/bash
### Set initial time of file
LTIME=`stat -c %Z ./myfile.txt`
while true
do
ATIME=`stat -c %Z ./myfile.txt`
if [[ "$ATIME" != "$LTIME" ]]
then
python main.py userAdd
LTIME=$ATIME
fi
sleep 1
done
My problem happens here. Everytime the Python script is called from the Bash script, the input prompt shows up. I enter a value, and I get a bash : <input> command not found, which means the current tty I'm using isn't Python, but Bash
$ chmod +x update.sh
$ ./update.sh &
$ echo "bar" >> myfile.txt
$ Please type any input: test
bash: test : command not found
I tried a few things (using /usr/bin/env and /dev/tty or <&1, or using python -i).
tldr; My Python script asks for a user input to update a file. When I run it directly from my bash terminal (python main.py myfunction), it works fine. When I run this Python script from a Bash script (which contains this same python [...] line) and type my input, I get a bash <input> command not found. This means that the terminal isn't Python's but Bash's. How can I get a Python terminal which will accept my input in this case ?
For anyone who might be facing the same issue, here's how I solved it : I edited the line from update.sh invoking the Python script like so :
#!/bin/bash
### Set initial time of file
LTIME=`stat -c %s ./myfile.txt`
while true
do
ATIME=`stat -c %s ./myfile.txt`
if [[ "$ATIME" >= "$LTIME" ]]
then
/usr/bin/konsole -e /usr/bin/bash -c './main.py userAdd'
LTIME=$ATIME
fi
sleep 1
done
So this line ties me up to a specific terminal, but I think this could easily be bypassed by more advanced Linux users.
Then, when I run the script using ./update.sh & and add this line to a cron, everytime I edit and close the file, a new terminal runs, asks for the input prints the results then closes (although for the sake of the example a wait() command should be added after the print()).

Trying to run python script in PHP

I'm trying to run a Python script inside a perl script with the following command:
system("python3 script.py -d http:\/\/site.com --no-interaction");
qx/python3 script.py -d http:\/\/site.com --no-interaction/;
On the operating system's command line, the Python script executes, but when I make a call from a PHP application, the perl work, but the python script don't work.
Do you get any error message from Perl side?
Likely where your PHP/Perl script runs from isn't the same location as where script.py is at. Try by using full path to Python script. Also double check that python3 is in your $PATH.
For example:
-> cat /home/me/python/script.py
print("This line will be printed.")
-> cat /home/me/perl/pytest.pl
#!/bin/env perl
print "From perl:\n";
system ("python3 /home/me/python/script.py");
cd /home/me/perl/
ksh
whence python3
"/usr/bin"
pytest.pl
"From perl:
This line will be printed."

Run a script using full path

I have a folder called TEST with inside :
script.py
script.sh
The bash file is :
#!/bin/bash
# Run the python script
python script.py
If I run the bash file like this :
./TEST/script.sh
I have the following error :
python: can't open file 'script.py': [Errno 2] No such file or directory
How could I do, to tell my script.sh to look in the directory (which may change) and to allow me to run it for inside the TEST directory ?
Tricky, my python file run a sqlite database and I have the same problem when calling the script from outside the folder, it didn't look inside the folder to find the database!
Alternative
You are able to run the script directly by adding this line to the top of your python file:
#!/usr/bin/env python
and then making the file executable:
$ chmod +x script.py
With this, you can run the script directly with ./TEST/script.py
What you asked for specifically
This works to get the path of the script, and then pass that to python.
#!/bin/sh
SCRIPTPATH="$( cd "$(dirname "$0")" ; pwd -P )"
python "$SCRIPTPATH/script.py"
Also potentially useful:
You mentioned having this problem with accessing a sqlite DB in the same folder, if you are running this from a script to solve this problem, it will not work. I imagine this question may be of use to you for that problem: How do I get the path of a the Python script I am running in?
You could use $0 which is the name of the currently executing program, as invoked, combined with dirname which provides the directory component of a file path, to determine the path (absolute or relative) that the shell script was invoked under. Then, you can apply it to the python invocation.
This example worked for me:
$ t/t.sh
Hello, world!
$ cat t/t.sh
#!/bin/bash
python "$(dirname $0)/t.py"
Take it a step farther and change your current working directory which will also be inherited by python, thus helping it to find its database:
$ t/t.sh; cat t/t.sh ; cat t/t.py ; cat t/message.txt
hello, world!
#!/bin/bash
cd "$(dirname $0)"
python t.py
with(open('message.txt')) as msgf:
print(msgf.read())
hello, world!
From the shell script, you can always find your current directory: Getting the source directory of a Bash script from within. While the accepted answer to this question provide a very comprehensive and robust solution, your relatively simple case only really needs something like
#!/bin/bash
dir="$(dirname "${BASH_SOURCE[0]}")"
# Run the python script
python "$(dir)"/script.py
Another way to do it would be to change the directory from which you run the script:
#!/bin/bash
dir="$(dirname "${BASH_SOURCE[0]}")"
# Run the python script
(cd "$dir"; python script.py)
The parentheses ((...)) around cd and python create a subprocess, so that the directory does not change for the rest of your bash script. This may not be necessary if you don't do anything else in the bash portion, but is still useful to have if you ever decide to say source your script instead of running it as a subprocess.
If you do not change the directory in bash, you can do it in Python using a combination of sys.argv\[0\], os.path.dirname and os.chdir:
import sys
import os
...
os.chdir(os.path.dirname(sys.argv[0]))

Shell script not executing a python file

I am trying to run a script which in turn should execute a basic python script.
This is the shell script:
#!usr/bin/bash
mv ~/Desktop/source/movable.py ~/Desktop/dest
cd ~/Desktop/dest
pwd
ls -lah
chmod +x movable.py
python movable.py
echo "Just ran a python file from a shell script"
This is the python script:
#!usr/bin/python
import os
print("movable transfered to dest")
os.system("pwd")
os.system("mv ~/Desktop/dest/movable.py ~/Desktop/source")
print("movable transfered to dest")
os.system("cd ~/Desktop/source")
os.system("pwd")
Q1. The shell script is not executing the python file. What am I doing wrong?
Q2. Do I need to write the first line #!usr/bin/python in the python script?
Thank you.
You are missing a '/' in the shebang line:
#!usr/bin/python
should be
#!/usr/bin/python
Another thing I noticed was, you are calling cd in os.system. Since that command would be executed in a sub shell, it won't change the current directory of the calling script process.

bash wrap a piped command with a python script

Is there a way to create a python script which wraps an entire bash command including the pipes.
For example, if I have the following simple script
import sys
print sys.argv
and call it like so (from bash or ipython), I get the expected outcome:
[pkerp#pendari trell]$ python test.py ls
['test.py', 'ls']
If I add a pipe, however, the output of the script gets redirected to the pipe sink:
[pkerp#pendari trell]$ python test.py ls > out.txt
And the > out.txt portion is not in sys.argv. I understand that the shell automatically process this output, but I'm curious if there's a way to force the shell to ignore it and pass it to the process being called.
The point of this is to create something like a wrapper for the shell. I'd like to run the commands regularly, but keep track of the strace output for each command (including the pipes). Ideally I'd like to keep all of the bash features, such as tab-completion and up and down arrows and history search, and then just pass the completed command through a python script which invokes a subprocess to handle it.
Is this possible, or would I have to write my own shell to do this?
Edit
It appears I'm asking the exact same thing as this question.
The only thing you can do is pass the entire shell command as a string, then let Python pass it back to a shell for execution.
$ python test.py "ls > out.txt"
Inside test.py, something like
subprocess.call("strace " + sys.argv[1], shell=True, executable="/bin/bash")
to ensure the entire string is passed to the shell (and bash, specifically).
Well, I don't quite see what you are trying to do. The general approach would be to give the desired output destination to the script using command line options: python test.py ls --output=out.txt. Incidentally, strace writes to stderr. You could capture everything using strace python test.py > out 2> err if you want to save everything...
Edit: If your script writes to stderr as well you could use strace -o strace_out python test.py > script_out 2> script_err
Edit2: Okay, I understand better what you want. My suggestion is this: Write a bash helper:
function process_and_evaluate()
{
strace -o /tmp/output/strace_output "$#"
/path/to/script.py /tmp/output/strace_output
}
Put this in a file like ~/helper.sh. Then open a bash, source it using . ~/helper.sh.
Now you can run it like this: process_and_evaluate ls -lA.
Edit3:
To capture output / error you could extend the macro like this:
function process_and_evaluate()
{
out=$1
err=$2
shift 2
strace -o /tmp/output/strace_output "$#" > "$out" 2> "$err"
/path/to/script.py /tmp/output/strace_output
}
You would have to use the (less obvious ) process_and_evaluate out.txt err.txt ls -lA.
This is the best that I can come up with...
At least in your simple example, you could just run the python script as an argument to echo, e.g.
$ echo $(python test.py ls) > test.txt
$ more test.txt
['test.py','ls']
Enclosing a command in parenthesis with a dollar sign first executes the contents then passes the output as an argument to echo.

Categories