This question already has answers here:
Call Python script from bash with argument
(8 answers)
Closed 11 months ago.
I'm working with an HPC where I write a bash script which executes my python_script.py job like so:
#!/bin/bash -l
#$ -l h_rt=0:00:10
#$ -l mem=1G
#$ -cwd
module load python3/3.8
python3 python_script.py
In my python_script.py I define the variables repeat, and directory like so:
repeat = 10
directory = r'User/somedir/'
I would like to be able to set these variables in my bash script, so that they overwrite these values within python_script.py and are used instead.
The easier way is to use in your python script :
import sys
repeat = sys.argv[1]
directory = sys.argv[2]
and in your bash script
repeat="10"
directory="User/somedir/"
python3 python_script.py $repeat $directory
Related
This question already has answers here:
How to pass a Bash variable to Python?
(4 answers)
How do I set a variable to the output of a command in Bash?
(15 answers)
Command not found error in Bash variable assignment
(5 answers)
Closed 2 years ago.
The question is related to Linux Debian, bash and python3.
Run python3 file from bash, send parameter from bash to py and output the result by echo on bash.
The follow are a sample which works for me. This one start a python3 script by bash and get the outpup of python3 script on terminal.
bash_script.sh
#!/bin/bash
python3 python_script.py
python_script.py
#!/usr/bin/env python3
import random # import the random module
# determining the values of the parameters
mu = 100
sigma = 25
print(random.normalvariate(mu, sigma))
Whats my question:
How to change the code, for send the parameter for "mu" and "sigma" from bash to python and output the python result on bash by "echo $python_script_output" ?
Follow the not working draft for a solution:
bash_script.sh
#!/bin/bash
mu = 100
sigma = 25
python3 python_script.py
echo $python_script_output
python_script.py
#!/usr/bin/env python3
import random
print(random.normalvariate(mu, sigma))
Is there a particular reason you are looking to run the python script via bash?
If not, you can use in-built python module sys to pass arguments to the python script and print results directly on terminal.
import sys
def func(arg1, arg2):
# do_somehing
arg1, arg2 = sys.argv[1], sys.argv[2]
print(func(arg1, arg2))
Then you can use it directly from bash
python script_name.py arg1 arg1
Eventually I understand this and it works.
bash script:
#!/bin/bash
#$ -V
#$ -cwd
#$ -o $HOME/sge_jobs_output/$JOB_ID.out -j y
#$ -S /bin/bash
#$ -l mem_free=4G
c=$SGE_TASK_ID
cd /home/xxx/scratch/test/
FILENAME=`head -$c testlist|tail -1`
python testpython.py $FILENAME
python script:
#!/bin/python
import sys,os
path='/home/xxx/scratch/test/'
name1=sys.argv[1]
job_id=os.path.join(path+name1)
f=open(job_id,'r').readlines()
print f[1]
thx
Exported bash variables are actually environment variables. You get at them through the os.environ object with a dictionary-like interface. Note that there are two types of variables in Bash: those local to the current process, and those that are inherited by child processes. Your Python script is a child process, so you need to make sure that you export the variable you want the child process to access.
To answer your original question, you need to first export the variable and then access it from within the python script using os.environ.
##!/bin/bash
#$ -V
#$ -cwd
#$ -o $HOME/sge_jobs_output/$JOB_ID.out -j y
#$ -S /bin/bash
#$ -l mem_free=4G
c=$SGE_TASK_ID
cd /home/xxx/scratch/test/
export FILENAME=`head -$c testlist|tail -1`
chmod +X testpython.py
./testpython.py
#!/bin/python
import sys
import os
for arg in sys.argv:
print arg
f=open('/home/xxx/scratch/test/' + os.environ['FILENAME'],'r').readlines()
print f[1]
Alternatively, you may pass the variable as a command line argument, which is what your code is doing now. In that case, you must look in sys.argv, which is the list of arguments passed to your script. They appear in sys.argv in the same order you specified them when invoking the script. sys.argv[0] always contains the name of the program that's running. Subsequent entries contain other arguments. len(sys.argv) indicates the number of arguments the script received.
#!/bin/python
import sys
import os
if len(sys.argv) < 2:
print 'Usage: ' + sys.argv[0] + ' <filename>'
sys.exit(1)
print 'This is the name of the python script: ' + sys.argv[0]
print 'This is the 1st argument: ' + sys.argv[1]
f=open('/home/xxx/scratch/test/' + sys.argv[1],'r').readlines()
print f[1]
use this inside your script (EDITED per Aarons suggestion):
def main(args):
do_something(args[0])
if __name__ == "__main__":
import sys
main(sys.argv[1:])
Take a look at parsing Python arguments. Your bash code would be fine, just need to edit your Python script to take the argument.
Command line arguments to the script are available as sys.argv list.
This question already has answers here:
subprocess.Popen() error (No such file or directory) when calling command with arguments as a string
(2 answers)
Closed 2 years ago.
I am trying to write a program that opens a gnome-terminal window and executes a python file in it.
When I call the gnome-terminal subprocess with the subprocess module like this:
import subprocess
subprocess.call(['gnome-terminal', '-x', 'python3 '+filename])
I get the following error:
Failed to execute child process "python3 /home/user/Documents/test.py” (No such file or directory)
I have tried to cd to the directory /home/user/Documents/test.py first and then run the file, but it didn't work.
You're trying to execute the literal command python3 /home/user/Documents/test.py which obviously doesn't exist on your system.
When you type that line in a shell, the shell will split it on spaces and in the end it will call python3 with /home/user/Documents/test.py as argument.
When using subprocess.call, you have to do the splitting yourself.
I believe you need to pass your filename as another element in the array. I don't have gnome-terminal, but I replicated your issue with plain sh.
import subprocess
subprocess.call(['gnome-terminal', '-x', 'python3', filename])
Try this:
from os import system
system("gnome-terminal -e 'bash -c \"python3 %s\"'"%filename)
Add other commands using a semicolon:
system("gnome-terminal -e 'bash -c \"python3 %s; [second command]\"'")
Try this (i assume that python3 is set in PATH)
from subprocess import Popen
command="gnome-terminal -x python3"+filename
proc=Popen(command)
if this not works
then try to run your python file first , and see if it works or not
python filename
This question already has answers here:
Reading and writing environment variables in Python? [duplicate]
(4 answers)
Closed 7 years ago.
In UNIX from command line, I do
setenv HOME <path to home>
I pass it as argument to my python script
python hello.py HOME
and do
sys.argv[1] = os.environ["HOME"]
still it doesn't read the path.
I am new to python, is os.environ correct for this case?
If your aim is to get the path of the home directory as argument, you can just make your user send it by making shell evaluate the argument before calling the script.
I have a simple script like -
import sys
print(sys.argv[1])
In Windows I call it as -
set HOME=D:\
python script.py %HOME%
The output I get is -
D:\
In Linux -
$ python hello.py $HOME
output -
/home/random/
If you want to get it from the environment variable, and you want to pass the environment variable to use as the first argument to the script, then you should change your script like -
sys.argv[1] = os.environ.get(sys.argv[1],sys.argv[1])
This would
It seems this depends a little on your shell. For example, in Linux using bash 4.3-7ubuntu1.5:
$ export XXX="abc"
$ python
>>> import os
>>> os.environ["XXX"]
'abc'
>>> os.environ["HOME"]
'/home/alan'
I've tried googling the answer but with no luck.
I need to use my works supercomputer server, but for my python script to run, it must be executed via a shell script.
For example I want job.sh to execute python_script.py
How can this be accomplished?
Just make sure the python executable is in your PATH environment variable then add in your script
python path/to/the/python_script.py
Details:
In the file job.sh, put this
#!/bin/sh
python python_script.py
Execute this command to make the script runnable for you : chmod u+x job.sh
Run it : ./job.sh
Method 1 - Create a shell script:
Suppose you have a python file hello.py
Create a file called job.sh that contains
#!/bin/bash
python hello.py
mark it executable using
$ chmod +x job.sh
then run it
$ ./job.sh
Method 2 (BETTER) - Make the python itself run from shell:
Modify your script hello.py and add this as the first line
#!/usr/bin/env python
mark it executable using
$ chmod +x hello.py
then run it
$ ./hello.py
Save the following program as print.py:
#!/usr/bin/python3
print('Hello World')
Then in the terminal type:
chmod +x print.py
./print.py
You should be able to invoke it as python scriptname.py e.g.
# !/bin/bash
python /home/user/scriptname.py
Also make sure the script has permissions to run.
You can make it executable by using chmod u+x scriptname.py.
Imho, writing
python /path/to/script.py
Is quite wrong, especially in these days. Which python? python2.6? 2.7? 3.0? 3.1? Most of times you need to specify the python version in shebang tag of python file. I encourage to use #!/usr/bin/env python2 #or python2.6 or python3 or even python3.1 for compatibility.
In such case, is much better to have the script executable and invoke it directly:
#!/bin/bash
/path/to/script.py
This way the version of python you need is only written in one file. Most of system these days are having python2 and python3 in the meantime, and it happens that the symlink python points to python3, while most people expect it pointing to python2.
This works for me:
Create a new shell file job. So let's say:
touch job.sh and add command to run python script (you can even add command line arguments to that python, I usually predefine my command line arguments).
chmod +x job.sh
Inside job.sh add the following py files, let's say:
python_file.py argument1 argument2 argument3 >> testpy-output.txt && echo "Done with python_file.py"
python_file1.py argument1 argument2 argument3 >> testpy-output.txt && echo "Done with python_file1.py"
Output of job.sh should look like this:
Done with python_file.py
Done with python_file1.py
I use this usually when I have to run multiple python files with different arguments, pre defined.
Note: Just a quick heads up on what's going on here:
python_file.py argument1 argument2 argument3 >> testpy-output.txt && echo "completed with python_file.py" .
Here shell script will run the file python_file.py and add multiple command-line arguments at run time to the python file.
This does not necessarily means, you have to pass command line arguments as well.
You can just use it like: python python_file.py, plain and simple.
Next up, the >> will print and store the output of this .py file in the testpy-output.txt file.
&& is a logical operator that will run only after the above is executed successfully and as an optional echo "completed with python_file.py" will be echoed on to your cli/terminal at run time.
This works best for me:
Add this at the top of the script:
#!c:/Python27/python.exe
(C:\Python27\python.exe is the path to the python.exe on my machine)
Then run the script via:
chmod +x script-name.py && script-name.py
I use this and it works fine
#/bin/bash
/usr/bin/python python python_script.py
Since the other posts say everything (and I stumbled upon this post while looking for the following).
Here is a way how to execute a python script from another python script:
Python 2:
execfile("somefile.py", global_vars, local_vars)
Python 3:
with open("somefile.py") as f:
code = compile(f.read(), "somefile.py", 'exec')
exec(code, global_vars, local_vars)
and you can supply args by providing some other sys.argv
Here I have demonstrated an example to run python script within a shell script. For different purposes you may need to read the output from a shell command, execute both python script and shell command within the same file.
To execute a shell command from python use os.system() method. To read output from a shell command use os.popen().
Following is an example which will grep all processes having the text sample_program.py inside of it. Then after collecting the process IDs (using python) it will kill them all.
#!/usr/bin/python3
import os
# listing all matched processes and taking the output into a variable s
s = os.popen("ps aux | grep 'sample_program.py'").read()
s = '\n'.join([l for l in s.split('\n') if "grep" not in l]) # avoiding killing the grep itself
print("To be killed:")
print(s)
# now manipulating this string s and finding the process IDs and killing them
os.system("kill -9 " + ' '.join([x.split()[1] for x in s.split('\n') if x]))
References:
Execute a python program from within a shell script
Assign output of os.system to a variable and prevent it from being displayed on the screen
If you have a bash script and you need to run inside of it a python3 script (with external modules), I recommend that you point in your bash script to your python path like this.
#!/usr/bin/env bash
-- bash code --
/usr/bin/python3 your_python.py
-- bash code --