I am trying to write a script which is executing couple of things on a Linux server and I would like to use bash for most of the Linux specific commands and only use Python for the most complex stuff, but in order to do that I will need to export some variables from the bash script and use them in the python script and I didn't find a way how I can do that. So I have tried to create two very little scripts to test this functionality:
1.sh is a bash script
#!/bin/bash
test_var="Test Variable"
export test_var
echo "1.sh has been executed"
python 2.sh
2.sh is a Python script:
#!/usr/bin/env python3
print("The python script has been invoked successfully")
print(test_var)
As you can guess when I execute the first script the second fails with the error about unknown variable:
$ ./1.sh
1.sh has been executed
The python script has been invoked successfully
Traceback (most recent call last):
File "2.sh", line 4, in <module>
print(test_var)
NameError: name 'test_var' is not defined
The reason why I am trying to do that is because I am more comfortable with bash and I want to use $1, $2 variables in bash. Is this also possible in Python?
[EDIT] - I have just found out how I can use $1 and $2 it in Python. You need to use sys.argv[1] and sys.argv[2] and import the sys module import sys
To use environment variables from your python script you need to call:
import os
os.environ['test_var']
os.environ is a dictionary with all the environment variables, you can use all the method a dict has. For instance, you could write :
os.environ.get('test_var', 'default_value')
Check python extension it should be .py instead of .sh
1.sh
#!/bin/bash
test_var="Test Variable"
export test_var
echo "1.sh has been executed"
python 2.py
os library will gave you the access the environment variable. Following python code will gave you the required result,
#!/usr/bin/env python3
import os
print("The python script has been invoked successfully")
print(os.environ['test_var'])
Check for reference : How do I access environment variables from Python?
Related
I was hoping to write a python script to create some appropriate environmental variables by running the script in whatever directory I'll be executing some simulation code, and I've read that I can't write a script to make these env vars persist in the mac os terminal. So two things:
Is this true?
and
It seems like it would be a useful things to do; why isn't it possible in general?
You can't do it from python, but some clever bash tricks can do something similar. The basic reasoning is this: environment variables exist in a per-process memory space. When a new process is created with fork() it inherits its parent's environment variables. When you set an environment variable in your shell (e.g. bash) like this:
export VAR="foo"
What you're doing is telling bash to set the variable VAR in its process space to "foo". When you run a program, bash uses fork() and then exec() to run the program, so anything you run from bash inherits the bash environment variables.
Now, suppose you want to create a bash command that sets some environment variable DATA with content from a file in your current directory called ".data". First, you need to have a command to get the data out of the file:
cat .data
That prints the data. Now, we want to create a bash command to set that data in an environment variable:
export DATA=`cat .data`
That command takes the contents of .data and puts it in the environment variable DATA. Now, if you put that inside an alias command, you have a bash command that sets your environment variable:
alias set-data="export DATA=`cat .data`"
You can put that alias command inside the .bashrc or .bash_profile files in your home directory to have that command available in any new bash shell you start.
One workaround is to output export commands, and have the parent shell evaluate this..
thescript.py:
import pipes
import random
r = random.randint(1,100)
print("export BLAHBLAH=%s" % (pipes.quote(str(r))))
..and the bash alias (the same can be done in most shells.. even tcsh!):
alias setblahblahenv="eval $(python thescript.py)"
Usage:
$ echo $BLAHBLAH
$ setblahblahenv
$ echo $BLAHBLAH
72
You can output any arbitrary shell code, including multiple commands like:
export BLAHBLAH=23 SECONDENVVAR='something else' && echo 'everything worked'
Just remember to be careful about escaping any dynamically created output (the pipes.quote module is good for this)
If you set environment variables within a python script (or any other script or program), it won't affect the parent shell.
Edit clarification:
So the answer to your question is yes, it is true.
You can however export from within a shell script and source it by using the dot invocation
in fooexport.sh
export FOO="bar"
at the command prompt
$ . ./fooexport.sh
$ echo $FOO
bar
It's not generally possible. The new process created for python cannot affect its parent process' environment. Neither can the parent affect the child, but the parent gets to setup the child's environment as part of new process creation.
Perhaps you can set them in .bashrc, .profile or the equivalent "runs on login" or "runs on every new terminal session" script in MacOS.
You can also have python start the simulation program with the desired environment. (use the env parameter to subprocess.Popen (http://docs.python.org/library/subprocess.html) )
import subprocess, os
os.chdir('/home/you/desired/directory')
subprocess.Popen(['desired_program_cmd', 'args', ...], env=dict(SOMEVAR='a_value') )
Or you could have python write out a shell script like this to a file with a .sh extension:
export SOMEVAR=a_value
cd /home/you/desired/directory
./desired_program_cmd
and then chmod +x it and run it from anywhere.
What I like to do is use /usr/bin/env in a shell script to "wrap" my command line when I find myself in similar situations:
#!/bin/bash
/usr/bin/env NAME1="VALUE1" NAME2="VALUE2" ${*}
So let's call this script "myappenv". I put it in my $HOME/bin directory which I have in my $PATH.
Now I can invoke any command using that environment by simply prepending "myappenv" as such:
myappenv dosometask -xyz
Other posted solutions work too, but this is my personal preference. One advantage is that the environment is transient, so if I'm working in the shell only the command I invoke is affected by the altered environment.
Modified version based on new comments
#!/bin/bash
/usr/bin/env G4WORKDIR=$PWD ${*}
You could wrap this all up in an alias too. I prefer the wrapper script approach since I tend to have other environment prep in there too, which makes it easier for me to maintain.
As answered by Benson, but the best hack-around is to create a simple bash function to preserve arguments:
upsert-env-var (){ eval $(python upsert_env_var.py $*); }
Your can do whatever you want in your python script with the arguments. To simply add a variable use something like:
var = sys.argv[1]
val = sys.argv[2]
if os.environ.get(var, None):
print "export %s=%s:%s" % (var, val, os.environ[var])
else:
print "export %s=%s" % (var, val)
Usage:
upsert-env-var VAR VAL
As others have pointed out, the reason this doesn't work is that environment variables live in a per-process memory spaces and thus die when the Python process exits.
They point out that a solution to this is to define an alias in .bashrc to do what you want such as this:
alias export_my_program="export MY_VAR=`my_program`"
However, there's another (a tad hacky) method which does not require you to modify .bachrc, nor requires you to have my_program in $PATH (or specify the full path to it in the alias). The idea is to run the program in Python if it is invoked normally (./my_program), but in Bash if it is sourced (source my_program). (Using source on a script does not spawn a new process and thus does not kill environment variables created within.) You can do that as follows:
my_program.py:
#!/usr/bin/env python3
_UNUSED_VAR=0
_UNUSED_VAR=0 \
<< _UNUSED_VAR
#=======================
# Bash code starts here
#=======================
'''
_UNUSED_VAR
export MY_VAR=`$(dirname $0)/my_program.py`
echo $MY_VAR
return
'''
#=========================
# Python code starts here
#=========================
print('Hello environment!')
Running this in Python (./my_program.py), the first 3 lines will not do anything useful and the triple-quotes will comment out the Bash code, allowing Python to run normally without any syntax errors from Bash.
Sourcing this in bash (source my_program.py), the heredoc (<< _UNUSED_VAR) is a hack used to "comment out" the first-triple quote, which would otherwise be a syntax error. The script returns before reaching the second triple-quote, avoiding another syntax error. The export assigns the result of running my_program.py in Python from the correct directory (given by $(dirname $0)) to the environment variable MY_VAR. echo $MY_VAR prints the result on the command-line.
Example usage:
$ source my_program.py
Hello environment!
$ echo $MY_VAR
Hello environment!
However, the script will still do everything it did before except exporting, the environment variable if run normally:
$ ./my_program.py
Hello environment!
$ echo $MY_VAR
<-- Empty line
As noted by other authors, the memory is thrown away when the Python process exits. But during the python process, you can edit the running environment. For example:
>>> os.environ["foo"] = "bar"
>>> import subprocess
>>> subprocess.call(["printenv", "foo"])
bar
0
>>> os.environ["foo"] = "foo"
>>> subprocess.call(["printenv", "foo"])
foo
0
I have a folder called TEST with inside :
script.py
script.sh
The bash file is :
#!/bin/bash
# Run the python script
python script.py
If I run the bash file like this :
./TEST/script.sh
I have the following error :
python: can't open file 'script.py': [Errno 2] No such file or directory
How could I do, to tell my script.sh to look in the directory (which may change) and to allow me to run it for inside the TEST directory ?
Tricky, my python file run a sqlite database and I have the same problem when calling the script from outside the folder, it didn't look inside the folder to find the database!
Alternative
You are able to run the script directly by adding this line to the top of your python file:
#!/usr/bin/env python
and then making the file executable:
$ chmod +x script.py
With this, you can run the script directly with ./TEST/script.py
What you asked for specifically
This works to get the path of the script, and then pass that to python.
#!/bin/sh
SCRIPTPATH="$( cd "$(dirname "$0")" ; pwd -P )"
python "$SCRIPTPATH/script.py"
Also potentially useful:
You mentioned having this problem with accessing a sqlite DB in the same folder, if you are running this from a script to solve this problem, it will not work. I imagine this question may be of use to you for that problem: How do I get the path of a the Python script I am running in?
You could use $0 which is the name of the currently executing program, as invoked, combined with dirname which provides the directory component of a file path, to determine the path (absolute or relative) that the shell script was invoked under. Then, you can apply it to the python invocation.
This example worked for me:
$ t/t.sh
Hello, world!
$ cat t/t.sh
#!/bin/bash
python "$(dirname $0)/t.py"
Take it a step farther and change your current working directory which will also be inherited by python, thus helping it to find its database:
$ t/t.sh; cat t/t.sh ; cat t/t.py ; cat t/message.txt
hello, world!
#!/bin/bash
cd "$(dirname $0)"
python t.py
with(open('message.txt')) as msgf:
print(msgf.read())
hello, world!
From the shell script, you can always find your current directory: Getting the source directory of a Bash script from within. While the accepted answer to this question provide a very comprehensive and robust solution, your relatively simple case only really needs something like
#!/bin/bash
dir="$(dirname "${BASH_SOURCE[0]}")"
# Run the python script
python "$(dir)"/script.py
Another way to do it would be to change the directory from which you run the script:
#!/bin/bash
dir="$(dirname "${BASH_SOURCE[0]}")"
# Run the python script
(cd "$dir"; python script.py)
The parentheses ((...)) around cd and python create a subprocess, so that the directory does not change for the rest of your bash script. This may not be necessary if you don't do anything else in the bash portion, but is still useful to have if you ever decide to say source your script instead of running it as a subprocess.
If you do not change the directory in bash, you can do it in Python using a combination of sys.argv\[0\], os.path.dirname and os.chdir:
import sys
import os
...
os.chdir(os.path.dirname(sys.argv[0]))
I was hoping to write a python script to create some appropriate environmental variables by running the script in whatever directory I'll be executing some simulation code, and I've read that I can't write a script to make these env vars persist in the mac os terminal. So two things:
Is this true?
and
It seems like it would be a useful things to do; why isn't it possible in general?
You can't do it from python, but some clever bash tricks can do something similar. The basic reasoning is this: environment variables exist in a per-process memory space. When a new process is created with fork() it inherits its parent's environment variables. When you set an environment variable in your shell (e.g. bash) like this:
export VAR="foo"
What you're doing is telling bash to set the variable VAR in its process space to "foo". When you run a program, bash uses fork() and then exec() to run the program, so anything you run from bash inherits the bash environment variables.
Now, suppose you want to create a bash command that sets some environment variable DATA with content from a file in your current directory called ".data". First, you need to have a command to get the data out of the file:
cat .data
That prints the data. Now, we want to create a bash command to set that data in an environment variable:
export DATA=`cat .data`
That command takes the contents of .data and puts it in the environment variable DATA. Now, if you put that inside an alias command, you have a bash command that sets your environment variable:
alias set-data="export DATA=`cat .data`"
You can put that alias command inside the .bashrc or .bash_profile files in your home directory to have that command available in any new bash shell you start.
One workaround is to output export commands, and have the parent shell evaluate this..
thescript.py:
import pipes
import random
r = random.randint(1,100)
print("export BLAHBLAH=%s" % (pipes.quote(str(r))))
..and the bash alias (the same can be done in most shells.. even tcsh!):
alias setblahblahenv="eval $(python thescript.py)"
Usage:
$ echo $BLAHBLAH
$ setblahblahenv
$ echo $BLAHBLAH
72
You can output any arbitrary shell code, including multiple commands like:
export BLAHBLAH=23 SECONDENVVAR='something else' && echo 'everything worked'
Just remember to be careful about escaping any dynamically created output (the pipes.quote module is good for this)
If you set environment variables within a python script (or any other script or program), it won't affect the parent shell.
Edit clarification:
So the answer to your question is yes, it is true.
You can however export from within a shell script and source it by using the dot invocation
in fooexport.sh
export FOO="bar"
at the command prompt
$ . ./fooexport.sh
$ echo $FOO
bar
It's not generally possible. The new process created for python cannot affect its parent process' environment. Neither can the parent affect the child, but the parent gets to setup the child's environment as part of new process creation.
Perhaps you can set them in .bashrc, .profile or the equivalent "runs on login" or "runs on every new terminal session" script in MacOS.
You can also have python start the simulation program with the desired environment. (use the env parameter to subprocess.Popen (http://docs.python.org/library/subprocess.html) )
import subprocess, os
os.chdir('/home/you/desired/directory')
subprocess.Popen(['desired_program_cmd', 'args', ...], env=dict(SOMEVAR='a_value') )
Or you could have python write out a shell script like this to a file with a .sh extension:
export SOMEVAR=a_value
cd /home/you/desired/directory
./desired_program_cmd
and then chmod +x it and run it from anywhere.
What I like to do is use /usr/bin/env in a shell script to "wrap" my command line when I find myself in similar situations:
#!/bin/bash
/usr/bin/env NAME1="VALUE1" NAME2="VALUE2" ${*}
So let's call this script "myappenv". I put it in my $HOME/bin directory which I have in my $PATH.
Now I can invoke any command using that environment by simply prepending "myappenv" as such:
myappenv dosometask -xyz
Other posted solutions work too, but this is my personal preference. One advantage is that the environment is transient, so if I'm working in the shell only the command I invoke is affected by the altered environment.
Modified version based on new comments
#!/bin/bash
/usr/bin/env G4WORKDIR=$PWD ${*}
You could wrap this all up in an alias too. I prefer the wrapper script approach since I tend to have other environment prep in there too, which makes it easier for me to maintain.
As answered by Benson, but the best hack-around is to create a simple bash function to preserve arguments:
upsert-env-var (){ eval $(python upsert_env_var.py $*); }
Your can do whatever you want in your python script with the arguments. To simply add a variable use something like:
var = sys.argv[1]
val = sys.argv[2]
if os.environ.get(var, None):
print "export %s=%s:%s" % (var, val, os.environ[var])
else:
print "export %s=%s" % (var, val)
Usage:
upsert-env-var VAR VAL
As others have pointed out, the reason this doesn't work is that environment variables live in a per-process memory spaces and thus die when the Python process exits.
They point out that a solution to this is to define an alias in .bashrc to do what you want such as this:
alias export_my_program="export MY_VAR=`my_program`"
However, there's another (a tad hacky) method which does not require you to modify .bachrc, nor requires you to have my_program in $PATH (or specify the full path to it in the alias). The idea is to run the program in Python if it is invoked normally (./my_program), but in Bash if it is sourced (source my_program). (Using source on a script does not spawn a new process and thus does not kill environment variables created within.) You can do that as follows:
my_program.py:
#!/usr/bin/env python3
_UNUSED_VAR=0
_UNUSED_VAR=0 \
<< _UNUSED_VAR
#=======================
# Bash code starts here
#=======================
'''
_UNUSED_VAR
export MY_VAR=`$(dirname $0)/my_program.py`
echo $MY_VAR
return
'''
#=========================
# Python code starts here
#=========================
print('Hello environment!')
Running this in Python (./my_program.py), the first 3 lines will not do anything useful and the triple-quotes will comment out the Bash code, allowing Python to run normally without any syntax errors from Bash.
Sourcing this in bash (source my_program.py), the heredoc (<< _UNUSED_VAR) is a hack used to "comment out" the first-triple quote, which would otherwise be a syntax error. The script returns before reaching the second triple-quote, avoiding another syntax error. The export assigns the result of running my_program.py in Python from the correct directory (given by $(dirname $0)) to the environment variable MY_VAR. echo $MY_VAR prints the result on the command-line.
Example usage:
$ source my_program.py
Hello environment!
$ echo $MY_VAR
Hello environment!
However, the script will still do everything it did before except exporting, the environment variable if run normally:
$ ./my_program.py
Hello environment!
$ echo $MY_VAR
<-- Empty line
As noted by other authors, the memory is thrown away when the Python process exits. But during the python process, you can edit the running environment. For example:
>>> os.environ["foo"] = "bar"
>>> import subprocess
>>> subprocess.call(["printenv", "foo"])
bar
0
>>> os.environ["foo"] = "foo"
>>> subprocess.call(["printenv", "foo"])
foo
0
How do I set environment variables in one script and after piping the output to another script, I need certain environment variable to be set in first script and used in second script.
I am using python scripts.
python script1.py | python script2.py
in script1.py , I am doing
os.environ["some_key"] = some_value
and in script2.py , I am trying to read it as
saved_value = os.environ["some_key"]
When I try to run the above scripts as mentioned above,
Traceback (most recent call last):
File "script2.py", line 11, in <module>
filename = os.environ["some_key"]
File "/usr/lib/python2.7/UserDict.py", line 23, in __getitem__
raise KeyError(key)
Is there anyway this can be corrected ? I specifically need it to be read from os.environ["some_key"] in script2.py as that file is part of bigger framework and cant change it. I have flexibility with with script1.py
It's not possible to set environment variables of another process - unless you launch it yourself.
If you can, consider changing script1 to launch script2 directly using subprocess.Popen, and supply the env argument.
If you can't do this, you'll need to change the parent process (probably the bash script that launchs the scripts) to, somehow, receive the desired variable from script1 and set it when launching script2.
You won't be able to alter the environment of a sibling process.
You can use a convoluted construct like the one below:
{ export some_key=$(python script1.py) ; } |& python script2.py
where script1.py outputs the environment variable to stdout and writes the stuff to be fed to script2.py to stderr. This will only allow setting one environment variable though.
Example:
$ cat script1.py
import sys
print "some_env_value"
sys.stderr.write("some_piped_value\n")
$ cat script2.py
import os
import sys
print "Passed through enviroment:", os.environ["some_key"]
print "Passed through stdin:",
for line in sys.stdin:
print line
Result:
$ { export some_key=$(python script1.py) ; } |& python script2.py
Passed through enviroment: some_env_value2
Passed through stdin: some_piped_value
I have a demo file: test.py.
In the Windows Console I can run the file with: C:\>test.py
How can I execute the file in the Python Shell instead?
Use execfile for Python 2:
>>> execfile('C:\\test.py')
Use exec for Python 3
>>> exec(open("C:\\test.py").read())
If you're wanting to run the script and end at a prompt (so you can inspect variables, etc), then use:
python -i test.py
That will run the script and then drop you into a Python interpreter.
It depends on what is in test.py. The following is an appropriate structure:
# suppose this is your 'test.py' file
def main():
"""This function runs the core of your program"""
print("running main")
if __name__ == "__main__":
# if you call this script from the command line (the shell) it will
# run the 'main' function
main()
If you keep this structure, you can run it like this in the command line (assume that $ is your command-line prompt):
$ python test.py
$ # it will print "running main"
If you want to run it from the Python shell, then you simply do the following:
>>> import test
>>> test.main() # this calls the main part of your program
There is no necessity to use the subprocess module if you are already using Python. Instead, try to structure your Python files in such a way that they can be run both from the command line and the Python interpreter.
For newer version of python:
exec(open(filename).read())
If you want to avoid writing all of this everytime, you can define a function :
def run(filename):
exec(open(filename).read())
and then call it
run('filename.py')
From the same folder, you can do:
import test