I need to run a terminal from my Python script and execute a command in clean environment, as if I just opened terminal emulator and typed the command there. But I have some exported variables in my script that should not be available in this terminal. There can be an arbitrary number of variables (some of them may be even set outside the script using bash 'export' command), so I can't delete them manually before running the terminal.
I tried the common solution which is claimed to reset the default environment, but it did not work. The code looks like this:
import subprocess
import os
os.environ['X'] = 'Y'
cmd = 'gnome-terminal -x env -i bash -c "echo $X; bash --noprofile --norc"'
subprocess.Popen([cmd], stdout=subprocess.PIPE, shell=True)
The output still prints "Y". When I try to do the same thing using only terminal, the result is the same:
$ export X=Y
$ gnome-terminal -x env -i bash -c "echo $X; bash --noprofile --norc"
The new terminal is opened and "Y" is printed.
Is there any solution that could solve the problem?
Use the env argument when calling subprocess.Popen:
subprocess.Popen([cmd], stdout=subprocess.PIPE, shell=True, env={})
This will run it in as clean environment as possible, however a lot of maybe needed environment variables will be missing. You may want to cache os.environ when you start your script and then populate the env argument with that cache to get the same environment variables you had when you started your script.
Update (for clarity sake): Keep in mind that the current environment is always copied to any sub-process (and sub-processes cannot access/change the environment of their parents) so the above essentially takes the current environment and blanks it out (giving the sub-process copy of an empty environment) and if the sub-process cannot establish new environment it will never know the variables from your script's environment. One way to partially mitigate that is to actually let bash (or whatever shell you're calling from your sub-process) to load profile and other user scripts but it still won't get the global environment.
Related
I work in tc shell and have environment set script with commands like setenv foo bar. I use to run this script by command source set_env.sh and then, when environment is set, run compilation command make my_ target...
I would like to automate the whole process by executing it from Python by calling subprocess.Popen or even os.system. The troubles are:
Shell opened form Python do not recognize command source. My assumption that /bin/bash shell is opened. How do I force Python to open /bin/tcsh?
How do I preserve changed environment setting for second command (make my_ target...)?
If you use the subprocess package in Python then it won't run the shell by default, it will just launch a process (without using the shell). You can add shell=True, but this won't do you much good since environment variables are set for the current process only, and are not sent back to the parent process.
Python starts a tcsh child process, and child processes can't affect parent processes.
The solution would be to make your script print out the environment variables, and then parse the output of that in Python.
So if your script prints something like:
XX=one
YY=two
Then in Python do something along the likes of:
for line in output.split('\n'):
if line == '':
continue
k, v = line.split('=')
os.environ[k] = v
I'm trying to run my virtualenv called env (created before) using call() from subprocess and it doesn't work. Command is fine, it's working when I type it directly in terminal.
python code:
import subprocess
subprocess.call("source env/bin/activate", shell=True)
I was trying also:
import os
os.system("source env/bin/activate")
any ideas why command is not performed or what should I use instead os.system() and subprocess.call()?
In both examples, your code launches a subprocess. That subprocess, which has its own environment, attempts to run the "source" command, and then exits. Since the subprocess has its own environment, this has no effect on the current process.
Assuming your end goal is to run some other command in the subprocess, you should just run it directly. You don't specifically need to activate the virtual environment.
subprocess.call(["./env/bin/pip", "list"])
Avoid using the shell=True option if at all possible; it can be quite dangerous if you're not extremely careful with it.
If you really need to set the environment variables that the activate script sets in this script, you need to manually set them in os.environ. You can read the activate script to see what they are. They usually won't matter, though.
recently, I want to use python script to set environment in linux.This is one line of my code:
p = subprocess.call(['/bin/csh', '-c', "source setup.csh"])
My setup.csh file is below:
add questa10.2b
add ds5-2013.06
setenv MODELSIM modelsim.ini
But when I run my python, it shows that the files have sourced on screen, but it turns out I have to type myself on command line.
How could I solve these problem? Can any one please help me with this?
You're creating a new csh shell as a subprocess and then running your commands inside that shell, which then terminates. The commands do not run in, or affect, the parent shell within which Python is running. When you just run the commands yourself, they affect the current shell.
If you need these settings to persist in your current shell after Python terminates, your best bet in general is to source setup.csh rather than putting it in a Python script. If other child processes of the Python script need your environment variables, you can alter os.environ.
I am writing a python script (Linux) that is adding some shell aliases (writes them to HOME/.bash_aliases).
In order to make an alias available immediately after it was written I should issue the following bash built-in:
source HOME/.bashrc
source is a bash built-in so I cannot just:
os.system(source HOME/.bashrc)
If i try something like:
os.system('/bin/bash -c source HOME/.bashrc')
...will freeze the script (just like is waiting for something).
Any suggestions ?
What you want is not possible. A program (your script) cannot modify the environment of the caller (the shell you run it from).
Another approach which would allow you to do something close is to write it in terms of a bash function, which is run in the same process and can modify the caller. Note that sourcing during runtime can have possible negative side-effects depending on what the user has in their bashrc.
what you are trying to do is impossible. or better: how you are trying to do it is impossible.
your bash command is wrong. bash -s command does not execute command. it just stores the string "command" in the variable $1 and then drops you to the prompt. that is why the python script seems to freeze. what you meant to do is bash -c command.
why do you source .bashrc? would it not be enough to just source .bash_aliases?
even if you got your bash command right, the changes will only take effect in the bash session started from python. once that bash session is closed, and your python script is done, you are back at your original bash session. all changes in the bash session started from python is lost.
everytime you want to change something in the current bash session, you have to do it from inside the current bash session. most of the commands you run from bash (system commands, python scripts, even bash scripts) will spawn another process, and everything you do in that other process will not affect your first bash session.
source is a bash builtin which allows you to execute commands inside the currently running bash session, instead of spawning another process and running the commands there. defining a bash function is another way to execute commands inside the currently running bash session.
see this answer for more information about sourcing and executing.
what you can do to achieve what you want
modify your python script to just do the changes necessary to .bash_aliases.
prepare a bash script to run your python script and then source .bash_aliases.
#i am a bash script, but you have to source me, do not execute me.
modify_bash_aliases.py "$#"
source ~/.bash_aliases
add an alias to your .bashrc to source that script
alias add_alias='source modify_bash_aliases.sh'
now when you type add_alias some_alias in your bash prompt it will be replaced with source modify_bash_aliases.sh and then executed. since source is a bash builtin, the commands inside the script will be executed inside the currently running bash session. the python script will still run in another process, but the subsequent source command will run inside your currently running bash session.
another way
modify your python script to just do the changes necessary to .bash_aliases.
prepare a bash function to run your python script and then source .bash_aliases.
add_alias() {
modify_bash_aliases.py "$#"
source ~/.bash_aliases
}
now you can call the function like this: add_alias some_alias
I had an interesting issue where I needed to source an RC file to get the correct output in my python script.
I eventually used this inside my function to bring over the same variables from the bash file I needed to source. Be sure to have os imported.
with open('overcloudrc') as data:
lines = data.readlines()
for line in lines:
var = line.split(' ')[1].split('=')[0].strip()
val = line.split(' ')[1].split('=')[1].strip()
os.environ[var] = val
Working solution from Can I use an alias to execute a program from a python script :
import subprocess
sp = subprocess.Popen(["/bin/bash", "-i", "-c", "nuke -x scriptpath"])
sp.communicate()
I am trying to call a shell script that sets a bunch of environment variables on our server from a mercurial hook. The shell script gets called fine when a new changegroup comes in, but the environment variables aren't carrying over past the call to the shell script.
My hgrc file on the respository looks like this:
[hooks]
changegroup = shell_script
changegroup.env = env
I can see the output of the shell script, and then the output of the env command, but the env command doesn't include the new environment variables set by the shell script.
I have verified that the shell script works fine when run by itself but when run in the context of the mercurial hook it does not properly set the environment.
Shell scripts can't modify their enviroment.
http://tldp.org/LDP/abs/html/gotchas.html
A script may not export variables back to its parent process, the shell, or to the environment. Just as we learned in biology, a child process can inherit from a parent, but not vice versa
$ cat > eg.sh
export FOO="bar";
^D
$ bash eg.sh
$ echo $FOO;
$
also, the problem is greater, as you have multiple calls of bash
bash 1 -> hg -> bash 2 ( shell script )
-> bash 3 ( env call )
it would be like thinking I could set a variable in one php script and then magically get it with another simply by running one after the other.