Inside of my scons script I execute another python script:
fs = env.Command('fs', None, 'python updatefs.py')
AlwaysBuild(fs)
Depends(fs, main)
In python script I am trying to access an environment variable:
import os
mode = os.environ['PROC_MODE']
The variable was previously set up in the shell:
export PROC_MODE='some_mode'
Python complain:
KeyError: 'PROC_MODE'
What is the proper way to propagate environment to an external script?
This is covered in lightly in the FAQ:
FAQ
Basically SCons constructs a clean reproducible set of environment variables so that differences in any user's environment won't break a build.
So if you want to propagate a particular variable from your shell you can explicitly do it as such:
env['ENV']['MY_VARIABLE']=os.environ['MY_VARIABLE']
If you wanted to progagate all environment variables you'd do this:
env['ENV'] = os.environ
Where env is your Environment()
Related
I have python module that should run python scripts (let's call it launcher)
I have list of scripts. Each of them has it's own virtual environment.
Launcher's input:
name of the script to launch
path to script
arguments to pass to script
I need to come up with solution, so that launcher was able to run scripts without creating new processes.
I tried to use __import__() function, but the main problem is that I don't know how to use script's own virtual environment.
Based on Lie Ryan from activate-a-virtualenv-with-a-python-script
You could try to alter the current interpreter and import the scripts
# Doing execfile() on this file will alter the current interpreter's
# environment so you can import libraries in the virtualenv
activate_this_file = "/path/to/virtualenv/bin/activate_this.py"
execfile(activate_this_file, dict(__file__=activate_this_file))
If every script is gonna need different venv, then your best choice is to create a bash file with the pipeline and connect the scripts through output files. That's what I would do.
You can use pickle to transfer numpy arrays or dictionaries or other python objects, but be sure that the pickle protocol is the same.
For example:
#!/usr/bin/env bash
# General conf
var1 = 1.
var2 = "text"
# for each script
cd mypath1/
conda activate venv1
python script1.py -a 3.141592 -b var1 # Outputs some file
conda deactivate venv1
cd mypath2/
conda activate venv2
python script2.py -a var2 -b "text" # Takes the previous output
conda deactivate venv2
...
I've created Windows system variable using below code:
import os
os.environ['BJT'] = 'HELLO'
But I can't see it in Advanced settings \ system variables Also I can't see it when I try to print it:
import os
print(os.environ['BJT'])
I thought that when I create system variable using os.environ it is created exacly like when I do it in system settings. Is it possible to create system variable from python code and access it even when I restart computer?
You need to call the system (with admin privileges) to create a system variable, you can use subprocess:
import subprocess
subprocess.run(['setx', 'BJT', '"HELLO"', '/M'])
Prior to python3.5 you will need to use process.call instead
There is a misunderstanding with what environment is. It is just a mapping of (string) variables that a process can pass to its children. Specifically a process can change its own environment (which will be used by its future children) but this will not change its parent's environment, nor even the environment of its already existing children if any.
In addition, Windows provides system and user environment variables which are used as the initial environment of any process. This are not changet by os.environ nor by putenv but only from the Windows API or the shell command setx.
I added a environmental variable manually as setx NEWVAR SOMETHING,
so that my tool later uses the NEWVAR variable in the script but I am unable to access it, please help. Also below is the code. And for your information I am able to access the predefined system variables
try:
kiran=os.environ["NEWVAR"]
print kiran
except KeyError:
print "Please set the environment variable NEWVAR"
sys.exit(1)
You need to make sure your environment variable is persistent following a restart of your shell otherwise your new environment variable will not be accessible later on
ekavala#elx75030xhv:/var/tmp$ export NEWVAR='alan'
ekavala#elx75030xhv:/var/tmp$ python test.py
alan
*closes shell and reopens*
ekavala#elx75030xhv:/var/tmp$ python test.py
Please set the environment variable NEWVAR
Update your $HOME/.bashrc or /etc/environment with the variable instead of just doing a setx or export
Note: If you update /etc/environment you will need to reboot your computer to have the environment variables set in your shell
If you want to set environment variable through script:
import os
os.environ["JAVA_HOME"] = "somepath"
If you set it using command prompt it will be available only for that shell. If you set it in advanced system settings it will available every where but not when you open a new shell inside python script.
Instead of using setx to set those environment variables; try using EXPORT. The following example worked for me.
export MYCUSTOMVAR="testing123"
python testing.py
Note: If you are on Windows you can set the env variable with SET.
SET MYCUSTOMVAR=testing123
testing.py
import os
if __name__ == '__main__':
print("MYCUSTOMVAR: {0}".format(os.environ['MYCUSTOMVAR']))
output
MYCUSTOMVAR: testing123
trying to install spark, I've some problems when I try to set the system enviroment variables. I modify the PATH using:
“Advanced system settings” → “Environment Variables”
but when I call these variables from python, using the code:
import os
path = os.environ.get('PATH', None)
print(path)
The path that shows python don't have the modifications that I put. Thanks
Any program invoked from the command prompt will be given the environment variables that was at the time the command prompt was invoked.
Therefore, when you modify or add an environment variable you should restart the command prompt (cmd.exe) and then invoke python to see the changes.
I have a bash backup script run as root (cron) that delegates certain tasks to other specific bash scripts owned by different users. (simplified example, principle is, some things have to be done as root, different tasks are delegated to users with the appropriate environment (oracle, amazon, ...)
mkdir -p /tmp/backup$NAME
su - oracle -c "~/.backups/export-test.sh"
tar cf /tmp/backup/$NOW.tar /tmp/backup$NAME
su - amazon upload_to_amazon.sh /tmp/backup/$NOW.tar
This script itself does then some tasks as user oracle:
mkdir -p $TMP_LOCATION
cd ~/.backups
exp $TMP_LOCATION/$NAME-$NOW
When I try to mimic this behaviour in python I came up with the following (started from cron as root)
name = "oracle"
# part run as root
os.makedirs(tmp_backup + name)
os.setegid(pwd.getpwnam(name)[3])
os.seteuid(pwd.getpwnam(name)[2])
# part run as oracle
os.makedirs(tmp_location)
os.chdir(os.path.expanduser("~{user}/.backups".format(user=name)))
subprocess.check_call(["exp",
"os.path.join(tmp_location, name+'-'+now)"
])
In bash when using su -, a real new shell is invoked and all environment variables of that user are set.
How can I improve this for my python script? Is there a standard recipe I can follow? I'm thinking of environment variables, umask, ...
the environment is Solaris if that might matter.
all environment variables of that user are set
Usually because a shell runs a .profile file when it starts up.
You have several choices.
Create a proper subprocess with subprocess.Popen to execute the shell .profile -- same as su -.
Carefully locate the environment variable settings and mimic them in Python. The issue is that a .profile can do all kinds of crazy things, making it a potential problem to determine the exact effects of the .profile.
Or you can extract the relevant environment variables to make the accessible to both the shell environment and your Python programs.
First. Read the .profile for each user to be clear on what environment variables it sets (different from things like aliases or other craziness that doesn't apply to your Python script). Some of these environment variables are relevant to the scripts you're running. Some aren't relevant.
Second. Split the "relevant" environment variables into a tidy env_backups.sh script or env_uploads.sh script.
Once you have those environment variable scripts, update your .profile files to replace the environment variables settings with source env_backup.sh or source env_uploads.sh.
Third. Source the relevant env_this and env_that scripts before running the Python program. Now your Python environment shares the variables with your shell environment and you only maintain them in one place.
my_script.sh
source ~oracle/env_backup.sh
source ~amazon/env_uploads.sh
python my_script.py
That seems best to me. (Since that's how we do it.)
I can run amazon as root, without needing environment variables after all.
I used boto for that.
As for the oracle environment variables I used this piece of code:
if "ORACLE_HOME" not in os.environ or os.environ["ORACLE_HOME"] != ORACLE_HOME:
logger.debug("setting ORACLE_HOME='{oh}'".format(oh=ORACLE_HOME))
os.environ['ORACLE_HOME'] = ORACLE_HOME
if ORACLE_HOME + "/bin" not in os.environ["PATH"].split(":"):
logger.debug("setting PATH='{p}'".format(p=os.path.expandvars(ORACLE_PATH)))
os.environ['PATH'] = os.path.expandvars(ORACLE_PATH)
if "NLS_LANG" not in os.environ or os.environ["NLS_LANG"] != NLS_LANG:
logger.debug("setting NLS_LANG='{n}'".format(n=NLS_LANG))
os.environ['NLS_LANG'] = NLS_LANG