Python: Platform independent way to modify PATH environment variable - python

Is there a way to modify the PATH environment variable in a platform independent way using python?
Something similar to os.path.join()?

You should be able to modify os.environ.
Since os.pathsep is the character to separate different paths, you should use this to append each new path:
os.environ["PATH"] += os.pathsep + path
or, if there are several paths to add in a list:
os.environ["PATH"] += os.pathsep + os.pathsep.join(pathlist)
As you mentioned, os.path.join can also be used for each individual path you have to append in the case you have to construct them from separate parts.

Please note that os.environ is not actually a dictionary. It's a special dictionary-like object which actually sets environment variables in the current process using setenv.
>>> os.environ.__class__
<class os._Environ at 0x100472050>
>>> import os
>>> os.environ["HELLO"] = "WORLD"
>>> os.getenv("HELLO")
'WORLD'
This means that PATH (and other environment variables) will be visible to C code run in the same process.
(Since comments can't contain formatting, I have to put this in an answer, but I feel like it's an important point to make. This is really a comment on the comment about there being no equivalent to 'export'.)

The caveat to be aware of with modifying environment variables in Python, is that there is no equivalent of the "export" shell command. There is no way of injecting changes into the current process, only child processes.

You could refresh it like this
os.environ["PATH"] = os.environ["PATH"]

Related

Windows Python path.join - The system cannot find the path specified [duplicate]

I believe that running an external command with a slightly modified environment is a very common case. That's how I tend to do it:
import subprocess, os
my_env = os.environ
my_env["PATH"] = "/usr/sbin:/sbin:" + my_env["PATH"]
subprocess.Popen(my_command, env=my_env)
I've got a gut feeling that there's a better way; does it look alright?
I think os.environ.copy() is better if you don't intend to modify the os.environ for the current process:
import subprocess, os
my_env = os.environ.copy()
my_env["PATH"] = "/usr/sbin:/sbin:" + my_env["PATH"]
subprocess.Popen(my_command, env=my_env)
That depends on what the issue is. If it's to clone and modify the environment one solution could be:
subprocess.Popen(my_command, env=dict(os.environ, PATH="path"))
But that somewhat depends on that the replaced variables are valid python identifiers, which they most often are (how often do you run into environment variable names that are not alphanumeric+underscore or variables that starts with a number?).
Otherwise you'll could write something like:
subprocess.Popen(my_command, env=dict(os.environ,
**{"Not valid python name":"value"}))
In the very odd case (how often do you use control codes or non-ascii characters in environment variable names?) that the keys of the environment are bytes you can't (on python3) even use that construct.
As you can see the techniques (especially the first) used here benefits on the keys of the environment normally is valid python identifiers, and also known in advance (at coding time), the second approach has issues. In cases where that isn't the case you should probably look for another approach.
With Python 3.5 you could do it this way:
import os
import subprocess
my_env = {**os.environ, 'PATH': '/usr/sbin:/sbin:' + os.environ['PATH']}
subprocess.Popen(my_command, env=my_env)
Here we end up with a copy of os.environ and overridden PATH value.
It was made possible by PEP 448 (Additional Unpacking Generalizations).
Another example. If you have a default environment (i.e. os.environ), and a dict you want to override defaults with, you can express it like this:
my_env = {**os.environ, **dict_with_env_variables}
you might use my_env.get("PATH", '') instead of my_env["PATH"] in case PATH somehow not defined in the original environment, but other than that it looks fine.
To temporarily set an environment variable without having to copy the os.envrion object etc, I do this:
process = subprocess.Popen(['env', 'RSYNC_PASSWORD=foobar', 'rsync', \
'rsync://username#foobar.com::'], stdout=subprocess.PIPE)
The env parameter accepts a dictionary. You can simply take os.environ, add a key (your desired variable) (to a copy of the dict if you must) to that and use it as a parameter to Popen.
I know this has been answered for some time, but there are some points that some may want to know about using PYTHONPATH instead of PATH in their environment variable. I have outlined an explanation of running python scripts with cronjobs that deals with the modified environment in a different way (found here). Thought it would be of some good for those who, like me, needed just a bit more than this answer provided.
In certain circumstances you may want to only pass down the environment variables your subprocess needs, but I think you've got the right idea in general (that's how I do it too).
This could be a solution :
new_env = dict([(k,(':'.join([env[k], v]) if k in env else v)) for k,v in os.environ.items()])

Setting environment variables when calling Linux commands from Python [duplicate]

I believe that running an external command with a slightly modified environment is a very common case. That's how I tend to do it:
import subprocess, os
my_env = os.environ
my_env["PATH"] = "/usr/sbin:/sbin:" + my_env["PATH"]
subprocess.Popen(my_command, env=my_env)
I've got a gut feeling that there's a better way; does it look alright?
I think os.environ.copy() is better if you don't intend to modify the os.environ for the current process:
import subprocess, os
my_env = os.environ.copy()
my_env["PATH"] = "/usr/sbin:/sbin:" + my_env["PATH"]
subprocess.Popen(my_command, env=my_env)
That depends on what the issue is. If it's to clone and modify the environment one solution could be:
subprocess.Popen(my_command, env=dict(os.environ, PATH="path"))
But that somewhat depends on that the replaced variables are valid python identifiers, which they most often are (how often do you run into environment variable names that are not alphanumeric+underscore or variables that starts with a number?).
Otherwise you'll could write something like:
subprocess.Popen(my_command, env=dict(os.environ,
**{"Not valid python name":"value"}))
In the very odd case (how often do you use control codes or non-ascii characters in environment variable names?) that the keys of the environment are bytes you can't (on python3) even use that construct.
As you can see the techniques (especially the first) used here benefits on the keys of the environment normally is valid python identifiers, and also known in advance (at coding time), the second approach has issues. In cases where that isn't the case you should probably look for another approach.
With Python 3.5 you could do it this way:
import os
import subprocess
my_env = {**os.environ, 'PATH': '/usr/sbin:/sbin:' + os.environ['PATH']}
subprocess.Popen(my_command, env=my_env)
Here we end up with a copy of os.environ and overridden PATH value.
It was made possible by PEP 448 (Additional Unpacking Generalizations).
Another example. If you have a default environment (i.e. os.environ), and a dict you want to override defaults with, you can express it like this:
my_env = {**os.environ, **dict_with_env_variables}
you might use my_env.get("PATH", '') instead of my_env["PATH"] in case PATH somehow not defined in the original environment, but other than that it looks fine.
To temporarily set an environment variable without having to copy the os.envrion object etc, I do this:
process = subprocess.Popen(['env', 'RSYNC_PASSWORD=foobar', 'rsync', \
'rsync://username#foobar.com::'], stdout=subprocess.PIPE)
The env parameter accepts a dictionary. You can simply take os.environ, add a key (your desired variable) (to a copy of the dict if you must) to that and use it as a parameter to Popen.
I know this has been answered for some time, but there are some points that some may want to know about using PYTHONPATH instead of PATH in their environment variable. I have outlined an explanation of running python scripts with cronjobs that deals with the modified environment in a different way (found here). Thought it would be of some good for those who, like me, needed just a bit more than this answer provided.
In certain circumstances you may want to only pass down the environment variables your subprocess needs, but I think you've got the right idea in general (that's how I do it too).
This could be a solution :
new_env = dict([(k,(':'.join([env[k], v]) if k in env else v)) for k,v in os.environ.items()])

Using %systemdrive% in python

I would like to open and work with a file with Python however I would like to use Windows %systemdrive% when referring to a file instead of full path.
This piece of code works:
if not os.path.isfile('C:\\Work\\test\sample.txt'):
This does not:
if not os.path.isfile('%systemdrive%\\Work\\test\\sample.txt'):
Any idea? Thank you in advance!
There is a dedicated function for solving this problem named os.path.expandvars.
Return the argument with environment variables expanded. Substrings of
the form $name or ${name} are replaced by the value of environment
variable name. Malformed variable names and references to non-existing
variables are left unchanged.
On Windows, %name% expansions are supported in addition to $name and
${name}.
if not os.path.isfile(os.path.expandvars('%systemdrive%\\Work\\test\\sample.txt')):
pass # do something
You can use os.environ
import os
import os.path
fn = r'{0}\Work\test\sample.txt'.format( os.environ['systemdrive'] )
if not os.path.isfile(fn):
...

Reading and writing environment variables in Python? [duplicate]

This question already has answers here:
How to set environment variables in Python?
(19 answers)
Closed 3 years ago.
My python script which calls many python functions and shell scripts. I want to set a environment variable in Python (main calling function) and all the daughter processes including the shell scripts to see the environmental variable set.
I need to set some environmental variables like this:
DEBUSSY 1
FSDB 1
1 is a number, not a string. Additionally, how can I read the value stored in an environment variable? (Like DEBUSSY/FSDB in another python child script.)
Try using the os module.
import os
os.environ['DEBUSSY'] = '1'
os.environ['FSDB'] = '1'
# Open child processes via os.system(), popen() or fork() and execv()
someVariable = int(os.environ['DEBUSSY'])
See the Python docs on os.environ. Also, for spawning child processes, see Python's subprocess docs.
First things first :) reading books is an excellent approach to problem solving; it's the difference between band-aid fixes and long-term investments in solving problems. Never miss an opportunity to learn. :D
You might choose to interpret the 1 as a number, but environment variables don't care. They just pass around strings:
The argument envp is an array of character pointers to null-
terminated strings. These strings shall constitute the
environment for the new process image. The envp array is
terminated by a null pointer.
(From environ(3posix).)
You access environment variables in python using the os.environ dictionary-like object:
>>> import os
>>> os.environ["HOME"]
'/home/sarnold'
>>> os.environ["PATH"]
'/home/sarnold/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/usr/games'
>>> os.environ["PATH"] = os.environ["PATH"] + ":/silly/"
>>> os.environ["PATH"]
'/home/sarnold/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/usr/games:/silly/'
If you want to pass global variables into new scripts, you can create a python file that is only meant for holding global variables (e.g. globals.py). When you import this file at the top of the child script, it should have access to all of those variables.
If you are writing to these variables, then that is a different story. That involves concurrency and locking the variables, which I'm not going to get into unless you want.
Use os.environ[str(DEBUSSY)] for both reading and writing (http://docs.python.org/library/os.html#os.environ).
As for reading, you have to parse the number from the string yourself of course.

Change current process environment's LD_LIBRARY_PATH

Is it possible to change environment variables of current process?
More specifically in a python script I want to change LD_LIBRARY_PATH so that on import of a module 'x' which depends on some xyz.so, xyz.so is taken from my given path in LD_LIBRARY_PATH
is there any other way to dynamically change path from where library is loaded?
Edit: I think I need to mention that I have already tried thing like
os.environ["LD_LIBRARY_PATH"] = mypath
os.putenv('LD_LIBRARY_PATH', mypath)
but these modify the env. for spawned sub-process, not the current process, and module loading doesn't consider the new LD_LIBRARY_PATH
Edit2, so question is can we change environment or something so the library loader sees it and loads from there?
The reason
os.environ["LD_LIBRARY_PATH"] = ...
doesn't work is simple: this environment variable controls behavior of the dynamic loader (ld-linux.so.2 on Linux, ld.so.1 on Solaris), but the loader only looks at LD_LIBRARY_PATH once at process startup. Changing the value of LD_LIBRARY_PATH in the current process after that point has no effect (just as the answer to this question says).
You do have some options:
A. If you know that you are going to need xyz.so from /some/path, and control the execution of python script from the start, then simply set LD_LIBRARY_PATH to your liking (after checking that it is not already so set), and re-execute yourself. This is what Java does.
B. You can import /some/path/xyz.so via its absolute path before importing x.so. When you then import x.so, the loader will discover that it has already loaded xyz.so, and will use the already loaded module instead of searching for it again.
C. If you build x.so yourself, you can add -Wl,-rpath=/some/path to its link line, and then importing x.so will cause the loader to look for dependent modules in /some/path.
Based on the answer from Employed Russian, this is what works for me
oracle_libs = os.environ['ORACLE_HOME']+"/lib/"
rerun = True
if not 'LD_LIBRARY_PATH' in os.environ:
os.environ['LD_LIBRARY_PATH'] = ":"+oracle_libs
elif not oracle_libs in os.environ.get('LD_LIBRARY_PATH'):
os.environ['LD_LIBRARY_PATH'] += ":"+oracle_libs
else:
rerun = False
if rerun:
os.execve(os.path.realpath(__file__), sys.argv, os.environ)
In my experience trying to change the way the loader works for a running Python is very tricky; probably OS/version dependent; may not work. One work-around that might help in some circumstances is to launch a sub-process that changes the environment parameter using a shell script and then launch a new Python using the shell.
The below code is to set the LD_LIBRARY_PATH or any other environment variable paths that is required by the import modules.
if os.getenv('LD_LIBRARY_PATH')==None:
os.environ['LD_LIBRARY_PATH']='<PATH>'
try:
sys.stdout.flush()
os.execl(sys.executable,sys.executable, *sys.argv)
except OSError as e:
print(e)
elif <path> not in os.getenv('LD_LIBRARY_PATH'):
os.environ['LD_LIBRARY_PATH'] = ':'.join([os.getenv('LD_LIBRARY_PATH'),'<PATH>'])
try:
sys.stdout.flush()
os.execl(sys.executable,sys.executable, *sys.argv)
except OSError as e:
print(e)
# import X
The function os.execl will replace the current process. In UNIX a new executable will be loaded into the current process.
By having this code before the import of the 'X' module, now it will be looking for the files in the new path that was set.
More on execl
well, the environment variables are stored in the dictionary os.environ, so if you want to change , you can do
os.environ["PATH"] = "/usr/bin"

Categories