Environment variable gets removed immediately after it is set Python - python

I have this simple environment set script in Python:
import os
os.environ['API_USER'] = 'SuperUser'
os.environ['API_PASSWORD'] = 'SuperPass'
USER = os.environ.get('API_USER')
PASSWORD = os.environ.get('API_PASSWORD')
print(USER)
print(PASSWORD)
When I run it I get this:
SuperUser
SuperPass
So far OK, but when I comment out two lines where I set user and pass and run it again, I get this:
None
None
Why those two environment variable are removed and are not saved? I was wondering, that once I set some environment variable, I can use it again later without setting it again (until I delete them), right?
What is wrong?
Thank you!

You are only setting the environment variable for that specific running Python process. Once the process exits, the variable ceases to exist. It appears you cannot set a persistent environment variable from Python, as is described here:
Why can't environmental variables set in python persist?
It looks like you'll need to shell execute whatever variable you want to persist, as is described in the second answer to the above questions, like so:
import pipes
import random
r = random.randint(1,100)
print("export BLAHBLAH=%s" % (pipes.quote(str(r))))
Edit:
As per #SuperStormer's comment below--they are correct--this solution is Unix/Mac specifc. If you need to accomplish the same thing on Windows you would utilize setx like so (this would set the variable for only current user, you need to add a /m switch on the end to set for the local machine):
import pipes
import random
r = random.randint(1,100)
print("setx BLAHBLAH %s" % (pipes.quote(str(r))))

Related

Check if current Python script is running in a Jenkins environment

I'm writing a Python 3 script meant to be run from Jenkins; However, I'd like it to print several debug messages only when it runs locally on a developer's PC.
I know a possible solution would be creating an Environment variable in the developer's IDE to be passed to the Interpreter and then check for it on start-up:
debug_mode = False
if 'DEBUGMODE' in os.environ:
debug_mode = bool(os.environ.get('DEBUGMODE'))
print('Script is starting up')
(...) # Do stuff
if debug_mode:
print('So many things to do...')
(...) # Do other stuff
Actually, I don't like to force the developer to define DEBUGMODE in his/her environment, so I'm
wondering if there's any other way for my script to automatically know it's running in a Jenkins job and not in a Debugger.
Thanks in advance!
Max
When a Jenkins job executes, it always sets some default environment variables.
In your python code you can just check to see if one (or more) of these variables exists.
You can go for the JENKINS_URL environment variable as it is quite unique and probably wont be used for any other purpose beside what you want to achieve.
So your code can look like:
debug_mode = 'JENKINS_URL' not in os.environ
print('Script is starting up')
(...) # Do stuff
if debug_mode:
print('So many things to do...')
(...) # Do other stuff

Set the variable in command output

I would like to know how can i use my variables in output of another command. For example if i try to generate some keys with "openssl" i'll get the question about the country, state, organizations etc.
I would like to use my variables in the script that i have to fill this information. I'll have variable "Country"; variable "State" etc. and to be parsed/set in to this questions from the openssl command when is executed.
I'm trying this in bash but also would like to know how will be the same think done in python.
Kind regards
You have multiple ways to do so.
1. If you have your script launched before the python script and the result set in an enviroment variable you can read the environment variable from your python script as follows:
import os
os.environ.get('MYVARIABLE', 'Default val')
Otherwise you can try to launch the other application from your python script and read the result by using os.popen():
import os
tmp = os.popen("ls").read()
or better (if you have a python newer than 2.6)
import subprocess
proc = subprocess.Popen('ls', stdout=subprocess.PIPE)
tmp = proc.stdout.read()

Change environment variables persistently with Python

Is it possible using Python 3.5 to create and update environment variables in Windows and Linux so that they get persisted?
At the moment I use this:
import os
os.environ["MY_VARIABLE"] = "TRUE"
However it seems as if this does not "store" the environment variable persistently.
I'm speaking for Linux here, not sure about Windows.
Environment variables don't work that way. They are a part of the process (which is what you modify by changing os.environ), and they will propagate to child processes of your process (and their children obviously). They are in-memory only, and there is no way to "set and persist" them directly.
There are however several configuration files which allow you to set the environment on a more granular basis. These are read by various processes, and can be system-wide, specific to a user, specific to a shell, to a particular type of process etc.
Some of them are:
/etc/environment for system-wide variables
/etc/profile for shells (and their children)
Several other shell-specific files in /etc
Various dot-files in a user's home directory such as .profile, .bashrc, .bash_profile, .tcshrc and so on. Read your shell's documentation.
I believe that there are also various ways to configure environment variables launched from GUIs (e.g. from the gnome panel or something like that).
Most of the time you'll want to set environment variables for the current user only. If you only care about shells, append them to ~/.profile in this format:
NAME="VALUE"
The standard way to 'persist' an environment variable is with a configuration file. Write your application to open the configuration file and set every NAME=VARIABLE pair that it finds. Optionally this step could be done in a wrapper startup script.
If you wish to 'save' the state of a variable, you need to open the configuration file and modify its contents. Then when it's read in again, your application will set the environment accordingly.
You could of course store the configuration in some other way. For example in a configuration_settings class that you pickle/shelve. Then on program startup you read in the pickled class and set the environment. The important thing to understand is that when a process exits its environment is not saved. Environments are inherited from the parent process as an intentional byproduct of forking.
Config file could look like:
NAME=VALUE
NAME2=VALUE2
...
Or your config class could look like:
class Configuration():
env_vars = {}
import os
def set(name, val):
env_vars[name] = val
os.environ[name] = val
def unset(name):
del env_vars[name]
del os.environ[name]
def init():
for name in env_vars:
os.environ[name] = env_vars[name]
Somewhere else in our application
import shelve
filename = "config.db"
d = shelve.open(filename)
# get our configuration out of shelve
config = d['configuration']
# initialize environment
config.init()
# setting an environment variable
config.set("MY_VARIABLE", "TRUE")
#unsetting
config.unset("MY_VARIABLE")
# save our configuration
d['configuration'] = config
Code is not tested but I think you get the jist.

Is there any way to modify the pydevd_file_utils.PATHS_FROM_ECLIPSE_TO_PYTHON value without having to modify that file?

I am using the pydev plugin to debug a remote application.
This (remote) application has a structure of files that differs from the structure where my Eclipse is running. This leads to problems when I set the breakpoints from the Eclipse IDE because the pydev debugger server cannot match the absolute path of the file with the file on the remote application and hence the breakpoint isnĀ“t hit.
I dont want to hardcode the pydevd_file_utils.PATHS_FROM_ECLIPSE_TO_PYTHON constant to enable filepath translations.
Do you know some way to modify this value without changing the file?
Thanks!
There are 2 ways of setting the path translation:
Use an environment variable such as PATHS_FROM_ECLIPSE_TO_PYTHON that maps the paths from the client to the server side.
The value is a json string with a list(list(str, str)) such that:
PATHS_FROM_ECLIPSE_TO_PYTHON=[["c:/local/path", "/path/in/server"]]
Note that you may set the environment variable in any place you'd like (such as the Environment tab in the Python interpreter preferences page, in the OS itself, in the launch config, etc).
Use the pydevd API to set the tracing at runtime from the python process:
from pydevd_file_utils import setup_client_server_paths
MY_PATHS_FROM_ECLIPSE_TO_PYTHON = [
('/home/user/local-project', '/remote/path/to/project'),
]
setup_client_server_paths(MY_PATHS_FROM_ECLIPSE_TO_PYTHON)
# At this point we could connect to the remote debugger client with:
import pydevd
pydevd.settrace("10.0.0.12")
See: https://www.pydev.org/manual_adv_remote_debugger.html for more info on the Remote Debugging.
Note: the mapping set in Window > Preferences select PyDev > Debug > Source Locator doesn't really map to that environment variable nor the actual debugger mapping (that's a separate translation that only translates paths which are found on Eclipse locally and it's not really passed on to the debugger to hit breakpoints remotely).
You can do that by setting a new environment variable like this:
PATHS_FROM_ECLIPSE_TO_PYTHON='[["client_src_fullpath", "remote_src_fullpath"]]'
In linux simply run that before starting the program from the command line, or set is as a global variable.
In windows you will need to set it as a global system variable.
Variable name: PATHS_FROM_ECLIPSE_TO_PYTHON
Variable value: [["client_src_path", "remote_src_path"]]
As an alternative, you can also do this in code, BUT you need to do it BEFORE you import pydevd:
import os
os.environ.setdefault("PATHS_FROM_ECLIPSE_TO_PYTHON",'[["client_src_path","remote_src_path"]]')
import pydevd
pydevd.settrace("10.0.2.2", port=5678,stdoutToServer=True, stderrToServer=True, suspend=False,patch_multiprocessing=True)
(I'm aware this is a very old question, but none of the answers were updated to the current code)
Unfortunately there is no good way to do that.
As a workaround I explicitly replaced function NormFileToServer by adding the following code at the beginning of my source file.
def SrcPathMapping(file):
eclipse_src_path = 'C:\\tmp\\workspace\\test\\Scripts\\'
server_src_path = '/home/tester/test/Scripts/'
return file.replace(eclipse_src_path, server_src_path)
import pysrc.pydevd as pydevd
pydevd.NormFileToServer = SrcPathMapping
This simplistic mapping is sufficient when all source files are located in one directory. For proper implementation of the mapping function check NormFileToServer in the pydevd_file_utils.

Change current process environment's LD_LIBRARY_PATH

Is it possible to change environment variables of current process?
More specifically in a python script I want to change LD_LIBRARY_PATH so that on import of a module 'x' which depends on some xyz.so, xyz.so is taken from my given path in LD_LIBRARY_PATH
is there any other way to dynamically change path from where library is loaded?
Edit: I think I need to mention that I have already tried thing like
os.environ["LD_LIBRARY_PATH"] = mypath
os.putenv('LD_LIBRARY_PATH', mypath)
but these modify the env. for spawned sub-process, not the current process, and module loading doesn't consider the new LD_LIBRARY_PATH
Edit2, so question is can we change environment or something so the library loader sees it and loads from there?
The reason
os.environ["LD_LIBRARY_PATH"] = ...
doesn't work is simple: this environment variable controls behavior of the dynamic loader (ld-linux.so.2 on Linux, ld.so.1 on Solaris), but the loader only looks at LD_LIBRARY_PATH once at process startup. Changing the value of LD_LIBRARY_PATH in the current process after that point has no effect (just as the answer to this question says).
You do have some options:
A. If you know that you are going to need xyz.so from /some/path, and control the execution of python script from the start, then simply set LD_LIBRARY_PATH to your liking (after checking that it is not already so set), and re-execute yourself. This is what Java does.
B. You can import /some/path/xyz.so via its absolute path before importing x.so. When you then import x.so, the loader will discover that it has already loaded xyz.so, and will use the already loaded module instead of searching for it again.
C. If you build x.so yourself, you can add -Wl,-rpath=/some/path to its link line, and then importing x.so will cause the loader to look for dependent modules in /some/path.
Based on the answer from Employed Russian, this is what works for me
oracle_libs = os.environ['ORACLE_HOME']+"/lib/"
rerun = True
if not 'LD_LIBRARY_PATH' in os.environ:
os.environ['LD_LIBRARY_PATH'] = ":"+oracle_libs
elif not oracle_libs in os.environ.get('LD_LIBRARY_PATH'):
os.environ['LD_LIBRARY_PATH'] += ":"+oracle_libs
else:
rerun = False
if rerun:
os.execve(os.path.realpath(__file__), sys.argv, os.environ)
In my experience trying to change the way the loader works for a running Python is very tricky; probably OS/version dependent; may not work. One work-around that might help in some circumstances is to launch a sub-process that changes the environment parameter using a shell script and then launch a new Python using the shell.
The below code is to set the LD_LIBRARY_PATH or any other environment variable paths that is required by the import modules.
if os.getenv('LD_LIBRARY_PATH')==None:
os.environ['LD_LIBRARY_PATH']='<PATH>'
try:
sys.stdout.flush()
os.execl(sys.executable,sys.executable, *sys.argv)
except OSError as e:
print(e)
elif <path> not in os.getenv('LD_LIBRARY_PATH'):
os.environ['LD_LIBRARY_PATH'] = ':'.join([os.getenv('LD_LIBRARY_PATH'),'<PATH>'])
try:
sys.stdout.flush()
os.execl(sys.executable,sys.executable, *sys.argv)
except OSError as e:
print(e)
# import X
The function os.execl will replace the current process. In UNIX a new executable will be loaded into the current process.
By having this code before the import of the 'X' module, now it will be looking for the files in the new path that was set.
More on execl
well, the environment variables are stored in the dictionary os.environ, so if you want to change , you can do
os.environ["PATH"] = "/usr/bin"

Categories