So I know how subprocess works and use it a lot, but I've run into a strange issue. I need to execute an export of some environment variables. The reason is that some program (black-box) executes a program that seems like it runs in a subshell, so it doesn't have access to the environment variables but it has access to all my files.
I can't hard code the environment variables so I want to source or . the file that has the export commands in it. However, if I source or . that file in a subprocess, it won't make any difference to its parent process. In which case I either need some function besides subprocess that can execute shell commands without creating a subprocess, if that exists. Another issue is that a subprocess doesn't have the proper permissions to read the file.
And copying the environment variables via os isn't really possible either.
Does anything besides subprocess exist? Or is there some other kind of workaround?
IMHO the simplest solution consists in creating a new shell script (let's call it run_black_box.sh) which sources the setup script (let's assume it is named setup.sh) to initialize the environment and then calls the black_box program.
Here is a possible content of run_black_box.sh:
#/bin/bash
source setup.sh
black_box
The you can pass run_black_box.sh to subprocess for execution.
Related
While there are many questions regarding accessing/ setting env variables in python, I could not find the answer for my particular scenario.
I have a shell script that when called it does the export of a bunch of env variables that are used later on.
When you need those variables to be available from your current session, then you would do
. ./script_exp_var.sh
that does, say
export MYVAR = MYVAL
then if you run python, you could access it with os.environ.get('MYVAR').
My question is how to invoke the script from python and then to have to access the env variables that called script just exported. Is it possible at all and if yes how?
Note: I know I could set the env var from python using os.environ["MYVAR"] = MYVAL but I would like to use the existing logic in my ./script_exp_var.sh because it exports many variables.
Also making sure to execute the script first and then execute the python is also not an option for my scenario.
You can't. Environment variables are copied from parent to child, never back to the parent.
If you execute a shell script from python then the environment variables will be set in that shell process (the child) and python will be unaware of them.
You could write a parser to read the shell commands from python, but that's a lot of work.
Better to write a shell script with the settings in that and then call the python program as a child of the script.
Alternatively, write a shell script that echoes the values back to python which can be picked-up using a pipe.
recently, I want to use python script to set environment in linux.This is one line of my code:
p = subprocess.call(['/bin/csh', '-c', "source setup.csh"])
My setup.csh file is below:
add questa10.2b
add ds5-2013.06
setenv MODELSIM modelsim.ini
But when I run my python, it shows that the files have sourced on screen, but it turns out I have to type myself on command line.
How could I solve these problem? Can any one please help me with this?
You're creating a new csh shell as a subprocess and then running your commands inside that shell, which then terminates. The commands do not run in, or affect, the parent shell within which Python is running. When you just run the commands yourself, they affect the current shell.
If you need these settings to persist in your current shell after Python terminates, your best bet in general is to source setup.csh rather than putting it in a Python script. If other child processes of the Python script need your environment variables, you can alter os.environ.
I am new to python so I'm in the early stages of learning it. I was wondering if anyone knows how to run a system command after another. It's hard to explain:
subprocess.call('dir',shell=True)
subprocess.call('cd ..',shell=True)
subprocess.call('dir',shell=True)
When I run the command I expect to see the directory which the file is run. Which was fine.
Then the second process I expect to go up a directory.
Then the third command I expected to see the higher directory. Which I didn't I just saw the first directory.
Could some one explain why it isn't working as I expected and what I should do to correct it.
The general rule is that children cannot affect the parent's environment.
subprocess.call creates a child process. The child process can do many things. But, any changes it makes to the current working directory or to environment variables only last for the duration of the subprocess call. After the call completes and control returns to the parent, the parent's environment is restored unchanged.
If you want the cd to affect the next dir command, you need to have both in the same child. For example:
subprocess.call('cd ..; dir', shell=True)
You probably asked this question for more general purposes. But, for the specific examples that you provided, note that those actions might be better performed with the os module, rather than the subprocess module: listing files in the current directory can be done with os.listdir and changing the current working directory can be done with os.chdir
If you are trying to change the working directory in python that can be accomplished simply by os module. You can find that documentation here. I would suggest only using subprocess.call to call a script or another program that isn't trying to modify stuff based on the current environment.
When you run a subprocess with shell=True, python starts up a new shell to run the command in. It is basically the same as if python start up a new command prompt, entered in the command, and then closed the command prompt.
The consequence is that any action which only affects the shell is lost when the shell is closed. So you can create files and you'll see that because the hard drive is changed. But if you change the current directory of the shell that change will be lost.
You might wonder about the output of the program. Basically, the default is for the output of the program to be copied to the output of the calling program. (You can override this.)
If you want to change the current directory you want os.chdir. In general, you should avoid calling subprocesses and prefer python's tools. For example, instead of dir use os.listdir.
I need to run a lot of bash commands from Python. For the moment I'm doing this with
subprocess.Popen(cmd, shell=True)
Is there any solution to run all these commands in the same shell? subprocess.Popen opens a new shell at every execution and I need to set up all the necessary variables at every call, in order for cmd command to work properly.
subprocess.Popen lets you supply a dictionary of environment variables, which will become the environment for the process being run. If the only reason you need shell=True is to set an environment variable, then I suggest you use an explicit environment dictionary instead; it's safer and not particularly difficult. Also, it's usually much easier to construct command invocations when you don't have to worry about quoting and shell metacharacters.
It may not even be necessary to construct the environment dictionary, if you don't mind having the environment variables set in the running process. (Most of the time, this won't be a problem, but sometimes it is. I don't know enough about your application to tell.)
If you can modify your own environment with the settings, just do that:
os.environ['theEnvVar'] = '/the/value'
Then you can just use a simple Popen.call (or similar) to run the command:
output = subprocess.check_output(["ls", "-lR", "/tmp"])
If for whatever reason you cannot change your own environment, you need to make a copy of the current environment, modify it as desired, and pass it to each subprocess.call:
env = os.environ.copy()
env['theEnvVar'] = '/the/value'
output = subprocess.check_output(["ls", "-lR", "/tmp"], env=env)
If you don't want to have to specify env=env every time, just write a little wrapper class.
Why not just create a shell script with all the commands you need to run, then just use a single subprocess.Popen() call to run it? If the contents of the commands you need to run depend on results calculated in your Python script, you can just create the shell script dynamically, then run it.
Use multiprocessing instead, it's more lightweight and efficient.
Unlike subprocess.Popen it does not open a new shell at every execution.
You didn't say you need to run subprocess.Popen and you may well not need to; you just said that's what you're currently doing. More justification please.
See set env var in Python multiprocessing.Process for how to set your env vars once and for all in the parent process.
I am very new to Python and I have been trying to find a way to write in cmd with python.
I tried os.system and subprocess too. But I am not sure how to use subprocess.
While using os.system(), I got an error saying that the file specified cannot be found.
This is what I am trying to write in cmd os.system('cd '+path+'tesseract '+'a.png out')
I have tried searching Google but still I don't understand how to use subprocess.
EDIT:
It's not a problem with python anymore, I have figured out. Here is my code now.
os.system("cd C:\\Users\\User\\Desktop\\Folder\\data\\")
os.system("tesseract a.png out")
Now it says the file cannot be open. But if I open the cmd separately and write the above code, it successfully creates a file in the folder\data.
Each call to os.system is a separate instance of the shell. The cd you issued only had effect in the first instance of the shell. The second call to os.system was a new shell instance that started in the Python program's current working directory, which was not affected by the first cd invocation.
Some ways to do what you want:
1 -- put all the relevant commands in a single bash file and execute that via os.system
2 -- skip the cd call; just invoke your tesseract command using a full path to the file
3 -- change the directory for the Python program as a whole using os.chdir but this is probably not the right way -- your Python program as a whole (especially if running in a web app framework like Django or web2py) may have strong feelings about the current working directory.
The main takeaway is, os.system calls don't change the execution environment of the current Python program. It's equivalent to what would happen if you created a sub-shell at the command line, issued one command then exited. Some commands (like creating files or directories) have permanent effect. Others (like changing directories or setting environment variables) don't.