From python, I need to run a python file inside of git bash, while running in Windows.
That is, I have a configuration script written in python that calls other python scripts. Unfortunately, some of them use Unix commands, so they must be run using git bash in Windows.
Currently I'm using this:
cmd = f'{sys.executable} mydependency.py'
pipe = subprocess.Popen(cmd, stderr=subprocess.PIPE, stdout=subprocess.PIPE)
# waiting for pipe is handled later...
However, this doesn't work, giving me a cannot execute binary file message. How can I get it to run?
PS: For slightly more context, mydependency.py is actually the amalgamate.py script from the simdjson (https://github.com/simdjson/simdjson) project.
EDIT:
I have also attempted the following:
Switch to run or call instead of subprocess.Popen
Use f'{git_bash_path} {sys.executable} mydependency.py'
Change the shell and executable parameters of Popen,run and call
I found a solution:
cmd = git_bash_path # Found with glob.
pipe = subprocess.Popen(cmd, stderr=subprocess.PIPE, stdout=subprocess.PIPE, stdin=subprocess.PIPE)
pipe.communicate(input=f'{sys.executable} mydependency.py'.encode())
I'm not entirely sure why this works, if anyone has an explanation I'd be glad to hear it.
Related
I am trying to run a shell script using through Python using subprocess.Popen().
The shell script just has the following lines:
#!/bin/sh
echo Hello World
Following is the Python code:
print("RUNNNING SHELL SCRIPT NOW")
shellscript = subprocess.Popen(['km/example/example1/source/test.sh'], stdout=subprocess.PIPE, stderr=subprocess.PIPE, shell=True)
shellscript.wait()
for line in shellscript.stdout.readlines():
print(line)
print("SHELL SCRIPT RUN ENDED")
However, on running this, I am only getting the following output:
RUNNNING SHELL SCRIPT NOW
SHELL SCRIPT RUN ENDED
i.e. I am not getting the shell script output in between these 2 lines.
Moreover, when I remove the stderr=subprocess.PIPE part from the subprocess, I get the following output:
RUNNNING SHELL SCRIPT NOW
'km' is not defined as an internal or external command.
SHELL SCRIPT RUN ENDED
I am not able to understand how to resolve this, and run the shell script properly. Kindly guide. Thanks.
UPDATE:
I also tried the following change:
print("RUNNNING SHELL SCRIPT NOW")
shellscript = subprocess.Popen(['km/example/example1/source/test.sh'], stdout=subprocess.PIPE, stderr=subprocess.PIPE, shell=True)
out, err = shellscript.communicate()
print(out)
print("SHELL SCRIPT RUN ENDED")
I get the following output:
RUNNNING SHELL SCRIPT NOW
b''
SHELL SCRIPT RUN ENDED
The simple and straightforward fix is to not use bare Popen for this.
You also don't need a shell to run a subprocess; if the subprocess is a shell script, that subprocess itself will be a shell, but you don't need the help of the shell to run that script.
proc = subprocess.run(
['km/example/example1/source/test.sh'],
check=True, capture_output=True, text=True)
out = proc.stdout
If you really need to use Popen, you need to understand its processing model. But if you are just trying to get the job done, the simple answer is don't use Popen.
The error message actually looks like you are on Windows, and it tries to run km via cmd which thinks the slashes are option separators, not directory separators. Removing the shell=True avoids this complication, and just starts a process with the requested name. (This of course still requires that the file exists in the relative file name you are specifying. Perhaps see also What exactly is current working directory? and also perhaps switch to native Windows backslashes, with an r'...' string to prevent Python from trying to interpret the backslashes.)
I am trying to call mvn commands from Python, for this subprocess module has to be used.
The problem is, this has been working for a long time and all of a sudden does not work anymore, because the executed Maven commands complain about JAVA_HOME not being set, even though it is when i manually type echo $JAVA_HOME into the shell.
I have no idea why it stopped working all of a sudden.
What i would expect
command= "echo $JAVA_HOME"
proc = subprocess.Popen(['bash', '-c', command],
stdout=subprocess.PIPE, stderr=subprocess.PIPE,
stdin=subprocess.PIPE)
output, err = proc.communicate()
print(str(output))
prints the path to my Java JDK.
$ echo $JAVA_HOME
prints the path to my Java JDK.
What happens instead
command= "echo $JAVA_HOME"
proc = subprocess.Popen(['bash', '-c', command],
stdout=subprocess.PIPE, stderr=subprocess.PIPE,
stdin=subprocess.PIPE)
output, err = proc.communicate()
print(str(output))
prints b'\n'
$ echo $JAVA_HOME
prints epath/to/my/java/jdk
What i already tried
Using shell=True in Popen: Works, but is discouraged due to security risks and it seems to use /bin/sh when executed on our Jenkins, which makes the script crash because some commands are only executable when using bash. It worked without it too, so there must be a way to get along without it.
adding env=os.environ.copy() as argument to Popen: No effect..Even when specifying the JAVA_HOME explicitly using env
Moving the JDK to a path with no weird spaces or anything like that: No effect...
Checking the output of os.environ['JAVA_HOME']: Prints the path to my Java JDK
Information
I am still using the same python version. I did not update anything that could have caused this weird behavior all of a sudden, at least i wouldnt know what it is.
I am using Windows 10 Enterprise, x64 based
I am using GIT Bash
I am using Python 3.8.5
Update 1:
After reading something about problems of shared environment variables between WSL and Windows, i discovered that i can specify shared variables by setting a environment variable 'WSLENV'. I added JAVA_HOME/p to it and now Python subprocess no longer prints b'\n', but b'/tmp/docker-desktop-root/mnt/host/c/Users/user/Desktop/jdk11\n'. So the problem seems to be WSL (?).
Unfortunately, Maven still says JAVA_HOME should point to a JDK not a JRE, so this path seems not to work.
Update 2:
By changing the WSLENV variable's content from JAVA_HOME/p to JAVA_HOME/u, the subprocess now prints the correct path to the JDK. Still, Maven fails with the same error message..
Update 3:
For making it work with WSL enabled, check out my answer below
I found a way to make it work with WSL enabled, it is kinda ugly but it seems to work.
command = "mvn --version"
proc = subprocess.Popen(['wsl', 'bash.exe', '-c', command],
stdout=subprocess.PIPE, stderr=subprocess.PIPE,
stdin=subprocess.PIPE)
output, err = proc.communicate()
print(str(output))
print(str(err))
By appending wsl and bash.exe i managed to make it work, the output is the basic output from mvn --version just like expected. Notice the .exe which seems to tell WSL to use the same bash executable like in normal usage without subprocess.
Without the .exe, WSL seems to use a different bash executable, where JAVA_HOME is not defined or at least maven complains about it with the error message that i already mentioned above.
Notice that this code probably won't work when WSL is not enabled, so you would need to programmatically test if WSL is enabled and then modify the command accordingly.
Im still searching for any solution where i dont need to modify the process args, i am gonna update if i will find one.
I'm currently running an OpenELEC (XBMC) installation on a Raspberry Pi and installed a tool named "Hyperion" which takes care of the connected Ambilight. I'm a total noob when it comes to Python-programming, so here's my question:
How can I run a script that checks if a process with a specific string in its name is running and:
kill the process when it's running
start the process when it's not running
The goal of this is to have one script that toggles the Ambilight. Any idea how to achieve this?
You may want to have a look at the subprocess module which can run shell commands from Python. For instance, have a look at this answer. You can then get the stdout from the shell command to a variable. I suspect you are going to need the pidof shell command.
The basic idea would be along the lines of:
import subprocess
try:
subprocess.check_output(["pidof", "-s", "-x", "hyperiond"])
except subprocess.CalledProcessError:
# spawn the process using a shell command with subprocess.Popen
subprocess.Popen("hyperiond")
else:
# kill the process using a shell command with subprocess.call
subprocess.call("kill %s" % output)
I've tested this code in Ubuntu with bash as the process and it works as expected. In your comments you note that you are getting file not found errors. You can try putting the complete path to pidof in your check_output call. This can be found using which pidof from the terminal. The code for my system would then become
subprocess.check_output(["/bin/pidof", "-s", "-x", "hyperiond"])
Your path may differ. On windows adding shell=True to the check_output arguments fixes this issue but I don't think this is relevant for Linux.
Thanks so much for your help #will-hart, I finally got it working. Needed to change some details because the script kept saying that "output" is not defined. Here's how it now looks like:
#!/usr/bin/env python
import subprocess
from subprocess import call
try:
subprocess.check_output(["pidof", "hyperiond"])
except subprocess.CalledProcessError:
subprocess.Popen(["/storage/hyperion/bin/hyperiond.sh", "/storage/.config/hyperion.config.json"])
else:
subprocess.call(["killall", "hyperiond"])
I'm having trouble using mutilple subprocess calls back to back.
These 2 work fine:
subprocess.call(["gmake", "boot-tilera"], cwd="/home/ecorbett/trn_fp")
p = subprocess.Popen(["gmake", "run-tilera"], stdout=subprocess.PIPE, cwd="/home/ecorbett/trn_fp")
However, I get an error when I try to run this call directly after:
time.sleep(10)
subprocess.call(["./go2.sh"], cwd="/home/ecorbett/trn_fp/kem_ut")
I added sleep in there because I need a few seconds before I run the "./go2.sh" program. Not sure if that is the issue.
Any advice?
A possible reason why your shell script is working on the command-line is that the shebang line was not written correctly (or not written at all). See an example in which the script would work from a command line but not as a Python subprocess: Is this the right way to run a shell script inside Python?
If your shell script did not have a shebang line specified, it would work from command line because $SHELL is set in your environment and the script is taking that as a default. When running from a python subprocess, python does not know what it is and fails with OSError: [Errno 8] Exec format error. The subprocess.call() to gmake worked because it is a binary program and not a shell script. Using the argument shell=True gave an instruction to interpret the argument exactly as it would in a shell.
However, be careful about using shell=True in subprocess.call() as it may be insecure in some cases: subprocess Python docs.
I have successfully run several Python scripts, calling them from a base script using the subprocess module:
subprocess.popen([sys.executable, 'script.py'], shell=True)
However, each of these scripts executes some simulations (.exe files from a C++ application) that generate some output to the shell. All these outputs are written to the base shell from where I've launched those scripts. I'd like to generate a new shell for each script. I've tried to generate new shells using the shell=True attribute when calling subprocess.call (also tried with popen), but it doesn't work.
How do I get a new shell for each process generated with the subprocess.call?
I was reading the documentation about stdin and stdout as suggested by Spencer and found a flag the solved the problem: subprocess.CREATE_NEW_CONSOLE. Maybe redirecting the pipes does the job too, but this seems to be the simplest solution (at least for this specific problem). I've just tested it and worked perfectly:
subprocess.popen([sys.executable, 'script.py'], creationflags = subprocess.CREATE_NEW_CONSOLE)
To open in a different console, do (tested on Windows 7 / Python 3):
from sys import executable
from subprocess import Popen, CREATE_NEW_CONSOLE
Popen([executable, 'script.py'], creationflags=CREATE_NEW_CONSOLE)
input('Enter to exit from this launcher script...')
Popen already generates a sub process to handle things. You just need to redirect the output pipes. Look at the subprocess documentation, specifically the section on popen stdin, stdout and stderr redirection.
If you don't redirect these pipes, it inherits them from the parent. Just be careful about deadlocking your processes.
You wanted additional windows for each subprocess. This is handled as well. Look at the startupinfo section of subprocess. It explains what options to set on windows to spawn a new terminal for each subprocess. Note that it requires the use of the shell=True option.
This doesn't actually answer your question. But I've had my problems with subprocess too, and pexpect turned out to be really helpful.