I have a python script that I am using to call a bash script that renames a file. I then need the new name of the file so python can do some further processing on it. I'm using subprocess.Popen to call the shell script. The shell script echos the new file name so I can use stdout=subprocess.PIPE to get the new file name.
The problem is that sometimes the bash script tries to rename the file with it's old name depending on the circumstances and so gives the message that the two files are the same from the mv command. I have cutout all the other stuff and included a basic example below.
$ ls -1
test.sh
test.txt
This shell script is just an example to force the error message.
$ cat test.sh
#!/bin/bash
mv "test.txt" "test.txt"
echo "test"
In python:
$ python
>>> import subprocess
>>> p = subprocess.Popen(['/bin/bash', '-c', './test.sh'], stdout=subprocess.PIPE, stderr=subprocess.STDOUT)
>>> p.stdout.read()
"mv: `test.txt' and `test.txt' are the same file\ntest\n"
How can I ignore the message from the mv command and only get the output of the echo command? If all goes well the only output of the shell script would be the result of the echo so really I just need to ignore the mv error message.
Thanks,
Geraint
Direct stderr to null, Thusly
$ python
>>> import os
>>> from subprocess import *
>>> p = Popen(['/bin/bash', '-c', './test.sh'], stdout=PIPE, stderr=open(os.devnull, 'w'))
>>> p.stdout.read()
To get subprocess' output and ignore its error messages:
#!/usr/bin/env python
from subprocess import check_output
import os
with open(os.devnull, 'wb', 0) as DEVNULL:
output = check_output("./test.sh", stderr=DEVNULL)
check_output() raises an exception if the script returns with non-zero status.
See How to hide output of subprocess in Python 2.7.
Related
I am trying to run a specific command from Python to Powershell:
The command works as expected in Powershell. The command in Powershell is as following:
gpt .\Method\gpt_scripts\s1_cal_deb.xml -t .\deburst\S1A_IW_SLC__1SDV_20180909T003147_20180909T003214_023614_0292B2_753E_Cal_deb_script.dim .\images\S1A_IW_SLC__1SDV_20180909T003147_20180909T003214_023614_0292B2_753E.zip
Powershell Output:
os.getcwd()
'C:\\Users\\Ishack\\Documents\\Neta-Analytics\\Harvest Dates\\S1_SLC_Processing'
The current directory is the same as in PowerShell
I tried something like this:
import subprocess
process = 'gpt .\Method\gpt_scripts\s1_cal_deb.xml -t .\deburst\S1A_IW_SLC__1SDV_20180909T003147_20180909T003214_023614_0292B2_753E_Cal_deb_script.dim .\images\S1A_IW_SLC__1SDV_20180909T003147_20180909T003214_023614_0292B2_753E.zip'
process = subprocess.Popen(['powershell.exe', '-NoProfile', '-Command', '"&{' + process + '}"'], stdout=subprocess.PIPE, stderr=subprocess.PIPE)
process
Output:
<subprocess.Popen at 0x186bb202388>
But when I press enter I get no response, I would like Python to print out the output just like in Powershell. I researched other similar questions but still no solution to the problem.
Thanks,
Ishack
import os
os.system("powershell.exe -ExecutionPolicy Bypass -File .\SamplePowershell.ps1")
SamplePowershell.ps1
Write-Host "Hello world"
hostname
This worked for me. Commands in file SamplePowershell.ps1.
and if you want to use subprocess
import subprocess
k=subprocess.Popen('powershell.exe hostname', stdout=subprocess.PIPE, stderr=subprocess.PIPE);
print(k.communicate())
I need to use the wget in a Python script with the subprocess.call function, but it seems the "wget" command cannot be identified by the bash subprocess opened by python.
I have added the environment variable (the path where wget is):
export PATH=/usr/local/bin:$PATH
to the ~/.bashrc file and the ~/.bash_profile file on my mac and guaranteed to have sourced them.
And the python script looks like:
import subprocess as sp
cmd = 'wget'
process = sp.Popen(cmd ,stdout=sp.PIPE, stdin=sp.PIPE,
stderr=sp.PIPE, shell=True ,executable='/bin/bash')
(stdoutdata, stderrdata) = process.communicate()
print stdoutdata, stderrdata
The expected output should be like
wget: missing URL
Usage: wget [OPTION]... [URL]...
But the result is always
/bin/bash: wget: command not found
Interestingly I can get the help output if I type in wget directly in a bash terminal, but it never works in the python script. How could it be?
PS:
If I change the command to
cmd = '/usr/local/bin/wget'
then it works. So I am sure I got wget installed.
You can pass an env= argument to the subprocess functions.
import os
myenv = os.environ.copy
myenv['PATH'] = '/usr/local/bin:' + myenv['PATH']
subprocess.run(..., env=myenv)
However, you probably want to avoid running a shell at all, and instead augment the PATH that Python uses to find the binary to run in the subprocess call.
import subprocess as sp
import os
os.environ['PATH'] = '/usr/local/bin:' + os.environ['PATH']
cmd = 'wget'
# use run instead of Popen
# don't needlessly use a shell
# and thus put [cmd] as a list
process = sp.run([cmd], stdout=sp.PIPE, stdin=sp.PIPE,
stderr=sp.PIPE,
universal_newlines=True)
print(process.stdout, process.stderr)
Running Bash commands in Python explains the changes I made in more detail.
However, there is no good reason to use an external utility for this; Python requests does pretty everything wget does, often more naturally and with more control over what exactly it does.
I'm doing a GIT hook in Python 3.5. The python script calls a Bash script that that reads input from the user using read command.
The bash script by itself works, also when calling directly the python script, but when GIT runs the hook written in Python, it doesn't work as expected because no user input is requested from the user.
Bash script:
#!/usr/bin/env bash
echo -n "Question? [Y/n]: "
read REPLY
GIT Hook (Python script):
#!/usr/bin/env python3
from subprocess import Popen, PIPE
proc = Popen('/path/to/myscript.sh', shell=True, stderr=PIPE, stdout=PIPE)
stdout_raw, stderr_raw= proc.communicate()
When I execute the Python script, Bash's read does not seem to be waiting for an input, and I only get:
b'\nQuestion? [Y/n]: \n'
How to let the bash script read input when being called from Python?
It turns out the problem had nothing to do with Python: if the GIT hook called a bash script it also failed to ask for input.
The solution I found is given here.
Basically, the solution is to add the following to the bash script before the read:
# Allows us to read user input below, assigns stdin to keyboard
exec < /dev/tty
In my case, I also had to call the bash process simply like Popen(mybashscript) instead of Popen(mybashscript, shell=True, stderr=PIPE, stdout=PIPE)), so the script can freely output to STDOUT and not get captured in a PIPE.
Alternatively, I didn't modify the bash script and instead used in Python:
sys.stdin = open("/dev/tty", "r")
proc = Popen(h, stdin=sys.stdin)
which is also suggested in the comments of the aforementioned link.
Adding
print(stdout_raw)
print(stderr_raw)
Shows
b''
b'/bin/sh: myscript.sh: command not found\n'
here. Adding ./ to the myscript.sh worked for the READ once python could find the script. cwd='.' in Popen may also work.
This is what worked for me without invoking a bash script from within python. It is a modified version from arod's answer.
import subprocess
import sys
sys.stdin = open("/dev/tty", "r")
user_input = subprocess.check_output("read -p \"Please give your input: \" userinput && echo \"$userinput\"", shell=True, stdin=sys.stdin).rstrip()
print(user_input)
Based on the above replies:
import sys
import subprocess
def getInput(prompt):
sys.stdin = open("/dev/tty", "r")
command = f"read -p \"{prompt}\" ret && echo \"$ret\""
userInput = subprocess.check_output(command, shell=True, stdin=sys.stdin).rstrip().decode("utf-8")
return userInput
I need to run a bash script from Python. I got it to work as follows:
import os
os.system("xterm -hold -e scipt.sh")
That isn't exactly what I am doing but pretty much the idea. That works fine, a new terminal window opens and I hold it for debugging purposes, but my problem is I need the python script to keep running even if that isn't finished. Any way I can do this?
I recommend you use subprocess module: docs
And you can
import subprocess
cmd = "xterm -hold -e scipt.sh"
# no block, it start a sub process.
p = subprocess.Popen(cmd , shell=True, stdout=subprocess.PIPE, stderr=subprocess.PIPE)
# and you can block util the cmd execute finish
p.wait()
# or stdout, stderr = p.communicate()
For more info, read the docs,:).
edited misspellings
I have a bit of code that I am designing to take a file, perform the dos2unix command on it, then copy that file to a file called INPUT, and then run a command that boots a program. From my code the first two tasks work flawlessly, however the script doesn't seem to excecute the command line which starts the program. However when I take the command line exactly as I have it written in the script, and pass it in terminal, it works fine.
here is the code:
import subprocess
import os
os.chdir('/home/mike/testing/crystal')
subprocess.Popen(['dos2unix mgo_input'], stdout=subprocess.PIPE, shell=True)
subprocess.call(['cp mgo_input INPUT'], shell=True)
subprocess.Popen(['mpirun -np 8 Pcrystal </dev/null &> mgo_singlepoint.out &'], stdout=subprocess.PIPE, shell=True)
it is the mpirun section of the code that seems to be getting hung up
Popen() returns before the called program finishes execution. Using call (or check_call, which checks return codes) can be a better solution. Better yet, use python for the conversion and the copy.
I'm not sure why you are piping stdout, but I'm going to assume you don't want dos2unix or mpirun to print to the screen, so I redirect them to /dev/null.
import subprocess
import shutil
import os
os.chdir('/home/mike/testing/crystal')
subprocess.check_call('dos2unix mgo_input', stdout=open('/dev/null','w'), shell=True)
shutil.copy2(mgo_input, 'INPUT')
subprocess.check_call('mpirun -np 8 Pcrystal', stdout=open('mgo_singlepoint.out', 'w'), stdstderr=subprocess.STDOUT, shell=True)