Python - pipelining subprocess in Windows - python

I'm using Windows 7, and I've tried this under Python 2.6.6 and Python 3.2.
So I'm trying to call this command line from Python:
netstat -ano | find ":80"
under Windows cmd, this line works perfectly fine.
So,
1st attempt:
output = subprocess.Popen(
[r'netstat -ano | find ":80"'],
stdout=subprocess.PIPE,
shell=True
).communicate()
An error is raised that 'find' actually didn't receive correct parameter (e.g. 'find ":80" \'):
Access denied - \
2nd attempt:
#calling netstat
cmd_netstat = subprocess.Popen(
['netstat','-ano'],
stdout = subprocess.PIPE
)
#pipelining netstat result into find
cmd_find = subprocess.Popen(
['find','":80"'],
stdin = cmd_netstat.stdout,
stdout = subprocess.PIPE
)
Again, the same error is raised.
Access denied - \
What did I do wrong? :(
EDIT:
3rd attempt (As #Pavel Repin suggested):
cmd_netstat = subprocess.Popen(
['cmd.exe', '-c', 'netstat -ano | find ":80"'],
stdout=subprocess.PIPE
).communicate()
Unfortunately, subprocess with ['cmd.exe','-c'] results in something resembling deadlock or a blank cmd window. I assume '-c' is ignored by cmd, resulting in communicate() waiting indefinitely for cmd termination. Since this is Windows, my bet bet is cmd only accepts parameter starting with slash (/). So I substituted '-c' with '/c':
cmd_netstat = subprocess.Popen(
['cmd.exe', '/c', 'netstat -ano | find ":80"'],
stdout=subprocess.PIPE
).communicate()
And...back to the same error:
Access denied - \
EDIT:
I gave up, I'll just process the string returned by 'netstat -ano' in Python. Might this be a bug?

What I suggest is that you do the maximum inside Python code. So, you can execute the following command:
# executing the command
import subprocess
output = subprocess.Popen(['netstat', '-ano'], stdout=subprocess.PIPE).communicate()
and then by parsing the output:
# filtering the output
valid_lines = [ line for line in output[0].split('\r\n') if ':80' in line ]
You will get a list of lines. On my computer, the output looks like this for port number 1900 (no html connexion active):
[' UDP 127.0.0.1:1900 *:* 1388', ' UDP 192.xxx.xxx.233:1900 *:* 1388']
In my opinion, this is easier to work with.
Note that :
option shell=True is not mandatory, but a command-line window is opened-closed quickly. See what suits you the most, but take care of command injection;
list of Popen arguments shall be a list of string. Quoting of the list parts is not necessary, subprocess will take care of it for you.
Hope this helps.
EDIT: oops, I missed the last line of the edit. Seems you've already got the idea on your own.

So I revisited this question, and found two solutions (I switched to Python 2.7 sometime ago, so I'm not sure about Python 2.6, but it should be the same.):
Replace find with findstr, and remove doublequotes
output = subprocess.Popen(['netstat','-ano','|','findstr',':80'],
stdout=subprocess.PIPE,
shell=True)
.communicate()
But this doesn't explain why "find" cannot be used, so:
Use string parameter instead of list
output = subprocess.Popen('netstat -ano | find ":80"',
stdout=subprocess.PIPE,
shell=True)
.communicate()
or
pipeout = subprocess.Popen(['netstat', '-ano'],
stdout = subprocess.PIPE)
output = subprocess.Popen('find ":80"',
stdin = pipeout.stdout,
stdout = subprocess.PIPE)
.communicate()
The problem arise from the fact that: ['find','":80"'] is actually translated into ['find,'\":80\"'].
Thus the following command is executed in Windows command shell:
>find \":80\"
Access denied - \
Proof:
Running:
output = subprocess.Popen(['echo','find','":80"'],
stdout=subprocess.PIPE,
shell=True)
.communicate()
print output[0]
returns:
find \":80\"
Running:
output = subprocess.Popen('echo find ":80"',
stdout=subprocess.PIPE,
shell=True)
.communicate()
print output[0]
returns:
find ":80"

New answer, after reading this old question again: this may be due to the two following facts:
The pipe operator executes the following commands in a sub-shell; see for instance this interesting consequence).
Python itself uses the pipe as a way to get the results back:
Note that (...) to get anything other than None in the result tuple, you need to give stdout=PIPE and/or stderr=PIPE too.
Not sure if this 'conflict' is kind of a bug, or a design choice though.

Related

Python subprocess.Popen() not running command

I'm trying to use subprocess.Popen() to run a command in my script. The code is:
output = Popen(["hrun DAR_MeasLogDump " + log_file_name], stdout=subprocess.PIPE, stderr = subprocess.PIPE, executable="/bin/csh", cwd=cwdir, encoding='utf-8')
When I print the output, it's printing out the created shell output and not the actual command that's in the list. I tried getting rid of executable='/bin/csh', but then Popen wouldn't even run.
I also tried using subprocess.communicate(), but it didn't work either. I would also get the shell output and not the actual command run.
I want to completely avoid using shell=True because of security issues.
EDIT: In many different attempts, "hrun" is not being recoognized. "hrun" is a Pearl script that is being called, DAR_MeasLogDump is the action and log_file_name is the file that the script will call its action on. Is there any sort of set up or configuration that needs to be done in order for "hrun" to be recognized?
I think the problem is that Popen requires a list of every part of the command (command + options), the documentation for Popen inside subprocess has an example for that. So for that line in your script to work, you would need to write it like this:
output = Popen(["/bin/csh", "hrun", "DAR_MeasLogDump", log_file_name], stdout=subprocess.PIPE, stderr = subprocess.PIPE)
I've removed the executable argument, but I guess it could work that way as well.
Try:
output = Popen(["-c", "hrun DAR_MeasLogDump " +log_file_name], stdout=subprocess.PIPE, stderr = subprocess.PIPE, executable="/bin/csh", cwd=cwdir, encoding='utf-8')
csh is expecting -c "full command here". Without -c I think it just tries to open it as a file.
Specifying an odd shell and an explicit cwd seems completely out of place here (assuming cwdir is defined to the current directory).
If the first argument to subprocess is a list, no shell is involved.
result = subprocess.run(["hrun", "DAR_MeasLogDump", log_file_name],
stdout=subprocess.PIPE, stderr = subprocess.PIPE,
universal_newlines=True, check=True)
output = result.stdout
If you need this to be run under a legacy version of Python, maybe use check_output instead of run.
You generally want to avoid Popen unless you need to do something which the higher-level wrapper functions cannot do.
You are creating an instance of subprocess.Popen but not executing it.
You should try:
p = Popen(["hrun", "DAR_MeasLogDump ", log_file_name], stdout=subprocess.PIPE, stderr = subprocess.PIPE, cwd=cwdir, encoding='utf-8')
out, err = p.communicate() # This will get you output
Args should be passed as a sequence if you do not use shell=True, and then using executable should not be required.
Note that if you are not using advanced features from Popen, the doc recommends using subprocess.run:
from subprocess import run
p = run(["hrun", "DAR_MeasLogDump ", log_file_name], capture_output=True, cwd=cwdir, encoding='utf-8')
out, err = p.communicate() # This will get you output
This works with cat example:
import subprocess
log_file_name='-123.txt'
output = subprocess.Popen(['cat', 'DAR_MeasLogDump' + log_file_name],
stdout=subprocess.PIPE,
stderr=subprocess.STDOUT)
stdout, stderr = output.communicate()
print (stdout)
print (stderr)
I think you need only change to your 'hrun' command
It seems the same problem that I had at the beginning of a project: you have tried with windows "environment variables". It turns out that when entering the CMD or powershell it does not recognize perl, java, etc. unless you go to the folder where the .exe .py .java, etc. is located and enter the cmd, where the java.exe, python.py, etc. is.
In my ADB project, once I added in my environment variables, I no longer needed to go to the folder where the .exe .py or adb code was located.
Now you can open a CMD and it will execute any command even from your perl , so the interpreter that uses powershell will find and recognize the command.

Drush hangs when started from subprocess.Popen

I am trying to pass a pretty long bash command into my Popen command. The command is this -
'/usr/local/bin/drush --alias-path=/data/scripts/drush_aliases #test pml -y | /bin/grep -i dblog | /bin/grep -i enabled'
When passing the whole command in one go, the Popen command doesn't return the correct output in Cron. In order to remedy this, I am trying to split it apart into a list (as seen in "command") as pass it in in order to get around the issue.
In my full code, I'm chaining together several different Popen objects. However, my bug can be reproduced with only the following:
command = ['/usr/local/bin/drush', '--alias-path=/data/scripts/drush_aliases', '#test', 'pml', '-y']
try:
process = subprocess.Popen(command, stdout=subprocess.PIPE, stderr=subprocess.PIPE)
output = process.communicate()
What can cause this hang?
One of the most common reasons for a process to hang is if it's trying to read input from stdin.
You can work around this by explicitly passing a closed pipe (or a handle on /dev/null) on stdin:
process = subprocess.Popen(command,
stdout=subprocess.PIPE,
stderr=subprocess.PIPE,
stdin=subprocess.PIPE) ## this is new
communicate() will close the pipe passed to stdin after writing any content passed to it as an argument, preventing the process from hanging.
In Python 3.2 or newer, you can also use stdin=subprocess.DEVNULL.

Resource unavailable when piping subprocess

I am trying to find the path to the MATLAB executable using Python when it is not in PATH. I am using subprocess.Popen to execute locate and grepping the result, however this creates a Resource Unavailable error:
locate = subprocess.Popen(['locate', 'matlab'], stdout=subprocess.PIPE)
grep = subprocess.Popen(['grep', '/bin/matlab$'], stdin=locate.stdout, stdout=subprocess.PIPE, stderr=subprocess.PIPE)
result, err = grep.communicate()
MATLAB_PATH = result.decode('UTF-8').split()
The result variable is empty and err variable is :
b'grep: (standard input): Resource temporarily unavailable\n'
I have tried your code on linux with python 3.5.2 and 3.6.1 and it does work:
locate = subprocess.Popen(['locate', 'find'], stdout=subprocess.PIPE)
grep = subprocess.Popen(['grep', '/bin/find$'], stdin=locate.stdout, stdout=subprocess.PIPE, stderr=subprocess.PIPE)
grep.communicate()
(b'/usr/bin/find\n', b'')
For the records: locate find gives 1619 lines.
For completeness I have also tried locate fdafad (gibberish) and it also works.
It does also work when the code is in a script.
edit:
Try to use communicate to interact between to two processess:
locate = subprocess.Popen(['locate', 'find'], stdout=subprocess.PIPE)
stdout, stderr = locate.communicate()
grep = subprocess.Popen(['grep', '/bin/find$'], stdin=subprocess.PIPE, stdout=subprocess.PIPE, stderr=subprocess.PIPE)
print(grep.communicate(input=stdout))
NOTE: the second part of the answer has been written before the asker updated the question with information about the PATH
However there is a much better ways to find executables using python:
from distutils.spawn import find_executable
find_executable('find')
'/usr/bin/find'
If you insist in using shell functions, why don't use something like which.
Adding just a little bit more information on why this error occurred.
This looks to be a problem with eventlet using "green" threads and non-blocking IO and locate not generating output fast enough. That is, eventlet assumes that the stdout is to be used by python. Eventlet uses non-blocking IO to assist in cooperative threading. This means the file descriptor behind locate.stdout that you pass to grep has already been set to non-blocking. If grep tries to read from stdin when it is empty then you will get that error.
An easy solution would be to do both commands in a single shell (so Python doesn't get to mess with the pipe between the two sub-processes).
eg.
result = subprocess.check_output('locate matlab | grep /bin/matlab$', shell=True).strip()

python issue with popen and mysql

I'm new to Python, and haven't used Linux in years, so I'm not sure where I'm getting tangled up. I'm trying to use Popen to run sql files in MySQL on Ubuntu.
Here is the relevant code:
command = ['mysql', '-uUSER', '-pPWD','-h192.168.1.132', '--database=dbName', '<', './1477597236_foo.sql' ]
print("command is: "+subprocess.list2cmdline(command))
proc = subprocess.Popen(
command, stderr=subprocess.PIPE, stdout=subprocess.PIPE, cwd='.'
)
the output from this is the same as if had run 'mysql --help'. The puzzling thing to me is that if i take the command output by subprocess.list2cmdline and run it directly, it runs perfectly. Also, if i replace '< file.sql' with '-e select * from foo', it runs. So, the '<' and file are causing my problem. I know WHAT is causing the problem, but nothing I've tried so far has fixed it.
tia, Craig
When a redirection or pipe or built-in command is present in the command line, shell=True is required. However, in simple cases like this, shell=True is overkill. There's a much cleaner way in order to avoid shell=True which gives better control on the input file.
if the input file doesn't exist, you get an exception before reaching the subprocess, which is easier to handle
the process runs without the shell: better portability & performance
the code:
command = ['mysql', '-uUSER', '-pPWD','-h192.168.1.132', '--database=dbName' ]
with open('./1477597236_foo.sql') as input_file:
proc = subprocess.Popen(
command, stdin = input_file, stderr=subprocess.PIPE, stdout=subprocess.PIPE )
output,error = proc.communicate()
(I added the next line which should be a communicate call: since both stdout & stderr are redirected, it's the only simple way to avoid deadlocks between both output streams)
So you need to add shell=True to your Popen call. < is a part of the shell and you can't use shell features without that parameter.
proc = subprocess.Popen( command, stderr=subprocess.PIPE, stdout=subprocess.PIPE, cwd='.',shell=True )

Can Powershell read code from stdin?

I'm trying to run a Powershell subprocess from Python. I need to send Powershell code from Python to the child process. I've got this far:
import subprocess
import time
args = ["powershell", "-NoProfile", "-InputFormat None", "-NonInteractive"]
startTime = time.time()
process = subprocess.Popen(args, stdin=subprocess.PIPE, stdout=subprocess.PIPE, stderr=subprocess.PIPE)
process.stdin.write("Write-Host 'FINISHED';".encode("utf-8"))
result = ''
while 'FINISHED' not in result:
result += process.stdout.read(32).decode('utf-8')
if time.time() > startTime + 5:
raise TimeoutError(result)
print(result)
This times out, because nothing ever gets written to stdout. I think the Write-Host cmdlet never gets executed. Even the simple bash/Cygwin code echo "Write-Host 'FINISHED';" | powershell doesn't seem to do the job.
For comparison, sending the code block using the -Command flag works correctly.
How can I convince Powershell to run the code which I'm sending to stdin?
There a couple of things you can consider:
Invoke PowerShell in a mode where you provide it with a script file which it should execute. Write this script file prior to calling the subprocess. Use the -File <FilePath> parameter for PowerShell (cf. the docs)
If you really want to go with the stdin technique, you might be missing a newline character after the command. If this does not help, you might need to send another control character that tells PowerShell that input EOF is reached. You definitely need to consult the PowerShell docs for finding out how to 'terminate' commands on stdin. One thing you definitely need is the -Command - arguments: The value of Command can be "-", a string. or a script block. If the value of Command is "-", the command text is read from standard input. You may also want to look at this little hack: https://stackoverflow.com/a/13877874/145400
If you only want to execute one command, you can simplify your code by using out, err = subprocess.communicate(in)
I had trouble with a similar task, but I was able to solve it.
First my example code:
import subprocess
args = ["powershell.exe", "-Command", r"-"]
process = subprocess.Popen(args, stdin = subprocess.PIPE, stdout = subprocess.PIPE)
process.stdin.write(b"$data = Get-ChildItem C:\\temp\r\n")
process.stdin.write(b"Write-Host 'Finished 1st command'\r\n")
process.stdin.write(b"$data | Export-Clixml -Path c:\\temp\state.xml\r\n")
process.stdin.write(b"Write-Host 'Finished 2nd command'\r\n")
output = process.communicate()[0]
print(output.decode("utf-8"))
print("done")
The main issue was the correct argument list args. It is required to start the powershell with the -Command-flag, followed by "-" as indicated by Jan-Philipp.
Another mystery was the end-of-line character that is required to get the stuff executed. \r\n works quite well.
Getting the output of the Powershell is still an issue. But if you don't care about realtime, you can collect the output after finishing all executions by calling
output = process.communicate()[0]
However, the active Powershell will be terminated afterwards.

Categories