To pass `echo t | ` as arg in `subprocess.check_output` - python

I am new to Python. I am struck up with the below problem.
I have called an exe using subprocess.check_output from my python script.
res = subprocess.check_output(["svn.exe", "list", "Https://127.0.0.1:443/svn/Repos"], stderr=subprocess.STDOUT)
when the script is executed in command prompt, i get a prompt message to enter an input,
(R)eject, accept (t)emporarily or accept (p)ermanently?
But when i execute the script from a batch file, i dont get this message and by default the call check_output fails.
Is there any way to pass the input while calling subprocess.check_output, so that i could run the script in batch.
Hi, Im updating my question:
I tried to run svn from command prompt with the below command,
echo t | svn.exe list Https://127.0.0.1:443/svn/Repos
I got the output without any user input.
But i couldnt able to pass the echo t | in subprocess.check_output.
Is there any way to do this?
Please help me to resolve this.
Thanks

I guess this is not the answer to the question in title, but did you try running svn with --non-interactive --trust-server-cert? Isn't that what you want?

The message you're getting looks related to certificates to me. So rather than adding the complication of passing input to the command either fix the certificate problem or pass the --trust-server-cert flag to the SVN command.
res = subprocess.check_output(["svn.exe", "--trust-server-cert", "list", "url"], stderr=subprocess.STDOUT)

I found two solutions for my problem,
1.found this soultion from this link http://desipenguin.com/techblog/2009/01/13/fun-with-python-subprocessstdin/
proc = subprocess.Popen((["svn.exe", "list", "Https://127.0.0.1:443/svn/Repos"], stdin=subprocess.PIPE, stdout=subprocess.PIPE)
print proc.communicate(‘t\n’)[0]
This resolved providing key input 't' to the process. But i was not able to read the output. So, i followed the second solution.
2.Using two subprocesses:
p = subprocess.Popen("echo t |", shell=True, stdin=subprocess.PIPE, stdout=subprocess.PIPE, stderr=subprocess.STDOUT)
p1 = subprocess.Popen(["svn.exe", "list", "Https://127.0.0.1:443/svn/Repos"], shell=True, **stdin=p.stdout**, stdout=subprocess.PIPE, stderr=subprocess.PIPE).communicate()
output = p1[0]
Here the output of process 1 is given as input of process 2. So this is equal to echo t | svnmucc mkdir d:\temp.
Thanks.

Related

how can i passs input in adb shell using python script?

I am facing a problem, have to pass an input after I run command: adb shell libtest_ip through python:
import subprocess
command = 'adb shell libtest_ip'
p = subprocess.Popen(command, shell=True,
stdout=subprocess.PIPE, stderr=subprocess.PIPE)
after this I have to pass input like 1 or en_us etc.. but as soon as the command to sun binary(libtest_ip is a binary), is executed, it gets stuck.
Please help me if anyone have idea how to solve this?
I think your best bet is pexpect .
Specially you can take a look at script.py that would help you creating the interactive script.
Basically, you should end up with something like this
...
self.child.expect('Whatever')
self.child.sendline('1')
self.child.expect('Whatever 2')
self.child.sendline('en-us')
...
update
Your example should work, try
#! /usr/bin/env python3
import pexpect
print(pexpect.run('/bin/echo hello'))
and running it should output
% ./test-pexpect.py
b'hello\r\n'

Python subprocess.Popen() not running command

I'm trying to use subprocess.Popen() to run a command in my script. The code is:
output = Popen(["hrun DAR_MeasLogDump " + log_file_name], stdout=subprocess.PIPE, stderr = subprocess.PIPE, executable="/bin/csh", cwd=cwdir, encoding='utf-8')
When I print the output, it's printing out the created shell output and not the actual command that's in the list. I tried getting rid of executable='/bin/csh', but then Popen wouldn't even run.
I also tried using subprocess.communicate(), but it didn't work either. I would also get the shell output and not the actual command run.
I want to completely avoid using shell=True because of security issues.
EDIT: In many different attempts, "hrun" is not being recoognized. "hrun" is a Pearl script that is being called, DAR_MeasLogDump is the action and log_file_name is the file that the script will call its action on. Is there any sort of set up or configuration that needs to be done in order for "hrun" to be recognized?
I think the problem is that Popen requires a list of every part of the command (command + options), the documentation for Popen inside subprocess has an example for that. So for that line in your script to work, you would need to write it like this:
output = Popen(["/bin/csh", "hrun", "DAR_MeasLogDump", log_file_name], stdout=subprocess.PIPE, stderr = subprocess.PIPE)
I've removed the executable argument, but I guess it could work that way as well.
Try:
output = Popen(["-c", "hrun DAR_MeasLogDump " +log_file_name], stdout=subprocess.PIPE, stderr = subprocess.PIPE, executable="/bin/csh", cwd=cwdir, encoding='utf-8')
csh is expecting -c "full command here". Without -c I think it just tries to open it as a file.
Specifying an odd shell and an explicit cwd seems completely out of place here (assuming cwdir is defined to the current directory).
If the first argument to subprocess is a list, no shell is involved.
result = subprocess.run(["hrun", "DAR_MeasLogDump", log_file_name],
stdout=subprocess.PIPE, stderr = subprocess.PIPE,
universal_newlines=True, check=True)
output = result.stdout
If you need this to be run under a legacy version of Python, maybe use check_output instead of run.
You generally want to avoid Popen unless you need to do something which the higher-level wrapper functions cannot do.
You are creating an instance of subprocess.Popen but not executing it.
You should try:
p = Popen(["hrun", "DAR_MeasLogDump ", log_file_name], stdout=subprocess.PIPE, stderr = subprocess.PIPE, cwd=cwdir, encoding='utf-8')
out, err = p.communicate() # This will get you output
Args should be passed as a sequence if you do not use shell=True, and then using executable should not be required.
Note that if you are not using advanced features from Popen, the doc recommends using subprocess.run:
from subprocess import run
p = run(["hrun", "DAR_MeasLogDump ", log_file_name], capture_output=True, cwd=cwdir, encoding='utf-8')
out, err = p.communicate() # This will get you output
This works with cat example:
import subprocess
log_file_name='-123.txt'
output = subprocess.Popen(['cat', 'DAR_MeasLogDump' + log_file_name],
stdout=subprocess.PIPE,
stderr=subprocess.STDOUT)
stdout, stderr = output.communicate()
print (stdout)
print (stderr)
I think you need only change to your 'hrun' command
It seems the same problem that I had at the beginning of a project: you have tried with windows "environment variables". It turns out that when entering the CMD or powershell it does not recognize perl, java, etc. unless you go to the folder where the .exe .py .java, etc. is located and enter the cmd, where the java.exe, python.py, etc. is.
In my ADB project, once I added in my environment variables, I no longer needed to go to the folder where the .exe .py or adb code was located.
Now you can open a CMD and it will execute any command even from your perl , so the interpreter that uses powershell will find and recognize the command.

Drush hangs when started from subprocess.Popen

I am trying to pass a pretty long bash command into my Popen command. The command is this -
'/usr/local/bin/drush --alias-path=/data/scripts/drush_aliases #test pml -y | /bin/grep -i dblog | /bin/grep -i enabled'
When passing the whole command in one go, the Popen command doesn't return the correct output in Cron. In order to remedy this, I am trying to split it apart into a list (as seen in "command") as pass it in in order to get around the issue.
In my full code, I'm chaining together several different Popen objects. However, my bug can be reproduced with only the following:
command = ['/usr/local/bin/drush', '--alias-path=/data/scripts/drush_aliases', '#test', 'pml', '-y']
try:
process = subprocess.Popen(command, stdout=subprocess.PIPE, stderr=subprocess.PIPE)
output = process.communicate()
What can cause this hang?
One of the most common reasons for a process to hang is if it's trying to read input from stdin.
You can work around this by explicitly passing a closed pipe (or a handle on /dev/null) on stdin:
process = subprocess.Popen(command,
stdout=subprocess.PIPE,
stderr=subprocess.PIPE,
stdin=subprocess.PIPE) ## this is new
communicate() will close the pipe passed to stdin after writing any content passed to it as an argument, preventing the process from hanging.
In Python 3.2 or newer, you can also use stdin=subprocess.DEVNULL.

Can Powershell read code from stdin?

I'm trying to run a Powershell subprocess from Python. I need to send Powershell code from Python to the child process. I've got this far:
import subprocess
import time
args = ["powershell", "-NoProfile", "-InputFormat None", "-NonInteractive"]
startTime = time.time()
process = subprocess.Popen(args, stdin=subprocess.PIPE, stdout=subprocess.PIPE, stderr=subprocess.PIPE)
process.stdin.write("Write-Host 'FINISHED';".encode("utf-8"))
result = ''
while 'FINISHED' not in result:
result += process.stdout.read(32).decode('utf-8')
if time.time() > startTime + 5:
raise TimeoutError(result)
print(result)
This times out, because nothing ever gets written to stdout. I think the Write-Host cmdlet never gets executed. Even the simple bash/Cygwin code echo "Write-Host 'FINISHED';" | powershell doesn't seem to do the job.
For comparison, sending the code block using the -Command flag works correctly.
How can I convince Powershell to run the code which I'm sending to stdin?
There a couple of things you can consider:
Invoke PowerShell in a mode where you provide it with a script file which it should execute. Write this script file prior to calling the subprocess. Use the -File <FilePath> parameter for PowerShell (cf. the docs)
If you really want to go with the stdin technique, you might be missing a newline character after the command. If this does not help, you might need to send another control character that tells PowerShell that input EOF is reached. You definitely need to consult the PowerShell docs for finding out how to 'terminate' commands on stdin. One thing you definitely need is the -Command - arguments: The value of Command can be "-", a string. or a script block. If the value of Command is "-", the command text is read from standard input. You may also want to look at this little hack: https://stackoverflow.com/a/13877874/145400
If you only want to execute one command, you can simplify your code by using out, err = subprocess.communicate(in)
I had trouble with a similar task, but I was able to solve it.
First my example code:
import subprocess
args = ["powershell.exe", "-Command", r"-"]
process = subprocess.Popen(args, stdin = subprocess.PIPE, stdout = subprocess.PIPE)
process.stdin.write(b"$data = Get-ChildItem C:\\temp\r\n")
process.stdin.write(b"Write-Host 'Finished 1st command'\r\n")
process.stdin.write(b"$data | Export-Clixml -Path c:\\temp\state.xml\r\n")
process.stdin.write(b"Write-Host 'Finished 2nd command'\r\n")
output = process.communicate()[0]
print(output.decode("utf-8"))
print("done")
The main issue was the correct argument list args. It is required to start the powershell with the -Command-flag, followed by "-" as indicated by Jan-Philipp.
Another mystery was the end-of-line character that is required to get the stuff executed. \r\n works quite well.
Getting the output of the Powershell is still an issue. But if you don't care about realtime, you can collect the output after finishing all executions by calling
output = process.communicate()[0]
However, the active Powershell will be terminated afterwards.

Python - pipelining subprocess in Windows

I'm using Windows 7, and I've tried this under Python 2.6.6 and Python 3.2.
So I'm trying to call this command line from Python:
netstat -ano | find ":80"
under Windows cmd, this line works perfectly fine.
So,
1st attempt:
output = subprocess.Popen(
[r'netstat -ano | find ":80"'],
stdout=subprocess.PIPE,
shell=True
).communicate()
An error is raised that 'find' actually didn't receive correct parameter (e.g. 'find ":80" \'):
Access denied - \
2nd attempt:
#calling netstat
cmd_netstat = subprocess.Popen(
['netstat','-ano'],
stdout = subprocess.PIPE
)
#pipelining netstat result into find
cmd_find = subprocess.Popen(
['find','":80"'],
stdin = cmd_netstat.stdout,
stdout = subprocess.PIPE
)
Again, the same error is raised.
Access denied - \
What did I do wrong? :(
EDIT:
3rd attempt (As #Pavel Repin suggested):
cmd_netstat = subprocess.Popen(
['cmd.exe', '-c', 'netstat -ano | find ":80"'],
stdout=subprocess.PIPE
).communicate()
Unfortunately, subprocess with ['cmd.exe','-c'] results in something resembling deadlock or a blank cmd window. I assume '-c' is ignored by cmd, resulting in communicate() waiting indefinitely for cmd termination. Since this is Windows, my bet bet is cmd only accepts parameter starting with slash (/). So I substituted '-c' with '/c':
cmd_netstat = subprocess.Popen(
['cmd.exe', '/c', 'netstat -ano | find ":80"'],
stdout=subprocess.PIPE
).communicate()
And...back to the same error:
Access denied - \
EDIT:
I gave up, I'll just process the string returned by 'netstat -ano' in Python. Might this be a bug?
What I suggest is that you do the maximum inside Python code. So, you can execute the following command:
# executing the command
import subprocess
output = subprocess.Popen(['netstat', '-ano'], stdout=subprocess.PIPE).communicate()
and then by parsing the output:
# filtering the output
valid_lines = [ line for line in output[0].split('\r\n') if ':80' in line ]
You will get a list of lines. On my computer, the output looks like this for port number 1900 (no html connexion active):
[' UDP 127.0.0.1:1900 *:* 1388', ' UDP 192.xxx.xxx.233:1900 *:* 1388']
In my opinion, this is easier to work with.
Note that :
option shell=True is not mandatory, but a command-line window is opened-closed quickly. See what suits you the most, but take care of command injection;
list of Popen arguments shall be a list of string. Quoting of the list parts is not necessary, subprocess will take care of it for you.
Hope this helps.
EDIT: oops, I missed the last line of the edit. Seems you've already got the idea on your own.
So I revisited this question, and found two solutions (I switched to Python 2.7 sometime ago, so I'm not sure about Python 2.6, but it should be the same.):
Replace find with findstr, and remove doublequotes
output = subprocess.Popen(['netstat','-ano','|','findstr',':80'],
stdout=subprocess.PIPE,
shell=True)
.communicate()
But this doesn't explain why "find" cannot be used, so:
Use string parameter instead of list
output = subprocess.Popen('netstat -ano | find ":80"',
stdout=subprocess.PIPE,
shell=True)
.communicate()
or
pipeout = subprocess.Popen(['netstat', '-ano'],
stdout = subprocess.PIPE)
output = subprocess.Popen('find ":80"',
stdin = pipeout.stdout,
stdout = subprocess.PIPE)
.communicate()
The problem arise from the fact that: ['find','":80"'] is actually translated into ['find,'\":80\"'].
Thus the following command is executed in Windows command shell:
>find \":80\"
Access denied - \
Proof:
Running:
output = subprocess.Popen(['echo','find','":80"'],
stdout=subprocess.PIPE,
shell=True)
.communicate()
print output[0]
returns:
find \":80\"
Running:
output = subprocess.Popen('echo find ":80"',
stdout=subprocess.PIPE,
shell=True)
.communicate()
print output[0]
returns:
find ":80"
New answer, after reading this old question again: this may be due to the two following facts:
The pipe operator executes the following commands in a sub-shell; see for instance this interesting consequence).
Python itself uses the pipe as a way to get the results back:
Note that (...) to get anything other than None in the result tuple, you need to give stdout=PIPE and/or stderr=PIPE too.
Not sure if this 'conflict' is kind of a bug, or a design choice though.

Categories