Executing script inside python subprocess and getting the output - python

I am trying to get execute one script with some arguments and getting its out to a buffer for further processing. The original code snippet which works now as below-
cmd = "/pathToScript/myscript.sh --endpoint "+ ip_addr + " --register '"+ reg_token + "' --output-file " + pathToOutputFile
os.system(cmd)
The above cmd runs and creates the output file where it write the data generated by myhelper script.
Now I want to run the same script, but instead of writing to file, I am expecting the output to a buffer as returned value.
I rewrite the code line as below, but it does not generates the data into buffer.
creds = subprocess.run(
["/pathToScript/myscript.sh", "--endpoint {0}".format(endpoint_ipaddr), "--register {0}".format(reg_token)],
check=True,
stdout=subprocess.PIPE,
universal_newlines=True
).stdout
print("creds")
I tried with both subprocess.run() and call() command, but both did not return anything. Is my modified code is correct with respect to executing the script and getting the output collected to returned variable?
Appreciate your help on this.
thank you

Related

Running a .bat file in python that requires a user input

I am trying to get an output from a .bat file in a python script, code works fine if I hard code a variable value in .bat file, but I want that value to be dynamic.
This is code I am using to execute the external file.
command = 'C:/this/this.bat'
p = subprocess.Popen(command, universal_newlines=True,
shell=True, stdout=subprocess.PIPE,
stderr=subprocess.PIPE)
text = p.stdout.read()
retcode = p.wait()
file this.bat requires a user input but I am not sure how to provide it inside python script, e.g. from a variable. Thanks for the help.
I've dealt with something similar. The way I solved it was to prompt the user for values in python. Then using subprocess pass those values into the .bat file.
command = [shutil.which('C:/this/this.bat') ,sys.argv[1], sys.argv[2], sys.argv[3], sys.argv[4], sys.argv[5]]
subprocess.Popen(command).wait()
This assumes you can change the .bat file to take parameters instead of prompts.

subprocess.Popen or proc.stdout.read() corrupting JSON data

I am running a PHP script through Python Django as it contains legacy code for a client.
Data is passed to through to the PHP script via JSON and after the script is computed a string is returned for display like so.
proc = subprocess.Popen(["php -f script.php " + json.dumps(data_for_php, sort_keys=True)], shell=True, stdout=subprocess.PIPE)
script_response = proc.stdout.read()
return HttpResponse(script_response)
The issue I am having is something in this process is corrupting the data.
i.e. one JSON field from data_for_php has the key and value 'xxx_amount': u'$350,000.00', returns ,000.00, as the value in script_response.
It's not doing this for anything else.
I have been doing a bit of debugging and have determined that json.dumps(data_for_php, sort_keys=True) is not causing the issue, also data_for_php is good too.
It leads me to believe that this command proc.stdout.read() is some how mutating $350 to (space).
Note: same thing is occurring for other dictionary values.
Update
I have been lead to believe that the process I am using is a command line script inside Python. When the command is called the JSON variables are being passed inside the command line script. This is probably the issue. Looking for a solution.
$350 in bash is a variable, in the shell it gets replaced by its value, which is not defined. Adding a single quote aroud the the dump should do the trick to avoir interpreting special characters:
proc = subprocess.Popen(["php -f script.php '" + json.dumps(data_for_php, sort_keys=True) + "'"], shell=True, stdout=subprocess.PIPE)
script_response = proc.stdout.read()
return HttpResponse(script_response)

Python file output adding weird characters

Turns out it is an error with my C program. I changed my printf to only print a preset string and redirected it to a file and the extra characters were still there. I still don't know why though.
Hi I'm writing a python script to run analysis on a C program I'm making parallel. Write now I have the number of processors used and the iterations I want to pass to my C program in a separate file called tests. I'm extremely new to Python, here's my sample code I wrote to figure out how to write results to a file which fill eventually be a .csv file.
#!/usr/bin/env python
import subprocess
mpiProcess = "runmpi"
piProcess = "picalc"
tests = open("tests.txt")
analysis = open("analysis.txt", "w")
def runPiCalc (numProcs, numIterations):
numProcs = str(numProcs)
numIterations = str(numIterations)
args = (mpiProcess, piProcess, numProcs, numIterations)
popen = subprocess.Popen(args, stdout=subprocess.PIPE)
popen.wait()
output = popen.stdout.read()
return output
def runTest (testArgs):
testProcs = testArgs[0]
testIterations = testArgs[1]
output = runPiCalc(testProcs,testIterations)
appendResults(output)
def appendResults (results):
print results
analysis.write(results + '\n')
for testLine in tests:
testArgs = testLine.split()
runTest(testArgs)
tests.close()
analysis.close()
My problem right now is when I "print results" to stdout the output comes out as expected and I get 3.14blablablablawhatever. When I check the analysis.txt file though I get [H[2J (weirder characters that are encoded as ESC not on the web) at the start of every line before my pi calculation shows up. I can't figure out why that is. Why would file.write have different output than print. Again this is my first time with Python so I'm probably just missing something easy.
This is on a ubuntu server I'm sshing to btw.
Here's the tests.txt and a picture of how the characters look on linux
The problem was I had a bash script executing my C program. The bash script was inserting the weird characters before the program output and adding it to its standard output. Putting the command I was calling inside the python script directly instead of calling a bash script fixed the problem.

Splitting a large file in python by calling a command line function

I am trying to split a file into a number of parts via a python script:
Here is my snippet:
def bashCommandFunc(commandToRun):
process = subprocess.Popen(commandToRun.split(), stdout=subprocess.PIPE)
output = process.communicate()
return output
filepath = "/Users/user/Desktop/TempDel/part-00000"
numParts = "5"
splitCommand = "split -l$((`wc -l < " + filepath + "/" + numParts + ")) " + filepath
splitCommand:
'split -l$((`wc -l < /Users/user/Desktop/TempDel/part-00000`/5)) /Users/user/Desktop/TempDel/part-00000'
If I run this command on a terminal, it splits the file as it's supposed to, but it fails for the above defined subprocess function.
I have tested the function for other generic commands and it works fine.
I believe the character " ` " (tilde) might be an issue,
What is the work around to getting this command to work?
Are there some better ways to split a file from python into "n" parts.
Thanks
You'll have to let Python run this line via a full shell, rather than trying to run it as a command. You can do that by adding shell=True option and not splitting your command. But you really shouldn't do that if any part of the command may be influenced by users (huge security risk).
You could do this in a safer way by first calling wc, getting the result and then calling split. Or even implement the whole thing in pure Python instead of calling out to other commands.
What happens now is that you're calling split with first parameter -l$((``wc, second parameter -l, etc.

pOpen with python

This is baffling me.
I have a python script that does some work on a Windows platform to generate an XML file, download some images and then call an external console application to generate a video using the xml and images the python script has generated.
The application I call with pOPen is supposed to return a status i.e. [success] or [invalid] or [fail] dependant on how it interprets the data I pass it.
If I use my generation script to generate the information and then call the console application separately in another script it works fine and I get success and a video is generated.
Successful code (please ignore the test prints!):
print ("running console app...")
cmd = '"C:/Program Files/PropertyVideos/propertyvideos.console.exe" -xml data/feed2820712.xml -mpeg -q normal'
print (cmd)
p = subprocess.Popen(cmd , stdout=subprocess.PIPE, stderr=subprocess.PIPE)
output = p.communicate()[0]
print ("\n----\n[" + output + "]\n----\n")
if output == "[success]":
print "\nHURRAHHHHH!!!!!!!"
print ("finished...")
But if I include the same code at the end of the script that generates the info to feed the console application then it runs for about 2 seconds and output = []
Same code, just ran at the end of a different script...
EDIT:
Thanks to Dave, Dgrant and ThomasK it seems to be that the generate script is not closing the file as redirecting strerr to stdout shows:
Unhandled Exception: System.IO.IOException: The process cannot access the file '
C:\videos\data\feed2820712.xml' because it is being used by another process.
at System.IO.__Error.WinIOError(Int32 errorCode, String maybeFullPath)
at System.IO.FileStream.Init(String path, FileMode mode, FileAccess access, I
nt32 rights, Boolean useRights, FileShare share, Int32 bufferSize, FileOptions o
ptions, SECURITY_ATTRIBUTES secAttrs, String msgPath, Boolean bFromProxy)
However I AM closing the file:
Extract from the generation script:
xmlfileObj.write('</FeedSettings>\n') # write the last line
xmlfileObj.close
# sleep to allow files to close
time.sleep(10)
# NOW GENERATE THE VIDEO
print ("initialising video...")
cmd = '"C:/Program Files/PropertyVideos/propertyvideos.console.exe" -xml data/feed2820712.xml -mpeg -q normal'
print (cmd)
p = subprocess.Popen(cmd , stdout=subprocess.PIPE, stderr=subprocess.STDOUT)
output = p.communicate()[0]
print ("\n----\n[" + output + "]\n----\n")
if output == "[success]":
print "\nHURRAHHHHH!!!!!!!"
print ("finished...")
Any help would be appreciated.
You're not closing the file. Your code says:
xmlfileObj.close
when it should be
xmlfileObj.close()
Edit: Just to clarify - the code xmlfileObj.close is a valid python expression which returns a reference to the built in close method of a file (or file like) object. Since it is a valid python expression it is perfectly legal code, but it does not have any side effects. Specifically, it does not have the effect of actually calling the close() method. You need to include the open and close brackets to do that.

Categories