Capture output of bash code run in python - python

I'm trying to capture the output of a bash line I'm executing in my python script using the following code:
call_bash = subprocess.Popen(['/space/jazz/1/users/gwarner/test.sh'],stdout=subprocess.PIPE)
output = call_bash.communicate()[0]
This works fine but I'd like to change it so that that bash code is written directly into this python script (eliminating the need for a separate .sh file). I know that I can run bash code from python with subprocess.call() but I can't figure out how to capture the output (what normally gets printed to the terminal incase I'm using the wrong terminology). Can anyone help?
I'm running python 2.7 on CentOS

The first argument to subprocess.Popen is a list of arguments (unless you specify shell=True).
So you could do your example like
call_bash = subprocess.Popen(['ssh', 'me#cli.globusonline.org', 'scp me#comp:/~/blah.text ep#comp:/register/'], stdout=subprocess.PIPE)
output = call_bash.communicate()[0]
This will invoke the ssh command with 2 arguments, me#cli.globusonline.org and scp me#comp:/~/blah.text ep#comp:/register/.

Related

Using pyinstaller to convert Python3 and Python2 to .exe

I am currently writing a script, on Python 3.
I need to call an already existing Python 2 script, from within my Python 3 script.
In the Python3 script I am doing this using something like:
import subprocess
my_command = 'python script_in_py2.py arg1 arg2'
process = subprocess.Popen(my_command.split(), stdout = subprocess.PIPE)
output, error = process.communicate()
Thus, variable output will get the output that would be displayed on screen if I had just written my_command in terminal.
Although I am currently developing in Ubuntu, eventually I will migrate it to Windows where I will need to convert it to an .exe. For this I have previously used pyinstaller.
I know that running the resulting .exe (of a purely Python3 script) on a Windows machine without Python installed at all works.
However I need to know before going on, whether it will also work like this. My intuition is no (in my knowledge creating a process like that is similar to running the command directly in Terminal/Command Prompt), however I would like to know if anybody knows of a way/workaround. (Installing Python2 on the machine that would run the script is not an option).
A possible solution could be like this:
Convert your python2 script i.e. 'script_in_py2.py' into an executable i.e. 'script_in_py2.exe' using pyinstaller.
pyinstaller script_in_py2.py
In your python3 script, call this python 2 executable instead of calling python 2 script.
import subprocess
my_command = 'script_in_py2.exe arg1 arg2'
process = subprocess.Popen(my_command.split(), stdout = subprocess.PIPE)
output, error = process.communicate()
Now convert your python3 script into an executable using pyinstaller.
pyinstaller script_in_py3.py
Paste the executable 'script_in_py2.exe' and 'python27.dll' in dist folder of 'script_in_py3' (i.e. parallel to location of executable 'script_in_py3.exe')
Run "script_in_py3.exe" and you will get desired output.
I have tried it on my system and its working smoothly.

How to send value from python script to console and exit

My python script generate a proper command that user need to run in the same console. My scenario is that user is running a script and then as a result see the command that must to run. Is there any way to exit python script and send that command to console, so user do not need to copy/paste?
A solution would be to have your python script (let's call it script.py) just print the command: print('ls -l') and use it in a terminal like so: $(python3 script.py). This makes bash run the output of your script as a command, and would basically run a ls -l in the terminal.
You can even go a step beyond and create an alias in ~/.bashrc so that you no longer need to call the whole line. You can write at the end of the file something like alias printls=$(python3 /path/to/script.py). After starting a new terminal, you can type printls and the script will run.
A drawback of this method is that you have no proper way of handling exceptions or errors in your code, since everything it prints will be run as a command. One way (though ugly) would be to print('echo "An error occured!"') so that the user who runs the command can see that something malfunctioned.
However, I'd suggest going for the "traditional" way and running the command directly from python. Here's a link to how you can achieve this: Calling an external command in Python.
Python can run system commands in new subshells. The proper way of doing this is via the subprocess module, but for simple tasks it's easier to just use os.system. Example Python script (assuming a Unix-like system):
import os
os.system('ls')

Is there a way to get the output of a bash script to "feed" into a variable in a python script that executes the action initially?

So, I have a bash script that feeds out data from a sensor that is connected to the NVidea Jetson TK-1 in string format in the shell. Is there any way I can run a python script that initializes the bash script. Takes the output data in bash, feeds it back into a python string variable where it can be parsed? I cannot edit the bash script.
Thanks
In python:
import subprocess
bash_output = subprocess.check_output('run bash script here')
Subprocess.check_output sends commands to the shell and the shell pipes the results back to python. Look at examples of using and documentation on subprocess.check_output here.

Cannot make consecutive calls with subprocess

I'm having trouble using mutilple subprocess calls back to back.
These 2 work fine:
subprocess.call(["gmake", "boot-tilera"], cwd="/home/ecorbett/trn_fp")
p = subprocess.Popen(["gmake", "run-tilera"], stdout=subprocess.PIPE, cwd="/home/ecorbett/trn_fp")
However, I get an error when I try to run this call directly after:
time.sleep(10)
subprocess.call(["./go2.sh"], cwd="/home/ecorbett/trn_fp/kem_ut")
I added sleep in there because I need a few seconds before I run the "./go2.sh" program. Not sure if that is the issue.
Any advice?
A possible reason why your shell script is working on the command-line is that the shebang line was not written correctly (or not written at all). See an example in which the script would work from a command line but not as a Python subprocess: Is this the right way to run a shell script inside Python?
If your shell script did not have a shebang line specified, it would work from command line because $SHELL is set in your environment and the script is taking that as a default. When running from a python subprocess, python does not know what it is and fails with OSError: [Errno 8] Exec format error. The subprocess.call() to gmake worked because it is a binary program and not a shell script. Using the argument shell=True gave an instruction to interpret the argument exactly as it would in a shell.
However, be careful about using shell=True in subprocess.call() as it may be insecure in some cases: subprocess Python docs.

Python 2.4.6 subprocess is befuddling me

I have written a nice python script that parses XML and adds some sophisticated logic to then interface with an external command via subprocess module.
Most of the subprocess.Popen calls do exactly what they're supposed to, but the last one simply refuses to execute. No error message , it just doesn't do what it's supposed to. I even put the actual CMD into a shell script surrounded by debugging statements, and the shell script gets executed, but not the actual CMD.
More infuriatingly, the very same line of code in a separate .py file executes just fine.
I have no idea why or how this could be?
The python code is generating a file and tries to invoke the external command with options
p = subprocess.Popen([CMD,'object','new_host','--file','/tmp/add.1234'], stdout=subprocess.PIPE, stderr=subprocess.PIPE)
r = p.communicate()
print r
this logic works in the standalone file, but not in the larger python script (which has other working Popen calls in it).
Does anybody have an idea why this could be?
PS: I can not update python to a more recent version

Categories