#!/usr/bin/env bash
I am working on using bash commands in python I found this example in Red hat magazine the tail command does not produce any output. I would like to understand why it is failing and I have also tried to import subprocess but that just hangs.
#Create Commands**strong text**
SPACE=`df -h`
MESSAGES=`tail /var/log/messages`
#Assign to an array(list in Python)
cmds=("$MESSAGES" "$SPACE")
#iteration loop
count=0
for cmd in "${cmds[#]}"; do
count=$((count + 1))
printf "Running Command Number %s \n" $count
echo "$cmd"
done
Printing the command doesn't mean executing it. Look at Python's subprocess library for API and examples for doing what you want.
Related
I'm using windows 10 and while working on a new project, I need to interact with WSL(Ubuntu on windows) bash from within python (windows python interpreter).
I tried using subprocess python library to execute commands.. what I did looks like this:
import subprocess
print(subprocess.check_call(['cmd','ubuntu1804', 'BashCmdHere(eg: ls)']))#not working
print(subprocess.check_output("ubuntu1804", shell=True).decode())#also not working
The expected behavior is to execute ubuntu1804 command which starts a wsl linux bash on which I want to execute my 'BashCmdHere' and retrieve its results to python but it just freezes. What am I doing wrong ? or how to do this ?
Thank you so much
Found 2 ways to achieve this:
A correct version of my code looks like this
#e.g: To execute "ls -l"
import subprocess
print(subprocess.check_call(['wsl', 'ls','-l','MaybeOtherParamHere']))
I should have used wsl to invoke linux shell from windows aka bash then my command and parameters in separated arguments for the subprocess command.
The other way which I think is cleaner but may be heavier is using PowerShell Scripts:
#script.ps1
param([String]$folderpath, [String]$otherparam)
Write-Output $folderpath
Write-Output $otherparam
wsl ls -l $folderpath $otherparam
Then to execute it in python and get the results:
import subprocess
def callps1():
powerShellPath = r'C:\WINDOWS\system32\WindowsPowerShell\v1.0\powershell.exe'
powerShellCmd = "./script.ps1"
#call script with argument '/mnt/c/Users/aaa/'
p = subprocess.Popen([powerShellPath, '-ExecutionPolicy', 'Unrestricted', powerShellCmd, '/mnt/c/Users/aaa/', 'SecondArgValue']
, stdout=subprocess.PIPE, stderr=subprocess.PIPE)
output, error = p.communicate()
rc = p.returncode
print("Return code given to Python script is: " + str(rc))
print("\n\nstdout:\n\n" + str(output))
print("\n\nstderr: " + str(error))
# Test
callps1()
Thank you for helping out
What about:
print(subprocess.check_call(['ubuntu1804', 'run', 'BashCmdHere(eg: ls)'])) #also try without "run" or change ubuntu1804 to wsl
Or
print(subprocess.check_call(['cmd', '/c', 'ubuntu1804', 'run', 'BashCmdHere(eg: ls)']))#also try without "run" or change "ubuntu1804" to "wsl"
# I think you need to play with quotes here to produce: cmd /c 'ubuntu1804 run BashCmdHere(eg: ls)'
First, try to call your command from cmd.exe to see the right format and then translate it to Python.
os.system('bash')
I figured this out by accident.
I want to get the string output of executing ps -p $$ -oargs= through python.
When I try this:
subprocess.check_output("ps -p $$ -oargs=", shell=True)
I get "ps -p 9017 -oargs=" but when I try ps -p $$ -oargs= directly in a shell I get "-bash". How do I get "-bash" through python?
That's doing exactly what you asked it to do, filtering the ps to only output information on the process with the PID of the shell that's running it (that's what $$ means).
If you want information on the Python session itself, you need to pass the PID of the Python process itself, e.g. (removing the shell=True wrapping since you don't need it for $$ here):
subprocess.check_output(["ps", "-p", str(os.getpid()), "-oargs="])
Make sure to import os at the top of your file so you have access to os.getpid.
I was using python project pick to select an option from a list. Below code returns the option and index.
option, index = pick(options, title)
Pick uses curses library from python. I want to pass the output of my python script to shell script.
variable output = $(pythonfile.py)
but it gets stuck on the curses screen. It cannot draw anything. What can be the reason for this?
pick gets stuck because when you use $(pythonfile.py), the shell redirects the output of pythonfile.py as if it were a pipe. Also, the output of pick contains characters for updating the screen (not what you want). You can work around those problems by
redirecting the output of pythonfile.py to /dev/tty
ensuring that your pythonfile.py writes its result to the standard error, and
directing the standard error in the shell script to the output of the $(...) construct.
For example:
#!/bin/bash
foo=$(python basic.py 2>&1 >/dev/tty )
echo "result '$foo'"
and in pythonfile.py, doing
import sys
print(option, index, file=sys.stderr)
rather than
print(option, index)
To pass the output of a Python script to a Bash variable you need to specify the command with which to open the python file inside the variable's declaration.
Like so:
variable_output=$(python pythonfile.py)
Furthermore, if you'd like to pass a variable from Python to bash you could use Python's sys module and then redirect the stderr.
Like so:
test.py
import sys
test_var = (str(3 + 3))
sys.exit(test_var)
test.sh
test_var=$(python3 test.py 2>&1)
echo $testvar
Now, if we run test.sh we get the output 6.
I am invoking the bash script from python script.
I want the bash script to add an element to dictionary "d" in the python script
abc3.sh:
#!/bin/bash
rank=1
echo "plugin"
function reg()
{
if [ "$1" == "what" ]; then
python -c 'from framework import data;data(rank)'
echo "iamin"
else
plugin
fi
}
plugin()
{
echo "i am plugin one"
}
reg $1
python file:
import sys,os,subprocess
from collections import *
subprocess.call(["./abc3.sh what"],shell=True,executable='/bin/bash')
def data(rank,check):
d[rank]["CHECK"]=check
print d[1]["CHECK"]
If I understand correctly, you have a python script that runs a shell script, that in turn runs a new python script. And you'd want the second Python script to update a dictionnary in the first script. That will not work like that.
When you run your first python script, it will create a new python process, which will interpret each instruction from your source script.
When it reaches the instruction subprocess.call(["./abc3.sh what"],shell=True,executable='/bin/bash'), it will spawn a new shell (bash) process which will in turn interpret your shell script.
When the shell script reaches python -c <commands>, it invokes a new python process. This process is independant from the initial python process (even if you run the same script file).
Because each of theses scripts will run in a different process, they don't have access to each other data (the OS makes sure that each process is independant from each other, excepted for specific inter-process communications methods).
What you need to do: use some kind of interprocess mechanism, so that the initial python script gets data from the shell script. You may for example read data from the shell standard output, using https://docs.python.org/3/library/subprocess.html#subprocess.check_output
Let's suppose that you have a shell plugin that echoes the value:
echo $1 12
The mockup python script looks like (I'm on windows/MSYS2 BTW, hence the strange paths for a Linux user):
import subprocess
p = subprocess.Popen(args=[r'C:\msys64\usr\bin\sh.exe',"-c","C:/users/jotd/myplugin.sh myarg"],stdout=subprocess.PIPE,stderr=subprocess.PIPE)
o,e= p.communicate()
p.wait()
if len(e):
print("Warning: error found: "+e.decode())
result = o.strip()
d=dict()
d["TEST"] = result
print(d)
it prints the dictionary, proving that argument has been passed to the shell, and went back processed.
Note that stderr has been filtered out to avoid been mixed up with the results, but is printed to the console if occurs.
{'TEST': b'myarg 12'}
I am trying to run 2 bash commands sequentially in a for loop in Python. The loop is as given below:
for dataset in data:
subprocess.call("cd " +dataset+'.matrix',shell=True)
subprocess.call("cat part-r-* > combined_output", shell=True)
However, in this way, each command is being considered independently of each other. I need them to execute one after the other. I don't know how to use the subprocess module well (I also tried with os.system). I went through some documentation online but they weren't really useful. Any help in this would be appreciated. thanks in advance!
I don't believe they're running asynchronously (at the same time), just that they're in different subprocesses which each inherit the parent process' working directory. So it is waiting for the cd to complete, but the subprocess which has cd'd then disappears.
Thankfully, in this case you can tell the call to operate in a different directory with the cwd parameter, so you don't need your cd:
for dataset in data:
subprocess.call("cat part-r-* > combined_output",shell=True,cwd=dataset+'.matrix')
An example with os.system:
import os
os.system("echo aaa > ~/SOMEFILE1111; cd ~/; cat SOMEFILE1111; rm SOMEFILE1111")
and one with subprocess.call:
import subprocess
cmds = [ "echo aaa > ~/SOMEFILE1111", "cd ~/",
"cat SOMEFILE1111", "rm SOMEFILE1111"]
subprocess.call(";".join(cmds), shell=True)