How to call a shell script function/variable from python? - python

Is there any way to call a shell script and use the functions/variable defined in the script from python?
The script is unix_shell.sh
#!/bin/bash
function foo
{
...
}
Is it possible to call this function foo from python?
Solution:
For functions: Convert Shell functions to python functions
For shell local variables(non-exported), run this command in shell, just before calling python script:
export $(set | tr '\n' ' ')
For shell global variables(exported from shell), in python, you can:
import os
print os.environ["VAR1"]

Yes, in a similar way to how you would call it from another bash script:
import subprocess
subprocess.check_output(['bash', '-c', 'source unix_shell.sh && foo'])

This can be done with subprocess. (At least this was what I was trying to do when I searched for this)
Like so:
output = subprocess.check_output(['bash', '-c', 'source utility_functions.sh; get_new_value 5'])
where utility_functions.sh looks like this:
#!/bin/bash
function get_new_value
{
let "new_value=$1 * $1"
echo $new_value
}
Here's how it looks in action...
>>> import subprocess
>>> output = subprocess.check_output(['bash', '-c', 'source utility_functions.sh; get_new_value 5'])
>>> print(output)
b'25\n'

No, that's not possible. You can execute a shell script, pass parameters on the command line, and it could print data out, which you could parse from Python.
But that's not really calling the function. That's still executing bash with options and getting a string back on stdio.
That might do what you want. But it's probably not the right way to do it. Bash can not do that many things that Python can not. Implement the function in Python instead.

With the help of above answer and this answer, I come up with this:
import subprocess
command = 'bash -c "source ~/.fileContainingTheFunction && theFunction"'
stdout = subprocess.getoutput(command)
print(stdout)
I'm using Python 3.6.5 in Ubuntu 18.04 LTS.

I do not know to much about python, but if You use export -f foo after the shell script function definition, then if You start a sub bash, the function could be called. Without export, You need to run the shell script as . script.sh inside the sub bash started in python, but it will run everything in it and will define all the functions and all variables.

You could separate each function into their own bash file. Then use Python to pass the right parameters to each separate bash file.
This may be easier than just re-writing the bash functions in Python yourself.
You can then call these functions using
import subprocess
subprocess.call(['bash', 'function1.sh'])
subprocess.call(['bash', 'function2.sh'])
# etc. etc.
You can use subprocess to pass parameters too.

Related

Running bash code in python

For one functionality, I need bash commands (12 lines of bash code). How can I put these 12 lines in between my Python code? At the moment I was using:
import subprocess
command = 'bash 1-line code'
subprocess.call(command, shell=True)
This worked but I was only using one line of code, now I have 12 and the '' seems not to work well...
Any suggestions?
Just extend what you were doing. Put all your bash codes in a file named, say script.sh and call it using python. You can call it as you were calling normal commands, i.e using subprocess module:
import subprocess
subprocess.call(['./script.sh'])

Calling alias Command from python script

I need to run an OpenFOAM command by automatized python script.
My python code contains the lines
subprocess.Popen(['OF23'], shell=True)
subprocess.Popen(['for i in *; do surfaceConvert $i file_path/$i.stlb; done', shell=True)
where OF23 is a shell command is defined in alias as
alias OF23='export PATH=/usr/lib64/openmpi/bin/:$PATH;export LD_LIBRARY_PATH=/usr/lib64/openmpi/lib/:$LD_LIBRARY_PATH;source /opt/OpenFOAM/OpenFOAM-2.3.x/etc/bashrc'
This script runs the OpenFOAM command in terminal and the file_path defines the stl files which are converted to binary format
But when I run the script, I am getting 'OF23' is not defined.
How do I make my script to run the alias command and also perform the next OpenFOAM file conversion command
That's not going to work, even once you've resolved the alias problem. Each Python subprocess.Popen is run in a separate subshell, so the effects of executing OF23 won't persist to the second subprocess.Popen.
Here's a brief demo:
import subprocess
subprocess.Popen('export ATEST="Hello";echo "1 $ATEST"', shell=True)
subprocess.Popen('echo "2 $ATEST"', shell=True)
output
1 Hello
2
So whether you use the alias, or just execute the aliased commands directly, you'll need to combine your commands into one subprocess.Popen call.
Eg:
subprocess.Popen('''export PATH=/usr/lib64/openmpi/bin/:$PATH;
export LD_LIBRARY_PATH=/usr/lib64/openmpi/lib/:$LD_LIBRARY_PATH;
source /opt/OpenFOAM/OpenFOAM-2.3.x/etc/bashrc;
for i in *;
do surfaceConvert $i file_path/$i.stlb;
done''', shell=True)
I've used a triple-quoted string so I can insert linebreaks, to make the shell commands easier to read.
Obviously, I can't test that exact command sequence on my machine, but it should work.
You need to issue shopt -s expand_aliases to activate alias expansion. From bash(1):
Aliases are not expanded when the shell is not interactive, unless the expand_aliases shell option is set using shopt [...]
If that does not help, check if the shell executed from your Python program is actually Bash (e.g. by echoing $BASH).
If your command may use bash-isms then you could pass executable parameter otherwise /bin/sh is used. To expand aliases, you could use #Michael Jaros' suggestion:
#!/usr/bin/env python
import subprocess
subprocess.check_call("""
shopt -s expand_aliases
OF23
for i in *; do surfaceConvert $i file_path/$i.stlb; done
"""], shell=True, executable='/bin/bash')
If you already have a working bash-script then just call it as is.
Though to make it more robust and maintainable, you could convert to Python parts that provide the most benefit e.g., here's how you could emulate the for-loop:
#!/usr/bin/env python
import subprocess
for entry in os.listdir():
subprocess.check_call(['/path/to/surfaceConvert', entry,
'file_path/{entry}.stlb'.format(entry)])
It allows filenames to contain shell meta-characters such as spaces.
To configure the environment for a child process, you could use Popen's env parameter e.g., env=dict(os.environ, ENVVAR='value').
It is possible to emulate source bash command in Python but you should probably leave the parts that depend on it in bash-script.

Passing parameters to shell script from a python program

I would like to call a shell script from my python script. I need to pass 3 parameters/arguments to the shell script. I am able to call the shell script (that is in the same directory as that of the python script), but having some issue with the parameter passing
from subprocess import call
// other code here.
line = "Hello"
// Here is how I call the shell command
call (["./myscript.sh", "/usr/share/file1.txt", ""/usr/share/file2.txt", line], shell=True)
In my shell script I have this
#!/bin/sh
echo "Parameters are $1 $2 $3"
...
Unfortunately parameters are not getting passed correctly.
I get this message:
Parameters are
None of the parameter values are passed in the script
call ("./myscript.sh /usr/share/file1.txt /usr/share/file2.txt "+line, shell=True)
When you are using shell=True you can directly pass the command as if passing on shell directly.
Drop shell=True (you might need to make myscript.sh executable: $ chmod +x myscript.sh):
#!/usr/bin/env python
from subprocess import check_call
line = "Hello world!"
check_call(["./myscript.sh", "/usr/share/file1.txt", "/usr/share/file2.txt",
line])
Do not use a list argument and shell=True together.

bash wrap a piped command with a python script

Is there a way to create a python script which wraps an entire bash command including the pipes.
For example, if I have the following simple script
import sys
print sys.argv
and call it like so (from bash or ipython), I get the expected outcome:
[pkerp#pendari trell]$ python test.py ls
['test.py', 'ls']
If I add a pipe, however, the output of the script gets redirected to the pipe sink:
[pkerp#pendari trell]$ python test.py ls > out.txt
And the > out.txt portion is not in sys.argv. I understand that the shell automatically process this output, but I'm curious if there's a way to force the shell to ignore it and pass it to the process being called.
The point of this is to create something like a wrapper for the shell. I'd like to run the commands regularly, but keep track of the strace output for each command (including the pipes). Ideally I'd like to keep all of the bash features, such as tab-completion and up and down arrows and history search, and then just pass the completed command through a python script which invokes a subprocess to handle it.
Is this possible, or would I have to write my own shell to do this?
Edit
It appears I'm asking the exact same thing as this question.
The only thing you can do is pass the entire shell command as a string, then let Python pass it back to a shell for execution.
$ python test.py "ls > out.txt"
Inside test.py, something like
subprocess.call("strace " + sys.argv[1], shell=True, executable="/bin/bash")
to ensure the entire string is passed to the shell (and bash, specifically).
Well, I don't quite see what you are trying to do. The general approach would be to give the desired output destination to the script using command line options: python test.py ls --output=out.txt. Incidentally, strace writes to stderr. You could capture everything using strace python test.py > out 2> err if you want to save everything...
Edit: If your script writes to stderr as well you could use strace -o strace_out python test.py > script_out 2> script_err
Edit2: Okay, I understand better what you want. My suggestion is this: Write a bash helper:
function process_and_evaluate()
{
strace -o /tmp/output/strace_output "$#"
/path/to/script.py /tmp/output/strace_output
}
Put this in a file like ~/helper.sh. Then open a bash, source it using . ~/helper.sh.
Now you can run it like this: process_and_evaluate ls -lA.
Edit3:
To capture output / error you could extend the macro like this:
function process_and_evaluate()
{
out=$1
err=$2
shift 2
strace -o /tmp/output/strace_output "$#" > "$out" 2> "$err"
/path/to/script.py /tmp/output/strace_output
}
You would have to use the (less obvious ) process_and_evaluate out.txt err.txt ls -lA.
This is the best that I can come up with...
At least in your simple example, you could just run the python script as an argument to echo, e.g.
$ echo $(python test.py ls) > test.txt
$ more test.txt
['test.py','ls']
Enclosing a command in parenthesis with a dollar sign first executes the contents then passes the output as an argument to echo.

How to use export with Python on Linux

I need to make an export like this in Python :
# export MY_DATA="my_export"
I've tried to do :
# -*- python-mode -*-
# -*- coding: utf-8 -*-
import os
os.system('export MY_DATA="my_export"')
But when I list export, "MY_DATA" not appear :
# export
How I can do an export with Python without saving "my_export" into a file ?
export is a command that you give directly to the shell (e.g. bash), to tell it to add or modify one of its environment variables. You can't change your shell's environment from a child process (such as Python), it's just not possible.
Here's what's happening when you try os.system('export MY_DATA="my_export"')...
/bin/bash process, command `python yourscript.py` forks python subprocess
|_
/usr/bin/python process, command `os.system()` forks /bin/sh subprocess
|_
/bin/sh process, command `export ...` changes its local environment
When the bottom-most /bin/sh subprocess finishes running your export ... command, then it's discarded, along with the environment that you have just changed.
You actually want to do
import os
os.environ["MY_DATA"] = "my_export"
Another way to do this, if you're in a hurry and don't mind the hacky-aftertaste, is to execute the output of the python script in your bash environment and print out the commands to execute setting the environment in python. Not ideal but it can get the job done in a pinch. It's not very portable across shells, so YMMV.
$(python -c 'print "export MY_DATA=my_export"')
(you can also enclose the statement in backticks in some shells ``)
Not that simple:
python -c "import os; os.putenv('MY_DATA','1233')"
$ echo $MY_DATA # <- empty
But:
python -c "import os; os.putenv('MY_DATA','123'); os.system('bash')"
$ echo $MY_DATA #<- 123
I have an excellent answer.
#! /bin/bash
output=$(git diff origin/master..origin/develop | \
python -c '
# DO YOUR HACKING
variable1_to_be_exported="Yo Yo"
variable2_to_be_exported="Honey Singh"
… so on
magic=""
magic+="export onShell-var1=\""+str(variable1_to_be_exported)+"\"\n"
magic+="export onShell-var2=\""+str(variable2_to_be_exported)+"\""
print magic
'
)
eval "$output"
echo "$onShell-var1" // Output will be Yo Yo
echo "$onShell-var2" // Output will be Honey Singh
Mr Alex Tingle is correct about those processes and sub-process stuffs
How it can be achieved is like the above I have mentioned.
Key Concept is :
Whatever printed from python will be stored in the variable in the catching variable in bash [output]
We can execute any command in the form of string using eval
So, prepare your print output from python in a meaningful bash commands
use eval to execute it in bash
And you can see your results
NOTE
Always execute the eval using double quotes or else bash will mess up your \ns and outputs will be strange
PS: I don't like bash but your have to use it
I've had to do something similar on a CI system recently. My options were to do it entirely in bash (yikes) or use a language like python which would have made programming the logic much simpler.
My workaround was to do the programming in python and write the results to a file.
Then use bash to export the results.
For example:
# do calculations in python
with open("./my_export", "w") as f:
f.write(your_results)
# then in bash
export MY_DATA="$(cat ./my_export)"
rm ./my_export # if no longer needed
You could try os.environ["MY_DATA"] instead.
Kind of a hack because it's not really python doing anything special here, but if you run the export command in the same sub-shell, you will probably get the result you want.
import os
cmd = "export MY_DATA='1234'; echo $MY_DATA" # or whatever command
os.system(cmd)
In the hope of providing clarity over common cinfusion...
I have written many python <--> bash <--> elfbin toolchains and the proper way to see it is such as this:
Each process (originator) has a state of the environment inherited from whatever invoked it. Any change remains lokal to that process. Transfering an environment state is a function by itself and runs in two directions, each with it's own caveats. The most common thing is to modify environment before running a sub-process. To go down to the metal, look at the exec() - call in C. There is a variant that takes a pointer to environment data. This is the only actually supported transfer of environment in typical OS'es.
Shell scripts will create a state to pass when running children when you do an export. Otherwise it just uses that which it got in the first place.
In all other cases it will be some generic mechanism used to pass a set of data to allow the calling process itself to update it's environment based on the result of the child-processes output.
Ex:
ENVUPDATE = $(CMD_THAT_OUTPUTS_KEYVAL_LISTS)
echo $ENVUPDATE > $TMPFILE
source $TMPFILE
The same can of course be done using json, xml or other things as long as you have the tools to interpret and apply.
The need for this may be (50% chance) a sign of misconstruing the basic primitives and that you need a better config or parameter interchange in your solution.....
Oh, in python I would do something like...
(need improvement depending on your situation)
import re
RE_KV=re.compile('([a-z][\w]*)\s*=\s*(.*)')
OUTPUT=RunSomething(...) (Assuming 'k1=v1 k2=v2')
for kv in OUTPUT.split(' ')
try:
k,v=RE_KV.match(kv).groups()
os.environ[k]=str(v)
except:
#The not a property case...
pass
One line solution:
eval `python -c 'import sysconfig;print("python_include_path={0}".format(sysconfig.get_path("include")))'`
echo $python_include_path # prints /home/<usr>/anaconda3/include/python3.6m" in my case
Breakdown:
Python call
python -c 'import sysconfig;print("python_include_path={0}".format(sysconfig.get_path("include")))'
It's launching a python script that
imports sysconfig
gets the python include path corresponding to this python binary (use "which python" to see which one is being used)
prints the script "python_include_path={0}" with {0} being the path from 2
Eval call
eval `python -c 'import sysconfig;print("python_include_path={0}".format(sysconfig.get_path("include")))'`
It's executing in the current bash instance the output from the python script. In my case, its executing:
python_include_path=/home/<usr>/anaconda3/include/python3.6m
In other words, it's setting the environment variable "python_include_path" with that path for this shell instance.
Inspired by:
http://blog.tintoy.io/2017/06/exporting-environment-variables-from-python-to-bash/
import os
import shlex
from subprocess import Popen, PIPE
os.environ.update(key=value)
res = Popen(shlex.split("cmd xxx -xxx"), stdin=PIPE, stdout=PIPE, stderr=PIPE,
env=os.environ, shell=True).communicate('y\ny\ny\n'.encode('utf8'))
stdout = res[0]
stderr = res[1]
os.system ('/home/user1/exportPath.ksh')
exportPath.ksh:
export PATH=MY_DATA="my_export"

Categories