Calling a python function from bash script - python

I have an "alarm email" function inside a python module. I want to be able to call this function from a bash script. I know you can call a module using 'python ' in the script, but I'm not if you can or how you would call a specific function within the module.

python -c'import themodule; themodule.thefunction("boo!")'

You can use the -c option:
python -c "import random; print random.uniform(0, 1)"
Modify as you need.

To call a specific function in a module, assure that the module has the following:
if __name__ == "__main__":
the_function_to_call( )
Then you can simply do this in your shell script.
python module.py
Or
python -m module
Depending on whether or not the module's on the PYTHONPATH.

To supplements others' answers with two notes:
If you need to feed environment variables as parameters, use double quotes;
The parameters are not necessarily str.
Here is an example:
# test.py
def call(param):
print(f'{param} with type={type(param)}')
print('hi')
Execute from shell:
PARAM=10
python3 -c "import test; test.call(${PARAM})"
You can read:
hi
10 with type=<class 'int'>

Related

How to use a python module in combination with a command from shell

I would like to use a python module from shell (to be exact: indirect from gnuplot). I do not want to write for every call an extra script or implement some I/O logic.
Let's say as a minimal working example, I have a python module module_foo.py with
#!/usr/bin/python
def bar():
print(1,2,3)
My question is:
Why isn't it possible to use a python module combining module loading and command execution like here?:
$ python -m module_foo -c 'bar()'
When executed, nothing happens. But what does work, is using only a command call like this
$ python -c 'import module_foo; module_foo.bar()'
1 2 3
or this
$ python -c 'from module_foo import *; bar()'
1 2 3
As soon as I load a module before, even a syntactically errorneous command is “accepted” – not executed, I suppose (the bracked of the call to bar isn't closed):
$ python -m module_foo -c 'bar('
$
It is, however, possible to use the -m module option using a python unit test (from the python docs):
python -m unittest test_module1 test_module2
The python manpage says for both options:
-c command
Specify the command to execute (see next section). This terminates the option
list (following options are passed as arguments to the command).
-m module-name
Searches sys.path for the named module and runs the corresponding .py file as
a script.
So I'd expect to be able to use path options in this -m ... -c ..., but not in reverse order -c ... -m ...'. Am I missing something obvious?
If you want your Python module to be executable and to call function bar(), you should add this to the end of the python file:
if __name__ == "__main__": # this checks that the file is "executed", rather than "imported"
bar() # call the function you want to call
Then call:
python module_foo.py
If you want more control, you can pass arguments to the script and access them from sys.argv.
For even more flexibility in arguments passed to the script, see argparse module.

How can I disassemble a function in a file without going into python interactive interpreter

So I have the following script, test.py:
def foo(x):
y=x*2
return y
print foo(10)
I run this command to disassemble it: python -m dis test.py. The problem is, it does not show the byte code of function foo.
Can you show me how I can make a simple BASH or Python script so that, given a Python script name and a function name, I can run it at command line to list the bytecode of the function?
[UPDATE]
OK, here it is:
dis_py_f() {
python -c "import dis; import $1; dis.dis($1.$2)"
}
Thanks Mad Physicist for the tip.

Export a variable from bash and use it in Python

I am trying to write a script which is executing couple of things on a Linux server and I would like to use bash for most of the Linux specific commands and only use Python for the most complex stuff, but in order to do that I will need to export some variables from the bash script and use them in the python script and I didn't find a way how I can do that. So I have tried to create two very little scripts to test this functionality:
1.sh is a bash script
#!/bin/bash
test_var="Test Variable"
export test_var
echo "1.sh has been executed"
python 2.sh
2.sh is a Python script:
#!/usr/bin/env python3
print("The python script has been invoked successfully")
print(test_var)
As you can guess when I execute the first script the second fails with the error about unknown variable:
$ ./1.sh
1.sh has been executed
The python script has been invoked successfully
Traceback (most recent call last):
File "2.sh", line 4, in <module>
print(test_var)
NameError: name 'test_var' is not defined
The reason why I am trying to do that is because I am more comfortable with bash and I want to use $1, $2 variables in bash. Is this also possible in Python?
[EDIT] - I have just found out how I can use $1 and $2 it in Python. You need to use sys.argv[1] and sys.argv[2] and import the sys module import sys
To use environment variables from your python script you need to call:
import os
os.environ['test_var']
os.environ is a dictionary with all the environment variables, you can use all the method a dict has. For instance, you could write :
os.environ.get('test_var', 'default_value')
Check python extension it should be .py instead of .sh
1.sh
#!/bin/bash
test_var="Test Variable"
export test_var
echo "1.sh has been executed"
python 2.py
os library will gave you the access the environment variable. Following python code will gave you the required result,
#!/usr/bin/env python3
import os
print("The python script has been invoked successfully")
print(os.environ['test_var'])
Check for reference : How do I access environment variables from Python?

How to call a shell script function/variable from python?

Is there any way to call a shell script and use the functions/variable defined in the script from python?
The script is unix_shell.sh
#!/bin/bash
function foo
{
...
}
Is it possible to call this function foo from python?
Solution:
For functions: Convert Shell functions to python functions
For shell local variables(non-exported), run this command in shell, just before calling python script:
export $(set | tr '\n' ' ')
For shell global variables(exported from shell), in python, you can:
import os
print os.environ["VAR1"]
Yes, in a similar way to how you would call it from another bash script:
import subprocess
subprocess.check_output(['bash', '-c', 'source unix_shell.sh && foo'])
This can be done with subprocess. (At least this was what I was trying to do when I searched for this)
Like so:
output = subprocess.check_output(['bash', '-c', 'source utility_functions.sh; get_new_value 5'])
where utility_functions.sh looks like this:
#!/bin/bash
function get_new_value
{
let "new_value=$1 * $1"
echo $new_value
}
Here's how it looks in action...
>>> import subprocess
>>> output = subprocess.check_output(['bash', '-c', 'source utility_functions.sh; get_new_value 5'])
>>> print(output)
b'25\n'
No, that's not possible. You can execute a shell script, pass parameters on the command line, and it could print data out, which you could parse from Python.
But that's not really calling the function. That's still executing bash with options and getting a string back on stdio.
That might do what you want. But it's probably not the right way to do it. Bash can not do that many things that Python can not. Implement the function in Python instead.
With the help of above answer and this answer, I come up with this:
import subprocess
command = 'bash -c "source ~/.fileContainingTheFunction && theFunction"'
stdout = subprocess.getoutput(command)
print(stdout)
I'm using Python 3.6.5 in Ubuntu 18.04 LTS.
I do not know to much about python, but if You use export -f foo after the shell script function definition, then if You start a sub bash, the function could be called. Without export, You need to run the shell script as . script.sh inside the sub bash started in python, but it will run everything in it and will define all the functions and all variables.
You could separate each function into their own bash file. Then use Python to pass the right parameters to each separate bash file.
This may be easier than just re-writing the bash functions in Python yourself.
You can then call these functions using
import subprocess
subprocess.call(['bash', 'function1.sh'])
subprocess.call(['bash', 'function2.sh'])
# etc. etc.
You can use subprocess to pass parameters too.

run program in Python shell

I have a demo file: test.py.
In the Windows Console I can run the file with: C:\>test.py
How can I execute the file in the Python Shell instead?
Use execfile for Python 2:
>>> execfile('C:\\test.py')
Use exec for Python 3
>>> exec(open("C:\\test.py").read())
If you're wanting to run the script and end at a prompt (so you can inspect variables, etc), then use:
python -i test.py
That will run the script and then drop you into a Python interpreter.
It depends on what is in test.py. The following is an appropriate structure:
# suppose this is your 'test.py' file
def main():
"""This function runs the core of your program"""
print("running main")
if __name__ == "__main__":
# if you call this script from the command line (the shell) it will
# run the 'main' function
main()
If you keep this structure, you can run it like this in the command line (assume that $ is your command-line prompt):
$ python test.py
$ # it will print "running main"
If you want to run it from the Python shell, then you simply do the following:
>>> import test
>>> test.main() # this calls the main part of your program
There is no necessity to use the subprocess module if you are already using Python. Instead, try to structure your Python files in such a way that they can be run both from the command line and the Python interpreter.
For newer version of python:
exec(open(filename).read())
If you want to avoid writing all of this everytime, you can define a function :
def run(filename):
exec(open(filename).read())
and then call it
run('filename.py')
From the same folder, you can do:
import test

Categories