How to resolve this bash python interaction? - python

I have 3 files, let's see one by one.
params.conf
[a]
[b]
[c]
[d]
[e]
parsing.py
from ConfigParser import SafeConfigParser
parser = SafeConfigParser()
parser.read('params.conf')
lst=parser.sections()
print lst
demo.sh
#! /bin/bash
value=$(python parsing.py)
echo "$value"
After running demo.sh, i should get the output ['a','b','c','d','e'] and I am getting that.
But, there are several problem when I am going to the next level.
I want to use the list elements, so at this moment I need to use sed to parse it within bash file which I don't want, rather i love to access the list as an array in bash so that i can use it later.
Currently I am printing a list, but I can't always do that. Because I could have multiple list in my python file, and I just can't afford to print each of them and retrieve it from bash. So, can I return a variable from python to bash somehow, I am searching some hack like source a bash file from another bash file.
Eventually I want to have multiple functions in my python file and will try to access a particular functions return value from bash. Can i achieve that? Like rather calling a whole python file, can I call a specific function from a python file?

You do not need to use sed to parse the output of the python script.
In python script before printing the output, convert the list into space separated string (hope you do not have any item in list with space in it)
print " ".join(lst)
This output can be easily looped through in shell.
For me the following code gives the below output.
#! /bin/bash
lst=$(python parsing.py)
for i in $lst
do
echo $i
done
params.conf is same as yours...
output
a
b
c
d
e
Hope this helps in solving some of your problem..
Also calling a specific function in python from shell script can be achieved like below...
python file with multiple function
import sys
def foo():
print 'i am foo'
def func():
print 'i am func'
def bye():
print 'i am bye'
if __name__ == "__main__":
if len(sys.argv)==2:
x = getattr(sys.modules[__name__], sys.argv[1])
x()
and shell script (i am using command line args to name what is the python function)
#! /bin/bash
echo $(python params.py $1)
output...
# ./demo.sh foo
i am foo
# ./demo.sh func
i am func
# ./demo.sh bye
i am bye

Related

Pass arguments to python from bash script

I have been working on a script mixture of bash and python script. The bash script can receive unknown count input arguments. For example :
tinify.sh test1.jpg test2.jpg test3.jpg .....
After the bash receives all, it pass these arguments to tinify.py. Now I have come out two ways to do that.
Loop in bash and call python tinify.py testx.jpg
In another word, python tinify test1.jpg then python tinify test2.jpg, finaly python tinify test3.jpg
Pass all arguments to tinify.py then loop in python
But there is a problem, I want to filter same parameter for example if user input tinify.sh test1.jpg test1.jpg test1.jpg , I only want tinify.sh test1.jpg.So I think it's easier to do in second way because python may be convenient.
How can I do to pass all arguments to python script? Thanks in advance!
In addition to Chepner's answer above:
#!/bin/bash
tinify.py "$#"
within python script, tinify.py:
from sys import argv
inputArgs = sys.argv[1:]
def remove_duplicates(l):
return list(set(l))
arguments=remove_duplicates(inputArgs)
The list arguments will contain the arguments passed to python script (duplicated removed as set can't contain duplicated values in python).
You use $# in tinify.sh
#!/bin/bash
tinify.py "$#"
It will be far easier to eliminate duplicates inside the Python script that to filter them out from the shell. (Of course, this raises the question whether you need a shell script at all.)
A python program can accept any number of command line arguments, using sys.argv — just remember that sys.argv[0] is the name of the script
and actual arguments are contained in sys.argv[1:]
$ cat test_args.py
from sys import argv
prog_name = argv[0]
print('Program name:', prog_name)
for arg in argv[1:]:
print(arg)
$ python test_args.py a b 'c d'
Program name: test_args.py
a
b
c d
$
note that an argument containing spaces must be quoted according to the shell syntax.
Your file tinify.py should start with the following (if you have two arguments):
import sys
arg1, arg2 = sys.argv[1], sys.argv[2]
sys.argv[0] is the name of the script itself. You can of course loop over sys.argv. Personally, i like to pass all the arguments as json objects, so afterwards I do json.loads()

Pass a code of Python in a command line

Is it possible to somehow convert string which I pass via command line e.g
python myfile.py "s = list(range(1000))"
into module? And then...
#myfile.py
def printlist(s):
print(s)
?
Something like a timeit module, but with my own code.
python -m timeit -s "s = list(range(1000))" "sorted(s)"
Save the code
import sys
print(sys.argv)
as filename.py, then run python filename.py Hi there! and see what happens ;)
I'm not sure if I understand correctly, but the sys module has the attribute argv that is a list of all the arguments passed in the command line.
test.py:
#!/usr/bin/env python
import sys
for argument in sys.argv:
print(argument)
This example will print every argument passed to the script, take in mind that the first element sys.argv[0] will always be the name of the script.
$ ./test.py hello there
./test.py
hello
there
There are two options to send short code snippets similar to what you can do via the timeit module or certain other -m module option functionality and these are:
You can use either the -c option of Python e.g.:
python -c "<code here>"
Or you could use piping such as:
echo <code here> | python
You can combine multiple statements using the ; semicolon statement separator. However if you use a : such as with while or for or def or if then you cannot use the ; so there may be limited options. Possibly some clever ways around this limitation but I have yet to see them.

How to import a python file in a bash script ? (to use a python value in my bash script)

I would like to know if it's possible in a bash script to include a python script in order to write (in the bash script) the return value of a funnction I wrote in my python program ?
For example:
my file "file.py" has a function which returns a variable value "my_value" (which represents the name of a file but anyway)
I want to create a bash script which has to be able to execute a commande line like "ingest my_value"
So do you know how to include a python file in a bash script (import ...?) and how is it possible to call a value from a python file inside a bash script ?
Thank you in advance.
Update
Actually, my python file looks like that:
class formEvents():
def __init__(self):
...
def myFunc1(self): # function which returns the name of a file that the user choose in his computeur
...
return name_file
def myFunc2(self): # function which calls an existing bash script (bash_file.sh) in writing the name_file inside it (in the middle of a line)
subprocess.call(['./bash_file.sh'])
if__name__="__main__":
FE=formEvents()
I don't know if it's clear enough but here is my problem: it's to be able to write name_file inside the bash_file.sh
Jordane
The easiest way of doing this is via the standard UNIX Pipeline and your Shell.
Here's an example:
foo.sh:
#!/bin/bash
my_value=$(python file.py)
echo $my_value
file.py:
#!/usr/bin/env python
def my_function():
return "my_value"
if __name__ == "__main__":
print(my_function())
The way this works is simple:
You launch foo.sh
Bash spawns a subprocess and runs python file.py
Python (and the interpretation of file.py) run the function my_function and print it's return value to "Standard Output"
Bash captures the "Standard Output" of the Python process in my_value
Bash then simply echoes the value stored in my_value also to "Standard Output" and you should see "my_value" printed to the Shell/Terminal.
If the python script outputs the return value to the console, you should be able to just do this
my_value=$(command)
Edit: Damn, beat me to it
Alternatively, you can make the bash script process arguments.
#!/usr/bin/bash
if [[ -n "$1" ]]; then
name_file="$1"
else
echo "No filename specified" >&2
exit 1
fi
# And use $name_file in your script
In Python, your subprocess call should be changed accordingly:
subprocess.call(['./bash_file.sh', name_file])

Execute the shell script by passing certain parameters to it using Python?

I am working on a project with Python in which I am supposed to execute shell script with Python.
I have written a simple program from which I am able to execute shell script with my python code. But now I need to pass certain parameters from my Python code to the shell script and then print out those parameters by executing the shell script.
For the simplicity sake, Currently I am executing shell script which will print out Hello World but now I want to pass hostname and pp and sp values to my shell script and then print out those values from the shell script when it is getting execute by Python client.
#!/usr/bin/python
import subprocess
import json
import socket
hostname = socket.gethostname()
jsonData = '{"desc": "some information about the host", "pp": [0,3,5,7,9], "sp": [1,2,4,6,8]}'
jj = json.loads(jsonData)
print jj['pp'] # printing it from Python program for now
print jj['sp'] # printing it from Python program for now
print hostname # printing it from Python program for now
# pass the above values to my shell script
jsonStr = '{"script":"#!/bin/bash\\necho Hello World\\n"}'
j = json.loads(jsonStr)
print "start"
subprocess.call(j['script'], shell=True)
print "end"
In general, I want to pass, hostname, pp and sp values to my shell script as shown in jsonStr and then print out those values from the shell script itself when I run my Python code.
So it should print out like this whenever I execute my shell script in jsonStr-
start
Hello world
[0, 3, 5, 7, 9]
[1, 2, 4, 6, 8]
myhostname
end
Is this possible to do it in Python?
There are two ways to pass variables to a script: as arguments, or through the environment.
Since you're trying to execute a script as if it were a giant command line (which won't actually work—especially if you're escaping your newlines as \\n so the shell sees the whole thing as one line—but let's pretend it would), you can't pass arguments, so you will need to pass an environment.
This is trivial:
env = {}
env.update(os.environ)
env.update(jj)
subprocess.call('echo ${pp}', shell=True, env=env)
This will print out whatever was in jj[pp], and return 0.
Why? Well, in bash, ${pp} means "whatever is in the environment variable pp". And we copied every key-value pair from jj into the env environment, so the environment variable pp has whatever value was in jj[pp]. (In some cases you might want to quote things, e.g. "${pp}", but for echo there's no reason to do that, and without knowing what you're going to do in your real-life code I can't guess what you might need.)
If you actually had a script that you wanted to call, stored in a file, then of course you could pass it arguments the same way you do with any other program you run via subprocess, and inside the script you could reference them as $1, etc.
However, I don't see why you need to pass arguments to a script you're building on the fly. Just build the values into the script. While that's usually a bad idea, that's mainly because building a script on the fly is a bad idea, and you've already committed to that part for some reason. Format your strings in Python, where you have access to the full power of str.format or % (as you prefer), and the entire Python stdlib. For example:
script = 'echo {}'.format(shlex.quote(jj['pp']))
subprocess.call(script, shell=True)
Now the script doesn't have to do anything to access the value; it's hard-coded into the script.

Calling a Python function from a shell script

I am trying to figure out how to call a Python function from a shell script.
I have a Python file with multiple functions and I need to use the values returned by them in my shell script. Is there a way to do it.
I am doing this in order to read a config file using Python and getting the values in shell. Is there any other better way to achieve this.
test.py contains:
import ConfigParser
config = ConfigParser.ConfigParser()
config.read("test.conf")
def get_foo():
return config.get("locations", "foo")
def get_bar():
return config.get("locations", "bar")
I need to store the values returned by the Python functions in a shell variable.
You can send the result of your functions to the standard output by asking the Python interpreter to print the result:
python -c 'import test; print test.get_foo()'
The -c option simply asks Python to execute some Python commands.
In order to store the result in a variable, you can therefore do:
RESULT_FOO=`python -c 'import test; print test.get_foo()'`
or, equivalently
RESULT=$(python -c 'import test; print test.get_foo()')
since backticks and $(…) evaluate a command and replace it by its output.
PS: Getting the result of each function requires parsing the configuration file each time, with this approach. This can be optimized by returning all the results in one go, with something like:
ALL_RESULTS=$(python -c 'import test; print test.get_foo(), test.get_bar()')
The results can then be split and put in different variables with
RESULT_BAR=$(echo $ALL_RESULTS | cut -d' ' -f2)
which takes the second result and puts it in RESULT_BAR for example (and similarly: -fn for result #n).
PPS: As Pablo Maurin mentioned, it would probably be easier to do everything in a single interpreter (Python, but maybe also the shell), if possible, instead of calculating variables in one program and using them in another one.

Categories