How to insert a python program into a bash script? - python

I have a small python program that parses a text file and writes the output to another file. Currently, I am writing a bash script to call this program several times, it looks something like:
for i in $(seq 1 100); do
python /home/Documents/myProgram.py myTextFile$i
done
This works fine but I want to know if it is possible to have the python program inside the bash script file so when another user runs the script they don't need to have the python program in their memory; i.e., is it possible to copy and paste the python program into the bash script and run it from within the script itself?

#!/bin/bash
python - 1 2 3 << 'EOF'
import sys
print 'Argument List:', str(sys.argv)
EOF
Output:
Argument List: ['-', '1', '2', '3']

I think you should be able to put:
python << END
[Python code here]
END
But I have not tested this.

for simple scripts you can also run python with the -c option. For example
python -c "x=1; print x; print 'great program eh'"
I wouldn't recommend writing anything too complicated, but it could work for something simple.

Pardon the thread necromancy, but here's a technique that is missing that may be useful to someone.
#!/bin/bash
""":" # Hide bash from python
for i in $(seq 1 100); do
python3 $(readlink -f "$0") myTextFile$i
done
exit
"""
# Python script starts here
import sys
print('Argument List: ', str(sys.argv))
However, I do agree with the general recommendation to just do the loop in Python.

Related

How to pass arguments to python script? [duplicate]

I know that I can run a python script from my bash script using the following:
python python_script.py
But what about if I wanted to pass a variable / argument to my python script from my bash script. How can I do that?
Basically bash will work out a filename and then python will upload it, but I need to send the filename from bash to python when I call it.
To execute a python script in a bash script you need to call the same command that you would within a terminal. For instance
> python python_script.py var1 var2
To access these variables within python you will need
import sys
print(sys.argv[0]) # prints python_script.py
print(sys.argv[1]) # prints var1
print(sys.argv[2]) # prints var2
Beside sys.argv, also take a look at the argparse module, which helps define options and arguments for scripts.
The argparse module makes it easy to write user-friendly command-line interfaces.
Use
python python_script.py filename
and in your Python script
import sys
print sys.argv[1]
Embedded option:
Wrap python code in a bash function.
#!/bin/bash
function current_datetime {
python - <<END
import datetime
print datetime.datetime.now()
END
}
# Call it
current_datetime
# Call it and capture the output
DT=$(current_datetime)
echo Current date and time: $DT
Use environment variables, to pass data into to your embedded python script.
#!/bin/bash
function line {
PYTHON_ARG="$1" python - <<END
import os
line_len = int(os.environ['PYTHON_ARG'])
print '-' * line_len
END
}
# Do it one way
line 80
# Do it another way
echo $(line 80)
http://bhfsteve.blogspot.se/2014/07/embedding-python-in-bash-scripts.html
use in the script:
echo $(python python_script.py arg1 arg2) > /dev/null
or
python python_script.py "string arg" > /dev/null
The script will be executed without output.
I have a bash script that calls a small python routine to display a message window. As I need to use killall to stop the python script I can't use the above method as it would then mean running killall python which could take out other python programmes so I use
pythonprog.py "$argument" & # The & returns control straight to the bash script so must be outside the backticks. The preview of this message is showing it without "`" either side of the command for some reason.
As long as the python script will run from the cli by name rather than python pythonprog.py this works within the script. If you need more than one argument just use a space between each one within the quotes.
and take a look at the getopt module.
It works quite good for me!
Print all args without the filename:
for i in range(1, len(sys.argv)):
print(sys.argv[i])

Bash script loop on python variable

I'm trying to do a simple script, but I don't get how to pass a variable to the command that I need:
#!/bin/bash
for i in 1 2 3
do
python -c 'print "a"*' $i
done
If you really want to go on with your solution using python -c, you would have to remove the space:
#!/bin/bash
for i in 1 2 3
do
python -c 'print "a"*'$i
done
But the approach suggested by Asmox makes more sense to me (and it has nothing to do with the question where you pipe the standard output to).
Maybe this topic will help:
How do I run Python script using arguments in windows command line
Have you considered making a whole script? Like this one:
import sys
def f(a):
print a
if __name__ == "__main__":
a = int(sys.argv[1])
f(a)
This example might seem overcomplicated, but I wanted to show you how to pass parameter from console to function
I have tried the solution provided above and had to modify for python3+
due to lack of sufficient points I am not allowed to make a comment, thats why I posted my solution separately
#!/bin/bash
for i in {1..3}
do
python -c "print ('a'* ${i})"
done
output~>
a
aa
aaa

accessing python dictionary from bash script

I am invoking the bash script from python script.
I want the bash script to add an element to dictionary "d" in the python script
abc3.sh:
#!/bin/bash
rank=1
echo "plugin"
function reg()
{
if [ "$1" == "what" ]; then
python -c 'from framework import data;data(rank)'
echo "iamin"
else
plugin
fi
}
plugin()
{
echo "i am plugin one"
}
reg $1
python file:
import sys,os,subprocess
from collections import *
subprocess.call(["./abc3.sh what"],shell=True,executable='/bin/bash')
def data(rank,check):
d[rank]["CHECK"]=check
print d[1]["CHECK"]
If I understand correctly, you have a python script that runs a shell script, that in turn runs a new python script. And you'd want the second Python script to update a dictionnary in the first script. That will not work like that.
When you run your first python script, it will create a new python process, which will interpret each instruction from your source script.
When it reaches the instruction subprocess.call(["./abc3.sh what"],shell=True,executable='/bin/bash'), it will spawn a new shell (bash) process which will in turn interpret your shell script.
When the shell script reaches python -c <commands>, it invokes a new python process. This process is independant from the initial python process (even if you run the same script file).
Because each of theses scripts will run in a different process, they don't have access to each other data (the OS makes sure that each process is independant from each other, excepted for specific inter-process communications methods).
What you need to do: use some kind of interprocess mechanism, so that the initial python script gets data from the shell script. You may for example read data from the shell standard output, using https://docs.python.org/3/library/subprocess.html#subprocess.check_output
Let's suppose that you have a shell plugin that echoes the value:
echo $1 12
The mockup python script looks like (I'm on windows/MSYS2 BTW, hence the strange paths for a Linux user):
import subprocess
p = subprocess.Popen(args=[r'C:\msys64\usr\bin\sh.exe',"-c","C:/users/jotd/myplugin.sh myarg"],stdout=subprocess.PIPE,stderr=subprocess.PIPE)
o,e= p.communicate()
p.wait()
if len(e):
print("Warning: error found: "+e.decode())
result = o.strip()
d=dict()
d["TEST"] = result
print(d)
it prints the dictionary, proving that argument has been passed to the shell, and went back processed.
Note that stderr has been filtered out to avoid been mixed up with the results, but is printed to the console if occurs.
{'TEST': b'myarg 12'}

Running code with another interpreter on a Perl script

This thread discusses a way of running Python code from within a Bash script.
Is there any way to do something similar from within a Perl script? i.e. is there any way to run Python code typed on a Perl script? Note that I am not asking about running a Python file from a Perl script. I am asking about running Python code directly typed within the same file that has the Perl script (in the same way that the other thread discussed how to run Perl code from which a Bash script).
Example:
# /bin/perl
use 5.010
my $some_perl_variable = 'hello';
# ... BEGIN PYTHON BLOCK ...
# We are still in the same file. But we are now running Python code
import sys;
print some_perl_variable # Notice that this is a perl variable
for r in range(3):
print r
# ... END PYTHON BLOCK ...
say "We are done with the Perl script!"
say "The output of the Python block is:"
print $output"
1;
Should print:
We are done with the Perl script!
The output of the Python block is:
hello
1
2
3
We are done with the perl script
It sounds like you would be interested in the Inline module. It allows Perl to call code in many other languages, and relies on support modules for each language.
You don't say what you want to do, but you mention Python and there is an Inline::Python.
Yes, the same technique (here-docs) can be used for Perl.
Perl in Bash:
perl <<'END' # note single quotes to avoid $variable interpolation
use 5.010;
say "hello world";
END
or
perl -E'say "hello from perl"'
Bash in Perl:
use autodie; # less error handling
open my $bash, "|-", "bash";
print $bash <<'END'; # single quotes again
echo hello from bash
END
Perl in Bash in Perl:
use autodie; # less error handling
open my $bash, "|-", "bash";
print $bash <<'END'; # single quotes again
perl <<'INNER_END'
use 5.010;
say "hello inception";
INNER_END
END
(which I ironically tested on the commandline, in another heredoc)

I want to have raw_input and nohup together in python

I am dealing with a large data set and it takes some days to run, therefore I use nohup to run my script in terminal.
This time I need to first get a raw_input from terminal then by nohup, my codes starts running. Any suggestion how I can do that?
so first I need to get input from terminal like this
$ python myprogram.py
enter_input: SOMETHING
then the process should be like this:
$nohup python myprogram.py &
But I want to do this in one step via terminal. I hope my explanation is clear :)
Here's one more option, in case you want to stick with the user-friendly nature of the input box. I did something like this because I needed a password field, and didn't want the user to have to display their password in the terminal. As described here, you can create a small wrapper shell script with input boxes (with or without the -s option to hide), and then pass those variables via the sys.argv solution above. Something like this, saved in an executable my_program.sh:
echo enter_input:
read input
echo enter_password:
read -s password
nohup python myprogram.py $username $password &
Now, running ./my_program.sh will behave exactly like your original python my_program.py
I think you shouldn't your program have read input from stdin, but give it data via its command line.
So instead of
startdata = raw_input('enter_input:')
you do
import sys
startdata = sys.argv[1]
and you start your program with
$ nohup python myprogram.py SOMETHING &
and all works the way you want - if I get you right.
You could make your process fork to the background after reading the input. The by far easier variant, though, is to start your process inside tmux or GNU screen.

Categories