Is it possible to somehow convert string which I pass via command line e.g
python myfile.py "s = list(range(1000))"
into module? And then...
#myfile.py
def printlist(s):
print(s)
?
Something like a timeit module, but with my own code.
python -m timeit -s "s = list(range(1000))" "sorted(s)"
Save the code
import sys
print(sys.argv)
as filename.py, then run python filename.py Hi there! and see what happens ;)
I'm not sure if I understand correctly, but the sys module has the attribute argv that is a list of all the arguments passed in the command line.
test.py:
#!/usr/bin/env python
import sys
for argument in sys.argv:
print(argument)
This example will print every argument passed to the script, take in mind that the first element sys.argv[0] will always be the name of the script.
$ ./test.py hello there
./test.py
hello
there
There are two options to send short code snippets similar to what you can do via the timeit module or certain other -m module option functionality and these are:
You can use either the -c option of Python e.g.:
python -c "<code here>"
Or you could use piping such as:
echo <code here> | python
You can combine multiple statements using the ; semicolon statement separator. However if you use a : such as with while or for or def or if then you cannot use the ; so there may be limited options. Possibly some clever ways around this limitation but I have yet to see them.
Related
I know that I can run a python script from my bash script using the following:
python python_script.py
But what about if I wanted to pass a variable / argument to my python script from my bash script. How can I do that?
Basically bash will work out a filename and then python will upload it, but I need to send the filename from bash to python when I call it.
To execute a python script in a bash script you need to call the same command that you would within a terminal. For instance
> python python_script.py var1 var2
To access these variables within python you will need
import sys
print(sys.argv[0]) # prints python_script.py
print(sys.argv[1]) # prints var1
print(sys.argv[2]) # prints var2
Beside sys.argv, also take a look at the argparse module, which helps define options and arguments for scripts.
The argparse module makes it easy to write user-friendly command-line interfaces.
Use
python python_script.py filename
and in your Python script
import sys
print sys.argv[1]
Embedded option:
Wrap python code in a bash function.
#!/bin/bash
function current_datetime {
python - <<END
import datetime
print datetime.datetime.now()
END
}
# Call it
current_datetime
# Call it and capture the output
DT=$(current_datetime)
echo Current date and time: $DT
Use environment variables, to pass data into to your embedded python script.
#!/bin/bash
function line {
PYTHON_ARG="$1" python - <<END
import os
line_len = int(os.environ['PYTHON_ARG'])
print '-' * line_len
END
}
# Do it one way
line 80
# Do it another way
echo $(line 80)
http://bhfsteve.blogspot.se/2014/07/embedding-python-in-bash-scripts.html
use in the script:
echo $(python python_script.py arg1 arg2) > /dev/null
or
python python_script.py "string arg" > /dev/null
The script will be executed without output.
I have a bash script that calls a small python routine to display a message window. As I need to use killall to stop the python script I can't use the above method as it would then mean running killall python which could take out other python programmes so I use
pythonprog.py "$argument" & # The & returns control straight to the bash script so must be outside the backticks. The preview of this message is showing it without "`" either side of the command for some reason.
As long as the python script will run from the cli by name rather than python pythonprog.py this works within the script. If you need more than one argument just use a space between each one within the quotes.
and take a look at the getopt module.
It works quite good for me!
Print all args without the filename:
for i in range(1, len(sys.argv)):
print(sys.argv[i])
I'm trying to do a simple script, but I don't get how to pass a variable to the command that I need:
#!/bin/bash
for i in 1 2 3
do
python -c 'print "a"*' $i
done
If you really want to go on with your solution using python -c, you would have to remove the space:
#!/bin/bash
for i in 1 2 3
do
python -c 'print "a"*'$i
done
But the approach suggested by Asmox makes more sense to me (and it has nothing to do with the question where you pipe the standard output to).
Maybe this topic will help:
How do I run Python script using arguments in windows command line
Have you considered making a whole script? Like this one:
import sys
def f(a):
print a
if __name__ == "__main__":
a = int(sys.argv[1])
f(a)
This example might seem overcomplicated, but I wanted to show you how to pass parameter from console to function
I have tried the solution provided above and had to modify for python3+
due to lack of sufficient points I am not allowed to make a comment, thats why I posted my solution separately
#!/bin/bash
for i in {1..3}
do
python -c "print ('a'* ${i})"
done
output~>
a
aa
aaa
I have been working on a script mixture of bash and python script. The bash script can receive unknown count input arguments. For example :
tinify.sh test1.jpg test2.jpg test3.jpg .....
After the bash receives all, it pass these arguments to tinify.py. Now I have come out two ways to do that.
Loop in bash and call python tinify.py testx.jpg
In another word, python tinify test1.jpg then python tinify test2.jpg, finaly python tinify test3.jpg
Pass all arguments to tinify.py then loop in python
But there is a problem, I want to filter same parameter for example if user input tinify.sh test1.jpg test1.jpg test1.jpg , I only want tinify.sh test1.jpg.So I think it's easier to do in second way because python may be convenient.
How can I do to pass all arguments to python script? Thanks in advance!
In addition to Chepner's answer above:
#!/bin/bash
tinify.py "$#"
within python script, tinify.py:
from sys import argv
inputArgs = sys.argv[1:]
def remove_duplicates(l):
return list(set(l))
arguments=remove_duplicates(inputArgs)
The list arguments will contain the arguments passed to python script (duplicated removed as set can't contain duplicated values in python).
You use $# in tinify.sh
#!/bin/bash
tinify.py "$#"
It will be far easier to eliminate duplicates inside the Python script that to filter them out from the shell. (Of course, this raises the question whether you need a shell script at all.)
A python program can accept any number of command line arguments, using sys.argv — just remember that sys.argv[0] is the name of the script
and actual arguments are contained in sys.argv[1:]
$ cat test_args.py
from sys import argv
prog_name = argv[0]
print('Program name:', prog_name)
for arg in argv[1:]:
print(arg)
$ python test_args.py a b 'c d'
Program name: test_args.py
a
b
c d
$
note that an argument containing spaces must be quoted according to the shell syntax.
Your file tinify.py should start with the following (if you have two arguments):
import sys
arg1, arg2 = sys.argv[1], sys.argv[2]
sys.argv[0] is the name of the script itself. You can of course loop over sys.argv. Personally, i like to pass all the arguments as json objects, so afterwards I do json.loads()
So to be more precise, what I am trying to do is :
read a full shell command as argument of my python script like : python myPythonScript.py ls -Fl
Call that command within my python script when I'd like to (Make some loops on some folders and apply the command etc ...)
I tried this :
import subprocess
from optparse import OptionParser
from subprocess import call
def execCommand(cmd):
call(cmd)
if __name__ == '__main__':
parser = OptionParser()
(options,args) = parser.parse_args()
print args
execCommand(args)
The result is that now I can do python myPythonScript.py ls , but I don't know how to add options. I know I can use parser.add_option , but don't know how to make it work for all options as I don't want to make only specific options available, but all possible options depending on the command I am running.
Can I use something like parser.add_option('-*') ? How can I parse the options then and call the command with its options ?
EDIT
I need my program to parse all type of commands passed as argument : python myScript.py ls -Fl , python myScript.py git pull, python myScript rm -rf * etc ...
OptionParser is useful when your own program wants to process the arguments: it helps you turn string arguments into booleans or integers or list items or whatever. In your case, you just want to pass the arguments on to the program you're invoking, so don't bother with OptionParser. Just pass the arguments as given in sys.argv.
subprocess.call(sys.argv[1:])
Depending on how much your program depends on command line arguments, you can go with simple route.
Simple way of reading command line arguments
Use sys to obtain all the arguments to python command line.
import sys
print sys.argv[1:]
Then you can use subprocess to execute it.
from subprocess import call
# e.g. call(["ls", "-l"])
call(sys.argv[1:])
This sample below works fine for me.
import sys
from subprocess import call
print(sys.argv[1:])
call(sys.argv[1:])
I am trying to figure out how to call a Python function from a shell script.
I have a Python file with multiple functions and I need to use the values returned by them in my shell script. Is there a way to do it.
I am doing this in order to read a config file using Python and getting the values in shell. Is there any other better way to achieve this.
test.py contains:
import ConfigParser
config = ConfigParser.ConfigParser()
config.read("test.conf")
def get_foo():
return config.get("locations", "foo")
def get_bar():
return config.get("locations", "bar")
I need to store the values returned by the Python functions in a shell variable.
You can send the result of your functions to the standard output by asking the Python interpreter to print the result:
python -c 'import test; print test.get_foo()'
The -c option simply asks Python to execute some Python commands.
In order to store the result in a variable, you can therefore do:
RESULT_FOO=`python -c 'import test; print test.get_foo()'`
or, equivalently
RESULT=$(python -c 'import test; print test.get_foo()')
since backticks and $(…) evaluate a command and replace it by its output.
PS: Getting the result of each function requires parsing the configuration file each time, with this approach. This can be optimized by returning all the results in one go, with something like:
ALL_RESULTS=$(python -c 'import test; print test.get_foo(), test.get_bar()')
The results can then be split and put in different variables with
RESULT_BAR=$(echo $ALL_RESULTS | cut -d' ' -f2)
which takes the second result and puts it in RESULT_BAR for example (and similarly: -fn for result #n).
PPS: As Pablo Maurin mentioned, it would probably be easier to do everything in a single interpreter (Python, but maybe also the shell), if possible, instead of calculating variables in one program and using them in another one.