How do I pass function definition to python script as string - python

I want to pass function definition to a python command line script. What is the best way to do this? I am using python 2. Suppose i have a script like this:
#myscript.py
x = load_some_data()
my_function = load_function_definition_from_command_line()
print my_function(x)
And i want to call it like this: python myscript.py 'def fun(x): return len(x)'
How do i perform the load_function_definition_from_command_line part ?
I imagine a workaround:
get the string function definition from command line
write it to a file with .py extension in some temp directory
load the definition from file using solutions from this question: How to import a module given the full path?
execute
cleanup
But I am sure there must be a better way.

You can use eval to run code defined in a string. Like so:
import sys
x = load_some_data()
function = eval("".join(sys.argv[1:]))
print(function(x))
With your specific example though you might have to use something like lambda x: len(x)
As #Jan-Spurny rightly points out: "Never, never, never use eval unless you're absolutely sure there is no other way. And even then you should stop and think again."
In my mind the better strategy would be to turn the data loader and executor into a module with a method that takes a function as an argument and runs the desired code. The end result something like this:
import data_loader_and_executor
def function(x):
return len(x)
data_loader_and_executor.run(function)

You can use eval or exec to create a function in your current namespace.
exec "somefunc(x): return x * 2"
somefunc(2) # 2
Example within your context
python load_function.py "def f(x): return x * 2"
//load_function.py
import sys
exec sys.argv[1]
print f(2)
Command line output:
4
Edit: Obligatory, "It is not wise to execute user input like this."

Use function exec:
import sys
def load_function_definition_from_command_line():
exec(sys.argv[1])
return locals()['fun']
Of course you have to know, how your function will be named, but this can be done by passing to your argument second argument:
$ python myscript.py 'def fun(x): return len(x)' fun
And then your function will look like:
import sys
def load_function_definition_from_command_line():
exec(sys.argv[1])
return locals()[sys.argv[2]]
!!Remember though, that evaluating user input is very dangerous!!
Edit: Since fun would be the only object defined in locals, you can just return first element in locals():
def load_function_definition_from_command_line():
exec(sys.argv[1])
return locals()[0]

The most obvious source for the correct answer on how to do this is in the timeit python builtin library.
It is invoked like this:
$ python -m timeit '"-".join(str(n) for n in range(100))'
and you can find the source code here, which uses compile and exec to invoke the code from the command line
https://hg.python.org/cpython/file/2.7/Lib/timeit.py#l143

Related

How to run different python functions from command line

I'm trying to run different functions from a python script(some with arguments and some without)
So far I have
def math(x):
ans = 2*x
print(ans)
def function1():
print("hello")
if __name__ == '__main__':
globals()[sys.argv[1]]()
and in the command line if I type python scriptName.py math(2)
I get the error
File "scriptName.py", line 28, in <module>
globals()[sys.argv[1]]()
KeyError: 'mat(2)'
New to python and programming so any help would be apprecitated. This is also a general example...my real script will have a lot more functions.
Thank you
Try this!
import argparse
def math(x):
try:
print(int(x) * 2)
except ValueError:
print(x, "is not a number!")
def function1(name):
print("Hello!", name)
if __name__ == '__main__':
# if you type --help
parser = argparse.ArgumentParser(description='Run some functions')
# Add a command
parser.add_argument('--math', help='multiply the integer by 2')
parser.add_argument('--hello', help='say hello')
# Get our arguments from the user
args = parser.parse_args()
if args.math:
math(args.math)
if args.hello:
function1(args.hello)
You run it from your terminal like so:
python script.py --math 5 --hello ari
And you will get
>> 10
>> Hello! ari
You can use --help to describe your script and its options
python script.py --help
Will print out
Run some functions
optional arguments:
-h, --help show this help message and exit
--math MATH multiply the integer by 2
--hello HELLO say hello
Read More: https://docs.python.org/3/library/argparse.html
Here is another approach you can take:
#!/usr/bin/env python3
"""Safely run Python functions from command line.
"""
import argparse
import ast
import operator
def main():
# parse arguments
parser = argparse.ArgumentParser(description=__doc__)
parser.add_argument("function", help="Python function to run.")
parser.add_argument("args", nargs='*')
opt = parser.parse_args()
# try to get the function from the operator module
try:
func = getattr(operator, opt.function)
except AttributeError:
raise AttributeError(f"The function {opt.function} is not defined.")
# try to safely eval the arguments
try:
args = [ast.literal_eval(arg) for arg in opt.args]
except SyntaxError:
raise SyntaxError(f"The arguments to {opt.function}"
f"were not properly formatted.")
# run the function and pass in the args, print the output to stdout
print(func(*args))
if __name__ == "__main__":
main()
Then you can execute this by doing the following:
./main.py pow 2 2
4
We use the argparse module from Python's Standard Library to facilitate the parsing of arguments here. The usage for the script is below:
usage: main.py [-h] function [args [args ...]]
function is the name of the function you want to run. The way this is currently structured is to pull functions from the operator module, but this is just an example. You can easily create your own file containing functions and use that instead, or just pull them from globals().
Following function you can supply any number of arguments that you want. Those arguments will be ran through ast.literal_eval to safely parse the arguments and get the corresponding types.
The cool thing about this is your arguments are not strictly limited to strings and numbers. You can pass in any literal. Here is an example with a tuple:
./main.py getitem '(1, 2, 3)' 1
2
These arguments are then passed to the selected function, and the output is printed to stdout. Overall, this gives you a pretty flexibly framework in which you can easily expand the functionality. Plus, it avoids having to use eval which greatly reduces the risk of doing something of this nature.
Why not to use eval:
Here is a small example of why just using eval is so unsafe. If you were to simply use the following code to solve your issue:
def math(x):
ans = 2*x
print(ans)
def function1():
print("hello")
if __name__ == '__main__':
print(eval(sys.argv[1]])) # DO NOT DO IT THIS WAY
Someone could pass in an argument like so:
python main.py 'import shutil; shutil.rmtree("/directory_you_really_dont_want_to_delete/")'
Which would in effect, import the shutil module and then call the rmtree function to remove a directory you really do not want to delete. Obviously this is a trivial example, but I am sure you can see the potential here to do something really malicious. An even more malicious, yet easily accessible, example would be to import subprocess and use recursive calls to the script to fork-bomb the host, but I am not going to share that code here for obvious reasons. There is nothing stopping that user from downloading a malicious third party module and executing code from it here (a topical example would be jeilyfish which has since been removed from PyPi). eval does not ensure that the code is "safe" before running it, it just arbitrarily runs any syntactically correct Python code given to it.

How could Python functions of a library be made available as Bash commands?

Let's say I have a large Python library of functions and I want these functions (or some large number of them) to be available as commands in Bash.
First, disregarding Bash command options and arguments, how could I get a function of a Python file containing a number of functions to run using a single word Bash command? I do not want to have the functions available via commands of a command 'suite'. So, let's say I have a function called zappo in this Python file (say, called library1.py). I would want to call this function using a single-word Bash command like zappo, not something like library1 zappo.
Second, how could options and arguments be handled? I was thinking that a nice way could be to capture all of the options and arguments of the Bash command and then use them within the Python functions using docopt parsing *at the function level```.
Yes, but the answer might not be as simple as you hope. No matter what you do, you're going to have to create something in your bash shell for each function you want to run. However, you could have a Python script generate aliases stored in a file which gets sourced.
Here's the basic idea:
#!/usr/bin/python
import sys
import __main__ #<-- This allows us to call methods in __main__
import inspect #<-- This allows us to look at methods in __main__
########### Function/Class.Method Section ##############
# Update this with functions you want in your shell #
########################################################
def takesargs():
#Just an example that reads args
print(str(sys.argv))
return
def noargs():
#and an example that doesn't
print("doesn't take args")
return
########################################################
#Make sure there's at least 1 arg (since arg 0 will always be this file)
if len(sys.argv) > 1:
#This fetches the function info we need to call it
func = getattr(__main__, str(sys.argv[1]), None)
if callable(func):
#Actually call the function with the name we received
func()
else:
print("No such function")
else:
#If no args were passed to this function, just output a list of aliases for this script that can be appended to .bashrc or similar.
funcs = inspect.getmembers(__main__, predicate=inspect.isfunction)
for func in funcs:
print("alias {0}='./suite.py {0}'".format(func[0]))
Obviously, if you're using methods in a class instead of functions in main, change references from __main__ to your class, and change predicate in the inspect to inspect.ismethod. Also, you probably would want to use absolute paths for the aliases, etc.
Sample output:
~ ./suite.py
alias noargs='./suite.py noargs'
alias takesargs='./suite.py takesargs'
~ ./suite.py > ~/pyliases
~ echo ". ~/pyliases" >> ~/.bashrc
~ . ~/.bashrc
~ noargs
doesn't take args
~ takesargs blah
['./suite.py', 'takesargs', 'blah']
If you use the method I've suggested above, you can actually have your .bashrc run ~/suite.py > ~/pyliases before it sources the aliases from the file. Then your environment gets updated every time you log in/start a new terminal session. Just edit your python function file, and then . ~/.bashrc and the functions will be available.

Importing code from a file as a function in python

Essentially I have a script in one file which I would like to import and run as a function in another file. Here is the catch, the contents of the first file CANNOT be written as function definition it just needs to be a plain old script (I'm writing a simulator for my robotics kit so user experience is important). I have no idea how to go about this.
Adam
Anything can be written as a function.
If you additionally need the ability to call your script directly, you just use the __name__ == '__main__' trick:
def my_function():
... code goes here ...
if __name__ == '__main__':
my_function()
Now you can import my_function from the rest of your code, but still execute the file directly since the block at the end will call the function.
Assuming that the code in the file you need to import is a well bounded script - then you can read in as a text variable and use the "execfile" function to create a function from that script.
By well bounded I mean that you understand all the data it needs and you are able to provide all of it from your program.
An alternative would be to use the "system" call, or the subprocess module to call the script as if it was an external program (depending if you need the script output).
A final approach will be to use exec to create a function - see approach 3.
The approach you use determines what you need your other script to do ..
examples :
hello.py (your file you want to run, but can't change):
# Silly example to illustrate a script which does something.
fp = open("hello.txt", "a")
fp.write("Hello World !!!\n")
fp.close()
Three approaches to use hello.py without importing hello.py
import os
print "approach 1 - using system"
os.system("python hello.py")
print "approach 2 - using execfile"
execfile("hello.py", globals(), locals())
print "approach 3 - exec to create a function"
# read script into string and indent
with open("hello.py","r") as hfp:
hsrc = [" " + line for line in hfp]
# insert def line
hsrc.insert(0, "def func_hello():")
# execute our function definition
exec "\n".join( hsrc) in globals(), locals()
# you now have a function called func_hello, which you can call just like a normal function
func_hello()
func_hello()
print "My original script is still running"

Convert string to command

I'm trying to convert to get a command executed which is passed to the print statement. Eg:
print "exec(raw_input())"
Can I get to run the exec() in some way?
do this
command = raw_input("Command: ")
exec(command)
why are you trying to print it? be more clear if this isnt what you are looking for
I Guess this is what you are looking for ?
>>> exec("x = raw_input()")
23
>>> x
'23'
>>>
Are you asking for something simple like
aString = "raw_input()"
print "exec(" + aString + ")"
exec(aString)
from __future__ import print_function
def foo(x):
exec x
print = foo
print("exec(raw_input())")
Running
% test.py
kaboom
results in:
NameError: name 'kaboom' is not defined
From the tags you applied, it seems like you're looking for a magic string that will allow you to run any arbitrary code when Python passes that string to the print statement.
There are no such known strings. You might try very long strings, just over the obvious boundaries, looking for the traditional stack- and heap-overflow cases, but if you find one in Python X.Y.Z, chances are it was already fixed in X.Y.Z+1.
If the Python script is actually doing an exec or eval on your string, of course that's easy to exploit; if it's just doing a print, the contents never even get compiled, much less run; they're just a plain old string. The fact that the string happens to have dangerous-looking things like exec in it is irrelevant; it's not more exploitable than getting Python to print the string "rm -rf /".
Of course if you can arrange for the output of one Python interpreter to feed in as the input to an interactive session in another interpreter, then whatever you get the first interpreter to print will get executed in the second one… but in that case, you don't need the exec in the middle of the string anyway.
The print statement writes to sys.stdout by default, and sys.stdout is like a file. You can make a class that looks like a file, with a write method, which would be called by the print statement. I put together a script which demonstrates this, print_exec.py. Also note, this doesn't work if the code in the print statement contains print, itself. i.e., print "print 'foo'" won't work. So, in the example, I had to print to sys.stderr to actually see something happening. Hope this helps.
print_exec.py
import sys
class file_like(object):
def write(self, x):
exec(x)
sys.stdout = file_like()
y = "exec(raw_input())"
print "exec(y)"
Example running the print_exec.py:
>python print_exec.py
print "import sys; print >> sys.stderr, 'hi'"
hi
>

An interpretation error for a python multithreading and multiprocessing

I want to compare multithreading and multiprocessing python program. But, I got interpretation error:
File "./parallelPython.py", line 23
time fornorm(g,range(100))
^
SyntaxError: invalid syntax
The code is as follows:
#!/usr/bin/python -tt
import numpy as np
import math
def f(x):
print x
y = [1]*10000000
[math.exp(i) for i in y]
def g(x):
print x
y = np.ones(10000000)
np.exp(y)
from handythread import foreach
from processing import Pool
from timings import f,g
def fornorm(f,l):
for i in l:
f(i)
time fornorm(g,range(100))
time fornorm(f,range(10))
time foreach(g,range(100),threads=2)
time foreach(f,range(10),threads=2)
p = Pool(2)
time p.map(g,range(100))
time p.map(f,range(100))
I do not why fornorm() has a problem, it has been defined !!!
thanks
It doesn't say fornorm hasn't been defined, it says you have a syntax error on the line where you're calling fornorm. Syntax errors mean Python can't even understand your code: it's as if I say to you "flrk ask web ski ur lkjq", and then ask you to do what I said. An error about fornorm not being defined would happen much later. As it is, Python can't even tell whether you're asking it to call a function, let alone whether you're calling one that is already defined or not.
It looks like your error is this:
time fornorm(g,range(100))
That looks like you're trying to use the shell command time. Shell commands are not Python, and Python doesn't understand it.
However, your code as pasted into SO also has indentation errors, which should have triggered a syntax error earlier than that line, so I suspect what we can see here is not exactly what you were running.
It looks like an Indentation error here :
def fornorm(f,l):
for i in l:
f(i)
After your def python is expecting an indented block.
By the way, time something is an IPython "magic" function and it won't work in a script file. You should import timeit module and use that instead.
Where are you getting "time" from? That's not a valid python statement. It's not like shell scripting.
If you want to time stuff, use the timeit library:
http://docs.python.org/library/timeit.html

Categories