Suspending function calls in Python for passing later (functional paradigm) - python

I'm writing a python command line program which has some interdependent options, I would like for the user to be able to enter the options in whichever order they please.
Currently I am using the getopts library to parse the command line options, unfortunately that parses them in-order. I've thrown together a system of boolean flags to leave the processing of certain command line arguments until the one they're dependent on is processed, however I had the idea of using a Priority Queue of function calls which would execute after all the command line options are parsed.
I know that Python can store functions under variable names, but that seems to call the function at the same time.
For example:
help = obj.PrintHelp()
heapq.heappush(commandQ, (0, help))
Will print the help dialog immediately. How would I go about implementing my code such that it won't call PrintHelp() immediately upon assigning it a name.
EDIT:
Oh i just realized I was pushing into a queue called help, that's my mistake.
Thanks for the tip on removing the () after PrintHelp.
What if I want to now call a function that requires more than the self argument?
myFun = obj.parseFile(path)
heapq.heappush(commandQ, (1, myFun))
Would I just make the tuple bigger and take the command line argument?

If you heappush like this:
myFun = obj.parseFile
heapq.heappush(commandQ, (1, myFun, path))
then to later call the function, you could do this:
while commandQ:
x=heapq.heappop(commandQ)
func=x[1]
args=x[2:]
func(*args)
Use
help = obj.PrintHelp
without the parentheses. This makes help reference the function.
Later, you can call the function with help().
Note also (if I understand your situation correctly), you could just use the optparse or (if you have Python2.7 or better) argparse modules in the standard library to handle the command-line options in any order.
PS. help is a built-in function in Python. Naming a variable help overrides the built-in, making it difficult (though not impossible) to access the built-in. Generally, it's a good idea not to overwrite the names of built-ins.

Instead of using getopts, I would suggest using optparse (argparse, if you are using a newer python version): most probably, you will get everything you need, already implemented.
That said, in your example code, you are actually calling the function, while you should simply get its name:
help = obj.PrintHelp
heapq.heappush(help, (0, help))

If you want to store a complete function call in Python, you can do it one of two ways:
# option 1: hold the parameters separately
# I've also skipped saving the function in a 'help' variable'
heapq.heappush(commandQ, (0, obj.PrintHelp, param1, param2))
# later:
command = commandQ[0]
heapq.heappop(commandQ)
command[1](*command[2:]) # call the function (second item) with args (remainder of items)
Alternatively, you can use a helper to package the arguments up via lambda:
# option 2: build a no-argument anonymous function that knows what arguments
# to give the real one
# module scope
def makeCall(func, *args):
return lambda: func(*args)
# now you can:
help = makeCall(obj.PrintHelp, param1, param2)
heapq.heappush(commandQ, (0, help))
If you need keyword arguments, let me know and I'll edit to take care of those too.

Related

Transparently passing through a function with a variable argument list

I am using Python RPyC to communicate between two machines. Since the link may be prone to errors I would like to have a generic wrapper function which takes a remote function name plus that function's parameters as its input, does some status checking, calls the function with the parameters, does a little more status checking and then returns the result of the function call. The wrapper should have no knowledge of the function, its parameters/parameter types or the number of them, or the return value for that matter, the user has to get that right; it should just pass them transparently through.
I get the getattr(conn.root, function)() pattern to call the function but my Python expertise runs out at populating the parameters. I have read various posts on the use of *arg and **kwarg, in particular this one, which suggests that it is either difficult or impossible to do what I want to do. Is that correct and, if so, might there be a scheme which would work if I, say, ensured that all the function parameters were keyword parameters?
I do own both ends of this interface (the caller and the called) so I could arrange to dictionary-ise all the function parameters but I'd rather not make my API too peculiar if I could possibly avoid it.
Edit: the thing being called, at the remote end of the link, is a class with very ordinary methods, e.g.;
def exposed_a(self)
def exposed_b(self, thing1)
def exposed_c(self, thing1=None)
def exposed_d(self, thing1=DEFAULT_VALUE1, thing2=None)
def exposed_e(self, thing1, thing2, thing3=DEFAULT_VALUE1, thing4=None)
def exposed_f(self, thing1=None, thing2=None)
...where the types of each argument (and the return values) could be string, dict, number or list.
And it is indeed, trivial, my Goggle fu had simply failed me in finding the answer. In the hope of helping anyone else who is inexperienced in Python and is having a Google bad day:
One simply takes *arg and **kwarg as parameters and passes them directly on, with the asterisks attached. So in my case, to do my RPyC pass-through, where conn is the RPyC connection:
def my_passthru(conn, function_name, *function_args, **function_kwargs):
# Do a check of something or other here
return_value = getattr(conn.root, function_name)(*function_args, **function_kwargs)
# Do another check here
return return_value
Then, for example, a call to my exposed_e() method above might be:
return_value = my_passthru(conn, e, thing1, thing2, thing3)
(the exposed_ prefix being added automagically by RPyC in this case).
And of course one could put a try: / except ConnectionRefusedError: around the getattr() call in my_passthru() to generically catch the case where the connection has dropped underneath RPyC, which was my main purpose.

How do I trace all functions calls and args automatically?

For the purpose of building a specialized debugger for my code I would like to trace some of the functions in my code, and to log the arguments they received in each call.
I would like to be able to do this without adding line of code to each function, or even a decorator around all functions, but to set a trace around the whole run.
This is somewhat similar to what you can do with sys.settrace in the sys module:
https://docs.python.org/2/library/sys.html
except that as far as I can tell the trace doesn't include the funcions' arguments.
So i would like to write a function that looks something like this:
def tracing_func(func_name, args):
if func_name in ['func', 'foo']:
log_func_args(func_name, args)
where log_func_args logs it in a file for later analysis.
Then set this function to be called whenever any function in my code is called, with the functions' name and args.
Can this be done?
Ok so sys.settrace does the trick pretty well:
https://docs.python.org/2/library/sys.html#sys.settrace
and an example:
https://pymotw.com/2/sys/tracing.html
note that the function you pass to settrace has to return a reference to itself (or to another function for further tracing in that scope).

Python call a function by indirection

Windows7, Python2.7 MPD2.
I am writing a program to control MPD.
MPD has several (over 50) different functions.
Normally one would make a call in the form:
mpd_client.pause()
#or
mpd_client.playlistmove(playlist_name, old_pos, new_pos)
I want to encapsulate all the separate calls in one function so I can use a single try / except.
I am thinking I want to use some sort of lambda, and *args but I have little experience with either of those.
In the body of my program, I want to call something like this:
MPD('pause')
#or
MPD('playlistmove', playlist_name, old_pos, new_pos)
I envision my function looking something like...
def MPD(required_param, *args):
try:
mpd_client.required_param(args)
except:
...
of course, this isn't working.
Short of writing a huge switch statement and 50 different try structures, is there a way I can use lambda?
maybe something like:
lambda m=mpd_client.required_param: m(args)
but, this isn't working either.
I don't know.
Thanks, Mark.
You need to use getattr() to retrieve the actual method to call by name:
getattr(mpd_client, required_param)(*args)
(Note that you also need the * in front of the args for the function call as well, to re-expand the argument list back into separate arguments.)
what you need is object.__dict__, as in your code:
func = mpd_client.__dict__['pause']
func()
func = mpd_client.__dict__['playlistmove']
func(playlist_name, old_pos, new_pos)

Don't know where to start with these parameters

I'm creating a custom module to help me write python faster with less code and cleaner syntax.
At the moment i am creating a function which is a cleaner version of wx.Accelerator.
By the end of this, i expect:
accel_tbl = wx.AcceleratorTable([(wx.ACCEL_CTRL, ord('O'), PUT_ID_HERE),
(wx.ACCEL_CTRL, ord('S'), PUT_ID_HERE)])
self.SetAcceleratorTable(accel_tbl)
to become:
accelerate((wx.ACCEL_CTRL,'O',PUT_ID_HERE),
(wx.ACCEL_CTRL,'S',PUT_ID_HERE))
The only problem is.. I dont know where to start.... I know how to process information through parameters, but i've never learnt how to process multiple tuples with dynamic information inside parameters.
Can someone provide insight? Thankyou.
EDIT:
Current code:
## Create a cleaner accelerator
def accelerate(*args):
accel_tbl = wx.AcceleratorTable(list(args))
wx.SetAcceleratorTable(accel_tbl)
Current call:
import Sky
Sky.accelerate((wx.ACCEL_CTRL,'s',wx.ID_ANY),
(wx.ACCEL_CTRL,'t',wx.ID_ANY))
Add this method to your custom wx.Window class:
def accelerate(self,*args):
accel_tbl = wx.AcceleratorTable(args)
self.SetAcceleratorTable(accel_tbl)
Then invoke it as follows:
win = YourCustomWindowClass(PUT_WINDOW_ARGUMENTS_HERE)
win.accelerate((wx.ACCEL_CTRL,'O',PUT_ID_HERE),(wx.ACCEL_CTRL,'S',PUT_ID_HERE))
Alternatively, you can define it as a function taking a wx.Window argument as follows:
def accelerate(win,*args):
accel_tbl = wx.AcceleratorTable(args)
win.SetAcceleratorTable(accel_tbl)
The *args represents all the positional arguments as a tuple. If AcceleratorTable really requires a list, you can use list(args) instead as the argument (i.e. accel_tbl = wx.AcceleratorTable(list(args))).
You can learn about *args here.

Can I be warned when I used a generator function by accident

I was working with generator functions and private functions of a class. I am wondering
Why when yielding (which in my one case was by accident) in __someFunc that this function just appears not to be called from within __someGenerator. Also what is the terminology I want to use when referring to these aspects of the language?
Can the python interpreter warn of such instances?
Below is an example snippet of my scenario.
class someClass():
def __init__(self):
pass
#Copy and paste mistake where yield ended up in a regular function
def __someFunc(self):
print "hello"
#yield True #if yielding in this function it isn't called
def __someGenerator (self):
for i in range(0, 10):
self.__someFunc()
yield True
yield False
def someMethod(self):
func = self.__someGenerator()
while func.next():
print "next"
sc = someClass()
sc.someMethod()
I got burned on this and spent some time trying to figure out why a function just wasn't getting called. I finally discovered I was yielding in function I didn't want to in.
A "generator" isn't so much a language feature, as a name for functions that "yield." Yielding is pretty much always legal. There's not really any way for Python to know that you didn't "mean" to yield from some function.
This PEP http://www.python.org/dev/peps/pep-0255/ talks about generators, and may help you understand the background better.
I sympathize with your experience, but compilers can't figure out what you "meant for them to do", only what you actually told them to do.
I'll try to answer the first of your questions.
A regular function, when called like this:
val = func()
executes its inside statements until it ends or a return statement is reached. Then the return value of the function is assigned to val.
If a compiler recognizes the function to actually be a generator and not a regular function (it does that by looking for yield statements inside the function -- if there's at least one, it's a generator), the scenario when calling it the same way as above has different consequences. Upon calling func(), no code inside the function is executed, and a special <generator> value is assigned to val. Then, the first time you call val.next(), the actual statements of func are being executed until a yield or return is encountered, upon which the execution of the function stops, value yielded is returned and generator waits for another call to val.next().
That's why, in your example, function __someFunc didn't print "hello" -- its statements were not executed, because you haven't called self.__someFunc().next(), but only self.__someFunc().
Unfortunately, I'm pretty sure there's no built-in warning mechanism for programming errors like yours.
Python doesn't know whether you want to create a generator object for later iteration or call a function. But python isn't your only tool for seeing what's going on with your code. If you're using an editor or IDE that allows customized syntax highlighting, you can tell it to give the yield keyword a different color, or even a bright background, which will help you find your errors more quickly, at least. In vim, for example, you might do:
:syntax keyword Yield yield
:highlight yield ctermbg=yellow guibg=yellow ctermfg=blue guifg=blue
Those are horrendous colors, by the way. I recommend picking something better. Another option, if your editor or IDE won't cooperate, is to set up a custom rule in a code checker like pylint. An example from pylint's source tarball:
from pylint.interfaces import IRawChecker
from pylint.checkers import BaseChecker
class MyRawChecker(BaseChecker):
"""check for line continuations with '\' instead of using triple
quoted string or parenthesis
"""
__implements__ = IRawChecker
name = 'custom_raw'
msgs = {'W9901': ('use \\ for line continuation',
('Used when a \\ is used for a line continuation instead'
' of using triple quoted string or parenthesis.')),
}
options = ()
def process_module(self, stream):
"""process a module
the module's content is accessible via the stream object
"""
for (lineno, line) in enumerate(stream):
if line.rstrip().endswith('\\'):
self.add_message('W9901', line=lineno)
def register(linter):
"""required method to auto register this checker"""
linter.register_checker(MyRawChecker(linter))
The pylint manual is available here: http://www.logilab.org/card/pylint_manual
And vim's syntax documentation is here: http://www.vim.org/htmldoc/syntax.html
Because the return keyword is applicable in both generator functions and regular functions, there's nothing you could possibly check (as #Christopher mentions). The return keyword in a generator indicates that a StopIteration exception should be raised.
If you try to return with a value from within a generator (which doesn't make sense, since return just means "stop iteration"), the compiler will complain at compile-time -- this may catch some copy-and-paste mistakes:
>>> def foo():
... yield 12
... return 15
...
File "<stdin>", line 3
SyntaxError: 'return' with argument inside generator
I personally just advise against copy and paste programming. :-)
From the PEP:
Note that return means "I'm done, and have nothing interesting to
return", for both generator functions and non-generator functions.
We do this.
Generators have names with "generate" or "gen" in their name. It will have a yield statement in the body. Pretty easy to check visually, since no method is much over 20 lines of code.
Other methods don't have "gen" in their name.
Also, we do not every use __ (double underscore) names under any circumstances. 32,000 lines of code. Non __ names.
The "generator vs. non-generator" method function is entirely a design question. What did the programmer "intend" to happen. The compiler can't easily validate your intent, it can only validate what you actually typed.

Categories