Python wrapper function discarding arguments - python

I have a setting where event handlers are always functions taking a single event argument.
But more often than not, I find myself writing handlers that doesn´t use any of the event information. So I constantly write handlers of the form:
def handler(_):
#react
In order to discard the argument.
But I wish I didn´t have to, as sometimes I want to reuse handlers as general action functions that take no arguments, and sometimes I have existing zero-arguments functions that I want to use as handlers.
My current solution is wrapping the function using a lambda:
def handler():
#react
event.addHandler(lambda _:handler())
But that seems wrong for other reasons.
My intuitive understanding of a lambda is that it is first and foremost a description of a return value, and event handlers return nothing. I feel lambdas are meant to express pure functions, and here I´m using them only to cause side effects.
Another solution would be a general decorator discarding all arguments.
def discardArgs(func):
def f(*args):
return func()
return f
But I need this in a great many places, and it seems silly having to import such a utility to every script for something so simple.
Is there a particularly standard or "pythonic" way of wrapping a function to discard all arguments?

Use *args:
def handler(*args):
#react
Then handler can take 0 arguments, or any number of position arguments.

Related

Transparently passing through a function with a variable argument list

I am using Python RPyC to communicate between two machines. Since the link may be prone to errors I would like to have a generic wrapper function which takes a remote function name plus that function's parameters as its input, does some status checking, calls the function with the parameters, does a little more status checking and then returns the result of the function call. The wrapper should have no knowledge of the function, its parameters/parameter types or the number of them, or the return value for that matter, the user has to get that right; it should just pass them transparently through.
I get the getattr(conn.root, function)() pattern to call the function but my Python expertise runs out at populating the parameters. I have read various posts on the use of *arg and **kwarg, in particular this one, which suggests that it is either difficult or impossible to do what I want to do. Is that correct and, if so, might there be a scheme which would work if I, say, ensured that all the function parameters were keyword parameters?
I do own both ends of this interface (the caller and the called) so I could arrange to dictionary-ise all the function parameters but I'd rather not make my API too peculiar if I could possibly avoid it.
Edit: the thing being called, at the remote end of the link, is a class with very ordinary methods, e.g.;
def exposed_a(self)
def exposed_b(self, thing1)
def exposed_c(self, thing1=None)
def exposed_d(self, thing1=DEFAULT_VALUE1, thing2=None)
def exposed_e(self, thing1, thing2, thing3=DEFAULT_VALUE1, thing4=None)
def exposed_f(self, thing1=None, thing2=None)
...where the types of each argument (and the return values) could be string, dict, number or list.
And it is indeed, trivial, my Goggle fu had simply failed me in finding the answer. In the hope of helping anyone else who is inexperienced in Python and is having a Google bad day:
One simply takes *arg and **kwarg as parameters and passes them directly on, with the asterisks attached. So in my case, to do my RPyC pass-through, where conn is the RPyC connection:
def my_passthru(conn, function_name, *function_args, **function_kwargs):
# Do a check of something or other here
return_value = getattr(conn.root, function_name)(*function_args, **function_kwargs)
# Do another check here
return return_value
Then, for example, a call to my exposed_e() method above might be:
return_value = my_passthru(conn, e, thing1, thing2, thing3)
(the exposed_ prefix being added automagically by RPyC in this case).
And of course one could put a try: / except ConnectionRefusedError: around the getattr() call in my_passthru() to generically catch the case where the connection has dropped underneath RPyC, which was my main purpose.

How to pass optional python arguments to sub-functions with optional arguments

Given a simple Python function with an optional argument, like:
def wait(seconds=3):
time.sleep(seconds)
How do I create a function that calls this and passes on an optional argument? For example, this does NOT work:
def do_and_wait(message, seconds=None):
print(message)
wait(seconds)
Note: I want to be able to call wait from other functions with the optional seconds argument without having to know and copy the current default seconds value in the underlying wait function to every other function which calls it.
As above, if I call it with the optional argument, like do_and_wait(2) then it works, but trying to rely on wait's default, e.g. calling it like do_and_wait() causes a TypeError because inside wait seconds == None.
Is there a simple and clean way to make this work? I know I can abuse kwargs like this:
def do_and_wait(message, **kwargs):
print(message)
wait(**kwargs)
But that seems unclear to the reader and user of this function since there is no useful name on the argument.
Note: This is a stupidly simplified example.
I understand you've simplified your question, but I think you mean how one can call a function with optional arguments that could be None. Does the following work for you?
import time
def func1(mess, sec):
if sec != None:
time.sleep(sec)
print(mess)
func1('success', sec=None)
I don't think you've quite explained your problem completely because I don't expect an answer should be this simple, but I would just use the same default value (and data type) in do_and_wait() as wait() uses, like so:
def do_and_wait(message, seconds=3):
print(message)
wait(seconds)
After thinking a bit more, I came up with something like this; Han's answer suggested this and reminded me that I think PEP even suggests it somewhere. This especially avoids having to know and copy the default value into any function that calls wait and wants to support a variable value for seconds.
def wait(seconds=None):
time.sleep(seconds if seconds is not None else 3)
def do_and_wait(message, seconds=None):
print(message)
wait(seconds)
def something_else(callable, seconds=None):
callable()
wait(seconds)

Calling a python function

I am trying to make a python library that allows me to make custom tkinter widgets that are more aesthetically pleasing than the built-in ones. However, I have run into a problem while defining a few functions.
The problem stems from the difference between functions like append() and str(). While the append function works as follows...
somelist = ['a', 'b', 'c']
somelist.append('d')
The str() function works like this...
somenumber = 99
somenumber_text = str(some_number)
You 'call upon' the append function by (1) stating the list that you are modifying (somelist), (2) adding a period, and (3) actually naming the append funtion itself (append()). Meanwhile you 'call upon' the str function by placing a positional argument (somenumber) within its argument area. I have no idea why there is this difference, and more importantly if there is a way to specify which method to use to 'call upon' a function that I define myself?
Thanks...
In Python, function is a group of related statements that perform a specific task.
Functions help break our program into smaller and modular chunks. As our program grows larger and larger, functions make it more organized and manageable.
Furthermore, it avoids repetition and makes code reusable.
Syntax of Function
def function_name(parameters):
"""docstring"""
statement(s)
Above shown is a function definition which consists of following components.
Keyword def marks the start of function header.
A function name to uniquely identify it. Function naming follows the same rules of writing identifiers in Python.
Parameters (arguments) through which we pass values to a function. They are optional.
A colon (:) to mark the end of function header.
Optional documentation string (docstring) to describe what the function does.
One or more valid python statements that make up the function body. Statements must have same indentation level (usually 4 spaces).
An optional return statement to return a value from the function.
You really don't need to create a class, or any methods. You can make a plain-old function that's similar to bind, and just take the widget to bind as a normal parameter. For example:
def bind_multi(widget, callback, *events):
for event in events:
widget.bind(event, callback)
That means you have to call this function as bind_multi(mybutton, callback, event1, event2) instead of mybutton.bind_multi(callback, event1, event2), but there's nothing wrong with that.

Why doesn't functools.partial return a real function (and how to create one that does)?

So I was playing around with currying functions in Python and one of the things that I noticed was that functools.partial returns a partial object rather than an actual function. One of the things that annoyed me about this was that if I did something along the lines of:
five = partial(len, 'hello')
five('something')
then we get
TypeError: len() takes exactly 1 argument (2 given)
but what I want to happen is
TypeError: five() takes no arguments (1 given)
Is there a clean way to make it work like this? I wrote a workaround, but it's too hacky for my taste (doesn't work yet for functions with varargs):
def mypartial(f, *args):
argcount = f.func_code.co_argcount - len(args)
params = ''.join('a' + str(i) + ',' for i in xrange(argcount))
code = '''
def func(f, args):
def %s(%s):
return f(*(args+(%s)))
return %s
''' % (f.func_name, params, params, f.func_name)
exec code in locals()
return func(f, args)
Edit: I think it might be helpful if I added more context. I'm writing a decorator that will automatically curry a function like so:
#curry
def add(a, b, c):
return a + b + c
f = add(1, 2) # f is a function
assert f(5) == 8
I want to hide the fact that f was created from a partial (maybe a bad idea :P). The message that the TypeError message above gives is one example of where whether something is a partial can be revealed. I want to change that.
This needs to be generalizable so EnricoGiampieri's and mgilson's suggestions only work in that specific case.
You definitely don't want to do this with exec.
You can find recipes for partial in pure Python, such as this one—many of them are mislabeled as curry recipes, so look for that as well. At any rate, these will show you the proper way to do it without exec, and you can just pick one and modify it to do what you want.
Or you could just wrap partial…
However, whatever you do, there's no way the wrapper can know that it's defining a function named "five"; that's just the name of the variable you store the function in. So if you want a custom name, you'll have to pass it in to the function:
five = my_partial('five', len, 'hello')
At that point, you have to wonder why this is any better than just defining a new function.
However, I don't think this is what you actually want anyway. Your ultimate goal is to define a #curry decorator that creates a curried version of the decorated function, with the same name (and docstring, arg list, etc.) as the decorated function. The whole idea of replacing the name of the intermediate partial is a red herring; use functools.wraps properly inside your curry function, and it won't matter how you define the curried function, it'll preserve the name of the original.
In some cases, functools.wraps doesn't work. And in fact, this may be one of those times—you need to modify the arg list, for example, so curry(len) can take either 0 or 1 parameter instead of requiring 1 parameter, right? See update_wrapper, and the (very simple) source code for wraps and update_wrapper to see how the basics work, and build from there.
Expanding on the previous: To curry a function, you pretty much have to return something that takes (*args) or (*args, **kw) and parse the args explicitly, and possibly raise TypeError and other appropriate exceptions explicitly. Why? Well, if foo takes 3 params, curry(foo) takes 0, 1, 2, or 3 params, and if given 0-2 params it returns a function that takes 0 through n-1 params.
The reason you might want **kw is that it allows callers to specify params by name—although then it gets much more complicated to check when you're done accumulating arguments, and arguably this is an odd thing to do with currying—it may be better to first bind the named params with partial, then curry the result and pass in all remaining params in curried style…
If foo has default-value or keyword args, it gets even more complicated, but even without those problems, you already need to deal with this problem.
For example, let's say you implement curry as a class that holds the function and all already-curried parameters as instance members. Then you'll have something like this:
def __call__(self, *args):
if len(args) + len(self.curried_args) > self.fn.func_code.co_argcount:
raise TypeError('%s() takes exactly %d arguments (%d given)' %
(self.fn.func_name, self.fn.func_code.co_argcount,
len(args) + len(self.curried_args)))
self.curried_args += args
if len(self.curried_args) == self.fn.func_code.co_argcount:
return self.fn(*self.curried_args)
else:
return self
This is horribly oversimplified, but it shows how to handle the basics.
My guess is that the partial function just delay the execution of the function, do not create a whole new function out of it.
My guess is that is just easier to define directly a new function in place:
def five(): return len('hello')
This is a very simple line, won't clutter your code and is quite clear, so i wouldn't bother writing a function to replace it, especially if you don't need this situation in a large number of cases

Python asynchronous callbacks and generators

I'm trying to convert a synchronous library to use an internal asynchronous IO framework. I have several methods that look like this:
def foo:
....
sync_call_1() # synchronous blocking call
....
sync_call_2() # synchronous blocking call
....
return bar
For each of the synchronous functions (sync_call_*), I have written a corresponding async function that takes a a callback. E.g.
def async_call_1(callback=none):
# do the I/O
callback()
Now for the python newbie question -- whats the easiest way to translate the existing methods to use these new async methods instead? That is, the method foo() above needs to now be:
def async_foo(callback):
# Do the foo() stuff using async_call_*
callback()
One obvious choice is to pass a callback into each async method which effectively "resumes" the calling "foo" function, and then call the callback global at the very end of the method. However, that makes the code brittle, ugly and I would need to add a new callback for every call to an async_call_* method.
Is there an easy way to do that using a python idiom, such as a generator or coroutine?
UPDATE: take this with a grain of salt, as I'm out of touch with modern python async developments, including gevent and asyncio and don't actually have serious experience with async code.
There are 3 common approaches to thread-less async coding in Python:
Callbacks - ugly but workable, Twisted does this well.
Generators - nice but require all your code to follow the style.
Use Python implementation with real tasklets - Stackless (RIP) and greenlet.
Unfortunately, ideally the whole program should use one style, or things become complicated. If you are OK with your library exposing a fully synchronous interface, you are probably OK, but if you want several calls to your library to work in parallel, especially in parallel with other async code, then you need a common event "reactor" that can work with all the code.
So if you have (or expect the user to have) other async code in the application, adopting the same model is probably smart.
If you don't want to understand the whole mess, consider using bad old threads. They are also ugly, but work with everything else.
If you do want to understand how coroutines might help you - and how they might complicate you, David Beazley's "A Curious Course on Coroutines and Concurrency" is good stuff.
Greenlets might be actualy the cleanest way if you can use the extension. I don't have any experience with them, so can't say much.
There are several way for multiplexing tasks. We can't say what is the best for your case without deeper knowledge on what you are doing. Probably the most easiest/universal way is to use threads. Take a look at this question for some ideas.
You need to make function foo also async. How about this approach?
#make_async
def foo(somearg, callback):
# This function is now async. Expect a callback argument.
...
# change
# x = sync_call1(somearg, some_other_arg)
# to the following:
x = yield async_call1, somearg, some_other_arg
...
# same transformation again
y = yield async_call2, x
...
# change
# return bar
# to a callback call
callback(bar)
And make_async can be defined like this:
def make_async(f):
"""Decorator to convert sync function to async
using the above mentioned transformations"""
def g(*a, **kw):
async_call(f(*a, **kw))
return g
def async_call(it, value=None):
# This function is the core of async transformation.
try:
# send the current value to the iterator and
# expect function to call and args to pass to it
x = it.send(value)
except StopIteration:
return
func = x[0]
args = list(x[1:])
# define callback and append it to args
# (assuming that callback is always the last argument)
callback = lambda new_value: async_call(it, new_value)
args.append(callback)
func(*args)
CAUTION: I haven't tested this

Categories