I have a third-party Python library that allows me to register a callback function that it called later.
While the code works okay with functions, when I tried to pass a method it fails, the callback is never called.
I have no control over the third party library (source code not available).
def old_callbackFunction(param, data):
print data
class MyClass():
def callbackFunction(self, param, data):
print data
myObj = MyClass()
# old_setCallback(myObj.callbackFunction, param = "x") # this would work
setCallback(myObj.callbackFunction, param = "x") # this is never called
Sorin actually figured this out himself, with help from my comment, but he indicated that he wanted me to post the original comment as an answer. I was reluctant to post this originally because I'm unsure of the precise behavior of the setCallback and callbackFunction code; use at your own risk and modify as reason dictates.
The best way to wrap a function is to use functools.partial:
from functools import partial
setCallback(partial(myObj.callbackFunction), param="x")
You may also use a lambda (but you'll lose style points with the pythonistas):
setCallback(lambda param, data: myObj.callbackFunction(param, data), param="x")
Related
Suppose I have following methods:
from torch import device
def run_cuda(device:device, count:int):
...
def gen_noise(device:device, width:int, height:int):
...
If I were to call these, I have to:
device = DEVICE
run_cuda(device, count=8)
gen_noise(device, width=128, height=128)
What I'm trying to achieve is to somehow remove multiple device argument in some way for better readabiliy. So I'd rather:
device = DEVICE
def use_device(device:device):
...
# multiline
device_user = use_device(device)
device_user.run_cuda(count=8)
device_user.gen_noise(width=128, height=128)
# inline
use_device(device).run_cuda(count=8)
use_device(device).gen_noise(width=128, height=128)
obviously I could simply wrap device class and manually define methods in it. But It feels like it would be quite better to define simple use_device(device) function that takes device argument, and pass into following function by 'calling' it.
Maybe something like so:
def use_device(device:device):
# call arbitrary methods that has 'device' argument.
return ExtensionMethodLikeCallable[[device, *args, **kwargs],any]()
Is this possible in Python?
Thank you.
Edit:
Justification for this implementation could be the fact that I could then:
use_device(device).use_dimension(512,512).use_iteration(8).gen_noise()
instead of:
gen_noise(device=device, width=512, height=512, iteration=8)
which one has 'better' readability is admittedly arguable but let's say it's just my personal preference.
Im working with matplotlib and for a certain program I need to change the matplotlib parameters only in a context, not for all the document, so I neeed to use this context with lot of parameters,
with plt.rc_context(rc={'figure.figsize':(10,9),
'font.size':17,
....
'xtick.top':True,
'ytick.right':True,
'xtick.minor.visible':True,
'ytick.minor.visible':True }):
As there are lot of parameters in order to clean the code I have the intention of put all this code on a separated archive and later import it as a function,
def context():
with plt.rc_context( ... ):
But it cant be made in this way, gives an error because the function is not "closed", (SyntaxError: unexpected EOF while parsing).
My question is if there is some way to pass this "context" inside a function.
I think what you might be asking is how to create your own context manager that you can use with with (this is what this comment was suggesting). But actually there's an even simpler solution than that.
rc_context itself is already a context manager. If you want to make your own "version" of it that bakes in a specific context, you can do that with partial application using functools.partial:
from functools import partial
my_mpl_context = partial(plt.rc_context, rc={
# fill in your mpl settings here
})
And then you can use it like:
with my_mpl_context():
# make plots, etc...
To make your code work you need to use yield in the with block. And to make context a context manager you can decorate it with contextlib.contextmanager:
from contextlib import contextmanger
#contextmanager
def context():
with plt.rc_context( ... ):
yield
When trying to unittest the below seen code snippet i get limited by the timing limit that the decorator that wraps calc_something functions puts to me. It seems that I cant override RAND_RATE on my unittests since then I import the module containing my implementation the decorators have already wrapped my function. How can I solve that issue?
RAND_RATE=20
RAND_PERIOD=10
#limits(calls=RAND_RATE, period=RAND_PERIOD)
def calc_something():
...
Without knowing exactly what limits does, we don't know what (if anything) can be patched. Instead, leave the base implementation undecorated for use by unit test. calc_something will be saved as the separate result of applying limits manually.
RAND_RATE=20
RAND_PERIOD=10
def _do_calc():
...
calc_something = limits(calls=RAND_RATE, period=RAND_PERIOD)(_do_calc)
#limits(calls=RAND_RATE, period=RAND_PERIOD)
def calc_something():
...
Now in your tests, you can define any decorated version you like:
test_me = limits(10, 5)(my_module._do_calc)
Many languages support ad-hoc polymorphism (a.k.a. function overloading) out of the box. However, it seems that Python opted out of it. Still, I can imagine there might be a trick or a library that is able to pull it off in Python. Does anyone know of such a tool?
For example, in Haskell one might use this to generate test data for different types:
-- In some testing library:
class Randomizable a where
genRandom :: a
-- Overload for different types
instance Randomizable String where genRandom = ...
instance Randomizable Int where genRandom = ...
instance Randomizable Bool where genRandom = ...
-- In some client project, we might have a custom type:
instance Randomizable VeryCustomType where genRandom = ...
The beauty of this is that I can extend genRandom for my own custom types without touching the testing library.
How would you achieve something like this in Python?
Python is not a strongly typed language, so it really doesn't matter if yo have an instance of Randomizable or an instance of some other class which has the same methods.
One way to get the appearance of what you want could be this:
types_ = {}
def registerType ( dtype , cls ) :
types_[dtype] = cls
def RandomizableT ( dtype ) :
return types_[dtype]
Firstly, yes, I did define a function with a capital letter, but it's meant to act more like a class. For example:
registerType ( int , TheLibrary.Randomizable )
registerType ( str , MyLibrary.MyStringRandomizable )
Then, later:
type = ... # get whatever type you want to randomize
randomizer = RandomizableT(type) ()
print randomizer.getRandom()
A Python function cannot be automatically specialised based on static compile-time typing. Therefore its result can only depend on its arguments received at run-time and on the global (or local) environment, unless the function itself is modifiable in-place and can carry some state.
Your generic function genRandom takes no arguments besides the typing information. Thus in Python it should at least receive the type as an argument. Since built-in classes cannot be modified, the generic function (instance) implementation for such classes should be somehow supplied through the global environment or included into the function itself.
I've found out that since Python 3.4, there is #functools.singledispatch decorator. However, it works only for functions which receive a type instance (object) as the first argument, so it is not clear how it could be applied in your example. I am also a bit confused by its rationale:
In addition, it is currently a common anti-pattern for Python code to inspect the types of received arguments, in order to decide what to do with the objects.
I understand that anti-pattern is a jargon term for a pattern which is considered undesirable (and does not at all mean the absence of a pattern). The rationale thus claims that inspecting types of arguments is undesirable, and this claim is used to justify introducing a tool that will simplify ... dispatching on the type of an argument. (Incidentally, note that according to PEP 20, "Explicit is better than implicit.")
The "Alternative approaches" section of PEP 443 "Single-dispatch generic functions" however seems worth reading. There are several references to possible solutions, including one to "Five-minute Multimethods in Python" article by Guido van Rossum from 2005.
Does this count for ad hock polymorphism?
class A:
def __init__(self):
pass
def aFunc(self):
print "In A"
class B:
def __init__(self):
pass
def aFunc(self):
print "In B"
f = A()
f.aFunc()
f = B()
f.aFunc()
output
In A
In B
Another version of polymorphism
from module import aName
If two modules use the same interface, you could import either one and use it in your code.
One example of this is from xml.etree.ElementTree import XMLParser
imagine you have an io heavy function like this:
def getMd5Sum(path):
with open(path) as f:
return md5(f.read()).hexdigest()
Do you think Python is flexible enough to allow code like this (notice the $):
def someGuiCallback(filebutton):
...
path = filebutton.getPath()
md5sum = $getMd5Sum()
showNotification("Md5Sum of file: %s" % md5sum)
...
To be executed something like this:
def someGuiCallback_1(filebutton):
...
path = filebutton.getPath()
Thread(target=someGuiCallback_2, args=(path,)).start()
def someGuiCallback_2(path):
md5sum = getMd5Sum(path)
glib.idle_add(someGuiCallback_3, md5sum)
def someGuiCallback_3(md5sum):
showNotification("Md5Sum of file: %s" % md5sum)
...
(glib.idle_add just pushes a function onto the queue of the main thread)
I've thought about using decoraters, but they don't allow me to access the 'content' of the function after the call. (the showNotification part)
I guess I could write a 'compiler' to change the code before execution, but it doesn't seam like the optimal solution.
Do you have any ideas, on how to do something like the above?
You can use import hooks to achieve this goal...
PEP 302 - New Import Hooks
PEP 369 - Post Import Hooks
... but I'd personally view it as a little bit nasty.
If you want to go down that route though, essentially what you'd be doing is this:
You add an import hook for an extension (eg ".thpy")
That import hook is then responsible for (essentially) passing some valid code as a result of the import.
That valid code is given arguments that effectively relate to the file you're importing.
That means your precompiler can perform whatever transformations you like to the source on the way in.
On the downside:
Whilst using import hooks in this way will work, it will surprise the life out of any maintainer or your code. (Bad idea IMO)
The way you do this relies upon imputil - which has been removed in python 3.0, which means your code written this way has a limited lifetime.
Personally I wouldn't go there, but if you do, there's an issue of the Python Magazine where doing this sort of thing is covered in some detail, and I'd advise getting a back issue of that to read up on it. (Written by Paul McGuire, April 2009 issue, probably available as PDF).
Specifically that uses imputil and pyparsing as it's example, but the principle is the same.
How about something like this:
def performAsync(asyncFunc, notifyFunc):
def threadProc():
retValue = asyncFunc()
glib.idle_add(notifyFunc, retValue)
Thread(target=threadProc).start()
def someGuiCallback(filebutton):
path = filebutton.getPath()
performAsync(
lambda: getMd5Sum(path),
lambda md5sum: showNotification("Md5Sum of file: %s" % md5sum)
)
A bit ugly with the lambdas, but it's simple and probably more readable than using precompiler tricks.
Sure you can access function code (already compiled) from decorator, disassemble and hack it. You can even access the source of module it's defined in and recompile it. But I think this is not necessary. Below is an example using decorated generator, where yield statement serves as a delimiter between synchronous and asynchronous parts:
from threading import Thread
import hashlib
def async(gen):
def func(*args, **kwargs):
it = gen(*args, **kwargs)
result = it.next()
Thread(target=lambda: list(it)).start()
return result
return func
#async
def test(text):
# synchronous part (empty in this example)
yield # Use "yield value" if you need to return meaningful value
# asynchronous part[s]
digest = hashlib.md5(text).hexdigest()
print digest