I'm trying to create a generic function caller class with the purpose of calling GUI functions from outside the main thread. The class has a signal and two functions: one is called to emit the signal and runs from a side thread. The other runs the actual function and is connected to the signal in the main thread.
In keeping this class generic, I want to be able to pass any number of arguments, as would normally be done using *args or **kwargs. But when initializing the Signal I'm required to define what arguments are passed.
This is my generic function caller (code stripped down for clarity):
from PySide.QtCore import Signal
class FunctionCaller(QObject):
_request = Signal(int)
def __init__(self, function=str):
super().__init__()
self.function = function
self._request.connect(self._do_function)
#Slot(*args)
def _do_function(self, *args):
"""This runs in the GUI thread to handle the thing."""
self.function(*args)
def doFunction(self, *args):
self._request.emit(*args)
Here's what appears in the main thread:
functionCaller = FunctionCaller(function = someGUIfunction)
foo = functionCaller.doFunction # foo is a global variable
And here's where I call the function in the side thread:
foo(0xc000, 'string')
The error I get when running the function with 3 arguments: _request(int) only accepts 1 arguments, 3 given!
Is there a way to do this or must this function be less generic than I hoped, only working for a set number and types of arguments?
Related
I wanted to use a decorator to handle exceptions in my PyQt5 application:
def handle_exceptions(func):
def func_wrapper(*args, **kwargs):
try:
print(args)
return func(*args, **kwargs)
except Exception as e:
print(e)
return None
return func_wrapper
class MainWindow(QMainWindow):
def __init__(self):
QMainWindow.__init__(self)
loadUi("main_window.ui",self)
self.connect_signals()
def connect_signals(self):
self.menu_action.triggered.connect(self.fun)
#handle_exceptions
def fun(self):
print("hello there!")
When I run I get the following exception:
fun() takes 1 positional argument but 2 were given
The output is False (printed args in the decorator).
The interesting thing is that when I run the fun() function directly by self.fun() in the constructor or comment the decorator, everything works. Seems like the decorator adds an additional argument, but only when the function is called by the signal. What is going on?
The decorator isn't the problem; you simply haven't defined fun with the correct number of parameters.
#handle_exceptions
def fun(self, foo):
print("hello there!")
fun is an ordinary function; self.fun is a bound method that, when called, calls fun with self as the first argument and passing its own arguments as additional arguments to fun. Whatever is calling self.fun is passing an additional argument, so the definition of fun has to be accept that.
The issue is that QAction.triggered emits a boolean when emitted. When a slot receives a signal, the arguments of the signal are submitted in the signature of the slot. When the slot has a shorter signature than the signal the extra arguments are ignored. In your case, the non-decorated function has no input parameters besides self, so the checked argument of QAction.triggered is ignored when the non-decorated function receives the signal. However, the decorated function receives an arbitrary number of arguments, so when the decorated function receives the triggered signal, the checked argument is not ignored which is the extra argument that Python is complaining about.
The problem is caused because the triggered signal is overload, that is to say it has 2 signatures:
void QAction::triggered(bool checked = false)
QAction.triggered()
QAction.triggered(bool checked)
So by default it sends a boolean(false) that clearly does not accept the "fun" method causing the error.
In this case the solution is to use the #pyqtSlot() decorator to indicate the signature that you must accept:
#pyqtSlot()
#handle_exceptions
def fun(self):
print("hello there!")
It's not the decorator which adds this argument, it is the fact that you are dealing with a method.
It would act the same way if you omitted the #handle_exceptions.
What happens?
You take self.fun and pass it to self.menu_action.triggered.connect().
Whenever a menu action triggers, it (presumably) tries to call the given callable with one argument (maybe an event argument?)
But: When you take this self.fun, you don't get the function object itself, but you get what MainWindow.fun.__get__(self) (or alike, I don't remember the exact syntax) gives you, a so-called "bound method object". It is a wrapper object which can be called and deflects the calls to the original function object, but prepends the given self to the argument list.
This leads to the fact that this object being called with one argument (the event object?) results in the original function being called with two arguments (self and the event object). As it is not ready to take this additional event object, you get the said error.
I have a method like this in Python :
def test(a,b):
return a+b, a-b
How can I run this in a background thread and wait until the function returns.
The problem is the method is pretty big and the project involves GUI, so I can't wait until it's return.
In my opinion, you should besides this thread run another thread that checks if there is result. Or Implement callback that is called at the end of the thread. However, since you have gui, which as far as I know is simply a class -> you can store result into obj/class variable and check if the result came.
I would use mutable variable, which is sometimes used. Lets create special class which will be used for storing results from thread functions.
import threading
import time
class ResultContainer:
results = [] # Mutable - anything inside this list will be accesable anywher in your program
# Lets use decorator with argument
# This way it wont break your function
def save_result(cls):
def decorator(func):
def wrapper(*args,**kwargs):
# get result from the function
func_result = func(*args,**kwargs)
# Pass the result into mutable list in our ResultContainer class
cls.results.append(func_result)
# Return result from the function
return func_result
return wrapper
return decorator
# as argument to decorator, add the class with mutable list
#save_result(ResultContainer)
def func(a,b):
time.sleep(3)
return a,b
th = threading.Thread(target=func,args=(1,2))
th.daemon = True
th.start()
while not ResultContainer.results:
time.sleep(1)
print(ResultContainer.results)
So, in this code, we have class ResultContainer with list. Whatever you put in it, you can easily access it from anywhere in the code (between threads and etc... exception is between processes due to GIL). I made decorator, so you can store result from any function without violating the function. This is just example how you can run threads and leave it to store result itself without you taking care of it. All you have to do, is to check, if the result arrived.
You can use global variables, to do the same thing. But I dont advise you to use them. They are ugly and you have to be very careful when using them.
For even more simplicity, if you dont mind violating your function, you can just, without using decorator, just push result to class with list directly in the function, like this:
def func(a,b):
time.sleep(3)
ResultContainer.results.append(tuple(a,b))
return a,b
I'm trying to get to clean up some stuff after i kill a running task within celery. I'm currently hitting 2 problems:
1) Inside the task revoked function body, how can i get access to the parameters that the task function was called: so for example if the task is defined as:
#app.task()
def foo(bar, baz):
pass
How will i get access to bar and baz inside the task_revoked.connect code?
2) I want to kill a task only when it's state is anything but X. That means inspecting the task on one hand, and setting the state on the other. Inspecting the state could be done I guess, but I'm having difficulty getting my head around the context inside the task function body.
If I define foo like this:
#app.task(bound=True)
def foo(self, bar, baz):
pass
and call it from say.... Flask like foo(bar, baz), then I'll get an error that the third parameter is expected, which means the decorator does not add any context automatically through the self parameter.
the app is simply defined as celery.Celery()
Thanks in advance
You can get tasks args from request object.
from celery.signals import task_revoked
#task_revoked.connect
def my_task_revoked_handler(sender=None, body=None, *args, **kwargs):
print(kwargs['request'].args)
This prints arguments given to the task.
Update:
You have to use bind not bound.
#app.task(bind=True)
def foo(self, bar, baz):
I have a decorator #newthread which wraps functions to run in a separate thread (using wraps from functools and Thread from threading). However, there are some functions for which I only want this to happen some of the time.
At the moment, I have #newthread check the keyword arguments of the function to be wrapped and if it finds a bool new_thread equal to True then it runs the function in a separate thread, otherwise it runs the function normally. For example,
#newthread
def foo(new_thread=False)
# Do stuff...
foo() # Runs normally
foo(new_thread=True) # Runs in new thread
Is this the canonical way of doing this, or am I missing something?
Don't use newthread as a decorator, then. A decorator is just a function that takes a function and returns a function.
If you want it to run in the current thread, call
foo(some, params)
If you want to run foo in a new thread, call
newthread(foo)(some, params)
#newthread
def foo(new_thread=False)
# Do stuff...
foo() # Runs normally
foo(new_thread=True) # Runs in new thread
That is good - but, I for one, would prefer to have the decorator do consume the "new_thread" argument, instead of having it showing on the parameter list of the decorated functions.
Also, you could use a "default" value so that you'd pick the actual need to use a different thread from somewhere else (like an enviroment variable):
MARKER = object()
def newthread(func):
def wrapper(*args, newthread=MARKER, **kwargs):
if newthread is MARKER:
newthread = os.environ.get("force_threads", True)
if newthread:
...
# cretae new thread and return future-like object
else:
return func(*args, **kwargs)
return wrapper
Example code:
# -*- coding: utf-8 -*-
from functools import wraps
class MyClass(object):
def __init__(self):
pass
#decorator inside class
def call(f):
#wraps(f)
def wrapper(*args):
print 'Wrapper: ', args
return wrapper
#decorated 'method' without self
#call
def myfunc(a):
pass
c = MyClass()
c.myfunc(1)
Returns:
Wrapper: (<test3.MyClass object at 0xb788a34c>, 1)
Is this normal? Can someone explain?
If this is a feature I would use it in my library.
This is perfectly normal.
The function myfunc is replacecd by an instance of wrapper. The signature of wrapper is (*args). because it is a bound method, the first argument is the instance of MyClass which is printed out after the string `Wrapper: '.
What's confusing you?
It's worth noting that if you use call as a decorator from outside of MyClass, it will generate a TypeError. One way around this is to apply the staticmethod decorator to it but then you can't call it during class construction.
It's a little bit hacky but I address how to have it both ways here.
update after comment
it gets the instance as the first argument regardless of if you type self in the parameter list because after the class is created, and an instance instantiated, it is a bound method. when you call it in the form
#instance.call
def foo(bar):
return bar + 1
it expands to
def foo(bar):
return bar + 1
foo = instance.call(f)
but note that you are calling it on an instance! This will automatically expand to a call of the form
def foo(bar):
return bar + 1
foo = MyClass.call(instance, f)
This is how methods work. But you only defined call to take one argument so this raises a TypeError.
As for calling it during class construction, it works fine. but the function that it returns gets passed an instance of MyClass when it is called for the same reason that I explained above. Specifically, whatever arguments you explicity pass to it come after the implicit and automatic placement of the instance that it is called upon at the front of the argument list.
#call
def myfunc(a):
...
is equivalent to
def myfunc(a):
...
myfunc=call(myfunc)
The orginial myfunc may have expected only one argument, a, but after being decorated with call, the new myfunc can take any number of positional arguments, and they will all be put in args.
Notice also that
def call(f)
never calls f. So the fact that
def myfunc(a)
lacks the normal self argument is not an issue. It just never comes up.
When you call c.myfunc(1), wrapper(*args) gets called.
What is args? Well, since c.myfunc is a method call, c is sent as the first argument, followed by any subsequent arguments. In this case, the subsequent argument is 1. Both arguments are sent to wrapper, so args is the 2-tuple (c,1).
Thus, you get
Wrapper: (<test3.MyClass object at 0xb788a34c>, 1)