How to catch task arguments with default values in Celery signal? - python

I use Celery task_postrun signal to execute some logic after a task my_task has been executed. The Celery task receives arguments from a dictionary with the use of ** to unpack them.
The problem is that args inside Celery signal only contain argument without a default value, and I guess the use dictionary with ** to unpack key:value is like arguments with default values.
Task
#app.task(name='Task_example')
def my_task(arg1, arg2):
...
Signal
#task_postrun.connect
def task_postrun_handler(task_id=None, task=None, args=None, state=None, retval=None, **kwargs):
if task.name == 'Task_example':
log.info(len(args))
Now when I execute the task I'm unable to catch arguments passed with the dic. How can I do that without unpacking everything ?
kwargs = dict(arg1=1, arg2=2)
my_task.delay(**kwargs)
It says there is 0 argument in args, and when I execute the task with:
kwargs = dict(arg2=2)
my_task.delay(arg1, **kwargs)
Then it finds arg1 but not arg2.

Related

Default value for optional parameter

I'm making a function in python that takes an optional parameter *args. This function calls upon another function passing this optional parameter as well. However, when a certain condition applies, I want the optional parameter to have a certain default value, rather than the value passed in the function call. It is clear in the snippet of code below that simply setting a new value for *args is incorrect, but what is the correct method of doing this?
def function(arg1, arg2, *args):
if condition:
*args = value
function2(*args)
You could identify your parameter by index in args:
def function(arg1, arg2, *args):
if condition:
args = list(args)
if len(args) == 0: # the second arg in args was not set
args.append("first default")
if len(args) == 1:
args.append("second default")
function2(*args)
But as you see - it's a little ugly. It will involve a lot of if statements in case you want to use a lot of default positional arguments. Use **kwargs if you can.
Apparantly *args is a list, so I had to assign it as args = [value], instead of *args = value. That way it solved the problem.

In Celery+Python, how do I access to task parameters from the task_revoked function?

I'm trying to get to clean up some stuff after i kill a running task within celery. I'm currently hitting 2 problems:
1) Inside the task revoked function body, how can i get access to the parameters that the task function was called: so for example if the task is defined as:
#app.task()
def foo(bar, baz):
pass
How will i get access to bar and baz inside the task_revoked.connect code?
2) I want to kill a task only when it's state is anything but X. That means inspecting the task on one hand, and setting the state on the other. Inspecting the state could be done I guess, but I'm having difficulty getting my head around the context inside the task function body.
If I define foo like this:
#app.task(bound=True)
def foo(self, bar, baz):
pass
and call it from say.... Flask like foo(bar, baz), then I'll get an error that the third parameter is expected, which means the decorator does not add any context automatically through the self parameter.
the app is simply defined as celery.Celery()
Thanks in advance
You can get tasks args from request object.
from celery.signals import task_revoked
#task_revoked.connect
def my_task_revoked_handler(sender=None, body=None, *args, **kwargs):
print(kwargs['request'].args)
This prints arguments given to the task.
Update:
You have to use bind not bound.
#app.task(bind=True)
def foo(self, bar, baz):

How to enqueue a job in rq from redis

I have to fetch functions and time when it execute from mysql and then save this thing into redis.Now from redis I have to execute functions at prescribed time.I want to use rq as scheduler but I am not able to find out the model in which I should save imported data into redis.
I am totally new in python and redis
If you install redis there is a file (for me it was the ~/lib/python2.7/site-packages/rq/queue.py which in turn calls job.py) that clearly states the enqueue and enqueue_call functions:
def enqueue_call(self, func, args=None, kwargs=None,
timeout=None, result_ttl=None, description=None,
depends_on=None):
"""Creates a job to represent the delayed function call and enqueues it.
It is much like `.enqueue()`, except that it takes the function's args
and kwargs as explicit arguments. Any kwargs passed to this function
contain options for RQ itself.
etc...."""
def enqueue(self, f, *args, **kwargs):
"""Creates a job to represent the delayed function call and enqueues it.
Expects the function to call, along with the arguments and keyword
arguments.
etc...."""

passing functions as argument with named parameters

I followed this to pass functions as arguments:
Passing functions with arguments to another function in Python?
However, I could not figure out how to pass function with its own arguments as names parameters
def retry(func, *args):
func(*args)
def action(args):
do something
retry(action, arg1, namedArg=arg2)
Here I get an exception:
TypeError: retry() got an unexpected keyword argument 'namedArg'
Normally, I can do:
action(arg1, namedArg=arg2)
Please help/
*args and it's sibling **kwargs are the names generally used for extra arguments and key word arguments. You are passing a kew word argument when you pass namedArg=arg2.
So, try this instead:
def retry(func, *args, **kwargs):
func(*args, **kwargs)
def action(*args, **kwargs):
do something
retry(action, arg1, namedArg=arg2)
If you instead use
def action(args, kwargs):
do something
Then you will end up with args as a list of arguments and kwargs as a dictionary of key word arguments, so in your case
args = [arg1]
kwargs = {'namedArg':arg2}
Read this, keyword arguments in python doc.
As the error clearly states that got an unexpected keyword argument 'namedArg'. where as you are providing only arguments in *args.
You will find plenty of examples to understand keyword arguments.
What you need is functools.
http://docs.python.org/2/library/functools.html#functools.partial
from functools import partial
def action(arg):
do something
def action2(arg=1):
do something
def action3(arg1, arg2=2):
do something
partial(action, arg1)
partial(action, arg1, arg2=3)

Use **kwargs both in function calling and definition

Suppose I have a function get_data which takes some number of keyword arguments. Is there some way I can do this
def get_data(arg1, **kwargs):
print arg1, arg2, arg3, arg4
arg1 = 1
data['arg2'] = 2
data['arg3'] = 3
data['arg4'] = 4
get_data(arg1, **data)
So the idea is to avoid typing the argument names in both function calling and function definition. I call the function with a dictionary as argument and the keys of the dictionary become local variables of function and their values are the dictionary values
I tried the above and got error saying global name 'arg2' is not defined. I understand I can change the locals() in the definition of get_data to get the desired behavior.
So my code would look like this
def get_data(arg1, kwargs):
locals().update(kwargs)
print arg1, arg2, arg3, arg4
arg1 = 1
data['arg2'] = 2
data['arg3'] = 3
data['arg4'] = 4
get_data(arg1, data)
and it wouldn't work too. Also cant I achieve the behavior without using locals()?
**kwargs is a plain dictionary. Try this:
def get_data(arg1, **kwargs):
print arg1, kwargs['arg2'], kwargs['arg3'], kwargs['arg4']
Also, check documentation on keyword arguments.
If we examine your example:
def get_data(arg1, **kwargs):
print arg1, arg2, arg3, arg4
In your get_data functions's namespace, there is a variable named arg1, but there is no variable named arg2. so you can not reach a function or a variable that is not in your namespace.
In fact, in your namespace; there is a variable arg1 and a dictionary object called kwargs. using ** notation before the kwargs (name kwargs is not important in here, it can be something else too.) tells your python compiler that kwargs is a dictionary and all values in that dictionary will be evaluated as named parameters in your function definition.
Lets take a look at this example:
def myFunc(**kwargs):
do something
myFunc(key1='mykey', key2=2)
when you call myFunc with named parameters key1 and key2, your function definition acts like
def myFunc(key1=None, key2=None):
but with an exception! since myFunc have no named parameters, compiler do not know how to handle them directly since you can pass any named parameter to your function.
So your function accept those named parameters within a dictionary like you call your function:
myFunc({key1:'mykey', key2:2})
so, your function definition get those parameters within a dictionary. **kwargs defines your dictionary in that point, which also tell compiler that any named parameters will be accepted (by the help of ** signs before the kwargs)
def myFunc(**kwargs):
print kwargs
will print
{key1:'mykey', key2:2}
So in your example, you can use kwargs like a dictionary;
def myFunc(**kwargs):
kwargs['arg2']
kwargs.get('arg3')
I hope it is not complicated (:

Categories