How to call celery task with class based method? - python

I have the class:
class Parser:
def __init__(self):
self.OM = Omni() # Creates an class object, which makes auth on the web site, for scrapping
#app.task
def foo(self, data):
self.OM.parse(data)
So how can I call task with foo method?
Because when I try to do like this, I takes error : Missing argument data. I think it is because calling the method get data as self parameter
prs = Parser()
prs.foo.delay(data)
How I can to resolve it?

Creating tasks from methods was possible in Celery 3.x, but it was removed in Celery 4.0 because it was too buggy.
I would create a little helper function:
class Parser:
def __init__(self):
self.OM = Omni() # Creates an class object, which makes auth on the web site, for scrapping
def foo(self, data):
self.OM.parse(data)
#app.task
def foo_task(data)
prs = Parser()
parser.foo(data)
foo_task.delay(data)

Related

Calling a class function from a class inside the class

I have this code:
class CongressApi:
class apiKeyError(Exception):
pass
class Member:
def __init__(self):
print("self.makeRequest()?") # want to call the makeRequest function in the external class
def __init__(self, apiKey):
self.key = apiKey
def makeRequest(self, req):
ret = requests.get(f"https://api.propublica.org/congress/v1/{req}", headers={"X-API-Key": self.key})
return ret.content
I would like to be able to call that makeRequest() function from inside the memeber class. is this possible?
It is not common practice in Python to nest classes like this. I would recommend something like this instead:
class CongressApi:
def __init__(self, apiKey):
self.key = apiKey
def makeRequest(self, req):
ret = requests.get(f"https://api.propublica.org/congress/v1/{req}", headers={"X-API-Key": self.key})
return ret.content
class Member:
def __init__(self, congress_api_key):
self.C = CongressAPI(congress_api_key)
print(f"{self.C.makeRequest()}")
class apiKeyError(Exception):
pass # this is really unnecessary - it's easier just to implement try/except blocks at each point in the code where an exception might be triggered.
In general, it's good practice to separate out your classes.
If you want your internal class's instance methods to be able to access instance methods of the outer class, the internal class's instance needs access to an instance of the external class. For example:
class CongressApi:
class Member:
def __init__(self, api):
api.makeRequest("bar")
def __init__(self, apiKey):
self.key = apiKey
def makeRequest(self, req):
print(f"making request {req} with apiKey {self.key}")
def do_member_thing(self):
member = self.Member(self)
api = CongressApi("foo")
api.do_member_thing() # making request bar with apiKey foo
Note that this is not actually a sensible way to organize your classes -- typically the point of an inner class would be to encapsulate some piece of state that doesn't depend on the outer class, and further to abstract that implementation away from the rest of the outer class's implementation. Passing the inner class a reference to the outer class is permitted, but it also entirely defeats the purpose from an architectural standpoint.

Use an instance method as a decorator within another class

I am trying to create a class (MySerial) that instantiates a serial object so that I can write/read to a serial device (UART). There is an instance method that is a decorator which wraps around a function that belongs to a completely different class (App). So decorator is responsible for writing and reading to the serial buffer.
If I create an instance of MySerial inside the App class, I can't use the decorator instance method that is created from MySerial.
I have tried foregoing instance methods and using class methods as explained in this second answer, but I really need to instantiate MySerial, thus create an instance using __init__.
How can this be accomplished? Is it impossible?
Create a decorator that is an instance method.
Use this decorator within another class
class MySerial():
def __init__(self):
pass # I have to have an __init__
def write(self):
pass # write to buffer
def read(self):
pass # read to buffer
def decorator(self, func):
def func_wrap(*args, **kwargs):
self.write(func(*args, **kwars))
return self.read()
return func_wrap
class App():
def __init__(self):
self.ser = MySerial()
#self.ser.decorator # <-- does not work here.
def myfunc(self):
# 'yummy_bytes' is written to the serial buffer via
# MySerial's decorator method
return 'yummy_bytes'
if __name__ == '__main__':
app = App()
You can use a staticmethod to wrap decorator. The inner func_wrap function of decorator contains an additional parameter in its signature: cls. cls can be used to access the ser attribute of the instance of App, and then the desired methods write and read can be called from cls.ser. Also, note that in your declarations, MySerial.write takes no paramters, but is passed the result of the wrapped function. The code below uses *args to prevent the TypeError which would otherwise be raised:
class MySerial():
def __init__(self):
pass # I have to have an __init__
def write(self, *args):
pass # write to buffer
def read(self):
pass # read to buffer
#staticmethod
def decorator(func):
def func_wrap(cls, *args, **kwargs):
cls.ser.write(func(cls, *args, **kwargs))
return cls.ser.read()
return func_wrap
class App():
def __init__(self):
self.ser = MySerial()
#MySerial.decorator
def myfunc(self):
# 'yummy_bytes' is written to the serial buffer via
# MySerial's decorator method
return 'yummy_bytes'
App().myfunc()
The reason this does not work is because you are refering to self in the class body, where it is not defined. Here are two solutions.
Store the serial object as class attribute
If you store the MySerial instance as a class attribute, then it sill be accessible in the class body:
class App():
ser = MySerial()
#ser.decorator
def myfunc(self):
return 'yummy_bytes'
Decorate on each instantiation
Or if you need a different MySerial instance for every App instance, then you will need to wait for the instance to be created to define an instance attribute my_func. This means the function is decorated dynamically on every instance creation, in which case, the # decorator syntax must be replaced by a function call.
class App():
def __init__(self):
self.ser = MySerial()
self.my_func = self.ser.decorator(self.myfunc)
def myfunc(self):
return 'yummy_bytes'
This solution generalizes to decorating multiple methods or conditionally deactivating serializing, say in a test environment.
import env
class App():
def __init__(self):
self.ser = MySerial()
to_decorate = [] if env.test else ['myfunc']
for fn_name in to_decorate:
fn = getattr(self, fn_name)
setattr(self, fn_name, self.ser.decorator(fn))
There's a lot of hidden pitfalls that make this a risky design, however it is a great learning example.
First off, the call to 'self' when decorating fails because there is no self at that scope. It only exists inside the methods. Now that the easy one is out of the way...
myfunc is an attribute of App class. When you create an instance of App, it is always that one function that gets called. Even when it becomes methodfied, that only happens once.
a1 = App()
a2 = App()
assert a1.myfunc.__func__ is a2.myfunc.__func__
assert id(a1.myfunc) is id(a2.myfunc) # Methods have some weirdness that means that won't equate but id's show they are the same
This is why self is needed to get a unique namespace for the instance. It is also why you won't be able to get decorator that is unique to the instance in this way.
Another way to think about it is that Class must be defined before you can produce instances. Therefore, you can't use an instance in the defination of a Class.
Solution
The decorator needs to be written in a way that it won't store any instance attributes. It will access the App instance attributes instead.
class MySerial():
def __init__(self):
pass # Possibly don't need to have an __init__
def write(self, serial_config):
pass # write to buffer
def read(self, serial_config):
pass # read to buffer
def decorator(self, func):
def func_wrap(self_app: App, *args, **kwargs):
self.write(func(self_app, *args, **kwars), self_app.serial_config)
return self.read(self_app.serial_config)
return func_wrap
ser = MySerial()
class App():
def __init__(self, serial_config):
self.serial_config = serial_config # This is the instance data for MySerial
#ser.decorator
def myfunc(self):
# 'yummy_bytes' is written to the serial buffer via
# MySerial's decorator method
return 'yummy_bytes'
if __name__ == '__main__':
app = App()
Now I'm assuming MySerial was going to have a unique file, or port or something per instance of App. This is what would be recorded in serial_config. This may not be elegant if the stream is opening an closing but you should be able to improve this for your exact application.

How to use the 'self' param inside Spyne server methods

I've seen in many Spyne examples that all the methods don't have the typical self parameter; there aren't examples of Spyne using the self parameter, nor cls. They use a ctx parameter, but ctx doesn't refer to the instance nor to the class (and I need to maintain some state).
Is it possible to use it? Or are the classes not instantiated, and used as static classes?
I was trying to do something similar to:
# -*- coding: utf-8 -*-
from __future__ import (
absolute_import,
unicode_literals,
print_function,
division
)
from spyne.decorator import rpc
from spyne.service import ServiceBase
from spyne.model.primitive import String
class RadianteRPC(ServiceBase):
def __init__(self, name):
self._name = name
#rpc(_returns=String)
def whoami(self):
"""
Dummy test method.
"""
return "Hello I am " + self._name + "!"
The problem with this piece of code is that RadianteRPC never seems to be instantiated as an object by Spyne, but used as a static class.
Solution 1:
As it stands, Spyne doesn't instantiate any object. Then, if we need to store some state, we can do it through class properties.
Since we can't access to the cls parameter in our methods, we need to refer the class by its name, so we can do something like:
class RadianteRPC(ServiceBase):
_name = "Example"
#rpc(_returns=String)
def whoami(ctx): # ctx is the 'context' parameter used by Spyne
"""
Dummy test method.
"""
return "Hello I am " + RadianteRPC._name + "!"
Solution 2 (found in Spyne mailing lists) :
In many cases, it's possible that we can't directly refer to the class name, so we have another alternative: find the class through the ctx parameter.
class RadianteRPC(ServiceBase):
_name = "Example"
#rpc(_returns=String)
def whoami(ctx): # ctx is the 'context' parameter used by Spyne
"""
Dummy test method.
"""
return "Hello I am " + ctx.descriptor.service_class._name + "!"
What I did is to subclass the Application class, and then access the application object through ctx.app.
from spyne.protocol.soap.soap11 import Soap11
from spyne.server.wsgi import WsgiApplication
from spyne import Application, rpc, ServiceBase, Unicode, Boolean
class MyApplication(Application):
def __init__(self, *args, **kargs):
Application.__init__(self, *args, **kargs)
assert not hasattr(self, 'session')
self.session = 1
def increment_session(self):
self.session += 1
def get_session(self):
return self.session
class Service(ServiceBase):
#rpc(_returns=Integer)
def increment_session(ctx):
s = ctx.app.get_session()
self.increment_session()
return s
application = MyApplication([MatlabAdapterService],
'spyne.soap',
in_protocol=Soap11(validator='lxml'),
out_protocol=Soap11())
wsgi_application = WsgiApplication(application)
...
I guess there should be a "cleaner" way - not requiring subclassing of the Application class - by subclassing the Context, but this should allow you to work dynamically.
To come back to your question, you also have the opportunity to access your service, since this is defined in the Application.services attribute.

How to determine the class defining a method through introspection

I'm building a rate-limiting decorator in flask using redis stores that will recognize different limits on different endpoints. (I realize there are a number of rate-limiting decorators out there, but my use case is different enough that it made sense to roll my own.)
Basically the issue I'm having is ensuring that the keys I store in redis are class-specific. I'm using the blueprint pattern in flask, which basically works like this:
class SomeEndpoint(MethodView):
def get(self):
# Respond to get request
def post(self):
# Respond to post request
The issue here is that I want to be able to rate limit the post method of these classes without adding any additional naming conventions. In my mind the best way to do this would be something like this:
class SomeEndpoint(MethodView):
#RateLimit # Access SomeEndpoint class name
def post(self):
# Some response
but within the decorator, only the post function is in scope. How would I get back to the SomeEndpoint class given the post function? This is the basic layout of the decorator. That might be confusing, so here's a more concrete example of the decorator.
class RateLimit(object):
"""
The base decorator for app-specific rate-limiting.
"""
def __call__(self, f):
def endpoint(*args, **kwargs):
print class_backtrack(f) # Should print SomeEnpoint
return f(*args, **kwargs)
return endpoint
basically looking for what that class_backtrack function looks like. I've looked through the inspect module, but I haven't found anything that seems to accomplish this.
You can decorate the entire class instead of just the methods:
def wrap(Class, method):
def wrapper(self, *args, **kwargs):
print Class
return method(self, *args, **kwargs)
return method.__class__(wrapper, None, Class)
def rate_limit(*methods):
def decorator(Class):
for method_name in methods:
method = getattr(Class, method_name)
setattr(Class, method_name, wrap(Class, method))
return Class
return decorator
#rate_limit('post')
class SomeEndpoint(object):
def post(self):
pass
class Subclass(SomeEndpoint):
pass
a = Subclass()
a.post()
# prints <class 'SomeEndpoint'>

Register unique callbacks per instance using classmethod

I wanted to make it easier to register callbacks using decorators when designing a library, but the problem is that they both use the same instance of the Consumer.
I am trying to allow both these examples to co-exist in the same project.
class SimpleConsumer(Consumer):
#Consumer.register_callback
def callback(self, body)
print body
class AdvancedConsumer(Consumer):
#Consumer.register_callback
def callback(self, body)
print body
a = AdvancedConsumer()
s = SimpleConsumer()
What happens here is that the callback implementation of AdvancedConsumer will override the one of the SimpleConsumer, as it is defined last.
The implementation of the decorator class is pretty simple.
class Consumer(object):
def start_consumer(self):
self.consuming_messages(callback=self._callback)
#classmethod
def register_callback(cls, function):
def callback_function(cls, body):
function(cls, body)
cls._callback = callback_function
return callback_function
I am very happy with the implementation, but as there is a possibility that someone will register a second callback I would like to ensure that it won't be a problem in the future. So, does anyone have a suggestion on how to implement this in a way that is not static?
The implementation shown here is obviously simplified, and as a precaution I have something like this in the code.
if cls._callback:
raise RuntimeError('_callback method already defined')
You can do it with a class decorator:
def register_callback(name):
def decorator(cls):
cls._callback = getattr(cls, name)
return cls
return decorator
#register_callback('my_func')
class SimpleConsumer(Consumer):
def my_func(self, body):
print body
If you want to decorate a method, you will get only a function in it so you cannot access any information about the class that the method is contained in.
But if only one callback should be available per class why just not call it _callback?
class SimpleConsumer(Consumer):
def _callback(self, body):
print body
Or do something like:
class SimpleConsumer(Consumer):
def my_func(self, body):
print body
_callback = my_func
?

Categories