How to create a singleton object in Flask micro framework - python

I am creating a class for Producer which pushes messages to RabbitMQ. It makes use of pika module.
I would like to create a handler so that I have control over the number of connections that interact with Rabbit MQ.
Is there a way we can add this to app_context and later refer to that or is there way that we use init_app to define this handler.
Any code snippet would be of really good help.

In Python, using singleton pattern is not needed in most cases, because Python module is essentially singleton. But you can use it anyway.
class Singleton(object):
_instance = None
def __init__(self):
raise Error('call instance()')
#classmethod
def instance(cls):
if cls._instance is None:
cls._instance = cls.__new__(cls)
# more init operation here
return cls._instance
To use Flask (or any other web framework) app as singleton, simply try like this.
class AppContext(object):
_app = None
def __init__(self):
raise Error('call instance()')
#classmethod
def app(cls):
if cls._app is None:
cls._app = Flask(__name__)
# more init opration here
return cls._app
app = AppContext.app() # can be called as many times as you want
Or inherite Flask class and make itself as a singleton.

Related

How to mock a method inside a #singleton decorated class in python

The class itself calls in the init method a get_credentials method, which does I need to mock. Using unittest for mocking;
from unittest import TestCase, mock
from src.layer.utils.db import Db
#singleton
class Db:
def __init__(self):
self.get_credentials()
def get_credentials(self):
# stuff
pass
#Tried and failed:
#mock.patch('src.layer.utils.db.Db.get_credentials',get_creds_mock)
#mock.patch.object(Db, 'get_credentials', get_credentials_mock)
class DbMock:
def get_credentials(self):
pass
def get_credentials_mock(self):
pass
class TestDb(TestCase):
#mock.patch.object(Db, 'get_credentials', get_credentials_mock)
def test_init(self):
db = Db()
self.assertIsInstance(db, Db)
The code of the #singleton decorator class:
def singleton(cls):
instances = {}
def instance():
if cls not in instances:
instances[cls] = cls()
return instances[cls]
return instance
I need to mock the get_credentials function because it communicates with a server, which is not allowed in testing environment. So, I must return the json token itself.
Is there any feasible approach to mock that function?
You could use a solution like https://pypi.org/project/singleton-decorator/ which tackles exactly your problem.
if you cannot exchange the singleton decorator, because it is some kind of framework solution, with this particular solution you are stranded, because you cannot access the instances dictionary.
If you cannot for any reason use another package, but can modify your definition of the singleton wrapper, you could add this to your singleton code:
def singleton(cls):
instances = {}
def instance():
if cls not in instances:
instances[cls] = cls()
return instances[cls]
instance.__wrapped__ = cls
return instance
and then you should be able to override as presented in the package:
#mock.patch('wherever.Db.__wrapped__.get_credentials')

Use an instance method as a decorator within another class

I am trying to create a class (MySerial) that instantiates a serial object so that I can write/read to a serial device (UART). There is an instance method that is a decorator which wraps around a function that belongs to a completely different class (App). So decorator is responsible for writing and reading to the serial buffer.
If I create an instance of MySerial inside the App class, I can't use the decorator instance method that is created from MySerial.
I have tried foregoing instance methods and using class methods as explained in this second answer, but I really need to instantiate MySerial, thus create an instance using __init__.
How can this be accomplished? Is it impossible?
Create a decorator that is an instance method.
Use this decorator within another class
class MySerial():
def __init__(self):
pass # I have to have an __init__
def write(self):
pass # write to buffer
def read(self):
pass # read to buffer
def decorator(self, func):
def func_wrap(*args, **kwargs):
self.write(func(*args, **kwars))
return self.read()
return func_wrap
class App():
def __init__(self):
self.ser = MySerial()
#self.ser.decorator # <-- does not work here.
def myfunc(self):
# 'yummy_bytes' is written to the serial buffer via
# MySerial's decorator method
return 'yummy_bytes'
if __name__ == '__main__':
app = App()
You can use a staticmethod to wrap decorator. The inner func_wrap function of decorator contains an additional parameter in its signature: cls. cls can be used to access the ser attribute of the instance of App, and then the desired methods write and read can be called from cls.ser. Also, note that in your declarations, MySerial.write takes no paramters, but is passed the result of the wrapped function. The code below uses *args to prevent the TypeError which would otherwise be raised:
class MySerial():
def __init__(self):
pass # I have to have an __init__
def write(self, *args):
pass # write to buffer
def read(self):
pass # read to buffer
#staticmethod
def decorator(func):
def func_wrap(cls, *args, **kwargs):
cls.ser.write(func(cls, *args, **kwargs))
return cls.ser.read()
return func_wrap
class App():
def __init__(self):
self.ser = MySerial()
#MySerial.decorator
def myfunc(self):
# 'yummy_bytes' is written to the serial buffer via
# MySerial's decorator method
return 'yummy_bytes'
App().myfunc()
The reason this does not work is because you are refering to self in the class body, where it is not defined. Here are two solutions.
Store the serial object as class attribute
If you store the MySerial instance as a class attribute, then it sill be accessible in the class body:
class App():
ser = MySerial()
#ser.decorator
def myfunc(self):
return 'yummy_bytes'
Decorate on each instantiation
Or if you need a different MySerial instance for every App instance, then you will need to wait for the instance to be created to define an instance attribute my_func. This means the function is decorated dynamically on every instance creation, in which case, the # decorator syntax must be replaced by a function call.
class App():
def __init__(self):
self.ser = MySerial()
self.my_func = self.ser.decorator(self.myfunc)
def myfunc(self):
return 'yummy_bytes'
This solution generalizes to decorating multiple methods or conditionally deactivating serializing, say in a test environment.
import env
class App():
def __init__(self):
self.ser = MySerial()
to_decorate = [] if env.test else ['myfunc']
for fn_name in to_decorate:
fn = getattr(self, fn_name)
setattr(self, fn_name, self.ser.decorator(fn))
There's a lot of hidden pitfalls that make this a risky design, however it is a great learning example.
First off, the call to 'self' when decorating fails because there is no self at that scope. It only exists inside the methods. Now that the easy one is out of the way...
myfunc is an attribute of App class. When you create an instance of App, it is always that one function that gets called. Even when it becomes methodfied, that only happens once.
a1 = App()
a2 = App()
assert a1.myfunc.__func__ is a2.myfunc.__func__
assert id(a1.myfunc) is id(a2.myfunc) # Methods have some weirdness that means that won't equate but id's show they are the same
This is why self is needed to get a unique namespace for the instance. It is also why you won't be able to get decorator that is unique to the instance in this way.
Another way to think about it is that Class must be defined before you can produce instances. Therefore, you can't use an instance in the defination of a Class.
Solution
The decorator needs to be written in a way that it won't store any instance attributes. It will access the App instance attributes instead.
class MySerial():
def __init__(self):
pass # Possibly don't need to have an __init__
def write(self, serial_config):
pass # write to buffer
def read(self, serial_config):
pass # read to buffer
def decorator(self, func):
def func_wrap(self_app: App, *args, **kwargs):
self.write(func(self_app, *args, **kwars), self_app.serial_config)
return self.read(self_app.serial_config)
return func_wrap
ser = MySerial()
class App():
def __init__(self, serial_config):
self.serial_config = serial_config # This is the instance data for MySerial
#ser.decorator
def myfunc(self):
# 'yummy_bytes' is written to the serial buffer via
# MySerial's decorator method
return 'yummy_bytes'
if __name__ == '__main__':
app = App()
Now I'm assuming MySerial was going to have a unique file, or port or something per instance of App. This is what would be recorded in serial_config. This may not be elegant if the stream is opening an closing but you should be able to improve this for your exact application.

Bind some arbitrary value to route in Flask

How can I elegantly bind some arbitrary value and flask route? Suppose, I want to access this value in my session interface implementation or in before_request hook.
Now I'm doing it in such way:
#app.route('/foo/<bar>', defaults={'_my_val': True}):
def foo(bar, _my_val): # this is ugly hack
pass
And access this value through request object like this:
class MyRequest(flask.Request):
def get_my_val(self):
return (self.url_rule.defaults or {}).get('_my_val', False)
But it looks like a hack.
UPD: Looks like it can be done by extending werkzeug.routing.Rule class and adding **kwargs to its constructor. Is it ok to override Rule class in flask?
Eventually I ended up overriding flask's Request and Rule classes:
# here app is a Flask current application object
from flask import Request as FlaskRequest
from werkzeug.routing import Rule as FlaskRule
class Request(FlaskRequest):
def is_foo(self):
return bool(self.url_rule._foo) if self.url_rule else False
def get_bar(self):
return getattr(self.url_rule, '_bar')
class Rule(FlaskRule):
def __init__(self, *args, **kwargs):
for param in ('_foo', '_bar'):
setattr(self, param, kwargs.pop(param, None))
super().__init__(*args, **kwargs)
# app initialization
app.request_class = Request
app.url_rule_class = Rule
# route example
#app.route('/path', _foo=True, _bar='baz')
def route():
pass

Flask: using app as an attribute and accessing decorators

using the python flask module, i would like to have the
app = flask.Flask(__name__)
as a attribute of a class:
class Handler(object):
def __init__(self):
self.datastores = {}
self.websocket_queue = gevent.queue.JoinableQueue()
self.app = flask.Flask(__name__)
the problem is how to access decorators then?
#self.app.route('/socket.io/<path:remaining>')
def socketio(self, remaining):
That generates the error NameError: name 'self' is not defined
Thanks
You could try to use Flask-Classy as it provides an easy way to use classes with Python-Flask.
It depends - if you are adding handlers inside of a method of the Handler class it should work without issue:
def add_routes(self):
#self.app.route("/some/route")
def some_route():
return "At some route"
If you are attempting to add routes outside of Handler you will need to use a reference to your Handler instance:
handler = Handler()
#handler.app.route("/some/route")
def some_route():
return "At some route"

Use cases of RequestHandler.initialize() in Tornado

Is it correct to say that one should use initialize method to prepare resources that will be shared by all other methods (e.g. get, post, etc) of a RequestHandler subclass?
What are the other common use cases for using initialize in Tornado? It'd be great to have some examples!
Why you don't like example in tornado code?
def initialize(self):
"""Hook for subclass initialization.
A dictionary passed as the third argument of a url spec will be
supplied as keyword arguments to initialize().
Example::
class ProfileHandler(RequestHandler):
def initialize(self, database):
self.database = database
def get(self, username):
...
app = Application([
(r'/user/(.*)', ProfileHandler, dict(database=database)),
])
"""
pass

Categories