Is it correct to say that one should use initialize method to prepare resources that will be shared by all other methods (e.g. get, post, etc) of a RequestHandler subclass?
What are the other common use cases for using initialize in Tornado? It'd be great to have some examples!
Why you don't like example in tornado code?
def initialize(self):
"""Hook for subclass initialization.
A dictionary passed as the third argument of a url spec will be
supplied as keyword arguments to initialize().
Example::
class ProfileHandler(RequestHandler):
def initialize(self, database):
self.database = database
def get(self, username):
...
app = Application([
(r'/user/(.*)', ProfileHandler, dict(database=database)),
])
"""
pass
Related
I have a Django model that makes use of some libraries which I would like to be able to override. For instance, when testing I'd like to pass a mock instead of having my model tightly coupled. I can do this in python, but for the life of me I can't figure out how to do it with a Django model. Here's a simplified example not using Django:
import requests
class APIClient:
def __init__(self, **kwargs):
self.http_lib = kwargs.get("http_lib", requests)
def get_url(self, url):
return self.http_lib.get(url)
For regular use of this class I can still use requests but if I want to use a different library for some reason or if I want to test certain outcomes, I can invoke the class with client = APIClient(http_lib=MockRequests())
But how do I do that with a Django model? If I try to pass kwargs that aren't backed by a database field Django throws an error. Overriding __init__ is not considered a good practice either. Is there a way in Django to set and get a value that isn't backed by a database column?
Do you have a settings.TEST var? If so, you could make http_lib a function that returns the proper lib:
from django.conf import settings
def get_http_lib(mock=None):
if not mock:
return requests
return MockRequests()
class APIClient(Model):
def __init__(self, **kwargs):
# ...whatever...
#property
def some_column(self):
http_lib = get_http_lib(settings.TEST)
# ...etc...
Not ideal, but passable.
PRE-EDIT ANSWER (doesn't work):
What if you setattr subsequent to instantiating the Model?
# In model...
class APIClient(Model):
def __init__(self, **kwargs):
self.http_lib = requests
# ...etc...
# In tests...
client = APIClient()
setattr(client, 'http_lib', MockRequests())
first I created some user management functions I want to use everywhere, and bound them to cherrypy, thinking I could import cherrypy elsewhere and they would be there. Other functions seem to import fine this way, when not used as decorators.
from user import validuser
cherrypy.validuser = validuser
del validuser
that didn't work, so next I tried passing the function into the class that is a section of my cherrypy site (/analyze) from the top level class of pages:
class Root:
analyze = Analyze(cherrypy.validuser) #maps to /analyze
And in the Analyze class, I referred to them. This works for normal functions but not for decorators. why not?
class Analyze:
def __init__(self, validuser):
self.validuser = validuser
#cherrypy.expose
#self.validuser(['uid'])
def index(self, **kw):
return analysis_panel.pick_data_sets(user_id=kw['uid'])
I'm stuck. How can I pass functions in and use them as decorators. I'd rather not wrap my functions like this:
return self.validuser(analysis_panel.pick_data_sets(user_id=kw['uid']),['uid'])
thanks.
ADDED/EDITED: here's what the decorator is doing, because as a separate issue, I don't think it is properly adding user_id into the kwargs
def validuser(old_function, fetch=['uid']):
def new_function(*args, **kw):
"... do stuff. decide is USER is logged in. return USER id or -1 ..."
if USER != -1 and 'uid' in fetch:
kw['uid'] = user_data['fc_uid']
return old_function(*args, **kw)
return new_function
only the kwargs that were passed in appear in the kwargs for the new_function. Anything I try to add isn't there. (what I'm doing appears to work here How can I pass a variable in a decorator to function's argument in a decorated function?)
The proper way in CherryPy to handle a situation like this is to have a tool and to enable that tool on the parts of your site that require authentication. Consider first creating this user-auth tool:
#cherrypy.tools.register('before_handler')
def validate_user():
if USER == -1:
return
cherrypy.request.uid = user_data['fc_uid']
Note that the 'register' decorator was added in CherryPy 5.5.0.
Then, wherever you wish to validate the user, either decorate the handler with the tool:
class Analyze:
#cherrypy.expose
#cherrypy.tools.validate_user()
def index(self):
return analysis_panel.pick_data_sets(user_id=cherrypy.request.uid)
Or in your cherrypy config, enable that tool:
config = {
'/analyze': {
'tools.validate_user.on': True,
},
}
The function/method is defined in the class, it doesn't make sense to decorate it with an instance variable because it won't be the same decorator for each instance.
You may consider using a property to create the decorated method when it is accessed:
#property
def index(self):
#cherrypy.expose
#self.validuser(['uid'])
def wrapped_index(**kw):
return analysis_panel.pick_data_sets(user_id=kw['uid'])
return wrapped_index
You may also consider trying to apply lru_cache to save the method for each instance but I'm not sure how to apply that with the property.
I'm building an HTTP API and I factored out a lot of code into a superclass that handles requests to a collection of objects. In my subclass, I specify what database models the operation should work on and the superclass takes care of the rest.
This means that I don't need to re-implement the get, post, etc. methods from the superclass, however, I want to change their docstrings in the subclass so that I can have some documentation more specific to the actual model the endpoint is operating on.
What is the cleanest way to inherit the parent class's functionality but change the docstrings?
Example:
class CollectionApi(Resource):
"""Operate on a collection of something.
"""
class Meta(object):
model = None
schema = None
def get(self):
"""Return a list of collections.
"""
# snip
def post(self):
"""Create a new item in this collection.
"""
# snip
class ActivityListApi(CollectionApi):
"""Operations on the collection of Activities.
"""
class Meta(object):
model = models.Activity
schema = schemas.ActivitySchema
Specifically, I need ActivityListApi to have get and post run like in CollectionApi, but I want different docstrings (for automatic documentation's sake).
I can do this:
def get(self):
"""More detailed docs
"""
return super(ActivityListApi, self).get()
But this seems messy.
class CollectionApi(Resource):
"""Operate on a collection of something.
"""
def _get(self):
"""actual work... lotsa techy doc here!
the get methods only serve to have something to hang
their user docstrings onto
"""
pass
def get(self):
"""user-intended doc for CollectionApi"""
return self._get()
class ActivityListApi(CollectionApi):
def get(self):
"""user-intended doc for ActivityListApi"""
return self._get()
I'm building a rate-limiting decorator in flask using redis stores that will recognize different limits on different endpoints. (I realize there are a number of rate-limiting decorators out there, but my use case is different enough that it made sense to roll my own.)
Basically the issue I'm having is ensuring that the keys I store in redis are class-specific. I'm using the blueprint pattern in flask, which basically works like this:
class SomeEndpoint(MethodView):
def get(self):
# Respond to get request
def post(self):
# Respond to post request
The issue here is that I want to be able to rate limit the post method of these classes without adding any additional naming conventions. In my mind the best way to do this would be something like this:
class SomeEndpoint(MethodView):
#RateLimit # Access SomeEndpoint class name
def post(self):
# Some response
but within the decorator, only the post function is in scope. How would I get back to the SomeEndpoint class given the post function? This is the basic layout of the decorator. That might be confusing, so here's a more concrete example of the decorator.
class RateLimit(object):
"""
The base decorator for app-specific rate-limiting.
"""
def __call__(self, f):
def endpoint(*args, **kwargs):
print class_backtrack(f) # Should print SomeEnpoint
return f(*args, **kwargs)
return endpoint
basically looking for what that class_backtrack function looks like. I've looked through the inspect module, but I haven't found anything that seems to accomplish this.
You can decorate the entire class instead of just the methods:
def wrap(Class, method):
def wrapper(self, *args, **kwargs):
print Class
return method(self, *args, **kwargs)
return method.__class__(wrapper, None, Class)
def rate_limit(*methods):
def decorator(Class):
for method_name in methods:
method = getattr(Class, method_name)
setattr(Class, method_name, wrap(Class, method))
return Class
return decorator
#rate_limit('post')
class SomeEndpoint(object):
def post(self):
pass
class Subclass(SomeEndpoint):
pass
a = Subclass()
a.post()
# prints <class 'SomeEndpoint'>
I am trying to define a base request handling class so that the webapp pages may inherit some basic methods and variable which otherwise would be required to be repeatedly be defined for each page of the application. A sort of similar functionality like django preprocessors. This is my base class from which other pages inherit:
class BasePage(webapp.RequestHandler):
def __init__(self):
self.user = users.get_current_user()
self.template_values = {
'user': self.user,
'environ': self, #I don't like the idea of passing the whole environ object to a template
##The below three functions cannot be executed during _init_ because of absence of self.request
#'openid_providers': self.openid_providers(),
#'logout_url': self.get_logout_url(),
#'request': self.get_request(),
}
##A sort of similar functionality like render_to_response in django
def render_template(self, template_name, values = None, *args, **kwargs):
#PATH is the directory containing the templates
if values:
for value in values: self.template_values[value] = values[value]
self.response.out.write(template.render(PATH+template_name, self.template_values, *args, **kwargs))
##Returns request as the name suggests
def logout_url(self):
return users.create_logout_url(self.request.url)
##Returns request as the name suggests
def request(self):
return request
##Returns openid login urls
def openid_providers(self):
#OPENID_POVIDERS is a list of dictionary
for p in OPENID_PROVIDERS:
p['login_url'] = users.create_login_url(self.request.get('next', '/') , p['name'], p['url'])
return OPENID_PROVIDERS
Everything is working fine except that I cannot pass some variables during initialization as self.request is not available. So for a workaround what I did is pass on whole self variable as a template variable.
Is there some other way to provide the template variables (request, logout_url etc) to the templates?
A much simpler solution than bgporter's is to do the common setup in the initialize method of webapp.RequestHandler. Here's an example from work, where we wanted to add a Django-like is_ajax method to the request object:
class BaseHandler(webapp.RequestHandler):
def initialize(self, request, response):
super(BaseHandler, self).initialize(request, response)
# Add a Django-like is_ajax() method to the request object
request.is_ajax = lambda: \
request.environ.get('HTTP_X_REQUESTED_WITH') == 'XMLHttpRequest'
This method is called to, uh, initialize each request handler with the current request and response objects, before the appropriate get or post (or whatever) methods are called.
I've solved that problem in my AppEngine code by using the Template Method Pattern
Basically, the base class looks like:
class MyBasePage(webapp.RequestHandler):
def __init__(self):
# common setup/init stuff here,
# omitted for this discussion
def Setup(self):
# request handling setup code needed in both GET/POST methods, like
# checking for user login, getting session cookies, etc.
# omitted for this discussion
def get(self, *args):
self.Setup()
# call the derived class' 'DoGet' method that actually has
# the logic inside it
self.DoGet(*args)
def post(self, *args):
self.Setup()
# call the derived class' 'DoPost' method
self.DoPost(*args)
def DoGet(self, *args):
''' derived classes override this method and
put all of their GET logic inside. Base class does nothing.'''
pass
def DoPost(self, *args):
''' derived classes override this method and
put all of their POST logic inside. Base class does nothing.'''
pass
...your derived classes then mostly just need to worry about the guts of those DoGet() and DoPost() methods.