I would like to store some information at the "request scope" when using google app engine (python). What I mean by this is that I would like to initialize some information when a request is first received, and then be able to access it from anywhere for the life of the request (and only from that request).
An example of this would be if I saved the current user's name at request scope after they were authenticated.
How would I go about doing this sort of thing?
Thanks!
A pattern used in app engine itself seems to be threading.local which you can grep for in the SDK code. Making os.environ request local is done like that in runtime/request_environment.py for example.
A rough example:
import threading
class _State(threading.local):
"""State keeps track of request info"""
user = None
_state = _State()
From elsewhere you could authenticate early on in handler code.
from state import _state
if authentication_passed:
_state.user = user
and provide convenience that can be used in other parts of your code
from state import _state
def get_authenticated_user():
user = _state.user
if not user:
raise AuthenticationError()
return user
You need something like this:-
class BaseHandler(webapp2.RequestHandler):
#A function which is useful in order to determine whether user is logged in
def initialize(self, *a, **kw):
#Do the authentication
self.username = username
class MainHandler(BaseHandler):
def get(self):
print self.username
Now if you inherit BaseHandler class all the request will first go through the initialize method of BaseHandler class and since in the BaseHandler class you are setting the username
and MainHandler inherits form BaseHandler you will have the self.username defined and all the request wil go through initialize method.
Related
#property
def get_maca(self, request):
if request.user.name == "example":
return self
I want to do something like this. If the user name is example return that object.
How to access the request like this?
The standard way is to pass the request, or in your case just the user object, from the view/router all the way down to the models.
This gets very quickly out of hand in a larger project, so my approach is to use thread local to save some of the request context that I like to have available across the whole project. The thread local storage will keep data available inside a single thread, without it being accessible from other threads - great if you're gonna run the Django app on a production server.
Start with the local storage:
from threading import local
_active_user = local()
def activate_user(user):
if not user:
return
_active_user.value = user
def deactivate_user():
if hasattr(_active_user, "value"):
del _active_user.value
def get_user():
"""Returns `(is_anonymous, user)` ."""
active_user = getattr(_active_user, "value", None)
if active_user and active_user is not AnonymousUser:
try:
return False, active_user
except AttributeError:
pass
return True, None
Now that's all good, you can use this manually. Calling activate_user will make you be able to call get_user in any place in your project. However, this is error prone - if you forget to call deactivate_user, the user object will still be available to the next coming request.
The rest of the answer is to show how to make things automatic.
Let's first make a middleware to clean up by calling deactivate_user after every single request.
class ThreadCleanupMiddleware:
def __init__(self, get_response):
self.get_response = get_response
# One-time configuration and initialization.
def __call__(self, request):
# Executed for each request before
# the view (and later middleware) are called.
response = self.get_response(request)
# Executed for each request/response after
# the view is called.
deactivate_user()
return response
Add a path to the ThreadCleanupMiddleware to the end of your settings.MIDDLEWARE list.
Finish up with a view mixin that activates the user automatically (that's for class based views; if you're using functional views, it would be a decorator instead):
class ContextViewSetMixin:
def initial(self, request, *args, **kwargs):
super().initial(request, *args, **kwargs)
if request.user.is_authenticated:
activate_user(request.user)
class ContextModelViewSet(ContextViewSetMixin, viewsets.ModelViewSet):
pass
I'm building a single database/shared schema multi-tenant application using Django 2.2 and Python 3.7.
I'm attempting to use the new contextvars api to share the tenant state (an Organization) between views.
I'm setting the state in a custom middleware like this:
# tenant_middleware.py
from organization.models import Organization
import contextvars
import tenant.models as tenant_model
tenant = contextvars.ContextVar('tenant', default=None)
class TenantMiddleware:
def __init__(self, get_response):
self.get_response = get_response
def __call__(self, request):
response = self.get_response(request)
user = request.user
if user.is_authenticated:
organization = Organization.objects.get(organizationuser__is_current_organization=True, organizationuser__user=user)
tenant_object = tenant_model.Tenant.objects.get(organization=organization)
tenant.set(tenant_object)
return response
I'm using this state by having my app's models inherit from a TenantAwareModel like this:
# tenant_models.py
from django.contrib.auth import get_user_model
from django.db import models
from django.db.models.signals import pre_save
from django.dispatch import receiver
from organization.models import Organization
from tenant_middleware import tenant
User = get_user_model()
class TenantManager(models.Manager):
def get_queryset(self, *args, **kwargs):
tenant_object = tenant.get()
if tenant_object:
return super(TenantManager, self).get_queryset(*args, **kwargs).filter(tenant=tenant_object)
else:
return None
#receiver(pre_save)
def pre_save_callback(sender, instance, **kwargs):
tenant_object = tenant.get()
instance.tenant = tenant_object
class Tenant(models.Model):
organization = models.ForeignKey(Organization, null=False, on_delete=models.CASCADE)
def __str__(self):
return self.organization.name
class TenantAwareModel(models.Model):
tenant = models.ForeignKey(Tenant, on_delete=models.CASCADE, related_name='%(app_label)s_%(class)s_related', related_query_name='%(app_label)s_%(class)ss')
objects = models.Manager()
tenant_objects = TenantManager()
class Meta:
abstract = True
In my application the business logic can then retrieve querysets using .tenant_objects... on a model class rather than .objects...
The problem I'm having is that it doesn't always work - specifically in these cases:
In my login view after login() is called, the middleware runs and I can see the tenant is set correctly. When I redirect from my login view to my home view, however, the state is (initially) empty again and seems to get set properly after the home view executes. If I reload the home view, everything works fine.
If I logout and then login again as a different user, the state from the previous user is retained, again until a do a reload of the page. This seems related to the previous issue, as it almost seems like the state is lagging (for lack of a better word).
I use Celery to spin off shared_tasks for processing. I have to manually pass the tenant to these, as they don't pick up the context.
Questions:
Am I doing this correctly?
Do I need to manually reload the state somehow in each module?
Frustrated, as I can find almost no examples of doing this and very little discussion of contextvars. I'm trying to avoid passing the tenant around manually everywhere or using thread.locals.
Thanks.
You're only setting the context after the response has been generated. That means it will always lag. You probably want to set it before, then check after if the user has changed.
Note though that I'm not really sure this will ever work exactly how you want. Context vars are by definition local; but in an environment like Django you can never guarantee that consecutive requests from the same user will be served by the same server process, and similarly one process can serve requests from multiple users. Plus, as you've noted, Celery is a yet another separate process again, which won't share the context.
I'm trying to test my Django app which has a proxy API which is instantiated in its own module.
api.py
class ProxyApi(object):
def __init__(self, server_info):
pass
def validate_login(self, credentials):
# call to real api here
api = ProxyAPi()
middlewares.py
from mymodule.api import api
class MyMiddleware(MiddlewareMixin):
def process_request(self, request):
if api.validate_login():
# do something with proxy api
views.py
from mymodule.api import api
class TaskView(LoginRequiredMixin, FormView):
def get(self, request):
if api.validate_login():
# do something with proxy api
tests.py
class InputTasksViewTest(TestCase):
#mock.patch('mymodule.api.ProxyAPi')
def test_add(self, mock_api):
mock_api.validate_login.return_value = True
response = self.client.get(reverse('task'))
The original validate_loginis still called.
I would like to know how to handle the instantiation of ProxyApi while still retaining mocking capacity.
Ok I found my own solution, the problem was that once Django started, it read some files (like models or views or middlewares) that automatically instantiated api variable from import.
I just needed to defer this instantiation so I can mock the ProxyApi object, here's what I did:
api = SimpleLazyObject(lambda: ProxApi())
You have def validate_login(self, credentials): in api
and in middleware you define below code. So How you will send creadentials to API from middleware api.validate_login(<You should send credentials to api as parameter>):
from mymodule.api import api
class MyMiddleware(MiddlewareMixin):
def process_request(self, request):
if api.validate_login():
pass
I implemented a server side Session in Flask with SQLAlchemy based on this snippit:
class SqlAlchemySession(CallbackDict, SessionMixing):
...
class SqlAlchemySessionInterface(SessionInterface):
def __init__(self, db):
self.db = db
def open_session(self, app, request):
...
def save_session(self, app, session, response):
...
Everything works as expected. When the user logs in, a session is stored in the database, and the session id is placed in a cookie and returned to the user. When the user logs out, session.clear() is called, and the cookie is removed from the user.
However, the session is not deleted from the database. I was hoping that I could implement this logic in my SqlAlchemySessionInterface class, as opposed to defining a function and calling this instead of session.clear().
Likewise, in the sessions.py code, there isn't any reference to clear, and the only time a cookie is deleted is if the session was modified.
The API documentation for sessions also doesn't indicate how the clear method works.
Would anyone know of a way of accomplishing this, other than replacing all my calls to session.clear() with:
def clear_session():
sid = session.get('sid')
if sid:
db.session.query(DBSession).filter_by(sid=sid).delete()
db.session.commit()
session.clear()
If you want to remove duplication, you can define a function called logout_user
In that function, you can remove the session record from your database as well as session.clear().
Call this function when \logout or wherever suitable.
I'm having trouble creating admin pages on my Python Google App Engine site. I think the answer should be pretty straightforward, but honestly, I've been trying to understand how classes inheriting from other classes, or using functions to wrap other functions, and I just can't seem to get a good understanding of it.
Basically, my site has two kinds of pages: the main page, and then some pages that allow the user to perform admin actions. The main page can be seen by anyone without signing in. The other pages are for admins. The only users with accounts are admins, so I've set up webapp2 sessions, and as long as
self.sessions.get('username')
returns something that's enough to be allowed access to the other pages.
Here are my handlers:
class BaseHandler(webapp2.RequestHandler):
def write(self, *a, **kw):
self.response.out.write(*a, **kw)
def render(self, template, **kw):
self.response.out.write(render_str(template, **kw))
def dispatch(self):
# Get a session store for this request.
self.session_store = sessions.get_store(request=self.request)
try:
# Dispatch the request.
webapp2.RequestHandler.dispatch(self)
finally:
# Save all sessions.
self.session_store.save_sessions(self.response)
#webapp2.cached_property
def session(self):
# Returns a session using the default cookie key.
return self.session_store.get_session()
class MainHandler(BaseHandler):
def get(self):
animals = Animal.query().fetch(100)
self.render('index.html',animals=animals)
class AdminHandler(BaseHandler):
def get(self):
if self.session.get('username'):
self.render('admin.html')
else:
self.render('signin.html')
class ReorderHandler(BaseHandler):
def get(self):
self.render('reorder.html')
def post(self):
#Change order of item display
self.write('OK')
class DeleteHandler(BaseHandler):
def get(self):
self.render('delete.html')
def post(self):
#Delete entry from db
self.write('OK')
class AddHandler(BaseHandler):
def get(self):
self.render('add.html')
def post(self):
#add entry to db
self.write('OK')
class SigninHandler(BaseHandler):
def post(self):
#Check username and password
if valid:
self.session['username'] = username
self.redirect('/admin')
else:
self.write('Not valid')
The AdminHandler lays out the basic logic of what these Admin pages should do.
If someone is trying to access an admin pages, the handler should checks to see if user is signed in, and if so, allow access to the page. If not, it renders the sign-in page.
Reorder, Delete, and Add are all actions I want admins to be able to do, but there might be more in the future. I could add the AdminHandler logic to all the GETs and POSTs of those other handlers, but that is extremely repetitive and therefore I am sure that it is the wrong thing to do.
Looking for some guidance on how to get the logic of AdminHandler incorporated into all of the other Handlers that cover "administrative" tasks.
Update: Brent Washburne pointed me in the right direction enough to get the thing working, although I still don't feel like I understand what the decorator function actually does. Anyway, the code seems to be working, and now looks like this:
def require_user(old_func):
def new_function(self):
if not self.session.get('username'):
self.redirect('/signin')
old_func(self)
return new_function
class AdminHandler(BaseHandler):
#require_user
def get(self):
self.render('admin.html')
class AddHandler(BaseHandler):
#require_user
def get(self):
self.render('add.html')
#require_user
def post(self):
name = self.request.get('name')
qry = Animal.query(Animal.name == name).get()
if not qry:
new_animal = Animal(name=name)
new_animal.put()
self.write('OK')
And so on for all the other "admin" Handlers.
Here's a brute-force way to ensure a user is logged in for every page (except the login page), or it redirects them to the login page:
def dispatch(self):
# Get a session store for this request.
self.session_store = sessions.get_store(request=self.request)
if not self.session['username'] and self.request.get('path') != '/login':
return redirect('/login')
A better way is to add this code to the top of every get() and put() routine:
def get(self):
if not self.session['username']:
return redirect('/login')
An even better way is to turn that code into a decorator so all you need to add is one line:
#require_login
def get(self):
....