How to delete Flask HTTP Sessions from the Database on session.clear() - python

I implemented a server side Session in Flask with SQLAlchemy based on this snippit:
class SqlAlchemySession(CallbackDict, SessionMixing):
...
class SqlAlchemySessionInterface(SessionInterface):
def __init__(self, db):
self.db = db
def open_session(self, app, request):
...
def save_session(self, app, session, response):
...
Everything works as expected. When the user logs in, a session is stored in the database, and the session id is placed in a cookie and returned to the user. When the user logs out, session.clear() is called, and the cookie is removed from the user.
However, the session is not deleted from the database. I was hoping that I could implement this logic in my SqlAlchemySessionInterface class, as opposed to defining a function and calling this instead of session.clear().
Likewise, in the sessions.py code, there isn't any reference to clear, and the only time a cookie is deleted is if the session was modified.
The API documentation for sessions also doesn't indicate how the clear method works.
Would anyone know of a way of accomplishing this, other than replacing all my calls to session.clear() with:
def clear_session():
sid = session.get('sid')
if sid:
db.session.query(DBSession).filter_by(sid=sid).delete()
db.session.commit()
session.clear()

If you want to remove duplication, you can define a function called logout_user
In that function, you can remove the session record from your database as well as session.clear().
Call this function when \logout or wherever suitable.

Related

Session variable set before request is not accessible in Django view during tests

I need to store a kind of state between HTTP requests. Because I don't use the JS frontend, I decided to use session variables stored in cookies. My session settings look like below:
SESSION_ENGINE = "django.contrib.sessions.backends.signed_cookies"
SESSION_COOKIE_HTTPONLY = False
I need a function to easily manipulate one session variable. Because this functionality is quite simple, I decided to exceptionally use function-based view:
def set_category_filter(request, pk):
if pk == 0:
if 'filter_category_id' in request.session: # this is False during unit testing, but it's ok in a normal app run
del request.session['filter_category_id']
else:
get_object_or_404(Category, pk=pk, user=request.user)
request.session['filter_category_id'] = pk
return redirect('index')
The problem is I cannot test case when I want to unset this session variable. Suppose the session variable filter_category_id is set and I want to reset it. When I do this in my browser, everything works okay. When I try to do this via Django TestCase, it fails, because this session variable is not passed to request!
class SetCategoryFilterTest(TestCase):
def setUp(self) -> None:
super().setUp()
self.fake = Faker()
self.user = User.objects.create_user(username="testuser", password="12345")
self.client.force_login(self.user)
def tearDown(self):
self.client.logout()
# ...
def test_unset_successfully(self):
"""Werify unsetting "filter_category_id" session field.
Expected: Delete "filter_category_id" when url called with parameter 0
"""
# GIVEN
session = self.client.session
session['filter_category_id'] = DUMMY_INT
session.save()
clear_arg = 0
# WHEN
self.client.get(reverse("filter_tasks_by_category", args=[clear_arg]))
# THEN
self.assertFalse("filter_category_id" in session)
As you see, I try to assume in the test, that the mentioned session variable is already set. Then I try to delete, but there are two problems:
During debugging, the condition if 'filter_category_id' in request.session is equal to False, so the session variable is not being unset.
After the request is done and code execution goes back to the test case, the session variable set on the beginning of the test case still exists so the test fails.
I tried to follow rule from documentation that we need to store session to variable before modification and call save() after. But it didn't help. Also, I read somewhere that it's better to do another request or at least login() in order to set the session object. But, as you see in setUp() method, the login is already done.
It doesn't help too when I try to perform another request to ensure that session has been created:
def test_unset_successfully(self):
"""Werify unsetting "filter_category_id" session field.
Expected: Delete "filter_category_id" when url called with parameter 0
"""
# GIVEN
category = Category.objects.create(user=self.user, name=self.fake.name())
self.client.get(reverse("filter_tasks_by_category", args=[category.id]))
session = self.client.session
session['filter_category_id'] = DUMMY_INT
session.save()
clear_arg = 0
# WHEN
self.client.get(reverse("filter_tasks_by_category", args=[clear_arg]))
# THEN
self.assertFalse("filter_category_id" in session)
This code is not only strange, but also it doesn't confirm the statement that I have to do a request first before accessing the session object.
Do you know how to deal with this situation? Basically all I want to do is to set a session variable and pass it with a request to my view.
If you want just to test the delete part, use mocks:
def test_unset_successfully(self):
request_mock = Mock()
request_mock.session = {"filter_category_id": 123}
set_category_filter(request_mock, 0)
self.assertIsNone(request_mock.session.get("filter_category_id"))
Finally I found out that I need to update session cookie:
# GIVEN
session = self.client.session
session["filter_category_id"] = DUMMY_INT
session.save()
# After saving new session, it must be re-assigned to a session cookie
session_cookie_name = django_settings.SESSION_COOKIE_NAME
self.client.cookies[session_cookie_name] = session.session_key
Solution found on https://kurzacz.com/writing-django-tests-for-session-variables-stored-in-cookies/

Correct way to register flask admin views with application factory

I am using an application factory to add views to my flask application like so :
(this is not my actual application factory, and has been shortened for the sake of brevity)
def create_app(config_name='default'):
app = Flask(__name__, template_folder="templates", static_folder='static')
admin_instance = Admin(app, name='Admin')
admin_instance.add_view(EntityAdmin(Entity, db.session))
My EntityAdmin class looks like this :
class EntityAdmin(ModelView):
column_filters = [
MyCustomFilter(column=None, name='Custom')
]
My custom filter looks like this :
class MyCustomFilter(BaseSQLAFilter):
def get_options(self, view):
entities = Entity.query.filter(Entity.active == True).all()
return [(entity.id, entity.name) for entity in entities]
The problem is that it seems that the get_options function is called when the app is instantiated, running a select query every time the create_app function gets called.
So if I update my database schema and run the flask db migrate command, I get an error because the new column I added does not exist when the select query is run. The query raises an error because my database schema is not in sync with the actual database.
Can I register my views only when an actual HTTP request is made ? How can I differentiate between a request and a command ?
You have one more problem with this filter: its options are created on the application instantiation so if your list of entities was changed during the application running it would still return the same list of options.
To fix both problems you don't need to postpone views registrations. You need the filter to get the list of options every time it is used.
This SO answer to the question "Resetting generator object in Python" describes a way to reuse a generator (in your case — a database query):
from flask import has_app_context
def get_entities():
# has_app_context is used to prevent database access
# when application is not ready yet
if has_app_context():
for entity in Entity.query.filter(Entity.active.is_(True)):
yield entity.id, entity.name
class ReloadingIterator:
def __init__(self, iterator_factory):
self.iterator_factory = iterator_factory
def __iter__(self):
return self.iterator_factory()
class MyCustomFilter(BaseSQLAFilter):
def get_options(self, view):
# This will return a generator which is
# reloaded every time it is used
return ReloadingIterator(get_entities)
The problem is that the query to the Entity table can be called multiple times during request. So I usually cache the result for a single request using Flask globals:
def get_entities():
if has_app_context():
if not hasattr(g, 'entities'):
query = Entity.query.filter(Entity.active.is_(True))
g.entities = [(entity.id, entity.name) for entity in query]
for entity_id, entity_name in g.entities:
yield entity_id, entity_name

How to work with multiple databases with Python Pyramid

I want to work with multiple databases with Python Pyramid Framework and SQL Alchemy.
I have 1 database with user information, and multiple databases (with the same structure) where the application information is stored. Each user at login time selects a database and is only shown information from that database (not others).
How should I structure my code?
I was thinking on saving in the session the dbname and on every request check user permission on the selected database and generate a new db session. So my view would look like (PSEUDO CODE):
#view_config(route_name='home', renderer='json')
def my_view_ajax(request):
try:
database = int(request.GET['database'])
# check user permissions from user database
engine = create_engine('postgresql://XXX:XXX#localhost/'+database)
DBSession.configure(bind=engine)
items = DBSession.query('table').all()
except DBAPIError:
return 'error'
return items
Should I generate a new db session with the user information on each request? Or is there a better way?
Thanks
This is quite easy to do in Pyramid+SQLAlchemy, but you'll likely want to
switch to a heavier boilerplate, more manual session management style, and you'll want to be up on the session management docs for SQLA 'cause you can easily trip up when working with multiple concurrent sessions. Also, things like connection management should stay out of views, and be in components that live in the server start up lifecycle and are shared across request threads. If you're doing it right in Pyramid, your views should be pretty small and you should have lots of components that work together through the ZCA (the registry).
In my apps, I have a db factory objects that get sessions when asked for them, and I instantiate these objects in the server start up code (the stuff in __ init __.py) and save them on the registry. Then you can attach sessions for each db to your request object with the reify decorator, and also attach a house keeping end of request cleanup method to close them. This can be done either with custom request factories or with the methods for attaching to the request right from init, I personally wind up using the custom factories as I find it easier to read and I usually end up adding more there.
# our DBFactory component, from some model package
class DBFactory(object):
def __init__(self, db_url, **kwargs):
db_echo = kwargs.get('db_echo', False)
self.engine = create_engine(db_url, echo=db_echo)
self.DBSession = sessionmaker(autoflush=False)
self.DBSession.configure(bind=self.engine)
self.metadata = Base.metadata
self.metadata.bind = self.engine
def get_session(self):
session = self.DBSession()
return session
# we instantiate them in the __init__.py file, and save on registry
def main(global_config, **settings):
"""runs on server start, returns a Pyramid WSGI application """
config = Configurator(
settings=settings,
# ask for a custom request factory
request_factory = MyRequest,
)
config.registry.db1_factory = DBFactory( db_url=settings['db_1_url'] )
config.registry.db2_factory = DBFactory( db_url=settings['db_2_url'] )
# and our custom request class, probably in another file
class MyRequest(Request):
"override the pyramid request object to add explicit db session handling"
#reify
def db1_session(self):
"returns the db_session at start of request lifecycle"
# register callback to close the session automatically after
# everything else in request lifecycle is done
self.add_finished_callback( self.close_dbs_1 )
return self.registry.db1_factory.get_session()
#reify
def db2_session(self):
self.add_finished_callback( self.close_dbs_2 )
return self.registry.db2_factory.get_session()
def close_dbs_1(self, request):
request.db1_session.close()
def close_dbs_2(self, request):
request.db2_session.close()
# now view code can be very simple
def my_view(request):
# get from db 1
stuff = request.db1_session.query(Stuff).all()
other_stuff = request.db2_session.query(OtherStuff).all()
# the above sessions will be closed at end of request when
# pyramid calls your close methods on the Request Factory
return Response("all done, no need to manually close sessions here!")

Integrating a non-threaded SQLAlchemy code with Flask-SQLAlchemy

I have a python module UserManager that takes care for all things user management related - users, groups, rights, authentication. Access to these assets is provided via master class that is passed SQLAlchemy engine parameter at constructor. The engine is needed to make the table-class mappings (using mapper objects), and to emit sessions.
This is how the gobal variables are established in the app module:
class UserManager:
def __init__(self, db):
self.db = db
self._db_session = None
meta = MetaData(db)
user_table = Table(
'USR_User', meta,
Column('field1'),
Column('field3')
)
mapper(User, user_table)
#property
def db_session(self):
if self._db_session is None:
self._db_session = scoped_session(sessionmaker())
self._db_session.configure(bind=self.db)
return self._db_session
class User(object):
def init(self, um):
self.um = um
from flask.ext.sqlalchemy import SQLAlchemy
db = SQLAlchemy(app)
um = UserManager(db.engine)
This module as such is designed to be context-agnostic by purpose, so that it can be used both for locally run and web application.
But here the problems arise: time to time I get the dreaded "Can't reconnect until invalid transaction is rolled back" error, presumably caused by some failed transaction in the UserManager code.
I am now trying to identify the problem source. Maybe it is not right way how to handle the database in the dynamic context of web server? Perhaps I have to pass the db.session to the um object so that I can be sure that the db connections are not mixed up?
In web context you should consider the request for every user isolated. For this you must use the flask.g
To share data that is valid for one request only from one function to
another, a global variable is not good enough because it would break
in threaded environments.Flask provides you with a special object
that ensures it is only valid for the active request and that will
return different values for each request. In a nutshell: it does the
right thing, like it does for request and session.
You can see more about here.

Best way to integrate SqlAlchemy into a Django project

I changed my Django application to use SQLAlchemy, and it works now.
But I'm wondering where I should put these lines:
engine = sqlalchemy.create_engine(settings.DATABASE_URL)
Session = sqlalchemy.orm.sessionmaker(bind=engine)
session = Session()
The reason I'm asking is because I want to use SQLAlchemy at many place, and I don't think its correct/powerful/well-written to call this three lines everytime I need to use the database.
The place I will require SA is :
In my views, of course
In some middleware I wrote
In my models. Like in get_all_tags for a BlogPost Model.
What I think would be correct, is to get the session, by re-connecting to the database if the session is closed, or just returning the current, connected session if exists.
How can I use SQLAlchemy correctly with my Django apps?
Thanks for your help!
Note: I already followed this tutorial to implement SA into my Django application, but this one doesn't tell me exactly where to put those 3 lines (http://lethain.com/entry/2008/jul/23/replacing-django-s-orm-with-sqlalchemy/).
for the first two, engine and Session, you can put them in settings.py; they are, configuration, after all.
Actually creating a session requires slightly more care, since a session is essentially a 'transaction'. The simplest thing to do is to create it in each view function when needed, and commit them just before returning. If you'd like a little bit more magic than that, or if you want/need to use the session outside of the view function, you should instead define some middleware, something like
class MySQLAlchemySessionMiddleware(object):
def process_request(self, request):
request.db_session = settings.Session()
def process_response(self, request, response):
try:
session = request.db_session
except AttributeError:
return response
try:
session.commit()
return response
except:
session.rollback()
raise
def process_exception(self, request, exception):
try:
session = request.db_session
except AttributeError:
return
session.rollback()
Then, every view will have a db_session attribute in their requests, which they can use as they see fit, and anything that was added will get commited when the response is finished.
Don't forget to add the middleware to MIDDLEWARE_CLASSES

Categories