I am working on sessions in Django.
By default, django stores sessions in django_session, I found out there is no way to purge sessions.
Though clearsessions can be used to delete rows. It is also recommended to run this as a cron job. But doing this means logging out all logged-in users, right?
Is this the right way to do it?
The Django documentation states (emphasis from me):
Clearing the session store
As users create new sessions on your website, session data can
accumulate in your session store. If you’re using the database
backend, the django_session database table will grow. If you’re using
the file backend, your temporary directory will contain an increasing
number of files.
To understand this problem, consider what happens with the database
backend. When a user logs in, Django adds a row to the django_session
database table. Django updates this row each time the session data
changes. If the user logs out manually, Django deletes the row. But if
the user does not log out, the row never gets deleted. A similar
process happens with the file backend.
Django does not provide automatic purging of expired sessions.
Therefore, it’s your job to purge expired sessions on a regular basis.
Django provides a clean-up management command for this purpose:
clearsessions. It’s recommended to call this command on a regular
basis, for example as a daily cron job.
Note that the cache backend isn’t vulnerable to this problem, because
caches automatically delete stale data. Neither is the cookie backend,
because the session data is stored by the users’ browsers.
Found this link in Abid A's answer.
The clearsessions command
Can be run as a cron job or directly to clean out expired sessions.
So it won't log off every user.
As mentioned by Kevin Christopher Henry in a comment and in the other possible duplicate of your question flagged by e4c5.
Django 1.6 or Above
python manage.py clearsessions
Django 1.5 or lower
python manage.py cleanup
From Django Shell
from django.contrib.sessions.models import Session
Session.objects.all().delete()
django-session-cleanup
cronJob
clearing session in logout( based on session key present in request)
from django.contrib.sessions.models import Session
session_key = request.data['sessionKey']
session = Session.objects.get(session_key=session_key)
Session.objects.filter(session_key=session).delete()
Session.objects.all().delete()
The newer versions of Django allow:
request.session.clear()
Related
I'm having a problem with the sessions in my python/wsgi web app. There is a different, persistent mysqldb connection for each thread in each of 2 wsgi daemon processes. Sometimes, after deleting old sessions and creating a new one, some connections still fetch the old sessions in a select, which means they fail to validate the session and ask for login again.
Details: Sessions are stored in an InnoDB table in a local mysql database. After authentication (through CAS), I delete any previous sessions for that user, create a new session (insert a row), commit the transaction, and redirect to the originally requested page with the new session id in the cookie. For each request, a session id in the cookie is checked against the sessions in the database.
Sometimes, a newly created session is not found in the database after the redirect. Instead, the old session for that user is still there. (I checked this by selecting and logging all of the sessions at the beginning of each request). Somehow, I'm getting cached results. I tried selecting the sessions with SQL_NO_CACHE, but it made no difference.
Why am I getting cached results? Where else could the caching occur, and how can stop it or refresh the cache? Basically, why do the other connections fail to see the newly inserted data?
MySQL defaults to the isolation level "REPEATABLE READ" which means you will not see any changes in your transaction that were done after the transaction started - even if those (other) changes were committed.
If you issue a COMMIT or ROLLBACK in those sessions, you should see the changed data (because that will end the transaction that is "in progress").
The other option is to change the isolation level for those sessions to "READ COMMITTED". Maybe there is an option to change the default level as well, but you would need to check the manual for that.
Yes, it looks like the assumption is that you are only going to perform a single transaction and then disconnect. If you have a different need, then you need to work around this assumption. As mentioned by #a_horse_with_no_name, you can put in a commit (though I would use a rollback if you are not actually changing data). Or you can change the isolation level on the cursor - from this discussion I used this:
dbcursor.execute("SET SESSION TRANSACTION ISOLATION LEVEL READ COMMITTED")
Or, it looks like you can set auto commit to true on the connection:
dbconn.autocommit(True)
Though, again, this is not recommended if actually making changes in the connection.
I'm having a problem with the sessions in my python/wsgi web app. There is a different, persistent mysqldb connection for each thread in each of 2 wsgi daemon processes. Sometimes, after deleting old sessions and creating a new one, some connections still fetch the old sessions in a select, which means they fail to validate the session and ask for login again.
Details: Sessions are stored in an InnoDB table in a local mysql database. After authentication (through CAS), I delete any previous sessions for that user, create a new session (insert a row), commit the transaction, and redirect to the originally requested page with the new session id in the cookie. For each request, a session id in the cookie is checked against the sessions in the database.
Sometimes, a newly created session is not found in the database after the redirect. Instead, the old session for that user is still there. (I checked this by selecting and logging all of the sessions at the beginning of each request). Somehow, I'm getting cached results. I tried selecting the sessions with SQL_NO_CACHE, but it made no difference.
Why am I getting cached results? Where else could the caching occur, and how can stop it or refresh the cache? Basically, why do the other connections fail to see the newly inserted data?
MySQL defaults to the isolation level "REPEATABLE READ" which means you will not see any changes in your transaction that were done after the transaction started - even if those (other) changes were committed.
If you issue a COMMIT or ROLLBACK in those sessions, you should see the changed data (because that will end the transaction that is "in progress").
The other option is to change the isolation level for those sessions to "READ COMMITTED". Maybe there is an option to change the default level as well, but you would need to check the manual for that.
Yes, it looks like the assumption is that you are only going to perform a single transaction and then disconnect. If you have a different need, then you need to work around this assumption. As mentioned by #a_horse_with_no_name, you can put in a commit (though I would use a rollback if you are not actually changing data). Or you can change the isolation level on the cursor - from this discussion I used this:
dbcursor.execute("SET SESSION TRANSACTION ISOLATION LEVEL READ COMMITTED")
Or, it looks like you can set auto commit to true on the connection:
dbconn.autocommit(True)
Though, again, this is not recommended if actually making changes in the connection.
Basically what I want to do is to regenerate every session with some new set of keys without having users to log in again. How can I do this?
edited for clarity
So let's assume we are using Redis as a backend for sessions and keeping cookies of it on the client-side. Cookie just consists of the session id. This session id corresponds to a session on the Redis. After we have initialized Session by writing Session(APP) in our application, for every request context, we can fetch the session of the current user by
from flask import session
After admin changes some general settings on the application, I am willing to regenerate the session of every current user which can be seen just for the current user by again
from flask import session
This is as far as I know.
For example, let's say there is a value on the user's session determined as
session['arbitrary_key'] = not_important_database_function()
After admin changes some stuff at application, I need to reload a key on the current user's session by
session['arbitrary_key'] = not_important_database_function()
Because after changes admin made, it will yield a different value. After that, I am changing session.modified as true. What I want to learn is how can I change the arbitrary_key on sessions of EVERY USER. Because I am lacking information on how to fetch every session and change them from Flask.
If I delete the sessions from Redis, users are required to reauthenticate. I don't want them to reauthenticate. I just want back-end sessions to be changed because I use some information inside of the user's session which needs to be fetched from Redis so I do not have to call not_important_database_function for every request.
I hope this is enough information for you to at least NOT answer but also NOT downvote so I can continue to seek a solution for my problem.
I am not sharing code snippets because no code snippet is helpful for the case in my opinion.
The question is rather old but it looks like many developers are interested in the answer. There are several approaches which come to mind:
1. Lazy calculations
You need a way to differentiate old and new session values. For example, storing version number in session. Then you can force clients to update their sessions when they are on your resource:
CURRENT_VERSION = '1.2.3'
#app.route('/some_route')
def some_handler():
if session.get('version', '0.0.0') < CURRENT_VERSION:
session['arbitrary_key'] = not_important_database_function()
session['version'] = CURRENT_VERSION
Pros: The method is easy to implement and it is agnostic to the way of storing session data (database, user-agent, etc.)
Cons: Session data is not updated instantly after deploy. You have to give up that some users' session data may not be updated at all.
2. Session storage update
You need to make some kind of a database migration for session storage. It is backend-dependable and will look different for different storages. For Redis it may look like this:
import json
import redis
r = redis.Redis() # Your connection settings here
for key in r.scan_iter():
raw_value = r.get(key)
session = json.loads(raw_value)
session['arbitrary_key'] = not_important_database_function()
r.set(key, json.dumps(session))
Pros: Session data for all users will be updated right after your service deployment.
Cons: The method implementations differ for different storages. It is not applicable if all session data is stored in user-agents.
3. Dropping session data
It is still an option though it is clearly stated in the question that you need to keep the users logged in. It may be a part of the policy of deploying new application versions: session key is regenerated on application deployment and all user sessions are invalidated.
Pros: You don't need to implement any logic to set new user session values.
Cons: There are no new user session values, the data is just wiped out.
I have a site with pages hierarchy, which shows tables based on complex calculation from values stored in database. That database can be updated by external application. During to long time of calculation, I prefer to use per-page caching to show result pages (I'm using DatabaseCache). After external updating of database, I can clear cache, but I want to refresh it (create new one instead) before user's visit (assuming that user will see only next cached version).
Is any way in Django to force refreshing cache by external application?
Comes to mind only calling some Django code from external app, which will call all the pages urls one-by-one, after cache deleting..
Will be grateful for your advice anyway
in your external script you can do
import django
from django.core.cache import cache
django.setup() # Needed to make django ready from the external script
cache.clear() # Flush all the old cache entry.
I've been working on a small project and I've started wondering how to keep some data after the user logs in or logs out.
For example let's say that the user is adding items to his shopping cart and he is not logged in. Django sessions by default generate new session_id after the user logs in or logs out.
When the user adds 5 products to his cart when and he logs in afterwards then his cart will be cleared.
How to implement an element of persistence in user data? For example should I use some signals or some sort of middleware to bind the cart from old to new session?
My main goal is to keep it safe so I don't want to disable some security mechanisms.
You have to use the database-backed session. From the doc:
you need to add 'django.contrib.sessions' to your INSTALLED_APPS
setting.
Once you have configured your installation, run manage.py migrate to install the single database table that stores session data.
Then you have to ensure that the session.flush() is not called in the logout/login process, witch implies avoid using the django.contrib.auth.logout() witch will call session.flush(), it is also called in django.contrib.login(). login and logout the user yourself to avoid losing the session data. source for login/logout.
The session is flushed at login/logout, as a security measure. If you want to retain some variables, you can use the solution at:
https://stackoverflow.com/a/41849076/146289
It basically involves backing up old values, and then restoring them in the new session.