Idle Timeout of Authentication Tokens - python

I have an app using Flask and SQLAlchemy that allows users to make calls to a RESTful API using authentication tokens which I provide to them when they log into the application. One of the requirements of the application is that if a user is idle for 15 minutes, their token should expire and they will be required to log into the application again to get a new one.
Currently I am doing this by storing a user's session in a table
class Session(Model):
id = Column(Integer, primary_key=True)
user_id = Column(Integer, ForeignKey('user.id'))
user = relationship('User')
creation_time = Column(DateTime, default=datetime.now)
last_active = Column(DateTime, default=datetime.now)
_serializer = Serializer('foobar')
def is_active(self):
return (datetime.now() - self.last_active) < timedelta(15*60)
def create_token(self, token):
return self._serializer.dumps({
'session_id': self.id,
'user_id': self.user_id,
})
#staticmethod
def from_token(self, token):
try:
data = self._serializer.loads(token)
except BadSignature:
return None
session = Session.query.get(data.get('session_id'))
if session and session.is_active() and session.user_id == data.get('user_id'):
# Refresh the session
session.last_active = datetime.now()
return session
class User(Model):
id = Column(Integer, primary_key=True)
username = Column(String)
def generate_token(self):
return Session(self.id).create_token()
def verify_auth_token(token):
session = Session.from_token(token)
if session:
return session.user
Is there a more elegant and efficient way of doing this? I have seen many examples where the token stores the creation time and the tokens are only valid for a specific amount of time (which allows for an easy test without having to know what user it is, etc.), but I haven't been able to find any information about the expiration based on a user's idle time.

Apart from the potential bug because session.last_active = … is set without a call to session.flush() (where that's the SQLAlchemy Session), this is a totally reasonable way to do it.
In general, sites use a non-relational datastore (ex, Redis) for session management because it can be more performant (RDBMSs tend to be slower for writes, and especially in this case you'll be doing a lot of writes).
The logic would still be the same, though:
def session_from_token(token):
key = "session:%s" %(token, )
session_json = redis.get(key)
if not session_json:
return None
session = json.loads(session_json)
if not session.get("is_valid"):
return None
redis.expire(key, 15 * 60)
return session

Related

Flask admin remember form value

In my application, I have Users and Posts as models. Each post has a foreign key to a username. When I create a ModelView on top of my Posts model I can create posts as specific users in the admin interface
as seen in the screenshot below
After I have added a post and click "Save and Add Another", the "User" reverts back to "user1". How can I make the form remember the previous value "user2"?
My reserach has led me to believe it can be done by modifying on_model_change and on_form_prefill, and saving the previous value in the flask session, but it seems to be overengineering such a simple task. There must be a simpler way.
My code can be seen below
from flask import Flask
from flask_sqlalchemy import SQLAlchemy
import flask_admin
from flask_admin.contrib import sqla
app = Flask(__name__)
db = SQLAlchemy()
admin = flask_admin.Admin(name='Test')
class Users(db.Model):
"""
Contains users of the database
"""
user_id = db.Column(db.Integer, primary_key=True)
username = db.Column(db.String(64), index=True, unique=True, nullable=False)
def __str__(self):
return self.username
class Posts(db.Model):
"""
Contains users of the database
"""
post_id = db.Column(db.Integer, primary_key=True)
username = db.Column(db.String(11), db.ForeignKey(Users.username), nullable=False)
post = db.Column(db.String(256))
user = db.relation(Users, backref='user')
def build_sample_db():
db.drop_all()
db.create_all()
data = {'user1': 'post1', 'user1': 'post2', 'user2': 'post1'}
for user, post in data.items():
u = Users(username=user)
p = Posts(username=user, post=post)
db.session.add(u)
db.session.add(p)
db.session.commit()
class MyModelView(sqla.ModelView):
pass
if __name__ == '__main__':
app.config['SECRET_KEY'] = '123456790'
app.config['DATABASE_FILE'] = 'sample_db.sqlite'
app.config['SQLALCHEMY_DATABASE_URI'] = 'sqlite:///database'
app.config['SQLALCHEMY_ECHO'] = True
db.init_app(app)
admin.init_app(app)
admin.add_view(MyModelView(Posts, db.session))
with app.app_context():
build_sample_db()
# Start app
app.run(debug=True)
I have come across this situation before and i have solved it using 2 functions. its pretty easy and small.
#expose('/edit/', methods=('GET', 'POST'))
def edit_view(self):
#write your logic to populate the value into html
self._template_args["arg_name"] = stored_value
# in your html find this value to populate it as you need
the above function will let you populate the values in html when user tries to edit any value. This can be populated using the value stored. And below is a function that helps you save the value from previous edit.
within this class MyModelView(sqla.ModelView): you need to add the below 2 functions.
def on_model_change(self, form, model, is_created):
stored_value = model.user # this is your user name stored
# get the value of the column from your model and save it
This is a 2 step operation that's pretty small and does not need a lot of time. I have added just a skeleton/pseudo code for now.
on_form_prefill will not help you with this problem, as it only works for the edit form.
As the lib code says -> on_form_prefill => Perform additional actions to
pre-fill the edit form
When a request hit, the below flow of code happens:
(Base class) View(create_view) ->create_form->_create_form_class->get_create_form->get_form-> scaffold_form
Hence If we want to change the default value of the create form being loaded we can update it in the methods handling the form as explained in the above flow.
Hence we can override create_form.
def create_form(self):
form = super().create_form()
# set the user id we want in the form as default.
form.user_id.default = 'USER_2_ID'
return form
This was to get the default value for the form.
To set the value we override the on_model_change
def on_model_change(self, form, model, is_created):
# set the USER_ID from this model.user_id
pass
Now the way to share this data(USER_ID) from the setter and getter methods are following,
We set this in cookies and get on the create request.
Update the "save and add another" button link to add the user_id in the query strings.
This data has to be shared between two different requests hence storing it in application context, g won't work. The application context "will not be shared between requests"
def create_form(self):
form = super().create_form()
user_id = request.args.get('user_id')
form.user_id.default = user_id
# or if you want only one option to be available
user_query = self.session.query(User).filter(User.id = user_id).one()
form.user.query = [user_query]
return form
As the above answer mention to use edit_view, create_view, that can also be used to add
An easy option but not a good one will be to query on creating
that will give you the last entry(But this approach is not optimal
one)
#expose('/new/', methods=('GET', 'POST'))
def create_view(self):
model = self.get_model()
# query the model here to get the latest value of the user_id
self._template_args['user_id'] = user_id
return super(YourAdmin, self).details_view()

How can I automatically let API keys expire?

I am building an application which uses API keys during sessions. I have so far successfully generated the API keys and I can check them for validity and whether they go with the correct account and I've also added brute force protection.
My problem is that I would like to automatically let them expire after 24 hours. Right now I remove old keys when a user requests a new one to lessen the chance of someone guessing the right key, but this doesn't work for users who don't use the application again.
I was going to achieve this by scheduling a cronjob, as I read other people advising. However, the server the application will be hosted by isn't mine and the person who the server actually does belong to doesn't see the need for the automatic expiry in the first place. Which means that I would like to somehow include it in the code itself or to have a good reasoning for why he should let me (or do it himself) schedule a cronjob.
The table containing the API keys looks as follows:
class DBAuth(db.Model):
__tablename__ = 'auth'
id = db.Column(db.Integer, primary_key=True)
user_id = db.Column(db.Integer, index=True)
api_key = db.Column(db.String(256))
begin_date = db.Column(db.DateTime, nullable=False)
And the api key generater is called as follows:
auth = DBAuth()
key = DBAuth.query.filter_by(user_id=user.id).first()
if key is not None:
db.session.delete(key)
db.session.commit()
api_key = auth.generate_key(user.id)
db.session.add(auth)
db.session.commit()
With the generator function like this:
def generate_key(self, user_id):
self.user_id = user_id
self.api_key = #redacted#
self.begin_date = datetime.datetime.now()
return self.api_key
My question is really two part:
1: is my colleague right in saying that the automatic expiry isn't necessary? and 2: Is there a way to add automatic expiry to the code instead of scheduling a cronjob?
Sorry, I don't have enough rep to comment, a simple approach would be the following:
Since you already have DateTime objects in your schema, maybe you can add another such item say "key_expiry_date" that contains the current time plus 24 hours.
You can then use "key_expiry_date" to validate further requests

Google Cloud Datastore 'NoneType' object has no attribute 'email'

When I try getting a kind from my datastore it returns NoneType, as if the query is empty. I know the datastore is working properly while saving, but pulling a kind from the a query is not.
Also using the GQL Query in the Google cloud Console website and using SELECT * FROM User does return all the kinds. User kind has no parents, it is at the root. I made sure all the properties are indexed as well.
I am not sure what I am doing wrong on GET.
MyApp.py
import webapp2
from google.appengine.ext import ndb
from google.appengine.ext.db import GqlQuery
class MainHandler(webapp2.RequestHandler):
def post(self):
message = self.request.body
message = message.splitlines()
if message[0] == "register":
user = User.create_user(message[1], message[2], message[3])
user_key = User.save_user(user)
if user_key is not None:
self.response.write(user_key)
else:
user = User.get_by_id(User.email == message[0])
if User.token == message[1]:
self.response.write("CURRENT")
else:
User.token = message[1]
User.save_user(user)
self.response.write("UPDATED")
def get(self):
self.response.write("CONNECTED")
user= User.query().get()
self.response.write("\n" + query.email)
class User(ndb.Model):
email = ndb.StringProperty()
token = ndb.StringProperty()
name = ndb.StringProperty()
#classmethod
def create_user(cls, email, token, name):
user = User(email=email, token=token, name=name, id=email)
return user
#classmethod
def save_user(cls, user):
user_key = user.put()
return user_key
#classmethod
def get_user(cls, email):
return User.get_by_id(User.email == email)
app = webapp2.WSGIApplication([
('/', MainHandler)
], debug=True)
You seem to be confusing .get_by_id() with a query.
The get_by_id method is actually mapped to ndb.Model._get_by_id which invokes ndb.Model._get_by_id_async, which requires an entity key identifier to determine the entity's key used to do a direct entity lookup (not a query!). From appengine.ext.ndb.model.py:
#classmethod
#utils.positional(3)
def _get_by_id_async(cls, id, parent=None, app=None, namespace=None,
**ctx_options):
"""Returns an instance of Model class by ID (and app, namespace).
This is the asynchronous version of Model._get_by_id().
"""
key = Key(cls._get_kind(), id, parent=parent, app=app, namespace=namespace)
return key.get_async(**ctx_options)
But in your code you're passing as id a bool: User.email == message[0], which most likely won't match any existing entity key identifiers, hence the None result causing the error you see.
Since the info you have available is the value of an entity's property (the email value) you probably want to perform a query, something along these lines:
results = User.query(User.email == message[0]).fetch(limit=1)
if results:
user = results[0]
So I figured out what was wrong. So there seems to be an issue with the way Google Cloud SDK is set up on my computer. When running the same code on the google servers rather than on my network everything seems to work properly.

flask API querying items, JSON

I am using a flask API as my rest point for my Angular application. Currently I am testing the API. I tested my /users point to make sure I got all the users.
//importing db, app, models, schema etc.
from flask import jsonify, request
#app.route('/users')
def get_users():
# fetching from database
users_objects = User.query.all()
# transforming into JSON-serializable objects
users_schema = UserSchema(many=True)
result = users_schema.dump(users_objects)
# serializing as JSON
return jsonify(result.data)
That worked. However, now that I am trying to get other data(which has more than 9000 objects.. it doesn't work(when I try querying all of them). I first just grabbed the first item
#app.route('/aggregated-measurements')
def get_aggregated_measurements():
aggregated_measurements_objects = AggregatedMeasurement.query.first()
# transforming into JSON-serializable objects
aggregated_measurement_schema = AggregatedMeasurementSchema()
result = aggregated_measurement_schema.dump(aggregated_measurements_objects)
return jsonify(result.data)
That showed me the first AggregatedMeasurement. However when I try to query all of them aggregated_measurements_objects = AggregatedMeasurement.query.all() Nothing displays. I did the same thing on my jupyter notebook and that displayed them. I then thought that maybe this was too much info, so I tried to just limit the query like this aggregated_measurements_objects = AggregatedMeasurement.query.all()[:5]. That works on the jupyter notebook, but displays nothing when I hit the route.
I don't understand why when I hit the /users point I can see all of them, but when I try to do the same for aggregated-measurements I get nothing(even when I limit the query). I am using flask_sqlalchemy with sqlite db.
**update with model and schema **
from datetime import datetime
# ... import db
import pandas as pd
from marshmallow import Schema, fields
class AggregatedMeasurement(db.Model):
id = db.Column(db.Integer, primary_key=True)
created = db.Column(db.DateTime, nullable=False, default=datetime.utcnow)
time = db.Column(db.DateTime, nullable=False)
speed = db.Column(db.Float, nullable=False)
direction = db.Column(db.Float, nullable=False)
# related fields
point_id = db.Column(db.Integer, db.ForeignKey('point.id'), nullable=False)
point = db.relationship('Point',backref=db.backref('aggregated_measurements', lazy=True))
class AggregatedMeasurementSchema(Schema):
id = fields.Int(dump_only=True)
time = fields.DateTime()
speed = fields.Number()
direction = fields.Number()
point_id = fields.Number()
SECOND UPDATE found the error.
After verifying that indeed it was hitting the db( thank you #gbozee) I noticed that on the /aggregated-measurements route when I made the schema I did it for just one object. I forgot to include the many = True like I did in the users_schema. Therefore that is why only one point appeared and when I tried more, it did not. I was using the marshmallow(an object serialization package).

What is the best way to store runtime information in SQLalchemy model?

What is the best way to store runtime information in model?
And is it good idea to store one in a model(like online/offline, etc)
for example:
class User(Base):
__tablename__ = 'users'
id = Column(Integer, primary_key=True)
username = Column(String, unique=True, nullable=False)
fullname = Column(String, default='')
password = Column(String, nullable=False)
role = Column(String, nullable=False)
status = {0: "Offline", 1: "Online", -1: "Unknown"}
def __init__(self, **kwargs):
Base.__init__(self, **kwargs)
self.init_status()
#orm.reconstructor
def init_status(self):
self._online = 0
#property
def online(self):
if self._online is None:
self._online = 0
if self.enable:
return self._online
return -1
#online.setter
def online(self, value):
if value != self.online:
dispatcher.send(sender=self, signal="state", value=value)
self._online = value
If I get object from session like
user = session.query(User).get(1)
change state
user.online = 1
and after session.close() I have detached object
Do I always have to do expunge(user) after commit() and before close()
and then if I want to change it, I have to add it to new session and the again commit,expunge,close
Is there any other ways?
P.S.
what is the most used practice, to create DAO layer or session it self work like DAO layer?
I need to have access to this state in a whole life of app, but as I undestand it's not a good way to use one session all time.
Proper way, to open session, do all my stuff with DB, then close session. But then I lost my state.
In java I have DAO layer and business object, that store all my db field and all my states regardless of session.
but with SA I already have session, DBO object and Manager object. I dont want to create so much layers, I think its not much pythonic.
Thanks.
You should store the status also in DB instead of memory.
As its not user data, preferably a different table, UserSession which has a user id FK.
If you do so, you can store other data as well e.g lastlogintime.
And even make intelligent decisions like if the lastlogintime > 30 mins, you can change the status back to offline maybe.
Storing such state in memory is not a good idea.

Categories