What is the proper way to delineate modules and classes in Python? - python

I am new to Python, and I'm starting to learn the basics of the code structure. I've got a basic app that I'm working on up on my Github.
For my simple app, I'm create a basic "Evernote-like" service which allows the user to create and edit a list of notes. In the early design, I have a Note object and a Notepad object, which is effectively a list of notes. Presently, I have the following file structure:
Notes.py
|
|------ Notepad (class)
|------ Note (class)
From my current understanding and implementation, this translates into the "Notes" module having a Notepad class and Note class, so when I do an import, I'm saying "from Notes import Notepad / from Notes import Note".
Is this the right approach? I feel, out of Java habit, that I should have a folder for Notes and the two classes as individual files.
My goal here is to understand what the best practice is.

As long as the classes are rather small put them into one file.
You can still move them later, if necessary.
Actually, it is rather common for larger projects to have a rather deep hierarchy but expose a more flat one to the user. So if you move things later but would like still have notes.Note even though the class Note moved deeper, it would be simple to just import note.path.to.module.Note into notes and the user can get it from there. You don't have to do that but you can. So even if you change your mind later but would like to keep the API, no problem.

I've been working in a similar application myself. I can't say this is the best possible approach, but it served me well. The classes are meant to interact with the database (context) when the user makes a request (http request, this is a webapp).
# -*- coding: utf-8 -*-
import json
import datetime
class Note ():
"""A note. This class is part of the data model and is instantiated every
time there access to the database"""
def __init__(self, noteid = 0, note = "", date = datetime.datetime.now(), context = None):
self.id = noteid
self.note = note
self.date = date
self.ctx = context #context holds the db connection and some globals
def get(self):
"""Get the current object from the database. This function needs the
instance to have an id"""
if id == 0:
raise self.ctx.ApplicationError(404, ("No note with id 0 exists"))
cursor = self.ctx.db.conn.cursor()
cursor.execute("select note, date from %s.notes where id=%s" %
(self.ctx.db.DB_NAME, str(self.id)))
data = cursor.fetchone()
if not data:
raise self.ctx.ApplicationError(404, ("No note with id "
+ self.id + " was found"))
self.note = data[0]
self.date = data[1]
return self
def insert(self, user):
"""This function inserts the object to the database. It can be an empty
note. User must be authenticated to add notes (authentication handled
elsewhere)"""
cursor = self.ctx.db.conn.cursor()
query = ("insert into %s.notes (note, owner) values ('%s', '%s')" %
(self.ctx.db.DB_NAME, str(self.note), str(user['id'])))
cursor.execute(query)
return self
def put(self):
"""Modify the current note in the database"""
cursor = self.ctx.db.conn.cursor()
query = ("update %s.notes set note = '%s' where id = %s" %
(self.ctx.db.DB_NAME, str(self.note), str(self.id)))
cursor.execute(query)
return self
def delete(self):
"""Delete the current note, by id"""
if self.id == 0:
raise self.ctx.ApplicationError(404, "No note with id 0 exists")
cursor = self.ctx.db.conn.cursor()
query = ("delete from %s.notes where id = %s" %
(self.ctx.db.DB_NAME, str(self.id)))
cursor.execute(query)
def toJson(self):
"""Returns a json string of the note object's data attributes"""
return json.dumps(self.toDict())
def toDict(self):
"""Returns a dict of the note object's data attributes"""
return {
"id" : self.id,
"note" : self.note,
"date" : self.date.strftime("%Y-%m-%d %H:%M:%S")
}
class NotesCollection():
"""This class handles the notes as a collection"""
collection = []
def get(self, user, context):
"""Populate the collection object and return it"""
cursor = context.db.conn.cursor()
cursor.execute("select id, note, date from %s.notes where owner=%s" %
(context.db.DB_NAME, str(user["id"])))
note = cursor.fetchone()
while note:
self.collection.append(Note(note[0], note[1],note[2]))
note = cursor.fetchone()
return self
def toJson(self):
"""Return a json string of the current collection"""
return json.dumps([note.toDict() for note in self.collection])
I personally use python as a "get it done" language, and don't bother myself with details. This shows in the code above. However one piece of advice: There are no private variables nor methods in python, so don't bother trying to create them. Make your life easier, code fast, get it done
Usage example:
class NotesCollection(BaseHandler):
#tornado.web.authenticated
def get(self):
"""Retrieve all notes from the current user and return a json object"""
allNotes = Note.NotesCollection().get(self.get_current_user(), settings["context"])
json = allNotes.toJson()
self.write(json)
#protected
#tornado.web.authenticated
def post(self):
"""Handles all post requests to /notes"""
requestType = self.get_argument("type", "POST")
ctx = settings["context"]
if requestType == "POST":
Note.Note(note = self.get_argument("note", ""),
context = ctx).insert(self.get_current_user())
elif requestType == "DELETE":
Note.Note(id = self.get_argument("id"), context = ctx).delete()
elif requestType == "PUT":
Note.Note(id = self.get_argument("id"),
note = self.get_argument("note"),
context = ctx).put()
else:
raise ApplicationError(405, "Method not allowed")
By using decorators I'm getting user authentication and error handling out of the main code. This makes it clearer and easier to mantain.

Related

Python MongoDB dynamic query building with multiple conditions

I am building an Open Source Project, Python MongoDB ORM (for Flask especially) using flask_pymongo and I am kind of stuck at building dynamic conditions.
Below code is what I have written in corresponding files
Model.py
from app.database import Database
class Model:
conditions = {"and":[], "or":[], "in":[]}
operators = {
"!=": "$ne",
"<": "$lt",
">": "$gt",
"<=": "$lte",
">=": "$gte",
"in": "$in",
"not in":"$nin",
"and": "$and",
"or": "$or"
}
def __init__(self):
# collection property from User class
# Database class takes collection to fire MongoDB queries
self.db = Database(self.collection)
def where(self, field, operator, value=None):
if value is None:
# to enable Model.where("first_name", "John")
value = operator
operator = "="
self._handle_condition("and", field, operator, value)
# to enable Model.where().where_or() and etc
return self
def where_or(self, field, operator, value=None):
if value is None:
# to enable Model.where("first_name", "John")
value = operator
operator = "="
self._handle_condition("or", field, operator, value)
# to enable Model.where().where_or() and etc
return self
def _handle_condition(self, type, field, operator, value):
self.conditions[type].append({"field":field, "operator":operator, value:value})
def get(self):
filetrs = {}
for type in self.conditions:
filetrs[self.operators[type]] = []
for condition in self.conditions[type]:
if condition["operator"] == "=":
filter = {condition["field"]:condition["value"]}
else:
filter = {condition["field"]:{self.operators[condition["operator"]]:condition["value"]}}
filetrs[self.operators[type]].append(filter)
return self.db.find(filters)
User.py
from app.Model import Model
class UserModel(Model):
# MongoDB collection name
collection = "users"
def __init__(self):
Model.__init__(self)
User = UserModel()
What I want to achieve is, from UserController.py where User.py is imported and used like the mentioned code.
Where multiple conditions are being added using where and where_or Model methods, get methods is parsing all the conditions and passing as filter to find method
UserController.py
from app.User import User
class UserController:
def index(self):
# Should return all the users where _id is not blank or their first_name is equal to John
return User.where("_id", "!=", "").where_or("first_name", "John").get()
The problem is this is not working at it should be, it seems working fine for any one condition, where or where_or but when I try to add multiple where and where_or conditions it is not working.
Your help is really appreciated.
PS: This question seems to have lots of code but to make you understand the complete scenario I had to, please feel free to comment if you still need any clarifications.
Eagerly looking forward.

To lock or to catch IntegrityError? [duplicate]

I want to get an object from the database if it already exists (based on provided parameters) or create it if it does not.
Django's get_or_create (or source) does this. Is there an equivalent shortcut in SQLAlchemy?
I'm currently writing it out explicitly like this:
def get_or_create_instrument(session, serial_number):
instrument = session.query(Instrument).filter_by(serial_number=serial_number).first()
if instrument:
return instrument
else:
instrument = Instrument(serial_number)
session.add(instrument)
return instrument
Following the solution of #WoLpH, this is the code that worked for me (simple version):
def get_or_create(session, model, **kwargs):
instance = session.query(model).filter_by(**kwargs).first()
if instance:
return instance
else:
instance = model(**kwargs)
session.add(instance)
session.commit()
return instance
With this, I'm able to get_or_create any object of my model.
Suppose my model object is :
class Country(Base):
__tablename__ = 'countries'
id = Column(Integer, primary_key=True)
name = Column(String, unique=True)
To get or create my object I write :
myCountry = get_or_create(session, Country, name=countryName)
That's basically the way to do it, there is no shortcut readily available AFAIK.
You could generalize it ofcourse:
def get_or_create(session, model, defaults=None, **kwargs):
instance = session.query(model).filter_by(**kwargs).one_or_none()
if instance:
return instance, False
else:
params = {k: v for k, v in kwargs.items() if not isinstance(v, ClauseElement)}
params.update(defaults or {})
instance = model(**params)
try:
session.add(instance)
session.commit()
except Exception: # The actual exception depends on the specific database so we catch all exceptions. This is similar to the official documentation: https://docs.sqlalchemy.org/en/latest/orm/session_transaction.html
session.rollback()
instance = session.query(model).filter_by(**kwargs).one()
return instance, False
else:
return instance, True
2020 update (Python 3.9+ ONLY)
Here is a cleaner version with Python 3.9's the new dict union operator (|=)
def get_or_create(session, model, defaults=None, **kwargs):
instance = session.query(model).filter_by(**kwargs).one_or_none()
if instance:
return instance, False
else:
kwargs |= defaults or {}
instance = model(**kwargs)
try:
session.add(instance)
session.commit()
except Exception: # The actual exception depends on the specific database so we catch all exceptions. This is similar to the official documentation: https://docs.sqlalchemy.org/en/latest/orm/session_transaction.html
session.rollback()
instance = session.query(model).filter_by(**kwargs).one()
return instance, False
else:
return instance, True
Note:
Similar to the Django version this will catch duplicate key constraints and similar errors. If your get or create is not guaranteed to return a single result it can still result in race conditions.
To alleviate some of that issue you would need to add another one_or_none() style fetch right after the session.commit(). This still is no 100% guarantee against race conditions unless you also use a with_for_update() or serializable transaction mode.
I've been playing with this problem and have ended up with a fairly robust solution:
def get_one_or_create(session,
model,
create_method='',
create_method_kwargs=None,
**kwargs):
try:
return session.query(model).filter_by(**kwargs).one(), False
except NoResultFound:
kwargs.update(create_method_kwargs or {})
created = getattr(model, create_method, model)(**kwargs)
try:
session.add(created)
session.flush()
return created, True
except IntegrityError:
session.rollback()
return session.query(model).filter_by(**kwargs).one(), False
I just wrote a fairly expansive blog post on all the details, but a few quite ideas of why I used this.
It unpacks to a tuple that tells you if the object existed or not. This can often be useful in your workflow.
The function gives the ability to work with #classmethod decorated creator functions (and attributes specific to them).
The solution protects against Race Conditions when you have more than one process connected to the datastore.
EDIT: I've changed session.commit() to session.flush() as explained in this blog post. Note that these decisions are specific to the datastore used (Postgres in this case).
EDIT 2: I’ve updated using a {} as a default value in the function as this is typical Python gotcha. Thanks for the comment, Nigel! If your curious about this gotcha, check out this StackOverflow question and this blog post.
A modified version of erik's excellent answer
def get_one_or_create(session,
model,
create_method='',
create_method_kwargs=None,
**kwargs):
try:
return session.query(model).filter_by(**kwargs).one(), True
except NoResultFound:
kwargs.update(create_method_kwargs or {})
try:
with session.begin_nested():
created = getattr(model, create_method, model)(**kwargs)
session.add(created)
return created, False
except IntegrityError:
return session.query(model).filter_by(**kwargs).one(), True
Use a nested transaction to only roll back the addition of the new item instead of rolling back everything (See this answer to use nested transactions with SQLite)
Move create_method. If the created object has relations and it is assigned members through those relations, it is automatically added to the session. E.g. create a book, which has user_id and user as corresponding relationship, then doing book.user=<user object> inside of create_method will add book to the session. This means that create_method must be inside with to benefit from an eventual rollback. Note that begin_nested automatically triggers a flush.
Note that if using MySQL, the transaction isolation level must be set to READ COMMITTED rather than REPEATABLE READ for this to work. Django's get_or_create (and here) uses the same stratagem, see also the Django documentation.
This SQLALchemy recipe does the job nice and elegant.
The first thing to do is to define a function that is given a Session to work with, and associates a dictionary with the Session() which keeps track of current unique keys.
def _unique(session, cls, hashfunc, queryfunc, constructor, arg, kw):
cache = getattr(session, '_unique_cache', None)
if cache is None:
session._unique_cache = cache = {}
key = (cls, hashfunc(*arg, **kw))
if key in cache:
return cache[key]
else:
with session.no_autoflush:
q = session.query(cls)
q = queryfunc(q, *arg, **kw)
obj = q.first()
if not obj:
obj = constructor(*arg, **kw)
session.add(obj)
cache[key] = obj
return obj
An example of utilizing this function would be in a mixin:
class UniqueMixin(object):
#classmethod
def unique_hash(cls, *arg, **kw):
raise NotImplementedError()
#classmethod
def unique_filter(cls, query, *arg, **kw):
raise NotImplementedError()
#classmethod
def as_unique(cls, session, *arg, **kw):
return _unique(
session,
cls,
cls.unique_hash,
cls.unique_filter,
cls,
arg, kw
)
And finally creating the unique get_or_create model:
from sqlalchemy import Column, Integer, String, create_engine
from sqlalchemy.orm import sessionmaker
from sqlalchemy.ext.declarative import declarative_base
Base = declarative_base()
engine = create_engine('sqlite://', echo=True)
Session = sessionmaker(bind=engine)
class Widget(UniqueMixin, Base):
__tablename__ = 'widget'
id = Column(Integer, primary_key=True)
name = Column(String, unique=True, nullable=False)
#classmethod
def unique_hash(cls, name):
return name
#classmethod
def unique_filter(cls, query, name):
return query.filter(Widget.name == name)
Base.metadata.create_all(engine)
session = Session()
w1, w2, w3 = Widget.as_unique(session, name='w1'), \
Widget.as_unique(session, name='w2'), \
Widget.as_unique(session, name='w3')
w1b = Widget.as_unique(session, name='w1')
assert w1 is w1b
assert w2 is not w3
assert w2 is not w1
session.commit()
The recipe goes deeper into the idea and provides different approaches but I've used this one with great success.
The closest semantically is probably:
def get_or_create(model, **kwargs):
"""SqlAlchemy implementation of Django's get_or_create.
"""
session = Session()
instance = session.query(model).filter_by(**kwargs).first()
if instance:
return instance, False
else:
instance = model(**kwargs)
session.add(instance)
session.commit()
return instance, True
not sure how kosher it is to rely on a globally defined Session in sqlalchemy, but the Django version doesn't take a connection so...
The tuple returned contains the instance and a boolean indicating if the instance was created (i.e. it's False if we read the instance from the db).
Django's get_or_create is often used to make sure that global data is available, so I'm committing at the earliest point possible.
I slightly simplified #Kevin. solution to avoid wrapping the whole function in an if/else statement. This way there's only one return, which I find cleaner:
def get_or_create(session, model, **kwargs):
instance = session.query(model).filter_by(**kwargs).first()
if not instance:
instance = model(**kwargs)
session.add(instance)
return instance
There is a Python package that has #erik's solution as well as a version of update_or_create(). https://github.com/enricobarzetti/sqlalchemy_get_or_create
Depending on the isolation level you adopted, none of the above solutions would work.
The best solution I have found is a RAW SQL in the following form:
INSERT INTO table(f1, f2, unique_f3)
SELECT 'v1', 'v2', 'v3'
WHERE NOT EXISTS (SELECT 1 FROM table WHERE f3 = 'v3')
This is transactionally safe whatever the isolation level and the degree of parallelism are.
Beware: in order to make it efficient, it would be wise to have an INDEX for the unique column.
One problem I regularly encounter is when a field has a max length (say, STRING(40)) and you'd like to perform a get or create with a string of large length, the above solutions will fail.
Building off of the above solutions, here's my approach:
from sqlalchemy import Column, String
def get_or_create(self, add=True, flush=True, commit=False, **kwargs):
"""
Get the an entity based on the kwargs or create an entity with those kwargs.
Params:
add: (default True) should the instance be added to the session?
flush: (default True) flush the instance to the session?
commit: (default False) commit the session?
kwargs: key, value pairs of parameters to lookup/create.
Ex: SocialPlatform.get_or_create(**{'name':'facebook'})
returns --> existing record or, will create a new record
---------
NOTE: I like to add this as a classmethod in the base class of my tables, so that
all data models inherit the base class --> functionality is transmitted across
all orm defined models.
"""
# Truncate values if necessary
for key, value in kwargs.items():
# Only use strings
if not isinstance(value, str):
continue
# Only use if it's a column
my_col = getattr(self.__table__.columns, key)
if not isinstance(my_col, Column):
continue
# Skip non strings again here
if not isinstance(my_col.type, String):
continue
# Get the max length
max_len = my_col.type.length
if value and max_len and len(value) > max_len:
# Update the value
value = value[:max_len]
kwargs[key] = value
# -------------------------------------------------
# Make the query...
instance = session.query(self).filter_by(**kwargs).first()
if instance:
return instance
else:
# Max length isn't accounted for here.
# The assumption is that auto-truncation will happen on the child-model
# Or directtly in the db
instance = self(**kwargs)
# You'll usually want to add to the session
if add:
session.add(instance)
# Navigate these with caution
if add and commit:
try:
session.commit()
except IntegrityError:
session.rollback()
elif add and flush:
session.flush()
return instance

What is the most pythonic way/trick to support two database backends and keep my code DRY?

I have the following example code which uses either MongoEngine and Peewee as DB backends.
import mongoengine, peewee
from mongomodels import *
from mysqlmodels import *
class Parser(object):
def __init__(self, line, dbBackend):
if dbBackend in ["MongoDB","MySQL"]:
self.line = line
self.DB = dbBackend
user = self.createUser()
car = self.createCar(user)
parking = self.createParking(car)
else:
raise Exception()
def createUser(self):
if self.DB == "MongoDB":
newUserID = self._createMongoUser(self.line['firstname'], self.line['lastname'], '...')
else:
newUserID = self._createMySQLUser(self.line['firstname'], self.line['lastname'], '...')
return newUserID
def _createMongoUser(self, firstname, lastname, '...'):
try:
_user = MongoUserModel.objects.get(firstname=firstname, lastname=lastname)
except mongoengine.errors.DoesNotExist as e:
user = MongoUserModel(firstname=firstname, password)
_user = user.save()
finally:
return _user
def _createMySQLUser(self, firstname, lastname, '...'):
try:
_user = MySQLUserModel.get(MySQLUserModel.fistname == firstname, MySQLUserModel.lastname == lastname )
except Exception as e:
user = MySQLUserModel(fistname=fistname, lastname=lastname)
_user = user.save()
finally:
return _user
def createCar(self, user):
pass
def createParking(self, car):
pass
Is there any good practice / trick / module to keep my code DRY and to avoid redefining two methods to create my Models?
Should I can create a new abstraction class 'UserModel' as does PDO in PHP?
This is something I went through recently - I swapped from a mongo backend to postgres. When I set up the original project I had some models and a DataLayer. The datalayer (dl) had quite a simple interface that I used throughout my app.
# note: this is half python / half pseudocode
class Model1(object):
__collection__ = 'model1'
__tablename__ = 'model1'
# field definitions etc
class MongoDataLayer(object):
def __init__(self, mongo_db_connection):
self.conn = mongo_db_connection
def load(self, model, conditions):
raw = self.conn[model.__collection__].find(...)
return model(**raw)
def persist(self, obj):
self.conn[obj.__collection__].save(obj.as_dict())
class SQLDataLayer(object):
def __init__(self, sa_session_factory):
self.Session = sa_session_factory
self.session = self.Session()
def load(self, model, conditions):
return self.session.query(model).find_by(conditions).one() # ...etc
def persist(self, obj):
self.conn[obj.__collection__].save(obj)
# connections - mongo and postgres (I use SQLAlchemy)
dl_mongo = MongoDataLayer(db...)
dl_sql = SQLDataLayer(Session...)
# using them - you don't care which one you have
m = dl_mongo.load(models.Model1)
dl_mongo.persist(m)
m = dl_sql.load(models.Model1)
dl_sql.persist(m)
In my app I load up the dl in the initial load and then inject it into the app whenever data access needs to happen. The app itself then knows about models but not the details of how to load / save them.
Maybe not the best way to do it but it's worked well for me. Would be interested to hear how other people deal with it.

PyQT: Using attributes from SQLite query

I am confused as to how I can use certain attributes that are returned after a query to a local SQLite database. I can populate a qlistwidget with one of the attributes but I do not know how to get the other attributes when a user clicks on the listwidget item.
The following code was created using Eric which pre populates some of the signals and slots
#pyqtSignature("QString")
def on_searchText_textEdited(self, p0):
"""
Slot documentation goes here.
"""
# TODO: not implemented yet
self.resultsList.clear()
self.searchItem = self.searchText.text()
self.search()
#pyqtSignature("QListWidgetItem*")
def on_resultsList_itemClicked(self, item):
"""
Slot documentation goes here.
"""
# TODO: not implemented yet
result = str(item.text())
QMessageBox.about(self, "Clicked Item", "%s")%(result)
#pyqtSignature("")
def on_cancelButton_clicked(self):
"""
Slot documentation goes here.
"""
self.close()
def search(self):
conn = sqlite3.connect("C:\\file.sqlite")
cur = conn.cursor()
sqlqry = "SELECT name, number, size FROM lookup WHERE name LIKE '%s' LIMIT 100;"%("%"+self.searchItem+"%")
try:
c = cur.execute(sqlqry)
data = c.fetchall()
for i in data:
self.resultsList.addItem(i[0])
except sqlite3.Error, e:
QMessageBox.about(self, "Error message", "Error")
So my resultsList gets populated when the user enters text into the line edit but then when a user clicks on an item I get an error with the messagebox saying something about a NoneType and str.
However, what I really need to use are the second and third attributes for somewhere else in my code.
So how do I select that attributes through the itemClicked signal and create two new variables?
i hope that makes sense, it has been a long day going round in circles
You just need to query from the database again and work with the new row.
#pyqtSignature("QListWidgetItem*")
def on_resultsList_itemClicked(self, item):
"""
Slot documentation goes here.
"""
result = str(item.text())
QMessageBox.about(self, "Clicked Item", "%s")%(result)
conn = sqlite3.connect("C:\\file.sqlite")
cur = conn.cursor()
sqlqry = "SELECT name, number, size FROM lookup WHERE name = '%s' LIMIT 1;"%(result)
try:
c = cur.execute(sqlqry)
data = c.fetchone()
# Do something with data
except sqlite3.Error, e:
QMessageBox.about(self, "Error fetching %s"%name, "Error")
Obviously, this doesn't deal with the database santisation issues you might have, and assumes that name is unique in the database.

Fastest Way to Create a New Object Only if it Doesn't Already Exist (SQLAlchemy) [duplicate]

I want to get an object from the database if it already exists (based on provided parameters) or create it if it does not.
Django's get_or_create (or source) does this. Is there an equivalent shortcut in SQLAlchemy?
I'm currently writing it out explicitly like this:
def get_or_create_instrument(session, serial_number):
instrument = session.query(Instrument).filter_by(serial_number=serial_number).first()
if instrument:
return instrument
else:
instrument = Instrument(serial_number)
session.add(instrument)
return instrument
Following the solution of #WoLpH, this is the code that worked for me (simple version):
def get_or_create(session, model, **kwargs):
instance = session.query(model).filter_by(**kwargs).first()
if instance:
return instance
else:
instance = model(**kwargs)
session.add(instance)
session.commit()
return instance
With this, I'm able to get_or_create any object of my model.
Suppose my model object is :
class Country(Base):
__tablename__ = 'countries'
id = Column(Integer, primary_key=True)
name = Column(String, unique=True)
To get or create my object I write :
myCountry = get_or_create(session, Country, name=countryName)
That's basically the way to do it, there is no shortcut readily available AFAIK.
You could generalize it ofcourse:
def get_or_create(session, model, defaults=None, **kwargs):
instance = session.query(model).filter_by(**kwargs).one_or_none()
if instance:
return instance, False
else:
params = {k: v for k, v in kwargs.items() if not isinstance(v, ClauseElement)}
params.update(defaults or {})
instance = model(**params)
try:
session.add(instance)
session.commit()
except Exception: # The actual exception depends on the specific database so we catch all exceptions. This is similar to the official documentation: https://docs.sqlalchemy.org/en/latest/orm/session_transaction.html
session.rollback()
instance = session.query(model).filter_by(**kwargs).one()
return instance, False
else:
return instance, True
2020 update (Python 3.9+ ONLY)
Here is a cleaner version with Python 3.9's the new dict union operator (|=)
def get_or_create(session, model, defaults=None, **kwargs):
instance = session.query(model).filter_by(**kwargs).one_or_none()
if instance:
return instance, False
else:
kwargs |= defaults or {}
instance = model(**kwargs)
try:
session.add(instance)
session.commit()
except Exception: # The actual exception depends on the specific database so we catch all exceptions. This is similar to the official documentation: https://docs.sqlalchemy.org/en/latest/orm/session_transaction.html
session.rollback()
instance = session.query(model).filter_by(**kwargs).one()
return instance, False
else:
return instance, True
Note:
Similar to the Django version this will catch duplicate key constraints and similar errors. If your get or create is not guaranteed to return a single result it can still result in race conditions.
To alleviate some of that issue you would need to add another one_or_none() style fetch right after the session.commit(). This still is no 100% guarantee against race conditions unless you also use a with_for_update() or serializable transaction mode.
I've been playing with this problem and have ended up with a fairly robust solution:
def get_one_or_create(session,
model,
create_method='',
create_method_kwargs=None,
**kwargs):
try:
return session.query(model).filter_by(**kwargs).one(), False
except NoResultFound:
kwargs.update(create_method_kwargs or {})
created = getattr(model, create_method, model)(**kwargs)
try:
session.add(created)
session.flush()
return created, True
except IntegrityError:
session.rollback()
return session.query(model).filter_by(**kwargs).one(), False
I just wrote a fairly expansive blog post on all the details, but a few quite ideas of why I used this.
It unpacks to a tuple that tells you if the object existed or not. This can often be useful in your workflow.
The function gives the ability to work with #classmethod decorated creator functions (and attributes specific to them).
The solution protects against Race Conditions when you have more than one process connected to the datastore.
EDIT: I've changed session.commit() to session.flush() as explained in this blog post. Note that these decisions are specific to the datastore used (Postgres in this case).
EDIT 2: I’ve updated using a {} as a default value in the function as this is typical Python gotcha. Thanks for the comment, Nigel! If your curious about this gotcha, check out this StackOverflow question and this blog post.
A modified version of erik's excellent answer
def get_one_or_create(session,
model,
create_method='',
create_method_kwargs=None,
**kwargs):
try:
return session.query(model).filter_by(**kwargs).one(), True
except NoResultFound:
kwargs.update(create_method_kwargs or {})
try:
with session.begin_nested():
created = getattr(model, create_method, model)(**kwargs)
session.add(created)
return created, False
except IntegrityError:
return session.query(model).filter_by(**kwargs).one(), True
Use a nested transaction to only roll back the addition of the new item instead of rolling back everything (See this answer to use nested transactions with SQLite)
Move create_method. If the created object has relations and it is assigned members through those relations, it is automatically added to the session. E.g. create a book, which has user_id and user as corresponding relationship, then doing book.user=<user object> inside of create_method will add book to the session. This means that create_method must be inside with to benefit from an eventual rollback. Note that begin_nested automatically triggers a flush.
Note that if using MySQL, the transaction isolation level must be set to READ COMMITTED rather than REPEATABLE READ for this to work. Django's get_or_create (and here) uses the same stratagem, see also the Django documentation.
This SQLALchemy recipe does the job nice and elegant.
The first thing to do is to define a function that is given a Session to work with, and associates a dictionary with the Session() which keeps track of current unique keys.
def _unique(session, cls, hashfunc, queryfunc, constructor, arg, kw):
cache = getattr(session, '_unique_cache', None)
if cache is None:
session._unique_cache = cache = {}
key = (cls, hashfunc(*arg, **kw))
if key in cache:
return cache[key]
else:
with session.no_autoflush:
q = session.query(cls)
q = queryfunc(q, *arg, **kw)
obj = q.first()
if not obj:
obj = constructor(*arg, **kw)
session.add(obj)
cache[key] = obj
return obj
An example of utilizing this function would be in a mixin:
class UniqueMixin(object):
#classmethod
def unique_hash(cls, *arg, **kw):
raise NotImplementedError()
#classmethod
def unique_filter(cls, query, *arg, **kw):
raise NotImplementedError()
#classmethod
def as_unique(cls, session, *arg, **kw):
return _unique(
session,
cls,
cls.unique_hash,
cls.unique_filter,
cls,
arg, kw
)
And finally creating the unique get_or_create model:
from sqlalchemy import Column, Integer, String, create_engine
from sqlalchemy.orm import sessionmaker
from sqlalchemy.ext.declarative import declarative_base
Base = declarative_base()
engine = create_engine('sqlite://', echo=True)
Session = sessionmaker(bind=engine)
class Widget(UniqueMixin, Base):
__tablename__ = 'widget'
id = Column(Integer, primary_key=True)
name = Column(String, unique=True, nullable=False)
#classmethod
def unique_hash(cls, name):
return name
#classmethod
def unique_filter(cls, query, name):
return query.filter(Widget.name == name)
Base.metadata.create_all(engine)
session = Session()
w1, w2, w3 = Widget.as_unique(session, name='w1'), \
Widget.as_unique(session, name='w2'), \
Widget.as_unique(session, name='w3')
w1b = Widget.as_unique(session, name='w1')
assert w1 is w1b
assert w2 is not w3
assert w2 is not w1
session.commit()
The recipe goes deeper into the idea and provides different approaches but I've used this one with great success.
The closest semantically is probably:
def get_or_create(model, **kwargs):
"""SqlAlchemy implementation of Django's get_or_create.
"""
session = Session()
instance = session.query(model).filter_by(**kwargs).first()
if instance:
return instance, False
else:
instance = model(**kwargs)
session.add(instance)
session.commit()
return instance, True
not sure how kosher it is to rely on a globally defined Session in sqlalchemy, but the Django version doesn't take a connection so...
The tuple returned contains the instance and a boolean indicating if the instance was created (i.e. it's False if we read the instance from the db).
Django's get_or_create is often used to make sure that global data is available, so I'm committing at the earliest point possible.
I slightly simplified #Kevin. solution to avoid wrapping the whole function in an if/else statement. This way there's only one return, which I find cleaner:
def get_or_create(session, model, **kwargs):
instance = session.query(model).filter_by(**kwargs).first()
if not instance:
instance = model(**kwargs)
session.add(instance)
return instance
There is a Python package that has #erik's solution as well as a version of update_or_create(). https://github.com/enricobarzetti/sqlalchemy_get_or_create
Depending on the isolation level you adopted, none of the above solutions would work.
The best solution I have found is a RAW SQL in the following form:
INSERT INTO table(f1, f2, unique_f3)
SELECT 'v1', 'v2', 'v3'
WHERE NOT EXISTS (SELECT 1 FROM table WHERE f3 = 'v3')
This is transactionally safe whatever the isolation level and the degree of parallelism are.
Beware: in order to make it efficient, it would be wise to have an INDEX for the unique column.
One problem I regularly encounter is when a field has a max length (say, STRING(40)) and you'd like to perform a get or create with a string of large length, the above solutions will fail.
Building off of the above solutions, here's my approach:
from sqlalchemy import Column, String
def get_or_create(self, add=True, flush=True, commit=False, **kwargs):
"""
Get the an entity based on the kwargs or create an entity with those kwargs.
Params:
add: (default True) should the instance be added to the session?
flush: (default True) flush the instance to the session?
commit: (default False) commit the session?
kwargs: key, value pairs of parameters to lookup/create.
Ex: SocialPlatform.get_or_create(**{'name':'facebook'})
returns --> existing record or, will create a new record
---------
NOTE: I like to add this as a classmethod in the base class of my tables, so that
all data models inherit the base class --> functionality is transmitted across
all orm defined models.
"""
# Truncate values if necessary
for key, value in kwargs.items():
# Only use strings
if not isinstance(value, str):
continue
# Only use if it's a column
my_col = getattr(self.__table__.columns, key)
if not isinstance(my_col, Column):
continue
# Skip non strings again here
if not isinstance(my_col.type, String):
continue
# Get the max length
max_len = my_col.type.length
if value and max_len and len(value) > max_len:
# Update the value
value = value[:max_len]
kwargs[key] = value
# -------------------------------------------------
# Make the query...
instance = session.query(self).filter_by(**kwargs).first()
if instance:
return instance
else:
# Max length isn't accounted for here.
# The assumption is that auto-truncation will happen on the child-model
# Or directtly in the db
instance = self(**kwargs)
# You'll usually want to add to the session
if add:
session.add(instance)
# Navigate these with caution
if add and commit:
try:
session.commit()
except IntegrityError:
session.rollback()
elif add and flush:
session.flush()
return instance

Categories