Provide an enum to model at init time for attribute conversion - python

Let's say I have a generic Food sqlalchemy model that I want to re-use in several different apps:
class Food(Base):
_type = Column(Integer, index=True, unique=False, nullable=False)
The _type attribute here is an integer. It could technically be an Enum, but I when I write my generic model, I don't have access to the enum values (they are defined later in the apps). I tried the Mixin approach (see my previous question: provide Enum at DB creation time), but my real use case is slightly more complex than the Food example. There are multiple relationships defined on this model, including one that points back to the Food model. This forces me, among other issues, to declare several relationships at the app level and I really don't want to do that.
Instead, I would like to do something like this:
class FoodType(Enum):
pass
class Food(Base):
_type = Column(Integer, index=True, unique=False, nullable=False)
#hybrid_property
def type(self) -> FoodType:
return FoodType(self._type)
#type.setter
def type(self, food_type):
self._type = food_type.value
and I wanted to somehow "fill" the FoodType enum later at the app level, but it doesn't seem to be possible. I tried to override/extend/subclass FoodType but my attempts were unsuccessful.
Do you have any suggestion?

Ok, the only way I found is to "monkey patch" the model/class by giving a subclassed (and extended) enum to the model/class:
class FoodType(Enum):
pass
class Food(Base):
food_types: FoodType
_type = Column(Integer, index=True, unique=False, nullable=False)
#hybrid_property
def type(self) -> FoodType:
return self.food_types(self._type)
#type.expression
def type(cls):
return cls._type
#type.setter
def type(self, food_type):
self._type = food_type.value
and then in my apps, I can subclass FoodType, add the enum values, and before the call to create_all I just need to do:
Food.food_types = MySubClassedEnum

Related

SqlAlchemy attribute that tracks an assigned attribute

Goal: Create an SQLAlchemy attribute which tracks/follows changes in another object's SQLAlchemy attribute.
Given:
class ClazzA():
attributeA = Column(JSONDict)
class ClazzB():
attributeB = Column(?)
objectA = ClazzA()
objectA.attributeA = {'foo': 1}
objectB = ClazzB()
objectB.attributeB = objectA.attributeA
objectA.attributeA['foo'] = 2
JSONDict is associated with MutableDict as described here: http://docs.sqlalchemy.org/en/latest/orm/extensions/mutable.html#module-sqlalchemy.ext.mutable , i.e. the JSONDict type allows for mutation tracking.
So we have this dictionary on objectA whose changes are being recorded by SQLAlchemy. I would like for attributeB to track attributeA such that even if the application is restarted (i.e. the attributes are reloaded from the DB), then attributeB will continue to reflect changes made to attributeA's dictionary.
Of course, this is closely related to the fact that Python doesn't have an idea of pointers. I was wondering if SQLAlchemy has a solution for this particular problem.
TL;DR
You want a one-to-many relationship.
from sqlalchemy import ForeignKey, Integer, Column
from sqlalchemy.orm import relationship
class Widget(Base):
__tablename__ = 'widget'
widget_id = Column(Integer, primary_key=True)
# name columns, type columns, ...
json = Column(JSONDict)
class ClazzB(Base):
__tablename__ = 'clazzb'
clazzb_id = Column(Integer, primary_key=True)
# Your "attributeB"
widget_id = Column(Integer,
ForeignKey('widget.widget_id',
onupdate='cascade',
ondelete='cascade'),
nullable=False)
widget = relationship('Widget')
# possible association_proxy
#widget_json = association_proxy('widget', 'json')
Using a Relationship
Define a relationship between models ClazzA and ClazzB. Now since we don't have the whole picture, the below definitions are just examples.
from sqlalchemy import ForeignKey
from sqlalchemy.orm import relationship
class ClazzA(Base): # replace Base with the base class of your models
__tablename__ = 'clazza' # replace with the real tablename
# T is the type of your primary key, the column name is just an example
clazza_id = Column(T, primary_key=True)
class ClazzB(Base):
# The column that will relate this model to ClazzA
clazza_id = Column(T, ForeignKey('clazza.clazza_id',
onupdate='cascade',
ondelete='cascade'),
nullable=False)
# A handy accessor for relationship between mapped classes,
# not strictly required. Configurable to be either very lazy
# (loaded if accessed by issuing a SELECT) or eager (JOINed
# when loading objectB for example)
objectA = relationship('ClazzA')
Now instead of adding a reference to attributeA of ClazzA to ClazzB add a reference to related objectA to objectB on initialization.
objectB = ClazzB(..., objectA=objectA)
The two are now related and to access attributeA of related objectA through objectB do
objectB.objectA.attributeA
No need to track changes to attributeA, since it is the attributeA of the instance.
Now if you must have an attribute attributeB on ClazzB (to avoid refactoring existing code or some such), you could add a property
class ClazzB:
#property
def attributeB(self):
return self.objectA.attributeA
which will return the attributeA of the related objectA with
objectB.attributeB
objectB.attributeB['something'] = 'else'
and so on.
There is also an SQLAlchemy method for accessing attributes across relationships: association proxy. It supports simple querying, but is not for example subscriptable.
class ClazzB(Base):
attributeB = association_proxy('objectA', 'attributeA')
If you wish for ClazzB.attributeB to access values from the JSONDict under certain key, you can for example use something like this
class ClazzB(Base):
key = Column(Unicode)
#property
def attributeB(self):
return self.objectA.attributeA[self.key]
You can also make attributeB work as an SQL expression on class level using hybrid properties, if you need such a thing. You would have to write your class level expressions yourself though.

Same Polymorphic Identity for Different Models

I have one model which is the a polymorphic identity of an abstract class
class AbstractModel(Base):
type = Column(String())
__mapper_args__ = {"polymorphic_on": type}
class ModelA(AbstractModel):
__mapper_args__ = {"polymorphic_identity": "model_a"}
class FlaskModel(ModelA):
__mapper_args__ = {"polymorphic_identity": "model_a"}
I need FlaskModel to to have the same polymorphic relationship as ModelA, because FlaskModel has flask specific restraint that can't exist within ModelA (request context, user permissions, etc)
However, when creating the second class, SQLAlchemy throws a warning about having duplicate, which is for good reason since the any queries always points to the FlaskModel, even if they were queried from ModelA.
Any suggestions on accomplishing this? Splitting the code into a package and then importing isn't an option.
I ended up using this route, which worked well and allowed me to create flask specific functions.
class AbstractModel(Base):
type = Column(String())
__mapper_args__ = {"polymorphic_on": type}
class ModelA(AbstractModel):
__mapper_args__ = {"polymorphic_identity": "model_a"}
class FlaskModel(ModelA):
pass
##Event listeners are requireed for the inherited polymorphic relationships
##If additional work needs to be done with different aspect, including after_delete and after_update
##They can als be added here
#event.listens_for(FlaskModel, 'init')
def receive_label_init(target, args, kwargs):
kwargs["type"] = "model_a"
#event.listens_for(FlaskModel, 'mapper_configured')
def receive_label_mapper_configured(mapper, class_):
mapper.polymorphic_identity = "model_a"

SQLAlchemy - create an instance in another instances __init__

Newish to SQLAlchemy (so my terminology may be a bit off). I want to create a database object inside the constructor of another, but the problem is I can't add said object to the session, so I get an error.
My schema looks a bit like the following:
class Tag(Base):
__tablename__ = 'tag'
id = Column(Integer, Sequence('tag_id_seq'), primary_key=True, nullable=False)
type = Column(String(1), nullable=False)
name = Column(String(255), unique=True, nullable=False)
def __init__(self, type, name):
self.type=type
self.name=name
def __repr__(self):
return "<Tag('%s')>" % (self.id)
class Venue:
__tablename__ = 'venue'
tag_id = Column(Integer)
tag_type = Column(String(1), nullable=False)
location = Column(String(255), nullable=False)
tag = ForeignKeyConstraint(
(tag_id, tag_type),
(Tag.id, Tag.type),
onupdate='RESTRICT',
ondelete='RESTRICT',
)
def __init__(self,name,location):
self.tag = Tag('V',name)
self.location = location
When I do the following:
session.add(Venue("Santa's Cafe","South Pole"))
I get an error:
UnmappedInstanceError: Class '__builtin__.instance' is not mapped
I assume this is because the the Tag object created in Venue's constructor is not added to the session. My question is how/when do I do this. (I'd really prefer to create that Tag object in the constructor if possible). I think I could do this with a scoped_session but that seems like a really bad solution.
Thanks
Inherit Venue from Base. Otherwise Venue won't be mapped.
Move ForeignKeyConstraint to __table_args__.
Replace currently meaningless tag property with relationship to Tag. The default value of cascade parameter to relationship contains 'save-update' rule that will add new referred object to the same session as parent.
From documentation:
save-update - cascade the Session.add() operation. This cascade
applies both to future and past calls to add(), meaning new items
added to a collection or scalar relationship get placed into the same
session as that of the parent, and also applies to items which have
been removed from this relationship but are still part of unflushed
history.

SQLAlchemy: Multiple Inheritance with dynamic 'association_proxy' creator function

I am currently trying to create the following database schema with SQLAlchemy (using ext.declarative):
I have a base class MyBaseClass which provides some common functionality for all of my publicly accessible classes, a mixin class MetadataMixin that provides functionality to query metadata from imdb and store it.
Every class that subclasses MetadataMixin has a field persons which provides a M:N relationship to instances of the Person class, and a field persons_roles which provides a 1:N relationship to an object (one for each subclass) which stores the role a concrete Person plays in the instance of the subclass.
This is an abbreviated version of what my code looks like at the moment:
from sqlalchemy import Column, Integer, Enum, ForeignKey
from sqlalchemy.orm import relationship
from sqlalchemy.ext.associationproxy import association_proxy
from sqlalchemy.ext.declarative import declarative_base
Base = declarative_base()
class MyBaseClass(object):
"""Base class for all publicly accessible classes"""
id = Column(Integer, primary_key=True)
class Person(MyBaseClass):
"""A Person"""
name = Column(Unicode)
movies = association_proxy('movie_roles', 'movie',
creator=lambda m: _PersonMovieRole(movie=m))
shows = association_proxy('show_roles', 'show',
creator=lambda s: _PersonShowRole(show=s=))
class _PersonMovieRole(Base):
"""Role for a Person in a Movie"""
__tablename__ = 'persons_movies'
id = Column(Integer, primary_key=True)
role = Column(Enum('none', 'actor', 'writer', 'director', 'producer'),
default='none')
person_id = Column(Integer, ForeignKey('persons.id'))
person = relationship('Person', backref='movie_roles')
movie_id = Column(Integer, ForeignKey('movies.id'))
movie = relationship('Movie', backref='persons_roles')
class _PersonShowRole(Base):
"""Role for a Person in a Show"""
__tablename__ = 'persons_shows'
id = Column(Integer, primary_key=True)
role = Column(Enum('none', 'actor', 'writer', 'director', 'producer'),
default='none')
person_id = Column(Integer, ForeignKey('persons.id'))
person = relationship('Person', backref='show_roles')
show_id = Column(Integer, ForeignKey('shows.id'))
show = relationship('Episode', backref='persons_roles')
class MetadataMixin(object):
"""Mixin class that provides metadata-fields and methods"""
# ...
persons = association_proxy('persons_roles', 'person',
creator= #...???...#)
class Movie(Base, MyBaseClass, MetadataMixin):
#....
pass
What I'm trying to do is to create a generic creator function for association_proxy that creates either a PersonMovieRole or a PersonShowRole object, depending on the class of the concrete instance that a Person is added to. What I'm stuck on at the moment is that I don't know how to pass the calling class to the creator function.
Is this possible, or is there maybe even an easier way for what I'm trying to accomplish?
By the time your persons field is defined, you cannot really know what class it will end up in. Python takes up ready dictionaries of class members and creates classes out of them (via type.__new__), but when it happens, those members are already fully defined.
So you need to provide the required information directly to the mixin, and tolerate the small duplication it will create in your code. I'd opt for interface similar to this one:
class Movie(Base, MyBaseClass, MetadataMixin('Movie')):
pass
(You cannot have MetadataMixin(Movie) either, for the exact same reasons: Movie requires its base classes to be completely defined by the time the class is created).
To implement such "parametrized class", simply use a function:
def MetadataMixin(cls_name):
"""Mixin class that provides metadata-fields and methods"""
person_role_cls_name = 'Person%sRole' % cls_name
person_role_cls = Base._decl_class_registry[person_role_cls_name]
class Mixin(object):
# ...
persons = association_proxy('persons_roles', 'person',
creator=person_role_cls)
return Mixin
This works because what we're looking up in Base._decl_class_registry - the registry of all classes descending from your declarative base - is not the final class (e.g. Movie), but the association object (e.g. PersonMovieRole).

Rails model inheritance

It will be simplest to explain with code example, in Python I can do so to achieve model inheritance:
"""Image model"""
from sqlalchemy import Column, ForeignKey
from sqlalchemy.types import Integer, String, Text
from miasto_3d.model.meta import Base
class Image(Base):
__tablename__ = "image"
image_id = Column(Integer, primary_key=True)
path = Column(String(200))
def get_mime(self):
#function to get mime type from file
pass
"""WorkImage model"""
class WorkImage(Image, Base):
__tablename__ = "work_images"
image_id = Column(Integer, ForeignKey("image.image_id"), primary_key=True)
work_id = Column(Integer, ForeignKey("work.id"))
work = relation("Work", backref=backref('images',order_by='WorkImage.work_id'))
"""UserAvatar model"""
class UserAvatar(Image, Base):
__tablename__ = "user_avatars"
image_id = Column(Integer, ForeignKey("image.image_id"), primary_key=True)
user_id = Column(Integer, ForeignKey("user.id"))
user = relation("User", backref=backref('images',order_by='UserAvatar.user_id'))
How I do similar things in Rails? Or maybe there is another, better way to do it?
I know paperclip, but I don't like it's conception to use shared table to store photo and model data.
It looks like you're wanting either a polymorphic association or perhaps single table inheritance.
Since you don't define database fields in the model, you cannot inherit database schema in this way - all your fields will need to be specified per table in a migration. You probably should use paperclip, if only because reinventing the wheel is a pain. It works really well, and abstracts away from the actual database structure for you.
In Rails, rather than model inheritance, shared functionality tends to be implemented in modules, like so:
http://handyrailstips.com/tips/14-drying-up-your-ruby-code-with-modules

Categories