I am using SQLAlchemy in Python and am declaring my classes inheriting from a declarative base as follows:
from sqlalchemy import Column, Integer, String
from sqlalchemy.ext.declarative import declarative_base
Base = declarative_base()
class SomeClass(Base):
__tablename__ = 'some_table'
id = Column(Integer, primary_key=True)
name = Column(String(50))
As a user I would like to define the __tablename__ as a parameter, and not a hard-coded value , something like this:
class SomeClass(Base):
__tablename__ = f'{environment}_some_table'
id = Column(Integer, primary_key=True)
name = Column(String(50))
It is my understanding that f'{environment}_some_table' will be be evaluated when I import this package, and therefore I won't be able to modify it at a later stage (i.e. in an interactive notebook). I have a broken piece of code that tries to solve this through nested classes and encapsulation, but I do not manage to reference the instance variable of an outer class.
class Outer:
def __init__(self, env):
self.environment = env
class SomeClass(Base):
__tablename__ = f'{self.environment}_some_table'
id = Column(Integer, primary_key=True)
name = Column(String(50))
I have read in a couple of SO questions that it is better not to use nested classes since no special relationship is established between these classes.
So what kind of design should I use to solve this problem?
Thanks in advance!
you can make all your model definitions inside a function scope so the will be depended on outer arguments:
def create_models(environment):
class SomeClass(Base):
__tablename__ = f'{environment}_some_table'
id = Column(Integer, primary_key=True)
name = Column(String(50))
...
globals().update(locals()) # update outer scope if needed
... # some time later
create_models('cheese')
... # now create the db session/engine/etc ...
another choice to look at is the builtin reload method. check it out :)
Related
We use SQLAlchemy to read/write data, but not create tables (as it done by DBAs). Due to this, some of the definitions which are incorrect have not yet been caught (though it works for reads/writes).
Is there a way to override them for testing purposes (create tables on the fly etc.) without touching the original definition? A simple class-overriding doesn't seem to work, and I don't see any other solution to this problem:
from sqlalchemy import Column, Integer, Numeric, String
from sqlalchemy.ext.declarative import declarative_base
Base = declarative_base()
class Product(Base):
__tablename__ = "product"
idn = Column(Numeric, primary_key=True) # should be Integer
code = Column(String, primary_key=True) # should be unique (not primary)
class Product(Product):
__tablename__ = "product"
__table_args__ = {"extend_existing": True} # want to override, not extend
idn = Column(Integer, primary_key=True)
code = Column(String, unique=True)
Is there a way to be more explicit in SqlAlchemy ORM definitions by using class/member references instead of string constants without running into cyclic dependencies? One of the main benefits of an ORM is keeping things 'cleaner' and more maintainable than having string constants copied all over the place. This totally undermines that benefit.
A simple example from SqlAlchemy's docs, showing using string constants.
class Parent(Base):
__tablename__ = 'parent'
id = Column(Integer, primary_key=True)
children = relationship("Child")
class Child(Base):
__tablename__ = 'child'
id = Column(Integer, primary_key=True)
parent_id = Column(Integer, ForeignKey('parent.id'))
I want to do this:
class Parent(Base):
__tablename__ = 'parent'
id = Column(Integer, primary_key=True)
children = relationship(Child)
class Child(Base):
__tablename__ = 'child'
id = Column(Integer, primary_key=True)
parent_id = Column(Integer, ForeignKey(Parent.id))
This is generally legal, but the problem is that I run into cyclic dependencies with the necessary importing of Parent from Child and of Child from Parent (assuming they are in separate files). The best I can do is split the difference - and use strings on one end and do the explicit class w/import on the other end. Just feels icky.
Just wondering if I'm missing something or someone has some ways of accomplishing this.
As an alternative to string-based attributes, attributes may also be defined after all classes have been created. Just add them to the target class after the fact:
class Parent(Base):
__tablename__ = 'parent'
id = Column(Integer, primary_key=True)
class Child(Base):
__tablename__ = 'child'
id = Column(Integer, primary_key=True)
parent_id = Column(Integer, ForeignKey(Parent.id))
Parent.childen = relationship(Child)
Let's have a classes X and Y and relations between them x2y and y2x.
From class_mapper(Class).iterate_properties iterator we can get all class's properties.
So x2y and y2x are RelationshipProperty and what I hope to get from is a class or a class name of objects on remote side of relation.
I've already tried to make a solution.
I've found x2y.remote_side[0].table.name, made a tables_map which maps a table name to a class and it works fine for one-to-many and one-to-one. If I use it for many-to-many the table name is an association table.
Any hints on how can I get the remote side class?
X.x2y.property.mapper.class_
relatonshipproperty will eventually get class-level attribute documentation the same as mapper does now.
edit. Here is a test which illustrates the above returning "Y" from "X", and no reflection doesn't create relationships so should have no effect:
from sqlalchemy import Column, Integer, ForeignKey
from sqlalchemy.orm import relationship
from sqlalchemy.ext.declarative import declarative_base
Base = declarative_base()
class X(Base):
__tablename__ = 'x'
id = Column(Integer, primary_key=True)
x2y = relationship("Y")
class Y(Base):
__tablename__ = 'y'
id = Column(Integer, primary_key=True)
x_id = Column(Integer, ForeignKey("x.id"))
assert X.x2y.property.mapper.class_ is Y
I've found that a method argument() on relationshipproperty returns remote class.
for prop in class_mapper(X).iterate_properties:
if isinstance(prop, RelationshipProperty):
relation = prop
relation.argument()
I am currently trying to create the following database schema with SQLAlchemy (using ext.declarative):
I have a base class MyBaseClass which provides some common functionality for all of my publicly accessible classes, a mixin class MetadataMixin that provides functionality to query metadata from imdb and store it.
Every class that subclasses MetadataMixin has a field persons which provides a M:N relationship to instances of the Person class, and a field persons_roles which provides a 1:N relationship to an object (one for each subclass) which stores the role a concrete Person plays in the instance of the subclass.
This is an abbreviated version of what my code looks like at the moment:
from sqlalchemy import Column, Integer, Enum, ForeignKey
from sqlalchemy.orm import relationship
from sqlalchemy.ext.associationproxy import association_proxy
from sqlalchemy.ext.declarative import declarative_base
Base = declarative_base()
class MyBaseClass(object):
"""Base class for all publicly accessible classes"""
id = Column(Integer, primary_key=True)
class Person(MyBaseClass):
"""A Person"""
name = Column(Unicode)
movies = association_proxy('movie_roles', 'movie',
creator=lambda m: _PersonMovieRole(movie=m))
shows = association_proxy('show_roles', 'show',
creator=lambda s: _PersonShowRole(show=s=))
class _PersonMovieRole(Base):
"""Role for a Person in a Movie"""
__tablename__ = 'persons_movies'
id = Column(Integer, primary_key=True)
role = Column(Enum('none', 'actor', 'writer', 'director', 'producer'),
default='none')
person_id = Column(Integer, ForeignKey('persons.id'))
person = relationship('Person', backref='movie_roles')
movie_id = Column(Integer, ForeignKey('movies.id'))
movie = relationship('Movie', backref='persons_roles')
class _PersonShowRole(Base):
"""Role for a Person in a Show"""
__tablename__ = 'persons_shows'
id = Column(Integer, primary_key=True)
role = Column(Enum('none', 'actor', 'writer', 'director', 'producer'),
default='none')
person_id = Column(Integer, ForeignKey('persons.id'))
person = relationship('Person', backref='show_roles')
show_id = Column(Integer, ForeignKey('shows.id'))
show = relationship('Episode', backref='persons_roles')
class MetadataMixin(object):
"""Mixin class that provides metadata-fields and methods"""
# ...
persons = association_proxy('persons_roles', 'person',
creator= #...???...#)
class Movie(Base, MyBaseClass, MetadataMixin):
#....
pass
What I'm trying to do is to create a generic creator function for association_proxy that creates either a PersonMovieRole or a PersonShowRole object, depending on the class of the concrete instance that a Person is added to. What I'm stuck on at the moment is that I don't know how to pass the calling class to the creator function.
Is this possible, or is there maybe even an easier way for what I'm trying to accomplish?
By the time your persons field is defined, you cannot really know what class it will end up in. Python takes up ready dictionaries of class members and creates classes out of them (via type.__new__), but when it happens, those members are already fully defined.
So you need to provide the required information directly to the mixin, and tolerate the small duplication it will create in your code. I'd opt for interface similar to this one:
class Movie(Base, MyBaseClass, MetadataMixin('Movie')):
pass
(You cannot have MetadataMixin(Movie) either, for the exact same reasons: Movie requires its base classes to be completely defined by the time the class is created).
To implement such "parametrized class", simply use a function:
def MetadataMixin(cls_name):
"""Mixin class that provides metadata-fields and methods"""
person_role_cls_name = 'Person%sRole' % cls_name
person_role_cls = Base._decl_class_registry[person_role_cls_name]
class Mixin(object):
# ...
persons = association_proxy('persons_roles', 'person',
creator=person_role_cls)
return Mixin
This works because what we're looking up in Base._decl_class_registry - the registry of all classes descending from your declarative base - is not the final class (e.g. Movie), but the association object (e.g. PersonMovieRole).
It will be simplest to explain with code example, in Python I can do so to achieve model inheritance:
"""Image model"""
from sqlalchemy import Column, ForeignKey
from sqlalchemy.types import Integer, String, Text
from miasto_3d.model.meta import Base
class Image(Base):
__tablename__ = "image"
image_id = Column(Integer, primary_key=True)
path = Column(String(200))
def get_mime(self):
#function to get mime type from file
pass
"""WorkImage model"""
class WorkImage(Image, Base):
__tablename__ = "work_images"
image_id = Column(Integer, ForeignKey("image.image_id"), primary_key=True)
work_id = Column(Integer, ForeignKey("work.id"))
work = relation("Work", backref=backref('images',order_by='WorkImage.work_id'))
"""UserAvatar model"""
class UserAvatar(Image, Base):
__tablename__ = "user_avatars"
image_id = Column(Integer, ForeignKey("image.image_id"), primary_key=True)
user_id = Column(Integer, ForeignKey("user.id"))
user = relation("User", backref=backref('images',order_by='UserAvatar.user_id'))
How I do similar things in Rails? Or maybe there is another, better way to do it?
I know paperclip, but I don't like it's conception to use shared table to store photo and model data.
It looks like you're wanting either a polymorphic association or perhaps single table inheritance.
Since you don't define database fields in the model, you cannot inherit database schema in this way - all your fields will need to be specified per table in a migration. You probably should use paperclip, if only because reinventing the wheel is a pain. It works really well, and abstracts away from the actual database structure for you.
In Rails, rather than model inheritance, shared functionality tends to be implemented in modules, like so:
http://handyrailstips.com/tips/14-drying-up-your-ruby-code-with-modules