Separating SQLAlchemy mapping class from functional class in python - python

I'm sure there have been questions about this before, and I'm just not searching for the right keywords...
Is it considered good/bad practice to separate ORM mapping classes from the classes that are used for processing in an application?
For instance:
class Artist:
def __init__(self, name, age, genre):
self.name = name
self.age = age
self.genre = genre
self.score = None # Calculated in real-time, never stored to DB
class ArtistM(Base):
__tablename__ = "artists"
name = Column(String(50))
age = Column(Integer)
genre = Column(String(50))
Conceivable benefit would be to keep the classes used by the primary application completely free of ORM stuff. For instance, assume I've created an Artist class and an entire suite of tools which operate on that class. Later on, I want to start handling LOTS of these objects and decide to add a DB component. Do I want to go back and modify the original Artist class or make new mapping classes on top?

Use inheritance to achieve this:
# Option-1: use ArtistM everywhere
class ArtistM(Artist, Base):
# ...
# Option-2: use Artist everywhere
class Artist(ArtistM):
# ...
I prefer option 1 as it does not blotter your pure classes with the persistent related information. And in your code you could even import them with pure names so that you could use your other code interchangably:
from mypuremodule import Artist
# or
from mymixedinmodule import ArtistM as Artist
# then use 'Artist` everywhere

Related

django simple history - using model methods?

I'm using django-simple-history:
http://django-simple-history.readthedocs.io/en/latest/
I have a model, which I would like to apply its methods on an historical instance. Example:
from simple_history.models import HistoricalRecords
class Person(models.Model):
firstname = models.CharField(max_length=20)
lastname = models.CharField(max_length=20)
history = HistoricalRecords()
def fullName(self):
return firstname + lastname
person = Person.objects.get(pk=1) # Person instance
for historyPerson in person.history:
historyPerson.fullName() # wont work.
Since the class HistoricalPerson does not inherit the methods of Person. But using Person methods actually make sense, since they share the same fields..
Any solution for this? I'd prefer something simple, not like duplicating every method in my models for the history instances..
I found another workaround (maybe it's just the addon had been updated and got this feature). It's based on the documentation: adding-additional-fields-to-historical-models
HistoricalRecords field accepts bases parameter which sets a class that history objects will inherit. But you can't just set bases=[Person] inside Person class description, because it's not yet initialized.
So I ended up with an abstract class, which is inherited by both Person class and HistoricalRecords field. So the example from the question would look like:
class AbstractPerson(models.Model):
class Meta:
abstract = True
firstname = models.CharField(max_length=20)
lastname = models.CharField(max_length=20)
def fullName(self):
return firstname + lastname
class Person(AbstractPerson):
history = HistoricalRecords(bases=[AbstractPerson])
And now history objects can use fullName method.
For anyone else having the same problem, I made it work by calling the method from the original class on the historical record object. So for the example in the question, a solution could be:
for historyPerson in person.history:
Person.fullName(historyPerson)
This works because methods are very much like functions in Python, except that when you call a method on an instance, the instance is implicitly passed as the first parameter for the method. So if you have a class like:
class Foo:
def method(self):
....
doing
f = Foo()
f.method()
is the same as:
f = Foo()
Foo.method(f)
I don't know exactly why simple-history does not copy the original model's methods though. One reason might be that since it allows you to exclude fields to be recorded, having the original methods might not make sense, since a method might not work if it uses fields that are not recorded in the historical record.

How to create exact duplicate of Django model that has M2M and FK relationships

I have a Django model that already exists that I'd like to duplicate, and I can't figure out an easy way how because of related-name conflicts across ForeignKeys and ManyToManys.
As an example, let's call the model I currently have Dog:
class Dog(models.Model):
name = models.CharField()
owner = models.ForeignKey('myapp.Owner')
breeds = models.ManyToMany('myapp.Breed', help_text="Remember, animals can be mixed of multiple breeds.")
I'd like to make an exact duplicate of this model for use elsewhere, with a different database table and name. I tried using an abstract base class:
class AnimalAbstract(models.Model):
name = models.CharField()
owner = models.ForeignKey('myapp.Owner')
breeds = models.ManyToMany('myapp.Breed', help_text="Remember, animals can be mixed of multiple breeds.")
class Meta:
abstract = True
class Dog(AnimalAbstract):
pass
class Cat(AnimalAbstract):
pass
This fails because of related_name conflicts.
Is there any way to automatically copy a model like this without explicitly redefining every ForeignKey and ManyToMany?
To preemptively answer questions: yes, I know about multi-table inheritance, and I don't want to use it. I also know that I could simply store this all in the same table and use proxy models with custom managers to automatically filter out the wrong type of animal, but I don't want that either—I want them on separate database tables.
https://docs.djangoproject.com/en/1.8/topics/db/models/#abstract-related-name
To work around this problem, when you are using related_name in an abstract base class (only), part of the name should contain %(app_label)s and %(class)s.
%(class)s is replaced by the lower-cased name of the child class that the field is used in.
%(app_label)s is replaced by the lower-cased name of the app the child class is contained within. Each installed application name must be unique and the model class names within each app must also be unique, therefore the resulting name will end up being different.
Ex:
class Dog(models.Model):
name = models.CharField()
owner = models.ForeignKey(
'myapp.Owner',
related_name="%(app_label)s_%(class)s_dogs")
breeds = models.ManyToMany(
'myapp.Breed',
help_text="Remember, animals can be mixed of multiple breeds.",
related_name="%(app_label)s_%(class)s_dogs")

Sqlalchemy (and alembic) Concrete table inheritance without polymorphic union

I'm new to SQLAlchmey, and I'm trying to achieve the following objects/db tables structure (by using alembic as well):
class BaseConfig(Base):
pk = Column(Integer, primary_key=True)
name = Column(Unicode(150), nullable=False, unique=True)
...
# Lots of other general columns, such as description, display_name, creation time etc.
And I want all other configuration classes to inherit the predefined columns from it:
class A(BaseConfig):
__tablename__ = "A"
column1 = Column...
column2 = Column...
class B(BaseConfig):
__tablename__ = "B"
column1 = Column...
column2 = Column...
The BaseConfig table is not a real table, only a class that holds common columns.
Other than that - nothing is related between A and B, and no need for a shared pk etc. It seems that using "polymorphic_union" is not relevant here as well.
By trying to run alembic autogenerate I get the error that BaseConfig doesn't have a table mapped class - which is true, and i really don't see a reason to add the "polymorphic union" option to BaseConfig, since this class is generic.
Any suggestions? (In Django with south this works out of the box, but here it seems that this behavior is not supported easily).
Thanks,
Li
Either use MixIns (read Mixing in Columns part of the documentation), where your BaseConfig does not subclass Base and actual tables subclass both Base and BaseConfig:
class BaseConfig(object):
# ...
class MyModel(BaseConfig, Base):
# ...
OR simply use __abstract__:
class BaseConfig(Base):
__abstract__ = True

dynamic sqlalchemy columns on class

I have an assortment of sqlalchemy classes e.g:
class OneThing(Base):
id = Column(Integer, Sequence('one_thing_seq'), primary_key=True)
thing = Column(Boolean())
tag = Column(String(255))
class TwoThing(Base):
id = Column(Integer, Sequence('two_thing_seq'), primary_key=True)
thing = Column(Boolean())
tag = Column(String(100))
i.e. fairly standard class contructions for sqlalchemy
Question: Is there a way to get greater control over the column creation or does that need to be relatively static? I'd at least like to consolidate the more mundane columns and their imports across a number of files like this for example(not as a mixin because I already do that for certain columns that are the same across models, but a function that returns a column based on potential vars):
class OneThing(Base):
id = Base.id_column()
thing = Base.bool_column
tag = Base.string_col(255)
class OneThing(Base):
id = Base.id_column()
thing = Base.bool_column
tag = Base.string_col(255)
Seems simple enough and fairly approachable that I will just start reading/typing, but I have not found any examples or the proper search terms for examples, yet. There is no reason that class columns need to be static, and it is probably simple. Is this a thing, or a foolish thing?
from sqlalchemy import Column, Boolean, Integer
def c_id():
return Column(Integer, primary_key=True)
def c_bool():
return Column(Boolean, nullable=False, default=False)
def c_string(len):
return Column(String(len), nullable=False, default='')
class Thing(Base):
id = c_id()
thing = c_bool()
tag = c_string(255)
The SQLAlchemy developer goes into more detail here: http://techspot.zzzeek.org/2011/05/17/magic-a-new-orm/
The Column() call is not magical; you can use any random way to create the appropriate object. The magic (i.e. binding the column to the variable name and the table) happens in Base's metaclass.
So one solution is for you to write your own code which returns a Column() or three -- nothing prevents you from doing this:
class Thing(Base):
id,thing,tag = my_magic_creator()
On the other hand, you can drop all these assignments wholesale, and do the work in a metaclass; see my answer here: Creating self-referential tables with polymorphism in SQLALchemy
for a template on how to do that.

python class attributes and sql alchemy relationships

Ok so I have two problems:
My first would by sqlalchemy related. So I'm forced to use sqllite and I have to implement some cascade deletes. Now I've found relationships do the job however these should be declared normally in the parent table. But due to some design decision I'm forced to do this in the child so I'm doing something like:
class Operation(Base):
"""
The class used to log any action executed in Projects.
"""
__tablename__ = 'OPERATIONS'
id = Column(Integer, primary_key=True)
parameters = Column(String)
..... rest of class here ....
class DataType(Base):
__tablename__ = 'DATA_TYPES'
id = Column(Integer, primary_key=True)
gid = Column(String)
...more params...
parent_operation = relationship(Operation, backref=backref("DATA_TYPES",
order_by=id,
cascade="all,delete"))
...rest of class...
Now this seems to work but I'm still not certain of a few things.
Firstly, what can I do with parent_operation from here on end? I mean I see that the cascade works but I make no use of parent_operation except for the actual declaration.
Secondly, the "DATA_TYPES" in the above case, which is the first parameter in the backref, does this need to be the name of the child table or does it need to be unique per model?
And finally, in my case both Operation and DataType classes are in the same module, so I can pass Operation as the first parameter in the relationship. Now if this wasnt the case and I would have them in separate modules, if I still want to declare this relationship should I pass 'Operation' or 'OPERATION' to the relationship( Classname or Tablename ? )
Now my second is more core Python but since it still has some connections with the above I'll add it here. So I need to be able to add a class attribute dinamically. Basically I need to add a relationship like the ones declared above.
class BaseClass(object)
def __init__(self):
my_class = self.__class__
if not hasattr(my_class, self.__class__.__name__):
reference = "my_class." + self.__class__.__name__ + "= relationship\
('DataType', backref=backref('" + self.__class__.__name__ + "', \
cascade='all,delete'))"
exec reference
The reason of to WHY I need to do this are complicated and have to do with some design decisions(basically I need every class that extends this one to have a relationship declared to the 'DataType' class). Now I know using of the exec statement isn't such a good practice. So is there a better way to do the above?
Regards,
Bogdan
For the second part or your question, keep in mind, anything in your class constructor won't be instrumented by SqlAlchemy. In your example you can simply declare the relationship in its own class (note it does not inherit from you declarative_base class) and then inherit it in any subclass something like this:
class BaseDataType(object):
parent_operation = relationship(Operation, backref="datatype")
class DataTypeA(Base, BaseDataType):
id = Column(Integer, primary_key=True)
class DataTypeB(Base, BaseDataType):
id = Column(Integer, primary_key=True)
The SqlAlchemy documentation gives good examples of what's possible:
http://www.sqlalchemy.org/docs/orm/extensions/declarative.html#mixing-in-relationships

Categories