The app has such logic: list of people stored in the database, each man has a rating calculated in realtime and this value is never stored in database. And I want to use one class to work with dababase fields: name, age etc. and non database field: rating.
Is it possible in sqlalchemy? Now I'm using inheritance Man -> ManMapping:
class Man:
rating = None
def get_rating(self):
return self.rating
...
class ManMapping(Base, Man):
__tablename__ = 'man'
id = Column('man_id', Integer, primary_key=True)
name = Column(Unicode)
...
It works but it looks terrible for me. Is it right approach or I have to do something else?
This is the correct solution: https://docs.sqlalchemy.org/en/13/orm/constructors.html
Hybrid properties are somewhat less flexible that this. The accepted answer is not an actual answer to the problem.
The SQLAlchemy ORM does not call init when recreating objects from database rows. The ORM’s process is somewhat akin to the Python standard library’s pickle module, invoking the low level new method and then quietly restoring attributes directly on the instance rather than calling init.
If you need to do some setup on database-loaded instances before they’re ready to use, there is an event hook known as InstanceEvents.load() which can achieve this; it is also available via a class-specific decorator called reconstructor(). When using reconstructor(), the mapper will invoke the decorated method with no arguments every time it loads or reconstructs an instance of the class. This is useful for recreating transient properties that are normally assigned in init:
from sqlalchemy import orm
class MyMappedClass(object):
def __init__(self, data):
self.data = data
# we need stuff on all instances, but not in the database.
self.stuff = []
#orm.reconstructor
def init_on_load(self):
self.stuff = []
If you are using any data from the DB to calculate rating I would recommend looking at hybrid property. Otherwise I would add self.rating to init and have your function inside the ManMapping class. Something like:
class ManMapping(Base):
__tablename__ = 'man'
id = Column('man_id', Integer, primary_key=True)
name = Column(Unicode)
def __init__(self)
self.rating = None
def get_rating(self):
return self.rating
In my point of view, you should have two distincts classes.
One for the logic in your code and one to communicate with your DB.
class Man(object):
"""This class is for your application"""
def __init__(self, name, rating):
# If the identifier is only used by the DB it should not be in this class
self.name = name
self.rating = rating
class ManModel(Base):
"""This model is only to communicate with the DB"""
__tablename__ = 'man'
id = Column('man_id', Integer, primary_key=True)
name = Column(Unicode)
You should have a provider that does queries to DB with ManModel objects, then maps results to Man objects and return your mapped data to the caller.
Your application will only use Man objects and your provider will do the mapping.
Something like below :
class DbProvider(object):
def get_man(self, id):
man_model = session.query(ManModel).filter(ManModel.id == id).one_or_none()
return self.man_mapper(man_model) if man_model else None
def get_men(self):
men_model = session.query(ManModel).all()
return [self.man_mapper(man_model) for man_model in men_model]
def man_mapper(self, man_model):
return Man(man_model.name, self.calculate_rating(man_model))
class Test(object):
def display_man(self):
man = db_provider.get_man(15)
if man:
print man.name, man.rating
Related
Currently in a situation where I have a series of classes that turn an API request JSON into objects. The objects are modeled after my database schema. I think the part that I'm struggling with is how to represent those entity relationships that are formed with foreign keys in my database.
The following classes are just for an example, the instance variables are much different for my application's schema.
class Table(ABC):
def __init__(self):
# stuff
#abstractmethod
def validateSchema(self):
"""Validates the resources column values."""
pass
class ClassRoom(Table):
def __init__(self, id, location_id, location):
super().__init__()
self.id = id
self.location = Location(location_id, location)
def validateSchema(self):
# stuff
class Location(Table):
def __init__(self, id, location):
super().__init__()
self.id = id
self.location = location
def validateSchema(self):
# stuff
The part I'm concerned about is when I am creating an object of the same type as the class that has the object as an instance variable.
class ClassRoom(Table):
def __init__(self, id, location_id, location):
# Can I instantiate this class even if it inherits the same parent?
self.location = Location(location_id, location)
Is this ok in OOP? Is there a better way to design my classes?
Also, these classes are just defined for the request JSONs that get sent to my API. Their purpose will be to facilitate column validation and a few other purposes. The specific validation I am hoping to implement in these classes comes from this other Stackoverflow post Flask sqlAlchemy validation issue with flask_Marshmallow. I'm not trying to recreate SqlAlchemy here.
Your Table class is analogous to SqlAlchemy's db.Model class. And just as it can have references between different subclasses, so can you.
The specific design of your Classroom.__init__() method seems wrong. All the classrooms in the same location should have references to the same Location object, but you create a new one for each classroom. The Location should be a parameter, rather than the location ID and name.
class ClassRoom(Table):
def __init__(self, id, location):
super().__init__()
self.id = id
self.location = location
Then you can create multiple classrooms in a location:
loc = Location(loc_id, loc_name)
c1 = Classroom(c1_id, loc)
c2 = Classroom(c2_id, loc)
I have two particular model classes that are giving me errors during my testing, upon inspecting the methods for each I was pretty certain my issues were the result of typos on my part.
I've made the changes in both classes as needed, but re-running the tests produces the same error. I even tried dropping my schema and re-creating it with Flask-SQLAlchemy's create_all() method, but I still run into issues.
In the Metrics class, the variables in the __init__ method were wrong and were missing underscores (Ie: self.name instead of self._name). I addressed that by changing them to self._name and self._metric_type
In the HostMetricMapping class, I needed to add the host_id parameter to the __init__ method, since I had forgotten it the first time. So, I added it.
class Metrics(_database.Model):
__tablename__ = 'Metrics'
_ID = _database.Column(_database.Integer, primary_key=True)
_name = _database.Column(_database.String(45), nullable=False)
_metric_type = _database.Column(_database.String(45))
_host_metric_mapping = _database.relationship('HostMetricMapping', backref='_parent_metric', lazy=True)
def __init__(self, name, metric_type):
self._name = name # This line used to say self.name, but was changed to self._name to match the column name
self._metric_type = metric_type # This line used to say self.metric_type, but was changed to self._metric_type to match the column name
def __repr__(self):
return '{0}'.format(self._ID)
class HostMetricMapping(_database.Model):
__tablename__ = 'HostMetricMapping'
_ID = _database.Column(_database.Integer, primary_key=True)
_host_id = _database.Column(_database.Integer, _database.ForeignKey('Hosts._ID'), nullable=False)
_metric_id = _database.Column(_database.Integer, _database.ForeignKey('Metrics._ID'), nullable=False)
_metric = _database.relationship('MetricData', backref='_metric_hmm', lazy=True)
_threshold = _database.relationship('ThresholdMapping', backref='_threshold_hmm', lazy=True)
def __init__(self, host_id, metric_id):
self._host_id = host_id # This line and it's corresponding parameter were missing, and were added
self._metric_id = metric_id
def __repr__(self):
return '{0}'.format(self._ID)
The issues I encounter are:
When trying to instantiate an instance of Metrics and add it into the database, SQLAlchemy raises an IntegrityError because I have the _name column set to not null, and SQLAlchemy inherits the values for both _name and _metric_type as None or NULL, even though I instantiate it with values for both parameters.
For HostMetricMapping, Python raises an exception because it still treats that class as only having the metric_id parameter, instead of also having the host_id parameter I've added.
A better way to override __init__ when using flask-sqlalchemy is to use reconstructor. Object initialization with sqlalchemy is a little tricky, and flask-sqlalchemy might be complicating it as well.. anyways, here's how we do it:
from sqlalchemy.orm import reconstructor
class MyModel(db.Model):
def __init__(self, *args, **kwargs):
super().__init__(*args, **kwargs)
self.init_on_load()
#reconstructor
def init_on_load(self):
# put your init stuff here
I have a django model that I want to attach an extra piece of information to, depending on the environment the instance is in (which user is logged in). For this reason, I don't want to do it at the database level.
Is this okay to do? Or are there problems that I don't foresee?
in models.py
class FooOrBar(models.Model):
"""Type is 'foo' or 'bar'
"""
def __init__(self, type):
self.type = type
in views.py
class FooCheck(FooOrBar):
"""Never saved to the database
"""
def __init__(self, foo_or_bar):
self.__dict__ = foo_or_bar.__dict__.copy()
def check_type(self, external_type):
if external_type == 'foo':
self.is_foo = True
else:
self.is_foo = False
foos_or_bars = FooOrBar.objects.all()
foochecks = map(FooCheck, foos_or_bars)
for foocheck in foochecks:
foocheck.check_type('foo')
extra credit question: Is there a more efficient way of calling a method on multiple objects i.e. replacing the last forloop with something clever?
Okay, this does not work. Trying to delete a FooOrBar objects throws a complaint about
OperationalError at /
no such table: test_FooCheck
To get around this I'm just not going to inherit from FooOrBar, but if anyone has a suggestion on a better way to do it I'd be interested in hearing it
I had a similar issue, I did something like:
class Foo(models.Model):
# specific info goes here
class Bar(models.Model):
# specific info goes here
class FooBar(models.Model):
CLASS_TYPES = {
"foo":Foo,
"bar":Bar
}
type = models.CharField(choices=CLASS_TYPES)
id = models.IntegerField()
#field to identify FooBar
then you can get the object back using
object = FooBar.CLASS_TYPES[instance.type].objects.get(id=instance.id)
where instance is the FooBar instance
I'm new to SQLAlchemy and am using Flask-SQLAlchemy for my current project. I'm getting an error that has me stumped:
sqlalchemy.orm.exc.UnmappedInstanceError: Class 'flask_sqlalchemy._BoundDeclarativeMeta' is not mapped; was a class (app.thing.Thing) supplied where an instance was required?
I've tried changing the notation in the instatiation, and moving some things around. Could the problem be related to this question, or something else?
Here is my module (thing.py) with the class that inherits from Flask-SQLAlchemy's db.Model (like SQLAlchemy's Base class):
from app import db
class Thing(db.Model):
indiv_id = db.Column(db.INTEGER, primary_key = True)
stuff_1 = db.Column(db.INTEGER)
stuff_2 = db.Column(db.INTEGER)
some_stuff = db.Column(db.BLOB)
def __init__():
pass
And here is my call to populate the database in the thing_wrangler.py module:
import thing
from app import db
def populate_database(number_to_add):
for each_thing in range(number_to_add):
a_thing = thing.Thing
a_thing.stuff_1 = 1
a_thing.stuff_2 = 2
a_thing.some_stuff = [1,2,3,4,5]
db.session.add(a_thing)
db.session.commit()
You need to create an instance of your model. Instead of a_thing = thing.Thing, it should be a_thing = thing.Thing(). Notice the parentheses. Since you overrode __init__, you also need to fix it so it takes self as the first argument.
I want to pass an instance of a mapped class to a non-SQLAlchemy aware method (in another process) and only need the values of my attributes. The problem is, that an UnboundExecutionError occurs, every time the method wants to read an attribute value. I do understand, why this happens, but I would like to have a solution for this problem.
I only need the values of my defined attributes (id, name and dirty in the example) and do not need the SQLAlchemy overhead in the destination method.
Example class:
Base = declarative_base()
class Record(Base):
__tablename__ = 'records'
_id = Column('id', Integer, primary_key=True)
_name = Column('name', String(50))
_dirty = Column('dirty', Boolean, index=True)
#synonym_for('_id')
#property
def id(self):
return self._id
#property
def name(self):
return self._name
#name.setter
def name(self, value):
self._name = value
self._dirty = True
name = synonym('_name', descriptor=name)
#synonym_for('_dirty')
#property
def dirty(self):
return self._dirty
Example call:
...
def do_it(self):
records = self.query.filter(Record.dirty == True)
for record in records:
pass_to_other_process(record)
I've tried using session.expunge() and copy.copy(), but without success.
You need to remove the SQL ALchemy object from the session aka 'expunge' it. Then you can request any already loaded attribute w/o it attempting to reuse its last known session/unbound session.
self.expunge(record)
Be aware though, that any unloaded attribute will return it's last known value or None. If you would like to later work with the object again you can just call 'add' again or 'merge'
self.add(record)
My guess is that you're running afoul of SQLAlchemy's lazy loading. Since I don't actually know a whole lot about SQLAlchemy's internals, here's what I recommend:
class RecordData(object):
__slots__ = ('id', 'name', 'dirty')
def __init__(self, rec):
self.id = rec.id
self.name = rec.name
self.dirty = rec.dirty
Then later on...
def do_it(self):
records = self.query.filter(Record.dirty == True)
for record in records:
pass_to_other_process(RecordData(record))
Now, I think there is a way to tell SQLAlchemy to turn your object into a 'dumb' object that has no connection to the database and looks very much like what I just made here. But I don't know what it is.
You may need to make the object transient as well:
This describes one of the major object states which an object can have
within a Session; a transient object is a new object that doesn’t have
any database identity and has not been associated with a session yet.
When the object is added to the session, it moves to the pending
state.
You can make that by expunging the old object, making it transient and then using it to create the new object.
Something like:
# Get the original object (new query style)
query = select(MyTable).where(MyTable.id == old_object_id)
results = session.execute(query)
old_object = result.scalar_one_or_none()
# Remove it from the session & make it transient
session.expunge(old_object)
make_transient(old_object)
# Use it to create the new object
new_object = MyTable(attr1=old_object.attr1 ...)
# Add & commit the new object
session.add(new_object)
session.commit()