Python - Class references another class before assignment - python

I am working with sqlalchemy's ORM to create classes mapped to SQL tables. I am running into issues generating the relationships between these classes since they reference each other before the class is declared. When I run the code the interpreter complains NameError: name 'Account' is not defined
I've included a code sample below that demonstrates how I am declaring these classes.
class Location(Base):
__tablename__ = 'locations'
id = Column(Integer, primary_key=True)
name = Column(String)
address = Column(String)
city = Column(String)
state = Column(String)
zip_code = Column(String)
account = sa.orm.relationship('Account', order_by=Account.id, back_populates='location')
entity = sa.orm.relationship('Entity', order_by=Entity.id, back_populates='location')
def __repr__(self):
return "<Location(name='{}', address='{}', city='{}', state='{}', zip_code='{}')>".\
format(self.name, self.address, self.city, self.state, self.zip_code)
class Account(Base):
__tablename__ = 'accounts'
id = Column(Integer, primary_key=True)
name = Column(String)
number = Column(String)
institution = Column(String)
# entity_id = Column(Integer, sa.ForeignKey('entities.id'))
entity = sa.orm.relationship('Entity', back_populates='accounts')
location = sa.orm.relationship('Location', order_by=Location.id, back_populates='account')
def __repr__(self):
return "<Account(name='{}', account={}, institution={}, entity={})>".\
format(self.name, self.number, self.institution, self.entity)
class Entity(Base):
__tablename__ = 'entities'
id = Column(Integer, primary_key=True)
name = Column(String)
accounts = sa.orm.relationship('Account', order_by=Account.id, back_populates='entity')
location = sa.orm.relationship('Location', order_by=Location.id, back_populates='entity')
def __repr__(self):
return "<Entity(name='{}', location='{}')>".format(self.name, self.location)
What am I missing here? Is there a way to define all classes and then call them later as you can with functions? For example with functions, it's simple to call main at the bottom after all the functions are defined:
def main():
foo()
def foo():
if __name__=='__main__':
main()

Define you orderings either as callables or as expression strings, as explained in the relationship API documentation:
class Location(Base):
...
account = sa.orm.relationship('Account',
order_by=lambda: Account.id, ...)
or
class Location(Base):
...
account = sa.orm.relationship('Account',
order_by='Account.id', ...)
The problem is that during evaluation of Location class' body the name Account does not yet exist in the global scope, and was not defined in the local scope of the class body. Passing in a function/lambda allows deferring the evaluation to "mapper initialization time":
Some arguments accepted by relationship() optionally accept a callable function, which when called produces the desired value. The callable is invoked by the parent Mapper at “mapper initialization” time, which happens only when mappers are first used, and is assumed to be after all mappings have been constructed. This can be used to resolve order-of-declaration and other dependency issues, such as if Child is declared below Parent in the same file
Passing a string will also resolve the order-of-declaration issue, and provides another feature:
These string arguments are converted into callables that evaluate the string as Python code, using the Declarative class-registry as a namespace. This allows the lookup of related classes to be automatic via their string name, and removes the need to import related classes at all into the local module space

Related

Inherit From SQLAlchemy Joined Table Inheritance Class

Python 3.6 and SQLAlchemy 1.2.
I have a package called events which defines Match as a type of Event and uses joined table inheritance to distinguish it from other types of Event. The other type of event is Competition, all events are one or the other:
class Event(Base):
__tablename__ = 'tbl_events'
event_id = Column(Integer, primary_key=True)
event_type_id = Column(String(50)) # 0 is Match, 1 is Competition
__mapper_args__ = {'polymorphic_on': event_type_id}
class Match(Event):
__tablename__ = 'tbl_match_details'
event_id = Column(Integer,
ForeignKey('tbl_events.event_id'),
primary_key=True)
team_1 = Column(String(50))
team_2 = Column(String(50))
__mapper_args__ = {'polymorphic_identity': 0}
I'm using Match in another package which distinguishes multiple types of Match and relies on the Match object's attribs and methods to pull event info from the database but operates away from the database otherwise to that:
from events import Match
class BaseMatch(Match):
# define common methods and attrs
class TennisMatch(BaseMatch):
# do Tennis based stuff
class FootballMatch(BaseMatch):
# do football based things
Any difference between events.Match and the classes that inherit from it only matter in this package and this package doesn't otherwise insert or update the database, only reads from it.
The issue I'm having is that attempting to instantiate an instance of any of the classes that inherits from Match results in a NULL value being passed into the query for the event_type_id field. This is the WHERE part of the query:
WHERE tbl_match_details.event_id = %s AND tbl_events.match_comp_id IN (NULL)
I can't simply give each class their own polymorphic identifier as those identifiers won't exist in the database.
I tried this:
class BaseMatch(Match):
#declared_attr
def __mapper_args__(cls):
return {'polymorphic_identity': 0}
class TennisMatch(BaseMatch):
# do tennis stuff
class FootballMatch(BaseMatch):
# do footy stuff
but importing the module, I get warnings like:
SAWarning: Reassigning polymorphic association for identity 0 from <Mapper at 0x7f80197f0550; Match> to <Mapper at 0x7f80197a9fd0; BaseModel>: Check for duplicate use of 0 as value for polymorphic_identity.
SAWarning: Reassigning polymorphic association for identity 0 from <Mapper at 0x7f80197a9fd0; BaseModel> to <Mapper at 0x7f800dfdf940; TennisMatch>: Check for duplicate use of 0 as value for polymorphic_identity.
I get one of those for each class that inherits from Match and when I attempt to instantiate any of the match types, I get an instance of the type last to have been associated with that polymorphic id.
I'd really appreciate a nudge in the right direction!
Thanks.
Here's what I've done to work around this - I'm not sure if it is 'right' but it has allowed me to move forward with what I'm doing and helped me to understand a bit more of the goings on under the hood.
I've created a factory methods on my Event, Competition and Match classes and a class attribute on Competition and Match that gives me access to each event type's event_type_id value:
from sqlalchemy import inspect
class Event(Base):
__tablename__ = 'tbl_events'
event_id = Column(Integer, primary_key=True)
event_type_id = Column(String(50)) # 0 is Match, 1 is Competition
__mapper_args__ = {'polymorphic_on': event_type_id}
#classmethod
def from_id(cls, id, session):
mapper = inspect(cls).mapper
mapper.polymorphic_map[cls.EVENT_TYPE_ID] = mapper
mapper.polymorphic_identity = cls.EVENT_TYPE_ID
return session.query(cls).filter_by(event_id=id).one()
class Match(Event):
EVENT_TYPE_ID = 0
__tablename__ = 'tbl_match_details'
event_id = Column(Integer,
ForeignKey('tbl_events.event_id'),
primary_key=True)
team_1 = Column(String(50))
team_2 = Column(String(50))
__mapper_args__ = {'polymorphic_identity': EVENT_TYPE_ID}
This way, whenever classes that inherit from Match or Competition are instantiated using the factory methods, the polymorphic identity is forced to the identity defined on the parent class and the polymorphic map points that identity to the class that the factory is being called upon.
A disadvantage obvious to me is that this will only work when objects are instantiated through the factory methods. Fine in this case but maybe not for all.
Would appreciate any feedback on how I've gone about this and any pointers toward a cleaner solution.
Thanks

Are SQLAlchemy model variables class or object type?

I am learning Web Development in Flask. I am using SQLAlchemy. A typical database object is structured like so:
class Role(db.Model):
__tablename__ = 'roles'
id = db.Column(db.Integer, primary_key=True)
name = db.Column(db.String(64), unique=True)
default = db.Column(db.Boolean, default=False, index=True)
permissions = db.Column(db.Integer)
users = db.relationship('User', backref='role', lazy='dynamic')
def __repr__(self):
return '<Role %r>' % self.name
My question is, are these all class variables or object variables? They are outside the __init__ so it would seem they are class variables, that seems odd though. Any pointers on this would be great! Thanks!
The fields with type Column in Role are indeed class variables. But they would be replaced with InstrumentedAttribute during Role construction, which occurred in declarative.aqi.DeclarativeMeta.__init__()
This is why we define Role inherited from Base with metaclass declarative.aqi.DeclarativeMeta (the return value of declarative_base())
The InstrumentedAttribute is a data descriptor, which defined __get__ and __set__ applied to instance dictionary. As a result, we could use them to do operation through instance.
In the following example, r.name is a data descriptor, r.name = 'hello' is equivalent to Role.name.__set__(r, 'hello') -> r.__dict__['name'] = 'hello'
r = Role()
r.name = 'hello'

How to define (or simulate) a collection containing different types in SQLAlchemy?

Working on a project with SQLAlchemy, I was attempting to employ what I believe is a composition pattern. I have a class of "owner" objects; I encapsulate some functionality in component classes and give the owners different capabilities by assigning components to them. The owners and the components all have state that needs to be serialized, so they're all SQLAlchemy objects. Here's a simple example (linked for readability):
class Employee(DeclarativeBase):
__tablename__ = 'employee'
id = Column(Integer, primary_key=True)
name = Column(String)
def __init__(self, name):
self.name = name
class SalesRole(DeclarativeBase):
__tablename__ = 'sales_role'
id = Column(Integer, primary_key=True)
employee_id = Column(Integer, ForeignKey('employee.id'))
employee = relationship(
'Employee',
backref=backref('sales_role', uselist=False)
)
def __init__(self, employee):
self.employee = employee
self.total_sales = 0
def __repr__(self):
return "<SalesRole(employee='%s')>" % self.employee.name
# Sales-specific data and behavior
total_sales = Column(Float)
class CustomerSupportRole(DeclarativeBase):
__tablename__ = 'support_role'
id = Column(Integer, primary_key=True)
employee_id = Column(Integer, ForeignKey('employee.id'))
employee = relationship(
'Employee',
backref=backref('support_role', uselist=False)
)
def __init__(self, employee):
self.employee = employee
self.tickets_resolved = 0
def __repr__(self):
return "<CustomerSupportRole(employee='%s')>" % self.employee.name
# Support-specific data and behavior
tickets_resolved = Column(Integer)
What I would like to be able to do is to define a property on the owner class that returns a collection (of whatever kind) containing all such components that have been assigned to the owner, i.e.,
# Define an Employee object bob and assign it some roles
>>> bob.roles
[<SalesRole(employee='Bob')>, <CustomerSupportRole(employee='Bob')>]
I want to accomplish this without hardcoding any references to the types of components that can exist on the owner class -- I want to define new components without changing code anywhere else.
I can more or less accomplish this with an intermediary table mapping owner instances to their components using generic_relationship from sqlalchemy-utils. Unfortunately, generic_relationship severs SQLAlchemy's automatic cascading of child objects.
Another approach I've tried with help elsewhere was to use SQLAlchemy's event framework to listen for mappings of relationships targeting the owner class ('mapper_configured' events). The components would define a backref from themselves to the owner class, and use the info parameter to set an arbitrary flag denoting this relationship as a referring to one of the components we want to be made available via this collection. The function registered to catch mapping events would test for this flag, and hypothetically build the collection containing those relationships, but we could never figure out how to make that work.
It's not important to me that this collection be a SQLAlchemy object via which I can actually write to the database (i.e. bob.roles.append(SalesmanRole()). That would be very cool, but just a property serving as a read-only iterable view would suffice.
It's not important whether the named attribute backrefs exist on the owner class (e.g., bob.sales_role. It's fine if they do, but I think the collection is actually more important to me.
Like I mentioned earlier, cascading is important (unless you want to convince me it's not!).
Again, it is important that I don't have to hardcode a list of component types anywhere. Whatever magic distinguishes the classes I want to show up in this collection of components from everything else should live in the definition of the components themselves. I want this to be readily extensible as I define new components.
Is there a way to make this work? Or should I be taking a different approach in general -- feel free to tell me this sounds like an XY problem.
And the answer is something I suspected but didn't look for hard enough: class inheritance. Simple, elegant, and accomplishes everything I want.
class Role(DeclarativeBase):
__tablename__ = 'role'
id = Column(Integer, primary_key=True)
role_type = Column(String)
employee_id = Column(Integer, ForeignKey('employee.id'))
employee = relationship('Employee', backref='roles')
__mapper_args__ = {
'polymorphic_identity': 'employee',
'polymorphic_on': role_type
}
class SalesRole(Role):
__tablename__ = 'sales_role'
id = Column(Integer, ForeignKey('role.id'), primary_key=True)
__mapper_args__ = {
'polymorphic_identity': 'sales_role'
}
# Sales-specific attributes, etc.
class CustomerSupportRole(Role):
__tablename__ = 'support_role'
id = Column(Integer, ForeignKey('role.id'), primary_key=True)
__mapper_args__ = {
'polymorphic_identity': 'support_role'
}
# Support-specific attributes, etc.

With SQLAlchemy, how do I make a dynamic relation?

I have a SyncEntities class (shown below).
I have several other classes (such as CommodityTypes also shown below) related to the SyncEntities class.
All of my Base subclasses have this column uuidKey = Column(String, primary_key=True)
Assume se is an instance of SyncEntities.
se.entityKind is the name of a Base subclass.
How do I query for an object that is in the se.entityKind class filtering for se.uuidKey?
class SyncEntities(Base):
__tablename__ = 'SyncEntities'
uuidKey = Column(String, primary_key=True)
dateCreated = Column(DateTime, index=True)
dateModified = Column(DateTime, index=True)
dateSynced = Column(DateTime, index=True)
username = Column(String)
entityKind = Column(String)
deleted = Column(Boolean)
def __init__(self, entity, security):
self.uuidKey = newUUID()
self.dateCreated = security.now
self.dateModified = security.now
self.dateSynced = security.then
self.username = security.username
self.entityKind = entity.__tablename__
self.deleted = False
def modified(self, security):
self.dateModified = security.now
self.username = security.username
class CommodityTypes(Base):
__tablename__ = 'CommodityTypes'
uuidKey = Column(String, ForeignKey('SyncEntities.uuidKey'), primary_key=True)
myName = Column(String, unique = True)
sortKey = Column(Integer, unique = True)
mySyncEntity = relationship("SyncEntities")
def __init__(self, security, myName, sortKey):
self.syncEntity = SyncEntities(self, security)
self.uuidKey = self.syncEntity.uuidKey
self.myName = myName
self.sortKey = sortKey
The structure here is similar, though not quite the same, as a "polymorphic association", and you can read about this pattern over at this blog post: http://techspot.zzzeek.org/2007/05/29/polymorphic-associations-with-sqlalchemy/ . It's an old post but the example at http://techspot.zzzeek.org/files/2007/discriminator_on_association.py was added later as an updated example.
This case is a little different in that an object like CommodityTypes only refers to a single SyncEntities, not multiple as in the usual polymorphic association. The SyncEntities also can only refer to a single type of related object since you have entityKind on it locally.
I would note that a potential problem with this design is that you could have rows in other tables that have a uuidKey pointing to a particular SyncEntities instance, but are not of a type that matches "entityKind". If the relationship between CommodityTypes and SyncEntities is actually one-to-one, that changes everything - this pattern is really simple joined table inheritance and you'd use the patterns described at http://docs.sqlalchemy.org/en/rel_0_7/orm/inheritance.html.
You also don't have backrefs between the target and SyncEntities, which is often a way to automate these styles of lookup. But you can still approximate things using a lookup of entityKind types to classes:
def lookup_related(se):
types = {
'commodity':CommodityTypes,
'foobar':FooBarTypes
}
cls = types[se.entityKind]
session = object_session(se)
return session.query(cls).filter(cls.mySyncEntity==se).all()
here's a mixin that could do it also, using a backref:
class HasSyncEntity(object):
entity_kind = None
"subclasses need to populate this"
#declared_attr
def uuidKey(cls):
return Column(String, ForeignKey("SyncEntities.uuidKey"), primary_key=True)
#declared_attr
def mySyncEntity(cls):
return relationship("SyncEntities", backref="_%s_collection" % cls.entity_kind)
CommodityTypes becomes:
class CommodityTypes(HasSyncEntity, Base):
entity_kind = "commodity"
# ...
You then add a method like this to SyncEntities, which looks up the appropriate backref, and you're done:
def get_related(self):
return getattr(self, "_%s_collection" % self.entityKind)

SQLAlchemy: avoiding repetition in declarative style class definition

I'm using SQLAlchemy, and many classes in my object model have the same two attributes: id and (integer & primary key), and name (a string). I'm trying to avoid declaring them in every class like so:
class C1(declarative_base()):
id = Column(Integer, primary_key = True)
name = Column(String)
#...
class C2(declarative_base()):
id = Column(Integer, primary_key = True)
name = Column(String)
#...
What's a good way to do that? I tried using metaclasses but it didn't work yet.
You could factor out your common attributes into a mixin class, and multiply inherit it alongside declarative_base():
from sqlalchemy import Column, Integer, String
from sqlalchemy.ext.declarative import declarative_base
class IdNameMixin(object):
id = Column(Integer, primary_key=True)
name = Column(String)
class C1(declarative_base(), IdNameMixin):
__tablename__ = 'C1'
class C2(declarative_base(), IdNameMixin):
__tablename__ = 'C2'
print C1.__dict__['id'] is C2.__dict__['id']
print C1.__dict__['name'] is C2.__dict__['name']
EDIT: You might think this would result in C1 and C2 sharing the same Column objects, but as noted in the SQLAlchemy docs, Column objects are copied when originating from a mixin class. I've updated the code sample to demonstrate this behavior.
Could you also use the Column's copy method? This way, fields can be defined independently of tables, and those fields that are reused are just field.copy()-ed.
id = Column(Integer, primary_key = True)
name = Column(String)
class C1(declarative_base()):
id = id.copy()
name = name.copy()
#...
class C2(declarative_base()):
id = id.copy()
name = name.copy()
#...
I think I got it to work.
I created a metaclass that derives from DeclarativeMeta, and made that the metaclass of C1 and C2. In that new metaclass, I simply said
def __new__(mcs, name, base, attr):
attr['__tablename__'] = name.lower()
attr['id'] = Column(Integer, primary_key = True)
attr['name'] = Column(String)
return super().__new__(mcs, name, base, attr)
And it seems to work fine.

Categories