I am trying to create a program that loads in over 100 tables from a database so that I can change all appearances of a user's user id.
Rather than map all of the tables individually, I decided to use a loop to map each of the tables using an array of objects. This way, the table definitions can be stored in a config file and later updated.
Here is my code so far:
def init_model(engine):
"""Call me before using any of the tables or classes in the model"""
meta.Session.configure(bind=engine)
meta.engine = engine
class Table:
tableID = ''
primaryKey = ''
pkType = sa.types.String()
class mappedClass(object):
pass
WIW_TBL = Table()
LOCATIONS_TBL = Table()
WIW_TBL.tableID = "wiw_tbl"
WIW_TBL.primaryKey = "PORTAL_USERID"
WIW_TBL.pkType = sa.types.String()
LOCATIONS_TBL.tableID = "locations_tbl"
LOCATIONS_TBL.primaryKey = "LOCATION_CODE"
LOCATIONS_TBL.pkType = sa.types.Integer()
tableList = ([WIW_TBL, LOCATIONS_TBL])
for i in tableList:
i.tableID = sa.Table(i.tableID.upper(), meta.metadata,
sa.Column(i.primaryKey, i.pkType, primary_key=True),
autoload=True,
autoload_with=engine)
orm.mapper(i.mappedClass, i.tableID)
The error that this code returns is:
sqlalchemy.exc.ArgumentError: Class '<class 'changeofname.model.mappedClass'>' already has a primary mapper defined. Use non_primary=True to create a non primary Mapper. clear_mappers() will remove *all* current mappers from all classes.
I cant use clear_mappers as it wipes all of the classes and the entity_name scheme doesn't seem to apply here.
It seems that every object wants to use the same class, although they all should have their own instance of it.
Does anyone have any ideas?
Well, in your case it *is the same Class you try to map to different Tables. To solve this, create a class dynamically for each Table:
class Table(object):
tableID = ''
primaryKey = ''
pkType = sa.types.String()
def __init__(self):
self.mappedClass = type('TempClass', (object,), {})
But I would prefer slightly cleaner version:
class Table2(object):
def __init__(self, table_id, pk_name, pk_type):
self.tableID = table_id
self.primaryKey = pk_name
self.pkType = pk_type
self.mappedClass = type('Class_' + self.tableID, (object,), {})
# ...
WIW_TBL = Table2("wiw_tbl", "PORTAL_USERID", sa.types.String())
LOCATIONS_TBL = Table2("locations_tbl", "LOCATION_CODE", sa.types.Integer())
Related
I'm running into an issue using mongoengine. A raw query that works on Compass isn't working using _ _ raw _ _ on mongoengine. I'd like to rewrite it using mongoengine's methods, but I'd like to understand why it's not working using _ _ raw_ _ either.
I'm running an embedded document list field that has inheritence. The query is "give me all sequences that are have a 'type A' Assignment "
My schema:
class Sequence(Document):
seq = StringField(required = True)
samples = EmbeddedDocumentListField(Sample)
assignments = EmbeddedDocumentListField(Assignment)
class Sample(EmbeddedDocument):
name = StringField()
class Assignment(EmbeddedDocument):
name = StringField()
meta = {'allow_inheritance': True}
class TypeA(Assignment):
pass
class TypeB(Assignment):
other_field = StringField()
pass
Writing {'assignments._cls': 'TypeA'} into Compass returns a list. But on mongoengine I get an empty field:
from mongo_objects import Sequence
def get_samples_assigned_as_class(cls : str):
query_raw = Sequence.objects(__raw__={'assignments._cls': cls}) # raw query, fails
#query2 = Sequence.objects(assignments___cls = cls) # Fist attempt, failed
#query3 = Sequence.objects.get().assignments.filter(cls = cls) # Second attempt, also failed. Didn't like that it queried everything first
print(query_raw) # empty list, iterating does nothing.
get_samples_assigned_as_class('TypeA')
"Assignments" is a list because one sequence may have multiples of the same class. An in depth awnser on how to query these lists for categorical information would be ideal, as I'm not sure how to properly go about it. I'm mostly filtering on the inheritence _cls, but eventually I'd like to do nested queries (cls : TypeA, sample : Sample_1)
Thanks
I know the question how to duplicate or copy a SQLAlchemy mapped object was asked a lot of times. The answer always depends on the needs or how "duplicate" or "copy" is interpreted.
This is a specialized version of the question because I got the tip to use make_transient() for that.
But I have some problems with that. I don't really know how to handle the primary key (PK) here. In my use cases the PK is always autogenerated by SQLA (or the DB in background). But this doesn't happen with a new duplicated object.
The code is a little bit pseudo.
import sqlalchemy as sa
from sqlalchemy.orm.session import make_transient
_engine = sa.create_engine('postgres://...')
_session = sao.sessionmaker(bind=_engine)()
class MachineData(_Base):
__tablename__ = 'Machine'
_oid = sa.Column('oid', sa.Integer, primary_key=True)
class TUnitData(_Base):
__tablename__ = 'TUnit'
_oid = sa.Column('oid', sa.Integer, primary_key=True)
_machine_fk = sa.Column('machine', sa.Integer, sa.ForeignKey('Machine.oid'))
_machine = sao.relationship("MachineData")
def __str__(self):
return '{}.{}: oid={}(hasIdentity={}) machine={}(fk={})' \
.format(type(self), id(self),
self._oid, has_identity(self),
self._machine, self._machine_fk)
if __name__ == '__main__':
# any query resulting in one persistent object
obj = GetOneMachineDataFromDatabase()
# there is a valid 'oid', has_identity == True
print(obj)
# should i call expunge() first?
# remove the association with any session
# and remove its “identity key”
make_transient(obj)
# 'oid' is still there but has_identity == False
print(obj)
# THIS causes an error because the 'oid' still exsits
# and is not new auto-generated (what should happen in my
# understandings)
_session.add(obj)
_session.commit()
After making a object instance transient you have to remove its object-id. Without an object-id you can add it again to the database which will generate a new object-id for it.
if __name__ == '__main__':
# the persistent object with an identiy in the database
obj = GetOneMachineDataFromDatabase()
# make it transient
make_transient(obj)
# remove the identiy / object-id
obj._oid = None
# adding the object again generates a new identiy / object-id
_session.add(obj)
# this include a flush() and create a new primary key
_session.commit()
I am trying to write a logging system, which uses dynamic classes to make tables. Getting the classes created, and the tables created seems to be working fine, but trying to put entries into them is lead to an error message regarding mapping, below is the sample code and the error message.
Base = declarative_base()
#my init function
def tableinit(self,keyargs):
self.__dict__ = dict(keyargs)
#table creation
tableName = "newTable"
columnsDict["__tablename__"] = tableName
columnsDict["__init__"] = tableinit
columnsDict["id"] = Column("id",Integer, autoincrement = True, nullable = False, primary_key=True)
columnsDict["pid"] = Column("pid",Integer, ForeignKey('someparenttable.id')) #someparenttable is created with a hard coded class
newTable = type(tableName,(Base,),columnsDict)
tableClassDict[tableName]=newTable
#when doing an entry
newClassInst = subEntryClassDict[tableName]
newEntry = newClassInst(dataDict)
entryList.append(newEntry) # this is called in a for loop with the entries for someparenttable's entries also
self.session.add_all(entryList) # at this point the error occurs
The error:
UnmappedInstanceError: Class 'newTable' is mapped, but this instance lacks instrumentation. This occurs when the instance is created before sqlalchemy.orm.mapper(module.newTable) was called.
This is easier if you create a function to return a class that you set up normally. I've tried something like this and it works:
def getNewTable( db, table ):
class NewTable( Base ):
__tablename__ = table
__table_args__ = { 'schema': db }
id = Column( ...
return NewTable
newClassInst = getNewTable( 'somedb', 'sometable' )
newRow = newClassInst( data )
This problem is caused by lack of instruments function interfaces for the orm as the error description says. And it is actually caused by self.__dict__ = dict(keyargs) I think.
So this can be solved by reconstruct the init, which do not modify the injected functions by ORM.
Turn this
#my init function
def tableinit(self,keyargs):
self.__dict__ = dict(keyargs)
To
#my init function
def tableinit(self,**kwargs):
self.__dict__.update(kwargs)
I'm trying to make some generic apps using Sql Alchemy, such as tags or rating for any model. But I couldn't find any help in the docs. I really liked what I could do with the django contenttypes framework ? Is there any similar functionality in Sql Alchemy ?
I once wrote some example code about something similar to this (see http://taketwoprogramming.blogspot.com/2009/08/reusable-sqlalchemy-models.html).
The basic idea is that you can create a model like this:
#commentable
class Post(Base):
__tablename__ = 'posts'
id = sa.Column(sa.Integer, primary_key=True)
text = sa.Column(sa.String)
...where commentable is defined like this...
class BaseComment(object):
pass
def build_comment_model(clazz):
class_table_name = str(class_mapper(clazz).local_table)
metadata = clazz.metadata
comment_class_name = clazz.__name__ + 'Comment'
comment_class = type(comment_class_name, (BaseComment,), {})
comment_table_name = class_table_name + '_comments'
comment_table = sa.Table(comment_table_name, metadata,
sa.Column('id', sa.Integer, primary_key=True),
sa.Column(class_table_name + '_id',
sa.Integer,
sa.ForeignKey(class_table_name + '.id')),
sa.Column('text', sa.String),
sa.Column('name', sa.String(100)),
sa.Column('url', sa.String(255)),
)
mapper(comment_class, comment_table)
return comment_class, comment_table
def commentable(clazz):
comment_class, comment_table = build_comment_model(clazz)
clazz.Comment = comment_class
setattr(clazz, 'comments', relation(comment_class))
def add_comment(self, comment):
self.comments.append(comment)
setattr(clazz, 'add_comment', add_comment)
return clazz
Basically, the commentable decorator dynamically creates a new type and table, along with some helper methods to the decorated class. This is the test I used to test that the code works, which shows some example of how it would work...
class TestModels(SATestCase):
def test_make_comment(self):
p = Post()
p.text = 'SQLAlchemy is amazing!'
text = 'I agree!'
name = 'Mark'
url = 'http://www.sqlalchemy.org/'
c = Post.Comment()
c.text = text
c.name = name
c.url = url
p.add_comment(c)
Session.add(p)
# This is a method I use to force the reload of the objects from
# the database to make sure that when I test them, I'm actually
# pulling from the database rather than just getting the data
# of the object still in the session.
p = self.reload(p)
self.assertEquals(len(p.comments), 1)
c = p.comments[0]
self.assertEquals(c.text, text)
self.assertEquals(c.name, name)
self.assertEquals(c.url, url)
I wrote this awhile ago, but I don't think there's anything in SQLA that will do this kind of thing for you, but you can create something similar without too much trouble. In my example, I created new mapped classes and methods to use it on the fly in a class decorator.
I never really made much use out of it, but it might give you some ideas.
I have some problems with setting up the dictionary collection in Python's SQLAlchemy:
I am using declarative definition of tables. I have Item table in 1:N relation with Record table. I set up the relation using the following code:
_Base = declarative_base()
class Record(_Base):
__tablename__ = 'records'
item_id = Column(String(M_ITEM_ID), ForeignKey('items.id'))
id = Column(String(M_RECORD_ID), primary_key=True)
uri = Column(String(M_RECORD_URI))
name = Column(String(M_RECORD_NAME))
class Item(_Base):
__tablename__ = 'items'
id = Column(String(M_ITEM_ID), primary_key=True)
records = relation(Record, collection_class=column_mapped_collection(Record.name), backref='item')
Now I want to work with the Items and Records. Let's create some objects:
i1 = Item(id='id1')
r = Record(id='mujrecord')
And now I want to associate these objects using the following code:
i1.records['source_wav'] = r
but the Record r doesn't have set the name attribute (the foreign key). Is there any solution how to automatically ensure this? (I know that setting the foreign key during the Record creation works, but it doesn't sound good for me).
Many thanks
You want something like this:
from sqlalchemy.orm import validates
class Item(_Base):
[...]
#validates('records')
def validate_record(self, key, record):
assert record.name is not None, "Record fails validation, must have a name"
return record
With this, you get the desired validation:
>>> i1 = Item(id='id1')
>>> r = Record(id='mujrecord')
>>> i1.records['source_wav'] = r
Traceback (most recent call last):
[...]
AssertionError: Record fails validation, must have a name
>>> r.name = 'foo'
>>> i1.records['source_wav'] = r
>>>
I can't comment yet, so I'm just going to write this as a separate answer:
from sqlalchemy.orm import validates
class Item(_Base):
[...]
#validates('records')
def validate_record(self, key, record):
record.name=key
return record
This is basically a copy of Gunnlaugur's answer but abusing the validates decorator to do something more useful than exploding.
You have:
backref='item'
Is this a typo for
backref='name'
?