I want to get all of objects that are related to an instance of models.
Because my code is kinda generic, I pass the related table as an string and use eval() function to convert it to the related table class. But I got an error.
Suppose that we have an instance of a table like self.casefile; this is a part of my code:
def related_models_migration(self):
opts = self.casefile._meta
table_name = 'Files'
for f in opts.many_to_many:
name = ''.join(f.name.split('_'))
table_name += name.capitalize()
objects = self.casefile.eval(table_name).all()
and I got this error:
AttributeError Traceback (most recent call last)
<ipython-input-6-025484eeba97> in <module>
----> 1 obj.related_models_migration()
~/Documents/kangaroo/etl/data_migration.py in related_models_migration(self)
28 name = ''.join(f.name.split('_'))
29 table_name += name.capitalize()
---> 30 objects = self.casefile.eval(table_name).all()
31
32 for d in dir(etl.models):
AttributeError: 'FilesCasefiles' object has no attribute 'eval'
How can I pass the class name?
You can not use eval(..) for that. What you probably want to use here is getattr(..):
def related_models_migration(self):
opts = self.casefile._meta
table_name = 'Files'
for f in opts.many_to_many:
name = ''.join(f.name.split('_'))
table_name += name.capitalize()
objects = getattr(self.casefile, table_name).all()
I am not sure you should use table_name += … here however, since it will each time add more content to the table_name. You likely want to use something like table_name = 'Files{}'.format(name.capitalize()).
Note: normally related fields are not capitalized. One writes users or user_set, not Users.
Django provides a way to do this, although you do need to specify the name of the app in which the moodel is defined (because it's possible to have two models with the same name in different apps).
apps.get_model(app_label, model_name, require_ready=True)¶
Returns the Model with the given app_label and model_name.
As a shortcut, this method also accepts a single argument in the form
app_label.model_name. model_name is case-insensitive.
Related
I'm running into an issue using mongoengine. A raw query that works on Compass isn't working using _ _ raw _ _ on mongoengine. I'd like to rewrite it using mongoengine's methods, but I'd like to understand why it's not working using _ _ raw_ _ either.
I'm running an embedded document list field that has inheritence. The query is "give me all sequences that are have a 'type A' Assignment "
My schema:
class Sequence(Document):
seq = StringField(required = True)
samples = EmbeddedDocumentListField(Sample)
assignments = EmbeddedDocumentListField(Assignment)
class Sample(EmbeddedDocument):
name = StringField()
class Assignment(EmbeddedDocument):
name = StringField()
meta = {'allow_inheritance': True}
class TypeA(Assignment):
pass
class TypeB(Assignment):
other_field = StringField()
pass
Writing {'assignments._cls': 'TypeA'} into Compass returns a list. But on mongoengine I get an empty field:
from mongo_objects import Sequence
def get_samples_assigned_as_class(cls : str):
query_raw = Sequence.objects(__raw__={'assignments._cls': cls}) # raw query, fails
#query2 = Sequence.objects(assignments___cls = cls) # Fist attempt, failed
#query3 = Sequence.objects.get().assignments.filter(cls = cls) # Second attempt, also failed. Didn't like that it queried everything first
print(query_raw) # empty list, iterating does nothing.
get_samples_assigned_as_class('TypeA')
"Assignments" is a list because one sequence may have multiples of the same class. An in depth awnser on how to query these lists for categorical information would be ideal, as I'm not sure how to properly go about it. I'm mostly filtering on the inheritence _cls, but eventually I'd like to do nested queries (cls : TypeA, sample : Sample_1)
Thanks
In Django, can I re-use an existing Q object on multiple models, without writing the same filters twice?
I was thinking about something along the lines of the pseudo-Django code below, but did not find anything relevant in the documentation :
class Author(Model):
name = TextField()
company_name = TextField()
class Book(Model):
author = ForeignKey(Author)
# Create a Q object for the Author model
q_author = Q(company_name="Books & co.")
# Use it to retrieve Book objects
qs = Book.objects.filter(author__matches=q_author)
If that is not possible, can I extend an existing Q object to work on a related field? Pseudo-example :
# q_book == Q(author__company_name="Books & co.")
q_book = q_author.extend("author")
# Use it to retrieve Book objects
qs = Book.objects.filter(q_book)
The only thing I've found that comes close is using a subquery, which is a bit unwieldy :
qs = Book.objects.filter(author__in=Author.objects.filter(q_author))
From what I can tell by your comment, it just looks like you're trying to pass a set of common arguments to multiple filters, to do that you can just unpack a dictionary
The values in the dictionary can still be q objects if required as if it were a value you would pass in to the filter argument normally
args = { 'author__company_name': "Books & co" }
qs = Book.objects.filter(**args)
args['author_name'] = 'Foo'
qs = Book.objects.filter(**args)
To share this between different models, you'd have to do some dictionary mangling
author_args = { k.lstrip('author__'): v for k, v in args.items }
You can do this
books = Book.objects.filter(author__company_name="Books & co")
Having these 2 MongoEngine Documents:
class A(Document):
a = StringField()
class B(Document):
b = StringField()
boolfield = BooleanField(default=False)
ref = ReferenceField(A)
I'd like first to filter() on a specific A object, and then, from the first query, filter() on the BooleanField. But these lines cause an error:
a_objects = A.objects(a='test') # OK
query = B.objects(ref__in=a_objects) # OK
query2 = query.filter(boolfield=True) # FAILS
The error is:
TypeError: 'Collection' object is not callable. If you meant to call the '__deepcopy__' method on a 'Collection' object it is failing because no such method exists.
See the full code and traceback here: https://gist.github.com/nferrari/4962245
Thanks!
Seems that querying reference fields can't be chained in 0.7.8 - so for the time being please use a dictionary and then pass in as kwargs as a work round eg:
a_objects = A.objects(a='test')
query_dict = {'ref__in': a_objects}
query_dict['boolfield'] = True
self.assertEquals(B.objects(**query_dict).count(), 1)
I have added: https://github.com/MongoEngine/mongoengine/issues/234 to be fixed in 0.8
I am trying to create a program that loads in over 100 tables from a database so that I can change all appearances of a user's user id.
Rather than map all of the tables individually, I decided to use a loop to map each of the tables using an array of objects. This way, the table definitions can be stored in a config file and later updated.
Here is my code so far:
def init_model(engine):
"""Call me before using any of the tables or classes in the model"""
meta.Session.configure(bind=engine)
meta.engine = engine
class Table:
tableID = ''
primaryKey = ''
pkType = sa.types.String()
class mappedClass(object):
pass
WIW_TBL = Table()
LOCATIONS_TBL = Table()
WIW_TBL.tableID = "wiw_tbl"
WIW_TBL.primaryKey = "PORTAL_USERID"
WIW_TBL.pkType = sa.types.String()
LOCATIONS_TBL.tableID = "locations_tbl"
LOCATIONS_TBL.primaryKey = "LOCATION_CODE"
LOCATIONS_TBL.pkType = sa.types.Integer()
tableList = ([WIW_TBL, LOCATIONS_TBL])
for i in tableList:
i.tableID = sa.Table(i.tableID.upper(), meta.metadata,
sa.Column(i.primaryKey, i.pkType, primary_key=True),
autoload=True,
autoload_with=engine)
orm.mapper(i.mappedClass, i.tableID)
The error that this code returns is:
sqlalchemy.exc.ArgumentError: Class '<class 'changeofname.model.mappedClass'>' already has a primary mapper defined. Use non_primary=True to create a non primary Mapper. clear_mappers() will remove *all* current mappers from all classes.
I cant use clear_mappers as it wipes all of the classes and the entity_name scheme doesn't seem to apply here.
It seems that every object wants to use the same class, although they all should have their own instance of it.
Does anyone have any ideas?
Well, in your case it *is the same Class you try to map to different Tables. To solve this, create a class dynamically for each Table:
class Table(object):
tableID = ''
primaryKey = ''
pkType = sa.types.String()
def __init__(self):
self.mappedClass = type('TempClass', (object,), {})
But I would prefer slightly cleaner version:
class Table2(object):
def __init__(self, table_id, pk_name, pk_type):
self.tableID = table_id
self.primaryKey = pk_name
self.pkType = pk_type
self.mappedClass = type('Class_' + self.tableID, (object,), {})
# ...
WIW_TBL = Table2("wiw_tbl", "PORTAL_USERID", sa.types.String())
LOCATIONS_TBL = Table2("locations_tbl", "LOCATION_CODE", sa.types.Integer())
I have some problems with setting up the dictionary collection in Python's SQLAlchemy:
I am using declarative definition of tables. I have Item table in 1:N relation with Record table. I set up the relation using the following code:
_Base = declarative_base()
class Record(_Base):
__tablename__ = 'records'
item_id = Column(String(M_ITEM_ID), ForeignKey('items.id'))
id = Column(String(M_RECORD_ID), primary_key=True)
uri = Column(String(M_RECORD_URI))
name = Column(String(M_RECORD_NAME))
class Item(_Base):
__tablename__ = 'items'
id = Column(String(M_ITEM_ID), primary_key=True)
records = relation(Record, collection_class=column_mapped_collection(Record.name), backref='item')
Now I want to work with the Items and Records. Let's create some objects:
i1 = Item(id='id1')
r = Record(id='mujrecord')
And now I want to associate these objects using the following code:
i1.records['source_wav'] = r
but the Record r doesn't have set the name attribute (the foreign key). Is there any solution how to automatically ensure this? (I know that setting the foreign key during the Record creation works, but it doesn't sound good for me).
Many thanks
You want something like this:
from sqlalchemy.orm import validates
class Item(_Base):
[...]
#validates('records')
def validate_record(self, key, record):
assert record.name is not None, "Record fails validation, must have a name"
return record
With this, you get the desired validation:
>>> i1 = Item(id='id1')
>>> r = Record(id='mujrecord')
>>> i1.records['source_wav'] = r
Traceback (most recent call last):
[...]
AssertionError: Record fails validation, must have a name
>>> r.name = 'foo'
>>> i1.records['source_wav'] = r
>>>
I can't comment yet, so I'm just going to write this as a separate answer:
from sqlalchemy.orm import validates
class Item(_Base):
[...]
#validates('records')
def validate_record(self, key, record):
record.name=key
return record
This is basically a copy of Gunnlaugur's answer but abusing the validates decorator to do something more useful than exploding.
You have:
backref='item'
Is this a typo for
backref='name'
?