In the stackoverflow below is mention how to fetch all related objects in Django
Get all related Django model objects
To be short, the solution that best fits my needs would be something like code below..
from django.contrib.admin.utils import NestedObjects
collector = NestedObjects(using="default") #database name
collector.collect([objective]) #list of objects. single one won't do
print(collector.data)
Basically I'm trying to figure out how to do the same approach but using SqlAlchemy.
Thanks in advance!!
Related
I am working on a project, where all the MongoDB collections contains mandatory fields.
While modeling the same in FastAPI, I am trying to create ABC(Abstract Base Class) for mandatory fields and trying to inherit in child classes.
Issue is: Code is not considering fields in ABC class at all.
This url, says "Models can't be inherited".
My environment is: Python + FastAPI + MongoDB. I am using ODMantic for MongoDB operations.
Is there any workaround for this issue? Any help is much appreciated.
Ok, they can't be inherited, but what's their use for? I don't see the real question here.
I can only make assumptions on what you may be needing:
If you need to check the input, then fastapi has you backed up with pydantic. See https://fastapi.tiangolo.com/tutorial/body/?h=pydantic#create-your-data-model . You can then create the odmantic model passing the input as dictionary (omodel(**model_name.dict()) or whatever name you use).
If you want to reduce the amount of copy and paste code or want the two models to share a common base, there are docs on the link you mentioned on how to integrate it with fastapi https://art049.github.io/odmantic/usage_fastapi/
Apart from the two points above, I do not understand what other needs you could have. If this answer did not get you on the right path, let me know, but before please be more specific about your goal.
I am currently trying to connect my new Django rest API to my already existing mongodb database. Currently I am trying to copy the structure of my database objects as models. I ran into the problem, that I set up a structure like this in my db:
{
objects: { DE: [], US: [] }
}
The attributes DE and US can be anything here (Any geo for that matter). Is there any way I can incorporate this kind of pattern in my djongo model?
If by anything, you truly mean anything (or at least more than a few types of data), you could set up the model(s) as follows:
from djongo import models
...
ObjectDataModel(models.Model):
US = models.ListField()
DE = models.ListField()
class Meta:
abstract = True # Stops a database table from being made
...
YourModel(models.Model):
objects = models.ArrayModelField(model_container=ObjectDataModel)
You could also add custom validation if you want the ListFields to not just take anything under the sun; here's how to do that.
NOTE: This makes the objects field completely inaccessible via the Django Admin website; this is simply because the Admin site cannot possibly represent all possible input types that a ListField might be able to handle for the user (you can still submit values to the field via your forms/views, however).
You can also design a custom field, if you have the time to do so. I am (sadly) not terribly familiar with the geo field, so I'll instead point you here for instructions on how to go about that. You might also want to look at how Djongo's author went about implementing the ListField mentioned prior; it might give a hint on how to make list-like database entries. Here's the raw code for that.
Hope this helps!
I am working on a site, and I have both Followers and Following as two ReferenceFields attached to a User. Is this overkill? I felt like it might make the querying a little easier, but is this just asking for trouble in the long run?
Something like:
class User(Document):
name = StringField()
followers = ListField(ReferenceField('User'))
following = ListField(ReferenceField('User'))
I wasn't sure what best practice would be for this kind of relationship and if it's easier to just set one as a ReferenceField and filter when I query the database. This would prevent instances where the follower and following list did not match, but it also seemed like it would make database queries more complicated.
I see this article on the mongodb website, which is helpful but I don't feel it directly answers my question https://docs.mongodb.com/ecosystem/use-cases/storing-comments/.
I need a way of reading data in django from different DBs using different models, because models and fields in the db have changed between the projects.
What I try to do is this:
from sbo.core import models as sbo_core_models
from sbo_cloud.core import models as cloud_core_models
company_details = sbo_core_models.CompanyDetails.objects.using('sbo').filter(company=sbo_company).order_by("id")[0]
new_company_details = cloud_core_models.CompanyDetails.objects.get(id=int(reply['id']))
the model that actually gets used for the new_company_details is actually sbo_core_models.CompanyDetails and not cloud_core_models.CompanyDetails because it is missing properties that would appear in the second one.
any ideea why this might happen and what I am doing wrong, from what i've seen it uses the models that I import first, no matter what model I tell it to use.
I am using python2.7 and django 1.3.3
All you need it's on the Django documentation
https://docs.djangoproject.com/en/1.3/topics/db/multi-db/
I think that the best solution for your case is a database router.
I'm looking for a way to easily filter from a collection of Model objects without hitting the database each time. By definition, QuerySets are lazy and always will hit the DB. So I am wondering if there is anything existing that can do this. If not, perhaps its a good library to create.
For example:
all_records = object_set(Record.objects.filter(company=user.company))
object_set being a hypothetical function which would gather all of the objects in a QuerySet as static data. The result would be an "object manager" instance that could have filters run against it similar to QuerySet filters. This would be particularly useful in storing creating, updating, and deleting objects based on data from multidimensional lists of data.
for row in data:
for col in row:
# this would not hit the DB. Only filter within the "object_set" in memory.
all_records.filter(date=col.date, type=col.type, creator=col.user)
I realize I may be trying to solve this the wrong way, but regardless, I think this would be a great tool to have in Django. Does anyone know of an existing library or functionality within Django that would solve this problem? Thanks in advance!
I think the QuerySet's select_related method is what you want:
https://docs.djangoproject.com/en/dev/ref/models/querysets/#select-related
Please, check out the managers.py in the following project: django-model-utils/.../managers.py
It should show you how he implemented the queryset
def get_query_set(self):
qs = super(QueryManager, self).get_query_set().filter(self._q)
if self._order_by is not None:
return qs.order_by(*self._order_by)
return qs
If long datasets is your motivation for this question use Redis cache in your Django project.
http://unfoldthat.com/2011/09/14/try-redis-instead.html