Can django lazy-load fields in a model? - python

One of my django models has a large TextField which I often don't need to use. Is there a way to tell django to "lazy-load" this field? i.e. not to bother pulling it from the database unless I explicitly ask for it. I'm wasting a lot of memory and bandwidth pulling this TextField into python every time I refer to these objects.
The alternative would be to create a new table for the contents of this field, but I'd rather avoid that complexity if I can.

The functionality happens when you make the query, using the defer() statement, instead of in the model definition. Check it out here in the docs:
http://docs.djangoproject.com/en/dev/ref/models/querysets/#defer
Now, actually, your alternative solution of refactoring and pulling the data into another table is a really good solution. Some people would say that the need to lazy load fields means there is a design flaw, and the data should have been modeled differently.
Either way works, though!

There are two options for lazy-loading in Django: https://docs.djangoproject.com/en/1.6/ref/models/querysets/#django.db.models.query.QuerySet.only
defer(*fields)
Avoid loading those fields that require expensive processing to convert them to Python objects.
Entry.objects.defer("text")
only(*fields)
Only load the fields that you actually need
Person.objects.only("name")
Personally, I think only is better than defer since the code is not only easier to understand, but also more maintainable in the long run.

For something like this you can just override the default manager. Usually, it's not advised but for a defer() it makes sense:
class CustomManager(models.Manager):
def get_queryset(self):
return super(CustomManager, self).get_queryset().defer('YOUR_TEXTFIELD_FIELDNAME')
class DjangoModel(models.Model):
objects = CustomerManager()

Related

Converting existing python classes into Django Models

I have a small program with a command line interface that uses a number of python classes with thorough implementations. I want to scrap the command line interface and wrap the app within a Django app, but I'm just learning Django and I'm unfamiliar with the conventions.
I have a number of classes, in-memory storage structures, getters/setters etc and I'd like to convert them into Django models so that I can persist them to the database and interact with them around the django app. Is there a general approach for doing something like this?
Should I just inherit the django.db.models.Model class in my existing classes and set them up for direct interaction? Or is there a better, more general/conventional way to do this?
I would like to be able to use all of this code in other apps, not necesarilly Django ones, so I don't really want to modify my existing classes in a way that would make them only work with Django. I thought of creating the models separately and then a sort of middle-man class to manage interaction of the actual in-memory class with the django model class, but that just seems like more places I have to make changes when I extend/modify the code.
Thanks for any help ahead of time...
Personally, I would modify your existing classes to extend models.Model and maintain separate versions of these classes for use outside of Django.
This will keep your classes lean and maintainable within their respective environments.
You could also create a new class that extends both models.Model and your python model through multiple inheritance. However this will result in duplicate fields for the same data.
If you would like, post an example Model as a new question and tag me in a link to it here, and I can help you convert it.
One of greatest django strengths is its ORM, if you want import i recommend you use it, and yes you would probably need rewrite the part that interacts with the database, but if you already have isolated this functions in a Models folder~classes, the modification won't be really hard
Although in your case i would recommending checking out Tornado/Aiohttp Since looks like you are just trying to create a interface for your functions

Django model with hundreds of fields

I have a model with hundreds of properties. The properties can be of different types (integer, strings, uploaded files, ...). I would like to implement this complex model step by step, starting with the most important properties. I can think of two options:
Define the properties as regular model fields
Define a separate model to hold each property separately, and link it to the main model with a ForeignKey
I have not found any suggestions on how to handle models with lots of properties with django. What are the advantages / drawbacks of both approaches?
You definitely should not define your properties as ForeignKeys. Every time you need a full model, your database server will have to make hundreds of JOINs, therefore ruining your performance.
If your properties are needed almost every time you access the model, you should keep them in the same model. If not, you could make a separate Properties model and link it to your original model via OneToOneField.
I personally had such an experience. We had to build a hotel recomendation engine, and we were using Drupal back then. And as Drupal stores every custom property in a separate MySQL table, we quickly realised we should switch the framework, because every single query crashed our production servers (20+ JOINs are a deadly thing to MySQL). BTW, we ended up using a custom solution based on ElasticSearch, which handles hundreds of fields just fine.
Update: If you're lucky enough to be using a recent version of PostgreSQL, you could leverage the JSONField storage to pack all your fields to a single model field. Note, though, that you'll have to implement a validation scheme by yourself.
customer requirement.
First off, I feel your pain and wish you the best! I wish to reiterate if this wasn't the case that you should be first looking to change this as there should never be any need for hundreds of properties on a single object, it normally shows a need for an array, inheritance, or separate classes etc..
Going forward, you're going to need to heavily make use of values and values_list to only return the properties that you actually need from the database since performance will be severely crippled from this.
Since you can't do anything with the model, you should try to address your performance issues from the design side of things. The single responsibility principle should feature heavily in your website which will mean you'll only ever have a few values needed to be returned from the model. This way it really won't make much difference what option you choose since what is returned will be very limited.
Filter where you can, and use ordering sparingly.
You could group them into a few separate models, linked by OneToOneFields to the main model. That would "namespace" your data, and namespaces are "one honking great idea".

How should I create an SQLAlchemy model based on another class?

I have a regular class Something that basically just holds some fields. I need to make an SQLAlchemy model SomethingModel (in the ORM) that represents a Something-like object, with a couple extra fields (primary key, for example). What's the best way to do this?
So far, I've thought of having SomethingModel inherit from Something, but then I'm using multiple inheritance which I've heard is bad (SomethingModel would be inheriting from Something and SQLAlchemy's Base). I also thought that I could simply call Something.__init__ from within SomethingModel.__init__ - would that be better?
I am aware that I'll still need SQLAlchemy's column fields for the fields of Something that I want to save in SomethingModel. This also seems to make things a bit messier.
What's the "best" way to accomplish this?
Use classical mapping for your class

Getting and serializing the state of dynamically created python instances to a relational model

I'm developing a framework of sorts. I'm providing a base class, that will be subclassed by other developers to add behavior to the system. The instances of those classes will have attributes that my framework doesn't necessarily expect, except by inspecting those instances' __dict__. To make things even more interesting, some of those classes can be created dynamically, at any time.
I'd like some things to be handled by the framework, namely, I will need to persist those instances, display their attribute values to the user, and let her search/filter instances using those values.
I have to use a relational database. I know there are some decent python OO database out there, but unfortunately they're not an option in this case.
I'm not looking for a full-blown ORM too... and it may not even be an option, given that some of the classes can be created dynamically.
So, my question is, what state of a python instance do I need to serialize to ensure that I can deserialize it later on? Is it enough to look at __dict__, or are there other private attributes that I should be using?
Pickling the instances is not enough, because I'll need to unpickle them to search/filter the attribute values, and I'm afraid it's too much data to do it in-memory (instead of letting the database do it).
Just use an ORM. This is what they are for.
What you are proposing to do is create your own half-assed ORM on your own time. Save your time for your own code that does things, and use the effort other people put for free into solving this problem for you.
Note that all class creation in python is "dynamic" - this is not an issue, for, well, anything at all. In fact, if you are assembling classes programmatically, it is probably slightly easier with an ORM, because they provide reifications of fields.
In the worst case, if you really do need to store your objects in a fake nosql-type schema, you will still only have to write your own backend driver if you use an existing ORM, rather than coding the whole stack yourself. (As it happens, you're not the first person to face this - solutions exist. Goole "python orm store dynamically created models" and "sqlalchemy store dynamically created models")
Candidates include:
Django ORM
SQLAlchemy
Some others you can find by googling "Python ORM".

Converting Python App into Django

I've got a Python program with about a dozen classes, with several classes possessing instances of other classes, e.g. ObjectA has a list of ObjectB's, and a dictionary of (ObjectC, ObjectD) pairs.
My goal is to put the program's functionality on a website.
I've written and tested JSON encode and decode methods for each class. The problem as I see it now is that I need to choose between starting over and writing the models and logic afresh from a database perspective, or simply storing the python objects (encoded as JSON) in the database, and pulling out the saved states for changes.
Can someone confirm that these are both valid approaches, and that I'm not missing any other simple options?
Man, what I think you can do is convert the classes you already have made into django model classes. Of course, only the ones that need to be saved to a database. The other classes, as the rest of the code, I recommend you to encapsulate them for use as helper functions. So you don't have to change too much your code and it's going to work fine. ;D
Or, another choice, that can be easier to implement is: put everything in a helper, the classes, the functions and everything else.
SO you'll just need to call the functions in your views and define models to save your data into the database.
Your idea of saving the objects as JSON on the database works, but it's ugly. ;)
Anyway, if you are in a hurry to deliver the website, anything is valid. Just remember that things made in this way always give us lots of problems in the future.
It hopes that it could be useful! :D

Categories