How to query database items from models.py in Django? - python

I have different model. Choices of Multiselctfield of one model is dependent on another model.So , database has to be queried inside model.py While doing so, this causes problem in migration. (Table doesn't exist error)
class Invigilator(models.Model):
---
# this method queries Shift objects and Examroom
def get_invigilator_assignment_list ():
assignment = []
shifts = Shift.objects.all()
for shift in shifts:
rooms= ExamRoom.objects.all()
for room in rooms:
assign = str (shift.shiftName)+ " " +str (room.name)
assignment.append (assign)
return assignment
assignment_choice = []
assign = get_invigilator_assignment_list()
i = 0
for assignm in assign:
datatuple = (i,assignm)
assignment_choice.append(datatuple)
i= i+1
ASSIGNMENT_CHOICE = tuple(assignment_choice)
assignment =MultiSelectField (choices = ASSIGNMENT_CHOICE, blank = True, verbose_name="Assignments")

You cannot add dynamic choices because they are all stored in the migration files and table info. If Django lets you do that, this means that everytime someone adds a record to those 2 models, a new migration should be created and the db should be changed. You must approach this problem differently.
As far as I know django-smart-selects has a ChainedManyToMany field which can do the trick.
Here is an example from the repo.
from smart_selects.db_fields import ChainedManyToManyField
class Publication(models.Model):
name = models.CharField(max_length=255)
class Writer(models.Model):
name = models.CharField(max_length=255)
publications = models.ManyToManyField('Publication', blank=True, null=True)
class Book(models.Model):
publication = models.ForeignKey(Publication)
writer = ChainedManyToManyField(
Writer,
chained_field="publication",
chained_model_field="publications")
name = models.CharField(max_length=255)

This cannot be done in the model and doesn't make sense. It's like you're trying to create a column in a table with a certain fixed set of choices (what is MultiSelecField anyway?), but when someone later adds a new row in the Shift or ExamRoom table, the initial column choices have to change again.
You can
either make your assignment column a simple CharField and create the choices dynamically when creating the form
or you can try to model your relationships differently. For example, since it looks like assignment is a combination of Shift and ExamRoom, I would create a through relationship:
shifts = models.ManyToManyField(Shift, through=Assignment)
class Assignment(Model):
room = ForeignKey(ExamRoom)
shift = ForeignKey(Shift)
invigilator = ForeignKey(Invigilator)
When creating the relationship, you'd have to pick a Shift and a Room which would create the Assignment object. Then you can query things like invigilator.shifts.all() or invigilator.assignment_set.first().room.

Related

How to fetch related model in django_tables2 to avoid a lot of queries?

I might be missing something simple here. And I simply lack the knowledge or some how-to.
I got two models, one is site, the other one is siteField and the most important one - siteFieldValue.
My idea is to create a django table (for site) that uses the values from siteFieldValue as a number in a row, for a specific site, under certain header. The problem is - each site can have 50s of them. That * number of columns specified by def render_ functions * number of sites equals to a lot of queries and I want to avoid that.
My question is - is it possible to, for example, prefetch all the values for each site (SiteFieldValue.objects.filter(site=record).first() somewhere in the SiteListTable class), put them into an array and then use them in the def render_ functions by simply checking the value assigned to a key (id of the field).
Models:
class Site(models.Model):
name = models.CharField(max_length=100)
class SiteField(models.Model):
name = models.CharField(max_length=100)
description = models.CharField(max_length=500, null=True, blank=True)
def __str__(self):
return self.name
class SiteFieldValue(models.Model):
site = models.ForeignKey(Site, on_delete=models.CASCADE)
field = models.ForeignKey(SiteField, on_delete=models.CASCADE)
value = models.CharField(max_length=500)
Table view
class SiteListTable(tables.Table):
name = tables.Column()
importance = tables.Column(verbose_name='Importance',empty_values=())
vertical = tables.Column(verbose_name='Vertical',empty_values=())
#... and many more to come... all values based on siteFieldValue
def render_importance(self, value, record):
q = SiteFieldValue.objects.filter(site=record, field=1).first()
# ^^ I don't want this!! I would want the SiteFieldValue to be prefetched somewhere else for that model and just check the array for field id in here.
if (q):
return q.value
else:
return None
def render_vertical(self, value, record):
q = SiteFieldValue.objects.filter(site=record, field=2).first()
# ^^ I don't want this!! I would want the SiteFieldValue to be prefetched somewhere else for that model and just check the array for field id in here.
if (q):
return q.value
else:
return None
class Meta:
model = Site
attrs = {
"class": "table table-striped","thead" : {'class': 'thead-light',}}
template_name = "django_tables2/bootstrap.html"
fields = ("name", "importance", "vertical",)
This might get you started. I've broken it up into steps but they can be chained quite easily.
#Get all the objects you'll need. You can filter as appropriate, say by site__name).
qs = SiteFieldValue.objects.select_related('site', 'field')
#lets keep things simple and only get the values we want
qs_values = qs.values('site__name','field__name','value')
#qs_values is a queryset. For ease of manipulation, let's make it a list
qs_list = list(qs_values)
#set up a final dict
final_dict = {}
# create the keys (sites) and values so they are all grouped
for site in qs_list:
#create the sub_dic for the fields if not already created
if site['site__name'] not in final_dict:
final_dict[site['site__name']] = {}
final_dict[site['site__name']][site['name']] = site['site__name']
final_dict[site['site__name']][site['field__name']] = site['value']
#now lets convert our dict of dicts into a list of dicts
# for use as per table2 docs
data = []
for site in final_dict:
data.append(final_dict[site])
Now you have a list of dicts eg,
[{'name':site__name, 'col1name':value...] and can add it as shown in the table2 docs

Creating records by model

Suppose I have such models:
class Recipe (models.Model):
par_recipe = models.CharField(max_length=200)
class Line (models.Model):
par_machine = models.CharField(max_length=200)
class Measurements (models.Model):
par_value = models.IntegerField(default=0)
id_line = models.ForeignKey(Line)
id_recipe = models.ForeignKey(Recipe)
Do I understand correctly that in this way I have a 1: 1 relationship, and adding entries ids will be automatically created id_line,id_recipe.
I will add for example:
for row in ws.iter_rows(row_offset=1):
recipe =Recipe()
line = line()
measurements = Measurements()
recipe.par_recipe = row[1].value
line.par_machine = row[2].value
measurements.par_value = row[8].value
And the small question about measurements was conceived that all secondary keys should go to it, now it is implemented correctly?
It is not quite like that, you would have to tie them together:
for row in ws.iter_rows(row_offset=1):
recipe =Recipe.objects.create(par_recipe=row[1].value)
line = Line.objects.create(par_machine=row[2].value)
measurements = Measurements.objects.create(
par_value=row[8].value,
id_line=line,
id_recipe=recipe
)
None of this is db optimized, you could use transactions to optimize the db writes.
You could make it faster if there are a lot of rows by using transactions:
from django.db import transaction
with transaction.atomic():
for row in ws.iter_rows(row_offset=1):
recipe =Recipe.objects.create(par_recipe=row[1].value)
line = Line.objects.create(par_machine=row[2].value)
measurements = Measurements.objects.create(
par_value=row[8].value,
id_line=line,
id_recipe=recipe
)
This would create a transaction and write one instead of each time. But it will also fail the whole transaction on an error.
see Django Database Transactions
You could get more creative by counting the number of records and writing every 1000 records for example by:
from django.db import transaction
with transaction.atomic():
for idx, row in enumerate(ws.iter_rows(row_offset=1)):
recipe =Recipe.objects.create(par_recipe=row[1].value)
line = Line.objects.create(par_machine=row[2].value)
measurements = Measurements.objects.create(
par_value=row[8].value,
id_line=line,
id_recipe=recipe
)
# every 1000 records, commmit the transaction
if idx % 1000 == 0:
transaction.commit()
Do I understand correctly that in this way I have a 1: 1 relationship, and adding entries ids will be automatically created id_line,id_recipe.
The relations will not link to the previously constructed objects, that would also be quite unsafe since a small change to the code fragment, could result in a totally different way of linking elements together.
Furthermore a ForeignKey is a many-to-one relation: multiple Measurements objects can refer to the same Recipe object.
You need to do this manually, for example:
for row in ws.iter_rows(row_offset=1):
recipe = Recipe.objects.create(par_recipe=row[1].value)
line = Line.objects.create(par_machine=row[2].value)
measurements = Measurements.objects.create(
par_value=row[8].value,
id_line=line,
id_recipe=recipe
)
Note that a ForeignKey refers to the objects, not to the primary key value, so you probably want to rename your ForeignKeys. A model typically has a singular name, so Measurement instead of Measurements:
class Measurement(models.Model):
par_value = models.IntegerField(default=0)
line = models.ForeignKey(Line, on_delete=models.CASCADE)
recipe = models.ForeignKey(Recipe, on_delete=models.CASCADE)

How to override create_table() in peewee to create extra history?

Following this SO answer and using the (excellent) Peewee-ORM I'm trying to make a versioned database in which a history of a record is stored in a second _history table. So when I create a new using the create_table() method I also need to create a second table with four extra fields.
So let's say I've got the following table:
class User(db.Model):
created = DateTimeField(default=datetime.utcnow)
name = TextField()
address = TextField()
When this table is created I also want to create the following table:
class UserHistory(db.Model):
created = DateTimeField() # Note this shouldn't contain a default anymore because the value is taken from the original User table
name = TextField()
address = TextField()
# The following fields are extra
original = ForeignKeyField(User, related_name='versions')
updated = DateTimeField(default=datetime.utcnow)
revision = IntegerField()
action = TextField() # 'INSERT' or 'UPDATE' (I never delete anything)
So I tried overriding the Model class like this:
class Model(db.Model):
#classmethod
def create_table(cls, fail_silently=False):
db.Model.create_table(cls, fail_silently=fail_silently)
history_table_name = db.Model._meta.db_table + 'history'
# How to create the history table here?
As you can see I manage to create a variable with the history table name, but from there I'm kinda lost.
Does anybody know how I can create a new table which is like the original one, but just with the added 4 fields in there? All tips are welcome!
Maybe something like this:
class HistoryModel(Model):
#classmethod
def create_table(cls...):
# Call parent `create_table()`.
Model.create_table(cls, ...)
history_fields = {
'original': ForeignKeyField(cls, related_name='versions'),
'updated': DateTimeField(default=datetime.utcnow),
'revision': IntegerField(),
'action': TextField()}
model_name = cls.__name__ + 'History'
HistoryModel = type(model_name, (cls,), history_fields)
Model.create_table(HistoryModel, ...)
Also note you'll want to do the same thing for create_indexes().
I'd suggest creating a property or some other way to easily generate the HistoryModel.

django 'null value in column'

I try to save user's birth date, but get "null value in column "dob" violates not-null constraint" error.
models.py:
class Profile(models.Model):
user = models.OneToOneField(User, unique=True)
nickname = models.CharField(max_length=32)
dob = models.DateField(null=False)
sex = models.BooleanField(null=False)
Here i try to generate random users:
def create_random_users(userCount=1000):
random.seed()
for i in range(0, userCount):
sex = random.randint(0, 1)
name = random.choice(names[sex])
email = "{0}{1}#mail.com".format(name, i)
user = soc_models.User.objects.create_user(email, email, password='password')
user.save()
userProfile = soc_models.Profile.objects.create()
userProfile.user = user
_year = random.randrange(1962, 1995)
_month = random.randrange(1, 12)
_day = random.randrange(1, calendar.monthrange(_year, _month)[1])
userProfile.dob = datetime.datetime(_year, _month, _day)
userProfile.sex = random.randrange(0, 1)
userProfile.city = random.randrange(4000000)
userProfile.country = random.randrange(230)
userProfile.save()
Thank you.
The create method is documented as "a convenience method for creating an object and saving it all in one step". So when the following statement in your sample data creation script runs:
userProfile = soc_models.Profile.objects.create()
It attempts to save an empty Profile object to the database. Since you haven't set the dob attribute at this point, you trigger the NOT NULL constraint.
Two ways to avoid this are:
create the object via the constructor so that it isn't immediately saved to the database.
provide values for all the fields via keyword arguments to create.
When using create, you must pass all required values to it, if you want to call save() later, use model constructor instead, i.e.:
userProfile = soc_models.Profile()
I'm not sure whether it fixes your error or not, but regarding to docs, DateField should be used for storing datetime.date instance, and DateTimeField to store datetime.datetime.
P.S. Really, it looks like you trying to "migrate" DB scheme (to change already created columns). Django doesn't support such feature, but you can use external applications, like South.

How can I do INNER JOIN in Django in legacy database?

Sorry for probably simple question but I'm a newby in Django and really confused.
I have an ugly legacy tables that I can not change.
It has 2 tables:
class Salespersons(models.Model):
id = models.IntegerField(unique=True, primary_key=True)
xsin = models.IntegerField()
name = models.CharField(max_length=200)
surname = models.CharField(max_length=200)
class Store(models.Model):
id = models.IntegerField(unique=True, primary_key=True)
xsin = models.IntegerField()
brand = models.CharField(max_length=200)
So I suppose I can not add Foreign keys in class definitions because they change the tables.
I need to execute such sql request:
SELECT * FROM Salespersons, Store INNER JOIN Store ON (Salespersons.xsin = Store.xsin);
How can I achieve it using Django ORM?
Or I'm allowed to get Salespersons and Store separately i.e.
stores = Store.objects.filter(xsin = 1000)
salespersons = Salespersons.objects.filter(xsin = 1000)
Given your example query, are your tables actually named Salespersons/Store?
Anyway, something like this should work:
results = Salespersons.objects.extra(tables=["Store"],
where=["""Salespersons.xsin = Store.xsin"""])
However, given the names of the tables/models it doesn't seem to me that an inner join would be logically correct. Unless you always have just 1 salesperson per store with same xsin.
If you can make one of the xsin fields unique, you can use a ForeignKey with to_field to generate the inner join like this:
class Salespersons(models.Model):
xsin = models.IntegerField(unique=True)
class Store(models.Model):
xsin = models.ForeignKey(Salespersons, db_column='xsin', to_field='xsin')
>>> Store.objects.selected_related('xsin')
I don't see why you can't use the models.ForeignKey fields even if the database lacks the constraints -- if you don't explicitly execute the SQL to change the database then the tables won't change. If you use a ForeignKey then you can use Salespersons.objects.select_related('xsin') to request that the related objects are fetched at the same time.

Categories