Suppose I have such models:
class Recipe (models.Model):
par_recipe = models.CharField(max_length=200)
class Line (models.Model):
par_machine = models.CharField(max_length=200)
class Measurements (models.Model):
par_value = models.IntegerField(default=0)
id_line = models.ForeignKey(Line)
id_recipe = models.ForeignKey(Recipe)
Do I understand correctly that in this way I have a 1: 1 relationship, and adding entries ids will be automatically created id_line,id_recipe.
I will add for example:
for row in ws.iter_rows(row_offset=1):
recipe =Recipe()
line = line()
measurements = Measurements()
recipe.par_recipe = row[1].value
line.par_machine = row[2].value
measurements.par_value = row[8].value
And the small question about measurements was conceived that all secondary keys should go to it, now it is implemented correctly?
It is not quite like that, you would have to tie them together:
for row in ws.iter_rows(row_offset=1):
recipe =Recipe.objects.create(par_recipe=row[1].value)
line = Line.objects.create(par_machine=row[2].value)
measurements = Measurements.objects.create(
par_value=row[8].value,
id_line=line,
id_recipe=recipe
)
None of this is db optimized, you could use transactions to optimize the db writes.
You could make it faster if there are a lot of rows by using transactions:
from django.db import transaction
with transaction.atomic():
for row in ws.iter_rows(row_offset=1):
recipe =Recipe.objects.create(par_recipe=row[1].value)
line = Line.objects.create(par_machine=row[2].value)
measurements = Measurements.objects.create(
par_value=row[8].value,
id_line=line,
id_recipe=recipe
)
This would create a transaction and write one instead of each time. But it will also fail the whole transaction on an error.
see Django Database Transactions
You could get more creative by counting the number of records and writing every 1000 records for example by:
from django.db import transaction
with transaction.atomic():
for idx, row in enumerate(ws.iter_rows(row_offset=1)):
recipe =Recipe.objects.create(par_recipe=row[1].value)
line = Line.objects.create(par_machine=row[2].value)
measurements = Measurements.objects.create(
par_value=row[8].value,
id_line=line,
id_recipe=recipe
)
# every 1000 records, commmit the transaction
if idx % 1000 == 0:
transaction.commit()
Do I understand correctly that in this way I have a 1: 1 relationship, and adding entries ids will be automatically created id_line,id_recipe.
The relations will not link to the previously constructed objects, that would also be quite unsafe since a small change to the code fragment, could result in a totally different way of linking elements together.
Furthermore a ForeignKey is a many-to-one relation: multiple Measurements objects can refer to the same Recipe object.
You need to do this manually, for example:
for row in ws.iter_rows(row_offset=1):
recipe = Recipe.objects.create(par_recipe=row[1].value)
line = Line.objects.create(par_machine=row[2].value)
measurements = Measurements.objects.create(
par_value=row[8].value,
id_line=line,
id_recipe=recipe
)
Note that a ForeignKey refers to the objects, not to the primary key value, so you probably want to rename your ForeignKeys. A model typically has a singular name, so Measurement instead of Measurements:
class Measurement(models.Model):
par_value = models.IntegerField(default=0)
line = models.ForeignKey(Line, on_delete=models.CASCADE)
recipe = models.ForeignKey(Recipe, on_delete=models.CASCADE)
Related
I might be missing something simple here. And I simply lack the knowledge or some how-to.
I got two models, one is site, the other one is siteField and the most important one - siteFieldValue.
My idea is to create a django table (for site) that uses the values from siteFieldValue as a number in a row, for a specific site, under certain header. The problem is - each site can have 50s of them. That * number of columns specified by def render_ functions * number of sites equals to a lot of queries and I want to avoid that.
My question is - is it possible to, for example, prefetch all the values for each site (SiteFieldValue.objects.filter(site=record).first() somewhere in the SiteListTable class), put them into an array and then use them in the def render_ functions by simply checking the value assigned to a key (id of the field).
Models:
class Site(models.Model):
name = models.CharField(max_length=100)
class SiteField(models.Model):
name = models.CharField(max_length=100)
description = models.CharField(max_length=500, null=True, blank=True)
def __str__(self):
return self.name
class SiteFieldValue(models.Model):
site = models.ForeignKey(Site, on_delete=models.CASCADE)
field = models.ForeignKey(SiteField, on_delete=models.CASCADE)
value = models.CharField(max_length=500)
Table view
class SiteListTable(tables.Table):
name = tables.Column()
importance = tables.Column(verbose_name='Importance',empty_values=())
vertical = tables.Column(verbose_name='Vertical',empty_values=())
#... and many more to come... all values based on siteFieldValue
def render_importance(self, value, record):
q = SiteFieldValue.objects.filter(site=record, field=1).first()
# ^^ I don't want this!! I would want the SiteFieldValue to be prefetched somewhere else for that model and just check the array for field id in here.
if (q):
return q.value
else:
return None
def render_vertical(self, value, record):
q = SiteFieldValue.objects.filter(site=record, field=2).first()
# ^^ I don't want this!! I would want the SiteFieldValue to be prefetched somewhere else for that model and just check the array for field id in here.
if (q):
return q.value
else:
return None
class Meta:
model = Site
attrs = {
"class": "table table-striped","thead" : {'class': 'thead-light',}}
template_name = "django_tables2/bootstrap.html"
fields = ("name", "importance", "vertical",)
This might get you started. I've broken it up into steps but they can be chained quite easily.
#Get all the objects you'll need. You can filter as appropriate, say by site__name).
qs = SiteFieldValue.objects.select_related('site', 'field')
#lets keep things simple and only get the values we want
qs_values = qs.values('site__name','field__name','value')
#qs_values is a queryset. For ease of manipulation, let's make it a list
qs_list = list(qs_values)
#set up a final dict
final_dict = {}
# create the keys (sites) and values so they are all grouped
for site in qs_list:
#create the sub_dic for the fields if not already created
if site['site__name'] not in final_dict:
final_dict[site['site__name']] = {}
final_dict[site['site__name']][site['name']] = site['site__name']
final_dict[site['site__name']][site['field__name']] = site['value']
#now lets convert our dict of dicts into a list of dicts
# for use as per table2 docs
data = []
for site in final_dict:
data.append(final_dict[site])
Now you have a list of dicts eg,
[{'name':site__name, 'col1name':value...] and can add it as shown in the table2 docs
I have the following models:
class Customer(models.Model):
name = models.CharField(max_length=255)
email = models.EmailField(max_length = 255, default='example#example.com')
authorized_credit = models.IntegerField(default=0)
balance = models.IntegerField(default=0)
class Transaction(models.Model):
customer = models.ForeignKey(Customer, on_delete=models.CASCADE)
payment_amount = models.IntegerField(default=0) #can be 0 or have value
exit_amount = models.IntegerField(default=0) #can be 0 or have value
transaction_date = models.DateField()
I want to query for get all customer information and date of last payment.
I have this query in postgres that is correct, is just that i need:
select e.*, max(l.transaction_date) as last_date_payment
from app_customer as e
left join app_transaction as l
on e.id = l.customer_id and l.payment_amount != 0
group by e.id
order by e.id
But i need this query in django for an serializer. I try with that but return other query
In Python:
print(Customer.objects.filter(transaction__isnull=True).order_by('id').query)
>>> SELECT app_customer.id, app_customer.name, app_customer.email, app_customer.balance FROM app_customer
LEFT OUTER JOIN app_transaction
ON (app_customer.id = app_transaction.customer_id)
WHERE app_transaction.id IS NULL
ORDER BY app_customer.id ASC
But that i need is this rows
example
Whether you are working with a serializer or not you can reuse the same view/function for both the tasks.
First to get the transaction detail for the current customer object you have you have to be aware of related_name.related_name have default values but you can mention something unique so that you remember.
Change your model:
class Transaction(models.Model):
customer = models.ForeignKey(Customer, related_name="transac_set",on_delete=models.CASCADE)
related_names are a way in django to create reverse relationship from Customer to Transaction this way you will be able to do Customer cus.transac_set.all() and it will fetch all the transaction of cus object.
Since you might have multiple customers to get transaction details for you can use select_related() when querying this will hit the database least number of times and get all the data for you.
Create a function definition to get the data of all transaction of Customers:
def get_cus_transac(cus_id_list):
#here cus_id_list is the list of ids you need to fetch
cus_transac_list = Transaction.objects.select_related(customer).filter(id__in = cus_id_list)
return cus_transac_list
For your purpose you need to use another way that is the reason you needed related_name, prefetch_related().
Create a function definition to get the data of latest transaction of Customers: ***Warning: I was typing this answer before sleeping so there is no way the latest value of transaction is being fetched here.I will add it later but you can work on similar terms and get it done this way.
def get_latest_transac(cus_id_list):
#here cus_id_list is the list of ids you need to fetch
latest_transac_list = Customer.objects.filter(id__in = cus_id_list).prefetch_related('transac_set')
return latest_transac_list
Now coming to serializer,you need to have 3 serializers (Actually you need 2 only but third one can serialize Customer data + latest transaction that you need) ...one for Transaction and another for customer then the 3rd Serializer to combine them.
There might be some mistakes in code or i might have missed some details.As i have not checked it.I am assuming you know how to make serializers and views for the same.
One approach is to use subqueries:
transaction_subquery = Transaction.objects.filter(
customer=OuterRef('pk'), payment_amount__gt=0,
).order_by('-transaction_date')
Customer.objects.annotate(
last_date_payment=Subquery(
transaction_subquery.values('transaction_date')[:1]
)
)
This will get all customer data, and annotate with their last transaction date that has payment_amount as non-zero, in one query.
To solve your problem:
I want to query for get all customer information and date of last payment.
You can try use order by combine with distinct:
Customer.objects.prefetch_related('transaction_set').values('id', 'name', 'email', 'authorized_credit', 'balance', 'transaction__transaction_date').order_by('-transaction__transaction_date').distinct('transaction__transaction_date')
Note:
It only applies to PostgreSQL when distinct followed by parameters.
Usage of distinct: https://docs.djangoproject.com/en/3.2/ref/models/querysets/#distinct
I have different model. Choices of Multiselctfield of one model is dependent on another model.So , database has to be queried inside model.py While doing so, this causes problem in migration. (Table doesn't exist error)
class Invigilator(models.Model):
---
# this method queries Shift objects and Examroom
def get_invigilator_assignment_list ():
assignment = []
shifts = Shift.objects.all()
for shift in shifts:
rooms= ExamRoom.objects.all()
for room in rooms:
assign = str (shift.shiftName)+ " " +str (room.name)
assignment.append (assign)
return assignment
assignment_choice = []
assign = get_invigilator_assignment_list()
i = 0
for assignm in assign:
datatuple = (i,assignm)
assignment_choice.append(datatuple)
i= i+1
ASSIGNMENT_CHOICE = tuple(assignment_choice)
assignment =MultiSelectField (choices = ASSIGNMENT_CHOICE, blank = True, verbose_name="Assignments")
You cannot add dynamic choices because they are all stored in the migration files and table info. If Django lets you do that, this means that everytime someone adds a record to those 2 models, a new migration should be created and the db should be changed. You must approach this problem differently.
As far as I know django-smart-selects has a ChainedManyToMany field which can do the trick.
Here is an example from the repo.
from smart_selects.db_fields import ChainedManyToManyField
class Publication(models.Model):
name = models.CharField(max_length=255)
class Writer(models.Model):
name = models.CharField(max_length=255)
publications = models.ManyToManyField('Publication', blank=True, null=True)
class Book(models.Model):
publication = models.ForeignKey(Publication)
writer = ChainedManyToManyField(
Writer,
chained_field="publication",
chained_model_field="publications")
name = models.CharField(max_length=255)
This cannot be done in the model and doesn't make sense. It's like you're trying to create a column in a table with a certain fixed set of choices (what is MultiSelecField anyway?), but when someone later adds a new row in the Shift or ExamRoom table, the initial column choices have to change again.
You can
either make your assignment column a simple CharField and create the choices dynamically when creating the form
or you can try to model your relationships differently. For example, since it looks like assignment is a combination of Shift and ExamRoom, I would create a through relationship:
shifts = models.ManyToManyField(Shift, through=Assignment)
class Assignment(Model):
room = ForeignKey(ExamRoom)
shift = ForeignKey(Shift)
invigilator = ForeignKey(Invigilator)
When creating the relationship, you'd have to pick a Shift and a Room which would create the Assignment object. Then you can query things like invigilator.shifts.all() or invigilator.assignment_set.first().room.
I'm just new using Django. I just created my models and migrate information to my sqlite3 database using the .cvs import module. This are my modules:
class Backlog(models.Model):
sales_order = models.CharField(max_length=30)
po_number = models.CharField(max_length=30)
order_number = models.IntegerField(blank=True)
line_number = models.IntegerField(blank=True)
ship_Set = models.IntegerField(blank=True)
product_id = models.CharField(max_length=30)
ordered_quantity = models.IntegerField(blank=True)
class Material(models.Model):
product_id = models.CharField(max_length=50)
tan_id = models.CharField(max_length=50)
Now that I have the information inside my tables I want to do the following:
Find if product_id from Backlog is in Material's model, once it finds it verify the first two digits from the tan_id. If is 74 classify as '1', if is 800 classify as '3' else set as '2'. (tan_id format commonly are 74-102345-03, 800-120394-03)
My two questions are:
How to do that and if I have to create a new column to add the information from every product_id.
Ok well given your current models, here is a possible solution to the problem you are having:
for backlog in Backlog.objects.all():
try:
material = Material.objects.get(product_id = backlog.product_id)
if material.tan_id[0:2] == '74':
# Classify as 1
elif material.tan_id[0:2] == '80':
# Classify as 3
else:
# Classify as 2
except Material.DoesNotExist:
print("This material is not in backlog")
continue
This code should loop over every instance of Backlog you have in your database and then try to find the associated Material. In the event it doesn't find a Material (in your case there is no backlog), objects.get() raises an exception that it doesn't exist, we print it's not there and continue on with the loop. If it is we classify it as you specified. Might require a slight bit of tweaking but it should give you the bones of what you want to fix this problem. Let me know if it doesn't.
I have Orders model which stores orders of users. I'd like to filter only orders which has been issued (order_started field) on the last 24 hours for a user. I am trying to update following view:
def userorders(request):
Orders = Orders.objects.using('db1').filter(order_owner=request.user).extra(select={'order_ended_is_null': 'order_ended IS NULL',},)
Order model has following fields:
order_uid = models.TextField(primary_key=True)
order_owner = models.TextField()
order_started = models.DateTimeField()
order_ended = models.DateTimeField(blank=True, null=True)
How can I add the extra filter?
You can do it as below, where you add another argument in the filter call (assuming the rest of your function was working):
import datetime
def userorders(request):
time_24_hours_ago = datetime.datetime.now() - datetime.timedelta(days=1)
orders = Orders.objects.using('db1').filter(
order_owner=request.user,
order_started__gte=time_24_hours_ago
).extra(select={'order_ended_is_null': 'order_ended IS NULL',},)
Note that Orders is not a good choice for a variable name, since it refers to another class in the project and begins with caps (generally used for classes), so I've used orders instead (different case).