Django Execute Function When DateTimeField Equals Current Date and Time - python

So I have implemented a subscription product to my website.
When they start the subscription, the current date and time is stored into the database. When they cancel the subscription, a period of time is added onto the start date so I know when to cancel the subscription and change some values elsewhere.
This is fine, I know how to use Django and Python, however, I am stuck on the last bit of, when the cancel date comes around in the future.
Question: How do I execute a function when the cancel date (in the db) is equal to current date and time.
Below is an simple example of the model I will be using:
models.py
class Subscriptions(models.Model):
subscription_id = models.AutoField(primary_key=True)
start_date = model.DateTimeField(auto_now_add=True)
cancel_date = model.DateTimeField(auto_now_add=False)
person_id = model.ForeignKey('Persons')
class Persons(models.Model):
person_id = models.AutoField(primary_key=True)
value_to_change = models.BooleanField()
Before you ask I have not attempted any code as I couldn't find a solution for this problem. Thanks <3

Without Celery, installed on UNIX system providing CRON (cron doku: e.g. https://www.computerhope.com/unix/ucrontab.htm):
write a Command https://docs.djangoproject.com/en/2.1/howto/custom-management-commands/ that fetches the objects for which cancel_date is in the past and that have not been cancelled yet. (If you do the lookup with datetime.now() (a very precise lookup with the finest granularity), you would have to be more than lucky to find anything.)
You should add another date field that tells you when the system actually ran the cancellation, and you should allow both the cancel_date and cancelled_date to be null.
# models.py
class Subscriptions(models.Model):
subscription_id = models.AutoField(primary_key=True)
start_date = model.DateTimeField(auto_now_add=True)
cancel_date = model.DateTimeField(auto_now_add=False, null=True)
cancelled_date = model.DateTimeField(null=True)
person_id = model.ForeignKey('Persons')
# under myapp/management/command/cancellation.py
class CancellationCommand(BaseCommand):
def handle(self, *args, **options):
now = datetime.now()
to_cancel_qs = Subscriptions.objects.exclude(
cancelled_date__isnull=False).filter(
cancel_date__lte=now)
for sub in to_cancel_qs.all():
# do your cancelling
sub.cancelled_date = now
sub.save()
# or: to_cancel_qs.update(cancelled_date=now)
install a cron job that runs this command via ./manage.py your_command at a regular time

Related

Python: Calculate time between current time and last login. (Automated Communication)

I'm trying to make a celery task that would send a basic reminder to our users. So in our automated communication project, we have these tasks:
As you can see there are few actions that are different. So for now I have created a logic that fetches all the users from the DB and then continues by checking the time difference. But for now, I only have set-up for 2 hours or more. How should I use it correctly? I do not want to re-write each if statement because it's bad practice. How should I make it clear and reduce the system load?
#app.task
def check_registered_users():
from apps.users.models import User
from apps.notifications.models import AutomatedCommunicationNotifications
day_start = datetime.utcnow().date()
day_end = day_start + timedelta(days=1)
users = User.objects.filter(is_active=True, date_joined__range=(day_start, day_end))
users_that_received_notification = AutomatedCommunicationNotifications.objects.all().values('user__id')
excluded_users = users.exclude(id__in=users_that_received_notification)
for user in excluded_users:
if user.last_login < user.last_login + timedelta(hours=2):
# Sign-up uncompleted Push notification 2 hours after last login
template = SiteConfiguration.get_solo().automated_comms_signup_uncompleted
send_plain_email_task(
email=user.email,
subject=template.subject,
html_message=template.content,
from_email=f'{settings.EMAIL_FROM_PREFIX} <{settings.DEFAULT_FROM_EMAIL}>',
)
P.S AutomatedCommunicationNotifications table is for us to track which user has already received a notification.
class AutomatedCommunicationNotifications(BaseModel):
""" Model for automated comms notifications """
user = models.ForeignKey(User, on_delete=models.CASCADE)
type = models.CharField(
max_length=255,
choices=NotificationTypes.get_choices(),
default=NotificationTypes.EMAIL_NOTIFICATION
)
def __str__(self):
return str(self.user.phone)
You'll have to iterate over your queried users at least once but here are tips that may help:
models.py
class User(...):
# add a field to determine if the user has registered or not
# set this to `True` when a User successfully registers:
is_registered = models.BooleanField(default=False)
class AutomatedCommunicationNotifications(BaseModel):
# add a related name field for easier coding:
user = models.ForeignKey(..., related_name = 'notifications')
tasks.py
# load packages outside of your function so this only runs once on startup:
from django.models import F
from apps.users.models import User
from apps.notifications.models import AutomatedCommunicationNotifications
#app.task
def check_registered_users():
# timestamps:
two_hours_ago = datetime.now() - timedelta(hours=2)
# query for unregistered users who have not received a notification:
users = User.objects.filter(
is_registered = False,
last_login__lt = two_hours_ago # last logged in 2 or more hours ago
).exclude(
notifications__type = "the type"
).prefetch_related(
'notifications' # prejoins tables to improve performance
)
for user in users:
# send email
...
I would do this with a cron job. You can let it run whenever you want, depends on how fast after your give time frame you want to sent this.
You start with making a folder in your app:
/django/yourapp/management/commands
There you make a python file which contains your logic. Make sure to import the right modules from your views.
from django.core.management.base import BaseCommand, CommandError
from yourapp.models import every, module, you, need
from django.utils import timezone
from datetime import datetime, date, timedelta
from django.core.mail import send_mail, EmailMessage
class Command(BaseCommand):
help = 'YOUR HELP TEXT FOR INTERNAL USE'
def handle(self, *args, **options):
# Your logic
I added the crontab to the www-data users crontab like this:
# m h dom mon dow command
45 3 * * * /websites/vaccinatieplanner/venv/bin/python /websites/vaccinatieplanner/manage.py reminder
You can use that crontab file to tweak your optimal time between checks. If you remove the 3 and replace it by a * then you will have it check every 45 mins.

How can I automatically let API keys expire?

I am building an application which uses API keys during sessions. I have so far successfully generated the API keys and I can check them for validity and whether they go with the correct account and I've also added brute force protection.
My problem is that I would like to automatically let them expire after 24 hours. Right now I remove old keys when a user requests a new one to lessen the chance of someone guessing the right key, but this doesn't work for users who don't use the application again.
I was going to achieve this by scheduling a cronjob, as I read other people advising. However, the server the application will be hosted by isn't mine and the person who the server actually does belong to doesn't see the need for the automatic expiry in the first place. Which means that I would like to somehow include it in the code itself or to have a good reasoning for why he should let me (or do it himself) schedule a cronjob.
The table containing the API keys looks as follows:
class DBAuth(db.Model):
__tablename__ = 'auth'
id = db.Column(db.Integer, primary_key=True)
user_id = db.Column(db.Integer, index=True)
api_key = db.Column(db.String(256))
begin_date = db.Column(db.DateTime, nullable=False)
And the api key generater is called as follows:
auth = DBAuth()
key = DBAuth.query.filter_by(user_id=user.id).first()
if key is not None:
db.session.delete(key)
db.session.commit()
api_key = auth.generate_key(user.id)
db.session.add(auth)
db.session.commit()
With the generator function like this:
def generate_key(self, user_id):
self.user_id = user_id
self.api_key = #redacted#
self.begin_date = datetime.datetime.now()
return self.api_key
My question is really two part:
1: is my colleague right in saying that the automatic expiry isn't necessary? and 2: Is there a way to add automatic expiry to the code instead of scheduling a cronjob?
Sorry, I don't have enough rep to comment, a simple approach would be the following:
Since you already have DateTime objects in your schema, maybe you can add another such item say "key_expiry_date" that contains the current time plus 24 hours.
You can then use "key_expiry_date" to validate further requests

Bulk update in django with calculations

I have 2 models in my project:
class Currency(models.Model):
title = models.CharField(max_length=100, unique=True)
value = models.FloatField()
class Good(models.Model):
name = models.CharField(max_length=100)
slug = SlugField(max_length=100, unique=True)
cost_to_display = models.IntegerField(default=0)
cost_in_currency = models.IntegerField()
currency = models.ForeignKey(Currency)
The idea of such model is to speed up the search by price and have all goods in one currency.
Therefore I need some hook which will update all Goods in case exchange rate was updated.
In raw sql it will looks like this
mysql> update core_good set cost_to_display = cost_in_currency * (select core_currency.value from core_currency where core_currency.id = currency_id ) ;
Query OK, 663 rows affected (0.10 sec)
Rows matched: 7847 Changed: 663 Warnings: 0
Works pretty fast. Though I tried to implement the same in django admin like this (using bulk-update):
def save_model(self, request, obj, form, change):
"""Update rate values"""
goods = Good.objects.all()
for good in goods:
good.cost_to_display = good.cost_in_currency * good.currency.value
bulk_update(goods)
obj.save()
It takes up to 20 minutes to update all records via django admin this way.
What I am doing wrong? What is the right way to update all the prices?
This is purely untested, but it's sort of work in my mind:
from django.db.models import F
Good.objects.all().update(cost_to_display=F('cost_in_currenty') * F('currency__value'))
Even you are calling bulk_update, you still looped through all goods, which is why your process is slow.
Edit:
This won't work because F() doesn't support joined fields. It can be done using raw query.
For the future readers: any call to good.currency in your code is hitting the database. Consider using select_related to fetch Currency and Good objects in one query:
goods = Good.objects.select_related('currency')
Also now Django comes with bulk_update method since version 2.2 docs

django model use database server time

I have 2 separate django server and a mysql server at difference location
Using django model, when try create/update object (as code below). Result is both 3 column with the same django server local time value. How to get db server local time when do that.
my model:
class Test(models.Model):
id = models.AutoField(primary_key = True, null = False)
create = models.DateTimeField(auto_now_add=True)
update = models.DateTimeField(auto_now=True)
test = models.DateTimeField(null=True)
My code:
y = Test()
y.test = datetime.now()
y.save()
result
id create update test
------ ------------------- ------------------- -------------------
6 2013-10-07 06:57:04 2013-10-07 06:57:04 2013-10-07 06:57:04
If you handle this in Django, django will set these fields before saving the data to database. In that case, Django will set the time according to your applicaiton server (server which django runs).
You must make some settings to your MySql installation (if you have not set them already). Then you must re-create your tables or alter them in your database (I guess Django model definition can not handle this). This is a how-to doc explaning these settings
An example sql CREATE TABLE statement will look like that
CREATE TABLE test_test (
create DATETIME DEFAULT CURRENT_TIMESTAMP,
update DATETIME DEFAULT 0 ON UPDATE CURRENT_TIMESTAMP
)
If you already have data in yotr table, then you may execute ALTERs' instead of re-creating tables.
Then you must edit yor models and set related datetime fields to null, so MySql can set them.
class Test(models.Model):
id = models.AutoField(primary_key = True, null = False)
create = models.DateTimeField(null=True)
update = models.DateTimeField(null=True)
test = models.DateTimeField(null=True)
I think you can't do what you want without setting up the database timezone. Let's see what means each of your dates:
create = models.DateTimeField(auto_now_add=True)
It sets automatically the date to the time when you add it.
update = models.DateTimeField(auto_now=True)
It sets automatically the date to the time when you start the process (either web process or command).
test = models.DateTimeField(null=True)
...
y.test = datetime.now()
It sets the date to the time of the process (where Django process run).
To get the database time, you could store the timezone of database in settings, and use it whenever neccesary. You can't use any automatically generated field for this.
Another solution could be to grab local time manually (through a raw query) on database.

Django - how to implement lock data

I have a database table. Some database items can be edited by a user, but only one user can edit the table content at a time, and if after 2 hours the user hasn't finished editing, other users can edit the table. How can I do this?
The table is like this:
class NodeRevision(BaseModel, NodeContent):
node = models.ForeignKey(Node, related_name='revisions')
summary = models.CharField(max_length=300)
revision = models.PositiveIntegerField()
revised_at = models.DateTimeField(default=datetime.datetime.now)
suggested = models.BooleanField(default=False)
suggest_status = models.CharField(max_length=16,default="")
Should I add a BooleanField to it, such as editing_locked=models.BooleanField(default=False) ? Or something else? And how could I implement the 2 hour check?
You'd need a locked_at time field and locked_by field.
Every time somebody loads an edit page, update the database with the locked_at and locked_by information.
To implement the 2 hour restriction, I'd just have the results calculated only when a user asks for permission (as opposed to polling / updating models). When a user tries to edit a model, have it check locked_by/locked_at and return a Boolean whether it's editable by the user or not.
def can_edit(self, user):
if user == self.locked_by:
return True
elif self.locked_at and (self.locked_at - datetime.datetime.now()).total_seconds > 2*60*60:
return True
return False

Categories