Dynamic settings.py - python

I have used django-constance as a library.
Although one thing I notice is that when I tried using ADMIN and MANAGER
CONSTANCE_CONFIG = {
'ADMINS': ([('Errors', 'admin#gmail.com')], 'Admin Emails'),
}
sending of emails is not working.
In MANAGER I have tried this:
MANAGER = CONSTANCE_CONFIG['ADMINS'][0]
still sending emails is not working. Am I missing a wrong implementation?
Or can you suggest any other library which can override ADMIN and MANAGER in settings.py. I am using Django 1.8.5 and Python 3.
also when trying to import inside settings.py it produces error as well.

1# Probably you already know, django-constance does not support tuple. Basically it is really hard to detect a widget for tuple specially
in your case. ADMINS can be added/deleted so how possibly you can make it dynamic through a single widget..!!(think about all django widgets). So here
CONSTANCE_ADDITIONAL_FIELDS will also not work.
2# I think you are misunderstanding the working of django constance.
It does not refresh your django server. So MANAGER = CONSTANCE_CONFIG['ADMINS'][0] is totally wrong(even using CONSTANCE_ADDITIONAL_FIELDS). You accessing constant value here(not dynamic).
You need to access it like
from constance import config
print(config.ADMINS)
3# Default logging config uses AdminEmailHandler class for mail_admins, which uses ADMINS value from django settings, not constance config.
So one possible solution might be to create your own handler class which will use ADMINS value from constance config. So change your setting.py to
CONSTANCE_CONFIG = {
'ADMIN1': ('admin#gmail.com', 'This one will receive error on 500'),
} # you can add as many admins as you want with ADMIN1, ADMIN2 etc(no tuple)
then create your own handler class which will use CONSTANCE_CONFIG.
from django.utils.log import AdminEmailHandler
from constance import config
from django.conf import settings
from django.core.mail.message import EmailMultiAlternatives
class ConstanceEmailHandler(AdminEmailHandler):
def send_mail(self, subject, message, html_message=None, fail_silently=False, *args, **kwargs):
# create a list of ADMIN emails here, if you have more then one ADMIN
mail = EmailMultiAlternatives('%s%s' % (settings.EMAIL_SUBJECT_PREFIX, subject),
message, settings.SERVER_EMAIL, [config.ADMIN1],
connection=self.connection())
if html_message:
mail.attach_alternative(html_message, 'text/html')
mail.send(fail_silently=fail_silently)
And then change your LOGGER config. I would recommend you to copy default logger config from django.utils.log(DEFAULT_LOGGING) if you do not have your custom LOGGING setup. And change mail_admins to
'mail_admins': {
'level': 'ERROR',
'filters': ['require_debug_false'], # change it to require_debug_true if you want to test it locally.
'class': '<yourproject>.<yourfile>.ConstanceEmailHandler', # path to newly created handler class
'include_html': True
},

Related

Django redis cache cannot access redis cache set outside of django

I'm setting up redis as a docker service and connect to it through django-cache and python redis library.
First:
from django.http import HttpResponse
from django.core.cache import cache
def test_dj_redis(request):
cache.set('dj_key', 'key_in_dj', 1000)
print(cache.get('dj_key')) # works
print(cache.get('py_key')) # Nope
print(cache.get(':1:py_key')) # No, tried reverse engineer the prefix
return HttpResponse("bla bla bla")
Second:
import redis
r = redis.Redis(
host='redis',
port=6379
)
def cache_it():
r.set('py_key', 'key in py', 1000)
print(r.get('py_key')) # works
print(r.get(':1:dj_key')) # works
print(r.keys()) # show two keys
If I run both of them
first one by refresh the web page related to that django view
second one by python cache_it().
In first, I cannot access 'py_key', it will return None.
But second, I can see cache set in django view. Django cache added a prefix to it and turn 'dj_key' into ':1:key_in_dj', but I can access it nonetheless.
Also in second, the redis_r.keys() return [b'py_key', b':1:key_in_dj'].The value of the key 'py_key' remained the same.
django cache setting
CACHES = {
'default': {
'BACKEND': 'django.core.cache.backends.redis.RedisCache',
'LOCATION': 'redis://redis:6379',
},
}
Question, how do I use django.core.cache to access redis cache set outside of Django.

Use Django Oscar Signals

I want to send an email to admin if an Order is placed (currently only user who have placed an order received the email). order_paced Oscar Signal can work for me here.
For this I have already forked order app and inside this app order_placed function is created in signals.py. I have also imported signals in config.py but still this order_placed not getting fired when I am placing an order from site.
Can anyone share any example of oscar signal usage ?
Code :
config.py
from oscar.apps.order import config
class OrderConfig(config.OrderConfig):
name = 'catalogue.order'
def ready(self):
from oscar.apps.order import signals
signals.py
from django.db.models.signals import post_save
from django.dispatch import receiver
from oscar.apps.order.models import Order
#receiver(post_save, sender=Order)
def order_placed(*args, **kwargs):
"""
:param args:
:param kwargs:
:return:
"""
print("i ma here ----------------------")
You don't need signals for this, as part of the payment flow (framework) oscar provides the view: PaymentDetailsView which in time implements the mixin OrderPlacementMixin.
In such mixin you'll find the method: handle_successful_order which is the correct place to send the messages, and do other things being sure the order was placed.
So, do not fork order app, fork checkout app and override this method in order to do something like this:
from django.conf import settings
class PaymentDetailView:
# ...
def handle_successful_order(order):
send_mail_to_admin(settings.ADMIN_EMAIL_ADDRESS)
super(PaymentDetailView, self).handle_successful_order(order)
If you read the code of this method in oscar you'll see that this is indeed where oscar notify the user about the order that has been just placed.
An of course, we can not ignore the docstring which states:
Override this view if you want to perform custom actions when an
order is submitted.
def handle_successful_order(self, order):
"""
Handle the various steps required after an order has been successfully
placed.
Override this view if you want to perform custom actions when an
order is submitted.
"""
# Send confirmation message (normally an email)
self.send_confirmation_message(order, self.communication_type_code)
# Flush all session data
self.checkout_session.flush()
# Save order id in session so thank-you page can load it
self.request.session['checkout_order_id'] = order.id
response = HttpResponseRedirect(self.get_success_url())
self.send_signal(self.request, response, order)
return response
Like #raydel-miranda pointed out you don't need signal to send email to the admin when an order is placed by the customer.
just fork the checkout app using ./python manage.py oscar_fork_app checkout apps
Inside the above forked checkout app create views.py file and override the default oscar checkout/views.py file with this code.
yourproject_name/apps/checkout/views.py
from django.conf import settings
from django.views.generic import FormView
from django.contrib import messages
from django.core.mail import EmailMessage,send_mail
OscarPaymentDetailsView = get_class("checkout.views", "PaymentDetailsView")
class PaymentDetailsView(CheckCountryPreCondition, OscarPaymentDetailsView):
def handle_successful_order(self, order):
print('order creted')
send_mail(
subject="New Order Needs attention",
message='Please login the dashboard and complete the order',
from_email=(settings.EMAIL_HOST_USER),
recipient_list=['admin_email_address_1', 'admin_email_address_2'], #x.email for x in partner_id.users.all()
fail_silently=False,
)
ctx=super(PaymentDetailsView, self).handle_successful_order(order)
#send_mail_to_admin(settings.ADMIN_EMAIL_ADDRESS)
This may save someone else quality time trying to create signals for sending admin email when an order is placed.

How to use one db to run the django service and other to fetch data

i have an existing application data base from which my web site should only fetch the data according to user input. Added database details in the settings.py file and I tried python manage.py integratedb and get the all the 300+ tables came off to my models.py file. I was never able to do python manage.py runserver it threw a million errors. Now i found a work around but i need your opinion on this.
I added the default server into the settings.py and using it i was able to run the server. settings.py looks like this.
DATABASES = {
'default': {
'ENGINE': 'django.db.backends.sqlite3',
'NAME': 'mydatabase',
},
'user': {
'ENGINE': 'django.db.backends.postgresql_psycopg2',
'NAME' : 'test',
'USER' : 'testuser',
'PASSWORD': 'readonly',
'HOST' : '10.20.30.40',
'PORT' : '5446',
}
}
Now can i access the user db to fetch data from my form? for example
views.py looks like this
from django.shortcuts import render_to_response
from .models import TestCases
from django.shortcuts import HttpResponse
from django.http import JsonResponse
from django.forms.models import model_to_dict
from django.views.decorators.csrf import csrf_exempt
# Create your views here.
#csrf_exempt
def index(request):
posts = TestCases.objects.all()[:10]
return render_to_response('index.html',{'posts':posts})
where TestCases is a class name from the models.py file.
Now when i click the button to retrieve data i get "no such table: test_cases"
models.py looks like
class TestCases(models.Model):
id = models.BigIntegerField(primary_key=True)
clientid = models.ForeignKey(Clients, db_column='clientid')
projectid = models.ForeignKey(Projects, db_column='projectid')
class Meta:
managed = False
db_table = 'test_cases'
what am i doing wrong to get the data from the user input. please help.
Queryset .using() method
I guess Django is going for the default database.
Try this:
#csrf_exempt
def index(request):
posts = TestCases.objects.using('user').all()[:10]
return render_to_response('index.html',{'posts':posts})
When you set using to your queryset, you can specify which database django is going to query.
More information in Django docs
A better approach, so you don't have to manually set it to all your queries.
You can manually add it to all your queries, or you can create a custom manager to your objects and force it to use a specific database for objects.
Ex:
# this will override the default queryset for objects that use this Manager
class UserDatabaseManager(models.Manager):
def get_queryset(self):
return super().get_queryset().using('user')
class MyModel(models.Models):
objects = UserDatabaseManager()
Then, when you use MyModel, you can query as usually and Django will use the user db for default just for models that have objects = UserDatabaseManager().
Keep in mind that this is just a simple use of the Manager, you can have multiple managers and do a lot of cool stuff.
Take a look at the Django Managers docs
first of all you need to do:
python manage.py inspectdb > models.py
to store your models to models.py
By default, inspectdb creates unmanaged models. That is, managed = False in the model’s Meta class tells Django not to manage each table’s creation, modification, and deletion, if you do want to allow Django to manage the table’s lifecycle, you’ll need to change the managed option above to True (or simply remove it because True is its default value).
Next, run the migrate command to install any extra needed database
python manage.py migrate
integrating docs

Django: proper way to reload template library

I am using custom template tags to show site-specific content. Templates for these tags are stored in database and staff is able to edit them. I used that SO answer as reference. Each time after template changed and saved into database, I force reloading of django app (by reloading uwsgi service) or changes doesn't become visible. Looks like Django populates template library for custom tags only on first load of app.
I think reloading Django app isn't best practice to refresh DB templates, and I am searching way to do it programmatically (from model's save() or on save signal)
Template tags code:
from django.template import Template, Library
from project.custom_templates.models import DbTemplate
register = Library()
# template names
SHOW_PART_TABLE_ADMIN = 'custom/tags/show_part_table_admin.html'
SHOW_PART_TABLE_PUBLIC = 'custom/tags/show_part_table_public.html'
t_show_part_table_admin = Template(DbTemplate.objects.get(name=SHOW_PART_TABLE_ADMIN).content)
t_show_part_table_public = Template(DbTemplate.objects.get(name=SHOW_PART_TABLE_PUBLIC).content)
#register.inclusion_tag(t_show_part_table_admin)
def show_part_table_admin(parts, user, perms):
return {'parts': parts, 'user': user, 'perms': perms}
#register.inclusion_tag(t_show_part_table_public)
def show_part_table_public(parts, user, perms):
return {'parts': parts, 'user': user, 'perms': perms}

Defining Constants in Django

I want to have some constants in a Django Projects. For example, let's say a constant called MIN_TIME_TEST.
I would like to be able to access this constant in two places: from within my Python code, and from within any Templates.
What's the best way to go about doing this?
EDIT:
To clarify, I know about Template Context Processors and about just putting things in settings.py or some other file and just importing.
My question is, how do I combine the two approaches without violating the "Don't Repeat Yourself" rule? Based on the answers so far, here's my approach:
I'd like to create a file called global_constants.py, which will have a list of constants (things like MIN_TIME_TEST = 5). I can import this file into any module to get the constants.
But now, I want to create the context processor which returns all of these constants. How can I go about doing this automatically, without having to list them again in a dictionary, like in John Mee's answer?
Both Luper and Vladimir are correct imho but you'll need both in order to complete your requirements.
Although, the constants don't need to be in the settings.py, you could put them anywhere and import them from that place into your view/model/module code. I sometimes put them into the __init__.py if I don't care to have them to be considered globally relevant.
a context processor like this will ensure that selected variables are globally in the template scope
def settings(request):
"""
Put selected settings variables into the default template context
"""
from django.conf import settings
return {
'DOMAIN': settings.DOMAIN,
'GOOGLEMAPS_API_KEY': settings.GOOGLEMAPS_API_KEY,
}
But this might be overkill if you're new to django; perhaps you're just asking how to put variables into the template scope...?
from django.conf import settings
...
# do stuff with settings.MIN_TIME_TEST as you wish
render_to_response("the_template.html", {
"MIN_TIME_TEST": settings.MIN_TIME_TEST
}, context_instance=RequestContext(request)
To build on other people's answers, here's a simple way you'd implement this:
In your settings file:
GLOBAL_SETTINGS = {
'MIN_TIME_TEST': 'blah',
'RANDOM_GLOBAL_VAR': 'blah',
}
Then, building off of John Mee's context processor:
def settings(request):
"""
Put selected settings variables into the default template context
"""
from django.conf import settings
return settings.GLOBAL_SETTINGS
This will resolve the DRY issue.
Or, if you only plan to use the global settings occasionally and want to call them from within the view:
def view_func(request):
from django.conf import settings
# function code here
ctx = {} #context variables here
ctx.update(settings.GLOBAL_SETTINGS)
# whatever output you want here
Consider putting it into settings.py of your application. Of course, in order to use it in template you will need to make it available to template as any other usual variable.
Use context processors to have your constants available in all templates (settings.py is a nice place to define them as Vladimir said).
Context processors are better suited at handling more dynamic object data--they're defined as a mapping in the documentation and in many of the posts here they're being modified or passed around to views--it doesn't make sense that a template may lose access to global information because, for example, your forgot to use a specialized context processor in the view. The data is global by definition & that couples the view to the template.
A better way is to define a custom template tag. This way:
templates aren't relying on views to have global information passed into them
it's DRY-er: the app defining the global settings can be exported to many projects, eliminating common code across projects
templates decide whether they have access to the global information, not the view functions
In the example below I deal with your problem--loading in this MIN_TIME_TEST variable--and a problem I commonly face, loading in URLs that change when my environment changes.
I have 4 environments--2 dev and 2 production:
Dev: django-web server, url: localhost:8000
Dev: apache web server: url: sandbox.com -> resolves to 127.0.0.1
Prod sandbox server, url: sandbox.domain.com
Prod server: url: domain.com
I do this on all my projects & keep all the urls in a file, global_settings.py so it's accessible from code. I define a custom template tag {% site_url %} that can be (optionally) loaded into any template
I create an app called global_settings, and make sure it's included in my settings.INSTALLED_APPS tuple.
Django compiles templated text into nodes with a render() method that tells how the data should be displayed--I created an object that renders data by returnning values in my global_settings.py based on the name passed in.
It looks like this:
from django import template
import global_settings
class GlobalSettingNode(template.Node):
def __init__(self, settingname):
self.settingname = settingname;
def render(self, context):
if hasattr(global_settings, self.settingname):
return getattr(global_settings, self.settingname)
else:
raise template.TemplateSyntaxError('%s tag does not exist' % self.settingname)
Now, in global_settings.py I register a couple tags: site_url for my example and min_test_time for your example. This way, when {% min_time_test %} is invoked from a template, it'll call get_min_time_test which resolves to load in the value=5. In my example, {% site_url %} will do a name-based lookup so that I can keep all 4 URLs defined at once and choose which environment I'm using. This is more flexible for me than just using Django's built in settings.Debug=True/False flag.
from django import template
from templatenodes import GlobalSettingNode
register = template.Library()
MIN_TIME_TEST = 5
DEV_DJANGO_SITE_URL = 'http://localhost:8000/'
DEV_APACHE_SITE_URL = 'http://sandbox.com/'
PROD_SANDBOX_URL = 'http://sandbox.domain.com/'
PROD_URL = 'http://domain.com/'
CURRENT_ENVIRONMENT = 'DEV_DJANGO_SITE_URL'
def get_site_url(parser, token):
return GlobalSettingNode(CURRENT_ENVIRONMENT)
def get_min_time_test(parser, token):
return GlobalSettingNode('MIN_TIME_TEST')
register.tag('site_url', get_site_url)
register.tag('min_time_test', get_min_time_test)
Note that for this to work, django is expecting global_settings.py to be located in a python packaged called templatetags under your Django app. My Django app here is called global_settings, so my directory structure looks like:
/project-name/global_settings/templatetags/global_settings.py
etc.
Finally the template chooses whether to load in global settings or not, which is beneficial for performance. Add this line to your template to expose all the tags registered in global_settings.py:
{% load global_settings %}
Now, other projects that need MIN_TIME_TEST or these environments exposed can simply install this app =)
In the context processor you can use something like:
import settings
context = {}
for item in dir(settings):
#use some way to exclude __doc__, __name__, etc..
if item[0:2] != '__':
context[item] = getattr(settings, item)
Variant on John Mee's last part, with a little elaboration on the same idea Jordan Reiter discusses.
Suppose you have something in your settings akin to what Jordan suggested -- in other words, something like:
GLOBAL_SETTINGS = {
'SOME_CONST': 'thingy',
'SOME_OTHER_CONST': 'other_thingy',
}
Suppose further you already have a dictionary with some of the variables you'd like to pass your template, perhaps passed as arguments to your view. Let's call it my_dict. Suppose you want the values in my_dict to override those in the settings.GLOBAL_SETTINGS dictionary.
You might do something in your view like:
def my_view(request, *args, **kwargs)
from django.conf import settings
my_dict = some_kind_of_arg_parsing(*args,**kwargs)
tmp = settings.GLOBAL_SETTINGS.copy()
tmp.update(my_dict)
my_dict = tmp
render_to_response('the_template.html', my_dict, context_instance=RequestContext(request))
This lets you have the settings determined globally, available to your templates, and doesn't require you to manually type out each of them.
If you don't have any additional variables to pass the template, nor any need to override, you can just do:
render_to_response('the_template.html', settings.GLOBAL_SETTINGS, context_instance=RequestContext(request))
The main difference between what I'm discussing here & what Jordan has, is that for his, settings.GLOBAL_SETTINGS overrides anything it may have in common w/ your context dictionary, and with mine, my context dictionary overrides settings.GLOBAL_SETTINGS. YMMV.

Categories