Single model dynamic database settings in Django - python

For example assume that I have 100 clients who uses WordPress and I have to write a service in Django which should return list of posts from WordPress's MySQL DB. The problem is 100 clients are having different database connection settings.
I know that I can use DatabaseRouter to switch databases which are already loaded in settings. But I don't know how to make a singe model class to use different database settings.
I have tried mutating settings.
I also tried mutating model's app_label.
But I later understood that mutating anyting in Django is meaning less.
My Requirements
I want to create a model and dynamically change database connection. List of connection can be in a managed database table. But I don't want to unnecessarily load all the connection settings or create multiple models.

I made something like that, but to change mongodb connections.
I created a GenericView that select the connection and use it on the get_queryset.
I'm using django rest framework, so I made something like this:
class SwitchDBMixinView(object):
model = None
fields = None
def initial(self, request, *args, **kwargs):
result = super().initial(request, *args, **kwargs)
if request.user.is_authenticated():
request.user.database_connection.register()
return result
def get_object(self, *args, **kwargs):
return super().get_object(*args, **kwargs).switch_db(self.get_db_alias())
def get_db_alias(self):
if self.request is None or not self.request.user.is_authenticated():
return DEFAULT_CONNECTION_NAME
return self.request.user.database_connection.name
def get_queryset(self):
return self.model.objects.using(self.get_db_alias()).all()
def perform_destroy(self, instance):
instance.switch_db(self.get_db_alias()).delete()
The model:
from mongoengine.connection import register_connection, get_connection
AUTH_USER_MODEL = getattr(settings, 'AUTH_USER_MODEL')
class Connection(models.Model):
class Meta:
pass
owner = models.OneToOneField(
AUTH_USER_MODEL,
related_name='database_connection',
)
uri = models.TextField(
default=DefaultMongoURI()
)
def register(self):
register_connection(
self.name,
host=self.uri,
tz_aware=True,
)
get_connection(
self.name,
reconnect=True
)
def get_name(self):
return 'client-%d' % self.owner.pk
name = property(get_name)
def __str__(self):
return self.uri

You may want to have a look at django.db.connections (in django/db/__init__.py) and django.db.utils.ConnectionHandler (which django.db.connections is an instance of). This should let you dynamically add new db configs without hacking settings.DATABASES (actually ConnectionHandler builds it's _databases attribute from settings.DATABASES). I can't tell for sure since I never tried but it should mostly boils down to
from django import db
def add_db(alias, connection_infos):
databases = db.connections.databases
if alias in databases:
either_raise_or_log_and_ignore(your choice)
db.connections.databases[alias] = connection_infos
where connection_infos is a mapping similar to the ones expected in settings.DATABASES.
Then it's mostly a matter of using Queryset.using(alias) for your queries, ie:
alias = get_alias_for_user(request.user)
posts = Post.objects.using(alias).all()
cf https://docs.djangoproject.com/en/1.11/topics/db/multi-db/#manually-selecting-a-database
The main problem with this IMHO (assuming you manage to make something that works out of the untested suggestion above) is that you will have to store databases users/password in clear somewhere which can be a major security issue. I don't know how much control you have on the databases admin part but it would be better if you could add a 'django' user with a same password (and appropriate permissions of course) on all those databases so you can keep the password in your settings file instead of having to keep it in your main db.

Related

Indirect way of usine Model.objects.all() in a formset

I'm using something like this to populate inlineformsets for an update view:
formset = inline_formsetfactory(Client, Groupe_esc, form=GroupEscForm, formset=BaseGroupEscInlineFormset, extra=len(Groupe.objects.all()))
(Basically I need as many extra form as there are entries in that table, for some special processing I'm doing in class BaseGroupEscInlineFormset(BaseInlineFormset)).
That all works fine, BUT if I pull my code & try to makemigrations in order to establish a brand new DB, that line apparently fails some django checks and throws up a "no such table (Groupe)" error and I cannot makemigrations. Commenting that line solves the issues (then I can uncomment it after making migration). But that's exactly best programming practices.
So I would need a way to achieve the same result (determine the extra number of forms based on the content of Groupe table)... but without triggering that django check that fails. I'm unsure if the answer is django-ic or pythonic.
E.g. perhaps I could so some python hack that allows me to specific the classname without actually importing Groupe, so I can do my_hacky_groupe_import.Objects.all(), and maybe that wouldn't trigger the error?
EDIT:
In forms.py:
from .models import Client, Groupe
class BaseGroupEscInlineFormset(BaseInlineFormSet):
def get_form_kwargs(self, index):
""" this BaseInlineFormset method returns kwargs provided to the form.
in this case the kwargs are provided to the GroupEsForm constructor
"""
kwargs = super().get_form_kwargs(index)
try:
group_details = kwargs['group_details'][index]
except Exception as ex: # likely this is a POST, but the data is already in the form
group_details = []
return {'group_details':group_details}
GroupeEscFormset = inlineformset_factory(Client, Groupe_esc,
form=GroupeEscForm,
formset=BaseGroupEscInlineFormset,
extra=len(Groupe.objects.all()),
can_delete=False)
The issue as already outlined is that your code is written at the module level and it executes a query when the migrations are not yet done, giving you an error.
One solution as I already pointed in the comment would be to write the line to create the formset class in a view, example:
def some_view(request):
GroupeEscFormset = inlineformset_factory(
Client,
Groupe_esc,
form=GroupeEscForm,
formset=BaseGroupEscInlineFormset,
extra=len(Groupe.objects.all()),
can_delete=False
)
Or if you want some optimization and want to keep this line at the module level to not keep recreating this formset class, you can override the __init__ method and accept extra as an argument (basically your indirect way to call Model.objects.all()):
class BaseGroupEscInlineFormset(BaseInlineFormSet):
def __init__(self, *args, extra=3, **kwargs):
self.extra = extra
super().__init__(*args, **kwargs)
...
GroupeEscFormset = inlineformset_factory(Client, Groupe_esc,
form=GroupeEscForm,
formset=BaseGroupEscInlineFormset,
can_delete=False)
# In your views:
def some_view(request):
formset = GroupeEscFormset(..., extra=Groupe.objects.count()) # count is better if the queryset is needed only to get a count

Using Python 3.7 contextvars to pass state between Django views

I'm building a single database/shared schema multi-tenant application using Django 2.2 and Python 3.7.
I'm attempting to use the new contextvars api to share the tenant state (an Organization) between views.
I'm setting the state in a custom middleware like this:
# tenant_middleware.py
from organization.models import Organization
import contextvars
import tenant.models as tenant_model
tenant = contextvars.ContextVar('tenant', default=None)
class TenantMiddleware:
def __init__(self, get_response):
self.get_response = get_response
def __call__(self, request):
response = self.get_response(request)
user = request.user
if user.is_authenticated:
organization = Organization.objects.get(organizationuser__is_current_organization=True, organizationuser__user=user)
tenant_object = tenant_model.Tenant.objects.get(organization=organization)
tenant.set(tenant_object)
return response
I'm using this state by having my app's models inherit from a TenantAwareModel like this:
# tenant_models.py
from django.contrib.auth import get_user_model
from django.db import models
from django.db.models.signals import pre_save
from django.dispatch import receiver
from organization.models import Organization
from tenant_middleware import tenant
User = get_user_model()
class TenantManager(models.Manager):
def get_queryset(self, *args, **kwargs):
tenant_object = tenant.get()
if tenant_object:
return super(TenantManager, self).get_queryset(*args, **kwargs).filter(tenant=tenant_object)
else:
return None
#receiver(pre_save)
def pre_save_callback(sender, instance, **kwargs):
tenant_object = tenant.get()
instance.tenant = tenant_object
class Tenant(models.Model):
organization = models.ForeignKey(Organization, null=False, on_delete=models.CASCADE)
def __str__(self):
return self.organization.name
class TenantAwareModel(models.Model):
tenant = models.ForeignKey(Tenant, on_delete=models.CASCADE, related_name='%(app_label)s_%(class)s_related', related_query_name='%(app_label)s_%(class)ss')
objects = models.Manager()
tenant_objects = TenantManager()
class Meta:
abstract = True
In my application the business logic can then retrieve querysets using .tenant_objects... on a model class rather than .objects...
The problem I'm having is that it doesn't always work - specifically in these cases:
In my login view after login() is called, the middleware runs and I can see the tenant is set correctly. When I redirect from my login view to my home view, however, the state is (initially) empty again and seems to get set properly after the home view executes. If I reload the home view, everything works fine.
If I logout and then login again as a different user, the state from the previous user is retained, again until a do a reload of the page. This seems related to the previous issue, as it almost seems like the state is lagging (for lack of a better word).
I use Celery to spin off shared_tasks for processing. I have to manually pass the tenant to these, as they don't pick up the context.
Questions:
Am I doing this correctly?
Do I need to manually reload the state somehow in each module?
Frustrated, as I can find almost no examples of doing this and very little discussion of contextvars. I'm trying to avoid passing the tenant around manually everywhere or using thread.locals.
Thanks.
You're only setting the context after the response has been generated. That means it will always lag. You probably want to set it before, then check after if the user has changed.
Note though that I'm not really sure this will ever work exactly how you want. Context vars are by definition local; but in an environment like Django you can never guarantee that consecutive requests from the same user will be served by the same server process, and similarly one process can serve requests from multiple users. Plus, as you've noted, Celery is a yet another separate process again, which won't share the context.

Correct way to register flask admin views with application factory

I am using an application factory to add views to my flask application like so :
(this is not my actual application factory, and has been shortened for the sake of brevity)
def create_app(config_name='default'):
app = Flask(__name__, template_folder="templates", static_folder='static')
admin_instance = Admin(app, name='Admin')
admin_instance.add_view(EntityAdmin(Entity, db.session))
My EntityAdmin class looks like this :
class EntityAdmin(ModelView):
column_filters = [
MyCustomFilter(column=None, name='Custom')
]
My custom filter looks like this :
class MyCustomFilter(BaseSQLAFilter):
def get_options(self, view):
entities = Entity.query.filter(Entity.active == True).all()
return [(entity.id, entity.name) for entity in entities]
The problem is that it seems that the get_options function is called when the app is instantiated, running a select query every time the create_app function gets called.
So if I update my database schema and run the flask db migrate command, I get an error because the new column I added does not exist when the select query is run. The query raises an error because my database schema is not in sync with the actual database.
Can I register my views only when an actual HTTP request is made ? How can I differentiate between a request and a command ?
You have one more problem with this filter: its options are created on the application instantiation so if your list of entities was changed during the application running it would still return the same list of options.
To fix both problems you don't need to postpone views registrations. You need the filter to get the list of options every time it is used.
This SO answer to the question "Resetting generator object in Python" describes a way to reuse a generator (in your case — a database query):
from flask import has_app_context
def get_entities():
# has_app_context is used to prevent database access
# when application is not ready yet
if has_app_context():
for entity in Entity.query.filter(Entity.active.is_(True)):
yield entity.id, entity.name
class ReloadingIterator:
def __init__(self, iterator_factory):
self.iterator_factory = iterator_factory
def __iter__(self):
return self.iterator_factory()
class MyCustomFilter(BaseSQLAFilter):
def get_options(self, view):
# This will return a generator which is
# reloaded every time it is used
return ReloadingIterator(get_entities)
The problem is that the query to the Entity table can be called multiple times during request. So I usually cache the result for a single request using Flask globals:
def get_entities():
if has_app_context():
if not hasattr(g, 'entities'):
query = Entity.query.filter(Entity.active.is_(True))
g.entities = [(entity.id, entity.name) for entity in query]
for entity_id, entity_name in g.entities:
yield entity_id, entity_name

How to programmatically generate the CREATE TABLE SQL statement for a given model in Django?

I need to programmatically generate the CREATE TABLE statement for a given unmanaged model in my Django app (managed = False)
Since i'm working on a legacy database, i don't want to create a migration and use sqlmigrate.
The ./manage.py sql command was useful for this purpose but it has been removed in Django 1.8
Do you know about any alternatives?
As suggested, I post a complete answer for the case, that the question might imply.
Suppose you have an external DB table, that you decided to access as a Django model and therefore have described it as an unmanaged model (Meta: managed = False).
Later you need to be able to create it in your code, e.g for some tests using your local DB. Obviously, Django doesn't make migrations for unmanaged models and therefore won't create it in your test DB.
This can be solved using Django APIs without resorting to raw SQL - SchemaEditor. See a more complete example below, but as a short answer you would use it like this:
from django.db import connections
with connections['db_to_create_a_table_in'].schema_editor() as schema_editor:
schema_editor.create_model(YourUnmanagedModelClass)
A practical example:
# your_app/models/your_model.py
from django.db import models
class IntegrationView(models.Model):
"""A read-only model to access a view in some external DB."""
class Meta:
managed = False
db_table = 'integration_view'
name = models.CharField(
db_column='object_name',
max_length=255,
primaty_key=True,
verbose_name='Object Name',
)
some_value = models.CharField(
db_column='some_object_value',
max_length=255,
blank=True,
null=True,
verbose_name='Some Object Value',
)
# Depending on the situation it might be a good idea to redefine
# some methods as a NOOP as a safety-net.
# Note, that it's not completely safe this way, but might help with some
# silly mistakes in user code
def save(self, *args, **kwargs):
"""Preventing data modification."""
pass
def delete(self, *args, **kwargs):
"""Preventing data deletion."""
pass
Now, suppose you need to be able to create this model via Django, e.g. for some tests.
# your_app/tests/some_test.py
# This will allow to access the `SchemaEditor` for the DB
from django.db import connections
from django.test import TestCase
from your_app.models.your_model import IntegrationView
class SomeLogicTestCase(TestCase):
"""Tests some logic, that uses `IntegrationView`."""
# Since it is assumed, that the `IntegrationView` is read-only for the
# the case being described it's a good idea to put setup logic in class
# setup fixture, that will run only once for the whole test case
#classmethod
def setUpClass(cls):
"""Prepares `IntegrationView` mock data for the test case."""
# This is the actual part, that will create the table in the DB
# for the unmanaged model (Any model in fact, but managed models will
# have their tables created already by the Django testing framework)
# Note: Here we're able to choose which DB, defined in your settings,
# will be used to create the table
with connections['external_db'].schema_editor() as schema_editor:
schema_editor.create_model(IntegrationView)
# That's all you need, after the execution of this statements
# a DB table for `IntegrationView` will be created in the DB
# defined as `external_db`.
# Now suppose we need to add some mock data...
# Again, if we consider the table to be read-only, the data can be
# defined here, otherwise it's better to do it in `setUp()` method.
# Remember `IntegrationView.save()` is overridden as a NOOP, so simple
# calls to `IntegrationView.save()` or `IntegrationView.objects.create()`
# won't do anything, so we need to "Improvise. Adapt. Overcome."
# One way is to use the `save()` method of the base class,
# but provide the instance of our class
integration_view = IntegrationView(
name='Biggus Dickus',
some_value='Something really important.',
)
super(IntegrationView, integration_view).save(using='external_db')
# Another one is to use the `bulk_create()`, which doesn't use
# `save()` internally, and in fact is a better solution
# if we're creating many records
IntegrationView.objects.using('external_db').bulk_create([
IntegrationView(
name='Sillius Soddus',
some_value='Something important',
),
IntegrationView(
name='Naughtius Maximus',
some_value='Whatever',
),
])
# Don't forget to clean after
#classmethod
def tearDownClass(cls):
with connections['external_db'].schema_editor() as schema_editor:
schema_editor.delete_model(IntegrationView)
def test_some_logic_using_data_from_integration_view(self):
self.assertTrue(IntegrationView.objects.using('external_db').filter(
name='Biggus Dickus',
))
To make the example more complete... Since we're using multiple DB (default and external_db) Django will try to run migrations on both of them for the tests and as of now there's no option in DB settings to prevent this. So we have to use a custom DB router for testing.
# your_app/tests/base.py
class PreventMigrationsDBRouter:
"""DB router to prevent migrations for specific DBs during tests."""
_NO_MIGRATION_DBS = {'external_db', }
def allow_migrate(self, db, app_label, model_name=None, **hints):
"""Actually disallows migrations for specific DBs."""
return db not in self._NO_MIGRATION_DBS
And a test settings file example for the described case:
# settings/test.py
DATABASES = {
'default': {
'ENGINE': 'django.db.backends.oracle',
'NAME': 'db_name',
'USER': 'username',
'HOST': 'localhost',
'PASSWORD': 'password',
'PORT': '1521',
},
# For production here we would have settings to connect to the external DB,
# but for testing purposes we could get by with an SQLite DB
'external_db': {
'ENGINE': 'django.db.backends.sqlite3',
},
}
# Not necessary to use a router in production config, since if the DB
# is unspecified explicitly for some action Django will use the `default` DB
DATABASE_ROUTERS = ['your_app.tests.base.PreventMigrationsDBRouter', ]
Hope this detailed new Django user user-friendly example will help someone and save their time.
unfortunately there seems to be no easy way to do this, but for your luck I have just succeeded in producing a working snippet for you digging in the internals of the django migrations jungle.
Just:
save the code to get_sql_create_table.py (in example)
do $ export DJANGO_SETTINGS_MODULE=yourproject.settings
launch the script with python get_sql_create_table.py yourapp.yourmodel
and it should output what you need.
Hope it helps!
import django
django.setup()
from django.db.migrations.state import ModelState
from django.db.migrations import operations
from django.db.migrations.migration import Migration
from django.db import connections
from django.db.migrations.state import ProjectState
def get_create_sql_for_model(model):
model_state = ModelState.from_model(model)
# Create a fake migration with the CreateModel operation
cm = operations.CreateModel(name=model_state.name, fields=model_state.fields)
migration = Migration("fake_migration", "app")
migration.operations.append(cm)
# Let the migration framework think that the project is in an initial state
state = ProjectState()
# Get the SQL through the schema_editor bound to the connection
connection = connections['default']
with connection.schema_editor(collect_sql=True, atomic=migration.atomic) as schema_editor:
state = migration.apply(state, schema_editor, collect_sql=True)
# return the CREATE TABLE statement
return "\n".join(schema_editor.collected_sql)
if __name__ == "__main__":
import importlib
import sys
if len(sys.argv) < 2:
print("Usage: {} <app.model>".format(sys.argv[0]))
sys.exit(100)
app, model_name = sys.argv[1].split('.')
models = importlib.import_module("{}.models".format(app))
model = getattr(models, model_name)
rv = get_create_sql_for_model(model)
print(rv)
For Django v4.1.3, the above get_create_sql_for_model soruce code changed like this:
from django.db.migrations.state import ModelState
from django.db.migrations import operations
from django.db.migrations.migration import Migration
from django.db import connections
from django.db.migrations.state import ProjectState
def get_create_sql_for_model(model):
model_state = ModelState.from_model(model)
table_name = model_state.options['db_table']
# Create a fake migration with the CreateModel operation
cm = operations.CreateModel(name=model_state.name, fields=model_state.fields.items())
migration = Migration("fake_migration", "app")
migration.operations.append(cm)
# Let the migration framework think that the project is in an initial state
state = ProjectState()
# Get the SQL through the schema_editor bound to the connection
connection = connections['default']
with connection.schema_editor(collect_sql=True, atomic=migration.atomic) as schema_editor:
state = migration.apply(state, schema_editor, collect_sql=True)
sqls = schema_editor.collected_sql
items = []
for sql in sqls:
if sql.startswith('--'):
continue
items.append(sql)
return table_name,items
#EOP
I used it to create all tables (like the command syncdb of old Django version):
for app in settings.INSTALLED_APPS:
app_name = app.split('.')[0]
app_models = apps.get_app_config(app_name).get_models()
for model in app_models:
table_name,sqls = get_create_sql_for_model(model)
if settings.DEBUG:
s = "SELECT COUNT(*) AS c FROM sqlite_master WHERE name = '%s'" % table_name
else:
s = "SELECT COUNT(*) AS c FROM information_schema.TABLES WHERE table_name='%s'" % table_name
rs = select_by_raw_sql(s)
if not rs[0]['c']:
for sql in sqls:
exec_by_raw_sql(sql)
print('CREATE TABLE DONE:%s' % table_name)
The full soure code can be found at Django syncdb command came back for v4.1.3 version

Django models: Only permit one entry in a model?

I want to make some of my Django global settings configurable through the admin interface.
To that end, I've decided to set them as database fields, rather than in settings.py.
These are the settings I care about:
class ManagementEmail(models.Model):
librarian_email = models.EmailField()
intro_text = models.CharField(max_length=1000)
signoff_text = models.CharField(max_length=1000)
These are one-off global settings, so I only ever want there to be a single librarian_email, intro_text etc floating around the system.
Is there a way I can prevent admin users from adding new records here, without preventing them from editing the existing record?
I guess I can do this by writing a custom admin template for this model, but I'd like to know if there's a neater way to configure this.
Could I use something other than class, for example?
Thanks!
Please see this question on "keep[ing] settings in database", where the answer seems to be django-dbsettings
Update
Just thought of another option: you can create the following model:
from django.contrib.sites.models import Site
class ManagementEmail(models.Model):
site = models.OneToOneField(Site)
librarian_email = models.EmailField()
intro_text = models.CharField(max_length=1000)
signoff_text = models.CharField(max_length=1000)
Because of the OneToOneField field, you can only have one ManagementEmail record per site. Then, just make sure you're using sites and then you can pull the settings thusly:
from django.contrib.sites.models import Site
managementemail = Site.objects.get_current().managementemail
Note that what everyone else is telling you is true; if your goal is to store settings, adding them one by one as fields to a model is not the best implementation. Adding settings over time is going to be a headache: you have to add the field to your model, update the database structure, and modify the code that is calling that setting.
That's why I'd recommend using the django app I mentioned above, since it does exactly what you want -- provide for user-editable settings -- without making you do any extra, unnecessary work.
I think the easiest way you can do this is using has_add_permissions function of the ModelAdmin:
class ContactUsAdmin(admin.ModelAdmin):
form = ContactUsForm
def has_add_permission(self, request):
return False if self.model.objects.count() > 0 else super().has_add_permission(request)
You can set the above to be any number you like, see the django docs.
If you need more granularity than that, and make the class a singleton at the model level, see django-solo. There are many singleton implementations also that I came across.
For StackedInline, you can use max_num = 1.
Try django-constance.
Here are some useful links:
https://github.com/jezdez/django-constance
http://django-constance.readthedocs.org/en/latest/
I'd take a page out of wordpress and create a Model that support settings.
class Settings(models.Model):
option_name = models.CharField(max_length=1000)
option_value = models.CharField(max_length=25000)
option_meta = models.CharField(max_length=1000)
Then you can just pickle (serialize) objects into the fields and you'll be solid.
Build a little api, and you can be as crafty as wordpress and call. AdminOptions.get_option(opt_name)
Then you can just load the custom settings into the runtime, keeping the settings.py module separate, but equal. A good place to write this would be in an __init__.py file.
Just set up an GlobalSettings app or something with a Key and Value field.
You could easily prevent admin users from changing values by not giving them permission to edit the GlobalSettings app.
class GlobalSettingsManager(models.Manager):
def get_setting(self, key):
try:
setting = GlobalSettings.objects.get(key=key)
except:
raise MyExceptionOrWhatever
return setting
class GlobalSettings(models.Model):
key = models.CharField(unique=True, max_length=255)
value = models.CharField(max_length=255)
objects = GlobalSettingsManager()
>>> APP_SETTING = GlobalSettings.objects.get_setting('APP_SETTING')
There are apps for this but I prefer looking at them and writing my own.
You can prevent users from adding/deleting an object by overriding this method on your admin class:
ModelAdmin.has_add_permission(self, request)
ModelAdmin.has_delete_permission(self, request, obj=None)
Modification of #radtek answer to prevent deleting if only one entry is left
class SendgridEmailQuotaAdmin(admin.ModelAdmin):
list_display = ('quota','used')
def has_add_permission(self, request):
return False if self.model.objects.count() > 0 else True
def has_delete_permission(self, request, obj=None):
return False if self.model.objects.count() <= 1 else True
def get_actions(self, request):
actions = super(SendgridEmailQuotaAdmin, self).get_actions(request)
if(self.model.objects.count() <= 1):
del actions['delete_selected']
return actions
I had basically the same problem as the original poster describes, and it's easily fixed by overriding modelAdmin classes. Something similar to this in an admin.py file easily prevents adding a new object but allows editing the current one:
class TitleAdmin(admin.ModelAdmin):
def has_delete_permission(self, request, obj=TestModel.title):
return False
def has_add_permission(self, request):
return False
def has_change_permission(self, request, obj=TestModel.title):
return True
This doesn't prevent a user from posting a form that edits data, but keeps things from happening in the Admin site. Depending on whether or not you feel it's necessary for your needs you can enable deletion and addition of a record with a minimum of coding.

Categories