From the oficial documentation:
For tests involving models with managed=False, it’s up to you to ensure the correct tables are created as part of the test setup.
I don't know how to create the tables as part of the test setup. I found this question and the accepted answer doesn't work for me. I think this is because the migrations files. The configuration is in the migrations files, to change the values "on the fly" don't have any effect.
What's the way to solve this in Django 1.7+?
I found a way. Modify the fixtures and add the SQL to generate the tables:
#0001_initial.py (or followings)
class Migration(migrations.Migration):
operations = [
migrations.RunSQL("CREATE TABLE..."),
...
]
I'm a "migrations newbie" so I don't know if this is the best option. But it works.
I think it should be similar in Django 1.7+. When you are going to run the tests you should manage those models with Django (just for testing purposes).
This conversion should be done before creating tables and Django allows you to give a class instance setting up TEST_RUNNER in your settings.py
# settings_test.py
TEST_RUNNER = 'utils.test_runner.ManagedModelTestRunner'
# test_runner.py
from django.test.runner import DiscoverRunner
class ManagedModelTestRunner(DiscoverRunner):
"""
Test runner that automatically makes all unmanaged models in your Django
project managed for the duration of the test run, so that one doesn't need
to execute the SQL manually to create them.
"""
def setup_test_environment(self, *args, **kwargs):
from django.db.models.loading import get_models
super(ManagedModelTestRunner, self).setup_test_environment(*args,
**kwargs)
self.unmanaged_models = [m for m in get_models(only_installed=False)
if not m._meta.managed]
for m in self.unmanaged_models:
m._meta.managed = True
def teardown_test_environment(self, *args, **kwargs):
super(ManagedModelTestRunner, self).teardown_test_environment(*args, **kwargs)
# reset unmanaged models
for m in self.unmanaged_models:
m._meta.managed = False
Related
So I added a new "status" field to a django database table. This field needed a default value, so I defaulted it to "New", but I then added a custom migration file that calls the save() method on all of the objects in that table, as I have the save() overridden to check a different table and pull the correct status from that. However, after running this migration, all of the statuses are still set to "New", so it looks like the save isn't getting executed. I tested this by manually calling the save on all the objects after running the migration, and the statuses are updated as expected.
Here's the table model in models.py:
class SOS(models.Model):
number = models.CharField(max_length=20, unique=True)
...
# the default="New" portion is missing here because I have a migration to remove it after the custom migration (shown below) that saves the models
status = models.CharField(max_length=20)
def save(self, *args, **kwargs):
self.status = self.history_set.get(version=self.latest_version).status if self.history_set.count() != 0 else "New"
super(SOS, self).save(*args, **kwargs)
And here is the migration:
# Generated by Django 2.0.5 on 2018-05-23 13:50
from django.db import migrations, models
def set_status(apps, schema_editor):
SOS = apps.get_model('sos', 'SOS')
for sos in SOS.objects.all():
sos.save()
class Migration(migrations.Migration):
dependencies = [
('sos', '0033_auto_20180523_0950'),
]
operations = [
migrations.RunPython(set_status),
]
So it seems pretty clear to me that I'm doing something wrong with the migration, but I matched it exactly to what I see in the Django Documentation and I also compared it to this StackOverflow answer, and I can't see what I'm doing wrong. There are no errors when I run the migrations, but the custom one I wrote does run pretty much instanteously, which seems strange, as when I do the save manually, it takes about 5 seconds to save all 300+ entries.
Any suggestions?
P.S. Please let me know if there are any relevant details I neglected to include.
When you run migrations and get Model from apps you can not use custom managers or custom save or create or something like that. This model only have the fields and that's all. If you want to achieve what you want you should add your logic into you migrations like this:
# comment to be more than 6 chars...
def set_status(apps, schema_editor):
SOS = apps.get_model('sos', 'SOS')
for sos in SOS.objects.all():
if sos.history_set.exists():
sos.status = sos.history_set.get(version=sos.latest_version).status
else:
sos.status = "New"
sos.save()
I need to programmatically generate the CREATE TABLE statement for a given unmanaged model in my Django app (managed = False)
Since i'm working on a legacy database, i don't want to create a migration and use sqlmigrate.
The ./manage.py sql command was useful for this purpose but it has been removed in Django 1.8
Do you know about any alternatives?
As suggested, I post a complete answer for the case, that the question might imply.
Suppose you have an external DB table, that you decided to access as a Django model and therefore have described it as an unmanaged model (Meta: managed = False).
Later you need to be able to create it in your code, e.g for some tests using your local DB. Obviously, Django doesn't make migrations for unmanaged models and therefore won't create it in your test DB.
This can be solved using Django APIs without resorting to raw SQL - SchemaEditor. See a more complete example below, but as a short answer you would use it like this:
from django.db import connections
with connections['db_to_create_a_table_in'].schema_editor() as schema_editor:
schema_editor.create_model(YourUnmanagedModelClass)
A practical example:
# your_app/models/your_model.py
from django.db import models
class IntegrationView(models.Model):
"""A read-only model to access a view in some external DB."""
class Meta:
managed = False
db_table = 'integration_view'
name = models.CharField(
db_column='object_name',
max_length=255,
primaty_key=True,
verbose_name='Object Name',
)
some_value = models.CharField(
db_column='some_object_value',
max_length=255,
blank=True,
null=True,
verbose_name='Some Object Value',
)
# Depending on the situation it might be a good idea to redefine
# some methods as a NOOP as a safety-net.
# Note, that it's not completely safe this way, but might help with some
# silly mistakes in user code
def save(self, *args, **kwargs):
"""Preventing data modification."""
pass
def delete(self, *args, **kwargs):
"""Preventing data deletion."""
pass
Now, suppose you need to be able to create this model via Django, e.g. for some tests.
# your_app/tests/some_test.py
# This will allow to access the `SchemaEditor` for the DB
from django.db import connections
from django.test import TestCase
from your_app.models.your_model import IntegrationView
class SomeLogicTestCase(TestCase):
"""Tests some logic, that uses `IntegrationView`."""
# Since it is assumed, that the `IntegrationView` is read-only for the
# the case being described it's a good idea to put setup logic in class
# setup fixture, that will run only once for the whole test case
#classmethod
def setUpClass(cls):
"""Prepares `IntegrationView` mock data for the test case."""
# This is the actual part, that will create the table in the DB
# for the unmanaged model (Any model in fact, but managed models will
# have their tables created already by the Django testing framework)
# Note: Here we're able to choose which DB, defined in your settings,
# will be used to create the table
with connections['external_db'].schema_editor() as schema_editor:
schema_editor.create_model(IntegrationView)
# That's all you need, after the execution of this statements
# a DB table for `IntegrationView` will be created in the DB
# defined as `external_db`.
# Now suppose we need to add some mock data...
# Again, if we consider the table to be read-only, the data can be
# defined here, otherwise it's better to do it in `setUp()` method.
# Remember `IntegrationView.save()` is overridden as a NOOP, so simple
# calls to `IntegrationView.save()` or `IntegrationView.objects.create()`
# won't do anything, so we need to "Improvise. Adapt. Overcome."
# One way is to use the `save()` method of the base class,
# but provide the instance of our class
integration_view = IntegrationView(
name='Biggus Dickus',
some_value='Something really important.',
)
super(IntegrationView, integration_view).save(using='external_db')
# Another one is to use the `bulk_create()`, which doesn't use
# `save()` internally, and in fact is a better solution
# if we're creating many records
IntegrationView.objects.using('external_db').bulk_create([
IntegrationView(
name='Sillius Soddus',
some_value='Something important',
),
IntegrationView(
name='Naughtius Maximus',
some_value='Whatever',
),
])
# Don't forget to clean after
#classmethod
def tearDownClass(cls):
with connections['external_db'].schema_editor() as schema_editor:
schema_editor.delete_model(IntegrationView)
def test_some_logic_using_data_from_integration_view(self):
self.assertTrue(IntegrationView.objects.using('external_db').filter(
name='Biggus Dickus',
))
To make the example more complete... Since we're using multiple DB (default and external_db) Django will try to run migrations on both of them for the tests and as of now there's no option in DB settings to prevent this. So we have to use a custom DB router for testing.
# your_app/tests/base.py
class PreventMigrationsDBRouter:
"""DB router to prevent migrations for specific DBs during tests."""
_NO_MIGRATION_DBS = {'external_db', }
def allow_migrate(self, db, app_label, model_name=None, **hints):
"""Actually disallows migrations for specific DBs."""
return db not in self._NO_MIGRATION_DBS
And a test settings file example for the described case:
# settings/test.py
DATABASES = {
'default': {
'ENGINE': 'django.db.backends.oracle',
'NAME': 'db_name',
'USER': 'username',
'HOST': 'localhost',
'PASSWORD': 'password',
'PORT': '1521',
},
# For production here we would have settings to connect to the external DB,
# but for testing purposes we could get by with an SQLite DB
'external_db': {
'ENGINE': 'django.db.backends.sqlite3',
},
}
# Not necessary to use a router in production config, since if the DB
# is unspecified explicitly for some action Django will use the `default` DB
DATABASE_ROUTERS = ['your_app.tests.base.PreventMigrationsDBRouter', ]
Hope this detailed new Django user user-friendly example will help someone and save their time.
unfortunately there seems to be no easy way to do this, but for your luck I have just succeeded in producing a working snippet for you digging in the internals of the django migrations jungle.
Just:
save the code to get_sql_create_table.py (in example)
do $ export DJANGO_SETTINGS_MODULE=yourproject.settings
launch the script with python get_sql_create_table.py yourapp.yourmodel
and it should output what you need.
Hope it helps!
import django
django.setup()
from django.db.migrations.state import ModelState
from django.db.migrations import operations
from django.db.migrations.migration import Migration
from django.db import connections
from django.db.migrations.state import ProjectState
def get_create_sql_for_model(model):
model_state = ModelState.from_model(model)
# Create a fake migration with the CreateModel operation
cm = operations.CreateModel(name=model_state.name, fields=model_state.fields)
migration = Migration("fake_migration", "app")
migration.operations.append(cm)
# Let the migration framework think that the project is in an initial state
state = ProjectState()
# Get the SQL through the schema_editor bound to the connection
connection = connections['default']
with connection.schema_editor(collect_sql=True, atomic=migration.atomic) as schema_editor:
state = migration.apply(state, schema_editor, collect_sql=True)
# return the CREATE TABLE statement
return "\n".join(schema_editor.collected_sql)
if __name__ == "__main__":
import importlib
import sys
if len(sys.argv) < 2:
print("Usage: {} <app.model>".format(sys.argv[0]))
sys.exit(100)
app, model_name = sys.argv[1].split('.')
models = importlib.import_module("{}.models".format(app))
model = getattr(models, model_name)
rv = get_create_sql_for_model(model)
print(rv)
For Django v4.1.3, the above get_create_sql_for_model soruce code changed like this:
from django.db.migrations.state import ModelState
from django.db.migrations import operations
from django.db.migrations.migration import Migration
from django.db import connections
from django.db.migrations.state import ProjectState
def get_create_sql_for_model(model):
model_state = ModelState.from_model(model)
table_name = model_state.options['db_table']
# Create a fake migration with the CreateModel operation
cm = operations.CreateModel(name=model_state.name, fields=model_state.fields.items())
migration = Migration("fake_migration", "app")
migration.operations.append(cm)
# Let the migration framework think that the project is in an initial state
state = ProjectState()
# Get the SQL through the schema_editor bound to the connection
connection = connections['default']
with connection.schema_editor(collect_sql=True, atomic=migration.atomic) as schema_editor:
state = migration.apply(state, schema_editor, collect_sql=True)
sqls = schema_editor.collected_sql
items = []
for sql in sqls:
if sql.startswith('--'):
continue
items.append(sql)
return table_name,items
#EOP
I used it to create all tables (like the command syncdb of old Django version):
for app in settings.INSTALLED_APPS:
app_name = app.split('.')[0]
app_models = apps.get_app_config(app_name).get_models()
for model in app_models:
table_name,sqls = get_create_sql_for_model(model)
if settings.DEBUG:
s = "SELECT COUNT(*) AS c FROM sqlite_master WHERE name = '%s'" % table_name
else:
s = "SELECT COUNT(*) AS c FROM information_schema.TABLES WHERE table_name='%s'" % table_name
rs = select_by_raw_sql(s)
if not rs[0]['c']:
for sql in sqls:
exec_by_raw_sql(sql)
print('CREATE TABLE DONE:%s' % table_name)
The full soure code can be found at Django syncdb command came back for v4.1.3 version
For example assume that I have 100 clients who uses WordPress and I have to write a service in Django which should return list of posts from WordPress's MySQL DB. The problem is 100 clients are having different database connection settings.
I know that I can use DatabaseRouter to switch databases which are already loaded in settings. But I don't know how to make a singe model class to use different database settings.
I have tried mutating settings.
I also tried mutating model's app_label.
But I later understood that mutating anyting in Django is meaning less.
My Requirements
I want to create a model and dynamically change database connection. List of connection can be in a managed database table. But I don't want to unnecessarily load all the connection settings or create multiple models.
I made something like that, but to change mongodb connections.
I created a GenericView that select the connection and use it on the get_queryset.
I'm using django rest framework, so I made something like this:
class SwitchDBMixinView(object):
model = None
fields = None
def initial(self, request, *args, **kwargs):
result = super().initial(request, *args, **kwargs)
if request.user.is_authenticated():
request.user.database_connection.register()
return result
def get_object(self, *args, **kwargs):
return super().get_object(*args, **kwargs).switch_db(self.get_db_alias())
def get_db_alias(self):
if self.request is None or not self.request.user.is_authenticated():
return DEFAULT_CONNECTION_NAME
return self.request.user.database_connection.name
def get_queryset(self):
return self.model.objects.using(self.get_db_alias()).all()
def perform_destroy(self, instance):
instance.switch_db(self.get_db_alias()).delete()
The model:
from mongoengine.connection import register_connection, get_connection
AUTH_USER_MODEL = getattr(settings, 'AUTH_USER_MODEL')
class Connection(models.Model):
class Meta:
pass
owner = models.OneToOneField(
AUTH_USER_MODEL,
related_name='database_connection',
)
uri = models.TextField(
default=DefaultMongoURI()
)
def register(self):
register_connection(
self.name,
host=self.uri,
tz_aware=True,
)
get_connection(
self.name,
reconnect=True
)
def get_name(self):
return 'client-%d' % self.owner.pk
name = property(get_name)
def __str__(self):
return self.uri
You may want to have a look at django.db.connections (in django/db/__init__.py) and django.db.utils.ConnectionHandler (which django.db.connections is an instance of). This should let you dynamically add new db configs without hacking settings.DATABASES (actually ConnectionHandler builds it's _databases attribute from settings.DATABASES). I can't tell for sure since I never tried but it should mostly boils down to
from django import db
def add_db(alias, connection_infos):
databases = db.connections.databases
if alias in databases:
either_raise_or_log_and_ignore(your choice)
db.connections.databases[alias] = connection_infos
where connection_infos is a mapping similar to the ones expected in settings.DATABASES.
Then it's mostly a matter of using Queryset.using(alias) for your queries, ie:
alias = get_alias_for_user(request.user)
posts = Post.objects.using(alias).all()
cf https://docs.djangoproject.com/en/1.11/topics/db/multi-db/#manually-selecting-a-database
The main problem with this IMHO (assuming you manage to make something that works out of the untested suggestion above) is that you will have to store databases users/password in clear somewhere which can be a major security issue. I don't know how much control you have on the databases admin part but it would be better if you could add a 'django' user with a same password (and appropriate permissions of course) on all those databases so you can keep the password in your settings file instead of having to keep it in your main db.
I have a model with dynamic choices, and I would like to return an empty choice list if I can guarantee that the code is being run in the event of a django-admin.py migrate / makemigrations command to prevent it either creating or warning about useless choice changes.
Code:
from artist.models import Performance
from location.models import Location
def lazy_discover_foreign_id_choices():
choices = []
performances = Performance.objects.all()
choices += {performance.id: str(performance) for performance in performances}.items()
locations = Location.objects.all()
choices += {location.id: str(location) for location in locations}.items()
return choices
lazy_discover_foreign_id_choices = lazy(lazy_discover_foreign_id_choices, list)
class DiscoverEntry(Model):
foreign_id = models.PositiveIntegerField('Foreign Reference', choices=lazy_discover_foreign_id_choices(), )
So I would think if I can detect the run context in lazy_discover_foreign_id_choices then I can choose to output an empty choice list. I was thinking about testing sys.argv and __main__.__name__ but I'm hoping there's possibly a more reliable way or an API?
Here is a fairly non hacky way to do this (since django already creates flags for us) :
import sys
def lazy_discover_foreign_id_choices():
if ('makemigrations' in sys.argv or 'migrate' in sys.argv):
return []
# Leave the rest as is.
This should work for all cases.
A solution I can think of would be to subclass the Django makemigrations command to set a flag before actually performing the actual operation.
Example:
Put that code in <someapp>/management/commands/makemigrations.py, it will override Django's default makemigrations command.
from django.core.management.commands import makemigrations
from django.db import migrations
class Command(makemigrations.Command):
def handle(self, *args, **kwargs):
# Set the flag.
migrations.MIGRATION_OPERATION_IN_PROGRESS = True
# Execute the normal behaviour.
super(Command, self).handle(*args, **kwargs)
Do the same for the migrate command.
And modify your dynamic choices function:
from django.db import migrations
def lazy_discover_foreign_id_choices():
if getattr(migrations, 'MIGRATION_OPERATION_IN_PROGRESS', False):
return []
# Leave the rest as is.
It is very hacky but fairly easy to setup.
Using the django.db.models.signals.pre_migrate should be enough to detect the migrate command. The drawback is that you cannot use it at configuration stage.
I'm trying to port my project to use Django 1.7. Everything is fine except 1 thing. Models inside tests folders.
Django 1.7 new migrations run migrate command internally. Before syncdb was ran. That means if a model is not included in migrations - it won't be populated to DB (and also to test DB). That's exactly what I'm experiencing right now.
What I do is:
In my /app/tests/models.py I have dummy model: class TestBaseImage(BaseImage): pass
All it does is to inherit from an abstract BaseImage model.
Then in tests I create instances of that dummy model to test it.
The problem is that it doesn't work any more. It's not included in migrations (that's obvious as I don't want to keep my test models in a production DB). Running my tests causes DB error saying that table does not exist. That makes sense as it's not included in migrations.
Is there any way to make it work with new migrations system? I can't find a way to "fix" that.
Code I use:
app/tests/models.py
from ..models import BaseImage
class TestBaseImage(BaseImage):
"""Dummy model just to test BaseImage abstract class"""
pass
app/models.py
class BaseImage(models.Model):
# ... fields ...
class Meta:
abstract = True
factories:
class BaseImageFactory(factory.django.DjangoModelFactory):
"""Factory class for Vessel model"""
FACTORY_FOR = BaseImage
ABSTRACT_FACTORY = True
class PortImageFactory(BaseImageFactory):
FACTORY_FOR = PortImage
example test:
def get_model_field(model, field_name):
"""Returns field instance"""
return model._meta.get_field_by_name(field_name)[0]
def test_owner_field(self):
"""Tests owner field"""
field = get_model_field(BaseImage, "owner")
self.assertIsInstance(field, models.ForeignKey)
self.assertEqual(field.rel.to, get_user_model())
There is a ticket requesting a way to do test-only models here
As a workaround, you can decouple your tests.py and make it an app.
tests
|--migrations
|--__init__.py
|--models.py
|--tests.py
You will end up with something like this:
myapp
|-migrations
|-tests
|--migrations
|--__init__.py
|--models.py
|--tests.py
|-__init__.py
|-models.py
|-views.py
Then you should add it to your INSTALLED_APPS
INSTALLED_APPS = (
# ...
'myapp',
'myapp.tests',
)
You probably don't want to install myapp.tests in production, so you can keep separate settings files. Something like this:
INSTALLED_APPS = (
# ...
'myapp',
)
try:
from local_settings import *
except ImportError:
pass
Or better yet, create a test runner and install your tests there.
Last but not least, remember to run python manage.py makemigrations
Here's a workaround that seems to work. Trick the migration framework into thinking that there are no migrations for your app. In settings.py:
if 'test' in sys.argv:
# Only during unittests...
# myapp uses a test-only model, which won't be loaded if we only load
# our real migration files, so point to a nonexistent one, which will make
# the test runner fall back to 'syncdb' behavior.
MIGRATION_MODULES = {
'myapp': 'myapp.migrations_not_used_in_tests'
}
I found the idea on the first post in ths Django dev mailing list thread, and it's also currently being used in Django itself, but it may not work in future versions of Django where migrations are required and the "syncdb fallback" is removed.