I used inspectdb to import the Models schema from MySQL database connected with MySQL Connector/Python.
When I run the tests, the system shows:
django.db.utils.ProgrammingError: Table 'test_mydb.cards' doesn't exist
But the table name is just mydb.cards, not test_mydb.cards
Why is this prefix being added? My app's name is just container.
Django uses unittest module for testing, also it creates a blank database for testing as tests should be run ideally always on same blank database or fixtures filled database.
Tests that require a database (namely, model tests) will not use your
“real” (production) database. Separate, blank databases are created
for the tests.
Regardless of whether the tests pass or fail, the test databases are
destroyed when all the tests have been executed.
Check out if you are having migration sorted for the table test tries to access, as all tables that don't have migrations aren't accessible to test suite without monkeypatching or workarounds
I found the reason: I had managed property to False in models.py Class Meta for each table. I never set it there myself, it is the default after doing inspectdb.
Please read: Django Models Options: Managed
Related
I want to use my test database with data in it which was created when testcase ran, how can i do it?
I tried running normal django test which inherits TestCase and put a break point after test data is been generated. Now if I login to the test_db (which django creates) in different terminal tab through postgres command and query it, no data is shown! can someone explain why this happens?
TestCase wraps tests in an atomic block and rolls back the transaction so that no changes are saved to the database.
If you want changes to be saved, you could use SimpleTestCase, and set databases = __all__ (Django 2.2+), or allow_database_queries = True (earlier versions of Django).
I'm trying to create tests for my Django application but I'm having some trouble creating a test database.
I'd like to keep the existing structure while entering new curated test-information, creating test users, uploading test content, etc. Which I can then populate a test database with so that I have curated data on which I can test edge-cases.
Creating a test database seems simple, just run python manage.py test --keepdb. Getting entries into it seems more difficult.
Is it possible to run Django in "test mode" with the test database being used so that I can use the website UI to enter all the data, or is there some other better way to do it entirely?
I assume you mean testing with unit tests?
Usually you fill the database with fixtures, or some other test data that is populated into the database as a part of the test itself.
Django fixtures: https://code.djangoproject.com/wiki/Fixtures
Fixtureless is a good option, https://pypi.org/project/django-fixtureless/
Factory Boy http://factoryboy.readthedocs.io/en/latest/
These options allow you to fill your database with fake or static data to use in your tests.
My default database is not being migrated into Django's test database. A database is being created with default tables that Django uses to log tests (such as django_content_type and django_admin_log), but not my app's tables.
When I run the tests with a verbosity level of 3 (-v 3), I see that my app is categorized under Synchronizing apps without migrations... which confirms that the migrations aren't being performed.
I don't, however, know how to tell Django's tests to migrate my app's database tables. I can output python manage.py makemigrations easily but apparently that doesn't carry over to the tests.
What am I missing?
Thank you!
Edit: Sorry, I meant the relations are not being created. I want to test views which rely on models in the database. In order to do so, I'm uploading fixtures (to not deal with data on production). However, when I try to upload a fixture I get a relation "mymodel" does not exist error.
Django's tests are designed to create a test database to perform tests.
I think that making your tests depend of a prod database is a bad idea.
You should design your tests to cover as many cases as possible, not to check if it works with your current database (which can evolve).
Here are some pages about testing with django 1.8 :
https://docs.djangoproject.com/en/1.8/intro/tutorial05/
https://docs.djangoproject.com/en/1.8/topics/testing/
Django 1.7, Python 3.4.
In my models I have several TextFields defined.
When I go to load a JSON fixture (which was generated from an SQLite3 dump), it fails on the second object, which has 515 characters for one of its fields.
The error printed is
psycopg2.DataError: value too long for type character varying(500)
I created a new database (not just a table drop, a whole new db), modified my settings.py file, ran manage.py syncdb on the new database, created a user, and tried to load the data again, getting the same error.
Upon opening pgAdmin3, all columns, both CharField and TextField defined are listed as type character var.
So it seems TextField is being ignored and CharFields are being created instead. The PostgreSQL documentation explicitly lists both text and character types, and defines text as being unlimited in length. Any idea why?
I'm not sure what the exact cause was, but it seems to be related to django's migration tool storing migrations, even on a new database.
What I did to get this behavior:
Create django project, then apps, using CharField
syncdb, run the project's dev server
kill the devserver, modify fields to be TextField
Create a new Postgres database, modify settings.py
Run syncdb, attempt to load fixtures
See the error in question, examine db instance
What fixed the problem:
Create a new database, modify settings.py
delete all migrations in apps/migrations folders
after running syncdb, also run createmigrations and migrate
The last step generated a migration, even though there were none stored in the migrations folder, and there had been no changes to models or data since syncdb was run on the new database, which I found to be odd.
Somewhere in the last two steps this was fixed. Future people stumbling upon this: sorry, I'm not going to keep creating django projects to test the behavior further, but perhaps with this information you can fix your own database problems.
I'm trying to test my Django apps which run on a PostGIS database, by following the info in the Django testing docs.
Normally I create a new database by copying a template:
(as user postgres)
createdb -T template_postgis -O lizard test_geodjango2
When I run ./manage.py test, I get the following message:
Creating test database...
Got an error creating the test database: permission denied to create database
Type 'yes' if you would like to try deleting the test database 'test_geodjango2', or 'no' to > cancel:
What's the best way to let the system create the database?
It may be that your DATABASE_USER doesn't have permissions to create a new database/schema.
Edit
If you read the source for the Django test command, you'll see that it always creates a test database. Further, it modifies your settings to reference this test database.
See this: http://docs.djangoproject.com/en/dev/topics/testing/#id1
What you should do is use fixtures. Here's how we do it.
From your template database, create a "fixture". Use the manage.py dumpdata command to create a JSON file with all of your template data. [Hint, the --indent=2 option gives you readable JSON that you can edit and modify.]
Put this in a fixtures directory under your application.
Reference the fixtures file in your TestCase class definition. This will load the fixture prior to running the test.
class AnimalTestCase(TestCase):
fixtures = ['mammals.json', 'birds']
def testFluffyAnimals(self):
etc.
The fixtures replace your template database. You don't need the template anymore once you have the fixtures.
As S.Lott mentioned, use the standard test command.
Using geodjango with postgis you'll need to add the following to your settings for the spatial templates to be created properly.
settings.py
POSTGIS_SQL_PATH = 'C:\\Program Files\\PostgreSQL\\8.3\\share\\contrib'
TEST_RUNNER='django.contrib.gis.tests.run_tests'
console
manage.py test
Described here:
http://geodjango.org/docs/testing.html?highlight=testing#testing-geodjango-apps
I haven't looked into it yet, but when I do this I get prompted for the database password when it attempts to install the necessary sql.
As of Django 1.11, Django supports the Postgres-only setting settings.DATABASE[whatever]['TEST']['TEMPLATE'] that dictates what template the test database is created from:
https://docs.djangoproject.com/en/dev/ref/settings/#template