create all table models using indpectdb in django - python

I have a database "Employee" into SQL-Server-2016 , which contains a lot of tables such as:
dbo.Employee
dbo.Department
dbo.Salary
dbo.Country
dbo.Address
domain1\userId.auth_group_permissions
domain1\userId.auth_user
domain1\userId.django_session
domain1\userId.auth_permission
......
I have tried to generate database in tables in django models by using:
python manage.py inspectdb>models.py
but only django models generated such as AuthUserGroups, AuthUserUserPermissions, DjangoAdminLog,...
What should i have do , to generate all dbo schema tables models?

Give Django your database parameters
You need to specify your database connection parameters in settings file.
Do that by editing the DATABASES setting and assigning values to the following keys for the 'default' connection:
NAME
ENGINE
USER
PASSWORD
HOST
PORT
For further reference or doubts refer this https://docs.djangoproject.com/en/2.1/howto/legacy-databases/

Related

Configuring PostgreSQL schema for default Django DB working with PgBouncer connection pool

I need to set the default DB schema for a Django project, so that all tables of all apps (including 3rd party apps) store their tables in the configured PostgreSQL schema.
One solution is to use a DB connection option, like this:
# in Django settings module add "OPTIONS" to default DB, specifying "search_path" for the connection
DB_DEFAULT_SCHEMA = os.environ.get('DB_DEFAULT_SCHEMA', 'public') # use postgresql default "public" if not overwritten
DATABASES['default']['OPTIONS'] = {'options': f'-c search_path={DB_DEFAULT_SCHEMA}'}
This works for a direct connection to PostgreSQL, but not when connecting to PgBouncer (to use connection pools), failing with OperatonalError: unsupported startup parameter: options". It appears PgBouncer doesn't recognize options as a startup parameter (at this point of time).
Another solution to set the schema without using startup parameters, is to prefix all tables with the schema . To make sure this works for built-in and 3rd party apps too (not just my own app), a solution is to inject the schema name to db_table attribute of all models when they're being loaded by Django, using class_prepared signal, and an AppConfig. This approach is close to what projects like django-db-prefix use, only need to make sure the schema name is well quoted:
from django.conf import settings
from django.db.models.signals import class_prepared
def set_model_schema(sender, **kwargs):
schema = getattr(settings, "DB_DEFAULT_SCHEMA", "")
db_table = sender._meta.db_table
if schema and not db_table[1:].startswith(schema):
sender._meta.db_table = '"{}"."{}"'.format(schema, db_table)
class_prepared.connect(set_model_schema)
This works for connection pools too, however it doesn't play well with Django migrations.
Using this solution, python manage.py migrate fails to work, because migrate command ensures django_migrations table exists, by introspecting existing tables, which the db_table prefix of models has no effect on.
I'm curious what a proper way could be to solve this problem.
This is the solution I came up with. Mixing both solutions above, using 2 separate DB connections.
Using the connection startup parameters (which works well for apps and migrations), but only to run migrations, not the app server. This means Django migrations has to connect to PostgreSQL directly, and not via PgBouncer, which for my case is fine.
Prefixing DB tables with the schema using a class_prepared signal handler, but excluding django_migrations table. The handler is registered with a Django app (say django_dbschema) using the AppConfig.__init__() method, which is the 1st stage of project initialization process, so all other apps are affected. An environment variable is used to flag bypassing this registration, which is set when running migrations. This way when the app runs to serve requests, it can connect to PgBouncer just as good, but Django migrations is unaware of schema prefixes.
Two environment variables (used by Django settings module) will be used to configure this behavior: DB_DEFAULT_SCHEMA is the name of the schema, and DB_SCHEMA_NO_PREFIX flags disabling registration for signal handler. It'll be like this:
The django_dbschema app structure (in the root of the project)
django_dbschema/
├── apps.py
├── __init__.py
where apps.py defines the signal handler and AppConfig to register it:
from django.apps import AppConfig
from django.conf import settings
from django.db.models.signals import class_prepared
def set_model_schema(sender, **kwargs):
"""Prefix the DB table name for the model with the configured DB schema.
Excluding Django migrations table itself (django_migrations).
Because django migartion command directly introspects tables in the DB, looking
for eixsting "django_migrations" table, prefixing the table with schemas won't work
so Django migrations thinks, the table doesn't exist, and tries to create it.
So django migrations can/should not use this app to target a schema.
"""
schema = getattr(settings, "DB_DEFAULT_SCHEMA", "")
if schema == "":
return
db_table = sender._meta.db_table
if db_table != "django_migrations" and not db_table[1:].startswith(schema):
# double quotes are important to target a schema
sender._meta.db_table = '"{}"."{}"'.format(schema, db_table)
class DjangoDbschemaConfig(AppConfig):
"""Django app to register a signal handler for model class preparation
signal, to prefix all models' DB tables with the schema name from "DB_DEFAULT_SCHEMA"
in settings.
This is better than specifying "search_path" as "options" of the connection,
because this approach works both for direct connections AND connection pools (where
the "options" connection parameter is not accepted by PGBouncer)
NOTE: This app defines __init__(), to register class_prepared signal.
Make sure no models are imported in __init__. see
https://docs.djangoproject.com/en/3.2/ref/signals/#class-prepared
NOTE: The signal handler for this app excludes django migrations,
So django migrations can/should not use this app to target a schema.
This means with this enabled, when starting the app server, Django thinks
migrations are missing and always warns with:
You have ... unapplied migration(s). Your project may not work properly until you apply the migrations for ...
To actually run migrations (python manage.py migrate) use another way to set the schema
"""
name = "django_dbschema"
verbose_name = "Configure DB schema for Django models"
def __init__(self, *args, **kwargs):
super().__init__(*args, **kwargs)
schema = getattr(settings, "DB_DEFAULT_SCHEMA", "")
if schema and not getattr(
settings, "DB_SCHEMA_NO_PREFIX", False
): # don't register signal handler if no schema or disabled
class_prepared.connect(set_model_schema)
This app is registered to the list of INSTALLED_APPS (I had to use full class path to the app config, otherwise Django wouldn't load my AppConfig definition).
Also the Django settings module (say settings.py), would define 1 extra DB connection (a copy of default) but with connection options:
# ...
INSTALLED_APPS = [
'django_dbschema.apps.DjangoDbschemaConfig', # has to be full path to class otherwise django won't load local app
'django.contrib.admin',
# ...
]
# 2 new settings to control the schema and prefix
DB_DEFAULT_SCHEMA = os.environ.get('DB_DEFAULT_SCHEMA', '')
DB_SCHEMA_NO_PREFIX = os.environ.get('DB_SCHEMA_NO_PREFIX', False) # if should force disable prefixing DB tables with schema
DATABASES = {
'default': { # default DB connection definition, used by app not migrations, to work with PgBouncer no connection options
# ...
}
}
# default_direct: the default DB connection, but a direct connection NOT A CONNECTION POOL, so can have connection options
DATABASES['default_direct'] = deepcopy(DATABASES['default'])
# explicit test db info, prevents django test from confusing multi DB aliases to the same actual DB with circular dependencies
DATABASES['default_direct']['TEST'] = {'DEPENDENCIES': [], 'NAME': 'test_default_direct'}
# allow overriding connection parameters if necessary
if os.environ.get('DIRECT_DB_HOST'):
DATABASES['default_direct']['HOST'] = os.environ.get('DIRECT_DB_HOST')
if os.environ.get('DIRECT_DB_PORT'):
DATABASES['default_direct']['PORT'] = os.environ.get('DIRECT_DB_PORT')
if os.environ.get('DIRECT_DB_NAME'):
DATABASES['default_direct']['NAME'] = os.environ.get('DIRECT_DB_NAME')
if DB_DEFAULT_SCHEMA:
DATABASES['default_direct']['OPTIONS'] = {'options': f'-c search_path={DB_DEFAULT_SCHEMA}'}
# ...
Now setting the environment variable DB_DEFAULT_SCHEMA=myschema configures the schema. To run migrations, we'll set the proper environment variable, and explicitly use the direct DB connection:
env DB_SCHEMA_NO_PREFIX=True python manage.py migrate --database default_direct
And when the app server runs, it'll use the default DB connection, which works with PgBouncer.
The down side is that since Django migrations is excluded from the signal handler, it'll think no migrations were run, so it always warns about this:
"You have ... unapplied migration(s). Your project may not work properly until you apply the migrations for ..."
Which is not true if we make sure we actually run migrations always before running the app server.
A side note about this solution is that, now the Django project has multiple DB connection settings (if it didn't have before). So for example DB migrations should have been written to work with an explicit connection, and not relying on default connection. For example if RunPython is used in a migration, it should pass the connection (schema_editor.connection.alias) to the object manager when querying. For example:
my_model.save(using=schema_editor.connection.alias)
# or
my_model.objects.using(schema_editor.connection.alias).all()

create multiple table with different schema using flask-migrate on a single database

I am using flask-migrate to handle the creation and maintenance of the table. I have multiples tables with different schema.
class A(db.model):
__tablename__ = 'A'
__table_args__ = {'schema': 'public'}
# rest of data
class B(db.model):
__tablename__ = 'B'
__table_args__ = {'schema': 'schema_b'}
# rest of data
so when I am running flask db init and flask db migrate, a migration script is created in the migration folder. But when I am running flask db upgrade to add tables in database it is showing me error
sqlalchemy.exc.ProgrammingError: (psycopg2.errors.InvalidSchemaName) schema "schema_b.B" does not exist
As I search about the issue, I found this include_schemas and migrate using different schema, in both the part it is mentioned that to use include_schemas=True in configure in migration/env.py. Also, the link mentioned in the solution in the answer has invalid link, so this becoming little programmatical to me.
I have made changes accordingly. then I am running flask db migrate which is detecting all the tables with all schema. but as I am running flask db upgrate
sqlalchemy.exc.ProgrammingError: (psycopg2.errors.InvalidSchemaName) schema "schema_b.B" does not exist
error again appears.
Help me, how to solve this problem using flask migrate. to create a table I have a SQL command which is working fine.
The include_schemas=True option makes Alembic look for tables in your non-default schemas, but it cannot generate new schemas when you add them to a model definition.
In such a case, what you have to do is run flask db migrate to generate the migration, and then open the generated migration script and add the schema creation right before the new table is created. For example, in the example in your question, your migration for your B model would have been generated more or less like this:
def upgrade():
# ### commands auto generated by Alembic - please adjust! ###
op.create_table('schema_b.B',
# ...
)
# ### end Alembic commands ###
So you need to add the schema creation statement above the table creation, so that the schema exists already when the table is created:
def upgrade():
# ### commands auto generated by Alembic - please adjust! ###
op.execute('create schema schema_b') # <--- add this
op.create_table('schema_b.B',
# ...
)
# ### end Alembic commands ###
And then for consistency also drop the schema on the downgrade path:
def downgrade():
# ### commands auto generated by Alembic - please adjust! ###
op.drop_table('schema_b.B')
op.execute('drop schema schema_b') # <--- add this
# ### end Alembic commands ###

Using the SQL initialization hook with ManytoManyField

I'm fairly new to Django and I'm trying to add some 'host' data to 'record' using django's hook for using SQL to initialise (a SQL file in lowercase in the app folder & sql subfolder)
Here's the models:
class Record(models.Model):
species = models.TextField(max_length = 80)
data=models.TextField(max_length = 700)
hosts = models.ManyToManyField('Host')
class Host(models.Model):
hostname = models.TextField()
I've used a ManyToManyField as each record should be able to have multiple hosts, and hosts should be 'reusable': ie be able to appear in many records.
When I'm trying to insert via SQL I have
INSERT INTO myapp_record VALUES ('Species name', 'data1', XYZ);
I'm not sure what to put for XYZ (the ManytoMany) if I wanted hosts 1, 2 and 3 for example
Separating them by commas doesn't work obviously, and I tried a tuple and neither did that.
Should I be trying to insert into the intermediary table Django makes? Does that have a similar hook to the one I'm using? If not, how can I execute SQL inserts on this table?
The use of initial SQL data files is deprecated. Instead, you should be using a data migration, which might look something like this:
from django.db import models, migrations
def create_records(apps, schema_editor):
# We can't import the Person model directly as it may be a newer
# version than this migration expects. We use the historical version.
Record = apps.get_model("yourappname", "Record")
Host = apps.get_model("yourappname", "Host")
host1 = Host.objects.get(hostname='host1')
record = Record.objects.create(name='Species name', data='Data')
record.hosts.add(host1)
...etc...
class Migration(migrations.Migration):
dependencies = [
('yourappname', '0001_initial'),
]
operations = [
migrations.RunPython(create_records),
]

Django manage.py sqlclear omit some tables

I wrote python script for dropping tables in all Django apps. (using settings.INSTALLED_APP)
https://gist.github.com/1520683
My django project creates 41 tables after running manage.py syncdb, but my script says only 40 tables will be dropped. So, I examined the result of sqlall and result of sqlclear. And I revealed sqlclear omits one table that stores ManyToManyField relationship.
I knew that drop database is much simpler than the above script. But I confused why django admin or manage script omit some tables while running sql commands.
Below model creates common_userbook_purchasedBooks table while running syncdb, but not in sqlclear command.
class UserBook(models.Model):
user = models.OneToOneField(User)
purchasedBooks = models.ManyToManyField(Book)
Added) So, I'm using an alternative approach for this.
https://gist.github.com/1520810
lqez, I gues this issue related to you local environment, because for Django 1.3.1, Python 2.7.2
for models
from django.contrib.auth.models import User
from django.db import models
class Book(models.Model):
name = models.CharField(max_length=10)
class UserBook(models.Model):
user = models.OneToOneField(User)
purchasedBooks = models.ManyToManyField(Book)
when I run (.env)testme$ ./manage.py sqlclear testapp output looks like
sqlite3
BEGIN;
DROP TABLE "testapp_userbook";
DROP TABLE "testapp_userbook_purchasedBooks";
DROP TABLE "testapp_book";
COMMIT;
postgresql_psycopg2
BEGIN;
ALTER TABLE "testapp_userbook_purchasedBooks" DROP CONSTRAINT "userbook_id_refs_id_8bda4b0";
DROP TABLE "testapp_userbook";
DROP TABLE "testapp_userbook_purchasedBooks";
DROP TABLE "testapp_book";
COMMIT;
mysql
BEGIN;
ALTER TABLE `testapp_userbook_purchasedBooks` DROP FOREIGN KEY `userbook_id_refs_id_8bda4b0`;
DROP TABLE `testapp_userbook`;
DROP TABLE `testapp_userbook_purchasedBooks`;
DROP TABLE `testapp_book`;
COMMIT;
Also your script can be a little bit improved using introspection:
from django.db import connection
cursor = connection.cursor()
connection.introspection.get_table_list(cursor)
[u'auth_group', u'auth_group_permissions', u'auth_message', u'auth_permission', u'auth_user', u'auth_user_groups', u'auth_user_user_permissions', u'django_content_type', u'django_session', u'django_site', u'testapp_book', u'testapp_userbook', u'testapp_userbook_purchasedBooks']

Django not create database table

Hey,
I've a database already created. Now I've updated UserProfile with:
class UserProfile(models.Model):
user = models.ForeignKey(User, unique = True, related_name = 'user')
follows = models.ManyToManyField("self", related_name = 'follows') <-- NEW LINE
so python manage.py sqlall myapp returns me:
[...]
CREATE TABLE "myapp_userprofile_follows" (
"id" integer NOT NULL PRIMARY KEY,
"from_userprofile_id" integer NOT NULL,
"to_userprofile_id" integer NOT NULL,
UNIQUE ("from_userprofile_id", "to_userprofile_id")
)
[...]
When I run python manage.py syncdb:
Creating tables ...
Installing custom SQL ...
Installing indexes ...
No fixtures found.
But the table is not created when I try to insert data into. Why? (I'm testing locally, with sqlite3)
manage.py syncdb will not modify existing tables to add or remove fields. You will need to either manually modify your database, or use a tool like South to create automated database migrations (which is what I highly recommend)
have you added your app to INSTALLED_APPS in settings.pys:
settings.py
INSTALLED_APPS = (
... ,
'my_app',
)
https://docs.djangoproject.com/en/dev/ref/settings/?from=olddocs#installed-apps

Categories