I want to connect to a postgres database running in my dev docker container.
I can establish a connection by the connection String:
def create_api():
from .models import db
from .models.userModels import User
# Initialize and Configure the Flask App
app = Flask(__name__)
app.config.from_object(AppConfig)
db = SQLAlchemy(app)
I have a separate module for my routes and the related classes as well as a separate module for my models I will use for postgres model based communication.
In the init file I execute the create_api() method. Here I create the routes and connect them to a Resource class ( userResource.py).
In userReource.py I use the userModels.User for my db stuff.
Structure:
- api
__init__.py
- models
- userModels.py
- resources
- userResource.py
Model:
class User(db.Model):
id = db.Column(db.Integer,primary_key=True)
name= db.Column(db.String(100))
def __init__(self, name):
self.name=name
The docsenter link description here say that I should use the
db.create_all() method to create the table user id if not exists.
I've also read other answsers of that topic which mention to import the model class if it is placed in a separate module before executing this command. As you can see, I've imported the model before, but nothing happens in my database.
Is there something else I have to take care of?
I working in a project in which i have different projects with the same database architecture,
so i used peewee Model in which:
dynamic_db = SqliteDatabase(None)
class BaseModel(Model):
class Meta:
database = dynamic_db
class KV (BaseModel):
key = TextField()
value = IntegerField()
And whenever i new project is created i will call a function
dynamic_db.init(r'{}\database.db'.format(ProjectName.upper()))
dynamic_db.connect()
dynamic_db.create_tables([KV])
dynamic_db.close()
The problem is that once this database is created, i can't access with peewee.
When i try to create a record:
KV.create(key = 'Saul', value = 123)
I get this error:
peewee.InterfaceError: Error, database must be initialized before opening a connection.
I would appreciate any help or cookbook for peewee.
I believe something is incorrect, either in your question description, or in the error you are receiving. The call you are making to .init() is what initializes the database. After that, you should have no problems using it.
Full example which works fine:
from peewee import *
db = SqliteDatabase(None)
class Base(Model):
class Meta:
database = db
class KV(Base):
key = TextField()
value = IntegerField()
db.init('foo.db') # database is now initialized
db.connect()
db.create_tables([KV]) # no problems.
db.close()
I was finally able to create a record.
I didn't mention that i was trying to create them in another file, but the procedure is the same as the one coleifer posted on the answer.
The file in which i create the peewee models is databases.py, so in the other file i do the following:
import databases
databases.db.init('foo.db')
databases.KV.create(name = 'Saul', value= 123)
Thanks!
As far as I know from the research I've made, the typical way of defining a table using PonyORM in Python is like the following:
from pony.orm import *
db = Database()
# Database connection ...
class SampleTable(db.entity):
sample_int_field = Required(int)
sample_string_field = Required(str)
# ...
db.generate_mapping(create_tables=True)
My Problem: this uses db.entity
I wish to define a table without using the specific Databse instance in an abstract general manner, and connect it to the instance when I need to.
is there a way to do so?
concept (not real runnable code presumably):
# SampleAbstractTable.py
from pony.orm import *
class SampleAbstractTable(Database):
sample_int_field = Required(int)
sample_string_field = Required(str)
# ...
# main.py
from pony.orm import *
import SampleAbstractTable
db = Database()
# Database connection ...
db.connectTables((SampleAbstractTable.SampleAbstractTable, ...))
db.generate_mapping(create_tables=True)
EDIT:
One idea I have is to create a wrapper class for the database I wish to use with a certain group of tables, and define the tables in the init, because the whole point of me wishing to define tables dynamically is to seperate the Database instance creation from the table classes' definitions, namely:
from pony.orm import *
class sampleDatabase:
def __init__(self):
self._db = Database()
# Database connection ...
class TableA(db.entity):
# ...
class TableB(db.entity):
# ...
self._db.generate_mapping(create_tables=True)
but then I have issues in accessing the database tables...
First of all you're working with Entities, not Tables. They're not the same thing.
Your problem can be solved just defining the function like factory
def define_entities(db):
class Entity1(db.Entity):
attr1 = Required(str)
... and so on
And then later when you create your Database instance you just call
db = Database(...)
define_entities(db)
I need to programmatically generate the CREATE TABLE statement for a given unmanaged model in my Django app (managed = False)
Since i'm working on a legacy database, i don't want to create a migration and use sqlmigrate.
The ./manage.py sql command was useful for this purpose but it has been removed in Django 1.8
Do you know about any alternatives?
As suggested, I post a complete answer for the case, that the question might imply.
Suppose you have an external DB table, that you decided to access as a Django model and therefore have described it as an unmanaged model (Meta: managed = False).
Later you need to be able to create it in your code, e.g for some tests using your local DB. Obviously, Django doesn't make migrations for unmanaged models and therefore won't create it in your test DB.
This can be solved using Django APIs without resorting to raw SQL - SchemaEditor. See a more complete example below, but as a short answer you would use it like this:
from django.db import connections
with connections['db_to_create_a_table_in'].schema_editor() as schema_editor:
schema_editor.create_model(YourUnmanagedModelClass)
A practical example:
# your_app/models/your_model.py
from django.db import models
class IntegrationView(models.Model):
"""A read-only model to access a view in some external DB."""
class Meta:
managed = False
db_table = 'integration_view'
name = models.CharField(
db_column='object_name',
max_length=255,
primaty_key=True,
verbose_name='Object Name',
)
some_value = models.CharField(
db_column='some_object_value',
max_length=255,
blank=True,
null=True,
verbose_name='Some Object Value',
)
# Depending on the situation it might be a good idea to redefine
# some methods as a NOOP as a safety-net.
# Note, that it's not completely safe this way, but might help with some
# silly mistakes in user code
def save(self, *args, **kwargs):
"""Preventing data modification."""
pass
def delete(self, *args, **kwargs):
"""Preventing data deletion."""
pass
Now, suppose you need to be able to create this model via Django, e.g. for some tests.
# your_app/tests/some_test.py
# This will allow to access the `SchemaEditor` for the DB
from django.db import connections
from django.test import TestCase
from your_app.models.your_model import IntegrationView
class SomeLogicTestCase(TestCase):
"""Tests some logic, that uses `IntegrationView`."""
# Since it is assumed, that the `IntegrationView` is read-only for the
# the case being described it's a good idea to put setup logic in class
# setup fixture, that will run only once for the whole test case
#classmethod
def setUpClass(cls):
"""Prepares `IntegrationView` mock data for the test case."""
# This is the actual part, that will create the table in the DB
# for the unmanaged model (Any model in fact, but managed models will
# have their tables created already by the Django testing framework)
# Note: Here we're able to choose which DB, defined in your settings,
# will be used to create the table
with connections['external_db'].schema_editor() as schema_editor:
schema_editor.create_model(IntegrationView)
# That's all you need, after the execution of this statements
# a DB table for `IntegrationView` will be created in the DB
# defined as `external_db`.
# Now suppose we need to add some mock data...
# Again, if we consider the table to be read-only, the data can be
# defined here, otherwise it's better to do it in `setUp()` method.
# Remember `IntegrationView.save()` is overridden as a NOOP, so simple
# calls to `IntegrationView.save()` or `IntegrationView.objects.create()`
# won't do anything, so we need to "Improvise. Adapt. Overcome."
# One way is to use the `save()` method of the base class,
# but provide the instance of our class
integration_view = IntegrationView(
name='Biggus Dickus',
some_value='Something really important.',
)
super(IntegrationView, integration_view).save(using='external_db')
# Another one is to use the `bulk_create()`, which doesn't use
# `save()` internally, and in fact is a better solution
# if we're creating many records
IntegrationView.objects.using('external_db').bulk_create([
IntegrationView(
name='Sillius Soddus',
some_value='Something important',
),
IntegrationView(
name='Naughtius Maximus',
some_value='Whatever',
),
])
# Don't forget to clean after
#classmethod
def tearDownClass(cls):
with connections['external_db'].schema_editor() as schema_editor:
schema_editor.delete_model(IntegrationView)
def test_some_logic_using_data_from_integration_view(self):
self.assertTrue(IntegrationView.objects.using('external_db').filter(
name='Biggus Dickus',
))
To make the example more complete... Since we're using multiple DB (default and external_db) Django will try to run migrations on both of them for the tests and as of now there's no option in DB settings to prevent this. So we have to use a custom DB router for testing.
# your_app/tests/base.py
class PreventMigrationsDBRouter:
"""DB router to prevent migrations for specific DBs during tests."""
_NO_MIGRATION_DBS = {'external_db', }
def allow_migrate(self, db, app_label, model_name=None, **hints):
"""Actually disallows migrations for specific DBs."""
return db not in self._NO_MIGRATION_DBS
And a test settings file example for the described case:
# settings/test.py
DATABASES = {
'default': {
'ENGINE': 'django.db.backends.oracle',
'NAME': 'db_name',
'USER': 'username',
'HOST': 'localhost',
'PASSWORD': 'password',
'PORT': '1521',
},
# For production here we would have settings to connect to the external DB,
# but for testing purposes we could get by with an SQLite DB
'external_db': {
'ENGINE': 'django.db.backends.sqlite3',
},
}
# Not necessary to use a router in production config, since if the DB
# is unspecified explicitly for some action Django will use the `default` DB
DATABASE_ROUTERS = ['your_app.tests.base.PreventMigrationsDBRouter', ]
Hope this detailed new Django user user-friendly example will help someone and save their time.
unfortunately there seems to be no easy way to do this, but for your luck I have just succeeded in producing a working snippet for you digging in the internals of the django migrations jungle.
Just:
save the code to get_sql_create_table.py (in example)
do $ export DJANGO_SETTINGS_MODULE=yourproject.settings
launch the script with python get_sql_create_table.py yourapp.yourmodel
and it should output what you need.
Hope it helps!
import django
django.setup()
from django.db.migrations.state import ModelState
from django.db.migrations import operations
from django.db.migrations.migration import Migration
from django.db import connections
from django.db.migrations.state import ProjectState
def get_create_sql_for_model(model):
model_state = ModelState.from_model(model)
# Create a fake migration with the CreateModel operation
cm = operations.CreateModel(name=model_state.name, fields=model_state.fields)
migration = Migration("fake_migration", "app")
migration.operations.append(cm)
# Let the migration framework think that the project is in an initial state
state = ProjectState()
# Get the SQL through the schema_editor bound to the connection
connection = connections['default']
with connection.schema_editor(collect_sql=True, atomic=migration.atomic) as schema_editor:
state = migration.apply(state, schema_editor, collect_sql=True)
# return the CREATE TABLE statement
return "\n".join(schema_editor.collected_sql)
if __name__ == "__main__":
import importlib
import sys
if len(sys.argv) < 2:
print("Usage: {} <app.model>".format(sys.argv[0]))
sys.exit(100)
app, model_name = sys.argv[1].split('.')
models = importlib.import_module("{}.models".format(app))
model = getattr(models, model_name)
rv = get_create_sql_for_model(model)
print(rv)
For Django v4.1.3, the above get_create_sql_for_model soruce code changed like this:
from django.db.migrations.state import ModelState
from django.db.migrations import operations
from django.db.migrations.migration import Migration
from django.db import connections
from django.db.migrations.state import ProjectState
def get_create_sql_for_model(model):
model_state = ModelState.from_model(model)
table_name = model_state.options['db_table']
# Create a fake migration with the CreateModel operation
cm = operations.CreateModel(name=model_state.name, fields=model_state.fields.items())
migration = Migration("fake_migration", "app")
migration.operations.append(cm)
# Let the migration framework think that the project is in an initial state
state = ProjectState()
# Get the SQL through the schema_editor bound to the connection
connection = connections['default']
with connection.schema_editor(collect_sql=True, atomic=migration.atomic) as schema_editor:
state = migration.apply(state, schema_editor, collect_sql=True)
sqls = schema_editor.collected_sql
items = []
for sql in sqls:
if sql.startswith('--'):
continue
items.append(sql)
return table_name,items
#EOP
I used it to create all tables (like the command syncdb of old Django version):
for app in settings.INSTALLED_APPS:
app_name = app.split('.')[0]
app_models = apps.get_app_config(app_name).get_models()
for model in app_models:
table_name,sqls = get_create_sql_for_model(model)
if settings.DEBUG:
s = "SELECT COUNT(*) AS c FROM sqlite_master WHERE name = '%s'" % table_name
else:
s = "SELECT COUNT(*) AS c FROM information_schema.TABLES WHERE table_name='%s'" % table_name
rs = select_by_raw_sql(s)
if not rs[0]['c']:
for sql in sqls:
exec_by_raw_sql(sql)
print('CREATE TABLE DONE:%s' % table_name)
The full soure code can be found at Django syncdb command came back for v4.1.3 version
I'm starting to write tests with Flask-SQLAlchemy, and I'd like to add some fixtures for those. I have plenty of good data for that in my development database and a lot of tables so writing data manually would get annoying. I'd really like to just sample data from the dev database into fixtures and then use those. What's a good way to do this?
i would use factory boy
to create a model factory you just do:
import factory
from . import models
class UserFactory(factory.Factory):
class Meta:
model = models.User
first_name = 'John'
last_name = 'Doe'
admin = False
then to create instances:
UserFactory.create()
to add static data just give as kwarg to create
UserFactory.create(name='hank')
so to seed a bunch of stuff throw that in a for loop. :)
If you need to handle fixtures with SQLAlchemy or another ORM/backend then the Fixture package may be of use: Flask-Fixtures 0.3.3
That is a simple library that allows you to add database fixtures for your unit tests using nothing but JSON or YAML.
While Kyle's answer is correct, we still need to provide the model factory with a database session, otherwise we would never actually commit to the db. Also, factory boy has a dedicated class SQLAlchemyModelFactory for interacting with SQLAlchemy.
https://factoryboy.readthedocs.io/en/stable/orms.html#sqlalchemy
The whole setup could look something like this:
import pytest
import os
from sqlalchemy import create_engine
from sqlalchemy.orm import sessionmaker
from factory.alchemy import SQLAlchemyModelFactory
engine = create_engine( os.getenv("SQLALCHEMY_DATABASE_URI"))
SessionLocal = sessionmaker(autocommit=False, autoflush=False, bind=engine)
# this resets our tables in between each test
def _reset_schema():
db = SessionLocal()
for table in Base.metadata.sorted_tables:
db.execute(
'TRUNCATE {name} RESTART IDENTITY CASCADE;'.format(name=table.name)
)
db.commit()
#pytest.fixture
def test_db():
yield engine
engine.dispose()
_reset_schema()
#pytest.fixture
def session(test_db):
connection = test_db.connect()
transaction = connection.begin()
db = scoped_session(sessionmaker(bind=engine))
try:
yield db
finally:
db.close()
transaction.rollback()
connection.close()
db.remove()
class UserFactory(SQLAlchemyModelFactory):
class Meta:
model = models.User
first_name = 'John'
last_name = 'Doe'
admin = False
#pytest.fixture(autouse=True)
def provide_session_to_factories(session):
# usually you'd have one factory for each db table
for factory in [UserFactory, ...]:
factory._meta.sqlalchemy_session = session