I'm trying to set up a reusable set of data models which I can include in multiple apps, something like this (I'm using users as an example here, but the actual one is a peewee backend for the Authlib library):
# mixins.py
class UserMixin(peewee.Model):
username = peewee.CharField()
password = peewee.CharField()
def set_password(self):
# do stuff
...
Once that mixin's created, I should be able to import it like this, defining only the additional fields (the defaults will already be there from the mixin)
# models.py
db = peewee.SqliteDatabase(config.get('DATABASE_FILE'))
class BaseModel(peewee.model):
class Meta:
database = db
class User(BaseModel, UserMixin):
email = peewee.CharField()
...
I've seen people do this with SQLAlchemy, but when I use this strategy with peewee it doesn't seem to save the fields properly:
if UserMixin inherits from peewee.Model, it says "unable to resolve import hierarchy" (probably since we're importing from peewee.Model multiple times)
if UserMixin is just an object, then peewee doesn't seem to handle its fields properly: they all end up as unbound instances and don't get saved in the database.
My question: is there an "official way" to create reusable model mixins with fields in peewee?
I've seen other projects (such as flask-login) use mixins, but those are generally additional functions like set_password in this example, and not ones that define the fields themselves.
I have a few potential alternate solutions, like
Define the models themselves, rather than mixins, in the shared file, and override their .Meta.database separately for each models.py entry
Define only the other functions in the mixin; let the fields be defined separately each time in models.py
Use the shared code as a file to copy-paste from rather than importing directly.
But there's probably some cleaner way of doing this?
Here's a simple example:
from peewee import *
db = SqliteDatabase(':memory:')
class Base(Model):
class Meta:
database = db
class UserModelMixin(Model):
username = TextField()
class User(UserModelMixin, Base):
pass
print(User._meta.fields)
#{'id': <AutoField: User.id>, 'username': <TextField: User.username>}
I think the problem was the ordering of your mixins.
Related
Django allows us to use '%(class)s' to automatically create related name from mixins. But when I have a ClassName I'd rather access it using class_name, not classname. I know it's only semantics but I was wondering if it's possible to adjust model field to make it snake case instead.
I was browsing through django.db.models.fields but I can't find where the class interpolation is happening.
Example code
from django.db import models
class User(models.Model):
pass
class UserRelatedMixin(models.Model):
user = models.OneToOneField(
to=User,
parent_link=True,
on_delete=models.CASCADE,
related_name='%(class)s',
related_query_name="%(class)s",
)
class Meta:
abstract = True
class HomeAddress(UserRelatedMixin):
pass
user = User.objects.first()
What I have
user.homeaddress
What I want instead
user.home_address
Right now I'm using a #property but it won't allow ORM queries so it's a partial solution.
I think I have a more or less unorthodox and hackish question for you. What I currently have is django project with multiple apps.
I want to use a non-abstract model (ModelA) of one app (app1) and use it in another app (app2) by subclassing it. App1's models
should not be migrated to the DB, I just want to use the capabilities of app1 and it's model classes, by extending its functionality and logic.
I achieved that by adding both apps to settings.INSTALLED_APPS, and preventing app1's models being migrated to the DB.
INSTALLED_APPS += (
'App1',
'App2',
)
# This is needed to just use App1's models
# without creating it's database tables
# See: http://stackoverflow.com/a/35921487/1230358
MIGRATION_MODULES = {
'App1': None,
}
So far so good, ugly and hackish, I know... The remaining problem is now that most of app1's models are non-abstract (ModelA) and if I try
to subclass them, none of the ModelA's fields get populated to the db into the table named app2_modelb. This is clear to me, because I excluded the App1 from
migrating to the DB and therefore the table app1_modela is completely missing in the DB.
My idea now was to clone ModelA, preserve all its functionallity, and changing it's Meta information from non-abstract to abstract (ModelB.Meta.abstract = True).
I hope that by this, all the original fields of ModelA will be inherited to ModelB and can be found in its respective DB table and columns (app1_modelb).
What I have right now is:
# In app1 -> models.py
class ModelA(models.Model):
title = models.CharField(_('title'), max_length=255)
subtitle = models.CharField(_('subtitle'), max_length=255)
class Meta:
abstract = False # just explicitly for demonstration
# In app2 -> models.py
from app1.models import ModelA
class ModelB(ModelA):
pass
# Just extending ModelAdoes not create the fields title and subtitle fields in app2_modelb
# because ModelA.meta.abstract = False
My current way (pseudo code) to make an existing non-abstract model abstract looks like this:
# In app2 -> models.py
from app1.models import ModelA
def get_abstract_class(cls):
o = dict(cls.__dict__)
o['_meta'].abstract = True
o['_meta'].app_label = 'app2'
o['__module__'] = 'app2.models'
#return type('Abstract{}'.format(cls.__name__), cls.__bases__, o)
return type('Abstract{}'.format(cls.__name__), (cls,), o)
ModelB = get_abstract_class(ModelA)
class ModelC(ModelB):
# title and subtitle are inherited from ModelA
description = models.CharField(_('description'), max_length=255)
This does not work, and after this lengthy description my (simple) question would be, if and how is it possible to clone a non-abstract model class preserving all its functionality and how to change it to be abstract?
Just to be clear. All upper fuzz is about, that I can't change any code in app1. May it be that app1 is a django app installed via pip.
Why not, in app1
AbstractBaseModelA(models.Model):
# other stuff here
class Meta:
is_abstract=True
ModelA(AbstractBaseModelA):
# stuff
in app2:
MobelB(AbstractBaseModelA):
# stuff
Sorry if I've misunderstood your aims, but I think the above should achieve the same end result.
My app_templ models definition:
models.py
class TableName(models.Model):
name = models.CharField(max_length=100)
#
class TableAbstract(models.Model):
...
class Meta:
abstract = True
It can be used by other apps:
app1 / models.py
from app_templ.models import TableAbstract
class Table1(TableAbstract):
...
app2 / models.py
from app_templ.models import TableAbstract
class Table2(TableAbstract):
...
and so on...
It is necessary for me that in TableName, names of models (tables) of successors registered.
How to make it by means of coding only in the app_templ app?
Technically, what you are describing sounds fine. You are defining an abstract model and then using it to create several models. You do need to import it, and to specify that you want to create these tables (using your above examples). You should think carefully about why you are using the same model multiple times in different apps (should this actually be one app?), but in theory it is fine.
I don't quite understand your first definition, you should probably define your model something like this:
class TableBaseClass(models.Model):
name = models.CharField(max_length=100)
class Meta:
abstract = True
abstract = True will mean that the model is not created in your database (docs) so for clarity, you could store this file in a location distinct from your regular model classes that create tables.
This code:
from app_templ.models import TableAbstract
class Table1(TableAbstract):
...
should be in models.py in your app
I have a bunch of Django models that inherit from django-polymorphic's Polymorphic model. In another model, I have a ForeignKey relation to ContentType which I'd like to limit to models that inherit from a specific base class.
Example:
from django.db import models
from polymorphic import PolymorphicModel
from django.contrib.contenttypes.models import ContentType
class MagicBaseModel(PolymorphicModel):
def do_magic(self):
# ...
class MagicObjectA(MagicBaseModel):
def do_magic(self):
super(MagicObjectA, self).do_magic(self)
# ...
class MagicObjectB(MagicBaseModel):
def do_magic(self):
super(MagicObjectB, self).do_magic(self)
# ...
class NonMagicObject(models.Model):
# ...
class MagicAction(models.Model):
magic_object_type = models.ForeignKey(ContentType)
# ...
In the example above, I'd like to restrict MagicAction.magic_object_type so that only MagicObjectA and MagicObjectB are available as choices.
I've tried using limit_choices_to like this:
magic_object_type = models.ForeignKey(ContentType, limit_choices_to=Q(polymorphic_ctype=ContentType.objects.get_for_model(MagicBaseModel)))
However, it seems you can't execute that query during model initialisation as the ContentType model isn't ready yet.
Any ideas of a better way I could approach this?
I would try the following, inspired from here: Choose queryset for limit_choices_to based on object fields:
Your query for the two content types cannot yet be evaluated in the ForeignKey function, since the db will not yet be ready.
Therefore, you need to use a function instead, which will fetch the two choices for you when needed.
Then, in that function, you can filter on the 'model' field of the content type instance.
Like so:
def get_choices():
query_a = ContentType.objects.filter(
model=ContentType.objects.get_for_model(MagicObjectA).model)
query_b = ContentType.objects.filter(
model=ContentType.objects.get_for_model(MagicObjectB).model)
return {'magic_object_type': query_a | query_b}
Where in your field declaration you can refer to this function, like so:
magic_object_type = models.ForeignKey(ContentType, limit_choices_to=get_choices)
I would like to create a BaseModel that is an EndpointsModel to contain properties common across all my models (ie. created_at, modified_at). Then I would like to create a User model that extends that BaseModel.
However, I'm running into an issue where when I look at my "user.create" method in the API Explorer, the request body only shows only shows the BaseModel properties of created_at and modified_at, but not the username property.
Here's what I have:
from endpoints_proto_datastore.ndb import EndpointsModel
from google.appengine.ext import ndb
class BaseModel(EndpointsModel):
created_at = ndb.DateTimeProperty(auto_now_add=True)
modified_at = ndb.DateTimeProperty(auto_now=True)
class User(BaseModel):
username = ndb.StringProperty(required=True)
Here's the API built using Google Cloud Endpoints:
import endpoints
from google.appengine.ext import ndb
from models import User
from protorpc import remote
#endpoints.api(name='user', version='v1',
description='API for User Management')
class UserApi(remote.Service):
#User.method(name='user.create', path='user'):
def create_user(self, user):
user.put()
return user
application = endpoints.api_server([UserApi])
If you go to http://localhost:8080/_ah/api/discovery/v1/apis/user/v1/rest you'll see the discovery document generated by your API. Note that (toward the bottom) the create method on the user resource is shown as taking a BaseModel rather than a User.
Now I don't know why this happens precisely—it's definitely related to the magic being done by EndpointsModel—but I have been able to achieve the result you want by switching the inheritance around, and treating BaseModel like a mixin rather than a base class, this way the User model can inherit directly from EndpointsModel:
class BaseModel:
created_at = ndb.DateTimeProperty(auto_now_add=True)
modified_at = ndb.DateTimeProperty(auto_now=True)
class User(BaseModel, EndpointsModel):
username = ndb.StringProperty(required=True)
It makes sense then to rename BaseModel to something that makes more explicit it's a mixin now.
If you check the same discovery document (or API Explorer) you'll notice create takes a User message after this change.