Consider this code:
from sqlmodel import SQLModel, Field
class Task(SQLModel, table=True):
id = Column(Integer, primary_key=True, index=True)
I only stepped away from python for a few weeks and now there are arguments in the class-inheritance brackets? What does this do, how do I do it and are there drawbacks/benefits? Whats it called? Once I know what its called I can look it up.
EDIT: This is not a typo, this code works as is.
Short answer
table=True is the attribute that the class SQLModel uses to distinguish if something is a Pydantic model or an SQLAlchemy model.
Long answer
Not that long .. but here is the class and here is where the check happens. There is more to it, but basically, it is to distinguish if it should be a Pydantic model or an SQLAlchemy model.
Related
I would like to generate a Pydantic model that inherits from a parent class, but only has a subset of that parent model's fields.
E.g. ModelB should inherit only field_b from ModelA:
from pydantic import BaseModel
class ModelA(BaseModel):
field_a: str
field_b: str
class ModelB(ModelA):
pass
As far as I know, there is no built-in mechanism for this in Pydantic.
Difficult solutions
You could start messing around with the internals like __fields__
and __fields_set__, but I would strongly advise against it. I think this may be less than trivial because you need to take into account validators that are already registered and maybe a bunch of other stuff that happens internally, one a field is defined on a model.
You could also go the route of e.g. defining your own __init_subclass__ on ModelA or even subclassing ModelMetaclass, but this will likely lead to the same difficulties. Unless you are very familiar with the intricacies of Pydantic models and are prepared to rework your code, if something fundamentally changes on their end, I would not recommend this.
I can think of a few workarounds though.
Potential workarounds
The simplest one in my opinion is simply factoring out the fields that you want to share into their own model:
from pydantic import BaseModel
class ModelWithB(BaseModel):
field_b: str
class ModelA(ModelWithB):
field_a: str
class ModelB(ModelWithB):
pass
This obviously doesn't work, if you have no control over ModelA. It also may mess up the order of the fields on ModelA (in this case field_b would come before field_a), which may or may not be important to you. Validation for example depends on the order in which fields were defined.
Another possible workaround would be to override the unneeded field in ModelB and make it optional with a None default and exclude it from dict and json exports:
from pydantic import BaseModel, Field
class ModelA(BaseModel):
field_a: str
field_b: str
class ModelB(ModelA):
field_a: str | None = Field(default=None, exclude=True)
b = ModelB(field_b="foo")
print(b.json())
Output:
{"field_b": "foo"}
Note that this does not actually get rid of the field. It is still there and by default still visible in the model's string representation for example, as well as in the model schema. But at least you never need to pass a value for field_a and it is not present, when calling dict or json by default.
Note also that you may run into addtional problems, if you have custom validators for field_a that don't work with a None value.
If you provide more details, I might amend this answer, but so far I hope this helps a little.
It was enough for me to hard copy the field and to adjust the extras I had defined. Here is a snippet from my code:
import copy
from pydantic import BaseModel
def copy_primary_field(
model_from: BaseModel,
model_to: BaseModel,
primary_key: str,
) -> BaseModel:
new_field_name = f"{model_from.__name__}" + "_" + primary_key
model_to.__fields__[new_field_name] = copy.deepcopy(
model_from.__fields__[primary_key]
)
model_to.__fields__[new_field_name].name = new_field_name
model_to.__fields__[new_field_name].field_info.extra["references"] = (
f"{model_from.__name__}" + ":" + primary_key
)
return model_to
EDIT
The issue arises when trying to inherit from a class that is an attribute of an instance. This mcve repros it, I'll leave the rest of the question below for posterity:
class A:
class SubA:
pass
a = A()
class B(a.SubA):
pass
mypy output:
Name 'a.SubA' is not defined
This passes:
class A:
class SubA:
pass
class B(A.SubA):
pass
The example in this Related Issue is pretty much exactly what Flask-SQLAlchemy does to provide the declarative base class under the db namespace. In the issue, mypy maintainer asserts that they wouldn't support the pattern.
My question is, what is incorrect about the above pattern such that mypy would not support it? Especially in the context that it is used by a large project such as Flask-SQLAlchemy.
Further, what is the best way for users of Flask-SQLAlchemy and mypy to manage this in their projects?
ORIGINAL QUESTION
This question isn't about the lack of Flask-SQLAlchemy stubs. Accepting that, I came across this question.
Please help me to understand why the following does not work as I expect.
In my environment I have only Flask-SQLAlchemy and mypy installed.
I have mypy configured with ignore_missing_imports = True.
A mypy pass over the following:
from flask_sqlalchemy import SQLAlchemy
db = SQLAlchemy()
class Widget(db.Model):
id = db.Column(db.Integer, primary_key=True)
reveals:
error: Name 'db.Model' is not defined
So, I've attempted to subclass SQLAlchemy to provide an annotation for Model, which shows in __annotations__ on the object, yet the mypy analysis of the module doesn't change:
from flask_sqlalchemy import SQLAlchemy
from flask_sqlalchemy.model import DefaultMeta
class TypedSQLAlchemy(SQLAlchemy):
Model: DefaultMeta
db = TypedSQLAlchemy()
print(db.__annotations__) # {'Model': <class 'flask_sqlalchemy.model.DefaultMeta'>}
class Widget(db.Model):
id = db.Column(db.Integer, primary_key=True)
When I execute the file, the print(db.__annotations__) command displays {'Model': <class 'flask_sqlalchemy.model.DefaultMeta'>}, yet still mypy has the same error:
error: Name 'db.Model' is not defined
I would expect that providing an annotation for db.Model should make that error go away.
Earlier Edit
I've initially misinterpreted the error as it doesn't suggest the attribute Model doesn't exist on db, it suggests that the name db.Model doesn't exist in the namespace. But why would it treat db.Model as a full name, and not db as the name defined locally and Model as it's attribute? Is this something to do with trying to inheriting from a class variable?
Also, my annotation was incorrect, should be:
class TypedSQLAlchemy(SQLAlchemy):
Model: Type[DefaultMeta]
You should use A.SubA.
As I know access to nested class via instance variable is not allowed from mypy point of view. Because derived classes from A could override nested class, and mypy cannot recognize this case, something like this:
class A:
class SubA:
pass
class C(A):
class SubA:
pass
def foo(a: A):
class B(a.SubA): # What SubA here ?
pass
foo(C())
Update:
As for Flask-SQLAlchemy, the following workaround was suggested in this discussion:
from app import db
from sqlalchemy.ext.declarative import DeclarativeMeta
BaseModel: DeclarativeMeta = db.Model
class MyModel(BaseModel): ...
If you are using flask_sqlalchemy then you can use from flask_sqlalchemy.model import DefaultMeta instead of DeclarativeMeta.
I'm trying to use an Enum within my Mixin model
The mixin model looks this:
from sqlalchemy import Column, Integer, Enum
from enum import Enum as EnumClass
class MyEnum(EnumClass):
active = 1
deactive = 2
deleted = 3
class MyMixin:
id = Column(Integer(), primary_key=True)
status = Column(Enum(MyEnum), nullable=False, unique=False, default=MyEnum.active)
And I have other models inheriting my mixin like this:
class Inherited1(MyMixin, Base):
__tablename__ = 'test_table_1'
class Inherited2(MyMixin, Base):
__tablename__ = 'test_table_2'
I'm using PostgreSQL as my Database, when I try to use Base.metadata.create_all(), It return's the following error:
sqlalchemy.exc.ProgrammingError: (psycopg2.ProgrammingError) type "myenum" already exists
The reason is It's trying to recreate the enum type in the PostgreSQL, a workaround would be defining the enum on the inherited classes and passing name parameter to the Enum type used for the column
But I wanted to know is there any better way to use MyMixin with the Enum type and inherit it for multiple models?
I should add that I am using Alembic for migrating, I even tried to add create_type=False as a parameter to sa.Enum() within the sa.Column() defination for my Enum type, and it didn't work, like this:
sa.Column('status', sa.Enum('active', 'deactive', 'deleted', name='myenum', create_type=False), nullable=False)
Workaround:
I added this part to the question, cause still I don't think it's the best practice to do it
I edited the migration script by changing sa.Enum() to sa.dialects.postgresql.ENUM()
sa.Column('status', sa.dialects.postgresql.ENUM('active', 'deactive', 'deleted', name='myenum', create_type=False), nullable=False)
Seems like the sa.Enum type doesn't take create_type parameter since it is exclusively for PostgreSQL
But I'm still looking for a better way that doesn't require making changes to the migration script
How would you go about creating a relationship between two different entity models?
When I try this I get an error:
class Spec(ndb.Model):
userKey = ndb.KeyProperty(kind=User)
class User(ndb.Model):
specs = ndb.KeyProperty(kind=Spec, repeated=True)
The error as I understand stems from referencing User before it is defined.
I did the following to solve it, and I use a get_by_id, but I do not like this solution:
class Spec(ndb.Model):
userKey = ndb.IntegerProperty()
class User(ndb.Model):
specs = ndb.KeyProperty(kind=Spec, repeated=True)
How would you solve this so I can define my models as in the first example? Even better, how would you go about to define each class in its own file/module.
I tried following this article, but it seems to be be outdated and not relevant to ndb.
https://developers.google.com/appengine/articles/modeling
Thank you
As the docs show, the kind argument can be a string. So use kind='User'.
Ok so I have two problems:
My first would by sqlalchemy related. So I'm forced to use sqllite and I have to implement some cascade deletes. Now I've found relationships do the job however these should be declared normally in the parent table. But due to some design decision I'm forced to do this in the child so I'm doing something like:
class Operation(Base):
"""
The class used to log any action executed in Projects.
"""
__tablename__ = 'OPERATIONS'
id = Column(Integer, primary_key=True)
parameters = Column(String)
..... rest of class here ....
class DataType(Base):
__tablename__ = 'DATA_TYPES'
id = Column(Integer, primary_key=True)
gid = Column(String)
...more params...
parent_operation = relationship(Operation, backref=backref("DATA_TYPES",
order_by=id,
cascade="all,delete"))
...rest of class...
Now this seems to work but I'm still not certain of a few things.
Firstly, what can I do with parent_operation from here on end? I mean I see that the cascade works but I make no use of parent_operation except for the actual declaration.
Secondly, the "DATA_TYPES" in the above case, which is the first parameter in the backref, does this need to be the name of the child table or does it need to be unique per model?
And finally, in my case both Operation and DataType classes are in the same module, so I can pass Operation as the first parameter in the relationship. Now if this wasnt the case and I would have them in separate modules, if I still want to declare this relationship should I pass 'Operation' or 'OPERATION' to the relationship( Classname or Tablename ? )
Now my second is more core Python but since it still has some connections with the above I'll add it here. So I need to be able to add a class attribute dinamically. Basically I need to add a relationship like the ones declared above.
class BaseClass(object)
def __init__(self):
my_class = self.__class__
if not hasattr(my_class, self.__class__.__name__):
reference = "my_class." + self.__class__.__name__ + "= relationship\
('DataType', backref=backref('" + self.__class__.__name__ + "', \
cascade='all,delete'))"
exec reference
The reason of to WHY I need to do this are complicated and have to do with some design decisions(basically I need every class that extends this one to have a relationship declared to the 'DataType' class). Now I know using of the exec statement isn't such a good practice. So is there a better way to do the above?
Regards,
Bogdan
For the second part or your question, keep in mind, anything in your class constructor won't be instrumented by SqlAlchemy. In your example you can simply declare the relationship in its own class (note it does not inherit from you declarative_base class) and then inherit it in any subclass something like this:
class BaseDataType(object):
parent_operation = relationship(Operation, backref="datatype")
class DataTypeA(Base, BaseDataType):
id = Column(Integer, primary_key=True)
class DataTypeB(Base, BaseDataType):
id = Column(Integer, primary_key=True)
The SqlAlchemy documentation gives good examples of what's possible:
http://www.sqlalchemy.org/docs/orm/extensions/declarative.html#mixing-in-relationships