I cannot figure it out how to display a one to many relationship using fastapi and sqlmodel. I've read through this question but my case seems to be slightly different. Specially in the function call.
This is my schemas.py:
from typing import Optional
from sqlmodel import Field, Relationship, SQLModel
class BinaryBase(SQLModel):
product_id: int
software_install_path: Optional[str] = None
host_id: int = Field(foreign_key="host.id", nullable=False)
class Binary(BinaryBase, table=True):
id: Optional[int] = Field(default=None, primary_key=True)
host_id: int = Field(foreign_key="host.id", nullable=False)
host: "Host" = Relationship(back_populates="binaries")
class HostBase(SQLModel):
name: str
region: Optional[str] = None
os_version: Optional[str] = None
network: Optional[int] = None
class Host(HostBase, table=True):
id: Optional[int] = Field(default=None, primary_key=True)
binaries: list[Binary] = Relationship(back_populates='host')
class HostReadWithBinary(HostBase):
bins: list[HostBase] = []
class BinaryReadWithHost(BinaryBase):
host: BinaryBase
And this is my main.py:
from fastapi import Depends, FastAPI
from sqlmodel import Session, col, select
...
#app.get(
"/binaries/",
response_model=list[HostReadWithBinary]
)
def get_binary(name: Optional[str] = None, session: Session = Depends(get_session)) -> list[HostReadWithBinary]:
query = select(Host).limit(100)
if name:
query = query.where(col(Host.name).contains(name.lower()))
return session.exec(query).all()
The Host table represents the 1 part and the Binary table represents the many part. I would like to get a response of all the BinaryBase attributes for all the hosts eagerly. But what I get is this:
[
{
"name": "hkl20014889",
"region": "HK",
"os_version": "Red Hat 6.10",
"network": 3,
"bins": []
},
{
"name": "hkl20016283",
"region": "HK",
"os_version": "Red Hat 6.10",
"network": 3,
"bins": []
},
....
Theoreticaly bins should hold the attributes of the Host table when id in Host joins host_id in Binary.
You need to realize that when you define a response_model for a route, it will always try to parse whatever data comes out of your route handler function (get_binary in this case) through that model. This is done by essentially calling the .from_orm method on the response model, which goes through all fields defined on it and tries to find corresponding attributes (i.e. with the same names) on the object you pass to it.
The model you specified (in a list, but the argument stands) is HostReadWithBinary. Aside from the fields defined on its parent model HostBase it only has the field bins, which is supposed to be a list of HostBase.
First of all, I think you meant to declare that bins field to be of the type list[BinaryBase], not list[HostBase]. If you had named the field correctly, this would have caused an error, but this is where your second mistake comes in.
You also named the field on your response model bins, but your handler function performs a query that returns a list of Host model instances. That model does not have a bins field. It has a field named binaries. This means that when the from_orm method gets to the HostReadWithBinary.bins field, it checks if the corresponding Host instance has an attribute named bins. It does not find one, but no problem because you set a default for HostReadWithBinary.bins, namely the empty list [], so that is what is set on each of the resulting response model instances.
You should therefore be able to fix your error by changing the response model definition like this:
class HostReadWithBinary(HostBase):
binaries: list[BinaryBase] = []
Alternatively, you can change the name of the relationship field on your Host model:
class Host(HostBase, table=True):
id: Optional[int] = Field(default=None, primary_key=True)
bins: list[Binary] = Relationship(back_populates='host')
class HostReadWithBinary(HostBase):
bins: list[BinaryBase] = []
They need to be the same, otherwise parsing an object of one model to the other will not work (properly).
Side note: You also mistakenly annotated the host field on BinaryReadWithHost with BinaryBase instead of HostBase.
PS: I also just noticed a minor mistake related to type annotations. You declare the return type of your route handler function to be list[HostReadWithBinary], but that is not what it returns. It returns list[Host]. This is part of the misunderstanding with the response models. The decorated version of your route is what returns list[HostReadWithBinary]. Your route handler get_binary by itself (i.e. before decoration) returns list[Host], which is then passed to the wrapper around it and that wrapper parses it to list[HostReadWithBinary] and sends that data on its way (to the client eventually). This wrapper action obviously happens behind the scenes and is part of that FastAPI decorator magic.
Related
I have the following Django models:
from django.db import models
class Foo(models.Model):
id: int
name = models.TextField(null=False)
class Bar(models.Model):
id: int
foo = models.ForeignKey(
Foo,
on_delete=models.CASCADE,
null=False,
related_name="bars",
)
And Pydantic models (with orm_mode set to True):
from pydantic import BaseModel
class BarPy(BaseModel):
id: int
foo_id: int
class FooPy(BaseModel):
id: int
name: str
bars: list[BarPy]
Now I want to perform a query on the model Foo and load it into FooPy, so i wrote this query:
foo_db = Foo.objects.prefetch_related("bars").all()
pydantic_model = FooPy.from_orm(foo_db)
But it gives me this error:
pydantic.error_wrappers.ValidationError: 1 validation error for FooPy
bars
value is not a valid list (type=type_error.list)
I am able to do it when explicitly using the FooPy constructor and assigning the values manually but i want to use from_orm.
The bars attribute on your Foo model is a ReverseManyToOneDescriptor that just returns a RelatedManager for the Bar model. As with any manager in Django, to get a queryset of all the instances managed by it, you need to call the all method on it. Typically you would do something like foo.bars.all().
You can add your own custom validator to FooPy and make it pre=True to grab all the related Bar instances and pass a sequence of them along to the default validators:
from django.db.models.manager import BaseManager
from pydantic import BaseModel, validator
...
class FooPy(BaseModel):
id: int
name: str
bars: list[BarPy]
#validator("bars", pre=True)
def get_all_from_manager(cls, v: object) -> object:
if isinstance(v, BaseManager):
return list(v.all())
return v
Note that it is not enough to just do .all() because that will return a queryset, which will not pass the default sequence validator built into Pydantic models. You would get the same error.
You need to give it an actual sequence (e.g. list or tuple). A QuerySet is not a sequence, but an iterable. But you can consume it and turn it into a sequence, by calling for example list on it.
More generalized version
You could make an attempt at generalizing that validator and add it to your own (Pydantic) base model. Something like this should work on any field you annotate as list[Model], with Model being a subclass of pydantic.BaseModel:
from django.db.models.manager import BaseManager
from pydantic import BaseModel, validator
from pydantic.fields import ModelField, SHAPE_LIST
...
class CustomBaseModel(BaseModel):
#validator("*", pre=True)
def get_all_from_manager(cls, v: object, field: ModelField) -> object:
if not (isinstance(field.type_, type) and issubclass(field.type_, BaseModel)):
return v
if field.shape is SHAPE_LIST and isinstance(v, BaseManager):
return list(v.all())
return v
I have not thoroughly tested this, but I think you get the idea.
Side note
It is worth mentioning that prefetch_related has nothing to do with the problem. The problem and its solution are the same, whether you do that or not. The difference is that without prefetch_related, you'll trigger additional database queries, when calling from_orm and thus executing the validator that consumes the queryset of .bars.all().
I have a simple API to insert data inside an object type dictionary, my issue is when I try to save a calculated field. Example code:
from fastapi import FastAPI
from pydantic import BaseModel
app = FastAPI()
users = {}
class User(BaseModel):
name: str
email: str
calculated: float
#app.post('/user-create/{id_user}')
def creating_an_user(id_user:int,user:User):
calculated = 1+2+3*2
if id_user in users:
return {"Error":"User ID already exists."}
users[id_user] = {
"name":user.name,
"email":user.email,
"calculated":user.calculated #should I need to put just like that ?
}
return users[id_user]
Obviously I receive an error, I think because my method is waiting a manually insert "calculated" field but is not:
TypeError: Failed to execute 'fetch' on 'Window': Request with GET/HEAD method cannot have body.
I am lost in this concern, can anybody help me on this?
Currently i have this script:
from typing import List
from uuid import uuid4, UUID
from pydantic import BaseModel, Field
class Id(BaseModel):
value:UUID = Field(default_factory=uuid4)
class User(BaseModel):
id:Id = Field(default_factory=Id)
roles:List[str] = ['student']
class Project(BaseModel):
id:Id = Field(default_factory=Id)
creator:User
id = Id()
print(id)
user = User()
print(user)
project = Project(creator={})
print(project)
The script should run as it is
When i instantiate a project, pydantic allows me to put non User objects, like dicts, strings, Id(objects of type Id) and some others, but i feel this is not expected. The weirdest thing is the fields are auto filled with an object of the correct type when a bad value is passed
what is happening here? Thanks in advance!
I have a situation where I want to authorize the active user against one of the values (Organization) in a FastAPI route. When an object of a particular type is being submitted, one of the keys (organization_id) should be present and the user should be verified as having access to the organization.
I've solved this as a dependency in the API signature to avoid having to replicate this code across all routes that needs access to this property:
def get_organization_from_body(organization_id: str = Body(None),
user: User = Depends(get_authenticated_user),
organization_service: OrganizationService = Depends(get_organization_service),
) -> Organization:
if not organization_id:
raise HTTPException(status_code=400, detail='Missing organization_id.')
organization = organization_service.get_organization_for_user(organization_id, user)
if not organization:
raise HTTPException(status_code=403, detail='Organization authorization failed.')
return organization
This works fine, and if the API endpoint expects an organization through an organization_id key in the request, I can get the organization directly populated by introducing get_organization_from_body as a dependency in my route:
#router.post('', response_model=Bundle)
async def create_bundle([...]
organization: Organization = Depends(get_organization_from_body),
) -> Model:
.. and if the user doesn't have access to the organization, an 403 exception is raised.
However, I also want to include my actual object on the root level through a schema model. So my first attempt was to make a JSON request as:
{
'name': generated_name,
'organization_id': created_organization['id_key']
}
And then adding my incoming Pydantic model:
#router.post('', response_model=Bundle)
async def create_bundle(bundle: BundleCreate,
organization: Organization = Depends(get_organization_from_body),
[...]
) -> BundleModel:
[...]
return bundle
The result is the same whether the pydantic model / schema contains organization_id as a field or not:
class BundleBase(BaseModel):
name: str
class Config:
orm_mode = True
class BundleCreate(BundleBase):
organization_id: str
client_id: Optional[str]
.. but when I introduce my get_organization_from_body dependency, FastAPI sees that I have another dependency that refers to a Body field, and the description of the bundle object has to be moved inside a bundle key instead (so instead of "validating" the organization_id field, the JSON layout needs to change - and since I feel that organization_id is part of the bundle description, it should live there .. if possible).
The error message tells me that bundle was expected as a separate field:
{'detail': [{'loc': ['body', 'bundle'], 'msg': 'field required', 'type': 'value_error.missing'}]}
And rightly so, when I move name inside a bundle key instead:
{
'bundle': {
'name': generated_name,
},
'organization_id': created_organization['id_key']
}
.. my test passes and the request is successful.
This might be slightly bike shedding, but if there's a quick fix to work around this limitation in any way I'd be interested to find a way to both achieve validation (either through Depends() or in some alternative way without doing it explicitly in each API route function that requires that functionality) and a flat JSON layout that matches my output format closer.
Prior to FastAPI 0.53.2, dependencies for the body were resolved the way you are trying to do. Such code:
class Item(BaseModel):
val_1: int
val_2: int
def get_val_1(val_1: int = Body(..., embed=True)):
return val_1
#app.post("/item", response_model=Item)
def handler(full_body: Item, val_1: int = Depends(get_val_1)):
return full_body
Expected such body:
{
"val_1": 0,
"val_2": 0
}
But starting from version 0.53.2, different body dependencies are automatically embedded (embed=True) and the code above expects the following body:
{
"full_body": {
"val_1": 0,
"val_2": 0
},
"val_1": 0
}
Now, in order to have access to the model for the whole body and to have access to its elements as a separate dependency, you need to use same dependency for the body model everywhere:
def get_val_1(full_body: Item):
return full_body.val_1
#app.post("/item", response_model=Item)
def handler(full_body: Item, val_1: int = Depends(get_val_1)):
return full_body
Update for shared dependency
You can share one body dependency for multiple models, but in this case, two conditions must be met: the names of the dependencies must be the same and their types must be compatible (through inheritance or not). Example below:
class Base(BaseModel):
val_1: int
class NotBase(BaseModel):
val_1: int
class Item1(Base):
val_2: int
class Item2(Base):
val_3: int
def get_val1_base(full_body: Base):
return full_body.val_1
def get_val1_not_base(full_body: NotBase):
return full_body.val_1
#app.post("/item1", response_model=Item1)
def handler(full_body: Item1, val_1: int = Depends(get_val1_base)):
return full_body
#app.post("/item2", response_model=Item2)
def handler(full_body: Item2, val_1: int = Depends(get_val1_not_base)):
return full_body
Cornice's documentation mentions how to validate your schema using a colander's MappingSchema subclass. How should we use a colanderalchemy schema for the same purpose? Because if we create a schema using colanderalchemy as stated in the documentation, the schema object has already instantiated the colander's class, and I think that this results in an error.
To be more precise, here is my sample code:
from sqlalchemy.ext.declarative import declarative_base
from cornice.resource import resource, view
from colanderalchemy import SQLAlchemySchemaNode
from sqlalchemy import (
Column,
Integer,
Unicode,
)
Base = declarative_base()
'''
SQLAlchemy part
'''
class DBTable(Base):
__tablename__ = 'mytable'
id = Column(Integer, primary_key=True,
info={'colanderalchemy': {'exclude': True}})
name = Column(Unicode(70), nullable=False)
description = Column(Unicode(256))
'''
ColanderAlchemy part
'''
ClndrTable = SQLAlchemySchemaNode(DBTable)
'''
Cornice part
'''
PRF='api'
#resource(collection_path='%s/' % PRF, path='%s/{fid}' % PRF)
class TableApi(object):
def __init__(self, request):
self.request = request
#view(schema=ClndrTable, renderer='json')
def put(self):
# do my stuff here
pass
Where ClndrTable is my auto-generated schema. Now, when trying to deploy this code, I get the following error:
NotImplementedError: Schema node construction without a typ argument or a schema_type() callable present on the node class
As I've mentioned earlier, I am suspecting that the problem is that ClndrTable (given as an argument to the view decorator) is an instantiation of the automatically generated schema by colanderalchemy.
Anyone knowing how to resolve this?
Thanks all in advance!
This appears to be due to the issue of colander having both a typ property and a schema_type property. They're both supposed to tell you the schema's type, but they can actually be different values. I filed an issue with colander, but if there's a fix it'll likely not make it to pypi any time soon.
So what's happing is: ColanderAlchemy ignores schema_type and uses typ while Cornice ignores typ and uses schema_type.
You can hack a fix with the following: ClndrTable.schema_type = lambda: ClndrTable.typ
However, that just leads you to the next exception:
cornice.schemas.SchemaError: schema is not a MappingSchema: <class 'colanderalchemy.schema.SQLAlchemySchemaNode'>
This is due to Cornice not duck typing but expecting all Schema to be a subclass of MappingSchema. However, MappingSchema is just a Schema with typ/schema_type being Mapping (which is what ColanderAlchemy returns).
I'll see if I can enact some changes to fix this.
Update
Despite the names, 'typ' and 'schema_type' have two different purposes. 'typ' always tells you the type of a schema instance. 'schema_type' is a method that's called to give a SchemaNode a default type when it's instantiated (so it's called in the __init__ if you don't pass a typ in, but other than that it's not supposed to be used).
Cornice has been patched to properly use typ now (though, as of this message, it's not part of the latest release).