I have data coming into my FastAPI that can take any shape/form and as such I need an empty Pydantic model. I tried creating a dynamic model like this:
DynamicModel = create_model('RandomData', random_data=(dict, ...))
However it requires the model to follow this structure:
{"random_data": { } }
What I would like is for the model to accept an empty json dictionary like this: { }
What am I missing?
Related
I have a payload that is a list that will contain a variable number of dictionaries and I was hoping to be able to unpack that via a single call to a pydantic model.
Below is an example of what I am trying to accomplish, with two dictionaries:
#!/usr/bin/env python3.9
from pydantic import BaseModel
from typing import List, Dict
class RequestPayload(BaseModel):
site: str
class RequestPayload_unpack(BaseModel):
endpoints: List[Dict, RequestPayload]
payload = [
{
'site': 'nyc01',
},
{
'site': 'lax02',
},
]
request = RequestPayload_unpack(*payload)
print(request)
Running that I get the following error:
raise TypeError(f"Too {'many' if alen > elen else 'few'} parameters for {cls};"
TypeError: Too many parameters for typing.List; actual 2, expected 1
The workflow that I have working now (not using pydantic) parses/validates each dict in the list and then passes it along to the next step, which will perform additional lookups and then eventually do the work. So right now the model is that the entire RequestPayload is operated upon through the pipeline, versus each dict being treated as a distinct request.
I have some data (dict) and I want to validate its data structure at the first and ensure about validation, after that I want to change fields name (CamelCase to snake_case), that's all it!
I had lots of searches and I know about re_presentation method (it seems it calls only when using ModelSerializer as a parent class) and also read about ListSerializer.
Any helps will be appreciated :)
This is the possible way to do this
Assume you have a payload like this
{
'data':{
},
'serializer_class': 'user_profile'
}
Now you will load the serializer byserializer_class token from project path. Please see this
serializer_classes = {
'user_profile': 'path.to.user_profile.serilizer'
}
Once you will load the serilizer as module you can pass the data to the serializer and perform validation on data.
After validation you can play further with your validated data.
I'm using Django Rest Framework 3.6.3. I'm trying to write nested create serializers for one optional argument and one required argument inside a child. I know that I need to override create in the base serializer to tell DRF how to create the nested children - http://www.django-rest-framework.org/api-guide/relations/#writable-nested-serializers.
However, I can't figure out how to get it to parse the object information without telling it what the nested child serializer is defined as. Doing that then causes DRF to use it to parse the children objects, which then returns a validation error that I have no control over because the child serializer doesn't call its create method.
Base
Specification(many=True)
OptionalField
RequiredField
The information I pass in is a JSON object:
{
base_id: 1,
base_info: 'blah'
specifications: [
{
specification_id: 1,
optional_info: {
optional_id: 1,
optional_stuff: 'blah'
},
required_info: {
required_id: 1,
required_stuff: 'required',
}
}
]
}
The BaseCreationSerializer calls it's create method. I know that I need to pull out the rest of the information and create it manually. However, I can't figure out how to get the BaseCreationSerializer to parse the data into validated_data without defining specification = SpecificationCreationSerializer(), which then tries to parse that and throws an error. Printing self shows the entire JSON object in data, but then validated_data only contains the subset of things that the serializer knows about. How do I parse the appropriate things from data so that I can create the objects on my own?
I run Django 1.9 with the new JSONField and have the following Test model :
class Test(TimeStampedModel):
actions = JSONField()
Let's say the action JSONField looks like this :
[
{
"fixed_key_1": "foo1",
"fixed_key_2": {
"random_key_1": "bar1",
"random_key_2": "bar2",
}
},
{
"fixed_key_1": "foo2",
"fixed_key_2": {
"random_key_3": "bar2",
"random_key_4": "bar3",
}
}
]
I want to be able to filter the foo1 and foo2 keys for every item of the list.
When I do :
>>> Test.objects.filter(actions__1__fixed_key_1="foo2")
The Test is in the queryset. But when I do :
>>> Test.objects.filter(actions__0__fixed_key_1="foo2")
It isn't, which makes sense. I want to do something like :
>>> Test.objects.filter(actions__values__fixed_key_1="foo2")
Or
>>> Test.objects.filter(actions__values__fixed_key_2__values__contains="bar3")
And have the Test in the queryset.
Any idea if this can be done and how ?
If you wan't to filter your data by one of fields in your array of dicts, you can try this query:
Test.objects.filter(actions__contains=[{'fixed_key_1': 'foo2'}])
It will list all Test objects that have at least one object in actions field that contains key fixed_key_1 of value foo2.
Also it should work for nested lookup, even if you don't know actual indexes:
Test(actions=[
{'fixed_key_1': 'foo4', 'fixed_key_3': [
{'key1': 'foo2'},
]}
}).save()
Test.objects.filter(actions__contains=[{'fixed_key_3': [{'key1': 'foo2'}]}])
In simple words, contains will ignore everything else.
Unfortunately, if nested element is an object, you must know key name. Lookup by value won't work in that case.
You should be able to use a __contains lookup for this and pass queried values as list as documented here. The lookup would behave exactly like ArrayField. So, something like this should work:
Test.objects.filter(actions__contains=[{'fixed_key_1': 'foo2'}])
You can use the django-jsonfield package, I guess it's already the one you are using.
from jsonfield import JSONField
class Test(TimeStampedModel):
actions = JSONField()
So to search to make a search with a specific property, you can just do this:
def test_filter(**kwargs):
result = Test.objects.filter(actions__contains=kwargs)
return result
If you are using PostgreSQL, maybe you can take advantage of PostgreSQL specific model fields.
PS: If you are dealing with a lot of JSON structure you have maybe to consider using NoSQL Database.
I am using ElasticSearch in my Python application and want to be able to create a reusable dictionary object that represents a query. The JSON structures are described here http://pulkitsinghal.blogspot.co.uk/2012/02/how-to-use-elasticsearch-query-dsl.html and I am using PyES to query the search server. With PyES we can pass a dict object which gets jsonified before sending to the server. I want to create a library of common queries where only the actual query term changes, so I thought I would subclass dict so I could pass in the query term via the constructor, for example, and when the dict gets jsonified I would end up with something like this:
{
"fields": [
"name",
"shortDescription",
"longDescription"
],
"query": {
"query_string": {
"fields": [
"name"
],
"query": query_term,
"use_dis_max": true
}
}
}
How would I do this? Is it true that only instance members get returned via __dict__ if so would I have to set up this data structure in the constructor? Is this the best way of doing this or should I create a class that does not extend dict and just create a to_dict() method that returns a dictionary in the correct structure?
Answer:
This seems to work fine, any suggestions for making this more 'pythonic' will be appreciated! (Yes I know there are no doc strings)
class StandardQuery(object):
search_fields = ['meta_keywords', 'meta_description', \
'fields.title.value', 'slug']
return_fields = ['slug', 'fields.title.value']
def __init__(self, query):
self.query = query
def to_dict(self):
output = dict()
output['fields'] = self.search_fields
output['query'] = {'query_string': {'fields': self.return_fields, \
'query': self.query, 'use_dis_max': True}}
return output
If you don't want all of the normal dict behaviour then you should definitely just create a separate class. You can then either give it a to_dict() method, or better since you really want to convert to json create a custom json encoder (and if required decoder) to use with the default argument of json.dumps().
In json.dump() and json.dumps() the optional argument default is a callable that should either return a serialized version of the object or raise TypeError (to get the default behaviour).