I know that you can create get requests like this:
#router.get("/findStuff")
def get_stuff(a: int = None, b: str = None):
return {'a': a, 'b': b}
But is there a way to create the query parameters dynamically from, let's say, a Pydantic schema? I've tried this below and although it does seem to create the query parameters in the OpenAPI doc, it's unable to process them, returning a 422 (Unprocessable entity). Has anyone tried to do something similar? Being able to specify an object containing the query parameters lets me create get requests dynamically for any arbitrary object with primitive fields. I did this in Flask with webargs, but am not sure what I can do within FastApi.
class MySchema(BaseModel):
a: int = None
b: str = None
#router.get("/findStuff")
def get_stuff(inputs = Depends(MySchema)):
return inputs
This was unrealized user error. There was this end point with a path parameter
#router.get("/{id}")
def get_stuff_by_id(id: int):
return id
that appeared above the /findStuffs end point, so it got clobbered.
The solution was to just put the /findStuffs block above this block with the path parameter.
Related
I am trying to set a super-class field in a subclass using validator as follows:
Approach 1
from typing import List
from pydantic import BaseModel, validator, root_validator
class ClassSuper(BaseModel):
field1: int = 0
class ClassSub(ClassSuper):
field2: List[int]
#validator('field1')
def validate_field1(cls, v, values):
return len(values["field2"])
sub = ClassSub(field2=[1, 2, 3])
print(sub.field1) # It prints 0, but expected it to print 3
If I run the code above it prints 0, but I expected it to print 3 (which is basically len(field2)). However, if I use #root_validator() instead, I get the expected result.
Approach 2
from typing import List
from pydantic import BaseModel, validator, root_validator
class ClassSuper(BaseModel):
field1: int = 0
class ClassSub(ClassSuper):
field2: List[int]
#root_validator()
def validate_field1(cls, values):
values["field1"] = len(values["field2"])
return values
sub = ClassSub(field2=[1, 2, 3])
print(sub.field1) # This prints 3, as expected
New to using pydantic and I am bit puzzled what I am doing wrong with the Approach 1. Thank you for your help.
The reason your Approach 1 does not work is because by default, validators for a field are not called, when the value for that field is not supplied (see docs).
Your validate_field1 is never even called. If you add always=True to your #validator, the method is called, even if you don't provide a value for field1.
However, if you try that, you'll see that it will still not work, but instead throw an error about the key "field2" not being present in values.
This in turn is due to the fact that validators are called in the order they were defined. In this case, field1 is defined before field2, which means that field2 is not yet validated by the time validate_field1 is called. And values only contains previously-validated fields (see docs). Thus, at the time validate_field1 is called, values is simply an empty dictionary.
Using the #root_validator is the correct approach here because it receives the entire model's data, regardless of whether or not field values were supplied explicitly or by default.
And just as a side note: If you don't need to specify any parameters for it, you can use #root_validator without the parantheses.
And as another side note: If you are using Python 3.9+, you can use the regular list class as the type annotation. (See standard generic alias types) That means field2: list[int] without the need for typing.List.
Hope this helps.
Okay, so pardon me if I don't make much sense. I face this 'ObjectId' object is not iterable whenever I run the collections.find() functions. Going through the answers here, I'm not sure where to start. I'm new to programming, please bear with me.
Every time I hit the route which is supposed to fetch me data from Mongodb, I getValueError: [TypeError("'ObjectId' object is not iterable"), TypeError('vars() argument must have __dict__ attribute')].
Help
Exclude the "_id" from the output.
result = collection.find_one({'OpportunityID': oppid}, {'_id': 0})
I was having a similar problem to this myself. Not having seen your code I am guessing the traceback similarly traces the error to FastAPI/Starlette not being able to process the "_id" field - what you will therefore need to do is change the "_id" field in the results from an ObjectId to a string type and rename the field to "id" (without the underscore) on return to avoid incurring issues with Pydantic.
First of all, if we had some examples of your code, this would be much easier. I can only assume that you are not mapping your MongoDb collection data to your Pydantic BaseModel correctly.
Read this:
MongoDB stores data as BSON. FastAPI encodes and decodes data as JSON strings. BSON has support for additional non-JSON-native data types, including ObjectId which can't be directly encoded as JSON. Because of this, we convert ObjectIds to strings before storing them as the _id.
I want to draw attention to the id field on this model. MongoDB uses _id, but in Python, underscores at the start of attributes have special meaning. If you have an attribute on your model that starts with an underscore, pydantic—the data validation framework used by FastAPI—will assume that it is a private variable, meaning you will not be able to assign it a value! To get around this, we name the field id but give it an alias of _id. You also need to set allow_population_by_field_name to True in the model's Config class.
Here is a working example:
First create the BaseModel:
class PyObjectId(ObjectId):
""" Custom Type for reading MongoDB IDs """
#classmethod
def __get_validators__(cls):
yield cls.validate
#classmethod
def validate(cls, v):
if not ObjectId.is_valid(v):
raise ValueError("Invalid object_id")
return ObjectId(v)
#classmethod
def __modify_schema__(cls, field_schema):
field_schema.update(type="string")
class Student(BaseModel):
id: PyObjectId = Field(default_factory=PyObjectId, alias="_id")
first_name: str
last_name: str
class Config:
allow_population_by_field_name = True
arbitrary_types_allowed = True
json_encoders = {ObjectId: str}
Now just unpack everything:
async def get_student(student_id) -> Student:
data = await collection.find_one({'_id': student_id})
if data is None:
raise HTTPException(status_code=404, detail='Student not found.')
student: Student = Student(**data)
return student
Use the response model inside app decorator Here is the sample example
from pydantic import BaseModel
class Todo(BaseModel):
title:str
details:str
main.py
#app.get("/{title}",response_model=Todo)
async def get_todo(title:str):
response=await fetch_one_todo(title)
if not response:
raise
HTTPException(status_code=status.HTTP_404_NOT_FOUND,detail='not found')
return response
use db.collection.find(ObjectId:"12348901384918")
here db.collection is database name and use double quotes for the string .
I was trying to iterate through all the documents and what worked for me was this solution https://github.com/tiangolo/fastapi/issues/1515#issuecomment-782835977
These lines just needed to be added after the child of ObjectID class. An example is given in the following link.
https://github.com/tiangolo/fastapi/issues/1515#issuecomment-782838556
I had this issue until I upgraded from mongodb version 5.0.9 to version 6.0.0 so mongodb made some changes on their end to handle this if you have the ability to upgrade! I ran into this issue when creating a test server and when I created a new test server that was 6.0.0, it fixed the error.
As a Flask beginner, I can't understand how request.args is used. I read somewhere that it is used to return values of query string (correct me if I'm wrong) and how many parameters request.args.get() takes.
I know that when I have to store submitted form data, I can use fname = request.form.get("firstname"). Here, only one parameter is passed, whereas the code below takes two parameters.
#app.route("/")
def home():
cnx = db_connect()
cur = cnx.cursor()
output = []
page = request.args.get('page', 1)
try:
page = int(page)
skip = (page-1)*4
except:
abort(404)
stmt_select = "select * from posts limit %s, 4;"
values=[skip]
cur.execute(stmt_select,values)
x=cur.fetchall()
for row in reversed(x):
data = {
"uid":row[0],
"pid":row[1],
"subject":row[2],
"post_content":row[3],
"date":datetime.fromtimestamp(row[4]),
}
output.append(data)
next = page + 1
previous = page-1
if previous<1:
previous=1
return render_template("home.html", persons=output, next=next, previous=previous)
Please explain why it takes two parameters, and then what its use is.
According to the flask.Request.args documents.
flask.Request.args
A MultiDict with the parsed contents of the query string. (The part in the URL after the question mark).
So the args.get() is method get() for MultiDict, whose prototype is as follows:
get(key, default=None, type=None)
In newer version of flask (v1.0.x and v1.1.x), flask.Request.args is an ImmutableMultiDict(an immutable MultiDict), so the prototype and specific method above are still valid.
As a newbie using Flask and Python myself, I think some of the other answers here take for granted that you have a good understanding of the fundamentals. In case you or other readers don't, I'll give more context
... request.args returns a "dictionary" object for you. The "dictionary" object is similar to other collection-type of objects in Python, in that it can store many elements in one single object. Therefore the answer to your question
And how many parameters request.args.get() takes.
It will take only one object, a "dictionary" type of object (as stated in the previous answers). This "dictionary" object, however, can have as many elements as needed... (dictionaries have paired elements called Key, Value).
Other collection-type of objects besides "dictionaries", would be "tuple", and "list"... you can run a google search on those and "data structures" in order to learn other Python fundamentals. This answer is based Python; I don't have an idea if the same applies to other programming languages.
It has some interesting behaviour in some cases that is good to be aware of:
from werkzeug.datastructures import MultiDict
d = MultiDict([("ex1", ""), ("ex2", None)])
d.get("ex1", "alternive")
# returns: ''
d.get("ex2", "alternative")
# returns no visible output of any kind
# It is returning literally None, so if you do:
d.get("ex2", "alternative") is None
# it returns: True
d.get("ex3", "alternative")
# returns: 'alternative'
request.args is a MultiDict with the parsed contents of the query string.
From the documentation of get method:
get(key, default=None, type=None)
Return the default value if the
requested data doesn’t exist. If type is provided and is a callable it
should convert the value, return it or raise a ValueError if that is
not possible.
Let's say I have a model like this:
class User(db.Model):
id = db.Column(db.Integer, primary_key=True)
hometown = db.Column(db.String(140))
university = db.Column(db.String(140))
To get a list of users from New York, this is my query:
User.query.filter_by(hometown='New York').all()
To get a list of users who go to USC, this is my query:
User.query.filter_by(university='USC').all()
And to get a list of users from New York, and who go to USC, this is my query:
User.query.filter_by(hometown='New York').filter_by(university='USC').all()
Now, I would like to dynamically generate these queries based on the value of a variable.
For example, my variable might look like this:
{'hometown': 'New York'}
Or like this:
{'university': 'USC'}
... Or even like this:
[{'hometown': 'New York'}, {'university': 'USC'}]
Can you help me out with writing a function which takes a dictionary (or list of dictionaries) as an input, and then dynamically builds the correct sqlalchemy query?
If I try to use a variable for the keyword, I get this err:
key = 'university'
User.query.filter_by(key='USC').all()
InvalidRequestError: Entity '<class 'User'>' has no property 'key'
Secondly, I am not sure how to chain multiple filter_by expressions together dynamically.
I can explicitly, call out a filter_by expression, but how do I chain several together based on a variable?
Hope this makes more sense.
Thanks!
SQLAlchemy's filter_by takes keyword arguments:
filter_by(**kwargs)
In other words, the function will allow you to give it any keyword parameter. This is why you can use any keyword that you want in your code: SQLAlchemy basically sees the arguments a dictionary of values. See the Python tutorial for more information on keyword arguments.
So that allows the developers of SQLAlchemy to receive an arbitrary bunch of keyword arguments in a dictionary form. But you're asking for the opposite: can you pass an arbitrary bunch of keyword arguments to a function?
It turns out that in Python you can, using a feature called unpacking. Simply create the dictionary of arguments and pass it to the function preceded by **, like so:
kwargs = {'hometown': 'New York', 'university' : 'USC'}
User.query.filter_by(**kwargs)
# This above line is equivalent to saying...
User.query.filter_by(hometown='New York', university='USC')
filter_by(**request.args) doesn't work well if you have non-model query parameters, like page for pagination, otherwise you get errors like these:
InvalidRequestError: Entity '<class 'flask_sqlalchemy.MyModelSerializable'>' has no property 'page'
I use something like this which ignores query parameters not in the model:
builder = MyModel.query
for key in request.args:
if hasattr(MyModel, key):
vals = request.args.getlist(key) # one or many
builder = builder.filter(getattr(MyModel, key).in_(vals))
if not 'page' in request.args:
resources = builder.all()
else:
resources = builder.paginate(
int(request.args['page'])).items
Considering a model with a column called valid, something like this will work:
curl -XGET "http://0.0.0.0/mymodel_endpoint?page=1&valid=2&invalid=whatever&valid=1"
invalid will be ignored, and page is available for pagination and best of all, the following SQL will be generated: WHERE mymodel.valid in (1,2)
(get the above snippet for free if you use this boilerplate-saving module)
As pointed out by #opyate that filter_by(**request.args) doesn't work well if you have non-model query parameters, like page for pagination, the following alternative can be used too:
Assuming that page is being taken in the form of request.args.get(), then:
def get_list(**filters):
page = None
if 'page' in filters:
page = filters.pop('limit')
items = Price.query.filter_by(**filters)
if page is not None:
items = items.paginate(per_page=int(page)).items
else:
items = items.all()
return {
"items": items
}
and then the get function
def get(self):
hometown = request.args.get('hometown')
university = request.args.get('university')
page = request.args.get('page')
return get_list(**request.args)
I have tried implementing this on my flask application, and it works smoothly.
Of course, one drawback that can be is if there are multiple values like page that are not a part of the model, then each of them has to be defined separately in the get_list, but that can be done by list comprehension
I've looked at documentation, and have searched Google extensively, and haven't found a solution to my problem.
This is my readRSS function (note that 'get' is a method of Kenneth Reitz's requests module):
def readRSS(name, loc):
linkList = []
linkTitles = list(ElementTree.fromstring(get(loc).content).iter('title'))
linkLocs = list(ElementTree.fromstring(get(loc).content).iter('link'))
for title, loc in zip(linkTitles, linkLocs):
linkList.append((title.text, loc.text))
return {name: linkList}
This is one of my MongoAlchemy classes:
class Feed(db.Document):
feedname = db.StringField(max_length=80)
location = db.StringField(max_length=240)
lastupdated = datetime.utcnow()
def __dict__(self):
return readRSS(self.feedname, self.location)
As you can see, I had to call the readRSS function within a function of the class, so I could pass self, because it's dependent on the fields feedname and location.
I want to know if there's a different way of doing this, so I can save the readRSS return value to a field in the Feed document. I've tried assigning the readRSS function's return value to a variable within the function __dict__ -- that didn't work either.
I have the functionality working in my app, but I want to save the results to the Document to lessen the load on the server (the one I am getting my RSS feed from).
Is there a way of doing what I intend to do or am I going about this all wrong?
I found out the answer. I needed to make use of a computed_field decorator, where the first argument was the structure of my return value and deps was a set which contained the fields that this field was dependent on. I then passed the dependent fields into a function's arguments and there you have it.
#fields.computed_field(db.KVField(db.StringField(), db.ListField(db.TupleField(db.StringField()))), deps=[feedname, location])
def getFeedContent(a=[feedname, location]):
return readRSS(a['feedname'], a['location'])
Thanks anyway, everyone.