How to implement callback functionality in FastAPI? - python

I'm trying to implement a service that will get a request from an external API, do some work (which might take time) and then return a response to the external API with the parsed data. However I'm at a loss on how to achieve this. I'm using FastAPI as my API service and have been looking at the following documentation: OpenAPI Callbacks
By following that documentation I can get the OpenAPI docs looking all pretty and nice. However I'm stumped on how to implement the actual callback and the docs don't have much information about that.
My current implementation is as follows:
from typing import Union
from fastapi import APIRouter, FastAPI
from pydantic import BaseModel, AnyHttpUrl
import requests
import time
from threading import Thread
app = FastAPI()
class Invoice(BaseModel):
id: str
title: Union[str, None] = None
customer: str
total: float
class InvoiceEvent(BaseModel):
description: str
paid: bool
class InvoiceEventReceived(BaseModel):
ok: bool
invoices_callback_router = APIRouter()
#invoices_callback_router.post(
"{$callback_url}/invoices/{$request.body.id}", response_model=InvoiceEventReceived
)
def invoice_notification(body: InvoiceEvent):
pass
#app.post("/invoices/", callbacks=invoices_callback_router.routes)
async def create_invoice(invoice: Invoice, callback_url: Union[AnyHttpUrl, None] = None):
# Send the invoice, collect the money, send the notification (the callback)
thread = Thread(target=do_invoice(invoice, callback_url))
thread.start()
return {"msg": "Invoice received"}
def do_invoice(invoice: Invoice, callback_url: AnyHttpUrl):
time.sleep(10)
url = callback_url + "/invoices/" + invoice.id
json = {
"data": ["Payment celebration"],
}
requests.post(url=url, json=json)
I thought putting the actual callback in a separate thread might work and that the {"msg": "Invoice received"} would be returned immediately and then 10s later the external api would recieve the result from the do_invoice function. But this doesn't seem to be the case so perhaps I'm doing something wrong.
I've also tried putting the logic in the invoice_notification function but that doesn't seem to do anything at all.
So what is the correct to implement a callback like the one I want? Thankful for any help!

I thought putting the actual callback in a separate thread might work and that the {"msg": "Invoice received"} would be returned
immediately and then 10s later the external api would recieve the
result from the do_invoice function. But this doesn't seem to be the case so perhaps I'm doing something wrong.
If you would like to run a task after the response has been sent, you could use a BackgroundTask, as demonstrated in this answer, as well as here and here. If you instead would like to wait for the task to finish before returning the response, you could run the task in either an external ThreadPool or ProcessPool (depending on the nature of the task) and await it, as explained in this detailed answer.
I would also strongly recommend using the httpx library in an async environment such as FastAPI, instead of using Python requests—you may find details and working examples here, as well as here and here.

Related

Is it possible to create muti thread in a flask server?

I am using flask and flask-restx try to create a protocol to get a specific string from another service. I am trying to figure out a way to run the function in server in different threads. Here's my code sample:
from flask_restx import Api,fields,Resource
from flask import Flask
app = Flask(__name__)
api = Api(app)
parent = api.model('Parent', {
'name': fields.String(get_answer(a,b)),
'class': fields.String(discriminator=True)
})
#api.route('/language')
class Language(Resource):
# #api.marshal_with(data_stream_request)
#api.marshal_with(parent)
#api.response(403, "Unauthorized")
def get(self):
return {"happy": "good"}
What I expect:
In Client side, first the server should run, i.e., we should able to make curl -i localhost:8080 work. Then when a specific condition is true, the client side should receive a GET request with the parent JSON data I have in server. However, if that condition is true, the GET request should not be able to return the correct result.
What I did:
One of the method I used is wrap up the decorator and Class Language(Resource) part in a different function and wrong that function in a different thread, and put that thread under a condition check. Not sure if that's the right way to do.I was seeing anyone said celery might be a good choice but not sure if that can work in flask-restx.
I have the answer for you. to run a process in the background with flask, schedule it to run using another process using APScheduler. A very simple package that helps you schedule tasks to run functions at an interval, in your case one time at utcnow().
here is the link to Flask-APScheduler.
job = scheduler.add_job(myfunc, 'interval', minutes=2)
In your case use 'date' instead of 'interval' and specify run_date
job = scheduler.add_job(myfunc, 'date', run_date=datetime.utcnow())
You can send arguments to the function:
job = scheduler.add_job(myfunc, 'date', args = (your args), run_date=datetime.utcnow())
here is the documentation:
User Guide

Is my views model doing too much work? MVVM

I am a very beginner writing one of my first webapps. I'm using FastAPI and I'm stuck on the logic of creating an endpoint that has to do a lot of things before it returns something back to the user. Since I'm new I also lack a lot of the vocabulary that I think I'd have if I were more experienced so bear with me:
To start -- What is my web app doing?
Well, I am trying to create something that will pull a schedule from a 3rd party API and then show it to me, allowing me to book things from my webapp rather than having to navigate to the 3rd party APIs. Some of these events are recurring (or are assumed to be) and get recorded as a "preference"
My approach so far, using 'MVVM'
User navigates to www.myfakeurl.com/schedule (either directly or from the home page) (handled by views/schedule.py)
This sends a request to the /schedule router (also handled by views/schedule.py). The request is forwarded to view_models/schedule.py
In the ScheduleViewModel in view_models/schedule.py, there's no payload to check, but I do need to request the schedule from the 3rd party API. I decoupled this step from the constructor on purpose, one must call get_schedule.
Once get_schedule is called, the user id and token are sent to services/3p_schedule.py to function called get_current_schedule, this is an asynchronous function that requests the schedule from the 3rd party API
Back in view_models.schedule.ScheduleViewModel, I validate the response and do some json reorganizing. The 3rd party API sends over a lot of stuff I don't need, so I just keep a few relevant pieces of info (like booking_id, if I already am signed up for a class, etc)
Still in view_models.schedule.ScheduleViewModel, I quickly check against the models.preferences.UserPreferences table (in the SQLAlchemy DB) if any classes in the schedule match my preferences and create a is_a_preferred_time attribute for each item in the list generated in (4) above
Still in the ScheduleViewModel, I finally combine the dict of info from step (4) with my info about user preferences from step (5). The schedule is a Dict[str, List[dict]]] where the key is the date and values are a list of dicts containing time (from 3rd party API), is_preferred (from UserPreferences table), booking_id (from 3rd party API), currently_enrolled (from 3rd party API). This is returned the views/schedule.py and the template is generated
(Somewhat) pseudocode below:
# views/schedule.py
from fastAPI import APIRouter
from fastapi.templating import Jinja2Templates
from startlette.requests import Request
from pydantic_schema import Schedule
from view_models.schedule import ScheduleViewModel
router = APIRouter()
templates = Jinja2Templates(directory="templates")
#router.get('/schedule')
async def schedule(request: Request) -> Schedule:
schedule_model = ScheduleViewModel(request)
schedule = await schedule_model.get_schedule(schedule_model.user)
return templates.TemplateResponse("schedule.html", {"schedule": schedule})
# views_models/schedule.py
from starlette.requests import Requests
from sqlalchemy.orm import Session
from dependencies.db import get_db
from pydantic_schema.user import User
from services.3p_services import get_3p_schedule
class ScheduleViewModel:
def __init__(self, request: Requests):
self.user = get_current_user(request)
self.db = self.get_db()
async def get_schedule(self, user: User):
# try to request the schedule
3p_schedule = await get_3p_schedule(id=user.id, token=user.token)
# parse schedule
relevant_schedule_info = self.parse_schedule(3p_schedule)
# combine with user preferences
user_schedule = await self.update_sched_with_prefs(3p_schedule)
return user_schedule
def parse_schedule(self, schedule):
# does stuff
return {"yyyy-mm-dd" : [{"booking_id": 1, "time": "x:xx xm", "currently_enrolled": False}]} # returns a dict of length N with this format
async def update_sched_with_prefs(self, schedule):
# gonna ignore the exact implementation here for brevity
return [{"yyyy-mm-dd" : [{"booking_id": 1, "time": "x:xx xm", "currently_enrolled": False, "is_preference": True}]} # dict of len(N) with values that are lists of variable length
I use the dict to populate a template.
My concerns and confusions
I started off using pydantic for a lot of this, and then I learned about MVVM and got confused and just went with MVVM since I don't know what I am doing and it seemed more clear.
But, this feels like a lot for one endpoint to handle ? Or maybe the examples in all my tutorials are just very basic and real-life is closer to what I have going on?
I apologize in advance for just how clueless I am, but I don't have anyone to ask about this and am looking for any and all guidance / resources. I think I am in a very steep section of the learning curve atm :)

Using FastAPI in a sync way, how can I get the raw body of a POST request?

Using FastAPI in a sync, not async mode, I would like to be able to receive the raw, unchanged body of a POST request.
All examples I can find show async code, when I try it in a normal sync way, the request.body() shows up as a coroutine object.
When I test it by posting some XML to this endpoint, I get a 500 "Internal Server Error".
from fastapi import FastAPI, Response, Request, Body
app = FastAPI()
#app.get("/")
def read_root():
return {"Hello": "World"}
#app.post("/input")
def input_request(request: Request):
# how can I access the RAW request body here?
body = request.body()
# do stuff with the body here
return Response(content=body, media_type="application/xml")
Is this not possible with FastAPI?
Note: a simplified input request would look like:
POST http://127.0.0.1:1083/input
Content-Type: application/xml
<XML>
<BODY>TEST</BODY>
</XML>
and I have no control over how input requests are sent, because I need to replace an existing SOAP API.
Using async def endpoint
If an object is a co-routine, it needs to be awaited. FastAPI is actually Starlette underneath, and Starlette methods for returning the request body are async methods (see the source code here as well); thus, one needs to await them (inside an async def endpoint). For example:
from fastapi import Request
#app.post("/input")
async def input_request(request: Request):
return await request.body()
Update 1 - Using def endpoint
Alternatively, if you are confident that the incoming data is a valid JSON, you can define your endpoint with def instead, and use the Body field, as shown below (for more options on how to post JSON data, see this answer):
from fastapi import Body
#app.post("/input")
def input_request(payload: dict = Body(...)):
return payload
If, however, the incoming data are in XML format, as in the example you provided, one option is to pass them using Files instead, as shown below—as long as you have control over how client data are sent to the server (have a look here as well). Example:
from fastapi import File
#app.post("/input")
def input_request(contents: bytes = File(...)):
return contents
Update 2 - Using def endpoint and async dependency
As described in this post, you can use an async dependency function to pull out the body from the request. You can use async dependencies on non-async (i.e., def) endpoints as well. Hence, if there is some sort of blocking code in this endpoint that prevents you from using async/await—as I am guessing this might be the reason in your case—this is the way to go.
Note: I should also mention that this answer—which explains the difference between def and async def endpoints (that you might be aware of)—also provides solutions when you are required to use async def (as you might need to await for coroutines inside a route), but also have some synchronous expensive CPU-bound operation that might be blocking the server. Please have a look.
Example of the approach described earlier can be found below. You can uncomment the time.sleep() line, if you would like to confirm yourself that a request won't be blocking other requests from going through, as when you declare an endpoint with normal def instead of async def, it is run in an external threadpool (regardless of the async def dependency function).
from fastapi import FastAPI, Depends, Request
import time
app = FastAPI()
async def get_body(request: Request):
return await request.body()
#app.post("/input")
def input_request(body: bytes = Depends(get_body)):
print("New request arrived.")
#time.sleep(5)
return body
For convenience, you can simply use asgiref, this package supports async_to_sync and sync_to_async:
from asgiref.sync import async_to_sync
sync_body_func = async_to_sync(request.body)
print(sync_body_func())
async_to_sync execute an async function in a eventloop, sync_to_async execute a sync function in a threadpool.

How to write unit tests for Durable Azure Functions?

I'm writing an Azure Durable Function, and I would like to write some unit tests for this whole Azure Function.
I tried to trigger the Client function (the "Start" function, as it is often called), but I can't make it work.
I'm doing this for two reasons:
It's frustrating to run the Azure Function code by running "func host start" (or pressing F5), then going to my browser, finding the right tab, going to http://localhost:7071/api/orchestrators/FooOrchestrator and going back to VS Code to debug my code.
I'd like to write some unit tests to ensure the quality of my project's code. Therefore I'm open to suggestions, maybe it would be easier to only test the execution of Activity functions.
Client Function code
This is the code of my Client function, mostly boilerplate code like this one
import logging
import azure.functions as func
import azure.durable_functions as df
async def main(req: func.HttpRequest, starter: str) -> func.HttpResponse:
# 'starter' seems to contains the JSON data about
# the URLs to monitor, stop, etc, the Durable Function
client = df.DurableOrchestrationClient(starter)
# The Client function knows which orchestrator to call
# according to 'function_name'
function_name = req.route_params["functionName"]
# This part fails with a ClientConnectorError
# with the message: "Cannot connect to host 127.0.0.1:17071 ssl:default"
instance_id = await client.start_new(function_name, None, None)
logging.info(f"Orchestration '{function_name}' starter with ID = '{instance_id}'.")
return client.create_check_status_response(req, instance_id)
Unit test try
Then I tried to write some code to trigger this Client function like I did for some "classic" Azure Functions:
import asyncio
import json
if __name__ == "__main__":
# Build a simple request to trigger the Client function
req = func.HttpRequest(
method="GET",
body=None,
url="don't care?",
# What orchestrator do you want to trigger?
route_params={"functionName": "FooOrchestrator"},
)
# I copy pasted the data that I obtained when I ran the Durable Function
# with "func host start"
starter = {
"taskHubName": "TestHubName",
"creationUrls": {
"createNewInstancePostUri": "http://localhost:7071/runtime/webhooks/durabletask/orchestrators/{functionName}[/{instanceId}]?code=aakw1DfReOkYCTFMdKPaA1Q6bSfnHZ/0lzvKsS6MVXCJdp4zhHKDJA==",
"createAndWaitOnNewInstancePostUri": "http://localhost:7071/runtime/webhooks/durabletask/orchestrators/{functionName}[/{instanceId}]?timeout={timeoutInSeconds}&pollingInterval={intervalInSeconds}&code=aakw1DfReOkYCTFMdKPaA1Q6bSfnHZ/0lzvKsS6MVXCJdp4zhHKDJA==",
},
"managementUrls": {
"id": "INSTANCEID",
"statusQueryGetUri": "http://localhost:7071/runtime/webhooks/durabletask/instances/INSTANCEID?taskHub=TestHubName&connection=Storage&code=aakw1DfReOkYCTFMdKPaA1Q6bSfnHZ/0lzvKsS6MVXCJdp4zhHKDJA==",
"sendEventPostUri": "http://localhost:7071/runtime/webhooks/durabletask/instances/INSTANCEID/raiseEvent/{eventName}?taskHub=TestHubName&connection=Storage&code=aakw1DfReOkYCTFMdKPaA1Q6bSfnHZ/0lzvKsS6MVXCJdp4zhHKDJA==",
"terminatePostUri": "http://localhost:7071/runtime/webhooks/durabletask/instances/INSTANCEID/terminate?reason={text}&taskHub=TestHubName&connection=Storage&code=aakw1DfReOkYCTFMdKPaA1Q6bSfnHZ/0lzvKsS6MVXCJdp4zhHKDJA==",
"rewindPostUri": "http://localhost:7071/runtime/webhooks/durabletask/instances/INSTANCEID/rewind?reason={text}&taskHub=TestHubName&connection=Storage&code=aakw1DfReOkYCTFMdKPaA1Q6bSfnHZ/0lzvKsS6MVXCJdp4zhHKDJA==",
"purgeHistoryDeleteUri": "http://localhost:7071/runtime/webhooks/durabletask/instances/INSTANCEID?taskHub=TestHubName&connection=Storage&code=aakw1DfReOkYCTFMdKPaA1Q6bSfnHZ/0lzvKsS6MVXCJdp4zhHKDJA==",
"restartPostUri": "http://localhost:7071/runtime/webhooks/durabletask/instances/INSTANCEID/restart?taskHub=TestHubName&connection=Storage&code=aakw1DfReOkYCTFMdKPaA1Q6bSfnHZ/0lzvKsS6MVXCJdp4zhHKDJA==",
},
"baseUrl": "http://localhost:7071/runtime/webhooks/durabletask",
"requiredQueryStringParameters": "code=aakw1DfReOkYCTFMdKPaA1Q6bSfnHZ/0lzvKsS6MVXCJdp4zhHKDJA==",
"rpcBaseUrl": "http://127.0.0.1:17071/durabletask/",
}
# I need to use async methods because the "main" of the Client
# uses async.
reponse = asyncio.get_event_loop().run_until_complete(
main(req, starter=json.dumps(starter))
)
But unfortunately the Client function still fails in the await client.start_new(function_name, None, None) part.
How could I write some unit tests for my Durable Azure Function in Python?
Technical information
Python version: 3.9
Azure Functions Core Tools version 4.0.3971
Function Runtime Version: 4.0.1.16815
Not sure if this will help which is the official documentation from Microsoft on the Unit testing for what you are looking for - https://github.com/kemurayama/durable-functions-for-python-unittest-sample

How to avoid logging request in Locust without using context manager?

The Locust documentation explains that a request can be prevented from being logged by using a context manager and raising an exception. For example:
try:
with self.client.get('/wont_be_logged', catch_response=True) as response:
raise RuntimeError
catch RuntimeError
pass
Is there a way to achieve the same without having to use a context manager?
Just do the request yourself (without using self.client)
For example by using requests.get(...)
(note that this will use a different session so it wont use the same cookies or underlying http connection)
Not detract from cyberwiz's answer as ultimately you could just do your own requests and get the same behavior, but if you really want to just use Locust's client and not manage another client yourself, you can. catch_response=True should be enough for it to not automatically fire success or failure Events. You can then manually fire events with whatever you want after that.
Demo Locust file:
from locust import HttpUser
from locust.user.task import task
class TestUser(HttpUser):
host = "http://localhost:8089"
#task
def test_call(self):
r = self.client.get("/", catch_response=True)
print("Test")
print(r.elapsed)
self.environment.events.request_success.fire(
request_type="POST",
name="/somewhere_new",
response_time=r.elapsed.microseconds / 1000,
response_length=13,
)
Again, doing it this way doesn't save much it works.

Categories