FastApi communication with other Api - python

I am using fastapi very recently and as an exercise I want to connect my fastapi api with a validation service on other server... but I do not know how to do this, I have not found something that will help me in the official documentation.. Will I have to do it with python code? Or is there a way?
FastApi docs
thank you for your help and excuse my english.

The accepted answer certainly works, but it is not an effective solution. With each request, the ClientSession is closed, so we lose the advantage [0] of ClientSession: connection pooling, keepalives, etc. etc.
We can use the startup and shutdown events [1] in FastAPI, which are triggered when the server starts and shuts down respectively. In these events it is possible to create a ClientSession instance and use it during the runtime of the whole application (and therefore utilize its full potential).
The ClientSession instance is stored in the application state. [2]
Here I answered a very similar question in the context of the aiohttp server: https://stackoverflow.com/a/60850857/752142
from __future__ import annotations
import asyncio
from typing import Final
from aiohttp import ClientSession
from fastapi import Depends, FastAPI
from starlette.requests import Request
app: Final = FastAPI()
#app.on_event("startup")
async def startup_event():
setattr(app.state, "client_session", ClientSession(raise_for_status=True))
#app.on_event("shutdown")
async def shutdown_event():
await asyncio.wait((app.state.client_session.close()), timeout=5.0)
def client_session_dep(request: Request) -> ClientSession:
return request.app.state.client_session
#app.get("/")
async def root(
client_session: ClientSession = Depends(client_session_dep),
) -> str:
async with client_session.get(
"https://example.com/", raise_for_status=True
) as the_response:
return await the_response.text()
[0] https://docs.aiohttp.org/en/stable/client_reference.html
[1] https://fastapi.tiangolo.com/advanced/events/
[2] https://www.starlette.io/applications/#storing-state-on-the-app-instance

You will need to code it with Python.
If you're using async you should use a HTTP client that is also async, for example aiohttp.
import aiohttp
#app.get("/")
async def slow_route():
async with aiohttp.ClientSession() as session:
async with session.get("http://validation_service.com") as resp:
data = await resp.text()
# do something with data

Related

Scheduled HTTP Request using FastAPI

Inside my FastAPI application, I would like to schedule an HTTP request to be made to check for new results (comparing to database) every X time interval. What would be the easiest way to accomplish this using httpx?
You can add an async task to the event loop during the startup event. This async task would check (and sleep) and store the result somewhere. In the below example, I've chosen to pass around a shared object using the app.state feature of FastAPI. This should give you enough pointers to implement your exact use case. I have commented out an example of dealing with https specifically.
from fastapi import FastAPI
import asyncio
class MySharedObject:
def __init__(self) -> None:
self.count = 0
async def timed_checker(obj: MySharedObject):
while True:
obj.count += 1
# async with httpx.AsyncClient() as client:
# r = await client.get('https://www.example.com/')
await asyncio.sleep(3)
app = FastAPI()
#app.on_event("startup")
def startup_function():
app.state.shared_object = MySharedObject()
asyncio.create_task(timed_checker(app.state.shared_object))
#app.get("/")
async def root():
return {"hello": "world"}
#app.get("/count")
async def get_count():
return app.state.shared_object.count
if __name__ == "__main__":
import uvicorn
uvicorn.run(app, host="0.0.0.0", port=8000)

Use anyio.TaskGroup with fastapi.StreamingResponse

anyio is a part of starlette and, therefore, of FastAPI. I find it quite convenient to use its task groups to perform concurrent requests to external services outside of one of my API servers.
Also, I would like to stream out the results as soon as they are ready. fastapi.StreamingResponse could do the trick, still I need to be able to keep the task group up and running after returning StreamingResponse, but it sounds like something that goes against the idea of structured concurrency.
Using an asynchronous generator may look like an obvious solution, but yield in general can not be used in a task group context, according to this: https://trio.readthedocs.io/en/stable/reference-core.html#cancel-scopes-and-nurseries
There is an example of a FastAPI server that seems to work, though it aggregates the responses before returning them:
import anyio
from fastapi import FastAPI
from fastapi.responses import StreamingResponse
app = FastAPI()
#app.get("/")
async def root():
# What to put below?
result = await main()
return StreamingResponse(iter(result))
async def main():
send_stream, receive_stream = anyio.create_memory_object_stream()
result = []
async with anyio.create_task_group() as tg:
async with send_stream:
for num in range(5):
tg.start_soon(sometask, num, send_stream.clone())
async with receive_stream:
async for entry in receive_stream:
# What to do here???
result.append(entry)
return result
async def sometask(num, send_stream):
await anyio.sleep(1)
async with send_stream:
await send_stream.send(f'number {num}\n')
if __name__ == "__main__":
import uvicorn
# Debug-only configuration
uvicorn.run(app)
So, the question is, is there something similar to #trio_util.trio_async_generator in anyio, or is it possible to use #trio_util.trio_async_generator with FastAPI directly?
Maybe there are other solutions?
import anyio
from fastapi import FastAPI
from fastapi.responses import StreamingResponse
app = FastAPI()
#app.get("/")
async def root():
return StreamingResponse(main())
async def main():
send_stream, receive_stream = anyio.create_memory_object_stream()
async with anyio.create_task_group() as tg:
async with send_stream:
for num in range(5):
tg.start_soon(sometask, num, send_stream.clone())
async with receive_stream:
async for entry in receive_stream:
yield entry
async def sometask(num, send_stream):
async with send_stream:
for i in range(1000):
await anyio.sleep(1)
await send_stream.send(f"number {num}\n")
if __name__ == "__main__":
import uvicorn
# Debug-only configuration
uvicorn.run(app)
unexpectedly, it works.

What is the default concurrency level with asyncio in Python?

I'm using Python Asyncio to do a lot of HTTP requests.
I'm wondering what is the default concurrency level of Asyncio or how many HTTP requests would be happening in parallel at any given time?
The code I'm using to do the HTTP requests you can find below:
async def call_url(self, session, url):
response = await session.request(method='GET', url=url)
return response
async def main(self, url_list):
async with aiohttp.ClientSession() as session:
res = await asyncio.gather(*[self.call_url(session, url) for url in url_list])
return res
There is no built-in limit in asyncio, but there is one in aiohttp. The TCPConnector limits the number of connections to 100 by default. You can override it by creating a TCPConnector with a different limit and passing it to the session.

Simple way to test websocket availability in python

I am using the following code to test that a local websocket server is running:
import asyncio
import websockets
async def hello():
async with websockets.connect('ws://127.0.0.1:8000/ws/registration/123') as websocket:
await websocket.send(json_data)
asyncio.get_event_loop().run_until_complete(hello())
Is there a simpler way to do this without using asyncio? Something such as:
import asyncio
import websockets
conn = websockets.connect('ws://127.0.0.1:8000/ws/registration/123')
conn.send('hello')
Basically, I'm just trying to find the simplest way to test to see if my websocket server is listening and receiving messages at a particular url.
Doesn't async_to_sync make this more complex? Why not just create a normal test_url function:
def test_url(url, data=""):
async def inner():
async with websockets.connect(url) as websocket:
await websocket.send(data)
return asyncio.get_event_loop().run_until_complete(inner())
test_url("ws://127.0.0.1:8000/ws/registration/123")
You can do the above by using async_to_sync, for example:
from asgiref.sync import async_to_sync
import websockets
def test_url(url, data=""):
conn = async_to_sync(websockets.connect)(url)
async_to_sync(conn.send)(data)
test_url("ws://127.0.0.1:8000/ws/registration/123")
Note that the "handshake" will probably not complete here because it needs to be accepted both ways, but the above should enable you to test to make sure that the urls are being routed properly, etc.

How to receive data from multiple WebSockets asynchronously in Python?

I am currently trying to use the websockets library. If another library is better suited for this purpose, let me know.
Given these functions:
def handle_message(msg):
# do something
async def consumer_handler(websocket, path):
async for message in websocket:
handle_message(message)
How can I (indefinitely) connect to multiple websockets? Would the below code work?
import asyncio
import websockets
connections = set()
connections.add(websockets.connect(consumer_handler, 'wss://api.foo.com', 8765))
connections.add(websockets.connect(consumer_handler, 'wss://api.bar.com', 8765))
connections.add(websockets.connect(consumer_handler, 'wss://api.foobar.com', 8765))
async def handler():
await asyncio.wait([ws for ws in connections])
asyncio.get_event_loop().run_until_complete(handler())
For anyone who finds this, I found an answer. Only works in > Python 3.6.1 I believe.
import asyncio
import websockets
connections = set()
connections.add('wss://api.foo.com:8765')
connections.add('wss://api.bar.com:8765'))
connections.add('wss://api.foobar.com:8765'))
async def handle_socket(uri, ):
async with websockets.connect(uri) as websocket:
async for message in websocket:
print(message)
async def handler():
await asyncio.wait([handle_socket(uri) for uri in connections])
asyncio.get_event_loop().run_until_complete(handler())
Instead of:
connections = set()
I would list it:
connections = []
connections = ["wss://api.foo.com:8765"]
connections.append ("wss://api.bar.com:8765")
connections.append ("wss://api.foobar.com:8765")

Categories