I'm trying to create a very simple system where an user, in order to use a consumer, needs to input a key in the WS url, like : ws://127.0.0.1:8000/main?key=KEY. Once the consumer is called, Django Channels needs to perform a very simple DB query to check if the key exists:
class TestConsumer(AsyncJsonWebsocketConsumer):
async def websocket_connect(self, event):
...
key_query = await database_sync_to_async(keys.objects.get(key=key))
...
But the problem with this code is that it will give the following error:
You cannot call this from an async context - use a thread or sync_to_async.
Is there any way to accomplish this? Any advice is appreciated!
database_sync_to_async must be called on the method, not on the result of the method:
key_query = await database_sync_to_async(keys.objects.get)(key=key)
Related
I have very recently started checking out asyncio for python. The use case is that, I hit an endpoint /refresh_data and the data is refreshed (from an S3 bucket). But this should be a non blocking operation, other API endpoints should still be able to be serviced.
So in my controller, I have:
def refresh_data():
myservice.refresh_data()
return jsonify(dict(ok=True))
and in my service I have:
async def refresh_data():
try:
s_result = await self.s3_client.fetch()
except (FileNotFound, IOError) as e:
logger.info("problem")
gather = {i.pop("x"):i async for i in summary_result}
# ... some other stuff
and in my client:
async def fetch():
result = pd.read_parquet("bucket", "pyarrow", cols, filts).to_dict(orient="col1")
return result
And when I run it, I see this error:
TypeError: 'async for' requires an object with __aiter__ method, got coroutine
I don't know how to move past this. Adding async definitely makes it return a coroutine type - but, either I have implemented this messily or I haven't fully understood asyncio package in Python. I have been working off of simple examples but I'm not sure what's wrong with what I've done here.
My application will be sending hundreds, if not thousands, of messages over redis every second, so we are going to open the connection when the app launches and use that connection for every transaction. I am having trouble keeping that connection open.
Here is some of my redis handler class:
class RedisSingleDatabase(AsyncGetSetDatabase):
async def redis(self):
if not self._redis:
self._redis = await aioredis.create_redis(('redis', REDIS_PORT))
return self._redis
def __init__(self):
self._redis = None
async def get(self, key):
r = await self.redis()
# r = await aioredis.create_redis(('redis', REDIS_PORT))
print(f'CONNECTION IS {"CLOSED" if r.closed else "OPEN"}!')
data = await r.get(key, encoding='utf-8')
if data is not None:
return json.loads(data)
This does not work. By the time we call r.get, the connection is closed (ConnectionClosedError).
If I uncomment the second line of the get method, and connect to the database locally in the method, it works. But then we're no longer using the same connection.
I've considered trying all this using dependency injection as well. This is a flask app, and in my create_app method I would connect to the database and construct my RedisSingleDatabase with that connection. Besides the question of whether my current problem would be the same, I can't seem to get this approach to work since we need to await the connection which I can't do in my root-level create_app which can't be async! (unless I'm missing something).
I've tried asyncio.run all over the place and it either doesn't fix the problem or raises its own error that it can't be called from a running event loop.
Please help!!!
I want to create a REST service using FastAPI and aio-pika while working asynchronously. For other async database drivers, I could create clients on startup, when get them in route handlers. For example, with motor I would declare simple connection manager:
from motor.motor_asyncio import AsyncIOMotorClient
class Database:
client: AsyncIOMotorClient = None
db = Database()
async def connect_to_mongo():
db.client = AsyncIOMotorClient("mongo:27017")
async def close_mongo_connection():
db.client.close()
async def get_mongo_client() -> AsyncIOMotorClient:
return db.client
Then add couple of handlers:
app.add_event_handler("startup", connect_to_mongo)
app.add_event_handler("shutdown", close_mongo_connection)
and then just use get_mongo_client to get one to my handler.
Problem here is that aio-pika needs asyncio loop to function. Here is an example from the docs:
connection = await aio_pika.connect_robust(
"amqp://guest:guest#127.0.0.1/", loop=loop
)
And with FastAPI I don't have asyncio loop. Are there any way to use it with interface like in example? Can I just create new loop using asyncio.get_event_loop() and pass it to the connect_robust without really using it anywhere? Like this:
connection = await aio_pika.connect_robust(
"amqp://guest:guest#127.0.0.1/", loop=asyncio.get_event_loop()
)
Ok, so, according to the docs, I can just use connect instead of connect_robust:
connection = await aio_pika.connect(
"amqp://guest:guest#127.0.0.1/"
)
I'm currently building an application that allows users to collaborate together and create things, as I require a sort of discord like group chatfeed. I need to be able to subscribe logged in users to a project for notifications.
I have a method open_project that retrieves details from a project that has been selected by the user, which I use to subscribe him to any updates for that project.
So I can think of 2 ways of doing this. I have created a instance variable in my connect function, like this:
def connect(self):
print("connected to projectconsumer...")
self.accept()
self.projectSessions = {}
And here is the open_project method:
def open_project(self, message):
p = Project.objects.values("projectname").get(id=message)
if len(self.projectSessions) == 0:
self.projectSessions[message] = []
pass
self.projectSessions[message] = self.projectSessions[message].append(self)
print(self.projectSessions[message])
self.userjoinedMessage(self.projectSessions[message])
message = {}
message["command"] = "STC-openproject"
message["message"] = p
self.send_message(json.dumps(message))
Then when the user opens a project, he is added to the projectSessions list, this however doesn't work (I think) whenever a new user connects to the websocket, he gets his own projectconsumer.
The second way I thought of doing this is to create a managing class that only has 1 instance and keeps track of all the users connected to a project. I have not tried this yet as I would like some feedback on if I'm even swinging in the right ball park. Any and all feedback is appreciated.
EDIT 1:
I forgot to add the userjoinedMessage method to the question, this method is simply there to mimic future mechanics and for feedback to see if my solution actually works, but here it is:
def userjoinedMessage(self, pointer):
message = {}
message["command"] = "STC-userjoinedtest"
message["message"] = ""
pointer.send_message(json.dumps(message))
note that i attempt to reference the instance of the consumer.
I will also attempt to implement a consumer manager that has the role of keeping track of what consumers are browsing what projects and sending updates to the relevant channels.
From the question, the issue is how to save projectSessions and have it accessible across multiple instances of the consumer. Instead of trying to save it in memory, you can save in a database. It is a dictionary with project as key. You can make it a table with ForeignKey to the Project model.
In that way, it is persisted and there would be no issue retrieving it even across multiple channels server instances if you ever decide to scale your channels across multiple servers.
Also, if you feel that a traditional database will slow down the retrieval of the sessions, then you can use faster storage systems like redis
Right, this is probably a horrible way of doing things and i should be taken out back and shot for doing it but i have a fix for my problem. I have made a ProjectManager class that handles subscriptions and updates to the users of a project:
import json
class ProjectManager():
def __init__(self):
if(hasattr(self, 'projectSessions')):
pass
else:
self.projectSessions = {}
def subscribe(self, projectid, consumer):
print(projectid not in self.projectSessions)
if(projectid not in self.projectSessions):
self.projectSessions[projectid] = []
self.projectSessions[projectid].append(consumer)
self.update(projectid)
def unsubscribe(self, projectid, consumer):
pass
def update(self, projectid):
if projectid in self.projectSessions:
print(self.projectSessions[projectid])
for consumer in self.projectSessions[projectid]:
message = {}
message["command"] = "STC-userjoinedtest"
message["message"] = ""
consumer.send_message(json.dumps(message))
pass
in my apps.py file i initialize the above ProjectManager class and assign it to a variable.
from django.apps import AppConfig
from .manager import ProjectManager
class ProjectConfig(AppConfig):
name = 'project'
manager = ProjectManager()
Which i then use in my consumers.py file. I import the manager from the projectconfig class and assign it to a instance variable inside the created consumer whenever its connected:
def connect(self):
print("connected to projectconsumer...")
self.accept()
self.manager = ProjectConfig.manager
and whenever i call open_project i subscribe to that project with the given project id recieved from the front-end:
def open_project(self, message):
p = Project.objects.values("projectname").get(id=message)
self.manager.subscribe(message, self)
message = {}
message["command"] = "STC-openproject"
message["message"] = p
self.send_message(json.dumps(message))
as i said i in no way claim that this is the correct way of doing it and i am also aware that channel_layers supposedly does this for you in a neat way. i however don't really have the time to get into channel_layers and will therefore be using this.
I am still open to suggestions ofcourse and am always happy to learn more.
I'm using RethinkDB with Tornado with an async approach. Base on my data model in RethinkDB, upon inserting a record for my topic table, I also have to update/insert new record into user_to_topic table. Here's the basic set up of my post request handler.
class TopicHandler(tornado.web.RequestHandler):
def get(self):
pass
#gen.coroutine
def post(self, *args, **kwargs):
# to establish databse connection
connection = rth.connect(host='localhost', port=00000, db=DATABASE_NAME)
# Thread the connection
threaded_conn = yield connection
title = self.get_body_argument('topic_title')
# insert into table
new_topic_record = rth.table('Topic').insert({
'title': topic_title,
}, conflict="error",
durability="hard").run(threaded_conn)
# {TODO} for now assume the result is always written successfully
io_loop = ioloop.IOLoop.instance()
# I want to return my generated_keys here
io_loop.add_future(new_topic_record, self.return_written_record_id)
# do stuff with the generated_keys here, how do I get those keys
def return_written_record_id(self, f):
_result = f.result()
return _result['generated_keys']
after the insert operation finishes, the Future object with return result from RethinkDB insert operation through Future.result() method which I can retrieve id of new record using generated_keys attribute of result. According to Tornado document, I can do stuff with the result in my callback function, return_written_record_id. Of course I can do all the other database operations inside my return_written_record_id function, but is it possible to return all the ids to my post function? Or this is the way it has to be using coroutine in tornado?
Any suggestion will be appreciated. Thank you!
Simply do:
result = yield new_topic_record
Whenever you yield a Future inside a coroutine, Tornado pauses the coroutine until the Future is resolved to a value or exception. Then Tornado resumes the coroutine by passing the value in or raising an exception at the "yield" expression.
For more info, see Refactoring Tornado Coroutines.