Using FastAPI for socket chat system? - python

How would one implement a comprehensive chat system using sockets within FastAPI. Specifically keeping the following in mind:
Multiple Chat rooms many-to-many between users
Storing messages with a SQL or NoSQL database for persistence
Security: Authentication or possibly encryption
I've looked at some libraries, but actual useful implementations are far between, regrettably.
Any advice or redirects towards places for more information will be of great help!

For chat rooms you could use FastAPI builtin websockets support and add redis pubsub or PostgreSQL pg_notify to it for sending messages to all participants in the room.
Storing messages in PostgreSQL is a solid choice because of its long history and stability.
Authentication can be handled by OAuth2 provider in FastAPI. Authorization can be handled by OAuth2 scopes that is hidden in the Advanced Security section in the FastAPI Documentation. Encryption is provided by HTTPS and reverse proxy that you put in front of your app.
There aren't any fully ready made libraries that provide everything out of the box. But breaking down the problem in to smaller pieces and then working on those will get you pretty far.
Write down what fields/data you want to store about your users, chat rooms, messages.
Implement those basic models in FastAPI probably using SQLAlchemy.
Wire up those models to api endpoints so that you can use those models in Swagger (list chatrooms, get and post messages into chatrooms).
Implement a websocket endpoint in FastAPI that will echo back everything sent to it. That should allow you to wire up some client side javascript for sending and receiving messages from the websocket.
Modify your exising message storing endpoint to push the same message also to redis publish topic and change your websocket endpoint to subscribe to the redis subscribe topic.
Add authentication to your endpoints. At first basic user/password, later more advanced configurations.
Add reverse proxy with https in front and voila.

Related

Can we pull messages from Pub/sub topic to Angular?

I'm able to push messages from Python backend (which is on a VM instance) to topic & see messages on Pub/sub topic. But there is no code for pulling data from topic using angular. I want to pull that data & show it to Angular UI. Could you please help me with this?
With PubSub, there are 2 subscriptions mode that imply 2 kind of authentication:
Push Subscription, where the sender (PubSub subscription) need to be authenticated to push the message to a secure endpoint
Pull Subscription, where the client need to be authenticated to be able to get the messages.
So, in your case, you need to authenticate your Angular app on PubSub Pull Subscription to be able to read the messages. You have 2 solutions:
Either you generate a service account key file and you put it in your static code. It's obviously a stupid idea, because you share publicly a secret, and thus it's like if there is no security!
Or, because the previous solution is like having no security, you can make the pull subscription public. Grant allUsers as PubSub Subscriber.
It will work, but there is a design issue: anyone will be able to subscribe to your Pull subscription, and because the messages aren't duplicated between the subscriber, you will potentially loose messages.
A better solution could be to serve an endpoint in streaming, with Cloud Run for example, to authenticate your user on the Cloud Run endpoint, and to stream the messages from PubSub pull subscription through Cloud Run streaming connexion.
Like this, you add a security layer, something like a proxy.

Slack API - queue for retrieving slash commands

I am tasked with building a Slack slash command app in Python which will respond to incoming slash commands. However, for security reasons, I am not allowed to open the firewall for incoming webhooks from Slack. Is there instead a way to check a queue of sent slash commands?
For example, a user types "/myslashapp" in a specific channel. My app will need to do something like call an endpoint every 30 seconds and check if the "/myslashapp" command was sent. If it was, my app should trigger a Lambda function in AWS.
Based on reading the Slack API docs, I haven't found any way to do this other than perhaps the RTM API, though it seems like overkill and still requires an open socket.
No. The Slack API has no build-in support that allows you to pull requests after-the-fact from a queue instead of receiving them from Slack when they happen.
The RTM API might work for you, because the connection to Slack is initiated from your side. So - provided you firewall allows it - would also work from within an intranet. However, you can not do slash commands with the RTM API or any of the other interesting interactive Slack features like buttons. Only simple messages and events.
You could implement your own bridging solution and pull from it. But I don't think that a pulling solution would work, because it creates a lot of latency for your app. Users expect an immediate response to their slash command, not a delay of 30 secs or more.
So in summary I think you only have two valid options:
Host your app internally and use a secure VPN like ngrok to expose a public URL to your app.
Run your app on the Internet and let it have a secure connection to your Intranet for accessing internal data. (similar to e.g. a shopping web site would work, that has a public app on the Internet, but also can transmit orders to the business applications on the companies Intranet.)

Client-Server framework for python

I'm currently working on a University project that needs to be implemented with a Client - Server model.
I had experiences in the past where I was managing the communication at socket level and that really sucked.
I was wondering if someone could suggest an easy to use python framework that I can use for that purpose.
I don't know what kind of details you may need to answer so I'm just going to describe the project briefly.
Communication should happen over HTTP, possibly HTTPS.
The server does not need to send data back or invoke methods on the clients, it just collects data
Many clients send data concurrently to server, who needs to distinguish the sender, process the data accordingly and put the result in a database.
You can use something like Flask or Django. Both frameworks are fairly easy to implement, Flask is much easier than Django IMO, although Django has a built in authentication layer that you can use, albeit more difficult to implement in a client/server scenario like you need.
I would personally use Flask and JWT (JSON Web Tokens), which will allow you to give a token to each client for authentication with the server, which will also let you differentiate between clients, and you can use HTTPS for your SSL/TLS requirement. It is tons easier to implement this, and although I like django better for what it brings to the table, it is probably overkill to have you learn it for a single assignment.
For Flask with SSL, here is a quick rundown of that.
For JWT with Flask, here is that.
You can use any database system you would like.
If I understood you correctly you can use any web framework in python. For instance, you can use Flask (I use it and I like it). Django is also a popular choice among the python web frameworks. However, you shouldn't be limited to only these two. There are plenty of them out there. Just google for them.
The implementation of the client depends on what kind of communication there will be between the clients and the server - I don't have enough details here. I only know it's unidirectional.
The client can be a browser accessing you web application written in Flask where users send only POST requests to the server. However, even here the communication will bidirectional (the clients need to open the page which means the server sends requests back to the client) and it violates your initial requirement.
Then it can be a specific client written in python sending some particular requests to your server over http/https. For instance, your client can use a requests package to send HTTP requests.

Can push notifications be done with an AngularJS+Flask stack?

I have a Python/Flask backend and an Angular frontend for my website. At the backend there is a process that occasionally checks SQS for messages and I want it to push a notification to the client which can then in turn update an Angular controller. What is the best way to do this my existing technologies?
To be able to push to the client, you'll have to implement web socket support in some fashion. If you want to keep it in python/flask, there is this tutorial on how to do that with gevent:
http://www.socketubs.org/2012/10/28/Websocket_with_flask_and_gevent.html
In that article, Geoffrey also mentions a SocketIO compatible library for python/gevent that may allow you to leverage the SocketIO client-side JS library, called "gevent-socketio".
That may reduce how much work you have to do in terms of cross-browser compatibility since SocketIO has done a lot of that already.
Here is a pretty good tutorial on how to use SocketIO in AngularJS so that you can notify the AngularJS model when an event comes in from SocketIO:
http://www.html5rocks.com/en/tutorials/frameworks/angular-websockets/
If you don't want to host the web socket backend, you could look to a hosted service like PubNub or Pusher and then integrate them into AngularJS as a service. You can communicate with these services through your Python app (when the SQS notification happens) and they'll notify all connected clients for you.
I know this is a bit late, but I have done pretty much exactly what you ask for (though without Angular).
I ended up having a separate process running a websocket server called Autobahn, which listens to a redis pub/sub socket, the code is here on github
This allows you to send push notifications to your clients from pretty much anything that can access redis.
So when I want to publish a message to all my connected clients I just use redis like this:
r = redis.Redis()
r.publish('broadcasts', "SOME MESSAGE")
This has worked fairly good so far. What I can't currently do is send a push notification to a specific client. But if you have a authentication system or something to identify a specific user you could tie that to the open websockets and then be able to send messages directly to a specific client :-)
You could of course use any websocket server or client (like socket.io or sock.js), but this has worked great for me :-)

rabbitmq python - notifications design pattern to scale for thousands of users

Our website has around 50,000 users and daily active traffic is pretty good. We are designing a new notifications feature for our user base.
Our requirement is as follows:
Users are part of different Groups.
A user can be part of multiple Groups.
When a user uploads a image in a group, all the members of that particular group should get a notification saying "new image uploaded" regardless of being online or offline.
We thought of creating rabbitmq exchanges for each group and queue for each user. But got confused going forward of designing the right way!!
Say, a user should receive notifications even he logs-in to site days after the notifications is generated. We ended up storing the messages in DB which is not a good thing at all for offline users.
Can someone suggest proper design pattern with explanation for this use case? We are using celery + rabbitmq + tornado. Should tornado talk directly to the celery consumer? Where do the messages get stored when the user is offline?
I have similar project. So how it works:
Put messages to your rabbit queue, from your events source ( django, celery, everywhere)
You can use pika+tornado IOLoop ( so when messages come to tornado you will receive notification via pika loop, when request comes http, or websocket connection via Tornado)
Use collection of opened in TornaodApplication websockets for messaging for users
You can check very close project with logging via torando+rabbitmq on my github: https://github.com/rmuslimov/RapidLog

Categories