TLDR; I am trying to run a client that fetches data from a web-sockets URI, and then use Flask to serve the fetched data from the sockets.
My intended workflow looks like: Read data from web-sockets (async) -> write data to a Python dict (continuously) -> read data from Python dict using GET requests on Flask (continuously)
The problem I am facing is that I am using a Python dictionary for storage, but when I read that dictionary from Flask, it does not show me updated values.
I have created an illustration with code to my problem
Client:
import asyncio
import websockets
import json
SOME_URI = "ws://localhost:8080/foo"
connections = set()
connections.add(SOME_URI)
class Storage:
storage = dict() # local storage dict for simplicity
#staticmethod
def store(data): # here I store the value
a, b, c = data
Storage.storage[a] = c
#staticmethod
def show(): # Here I just show the value to be used in the GET
return Storage.storage
async def consumer_handler(uri):
async with websockets.connect(uri) as websocket:
async for message in websocket:
await consumer(message)
async def consumer(message):
line = json.loads(message)
Storage.store(line) # adds message to dict
async def main():
await asyncio.wait([consumer_handler(uri) for uri in connections])
if __name__ == "__main__":
asyncio.run(main())
App:
from flask import Flask
from client import Storage
app = Flask(__name__)
app.debug = True
#app.route('/bar', methods=['GET'])
def get_instruments():
res = Storage.show() # I expected here to see updated value for the dict as it fills up from websockets
return res, 200
if __name__ == "__main__":
app.run()
Whenever I try to make a GET request to the Flask page, I get the value of an Empty dict (does not reflect the changes I add from the web socket). I was expecting to get an updated value of the dict with every GET request.
The problem is that you have two separate Python processes involved here -- one is the async script for ingesting data and the other is running flask. As these are separate processes, the dict updates that you're doing on the first process is not visible to the second process.
You need some form of IPC (Inter-Process Communication) mechanism here for data portability and/or persistence. One obvious and easier option to try is to use a named FIFO or a plain file. Depending on the level of complexity you want to handle for this, using Redis (or alike) is also another option.
Related
I do not want to await my function. Look at this code:
loop = asyncio.new_event_loop() # Needed because of flask server
asyncio.set_event_loop(loop) # Needed because of flask server
task = loop.create_task(mainHandler(code, send, text)) #Start mainHandler
return formResponse(True, "mainHandler Started") # Run this without awaiting the task
This is part of a function called on a flask server endpoint. I don't really care how mainHandler is started, all I want is that it starts and the function (not mainHandler) immediately returns without awaiting. I have tried scheduling tasks, using futures and just running it but all to no avail. Even this question describing exactly what I need did not help: How to run an Asyncio task without awaiting? Does anyone have experience with this?
When trying to use asyncio.create_task in this example:
from flask import Flask
from flask import request
import asyncio
app = Flask(__name__)
async def mainHandler(code: str, sends: int, sendText: str):
await asyncio.sleep(60) # simulate the time the function takes
#app.route('/endpoint')
def index():
code = request.args.get('code', default = "NOTSET", type = str)
send = request.args.get('send', default = 0, type = int)
text = request.args.get('text', default = "NOTSET", type = str).replace("+", " ")
if (code != "NOTSET" and send > 0 and text != "NOTSET"):
asyncio.create_task(mainHandler(code, send, text))
return {"error":False}
else:
return {"error":True}
if __name__ == '__main__':
app.debug = True
app.run(host="0.0.0.0", port=80)
It throws:
RuntimeError: no running event loop
when calling asyncio.create_task(mainHandler(code, send, text)).
To sum this up for everyone coming here in the future:
The version of Flask I used was sync. I just switched to Sanic which is similar in Syntax (only minor differences), faster and supports async via asyncio.ensure_future(mainHandler(code, send, text)).
According to gre_gor (Thank you!) you should also be able to use pip install flask[async] to install async flask.
I made an API for my AI model but I would like to not have any down time when I update the model. I search a way to load in background and once it's loaded I switch the old model with the new. I tried passing values between sub process but doesn't work well. Do you have any idea how can I do that ?
You can place the serialized model in a raw storage, like an S3 bucket if you're on AWS. In S3's case, you can use bucket versioning which might prove helpful. Then setup some sort of trigger. You can definitely get creative here, and I've thought about this a lot. In practice, the best options I've tried are:
Set up an endpoint that when called will go open the new model at whatever location you store it at. Set up a webhook on the storage/S3 bucket that will send a quick automated call to the given endpoint and auto-load that new item
Same thing as #1, but instead you just manually load it. In both cases you'll really want some security on that endpoint or anyone that finds your site can just absolutely abuse your stack.
Set a timer at startup that calls a given function nightly, internally running within the application itself. The function is invoked and then goes and reloads.
Could be other ideas I'm not smart enough (yet!) to use, just trying to start some dialogue.
Found a way to do it with async and multiprocessing
import asyncio
import random
from uvicorn import Server, Config
from fastapi import FastAPI
import time
from multiprocessing import Process, Manager
app = FastAPI()
value = {"latest": 1, "b": 2}
#app.get("/")
async def root():
global value
return {"message": value}
def background_loading(d):
time.sleep(2)
d["test"] = 3
async def update():
while True:
global value
manager = Manager()
d = manager.dict()
p1 = Process(target=background_loading, args=(d,))
p1.daemon = True
p1.start()
while p1.is_alive():
await asyncio.sleep(5)
print(f'Update to value to {d}')
value = d
if __name__ == "__main__":
loop = asyncio.new_event_loop()
config = Config(app=app, loop=loop)
server = Server(config)
loop.create_task(update())
loop.run_until_complete(server.serve())
OK so I'm doing a project on finding the Health details of a remote server using python and I'm hosting the main server using flask. But the idk how to send the Health report which I have created using python, to the flask app. The Health report is in the form of a dictionary and I need to pass the values of the dictionary into columns which are the keys of the dictionary in my database.can someone please help me in sending the Health report to the Flask app? This health report is on another system and I need to send that to my main server.
import psutil
import time
import json
import requests
'''
This program will be loaded on to the target server.
A flask app will transmit health data to the main flask app.
'''
SERVER_NAME="test_local_server"
def getHealth(): # function for generating health report. Returns a json object.
print('generating health report')
report={}
report['sever_name']=SERVER_NAME
report['cpupercent']=psutil.cpu_percent(interval=2.0)
report['ctime']=psutil.cpu_times()
report['cpu_total']=report['ctime'].user+report['ctime'].system
report['disk_usages']=psutil.disk_usage("/")
report['net']=psutil.net_io_counters()
report['bytes_sent']=report['net'].bytes_sent
report['bytes_received']=report['net'].bytes_recv
report['packets_sent']=report['net'].packets_sent
report['packets_received']=report['net'].packets_recv
report['mem']=psutil.virtual_memory()
report['memory_Free']=report['mem'].free
json_report=json.dumps(report)
return(json_report)
if __name__=='__main__':
print(f'starting health report stream for server :\t{SERVER_NAME}')
while True:
getHealth()
This is the code for generating the Health details.How to send this back to my flask app in the form of a dictionary?
Client
I would start by simpifying that code somewhat:
import psutil
STATS_URL = 'http://localhost:5000/'
SERVER_NAME="test_local_server"
def get_health():
print('generating health report')
cpu_percent = psutil.cpu_percent(interval=2.0)
cpu_times = psutil.cpu_times()
disk_usage = psutil.disk_usage("/")
net_io_counters = psutil.net_io_counters()
virtual_memory = psutil.virtual_memory()
# The keys in this dict should match the db cols
report = dict (
sever_name = SERVER_NAME
ctime = cpu_times.__str__(),
disk_usages = disk_usage.__str__(),
net = net_io_counters.__str__(),
mem = virtual_memory.__str__(),
cpupercent = cpu_percent,
cpu_total = cpu_times.user + cpu_times.system,
bytes_sent = net_io_counters.bytes_sent,
bytes_received = net_io_counters.bytes_recv,
packets_sent = net_io_counters.packets_sent,
packets_received = net_io_counters.packets_recv,
memory_Free = virtual_memory.free,
)
return report
This get_health function builds and returns a report dictionary. Notice that for some of the return values from the psutil functions, I've used the built in __str__ method. This ensures a friendly type to be inserted into the database.
If you want to check the types yourself, you can do something like:
for item in report:
print (item, type(report[item]), report[item])
Next have this function run in a loop, with a desired time delay between requests:
if __name__=='__main__':
import time
import requests
print(f'starting health report stream for server :\t{SERVER_NAME}')
while True:
report = get_health()
r = requests.post(STATS_URL, json=report)
print (r, r.json())
time.sleep(1)
Notice this uses the json argument to request.post which automatically sets the correct Content-Type which Flask's request.get_json function expects.
Server
This is pretty easy to recieve:
from flask import Flask, request
app = Flask(__name__)
#app.route('/', methods=['POST'])
def index():
incoming_report = request.get_json()
add_to_db(incoming_report) # We'll build this in a sec.
return {'message': 'success'}
You can now work with incoming_report which is a dictionary.
This also sends a success message back to the client, so on the client you'll see the ouptut:
starting health report stream for server : test_local_server
generating health report
<Response [200]> {'message': 'success'}
# Repeats until killed
Database
and I need to pass the values of the dictionary into columns which are the keys of the dictionary in my database
Now that you have a dictionary incoming_report it should be easy to add this to your database if you're using an ORM.
Something along the lines of this answer should allow you to simply unpack that dictionary. So assuming your model is called Report you could simply do something like:
def add_to_db(d):
report = Report(**d)
db.session.add(report)
db.session.commit()
Note this could probably use some validation, and authentication if your deployment requires this.
I have made a flask application which gets and receives messages from the user and generates a reply from a Chatbot backend I have created. When I run my my app.py file and I go to my localhost it runs fine if I only open one instance, however if I try to open multiple instances then they all try to use the same bot. How do I create a unique bot for each session. I tried using g.bot = mybot() but the problem was it still kept creating a new bot each time the user replied to the bot. I am relatively new to this so links to a detailed explanation would be appreciated. Note some pieces of code are unrelated junk from previous versions.
app = Flask(__name__)
items = ["CommonName","Title",
"Department","Address","City","PhoneNum"]
app.config.from_object(__name__)
bot2 = Bot2()
#app.before_request
def before_request():
session['uid'] = uuid.uuid4()
print(session['uid'])
g.bot2 = Bot2()
#app.route("/", methods=['GET'])
def home():
return render_template("index.html")
#app.route("/tables")
def show_tables():
data = bot2.df
if data.size == 0:
return render_template('sad.html')
return render_template('view.html',tables=[data.to_html(classes='df')], titles = items)
#app.route("/get")
def get_bot_response():
userText = request.args.get('msg')
bot2.input(str(userText))
print(bot2.message)
g.bot2.input(str(userText))
print(g.bot2.message)
show_tables()
if (bot2.export):
return (str(bot2.message) + "<br/>\nWould you like to narrow your results?")
#return (str(bot.message) + "click here" + "</span></p><p class=\"botText\"><span> Would you like to narrow your results?")
else:
return (str(bot2.message))
if __name__ == "__main__":
app.secret_key = 'super secret key'
app.run(threaded=True)
Problem: A new bot is created each time the user replies to the bot
Reason: app.before_request runs before each request that your flask server receives. Hence each reply will create a new Bot2 instance. You can read more about it here.
Problem: Creating a bot per instance
Possible Solution: I'm not really sure what you mean by opening multiple instances (are you trying to run multiple instances of the same server, or there are multiple ppl accessing the single server). I would say to read up on Sessions in Flask and store a Bot2 instance inside sessions as a server side variable. You can read more about it here and here.
You could try assigning the session id from flask-login to the bot instead of creating a unique id at before request. Right now, like #Imma said, a new bot is created for every request.
You will have to store an array of classes. When a session is created, i.e., a user or an anonymous user logs in, a bot/class instance gets created and is pushed onto the array.
You can keep the array in the session object... do note that the session object will get transferred in the form of a cookie to the front end... so you may be potentially exposing all the chat sessions...Also, a lot of users will unnecessarily slow down the response.
Another alternative is to create a separate container and run the bot as a separate microservice rather than integrate it with your existing flask application (this is what we ended up doing)
Delete the line bot2 = Bot2(), and change all the reference to bot2 to g.bot2.
I'm developing a simple API in Flask, it isn't REST or anything.
There's a module that returns real-time data in a list.
The module in question
# module.py
def get_data():
do_something()
return [info_from_somewhere, info_from_other_place]
And the app
# app.py
import module
#app.route('/')
def get_data():
return jsonify(data=module.get_data()[0])
The thing is, this would run the function every time someones requests that route. And as the data is only one for all, I want to give it for every request but running the function just once.
Edit: I tried this:
got_data = module.get_data()
#app.route('/')
def get_data():
return jsonify(data=got_data[0])
Works, but don't refresh the list. So my question would be "how can I refresh it every second?" I tried sleep but it freezes my app
You could achieve this with celery. From project page.
Celery is an asynchronous task queue/job queue based on distributed message passing. It is focused on real-time operation, but supports scheduling as well.
Other solution could be done by spawning a thread, that will update the data every second, but this could get tricky really fast.
from threading import Timer
from flask import Flask
app = Flask(__name__)
DATA = "data"
def update_data(interval):
Timer(interval, update_data, [interval]).start()
global DATA
DATA = DATA + " updating..."
# update data every second
update_data(1)
#app.route("/")
def index():
return DATA
if __name__ == "__main__":
app.run(debug=True)