Flask remove session variable from thread - python

I try to implement a voting system. It works like this. If a user votes a post I record its temporary state in a session variable: upvoted, starred etc..
If current user hasn't voted before I save the results to temporary table. User may change the vote in 5 minutes. After 5 minutes pass the results are written to the database permanently using a thread.
I'd like to clear temporary session variables after results are written to database. Is there a way to achieve this?
task.py
import threading
import time
from flask import Flask, copy_current_request_context, session
from threading import Lock
from vote import voteQuestion
app = Flask(__name__)
app.secret_key="appkey"
#app.before_first_request
def initializeTask():
#copy_current_request_context
def runTask():
while True:
voteQuestion()
time.sleep(10)
task = threading.Thread(target=runTask)
task.start()
#app.route('/vote')
def vote():
session['test'] = "This is a test"
return 'success'
#app.route("/")
def hello():
return "Hello world!"
if __name__ == "__main__":
app.run()
vote.py
from flask import session
def voteQuestion():
print('session variables', session.items())
result = session.get('test', 'not set')
print ('result ', result)
if 'test' in session:
session.pop('test', None)
print ('Running in the background')

No, it's not possible. The request is over, the session in that thread is essentially a read-only copy. Writing to it won't do anything because there's no response to carry the updated cookie to the browser.
It would make more sense to store the timestamp in the temporary table when you store the temporary vote, rather than trying to do something with threads and the session.

Related

Python threading in flask webapp

I've been trying threads recently in my webapp, and I've come across an issue that I cannot seem to solve.
The issue is that I have an index page, and every time the user enters that page, a new thread is being started which checks for changes in my database in a while loop, although I only want one thread to be on at that moment. Is there a way to "kill" a thread when the index page is accessed the second time and then start a new one?
I did try to use is_alive() to check if a thread is already running, but had no success since all the time they're different.
Code example below:
#app.route("/")
def index():
#copy_current_request_context
def check_for_updates():
while True:
...... # Query for information
if something_changed:
socketio.emit('new_notifications', {'data': new_data})
if index_opened_again:
break
sleep(5)
index_thread = threading.Thread(target=check_for_updates, daemon=True)
index_thread.start()
return render_template("index.html")
I user the below code to kill threads when I am existing a server ,
My suggestion is to kill all active threadss before opening a new one
for id, thread in threading._active.items():
ctypes.pythonapi.PyThreadState_SetAsyncExc(thread_id, ctypes.py_object(SystemExit))
In my example I use a global variable in combination with a lock. It's certainly not optimal, but you can check if a thread is already running.
If you're using flask-socketio anyway, I recommend you take a look at start_background_task. The functionality is compatible with that of a standard thread.
from threading import Lock
from flask import Flask, render_template
from flask_socketio import SocketIO
app = Flask(__name__)
app.secret_key = 'your secret here'
sio = SocketIO(app)
thread = None
thread_lock = Lock()
def background_task():
while True:
# ...
sleep(5)
#app.route('/')
def index():
global thread
with thread_lock:
if thread is None:
thread = sio.start_background_task(background_task)
return render_template('index.html')
if __name__ == '__main__':
sio.run(app)

How to send a POST request in the form of a dictionary to my flask main server

OK so I'm doing a project on finding the Health details of a remote server using python and I'm hosting the main server using flask. But the idk how to send the Health report which I have created using python, to the flask app. The Health report is in the form of a dictionary and I need to pass the values of the dictionary into columns which are the keys of the dictionary in my database.can someone please help me in sending the Health report to the Flask app? This health report is on another system and I need to send that to my main server.
import psutil
import time
import json
import requests
'''
This program will be loaded on to the target server.
A flask app will transmit health data to the main flask app.
'''
SERVER_NAME="test_local_server"
def getHealth(): # function for generating health report. Returns a json object.
print('generating health report')
report={}
report['sever_name']=SERVER_NAME
report['cpupercent']=psutil.cpu_percent(interval=2.0)
report['ctime']=psutil.cpu_times()
report['cpu_total']=report['ctime'].user+report['ctime'].system
report['disk_usages']=psutil.disk_usage("/")
report['net']=psutil.net_io_counters()
report['bytes_sent']=report['net'].bytes_sent
report['bytes_received']=report['net'].bytes_recv
report['packets_sent']=report['net'].packets_sent
report['packets_received']=report['net'].packets_recv
report['mem']=psutil.virtual_memory()
report['memory_Free']=report['mem'].free
json_report=json.dumps(report)
return(json_report)
if __name__=='__main__':
print(f'starting health report stream for server :\t{SERVER_NAME}')
while True:
getHealth()
This is the code for generating the Health details.How to send this back to my flask app in the form of a dictionary?
Client
I would start by simpifying that code somewhat:
import psutil
STATS_URL = 'http://localhost:5000/'
SERVER_NAME="test_local_server"
def get_health():
print('generating health report')
cpu_percent = psutil.cpu_percent(interval=2.0)
cpu_times = psutil.cpu_times()
disk_usage = psutil.disk_usage("/")
net_io_counters = psutil.net_io_counters()
virtual_memory = psutil.virtual_memory()
# The keys in this dict should match the db cols
report = dict (
sever_name = SERVER_NAME
ctime = cpu_times.__str__(),
disk_usages = disk_usage.__str__(),
net = net_io_counters.__str__(),
mem = virtual_memory.__str__(),
cpupercent = cpu_percent,
cpu_total = cpu_times.user + cpu_times.system,
bytes_sent = net_io_counters.bytes_sent,
bytes_received = net_io_counters.bytes_recv,
packets_sent = net_io_counters.packets_sent,
packets_received = net_io_counters.packets_recv,
memory_Free = virtual_memory.free,
)
return report
This get_health function builds and returns a report dictionary. Notice that for some of the return values from the psutil functions, I've used the built in __str__ method. This ensures a friendly type to be inserted into the database.
If you want to check the types yourself, you can do something like:
for item in report:
print (item, type(report[item]), report[item])
Next have this function run in a loop, with a desired time delay between requests:
if __name__=='__main__':
import time
import requests
print(f'starting health report stream for server :\t{SERVER_NAME}')
while True:
report = get_health()
r = requests.post(STATS_URL, json=report)
print (r, r.json())
time.sleep(1)
Notice this uses the json argument to request.post which automatically sets the correct Content-Type which Flask's request.get_json function expects.
Server
This is pretty easy to recieve:
from flask import Flask, request
app = Flask(__name__)
#app.route('/', methods=['POST'])
def index():
incoming_report = request.get_json()
add_to_db(incoming_report) # We'll build this in a sec.
return {'message': 'success'}
You can now work with incoming_report which is a dictionary.
This also sends a success message back to the client, so on the client you'll see the ouptut:
starting health report stream for server : test_local_server
generating health report
<Response [200]> {'message': 'success'}
# Repeats until killed
Database
and I need to pass the values of the dictionary into columns which are the keys of the dictionary in my database
Now that you have a dictionary incoming_report it should be easy to add this to your database if you're using an ORM.
Something along the lines of this answer should allow you to simply unpack that dictionary. So assuming your model is called Report you could simply do something like:
def add_to_db(d):
report = Report(**d)
db.session.add(report)
db.session.commit()
Note this could probably use some validation, and authentication if your deployment requires this.

Threading in Flask [duplicate]

I try to implement a voting system. It works like this. If a user votes a post I record its temporary state in a session variable: upvoted, starred etc..
If current user hasn't voted before I save the results to temporary table. User may change the vote in 5 minutes. After 5 minutes pass the results are written to the database permanently using a thread.
I'd like to clear temporary session variables after results are written to database. Is there a way to achieve this?
task.py
import threading
import time
from flask import Flask, copy_current_request_context, session
from threading import Lock
from vote import voteQuestion
app = Flask(__name__)
app.secret_key="appkey"
#app.before_first_request
def initializeTask():
#copy_current_request_context
def runTask():
while True:
voteQuestion()
time.sleep(10)
task = threading.Thread(target=runTask)
task.start()
#app.route('/vote')
def vote():
session['test'] = "This is a test"
return 'success'
#app.route("/")
def hello():
return "Hello world!"
if __name__ == "__main__":
app.run()
vote.py
from flask import session
def voteQuestion():
print('session variables', session.items())
result = session.get('test', 'not set')
print ('result ', result)
if 'test' in session:
session.pop('test', None)
print ('Running in the background')
No, it's not possible. The request is over, the session in that thread is essentially a read-only copy. Writing to it won't do anything because there's no response to carry the updated cookie to the browser.
It would make more sense to store the timestamp in the temporary table when you store the temporary vote, rather than trying to do something with threads and the session.

Flask App with Slow Queries, Multiple Client Users, and Hosted on Kubernetes

I've got a Flask app in which I hope to accomplish the following things:
Have an endpoint that will run a series of queries
This endpoint needs to respond to the HTTP request within a limited number of seconds.
The queries can take up to several minutes to finish so I need them to run in a separate thread, with multiple clients polling the server every so often to see if they have fresh data to be returned to them
Hopefully hosted on Kubernetes with multiple instances of the pod running.
My below implementation has several issues:
The poll endpoint seems unnecesarily large, most of this is just dealing with the Queue of queries and making sure that each client gets their own results back, and not someone elses.
Not sure what is going on, but when I try to host more than one instance of this pod on Kubernetes, its like some poll requests from some users are being sent to instances in which their uuid does not exist.
I'm hoping for some understanding of what I'm doing wrong with threading and Queues because this seems like a hacky way of doing this. And also, how can I make the results of these queries available to all instances of Kubernetes running?
Thanks!
from flask import Flask, render_template, request, jsonify, g
from Queue import Queue
from threading import Thread
from time import sleep
app = Flask(__name__, template_folder='Templates')
#app.route('/')
def index():
return render_template('index.html')
#app.before_first_request
def before_first_request():
g.output = Queue()
g.data_results = {}
return ""
#app.route('/data')
def data():
"""
Endpoint hit to fire of a request for data from a given user (uuid)
"""
params = request.args.to_dict()
uuid = params['uuid']
# Create a list for this user, to store their results
g.data_results[uuid] = []
list_of_queries = ["SELECT * FROM tbl1;",
"SELECT * FROM tbl2;",
"SELECT * FROM tbl3;"]
for query in list_of_queries:
t = Thread(target=worker, args=(query, uuid, g.output))
t.daemon = True
t.start()
return jsonify({'msg':'Queries started'})
def worker(*args):
query, uuid, output = args
# Will actually be something like `result = run_query(query)`
result = {'uuid':uuid}
sleep(10)
output.put(result)
#app.route('/poll')
def poll():
"""
Endpoint hit ever x seconds from frontend
to see if the data is ready
"""
params = request.args.to_dict()
uuid_from_client = params['uuid']
# If client polls for result, but server has no record of this uuid
# This can happen in kubernetes with multiple instances running
if g.data_results.get(uuid_from_client) is None:
return jsonify({'msg':'pong', 'data':None, 'freshdata':None})
try:
output = g.output
# This line throws an error if there is nothing to get
results = output.get(False)
output.task_done()
# What is the uuid associated with the most recently returned data
# More than 1 chunk of data can be in here
uuid_from_data = results['uuid']
g.data_results[uuid_from_data].append(results)
except:
uuid_from_data = None
results = None
results_for_client_uuid = g.data_results[uuid_from_client]
if len(results_for_client_uuid) > 0:
res = results_for_client_uuid.pop(0)
else:
res = None
return jsonify({'msg':'pong', 'data':res})
if __name__ == "__main__":
with app.app_context():
app.run(host='0.0.0.0')
Setup your app architecture to use queuing softwares so that there is separation of concerns in terms of what job it does.
Here is a great article that can help you give some insight http://blog.gorgias.io/deploying-flask-celery-with-docker-and-kubernetes/
and one more https://endocode.com/blog/2015/03/24/using-googles-kubernetes-to-build-a-distributed-task-management-cluster/

How to run a function once per second separated from the requests on Flask?

I'm developing a simple API in Flask, it isn't REST or anything.
There's a module that returns real-time data in a list.
The module in question
# module.py
def get_data():
do_something()
return [info_from_somewhere, info_from_other_place]
And the app
# app.py
import module
#app.route('/')
def get_data():
return jsonify(data=module.get_data()[0])
The thing is, this would run the function every time someones requests that route. And as the data is only one for all, I want to give it for every request but running the function just once.
Edit: I tried this:
got_data = module.get_data()
#app.route('/')
def get_data():
return jsonify(data=got_data[0])
Works, but don't refresh the list. So my question would be "how can I refresh it every second?" I tried sleep but it freezes my app
You could achieve this with celery. From project page.
Celery is an asynchronous task queue/job queue based on distributed message passing. It is focused on real-time operation, but supports scheduling as well.
Other solution could be done by spawning a thread, that will update the data every second, but this could get tricky really fast.
from threading import Timer
from flask import Flask
app = Flask(__name__)
DATA = "data"
def update_data(interval):
Timer(interval, update_data, [interval]).start()
global DATA
DATA = DATA + " updating..."
# update data every second
update_data(1)
#app.route("/")
def index():
return DATA
if __name__ == "__main__":
app.run(debug=True)

Categories