Run Flask webserver parallel from main code? - python

Context:
For the Raspberry Pi i am developing some home automation tools.
At one side i have my main application that reads a CSV file, that consists of date+time entries with a GPIO port number and a duration it needs to send a signal to that port.
My main app reads this CSV, creates a small list of entries of this and then basically checks every 60 seconds if there is any job to do.
So far so good, this works like a charm.
Now on the other half, i am trying to run a Flask webservice so i can directly interact with this schedule, overwrite, push to refresh the csv, and so on.
Later on (future music) i am thinking of making some android app that has a nice GUI that talks with this webservice.
But i keep struggling to start the webservice and then kick off the main app (read csv; execute loop)
some code snipit:
import threading
from flask import Flask, render_template, request
from dwe_homeautomation_app import runMainWorker
app = Flask(__name__)
# Some routing samples
#app.route('/app/breakLoop')
def breakLoop():
m_worker.breakLoop = True # set global var to exit the 60 sec loop
return "break!"
if __name__ == '__main__':
# TODO: how to run this parallel ?
t1 = threading.Thread(target=app.run(debug=True, use_reloader=False, port=5000, host='0.0.0.0')) # Flask webserver
t2 = threading.Thread(target=runMainWorker()) # The main app that reads the csv and executes the 60 sec loop
t1.start()
t2.start()
As i was reading some topics trough google and stack overflow, but i couldn't really figure out how to get this working in my code; i saw some advice about multi threading, though the info and advice doesn't seem to be very in sync with eachother.
For some reason t1 (the webservice) starts, but t2 doesn't start at all.
Im relative new to Python, so i might be missing the obvious here.
Any advice, pointing me in the right direction, or pointing me my mistake in the code sample is much appreciated.

Try that:
from flask import Flask, render_template, request
from threading import Thread
app = Flask(__name__)
# Some routing samples
#app.route('/app/breakLoop')
def breakLoop():
m_worker.breakLoop = True # set global var to exit the 60 sec loop
return "break!"
def runApp():
app.run(debug=True, use_reloader=False, port=5000, host='0.0.0.0')
if __name__ == '__main__':
# TODO: how to run this parallel ?
Thread(target = runApp).start()
Thread(target = runMainWorker).start()

Check the threading.Thread docs:
https://docs.python.org/3/library/threading.html#thread-objects
You have to pass the target without brackets and args/kwargs as defined in the docs.

Related

flask REST API with multithreading is not working

I have designed a REST API which receives inputs through POST requests and then applies some logic to the inputs and returns to the callback uri which is part of the inputs.
This design was working fine for single input, but then i want to implement multithreading so that i can handle multiple POST requests. I have tried using 'app.run(threaded=True)' but was not successful.
I am running this code on linux platform. Not sure what is wrong in the following code, and am not so good at using threads in python, would appreciate if someone can let me know where the issue is:
I am able to get the '200' response once there is a POST request and the inputs are appended to 'inp_params', after which there is no processing in the thread.
from flask import Flask, jsonify, request
import time
import json
import os
import threading
import Queue
import test_func_module as tf
app = Flask(__name__)
inp_params = []
# Create the queue and threader
q = Queue.Queue()
#app.route('/', methods = ['GET', 'POST'] )
def get_data():
if request.method == 'GET':
return 'RESTful API'
elif request.method == 'POST':
global inp_params
inputs = {"fileName": request.json["fileName"], "fileId": request.json["fileId"], "ModuleId": request.json["ModuleId"], "WorkflowId": request.json["WorkflowId"],"Language": request.json["Language"], "callbackuri": request.json["callbackuri"]}
inp_params.append(inputs)
return '200'
def test_integrate(worker):
TF_output = tf.test_func(worker)
return TF_output
def threader():
while True:
# gets an worker from the queue
worker = q.get()
# Run the example job with the avail worker in queue (thread)
test_integrate(worker)
# completed with the job
q.task_done()
if __name__ == '__main__':.
for worker in inp_params:
q.put(worker)
for x in range(4): #4 cores
t = threading.Thread(target=threader)
# classifying as a daemon, so they will die when the main dies
t.daemon = True
# begins, must come after daemon definition
t.start()
# wait until the thread terminates.
q.join()
app.run(threaded=True)
#Shilparani Since you mentioned
I have tried using 'app.run(threaded=True)' but was not successful.
May not be exact answer for your question but I would like to share my experience for achieving concurrency through uwsgi/gunicorn :
Keep it simple by coding Flask for REST endpoints and move Multithreading , MultiProcessing logic to gunicorn or uwsgi where you can mention threads and workers which help for achieving concurrency , parallelism if that's what you are trying to achieve.
gunicorn -b localhost:8080 -w 4 -t 4 app:app
Based on your need and operations:
If tasks are CPU intensive try to keep #workers as #CPU-cores
If tasks are I/O intensive may be safe to try with more threads

How to invoke python/flask server to reload client page from server-side function?

I am working on a small python/flask project, which interfaces a heavy computation routine with a browser interface. For practical reasons, I have to keep the computation in a background process and reload/redirect the page (with output results) when the computation is done. The following is a minimal code of what I have so far (in reverse order):
interface.py
from flask import Flask
from threading import Thread
import time
app = Flask(__name__)
# step 4: rerender browser with output data
#app.route('/done')
def done(data_to_pass):
# rerender browser's html here?
print data_to_pass
return data_to_pass
# step 3: heavy computation routine
def background():
print "start runing backgroun process"
time.sleep(3) # simulate heavy computation routine
data = 'done from background'
done(data)
# step 2: initiate background process
def init():
t = Thread(target=background)
t.daemon = True
t.start()
# step 1: home interface
#app.route('/')
def front_end():
init()
return 'initiate bachground process'
if __name__ == '__main__':
app.run()
When the interface.py is running, accessing 127.0.0.1:5000 get a string initiate bachground process in the browser. However, the final data (string done from background in this case) only been processed in the server's terminal, not to the browser.
I believe this procedure is commonly done for most of the server, but I can't find any flask solution... Or do I go in the wrong direction?
If you want to know when the process is finished I would suggest to use one of:
Long polling
WebSocket
However you can reload whole page:
window.location.reload()
it is a good practice to return from the server only the result of the background process and update only related fragment of the page.

How to run a function once per second separated from the requests on Flask?

I'm developing a simple API in Flask, it isn't REST or anything.
There's a module that returns real-time data in a list.
The module in question
# module.py
def get_data():
do_something()
return [info_from_somewhere, info_from_other_place]
And the app
# app.py
import module
#app.route('/')
def get_data():
return jsonify(data=module.get_data()[0])
The thing is, this would run the function every time someones requests that route. And as the data is only one for all, I want to give it for every request but running the function just once.
Edit: I tried this:
got_data = module.get_data()
#app.route('/')
def get_data():
return jsonify(data=got_data[0])
Works, but don't refresh the list. So my question would be "how can I refresh it every second?" I tried sleep but it freezes my app
You could achieve this with celery. From project page.
Celery is an asynchronous task queue/job queue based on distributed message passing. It is focused on real-time operation, but supports scheduling as well.
Other solution could be done by spawning a thread, that will update the data every second, but this could get tricky really fast.
from threading import Timer
from flask import Flask
app = Flask(__name__)
DATA = "data"
def update_data(interval):
Timer(interval, update_data, [interval]).start()
global DATA
DATA = DATA + " updating..."
# update data every second
update_data(1)
#app.route("/")
def index():
return DATA
if __name__ == "__main__":
app.run(debug=True)

Run gevent processes and server concurrently

How to run a given module given I want to run some functions concurrently that are not necessarily using routing (could be daemon services) while at the same time running the app server?
For example:
#some other route functions app.post(...)
#some other concurrent functions
def alarm():
'''
Run this service every X duration
'''
ALARM = 21
try:
while 1:
#checking time and doing something. Then finding INTERVAL
gevent.sleep(INTERVAL)
except KeyboardInterrupt,e:
print 'exiting'
Do I have to use the above like this after main ?
gevent.joinall(gevent.spawn(alarm))
app.run(....)
or
gevent.joinall((gevent.spawn(alarm),gevent.spawn(app.run)))
The objective is run these alarm like daemon services, do their work and snooze while rest of service operations work as usual.
The server should start concurrently as well. correct me if im not on the right track.
Gevent comes with it's own WSGI servers, so it is really not necessary to use app.run. The servers are:
gevent.pywsgi.WSGIServer
gevent.wsgi.WSGIServer
Both provide the same interface.
You can use these to achieve what you want:
import gevent
import gevent.monkey
gevent.monkey.patch_all()
import requests
from gevent.pywsgi import WSGIServer
# app = YourBottleApp
def alarm():
'''
Run this service every X duration
'''
ALARM = 21
while 1:
#checking time and doing something. Then finding INTERVAL
gevent.sleep(INTERVAL)
if __name__ == '__main__':
http_server = WSGIServer(('', 8080), app)
srv_greenlet = gevent.spawn(http_server.serve_forever)
alarm_greenlet = gevent.spawn(alarm)
try:
gevent.joinall([srv_greenlet, alarm_greenlet])
except KeyboardInterrupt:
http_server.stop()
print 'Quitting'

Threaded Flask application not working as expected

I want my flask application to be able to process more than one call at the same time.
I've been testing running with threaded=True or processes=3 with the code below but when I make two calls to the server the later always have to wait for the first one to complete.
I know that it's recommended to deploy the application on a more sophisticated WSGI container but for now I just want my small app to be able to process 2 calls at once.
from flask import Flask, Response, stream_with_context
from time import sleep
app = Flask(__name__)
def text_gen(message):
for c in message:
yield c
sleep(1)
#app.route("/")
def hello():
stream = text_gen('Hello World')
return Response(stream_with_context(stream))
if __name__ == "__main__":
app.run(host='0.0.0.0', port=8080, debug=True, threaded=True)
#Lukas was right.
I was debugging in Google Chrome with two tabs. Apparently Chrome is trying to be smart by using the socket same for both tabs. How Chrome handles that can be changed with the -socket-reuse-policy flag when starting Chrome.
An easier way to test is by using different hostname in each tab or by using curl -N (-N flag for no buffer to see the streaming). Doing that it did indeed work as expected.

Categories