I have created a face recognition model using keras and tensorflow, and now I am trying to convert it as a web application using flask and python. My requirement is that, I need a live webcam displayed on the webpage and by clicking on a button it should take the picture and save it to a specified directory, and using that picture the application should recognize the person. if the person is not found in the dataset then a message should be displayed over the webpage that unknown identity is been found. To do this job I have started learning flask and after that when it comes to the requirement it was very difficult for me. somebody help me out to solve this situation.
What you want to do is streaming with Flask by using the webcam Stream and handle it with Machine Learning. Your main script for the web server in flask will allow you to load your index.html file and then Stream each frame through the /video_feed path:
from flask import Flask, render_template, Response, jsonify
from camera import VideoCamera
import cv2
app = Flask(__name__)
video_stream = VideoCamera()
#app.route('/')
def index():
return render_template('index.html')
def gen(camera):
while True:
frame = camera.get_frame()
yield (b'--frame\r\n'
b'Content-Type: image/jpeg\r\n\r\n' + frame + b'\r\n\r\n')
#app.route('/video_feed')
def video_feed():
return Response(gen(video_stream),
mimetype='multipart/x-mixed-replace; boundary=frame')
if __name__ == '__main__':
app.run(host='127.0.0.1', debug=True,port="5000")
Then you need the VideoCamera class in wich you will handle each frame and where you can make every prediction or processing you want on the frames. The camera.py file :
class VideoCamera(object):
def __init__(self):
self.video = cv2.VideoCapture(0)
def __del__(self):
self.video.release()
def get_frame(self):
ret, frame = self.video.read()
# DO WHAT YOU WANT WITH TENSORFLOW / KERAS AND OPENCV
ret, jpeg = cv2.imencode('.jpg', frame)
return jpeg.tobytes()
And finally the page showing the video Stream in the html file index.html (in the templates/ folder, if not exist generate it) :
<!DOCTYPE html>
<html lang="en">
<head>
<title>Video Stream</title>
</head>
<body>
<img src="{{ url_for('video_feed') }}" />
</body>
</html>
from flask import Flask,request,jsonify
import numpy as np
import cv2
import tensorflow as tf
import base64
app = Flask(__name__)
graph = tf.get_default_graph()
#app.route('/')
def hello_world():
return """
<!DOCTYPE html>
<html lang="en">
<head>
<meta charset="UTF-8">
<title>Title</title>
</head>
<body>
<video id="video" width="640" height="480" autoplay></video>
<button id="snap">Snap Photo</button>
<canvas id="canvas" width="640" height="480"></canvas>
</body>
<script>
var video = document.getElementById('video');
if(navigator.mediaDevices && navigator.mediaDevices.getUserMedia) {
navigator.mediaDevices.getUserMedia({ video: true }).then(function(stream) {
//video.src = window.URL.createObjectURL(stream);
video.srcObject = stream;
video.play();
});
}
var canvas = document.getElementById('canvas');
var context = canvas.getContext('2d');
var video = document.getElementById('video');
// Trigger photo take
document.getElementById("snap").addEventListener("click", function() {
context.drawImage(video, 0, 0, 640, 480);
var request = new XMLHttpRequest();
request.open('POST', '/submit?image=' + video.toString('base64'), true);
request.send();
});
</script>
</html>
"""
# HtmlVideoElement
#app.route('/test',methods=['GET'])
def test():
return "hello world!"
#app.route('/submit',methods=['POST'])
def submit():
image = request.args.get('image')
print(type(image))
return ""`
i have done like this, but the problem is that, when calling the API /submit in decorator, i get my image stored as HTMLVideoElement when print the type of image variable, I dont know how to convert it into Jpeg format and use it for further purpose.
Related
I was referring to this github https://github.com/ahmedfgad/AndroidFlask and https://heartbeat.comet.ml/uploading-images-from-android-to-a-python-based-flask-server-691e4092a95e. I was unable to display the images into the remote destination from android emulator. I dont think the source code that they provide, do display images to the remote destination using flask.
Below is my github. It shows all the changes I have made.
https://github.com/2100723/AndroidEmulator/tree/main. I have made changes to the flask_server.py and images_template.html which is located in the templates folder . I need in the image to display in the remote host.
My android emulator has managed to save my images from emulator in the file call static whenever press connect to server & upload enter image description here. All I left is to send my image to a remote destination.
#Flask_server.py
import flask
import werkzeug
import time
app = flask.Flask(__name__, static_folder='static')
#app.route('/', methods = ['GET', 'POST'])
def handle_request():
image_paths = []
files_ids = list(flask.request.files)
print("\nNumber of Received Images : ", len(files_ids))
image_num = 1
for file_id in files_ids:
print("\nSaving Image ", str(image_num), "/", len(files_ids))
imagefile = flask.request.files[file_id]
filename = werkzeug.utils.secure_filename(imagefile.filename)
print("Image Filename : " + imagefile.filename)
timestr = time.strftime("%Y%m%d-%H%M%S")
file_path = timestr+'_'+filename
imagefile.save(file_path)
image_paths.append(filename)
image_num = image_num + 1
print("\n")
return flask.render_template('image_template.html', images=image_paths)
app.run(host="0.0.0.0", port=5000, debug=True)
#image_template.html
<!doctype html>
<html>
<head>
<title>Image Gallery</title>
</head>
<body>
<h1>Image Gallery</h1>
{% for image in images %}
<img src="{{ url_for('static', filename=image) }}" alt="image" width="200">
{% endfor %}
</body>
</html>
I try changing my Flask_server.py code and adding image_template.html to make my code to display to the remote server using flask. Nothing is display in the remote destination using flask
I am setting up a webserver with flask and seaborn/matplotlib. There I am creating and assigning a variable in Python img_buffer of type IOBytes to represent a graph.
Then I am transforming this data into an str img_tag:
...
import base64
str_equivalent_image = base64.b64encode(img_buffer.getvalue()).decode()
img_tag = "<img src='data:image/png;base64," + str_equivalent_image + "'/>"
Now I want to use this img_tag in the corresponding index.html to display the image. How to I do that? The following did not work:
...
<body>
<h1>Graph</h1>
{img_tag}
</body>
</html>
This is the code I used. It is reduced as much as possible. The page is updated every 0,5s. Later the graphic should be updated as well on the basis of dynamic data I read out from a URL, but this is not yet implemented:
server.py:
from gevent import monkey; monkey.patch_all()
from flask import Flask, Response, render_template, stream_with_context, url_for, send_file, render_template
from gevent.pywsgi import WSGIServer
import json
import time
import io
import base64
from requests.exceptions import HTTPError
import matplotlib as mpl
from matplotlib.backends.backend_agg import FigureCanvasAgg as FigureCanvas
import matplotlib.pyplot as plt
import seaborn as sns
fig,ax=plt.subplots(figsize=(6,6))
ax=sns.set(style="darkgrid")
x=[i for i in range(100)]
y=[i for i in range(100)]
app = Flask(__name__)
counter = 100
#app.route("/")
def render_index():
return render_template("index.html")
#app.route('/getVisualizeData')
def getVisualizeData():
sns.lineplot(x,y)
canvas=FigureCanvas(fig)
img = io.BytesIO()
fig.savefig(img)
img.seek(0)
return img
#app.route("/listen")
def listen():
def respond_to_client():
while True:
img_buffer=getVisualizeData()
str_equivalent_image = base64.b64encode(img_buffer.getvalue()).decode()
img_tag = "<img src='data:image/png;base64," + str_equivalent_image + "'/>"
_data = json.dumps({"img_tag":img_tag})
yield f"id: 1\ndata: {_data}\nevent: online\n\n"
time.sleep(0.5)
return Response(respond_to_client(), mimetype='text/event-stream')
if __name__ == "__main__":
http_server = WSGIServer(("localhost", 8080), app)
http_server.serve_forever()
index.html:
<!DOCTYPE html>
<html lang="en">
<head>
<meta charset="UTF-8">
<meta name="viewport" content="width=device-width, initial-scale=1.0">
<title>APP</title>
</head>
<body>
<h1>Graph</h1>
{img_tag}
<script>
var eventSource = new EventSource("/listen")
eventSource.addEventListener("message", function(e) {
console.log(e.data)
}, false)
eventSource.addEventListener("online", function(e) {
data = JSON.parse(e.data)
document.querySelector("#img_tag").innerText = data.img_tag
}, true)
</script>
</body>
</html>
Try using double brackets
<body>
<h1>Graph<h1>
{{img_tag}}
</body>
I'm currently trying to code something that will let websites view my webcam. I'm roughly following the tutorial linked on this website, except using Python and pygame instead of Processing.
At the moment, my code is grabbing a pygame image (which was originally a SimpleCV image), attempting to convert it into jpg format, and send it over websockets to the client where it will display it inside an img tag. However, I can't seem to figure out how to convert a pygame image into jpg and get it to display properly on the web browser.
This is my code for the server, which uses Flask and gevent:
#!/usr/bin/env python
import base64
import cStringIO
import time
from geventwebsocket.handler import WebSocketHandler
from gevent.pywsgi import WSGIServer
from flask import Flask, request, render_template
import pygame
pygame.init()
import SimpleCV as scv
app = Flask(__name__)
cam = scv.Camera(0)
#app.route('/')
def index():
return render_template('index.html')
#app.route('/camera')
def camera():
if request.environ.get('wsgi.websocket'):
ws = request.environ['wsgi.websocket']
while True:
image = cam.getImage().flipHorizontal().getPGSurface()
data = cStringIO.StringIO()
pygame.image.save(image, data)
ws.send(base64.b64encode(data.getvalue()))
time.sleep(0.5)
if __name__ == '__main__':
http_server = WSGIServer(('',5000), app, handler_class=WebSocketHandler)
http_server.serve_forever()
This is my HTML file:
<!DOCTYPE HTML>
<html>
<head>
<title>Flask/Gevent WebSocket Test</title>
<script src="http://code.jquery.com/jquery-1.10.1.min.js"></script>
<script type="text/javascript" charset="utf-8">
$(document).ready(function(){
if ("WebSocket" in window) {
cam = new WebSocket("ws://" + document.domain + ":5000/camera");
cam.onmessage = function (msg) {
$("#cam").attr('src', 'data:image/jpg;base64,' + msg.data);
};
cam.onerror = function(e) {
console.log(e);
}
} else {
alert("WebSocket not supported");
}
});
</script>
</head>
<body>
<img id="cam" src="" width="640" height="480" />
</body>
</html>
These are the specific lines that I think I'm having trouble with:
while True:
image = cam.getImage().flipHorizontal().getPGSurface()
data = cStringIO.StringIO()
pygame.image.save(image, data)
ws.send(base64.b64encode(data.getvalue()))
time.sleep(0.5)
Currently, if I try and run my code, going to localhost:5000 will display an invalid jpg image. It also becomes really laggy if I try running it on Firefox, but that may be an unrelated issue that I can debug later.
I've checked and made sure that the pygame image is a valid one, since I'm converting it from another library, and also checked that I was using websockets correctly by sending text data back and forth.
I've also tried calling pygame.image.to_string to try and convert the pygame surface into RGB format, but that also doesn't work.
What am I doing wrong?
Using the underlying PIL image, we can write to a file-like object, read back and base-64 encode it:
from geventwebsocket.handler import WebSocketHandler
from gevent.pywsgi import WSGIServer
from flask import Flask, request
from time import sleep
from cStringIO import StringIO
import pygame
pygame.init()
import SimpleCV as scv
app = Flask(__name__)
cam = scv.Camera(0)
#app.route('/camera')
def camera():
if request.environ.get('wsgi.websocket'):
ws = request.environ['wsgi.websocket']
while True:
fp = StringIO()
image = cam.getImage().flipHorizontal().getPIL()
image.save(fp, 'JPEG')
ws.send(fp.getvalue().encode("base64"))
#fp.close() << benchmark and memory tests needed
sleep(0.5)
if __name__ == '__main__':
http_server = WSGIServer(('',5000), app, handler_class=WebSocketHandler)
http_server.serve_forever()
I'm fighting with the same issue and the problem is a double codification. In the Python file, you have to remove the line "ws.send(base64.b64encode(data.getvalue()))" and send the image without encoded. Then in the js file, your script will make the codification and that all.
<!DOCTYPE html>
<html>
<head>
<title>dog or cat huh</title>
<style>
*{
font-size: :30px;
}
</style>
</head>
<body>
<input id='input_image', type='file'/>
<button id='submit_button'>Submit</button>
<p style='font-weight: bold'>Predicions</p>
<p>Dog <span id='Dog_prediction'></span></p>
<p>Cat <span id='Cat_prediction'></span></p>
<img id='selected_image' src=''/>
<script href="http://code.jquery.com/jquery-3.3.1.min.js"></script>
<script>
let base64Image;
$('#input_image').change(function(){
let reader = new FileReader();
reader.onload = function(e){
let dataUrl = reader.result;
$('#selected_image').attr('src', dataUrl);
base64Image = dataUrl.replace("data:image/png;base64","")
}
reader.readAsDataURL($('#input_image')[0].files[0])
$("#Dog_prediction").text("");
$("#Cat_prediction").text("");
});
$('#submit_button').click(function(event){
let message={
image: base64Image
}
console.log(message);
$.post('http://192.168.1.106:5000/predict', JSON.stringify(message), function(response){
$("#Dog_prediction").text(response.prediction.dog.toFixed(6));
$("#Cat_prediction").text(response.prediction.cat.toFixed(6));
console.log(response);
})
});
</script>
</body>
</html>
import base64
import numpy as np
import keras
from keras import backend as k
import io
from PIL import Image
from keras.preprocessing.image import ImageDataGenerator
from keras.preprocessing.image import img_to_array
from keras.models import Sequential
from keras.models import load_model
from flask import Flask
from flask import request
from flask import jsonify
from flask import render_template
app = Flask(__name__)
def get_model():
global model
model = load_model('VGG_cats_and_dogs.h5')
print('Model loaded!')
def preprocess(image, image_size):
if image.mode != 'RGB':
image = image.convert('RGB')
image = image.resize(target_size)
image = img_to_array(image)
image = np.expand_dims(image, axis=0)
return image
get_model()
print('Model is loading.........')
#app.route('/predict', methods=["GET","POST"])
def predict():
if request.method=='POST':
message = request.get_json(force=True)
encoded = message['image']
decoded = base64.b64code(encoded)
image = Image.open(io.BytesIO(encoded))
processed_image = preprocess(image, target_size=(224, 224))
prediction = model.predict(image).toList()
response = {
'predictions':{
'dog': prediction[0][0],
'cat': prediction[0][1]
}
}
return jsonify(response)
else:
return render_template("predict.html")
if __name__ == "__main__":
app.run(debug=True)
I'm trying to create a web application using flask to classify images using VGG16. The website loads fine I can select the file but can't submit the image. After clicking the submit button, nothing happens. In the original tutorial GET was not added in methods but without it my website wouldn't load so I had to add the if else statement for POST and GET which loaded the website.
Anyone?
Your javascript is failing, check the developer console of your browser.
Uncaught ReferenceError: $ is not defined
at predict:23
You weren't loading jQuery properly.
You need to do this:
<script src="http://code.jquery.com/jquery-3.3.1.min.js"></script>
not this:
<script href="http://code.jquery.com/jquery-3.3.1.min.js"></script>
I'm currently trying to code something that will let websites view my webcam. I'm roughly following the tutorial linked on this website, except using Python and pygame instead of Processing.
At the moment, my code is grabbing a pygame image (which was originally a SimpleCV image), attempting to convert it into jpg format, and send it over websockets to the client where it will display it inside an img tag. However, I can't seem to figure out how to convert a pygame image into jpg and get it to display properly on the web browser.
This is my code for the server, which uses Flask and gevent:
#!/usr/bin/env python
import base64
import cStringIO
import time
from geventwebsocket.handler import WebSocketHandler
from gevent.pywsgi import WSGIServer
from flask import Flask, request, render_template
import pygame
pygame.init()
import SimpleCV as scv
app = Flask(__name__)
cam = scv.Camera(0)
#app.route('/')
def index():
return render_template('index.html')
#app.route('/camera')
def camera():
if request.environ.get('wsgi.websocket'):
ws = request.environ['wsgi.websocket']
while True:
image = cam.getImage().flipHorizontal().getPGSurface()
data = cStringIO.StringIO()
pygame.image.save(image, data)
ws.send(base64.b64encode(data.getvalue()))
time.sleep(0.5)
if __name__ == '__main__':
http_server = WSGIServer(('',5000), app, handler_class=WebSocketHandler)
http_server.serve_forever()
This is my HTML file:
<!DOCTYPE HTML>
<html>
<head>
<title>Flask/Gevent WebSocket Test</title>
<script src="http://code.jquery.com/jquery-1.10.1.min.js"></script>
<script type="text/javascript" charset="utf-8">
$(document).ready(function(){
if ("WebSocket" in window) {
cam = new WebSocket("ws://" + document.domain + ":5000/camera");
cam.onmessage = function (msg) {
$("#cam").attr('src', 'data:image/jpg;base64,' + msg.data);
};
cam.onerror = function(e) {
console.log(e);
}
} else {
alert("WebSocket not supported");
}
});
</script>
</head>
<body>
<img id="cam" src="" width="640" height="480" />
</body>
</html>
These are the specific lines that I think I'm having trouble with:
while True:
image = cam.getImage().flipHorizontal().getPGSurface()
data = cStringIO.StringIO()
pygame.image.save(image, data)
ws.send(base64.b64encode(data.getvalue()))
time.sleep(0.5)
Currently, if I try and run my code, going to localhost:5000 will display an invalid jpg image. It also becomes really laggy if I try running it on Firefox, but that may be an unrelated issue that I can debug later.
I've checked and made sure that the pygame image is a valid one, since I'm converting it from another library, and also checked that I was using websockets correctly by sending text data back and forth.
I've also tried calling pygame.image.to_string to try and convert the pygame surface into RGB format, but that also doesn't work.
What am I doing wrong?
Using the underlying PIL image, we can write to a file-like object, read back and base-64 encode it:
from geventwebsocket.handler import WebSocketHandler
from gevent.pywsgi import WSGIServer
from flask import Flask, request
from time import sleep
from cStringIO import StringIO
import pygame
pygame.init()
import SimpleCV as scv
app = Flask(__name__)
cam = scv.Camera(0)
#app.route('/camera')
def camera():
if request.environ.get('wsgi.websocket'):
ws = request.environ['wsgi.websocket']
while True:
fp = StringIO()
image = cam.getImage().flipHorizontal().getPIL()
image.save(fp, 'JPEG')
ws.send(fp.getvalue().encode("base64"))
#fp.close() << benchmark and memory tests needed
sleep(0.5)
if __name__ == '__main__':
http_server = WSGIServer(('',5000), app, handler_class=WebSocketHandler)
http_server.serve_forever()
I'm fighting with the same issue and the problem is a double codification. In the Python file, you have to remove the line "ws.send(base64.b64encode(data.getvalue()))" and send the image without encoded. Then in the js file, your script will make the codification and that all.