I have a stream of images that come from a live video (RTP), the images pass through a pipeline of transformations, and I would like to stream the video again using HLS to the enduser, How can I take the buffer of images and stream it live using HLS ?
The current solution that I m using is returning the frames using http which is too slow and not scalable.
#app.route("/")
def main():
return Response(gen(),
mimetype='multipart/x-mixed-replace; boundary=frame')
def gen():
while True:
time.sleep(.04)
data = get()
yield (b'--frame\r\n'
b'Content-Type: image/jpeg\r\n\r\n' + data.tobytes() + b'\r\n\r\n')
Related
Greeting,
I was working on a drone project, I wanted to take stream from my drone process it on my laptop and give a command based on processing, I was using the flask framework from the same.
Currently, as the first step I want to take the stream from drone and PUT it to the flask server and view it on the flask website, not doing the processing part right now.
I PUT the video to server after compressing it into jpg and using base 64 to encode it and then finally use json.dumps() and then requests.put() it.
On the server side in flask server program I get its using request.json, use json.loads(), but I am not clear what to do next.
I am not experienced enough with flask, web development and with limited experience and knowledge made the programs, but it returns error 405 on the flask program.
Here are the programs
flask server
import base64
import json
from flask import Flask, make_response, render_template, request
app = Flask(__name__)
def getFrames(img):
pass
#app.route('/video', methods=['POST', 'GET'])
def video():
if request.method == 'PUT':
load = json.loads(request.json)
imdata = base64.b64decode(load['image'])
respose = make_response(imdata.tobytes())
return respose
#app.route('/')
def index():
return render_template('index.html')
#app.route('/cmd')
def cmd():
pass
if __name__ == "__main__":
app.run(debug=True)
index.html
<!DOCTYPE html>
<html>
<head>
<title>Video Stream</title>
</head>
<body>
<h1>
Live Stream
</h1>
<div>
<img src="{{ url_for('video') }}" width="50%">
</div>
</body>
</html>
drone program
import base64
import json
import requests
import cv2
cap = cv2.VideoCapture(1)
ip = '' #url returned by the flask program
while True:
success, img = cap.read()
cv2.imshow("OUTPUT", img)
_, imdata = cv2.imencode('.JPG', img)
jStr = json.dumps({"image": base64.b64encode(imdata).decode('ascii')})
requests.put(url=(ip + '/video'), data=jStr)
if cv2.waitKey(1) == 27:
break
Any help is highly appreciated!!!
You don't have to convert to base64 and use JSON. it can be simpler and faster to send JPG directly as raw bytes
And to make it simpler I would use /upload to send image from drone to server, and /video to send image to users.
import requests
import cv2
cap = cv2.VideoCapture(0)
while True:
success, img = cap.read()
if success:
cv2.imshow("OUTPUT", img)
_, imdata = cv2.imencode('.JPG', img)
print('.', end='', flush=True)
requests.put('http://127.0.0.1:5000/upload', data=imdata.tobytes())
# 40ms = 25 frames per second (1000ms/40ms),
# 1000ms = 1 frame per second (1000ms/1000ms)
# but this will work only when `imshow()` is used.
# Without `imshow()` it will need `time.sleep(0.04)` or `time.sleep(1)`
if cv2.waitKey(40) == 27: # 40ms = 25 frames per second (1000ms/40ms)
break
cv2.destroyAllWindows()
cap.release()
Now flask. This part is not complete.
It gets image from drone and keep in global variable. And when user open page then it loads single image from /video
from flask import Flask, make_response, render_template, request
app = Flask(__name__)
frame = None # global variable to keep single JPG
#app.route('/upload', methods=['PUT'])
def upload():
global frame
# keep jpg data in global variable
frame = request.data
return "OK"
#app.route('/video')
def video():
if frame:
return make_response(frame)
else:
return ""
#app.route('/')
def index():
return 'image:<br><img src="/video">'
if __name__ == "__main__":
app.run(debug=True)
At this moment it can display only one static image. It needs to send it as motion-jpeg
EDIT:
Version which sends motion-jpeg so you see video.
It works correctly with Chrome, Microsoft Edge and Brave (all use chrome engine).
Problem makes Firefox. It hangs and tries to load image all time. I don't know what is the real problem but if I add time.sleep() then it can solve problem.
from flask import Flask, Response, render_template_string, request
import time
app = Flask(__name__)
frame = None # global variable to keep single JPG,
# at start you could assign bytes from empty JPG
#app.route('/upload', methods=['PUT'])
def upload():
global frame
# keep jpg data in global variable
frame = request.data
return "OK"
def gen():
while True:
yield (b'--frame\r\n'
b'Content-Type: image/jpeg\r\n'
b'\r\n' + frame + b'\r\n')
time.sleep(0.04) # my Firefox needs some time to display image / Chrome displays image without it
# 0.04s = 40ms = 25 frames per second
#app.route('/video')
def video():
if frame:
# if you use `boundary=other_name` then you have to yield `b--other_name\r\n`
return Response(gen(), mimetype='multipart/x-mixed-replace; boundary=frame')
else:
return ""
#app.route('/')
def index():
return 'image:<br><img src="/video">'
#return render_template_string('image:<br><img src="{{ url_for("video") }}">')
if __name__ == "__main__":
app.run(debug=True)#, use_reloader=False)
Server may runs users in separated threads on processes and sometimes it may not share frame between users. If I use use_reloader=False then I can stop sending to /upload and this stops video in browser, and later I can start again sending to /upload and browser again displays stream (without reloading page). Without use_reloader=False browser doesn't restart video and it needs to reload page. Maybe it will need to use flask.g to keep frame. Or /upload will have to save frame in file or database and /video will have to read frame from file or database.
I have a video stored on the cloud, I'm using this Flask route to download chunks of the video at a time and return it to the user as a stream:
#app.route("/api/v1/download")
def downloadAPI():
def download_file(streamable):
with streamable as stream:
stream.raise_for_status()
for chunk in stream.iter_content(chunk_size=512):
yield chunk
headers = {"range": range}
resp = requests.request(
method=request.method,
url="example.com/api/file",
headers=headers,
data=request.get_data(),
cookies=request.cookies,
allow_redirects=False,
stream=True)
return Response(download_file(resp), resp.status_code)
How would I transcode these chunks on the fly?
I'm planning on transcoding basically any type of video to mp4
I am currently trying to use OpenCV with Python to load a video from a url onto a localhost webpage. The loaded video is a little choppy but the main problem is that it stops reading the video frames after a while and displays the following error message.
[h264 # 0955e140] error while decoding MB 87 29, bytestream -5
[h264 # 0955e500] left block unavailable for requested intra4x4 mode -1
[h264 # 0955e500] error while decoding MB 0 44, bytestream 126
Debugging middleware caught exception in streamed response at a point where response headers were already sent.
Traceback (most recent call last):
File "C:\Users\\AppData\Local\Programs\Python\Python38-32\Lib\site-packages\werkzeug\wsgi.py", line 506, in __next__
return self._next()
File "C:\Users\\AppData\Local\Programs\Python\Python38-32\Lib\site-packages\werkzeug\wrappers\base_response.py", line 45, in _iter_encoded
for item in iterable:
File "C:\Users\\Downloads\VideoStreamingFlask\main.py", line 12, in gen
frame = camera.get_frame()
File "C:\Users\\Downloads\VideoStreamingFlask\camera.py", line 13, in get_frame
ret, jpeg = cv2.imencode('.jpg', image)
cv2.error: OpenCV(4.3.0) C:\projects\opencv-python\opencv\modules\imgcodecs\src\loadsave.cpp:919: error: (-215:Assertion failed) !image.empty() in function 'cv::imencode'
Code
main.py
from flask import Flask, render_template, Response
from camera import VideoCamera
app = Flask(__name__)
#app.route('/')
def index():
return render_template('index.html')
def gen(camera):
while True:
frame = camera.get_frame()
yield (b'--frame\r\n'
b'Content-Type: image/jpeg\r\n\r\n' + frame + b'\r\n\r\n')
#app.route('/video_feed')
def video_feed():
return Response(gen(VideoCamera()),
mimetype='multipart/x-mixed-replace; boundary=frame')
if __name__ == '__main__':
app.run(host='0.0.0.0', debug=True)
camera.py
import cv2
class VideoCamera(object):
def __init__(self):
self.video = cv2.VideoCapture(*url*)
def __del__(self):
self.video.release()
def get_frame(self):
success, image = self.video.read()
ret, jpeg = cv2.imencode('.jpg', image)
return jpeg.tobytes()
Questions
What might be causing the problem here?
How do I make the video less choppy?
The crash in your python happened because video.read() failed. Therefore, image can not be passed to cv2.imencode(). You should check the success value in get_frame(self) and be prepared that sometimes camera.get_frame() will not return a valid Jpeg.
Now, let's understand why video.read() failed in this case. This could happen if the connection to the camera was not good enough and some packets got lost. But more likely, your VideoCapture was not fast enough to handle the video stream.
This could be improved if you reduce the work that the video capture thread is doing. As suggested in another discussion, offload processing to a separate thread.
Currently, your flask server listens to camera stream, converts it to a series of Jpegs, and sends them to the client over HTTP. If you have a thread dedicated to camera stream, you may find that your server cannot pass every video frame, because the encoder and HTTP transport are too slow. So, some frames will be skipped.
Here is a detailed article about video streaming with flask: https://blog.miguelgrinberg.com/post/flask-video-streaming-revisited. You can find some other open-source projects that stream video to browser, not necessarily with opencv and python.
I am able to send the frame through web API using below script:
import cv2
import requests
cap = cv2.VideoCapture(0)
while True:
ret, frame = cap.read()
cv2.imshow('frame', frame)
#frame_im = cv2.cvtColor(frame, cv2.COLOR_BGR2RGB)
print(type(frame))
frame_in = cv2.imencode('.jpg', frame)
headers = {'Content-Type': 'image/jpg'}
files = {'form': frame_in}
#img_files = urlopen(frame_in)
response = requests.post(
url="http://127.0.0.1:5000/test",
data=files,
headers=headers
)
if cv2.waitKey(1) and 0xFF == ord('q'):
break
cap.release()
cv2.destroyAllWindows()
I am sending the data to API using mentioned below Script and trying to recieve every frame and save it in some folder
from flask import Flask, redirect, url_for, request, Response, render_template, send_file,jsonify
import base64
app = Flask(__name__)
#app.route('/test', methods=['POST'])
def test():
if request.method == 'POST':
body = request.data
#print(request.form)
print(body)
print("Test Revealed")
return jsonify({"hi":"hi"})
if __name__ == '__main__':
#app.secret_key = os.urandom(12)
app.run(host='127.0.0.1', port=5000, threaded= True)
And got Results with huge line of as mentioned below :
=%5B39%5D&form=%5B13%5D&form=%5B146%5D&form=%5B192%5D&form=%5B117%5D&form=%5B24%5D&form=%5B206%5D&form=%5B70%5D&form=%5B106%5D&form=%5B121%5D&form=%5B235%5D&form=%5B41%5D&form=%5B114%5D&form=%5B180%5D&form=%5B215%5D&form=%5B98%5D&form=%5B84%5D&form=%5B212%5D&form=%5B149%5D&form=%5B154%5D&form=%5B63%5D&form=%5B255%5D&form=%5B217%5D'
Test Revealed
127.0.0.1 - - [07/Jan/2020 10:09:42] "POST /test HTTP/1.1" 200 -
I want to convert the format which I am receiving but I am still able to understand how to convert the data and save it as an image file.
Suggestion will be realy helpful here
So i want to create a video stream using imutils VideoStream and put it on the web. This is the Code:
camera_web.py
from flask import Flask, render_template, Response
from imutils.video import VideoStream
from imutils.video import FPS
import cv2
app = Flask(__name__)
vs = VideoStream(src=0).start()
#app.route('/')
def index():
""" Video streaming home page """
return render_template('index.html')
def gen():
rval, frame = vs.read()
cv2.imwrite('t.jpg', frame)
yield (b'--frame\r\n'
b'Content-Type: image/jpeg\r\n\r\n' + open('t.jpg', 'rb').read() + b'\r\n')
#app.route('/video_feed')
def video_feed():
return Response(gen(), mimetype='multipart/x-mixed-replace; boundary=frame')
if __name__ == '__main__':
app.run(host='0.0.0.0', debug = True, port = 80)
index.html
<html lang="en">
<head>
<meta charset="UTF-8">
<title>Vehicle Counter Web</title>
</head>
<body>
<h1>Vehicle Counter Demo</h1>
<img src="{{ url_for('video_feed') }}">
</body>
</html>
Now when i run it, it returns the Error:
[ WARN:0] videoio(MSMF): OnReadSample() is called with error status:
-1072875772 [ WARN:0] videoio(MSMF): async ReadSample() call is failed with error status: -1072875772 [ WARN:1] videoio(MSMF): can't grab
frame. Error: -1072875772
and it doesn't return any of my videostream like on this picture:
Is there an error in my code, or does flask didn't support imutils VideoStream? Thanks in advance.
Alright, so i am the one who just dumb. The code should be like this:
camera_web.py
from flask import Flask, render_template, Response
from imutils.video import WebcamVideoStream
from imutils.video import FPS
import imutils
import time
import cv2
app = Flask(__name__)
#app.route('/')
def index():
""" Video streaming home page """
return render_template('index.html')
def gen():
vs = WebcamVideoStream(src=1).start()
time.sleep(2.0)
while True:
frame = vs.read()
frame = imutils.resize(frame, width=500)
rgb = cv2.cvtColor(frame, cv2.COLOR_BGR2RGB)
cv2.imwrite('t.jpg', frame)
yield (b'--frame\r\n'
b'Content-Type: image/jpeg\r\n\r\n' + open('t.jpg', 'rb').read() + b'\r\n')
#app.route('/video_feed')
def video_feed():
return Response(gen(), mimetype='multipart/x-mixed-replace; boundary=frame')
if __name__ == '__main__':
app.run(host='0.0.0.0', debug=True, port=80)
And there we go! we should now see the videostream.
Link to the image