How to send and receive webcam stream using tcp sockets in Python? - python

I am trying to recreate this project. What I have is a server (my computer), and a client (my raspberry pi). What I am doing differently than the original project is that I am trying to use a simple webcam instead of a raspberry pi camera to stream images from my rpi to the server. I know that I must:
Get opencv image frames from the camera.
Convert a frame (which is a numpy array) to bytes.
Transfer the bytes from the client to the server.
Convert the bytes back into frames and view.
Examples would be appreciated.
self_driver.py
import SocketServer
import threading
import numpy as np
import cv2
import sys
ultrasonic_data = None
#BaseRequestHandler is used to process incoming requests
class UltrasonicHandler(SocketServer.BaseRequestHandler):
data = " "
def handle(self):
while self.data:
self.data = self.request.recv(1024)
ultrasonic_data = float(self.data.split('.')[0])
print(ultrasonic_data)
#VideoStreamHandler uses streams which are file-like objects for communication
class VideoStreamHandler(SocketServer.StreamRequestHandler):
def handle(self):
stream_bytes = b''
try:
stream_bytes += self.rfile.read(1024)
image = np.frombuffer(stream_bytes, dtype="B")
print(image.shape)
cv2.imshow('F', image)
cv2.waitKey(0)
finally:
cv2.destroyAllWindows()
sys.exit()
class Self_Driver_Server:
def __init__(self, host, portUS, portCam):
self.host = host
self.portUS = portUS
self.portCam = portCam
def startUltrasonicServer(self):
# Create the Ultrasonic server, binding to localhost on port 50001
server = SocketServer.TCPServer((self.host, self.portUS), UltrasonicHandler)
server.serve_forever()
def startVideoServer(self):
# Create the video server, binding to localhost on port 50002
server = SocketServer.TCPServer((self.host, self.portCam), VideoStreamHandler)
server.serve_forever()
def start(self):
ultrasonic_thread = threading.Thread(target=self.startUltrasonicServer)
ultrasonic_thread.daemon = True
ultrasonic_thread.start()
self.startVideoServer()
if __name__ == "__main__":
#From SocketServer documentation
HOST, PORTUS, PORTCAM = '192.168.0.18', 50001, 50002
sdc = Self_Driver_Server(HOST, PORTUS, PORTCAM)
sdc.start()
video_client.py
import socket
import time
import cv2
client_sock = socket.socket()
client_sock.connect(('192.168.0.18', 50002))
#We are going to 'write' to a file in 'binary' mode
conn = client_sock.makefile('wb')
try:
cap = cv2.VideoCapture(0)
cap.set(cv2.cv.CV_CAP_PROP_FRAME_WIDTH,320)
cap.set(cv2.cv.CV_CAP_PROP_FRAME_HEIGHT,240)
start = time.time()
while(cap.isOpened()):
conn.flush()
ret, frame = cap.read()
byteImage = frame.tobytes()
conn.write(byteImage)
finally:
finish = time.time()
cap.release()
client_sock.close()
conn.close()

You can't just display every received buffer of 1-1024 bytes as an image; you have to concatenate them up and only display an image when your buffer is complete.
If you know, out of band, that your images are going to be a fixed number of bytes, you can do something like this:
IMAGE_SIZE = 320*240*3
def handle(self):
stream_bytes = b''
try:
stream_bytes += self.rfile.read(1024)
while len(stream_bytes) >= IMAGE_SIZE:
image = np.frombuffer(stream_bytes[:IMAGE_SIZE], dtype="B")
stream_bytes = stream_bytes[IMAGE_SIZE:]
print(image.shape)
cv2.imshow('F', image)
cv2.waitKey(0)
finally:
cv2.destroyAllWindows()
sys.exit()
If you don't know that, you have to add some kind of framing protocol, like sending the frame size as a uint32 before each frame, so the server can know how many bytes to received for each frame.
Next, if you're just sending the raw bytes, without any dtype or shape or order information, you need to embed the dtype and shape information into the server. If you know it's supposed to be, say, bytes in C order in a particular shape, you can do that manually:
image = np.frombuffer(stream_bytes, dtype="B").reshape(320, 240, 3)
… but if not, you have to send that information as part of your framing protocol as well.
Alternatively, you could send a pickle.dumps of the buffer and pickle.loads it on the other side, or np.save to a BytesIO and np.load the result. Either way, that includes the dtype, shape, order, and stride information as well as the raw bytes, so you don't have to worry about it.
The next problem is that you're exiting as soon as you display one image. Is that really what you want? If not… just don't do that.
But that just raises another problem. Do you really want to block the whole server with that cv.waitKey? Your client is capturing images and sending them as fast as it can; surely you either want to make the server display them as soon as they arrive, or change the design so the client only sends frames on demand. Otherwise, you're just going to get a bunch of near-identical frames, then a many-seconds-long gap while the client is blocked waiting for you to drain the buffer, then repeat.

Related

error when sending video feed bytes using socket

I am using sockets to send video feed bytes from server to client. The video feed is being captured using openCV. But the method I am using right now works for a couple of seconds and stops with error OSError: [WinError 10040] A message sent on a datagram socket was larger than the internal message buffer or some other network limit, or the buffer used to receive a datagram into was smaller than the datagram itself Where did I go wrong and how can I fix it? Thanks in advance.
HOST
import cv2
import socket
import pickle
s_stream = socket.socket(socket.AF_INET, socket.SOCK_DGRAM)
s_stream.setsockopt(socket.SOL_SOCKET, socket.SO_SNDBUF, 10000000)
streamIp = "192.168.3.5"
streamPort = 8787
camera = cv2.VideoCapture(0)
while True:
ret, img = camera.read()
ret, buffer = cv2.imencode(
'.jpg', img, [int(cv2.IMWRITE_JPEG_QUALITY), 30])
x_as_bytes = pickle.dumps(buffer)
s_stream.sendto(x_as_bytes, (streamIp, streamPort))
CLIENT
import cv2, socket, pickle
s_stream=socket.socket(socket.AF_INET , socket.SOCK_DGRAM)
streamIp="192.168.3.5"
streamPort=8787
s_stream.bind((streamIp,streamPort))
while True:
x=s_stream.recvfrom(10000000)
clientip = x[1][0]
data=x[0]
data=pickle.loads(data)
print(data)

Python sockets: received frames queue management

I'm creating an instrument panel for a flight simulator. The sim sends out data via UDP, and the panel parses this data and uses it to transform and render an SVG. The panel runs on Linux.
The sim sends out UDP frames faster than the panel can render, so the frames queue up, and the panel ends up lagging several seconds behind the sim. I haven't found any advice on how to configure or manage this queue beyond the MSG_PEEK flag which does the exact opposite of what I want.
Each UDP frame contains the full set of key/value pairs, so ideally I'd discard any older frames in the queue when a new one is received.
How might I do any one of these:
read the next frame and discard any later frames in the queue
read the most recent frame and discard the rest
set the queue size to 1 or a small number
I'm sure there's a straightforward way to do this but an hour or so of searching and experimentation hasn't come up with a solution.
Here's a representation of the current panel code:
import socket
sock = socket.socket(socket.AF_INET, socket.SOCK_DGRAM)
sock.bind(("",54321))
sock.settimeout(0.5)
while True:
try:
frame = sock.recv(1024)
parse_and_render(frame)
except socket.timeout:
pass
This is ugly but works:
import socket
sock = socket.socket(socket.AF_INET, socket.SOCK_DGRAM)
sock.bind(("",54321))
sock.setblocking(False)
frame = bytearray(b" " * 1024)
extra_frame = bytearray(b" " * 1024)
while True:
try:
# load next available frame
sock.recv_into(frame, 1024) != -1
# if there's any more frames available, overwrite with that
try:
while sock.recv_into(extra_frame, 1024) != -1:
frame = extra_frame
except BlockingIOError:
pass
# render the most recent frame
parse_and_render(frame)
except BlockingIOError:
pass

loading image though socket in bytestring takes forever in python

I am wondering how i can send an image through a python socket.
I have already tried it with pickle and sending the byte string piece by piece. However that takes forever.
Here is what I have tried.
my server code:
import socket
from PIL import Image
import pickle
host=""
port=80
server=socket.socket(socket.AF_INET, socket.SOCK_STREAM)
server.bind(("", port))
server.listen()
path=("path")
image=Image.open(path)
def acc():
while True:
conn, addr=server.accept()
print("connected to %s" %(conn))
conn.send(pickle.dumps(image))
acc()
my client code:
import socket
import pickle
host="192.168.1.11"
port=80
c=socket.socket(socket.AF_INET, socket.SOCK_STREAM)
c.connect((host, port))
while True:
while True:
data=[]
packet=c.recv(100000)
if not packet: break
data.append(packet)
data_arr=b"".join(data)
print(pickle.loads(data_arr))
If the answer is not with pickle or PIL it is fine. I just need a way how this works. I'm looking forward for answers!
import socket, time
server = socket.socket(socket.AF_INET, socket.SOCK_STREAM)
server.bind(("10.0.0.9", 2000))
server.listen()
def acc(image_Path):
with open(image_Path, "rb") as image:
data = image.read() # Read the bytes from the path
image.close()
while True:
conn, addr = server.accept()
print("connected to %s" %(conn))
conn.sendall(data) # Send the bytes
acc("The path of the image to transform")
import socket
import pickle
host = "10.0.0.9"
port = 2000
c = socket.socket(socket.AF_INET, socket.SOCK_STREAM)
c.connect((host, port))
while True:
data = c.recv(100000000) # We don't know the size of the image, so 100000000 just in case
with open("The path of the location of the image that recived", "wb") as newImage:
newImage.write(data) # Write the bytes in new path to create the image
newImage.close()
print("Got the image")
The answer is neither PIL nor Pickle!
If you have a 31kB JPEG of a 1920x1080 image and you open it with PIL, it will get JPEG-decompressed and expanded out into a 1920x1080x3 RGB image in memory - which will require a minimum of 6,220,800 bytes, i.e. you have increased its size by 200 times. Moral of the story is to send the (compressed) JPEG itself.
If you then pickle that 6MB monster, it will a) take longer and b) get bigger and it is entirely unnecessary because you can send binary data down a socket anyway.
The easiest way is to:
simply read() the JPEG/PNG file using Python read() rather than PIL Image.read()
do not pickle it
send a 4-byte header in network order (see htonl()) with the image size before your image and, on the receiving end, read 4 bytes and unpack them, then read the correct number of bytes in the image
put the received bytes into a BytesIO structure and then use PIL Image.open(BYTESIO_THING)

Sending frames using sockets in Python

I'm trying to get frames from client, send it to server and from there write it into a video. But I keep failing in sending part, getting TypeError: Expected Ptr<cv::UMat> for argument '%s' error in out.write(frame).
I've also tried using pickle.dumps(frame) and then loading it in server side but it keeps getting truncated.
Server:
import numpy as np
import cv2, socket
fourcc = cv2.VideoWriter_fourcc(*'XVID')
out = cv2.VideoWriter("output.avi", fourcc, 19.0, (1366, 768))
s = socket.socket()
host = socket.gethostname()
port = 8080
s.bind((host,port))
s.listen(1)
print(host)
print("Waiting for any incoming connections ... ")
conn, addr = s.accept()
print(addr, "Has connected to the server")
while True:
frame = conn.recv(1024)
# write frame to video writer
out.write(frame)
if cv2.waitKey(1) == 27:
break
out.release()
cv2.destroyAllWindows()
Client:
import numpy as np
import cv2, socket
from PIL import ImageGrab
s = socket.socket()
host = input(str("Please enter the host address of the sender : "))
port = 8080
s.connect((host,port))
print("Connected ... ")
while True:
img = ImageGrab.grab()
img_np = np.array(img)
frame = img_np
s.send(frame)
Apparently in server, frame becomes <class 'bytes'>. So, I'm trying to find any way to fix this, including somehow converting bytes back into ndarray, or finding any other workaround.
Thanks.
Lets separate your question into two parts:
How to send data over a socket?
You are using a socket with 1024 bytes buffer which means that in every iteration you get 1024 bytes data at maximum.
What you should do when working in low level networking, is to put a unique end identifier token in the end of the frame and iterate in the server side with .recv() until you reached it. Another option is to send the length of your message and count the received bytes. This way, you know when you have a complete frame, then you can break the while loop, convert it to numpy array and .write() it.
How to pass numpy array over network?
You can pickle it and transfer the bytes into a io.BytesIO stream. Then load the stream with np.load() function.
You can also serialize the frame pixels as array of your pixel type, read them from the socket into io.BytesIO, then read them into numpy with np.fromfile(..., dtype= ...)

How do I implement a raw frame capture using Python 3.3?

Note: I'm not sure if this is a programming issue, or a hardware/OS specific related issue.
I'm trying to capture raw ethernet frames using Python 3.3's socket class. Looking directly at the example from the PyDoc's website:
import socket
import struct
# CAN frame packing/unpacking (see 'struct can_frame' in <linux/can.h>)
can_frame_fmt = "=IB3x8s"
can_frame_size = struct.calcsize(can_frame_fmt)
def build_can_frame(can_id, data):
can_dlc = len(data)
data = data.ljust(8, b'\x00')
return struct.pack(can_frame_fmt, can_id, can_dlc, data)
def dissect_can_frame(frame):
can_id, can_dlc, data = struct.unpack(can_frame_fmt, frame)
return (can_id, can_dlc, data[:can_dlc])
# create a raw socket and bind it to the 'vcan0' interface
s = socket.socket(socket.AF_CAN, socket.SOCK_RAW, socket.CAN_RAW)
s.bind(('vcan0',))
while True:
cf, addr = s.recvfrom(can_frame_size)
print('Received: can_id=%x, can_dlc=%x, data=%s' % dissect_can_frame(cf))
try:
s.send(cf)
except OSError:
print('Error sending CAN frame')
try:
s.send(build_can_frame(0x01, b'\x01\x02\x03'))
except OSError:
print('Error sending CAN frame')
I get the following error:
OSError: [Errno 97] Address family not supported by protocol.
breaking at this specific line:
s = socket.socket(socket.AF_CAN, socket.SOCK_RAW, socket.CAN_RAW)
The only changes I've made to the code was the actual interface name (i.e. 'em1'). I'm using Fedora 15.
Looking further into the Python Source code it appears that the AF_CAN (address family) and the CAN_RAW (protocol) aren't the correct pair.
How do I capture raw ethernet frames for further processing?
Ultimately what I need to be able to do is capture raw ethernet frames and process them as this come into the system.
I was finally able to do this with the following:
import socket
import struct
import time
s = socket.socket(socket.AF_PACKET, socket.SOCK_RAW, socket.ntohs(0x0003))
test = []
while(1):
now = time.time()
message = s.recv(4096)
# Process the message from here

Categories