Sending frames using sockets in Python - python

I'm trying to get frames from client, send it to server and from there write it into a video. But I keep failing in sending part, getting TypeError: Expected Ptr<cv::UMat> for argument '%s' error in out.write(frame).
I've also tried using pickle.dumps(frame) and then loading it in server side but it keeps getting truncated.
Server:
import numpy as np
import cv2, socket
fourcc = cv2.VideoWriter_fourcc(*'XVID')
out = cv2.VideoWriter("output.avi", fourcc, 19.0, (1366, 768))
s = socket.socket()
host = socket.gethostname()
port = 8080
s.bind((host,port))
s.listen(1)
print(host)
print("Waiting for any incoming connections ... ")
conn, addr = s.accept()
print(addr, "Has connected to the server")
while True:
frame = conn.recv(1024)
# write frame to video writer
out.write(frame)
if cv2.waitKey(1) == 27:
break
out.release()
cv2.destroyAllWindows()
Client:
import numpy as np
import cv2, socket
from PIL import ImageGrab
s = socket.socket()
host = input(str("Please enter the host address of the sender : "))
port = 8080
s.connect((host,port))
print("Connected ... ")
while True:
img = ImageGrab.grab()
img_np = np.array(img)
frame = img_np
s.send(frame)
Apparently in server, frame becomes <class 'bytes'>. So, I'm trying to find any way to fix this, including somehow converting bytes back into ndarray, or finding any other workaround.
Thanks.

Lets separate your question into two parts:
How to send data over a socket?
You are using a socket with 1024 bytes buffer which means that in every iteration you get 1024 bytes data at maximum.
What you should do when working in low level networking, is to put a unique end identifier token in the end of the frame and iterate in the server side with .recv() until you reached it. Another option is to send the length of your message and count the received bytes. This way, you know when you have a complete frame, then you can break the while loop, convert it to numpy array and .write() it.
How to pass numpy array over network?
You can pickle it and transfer the bytes into a io.BytesIO stream. Then load the stream with np.load() function.
You can also serialize the frame pixels as array of your pixel type, read them from the socket into io.BytesIO, then read them into numpy with np.fromfile(..., dtype= ...)

Related

error when sending video feed bytes using socket

I am using sockets to send video feed bytes from server to client. The video feed is being captured using openCV. But the method I am using right now works for a couple of seconds and stops with error OSError: [WinError 10040] A message sent on a datagram socket was larger than the internal message buffer or some other network limit, or the buffer used to receive a datagram into was smaller than the datagram itself Where did I go wrong and how can I fix it? Thanks in advance.
HOST
import cv2
import socket
import pickle
s_stream = socket.socket(socket.AF_INET, socket.SOCK_DGRAM)
s_stream.setsockopt(socket.SOL_SOCKET, socket.SO_SNDBUF, 10000000)
streamIp = "192.168.3.5"
streamPort = 8787
camera = cv2.VideoCapture(0)
while True:
ret, img = camera.read()
ret, buffer = cv2.imencode(
'.jpg', img, [int(cv2.IMWRITE_JPEG_QUALITY), 30])
x_as_bytes = pickle.dumps(buffer)
s_stream.sendto(x_as_bytes, (streamIp, streamPort))
CLIENT
import cv2, socket, pickle
s_stream=socket.socket(socket.AF_INET , socket.SOCK_DGRAM)
streamIp="192.168.3.5"
streamPort=8787
s_stream.bind((streamIp,streamPort))
while True:
x=s_stream.recvfrom(10000000)
clientip = x[1][0]
data=x[0]
data=pickle.loads(data)
print(data)

how to send numpy array over python socket

Hey , Good day !
I implement a simple program to send a numpy array over python sockets
This is server.py
import socket
import numpy as np
s = socket.socket(socket.AF_INET, socket.SOCK_STREAM)
s.bind((socket.gethostname(), 1024))
s.listen(5)
print('Server is ready...')
while True:
client, adr = s.accept()
print(f'Connection to {adr} established')
myarray = np.array([[1,2],[3,4]])
client.send(myarray)
client.close()
This is client.py
import socket
s = socket.socket(socket.AF_INET, socket.SOCK_STREAM)
s.connect((socket.gethostname(), 1024))
getarray = s.recv(100)
print(getarray)
I want to send myarray in server.py to client.py
I want to get the myarray in client.py 100% similar to the myarray in server.py
**But, When I run server.py and client.py , client.py's output is this .. **
b'\x01\x00\x00\x00\x02\x00\x00\x00\x03\x00\x00\x00\x04\x00\x00\x00'
I don't know what is the encoding method (asci or utf-8 to decode that code)
How can I decode it ?
Thank you !
You could use pickle to encode the array object into bytes, and then send that.
Although depending on the size of the array you may need to change the way you receive it, s.recv() will only get a certain number of bytes, so you would need to send the size of the array in bytes over first, and then repeat s.recv until you have the entire array. pickle.dumps(obj) will return the byte stream of the object, and pickle.loads(obj) will return the original object once all of it is received.
Hope that helps :)

loading image though socket in bytestring takes forever in python

I am wondering how i can send an image through a python socket.
I have already tried it with pickle and sending the byte string piece by piece. However that takes forever.
Here is what I have tried.
my server code:
import socket
from PIL import Image
import pickle
host=""
port=80
server=socket.socket(socket.AF_INET, socket.SOCK_STREAM)
server.bind(("", port))
server.listen()
path=("path")
image=Image.open(path)
def acc():
while True:
conn, addr=server.accept()
print("connected to %s" %(conn))
conn.send(pickle.dumps(image))
acc()
my client code:
import socket
import pickle
host="192.168.1.11"
port=80
c=socket.socket(socket.AF_INET, socket.SOCK_STREAM)
c.connect((host, port))
while True:
while True:
data=[]
packet=c.recv(100000)
if not packet: break
data.append(packet)
data_arr=b"".join(data)
print(pickle.loads(data_arr))
If the answer is not with pickle or PIL it is fine. I just need a way how this works. I'm looking forward for answers!
import socket, time
server = socket.socket(socket.AF_INET, socket.SOCK_STREAM)
server.bind(("10.0.0.9", 2000))
server.listen()
def acc(image_Path):
with open(image_Path, "rb") as image:
data = image.read() # Read the bytes from the path
image.close()
while True:
conn, addr = server.accept()
print("connected to %s" %(conn))
conn.sendall(data) # Send the bytes
acc("The path of the image to transform")
import socket
import pickle
host = "10.0.0.9"
port = 2000
c = socket.socket(socket.AF_INET, socket.SOCK_STREAM)
c.connect((host, port))
while True:
data = c.recv(100000000) # We don't know the size of the image, so 100000000 just in case
with open("The path of the location of the image that recived", "wb") as newImage:
newImage.write(data) # Write the bytes in new path to create the image
newImage.close()
print("Got the image")
The answer is neither PIL nor Pickle!
If you have a 31kB JPEG of a 1920x1080 image and you open it with PIL, it will get JPEG-decompressed and expanded out into a 1920x1080x3 RGB image in memory - which will require a minimum of 6,220,800 bytes, i.e. you have increased its size by 200 times. Moral of the story is to send the (compressed) JPEG itself.
If you then pickle that 6MB monster, it will a) take longer and b) get bigger and it is entirely unnecessary because you can send binary data down a socket anyway.
The easiest way is to:
simply read() the JPEG/PNG file using Python read() rather than PIL Image.read()
do not pickle it
send a 4-byte header in network order (see htonl()) with the image size before your image and, on the receiving end, read 4 bytes and unpack them, then read the correct number of bytes in the image
put the received bytes into a BytesIO structure and then use PIL Image.open(BYTESIO_THING)

Pickle data was truncated (desktop streaming)

I'm trying to build a desktop streaming app. It consists of a server and a client for now. I learned that I should use the library pickle in order to serialize/deserialize the data. However, when I run both the scripts, I get the error "Pickle data was truncated" from the client side. Could you help me to solve this? I tried the solution the following link, whose OP apparently was trying to do the similar think but it didn't work.
python 3.6 socket pickle data was truncated
Server
import numpy as np
import cv2
from PIL import ImageGrab
import socket
import pickle
HOST = "0.0.0.0"
SOCKET = 5000
s = socket.socket(socket.AF_INET, socket.SOCK_STREAM)
s.bind((HOST,SOCKET))
while True:
s.listen(5)
client, addres = s.accept()
print(addres, " has connected")
img = ImageGrab.grab()
img_np = np.array(img)
img_np_serial = pickle.dumps(img_np)
client.send(img_np_serial)
if cv2.waitKey(1) == 27:
break
cv2.destroyAllWindows()
Client
import socket
import pickle
s = socket.socket(socket.AF_INET, socket.SOCK_STREAM)
s.connect((socket.gethostbyname(socket.gethostname()),5000))
data = b""
while True:
packet = s.recv(4096)
if not packet: break
data += packet
data_deserial = pickle.loads(data)
print((data_deserial))

How to send and receive webcam stream using tcp sockets in Python?

I am trying to recreate this project. What I have is a server (my computer), and a client (my raspberry pi). What I am doing differently than the original project is that I am trying to use a simple webcam instead of a raspberry pi camera to stream images from my rpi to the server. I know that I must:
Get opencv image frames from the camera.
Convert a frame (which is a numpy array) to bytes.
Transfer the bytes from the client to the server.
Convert the bytes back into frames and view.
Examples would be appreciated.
self_driver.py
import SocketServer
import threading
import numpy as np
import cv2
import sys
ultrasonic_data = None
#BaseRequestHandler is used to process incoming requests
class UltrasonicHandler(SocketServer.BaseRequestHandler):
data = " "
def handle(self):
while self.data:
self.data = self.request.recv(1024)
ultrasonic_data = float(self.data.split('.')[0])
print(ultrasonic_data)
#VideoStreamHandler uses streams which are file-like objects for communication
class VideoStreamHandler(SocketServer.StreamRequestHandler):
def handle(self):
stream_bytes = b''
try:
stream_bytes += self.rfile.read(1024)
image = np.frombuffer(stream_bytes, dtype="B")
print(image.shape)
cv2.imshow('F', image)
cv2.waitKey(0)
finally:
cv2.destroyAllWindows()
sys.exit()
class Self_Driver_Server:
def __init__(self, host, portUS, portCam):
self.host = host
self.portUS = portUS
self.portCam = portCam
def startUltrasonicServer(self):
# Create the Ultrasonic server, binding to localhost on port 50001
server = SocketServer.TCPServer((self.host, self.portUS), UltrasonicHandler)
server.serve_forever()
def startVideoServer(self):
# Create the video server, binding to localhost on port 50002
server = SocketServer.TCPServer((self.host, self.portCam), VideoStreamHandler)
server.serve_forever()
def start(self):
ultrasonic_thread = threading.Thread(target=self.startUltrasonicServer)
ultrasonic_thread.daemon = True
ultrasonic_thread.start()
self.startVideoServer()
if __name__ == "__main__":
#From SocketServer documentation
HOST, PORTUS, PORTCAM = '192.168.0.18', 50001, 50002
sdc = Self_Driver_Server(HOST, PORTUS, PORTCAM)
sdc.start()
video_client.py
import socket
import time
import cv2
client_sock = socket.socket()
client_sock.connect(('192.168.0.18', 50002))
#We are going to 'write' to a file in 'binary' mode
conn = client_sock.makefile('wb')
try:
cap = cv2.VideoCapture(0)
cap.set(cv2.cv.CV_CAP_PROP_FRAME_WIDTH,320)
cap.set(cv2.cv.CV_CAP_PROP_FRAME_HEIGHT,240)
start = time.time()
while(cap.isOpened()):
conn.flush()
ret, frame = cap.read()
byteImage = frame.tobytes()
conn.write(byteImage)
finally:
finish = time.time()
cap.release()
client_sock.close()
conn.close()
You can't just display every received buffer of 1-1024 bytes as an image; you have to concatenate them up and only display an image when your buffer is complete.
If you know, out of band, that your images are going to be a fixed number of bytes, you can do something like this:
IMAGE_SIZE = 320*240*3
def handle(self):
stream_bytes = b''
try:
stream_bytes += self.rfile.read(1024)
while len(stream_bytes) >= IMAGE_SIZE:
image = np.frombuffer(stream_bytes[:IMAGE_SIZE], dtype="B")
stream_bytes = stream_bytes[IMAGE_SIZE:]
print(image.shape)
cv2.imshow('F', image)
cv2.waitKey(0)
finally:
cv2.destroyAllWindows()
sys.exit()
If you don't know that, you have to add some kind of framing protocol, like sending the frame size as a uint32 before each frame, so the server can know how many bytes to received for each frame.
Next, if you're just sending the raw bytes, without any dtype or shape or order information, you need to embed the dtype and shape information into the server. If you know it's supposed to be, say, bytes in C order in a particular shape, you can do that manually:
image = np.frombuffer(stream_bytes, dtype="B").reshape(320, 240, 3)
… but if not, you have to send that information as part of your framing protocol as well.
Alternatively, you could send a pickle.dumps of the buffer and pickle.loads it on the other side, or np.save to a BytesIO and np.load the result. Either way, that includes the dtype, shape, order, and stride information as well as the raw bytes, so you don't have to worry about it.
The next problem is that you're exiting as soon as you display one image. Is that really what you want? If not… just don't do that.
But that just raises another problem. Do you really want to block the whole server with that cv.waitKey? Your client is capturing images and sending them as fast as it can; surely you either want to make the server display them as soon as they arrive, or change the design so the client only sends frames on demand. Otherwise, you're just going to get a bunch of near-identical frames, then a many-seconds-long gap while the client is blocked waiting for you to drain the buffer, then repeat.

Categories