Google Speech API from client micro stream : data format - python

I'm trying to adapt this Java codeLabs to Python / SocketIO (Flask Socket IO on server side).
My socket is working, I'm able to pass both the sample rate (and to retrieve on server side), and the audio data.
The problem is that the format of my audio data is not correct. I send it this way:
// Create a node that sends raw bytes across the websocket
var scriptNode = context.createScriptProcessor(4096, 1, 1);
// Need the maximum value for 16-bit signed samples, to convert from float.
const MAX_INT = Math.pow(2, 16 - 1) - 1;
scriptNode.addEventListener('audioprocess', function(e) {
var floatSamples = e.inputBuffer.getChannelData(0);
// The samples are floats in range [-1, 1]. Convert to 16-bit signed
// integer.
socket.emit('audiodata', Int16Array.from(floatSamples.map(function(n) {
return n * MAX_INT;
})));
This is the code from the codeLabs but I use socket.emit instead of socket.send.
The python code to handle and send it to Speech API is this one:
credentials = service_account.Credentials.from_service_account_file(CONFIG[u"service_account"])
client = speech.SpeechClient(credentials=credentials)
config = types.RecognitionConfig(
encoding=enums.RecognitionConfig.AudioEncoding.LINEAR16,
sample_rate_hertz=44100,
language_code=u"fr"
)
streaming_config = types.StreamingRecognitionConfig(
config=config,
interim_results=True,
single_utterance=False)
#socketio.on(u'audiodata')
def handle_audio_data_msg(audio_data_msg):
print u'Received raw audio.'
print audio_data_msg
for response in client.streaming_recognize(streaming_config, [audio_data_msg]):
print response
emit(u'audiodatareceived', {u"AudioData": u"Acquired"})
I got TypeError: descriptor 'SerializeToString' requires a 'google.protobuf.pyext._message.CMessage' object but received a 'dict'
so I tried, on front side, to pass Int16Array.from(...).buffer but I get TypeError: descriptor 'SerializeToString' requires a 'google.protobuf.pyext._message.CMessage' object but received a 'str'
So I'm not sure how I should pass the data... Any help appreciated!
[edit]
I re-wrote my handler cause I think I was not using the right type, it is now like that:
#socketio.on(u'audiodata')
def handle_audio_data_msg(audio_data_msg):
print u'Received raw audio.'
request = types.StreamingRecognizeRequest(audio_content=audio_data_msg)
response = client.streaming_recognize(streaming_config, [request])
for item in response:
print u"error", item.get(u"error")
print u"results", item.get(u"results")
print u"resultIndex", item.get(u"resultIndex")
print u"endpointerType", item.get(u"endpointerType")
emit(u'audiodatareceived', {u"AudioData": u"Acquired"})
I don't get any error but I get no response (empty list) and so nothing printed...
[edit2]: I realized that the type of my input was never binary data but str. Could the problem come from here ? I try to instantiate my Flask Socket IO app as follow, but still the type is str.
socketio = SocketIO(app, binary=True)

Related

Sending payload to IoT Hub for using in Azure Digital Twin using an Azure Function

Apologies for any incorrect formatting, long time since I posted anything on stack overflow.
I'm looking to send a json payload of data to Azure IoT Hub which I am then going to process using an Azure Function App to display real-time telemetry data in Azure Digital Twin.
I'm able to post the payload to IoT Hub and view it using the explorer fine, however my function is unable to take this and display this telemetry data in Azure Digital Twin. From Googling I've found that the json file needs to be utf-8 encrypted and set to application/json, which I think might be the problem with my current attempt at fixing this.
I've included a snipped of the log stream from my azure function app below, as shown the "body" part of the message is scrambled which is why I think it may be an issue in how the payload is encoded:
"iothub-message-source":"Telemetry"},"body":"eyJwb3dlciI6ICIxLjciLCAid2luZF9zcGVlZCI6ICIxLjciLCAid2luZF9kaXJlY3Rpb24iOiAiMS43In0="}
2023-01-27T13:39:05Z [Error] Error in ingest function: Cannot access child value on Newtonsoft.Json.Linq.JValue.
My current test code is below for sending payloads to IoT Hub, with the potential issue being that I'm not encoding the payload properly.
import datetime, requests
import json
deviceID = "JanTestDT"
IoTHubName = "IoTJanTest"
iotHubAPIVer = "2018-04-01"
iotHubRestURI = "https://" + IoTHubName + ".azure-devices.net/devices/" + deviceID + "/messages/events?api-version=" + iotHubAPIVer
SASToken = 'SharedAccessSignature'
Headers = {}
Headers['Authorization'] = SASToken
Headers['Content-Type'] = "application/json"
Headers['charset'] = "utf-8"
datetime = datetime.datetime.now()
payload = {
'power': "1.7",
'wind_speed': "1.7",
'wind_direction': "1.7"
}
payload2 = json.dumps(payload, ensure_ascii = False).encode("utf8")
resp = requests.post(iotHubRestURI, data=payload2, headers=Headers)
I've attempted to encode the payload correctly in several different ways including utf-8 within request.post, however this produces an error that a dict cannot be encoded or still has the body encrypted within the Function App log stream unable to decipher it.
Thanks for any help and/or guidance that can be provided on this - happy to elaborate further on anything that is not clear.
is there any particular reason why you want to use Azure IoT Hub Rest API end point instead of using Python SDK? Also, even though you see the values in JSON format when viewed through Azure IoT Explorer, the message format when viewed through a storage end point such as blob reveals a different format as you pointed.
I haven't tested the Python code with REST API, but I have a Python SDK that worked for me. Please refer the code sample below
import os
import random
import time
from datetime import date, datetime
from json import dumps
from azure.iot.device import IoTHubDeviceClient, Message
def json_serial(obj):
"""JSON serializer for objects not serializable by default json code"""
if isinstance(obj, (datetime, date)):
return obj.isoformat()
raise TypeError("Type %s not serializable" % type(obj))
CONNECTION_STRING = "<AzureIoTHubDevicePrimaryConnectionString>"
TEMPERATURE = 45.0
HUMIDITY = 60
MSG_TXT = '{{"temperature": {temperature},"humidity": {humidity}, "timesent": {timesent}}}'
def run_telemetry_sample(client):
print("IoT Hub device sending periodic messages")
client.connect()
while True:
temperature = TEMPERATURE + (random.random() * 15)
humidity = HUMIDITY + (random.random() * 20)
x = datetime.now().isoformat()
timesent = dumps(datetime.now(), default=json_serial)
msg_txt_formatted = MSG_TXT.format(
temperature=temperature, humidity=humidity, timesent=timesent)
message = Message(msg_txt_formatted, content_encoding="utf-8", content_type="application/json")
print("Sending message: {}".format(message))
client.send_message(message)
print("Message successfully sent")
time.sleep(10)
def main():
print("IoT Hub Quickstart #1 - Simulated device")
print("Press Ctrl-C to exit")
client = IoTHubDeviceClient.create_from_connection_string(CONNECTION_STRING)
try:
run_telemetry_sample(client)
except KeyboardInterrupt:
print("IoTHubClient sample stopped by user")
finally:
print("Shutting down IoTHubClient")
client.shutdown()
if __name__ == '__main__':
main()
You can edit the MSG_TXT variable in the code to match the payload format and pass the values. Note that the SDK uses Message class from Azure IoT Device library which has an overload for content type and content encoding. Here is how I have passed the overloads in the code message = Message(msg_txt_formatted, content_encoding="utf-8", content_type="application/json")
I have validated the message by routing to a Blob Storage container and could see the telemetry data in the JSON format. Please refer below image screenshot referring the data captured at end point.
Hope this helps!

How do I send an Image rotate request to gprc server with python?

I am new to grpc. I am having trouble understanding it. Anyway, I am working based on a protobuf file called: image.proto
Below are the contents of my image.proto file:
syntax = "proto3";
option java_multiple_files = true;
message NLImage {
bool color = 1;
bytes data = 2;
int32 width = 3;
int32 height = 4;
}
message NLImageRotateRequest {
enum Rotation {
NONE = 0;
NINETY_DEG = 1;
ONE_EIGHTY_DEG = 2;
TWO_SEVENTY_DEG = 3;
}
Rotation rotation = 1;
NLImage image = 2;
}
service NLImageService {
rpc RotateImage(NLImageRotateRequest) returns (NLImage);
rpc MeanFilter(NLImage) returns (NLImage);
}
I was able to create a server.py file, and client.py. I also generated the image_pb2.py from the image.proto file and I generated the image_pb2_grpc.py.
Now, I am stuck into sending an image rotation request from the client to the server, and getting back an appropriate response.
Here is what I have so far tried in my client.py
import grpc
# import the generated files
import image_pb2
import image_pb2_grpc
#.......
#.......SEVERAL LINES OF CODE LATER
#.......
print('Input image path is', inputfile)
print('Output path is', outputfile)
print('Mean is', mean)
print("Rotate is", rotate)
channel = grpc.insecure_channel('localhost:5000')
stub = image_pb2_grpc.NLImageServiceStub(channel)
# SEND ROTATION REQUEST IF rotate
img_rotate_request = image_pb2.NLImageRotateRequest()
stub.RotateImage(img_rotate_request, inputfile)
I am not sure how to send that RotateImage request properly.
Below is my server.py:
import sys, getopt
from google.protobuf.descriptor import EnumDescriptor
import grpc
from concurrent import futures
import time
import numpy as np
# importing grpc generated classes
import image_pb2
import image_pb2_grpc
class NLImageServiceServicer(image_pb2_grpc.NLImageServiceServicer):
def RotateImage(self, request, context):
print("Request to rotate image received")
return image_pb2._NLIMAGE()
def MeanFilter(self, request, context):
print("Request to filter received")
return image_pb2.NLImage()
server = grpc.server(futures.ThreadPoolExecutor(max_workers=10))
image_pb2_grpc.add_NLImageServiceServicer_to_server(NLImageServiceServicer, server)
server.add_insecure_port( host + ":" + str(port))
server.start()
print("Server started ...on " + host + ":" + port)
server.wait_for_termination()
Eyeballing your code, it appears OK; I've not run it.
You're correctly calling the method: stub.RotateImage but you're not correctly creating the message type (NLImageRotateRequest) for it. It will be an instance of the class generated for you by protoc, probably image_pb2.NLImageRotateRequest and this contains rotation and image properties. The image is itself an instance of a class image_pb2.Image etc.
Please see this link for using protobuf with Python because each language is distinct and Python has some quirks in how you create messages.
At some point, you'll need to read in inputfile as bytes in order to populate the data property of image.
If you've not, perhaps run through the Python HelloWorld example on grpc.io. It's basic but it will get you started with gRPC. Then review the link I included above and the protobuf sections on Python.

Cannot deserialize protobuf binary (created by Nodejs) in Python

Please review below context and help resolve the issue
I define below message in device.proto
message DeviceConfiguration {
message Resolution {
uint32 width = 1;
uint32 height = 2;
}
string device_name = 1;
string brand_name = 2;
Resolution resolution = 3;
}
Then, compile this message into 2 languages: Nodejs (device_pb.js) and Python(device_pb2.py)
1. From Nodejs: send above message to Kafka
const {DeviceConfiguration} = require("./device_pb")
const resolution = new DeviceConfiguration.Resolution();
resolution.setWidth(1280);
resolution.setHeight(960);
const deviceConfiguration = new DeviceConfiguration();
deviceConfiguration.setDeviceName("S6");
deviceConfiguration.setBrandName("samsung");
deviceConfiguration.setResolution(resolution);
let binaryEvent = deviceConfiguration.serializeBinary();
Finally, use Kafka-node to send binaryEvent value to a Kafka topic
2. From Python component: consume the message via kafka
Here is binary value received in python component
b'10,8,111,98,106,101,99,116,73,100,26,21,10,2,83,54,18,7,115,97,109,115,117,110,103,26,6,8,128,10,16,192,7'
Then, I use below code to deserialize the message but it throws a message "Error parsing message"
from device_pb2 import DeviceConfiguration
device = DeviceConfiguration()
device.ParseFromString(message)
=> As I see that the ParseFromString doesn't work for above binary value and throws "Error parsing message". It only works if the bytes value is created by SerializeToString python code.
Note that: I'm able to deserialize the binary by using deserializeBinary (make it public) in Nodejs but there is no similar function in device_pb2.py
So, is there any way for me to deserialize the message in Python code?
Many Thanks.

Request-streaming gRPC client request error

When I run my gRPC client and it attempts to stream a request to the server I get this error: "TypeError: has type list_iterator, but expected one of: bytes, unicode"
Do I need to encode the text I'm sending in some way? Error message makes some sense, as I am definitely passing in an iterator. I assumed from the gRPC documentation that this is what was needed. (https://grpc.io/docs/tutorials/basic/python.html#request-streaming-rpc)Anyway, sending a list or string yields a similar error.
At the moment I am sending a small test list of strings to the server in the request, but I plan to stream requests with very large amounts of text in the future.
Here's some of my client code.
def gen_tweet_space(text):
for tweet in text:
yield tweet
def run():
channel = grpc.insecure_channel('localhost:50050')
stub = ProseAndBabel_pb2_grpc.ProseAndBabelStub(channel)
while True:
iterator = iter(block_of_text)
response = stub.UserMarkov(ProseAndBabel_pb2.UserTweets(tweets=iterator))
Here's relevant server code:
def UserMarkov(self, request_iterator, context):
return ProseAndBabel_pb2.Babel(prose=markov.get_sentence(request_iterator.tweets))
Here's the proto where the rpc and messages are defined:
service ProseAndBabel {
rpc GetHaiku (BabelRequest) returns (Babel) {}
rpc GetBabel (BabelRequest) returns (Babel) {}
rpc UserMarkov (stream UserTweets) returns (UserBabel) {}
}
message BabelRequest{
string ask = 1;
}
message Babel{
string prose = 1;
}
message UserTweets{
string tweets = 1;
}
message UserBabel{
string prose = 1;
}
I've been successful getting the non-streaming rpc to work, but having trouble finding walkthroughs for request side streaming for python applications so I'm sure I'm missing something here. Any guidance/direction appreciated!
You need to pass the iterator of requests to the gRPC client stub, not to the protobuf constructor. The current code tries to instantiate a UserTweets protobuf with an iterator rather than an individual string, resulting in the type error.
response = stub.UserMarkov(ProseAndBabel_pb2.UserTweets(tweets=iterator))
You'll instead need to have your iterator to return instances of ProseAndBabel_pb2.UserTweets, each of which wraps one of the request strings you would like to send, and pass the iterator itself to the stub. Something like:
iterator = iter([ProseAndBabel_pb2.UserTweets(tweets=x) for x in block_of_text])
response = stub.UserMarkov(iterator)

Golang - Sending binary encoded data via TLS Socket

I'm new to Golang and this kind of more lowlevel stuff, so maybe i'm just thinking in the wrong direction.
My project is a small golang monitoring client which sends some ping-message (current date) over a encrypted TLS connection (this is just for learning purpose).
The python server and client are working flawless for now.
The (python) server & client are packing the data like this:
[...]
def _send_msg(self, msg):
msg = struct.pack('>I', len(msg)) + bytes(msg, 'UTF-8')
self.client.send(msg)
def _recv_msg(self):
raw_msglen = self.client.recv(4)
if not raw_msglen:
return ""
msglen = struct.unpack('>I', raw_msglen)[0]
return self._recvall(msglen)
[...]
On the Go-Side I pack the data like this:
type packet struct {
a uint32
b []byte
}
func pack(d string) []byte {
buf := bytes.Buffer{}
u := len([]byte(d))
p := packet{
a: uint32(u),
b: []byte(d),
}
err := binary.Write(&buf, binary.BigEndian, p.a) // struct.unpack('>I' [...] in python
if err != nil {
fmt.Println(err)
}
buf.Write(p.b) // Append bytes to buffer
return buf.Bytes()
}
reader := bufio.NewReader(tlsConn) // Socket reader
[..]
// Writing binary data
tlsConn.Write(pack("login test:test")) // Works so far
// Reading response (from python server)
var p uint32 // packet size
binary.Read(reader, binary.BigEndian, &p) // Read packet length (4 bytes uint32)
buf1 := make([]byte, int(p))
reader.Read(buf1)
fmt.Println(string(buf1)) // Print response - this also works
// Send some ping
// This code part also get's passed, but the python server doesn't
// recive any message
c.Write(pack("ping 0000-00-00 00:00:00"))
binary.Read(reader, binary.BigEndian, &p)
buf1 = make([]byte, int(p))
fmt.Println(string(buf1)) // Here i don't get an response
Here is the golang client-side test code which I'm using: http://pastebin.com/dr1mJZ9Y
and the corresponding terminal output: http://pastebin.com/mrctJPs5
The strange thing is, that the login message (including the server response) all get sent (and received) correctly but the second time when i try to send a ping like:
tlsConn.Write(pack("ping 0000-00-00 00:00:00"))
The message seems not to reach the server and no response message gets send back to the client.
Is there an error with the binary encoding of the packet-data and the length prefixing?
I know the code looks a bit messy and not very go-idiomatic, sorry for that.
Thank you again in advance for your help!
Software dependencies / environment:
Python 3.4
Golang 1.5.3 (amd64 Linux)
No 3rd-Party libs

Categories