When you use the AWS API to run a command on a remote docker container (ECS), the AWS API gives you back a websocket to read the output of your command from. When using the aws command line utility (which also uses the AWS API), reading the websocket stream is handled by session-manager-plugin.
session-manager-plugin is written in GoLang, and I've been trying to rewrite parts of it in Python. I don't speak GoLang, but I fumbled my way though adding some code to session-manager-plugin to output the raw binary data it is sending, and receiving when the binary is being used.
Essentially, the output of the command you ran is split up into messages, each one with headers, and a payload. One of the headers for each message is a messageID, which is a UUID. Each message needs to be acknowledged by telling the server that you're received the message with that UUID.
The issue I'm having is that when analyzing the raw binary data, I can see that a message that was received with UUID b'\x85\xc3\x12P\n\x08\xaf)\xfd\xba\x1b8\x1asMd' is being acknowledged by session-manager-plugin with a packet that says this:
b'{"AcknowledgedMessageType":"output_stream_data","AcknowledgedMessageId":"fdba1b38-1a73-4d64-85c3-12500a08af29","AcknowledgedMessageSequenceNumber":4,"IsSequentialMessage":true}'
To figure out what UUID b'\x85\xc3\x12P\n\x08\xaf)\xfd\xba\x1b8\x1asMd' is in Python, I do this:
import uuid
print(str(uuid.UUID(bytes=b'\x85\xc3\x12P\n\x08\xaf)\xfd\xba\x1b8\x1asMd')))
# 85c31250-0a08-af29-fdba-1b381a734d64
At first glance, the UUID of the message that was received, and the UUID of the message being acknowledged do not match, but if you look closely, you'll see that the UUID of the original message that was received is reversed from the UUID being acknowledged. Sort of. In the 16 byte UUID, the first 8 bytes come after the last 8 bytes.
85c31250-0a08-af29-fdba-1b381a734d64
fdba1b38-1a73-4d64-85c3-12500a08af29
Is there any reason this would be happening? Am I decoding b'\x85\xc3\x12P\n\x08\xaf)\xfd\xba\x1b8\x1asMd' wrong?
Note: As you can see from above, the UUID in the Acknowledgement packet is inside of JSON. If I was decoding it wrong, the whole thing would be gibberish.
Also note that this is just an analysis of a perfectly working session-manager-plugin communication stream. One way or another, this actually works. I'm just trying to figure out how so I can re-create it.
Looking at the source code for session-manager-plugin, it would appear it reads the first eight bytes as the least significant bytes, then reads the next eight bytes as the most significant bytes, then appends it in the order MSB, LSB. Seems to me like that would produce the behavior you're seeing.
// getUuid gets the 128bit uuid from an array of bytes starting from the offset.
func getUuid(log log.T, byteArray []byte, offset int) (result uuid.UUID, err error) {
byteArrayLength := len(byteArray)
if offset > byteArrayLength-1 || offset+16-1 > byteArrayLength-1 || offset < 0 {
log.Error("getUuid failed: Offset is invalid.")
return nil, errors.New("Offset is outside the byte array.")
}
leastSignificantLong, err := getLong(log, byteArray, offset)
if err != nil {
log.Error("getUuid failed: failed to get uuid LSBs Long value.")
return nil, errors.New("Failed to get uuid LSBs long value.")
}
leastSignificantBytes, err := longToBytes(log, leastSignificantLong)
if err != nil {
log.Error("getUuid failed: failed to get uuid LSBs bytes value.")
return nil, errors.New("Failed to get uuid LSBs bytes value.")
}
mostSignificantLong, err := getLong(log, byteArray, offset+8)
if err != nil {
log.Error("getUuid failed: failed to get uuid MSBs Long value.")
return nil, errors.New("Failed to get uuid MSBs long value.")
}
mostSignificantBytes, err := longToBytes(log, mostSignificantLong)
if err != nil {
log.Error("getUuid failed: failed to get uuid MSBs bytes value.")
return nil, errors.New("Failed to get uuid MSBs bytes value.")
}
uuidBytes := append(mostSignificantBytes, leastSignificantBytes...)
return uuid.New(uuidBytes), nil
}
Source Code
Related
I'm working with an API provided by a client, who provided sample code on authorizing via C#, but my company works in Python. They authorize via HMAC, and following their sample code (on .net fiddle), I finally got to the point where our byte key and byte message match those of the C# call.
However, when we get to this in their code:
using (HMACSHA256 sha = new HMACSHA256(secretKeyByteArray))
{
// Sign the request
byte[] signatureBytes = sha.ComputeHash(signature);
Where our equivalent is
signature_hmac = hmac.new(
base64.b64decode(secretKey),
bytes(message, "utf8"),
digestmod=hashlib.sha256,
)
The byte value of their signatureBytes doesn't match with the byte value of our signature_hmac.digest(). If using the same hashing libraries, and the byte values of the inputs match, we should get the same result, right?
To make sure, when I say byte values match, the value of base64.b64decode(secretKey) (Python) matched var secretKeyByteArray = Convert.FromBase64String(secretKey); (C#).
Worked for me, make sure your input arrays are EXACTLY the same on both. I think you might be missing a base64 decode or encoding the strings differently.
Here's my 2 tests:
secretKey = "wxyz".encode()
message = "abcd".encode()
signature_hmac = hmac.new(
secretKey,
message,
digestmod=hashlib.sha256,
)
x = signature_hmac.hexdigest()
and C#
byte[] signatureBytes;
byte[] secretKeyByteArray = Encoding.UTF8.GetBytes("wxyz");
byte[] signature = Encoding.UTF8.GetBytes("abcd");
using (HMACSHA256 sha = new HMACSHA256(secretKeyByteArray))
{
signatureBytes = sha.ComputeHash(signature);
};
string x = Convert.ToHexString(signatureBytes);
Please review below context and help resolve the issue
I define below message in device.proto
message DeviceConfiguration {
message Resolution {
uint32 width = 1;
uint32 height = 2;
}
string device_name = 1;
string brand_name = 2;
Resolution resolution = 3;
}
Then, compile this message into 2 languages: Nodejs (device_pb.js) and Python(device_pb2.py)
1. From Nodejs: send above message to Kafka
const {DeviceConfiguration} = require("./device_pb")
const resolution = new DeviceConfiguration.Resolution();
resolution.setWidth(1280);
resolution.setHeight(960);
const deviceConfiguration = new DeviceConfiguration();
deviceConfiguration.setDeviceName("S6");
deviceConfiguration.setBrandName("samsung");
deviceConfiguration.setResolution(resolution);
let binaryEvent = deviceConfiguration.serializeBinary();
Finally, use Kafka-node to send binaryEvent value to a Kafka topic
2. From Python component: consume the message via kafka
Here is binary value received in python component
b'10,8,111,98,106,101,99,116,73,100,26,21,10,2,83,54,18,7,115,97,109,115,117,110,103,26,6,8,128,10,16,192,7'
Then, I use below code to deserialize the message but it throws a message "Error parsing message"
from device_pb2 import DeviceConfiguration
device = DeviceConfiguration()
device.ParseFromString(message)
=> As I see that the ParseFromString doesn't work for above binary value and throws "Error parsing message". It only works if the bytes value is created by SerializeToString python code.
Note that: I'm able to deserialize the binary by using deserializeBinary (make it public) in Nodejs but there is no similar function in device_pb2.py
So, is there any way for me to deserialize the message in Python code?
Many Thanks.
When I run my gRPC client and it attempts to stream a request to the server I get this error: "TypeError: has type list_iterator, but expected one of: bytes, unicode"
Do I need to encode the text I'm sending in some way? Error message makes some sense, as I am definitely passing in an iterator. I assumed from the gRPC documentation that this is what was needed. (https://grpc.io/docs/tutorials/basic/python.html#request-streaming-rpc)Anyway, sending a list or string yields a similar error.
At the moment I am sending a small test list of strings to the server in the request, but I plan to stream requests with very large amounts of text in the future.
Here's some of my client code.
def gen_tweet_space(text):
for tweet in text:
yield tweet
def run():
channel = grpc.insecure_channel('localhost:50050')
stub = ProseAndBabel_pb2_grpc.ProseAndBabelStub(channel)
while True:
iterator = iter(block_of_text)
response = stub.UserMarkov(ProseAndBabel_pb2.UserTweets(tweets=iterator))
Here's relevant server code:
def UserMarkov(self, request_iterator, context):
return ProseAndBabel_pb2.Babel(prose=markov.get_sentence(request_iterator.tweets))
Here's the proto where the rpc and messages are defined:
service ProseAndBabel {
rpc GetHaiku (BabelRequest) returns (Babel) {}
rpc GetBabel (BabelRequest) returns (Babel) {}
rpc UserMarkov (stream UserTweets) returns (UserBabel) {}
}
message BabelRequest{
string ask = 1;
}
message Babel{
string prose = 1;
}
message UserTweets{
string tweets = 1;
}
message UserBabel{
string prose = 1;
}
I've been successful getting the non-streaming rpc to work, but having trouble finding walkthroughs for request side streaming for python applications so I'm sure I'm missing something here. Any guidance/direction appreciated!
You need to pass the iterator of requests to the gRPC client stub, not to the protobuf constructor. The current code tries to instantiate a UserTweets protobuf with an iterator rather than an individual string, resulting in the type error.
response = stub.UserMarkov(ProseAndBabel_pb2.UserTweets(tweets=iterator))
You'll instead need to have your iterator to return instances of ProseAndBabel_pb2.UserTweets, each of which wraps one of the request strings you would like to send, and pass the iterator itself to the stub. Something like:
iterator = iter([ProseAndBabel_pb2.UserTweets(tweets=x) for x in block_of_text])
response = stub.UserMarkov(iterator)
I'm trying to adapt this Java codeLabs to Python / SocketIO (Flask Socket IO on server side).
My socket is working, I'm able to pass both the sample rate (and to retrieve on server side), and the audio data.
The problem is that the format of my audio data is not correct. I send it this way:
// Create a node that sends raw bytes across the websocket
var scriptNode = context.createScriptProcessor(4096, 1, 1);
// Need the maximum value for 16-bit signed samples, to convert from float.
const MAX_INT = Math.pow(2, 16 - 1) - 1;
scriptNode.addEventListener('audioprocess', function(e) {
var floatSamples = e.inputBuffer.getChannelData(0);
// The samples are floats in range [-1, 1]. Convert to 16-bit signed
// integer.
socket.emit('audiodata', Int16Array.from(floatSamples.map(function(n) {
return n * MAX_INT;
})));
This is the code from the codeLabs but I use socket.emit instead of socket.send.
The python code to handle and send it to Speech API is this one:
credentials = service_account.Credentials.from_service_account_file(CONFIG[u"service_account"])
client = speech.SpeechClient(credentials=credentials)
config = types.RecognitionConfig(
encoding=enums.RecognitionConfig.AudioEncoding.LINEAR16,
sample_rate_hertz=44100,
language_code=u"fr"
)
streaming_config = types.StreamingRecognitionConfig(
config=config,
interim_results=True,
single_utterance=False)
#socketio.on(u'audiodata')
def handle_audio_data_msg(audio_data_msg):
print u'Received raw audio.'
print audio_data_msg
for response in client.streaming_recognize(streaming_config, [audio_data_msg]):
print response
emit(u'audiodatareceived', {u"AudioData": u"Acquired"})
I got TypeError: descriptor 'SerializeToString' requires a 'google.protobuf.pyext._message.CMessage' object but received a 'dict'
so I tried, on front side, to pass Int16Array.from(...).buffer but I get TypeError: descriptor 'SerializeToString' requires a 'google.protobuf.pyext._message.CMessage' object but received a 'str'
So I'm not sure how I should pass the data... Any help appreciated!
[edit]
I re-wrote my handler cause I think I was not using the right type, it is now like that:
#socketio.on(u'audiodata')
def handle_audio_data_msg(audio_data_msg):
print u'Received raw audio.'
request = types.StreamingRecognizeRequest(audio_content=audio_data_msg)
response = client.streaming_recognize(streaming_config, [request])
for item in response:
print u"error", item.get(u"error")
print u"results", item.get(u"results")
print u"resultIndex", item.get(u"resultIndex")
print u"endpointerType", item.get(u"endpointerType")
emit(u'audiodatareceived', {u"AudioData": u"Acquired"})
I don't get any error but I get no response (empty list) and so nothing printed...
[edit2]: I realized that the type of my input was never binary data but str. Could the problem come from here ? I try to instantiate my Flask Socket IO app as follow, but still the type is str.
socketio = SocketIO(app, binary=True)
I'm new to Golang and this kind of more lowlevel stuff, so maybe i'm just thinking in the wrong direction.
My project is a small golang monitoring client which sends some ping-message (current date) over a encrypted TLS connection (this is just for learning purpose).
The python server and client are working flawless for now.
The (python) server & client are packing the data like this:
[...]
def _send_msg(self, msg):
msg = struct.pack('>I', len(msg)) + bytes(msg, 'UTF-8')
self.client.send(msg)
def _recv_msg(self):
raw_msglen = self.client.recv(4)
if not raw_msglen:
return ""
msglen = struct.unpack('>I', raw_msglen)[0]
return self._recvall(msglen)
[...]
On the Go-Side I pack the data like this:
type packet struct {
a uint32
b []byte
}
func pack(d string) []byte {
buf := bytes.Buffer{}
u := len([]byte(d))
p := packet{
a: uint32(u),
b: []byte(d),
}
err := binary.Write(&buf, binary.BigEndian, p.a) // struct.unpack('>I' [...] in python
if err != nil {
fmt.Println(err)
}
buf.Write(p.b) // Append bytes to buffer
return buf.Bytes()
}
reader := bufio.NewReader(tlsConn) // Socket reader
[..]
// Writing binary data
tlsConn.Write(pack("login test:test")) // Works so far
// Reading response (from python server)
var p uint32 // packet size
binary.Read(reader, binary.BigEndian, &p) // Read packet length (4 bytes uint32)
buf1 := make([]byte, int(p))
reader.Read(buf1)
fmt.Println(string(buf1)) // Print response - this also works
// Send some ping
// This code part also get's passed, but the python server doesn't
// recive any message
c.Write(pack("ping 0000-00-00 00:00:00"))
binary.Read(reader, binary.BigEndian, &p)
buf1 = make([]byte, int(p))
fmt.Println(string(buf1)) // Here i don't get an response
Here is the golang client-side test code which I'm using: http://pastebin.com/dr1mJZ9Y
and the corresponding terminal output: http://pastebin.com/mrctJPs5
The strange thing is, that the login message (including the server response) all get sent (and received) correctly but the second time when i try to send a ping like:
tlsConn.Write(pack("ping 0000-00-00 00:00:00"))
The message seems not to reach the server and no response message gets send back to the client.
Is there an error with the binary encoding of the packet-data and the length prefixing?
I know the code looks a bit messy and not very go-idiomatic, sorry for that.
Thank you again in advance for your help!
Software dependencies / environment:
Python 3.4
Golang 1.5.3 (amd64 Linux)
No 3rd-Party libs