Hi,
I'm writing a HCI monitoring program which is able to capture HCI traffic between a BlueCore and a Host independent from the BlueZ stack. The program's output is a log of all HCI traffic in a format that can be reread and interpret using the hcidump --read-dump=hcidumplog
The reaction on an earlier mail from my colleague (Han Hoekstra) didn't solve the problem unfortunately, so I'll try to reformulate our problem in more detail hoping you can help us out.
We're able to extract the BCSP packet from the SLIP layer. We're also able to derive the HCI packet from the BCSP packet. But then, to put the HCI packet in a hcidump usable format, a HCIDUMP HEADER must be put in front of the HCI packet. This is where a first question is about. According to the hcidump source code the HCI packet's header is a 12 byte header containing
- HCI packet length (2 byte)
- HCI packet direction (1 byte)
- HCI packet pad (1 byte)
- HCI timestamp seconds (4 byte)
- HCI timestamp microseconds (4 byte)
What's the pad byte's function/value?
Now the second question. When I retrieve a HCI packet and put a HCIDUMP HEADER in front of it, it is not re-readable using hcidump. According to the hci_dump function in hci.c (line: 2666) the first databyte of the frame data holds the packet type which forms the starting point of the interpretation of the HCI packet in hcidump. This packet type byte is an extra byte not present in the real communication, but necessary for hcidump to work properly. I can find the defines for the different packet types (bluetooth/hci.h) but I can't figure out where and how this byte is calculated or given a value. I first thought this byte was given a value depending on the contents of the BCSP packet header, but I can find no relationship between the two to be honest. (Maybe due to seeing too much HEX code the last few days). So the question here is:
How to calculate the packet type byte between the HCIDUMP HEADER and the original HCI packet?
I'd really appreciate your help!
Regards
Ruud