- #1
NTL2009
- 618
- 386
- TL;DR Summary
- What is responsible for Bluetooth audio latency?
I often watch instructional YouTube videos on my tablet with Bluetooth headphones. The delay/latency in the audio often isn't so noticeable, but when I see someone using a hammer, or sandpaper, the delay between the visual of the hammer hit or sandpaper swipe and the audible "bam" or "swoosh" is jarring. Which always reminds me of those film clapperboard out-takes, which were used to sync video-audio.
When I try to search on this subject, I get a bunch of generic info about making sure drivers are up to date, you have a strong signal, etc. But I'm not talking about out-of-the-ordinary delay, I'm talking about the inherent, unavoidable delays in the process.
In practice, the common protocols seem to have a latency of ~ 150 msec, which is very noticeable under the conditions I mentioned.
I'm somewhat familiar with GSM transmission, and I assume it's fairly similar. I think the inherent delays there come from:
A) Sample the audio for 1/50th of a second (20 ms).
B) Only after that 20 msec is captured, can the the data be compressed and put into a "packet". I'm not sure how long this takes, but I'm pretty sure it can't begin until the entire 20 msec sample is collected, 20 msec plus processing time.
C) The packet is transmitted. Time will depend on available time slice (and maybe bandwidth?). Though BT may be different, I'm pretty sure GSM is set up such that the entire packet fits into one time slice, so bandwidth is not a limiting factor, by design. Or maybe I should say, the bandwidth delay is fixed by design?
D) The packet is received. Again, I'm pretty sure the entire packet must be received before the de-compression can begin. And only after it is fully decompressed, can the audio be output. And, I'd imagine some amount of buffering is needed to assure the packets can be seamlessly connected (and maybe to request a resend of data for error detection/correction?).
So if that is reasonable analogous to how BT works, does anyone know how long of a sample BT takes for audio? And would that be the biggest factor? I believe that BT uses the SBC codec. https://en.wikipedia.org/wiki/SBC_(codec)
TIA
When I try to search on this subject, I get a bunch of generic info about making sure drivers are up to date, you have a strong signal, etc. But I'm not talking about out-of-the-ordinary delay, I'm talking about the inherent, unavoidable delays in the process.
In practice, the common protocols seem to have a latency of ~ 150 msec, which is very noticeable under the conditions I mentioned.
I'm somewhat familiar with GSM transmission, and I assume it's fairly similar. I think the inherent delays there come from:
A) Sample the audio for 1/50th of a second (20 ms).
B) Only after that 20 msec is captured, can the the data be compressed and put into a "packet". I'm not sure how long this takes, but I'm pretty sure it can't begin until the entire 20 msec sample is collected, 20 msec plus processing time.
C) The packet is transmitted. Time will depend on available time slice (and maybe bandwidth?). Though BT may be different, I'm pretty sure GSM is set up such that the entire packet fits into one time slice, so bandwidth is not a limiting factor, by design. Or maybe I should say, the bandwidth delay is fixed by design?
D) The packet is received. Again, I'm pretty sure the entire packet must be received before the de-compression can begin. And only after it is fully decompressed, can the audio be output. And, I'd imagine some amount of buffering is needed to assure the packets can be seamlessly connected (and maybe to request a resend of data for error detection/correction?).
So if that is reasonable analogous to how BT works, does anyone know how long of a sample BT takes for audio? And would that be the biggest factor? I believe that BT uses the SBC codec. https://en.wikipedia.org/wiki/SBC_(codec)
TIA