Delay variation (DV), jitter

Delay variation (DV) or jitter arises when different packets take a different amount of time to travel from sender to receiver.

Netrounds' jitter calculation for synthetic traffic follows IETF RFC 3393. In short, Netrounds calculates the difference between the maximum and the minimum measured delay within a specific interval, commonly one second.

Since the output of a video stream needs to be continuous, jitter forces a set-top box (STB) to buffer a certain amount of data. The more jitter there is, the more the STB needs to buffer. If the buffer runs empty, or runs full, the effect on IPTV quality will be the same as that of packet loss (pixelation, audio glitches, etc.).

The standard buffering requirement for coping with jitter is 50 ms. Modern STBs are often able to buffer much more than 50 ms of data, and jitter buffers can vary between vendors.

Have more questions? Submit a request


Powered by Zendesk