| Metric | Definition | Unit | Example | |--------|------------|------|---------| | | Time for a packet to travel to destination and back | ms | 20 ms | | Jitter | Standard deviation or mean absolute deviation of latencies across multiple packets | ms | 5 ms | | Packet Loss | Percentage of packets never acknowledged | % | 0.1% |
[ J(i) = J(i-1) + \frac - J(i-1)16 ]
Jitter is defined as the statistical variance of packet inter-arrival times. If packets are sent at perfectly regular intervals (e.g., every 10 ms) but arrive at intervals of 8 ms, 12 ms, 9 ms, 11 ms, the variation is jitter. When jitter exceeds the buffer capacity of an application, packets are either discarded or delayed, causing perceptible degradation.