1997 Nov 17
22
Philips Semiconductors
Preliminary specification
ISO/MPEG Audio Source Decoder
SAA2502
7.4.9
D
ECODER LATENCY TIME
Latency time is defined as elapsed time between the
moment that the first byte of an audio frame is delivered to
the SAA2502 and the moment that the output response
resulting from the first (sub-band) sample of the same
frame reaches its maximum.
Latency time results from the addition of two internal
latency contributions: t
latency
= t
proc
+ t
buf
.
The processing latency time (t
proc
) is sample frequency
dependent (see Table 10).
The input buffer latency time (t
buf
) is input interface
mode dependent.
Precision of latency time calculation is sampling rate and
bit rate dependent. Maximum deviation is roughly plus or
minus 4 sample periods.
7.4.9.1
Master and slave input interface modes
Input buffer latency time t
buf
= (minimum of t
buf1
and
t
buf2
) + cr
×
3.52 ms:
t
buf1
is sample frequency dependent (see Table 10)
t
buf2
is input bit rate dependent (see Table 11 and
Table 12)
cr is the ratio between maximum and actual value of
MCLKIN frequency.
For slave input interface mode NOT the average input bit
rate should be used for table look-up, but CDCL frequency
(input bit rate during the burst). For free format bit rates the
table should be interpolated (t
buf2
is proportional to
1/bit rate).
Table 10
Processing latency time
SAMPLE FREQUENCY
(kHz)
t
proc
(ms)
t
buf1
LAYER I (ms)
t
buf1
LAYER II (ms)
48
44.1
32
24
22.05
16
6.67
7.26
10.00
13.33
14.51
20.00
8.00
8.71
12.00
16.00
17.41
24.00
24.00
26.12
36.00
48.00
52.24
72.00
Table 11
Buffer latency time; high bit rate
BIT RATE (kbits/s)
t
buf2
(ms)
5.52
6.44
7.73
9.66
12.88
15.45
19.31
25.75
38.63
51.50
77.25
154.50
448
384
320
256
192
160
128
96
64
48
32
16
Table 12
Buffer latency time; low bit rate
BIT RATE (kbits/s)
t
buf2
(ms)
5.94
7.02
8.58
11.04
14.05
17.17
22.07
30.90
44.14
61.80
103.00
309.00
416
352
288
224
176
144
112
80
56
40
24
8