
Analog / digital converter
ST10F252M
needs at least 50 s to complete the power-up transient phase. Clear the ADOFF bit first,
and start the first conversion only after 50 s.
Note:
If bit ADOFF is set and when ADST is also set, at the end of the conversion (or cycle of
conversions if SCAN mode is selected), the ADC is switched off (as soon as ADBSY is
cleared).
Turning off ADC consumption (setting bit ADOFF) should be done once the calibration is
completed (starts after every reset occurrence); if not, the calibration is stopped by setting
bit ADOFF and not restarted/completed when bit ADOFF is cleared again.
14.2
Conversion timing control
When a conversion is started, first the capacitances of the converter are loaded via the
respective analog input pin to the current analog input voltage. The time to load the
capacitances is referred to as sample time. Next the sampled voltage is converted to a
digital value in several successive steps, which correspond to the 10-bit resolution of the
ADC. During these steps the internal capacitances are repeatedly charged and discharged
via the VAREF pin.
The current that has to be drawn from the sources for sampling and changing charges
depends on the time that each respective step takes, because the capacitors must reach
their final voltage level within the given time, at least with a certain approximation. The
maximum current, however, that a source can deliver, depends on its internal resistance.
The time that the two different actions during conversion take (sampling, and converting)
can be programmed within a certain range in the ST10F252M with respect to the CPU
clock. The absolute time that is consumed by the different conversion steps is independent
of the general speed of the controller. This allows the ADC of the ST10F252M to be
adjusted to the properties of the system.
●
Fast conversion can be achieved by programming the respective times to their
absolute possible minimum. This is preferable for scanning high frequency signals.
However, the internal resistance of analog source and analog supply must be
sufficiently low.
●
High internal resistance can be achieved by programming the respective times to a
higher value, or the possible maximum. This is preferable when using analog sources
and supply with a high internal resistance to keep the current as low as possible. The
conversion rate in this case may be considerably lower.
The ADC input bandwidth is limited by the achievable accuracy. For example, supposing a
maximum error of 0.5LSB (2 mV) impacting the global TUE (TUE also depends on other
causes), in the worst case of temperature and process, the maximum frequency for a sine
wave analog signal is around 7.5 kHz. To reduce the effect of the input signal variation on
the accuracy down to 0.05LSB, the maximum input frequency of the sine wave is reduced to
800 Hz.
If a static signal is applied during the sampling phase, the series resistance is not greater
than 20 k
Ω (this takes into account any possible input leakage). Do not connect any
capacitance on analog input pins to reduce the effect of charge partitioning (and consequent
voltage drop error) between the external and the internal capacitance. If an RC filter is
necessary, the external capacitance must be greater than 10 nF to minimize the accuracy
impact.