Analog to digital

 

Analogue to Digital Conversion

Also referred to as digitisation or quantisation. The process in which a continuous analog signal is quantized and converted to a series of binary integers.

In a narrow sense it is a device (usually, a micro-chip) which transforms a signal from analogue form to digital form. This is done by taking samples of the analogue signal at regular intervals. Each analogue sample value is then quantized into a binary code.

In a wider sense it is a device performing all functions, necessary to convert analogue video signals to a specified digital interface format, in particular sampling frequency genlocking, video signal pre-filtering, black level clamping, sync code word insertion, and even sometimes parallel-to-serial conversion.

The conversion of analogue signals into digital data – normally for subsequent use in a digital machine. For TV, samples of audio and video are taken, the accuracy of the process depending on both the sampling frequency and the resolution of the analogue amplitude information – how many bits are used to describe the analogue levels. For TV pictures 8 or 10 bits are normally used, for sound, 16 or 20 bits are common.

The CCIR 601 standard defines a video sampling frequency of 13.5 MHz, and AES/EBU defines sampling of 44.1 and 48 kHz for audio.

 

 

References

Pank, B., Editor, 1994, The Digital Fact Book, 7th Ed, Quantel, Newbury, UK