- #36
munqui
- 2
- 0
Originally posted by jimmy p
Well the lecture i went to, the dude said that...well i suppose the idea of digital is old fashioned with Morse Code and whatever. What i meant was that they are simple (eg. square) whereas normal waves are...wave shaped.
Yes, normal waves *are* 'wave shaped'; but you've missed the point about 'digital' representations a little -- they're not 'square' per se; just slightly jagged compared to the analogue.
So what's the difference? Simply, an analogue signal varies continuously. A digital one does not.
To illustrate; let's imagine you've drawn a waveform with pencil and paper. You want to show it to a friend, but you don't want to send the paper itself (its your only copy), so you measure the amplitude of the waveform at regular intevals, write down the values you measure, and post these values to your friend. He can 'reassemble' the waveform by plotting the amplitude values and joining the dots. The representation you send to your friend is digital; whereas the waveform you drew on the paper in the first place is analogue.
Here is where the distinction comes in -- your waveform on the piece of paper *isn't* the same as the numbers you sent your friend; as you made approximations each time you measured the amplitude (you can't measure more than +/- 0.5 mm on most rulers, for example). This is why analogue might be considered 'better' than digital -- digital has inherent inaccuracies.
So -- why use digital? (the original question)
i) You don't *need* all the information in an image to perceive the important information it conveys. Hence, if we represent an image digitally, then strip out the unneccessary information, we save on transmission costs
ii) Analogue signals are much harder to transmit than digital ones
iii) Digital signals allow good error checking, analogue signals don't
iv) Digital signals can be heavily manipulated by computers (DSP) and as such we can implement sophisticated systems using them that wouldn't be possible with analogue
The list goes on...