Time for another question, from Paul Hunter, who asks
“what is the difference between PCM 2.0 stereo channel and Dolby Digital (DD) 2.0? I have a brand new amp (Onkyo) which shows that Standard Definition TV programmes, via my Humax HDR Fox T2 recorder, are transmitted in PCM 2.0, whereas the majority of HD programmes are transmitted in DD 2.0 (although some in DD 5.1). My amp appears to be capable of converting both the PCM 2.0 and DD 2.0 input formats to 2.1 stereo, or even 5.1 surround outputs. Please could you explain the difference between PCM 2.0 and DD 2.0 inputs?”
There’s not a massive difference, in terms of what you’ll hear, really. The 2.0 indicates that the signal has just two channels of audio, and no separate channel for the subwoofer. When your amp converts this to 2.1 what it’s really doing is filtering off the low frequencies and directing them to the subwoofer channel.
But stepping back a bit, the real question – what’s the difference between PCM 2.0 an Dolby Digital 2.0, given that they are both stereo signals?
PCM stands for Pulse Code Modulation, which is a pretty simple way of encoding digital audio, without compression. It’s understood by just about every bit of AV kit out there, and you could call it the lingua franca of digital audio, I suppose.
Dolby Digital is a codec, which involves some compression, so theoretically if you were to take the same audio signal and encode it in PCM and Dolby Digital, you might (depending on the bit rates involved) hear a small difference, in favour of PCM.
Both formats can be used for stereo or multi-channel audio, however because of the uncompressed nature of PCM, you can’t really get more than two channels down an S/PDIF link, whether optical or digital, though some kit will support multi-channel PCM via HDMI. But I digress. Back to the 2.0 versions of each:
On Standard Definition Freeview (and most other services) the soundtrack uses MP2 compression. Some kit will support that, but not all, and when you’re using a digital connection, whether HDMI or S/PDIF, the most straightforward thing to do is simply to decode it to PCM audio, and so you get a PCM 2.0 stream from your receiver.
On High Definition channels on Freeview, the sound uses the AAC codec (and can use HE-AAC, which is the High Efficiency version of the same codec). This is used for both stereo and multi-channel programmes, and like the MP2 audio on standard def channels, can be converted to PCM.
However, not all home AV kit has HDMI, so many people will be relying on an S/PDIF connection for their audio. And some kit with HDMI doesn’t support multi-channel PCM, either.
So, for surround broadcasts, the most sensible thing to do for broadest compatibility is to convert the AAC multi-channel audio into Dolby Digital 5.1, and the chipsets in many FreeviewHD products are capable of doing this. Kit connected via HDMI can tell the receiver if it will support multi-channel PCM, but via S/PDIF that’s not possible, as it’s a one way connection. So, creating Dolby Digital is a solution that will work for most kit, and will work whether the programme is in stereo or multi-channel audio.
It would be possible to switch between using Dolby Digital 5.1 and PCM 2.0 when a programme is broadcast in stereo on HD channels, but that’s not generally considered a great idea. Lots of kit will handle it perfectly, but some won’t, and you may get a momentary loss of sound, or some other audible glitch. So, it’s best to stick to the same codec, and simply indicate the number of channels included in the stream.
This is less of a problem when changing between SD and HD channels, of course, because you’ll be expecting to lose sound and vision anyway.
The quick answer to the question, then, is that it’s just a different codec, but the signal is still in stereo. And the details, above, explain why it makes sense to create a Dolby signal even for a stereo programme.