OK this is attributed to Stephen Massey (Massey PLugins). I’ve no reason to believe it is not by SM but I can’t seem to locate the original. Regardless, it effectively trashes the -0.3db philosophy:
There’s this theory in mastering that you should leave a tiny bit of digital headroom in your brickwall limiter’s output. For example, if you’re bouncing down a mix using the L2007 last in the chain, the theory says you should drop the max. output control a fraction of a dB. The precise value varies by opinion: -.3 dB, -.1dB, -.5dB, etc.
The reasoning is that the analog signal might clip at its loudest peaks, once reconstructed at the output by the digital-to-analog converters in the listener’s playback device. That is, if the analog circuitry hasn’t been well designed. This phenomenon has been labeled “inter-sample clipping.” It’s a reasonable idea based on sound electrical engineering analysis. It probably happens in rare cases. There was a paper published a while back demonstrating it was possible in real-life CD players.
The company I once worked for, TL Labs, designed a metering plugin to model this process. You can read more about inter-sample clipping in the user guide: TL Labs Plugins Guide.
But, personally, I’ve never bought into this story in its entirety. (Maybe you can tell from my other postings, but I suffer from chronic skepticism of any and all dogma.)
This theory begs deeper questions, such as: Is this clipping at all audible above the massive distortion already done in by limiting in the first place? My guess is: no. Significant oversampled clipping goes hand-in-hand with substantial brickwall limiting levels. After blowing out the music with limiter distortion, it’s a little too late to start fretting like an audiophile.
Very few folks even use a CD player anymore, which is in the original foundation of this theory. If someone’s still using a CD player, they’re probably an audiophile and own a well-designed model.
Or, they’re living in the past, don’t care much about sonic quality and won’t be buying your modern, smashed CD anyway. If the listener is using an MP3 player, computer, or other media player, then the gain scaling for the volume control sometimes happens in the digital domain well before reaching the digital-to-analog convertors. This means the output is nowhere near the power “rails” of the analog circuitry. (Furthermore, what impact does MP3 compression generally have on peak levels?)
Most perplexing is the promotion of such a minuscule “headroom” value of -0.3 dB, etc. This isn’t going to get you any audible decrease in distortion in the event of actual clipping. You’re not going to hear the tiny 0.3 dB tip of the sound wave lost. But, you don’t have to take my word: insert a gain plugin on your master fader last in the chain. Go for broke: set it to +0.5 dB and listen to your mix. How far can you push it?
If you’re genuinely concerned about fidelity, then I say do something more substantive and give us 2 or 3 dB of headroom. Otherwise, you’re just playing a psychological game, buying some emotional comfort from the self-deluding marketplace of audio engineering groupthink. The TL Labs meter was always a curious irony to me. Winning the loudness war is mutually exclusive of achieving fidelity, but here was a gadget trying to sell us both.
Anyhow, if you click the “max. output” label on the purchased version of the L2007, a text entry box will appear and you can punch in an exact value. A little secret: -0.5 is usually a little bit grainy and digital-sounding, but -0.6 can be rapturously warm and fuzzy. But, you’ll have to upgrade from the demo version to find out!