Not disagreeing, but long ago and far away, when CDs were played on CD players, the players would use over sampling or single bit formats to allow the D/A converter to operate in its most accurate range. When a sample uses a small number of the available bits, the chances the converter will incorrectly resolve the sample goes up, so some players would shift everything up into the more accurate range of the converter.
Whether this is relevant to a DAW A/D converter is questionable, but it is an interesting question on inexpensive interfaces. It would probably be an advantage in signal to noise ratio to use a higher bit depth, but most playback systems and streaming formats throw that all away anyway.
1 Like