Victor Shevtsov, sound engineer and physicist some time ago began to explore a problem of objective comparison between sound quality of CD and vinyl record.
He found how vinyl record is more precise in reproduction of sound harmonics than digital sound in CD resolution (44 kHz/16 bit). This could be an interesting technical “proof” that
"vinyl sounds better" is not a myth - if "better" mean "more close to original and realistic".
The best way to define the precision of an audio device or chain is to compare the input and the output audio signals. Let's try to do this for the CD player. I used 96 kHz/24 bit audio as a high quality original signal (an "input signal") with the spectrum like the one shown on Fig.1. We can emulate the "CD-mastering" by decimating to 44 kHz and down-converting to 16 bits and then the "CD-playback" by up-converting and up-sampling back to 96 kHz/24 bits (this is similar to how the DAC works during the high quality CD player playback). For all conversions I used high-quality iZotope algorithms with dithering and anti-aliasing filter. Now we will compare this final sound file with the original one. I used the Cubase multitrack editor for frame-to-frame subtraction of these two files from each other, and it resulted in a file with the spectrum like the one shown on Fig.2. This difference between the "input" and the "output" signals for our emulated CD system is in the audible frequency range, so this system obviously has important data loss and reduces the sound quality. We can say that it is a "collection of lost harmonics" (I have some additional clarification for this statement).
So I suggest a frequency depending parameter of MINIMUM "ORIGINAL SIGNAL" HARMONICS PLAYBACK LEVEL relative to the full signal level (on these frequencies). We can obtain this graph for the CD-format while subtracting amplitudes of these two spectrums for each frequency - Fig.3, graph "CD" (certainly it is a product of the averaging of results for several different audio files).
In order to investigate the accuracy of the sound reproduction in vinyl records, I used my mid-class equipment (Technics 1210 MK2 with Benz-Micro MC20E2/H cartridge and Pro-Ject 7 Amp preamp) and a test record made by Melodya in the 1980th (VG) with sine waves up to 20 kHz. I've recorded these signals with the MOTU 828 mk2 ADC and found several spectrums like the one shown on Fig.4. Here – from the left to the right – we can see the main tone and the 2nd and the 3rd distortion harmonics. From such spectrums, we can obtain the level of "the noise floor" formed by the 2nd and the 3rd distortion harmonics for each frequency (Fig.3, "vinyl"). This last graph we can also qualify as the "minimum harmonics playback level" and thus compare the precision of the CD and the vinyl sound reproduction.
Such comparison shows that THE PRECISION OF THE HARMONICS REPRODUCTION IS MORE THAN 2 TIMES BETTER (in average 6 dB in the amplitude) FOR THE VINYL RECORD even played on mid-class equipment THAN THAT FOR THE CD IN THE VERY IMPORTANT FOR TIMBRE REPRODUCTION FREQUENCY RANGE FROM 3 kHz TO 12 kHz.
The main thing is that in case of vinyl records we can investigate the linearity of the sound reproduction system using the classical method of measuring harmonic distortion components and use, for example, the THD parameter to estimate the sonic precision. The situation is different with the CD. Although “classical” measurements of the THD can show extremely low values, we have data (and sound) losses for the real (not sine) signal, for example, due to the truncation from 24 bits to 16 bits and the decimation from 96 kHz to 44 kHz.