Naim Input Sensitivity

My DAC has an output voltage of 2V. When playing on my Naim NAC 202 the volume dial is very low at about 7 oclock position. This gives me very little control over the volume as small movements of the Volume Dial produce large increases in volume. Should I reduce the output voltage of the DAC so I can increase the volume Dial of the Naim 202 To a more usable level, or will this degrade sound quality? What is the optimum input Voltage?

Naim pre-amps are able to handle up to 7V before overload, however, as you have found, with modern digital mastering, the 2V standard for digital does sometimes leave only a narrow control area on the volume pot before music gets too loud. The issue is exacerbated by many DACs and CD players which, though they state they output at 2V, they actually measure more. Also, sensitivity of speakers will also have an effect.

If your DAC allows you to reduce the output to around 1V without reducing resolution then that’s the best step to take as SQ will not be degraded.

2 Likes

Hi Richard,

when line level is overloaded at 7V rms, this does not explain why adequate volume with a 2 V rms input device is reached between 7 and 8 o’clock; the Control Knob should rather be at 12 o’clock for this setup, shouldn´t it?

I do have a similar issue with an external 46 dB of amplification LCR phono stage, a SUT (1:32) and SPU Meister cartridge (0,3 mV, 1.5 Ω output impedance).

Not really. Adequate volume will depend on a number of factors.

On vinyl on my system (Superline E with Apheta 2), loud starts around 11 o’clock. Very loud by 12, disco by 2. That’s with SL2s, which are around 89dB sensitivity.

Via a NAIM CD player or the Naim DAC it’s actually very similar provided the CD or file was well mastered (i.e. prior to the added loudness, brick walling and compression that arrived in the late '90s and is still with us on too many releases). Now, it could be argued that the 2V standard of Redbook was already on the high side. But it was OK. The problem has come with the ridiculous mastering of recent years which has squashed dynamic range and pushed levels even to digital clipping in some cases.

I did recognise the same, when I wired up my TT.
Music from CD/Rips are loud at 8 o’clock (newer CDs from loudness war time) and around 9 o’clock when being old ones (nDac input in 52).
Music from Phono stage (prefix) is much less loud and volume can be tuned more sensitive. Loud begins here at 10 o’clock - mostly hearing volume is 9 to 9:30 :slight_smile:

Have to be VERY careful when skipping from Phono to nDac!

The input sensitivity of most Naim pre-amps is very high (75mV), which means little usable range and potential channel balance problems, particularly if, like me, you prefer modest listening levels; as Richard says, this problem is made even worse as a result of ‘loudness wars’ CDs. It’s a real PITA, IMO.

Attenuation would seem to be the easy answer, but I believe Naim (used to?) recommend against this, as it was said to have a detrimental impact on musical performance?

Well, the synchronism of a potentiometer varies with the level setting. So it makes more sense to design the gain stage that way tomove in the middle of the adjustment range of any potentiometer.

Well, the synchronism of a potentiometer varies with the level setting. So it makes more sense to design the gain branch in such a way that you always move in the middle adjustment range of a potentiometer.

And if you look at the high and bass tone capabilities of a carbon or carbon composite potentiometer by using a square-wave signal on the oscilloscope, then the resulting knob position should rather be found around 12 o´clock position either. But as I understood Richard points at adjusting this “misbehaviour” on the source side or by using a variable input volume board in the Chrome Bumper or Olive series.

These are best avoided as, IIRC, there was also some FR massaging to try to make early CD sound a bit more palatable.

1 Like

The problem is that this stems from the early Naim amplification, in the days before CD when line level sources had typically much lower output levels; even when Naim introduced their own CD players, they stuck with 75mV.

As Richard points out, the variable sensitivity *28 boards are also filtered, so best avoided.

@Don_Camillo, I notice from your other post that you have very high sensitivity loudspeakers, which will make the problem even (much) more apparent; I struggle with 85dB speakers, never mind the 90 and 100dB that you have! :astonished:

1 Like

Yes, the high sensitivy of my speakers make the issue even more apparent. To be honest the compression driver for the mids goes up to a max of 120 dB when operated without a xover at around 2000 Hz. The Xover levels this out over the entire frequency band to an average of 105 dB with -6dB points at 30 Hz and 18kHz.

Thanks Richard. Hmm, will have a look what mic input SUTs are napping in the parts box. Otherwise I´ll go with basic voltage divider approach for appropriate level adjustment.

This topic was automatically closed 60 days after the last reply. New replies are no longer allowed.