Can an interconnect be too good for an amplifier

I have a friend with a Nait xs2 powered by a hicap dr, a cd5 xs powered by a hicap dr and Michell gyro se. For a long time he has been using Naim Lavender leads. Last year he swopped them out for Vertere Redlines. We had a listening session yesterday because he felt they were lacking the soundstage and detail that he was used to and to be honest, so did i. The thing that came to my mind though was maybe the leads were too good for the amp. I’m thinking that because i have a full Redline loom and it beats Lavender by a long way, maybe his issue is the Nait xs2 is not up to getting the most out of the Redlines.

Others thoughts on this would be very welcome.

I think it’s all about balance in a system. I also have a Vertere Redline full loom, originally on a 332/300/250NC and now with my Nait 50. It’s superb, but I can imagine it might not be in synergy with another amp. Either sell the Redlines or pick up a Nait 50…..

1 Like

Let’s try and apply some logic here. If the original Naim wires, which don’t forget were specifically chosen to work with Naim stuff across the piece, why wouldn’t Vertere wires work the same? And in my experience the various Vertere wires I’ve tried with various Naim stuff were all excellent.

An interconnect can’t be too good but it can:

  • Be a disproportionate and non cost effective upgrade at it’s price compared to other options.
  • Simply be a poor sonic match (at any price).
9 Likes

Provided they have decent wire nd screening with well-made connections to decent quality plugs, there i no such thing as a good or bad cable. But the cable characteristics may modify the sound in some way, and in part that can depend on each of the units they connect. Where an audible difference between the sound using particular cables is evident, it seems common to select a bit like using tone controls, to find what is best to the individual’s ears in their room. So what works ell with one setup might not with another, and what is not so good with one may be very good with another.

1 Like

The obvious test would be to put the Naim cables back in and to see if the missing elements return. The Redlines aren’t cheap but they are not totally out of proportion in the way that something like Chord Music would be.

One thing I’d look at are the power supplies. The Hicap is not generally considered a good match with the 5 series CDPs, indeed I tried one on my then CD5X and it made the thing sound bonkers, like it was on speed. The Flatcap was a much better match. People have said similar about the Nait / Hicap match but I’ve not tried it. Then again, your friend must have tried it before buying, but if there is something not right I’d be looking at the Hicaps before the cables.

2 Likes

When I updated from 272/XPS DR/250 DR to 222/300/250, I swapped my SL 250DR cable, which to be brutally honest didn’t deliver a chalk & cheese moment in terms of SQ for the appropriate SL XLR interconnect, having used the Naim in-the-box XLR’s up to that point and was really impressed with the performance results, deffo a lightbulb moment.

But, if I hadn’t had the older SL interconnect to trade in, I’d probably have been more than happy with the supplied XLR’s for quite some time/ever……?

ATB, J

I had a chat with him earlier on. He still has his Flatcap XS, so will try that next week (when SWMBO) isn’t at home :grin: I’m amazed she has never noticed 2 Hicaps in place of one Flatcap lol.

No, but it can be WAY too expensive!

3 Likes

It might help , Powering a CD5 XS with a HiCap DR is often described as making the player sound “frenetic” or “on steroids”. When combined with the high-energy delivery of a full Redline loom, the resulting sound can become overblown or aggressive, losing the natural “flow” and subtle detail that the simpler Lavender cables maintained.

The vertere redline is perfectly fine with a Nait XS amp. They are however, directional if using RCA to RCA.

I have used Vertere DFI cables at home with my Naim systems, with an outdoor Bose speaker, in my car aux, and I would never like to listen to music without them,

Always remember; interconnects can not add anything. ‘Better’ interconnects detract less.

2 Likes

Or “better sounding” interconnects detract in a way that sounds better to your ears in your room with your system…

1 Like

Excellent sum up.

I would disagree with some of the posts, in that I do believe at times an interconnect can be too good for an amplifier. Typically the more expensive the cable is, the more revealing/transparent it is. Now if it’s connected to good kit, this is typically a good thing. If it is connected to not so good kit, it can expose weaknesses in said kit. So there is the potential for a cable to be “too good”.

Typically manufactures will make their entry level cables, warmer/fuller/less transparent because they’ll match better/mask flaws in the more entry level kit that they’re expected to be paired with. Flagship cables are often very transparent/resolving & more neutral sounding, so they get out of the way & let you hear the kit itself, something you typically want with top notch kit, often not so much with entry level kit.

1 Like

A cable dan’t improve detail in whatever signal it carries, and a perfect cable would simply deliver unchanged, however due to interactions with the source, sink or directly with the signal potentiallg can changed, detracted as @simon put it. But perfect transmission is not being too good for an amplifier, though could be considered to be too good for a bad source, where changes to the signal by, for example, suppressing an excessive top relative to bottonm end could improve the sound,

I believe this is in response to my post, though I never indicated a cable can improve detail vs the original signal. However without going too far off topic, I could see where this might be perceived as possible. If you have a “perfect cable” & compare it to another cable that is elevated in a certain frequency range, this “non perfect cable” may actually be perceived as “adding more detail”?

Unless I’m reading incorrectly your post seems somewhat contradictory. Yes, technically a cable can’t be too good for an amplifier but all we care about as listeners is what our ears hear, & in this case I would still say that is possible.

If we take someone with all entry level kit in standard room & he upgrades all his cables to top flight cables that are built from silver wire & rhodium connectors. Previously he may have used cables that are OF copper, paired with gold plated connectors. Technically the silver/rhodium are far superior/far more transparent/accurate to the source but its highly likely he would prefer the sound of the far “inferior” copper/gold combo.

1 Like

No, an interconnect cable cannot be too good.

All we can ask of these cables is to be a perfect conduit for the audio signal into the amp i.e. without fault or added colouration, what came in goes out exactly in the same way and protects the signal from external interference. The amp does whatever it does from there with that signal.

That does not mean cables in reality do this or that one cable does not sound better than another. It is the cable that is doing that not the amp.

The amp may have its own issues, but it can’t really be put off by having too pure an input signal. The cable is a conduit not a source.

1 Like

But the really big question is, Is it? Or, for example, might it perhaps be possible that it has lower impedance at higher frequencies vs lower frequencies, so sounding ‘brighter’, and hence perceived as being more transparent because the listener can then pick out some detail better but actually not passing any more info, and, possibly, presenting a sound that is actually coloured/inaccurate? Without eitger analytical measurement of the signal and recording, or direct comparison of recording and amp input, anything is of course just conjecture.

BTW, regarding the question of “too good for the amp”, my post wasn’t contradictory, rather I was suggesting that if it is too good for anything then it surely would be the source for which it is too good.

I think speakers being too revealing for a lesser amp or source and laying bare their deficiencies is more likely than a cable.

I’ve done some extreme cable tests in the past where the synergy was good and the cable was way more expensive than either of the bits it was connecting with good results - albeit not cost effective results.

For example, Arcam components are all designed with Audioquest in mind. The brand synergy is very strong and indeed. Each Arcam source has a recommended corresponding Audioquest cable and when testing their entry level CD player back in the 90s (the Alpha One) I used the Audioquest Diamon rather than the Ruby. So a GBP600 cable on a GBP300 CD player. It was quite a eye opener. It showed how much information was really available and I dare say outperformed an Alpha 5 player (with the recommended cable, not the Diamond) by some margin. But the combination was GBP900 and the Alpha 5 with the recommended Quartz was GBP580. But a GBP800 Linn Mimik II came with it’s recommended Linn cable and was miles better than the Alpha One with the Diamond for less money. As it should be.

So we come back to my first post that the cable can be simply wrong (regardless of price) or it can have great synergy (as in this case), but ceases to be the cost effective way forward. While it detracts from the source less, a better source puts more information out than a poor cable is masking in the first place.

So for any source, one has to ask, at what point does the cost of the cable tilt in favour of simply getting a better source? There isn’t a rule for that by the way. I’d like to say a good guestimate is 10% cost on cable, but I have systems where the cable is more like 25% the cost of the source and others where the right cable was more like 1% the cost.

1 Like