No, you don’t. (Did you not read my post?)
As I said above:
and
No, you don’t. (Did you not read my post?)
As I said above:
and
Good, so we can rule out electrical effects in these dielectrics.
On to other possibilities…
1 Mechanical changes:
This makes sense for adaption over time
a) For dielectrics, particularly with PTFE & PE (much less so for PP or XPE), due to creep; this fits with the usually indicated timescales (in the 10s or low 100s of hours).
b) For parts of the conductor system: mechanical stress may be able to affect softer metals such as solder with effects occurring in the usually indicated timescales.
2 Psychological / biological changes:
This is likely to be involved as no independent reference is being used to maintain organoleptic calibration.
a) Adaption to expected stimuli. This again would fit well with the usually indicated timescales. This also fits well with the often recorded but otherwise paradoxical observation of this being a discontinuous process sometimes even including a reversal of perceived quality gains.
b) Temporal drift. Sensory perception of complex animal life (such as humans) is subject to temporal drift to enable adaption to the environment.
c) Expectation bias. People who expect ‘cable burn in’ will hear it, no matter whether it occurs or not.
There may be other mechanisms involved but electrically induced dielectric changes in the previously stated dielectrics can be ruled out. This throws doubt on ALL explanations given by accounts of the phenomenon that attribute any part of the changes to this mechanism (since the people proffering these explanations clearly lack sufficient understanding - they have failed to rule out the one obviously flawed proposal for a mechanism).
Am I missing something obvious here…if a cable really does ‘burn in’ over time, then it should be cinch to compare its molecular structure under an electron microscope. You should be able to show how it has changed structure over time if that is indeed the case, right?
This assumes that the change is visible, but proponents of the idea will tell you that it’s not.
I don’t understand. Either the cables change in or they do not. How could it not be visible under a scanning or even tunnelling electron microscope?
It’s easy to claim, e.g., that the changes are on the quantum level and you don’t see it under an electron microscope, or that we are not looking at the right things. Everything is easy to claim if proof is not required
Well quite…
Firstly the molecular structure shouldn’t change - we’re not expecting significant chemical changes, and should these occur they are likely to be swamped by oxidation effects.
If you’re suggesting mechanical changes in the materials (e.g. atomic migration at crystal boundaries), these would require a scanning tunnelling microscope rather than an electron microscope… and you have to know not just exactly where to look, but also exactly what it looked like before the change. The latter is particularly problematic as sample preparation for these techniques is destructive, making it impossible to get ‘before’ and ‘after’ images of the same location.
Superficial stress induced changes in solder joints can be demonstrated using a scanning electron microscope, as can stress accelerated corrosion in some solder types.
I guess my point is that if burn-in is a ‘thing’, then how it works should be explainable, repeatable (not just in audio), observable, and quantifiable. Other industries that rely on extreme levels of accuracy in their electrical circuits don’t seem to mention it as far as I can tell.
You are being much too rational, like an atheist asking for proof of the existence of (a) God.
The main argument employed by cable burn-in believers is the absence of disproving evidence:
“You cannot prove conclusively that it/he/something doesn’t exist”
This is a fallacious argument, since it is not possible to prove a negative. But as long as it is used, this will likely never be settled.
It’s much more likely that the effects of burn-in are psychological or physiological like @Innocent_Bystander explained earlier. But those who believe in physical burn-in can always say that our equipment is simply not precise enough to measure the effects.
Interesting how rare it is for someone to take two identical cables, put one into use and the other not (ideally lying side by side so any mechanical movement or environmental influence is the same), then after a period of use conduct blind comparative listening tests. The vast majority of claims about burn-in are based on hearing perception at one time compared with memory of another, not directly comparing a “burnt in” cable with another not burnt in.
Belief requires no proof.
Indeed. Or someone with the possibility, like a dealer, doing the same with two preamps or whatever.
I believe one reason is that the test scenario is not straightworward: Setting up a useful blind test with statistical significance is hard. And it seems to me that in audio it is harder than in many other fields because you need the test subjects’ active participation and some ability. Having one guy switch between both items is not sufficient, particularly if one wants to convince skeptics.
While I agree with your post, it is complicated by the fact that hifi aficionados have been told for decades that certain things are not measurable and/or cannot possibly make a difference, when they clearly do. So some skepticism directed at this is also understandable
Yes i agree with yours too.
In the end it’s best to not trust our feelings or gut instinct too much when it comes to science.
Instead of:
“I’m going to trust this claim because it feels right, until someone proves to me that it isn’t right”
A better approach is generally:
“I’m going to be sceptical of this claim regardless of how i feel about it, until it is actually proven to be correct”
After all, like @facefirst explained, if there is a clear audible effect, then surely that effect should be measurable too. Our hearing is ultimately not more sensitive than our most sensitive measuring equipment is.
While it’s possible that certain effects are so small or subtle that we haven’t correctly measured them yet, at the same time we know that the effects of psychological and physiological bias are huge. So it’s not unreasonable to assume that psychological factors are a more likely cause, when we cannot measure anything significant using objective methods.
Very well put
It’s also possible that the effects are measurable, but, so far, no one has performed a test that shows the effect.
A case in point is how the human ear / brain system can detect timing differences below the Fourier limit. It is possible to build an instrumentation system to do this, but not do it in ‘real time’. As doing it in delayed processing is of no real practical value when dealing with human understandable audio signals such instruments aren’t routinely built or used. (It is however very useful for submarine SONAR and for RADAR and is implemented in these systems.)
My understanding is that the human brain can discern timing differences between 4 to 10 microseconds. Our current measuring equipment would be much more precise than that, i believe we can measure things in femtoseconds without much trouble!
The Fourier limit is about timing differences in the start of a periodic signal, within other signals such as noise. It isn’t a simple matter of resolving two edges and measuring the time difference (which, as you say possible in the femto second scale or even possible below that), rather it requires analysis of the waveform to extract the characteristics of the periodic signal (analysed as one or more sinusoidal elements).
It’s this information analysis that can’t be done in real time. By the time sufficient data have been accumulated, the start of the signal has already passed! Detecting 10μS differences between a 5kHz and a 6kHz signal represents less than 1/4 wavelength of either signal (the Fourier limit).
I guess that would mean we can still do comparative analyses of a cable signal before and after a break-in period, but perhaps just not in real time?