View attachment 348284
In the context of burn-in, the implied result is always that the cables will audibly change for the better throughout the burn-in period. There is never talk of any other outcome, simply that burn-in is required for better performance. If "better" is an outcome, how much better is really audibly better? If better is an outcome, why is "
not better" or "
no change" also not a possible outcome?
"Worse" and
"No Difference" are
absolutely possibilities during burn-in but are never discussed as a possible outcome.
Does Normal Gaussian distribution not apply to burn-in of cables? Sure seems like it would, as just about everything else in the world does.
It would seem that the outcome of burn-in simply should not be a binary function. No burn-in = bad vs. burn-in = better. If it is a binary function, there would be a magically turning point during the burn-in period at which bad instantaneously becomes better. But we know that that magical, instantaneous point likely doesn't exist and that if there really is change it is gradual in nature.
So, if there is some form of accelerated audible change to the equipment over the initial use period, does that not infer that the change will continue forever? Why would the change simply stop after a couple hundred hours of use?
Also, if change does occur during burn-in, does that not infer that the change will continue at some (perhaps reduced) rate forever. Big changes in the first few hundred hours of use and then no change at all, for eternity, seems a little far fetched. If cables do actually change audibly during burn-in, how long before that continual change becomes audibly unacceptable and those cables are now burned-out, their useful life has expired? At what point in a cable's life would it transition from burned-in to burned-out?
Know anyone whose cable's have burned-out, in anything other than a house fire?