He'll need peer review before publishing.
Wow, just noticed new avatar - true beauty again!
He'll need peer review before publishing.
Had the question if these cables provide full grounding isolation? Reading the link I understand they do not transfer any power except internally for self-powering at both ends, but couldn't determine if the ground remains totally isolated? (ground loops).
... For the first time anywhere, we have shown that we can change the analog output of a DAC by changing how we feed it in digital domain in USB ...
The ground connection is required for device detection.IMHO, there should be two types of USB cables available * in the market as exclusively designed for audio D/A applications :
* I know very little abt. the catalogs issued by the USB manufacturers for audio applications ...
one type that includes both digital signal transfer and DC
another type that includes only the digital signal, to be used on DACs with their power supply in
If - in these DACs, the 5V lines are terminated, the use of USB cables that carry also the DC, it would cause a ground loop, so noise, unless they are transformer isolated.
The ground connection is required for device detection.
You can't. Analog capture of a digital stream will have noise, variation in speed, etc. And will differ from run to run so no exact match can be found.What abt. just record the digital output from a DAC that is fed via a USB cable, to see if the source file and sent file are identical ?
You can't. Analog capture of a digital stream will have noise, variation in speed, etc. And will differ from run to run so no exact match can be found.
No, digital jitter is too small to cause audible speed variations. What I mentioned is drift. Our hearing is not sensitive to that so the clocks used in audio are allowed to change over time, temperature, etc. This is why your digital watch can lose time after a while. This drift occurs in a matter of seconds and will make run to run comparisons difficult.So at the end we go back to the infamous jitter as "in the real life" indication of speed accuracy between the PC transmitter and DAC receiver, right ?
Here's one who prefers 75 Ohms - and better 110 Ohms, over the USB for his audio digital transfer, from a pure sonic results.
Does a modern DAC, among the ones you are testing here, have that some sort of intelligence to compare it's received data to the original ?
I'm not an expert, I wonder what the impact - on the final sound, would be if there is any difference in the data file sent vs. the one received ...
Oh, so your hearing aid device can help you hear that difference too? Impressive..
Bad USB DAC or cable design could lead to audible artifacts.
This usually was lead by two issues
- some usb dac output is powered by USB power. If the power quality is very bad and the DAC power filter is not properly filtered, the output will be noisy.
- some bad USB cable cannot transport reliable digital signal (which means the digital signal may have very high error rate because of the power loss through the wire).
Usually these two factors are super easy to avoid --- as @amirm said, a cheap but properly implemented DAC and a normal USB cable can avoid those problems easily.
maybe his dac or computer is not rigorously implemented so that such impedance change can cause signal defect. In my view that’s entirely possible.That is obvious but certainly not what we're discussing here. I understood @graz_lag was speaking about sonic differences between USB and coax SPDIF.
I find it particularly interesting that he's convinced he can also hear the difference between 110ohm and 75ohm cable impedance.
maybe his dac or computer is not rigorously implemented so that such impedance change can cause signal defect. In my view that’s entirely possible.
Usually usb cable should have ~ 90 ohm impedance. 110 or 75 is not a small deviation away from the spec and could change the signal in some very audible way.
With such short cable lengths impedance is practically irrelevant. It would take much (and here I really mean much) longer cable for the impedance to start affect digital transmission.
Indeed so. S-PDIF and AES-EBU (both balanced and unbalanced) are incredibly rugged and pretty much anything will transmit the stream adequately for the receiver to lock, over normal domestic distances. In Studio complexes where the cable runs can be hundreds of metres, then impedance does matter, but not at short distances.With such short cable lengths impedance is practically irrelevant. It would take much (and here I really mean much) longer cable for the impedance to start affect digital transmission.
I would be surprised if the impedance of that circuit was closely controlled....wouldn't it depend on where the fish were?
S.
It would depend on the frequencies involved, don't you think? A 1 m cable qualifies as a lumped component only up to about 30 MHz. USB 1.1 @11 Mbps isn't likely to care, USB 2.0 high-speed @ 480 Mbps might, and USB 3.0 definitely will. (Found a lil' article on the challenges of high-speed USB 2.0 on board layout from back in the day.)With such short cable lengths impedance is practically irrelevant. It would take much (and here I really mean much) longer cable for the impedance to start affect digital transmission.
In fact, characteristic impedance is entirely independent of length by definition. Resistance, of course, will depend upon length and conductor diameter as you would expect, and a manufacturer could cheap out in ways that would push certain devices over the edge. (Some bus-powered interfaces were notorious for doing it even with good cables, like a certain Terratec that would develop a habit for crashing regularly... was it the 6fire USB? In this case proper operation would be restored by shorting out a choke in series with supply voltage, bypassing its resistance.)Cable impedance is not necessarily related to length as is resistance.