• WANTED: Happy members who like to discuss audio and other topics related to our interest. Desire to learn and share knowledge of science required. There are many reviews of audio hardware and expert members to help answer your questions. Click here to have your audio equipment measured for free!

Beta-test: DeltaWave Null Comparison software

pkane

Master Contributor
Forum Donor
Joined
Aug 18, 2017
Messages
5,632
Likes
10,205
Location
North-East
Folks,

Only for the brave: I'd like to solicit your feedback and suggestions on DeltaWave null-testing software. It's an early preview, so please be gentle!

Details and software download are available here: https://deltaw.org

An example report produced by DeltaWave comparing two USB cables :) (warning: large file)

Some quick points:
  • Runs on Windows 64bit, requires .NET 4.6
  • The faster the CPU, the more cores, and the more memory - the better
  • Meant for analyzing differences between two audio files captured with something different in the audio path, such as a different power cable, usb cable, DAC, preamp, DDC, ADC, digital filter, power supply, etc., etc.
  • Similar in some ways, but hopefully better than Audio DiffMaker. This is completely my own design and implementation, started because I could never get DiffMaker to work
  • Finally, the software is completely free of charge, but the current beta version will expire in 60 days. By that time I'll have an updated version posted on the site
Highlights:
  • Reads DSD, FLAC, and WAV file formats, writes difference files as WAV
  • Plays original and difference files over WASAPI (ASIO in the works)
  • All computations are performed in 64-bit floating point format
  • Matches phase to sub-sample accuracy, removes clock drift, and matches levels (non-linear level matching is also supported, but consider it experimental)
  • Will resample to the same frequency if two files are not sampled at the same rate
  • Generates delta waveform, spectrum, spectrogram, phase difference, and cepstrum plots with zooming
  • Produces a full HTML report documenting the results of a comparison, including how close to bit-perfect the two files are
  • Supports low- and high-pass filtering of the files, as well as notch-filter to remove a specific frequency
Please post feedback and suggestions in this thread, or PM me if you'd rather keep it private.
 

restorer-john

Grand Contributor
Joined
Mar 1, 2018
Messages
12,579
Likes
38,280
Location
Gold Coast, Queensland, Australia
Runs fine in my i5 4440 with 8GB (SSD+HDD). CPU usage was around 40% for the duration and it only took a minute or two, so no problems with older hardware.

So, I compared the same file to null test the program. Any idea why it says the files are not bit perfect?

1549584259096.png
 
OP
pkane

pkane

Master Contributor
Forum Donor
Joined
Aug 18, 2017
Messages
5,632
Likes
10,205
Location
North-East
Runs fine in my i5 4440 with 8GB (SSD+HDD). CPU usage was around 40% for the duration and it only took a minute or two, so no problems with older hardware.

So, I compared the same file to null test the program. Any idea why it says the files are not bit perfect?

View attachment 21532

Probably because dithering is turned on in settings.
 

restorer-john

Grand Contributor
Joined
Mar 1, 2018
Messages
12,579
Likes
38,280
Location
Gold Coast, Queensland, Australia
Where is the option to output the HTML report or does it automatically generate it and place it somewhere?

When it installs, it makes the program folder Pkaudio, with Delta Wave in it. I went looking for Delta Wave under D and figured it hadn't installed properly. There's no option to install a desktop icon or shortcut in the install process which might confuse some people maybe.

Anyway, it's on my taskbar now where it belongs :)

1549585786300.png


It runs really fast now, those files were 4.6 minutes long BTW with no issues.
 
OP
pkane

pkane

Master Contributor
Forum Donor
Joined
Aug 18, 2017
Messages
5,632
Likes
10,205
Location
North-East
Where is the option to output the HTML report or does it automatically generate it and place it somewhere?

When it installs, it makes the program folder Pkaudio, with Delta Wave in it. I went looking for Delta Wave under D and figured it hadn't installed properly. There's no option to install a desktop icon or shortcut in the install process which might confuse some people maybe.

Anyway, it's on my taskbar where it belongs :)

View attachment 21537

It runs really fast now, those files were 4.6 minutes long BTW with no issues.

Look under File menu (top left). Generate a report after performing a match. You tell it where to write it.

Report cruncher will generate multiple reports, performing a match and writing a report for each pair of files selected.
 

restorer-john

Grand Contributor
Joined
Mar 1, 2018
Messages
12,579
Likes
38,280
Location
Gold Coast, Queensland, Australia
My testing will mostly be at 16/44 or 16/48 so times for processing will clearly be much faster for me, compared to the guys with 24/96 or above.

So far your software seems rock solid and a whole lot of fun. I'll run some CD transport comparisons over the weekend and some optical/coaxial comparisons using the 24/192 soundcards either via SPDIF straight in or via the A/Ds.
 

Guermantes

Senior Member
Joined
Feb 19, 2018
Messages
484
Likes
561
Location
Brisbane, Australia
I've been playing with this today, too. Great work, Paul!

The HTML report is loading fine in Firefox but hanging in Chrome. The final spectragram plot seems to be giving Internet Explorer 11 some problems, too.

So far, I haven't been able to get any playback via WASAPI to my RME Fireface but this may require some troubleshooting outside DeltaWave . . .
 
OP
pkane

pkane

Master Contributor
Forum Donor
Joined
Aug 18, 2017
Messages
5,632
Likes
10,205
Location
North-East
I've been playing with this today, too. Great work, Paul!

The HTML report is loading fine in Firefox but hanging in Chrome. The final spectragram plot seems to be giving Internet Explorer 11 some problems, too.

So far, I haven't been able to get any playback via WASAPI to my RME Fireface but this may require some troubleshooting outside DeltaWave . . .

I did see the spectrogram plot giving IE problems, but they loaded fine in Chrome and Firefox for me. It's a large image that doesn't compress well, so I assume it has to do with image size. I have a default size used for all reports, so maybe I'll try adding a setting to reduce them if the report turns out to be too large.

Let me know if you still have issues with WASAPI. ASIO drivers will not work correctly, yet.
 
Last edited:

Guermantes

Senior Member
Joined
Feb 19, 2018
Messages
484
Likes
561
Location
Brisbane, Australia
Ok, WASAPI issues were resolved in Windows Control Panel -- working now.
 
OP
pkane

pkane

Master Contributor
Forum Donor
Joined
Aug 18, 2017
Messages
5,632
Likes
10,205
Location
North-East
Ok, WASAPI issues were resolved in Windows Control Panel -- working now.

By the way, I updated the version on the site to v1.0.2 with a small change that should help with very large files. Not that you should ever want to process files much larger than about 5-10 minutes, but now you can, assuming you have enough memory :)

The previous version was limited to about 2GB file size when loaded in-memory, regardless of how much memory there was. After this update, I was able to load and process an 80 minute 24/48KHz FLAC file.

To try to keep things reasonable, DeltaWave will now warn if the file duration exceeds 10 minutes. It will let you continue if you really insist, but expect out of memory errors and slow performance if you do.
 

Blumlein 88

Grand Contributor
Forum Donor
Joined
Feb 23, 2016
Messages
20,523
Likes
37,054
Alright have used the new one. In the settings, some of the boxes for each setting cover up part of the box label. I don't think it was like that before.

Thank you for the larger font option. :)

I'm only given an FFT option to 512K size. I thought the previous one (which I uninstalled) went up to 1 million. Am I mis-remembering?

This version is somewhat slower with same settings on my laptop in the time taken to match. Perhaps its doing more or doing a better job. Using the same comparison files (of 4 minutes length) it gets deeper nulls.

Prior results were 70db difference and 84db correlated null depth while now it is 88 and 89 db. The difference spectrogram and FFT charts look more like I'd expect. I was comparing the same file captured and then captured again. Leftovers were a fairly flat random noise. The prior version did't seem to get that the way I expected was the real difference.

Comparing the capture to the digital original resulted in mid 40 db results which seem right and like what the prior results are.

Now timing results are different. Offset is measuring the same as before. The clock rate has been reported as .00000001 ppm on everything. Previously I was getting around .004 ppm which seems more believable, but I'm not sure it is. In any case it is different.

I'll play around with it more, but I think from what I'm seeing it is providing more accurate results.

Like Guermantes it was taking much longer to do the matching. But it by default was set to 500K FFT while the previous version was set to 128 k FFT. Changed the setting and it is closer to before though still somewhat slower I think.
 

Blumlein 88

Grand Contributor
Forum Donor
Joined
Feb 23, 2016
Messages
20,523
Likes
37,054
I tried the following. Took a file, changed the speed of it by 10 ppm slower. Then compared it to itself.
Difference is in the mid 50 db range, and null depth in the low 60 db range. It shows clock drift of .000273 ppm. Is this an estimate of how much drift was uncorrected? Or possibly the decimal placement is wrong on the readout?

I did the same thing with one file 100 ppm faster. Difference was 44 db and null depth 47 db. Clock drift was .00273 ppm.

The difference spectrum looks the same as when I received null depth of 89 db only at some 45 db higher level. I would have expected with a speed offset the difference spectrum to take on a 6 db/octave tilt.

So this looks sensitive to speed differences in the clock. My earlier comparisons were much better, but the clock speeds on those were probably almost exactly the same.
 
Last edited:

Blumlein 88

Grand Contributor
Forum Donor
Joined
Feb 23, 2016
Messages
20,523
Likes
37,054
Adding to my last post. When I listened to the Delta file, with the 10 ppm slower comparison, it had the expected treble heavy balance of a time mismatch, but that cycled over a 3 or 4 second interval as if it were getting synched in time and then drifting off again. Briefly every few seconds the sound became more balanced sounding and lower in level before going the other way.

When I listened to the difference file for two consecutive recordings which should have very little if any clock drift it was similar except the cycle between seeming to synch up and drift off was much longer like 15 seconds instead of 3 or 4. Is this what the rest of you are hearing? BTW this latter example was a near 90 db null and it took lots of gain to hear anything at all.
 
OP
pkane

pkane

Master Contributor
Forum Donor
Joined
Aug 18, 2017
Messages
5,632
Likes
10,205
Location
North-East
Alright have used the new one. In the settings, some of the boxes for each setting cover up part of the box label. I don't think it was like that before.

Thank you for the larger font option. :)

I'm only given an FFT option to 512K size. I thought the previous one (which I uninstalled) went up to 1 million. Am I mis-remembering?

This version is somewhat slower with same settings on my laptop in the time taken to match. Perhaps its doing more or doing a better job. Using the same comparison files (of 4 minutes length) it gets deeper nulls.

Prior results were 70db difference and 84db correlated null depth while now it is 88 and 89 db. The difference spectrogram and FFT charts look more like I'd expect. I was comparing the same file captured and then captured again. Leftovers were a fairly flat random noise. The prior version did't seem to get that the way I expected was the real difference.

Comparing the capture to the digital original resulted in mid 40 db results which seem right and like what the prior results are.

Now timing results are different. Offset is measuring the same as before. The clock rate has been reported as .00000001 ppm on everything. Previously I was getting around .004 ppm which seems more believable, but I'm not sure it is. In any case it is different.

I'll play around with it more, but I think from what I'm seeing it is providing more accurate results.

Like Guermantes it was taking much longer to do the matching. But it by default was set to 500K FFT while the previous version was set to 128 k FFT. Changed the setting and it is closer to before though still somewhat slower I think.

Hi Dennis,

Change the precision in settings to 8 from 18 to speed things up. That just tells the algorithms to how many decimal places to continue to improve the result. With 8 it should stop faster.

As to the other issues I think I know what’s happening. In the latest version I added a non-linear (not constant) clock drift calculation and correction. I bet that’s being thrown off by some noise in the recording. Let me post a version that allows this to be bypassed, and we’ll see if the issue goes away.

Thanks for testing!
 

KSTR

Major Contributor
Joined
Sep 6, 2018
Messages
2,690
Likes
6,013
Location
Berlin, Germany
Very nice, Paul, thanks.
I'm currently on Linux 32bit, can't test your SW right now (will do later on Windoze), but I can explain my potential use case for this.

My inputs would always be time-aligned sample-synced record-while-playback takes (a second or so) that are already time-averaged, 1000 to 10000runs (may easily take an hour or so), to get rid of most of the random noise and any non-correlated stuff like mains hum, clicks and pops etc. Test signal ususally is a log-sweep over the full audio range. These condensed data blocks (64bit doubles, plain binary array, no .WAV) can be assumed to be close to identical and have high signal-to-noise ratio and energy at all frequencies. And no need to worry about clock drifts etc, no preprocessing required.

This should be ideal for extracting the linear transfer-function "A-->B" A is proccessed with to obtain A'. The difference A' - B should give the "true" residual without the linear differences dominating. This is what the software does already, you show the A-->B transfer function in form of a complex spectrum so the data is right there, correct?
While not unimportant, the linear differences ususally are trivial and in many cases *not* what we are mainly looking for, we want to look at distortion, feedthrough, etc patterns that are there in spite of dominating linear differences.

In a cable A vs. B measurement a shifting lowpass pole (changed capacitance) for example easily introduced enough phase shift to spoil the null at high frequencies. With the heavy time-domain averaging I do I actually found that clock and reference drift im my RME Adi-2 ProFS is enough to give smallest level differences and phase shifts at the frequency extremes (when the clock raises, the analog poles will lower effectively, just like a real component value shift would, and this spoils the null). While I managed to get rid of the ref and clock drift effect by interleaving the source data fed to the averagers this is only useful for a certain limited class of tests where another in-sync test signal can be used to change something in the main path. Microphonic effects, electrical feedthrough, shield current voltage drop, RF-demodulation and such, I can bring out very cleanly with this technique. But it is not useful for hardware DUT A vs. B test (say, cables) unless I get to finish a block-synced high quality switching unit that does A/B switch after every N blocks so I can still have most of the benefit from interleaving.

For these use cases, a software solution that automatically generates the linear transform and applies it before subtraction would be really nice. Thinking this to the end, the SW could also generate the logsweep (WAV output) and use it later to obtain the complex frequency response / IR of the signal chain under test with great accuracy and robustness (since the correction transfer function is a quotient we need high S/N and energy everyhwere for a stable result).

A truly robust automatic or semi-automatic "linear difference remover" on data with known properties (to avoid overprocessing) would be a break-through (like so many people I liked the original diffmaker idea back then but never got the program to work), at the moment I have to do this all by hand (manual trimming and using the "Acourate" transfer function toolbox by AudioVero to calculate the correction function) which is *very* tedious...

Does this make any sense? ;-)
 
OP
pkane

pkane

Master Contributor
Forum Donor
Joined
Aug 18, 2017
Messages
5,632
Likes
10,205
Location
North-East
Hi KSTR,

This does make sense. A lot of what you are describing is what I wanted to accomplish with this software. DeltaWave is a work in progress. For me, it's a learning process, but also a lot of fun to play with :) Any suggestions on enhancements, improvements, or new features are very welcome.

DeltaWave currently applies a combination of linear and non-linear transformations to both, clock drift and level mismatches, although any of these can be turned off in settings.

For example, I wanted to be able to get a better null between a digital recording and the same recording played back through a DAC and captured by ADC. The second recording has two to three different filters applied to it, added noise, harmonics, IMD, added jitter, clock drift. Doing a non-linear match on such dirty data is a bit scary!

You can see the difference with higher frequencies. The first plot is after linear clock drift and linear gain adjustments were already made! Blue is the original digital data, pink is the DAC/ADC recorded copy:
1549660164579.png



And here is the same comparison, but now with a 5th order polynomial used to remove non-linear gain difference between the two. A polynomial appears to be helping here, but not perfectly. There may be a need in a different non-linear approximation that fits the effect of ADC and DAC filters better. Maybe using Bessel functions, or Butterworth, or Chebyshev? Don't know, didn't get that far yet :)

1549660286820.png


To derive an accurate IR from a 'natural' recording will be a bit harder. I think the errors in the process could become the bottleneck, including noise, ADC non-linearities, random and correlated clock modulation, computation errors, etc could cause effects that are hard to remove. But certainly worth considering. Averaging multiple results may help to improve accuracy.

What would you want to do with this computed IR? Correct for system errors, or?
 

KSTR

Major Contributor
Joined
Sep 6, 2018
Messages
2,690
Likes
6,013
Location
Berlin, Germany
Yes, correction of the linear part of the total system error, actually the A to B difference. Only small errors, the 1% max regime.
Think applications like capacitor testing, two caps of same nominal value but different construction, say input coupling capacitors.
This will introduce an (additional) highpass in the loopback chain but the corner frequencies will slightly differ (even a after selection and/or analog fine trim) and the resulting phase shift spoils coherence at low frequencies, decreasing null depth. The influence of the linear differences could be brought down one order of magnitude easily if most of it were factored out by applying a well-estimated or measured transfer function. For the capacitor example we know that the correction function would have the shape of an analog (min-phase) shelving filter. In general, any slightly moving pole or zero in the analog transfer function will result in a correction shelving filter (with a fraction of a dB level change).

I'm aware of the problems of processing noise. A way to mitigate this is curve-fitting an analytical transfer function to the reasonably smoothed empirical one, if we know the correction function is min-phase then this can be a bunch of simple IIR filters which can be applied directly (or used to obtain a "clean" convolution kernel from that by sampling a dirac response).

I feel I need to try your program first before making any further comments ;-)
 
Top Bottom