Geez, the more I learn about Dirac the more I think they use marketing buzzwords to deceive people into thinking they have something new. I mean, DLBC sounds like it works just like MSO (Dirac's "genetic algorithm" is another word for "brute force optimisation" isn't it?
@andyc56 ?)...
MSO has been described many times as using a "brute force optimization" approach, but that is not true. There is a specific meaning to that term. It describes a technique where you're trying to optimize a small number of parameters, each of which can only take on a discrete set of values. It would be implemented as the equivalent of a nested loop in software, where the depth of nesting of the loop is the number of parameters to be evaluated. The error to be minimized is evaluated by exhaustively testing every possible discrete value of every parameter and choosing the best combination.
The use of brute-force optimization for multiple subwoofers is discussed by Todd Welti in his 2003 AES article "
In-Room Low Frequency Optimization", which was the predecessor of his more familiar 2006 article "Low-Frequency Optimization Using Multiple Subwoofers". The latter article describes SFM. SFM is the only multiple-sub optimization software that I know of that uses brute-force optimization. It can do this because it's only using a single PEQ per sub, having discrete allowable Q and CF cut values. Then there are per-sub attenuators and delays, each of which can only take on a discrete set of values. I'm pretty sure it finds the PEQ center frequency using a conventional search to find the frequency for which the seat-to-seat variation is worst, obtained from measurements. Once it has that information, the remainder of the parameters (Q and cut of the PEQ, per-sub attenuation and per-sub delay) can take on only discrete values. Brute-force optimization is then used for all parameters except the PEQ center frequency, which at that point has already been evaluated.
An optimization approach that can only use a discrete set of values for the parameters to be optimized is called
combinatorial optimization. Brute-force optimization refers to a subset of combinatorial optimization for which all possible values of all parameters can be exhaustively evaluated.
What about MSO? A fully stuffed miniDSP 2x4 HD using a PEQ in every open slot has around 250 parameters to adjust, all of which interact with each other, and all of which take on a value from a continuous range of allowable values. It can't use combinatorial optimization, because the allowable parameter values don't come from a discrete set of values. Because of this, it's also not possible to exhaustively evaluate all possible parameter values either. So brute-force optimization can't be used for this problem.
MSO uses a variant of an open-source technique called
Differential Evolution. It's described in
this book. You'll find my review of the book on that page, written back when I was first starting out with MSO in 2013. MSO uses some enhancements to the algorithm described in the
Acknowledgements section of the MSO home page. Differential evolution is a so-called
population-based algorithm, meaning that instead of a "current guess" that gets moved around, there is a collection of "current guesses" (in my case 100) that wander around the solution space like ants searching for the best food. When a lucky ant finds the best food so far, his buddies wander over there - but they may find better food on the way. As an algorithm, this can be described as new guesses being generated (ants moving around) based on the family of existing guesses (ant locations).
Genetic algorithms are population based and very similar to this, except that the guesses can only take on discrete values, and the method for generating new guesses from the family of existing ones is different. Each parameter within a guess vector takes on discrete values
y according to this formula:
y =
m *
i +
b
Here,
i is an integer, and
m and
b are floating-point numbers. There's a pretty good article about the approach
here.
Here is a diagram showing a taxonomy of optimization techniques.
Edit: As if this post weren't already long enough, I should add that genetic algorithms should be considered a sibling of Differential Evolution in the diagram above (both population-based), but with genetic algorithms originating from the right-side branch of combinatorial algorithms rather than continuous as does DE. DE can be made combinatorial, but it takes a shoehorn to do so, and convergence problems can result.