All that seems a bit theoretical for now. I know there is still some bandwith left with BT, but for audio applications it will be hard to squeeze more bitrate and make it reliable, especially with the antenna limitations of such a small device. IF the OS doesn't fully support UAT, and the BT SOC neither, a software player could potentially bypass the OS audio engine/driver, but it's sketchy. I want to see some real world use, not only measurment. If it stutters at one foot, we can have all the fidelityin the world but it's useless. I obviously don't know their algorythm, but to my ears, even LDAC don't sound better than AptX HD. Theoretically it is better, but we don't know the real amount of interpolations when sample don't reach timely. Bluetooth will never be Wifi. Now let's wait and see, It's unfair from me to not at least give them benefit of the doubt, but in my opinion trying to reach more fidelity for bluetooth applications have limited appeal. That said, we have to at least respect manufacturers that are at least trying to innovate, but selling this to the big guys at Google and Apple, which in the end is the only answer to real market penetration, will be a rough path.