• WANTED: Happy members who like to discuss audio and other topics related to our interest. Desire to learn and share knowledge of science required. There are many reviews of audio hardware and expert members to help answer your questions. Click here to have your audio equipment measured for free!

MQA Deep Dive - I published music on tidal to test MQA

Status
Not open for further replies.

Thomas savage

Grand Contributor
The Watchman
Forum Donor
Joined
Feb 24, 2016
Messages
10,260
Likes
16,298
Location
uk, taunton
Outstanding.

I'll save you some. Maybe try it with some Fava beans and a nice Chianti.
Well thats the family Christmas presents sorted too , who dosnt love mothers pickled onions..


Oh add garlic and some ghost chilli , your going to get haunted so might as well go with the chillies too , its humorous, who dosnt like a laugh over the pickled remains of mother ..

BTW pickled mom is far less deviant than roll mops! That pickled fish shit the Scandinavians folks think is normal ..

Just sayin, not everything scando is hyper evolved.


And yes I'm back from the pub now .... .. .. .
 

amirm

Founder/Admin
Staff Member
CFO (Chief Fun Officer)
Joined
Feb 13, 2016
Messages
44,374
Likes
234,475
Location
Seattle Area
@amirm I do not think MQA is claiming to be “perceptually lossless”. Their position is “MQA coding is lossless [in a traditional, mathematical sense] as far as the source material stays within the [spectral] boundaries of the MQA design (‘the triangle’)”.
I just explained how it is NOT perceptually lossless so not sure you are saying that to me. MQA's approach of preserving all the music can be called lossless since no music was left behind. It would be lay definition of lossless in some sense, "give me all the music." Compare that to 16-bit/44.1 kHz CD which cannot say that.

The audiophile community was the one that assumed their definition was mathematically lossless even though MQA did not use that term ("mathematically"). The community had good reason to think so but hopefully as a result of this thread and Bob's new explanation, everyone should know now there is a distinction. The old assumption could have never been true and I had said that many times.

I think MQA has made a strategic mistake of never engaging the audiophile community on these forums. Had they done so, they could have provided the explanations I provided in this thread, a long time ago, obviating the need to do the video OP did, and tons of arguments back and forth. Instead they only went the route of one-shot interviews with press and such with no allowance from the other side to ask tough questions.
 

Hai-Fri. Audio

Member
Forum Donor
Joined
May 28, 2021
Messages
32
Likes
53
@amirm

Thank you, prof. The concise distillation of information in your last few posts have been awesome for me. I feel like I'm in school again but this time I'm actually studying the stuff I like. haha! :D

I'm going to put my pitchfork away now, and take out my tiny violin to play for Tidal and MQL... seems only a matter of time
 

earlevel

Addicted to Fun and Learning
Joined
Nov 18, 2020
Messages
545
Likes
776
Something you didn't mention but I value is elegance in efficient coding of music. I have always considered PCM format to be highly wasteful. As a person who has spent decades optimizing technology, it seems like such a poor solution. Going from 44.1 kHz to 88.2 kHz doubles the data rate yet there is hardly any musical information to be gained from that doubling. In that regard, MQA's approach of noticing the statistical aspects of music and encoding that is appealing to me. It is simply neat!

But is the improvement worth it? Not a rhetorical question, I'd like your opinion and explanation—I think I have an idea of how much savings, but I've been told zero savings and that doesn't seem to be right either (btw Audio DSP is my expertise, and I work in big storage, including deduplication and compression).

That is...package size from Amazon could be could be optimized at a significant space savings. Under certain circumstances it might be worth it, but in general it isn't—as long as the packages going out for delivery fit in the truck, it doesn't matter if they could be occupying 60% of the space.

MQA is being used for streaming. Through pipes that accommodate video. How many people are really in a hypothetical sweet spot where they have the audio environment and equipment to hear the difference between CD and 24/96, yet they need just a bit more stream compression to be able to realize it? Bob Stuart gave an example of a guy in Japan with expensive data rates, presumably needing hi-res music on his phone—seems he could have come up with a better explanation if the compression aspect was truly important to people.

Just saying, I doubt the value of the compression aspect. On our own computers, we typically don't bother to compress everything, and there is much greater redundancy (though for enterprise-wide storage, we often do rely on compression, where deduplication can routinely reduce storage by 95%).

Again, I don't want this to sound like rhetoric—I'm hoping you can either tell me maybe I have a valid point of view, or whether I'm missing something important. My main view, which I've posted before, is the compression aspect for streaming is not compelling, and if I accept that their process improves the sound over streaming hi-res, why don't they just build that conversion into the players and not be dependent on infiltrating the entire chain. (If encoding time is far higher than decoding time, that would be a possible reason, but I haven't see that argument presented.)

PS—You answered some of my questions, at least qualitatively, in later posts and some after this post. On Tidal saving space, OK, but how is this an advantage over ALAC, which is far simpler? (That is, ALAC or similar to save space, which is trivially decompressible for normal streaming, no change on the user end.)
 
Last edited:

Hai-Fri. Audio

Member
Forum Donor
Joined
May 28, 2021
Messages
32
Likes
53
We are lucky we are not In Norway. :)

“On this basis we ask the institutions to find ways to stop the unauthorised use of the title professor,” the ministry said.

"The ministry is of the opinion that the fine has to be of such size that it will have a preventive effect for the largest institutions, and the size of the fine will be decided in each case," the ministry proposes.

I don't think anyone's ever thought of me as an institution... absolutely chuffed!! :D
 

amirm

Founder/Admin
Staff Member
CFO (Chief Fun Officer)
Joined
Feb 13, 2016
Messages
44,374
Likes
234,475
Location
Seattle Area
You appear to be confused. What you describe is the excuse for MQA. It's purpose is to enrich Bob Stuart.
Looks like you are the one that is confused. You think the world of technology should be in charity business. Smartphones would not be here if it were not to enrich the employees and stock holders of Apple and Google. Steve Jobs didn't set out to save humanity you know. I don't see Elon Musk trying to save the hungry in Africa either. Netscape popularized the browser as to make Mark Andreesen rich.

So let's dispense with putting down profitability as a reason something is bad. It is a sign of being smart to build technology that makes money.

This place would not exist if I hadn't make a couple of dollars out of my career of building technology. Maybe you live from donations from others but in my book, there is nothing whatsoever bad about making technology, selling it, and getting paid for it.

As it is, Bob Stuart could have made far, far more money getting a job at Apple, Amazon, etc. and getting rich from those stock options. With his qualifications, he can almost have any job he wants. He has chosen to be an entrepreneur in a super high risk venture. Likely gets little sleep doing so.

So I suggest once again that we don't opine outside of our core knowledge and expertise. You are very much out of line with that comment.
 

Thomas savage

Grand Contributor
The Watchman
Forum Donor
Joined
Feb 24, 2016
Messages
10,260
Likes
16,298
Location
uk, taunton
BTW Amirm's Mrs is the master of pickling , even if amirm himself is rather uncouth in his appreciation of certain pickeling efforts , I had to amend things and eat a ton because of his ignorance of pallet .

Pickeled veg , tea , darning wool , family and feeling sorry for the postman are my main lasting memories of my lovely friend @amirm , that and me struggling to eat a burger , fortunately that indiscretion was away from the home ...

He did drop me off at the airport with only 30mins before the international flight took off but , I understood, he didn't really want me to go .. .. . ... .
 
Last edited:

dmac6419

Major Contributor
Joined
Feb 16, 2019
Messages
1,246
Likes
770
Location
USofA

amirm

Founder/Admin
Staff Member
CFO (Chief Fun Officer)
Joined
Feb 13, 2016
Messages
44,374
Likes
234,475
Location
Seattle Area
A bit of a tangent here - this MQA debate has inspired me to try and learn a bit more about the inherent limitations and challenges of PCM when it comes to recording and digitizing real-life audio sources. Are there any good resources that discuss this subject matter in more depth?
Most definitely. The most authoritative paper, and analysis actually comes from Bob Stuart! It is both a conference paper and Journal with the latter being more detailed. It is also available in public. It is called Coding for High Resolution Audio Systems

https://www.aes.org/e-lib/browse.cfm?elib=12986

I found an online version although I don't know if it is the conference paper or the journal one: http://decoy.iki.fi/dsound/ambisonic/motherlode/source/coding2.pdf

"What do we mean by high resolution? The recording and replay chain is reviewed from the viewpoints of digital audio engineering and human psychoacoustics. An attempt is made to define high resolution and to identify the characteristics of a transparent digital audio channel. The theory and practice of selecting high sample rates such as 96 kHz and word lengths of up to 24 bit are examined. The relative importance of sampling rate and word size at various points in the recording, mastering, transmission, and replay chain is discussed. Encoding methods that can achieve high resolution are examined and compared, and the advantages of schemes such as lossless coding, noise shaping, oversampling, and matched preemphasis with noise shaping are described. "


This paper is heavily referenced in other research. And if you read it, it actually has a very sober look of the audio saying we don't need crazy sample rates to preserve what is important in audio. Likely the reason MQA doesn't even try to encode above 88/96 kHz and just upsamples.
 

amirm

Founder/Admin
Staff Member
CFO (Chief Fun Officer)
Joined
Feb 13, 2016
Messages
44,374
Likes
234,475
Location
Seattle Area
The thing is... how many audiophiles are worried about saving disc space?
Almost all of MQA content consumed today is streaming through Tidal so storage saving is not a factor. Audiophiles are attracted to it based on "sound quality" (perceived though different masters or imagined). For many though like me, we got MQA for free and play it when it comes our way.
 

amirm

Founder/Admin
Staff Member
CFO (Chief Fun Officer)
Joined
Feb 13, 2016
Messages
44,374
Likes
234,475
Location
Seattle Area
Why would I not have an idea?is it some special audiophile music or something,I know exactly who they are,Bob Ludwig too,so if I may ask what are you trying to say?
I don't know why he went after you. You were very correct that Bandcamp had made a mistake of not checking what was available in what format versus Prostudiomasters.

They are not totally at fault though in that I read that title was released in CD as an MQA without any marking on the CD either. So likely they were just given that same source.
 

Thomas savage

Grand Contributor
The Watchman
Forum Donor
Joined
Feb 24, 2016
Messages
10,260
Likes
16,298
Location
uk, taunton
Most definitely. The most authoritative paper, and analysis actually comes from Bob Stuart! It is both a conference paper and Journal with the latter being more detailed. It is also available in public. It is called Coding for High Resolution Audio Systems

https://www.aes.org/e-lib/browse.cfm?elib=12986

I found an online version although I don't know if it is the conference paper or the journal one: http://decoy.iki.fi/dsound/ambisonic/motherlode/source/coding2.pdf

"What do we mean by high resolution? The recording and replay chain is reviewed from the viewpoints of digital audio engineering and human psychoacoustics. An attempt is made to define high resolution and to identify the characteristics of a transparent digital audio channel. The theory and practice of selecting high sample rates such as 96 kHz and word lengths of up to 24 bit are examined. The relative importance of sampling rate and word size at various points in the recording, mastering, transmission, and replay chain is discussed. Encoding methods that can achieve high resolution are examined and compared, and the advantages of schemes such as lossless coding, noise shaping, oversampling, and matched preemphasis with noise shaping are described. "


This paper is heavily referenced in other research. And if you read it, it actually has a very sober look of the audio saying we don't need crazy sample rates to preserve what is important in audio. Likely the reason MQA doesn't even try to encode above 88/96 kHz and just upsamples.
Also why soo many reference nils loftgran live when it was done in 16/44 .

These just are not the important things, if all this time and effort was aimed at lables and distributors giving us files mastered for hifi playback we would all be better off.

Master it for speakers in a room, if you want to use ear buds or play fancy games trying to negate the limits of headphones this all should be done in the player or on the phone / attached device..

That would increase fidelity for all humans , your literally better off being stuck with nothing but MQA but it respecting the playback chain being a hifi than current status quo .

Its so disappointing, its all just so disappointing.
 

KSTR

Major Contributor
Joined
Sep 6, 2018
Messages
2,690
Likes
6,013
Location
Berlin, Germany
I think MQA has made a strategic mistake of never engaging the audiophile community on these forums. Had they done so, they could have provided the explanations I provided in this thread, a long time ago, obviating the need to do the video OP did, and tons of arguments back and forth. Instead they only went the route of one-shot interviews with press and such with no allowance from the other side to ask tough questions.
Looks like that nails it. In this day and age, you have to meat your customers at eye level. We're not the dumb sheep we used to be in the last century, notably pre-internet.
 

Thomas savage

Grand Contributor
The Watchman
Forum Donor
Joined
Feb 24, 2016
Messages
10,260
Likes
16,298
Location
uk, taunton
I just explained how it is NOT perceptually lossless so not sure you are saying that to me. MQA's approach of preserving all the music can be called lossless since no music was left behind. It would be lay definition of lossless in some sense, "give me all the music." Compare that to 16-bit/44.1 kHz CD which cannot say that.

The audiophile community was the one that assumed their definition was mathematically lossless even though MQA did not use that term ("mathematically"). The community had good reason to think so but hopefully as a result of this thread and Bob's new explanation, everyone should know now there is a distinction. The old assumption could have never been true and I had said that many times.

I think MQA has made a strategic mistake of never engaging the audiophile community on these forums. Had they done so, they could have provided the explanations I provided in this thread, a long time ago, obviating the need to do the video OP did, and tons of arguments back and forth. Instead they only went the route of one-shot interviews with press and such with no allowance from the other side to ask tough questions.
Wrong timing for them on all fronts .

The marketing brief was clear , create value in better audio , make it understandable, create a link between authenticity and MQA and break the numbers game in favour of the letters game . Do all this while creating added value , value principles that are effective as the publics high regard for screen resolution vs disregard and misunderstanding of sampling rate . .. so make it letters and try like a mother fucker for authenticity.

Good effort, probably the Internet age defeated them , I'm sure SACD never had to deal with this shit .

PCM is wasteful, there was a time when low rate codecs were taking over and that would have been sad so MQA would of saved us ...

Its just things never quite turned out that way and the progression of information sharing and storage combined with the difficulties of establishing any standard in audio got in the way .

Oh well ... @amirm If Bob asks you for a fiver for a ' sure thing ' run like the clappers ...
 

Hai-Fri. Audio

Member
Forum Donor
Joined
May 28, 2021
Messages
32
Likes
53
Looks like that nails it. In this day and age, you have to meat your customers at eye level. We're not the dumb sheep we used to be in the last century, notably pre-internet.

if anything, their utter ineptitude regarding business/marketing has always kept me skeptical of the idea that, excuse my hyperbole, MQA is some attempt at world domination.
 

raistlin65

Major Contributor
Forum Donor
Joined
Nov 13, 2019
Messages
2,279
Likes
3,421
Location
Grand Rapids, MI
The audiophile community was the one that assumed their definition was mathematically lossless even though MQA did not use that term ("mathematically"). The community had good reason to think so but hopefully as a result of this thread and Bob's new explanation, everyone should know now there is a distinction. The old assumption could have never been true and I had said that many times.

It's a very MQA Ltd. biased way of looking at it to describe it as an assumption.

Lossless is generally understood to be mathematically lossless in the audiophile community. A rhetor is responsible for communicating to their audience, and has to use the language correctly for their target audience.

So the complete burden for this so-called misunderstanding rests on MQA for not well communicating well. They feel their encoder/decoder is a new form of lossless that preserves the music, even if it's not bit for bit lossless. The fact that it didn't work means MQA had a major marketing/PR failure of messaging by not correctly interpreting the rhetorical situation.

(I say "so-called" because I don't think MQA has well-proven that the encoder/decoder does what it says it does. Which doesn't make any sense, because proving it doesn't seem to have a downside.)

But what's weird is Bob Stuart should understand consumer audio well enough to know this might happen. And when it did start happening, have done a better job of revising their messaging to convey this new lossless idea to consumers. But that hasn't happened.
 

PierreV

Major Contributor
Forum Donor
Joined
Nov 6, 2018
Messages
1,437
Likes
4,686
Keep in mind that Apple and Amazon have likely much lower bandwidth costs than Tidal since they use so much of it.

Mostly because Tidal is an Amazon customer...

from aws json ip list

{
"ip_prefix": "13.249.0.0/16",
"region": "GLOBAL",
"service": "AMAZON",
"network_border_group": "GLOBAL"
},

some tidal addresses, among others (https://tools.tracemyip.org/lookup/tidal.com)

13.249.109.19
13.249.123.24
13.249.135.70
13.249.110.101

and of course...

Domain Name: TIDAL.COM
Updated Date: 2017-05-02T20:56:59Z
Creation Date: 1995-06-08T04:00:00Z
Registry Expiry Date: 2025-06-07T04:00:00Z
Registrar: GoDaddy.com, LLC
Name Server 1: NS-1073.AWSDNS-06.ORG
Name Server 2: NS-1865.AWSDNS-41.CO.UK
Name Server 3: NS-196.AWSDNS-24.COM
Name Server 4: NS-974.AWSDNS-57.NET
 

mtristand

Member
Joined
Jan 12, 2021
Messages
27
Likes
167
So much of MQA's material throws such phrases around, "completely reversible lossless process", "unwrapped perfectly", "original sound restored", "lossless", "touched up to lossless", "better than lossless", etc.

"Lossless" is a term that is already widely used, well-understood, and well-defined in computer science, cryptography, mathematics, audio communities, etc - all information is present from the original, absolutely nothing is lost.

If you're going to make claims that make use of a well-established phrase, you need to let people verify it. If you're not going to communicate or offer transparency, then extreme skepticism is warranted.

If you're using common words in non-standard or misleading ways, you need to be clearer with your advertising to more accurately reflect what it is you're offering - otherwise you lose the right to clutch pearls when people rightfully take umbrage with it.
 
Status
Not open for further replies.
Top Bottom