• WANTED: Happy members who like to discuss audio and other topics related to our interest. Desire to learn and share knowledge of science required. There are many reviews of audio hardware and expert members to help answer your questions. Click here to have your audio equipment measured for free!

This is the technical forum for non-advanced audio users!

svart-hvitt

Major Contributor
Joined
Aug 31, 2017
Messages
2,375
Likes
1,253
Thanks, @amirm . Tried that. The «optical out» symbol shows on the screen, but volume cannot be altered via remote volume control for TV. So no fix.

I suspect HDMI on Apple’s (A4K) or Samsung’s (The Frame) side, and I have this reference in mind:

«Arvus Digital completes development of the world’s first HDMI to AES/EBU converter. The HDMI-2A is now used internationally for audio level calibration by companies such as Sony, Dolby, CBS, Warner Bros, AT&T, Park Road Post etc.»
Source: http://arvusgroup.com/history/

In other words, there seems to something going on in the HDMI protocol that represents a problem for engineers volume wise.

I can’t tell if the «direct» chain Mac Pro via USB is better when you adjust for volume because I need too much time to change between setups (and volume level is by ear only). Anecdotal impressions leave me, however, assuming that nothing good sound wise happens in the HDMI stage.
 
Last edited:
OP
amirm

amirm

Founder/Admin
Staff Member
CFO (Chief Fun Officer)
Joined
Feb 13, 2016
Messages
44,651
Likes
240,792
Location
Seattle Area
Yeh, it was a long shot. :)

I have tested HDMI in the past against S/PDIF and have not experienced any level differences. So as a protocol I don't think it is to blame.
 

svart-hvitt

Major Contributor
Joined
Aug 31, 2017
Messages
2,375
Likes
1,253
Yeh, it was a long shot. :)

I have tested HDMI in the past against S/PDIF and have not experienced any level differences. So as a protocol I don't think it is to blame.

Take my experience with a grain of salt; there could be something somewhere I haven’t adjusted (but I tried out all the suggestions from people above).

However, if other people have some «experiences» with HDMI, then don’t hesitate to write about this bastard standard.

:)
 
Last edited:

Fitzcaraldo215

Major Contributor
Joined
Mar 4, 2016
Messages
1,440
Likes
634
It definitely is a horrible "standard." Calling it a standard is definitely a stretch of that term.
I agree it is lacking in many respects, though it is perfectly OK for video. It is generally a compromise with audio, though. But, even there, it was not so bad that It prevented me from realizing the major advantages of Mch over stereo music from the get go. I used it for 5-6 years for audio with a prepro. But, all audio sounds much better to me via USB into a Mch DAC, and I think the elimination of HDMI audio was part of that.

And, then, of course, there is the awful HDMI connector.
 

svart-hvitt

Major Contributor
Joined
Aug 31, 2017
Messages
2,375
Likes
1,253
I agree it is lacking in many respects, though it is perfectly OK for video. It is generally a compromise with audio, though. But, even there, it was not so bad that It prevented me from realizing the major advantages of Mch over stereo music from the get go. I used it for 5-6 years for audio with a prepro. But, all audio sounds much better to me via USB into a Mch DAC, and I think the elimination of HDMI audio was part of that.

And, then, of course, there is the awful HDMI connector.

Thanks for your experience report!

I find your comment interesting but highly disturbing. What do you mean by «generally a compromise with audio»?

(Your anecdotal listening impressions resemble mine, by the way, when I try - sighted and with no way to ensure rapid AB(X) or level volume - and compare USB vs HDMI).
 

Fitzcaraldo215

Major Contributor
Joined
Mar 4, 2016
Messages
1,440
Likes
634
https://audiosciencereview.com/forum/index.php?threads/a-deep-dive-into-hdmi-audio-performance.56/

The main issue might be jitter, which is inherent in the HDMI protocol, because it uses the video clock for audio transmission/receive. There is always video transmitted with the audio in HDMI, even if just blank screens. Asynch USB does no such thing, and it vanquishes jitter by using the DAC master clock, together with buffering and logic at both ends to maximize audio performance.

The point is twofold. First, HDMI is primarily a video transmission protocol, with the unique needs of audio not fully recognized. Second, it is not terrible for audio, but other protocols do a better job.
 
OP
amirm

amirm

Founder/Admin
Staff Member
CFO (Chief Fun Officer)
Joined
Feb 13, 2016
Messages
44,651
Likes
240,792
Location
Seattle Area
I agree it is lacking in many respects, though it is perfectly OK for video.
There are tons of incompatibilities for video. You can have a perfectly good system and then get a new projector and it won't sync with it. Cabling is too thick and not field terminable which makes it pain for longer runs like to projectors. There are no requirements for testing so anything can get released and call itself HDMI.

We went from component video which always works when it wants to.
 

RayDunzl

Grand Contributor
Central Scrutinizer
Joined
Mar 9, 2016
Messages
13,250
Likes
17,185
Location
Riverview FL
OP
amirm

amirm

Founder/Admin
Staff Member
CFO (Chief Fun Officer)
Joined
Feb 13, 2016
Messages
44,651
Likes
240,792
Location
Seattle Area
Not a simple one. The issue is that HDMI audio is slaved to video. In each frame of video, there is room for X number of audio samples. In other words, that many audio samples has to be played before the next video frame comes. Otherwise audio will lose sync with video. Indeed that synchronization is the reason audio is slaved to video.

What this means is that you can't just have your own audio clock and play at your own rate. You will drift away from video in a few seconds.

Instead, you need to clean up the noisy HDMI clock and extract your data with extreme care to isolation from the noisy display circuits whaling close by. The Mark Levinson No 502 did that in my HDMI testing: https://audiosciencereview.com/forum/index.php?threads/a-deep-dive-into-hdmi-audio-performance.56/
 

svart-hvitt

Major Contributor
Joined
Aug 31, 2017
Messages
2,375
Likes
1,253
Not a simple one. The issue is that HDMI audio is slaved to video. In each frame of video, there is room for X number of audio samples. In other words, that many audio samples has to be played before the next video frame comes. Otherwise audio will lose sync with video. Indeed that synchronization is the reason audio is slaved to video.

What this means is that you can't just have your own audio clock and play at your own rate. You will drift away from video in a few seconds.

Instead, you need to clean up the noisy HDMI clock and extract your data with extreme care to isolation from the noisy display circuits whaling close by. The Mark Levinson No 502 did that in my HDMI testing: https://audiosciencereview.com/forum/index.php?threads/a-deep-dive-into-hdmi-audio-performance.56/

Do you have access to the February 2009 issue of Hifi-News on HDMI and jitter? It supports your findings. However, Paul Millers test method wasn’t 100 percent accounted for, according to some critical voices.
 
OP
amirm

amirm

Founder/Admin
Staff Member
CFO (Chief Fun Officer)
Joined
Feb 13, 2016
Messages
44,651
Likes
240,792
Location
Seattle Area
Do you have access to the February 2009 issue of Hifi-News on HDMI and jitter? It supports your findings. However, Paul Millers test method wasn’t 100 percent accounted for, according to some critical voices.
I do. My testing was actually as a result of people saying he had tested older gear and that HDMI audio performance was fixed. Well, it obviously was not.
 

Soniclife

Major Contributor
Forum Donor
Joined
Apr 13, 2017
Messages
4,510
Likes
5,437
Location
UK
The vol difference could be because of the vol levelling that's available in the tidal app to try to compensate for the loudness war, your phone might have it turned on, your Mac off. It's in the settings.
 

svart-hvitt

Major Contributor
Joined
Aug 31, 2017
Messages
2,375
Likes
1,253
The vol difference could be because of the vol levelling that's available in the tidal app to try to compensate for the loudness war, your phone might have it turned on, your Mac off. It's in the settings.

Will check...
 

svart-hvitt

Major Contributor
Joined
Aug 31, 2017
Messages
2,375
Likes
1,253
The vol difference could be because of the vol levelling that's available in the tidal app to try to compensate for the loudness war, your phone might have it turned on, your Mac off. It's in the settings.

@Soniclife , I believe you’re right!

I had never paid attention to that button in the iPhone app.

:)

And then some of my newfound digital (HDMI) skepticism was blown away again.

To my ears the sound is now much the same again.

So I guess: Case is over! Myth (on HDMI sound level) busted! Technology wins over man once more...
 

Soniclife

Major Contributor
Forum Donor
Joined
Apr 13, 2017
Messages
4,510
Likes
5,437
Location
UK
@Soniclife , I believe you’re right!

I had never paid attention to that button in the iPhone app.

:)

And then some of my newfound digital (HDMI) skepticism was blown away again.

To my ears the sound is now much the same again.

So I guess: Case is over! Myth (on HDMI sound level) busted! Technology wins over man once more...
So once level matched they sound they same?
 

svart-hvitt

Major Contributor
Joined
Aug 31, 2017
Messages
2,375
Likes
1,253
So once level matched they sound they same?

Yes, I believe so.

(Family house, no way to measure SPL properly...).

However, in Tidal’s Apple 4K app, I still think there’s a slightly lower volume (and I cannot find the setup of this app, and there may not be any), but I will leave it at that:

TIDAL FROM USB AND HDMI VIA A4K HAVE SAME VOLUME AND SOUND (ABOUT?) THE SAME.

:)
 
Top Bottom