• WANTED: Happy members who like to discuss audio and other topics related to our interest. Desire to learn and share knowledge of science required. There are many reviews of audio hardware and expert members to help answer your questions. Click here to have your audio equipment measured for free!

Digital Audio Jitter Fundamentals Part 2

OP
amirm

amirm

Founder/Admin
Staff Member
CFO (Chief Fun Officer)
Joined
Feb 13, 2016
Messages
44,671
Likes
241,053
Location
Seattle Area
It actually doesn't work that way. Packet delay variation is compensated with jitter buffers in the process called Packet Loss Concealment. As long as packet loss is under 1% it is considered inaudible, while if greater than 3% you will be certainly able to hear it. It will not actually modify the sound itself but you will hear it as clicks or dropouts.
This technique is only used in real-time communications where after a short period of time, you no longer can wait for the packet to arrive. Too much delay/buffering makes real-time communication hard to use due to latency of hearing the other party.

This is NOT in play when you stream music or video online or in your home network. Data can be fetched well in advance of playback and that gained "time" used to retransmit packets. If something doesn't arrive on time, then the player pauses, resulting in that "buffering" message.

The larger issue is correct that there is no such thing as jitter in the network end since everything is assumed to be highly jittery in networks.
 

Krunok

Major Contributor
Joined
Mar 25, 2018
Messages
4,600
Likes
3,068
Location
Zg, Cro
This technique is only used in real-time communications where after a short period of time, you no longer can wait for the packet to arrive. Too much delay/buffering makes real-time communication hard to use due to latency of hearing the other party.

This is NOT in play when you stream music or video online or in your home network.

This technique is used with RTP protocol which is used to stream video and audio over the Internet, as his question was "At what stage does the jitter from the entire internet.." ;)

In our home networks we are using TCP/IP protocol but player applications still use buffering in a similar manner which buys you time if packet needs to be re-transmitted. It's easy with audio as today's computers have enough memory to easily buffer the whole song but not so with video. Still, it is the same technique with both protocols and the point is that we are not speaking of jitter in packet transmission networks.
 
OP
amirm

amirm

Founder/Admin
Staff Member
CFO (Chief Fun Officer)
Joined
Feb 13, 2016
Messages
44,671
Likes
241,053
Location
Seattle Area
This technique is used with RTP protocol which is used to stream video and audio over the Internet, as his question was "At what stage does the jitter from the entire internet.." ;)
That is only the case for real-time communication such as VOIP. For general audio/video stream, we did start with RTP/RTSP but firewalls block these by default so it died early on. What took its place is HTTP streaming which in many cases encapsulates RTP/RTSP packets inside HTTP. Since HTTP runs over TCP, then packet retransmission is there by definition.

Also, error concealment is done at the codec level because it best knows how to make up for lost data. It is not handled at networking level.
 

Cosmik

Major Contributor
Joined
Apr 24, 2016
Messages
3,075
Likes
2,180
Location
UK
It actually doesn't work that way. Packet delay variation is compensated with jitter buffers in the process called Packet Loss Concealment. As long as packet loss is under 1% it is considered inaudible, while if greater than 3% you will be certainly able to hear it. It will not actually modify the sound itself but you will hear it as clicks or dropouts.

To put the things short - in packet audio transmmition you don't really have jitter, instead you have packet loss and delay. And yes, if a packet loss comes over the certain limit you will hear it in a very different way than you hear jitter.

More info here: https://kb.smartvox.co.uk/voip-sip/rtp-jitter-audio-quality-voip/
It's going back a bit, but as I recall I was being slightly facetious: I think I was stressing the point that streaming actually works across the globe and beyond, hence all FUD about 'jitter' across networks and worries about short lengths of ethernet cable are misplaced. Bits are bits :) and as long as they get through in time in an asynchronous system, no one will be any the wiser when they look at the output.
 

Krunok

Major Contributor
Joined
Mar 25, 2018
Messages
4,600
Likes
3,068
Location
Zg, Cro
That is only the case for real-time communication such as VOIP. For general audio/video stream, we did start with RTP/RTSP but firewalls block these by default so it died early on. What took its place is HTTP streaming which in many cases encapsulates RTP/RTSP packets inside HTTP. Since HTTP runs over TCP, then packet retransmission is there by definition.

The point of my post was that there is really no sense to talk about jitter in packet transmission of audio or video, so it doesn't really make any difference if you use RTP or TCP/IP.

Btw, who exactly are "we"? If by "we" you mean "Microsoft" I really hate to remind you that HLS protocol (HTTP Live Streaming) was invented by Apple, not Microsoft. :D

Also, error concealment is done at the codec level because it best knows how to make up for lost data. It is not handled at networking level.

True, but if you're using TCP/IP dropped packet will always be re-transmitted by the network layer regardless of the higher OSI layers.
 

Krunok

Major Contributor
Joined
Mar 25, 2018
Messages
4,600
Likes
3,068
Location
Zg, Cro
It's going back a bit, but as I recall I was being slightly facetious: I think I was stressing the point that streaming actually works across the globe and beyond, hence all FUD about 'jitter' across networks and worries about short lengths of ethernet cable are misplaced. Bits are bits :) and as long as they get through in time in an asynchronous system, no one will be any the wiser when they look at the output.

"Bits are bits" sounds nice but it's not really that simple. Once your computer receives TCP/IP or RTP packet over the Internet say it will pass it over USB to the digital converter which will then use I2S or SPDIF to move the data to the DAC chip, and that is where things start to get complicated as that transfer is not asynchronous any more, but also is not fully synchronous, and that may cause problems. But with modern chips it doesn't really happen in a way we can hear, so we can move on.. :D
 

Krunok

Major Contributor
Joined
Mar 25, 2018
Messages
4,600
Likes
3,068
Location
Zg, Cro
Btw, who exactly are "we"? If by "we" you mean "Microsoft" I really hate to remind you that HLS protocol (HTTP Live Streaming) was invented by Apple, not Microsoft. :D

Or did you mean: "We are the Borg, resistance is futile"? :D
 
OP
amirm

amirm

Founder/Admin
Staff Member
CFO (Chief Fun Officer)
Joined
Feb 13, 2016
Messages
44,671
Likes
241,053
Location
Seattle Area
Btw, who exactly are "we"? If by "we" you mean "Microsoft" I really hate to remind you that HLS protocol (HTTP Live Streaming) was invented by Apple, not Microsoft. :D
We being a company called VXtreme. A start-up that I was part of which we sold to Microsoft in 1997. We were streaming video using dial-up modem when Apple engineers were still in diapers. :D Indeed we patented the idea of adaptive streaming where different bit rate content is selected dynamically as network bandwidth changes. This is used by all streaming platforms today or you would be getting a lot more buffering messages. If you ever see the fidelity of video change as you are watching, adaptive streaming is in play.
 

Krunok

Major Contributor
Joined
Mar 25, 2018
Messages
4,600
Likes
3,068
Location
Zg, Cro
We being a company called VXtreme. A start-up that I was part of which we sold to Microsoft in 1997. We were streaming video using dial-up modem when Apple engineers were still in diapers. :D

Hahaha - very true that about dipers! :D

Indeed we patented the idea of adaptive streaming where different bit rate content is selected dynamically as network bandwidth changes. This is used by all streaming platforms today or you would be getting a lot more buffering messages. If you ever see the fidelity of video change as you are watching, adaptive streaming is in play.

I was technical head at national TV broadcast up until recently so I'm familiar with adaptive streaming, but nevertheless, cudos for inventing that!
 

svart-hvitt

Major Contributor
Joined
Aug 31, 2017
Messages
2,375
Likes
1,253
Reading this I think I am more concerned about the actual data rate of services like Netflix, iTunes, HBO, both data rates for video and audio.

To me, it’s clear that some digital streams are of lower quality than others. Jitter is of no concern!

Maybe a question for a separate thread...
 
OP
amirm

amirm

Founder/Admin
Staff Member
CFO (Chief Fun Officer)
Joined
Feb 13, 2016
Messages
44,671
Likes
241,053
Location
Seattle Area
They use very aggressive bit rates as to keep the cost of bandwidth low. As such, even though they advertise lofty specs like "4K," the actual fidelity on non-static images can be pretty low.
 

svart-hvitt

Major Contributor
Joined
Aug 31, 2017
Messages
2,375
Likes
1,253
They use very aggressive bit rates as to keep the cost of bandwidth low. As such, even though they advertise lofty specs like "4K," the actual fidelity on non-static images can be pretty low.

Maybe worth a new thread? Could we measure output data rate?

I hate it when the sound is congested and the image is foggy. And I don’t want to go back to discs! :(
 

Krunok

Major Contributor
Joined
Mar 25, 2018
Messages
4,600
Likes
3,068
Location
Zg, Cro
Reading this I think I am more concerned about the actual data rate of services like Netflix, iTunes, HBO, both data rates for video and audio.

To me, it’s clear that some digital streams are of lower quality than others. Jitter is of no concern!

Maybe a question for a separate thread...

As is the case with mp3 covering the whole frequency spectrum the same is true with video - 1080p or 4K resolution doesn't tell you much about quality unless you look at the effective bitrate of the stream you're getting.
 

svart-hvitt

Major Contributor
Joined
Aug 31, 2017
Messages
2,375
Likes
1,253
As is the case with mp3 covering the whole frequency spectrum the same is true with video - 1080p or 4K resolution doesn't tell you much about quality unless you look at the effective bitrate of the stream you're getting.

Anyone who wants to pick up the gauntlet and measure, present the output rate of streaming services?

Who is watering out the juice the most?
 

Cosmik

Major Contributor
Joined
Apr 24, 2016
Messages
3,075
Likes
2,180
Location
UK
We seem to be back at that point where everyone doubts that bits are bits, and digital is really analogue, etc. Next thing we'll be being told that expensive ethernet cable reduces jitter and is better for EMI. Because everything matters, and how can you be sure that the dielectric isn't draining the colour from the sound.

Don't lose sight of the fact that shifting bits around is *trivial*, and that the only critical bit is right at the DAC, and that can (should) be surrounded by ample isolation, shielding, power supply regulation, decoupling and so on. It just needs the bits from the outside world, entering through an airlock, to arrive before they're needed.

Don't fall for the FUD! :)
 
OP
amirm

amirm

Founder/Admin
Staff Member
CFO (Chief Fun Officer)
Joined
Feb 13, 2016
Messages
44,671
Likes
241,053
Location
Seattle Area
Anyone who wants to pick up the gauntlet and measure, present the output rate of streaming services?
I have covered some of that in my lengthy article here on HDR, UHD/4K, etc.: https://audiosciencereview.com/foru...ge-hdr-and-wide-gamut-video-technologies.669/

"Let me cheer you up by mentioning that online streaming of UHD content is running around 15 megabits/sec by likes of Netflix. Yes, it is nearly one third the data rate of Blu-ray at 1080p! And they are trying to push four times the pixels? Right… Every demo of 4K streaming I have seen has been underwhelming. UHD Blu-ray should easily outperform such streaming content and by a good mile."​
 
OP
amirm

amirm

Founder/Admin
Staff Member
CFO (Chief Fun Officer)
Joined
Feb 13, 2016
Messages
44,671
Likes
241,053
Location
Seattle Area
And in youtube, right click with your mouse in the video window and select "stats for nerds."

You get a display that looks like this:

upload_2018-4-17_11-21-18.png


You see the encoded rates (resolution/frame rate), whether video frames are dropped, and what your network speed is.

In the older versions I could tell the encoded rate but seems like they have changed it and that is no longer listed.

Netflix also has a benchmarking mode. I am not a customer so I have not played with it.
 

svart-hvitt

Major Contributor
Joined
Aug 31, 2017
Messages
2,375
Likes
1,253
I have covered some of that in my lengthy article here on HDR, UHD/4K, etc.: https://audiosciencereview.com/foru...ge-hdr-and-wide-gamut-video-technologies.669/

"Let me cheer you up by mentioning that online streaming of UHD content is running around 15 megabits/sec by likes of Netflix. Yes, it is nearly one third the data rate of Blu-ray at 1080p! And they are trying to push four times the pixels? Right… Every demo of 4K streaming I have seen has been underwhelming. UHD Blu-ray should easily outperform such streaming content and by a good mile."​
So the TV producers want you to by their sets to watch a digital stream way below the sets’ capabilities...

Like having a Porsche for sitting in a traffic jam :(
 

RayDunzl

Grand Contributor
Central Scrutinizer
Joined
Mar 9, 2016
Messages
13,250
Likes
17,194
Location
Riverview FL
Anyone who wants to pick up the gauntlet and measure, present the output rate of streaming services?

If you route the video traffic through your PC, you can observe it in excruciating detail with Wireshark.

Example: Amazon to Roku

upload_2018-4-17_15-42-45.png
 
Top Bottom