• WANTED: Happy members who like to discuss audio and other topics related to our interest. Desire to learn and share knowledge of science required. There are many reviews of audio hardware and expert members to help answer your questions. Click here to have your audio equipment measured for free!

TV Sales Dying?

xr100

Addicted to Fun and Learning
Joined
Jan 6, 2020
Messages
518
Likes
237
Location
London, UK
CFL backlight + LCD screen = LCD
LED backlight + LCD screen = LED
CRT + phosphorus layer = CRT
OLED = OLED
Plasma = Plasma
And so on. Random naming.
Color wheels, DLP, LED, laser, xenon, etc...

"LED TV," IIRC, was expressly introduced as a term by one of the Korean conglomerates (Samsung?) for product differentiation purposes, and in a way that arguably was misleading to the uninformed consumer. The first device I acquired that used an LED-backlit LCD display was a MacBook Pro (Core Duo) and I don't remember that being marketed as having an "LED" display.

Given that "all" LCD's use LCD-backlights, it seems as ridiculous as it ever was to call them "LED" displays. Particularly given LED displays per se exist.

As for projectors--it is not a "laser projector" nor a "xenon projector" but a "laser-light source" projector or a "xenon-light source" projector. And sloppy terminology is not helpful there, either, as a "laser phosphor-light source" projector is not equivalent to a "laser-light source" projector.

In the case of CRT's, they by definition use a cathode ray gun to excite phosphorous coating on the screen; nothing to differentiate by adding further qualifications.

I think we all know what is meant by "LED" TV's and the battle was lost back then. Albeit it could be argued that with new technologies, i.e. "QLED," the "LED" term continues to be a confusing nomenclature*--and I still find "LED" TV or "LED" display device when it's an LCD display that happens to be backlit by LED's particularly irritating.

(* I mean, Q comes after O in the alphabet, so someone might imagine that QLED is better/newer than OLED?! Maybe they should be called LEDLCQDD displays...)
 
Last edited:

Tks

Major Contributor
Joined
Apr 1, 2019
Messages
3,221
Likes
5,497
Stagnating?

We now have 4K HDR as the norm, with 8K as a possible new standard coming in. (Even if that doesn't happen, 4K mostly covers what people can see in most situations). TV technology has vastly upped it's game, and fairly recently in the timeline for TV.

If someone is complaining about television progress I am inclined to put that in to the "never satisfied" category ;-)
I think you need to do some reading(six part article series). Just the mention of HDR around me incites this sort of response.

You also mention 8K, this isn't progress, resolution can be increased with ease, all you need is more video controllers/processors to account for it, and that's where the majority of costs come from, as well as yield rates on whole panels. Also 8K is treading upon pointless territory. The back-end support for it simply isn't there. It's not there in terms of content, it's not there in terms of I/O bandwidth standards (though HDMI 2.1 and DP 1.4 or whenever 2.0 releases, is enough in that respect), it's not there in terms of signal processing and broadcast standards.

But mostly as I said first, who cares about 8K? home cinema theater folks.. 4K is more than enough (even that's more than I really need to be honest).

You can also see this TV trend extend to monitors. Constant promises, delays, of what essentially amounts to nothing more than vaporware products.

"Making 8K and HDR affordable" isn't technological progress that I was talking about. That's just manufacturing strides and economies of scale hitting. Jumping from CRT to LCD is a technological leap, then to OLED, and maybe this decade if we're lucky, MicroLED. THAT is what I am talking about when I say technological progress. Getting televisions that can come remotely close(if this thing ever sees the light of day) even with respect to brightness, to the entire Dobly Vision specification would be nice. Likewise with panel bit-depth being 10-bit (mostly 8-bit panels with FRC nonsense), I'd like to see some prototypes of 12-bit panels some day soon. Otherwise I don't even know what colorists are even doing when the content provider is advertising a movie or series capable of Dolby-Vision HDR.... HOW? When such panels don't even exist to master the content on - even in the professional offerings?

Likewise with color gamut.. How long are we going to keep getting BT.2020 (a 2012 specification) displays? And not even full coverage sometimes either.. BT.2100 (the 2016 varient of a slight improvement of the BT.2020 standard) is still nowhere to be found on consumer displays.. And based on TV release dates, it seems to be trending that coverage seems to be outpaced on standard LCD's compared to OLED's which everyone assumes would reign supreme in this respect.

So with respect to something like that, we're not only seeing no progress, we're seeing regression (but that's what happens when the focus is put into idiotic and security and telemetry nightmare ridden Android TV OS reskins on televisions, and nonsensical garbage of the sort).

So forgive me if I my excitement for televisions seems to rise a few blips per decade.
 

xr100

Addicted to Fun and Learning
Joined
Jan 6, 2020
Messages
518
Likes
237
Location
London, UK
I think you need to do some reading(six part article series). Just the mention of HDR around me incites this sort of response.

Thanks for the link. (Haven't read it yet but will do.)

Also 8K is treading upon pointless territory.

Depends on viewing distance. At the minimum screen width to distance from screen specified by IMAX, 8K is required according to this article.

Also, consumer "4K" is not "true" 4K because of the chroma sub-sampling. (Whether or not it matters...)

Otherwise at this time I put "8K" into the same box as "1000W PMPO." It's a simple figure for the uninformed consumer to think "bigger must be better." Heck, for years there have been people watching SD versions of channels when HD versions are available.

It's not there in terms of content

It has to be supported all the way through the chain--from the lens (where applicable) to sensor, or CG at high resolution (render time plus more work.)

Likewise with panel bit-depth being 10-bit (mostly 8-bit panels with FRC nonsense)

Why is FRC "nonsense"? Surely it would always make sense to use FRC, no matter the bit-depth of the display, in the same way that dither is used in audio?

Otherwise I don't even know what colorists are even doing when the content provider is advertising a movie or series capable of Dolby-Vision HDR.... HOW? When such panels don't even exist to master the content on - even in the professional offerings?

That's the whole point of a system like Dolby Vision, to "translate" what is seen in grading to the final display, as best as possible within the capabilities of the final display. Since "no-one" has display devices that can achieve 10,000nits peak brightness, there is no reason to make use of that capability at this time.

Likewise with color gamut.. How long are we going to keep getting BT.2020 (a 2012 specification) displays?

I'm not up to speed on the latest display devices, but how many can achieve the full Rec. 2020 gamut?

A quick Google search turns up this:

"The Hisense L9 is its latest 4K Laser TV, which uses an X-Fusion TriChoma image engine composed of red, green and blue lasers to deliver a claimed colour gamut that can hit 100 per cent of Rec.2020. That's basically all the colours the human eye can see, and it would be one of the first displays to reach this technological milestone."

Source: A Detailed Look at the Best TVs at CES 2020 (Wired.)

So forgive me if I my excitement for televisions seems to rise a few blips per decade.

I get what you're saying, but let's not forget that not much more than 20 years ago, i.e. before DVD, only some lucky people had LaserDisc players. 20 years before that, most didn't have a VHS VCR. And 10 years before that, colour TV broadcasts didn't even exist in most countries...

And now, we have incredible digital cameras for motion pictures, that are also used for TV production; combined with incredible CG work (CGI sucks... when you notice it... i.e. most of the time, not!) Old films are being rescanned from the original negatives. This is a MUCH better situation than the old telecine transfers with generational loss before the transfer has even started, bob and weave everywhere, and limited colour grading and restoration options. And such high quality source on a new OLED TV looks FANTASTIC.
 
Last edited:

MattHooper

Master Contributor
Forum Donor
Joined
Jan 27, 2019
Messages
7,329
Likes
12,285
I think you need to do some reading(six part article series). Just the mention of HDR around me incites this sort of response.

You also mention 8K, this isn't progress, resolution can be increased with ease, all you need is more video controllers/processors to account for it, and that's where the majority of costs come from, as well as yield rates on whole panels. Also 8K is treading upon pointless territory. The back-end support for it simply isn't there. It's not there in terms of content, it's not there in terms of I/O bandwidth standards (though HDMI 2.1 and DP 1.4 or whenever 2.0 releases, is enough in that respect), it's not there in terms of signal processing and broadcast standards.

But mostly as I said first, who cares about 8K? home cinema theater folks.. 4K is more than enough (even that's more than I really need to be honest).

You can also see this TV trend extend to monitors. Constant promises, delays, of what essentially amounts to nothing more than vaporware products.

"Making 8K and HDR affordable" isn't technological progress that I was talking about. That's just manufacturing strides and economies of scale hitting. Jumping from CRT to LCD is a technological leap, then to OLED, and maybe this decade if we're lucky, MicroLED. THAT is what I am talking about when I say technological progress. Getting televisions that can come remotely close(if this thing ever sees the light of day) even with respect to brightness, to the entire Dobly Vision specification would be nice. Likewise with panel bit-depth being 10-bit (mostly 8-bit panels with FRC nonsense), I'd like to see some prototypes of 12-bit panels some day soon. Otherwise I don't even know what colorists are even doing when the content provider is advertising a movie or series capable of Dolby-Vision HDR.... HOW? When such panels don't even exist to master the content on - even in the professional offerings?

Likewise with color gamut.. How long are we going to keep getting BT.2020 (a 2012 specification) displays? And not even full coverage sometimes either.. BT.2100 (the 2016 varient of a slight improvement of the BT.2020 standard) is still nowhere to be found on consumer displays.. And based on TV release dates, it seems to be trending that coverage seems to be outpaced on standard LCD's compared to OLED's which everyone assumes would reign supreme in this respect.

So with respect to something like that, we're not only seeing no progress, we're seeing regression (but that's what happens when the focus is put into idiotic and security and telemetry nightmare ridden Android TV OS reskins on televisions, and nonsensical garbage of the sort).

So forgive me if I my excitement for televisions seems to rise a few blips per decade.

Yes I'm familiar with all that (as a long time denizen of the AVSforum).

I had already given the caveat about the utility of 8K vs 4K.

For me all the nit-picking you point is missing the forest for the trees in terms of the overall advances TVs have made. They were black and white forever, then CRT color tube sets were ubiquitous for almost 35 years or so, then we leapt in to flat screens in the 2000s with excellent performance upgrades through that decade, and then this decade we have a HUGE variety of TV sizes, far thinner, much better technology in terms of colour and contrast and other aspects (e.g. viewing angles) from the latest LED to OLED, and very obvious gains with HDR material.

Sorry, but I would till put your kvetches in to the nit-picky "hard to please" territory. But of course, whether you are excited about the latest TV technology is subjective and up to you. I was for a long time far more submerged in the type of videophile minutea you are referencing. These days I have a different perspective and I find myself amazed by what we have available in TV technology. (Even though I concentrate on projection these days).
 

Tks

Major Contributor
Joined
Apr 1, 2019
Messages
3,221
Likes
5,497
Yes I'm familiar with all that (as a long time denizen of the AVSforum).

I had already given the caveat about the utility of 8K vs 4K.

For me all the nit-picking you point is missing the forest for the trees in terms of the overall advances TVs have made. They were black and white forever, then CRT color tube sets were ubiquitous for almost 35 years or so, then we leapt in to flat screens in the 2000s with excellent performance upgrades through that decade, and then this decade we have a HUGE variety of TV sizes, far thinner, much better technology in terms of colour and contrast and other aspects (e.g. viewing angles) from the latest LED to OLED, and very obvious gains with HDR material.

Sorry, but I would till put your kvetches in to the nit-picky "hard to please" territory. But of course, whether you are excited about the latest TV technology is subjective and up to you. I was for a long time far more submerged in the type of videophile minutea you are referencing. These days I have a different perspective and I find myself amazed by what we have available in TV technology. (Even though I concentrate on projection these days).

Well you're free to put me in "insane rambler" territory in the same way you're free to put me in any territory. I've made my case to the satisfaction I could, given the investiture I want to dedicate to demonstrating why I hold the view I do.

You on the other hand are simply stating 'but I would still put you in the hard to please territory' without sheding much light as how my points don't hold water enough to sway your opinion, nor the threshold that would. It's fine if you have no established threshold of course, but any aproximation of what would lead to to hold to my view would be a demonstration of good faith on the discussion.

Also speaking of CRT's, I actually am sad they ever left. The sort of picture characteristics they were capable of is a regression (not general consumer TV's but something like Trinitrons or SONY's BVM/PVM line was just beautiful). One aspect EVERY single currently existing display gets absolutely obliterated is with respect to latency, and pixel persistence (the factors that lead to ghosting and such that are present in all displays). People's memory has faded with respect to this very important aspect of displays (the motion capabilities). But this hasn't much to do with specifics to this topic I was discussing, I just wanted to mention it as an aside.

Thanks for the link. (Haven't read it yet but will do.)



Depends on viewing distance. At the minimum screen width to distance from screen specified by IMAX, 8K is required according to this article.

Sure, but as I said, I am not interested in theaters, we're talking about the consumer sphere.


Also, consumer "4K" is not "true" 4K because of the chroma sub-sampling. (Whether or not it matters...)

Not really to me, as long as it's capable of 4:4:4, I'd be fine living with that. Also, it also speaks volumes this is the case (the clusterfuck with all these currently existing bullshit standards that don't even definitionally qualify as standards... How is something a standard if there are optional portions? There is no sensible reason of letting one product get the label, while a superior one also is only granted the same basic label as the worse television for example. The only reason this even occurs is monetary gains that come from confusing consumers that don't do their research about labels and specs).

Otherwise at this time I put "8K" into the same box as "1000W PMPO." It's a simple figure for the uninformed consumer to think "bigger must be better." Heck, for years there have been people watching SD versions of channels when HD versions are available.

You and I are on the same page here. Glad we understand this mess and how it demonstrates companies just making up nonsense on paper that fools consumers in an attempt to make up for the stagnation that has led to all their products being very close with one anothers' offerings. And new consortiums always popping up in an attempt for companies to make their own standards hoping to add another sticker on the TV in hopes that will drive someone to pick their product over the other that looks similar on paper.

It has to be supported all the way through the chain--from the lens (where applicable) to sensor, or CG at high resolution (render time plus more work.)

Agreed here as well just like your prior statements.

Why is FRC "nonsense"? Surely it would always make sense to use FRC, no matter the bit-depth of the display, in the same way that dither is used in audio?

It's a demonstration of the over-reliance that signals they're not technologically progressing. The fact they are making use of it so much, simply shows this. It's not the end of the world in terms of perceptibility, it just shows they're not willing to take the high road (which they will eventually anyway, they're just putting things off, and this has been by whole qualm in this topic).

That's the whole point of a system like Dolby Vision, to "translate" what is seen in grading to the final display, as best as possible within the capabilities of the final display. Since "no-one" has display devices that can achieve 10,000nits peak brightness, there is no reason to make use of that capability at this time.



I'm not up to speed on the latest display devices, but how many can achieve the full Rec. 2020 gamut?

A quick Google search turns up this:

"The Hisense L9 is its latest 4K Laser TV, which uses an X-Fusion TriChoma image engine composed of red, green and blue lasers to deliver a claimed colour gamut that can hit 100 per cent of Rec.2020. That's basically all the colours the human eye can see, and it would be one of the first displays to reach this technological milestone."

Source: A Detailed Look at the Best TVs at CES 2020 (Wired.)

Finally, a 2012 spec in the year 2020 (not to pun it with BT.2020). Perfectly demonstrates my talking points.

I get what you're saying, but let's not forget that not much more than 20 years ago, i.e. before DVD, only some lucky people had LaserDisc players. 20 years before that, most didn't have a VHS VCR. And 10 years before that, colour TV broadcasts didn't even exist in most countries...

This is tangential to my points. All I care about is display tech with respect to the topic of contention. And I know things don't move in isolation of one another, but nor do all aspects move in tandem (like this resolution proliferation that doesn't seem to be a problem, but something like refresh rate, contrast ratios, or bit-depth improvements seem to be avoided at all costs). I only speak for the last decade. Please don't confuse my dissatisfaction for the whole industry since inception (which some might have confused).

And now, we have incredible digital cameras for motion pictures, that are also used for TV production; combined with incredible CG work (CGI sucks... when you notice it... i.e. most of the time, not!) Old films are being rescanned from the original negatives. This is a MUCH better situation than the old telecine transfers with generational loss before the transfer has even started, bob and weave everywhere, and limited colour grading and restoration options. And such high quality source on a new OLED TV looks FANTASTIC.


Absolutely fine by me, nothing here treds upon my talking points per se. OLED is an old tech, and it hasn't been improving for the past half decade either, it's main issues of luminescence, but mostly burn-in is a clear example of stagnation of even a revolutionary tech that hit about the time that decade threshold I was refering to - to when all this stagnation began. I'm not saying there's no new tech ever, but whole issue is the pace. The pace of the past decade is nothing compared to the decade, or decades before it simply put. This is strictly talking about display tech.

If you want to see more of this, check out desktop monitors. Recycled nonsense for years on end, same old AU Optronics displays, with a few others simply copying the basic specs of one another. Zero competition aside from very recently (and you pay out the ass for it). Worse yet, even the slight generational improvements (whenever they occur) are accompanied by lunacy price hikes. Look at even more established industries like smartphones. The new Samsung Galaxy S20's STARTING prices are $999, the S20 Plus is $1,199, while the S20 Ultra is $1,399. These are all successors to the previous generation, and yet the incremental improvements (aside from the new SKU of the Ultra) are demanding more and more after a generation or two.

Just insane. But at least with smartphones you see they're busting their asses competing. Displays, everyone seems to have a gentlemens agreement. Heck even OLED.. there's only 1 choice (LG) unless you want to pay more for almost nothing on a Sony. This is just awful, and this is the sort of environment that breeds complacency, and is the sort of horseshit that led to Intel STILL putting out 14nm CPU's in 2020. Absolutely comedic when you take a step back and look.

So everyone can call me "nitpicky" but I wish anyone luck at demonstrating to me, we're not stagnating more now than we were in the past (like 10+ years ago).
 

MattHooper

Master Contributor
Forum Donor
Joined
Jan 27, 2019
Messages
7,329
Likes
12,285
Well you're free to put me in "insane rambler" territory in the same way you're free to put me in any territory.

Now, now...let's not exaggerate. Being picky or "never pleased" (which is of course hyperbole) is not "insane rambler" territory. ;-)
If you aren't happy, again, that's your perogative and I'm not in a position to tell you otherwise. Further, it's the pickiest among us that tends to drive progress. I was there. I did it for years and years, sweating the minutia. My "reviews" of flat panels/plasmas etc in the early days were almost legendary on AVS. Ultimately I grew fatigued from it. So for instance I used to sweat every bit of little videophile differences when bying TVs and my projectors. I was very home-theater/AV focused for a decade or more. But I pivoted back in to "sweating the details" in my 2 channel system (swapping one-obsession for another, I can't juggle both). For me that gave me some personal perspective. When I turn on my projector I'm not looking at how everything could be even more improved "is that *really* hitting the color standards bang on??" Rather, I'm enjoying it in a relaxed way for the absolutely UNBELIEVABLE visual experience it gives me. I feel a bit more like how my non-videophile guests react who don't mine the picture for every possible failing but just react "Holy Shit! This is amazing!"

But, again, that's my personal spot on my own journey.



You on the other hand are simply stating 'but I would still put you in the hard to please territory' without sheding much light as how my points don't hold water enough to sway your opinion, nor the threshold that would. It's fine if you have no established threshold of course, but any aproximation of what would lead to to hold to my view would be a demonstration of good faith on the discussion.

I already mentioned the type of new advancements that I see as warranting my belief that TV has had plenty of amazing advancement, including relatively recently. If it's not enough for you, so be it.

Also speaking of CRT's, I actually am sad they ever left. The sort of picture characteristics they were capable of is a regression (not general consumer TV's but something like Trinitrons or SONY's BVM/PVM line was just beautiful). One aspect EVERY single currently existing display gets absolutely obliterated is with respect to latency, and pixel persistence (the factors that lead to ghosting and such that are present in all displays). People's memory has faded with respect to this very important aspect of displays (the motion capabilities). But this hasn't much to do with specifics to this topic I was discussing, I just wanted to mention it as an aside.

There was a fair amount of that sentiment on the AVSforum for a long time. Me, I never missed CRT once I moved on to plasma. I'd had a really nice Panasonic Tau TV that gave a stellar picture. And I'm familiar with all the points those who bemoan the loss of CRT make. But when I replaced my tube TV with a 42" plasma it was like I died and went to heaven. It looked so much more clear, dimensional, beautiful and cinematic.
The characteristics I gained with the Panny plasma, for me, far outweighed whatever I lost with the smaller CRT tube set.

Also, while CRT RPTVs had nice contrast, I couldn't abide the hot-spotting and narrowed viewing angles of all RPTVs, CRTS included. Same for the viewing angles for LED during much of it's lifespan. (That's why even today OLED is more attractive to me than LED variants, as well as the OLED contrast).

But such criteria are subjective, like choosing our audio equipment. Some things bugged me that didn't but others and visa versa. But as I said, I don't sweat the videophile stuff nearly so much anymore. I do get my projector professionally calibrated, but then I sit back enjoy and don't nit-pick.
 

Tks

Major Contributor
Joined
Apr 1, 2019
Messages
3,221
Likes
5,497
Now, now...let's not exaggerate. Being picky or "never pleased" (which is of course hyperbole) is not "insane rambler" territory. ;-)
My "reviews" of flat panels/plasmas etc in the early days were almost legendary on AVS. Ultimately I grew fatigued from it.
If you aren't happy, again, that's your perogative and I'm not in a position to tell you otherwise.

I'm not a videophile at all, this is a trend observed nearly in all respects of consumer electronics lately, it's just worse for example in the desktop monitor market with respect to this topic.


I already mentioned the type of new advancements that I see as warranting my belief that TV has had plenty of amazing advancement, including relatively recently. If it's not enough for you, so be it.

If you don't mind, what exactly would those be within the last half decade to a decade at most? Maybe I have a reading comprehension issue. If you're going to stick to 8K, or mention OLED(which specifically doesn't qualify per-se, but even if we include OLED in this respect, OLED progress hasn't moved much in where people hoped it would most.) then that's fine, but that isn't progress to me, nor is it anywhere near close to coming to consumer markets (for example about 8K...)


There was a fair amount of that sentiment on the AVSforum for a long time. Me, I never missed CRT once I moved on to plasma. I'd had a really nice Panasonic Tau TV that gave a stellar picture. And I'm familiar with all the points those who bemoan the loss of CRT make. But when I replaced my tube TV with a 42" plasma it was like I died and went to heaven. It looked so much more clear, dimensional, beautiful and cinematic.
The characteristics I gained with the Panny plasma, for me, far outweighed whatever I lost with the smaller CRT tube set.

Also, while CRT RPTVs had nice contrast, I couldn't abide the hot-spotting and narrowed viewing angles of all RPTVs, CRTS included. Same for the viewing angles for LED during much of it's lifespan. (That's why even today OLED is more attractive to me than LED variants, as well as the OLED contrast).

But such criteria are subjective, like choosing our audio equipment. Some things bugged me that didn't but others and visa versa. But as I said, I don't sweat the videophile stuff nearly so much anymore. I do get my projector professionally calibrated, but then I sit back enjoy and don't nit-pick.


I still have my Panasonic 42 inch Plasma as my main TV. I just don't watch much TV though, so it's not really much concern to me. I'm mostly on my desktop these days, and only on the TV when I am entertaining guests during a get together (and the TV's on in the background). Picture quality isn't the end-all be-all for me as you can tell since I still have that old thing (heck the only reason I got it in the first place is because I wanted a bigger sized television back then, no so much that I was thrilled with it's performance, and certainly not thrilled with it's current power consumption for example). With respect to televisions, I feel like a literal audiophile tube aficianado, but unlike them, the signature of the image (unlike the signature of tube distortion) cannot be emulated beyond basic shaders. The glow, the response time, the refresh, the lack of ghosting (especially when playing older games, and tbh it looks interesting even playing newer ones) imparts a look that simply cannot be had today. Granted, I am talking about broadcast CRT reference displays from Sony back then (I am sure I would appreciate a $40,000 Sony BVM today as well, no doubt). So my comparison with modern consumer displays VS reference CRT's of the past are a skewed comparison I will admit.

I wouldn't think CRT's are practical in light of recent trends of course. The size, the hum, the maintenance, the shipping, the disposal, the power usage, the pathetic screen sizes. Those are all downsides of a massive degree. Current OLED tech is fine enough for me, in the same way Plasma was (though the jump to plasma was FAR more drastic). I am not picky about my TV choices. Heck I wouldn't mind even an LCD either. I just wish we could have seen what would have happened if CRT was still around, and see what sort of advances people could have imagined there.

All of this is beside my main point though, of slow progress within the industry . I am sorry but again, the prior decade has not been as much progress as the decade before it in my view. Perhaps if you want to list a few things I am open to being enlightened and changing my mind on the matter. But as things stand, I simply do not see much movement in the consumer market.
 

GGroch

Major Contributor
Forum Donor
Joined
Apr 7, 2018
Messages
1,059
Likes
2,053
Location
Denver, Colorado
........This is beside my main point though, of slow progress within the industry . I am sorry but again, the prior decade has not been as much progress as the decade before it in my view......But as things stand, I simply do not see much movement in the consumer market.

I agree.....the state of the art in TV display technology has not advanced much recently...but why is that? There is strong evidence that the primary reason is not decisions made by manufacturers or a technological bottleneck, but rather decisions made by consumers. In 2019 the top five selling brands in North America (by number of screens shipped) were:

1. TCL (Surpassing Samsung for the 1st time ever)
2. Samsung
3. Vizio
4. LG
5. Funai

Three of the five are clearly budget brands...not leading display technology brands. The above link provides more details, TCL sales in 2019 were up 112% year over year (moving from 16% to 26% in 1st quarter sales, while Samsung dropped from 28% to 22%.

You could argue that Samsung and LG market share drops relative to TCL and Funai were caused by a lack of new premium technological advances, but this does not match the industries experience. In 2019 the two most differentiated premium ($$) display technology retail offerings were QLED and OLED, which made up only 1.8% and 1.6% of sales respectively.

Display technology may be stagnant, but consumers are clearly making dynamic choices in what to buy. 4K resolution and sophisticated smart functions had huge growth in the last 3 years...but they only became popular when they became virtually free. The market for super premium display technologies in televisions is clearly smaller than the market for super premium smartphones...where each new Apple and Samsung flagship release immediately impacts those companies stock prices.

This does not mean that TV sales are dying. It just means that as far as consumers are concerned, flat panel TV is now a mature technology in which performance at the TCL/Vizio/Funai level is plenty good enough and sales increases happen in the low end of the market.
 

Tks

Major Contributor
Joined
Apr 1, 2019
Messages
3,221
Likes
5,497
Nice post, anything I don't address I, agree with.

There is strong evidence that the primary reason is not decisions made by manufacturers or a technological bottleneck, but rather decisions made by consumers.

This is implying that decisions made by consumers are an isolated factor upon which technological bottlenecks, but more importantly manufacturer decisions don't have an effect on. As user @Berwhale somewhat relates in another topic, these things aren't independent of one another. And I am inclined to agree with him on some particulars in terms of perspective, and I feel that applies here. As for the root of the issue, deliberations can be had. From my view, simply put (without turning this into a massive discussion), the general consumers themselves are simply more stupid. It's a bit antithetical to think this may be the case seeing as how the internet has made information more available, but that information is an overload so critical thought about what specs should be considered never comes to fruition. Instead specs are served hot and ready by advertising arms of the manufacturer - all the while supported by consortiums that now have "standards" with optional portions (basically a contradiction in terms for all intents and purposes).

You could argue that Samsung and LG market share drops relative to TCL and Funai were caused by a lack of new premium technological advances, but this does not match the industries experience

I'd refrain from personally doing so, I am not an analyst in this industry to read trends and ascribe causality to be perfectly honest. Especially not in such a short time frame.

The market for super premium display technologies in televisions is clearly smaller than the market for super premium smartphones...where each new Apple and Samsung flagship release immediately impacts those companies stock prices.

100% agree, and I think this is the biggest contributory factor, and is why I don't understand when someone talks to me about "look at 8K, see the innovation?" what they think they're demonstrating. Certainly not technological advancement of yesteryears if we're going by relative comparisons. That's just an iteration on arguably the easiest portion of something like LCD display tech (resolution increase). Of course it's not as simple as I allude, but it certainly isn't as impressive as if doing it with MicroLED, or even older tech like Plasma or CRT's.

But yes, I don't think people care enough about televisions or displays as much anymore (nor are they bothered to care to find out why they may want to pay some heed if they're in the market to buy one, which speaks to my opening commentary about general consumer trends these days). Pair that up with efficiencies and information system analysis that companies due to properly gauge what they can or can't get away with (as opposed to a more lax, and more customer oriented market place for electronics 10, 20, 30, 40, 50 years ago). Seems like companies had no choice but to appeal to consumer demand (like people sending letters about what they hope a company would fix or improve). Now you have companies with feedback mechanisms where their customer's direct sentiments (as might be seen on social media with feedback they provide about products) is virtually useless, and spending habits/psychology models are employed to get a firmer grasp and understanding of what the consumer will pay for, rather than claim they want.

This does not mean that TV sales are dying. It just means that as far as consumers are concerned, flat panel TV is now a mature technology in which performance at the TCL/Vizio/Funai level is plenty good enough and sales increases happen in the low end of the market.

I feel like this in audio ;D though I actually go for what I say I want, and pay for it. Rather than say I want something - the company provides it - i see the price tag, and then avoid buying it. Which definately seems to be the case with consumer TV's and monitors.

You should see on Reddit when "next-gen" monitor prices were announced by Asus and Acer, like the X27 (they're gaming oriented monitors, but gaming requirements are highly demanding on actual hardware that cost billions in R&D like GPU's for example). Keep in mind, these monitors aren't nonsense, they're usually close to their television counterparts at the time, with superior other specifications like refresh rates, and latency etc.. All the while catering to a far smaller market, and being provided by a single OEM that had both manufacturers waiting for years seemingly until the prototypes were iterated for market viability. When the price was eventually revealed, the community was upset and in shock. Then came the comments "LOL $2,000 for an LCD?? Rather buy an OLED in that case". Totally oblivious to anything in the real world realities with respect to the folks making comments like that.

So I'm with you on TV's not dying or whatnot perhaps. But it's a far cry of the trends that were commonplace in years past. This must be what old people feel like when they talk about lunacy or stupidity taking over. I'm not there yet, and I'm not that concerned if people are happy.
 

xr100

Addicted to Fun and Learning
Joined
Jan 6, 2020
Messages
518
Likes
237
Location
London, UK
[On sitting close enough to the screen for 8K to be within visual acuity] Sure, but as I said, I am not interested in theaters, we're talking about the consumer sphere.

What is preventing consumers from laying out their home theatres around IMAX "immersiveness" standards? :)

[On FRC] it just shows they're not willing to take the high road (which they will eventually anyway, they're just putting things off, and this has been by whole qualm in this topic).

Will take the high road eventually...?

Not really to me, as long as it's capable of 4:4:4, I'd be fine living with that.

The content ain't 4:4:4...

Finally, a 2012 spec in the year 2020 (not to pun it with BT.2020). Perfectly demonstrates my talking points.

Specs can be written at any time...

For example, un Europe, there was an effort to move from PAL to MAC (Multiplexed Analogue Components.) It worked by "squashing" the luma and chroma in time, which had to be "unsquashed" by the receiver. There was also a section for data (audio, etc.)

1581651703494.png


Image from Wikipedia--data, luma, chroma.

Version "D" dated from 1982 (at least checking on the unreliable source that is Wikipedia, but that sounds about right), was not introduced in the UK until 1990. Then flopped because, in a nutshell, the competing satellite system that used PAL won. It made for very expensive receivers, because everything had to be A/D converted, buffered, and D/A converted in "unsquashed" form.

Why bother mentioning something that can only be considered extremely arcane, especially outside of Europe (and a few other places)? Well, the obvious feature was by using component video, bye-bye to composite artifacts such as moiré patterns.

MOREOVER, the UK receivers at least supported WIDESCREEN (16:9) out of the box that could work with "pan-and-scan" data that would be transmitted, and a high definition version (HD-MAC) would have been introduced--"1250 lines" (2x625 lines.) Some HD-MAC transmissions did occur in certain countries in the early 1990's.

Also, the data section, I think, supported up to 4 channels of digital audio (14-bit (well, 10-bit companded)/32kHz) so could have been used for discrete surround.

Over in Japan, high definition broadcasts existed in the form of Hi-Vision.

So much for 1980's standards that had some use in the early 1990's. How long did it take for high definition to properly "arrive"...?

OLED is an old tech, and it hasn't been improving for the past half decade either, it's main issues of luminescence, but mostly burn-in is a clear example of stagnation of even a revolutionary tech that hit about the time that decade threshold I was refering to - to when all this stagnation began. I'm not saying there's no new tech ever, but whole issue is the pace. The pace of the past decade is nothing compared to the decade, or decades before it simply put. This is strictly talking about display tech.

The problem is that once LCD became "good enough" plasma really suffered. I knew people who replaced their plasma displays because the screen was too reflective in a strongly illuminated environment. (e.g. Fed up with seeing their face reflected off the TV!) Once 4K was on the horizon, the decision was made to pull the plug on plasma.

BTW, plasma as a display technology dates from the 1970's. ;-)

OLED has indeed been around for a while, but there's a difference between it being developed to a certain quality point, and mass production of large screen TV-sized OLED displays. For TV's, OLED has only just reached the point of volume and pricing for mass consumer adoption.

Worse yet, even the slight generational improvements (whenever they occur) are accompanied by lunacy price hikes. Look at even more established industries like smartphones.

I find the whole smartphone situation and pricing (given they are basically disposable including planned obsolence) absolutely ridiculous, and (perhaps not entirely reasonably, but they're certainly part of the picture) I blame Apple for this "technology as jewellery" closed box nonsense.

Complete lack of modularity is extremely irritating--why do I need to buy a new smartphone when the only thing I want to upgrade (Samsung Galaxy S8+) is the camera? After all, certain PDA's used to have expansion ports. The rest works just fine--browser, messaging, phone calls, and some apps. Why can't I replace the battery without gross inconvenience (and "luck" in not destroying the phone) or having to take it to a repair shop? Ludicrous.

You and I are on the same page here. Glad we understand this mess and how it demonstrates companies just making up nonsense on paper that fools consumers in an attempt to make up for the stagnation that has led to all their products being very close with one anothers' offerings. And new consortiums always popping up in an attempt for companies to make their own standards hoping to add another sticker on the TV in hopes that will drive someone to pick their product over the other that looks similar on paper.

If the world (i.e. humans) worked in a rational way then things would be different. But it doesn't. And never did. So here we are, at least in a better era than the "Dark Ages." :)

How many CRT's could have been better if the HT line was properly implemented instead of penny-pinched?
 
Last edited:
  • Like
Reactions: Tks

Palladium

Addicted to Fun and Learning
Joined
Aug 4, 2017
Messages
666
Likes
815
To put things into perspective, LG in 2019 had the entire monopoly of TV OLED panel manufacturing to themselves and yet they still took a huge loss of $4.1 billion, and OLED TV sales still only occupies just a tiny 1% of the market in terms of volume.

Also, I think people are getting very wary for being early adopters to anything new these days based on past historical trends. For example, VR was hyped to the moon and back in 2018, yet it still remains a niche at best today,
 
  • Like
Reactions: Tks

raindance

Major Contributor
Joined
Sep 25, 2019
Messages
1,042
Likes
971
I'm in the pro-AV business and we do a fair number of real LED video wall displays for command and control centers, amongst other things like upgrading TV studios to digital, etc.

I'm constantly shocked by three things:

1. Hotels advertising high definition TV that only have an SD distribution system to the rooms - and most customers don't notice the hideously stretched and crappy quality picture...
2. Visiting peoples homes where their nice 4K TV is still in demo mode with everything far too saturated and far too bright... and then they watch in the DARK and boast about how good their TV is because of its specs !!
3. People actually watching live TV complete with brain dead advertising and over-dramatized news :) I haven't had live TV in 7 years and don't miss it.
 

BostonJack

Active Member
Editor
Joined
Jul 2, 2019
Messages
288
Likes
350
Location
Boston area, Cambridge, MA
I don't keep a tv in the house on general "it rots your brains" ethical grounds. I watch video on my Mac Book 15"
Retina screen, plenty good quality for me.

I guess that I'm an anti-tv elitist. Read my news, pretty voraciously.

When the planes hit the Twin Towers, I was working at a Cisco Systems location in Mass. with about 140 people on site. The internet was flailing with most news sites only intermittently available. One or two Israeli sites were up, for some reason (probably low traffic) and were our primary source of news. A small crew found that we had no cable tv into the building and no antennas for the two tv's used for video presentations: they improvised antennas for both and got scratchy quality broadcast tv out of Boston. Pretty odd to realize that we were tv less.
 

Palladium

Addicted to Fun and Learning
Joined
Aug 4, 2017
Messages
666
Likes
815
1. Hotels advertising high definition TV that only have an SD distribution system to the rooms - and most customers don't notice the hideously stretched and crappy quality picture...

No surprises there, because most people simply don't care much about image quality.
 
OP
Wombat

Wombat

Master Contributor
Joined
Nov 5, 2017
Messages
6,722
Likes
6,464
Location
Australia
No surprises there, because most people simply don't care much about image quality.

A relative of mine purchased a wide screen TV when they first appeared. He couldn't be told that 4:3 programs were not meant be watched in widescreen. He didn't seem to see that the actors became short/chubby.
 
Last edited:

Blumlein 88

Grand Contributor
Forum Donor
Joined
Feb 23, 2016
Messages
20,761
Likes
37,616
Where do you put such a large projector?
On a side table repurposed for that use. The screen is 120 inch, and the projector is separate. I could change the screen to 140 inch and move the projector box back around 3 feet and have a 140 inch projector set up.
 

thefsb

Addicted to Fun and Learning
Joined
Nov 2, 2019
Messages
796
Likes
657
On a side table repurposed for that use. The screen is 120 inch, and the projector is separate. I could change the screen to 140 inch and move the projector box back around 3 feet and have a 140 inch projector set up.
I was trying to imagine what kind of super high-end projector would be 10 foot high, does it need an upgraded power service, does it have special duct-work to get the heat out, how much noise does it make.
 

Sal1950

Grand Contributor
The Chicago Crusher
Forum Donor
Joined
Mar 1, 2016
Messages
14,197
Likes
16,923
Location
Central Fl
Today's TV's are absolutely amazing! I love my Sony 75XBR940D and am looking forward to soon replacing it with an 85.
Do you remember fiddling with the rabbit ears to get a decent picture, or constantly tuning the Vertical Hold to stop the picture from rolling? Or how about seeing the colors of faces change from purple to green to red, etc with every show or channel change, or the grainy, fuzzy picture from you VHS player running at the 8 hour speed. LOL I could go on but I think you all get it by now.. A huge wonderful improvement in the picture quality for the average consumer while the price per screen inch has fallen dramatically over the last few decades. It wasn't that long ago, sometime around 2000 that I paid around $3K for a 50" rear projection set that weighted 250 lbs, got a washed out, crappy picture, had the bottom right corner burned in from channel ID's, ugh.

When I'm home either the TV is on or the HiFi is playing music. Yea I admit to watching a lot of TV, something I think is true of many more around here than will admit it. Of the posters here claiming to "never watching TV or only a few hours a month", I believe that at least 50% are trying to feed us a fairytale, believing that admitting to TV viewing will cast them in a poor light with reference to their taste, education, or intelligence. There's a lot of very poor programing tailored to the lowest possible level, but there is also a ton of excellent programing available tailored to just about any interest or taste.
Quit trying to blow smoke up my butt all you "I never watch TV" folk, I don't believe you.

I think what's really hurting the market today is the constant dividing of available programing between more and more sources. New streaming providers are popping up every day, trying to get yet another $10-20 a month to give you what you used to get before from other included providers. How much of this crap can Joe Consumer afford? I also think the quality of programing from many is hurting due to the division of what is really a fairly static amount of $ from advertisers. You can only slice that pie into so many pieces.
 
Top Bottom