• WANTED: Happy members who like to discuss audio and other topics related to our interest. Desire to learn and share knowledge of science required. There are many reviews of audio hardware and expert members to help answer your questions. Click here to have your audio equipment measured for free!

Did Nvidia just kill the game ?

Neutron

Active Member
Joined
Jun 3, 2020
Messages
131
Likes
65
Just wait a bit. They are having many issues with these cards lately, including on going random crashes.

I don't like high power cards, and NV won't get away with the consequence. Msi 3080 trio x tested to pull over 400W with pcat. Igor's lab found out that VRAM consumes a lot of power and on FE they would reach 100C. If one has to pick a card, it's better to go with cards that has huge and effective coolers, rather than cheaper variants like ventus.
 
Last edited:

sweetchaos

Major Contributor
The Curator
Joined
Nov 29, 2019
Messages
3,872
Likes
11,554
Location
BC, Canada
Here's what I've been waiting for...finally posted on Techspot!
1600967745543.png

1600967749875.png



Quick analysis:
1440p gamers? Get 3080 be happy, since you'll only get +8% in avg framerate by going 3090, but you'll pay +114% for it? LoL
4K gamers? Get 3080 be happy, since you'll only get +11% in avg framerate by going 3090, but you'll pay +114% for it? LoL
 

Neutron

Active Member
Joined
Jun 3, 2020
Messages
131
Likes
65
Here's what I've been waiting for...finally posted on Techspot!
View attachment 84615
View attachment 84616

You might want to continue waiting until custom card reviews come out. NV has been limiting power draw with FE cards. Custom 3080 can draw nearly as much power as 3090 FE.

PS. I have been checking reviews. It seems 3090 only draws about 30W more than its 3080 variants. But the extra cuda cores (17% based on 3090's) and 2.4x VRAM (current est. 2.5W per GB, 14GB would be 35W alone) should consume way more than that. It seems NV is putting tight power limit on 3090, or 3090 is not being utilized effectively.
 
Last edited:

w1000i

Active Member
Forum Donor
Joined
Aug 31, 2019
Messages
260
Likes
138
Location
Jubail SA
AMD 7nm TSMC will dominant Nvidia. they seems 45% more transistors/dia area adavantage for TSMC 7nm over 8nm Samsung.

This video open my mind about how poor 8nm Samsung really is.
 
Last edited:

Astrozombie

Senior Member
Forum Donor
Joined
May 7, 2020
Messages
388
Likes
144
Location
Los Angeles
That chart of course doesn't use used prices, nobody will be paying $700 for a 2080 pretty soon.
 

Neutron

Active Member
Joined
Jun 3, 2020
Messages
131
Likes
65
AMD 7nm TSMC will dominant Nvidia. they seems 45% more transistors/dia area adavantage for TSMC 7nm over 8nm Samsung.

This video open my mind about how poor 8nm Samsung really is.

Nvidia made several bad choices on this generation. However AMD is not going to take down NV for now. I was extrapolating PS5 to 80 CU, and it’s a bit better than 3070. HEDT buyers (and streamers) are still stuck with 3080/3090.
 
Last edited:

w1000i

Active Member
Forum Donor
Joined
Aug 31, 2019
Messages
260
Likes
138
Location
Jubail SA
Nvidia made several bad choices on this generation. However AMD is not going to take down NV for now. I was extrapolating PS5 to 80 CU, and it’s a bit better than 3070. HEDT buyers (and streamers) are still stuck with 3080/3090.
I think PS5 is watt limited to maintain consistent thermal and framerate. But PC card can be more dynamic
 

Tks

Major Contributor
Joined
Apr 1, 2019
Messages
3,221
Likes
5,494
You might want to continue waiting until custom card reviews come out. NV has been limiting power draw with FE cards. Custom 3080 can draw nearly as much power as 3090 FE.

PS. I have been checking reviews. It seems 3090 only draws about 30W more than its 3080 variants. But the extra cuda cores (17% based on 3090's) and 2.4x VRAM (current est. 2.5W per GB, 14GB would be 35W alone) should consume way more than that. It seems NV is putting tight power limit on 3090, or 3090 is not being utilized effectively.

Case and point, the 3090 Strix without being penny pinchers on the power filters can sustain a far higher frequency with far more stability than any other card currently. Also look at what sort of power limit this allows Asus to chase, a whopping 480W, making all other cards look like a joke.

These other companies aren't even close it's almost hilarious.

clocks-and-thermals.png


tdp-adjustment-limit.png
 

Neutron

Active Member
Joined
Jun 3, 2020
Messages
131
Likes
65
I think PS5 is watt limited to maintain consistent thermal and framerate. But PC card can be more dynamic

XSX is surely limited. However PS5’s clock is about what others expected from RDNA2 and its cu count is less than XSX’s. It could be throttled, but probably not that much.
 

Neutron

Active Member
Joined
Jun 3, 2020
Messages
131
Likes
65
Case and point, the 3090 Strix without being penny pinchers on the power filters can sustain a far higher frequency with far more stability than any other card currently. Also look at what sort of power limit this allows Asus to chase, a whopping 480W, making all other cards look like a joke.

These other companies aren't even close it's almost hilarious.

clocks-and-thermals.png


tdp-adjustment-limit.png

Did you see reviews on strix 3080? I haven’t seen one yet.
 

w1000i

Active Member
Forum Donor
Joined
Aug 31, 2019
Messages
260
Likes
138
Location
Jubail SA
XSX is surely limited. However PS5’s clock is about what others expected from RDNA2 and its cu count is less than XSX’s. It could be throttled, but probably not that much.
Case and point, the 3090 Strix without being penny pinchers on the power filters can sustain a far higher frequency with far more stability than any other card currently. Also look at what sort of power limit this allows Asus to chase, a whopping 480W, making all other cards look like a joke.

These other companies aren't even close it's almost hilarious.

clocks-and-thermals.png


tdp-adjustment-limit.png

Going for high power consumption 480w to gain higher performance mean Nvidia will lose the laptop market to AMD for sure.
 

Tks

Major Contributor
Joined
Apr 1, 2019
Messages
3,221
Likes
5,494
Lose the laptop market lol? Why? They have a million SKU's for every 10W of TDP (I exaggerate but what I'm trying to say is, they have very power efficient low end cards every generation).
 

Neutron

Active Member
Joined
Jun 3, 2020
Messages
131
Likes
65
Lose the laptop market lol? Why? They have a million SKU's for every 10W of TDP (I exaggerate but what I'm trying to say is, they have very power efficient low end cards every generation).

I kinda agree with him on that. NV will struggle to cut down power consumption. Ampere reminds me of GTX 480, which lived for half an year only.

NV pushed up core counts significantly. The 3080 has 10 billion more transistors than 2080ti, and 3070 only 1 billion less than 2080ti. Implementing 30 series in laptops as they do with 20 series (using desktop chip directly) will be challenging. Very low power gpus are being baught up by integrated gpus, making them less necessary unless it is going to be used with gen 10/11 intel cpus for now (next year intel's own discrete graphics might eat this away).

Intel isn’t doing a great job cutting down power consumption either. We are back to 4 core cpus with gen 11, and PL2 still up to 64W. I am afraid next gen platforms will have strict power limit.

PS. As more details on 3080 crashing emerged, it seems that cards that have better bin suffers, as they can turbo to 2.05GHz and even 2.1GHz. On these cards, any configuration of MLCCs and PS-CAPs all failed (according to hardware unboxed), including FE and TUF OC. Some cards that has higher power limit suffer the most. Lower quality GPUs that cannot boost to 2GHz are less affected. This is probably why strix and FTW from EVGA are being delayed.
 
Last edited:
  • Like
Reactions: Tks
Top Bottom