Jump to content

Welcome to ExtremeHW

Welcome to ExtremeHW, register to take part in our community, don't worry this is a simple FREE process that requires minimal information for you to signup.

 

Registered users can: 

  • Start new topics and reply to others.
  • Show off your PC using our Rig Creator feature.
  • Subscribe to topics and forums to get updates.
  • Get your own profile page to customize.
  • Send personal messages to other members.
  • Take advantage of site exclusive features.
  • Upgrade to Premium to unlock additional sites features.
IGNORED

Official RTX 4000 reveal, DLSS 3


UltraMega

Recommended Posts

Quote

Nvidia has officially unveiled its next-gen GeForce RTX 40 series GPUs based on its new Ada Lovelace architecture and headlined by the flagship RTX 4090 GPU. Ada Lovelace is built on TSMC's 4N process and will debut in new RTX 4090 and RTX 4080 cards. The former packs 76 billion transistors, 16,384 CUDA cores and 24GB of GDDR6X memory from Micron.

WWW.TECHSPOT.COM

According to Nvidia, an RTX 4090 with DLSS 3 is up to four times faster than an RTX 3090 Ti running DLSS 2, and...

RT performance getting a major boost as expected. 

 

 

  • Respect 1

null

Owned

 Share

CPU: 5800x
MOTHERBOARD: ASUS TUF Gaming B550-Plus
RAM: 32GB 3600mhz CL16
GPU: 7900XT
SOUNDCARD: Sound Blaster Z 5.1 home theater
MONITOR: 4K 65 inch TV
Full Rig Info
Link to comment
Share on other sites

...looks like good tech (withheld from RTX3K - for now at least), but until I see a pile of independent tests by trusted sources, I'll have the 'salt shaker' ready. The YT below claims both RT on / off on the left, no to mention no DLSS (2?) at 8K on the left...duh 

 

 

Owned

 Share

CPU: CPU: ><.......7950X3D - Aorus X670E Master - 48GB DDR5 7200 (8000) TridentZ SK Hynix - Giga-G-OC/Galax RTX 4090 670W - LG 48 OLED - 4TB NVMEs >< .......5950X - Asus CH 8 Dark Hero - 32GB CL13 DDR4 4000 - AMD R 6900XT 500W - Philips BDM40 4K VA - 2TB NVME & 3TB SSDs >> - <<.......4.4 TR 2950X - MSI X399 Creation - 32 GB CL 14 3866 - Asus RTX 3090 Strix OC/KPin 520W and 2x RTX 2080 Ti Gigabyte XTR WF WB 380W - LG 55 IPS HDR - 1TB NVME & 4TB SSDs
Full Rig Info
Link to comment
Share on other sites

38 minutes ago, pioneerisloud said:

I think this is the most important bit in the whole article since $1600 MSRP....gross.... 🤮

$1600 for the 4090 isn't that crazy since it's about what the 3090 MSRP was at launch, but the prices on the two 4080 variants is just insulting. $1200 for a 4080 is completely ridiculous. What's a 4060 going to cost, $700? Prices like these make me want just buy a console. 

 

Not like any of these prices matter because a new gen of GPUs is probably going to start another mining frenzy anyway. Sad time for PC gamers with markets like this. What Nvidia is showing off is impressive but the cost is completely out of touch with reality for all but high end enthusiasts with money to burn. If they don't drop these prices significant after they clear out some 3000 overstock, I'll buy another AMD card next time I upgrade just out of spite, unless AMD does the same thing with prices. I can live without DLSS for the sake of not feeling like I'm being bent over by Nvidia. 

null

Owned

 Share

CPU: 5800x
MOTHERBOARD: ASUS TUF Gaming B550-Plus
RAM: 32GB 3600mhz CL16
GPU: 7900XT
SOUNDCARD: Sound Blaster Z 5.1 home theater
MONITOR: 4K 65 inch TV
Full Rig Info
Link to comment
Share on other sites

39 minutes ago, J7SC_Orion said:

...looks like good tech (withheld from RTX3K - for now at least), but until I see a pile of independent tests by trusted sources, I'll have the 'salt shaker' ready. The YT below claims both RT on / off on the left, no to mention no DLSS (2?) at 8K on the left...duh 

I doubt we'll see it on the 3000 series since DLSS 3 is actually generating new frames entirely. I suspect that takes more processing than the 3000 series would be able to handle. 

null

Owned

 Share

CPU: 5800x
MOTHERBOARD: ASUS TUF Gaming B550-Plus
RAM: 32GB 3600mhz CL16
GPU: 7900XT
SOUNDCARD: Sound Blaster Z 5.1 home theater
MONITOR: 4K 65 inch TV
Full Rig Info
Link to comment
Share on other sites

Given the "Jensen 2X" for the 3080 over the 2080 when it released, I likewise take these claims with a grain of salt.

 

The 4090 is actually cheaper than I was expecting. I thought for sure they'd push $1999 on it. 

 

I think the bigger travesty is the pair of 4080's. The 16GB is being priced like the 2080 Ti / 3080 Ti before it, yet is not the same die as the 4090 and features a really mid-range 256-bit bus I'd expect to see with the 104 die cards. In my mind, that is really bad. Even RDNA2 with it's Infinity Cache showed deficiencies at 4K using the 256-bit bus. To be fair, 80-class cards having 256-bit buses and 104 dies is not unusual. The 680, 980, 1080, 2080 are all examples of this. However what is new is the astronomical price increase. None of those cards were $1200.

 

The biggest slap to the face to consumers is the $900 12GB 4080. At a measly 192-bit bus I can't possibly see how this won't have problems for higher resolution gaming...something people in this price range are going to want to be able to do. A lot of people around are calling it a 4070 rebranded as a 4080 and in a sense that is certainly true, but it's even worse than that. Have we ever had a 70-class card that was 192-bit? Seems to me that was usually the 60-class card. So in my mind they are taking 60-class card specs, pumping a ton of power through it, and calling it "80-class".

 

Maybe its just me, but feels like Kepler all over again. Anyone remember the GTX 680?

 

And maybe this is an extreme reaction, but I sincerely hope people let these 4080's rot on shelves.

Edited by Sir Beregond
  • Thanks 1
  • Respect 1

null

Showcase

 Share

CPU: AMD Ryzen 9 5900X
GPU: Nvidia RTX 3080 Ti Founders Edition
RAM: G.Skill Trident Z Neo 32GB DDR4-3600 (@ 3733 CL14)
MOTHERBOARD: ASUS Crosshair VIII Dark Hero
SSD/NVME: x2 Samsung 970 Evo Plus 2TB
SSD/NVME 2: Crucial MX500 1TB
PSU: be Quiet! Straight Power 12 1500W
MONITOR: LG 42" C4 OLED
Full Rig Info

null

Owned

 Share

CPU: E8400, i5-650, i7-870, i7-960, i5-2400, i7-4790k, i9-10900k, i3-13100, i9-13900ks
GPU: many
RAM: Corsair 32GB DDR3-2400 | Oloy Blade 16GB DDR4-3600 | Crucial 16GB DDR5-5600
MOTHERBOARD: ASUS P7P55 WS SC | ASUS Z97 Deluxe | EVGA Z490 Dark | EVGA Z790 Dark Kingpin
SSD/NVME: Samsung 870 Evo 1TB | Inland 1TB Gen 4
PSU: Seasonic Focus GX 1000W
CASE: Cooler Master MasterFrame 700 - bench mode
OPERATING SYSTEM: Windows 10 LTSC
Full Rig Info

Owned

 Share

CPU: M1 Pro
RAM: 32GB
SSD/NVME: 1TB
OPERATING SYSTEM: MacOS Sonoma
CASE: Space Grey
Full Rig Info
Link to comment
Share on other sites

Quote

RTX 4080 12GB features 7,680 CUDA cores

RTX 4080 16GB fratures 9,728 CUDA cores

 

Why do they do this?????  They both shouldn't be called 4080's.  It all started with the 8800 GTS (512mb) and 8800 GTS (320mb).  The two GPU's weren't on the same nodes and had nothing in common other than both being an 8800 GTS.

 

As for the prices, with inflation at a 40 year high, it was to be expected.

  • Thanks 1
Link to comment
Share on other sites

33 minutes ago, Diffident said:

 

Why do they do this?????  They both shouldn't be called 4080's.  It all started with the 8800 GTS (512mb) and 8800 GTS (320mb).  The two GPU's weren't on the same nodes and had nothing in common other than both being an 8800 GTS.

 

As for the prices, with inflation at a 40 year high, it was to be expected.

I don't know, I think @Avacadomight be on to something. Given the 4080 12GB specs, I think it's extremely likely that the 3090/3090 Ti will outperform it at 4K. 

 

Probably just a way for them to sell unsold stock of 30-series cards.

null

Showcase

 Share

CPU: AMD Ryzen 9 5900X
GPU: Nvidia RTX 3080 Ti Founders Edition
RAM: G.Skill Trident Z Neo 32GB DDR4-3600 (@ 3733 CL14)
MOTHERBOARD: ASUS Crosshair VIII Dark Hero
SSD/NVME: x2 Samsung 970 Evo Plus 2TB
SSD/NVME 2: Crucial MX500 1TB
PSU: be Quiet! Straight Power 12 1500W
MONITOR: LG 42" C4 OLED
Full Rig Info

null

Owned

 Share

CPU: E8400, i5-650, i7-870, i7-960, i5-2400, i7-4790k, i9-10900k, i3-13100, i9-13900ks
GPU: many
RAM: Corsair 32GB DDR3-2400 | Oloy Blade 16GB DDR4-3600 | Crucial 16GB DDR5-5600
MOTHERBOARD: ASUS P7P55 WS SC | ASUS Z97 Deluxe | EVGA Z490 Dark | EVGA Z790 Dark Kingpin
SSD/NVME: Samsung 870 Evo 1TB | Inland 1TB Gen 4
PSU: Seasonic Focus GX 1000W
CASE: Cooler Master MasterFrame 700 - bench mode
OPERATING SYSTEM: Windows 10 LTSC
Full Rig Info

Owned

 Share

CPU: M1 Pro
RAM: 32GB
SSD/NVME: 1TB
OPERATING SYSTEM: MacOS Sonoma
CASE: Space Grey
Full Rig Info
Link to comment
Share on other sites

55 minutes ago, Diffident said:

 

Why do they do this?????  They both shouldn't be called 4080's.  It all started with the 8800 GTS (512mb) and 8800 GTS (320mb).  The two GPU's weren't on the same nodes and had nothing in common other than both being an 8800 GTS.

 

As for the prices, with inflation at a 40 year high, it was to be expected.

Inflation didn't go up 40% though. These prices are pure monopolistic/duopolistic greed. 

 

3080 MSRP: $699

4080 12GB MSRP $899 (28.6% increase over 3080 MSRP)

4080 16GB MSRP: $1199 (71.4% increase over 3080 MSRP)

 

Inflation in the US over the last two years combined is a little under 15%. 

 

If the price difference were just inflation, the 4080 should only be about $100 more expensive. 

Edited by UltraMega

null

Owned

 Share

CPU: 5800x
MOTHERBOARD: ASUS TUF Gaming B550-Plus
RAM: 32GB 3600mhz CL16
GPU: 7900XT
SOUNDCARD: Sound Blaster Z 5.1 home theater
MONITOR: 4K 65 inch TV
Full Rig Info
Link to comment
Share on other sites

On a different note, DLSS 3 is impressive tech. Can't deny that even with the insane pricing. 

 

Quote

Nvidia improves upon this with DLSS 3 by adding Optical Multi Frame Generation to generate entirely new frames rather than just pixels, similar to motion interpolation found on TVs (although hopefully without the input lag penalty). In performance mode, DLSS 3 is reconstructing seven-eighths of the total displayed pixels. Super Resolution rebuilds three-fourths of the first frame (e.g. 1080p to 4K), with Frame Generation reconstructing the entire second frame.

Since Optical Multi Frame Generation executes as a post-process on the GPU, it can increase framerates even in CPU-limited games, such as Microsoft Flight Simulator. Unfortunately, DLSS Frame Generation is powered by the new fourth-generation Tensor cores and Optical Flow Accelerator of the Nvidia Ada Lovelace architecture, meaning it only works on RTX 40 GPUs.

Nvidia DLSS 3 will provide up to four times more FPS, exclusive to RTX 40 series | TechSpot

 

Seems to work basically how you'd expect when you hear the term "frame generation". The idea is so simple, I expect other smart upscalers will do the same thing eventually. Once you have enough AI cores to run DLSS 2 with a good amount of head room, why not just start generating more frames? On the surface it's similar to how TVs do it, but I bet it works a hell of a lot better coming from a much higher data set than you'd get from 2D video. 

Edited by UltraMega

null

Owned

 Share

CPU: 5800x
MOTHERBOARD: ASUS TUF Gaming B550-Plus
RAM: 32GB 3600mhz CL16
GPU: 7900XT
SOUNDCARD: Sound Blaster Z 5.1 home theater
MONITOR: 4K 65 inch TV
Full Rig Info
Link to comment
Share on other sites

2 hours ago, UltraMega said:

Inflation didn't go up 40% though. These prices are pure monopolistic/duopolistic greed. 

 

3080 MSRP: $699

4080 12GB MSRP $899 (28.6% increase over 3080 MSRP)

4080 16GB MSRP: $1199 (71.4% increase over 3080 MSRP)

 

Inflation in the US over the last two years combined is a little under 15%. 

 

If the price difference were just inflation, the 4080 should only be about $100 more expensive. 

He said 40 year high, not 40% inflation.

null

Showcase

 Share

CPU: AMD Ryzen 9 5900X
GPU: Nvidia RTX 3080 Ti Founders Edition
RAM: G.Skill Trident Z Neo 32GB DDR4-3600 (@ 3733 CL14)
MOTHERBOARD: ASUS Crosshair VIII Dark Hero
SSD/NVME: x2 Samsung 970 Evo Plus 2TB
SSD/NVME 2: Crucial MX500 1TB
PSU: be Quiet! Straight Power 12 1500W
MONITOR: LG 42" C4 OLED
Full Rig Info

null

Owned

 Share

CPU: E8400, i5-650, i7-870, i7-960, i5-2400, i7-4790k, i9-10900k, i3-13100, i9-13900ks
GPU: many
RAM: Corsair 32GB DDR3-2400 | Oloy Blade 16GB DDR4-3600 | Crucial 16GB DDR5-5600
MOTHERBOARD: ASUS P7P55 WS SC | ASUS Z97 Deluxe | EVGA Z490 Dark | EVGA Z790 Dark Kingpin
SSD/NVME: Samsung 870 Evo 1TB | Inland 1TB Gen 4
PSU: Seasonic Focus GX 1000W
CASE: Cooler Master MasterFrame 700 - bench mode
OPERATING SYSTEM: Windows 10 LTSC
Full Rig Info

Owned

 Share

CPU: M1 Pro
RAM: 32GB
SSD/NVME: 1TB
OPERATING SYSTEM: MacOS Sonoma
CASE: Space Grey
Full Rig Info
Link to comment
Share on other sites

5 minutes ago, Sir Beregond said:

He said 40 year high, not 40% inflation.

The point is the same, which is that the price increase is way more than inflation would justify. 

  • Thanks 1

null

Owned

 Share

CPU: 5800x
MOTHERBOARD: ASUS TUF Gaming B550-Plus
RAM: 32GB 3600mhz CL16
GPU: 7900XT
SOUNDCARD: Sound Blaster Z 5.1 home theater
MONITOR: 4K 65 inch TV
Full Rig Info
Link to comment
Share on other sites

32 minutes ago, UltraMega said:

The point is the same, which is that the price increase is way more than inflation would justify. 

 

...but not if it goes by size / weight 😬

 

 

Owned

 Share

CPU: CPU: ><.......7950X3D - Aorus X670E Master - 48GB DDR5 7200 (8000) TridentZ SK Hynix - Giga-G-OC/Galax RTX 4090 670W - LG 48 OLED - 4TB NVMEs >< .......5950X - Asus CH 8 Dark Hero - 32GB CL13 DDR4 4000 - AMD R 6900XT 500W - Philips BDM40 4K VA - 2TB NVME & 3TB SSDs >> - <<.......4.4 TR 2950X - MSI X399 Creation - 32 GB CL 14 3866 - Asus RTX 3090 Strix OC/KPin 520W and 2x RTX 2080 Ti Gigabyte XTR WF WB 380W - LG 55 IPS HDR - 1TB NVME & 4TB SSDs
Full Rig Info
Link to comment
Share on other sites

5 hours ago, UltraMega said:

On a different note, DLSS 3 is impressive tech. Can't deny that even with the insane pricing. 

 

Nvidia DLSS 3 will provide up to four times more FPS, exclusive to RTX 40 series | TechSpot

 

Seems to work basically how you'd expect when you hear the term "frame generation". The idea is so simple, I expect other smart upscalers will do the same thing eventually. Once you have enough AI cores to run DLSS 2 with a good amount of head room, why not just start generating more frames? On the surface it's similar to how TVs do it, but I bet it works a hell of a lot better coming from a much higher data set than you'd get from 2D video. 

I will agree 110% that the DLSS tech is amazing, as is FSR and the one Intel is developing.  I love this idea, especially for older or lower tier cards.  One problem though.  Paying $1600 (MSRP, we know it'll be more than that retail) for a GPU, I shouldn't HAVE to run DLSS or FSR even at 4k.  It's 2022, these GPU's are 100 fold faster than the console counterparts.  I could see utilizing DLSS / FSR in consoles and again lower model GPU's....but a 4090 or 7900XT?  It shouldn't be needed, and they're promoting it on the high end cards in these videos like its gold.

Sorry, I'm actually a little salty that I paid $1000 for a 6900XT and I absolutely have to run FSR in some titles (that are older than the card mind you).  And Nvidia is pushing DLSS on the 4090 / 4080 series too in those videos.  Yeah, the tech is great.  Agreed.  But show it off on say the 4050 or 4060 series where it should be belonging.

Edited by pioneerisloud
  • Thanks 1
Link to comment
Share on other sites

1 hour ago, pioneerisloud said:

I will agree 110% that the DLSS tech is amazing, as is FSR and the one Intel is developing.  I love this idea, especially for older or lower tier cards.  One problem though.  Paying $1600 (MSRP, we know it'll be more than that retail) for a GPU, I shouldn't HAVE to run DLSS or FSR even at 4k.  It's 2022, these GPU's are 100 fold faster than the console counterparts.  I could see utilizing DLSS / FSR in consoles and again lower model GPU's....but a 4090 or 7900XT?  It shouldn't be needed, and they're promoting it on the high end cards in these videos like its gold.

Sorry, I'm actually a little salty that I paid $1000 for a 6900XT and I absolutely have to run FSR in some titles (that are older than the card mind you).  And Nvidia is pushing DLSS on the 4090 / 4080 series too in those videos.  Yeah, the tech is great.  Agreed.  But show it off on say the 4050 or 4060 series where it should be belonging.

 

...good point about DLSS being more important in the mid-range cards. Then again, the few demos I've seen on 4090 w/ DLSS 3.0 and RTX all seem to point at 8K...looks like NVidia is trying to push that one (8K) again. On the mid-range front, though, I find it telling that NVidia is releasing the 'weaker' 4080 with the gimped memory bus and reduced core count; in earlier and otherwise accurate leaks (re. 4090, 4080 16 GB), that model was actually tagged as a 4070. May be NVidia is expecting more competition in the mid-range from AMD and even Intel (once Intel sorts its vbios and driver problems) and wants to inject a reconstituted 4070 into that battle, especially during weaker economic times and associated consumer budget concerns ?

  • Thanks 1

Owned

 Share

CPU: CPU: ><.......7950X3D - Aorus X670E Master - 48GB DDR5 7200 (8000) TridentZ SK Hynix - Giga-G-OC/Galax RTX 4090 670W - LG 48 OLED - 4TB NVMEs >< .......5950X - Asus CH 8 Dark Hero - 32GB CL13 DDR4 4000 - AMD R 6900XT 500W - Philips BDM40 4K VA - 2TB NVME & 3TB SSDs >> - <<.......4.4 TR 2950X - MSI X399 Creation - 32 GB CL 14 3866 - Asus RTX 3090 Strix OC/KPin 520W and 2x RTX 2080 Ti Gigabyte XTR WF WB 380W - LG 55 IPS HDR - 1TB NVME & 4TB SSDs
Full Rig Info
Link to comment
Share on other sites

2 hours ago, ozlay said:

They should have called that 16gb card an RTX4085.

 

Bring back the 5's like the 200 series.

No, that will come later as a cut-down AD102 with 20GB 320-bit bus. (i.e. what the 4080 Ti will likely be in my mind). Could also do 22GB 352-bit, but the first seems more likely to me.

 

But I agree with your sentiment. That said, wasn't the GTX 285 a die shrink of the GTX 280, or am I misremembering?

Edited by Sir Beregond

null

Showcase

 Share

CPU: AMD Ryzen 9 5900X
GPU: Nvidia RTX 3080 Ti Founders Edition
RAM: G.Skill Trident Z Neo 32GB DDR4-3600 (@ 3733 CL14)
MOTHERBOARD: ASUS Crosshair VIII Dark Hero
SSD/NVME: x2 Samsung 970 Evo Plus 2TB
SSD/NVME 2: Crucial MX500 1TB
PSU: be Quiet! Straight Power 12 1500W
MONITOR: LG 42" C4 OLED
Full Rig Info

null

Owned

 Share

CPU: E8400, i5-650, i7-870, i7-960, i5-2400, i7-4790k, i9-10900k, i3-13100, i9-13900ks
GPU: many
RAM: Corsair 32GB DDR3-2400 | Oloy Blade 16GB DDR4-3600 | Crucial 16GB DDR5-5600
MOTHERBOARD: ASUS P7P55 WS SC | ASUS Z97 Deluxe | EVGA Z490 Dark | EVGA Z790 Dark Kingpin
SSD/NVME: Samsung 870 Evo 1TB | Inland 1TB Gen 4
PSU: Seasonic Focus GX 1000W
CASE: Cooler Master MasterFrame 700 - bench mode
OPERATING SYSTEM: Windows 10 LTSC
Full Rig Info

Owned

 Share

CPU: M1 Pro
RAM: 32GB
SSD/NVME: 1TB
OPERATING SYSTEM: MacOS Sonoma
CASE: Space Grey
Full Rig Info
Link to comment
Share on other sites

The mid range card prices are poor. Not surprised however as this is Nvidia and like it or not, their technology both hardware and software still outdoes what AMD has to offer. It sucks for the consumers. 

 

Having a two models with the 4080 naming convention is just offensive. Who in their right mind thinks that is sensible? 

 

While no doubt the 4090/Ti  will be a power house, its still a good chunk of cash to lay down. Even though the upgrade bug is biting, realistically their is no point in upgrading if you have a 3090/Ti. Other than professional uses, you will not see a real world benefit in gaming, besides the DLSS 3 tech. That is also assuming they don't later release it for the 30 Series. 

£3000

Owned

 Share

CPU: AMD Ryzen 9 7950X3D
MOTHERBOARD: MSI Meg Ace X670E
RAM: Corsair Dominator Titanium 64GB (6000MT/s)
GPU: EVGA 3090 FTW Ultra Gaming
SSD/NVME: Corsair MP700 Pro SE Gen 5 4TB
PSU: EVGA Supernova T2 1600Watt
CASE: be quiet Dark Base Pro 900 Rev 2
FANS: Noctua NF-A14 industrialPPC x 6
Full Rig Info

Owned

 Share

CPU: Intel Core i5 8500
RAM: 16GB (2x8GB) Kingston 2666Mhz
SSD/NVME: 256GB Samsung NVMe
NETWORK: HP 561T 10Gbe (Intel X540 T2)
MOTHERBOARD: Proprietry
GPU: Intel UHD Graphics 630
PSU: 90Watt
CASE: HP EliteDesk 800 G4 SFF
Full Rig Info

£3000

Owned

 Share

CPU: 2 x Xeon|E5-2696-V4 (44C/88T)
RAM: 128GB|16 x 8GB - DDR4 2400MHz (2Rx8)
MOTHERBOARD: HP Z840|Intel C612 Chipset
GPU: Nvidia Quadro P2200
HDD: 4x 16TB Toshiba MG08ACA16TE Enterprise
SSD/NVME: Intel 512GB 670p NVMe (Main OS)
SSD/NVME 2: 2x WD RED 1TB NVMe (VM's)
SSD/NVME 3: 2x Seagate FireCuda 1TB SSD's (Apps)
Full Rig Info
Link to comment
Share on other sites

1 hour ago, ENTERPRISE said:

The mid range card prices are poor. Not surprised however as this is Nvidia and like it or not, their technology both hardware and software still outdoes what AMD has to offer. It sucks for the consumers. 

 

Having a two models with the 4080 naming convention is just offensive. Who in their right mind thinks that is sensible? 

 

While no doubt the 4090/Ti  will be a power house, its still a good chunk of cash to lay down. Even though the upgrade bug is biting, realistically their is no point in upgrading if you have a 3090/Ti. Other than professional uses, you will not see a real world benefit in gaming, besides the DLSS 3 tech. That is also assuming they don't later release it for the 30 Series. 

It would be one thing if they were the same chip, but given the vast disparity between the two 4080's, its definitely a slap in the face to consumers...especially the ones who won't know better and will just assume the only difference is VRAM.

 

For me personally, it sounds pretty cool, but I will wait and see how the DLSS 3 tech plays out. From what I've heard, 30-series owners should still be able to run DLSS 3 titles in DLSS 2.

 

More than just the cost, I find the concept of a 450W card with what sounds to be optional 600W+ cards via the AIB's to be ridiculous. I know enthusiasts will like that for the OC sessions, but as a general gaming card to use daily, that just seems flat out irresponsible to me given rising energy costs, especially in places like Europe, and other places, as well as inflation, etc. Just very poor taste in my opinion.

null

Showcase

 Share

CPU: AMD Ryzen 9 5900X
GPU: Nvidia RTX 3080 Ti Founders Edition
RAM: G.Skill Trident Z Neo 32GB DDR4-3600 (@ 3733 CL14)
MOTHERBOARD: ASUS Crosshair VIII Dark Hero
SSD/NVME: x2 Samsung 970 Evo Plus 2TB
SSD/NVME 2: Crucial MX500 1TB
PSU: be Quiet! Straight Power 12 1500W
MONITOR: LG 42" C4 OLED
Full Rig Info

null

Owned

 Share

CPU: E8400, i5-650, i7-870, i7-960, i5-2400, i7-4790k, i9-10900k, i3-13100, i9-13900ks
GPU: many
RAM: Corsair 32GB DDR3-2400 | Oloy Blade 16GB DDR4-3600 | Crucial 16GB DDR5-5600
MOTHERBOARD: ASUS P7P55 WS SC | ASUS Z97 Deluxe | EVGA Z490 Dark | EVGA Z790 Dark Kingpin
SSD/NVME: Samsung 870 Evo 1TB | Inland 1TB Gen 4
PSU: Seasonic Focus GX 1000W
CASE: Cooler Master MasterFrame 700 - bench mode
OPERATING SYSTEM: Windows 10 LTSC
Full Rig Info

Owned

 Share

CPU: M1 Pro
RAM: 32GB
SSD/NVME: 1TB
OPERATING SYSTEM: MacOS Sonoma
CASE: Space Grey
Full Rig Info
Link to comment
Share on other sites

We haven't really seen any interesting leaks about AMDs next GPUs, who knows, maybe they will surprise us.

 

It's not totally unfeasible that AMD could come out with really strong RT performance. IF AMD could match or beat RT performance and offer it at a lower price, they could box Nvidia into a corner where the only way Nvidia can meaningfully pull ahead would be with DLSS3. If AMD can just match non-DLSS3 numbers for a lower price, they will be in a really good position where people will have to say, Yea Nvidia can beat AMD but only with frame generation. 

 

I've said this already somewhere, but I really see no reason why AMD couldn't also do frame generation. We know AMDs next GPUs are going to focus a lot more on AI than previous ones, as long as they have AI headroom to generate frames, they should be able to do it too. It seems like a very simple concept that can be easily replicated so long as the GPU power is available.  

 

On pricing, it seems the prices are slotted in an almost linear way with the with the 3000 series, minus the 4090. The two new 4080 cards are priced as if they are supposed to be a higher end to the current GPUs rather than replacement GPUs. Obviously, they are doing this because there is still 3000 series overstock. Very frustrating that they wouldn't just drop the prices of the 3000 series if they wanted these cards to cost so much more than the older ones, but perhaps the silver lining could be that price drops will be substantial once 3000 series stock clears out, though I'm not holding my breath.

null

Owned

 Share

CPU: 5800x
MOTHERBOARD: ASUS TUF Gaming B550-Plus
RAM: 32GB 3600mhz CL16
GPU: 7900XT
SOUNDCARD: Sound Blaster Z 5.1 home theater
MONITOR: 4K 65 inch TV
Full Rig Info
Link to comment
Share on other sites

1 hour ago, UltraMega said:

We haven't really seen any interesting leaks about AMDs next GPUs, who knows, maybe they will surprise us.

 

It's not totally unfeasible that AMD could come out with really strong RT performance. IF AMD could match or beat RT performance and offer it at a lower price, they could box Nvidia into a corner where the only way Nvidia can meaningfully pull ahead would be with DLSS3. If AMD can just match non-DLSS3 numbers for a lower price, they will be in a really good position where people will have to say, Yea Nvidia can beat AMD but only with frame generation. 

 

I've said this already somewhere, but I really see no reason why AMD couldn't also do frame generation. We know AMDs next GPUs are going to focus a lot more on AI than previous ones, as long as they have AI headroom to generate frames, they should be able to do it too. It seems like a very simple concept that can be easily replicated so long as the GPU power is available.  

 

On pricing, it seems the prices are slotted in an almost linear way with the with the 3000 series, minus the 4090. The two new 4080 cards are priced as if they are supposed to be a higher end to the current GPUs rather than replacement GPUs. Obviously, they are doing this because there is still 3000 series overstock. Very frustrating that they wouldn't just drop the prices of the 3000 series if they wanted these cards to cost so much more than the older ones, but perhaps the silver lining could be that price drops will be substantial once 3000 series stock clears out, though I'm not holding my breath.

 

I'm still skeptical on a lot of the charts and claims from Nvidia yesterday and a lot of their use of "relative performance" vs raw FPS differences. A lot of those charts were also DLSS in performance mode in the small print, which no one in their right mind should be using. I imagine actual reviews are going to paint a different story on how much better the 40-series cards actually are.

 

If I had to predict RDNA3, I'd guess their raster performance should be pretty competitive across the stack. If I had to venture a guess, I am going to think Ray Tracing will still be faster on the Nvidia cards, but I think as long as RDNA3 can at least beat Ampere in RT even if it doesn't match Lovelace, that still makes it a compelling option.

 

The biggest thing here is if AMD can price similarly or slightly more than RDNA2 was MSRP'd at, then that would be really compelling to the current market I would think, and is probably AMD's best chance at gaining more market share while Nvidia keeps trying to sell an overstock of 30-series cards and overpriced 40-series cards everyone is hating.

Edited by Sir Beregond
  • Thanks 1

null

Showcase

 Share

CPU: AMD Ryzen 9 5900X
GPU: Nvidia RTX 3080 Ti Founders Edition
RAM: G.Skill Trident Z Neo 32GB DDR4-3600 (@ 3733 CL14)
MOTHERBOARD: ASUS Crosshair VIII Dark Hero
SSD/NVME: x2 Samsung 970 Evo Plus 2TB
SSD/NVME 2: Crucial MX500 1TB
PSU: be Quiet! Straight Power 12 1500W
MONITOR: LG 42" C4 OLED
Full Rig Info

null

Owned

 Share

CPU: E8400, i5-650, i7-870, i7-960, i5-2400, i7-4790k, i9-10900k, i3-13100, i9-13900ks
GPU: many
RAM: Corsair 32GB DDR3-2400 | Oloy Blade 16GB DDR4-3600 | Crucial 16GB DDR5-5600
MOTHERBOARD: ASUS P7P55 WS SC | ASUS Z97 Deluxe | EVGA Z490 Dark | EVGA Z790 Dark Kingpin
SSD/NVME: Samsung 870 Evo 1TB | Inland 1TB Gen 4
PSU: Seasonic Focus GX 1000W
CASE: Cooler Master MasterFrame 700 - bench mode
OPERATING SYSTEM: Windows 10 LTSC
Full Rig Info

Owned

 Share

CPU: M1 Pro
RAM: 32GB
SSD/NVME: 1TB
OPERATING SYSTEM: MacOS Sonoma
CASE: Space Grey
Full Rig Info
Link to comment
Share on other sites

6 hours ago, Sir Beregond said:

 

I'm still skeptical on a lot of the charts and claims from Nvidia yesterday and a lot of their use of "relative performance" vs raw FPS differences. A lot of those charts were also DLSS in performance mode in the small print, which no one in their right mind should be using. I imagine actual reviews are going to paint a different story on how much better the 40-series cards actually are.

 

If I had to predict RDNA3, I'd guess their raster performance should be pretty competitive across the stack. If I had to venture a guess, I am going to think Ray Tracing will still be faster on the Nvidia cards, but I think as long as RDNA3 can at least beat Ampere in RT even if it doesn't match Lovelace, that still makes it a compelling option.

 

The biggest thing here is if AMD can price similarly or slightly more than RDNA2 was MSRP'd at, then that would be really compelling to the current market I would think, and is probably AMD's best chance at gaining more market share while Nvidia keeps trying to sell an overstock of 30-series cards and overpriced 40-series cards everyone is hating.

QFT.  Another interesting tidbit here......As a 6900XT owner, the current gen AMD cards really aren't HORRIBLE at raytracing, its just "RTX" that they fall on their face with.  The cards CAN do it, but "RTX" is moreso an Nvidia thing just like PhysX was.  So that's my bet too, the 4000 series will still be better at regular "RTX", but the 7000 series from AMD should end up doing better than the 3000 series do now, or maybe worst case about on par with.  Play Crysis Remastered with RT turned on, and the 6900XT performs just fine.  Launch up Cyberpunk though, and yeah, slideshow city.  Difference being "DX raytracing" vs "RTX" of course.

 

Doesn't AMD already have DLSS equivalent with RSR and FSR though?  Honestly, I like the openness of AMD's options better than Nvidia's DLSS anyway.  Any card can use FSR in games that support it.  And RSR is honestly pretty neat to play around with too on the AMD cards that support it (5700XT and higher).  I mean they're not perfect, and I'm sure DLSS is slightly more fine tuned.....but they work and they work pretty well.  FSR 2.0 is even better, so they're obviously putting effort into it.  It beats playing things at native lowered resolutions anyway by a long shot.

  • Thanks 2
Link to comment
Share on other sites

47 minutes ago, pioneerisloud said:

QFT.  Another interesting tidbit here......As a 6900XT owner, the current gen AMD cards really aren't HORRIBLE at raytracing, its just "RTX" that they fall on their face with.  The cards CAN do it, but "RTX" is moreso an Nvidia thing just like PhysX was.  So that's my bet too, the 4000 series will still be better at regular "RTX", but the 7000 series from AMD should end up doing better than the 3000 series do now, or maybe worst case about on par with.  Play Crysis Remastered with RT turned on, and the 6900XT performs just fine.  Launch up Cyberpunk though, and yeah, slideshow city.  Difference being "DX raytracing" vs "RTX" of course.

 

Doesn't AMD already have DLSS equivalent with RSR and FSR though?  Honestly, I like the openness of AMD's options better than Nvidia's DLSS anyway.  Any card can use FSR in games that support it.  And RSR is honestly pretty neat to play around with too on the AMD cards that support it (5700XT and higher).  I mean they're not perfect, and I'm sure DLSS is slightly more fine tuned.....but they work and they work pretty well.  FSR 2.0 is even better, so they're obviously putting effort into it.  It beats playing things at native lowered resolutions anyway by a long shot.

 

...I run both a 6900XT and a RTX 3090 (work and play, but somewhat interchangeable functions) and yeah, the 6900XT in ray tracing is at about the same as one of my older 2080 Tis - so not bad, but it can't compete with the 3090 in RTX. If anything, the narrower memory bus of the 6900XT is a bigger issue at 4K across the board.

 

I am certainly hanging back until early next year re. any potential upgrades (unless I change my mind and do the opposite...). So far, Nvidia's 'numbers' on performance gains are basically 'apples to oranges', ie. DLSS 3.0 vs no DLSS at 8K etc. I figure around 60 -70% genuine improvement over for example my oc'ed 3090, but that card is already giving me joys even at max 4K OLED. So since there's no immediate need, I might as well see what the 4090 Ti and/or 7900XT(X) are all about for a bigger jump.

  • Thanks 1

Owned

 Share

CPU: CPU: ><.......7950X3D - Aorus X670E Master - 48GB DDR5 7200 (8000) TridentZ SK Hynix - Giga-G-OC/Galax RTX 4090 670W - LG 48 OLED - 4TB NVMEs >< .......5950X - Asus CH 8 Dark Hero - 32GB CL13 DDR4 4000 - AMD R 6900XT 500W - Philips BDM40 4K VA - 2TB NVME & 3TB SSDs >> - <<.......4.4 TR 2950X - MSI X399 Creation - 32 GB CL 14 3866 - Asus RTX 3090 Strix OC/KPin 520W and 2x RTX 2080 Ti Gigabyte XTR WF WB 380W - LG 55 IPS HDR - 1TB NVME & 4TB SSDs
Full Rig Info
Link to comment
Share on other sites

1 hour ago, pioneerisloud said:

QFT.  Another interesting tidbit here......As a 6900XT owner, the current gen AMD cards really aren't HORRIBLE at raytracing, its just "RTX" that they fall on their face with.  The cards CAN do it, but "RTX" is moreso an Nvidia thing just like PhysX was.  So that's my bet too, the 4000 series will still be better at regular "RTX", but the 7000 series from AMD should end up doing better than the 3000 series do now, or maybe worst case about on par with.  Play Crysis Remastered with RT turned on, and the 6900XT performs just fine.  Launch up Cyberpunk though, and yeah, slideshow city.  Difference being "DX raytracing" vs "RTX" of course.

 

Doesn't AMD already have DLSS equivalent with RSR and FSR though?  Honestly, I like the openness of AMD's options better than Nvidia's DLSS anyway.  Any card can use FSR in games that support it.  And RSR is honestly pretty neat to play around with too on the AMD cards that support it (5700XT and higher).  I mean they're not perfect, and I'm sure DLSS is slightly more fine tuned.....but they work and they work pretty well.  FSR 2.0 is even better, so they're obviously putting effort into it.  It beats playing things at native lowered resolutions anyway by a long shot.

 

I'm not sure that's accurate. Ray tracing is ray tracing, there's not an Nvidia version of it. They just call it RTX, but it's the same. I could be wrong, but that's how I understand it. AMD 6000 is fine in games that use RT for shadows or GI but it doesn't really hold up with well with reflections. Crysis remastered probably isn't a good example because it's the one and only game that doesn't need RT cores to do RT. 
 

Edited by UltraMega

null

Owned

 Share

CPU: 5800x
MOTHERBOARD: ASUS TUF Gaming B550-Plus
RAM: 32GB 3600mhz CL16
GPU: 7900XT
SOUNDCARD: Sound Blaster Z 5.1 home theater
MONITOR: 4K 65 inch TV
Full Rig Info
Link to comment
Share on other sites

On 20/09/2022 at 12:25, UltraMega said:

RT performance getting a major boost as expected. 

Just wanna be careful with this statement.  DLSS 3.0 brings no advancement to ray tracing.  SER brings up to 1.25x RT performance, that's about it.  Other uplifts specific to RT may come from the new node and increased hardware units, but it's not going to be near the claimed 4x. 

 

The larger portion of the 4x is coming from the new frame generation, which has been seen used with SVP since Turing and known as Optical Flow.  There is no ray tracing happening in this step, because it is outside of the game engine and has only access to pixels from already rendered frames and (direction) motion vectors of those pixels. At best, it's another trick, one of several that NVidia is running out of relating to RT performance.  A more note worthy trick that is directly part of the RT pipeline and needs to be in the engine, is the denoiser, which has gotten significant upgrades and was featured on two-minute papers.  I appreciate RT, but NV is coming up short each gen and this time, they've came up with the wrong compromise.  RT is about fidelity.  Frame generation undoes ALL of that.

  • Respect 4
Link to comment
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now


×
×
  • Create New...

Important Information

This Website may place and access certain Cookies on your computer. ExtremeHW uses Cookies to improve your experience of using the Website and to improve our range of products and services. ExtremeHW has carefully chosen these Cookies and has taken steps to ensure that your privacy is protected and respected at all times. All Cookies used by this Website are used in accordance with current UK and EU Cookie Law. For more information please see our Privacy Policy