Jump to content

Welcome to ExtremeHW

Welcome to ExtremeHW, register to take part in our community, don't worry this is a simple FREE process that requires minimal information for you to signup.

 

Registered users can: 

  • Start new topics and reply to others.
  • Show off your PC using our Rig Creator feature.
  • Subscribe to topics and forums to get updates.
  • Get your own profile page to customize.
  • Send personal messages to other members.
  • Take advantage of site exclusive features.
  • Upgrade to Premium to unlock additional sites features.
IGNORED

RTX 4090 and RTX 4080 Final Specifications (leaked/rumor)


UltraMega
2 Attachments

Recommended Posts

  Quote

GeForce RTX 4090

The GPU SKU AD102-300 for the GeForce RTX 4090 should feature 16384 CUDA cores clocked at up to 2520 MHz. The card should contain 24GB of GDDR6X memory and offer 1TB of bandwidth, the same as the RTX 3090. The overall consumption should be at ~450W by default, with a maximum of 660W. The 660W TGP, according to sources, may not be accessible on all custom models because it will be unlocked via the BIOS.

 

16GB GeForce RTX 4080

The GeForce RTX 4080 16GB features a GPU model AD103-300 with 9728 CUDA cores and a frequency of up to 2505mHz, which is nearly comparable to the RTX 4090's clock. The model includes 16GB of GDDR6X RAM and a standard TGP of 340 W, with a maximum of 516 W.

 

12GB GeForce RTX 4080

Finally, the GeForce RTX 4080 12GB is the GPU previously known as the RTX 4070. According to the leaks, NVIDIA altered the product's name at the last minute. The AD104-400 GPU and 7680 CUDA cores with a stated frequency of up to 2610 MHz power the graphics card. The model's TGP is 285 W, with a maximum of 366 W.

Expand  

https://www.guru3d.com/news-story/are-these-the-geforce-rtx-4090-and-rtx-4080-final-(leaked)-specifications.html

 

I wonder if the 16gb 4080 is going to be a 4080Ti. Kinda disappointing to know that the 3070 and lower will have less than 16gbs. 

null

Owned

 Share

CPU: 5800x
MOTHERBOARD: ASUS TUF Gaming B550-Plus
RAM: 32GB 3600mhz CL16
GPU: 7900XT
SOUNDCARD: Sound Blaster Z 5.1 home theater
MONITOR: 4K 65 inch TV
Full Rig Info
Link to comment
Share on other sites

I'm sorry, an 80-class card (the 12gb 4080) with a 192-bit bus? Get bent Nvidia, that's atrocious.

 

Not sure if that was part of this source, but what I've seen with similar rumors. 16gb version being 256-bit and 12gb version being 192-bit (as opposed to 384).

Edited by Sir Beregond
  • Respect 1

null

Showcase

 Share

CPU: AMD Ryzen 9 5900X
GPU: Nvidia RTX 3080 Ti Founders Edition
RAM: G.Skill Trident Z Neo 32GB DDR4-3600 (@ 3733 CL14)
MOTHERBOARD: ASUS Crosshair VIII Dark Hero
SSD/NVME: x2 Samsung 970 Evo Plus 2TB
SSD/NVME 2: Crucial MX500 1TB
PSU: be Quiet! Straight Power 12 1500W
MONITOR: LG 42" C4 OLED
Full Rig Info
Link to comment
Share on other sites

  On 16/09/2022 at 19:17, Sir Beregond said:

I'm sorry, an 80-class card (the 12gb 4080) with a 192-bit bus? Get bent Nvidia, that's atrocious.

Expand  

Feom what I've heard, the 12gb 4080 was supposed to be the 4070 but they renamed it to justify charging more for it since the flood of used 3000 and overstock is taking over the lower end of what the 4000 series would cover. 

null

Owned

 Share

CPU: 5800x
MOTHERBOARD: ASUS TUF Gaming B550-Plus
RAM: 32GB 3600mhz CL16
GPU: 7900XT
SOUNDCARD: Sound Blaster Z 5.1 home theater
MONITOR: 4K 65 inch TV
Full Rig Info
Link to comment
Share on other sites

  On 16/09/2022 at 19:19, UltraMega said:

Feom what I've heard, the 12gb 4080 was supposed to be the 4070 but they renamed it to justify charging more for it since the flood of used 3000 and overstock is taking over the lower end of what the 4000 series would cover. 

Expand  

Still a spec like a 192-bit bus is firmly mid range.

  • Respect 1

null

Showcase

 Share

CPU: AMD Ryzen 9 5900X
GPU: Nvidia RTX 3080 Ti Founders Edition
RAM: G.Skill Trident Z Neo 32GB DDR4-3600 (@ 3733 CL14)
MOTHERBOARD: ASUS Crosshair VIII Dark Hero
SSD/NVME: x2 Samsung 970 Evo Plus 2TB
SSD/NVME 2: Crucial MX500 1TB
PSU: be Quiet! Straight Power 12 1500W
MONITOR: LG 42" C4 OLED
Full Rig Info
Link to comment
Share on other sites

  On 16/09/2022 at 19:17, Sir Beregond said:

I'm sorry, an 80-class card (the 12gb 4080) with a 192-bit bus? Get bent Nvidia, that's atrocious.

 

Not sure if that was part of this source, but what I've seen with similar rumors. 16gb version being 256-bit and 12gb version being 192-bit (as opposed to 384).

Expand  

 

....yeah, 192 bit bus 🤢

 

...looks like my 3090 Strix (w/ up to 1kw custom bios and 384 bit bus / 24GB VRAM) will be it for me for a while yet. I am likely to skip the 4090 and wait for the 4090 Ti (or AMD's rumoured XTX 7900)...  

Owned

 Share

CPU: CPU: ><.......7950X3D - Aorus X670E Master - 48GB DDR5 7200 (8000) TridentZ SK Hynix - Giga-G-OC/Galax RTX 4090 670W - LG 48 OLED - 4TB NVMEs >< .......5950X - Asus CH 8 Dark Hero - 32GB CL13 DDR4 4000 - AMD R 6900XT 500W - Philips BDM40 4K VA - 2TB NVME & 3TB SSDs >> - <<.......4.4 TR 2950X - MSI X399 Creation - 32 GB CL 14 3866 - Asus RTX 3090 Strix OC/KPin 520W and 2x RTX 2080 Ti Gigabyte XTR WF WB 380W - LG 55 IPS HDR - 1TB NVME & 4TB SSDs
Full Rig Info
Link to comment
Share on other sites

I'm sure raw performance will be great, but I just don't understand how a 192-bit 80-class card wouldn't be kneecapped for higher resolution gaming, something 80-class card buyers may care about lol.

 

Seems like more funny business in the vein of Kepler. GTX 680 was a mid-range card too.

  • Thanks 1

null

Showcase

 Share

CPU: AMD Ryzen 9 5900X
GPU: Nvidia RTX 3080 Ti Founders Edition
RAM: G.Skill Trident Z Neo 32GB DDR4-3600 (@ 3733 CL14)
MOTHERBOARD: ASUS Crosshair VIII Dark Hero
SSD/NVME: x2 Samsung 970 Evo Plus 2TB
SSD/NVME 2: Crucial MX500 1TB
PSU: be Quiet! Straight Power 12 1500W
MONITOR: LG 42" C4 OLED
Full Rig Info
Link to comment
Share on other sites

  On 19/09/2022 at 15:29, Sir Beregond said:

I'm sure raw performance will be great, but I just don't understand how a 192-bit 80-class card wouldn't be kneecapped for higher resolution gaming, something 80-class card buyers may care about lol.

 

Seems like more funny business in the vein of Kepler. GTX 680 was a mid-range card too.

Expand  

  

...the 256 bit wide bus of the 6900XT was already found to be a bit tight (even w/ infinitycache) at 4K (&+)

  • Thanks 1

Owned

 Share

CPU: CPU: ><.......7950X3D - Aorus X670E Master - 48GB DDR5 7200 (8000) TridentZ SK Hynix - Giga-G-OC/Galax RTX 4090 670W - LG 48 OLED - 4TB NVMEs >< .......5950X - Asus CH 8 Dark Hero - 32GB CL13 DDR4 4000 - AMD R 6900XT 500W - Philips BDM40 4K VA - 2TB NVME & 3TB SSDs >> - <<.......4.4 TR 2950X - MSI X399 Creation - 32 GB CL 14 3866 - Asus RTX 3090 Strix OC/KPin 520W and 2x RTX 2080 Ti Gigabyte XTR WF WB 380W - LG 55 IPS HDR - 1TB NVME & 4TB SSDs
Full Rig Info
Link to comment
Share on other sites

  On 19/09/2022 at 16:08, J7SC_Orion said:

  

...the 256 bit wide bus of the 6900XT was already found to be a bit tight (even w/ infinitycache) at 4K (&+)

Expand  

Exactly. This is why I am perplexed by these rumors. Perhaps it will just turn out to be that...rumors. Guess we'll find out tomorrow.

  • Thanks 1

null

Showcase

 Share

CPU: AMD Ryzen 9 5900X
GPU: Nvidia RTX 3080 Ti Founders Edition
RAM: G.Skill Trident Z Neo 32GB DDR4-3600 (@ 3733 CL14)
MOTHERBOARD: ASUS Crosshair VIII Dark Hero
SSD/NVME: x2 Samsung 970 Evo Plus 2TB
SSD/NVME 2: Crucial MX500 1TB
PSU: be Quiet! Straight Power 12 1500W
MONITOR: LG 42" C4 OLED
Full Rig Info
Link to comment
Share on other sites

  On 19/09/2022 at 16:08, J7SC_Orion said:

  

...the 256 bit wide bus of the 6900XT was already found to be a bit tight (even w/ infinitycache) at 4K (&+)

Expand  

Hmmm, so I have the bus width to blame? :lachen:

 

Seriously though, as @Sir Beregondsaid above, this really feels like the GTX 680 release all over again with these RTX 4000 series cards.  And the pricing on top of it is just a slap in the face to boot.  I don't recall who said it or in what thread, but it really almost does feel like they're releasing garbage on purpose to push existing RTX 3000 stock and get that moved first before releasing the big card and calling it the 4090Ti, or maybe save it for the 5090.

 

Guess we'll have to see how AMD answers this with the 7900XT to get a better idea of what's going on.  Probably wait for the next "big" Nvidia card too, just to see if there's any reasonable explanation behind what they're doing this gen.

  • Respect 1

null

Owned

 Share

CPU: Ryzen 7800 x3d
MOTHERBOARD: Gigabyte B650 Aorus Elite AX
RAM: 64GB Patriot Viper DDR5 @ 6200, 28-36-34-70, 1.50v
PSU: Seasonic Focus Gold 1000w
GPU: Sapphire Pulse RX 9070 XT
SOUNDCARD: Asus Xonar DGX
OPTICAL: USB Pioneer DVD-RW + Lightscribe
SSD/NVME: Corsair MP600 Mini 2TB
Full Rig Info
Link to comment
Share on other sites

  On 24/09/2022 at 04:36, pioneerisloud said:

Hmmm, so I have the bus width to blame? :lachen:

 

Seriously though, as @Sir Beregondsaid above, this really feels like the GTX 680 release all over again with these RTX 4000 series cards.  And the pricing on top of it is just a slap in the face to boot.  I don't recall who said it or in what thread, but it really almost does feel like they're releasing garbage on purpose to push existing RTX 3000 stock and get that moved first before releasing the big card and calling it the 4090Ti, or maybe save it for the 5090.

 

Guess we'll have to see how AMD answers this with the 7900XT to get a better idea of what's going on.  Probably wait for the next "big" Nvidia card too, just to see if there's any reasonable explanation behind what they're doing this gen.

Expand  

 

As posted before, I see no 'use case' reason for me right now to upgrade to a 4090 as among other things, I first want to see what the 4090 Ti and the AMD 7900XT/X bring to the table in real-world testing by several sources.

 

The AMD NVidia matchup early next year should be a really interesting development as it pits a monolithic super heavy-weight die (4090 w/ ~ 78 billion transistors, 3090 w/ ~ 28 billion) against the new-gen mGPU chiplets - I am convinced that AMD will compete against the 4090 Ti with a multi-chiplet approach just like it did with Ryzen on the CPU front...it filed related patents a while back which describe how multi-chiplets in a GPU appear as a single GPU to the OS and drivers (so no Crossfire / SLI profile requirements). Ultimately, chiplets will win over monolithic also for cost and manufacturing reasons. Whether this will occur with the 4090 Ti / AMD XTX matchup or later, I do not know, but it will come to pass, IMO. In the meantime, NVidia will try to keep its crown and get more of our monie$ with exclusive software and driver trickery.

 

Somewhat related re. upgrade plans, I always ran SLI or Crossfire over the last decade, w/o any real issues in at least the apps and games I have...but that upgrade path is clearly not an option anymore. So I just have to do with my trusty 'Ravens' eyes' for now

  Reveal hidden contents

 

  • Thanks 2

Owned

 Share

CPU: CPU: ><.......7950X3D - Aorus X670E Master - 48GB DDR5 7200 (8000) TridentZ SK Hynix - Giga-G-OC/Galax RTX 4090 670W - LG 48 OLED - 4TB NVMEs >< .......5950X - Asus CH 8 Dark Hero - 32GB CL13 DDR4 4000 - AMD R 6900XT 500W - Philips BDM40 4K VA - 2TB NVME & 3TB SSDs >> - <<.......4.4 TR 2950X - MSI X399 Creation - 32 GB CL 14 3866 - Asus RTX 3090 Strix OC/KPin 520W and 2x RTX 2080 Ti Gigabyte XTR WF WB 380W - LG 55 IPS HDR - 1TB NVME & 4TB SSDs
Full Rig Info
Link to comment
Share on other sites

Social Media Manager
1.5k 870

Got 550gbs with a little overclock on my 6800xt the 6900xt is not faster?

null

Owned

 Share

MOTHERBOARD: MSI MPG Z790i EDGE
CPU: Intel 13900k + Top Mounted 280mm Aio
RAM: 2x24gb 8000mhz @7600
PSU: 1300w XPG Cybercore Platinum
GPU: UHD ULTRA EXTREME BANANA GRAPHIC
MONITOR: [Monitor] LG CX48 OLED [VR] Samsung HMD Odyssey Plus OLED + Meta Quest 2 120hz
CASE: Corsair 7000D Airflow White
SSD/NVME: 2TB Intel 660p 1tb sn850 1tb sn770
Full Rig Info
Link to comment
Share on other sites

  On 24/09/2022 at 05:20, bonami2 said:

Got 550gbs with a little overclock on my 6800xt the 6900xt is not faster?

Expand  

 

...same VRAM re. 6800 and 6900, but when it gets to 550GB/s, there's this:

 

GPUzVRAMcomp_u.jpg.1453c76d6689828e0642de6abcb71894.jpg

  • Respect 1

Owned

 Share

CPU: CPU: ><.......7950X3D - Aorus X670E Master - 48GB DDR5 7200 (8000) TridentZ SK Hynix - Giga-G-OC/Galax RTX 4090 670W - LG 48 OLED - 4TB NVMEs >< .......5950X - Asus CH 8 Dark Hero - 32GB CL13 DDR4 4000 - AMD R 6900XT 500W - Philips BDM40 4K VA - 2TB NVME & 3TB SSDs >> - <<.......4.4 TR 2950X - MSI X399 Creation - 32 GB CL 14 3866 - Asus RTX 3090 Strix OC/KPin 520W and 2x RTX 2080 Ti Gigabyte XTR WF WB 380W - LG 55 IPS HDR - 1TB NVME & 4TB SSDs
Full Rig Info
Link to comment
Share on other sites

Social Media Manager
1.5k 870
  On 24/09/2022 at 05:42, J7SC_Orion said:

 

...same VRAM re. 6800 and 6900, but when it gets to 550GB/s, there's this:

 

GPUzVRAMcomp_u.jpg.1453c76d6689828e0642de6abcb71894.jpg

Expand  

Oh well never saw that nvidia was so much further. Pretty sad knowing amd was always first bandwith wise before with r9 290 and fury hbm.

Explain why 6800xt perform really good at 1080 and start losing a lot to nvidia at 4k.

  • Thanks 1

null

Owned

 Share

MOTHERBOARD: MSI MPG Z790i EDGE
CPU: Intel 13900k + Top Mounted 280mm Aio
RAM: 2x24gb 8000mhz @7600
PSU: 1300w XPG Cybercore Platinum
GPU: UHD ULTRA EXTREME BANANA GRAPHIC
MONITOR: [Monitor] LG CX48 OLED [VR] Samsung HMD Odyssey Plus OLED + Meta Quest 2 120hz
CASE: Corsair 7000D Airflow White
SSD/NVME: 2TB Intel 660p 1tb sn850 1tb sn770
Full Rig Info
Link to comment
Share on other sites

  On 24/09/2022 at 07:21, bonami2 said:

Oh well never saw that nvidia was so much further. Pretty sad knowing amd was always first bandwith wise before with r9 290 and fury hbm.

Explain why 6800xt perform really good at 1080 and start losing a lot to nvidia at 4k.

Expand  

Yeah when you look at a lot of benchmarks, the 1080p and 1440p often favor the Radeon cards, whereas 4k often swings over to Nvidia, with some exceptions.

 

I imagine if they are expanding the bus width with RDNA3 + infinity cache, that should be much better and hopefully help eliminate that bottleneck. At that point it will be down to GDDR6 vs GDDR6X, but my guess is this is where Infinity Cache can further help, with the proper bus width for high resolutions.

 

And this is why I am really perplexed by the 192-bit 4080 12GB. Imagine buying that for $900-$1k and it not doing great at 4k. 

 

  

  On 24/09/2022 at 05:19, J7SC_Orion said:

 

As posted before, I see no 'use case' reason for me right now to upgrade to a 4090 as among other things, I first want to see what the 4090 Ti and the AMD 7900XT/X bring to the table in real-world testing by several sources.

 

The AMD NVidia matchup early next year should be a really interesting development as it pits a monolithic super heavy-weight die (4090 w/ ~ 78 billion transistors, 3090 w/ ~ 28 billion) against the new-gen mGPU chiplets - I am convinced that AMD will compete against the 4090 Ti with a multi-chiplet approach just like it did with Ryzen on the CPU front...it filed related patents a while back which describe how multi-chiplets in a GPU appear as a single GPU to the OS and drivers (so no Crossfire / SLI profile requirements). Ultimately, chiplets will win over monolithic also for cost and manufacturing reasons. Whether this will occur with the 4090 Ti / AMD XTX matchup or later, I do not know, but it will come to pass, IMO. In the meantime, NVidia will try to keep its crown and get more of our monie$ with exclusive software and driver trickery.

 

Somewhat related re. upgrade plans, I always ran SLI or Crossfire over the last decade, w/o any real issues in at least the apps and games I have...but that upgrade path is clearly not an option anymore. So I just have to do with my trusty 'Ravens' eyes' for now

  Reveal hidden contents

 

Expand  

 

I agree. If Moore's Law is indeed dead, then monolithic approaches to GPU dies is a problem and chiplet designs where-in the less important pieces can go to a less advanced/expensive node for the I/O die equivalent for GPU's while using the advanced node for the chiplets is the solution.

Edited by Sir Beregond
  • Respect 1

null

Showcase

 Share

CPU: AMD Ryzen 9 5900X
GPU: Nvidia RTX 3080 Ti Founders Edition
RAM: G.Skill Trident Z Neo 32GB DDR4-3600 (@ 3733 CL14)
MOTHERBOARD: ASUS Crosshair VIII Dark Hero
SSD/NVME: x2 Samsung 970 Evo Plus 2TB
SSD/NVME 2: Crucial MX500 1TB
PSU: be Quiet! Straight Power 12 1500W
MONITOR: LG 42" C4 OLED
Full Rig Info
Link to comment
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now


×
×
  • Create New...

Important Information

This Website may place and access certain Cookies on your computer. ExtremeHW uses Cookies to improve your experience of using the Website and to improve our range of products and services. ExtremeHW has carefully chosen these Cookies and has taken steps to ensure that your privacy is protected and respected at all times. All Cookies used by this Website are used in accordance with current UK and EU Cookie Law. For more information please see our Privacy Policy