Jump to content

Welcome to ExtremeHW

Welcome to ExtremeHW, register to take part in our community, don't worry this is a simple FREE process that requires minimal information for you to signup.

 

Registered users can: 

  • Start new topics and reply to others.
  • Show off your PC using our Rig Creator feature.
  • Subscribe to topics and forums to get updates.
  • Get your own profile page to customize.
  • Send personal messages to other members.
  • Take advantage of site exclusive features.
  • Upgrade to Premium to unlock additional sites features.
IGNORED

Modder puts 16GBs Vram on a 3070


Recommended Posts

Quote

After some quick testing, Paulo found that he needed to set the card to work in high-performance mode to prevent random flickering or black screens when running 3D applications. Otherwise, the results are quite interesting despite only featuring one game – Resident Evil 4.

 

The modded card not only achieved a higher average frame rate but also went from single-digit one percent and 0.1 percent lows to more respectable values around 60 and 40 frames per second, respectively. The testing was done at a resolution of 2,560 by 1,080 and VRAM usage went over 12 gigabytes at times.

WWW.TECHSPOT.COM

When Nvidia launched the GeForce RTX 4070 graphics card with only 12 gigabytes of GDDR6X memory, gamers...

 

 

The original video is not in English, FYI. 

 

 

What started as a compromised amount of Vram do to pandemic shortages has turned into full planned obsolescence from Nvidia. 

 

 

null

Owned

 Share

CPU: 5800x
MOTHERBOARD: ASUS TUF Gaming B550-Plus
RAM: 32GB 3600mhz CL16
GPU: 7900XT
SOUNDCARD: Sound Blaster Z 5.1 home theater
MONITOR: 4K 65 inch TV
Full Rig Info
Link to comment
Share on other sites

21 minutes ago, UltraMega said:
WWW.TECHSPOT.COM

When Nvidia launched the GeForce RTX 4070 graphics card with only 12 gigabytes of GDDR6X memory, gamers...

 

 

The original video is not in English, FYI. 

 

 

What started as a compromised amount of Vram do to pandemic shortages has turned into full planned obsolescence from Nvidia. 

 

 

8GB just never made sense. Hell neither did 10 for the original 3080.

 

The fact is we had 8GB cards in 2015 from AMD and 2016 from Nvidia. Why are we still on 8GB as a standard?

 

Moreover, VRAM used to double ever generation at every segment.

 

Tesla to Fermi doubled. Fermi to Kepler doubled. Kepler to Maxwell doubled. And Maxwell to Pascal doubled. Then it stopped and the only thing that doubled was the 3090 which finally doubled from the 1080 Ti which was artificially cut down to 11GB (same as the 2080 Ti) just to differentiate it from the Titans. 3 gens of practically the same RAM in all mainstream SKUs.

null

Showcase

 Share

CPU: AMD Ryzen 9 5900X
GPU: Nvidia RTX 3080 Ti Founders Edition
RAM: G.Skill Trident Z Neo 32GB DDR4-3600 (@ 3733 CL14)
MOTHERBOARD: ASUS Crosshair VIII Dark Hero
SSD/NVME: x2 Samsung 970 Evo Plus 2TB
SSD/NVME 2: Crucial MX500 1TB
PSU: be Quiet! Straight Power 12 1500W
MONITOR: LG 42" C4 OLED
Full Rig Info

null

Owned

 Share

CPU: E8400, i5-650, i7-870, i7-960, i5-2400, i7-4790k, i9-10900k, i3-13100, i9-13900ks
GPU: many
RAM: Corsair 32GB DDR3-2400 | Oloy Blade 16GB DDR4-3600 | Crucial 16GB DDR5-5600
MOTHERBOARD: ASUS P7P55 WS SC | ASUS Z97 Deluxe | EVGA Z490 Dark | EVGA Z790 Dark Kingpin
SSD/NVME: Samsung 870 Evo 1TB | Inland 1TB Gen 4
PSU: Seasonic Focus GX 1000W
CASE: Cooler Master MasterFrame 700 - bench mode
OPERATING SYSTEM: Windows 10 LTSC
Full Rig Info

Owned

 Share

CPU: M1 Pro
RAM: 32GB
SSD/NVME: 1TB
OPERATING SYSTEM: MacOS Sonoma
CASE: Space Grey
Full Rig Info
Link to comment
Share on other sites

32 minutes ago, Sir Beregond said:

8GB just never made sense. Hell neither did 10 for the original 3080.

 

The fact is we had 8GB cards in 2015 from AMD and 2016 from Nvidia. Why are we still on 8GB as a standard?

 

Moreover, VRAM used to double ever generation at every segment.

 

Tesla to Fermi doubled. Fermi to Kepler doubled. Kepler to Maxwell doubled. And Maxwell to Pascal doubled. Then it stopped and the only thing that doubled was the 3090 which finally doubled from the 1080 Ti which was artificially cut down to 11GB (same as the 2080 Ti) just to differentiate it from the Titans. 3 gens of practically the same RAM in all mainstream SKUs.

Not to try and defend Nvidia here, but some things have changed. For example, every console generation saw massive (like 16x or more) memory capacity increases except the most recent one. PS5/xbox have about 12GBs of Vram available to the GPU, which is about double from the last gen. 

 

Nvidia is clearly being greedy, but it's also true that Vram capacity increases are less important today than they were in the past, in no small part because now we have storage drives fast enough to load in tons of game assets in real time. 

 

I think Nvidia also knows that a card as powerful as something like a 4070 could last a really long time if it did have the proper amount of Vram with DLSS 3 being in the mix. One you can do path tracing with good results, there's not a lot left to sell to gamers in the foreseeable gpu market. I mean, what are they going to do, try to sell people on how many bounces the path tracing can do? Anything past 2 bounces has extremely dimishing results, and we're already at 2 in 2077. There's just not any compelling reasons left to upgrade a GPU beyond one that can do PT reasonably well already, so if Nvidia didn't skimp on Vram they may end up with a bunch of gamers reluctant to upgrade in a future where the only difference between a hypothetical 4000 card with enough Vram and an even better card is just what DLSS setting you have to use with it when you do PT. 

 

I'm not disagreeing with you @Sir Beregond, but some of the aspects of this have evolved over time and it's interesting to bring up those points. 

 

 

That said, the backlash Nvidia is getting seems stronger than before, maybe we'll see a 4000 series refresh, or a rushed out 5000 series. 

 

 

null

Owned

 Share

CPU: 5800x
MOTHERBOARD: ASUS TUF Gaming B550-Plus
RAM: 32GB 3600mhz CL16
GPU: 7900XT
SOUNDCARD: Sound Blaster Z 5.1 home theater
MONITOR: 4K 65 inch TV
Full Rig Info
Link to comment
Share on other sites

1 hour ago, UltraMega said:

Not to try and defend Nvidia here, but some things have changed. For example, every console generation saw massive (like 16x or more) memory capacity increases except the most recent one. PS5/xbox have about 12GBs of Vram available to the GPU, which is about double from the last gen. 

 

Nvidia is clearly being greedy, but it's also true that Vram capacity increases are less important today than they were in the past, in no small part because now we have storage drives fast enough to load in tons of game assets in real time. 

 

I think Nvidia also knows that a card as powerful as something like a 4070 could last a really long time if it did have the proper amount of Vram with DLSS 3 being in the mix. One you can do path tracing with good results, there's not a lot left to sell to gamers in the foreseeable gpu market. I mean, what are they going to do, try to sell people on how many bounces the path tracing can do? Anything past 2 bounces has extremely dimishing results, and we're already at 2 in 2077. There's just not any compelling reasons left to upgrade a GPU beyond one that can do PT reasonably well already, so if Nvidia didn't skimp on Vram they may end up with a bunch of gamers reluctant to upgrade in a future where the only difference between a hypothetical 4000 card with enough Vram and an even better card is just what DLSS setting you have to use with it when you do PT. 

 

I'm not disagreeing with you @Sir Beregond, but some of the aspects of this have evolved over time and it's interesting to bring up those points. 

 

 

That said, the backlash Nvidia is getting seems stronger than before, maybe we'll see a 4000 series refresh, or a rushed out 5000 series. 

 

 

Yeah keep in kind though the way a console handles direct access to storage and how a PC does it while running Windows is different. You are definitely going to need more VRAM on a Windows machine than you will on a console.

 

I could see a 40-series refresh. Unless TSMC has sorted out 3nm yields yet, then 50-series is a way off.

Edited by Sir Beregond

null

Showcase

 Share

CPU: AMD Ryzen 9 5900X
GPU: Nvidia RTX 3080 Ti Founders Edition
RAM: G.Skill Trident Z Neo 32GB DDR4-3600 (@ 3733 CL14)
MOTHERBOARD: ASUS Crosshair VIII Dark Hero
SSD/NVME: x2 Samsung 970 Evo Plus 2TB
SSD/NVME 2: Crucial MX500 1TB
PSU: be Quiet! Straight Power 12 1500W
MONITOR: LG 42" C4 OLED
Full Rig Info

null

Owned

 Share

CPU: E8400, i5-650, i7-870, i7-960, i5-2400, i7-4790k, i9-10900k, i3-13100, i9-13900ks
GPU: many
RAM: Corsair 32GB DDR3-2400 | Oloy Blade 16GB DDR4-3600 | Crucial 16GB DDR5-5600
MOTHERBOARD: ASUS P7P55 WS SC | ASUS Z97 Deluxe | EVGA Z490 Dark | EVGA Z790 Dark Kingpin
SSD/NVME: Samsung 870 Evo 1TB | Inland 1TB Gen 4
PSU: Seasonic Focus GX 1000W
CASE: Cooler Master MasterFrame 700 - bench mode
OPERATING SYSTEM: Windows 10 LTSC
Full Rig Info

Owned

 Share

CPU: M1 Pro
RAM: 32GB
SSD/NVME: 1TB
OPERATING SYSTEM: MacOS Sonoma
CASE: Space Grey
Full Rig Info
Link to comment
Share on other sites

1 hour ago, Sir Beregond said:

Yeah keep in kind though the way a console handles direct access to storage and how a PC does it while running Windows is different. You are definitely going to need more VRAM on a Windows machine than you will on a console.

That's true, but it's a lot less true than it used to be. Most people don't know this but Nanite in UE5 can actually run just fine off a hard drive. The point data it streams in is not as much data as people assume it is. On top of that, a PCIe3 3500mb/s nvme drive should be more than enough to allow games to load in whatever assets they need even if it's not done in a particularly efficient way. To date, there are only a few games that can even saturate the bandwidth of a sata SSD, and only for very brief amounts of time. 

 

On top of that, direct storage and planned updates to DirectX will give GPUs much greater access to storage and system ram in the future. 

 

I don't think Nvidia is skimping on Vram because they're expecting advancements in storage access to get better, but those things are going to get better regardless. 

 

On a bit of a side note, Nvidia may be setting themselves up for a situation where people are a lot less compelled to upgrade in the future. By skimping on Vram but pushing DLSS, they are basically sending the message that they think Vram is less important now because DLSS can make up the difference better than the extra Vram can. If that proves true to a reasonable extent, it could take a lot longer for 4000 series owners to feel a strong need to upgrade. I suspect DLSS will only continue to get better, so if the main difference for gamers is running DLSS at balanced or performance mode more often vs quality mode as time goes on, but DLSS is also constantly getting better at upscaling, gamers may just decide to rely on upscaling more and more making new hardware less and less necessary. 

I wouldn't be surprised if Nvidia is already planning to pump the breaks on DLSS iterations and wait for AMD and Intel to catch up so they aren't giving gamers as much of a free upgrade via effective upscaling. It may turn out that eventually Nvidia aims to just stay a step ahead on upscaling while minimizing the benefit it actually gives to anyone on the fence about an upgrade. 

 

Interesting times ahead. Upscaling has certainly thrown a big wrench in the predictability of future hardware, like it or not. One of the most interesting things about this is that I don't think Nvidia was planning to make upscaling such a big part of their marketing back when they had to cut the Vram on the 3000 series in half due to pandemic shortages. If someone had told them back then that they would be inferring 7 out of ever 8 pixels with AI/upscaling just a few years later, I think they would have been almost as surprised as anyone else. 

Edited by UltraMega

null

Owned

 Share

CPU: 5800x
MOTHERBOARD: ASUS TUF Gaming B550-Plus
RAM: 32GB 3600mhz CL16
GPU: 7900XT
SOUNDCARD: Sound Blaster Z 5.1 home theater
MONITOR: 4K 65 inch TV
Full Rig Info
Link to comment
Share on other sites

4 hours ago, UltraMega said:

That's true, but it's a lot less true than it used to be. Most people don't know this but Nanite in UE5 can actually run just fine off a hard drive. The point data it streams in is not as much data as people assume it is. On top of that, a PCIe3 3500mb/s nvme drive should be more than enough to allow games to load in whatever assets they need even if it's not done in a particularly efficient way. To date, there are only a few games that can even saturate the bandwidth of a sata SSD, and only for very brief amounts of time. 

 

On top of that, direct storage and planned updates to DirectX will give GPUs much greater access to storage and system ram in the future. 

 

I don't think Nvidia is skimping on Vram because they're expecting advancements in storage access to get better, but those things are going to get better regardless. 

 

On a bit of a side note, Nvidia may be setting themselves up for a situation where people are a lot less compelled to upgrade in the future. By skimping on Vram but pushing DLSS, they are basically sending the message that they think Vram is less important now because DLSS can make up the difference better than the extra Vram can. If that proves true to a reasonable extent, it could take a lot longer for 4000 series owners to feel a strong need to upgrade. I suspect DLSS will only continue to get better, so if the main difference for gamers is running DLSS at balanced or performance mode more often vs quality mode as time goes on, but DLSS is also constantly getting better at upscaling, gamers may just decide to rely on upscaling more and more making new hardware less and less necessary. 

I wouldn't be surprised if Nvidia is already planning to pump the breaks on DLSS iterations and wait for AMD and Intel to catch up so they aren't giving gamers as much of a free upgrade via effective upscaling. It may turn out that eventually Nvidia aims to just stay a step ahead on upscaling while minimizing the benefit it actually gives to anyone on the fence about an upgrade. 

 

Interesting times ahead. Upscaling has certainly thrown a big wrench in the predictability of future hardware, like it or not. One of the most interesting things about this is that I don't think Nvidia was planning to make upscaling such a big part of their marketing back when they had to cut the Vram on the 3000 series in half due to pandemic shortages. If someone had told them back then that they would be inferring 7 out of ever 8 pixels with AI/upscaling just a few years later, I think they would have been almost as surprised as anyone else. 

I could be wrong, but I am pretty sure you still need VRAM with DLSS. While I am sure it does reduce its usage a bit by nature of rendering a lower texture resolution, I have not heard of DLSS necessarily being a big VRAM saver. Maybe a question for @J7SC_Orion.

 

I can tell you I definitely still ran out of VRAM on my 12GB card on Witcher 3 with DLSS. I really had to crank down some settings and push DLSS to balanced to make it work without getting slideshow mode a little while into the playthrough.

Edited by Sir Beregond

null

Showcase

 Share

CPU: AMD Ryzen 9 5900X
GPU: Nvidia RTX 3080 Ti Founders Edition
RAM: G.Skill Trident Z Neo 32GB DDR4-3600 (@ 3733 CL14)
MOTHERBOARD: ASUS Crosshair VIII Dark Hero
SSD/NVME: x2 Samsung 970 Evo Plus 2TB
SSD/NVME 2: Crucial MX500 1TB
PSU: be Quiet! Straight Power 12 1500W
MONITOR: LG 42" C4 OLED
Full Rig Info

null

Owned

 Share

CPU: E8400, i5-650, i7-870, i7-960, i5-2400, i7-4790k, i9-10900k, i3-13100, i9-13900ks
GPU: many
RAM: Corsair 32GB DDR3-2400 | Oloy Blade 16GB DDR4-3600 | Crucial 16GB DDR5-5600
MOTHERBOARD: ASUS P7P55 WS SC | ASUS Z97 Deluxe | EVGA Z490 Dark | EVGA Z790 Dark Kingpin
SSD/NVME: Samsung 870 Evo 1TB | Inland 1TB Gen 4
PSU: Seasonic Focus GX 1000W
CASE: Cooler Master MasterFrame 700 - bench mode
OPERATING SYSTEM: Windows 10 LTSC
Full Rig Info

Owned

 Share

CPU: M1 Pro
RAM: 32GB
SSD/NVME: 1TB
OPERATING SYSTEM: MacOS Sonoma
CASE: Space Grey
Full Rig Info
Link to comment
Share on other sites

Remember also, the xbox1 used an AMD APU It features a custom AMD Durango GCN 1.0 that's on a 28nm chip. The nearest amd & Nvidia cards used 6 or 8GB of memory on them. The AMD Scarlett is the GPU inside the Xbox Series X, and it was built on the RDNA 2.0 architecture on a 7nm production process. That right there covers why the consoles were able to jump the amount of vram they are using.

Link to comment
Share on other sites

51 minutes ago, Sir Beregond said:

I could be wrong, but I am pretty sure you still need VRAM with DLSS. While I am sure it does reduce its usage a bit by nature of rendering a lower texture resolution, I have not heard of DLSS necessarily being a big VRAM saver. Maybe a question for @J7SC_Orion.

 

I can tell you I definitely still ran out of VRAM on my 12GB card on Witcher 3 with DLSS. I really had to crank down some settings and push DLSS to balanced to make it work without getting slideshow mode a little while into the playthrough.

 

...happen to have this handy; Cyberpunk 2077 4K DLSS3 comp w/HWInfo inset which includes allocated and dedicated VRAM

 

CP2077_OD_PSYCHO_comp.thumb.jpg.f0c6d5ea1bf61f6254be199d173b569e.jpg

 

Edited by J7SC_Orion
  • Thanks 1

Owned

 Share

CPU: CPU: ><.......7950X3D - Aorus X670E Master - 48GB DDR5 7200 (8000) TridentZ SK Hynix - Giga-G-OC/Galax RTX 4090 670W - LG 48 OLED - 4TB NVMEs >< .......5950X - Asus CH 8 Dark Hero - 32GB CL13 DDR4 4000 - AMD R 6900XT 500W - Philips BDM40 4K VA - 2TB NVME & 3TB SSDs >> - <<.......4.4 TR 2950X - MSI X399 Creation - 32 GB CL 14 3866 - Asus RTX 3090 Strix OC/KPin 520W and 2x RTX 2080 Ti Gigabyte XTR WF WB 380W - LG 55 IPS HDR - 1TB NVME & 4TB SSDs
Full Rig Info
Link to comment
Share on other sites

5 hours ago, Sir Beregond said:

I could be wrong, but I am pretty sure you still need VRAM with DLSS. While I am sure it does reduce its usage a bit by nature of rendering a lower texture resolution, I have not heard of DLSS necessarily being a big VRAM saver. Maybe a question for @J7SC_Orion.

 

I can tell you I definitely still ran out of VRAM on my 12GB card on Witcher 3 with DLSS. I really had to crank down some settings and push DLSS to balanced to make it work without getting slideshow mode a little while into the playthrough.

DLSS scales the same way render resolution does when it comes to Vram, and there is a very strong relationship between resolution and Vram usage. Rendering at 1080P or 1440P and upscaling to 4K uses a lot less Vram than rendering in 4K does. 
blops-vram-bench-sp.png

 

DLSS does not change texture resolution or anything other than the render resolution. 

 

It's definitely the case that DLSS, when used, greatly reduces the Vram usage. There are plenty of benchmarks to back this up online. I don't think the 2077 screenshot a post above is showing Vram usage during a benchmark run, and it's also not a comparison of different resolutions/DLSS quality settings, it's a comparison of different RT settings. 

 

Edit: it is pretty hard to find data on this specifically actually. I know I have seen several charts on it, but hard to find any of them now. I did find this one:

9532_105_death-stranding-benchmarked-how

Perhaps, based on this one chart, it does not scale as much as just lowering the resolution usually does but it still makes quite a big difference. 

Edited by UltraMega

null

Owned

 Share

CPU: 5800x
MOTHERBOARD: ASUS TUF Gaming B550-Plus
RAM: 32GB 3600mhz CL16
GPU: 7900XT
SOUNDCARD: Sound Blaster Z 5.1 home theater
MONITOR: 4K 65 inch TV
Full Rig Info
Link to comment
Share on other sites

7 hours ago, UltraMega said:

DLSS scales the same way render resolution does when it comes to Vram, and there is a very strong relationship between resolution and Vram usage. Rendering at 1080P or 1440P and upscaling to 4K uses a lot less Vram than rendering in 4K does. 
blops-vram-bench-sp.png

 

DLSS does not change texture resolution or anything other than the render resolution. 

 

It's definitely the case that DLSS, when used, greatly reduces the Vram usage. There are plenty of benchmarks to back this up online. I don't think the 2077 screenshot a post above is showing Vram usage during a benchmark run, and it's also not a comparison of different resolutions/DLSS quality settings, it's a comparison of different RT settings. 

 

Edit: it is pretty hard to find data on this specifically actually. I know I have seen several charts on it, but hard to find any of them now. I did find this one:

9532_105_death-stranding-benchmarked-how

Perhaps, based on this one chart, it does not scale as much as just lowering the resolution usually does but it still makes quite a big difference. 

 

I don't know. I wouldn't call 2GB a "big" difference. But it is a difference.

  • Thanks 1

null

Showcase

 Share

CPU: AMD Ryzen 9 5900X
GPU: Nvidia RTX 3080 Ti Founders Edition
RAM: G.Skill Trident Z Neo 32GB DDR4-3600 (@ 3733 CL14)
MOTHERBOARD: ASUS Crosshair VIII Dark Hero
SSD/NVME: x2 Samsung 970 Evo Plus 2TB
SSD/NVME 2: Crucial MX500 1TB
PSU: be Quiet! Straight Power 12 1500W
MONITOR: LG 42" C4 OLED
Full Rig Info

null

Owned

 Share

CPU: E8400, i5-650, i7-870, i7-960, i5-2400, i7-4790k, i9-10900k, i3-13100, i9-13900ks
GPU: many
RAM: Corsair 32GB DDR3-2400 | Oloy Blade 16GB DDR4-3600 | Crucial 16GB DDR5-5600
MOTHERBOARD: ASUS P7P55 WS SC | ASUS Z97 Deluxe | EVGA Z490 Dark | EVGA Z790 Dark Kingpin
SSD/NVME: Samsung 870 Evo 1TB | Inland 1TB Gen 4
PSU: Seasonic Focus GX 1000W
CASE: Cooler Master MasterFrame 700 - bench mode
OPERATING SYSTEM: Windows 10 LTSC
Full Rig Info

Owned

 Share

CPU: M1 Pro
RAM: 32GB
SSD/NVME: 1TB
OPERATING SYSTEM: MacOS Sonoma
CASE: Space Grey
Full Rig Info
Link to comment
Share on other sites

4 hours ago, Sir Beregond said:

 

I don't know. I wouldn't call 2GB a "big" difference. But it is a difference.

 

...it also comes down to 'which DLSS' you're talking about 🤔. Below are three runs in Cyberpunks 2077 of the built-in benchmark with identical OC, and test conditions (temps; signing out and back in after each settings change and before each test). The first one is '4K Native', the second one 'DLSS2' (Quality) and the third one 'DLSS3' (Quality, FrameGen, NVR). DLSS3 and FrameGen uses more VRAM than DLSS2 w/o FrameGen

 

CP2077_DLSScomp.thumb.jpg.85ffb3340287a9c8915ddbccf2b12901.jpg

  • Thanks 2

Owned

 Share

CPU: CPU: ><.......7950X3D - Aorus X670E Master - 48GB DDR5 7200 (8000) TridentZ SK Hynix - Giga-G-OC/Galax RTX 4090 670W - LG 48 OLED - 4TB NVMEs >< .......5950X - Asus CH 8 Dark Hero - 32GB CL13 DDR4 4000 - AMD R 6900XT 500W - Philips BDM40 4K VA - 2TB NVME & 3TB SSDs >> - <<.......4.4 TR 2950X - MSI X399 Creation - 32 GB CL 14 3866 - Asus RTX 3090 Strix OC/KPin 520W and 2x RTX 2080 Ti Gigabyte XTR WF WB 380W - LG 55 IPS HDR - 1TB NVME & 4TB SSDs
Full Rig Info
Link to comment
Share on other sites

8 hours ago, Sir Beregond said:

 

I don't know. I wouldn't call 2GB a "big" difference. But it is a difference.

I think it's usually a lot more of a difference, but I can't seem to find any of the charts that I'm pretty sure I've seen over the last year or two. 🤷

 

I guess I could test it myself with FSR2 and see what happens. 

null

Owned

 Share

CPU: 5800x
MOTHERBOARD: ASUS TUF Gaming B550-Plus
RAM: 32GB 3600mhz CL16
GPU: 7900XT
SOUNDCARD: Sound Blaster Z 5.1 home theater
MONITOR: 4K 65 inch TV
Full Rig Info
Link to comment
Share on other sites

On 24/04/2023 at 16:08, UltraMega said:

Not to try and defend Nvidia here, but some things have changed. For example, every console generation saw massive (like 16x or more) memory capacity increases except the most recent one. PS5/xbox have about 12GBs of Vram available to the GPU, which is about double from the last gen. 

 

Nvidia is clearly being greedy, but it's also true that Vram capacity increases are less important today than they were in the past, in no small part because now we have storage drives fast enough to load in tons of game assets in real time. 

 

I think Nvidia also knows that a card as powerful as something like a 4070 could last a really long time if it did have the proper amount of Vram with DLSS 3 being in the mix. One you can do path tracing with good results, there's not a lot left to sell to gamers in the foreseeable gpu market. I mean, what are they going to do, try to sell people on how many bounces the path tracing can do? Anything past 2 bounces has extremely dimishing results, and we're already at 2 in 2077. There's just not any compelling reasons left to upgrade a GPU beyond one that can do PT reasonably well already, so if Nvidia didn't skimp on Vram they may end up with a bunch of gamers reluctant to upgrade in a future where the only difference between a hypothetical 4000 card with enough Vram and an even better card is just what DLSS setting you have to use with it when you do PT. 

 

I'm not disagreeing with you @Sir Beregond, but some of the aspects of this have evolved over time and it's interesting to bring up those points. 

 

 

That said, the backlash Nvidia is getting seems stronger than before, maybe we'll see a 4000 series refresh, or a rushed out 5000 series. 

 

 

Keep in mind that console architecture is different, including latency between memory, storage, gpu etc. I know you know this... but still, its worth flagging in this discussion. 

 

Generally speaking, looking at a quantity of memory on a console in isolation doesn't translate when the pipeline on a PC trying to render the same game is different. 

 

We know from years of data, it generally takes X% more PC system power to drive a similar console experience. Consoles are becoming more like PC's but they're still different. 12GB will likely be the min. for high quality textures, assets at 4K...  but I suspect that the recommendation will be closer to 16GB in the next couple of years. You need that buffer for all those bad ports out there... 🙂

 

If Nvidia wants to resolve things, they'll do so when their shareholders lash out. For now, they can continue with the "deal with it" sentiment. What do we have... maybe 6-8 months more of interest hikes in the US? When we're at peak inflation, maybe, just maybe, the ship will start to correct itself. Consumers and businesses are already cutting spend. Nvidia can only respond in one way. Lower prices. On the flipside, Nvidia is thirsty right now, and they got a few fish on the line (tech companies with deep pockets throwing money at "AI").  Anyway, I digress with my tinfoil hat tech / economics theory 😄

 

TL;DR -> PC gamers need to be aware of VRAM, Memory, and storage requirements this gen if they want full details. If you're not keeping up with the 16GB GPU VRAM, 32GB System RAM, SSD drive, just get ready to lower your expectations and lower some settings like DLSS / FSR (lower rendering resolutions). 

  • Thanks 1
Link to comment
Share on other sites

1 hour ago, Slaughtahouse said:

Keep in mind that console architecture is different, including latency between memory, storage, gpu etc. I know you know this... but still, its worth flagging in this discussion. 

 

Generally speaking, looking at a quantity of memory on a console in isolation doesn't translate when the pipeline on a PC trying to render the same game is different. 

 

We know from years of data, it generally takes X% more PC system power to drive a similar console experience. Consoles are becoming more like PC's but they're still different. 12GB will likely be the min. for high quality textures, assets at 4K...  but I suspect that the recommendation will be closer to 16GB in the next couple of years. You need that buffer for all those bad ports out there... 🙂

 

If Nvidia wants to resolve things, they'll do so when their shareholders lash out. For now, they can continue with the "deal with it" sentiment. What do we have... maybe 6-8 months more of interest hikes in the US? When we're at peak inflation, maybe, just maybe, the ship will start to correct itself. Consumers and businesses are already cutting spend. Nvidia can only respond in one way. Lower prices. On the flipside, Nvidia is thirsty right now, and they got a few fish on the line (tech companies with deep pockets throwing money at "AI").  Anyway, I digress with my tinfoil hat tech / economics theory 😄

 

TL;DR -> PC gamers need to be aware of VRAM, Memory, and storage requirements this gen if they want full details. If you're not keeping up with the 16GB GPU VRAM, 32GB System RAM, SSD drive, just get ready to lower your expectations and lower some settings like DLSS / FSR (lower rendering resolutions). 

I agree with all that. 

 

I'm definitely not a fan of what Nvidia is doing at all, that's why I bought a 7900xt instead of a 4070ti, but when it comes to dlss3 and path tracing i think this 2077 tech preview is proving that they can do more with less under the right conditions. 

 

I guess my overall point is just that, although there is no argument that Nvidia is skimping on Vram, there could be an argument for why that's not as big of a deal today with dlss2/3 to pick up the slack. 

Edited by UltraMega
  • Thanks 1

null

Owned

 Share

CPU: 5800x
MOTHERBOARD: ASUS TUF Gaming B550-Plus
RAM: 32GB 3600mhz CL16
GPU: 7900XT
SOUNDCARD: Sound Blaster Z 5.1 home theater
MONITOR: 4K 65 inch TV
Full Rig Info
Link to comment
Share on other sites

You also have to realize that the coding for the games on console and pc is vastly different. If it wasn't, you wouldn't get the poor ports to pc that crash & burn and have worse fps until 17 patches have been released. 🤣That's the only reason the previous gen amd apu in the consoles is able to make the games look as good as it does.

 

Link to comment
Share on other sites

2 hours ago, schuck6566 said:

You also have to realize that the coding for the games on console and pc is vastly different. If it wasn't, you wouldn't get the poor ports to pc that crash & burn and have worse fps until 17 patches have been released. 🤣That's the only reason the previous gen amd apu in the consoles is able to make the games look as good as it does.

 

It goes both ways. There are games ported from PC to console that don't get the full optimization in the port version as well. 

null

Owned

 Share

CPU: 5800x
MOTHERBOARD: ASUS TUF Gaming B550-Plus
RAM: 32GB 3600mhz CL16
GPU: 7900XT
SOUNDCARD: Sound Blaster Z 5.1 home theater
MONITOR: 4K 65 inch TV
Full Rig Info
Link to comment
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now


×
×
  • Create New...

Important Information

This Website may place and access certain Cookies on your computer. ExtremeHW uses Cookies to improve your experience of using the Website and to improve our range of products and services. ExtremeHW has carefully chosen these Cookies and has taken steps to ensure that your privacy is protected and respected at all times. All Cookies used by this Website are used in accordance with current UK and EU Cookie Law. For more information please see our Privacy Policy