Jump to content

Welcome to ExtremeHW

Welcome to ExtremeHW, register to take part in our community, don't worry this is a simple FREE process that requires minimal information for you to signup.

 

Registered users can: 

  • Start new topics and reply to others.
  • Show off your PC using our Rig Creator feature.
  • Subscribe to topics and forums to get updates.
  • Get your own profile page to customize.
  • Send personal messages to other members.
  • Take advantage of site exclusive features.
  • Upgrade to Premium to unlock additional sites features.
IGNORED

Rumored GeForce 5090 Specs


UltraMega
4 Attachments

Recommended Posts

Quote

The RTX 5090 is projected to include 204 Streaming Multiprocessors (SMs), leading to a total core count of 26,112. It is expected to be equipped with 32Gbps GDDR7 memory, operating over a 384-bit bus. This configuration contrasts with earlier considerations of a 512-bit bus. The L2 cache is anticipated to be 96MB. The GPU's core architecture, referred to as the Blackwell "GB202" gaming die, is described to have a 12 Graphics Processing Clusters (GPCs) x 8 Texture Processing Clusters (TPCs) configuration.

 

Each TPC is expected to contain 2 SMs, each with 128 FP32 cores. This structure leads to a theoretical core count of 24,576 (192 SMs) for a fully enabled GB202 die. However, manufacturing efficiencies may necessitate disabling some SMs, reducing the expected core count to 24,046 or less. The RTX 5090's memory bandwidth is projected to be 1,536 GB/s, a considerable increase from the 1,008 GB/s of the RTX 4090. This improvement represents a 50% enhancement in memory throughput and approximately a 60% increase in shader density. The overall performance uplift is anticipated to be up to 2x or more compared to its predecessor.

https://www.guru3d.com/story/rumored-nvidia-rtx-5090-specs-cores-gddr7-memory-and-bandwidth/

 

 

Hopefully they let people pay by donating their organs directly to Nvidia this time. Cut out the middle man. 

  • Agreed 1

null

Owned

 Share

CPU: 5800x
MOTHERBOARD: ASUS TUF Gaming B550-Plus
RAM: 32GB 3600mhz CL16
GPU: 7900XT
SOUNDCARD: Sound Blaster Z 5.1 home theater
MONITOR: 4K 65 inch TV
Full Rig Info
Link to comment
Share on other sites

I wonder what the memory density will be per chip by that point. Are we talking another 24GB card? Maybe 36GB or 48GB? 

 

Are they really saying 192-bit now for the 5080? That's a low blow. While I am sure the increased L2 helps, its clear time and again that at higher resolution, you know like the ones you'd use if buying top tier products, definitely gets hampered by a 192-bit bus limiting performance potential. 4070 Ti is a great example. Sure it works at 4k, but if it was a 256-bit card, it would work even better. For the money spent, its definitely performance left on the table because of that drastic cut-down.

 

AMD figured that out. Notice RDNA3 has a smaller Infinity Cache and larger traditional bus than RDNA2 did.

Edited by Sir Beregond
  • Agreed 4

null

Showcase

 Share

CPU: AMD Ryzen 9 5900X
GPU: Nvidia RTX 3080 Ti Founders Edition
RAM: G.Skill Trident Z Neo 32GB DDR4-3600 (@ 3733 CL14)
MOTHERBOARD: ASUS Crosshair VIII Dark Hero
SSD/NVME: x2 Samsung 970 Evo Plus 2TB
SSD/NVME 2: Crucial MX500 1TB
PSU: be Quiet! Straight Power 12 1500W
MONITOR: LG 42" C4 OLED
Full Rig Info

null

Owned

 Share

CPU: E8400, i5-650, i7-870, i7-960, i5-2400, i7-4790k, i9-10900k, i3-13100, i9-13900ks
GPU: many
RAM: Corsair 32GB DDR3-2400 | Oloy Blade 16GB DDR4-3600 | Crucial 16GB DDR5-5600
MOTHERBOARD: ASUS P7P55 WS SC | ASUS Z97 Deluxe | EVGA Z490 Dark | EVGA Z790 Dark Kingpin
SSD/NVME: Samsung 870 Evo 1TB | Inland 1TB Gen 4
PSU: Seasonic Focus GX 1000W
CASE: Cooler Master MasterFrame 700 - bench mode
OPERATING SYSTEM: Windows 10 LTSC
Full Rig Info

Owned

 Share

CPU: M1 Pro
RAM: 32GB
SSD/NVME: 1TB
OPERATING SYSTEM: MacOS Sonoma
CASE: Space Grey
Full Rig Info
Link to comment
Share on other sites

I really doubt they will give us more than 24GB anytime soon. Wouldn't want to compete with their own professional GPUs. Also with how hard they push DLSS and frame gen I think it's safe to say they will keep relying on software magic VS give us stronger hardware.

 

I totally understand cutting down specs for lower tier cards, but It's REALLY annoying how hard they start gimping their lineup starting with the XX80 SKU's. When Amphere was first announced I was really hoping for a 20GB 3080. I ended up buying a 3090 because I was wanting it for my 4K OLED and wasn't going to buy a 10GB "high end" GPU. 

 

Interestingly the 4070 Ti Super is getting upgraded to 16GB of VRAM on a 256 bit bus. If they just replace the current 4070 Ti that would probably be the best value Nvidia card on the market. Knowing them, they will sell it along side the current cards and jack the price up. 

 

 

Owned

 Share

CPU: 5900X + Optimus block
MOTHERBOARD: MSI X570 Ace
GPU: EVGA RTX 3090 FTW3 Ultra + Optimus block
RAM: 32GB Oloy Blade 3600CL14
SSD/NVME: 1TB SN750 Black
SSD/NVME 2: 1TB SN750 Black
SSD/NVME 3: 2TB Samsung 970 Evo Plus
CASE: LD PC-V7
Full Rig Info

Owned

 Share

CPU: 7800X3D
MOTHERBOARD: B650E-I Strix
RAM: G.Skill Flare X5
GPU: Reference 6950 XT
CASE: InWin D-Frame Mini
WC RADIATOR: MO-RA3 with 4x180mm Silverstone fans
FAN CONTROLLER: Aquaero
Full Rig Info

Owned

 Share

CPU: 12600KF
MOTHERBOARD: Z790I Strix
RAM: G.Skill Trident Z5 Neo
GPU: RTX 2080
CASE: Sliger SM580
WC RESEVOIR: Phanteks R160C
WC RADIATOR: XSPC TX240 Ultrathin
FANS: Phanteks T30
Full Rig Info
Link to comment
Share on other sites

1 hour ago, Sir Beregond said:

I wonder what the memory density will be per chip by that point. Are we talking another 24GB card? Maybe 36GB or 48GB? 

 

Are they really saying 192-bit now for the 5080? That's a low blow. While I am sure the increased L2 helps, its clear time and again that at higher resolution, you know like the ones you'd use if buying top tier products, definitely gets hampered by a 192-bit bus limiting performance potential. 4070 Ti is a great example. Sure it works at 4k, but if it was a 256-bit card, it would work even better. For the money spent, its definitely performance left on the table because of that drastic cut-down.

 

AMD figured that out. Notice RDNA3 has a smaller Infinity Cache and larger traditional bus than RDNA2 did.

With you on that, the 5090 looks like a powerhouse, which will also cost powerhouse prices and I think many people will want to not opt for the top end tier because of that. However as usual, Nvidia are forcing people to heavily consider the 5090 if they proceed with gimping the 5080 with a 192-bit bus. However this is nothing new from Nvidia to be perfectly honest.  Considering the price the 5080 will end up being....a 192-bit bus will be an impossible pill to swallow.

 

Back when I got my 3090, I went that route as I found the lack of VRAM on the other lower tier line-up concerning, so I went for the 3090 and that was a large determining factor (4K and productivity work etc), thought I did not  NEED 24GB, I would likely have been fine at 16GB. So Nvidia's game plan no doubt works.

 

Due to life being more expensive these days, something like a 5080/5090 would likely be out of reach anyway.

  • Agreed 3

£3000

Owned

 Share

CPU: AMD Ryzen 9 7950X3D
MOTHERBOARD: MSI Meg Ace X670E
RAM: Corsair Dominator Titanium 64GB (6000MT/s)
GPU: EVGA 3090 FTW Ultra Gaming
SSD/NVME: Corsair MP700 Pro SE Gen 5 4TB
PSU: EVGA Supernova T2 1600Watt
CASE: be quiet Dark Base Pro 900 Rev 2
FANS: Noctua NF-A14 industrialPPC x 6
Full Rig Info

Owned

 Share

CPU: Intel Core i5 8500
RAM: 16GB (2x8GB) Kingston 2666Mhz
SSD/NVME: 256GB Samsung NVMe
NETWORK: HP 561T 10Gbe (Intel X540 T2)
MOTHERBOARD: Proprietry
GPU: Intel UHD Graphics 630
PSU: 90Watt
CASE: HP EliteDesk 800 G4 SFF
Full Rig Info

£3000

Owned

 Share

CPU: 2 x Xeon|E5-2696-V4 (44C/88T)
RAM: 128GB|16 x 8GB - DDR4 2400MHz (2Rx8)
MOTHERBOARD: HP Z840|Intel C612 Chipset
GPU: Nvidia Quadro P2200
HDD: 4x 16TB Toshiba MG08ACA16TE Enterprise
SSD/NVME: Intel 512GB 670p NVMe (Main OS)
SSD/NVME 2: 2x WD RED 1TB NVMe (VM's)
SSD/NVME 3: 2x Seagate FireCuda 1TB SSD's (Apps)
Full Rig Info
Link to comment
Share on other sites

27 minutes ago, Fluxmaven said:

I really doubt they will give us more than 24GB anytime soon. Wouldn't want to compete with their own professional GPUs. Also with how hard they push DLSS and frame gen I think it's safe to say they will keep relying on software magic VS give us stronger hardware.

 

I totally understand cutting down specs for lower tier cards, but It's REALLY annoying how hard they start gimping their lineup starting with the XX80 SKU's. When Amphere was first announced I was really hoping for a 20GB 3080. I ended up buying a 3090 because I was wanting it for my 4K OLED and wasn't going to buy a 10GB "high end" GPU. 

 

Interestingly the 4070 Ti Super is getting upgraded to 16GB of VRAM on a 256 bit bus. If they just replace the current 4070 Ti that would probably be the best value Nvidia card on the market. Knowing them, they will sell it along side the current cards and jack the price up. 

 

 

 

Nvidia skimping on ram is what pushed me to AMD. I had a 1080Ti at the time which is an 11GB card, got it for $700 back when it was the best gaming card money could buy. When the 2000 series came out and the 2080Ti was so much more expensive but hardly any faster for raster, and didn't even come with an upgrade on the Vram, I felt like going with AMD was the only thing that made sense. I didn't want to go from an 11Gb card to another 11GB card, or a card with even less ram if I wanted to spend about the same amount on a new card as I did on the 1080Ti. Going with AMD meant I could get more Vram and spend less than I did before. Seemed like a no-brainer. 

But this was also peak covid GPU shortage time and I was also just lucky enough to get a card at MSRP from AMDs website. Lot of other stuff going on in the market at that time. 


That said, I think it's pretty unlikely Nvidia could get away with a third gen of GPUs that don't up the Vram. 

Edited by UltraMega
  • Thanks 1

null

Owned

 Share

CPU: 5800x
MOTHERBOARD: ASUS TUF Gaming B550-Plus
RAM: 32GB 3600mhz CL16
GPU: 7900XT
SOUNDCARD: Sound Blaster Z 5.1 home theater
MONITOR: 4K 65 inch TV
Full Rig Info
Link to comment
Share on other sites

1 hour ago, Fluxmaven said:

I really doubt they will give us more than 24GB anytime soon. Wouldn't want to compete with their own professional GPUs. Also with how hard they push DLSS and frame gen I think it's safe to say they will keep relying on software magic VS give us stronger hardware.

 

I totally understand cutting down specs for lower tier cards, but It's REALLY annoying how hard they start gimping their lineup starting with the XX80 SKU's. When Amphere was first announced I was really hoping for a 20GB 3080. I ended up buying a 3090 because I was wanting it for my 4K OLED and wasn't going to buy a 10GB "high end" GPU. 

 

Interestingly the 4070 Ti Super is getting upgraded to 16GB of VRAM on a 256 bit bus. If they just replace the current 4070 Ti that would probably be the best value Nvidia card on the market. Knowing them, they will sell it along side the current cards and jack the price up. 

 

 

Yeah but they'll still have the problem of a "70" branded card being $800 which is just asinine. While they certainly are going the right direction with performance per dollar with the Supers, the stack still feels so wrong. In just the previous gen, the 4070 Ti being a performance match against the last gen 3090 Ti would make it seem more like it should be a 4070 branding wise (and pricing wise). 3070 matched 2080 Ti, 2070 matched 1080 Ti, etc. etc. has been the precedence.

 

And then the actual mainstream 60-class cards are just stagnate, so there's literally zero worthwhile cards to buy below $550 and anything over $550 is misbranded and overpriced still.

 

As for the VRAM issue - perhaps, perhaps not. If the next gen cards are coming out in late 2024, early 2025 and are expected to last 2+ years as a generation, I'm not so sure they can go another gen without upgrading VRAM. That said, if they keep skimping on memory bus in favor of L2, then that does sort of limit the options. That does make me somewhat question this 192-bit rumor for the 5080 though. Unless they are doubling density again and somehow making it a 24GB card over 6 chips (or maybe front and back chips like a 3090), then it seems off to me that they'd take the 16GB 4080 and then go to 12GB 5080. Seems to me that would get crucified.

null

Showcase

 Share

CPU: AMD Ryzen 9 5900X
GPU: Nvidia RTX 3080 Ti Founders Edition
RAM: G.Skill Trident Z Neo 32GB DDR4-3600 (@ 3733 CL14)
MOTHERBOARD: ASUS Crosshair VIII Dark Hero
SSD/NVME: x2 Samsung 970 Evo Plus 2TB
SSD/NVME 2: Crucial MX500 1TB
PSU: be Quiet! Straight Power 12 1500W
MONITOR: LG 42" C4 OLED
Full Rig Info

null

Owned

 Share

CPU: E8400, i5-650, i7-870, i7-960, i5-2400, i7-4790k, i9-10900k, i3-13100, i9-13900ks
GPU: many
RAM: Corsair 32GB DDR3-2400 | Oloy Blade 16GB DDR4-3600 | Crucial 16GB DDR5-5600
MOTHERBOARD: ASUS P7P55 WS SC | ASUS Z97 Deluxe | EVGA Z490 Dark | EVGA Z790 Dark Kingpin
SSD/NVME: Samsung 870 Evo 1TB | Inland 1TB Gen 4
PSU: Seasonic Focus GX 1000W
CASE: Cooler Master MasterFrame 700 - bench mode
OPERATING SYSTEM: Windows 10 LTSC
Full Rig Info

Owned

 Share

CPU: M1 Pro
RAM: 32GB
SSD/NVME: 1TB
OPERATING SYSTEM: MacOS Sonoma
CASE: Space Grey
Full Rig Info
Link to comment
Share on other sites

53 minutes ago, Sir Beregond said:

Yeah but they'll still have the problem of a "70" branded card being $800 which is just asinine. While they certainly are going the right direction with performance per dollar with the Supers, the stack still feels so wrong. In just the previous gen, the 4070 Ti being a performance match against the last gen 3090 Ti would make it seem more like it should be a 4070 branding wise (and pricing wise). 3070 matched 2080 Ti, 2070 matched 1080 Ti, etc. etc. has been the precedence.

 

And then the actual mainstream 60-class cards are just stagnate, so there's literally zero worthwhile cards to buy below $550 and anything over $550 is misbranded and overpriced still.

 

As for the VRAM issue - perhaps, perhaps not. If the next gen cards are coming out in late 2024, early 2025 and are expected to last 2+ years as a generation, I'm not so sure they can go another gen without upgrading VRAM. That said, if they keep skimping on memory bus in favor of L2, then that does sort of limit the options. That does make me somewhat question this 192-bit rumor for the 5080 though. Unless they are doubling density again and somehow making it a 24GB card over 6 chips (or maybe front and back chips like a 3090), then it seems off to me that they'd take the 16GB 4080 and then go to 12GB 5080. Seems to me that would get crucified.

 

The problem is at this point people have proved that they will pay for it. Pascal was the last gen that made sense price wise. Turing came along with a big price hike and barely any raw performance uplift and just a huge focus on ray tracing and various software add ons that nobody asked for. Back then we had one of the mining booms which kept the prices sky high all the way until Ampere came along with that cute promise of buying 2080ti perfomance for $500. A handful of lucky people got in on an original MSRP 3070 or 3080. Most people got shafted because they waited and waited and waited to upgrade. Just for another mining boom paired with a global pandemic to make affordable GPUs a distant dream. 

 

Now, big surprise they follow that up with the most mis-priced lineup of cards ever. While people * and moan constantly about how much they hate them, they still keep buying it all up. Mostly because you had a bunch of people still holding on to those Pascal cards or even Maxwell waiting for a time when a decently priced card would come along and it just never did.

 

As for the VRAM thing, if we get more, it'll be in the form of a new $3000 Titan. They don't like doubling flagship VRAM without making some comically priced "best card in the universe". I think the lower tier cards will continue to get bumped up. I can see 12GB as the new minimum offering with more cards in the 16, 20, and 24GB range. I just don't see them giving us a 48GB 5090 as a direct replacement for the 4090 at the same price level. They should, but I'm sure they will find a way to make it a more expensive product. Then serving up the cut down version at still absurd prices. 

 

AMD should have priced more aggressivly to undercut Nvidia's poor pricing/naming this gen, but they are happy to just sell what they sell of their also overpriced cards. 

  • Agreed 4

Owned

 Share

CPU: 5900X + Optimus block
MOTHERBOARD: MSI X570 Ace
GPU: EVGA RTX 3090 FTW3 Ultra + Optimus block
RAM: 32GB Oloy Blade 3600CL14
SSD/NVME: 1TB SN750 Black
SSD/NVME 2: 1TB SN750 Black
SSD/NVME 3: 2TB Samsung 970 Evo Plus
CASE: LD PC-V7
Full Rig Info

Owned

 Share

CPU: 7800X3D
MOTHERBOARD: B650E-I Strix
RAM: G.Skill Flare X5
GPU: Reference 6950 XT
CASE: InWin D-Frame Mini
WC RADIATOR: MO-RA3 with 4x180mm Silverstone fans
FAN CONTROLLER: Aquaero
Full Rig Info

Owned

 Share

CPU: 12600KF
MOTHERBOARD: Z790I Strix
RAM: G.Skill Trident Z5 Neo
GPU: RTX 2080
CASE: Sliger SM580
WC RESEVOIR: Phanteks R160C
WC RADIATOR: XSPC TX240 Ultrathin
FANS: Phanteks T30
Full Rig Info
Link to comment
Share on other sites

Administrators
6.1k 3,251
On 01/01/2024 at 00:51, Fluxmaven said:

 

The problem is at this point people have proved that they will pay for it. Pascal was the last gen that made sense price wise. Turing came along with a big price hike and barely any raw performance uplift and just a huge focus on ray tracing and various software add ons that nobody asked for. Back then we had one of the mining booms which kept the prices sky high all the way until Ampere came along with that cute promise of buying 2080ti perfomance for $500. A handful of lucky people got in on an original MSRP 3070 or 3080. Most people got shafted because they waited and waited and waited to upgrade. Just for another mining boom paired with a global pandemic to make affordable GPUs a distant dream. 

 

Now, big surprise they follow that up with the most mis-priced lineup of cards ever. While people * and moan constantly about how much they hate them, they still keep buying it all up. Mostly because you had a bunch of people still holding on to those Pascal cards or even Maxwell waiting for a time when a decently priced card would come along and it just never did.

 

As for the VRAM thing, if we get more, it'll be in the form of a new $3000 Titan. They don't like doubling flagship VRAM without making some comically priced "best card in the universe". I think the lower tier cards will continue to get bumped up. I can see 12GB as the new minimum offering with more cards in the 16, 20, and 24GB range. I just don't see them giving us a 48GB 5090 as a direct replacement for the 4090 at the same price level. They should, but I'm sure they will find a way to make it a more expensive product. Then serving up the cut down version at still absurd prices. 

 

AMD should have priced more aggressivly to undercut Nvidia's poor pricing/naming this gen, but they are happy to just sell what they sell of their also overpriced cards. 

 

Unfortunately, yes. People are not voting with their wallets. However from the consumer side, or should I say professional/prosumer side Nvidia has a good choke hold with their penetration into hardware acceleration whether it be CUDA or other related Nvidia technology. As a CAD and 3D Scanning user, I cannot/will not touch AMD. From prior and recent experience, Solidworks performs much better on Nvidia GPU's and the reliability is far better.  Not saying you simply cannot use AMD GPU's with Solidworks, you can and I have, but the experience with an Nvidia GPU is superior.

 

Furthermore our 3D Scanner at work will not work on AMD GPU hardware PERIOD, it is literally designed for Nvidia only and after speaking with the makers, they told me they could not get reliable performance from AMD hardware. So some users such as myself have no choice to go Nvidia, regardless of the price 😞. This is why I would love Intel...or whoever to really disrupt the GPU market and give us another contender. That or AMD really pulls itself together in the prosumer/professional segment and gives us reliable competition so far as hardware acceleration/API's as Nvidia does.

 

So, some people certainly have the luxury of voting out of Nvidia, but some of us don't 😞

£3000

Owned

 Share

CPU: AMD Ryzen 9 7950X3D
MOTHERBOARD: MSI Meg Ace X670E
RAM: Corsair Dominator Titanium 64GB (6000MT/s)
GPU: EVGA 3090 FTW Ultra Gaming
SSD/NVME: Corsair MP700 Pro SE Gen 5 4TB
PSU: EVGA Supernova T2 1600Watt
CASE: be quiet Dark Base Pro 900 Rev 2
FANS: Noctua NF-A14 industrialPPC x 6
Full Rig Info

Owned

 Share

CPU: Intel Core i5 8500
RAM: 16GB (2x8GB) Kingston 2666Mhz
SSD/NVME: 256GB Samsung NVMe
NETWORK: HP 561T 10Gbe (Intel X540 T2)
MOTHERBOARD: Proprietry
GPU: Intel UHD Graphics 630
PSU: 90Watt
CASE: HP EliteDesk 800 G4 SFF
Full Rig Info

£3000

Owned

 Share

CPU: 2 x Xeon|E5-2696-V4 (44C/88T)
RAM: 128GB|16 x 8GB - DDR4 2400MHz (2Rx8)
MOTHERBOARD: HP Z840|Intel C612 Chipset
GPU: Nvidia Quadro P2200
HDD: 4x 16TB Toshiba MG08ACA16TE Enterprise
SSD/NVME: Intel 512GB 670p NVMe (Main OS)
SSD/NVME 2: 2x WD RED 1TB NVMe (VM's)
SSD/NVME 3: 2x Seagate FireCuda 1TB SSD's (Apps)
Full Rig Info
Link to comment
Share on other sites

48 minutes ago, ENTERPRISE said:

 

Unfortunately, yes. People are not voting with their wallets. However from the consumer side, or should I say professional/prosumer side Nvidia has a good choke hold with their penetration into hardware acceleration whether it be CUDA or other related Nvidia technology. As a CAD and 3D Scanning user, I cannot/will not touch AMD. From prior and recent experience, Solidworks performs much better on Nvidia GPU's and the reliability is far better.  Not saying you simply cannot use AMD GPU's with Solidworks, you can and I have, but the experience with an Nvidia GPU is superior.

 

Furthermore our 3D Scanner at work will not work on AMD GPU hardware PERIOD, it is literally designed for Nvidia only and after speaking with the makers, they told me they could not get reliable performance from AMD hardware. So some users such as myself have no choice to go Nvidia, regardless of the price 😞. This is why I would love Intel...or whoever to really disrupt the GPU market and give us another contender. That or AMD really pulls itself together in the prosumer/professional segment and gives us reliable competition so far as hardware acceleration/API's as Nvidia does.

 

So, some people certainly have the luxury of voting out of Nvidia, but some of us don't 😞

 

That was another tangent I was going to go down, but figured I'd typed enough the other day 😂

 

A lot of things just work better/only work on Nvidia. My roommate does 3D modeling and the plugins he uses only work on Nvidia. There are some newer options that support AMD but that involves paying for and then learning a different software which isn't really feasible.

 

When the current gen of cards was coming out I was thinking he'd at least be able to grab a 7900XTX for his VR rig... In typical AMD fashion that comes with 6 months of waiting on driver fixes. Not something that sit's well when you drop $1000+ on a new card.

 

I personally enjoy donating resources to the Folding@home project and Nvidia is just head and shoulders above in that department. 

 

It'll be a while before the dynamic at the high end shifts. AMD has been content with letting Nvidia dominate the upper end of the market. If Arc can get more traction and steal some of that lower to midrange market, AMD will have to step up in some way. That could just be with prices. But they might decide to get serious and make something special.

Owned

 Share

CPU: 5900X + Optimus block
MOTHERBOARD: MSI X570 Ace
GPU: EVGA RTX 3090 FTW3 Ultra + Optimus block
RAM: 32GB Oloy Blade 3600CL14
SSD/NVME: 1TB SN750 Black
SSD/NVME 2: 1TB SN750 Black
SSD/NVME 3: 2TB Samsung 970 Evo Plus
CASE: LD PC-V7
Full Rig Info

Owned

 Share

CPU: 7800X3D
MOTHERBOARD: B650E-I Strix
RAM: G.Skill Flare X5
GPU: Reference 6950 XT
CASE: InWin D-Frame Mini
WC RADIATOR: MO-RA3 with 4x180mm Silverstone fans
FAN CONTROLLER: Aquaero
Full Rig Info

Owned

 Share

CPU: 12600KF
MOTHERBOARD: Z790I Strix
RAM: G.Skill Trident Z5 Neo
GPU: RTX 2080
CASE: Sliger SM580
WC RESEVOIR: Phanteks R160C
WC RADIATOR: XSPC TX240 Ultrathin
FANS: Phanteks T30
Full Rig Info
Link to comment
Share on other sites

2 hours ago, ENTERPRISE said:

 

Unfortunately, yes. People are not voting with their wallets. However from the consumer side, or should I say professional/prosumer side Nvidia has a good choke hold with their penetration into hardware acceleration whether it be CUDA or other related Nvidia technology. As a CAD and 3D Scanning user, I cannot/will not touch AMD. From prior and recent experience, Solidworks performs much better on Nvidia GPU's and the reliability is far better.  Not saying you simply cannot use AMD GPU's with Solidworks, you can and I have, but the experience with an Nvidia GPU is superior.

 

Furthermore our 3D Scanner at work will not work on AMD GPU hardware PERIOD, it is literally designed for Nvidia only and after speaking with the makers, they told me they could not get reliable performance from AMD hardware. So some users such as myself have no choice to go Nvidia, regardless of the price 😞. This is why I would love Intel...or whoever to really disrupt the GPU market and give us another contender. That or AMD really pulls itself together in the prosumer/professional segment and gives us reliable competition so far as hardware acceleration/API's as Nvidia does.

 

So, some people certainly have the luxury of voting out of Nvidia, but some of us don't 😞

 

I am a bit surprised about Solidworks being so poor on AMD from your experience. It wasn't that long ago (10 years ago?) that AMD was much more affordable and simply worked very well for Solidworks. I think at the time they had better OpenGL perf vs. competing Nvidia cards but I could be wrong. That said, I've seen the same stranglehold from Nvidia in the professional / visualization market on the hardware side. HP, Lenovo, Dell (the big 3) that basically own the OEM market for workstations typically pair with Nvidia. Google the Z, P, or Precision stations for context. I don't even recall I've ever seen a workstation come from one of them with AMD out of the box in a long time or why a business would opt for AMD specifically. Anyway, the OEM market is fairly small these days in comparison to Data center / Enterprise and now smaller, Gaming market. Intel and AMD need to compete there, otherwise, Nvidia will continue to stronghold the entire market and you'll continue to see their dominance across the board.

 

On 31/12/2023 at 13:02, Sir Beregond said:

I wonder what the memory density will be per chip by that point. Are we talking another 24GB card? Maybe 36GB or 48GB? 

 

Are they really saying 192-bit now for the 5080? That's a low blow. While I am sure the increased L2 helps, its clear time and again that at higher resolution, you know like the ones you'd use if buying top tier products, definitely gets hampered by a 192-bit bus limiting performance potential. 4070 Ti is a great example. Sure it works at 4k, but if it was a 256-bit card, it would work even better. For the money spent, its definitely performance left on the table because of that drastic cut-down.

 

AMD figured that out. Notice RDNA3 has a smaller Infinity Cache and larger traditional bus than RDNA2 did.

 

 

I've always figured that going to a lower bus, whilst suffering bandwidth performance, was a win for power consumption. I know this generation (Ada) was a poor example but discussing bus width without full context of the features of the card, like cache, is technically jumping the gun. Even if its a clear red flag to use power users.

 

Yes, it's not a good sign for memory intensive applications, especially gaming at larger resolutions, but who knows. My gut and probably everyone else here too is that Nvidia will trim on the hardware to maximize profit per SKU and rely more on their "AI" solutions and other technologies that they've already invested heavily into to mislead convince consumers about their superior performance. We're a vocal minority unfortunately. If most consumers are OK with upscaled resolutions to unlock other features (ray tracing), I suspect this trend to continue. Why increase power consumption (poor marketing), chase resolution, and offer better bandwidth when "4K gaming" is technically OK on a 192-bit bus with 4K output at DLSS performance (1080p render) or Quality (1440p render)? Nvidia knows those are the features people pay for. DLSS, Raytracing. That's why it's in big shiny letters on the box.

 

Either way, get ready for more 3-4x more perf that Ampere and Ada charts...

Link to comment
Share on other sites

8 hours ago, Slaughtahouse said:

I've always figured that going to a lower bus, whilst suffering bandwidth performance, was a win for power consumption. I know this generation (Ada) was a poor example but discussing bus width without full context of the features of the card, like cache, is technically jumping the gun. Even if its a clear red flag to use power users.

 

Yes, it's not a good sign for memory intensive applications, especially gaming at larger resolutions, but who knows. My gut and probably everyone else here too is that Nvidia will trim on the hardware to maximize profit per SKU and rely more on their "AI" solutions and other technologies that they've already invested heavily into to mislead convince consumers about their superior performance. We're a vocal minority unfortunately. If most consumers are OK with upscaled resolutions to unlock other features (ray tracing), I suspect this trend to continue. Why increase power consumption (poor marketing), chase resolution, and offer better bandwidth when "4K gaming" is technically OK on a 192-bit bus with 4K output at DLSS performance (1080p render) or Quality (1440p render)? Nvidia knows those are the features people pay for. DLSS, Raytracing. That's why it's in big shiny letters on the box.

 

Either way, get ready for more 3-4x more perf that Ampere and Ada charts...

 

I mean I think the lowering of traditional bandwidth and replacing with cache or other software solutions is fine for some segments, but its another thing entirely when they are doing that for literally every SKU that isn't the top end halo product at a time when more people are checking out ultrawides, or 1440p, or 4k. I don't think I am jumping the gun in voicing my displeasure of that trending as we are already seeing that with the 40-series even if we don't really know what the 50-series will be yet.

 

Yeah I don't think you are wrong on your assessment here. But at the same time, gamers have been pretty vocal about VRAM this year, so will be interesting to see how they navigate that. If they did have another 8GB card next year or 12GB at certain segments, is that really going to be seen as ok? 

 

Anyway it's not just about high resolutions either. Why are cards being marketed as 1080p cards suddenly at $500 price points as well with a clear stagnation of performance improvements over the previous generation? Talking the 4060 Ti and 4060 here. 1080p at this point is considered a basic resolution if not old at this point and the idea that a good card for it isn't also getting cheaper doesn't make much sense.

 

Anyway, don't mind me, I just hardly miss an opportunity to rant about graphics cards.

Edited by Sir Beregond
  • Respect 1

null

Showcase

 Share

CPU: AMD Ryzen 9 5900X
GPU: Nvidia RTX 3080 Ti Founders Edition
RAM: G.Skill Trident Z Neo 32GB DDR4-3600 (@ 3733 CL14)
MOTHERBOARD: ASUS Crosshair VIII Dark Hero
SSD/NVME: x2 Samsung 970 Evo Plus 2TB
SSD/NVME 2: Crucial MX500 1TB
PSU: be Quiet! Straight Power 12 1500W
MONITOR: LG 42" C4 OLED
Full Rig Info

null

Owned

 Share

CPU: E8400, i5-650, i7-870, i7-960, i5-2400, i7-4790k, i9-10900k, i3-13100, i9-13900ks
GPU: many
RAM: Corsair 32GB DDR3-2400 | Oloy Blade 16GB DDR4-3600 | Crucial 16GB DDR5-5600
MOTHERBOARD: ASUS P7P55 WS SC | ASUS Z97 Deluxe | EVGA Z490 Dark | EVGA Z790 Dark Kingpin
SSD/NVME: Samsung 870 Evo 1TB | Inland 1TB Gen 4
PSU: Seasonic Focus GX 1000W
CASE: Cooler Master MasterFrame 700 - bench mode
OPERATING SYSTEM: Windows 10 LTSC
Full Rig Info

Owned

 Share

CPU: M1 Pro
RAM: 32GB
SSD/NVME: 1TB
OPERATING SYSTEM: MacOS Sonoma
CASE: Space Grey
Full Rig Info
Link to comment
Share on other sites

18 hours ago, Sir Beregond said:

 

I mean I think the lowering of traditional bandwidth and replacing with cache or other software solutions is fine for some segments, but its another thing entirely when they are doing that for literally every SKU that isn't the top end halo product at a time when more people are checking out ultrawides, or 1440p, or 4k. I don't think I am jumping the gun in voicing my displeasure of that trending as we are already seeing that with the 40-series even if we don't really know what the 50-series will be yet.

 

Yeah I don't think you are wrong on your assessment here. But at the same time, gamers have been pretty vocal about VRAM this year, so will be interesting to see how they navigate that. If they did have another 8GB card next year or 12GB at certain segments, is that really going to be seen as ok? 

 

Anyway it's not just about high resolutions either. Why are cards being marketed as 1080p cards suddenly at $500 price points as well with a clear stagnation of performance improvements over the previous generation? Talking the 4060 Ti and 4060 here. 1080p at this point is considered a basic resolution if not old at this point and the idea that a good card for it isn't also getting cheaper doesn't make much sense.

 

Anyway, don't mind me, I just hardly miss an opportunity to rant about graphics cards.

 

Applying critical thinking to these discussions is sometimes not possible when speaking about Nvidia cards and pricing. I also believe there is nothing wrong with a good solid rant 🙂 I feel that Nvidia rolled out of the Crypto bed and woke up in the AI bed and unfortunately, is extending all of this pain to consumers personally. The 4060 Ti and 4060 were absolutely ridiculous poor cards. I had my 3060 Ti in 2020 and I paid 400$ for it. For Nvidia to come out with a updated model that performs nearly identical across all resolutions and worse at 4K was absolutely insane. Then the grift tax by slapping on 16gb onto a 192-bit card for $500 to convince misinformed consumers that extra VRAM will solve their problem shows Nvidia's true colours once again.

 

Their marketing department is absolutely on top of their game right now... I went AMD for the first time and I am generally happy about it but it doesn't bring me "joy" to spend 1,000 USD on a 7900 XTX. More and more RT titles will be coming out and unless AMD solves the problem, that gap will widen immensely with the next gen. I fully foresee going back to Nvidia but it really comes down to price and value for me.

 

I would hope for Blackwell we see...

  • 5090 with +24gb VRAM with 30-60% perf> 4090 -> $1,000 to $1,200
  • 5080 with +20gb VRAM with 10-15% perf improvement over 4090 -> $700
  • 5070 with 16gb VRAM between 4080 to 4090 level performance -> $500
  • 5060 with 12gb VRAM and 4070 to 4070 Ti performance -> $300

For all the other milking SKUs (Ti / Supers) they can add their 3-7% performance uplift for whatever scalping price they want but those should be the approx perf. and vram per segment at launch, if Blackwell does release this fall / winter. Anything less in performance or more in cost will be seriously questionable.

Edited by Slaughtahouse
  • Respect 2
  • Agreed 1
Link to comment
Share on other sites

10 minutes ago, Slaughtahouse said:

 

Applying critical thinking to these discussions is sometimes not possible when speaking about Nvidia cards and pricing. I also believe there is nothing wrong with a good solid rant 🙂 I feel that Nvidia rolled out of the Crypto bed and woke up in the AI bed and unfortunately, is extending all of this pain to consumers personally. The 4060 Ti and 4060 were absolutely ridiculous poor cards. I had my 3060 Ti in 2020 and I paid 400$ for it. For Nvidia to come out with a updated model that performs nearly identical across all resolutions and worse at 4K was absolutely insane. Then the grift tax by slapping on 16gb onto a 192-bit card for $500 to convince misinformed consumers that extra VRAM will solve their problem shows Nvidia's true colours once again.

 

Their marketing department is absolutely on top of their game right now... I went AMD for the first time and I am generally happy about it but it doesn't bring me "joy" to spend 1,000 USD on a 7900 XTX. More and more RT titles will be coming out and unless AMD solves the problem, that gap will widen immensely with the next gen. I fully foresee going back to Nvidia but it really comes down to price and value for me.

 

I would hope for Blackwell we see...

  • 5090 with +24gb VRAM with 30-60% perf> 4090 -> $1,000 to $1,200
  • 5080 with +20gb VRAM with 10-15% perf improvement over 4090 -> $700
  • 5070 with 16gb VRAM between 4080 to 4090 level performance -> $500
  • 5060 with 12gb VRAM and 4070 to 4070 Ti performance -> $300

For all the other milking SKUs (Ti / Supers) they can add their 3-7% performance uplift for whatever scalping price they want but those should be the approx perf. and vram per segment at launch, if Blackwell does release this fall / winter. Anything less in performance or more in cost will be seriously questionable.

Well said.

 

Man I want to give AMD a try, but it still feels second rate to me. Sure the raster is there, but everything else not so much. I used to be mainly a Radeon guy back in the ATi days and the 5000-series back in the Fermi era, but man it just doesn't feel good to spend that much and feel like I am lacking features like RT / DLSS. The couple times I tried FSR it looked downright terrible. It would be nice if AMD cared to do more than just do "good enough" to then slot into Nvidia's BS pricing. I don't think the 7900 XTX and 7900 XT MSRPs made any sense at all and was just a "well this fits into Nvidia's pricing stack" move.

 

The only reason the 7900 XTX was $999 was because the 4080 was $1199. 😂

Edited by Sir Beregond

null

Showcase

 Share

CPU: AMD Ryzen 9 5900X
GPU: Nvidia RTX 3080 Ti Founders Edition
RAM: G.Skill Trident Z Neo 32GB DDR4-3600 (@ 3733 CL14)
MOTHERBOARD: ASUS Crosshair VIII Dark Hero
SSD/NVME: x2 Samsung 970 Evo Plus 2TB
SSD/NVME 2: Crucial MX500 1TB
PSU: be Quiet! Straight Power 12 1500W
MONITOR: LG 42" C4 OLED
Full Rig Info

null

Owned

 Share

CPU: E8400, i5-650, i7-870, i7-960, i5-2400, i7-4790k, i9-10900k, i3-13100, i9-13900ks
GPU: many
RAM: Corsair 32GB DDR3-2400 | Oloy Blade 16GB DDR4-3600 | Crucial 16GB DDR5-5600
MOTHERBOARD: ASUS P7P55 WS SC | ASUS Z97 Deluxe | EVGA Z490 Dark | EVGA Z790 Dark Kingpin
SSD/NVME: Samsung 870 Evo 1TB | Inland 1TB Gen 4
PSU: Seasonic Focus GX 1000W
CASE: Cooler Master MasterFrame 700 - bench mode
OPERATING SYSTEM: Windows 10 LTSC
Full Rig Info

Owned

 Share

CPU: M1 Pro
RAM: 32GB
SSD/NVME: 1TB
OPERATING SYSTEM: MacOS Sonoma
CASE: Space Grey
Full Rig Info
Link to comment
Share on other sites

Considering their lack of feature parity I don't see how AMD gets off pricing their second rate cards as high as they do. The 7900XTX is an alright card if you just look at raw raster. Being a gen behind on RT and upscaling tech while just slotting into the price bracket between a 4070ti and 4080 just seems outright lazy. 

I just wish they would undercut Nvidia or actually step up and make a 4090 level card. 

 

 

  • Agreed 1

Owned

 Share

CPU: 5900X + Optimus block
MOTHERBOARD: MSI X570 Ace
GPU: EVGA RTX 3090 FTW3 Ultra + Optimus block
RAM: 32GB Oloy Blade 3600CL14
SSD/NVME: 1TB SN750 Black
SSD/NVME 2: 1TB SN750 Black
SSD/NVME 3: 2TB Samsung 970 Evo Plus
CASE: LD PC-V7
Full Rig Info

Owned

 Share

CPU: 7800X3D
MOTHERBOARD: B650E-I Strix
RAM: G.Skill Flare X5
GPU: Reference 6950 XT
CASE: InWin D-Frame Mini
WC RADIATOR: MO-RA3 with 4x180mm Silverstone fans
FAN CONTROLLER: Aquaero
Full Rig Info

Owned

 Share

CPU: 12600KF
MOTHERBOARD: Z790I Strix
RAM: G.Skill Trident Z5 Neo
GPU: RTX 2080
CASE: Sliger SM580
WC RESEVOIR: Phanteks R160C
WC RADIATOR: XSPC TX240 Ultrathin
FANS: Phanteks T30
Full Rig Info
Link to comment
Share on other sites

9 hours ago, Fluxmaven said:

Considering their lack of feature parity I don't see how AMD gets off pricing their second rate cards as high as they do. The 7900XTX is an alright card if you just look at raw raster. Being a gen behind on RT and upscaling tech while just slotting into the price bracket between a 4070ti and 4080 just seems outright lazy. 

I just wish they would undercut Nvidia or actually step up and make a 4090 level card. 

 

 

Yep exactly my thought.

 

I fully acknowledge that some don't care about said features and just want the raster, but I still feel like they are/were overpriced for that too. As much as I hate what Nvidia is doing, AMD hasn't exactly won me over either.

null

Showcase

 Share

CPU: AMD Ryzen 9 5900X
GPU: Nvidia RTX 3080 Ti Founders Edition
RAM: G.Skill Trident Z Neo 32GB DDR4-3600 (@ 3733 CL14)
MOTHERBOARD: ASUS Crosshair VIII Dark Hero
SSD/NVME: x2 Samsung 970 Evo Plus 2TB
SSD/NVME 2: Crucial MX500 1TB
PSU: be Quiet! Straight Power 12 1500W
MONITOR: LG 42" C4 OLED
Full Rig Info

null

Owned

 Share

CPU: E8400, i5-650, i7-870, i7-960, i5-2400, i7-4790k, i9-10900k, i3-13100, i9-13900ks
GPU: many
RAM: Corsair 32GB DDR3-2400 | Oloy Blade 16GB DDR4-3600 | Crucial 16GB DDR5-5600
MOTHERBOARD: ASUS P7P55 WS SC | ASUS Z97 Deluxe | EVGA Z490 Dark | EVGA Z790 Dark Kingpin
SSD/NVME: Samsung 870 Evo 1TB | Inland 1TB Gen 4
PSU: Seasonic Focus GX 1000W
CASE: Cooler Master MasterFrame 700 - bench mode
OPERATING SYSTEM: Windows 10 LTSC
Full Rig Info

Owned

 Share

CPU: M1 Pro
RAM: 32GB
SSD/NVME: 1TB
OPERATING SYSTEM: MacOS Sonoma
CASE: Space Grey
Full Rig Info
Link to comment
Share on other sites

On 03/01/2024 at 10:26, Slaughtahouse said:

Snip

I would hope for Blackwell we see...

  • 5090 with +24gb VRAM with 30-60% perf> 4090 -> $1,000 to $1,200
  • 5080 with +20gb VRAM with 10-15% perf improvement over 4090 -> $700
  • 5070 with 16gb VRAM between 4080 to 4090 level performance -> $500
  • 5060 with 12gb VRAM and 4070 to 4070 Ti performance -> $300

 

I think it would be awesome if these improvements and costs came to be!

The 5080 for $700 would be a fair and reasonable price.

 

However, unfortunately I think that the 5090 price be much higher than the $1,000 to $1,200 range.

You know Nvidia will charge as much as they can for the top card.

  • Agreed 2
Link to comment
Share on other sites

47 minutes ago, Barefooter said:

 

I think it would be awesome if these improvements and costs came to be!

The 5080 for $700 would be a fair and reasonable price.

 

However, unfortunately I think that the 5090 price be much higher than the $1,000 to $1,200 range.

You know Nvidia will charge as much as they can for the top card.

 

For sure 👍 

 

I tried to use a fair scheme based on general rules of thumb. Not necessarily moores law but that relative performance should improve by 30-50%+ per the same $ spent on the previous gen.

 

It’s just getting out of hand… 

 

Rant incoming: Banks are still keeping interest rates high in an effort to drain bank accounts / kill the total amount of jobs available on the market (demand) so at some point, we must hit a breaking point. I just can’t understand how consumers can keep spending this much on GPU hardware. It’s not just the high end SKUs, it’s the entire stack. Here in Canada, everyone who signed a mortgage when the rates bottomed out in 2020/2021 will need to renew their mortgage (max 5 year term) at a new rate. So 2025/2026 should be fun. Either rates come down or people will continue to lose their homes. Something could be said about taking a mortgage higher than you can sustain in that market when rates were stupidly low but banks here are federally controlled and they’re fairly risk averse. Yet some people took a variable rate when rates were good so IDK…

 

Not sure how it exactly works in the US but my understanding is that your rate is locked in for the entire term (25 etc.). So unless you need to sell your house for whatever other external factor (lose a job, having a child, moving closer to another school / employer etc.) you’re locked in.

 

/end rant

 

Anyway, I know that doesn’t justify these prices from Nvidia but when EVERYTHING is getting more expensive, which didn’t happen in the 2008 financial crisis, I just can’t believe this model will be sustainable considering current market conditions. However, it doesn’t help AMD isn’t expected to compete in the high end this year either with RDNA4 so who knows. Nvidia pulling in record revenue and profits because “AI” but I suspect companies will only invest so much into AI before they slow down their spend and impact Nvidia’s bottom line. Maybe that will help the gaming segment but no idea when that bubble will burst or if it even will.

 

Bleak times ahead…

  • Shocked 1
  • Agreed 2
Link to comment
Share on other sites

Administrators
6.1k 3,251
19 hours ago, Slaughtahouse said:

 

For sure 👍 

 

I tried to use a fair scheme based on general rules of thumb. Not necessarily moores law but that relative performance should improve by 30-50%+ per the same $ spent on the previous gen.

 

It’s just getting out of hand… 

 

Rant incoming: Banks are still keeping interest rates high in an effort to drain bank accounts / kill the total amount of jobs available on the market (demand) so at some point, we must hit a breaking point. I just can’t understand how consumers can keep spending this much on GPU hardware. It’s not just the high end SKUs, it’s the entire stack. Here in Canada, everyone who signed a mortgage when the rates bottomed out in 2020/2021 will need to renew their mortgage (max 5 year term) at a new rate. So 2025/2026 should be fun. Either rates come down or people will continue to lose their homes. Something could be said about taking a mortgage higher than you can sustain in that market when rates were stupidly low but banks here are federally controlled and they’re fairly risk averse. Yet some people took a variable rate when rates were good so IDK…

 

Not sure how it exactly works in the US but my understanding is that your rate is locked in for the entire term (25 etc.). So unless you need to sell your house for whatever other external factor (lose a job, having a child, moving closer to another school / employer etc.) you’re locked in.

 

/end rant

 

Anyway, I know that doesn’t justify these prices from Nvidia but when EVERYTHING is getting more expensive, which didn’t happen in the 2008 financial crisis, I just can’t believe this model will be sustainable considering current market conditions. However, it doesn’t help AMD isn’t expected to compete in the high end this year either with RDNA4 so who knows. Nvidia pulling in record revenue and profits because “AI” but I suspect companies will only invest so much into AI before they slow down their spend and impact Nvidia’s bottom line. Maybe that will help the gaming segment but no idea when that bubble will burst or if it even will.

 

Bleak times ahead…

Yeah this is it, from a financial point of view, im not looking forward to the next few years as a homeowner, bills, parent, tech enthusiast when it comes to how much things are costing. Something will have to give and usually it's the hobbies we enjoy...go figure.

  • Agreed 1

£3000

Owned

 Share

CPU: AMD Ryzen 9 7950X3D
MOTHERBOARD: MSI Meg Ace X670E
RAM: Corsair Dominator Titanium 64GB (6000MT/s)
GPU: EVGA 3090 FTW Ultra Gaming
SSD/NVME: Corsair MP700 Pro SE Gen 5 4TB
PSU: EVGA Supernova T2 1600Watt
CASE: be quiet Dark Base Pro 900 Rev 2
FANS: Noctua NF-A14 industrialPPC x 6
Full Rig Info

Owned

 Share

CPU: Intel Core i5 8500
RAM: 16GB (2x8GB) Kingston 2666Mhz
SSD/NVME: 256GB Samsung NVMe
NETWORK: HP 561T 10Gbe (Intel X540 T2)
MOTHERBOARD: Proprietry
GPU: Intel UHD Graphics 630
PSU: 90Watt
CASE: HP EliteDesk 800 G4 SFF
Full Rig Info

£3000

Owned

 Share

CPU: 2 x Xeon|E5-2696-V4 (44C/88T)
RAM: 128GB|16 x 8GB - DDR4 2400MHz (2Rx8)
MOTHERBOARD: HP Z840|Intel C612 Chipset
GPU: Nvidia Quadro P2200
HDD: 4x 16TB Toshiba MG08ACA16TE Enterprise
SSD/NVME: Intel 512GB 670p NVMe (Main OS)
SSD/NVME 2: 2x WD RED 1TB NVMe (VM's)
SSD/NVME 3: 2x Seagate FireCuda 1TB SSD's (Apps)
Full Rig Info
Link to comment
Share on other sites

20 minutes ago, ENTERPRISE said:

Yeah this is it, from a financial point of view, im not looking forward to the next few years as a homeowner, bills, parent, tech enthusiast when it comes to how much things are costing. Something will have to give and usually it's the hobbies we enjoy...go figure.

 

KInda a side note that your comment made me think about, what do you guys usually do with your old GPU when you upgrade? For a long time now I have been in the habit of selling my old GPU right after I get a new one, so the price to upgrade is only ever as high as the difference between what I can sell my old one for and what the new one costs. 

 

Because of Covid/mining, when I sold my 1080Ti and upgraded to an RX6800, I actually made a little bit of money (like $10). After that I sold my RX6800 for $400 and bought the 7900XT for $800. 

 

Do you guys do this sort of thing? 

  • Agreed 1

null

Owned

 Share

CPU: 5800x
MOTHERBOARD: ASUS TUF Gaming B550-Plus
RAM: 32GB 3600mhz CL16
GPU: 7900XT
SOUNDCARD: Sound Blaster Z 5.1 home theater
MONITOR: 4K 65 inch TV
Full Rig Info
Link to comment
Share on other sites

2 hours ago, UltraMega said:

 

KInda a side note that your comment made me think about, what do you guys usually do with your old GPU when you upgrade? For a long time now I have been in the habit of selling my old GPU right after I get a new one, so the price to upgrade is only ever as high as the difference between what I can sell my old one for and what the new one costs. 

 

Because of Covid/mining, when I sold my 1080Ti and upgraded to an RX6800, I actually made a little bit of money (like $10). After that I sold my RX6800 for $400 and bought the 7900XT for $800. 

 

Do you guys do this sort of thing? 

 

I always flip. I try to sell before the next gen comes out if possible. 

 

If you’re in the upper tiers, besides the initial investment, it’s more cost effective to stay in the game. To recover as much value from the current GPU before jumping into the next gen.

 

So I may go to Blackwell and sell my 7900XTX for approximately 750 USD or so when the time is right.

Link to comment
Share on other sites

On 05/01/2024 at 20:41, Slaughtahouse said:

 

For sure 👍 

 

I tried to use a fair scheme based on general rules of thumb. Not necessarily moores law but that relative performance should improve by 30-50%+ per the same $ spent on the previous gen.

 

It’s just getting out of hand… 

 

Rant incoming: Banks are still keeping interest rates high in an effort to drain bank accounts / kill the total amount of jobs available on the market (demand) so at some point, we must hit a breaking point. I just can’t understand how consumers can keep spending this much on GPU hardware. It’s not just the high end SKUs, it’s the entire stack. Here in Canada, everyone who signed a mortgage when the rates bottomed out in 2020/2021 will need to renew their mortgage (max 5 year term) at a new rate. So 2025/2026 should be fun. Either rates come down or people will continue to lose their homes. Something could be said about taking a mortgage higher than you can sustain in that market when rates were stupidly low but banks here are federally controlled and they’re fairly risk averse. Yet some people took a variable rate when rates were good so IDK…

 

Not sure how it exactly works in the US but my understanding is that your rate is locked in for the entire term (25 etc.). So unless you need to sell your house for whatever other external factor (lose a job, having a child, moving closer to another school / employer etc.) you’re locked in.

 

/end rant

 

Anyway, I know that doesn’t justify these prices from Nvidia but when EVERYTHING is getting more expensive, which didn’t happen in the 2008 financial crisis, I just can’t believe this model will be sustainable considering current market conditions. However, it doesn’t help AMD isn’t expected to compete in the high end this year either with RDNA4 so who knows. Nvidia pulling in record revenue and profits because “AI” but I suspect companies will only invest so much into AI before they slow down their spend and impact Nvidia’s bottom line. Maybe that will help the gaming segment but no idea when that bubble will burst or if it even will.

 

Bleak times ahead…

 

Well, I just learned something new about Canadian mortgages.

 

You have a variety of options with mortgages in the US, but the most popular ones are 30-year fixed rate, 15-year fixed, 5/1 adjustable rate (ARM), and 5/5 adjustable, which appears to be similar to the 5-year fixed rate in Canada but it auto-adjusts the rate every five years so you're not doing any shopping around after the five years are up. Smart money treats ARMs as the option for people who move often rather than a lower upfront interest rate option. You can always refinance your mortgage to get a more favorable current market rate as long as the math works out for you. After 2021 though, the math won't work for a vast majority of people for a very long time.

 

I refinanced twice in 2021 to get a 2.375% 20-year fixed rate, which basically ensured that I'm not moving anywhere else because that type of deal will probably never happen again and I'm happy with everything else about my living situation, the house's attributes, location, etc. I didn't need to do a second refinance in 2021 (the first one was 3.25% for 20), but with the way things were going that year in the midst of the pandemic, I had a strong feeling it was going to be a massive hedge against everything that might happen in the future. Now I can kind of sit back and ignore the chaos, although I never got the feeling of an impending doomsday resulting from mortgage terms expiring like in Canada because of the variety of mortgage options here.

 

5 hours ago, UltraMega said:

 

KInda a side note that your comment made me think about, what do you guys usually do with your old GPU when you upgrade? For a long time now I have been in the habit of selling my old GPU right after I get a new one, so the price to upgrade is only ever as high as the difference between what I can sell my old one for and what the new one costs. 

 

Because of Covid/mining, when I sold my 1080Ti and upgraded to an RX6800, I actually made a little bit of money (like $10). After that I sold my RX6800 for $400 and bought the 7900XT for $800. 

 

Do you guys do this sort of thing? 

 

I always sell off my old one. I had a bit of a hiccup with reselling the 1080 Ti that I used for maybe half a year or so before the 3060 Ti because the first buyer was a weasel who invented a BS reason to return it and forced me to have to retest it, which I seriously dragged my feet on doing. I did benefit a little from the pandemic/mining pricing though because I turned a small profit from selling the 980 Ti I had prior to the 1080 Ti, but I also seriously overpaid for the 3060 Ti as well.

 

The 3060 Ti was the first time in a long time that I bought a brand new GPU, and that was mainly out of necessity due to getting my current second monitor (TV) and realizing that I needed a GPU that supported HDMI 2.1 to run 4K 120 Hz. Otherwise, I'd rather stay a generation behind so someone else can pay the premium of Nvidia's nonsense and AMD's complacency, and I got back on that cycle with the 3080 Ti when the 40 series came out. To be honest though, I don't play games as often as I imagine myself doing, so the entire stretch of going from 980 Ti -> 1080 Ti -> 3060 Ti -> 3080 Ti is a historical anomaly for me.

 

This time around, the problem is that the 40 series is so underwhelming that if I make any upgrade in the near future, the only GPU that might make any sense to me would be the 7900 XTX after the 8000 series comes out. I'll most likely be sitting it out for a bit since I don't foresee much gaming time over the next couple of years.

Edited by Snakecharmed
  • Thanks 1

null

Owned

 Share

CPU: AMD Ryzen 9 7900X
MOTHERBOARD: Asus ROG Strix B650E-F Gaming WiFi
RAM: 64 GB (2x32 GB) G.Skill Trident Z5 Neo RGB DDR5-6000 CL30
GPU: EVGA GeForce RTX 3080 Ti FTW3 Ultra Gaming
SSD/NVME: 1 TB WD_BLACK SN850X PCIe 4.0 NVMe
SSD/NVME 2: 2 TB WD_BLACK SN770 PCIe 4.0 NVMe
MONITOR: 38" LG UltraGear 38GN950-B 3840x1600 144 Hz
MONITOR 2: 55" Samsung Neo QLED QN85A 4K 120 Hz 4:4:4
Full Rig Info

Owned

 Share

CPU: AMD Ryzen 5 5600G
MOTHERBOARD: ASRock X300M-STM
RAM: 16 GB (2x8 GB) ADATA DDR4-3200 CL22
SSD/NVME: 500 GB Gigabyte Gen3 2500E PCIe 3.0 NVMe
SSD/NVME 2: 3.84 TB Samsung PM863a Enterprise SATA 6 Gbps
CASE: ASRock DeskMini X300W
CPU COOLER: Thermalright AXP90-X36
CPU COOLER 2: [Fan] Noctua NF-A9x14 92mm PWM 2.52 W
Full Rig Info
Link to comment
Share on other sites

Back when I was married and only allowed to have one PC, I'd sell a card to offset upgrading to the next card... Now that I'm divorced I have several PCs and the old card gets trickled down to the other systems in the house when I upgrade the main rig. After rotating out of use, cards get stuck in the inventory of stuff for HWBOT comps that I barely ever compete in. 😂

 

At last check I have 20 Nvidia, 11 AMD, and 1 Intel Arc GPUs.

 

I grossly overpaid for my 3090 when I "won" the chance to buy it in a Newegg shuffle back in the GPU craziness, but I sold a few of my other cards for a profit to make up for it. 

  • Shocked 1
  • Agreed 1

Owned

 Share

CPU: 5900X + Optimus block
MOTHERBOARD: MSI X570 Ace
GPU: EVGA RTX 3090 FTW3 Ultra + Optimus block
RAM: 32GB Oloy Blade 3600CL14
SSD/NVME: 1TB SN750 Black
SSD/NVME 2: 1TB SN750 Black
SSD/NVME 3: 2TB Samsung 970 Evo Plus
CASE: LD PC-V7
Full Rig Info

Owned

 Share

CPU: 7800X3D
MOTHERBOARD: B650E-I Strix
RAM: G.Skill Flare X5
GPU: Reference 6950 XT
CASE: InWin D-Frame Mini
WC RADIATOR: MO-RA3 with 4x180mm Silverstone fans
FAN CONTROLLER: Aquaero
Full Rig Info

Owned

 Share

CPU: 12600KF
MOTHERBOARD: Z790I Strix
RAM: G.Skill Trident Z5 Neo
GPU: RTX 2080
CASE: Sliger SM580
WC RESEVOIR: Phanteks R160C
WC RADIATOR: XSPC TX240 Ultrathin
FANS: Phanteks T30
Full Rig Info
Link to comment
Share on other sites

12 hours ago, Snakecharmed said:

 

Well, I just learned something new about Canadian mortgages.

 

You have a variety of options with mortgages in the US, but the most popular ones are 30-year fixed rate, 15-year fixed, 5/1 adjustable rate (ARM), and 5/5 adjustable, which appears to be similar to the 5-year fixed rate in Canada but it auto-adjusts the rate every five years so you're not doing any shopping around after the five years are up. Smart money treats ARMs as the option for people who move often rather than a lower upfront interest rate option. You can always refinance your mortgage to get a more favorable current market rate as long as the math works out for you. After 2021 though, the math won't work for a vast majority of people for a very long time.

Spoiler

Blah blah blah, me continuing to rant about mortgages

 

Thanks for sharing the additional context about the US market 🙂 I’ll stop derailing this thread 🙂

Edited by Slaughtahouse
Put spoiler tag... :P
  • Thanks 1
  • Great Idea 1
Link to comment
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now



×
×
  • Create New...

Important Information

This Website may place and access certain Cookies on your computer. ExtremeHW uses Cookies to improve your experience of using the Website and to improve our range of products and services. ExtremeHW has carefully chosen these Cookies and has taken steps to ensure that your privacy is protected and respected at all times. All Cookies used by this Website are used in accordance with current UK and EU Cookie Law. For more information please see our Privacy Policy