Jump to content

Welcome to ExtremeHW

Welcome to ExtremeHW, register to take part in our community, don't worry this is a simple FREE process that requires minimal information for you to signup.

 

Registered users can: 

  • Start new topics and reply to others.
  • Show off your PC using our Rig Creator feature.
  • Subscribe to topics and forums to get updates.
  • Get your own profile page to customize.
  • Send personal messages to other members.
  • Take advantage of site exclusive features.
  • Upgrade to Premium to unlock additional sites features.
IGNORED

4060 Ti 16GB Release on July 18th, AIB-Only


UltraMega

Recommended Posts

Quote

Sources suggest that the NVIDIA RTX 4060 Ti, equipped with 16GB of memory, is set to be released within the forthcoming two weeks. 
 

This enhanced variant of the RTX 4060 Ti, characterized by its expanded memory buffer, is projected to enter the order, shipment, and global distribution phase from July 18th onwards. The timeline and specifics related to the review embargo have not yet been clarified.
 

Originally released with 8GB of memory, the RTX 4060 Ti had a muted reception from gaming enthusiasts, largely attributed to its elevated $399 price tag. The upgraded model, featuring twice the memory, has a confirmed price tag of $499. Mirroring the original non-Ti RTX 4060, the 16GB Ti model will be solely procurable via add-in board (AIB) partners, indicating the absence of a Founders Edition directly distributed by NVIDIA.

 

NVIDIA RTX 4060 Ti 16GB Set to Release on July 18th, AIB-Exclusive Model (guru3d.com)

Oh Brother Facepalm GIF by reactionseditor

  • Thanks 4

null

Owned

 Share

CPU: 5800x
MOTHERBOARD: ASUS TUF Gaming B550-Plus
RAM: 32GB 3600mhz CL16
GPU: 7900XT
SOUNDCARD: Sound Blaster Z 5.1 home theater
MONITOR: 4K 65 inch TV
Full Rig Info
Link to comment
Share on other sites

I always upvote Picard facepalm.

 

  • Respect 1

null

Showcase

 Share

CPU: AMD Ryzen 9 5900X
GPU: Nvidia RTX 3080 Ti Founders Edition
RAM: G.Skill Trident Z Neo 32GB DDR4-3600 (@ 3733 CL14)
MOTHERBOARD: ASUS Crosshair VIII Dark Hero
SSD/NVME: x2 Samsung 970 Evo Plus 2TB
SSD/NVME 2: Crucial MX500 1TB
PSU: be Quiet! Straight Power 12 1500W
MONITOR: LG 42" C4 OLED
Full Rig Info

null

Owned

 Share

CPU: E8400, i5-650, i7-870, i7-960, i5-2400, i7-4790k, i9-10900k, i3-13100, i9-13900ks
GPU: many
RAM: Corsair 32GB DDR3-2400 | Oloy Blade 16GB DDR4-3600 | Crucial 16GB DDR5-5600
MOTHERBOARD: ASUS P7P55 WS SC | ASUS Z97 Deluxe | EVGA Z490 Dark | EVGA Z790 Dark Kingpin
SSD/NVME: Samsung 870 Evo 1TB | Inland 1TB Gen 4
PSU: Seasonic Focus GX 1000W
CASE: Cooler Master MasterFrame 700 - bench mode
OPERATING SYSTEM: Windows 10 LTSC
Full Rig Info

Owned

 Share

CPU: M1 Pro
RAM: 32GB
SSD/NVME: 1TB
OPERATING SYSTEM: MacOS Sonoma
CASE: Space Grey
Full Rig Info
Link to comment
Share on other sites

  • 3 weeks later...

Update: 

 

spacer.png

 

Seems like Nvidia allowed the 16GB model to exist to address criticism that 8GB wasn't enough, but the 192-bit memory is just too constrained for it to make a difference. 

 

WWW.TECHSPOT.COM

Despite our fondness for the GeForce RTX 4060 Ti, we can admit that just 8GB of VRAM is not enough for a $400 GPU...

 

 

null

Owned

 Share

CPU: 5800x
MOTHERBOARD: ASUS TUF Gaming B550-Plus
RAM: 32GB 3600mhz CL16
GPU: 7900XT
SOUNDCARD: Sound Blaster Z 5.1 home theater
MONITOR: 4K 65 inch TV
Full Rig Info
Link to comment
Share on other sites

Aside from the 4090, which itself is cut down significantly from the full die vs say the 3090 and full GA102, the rest of the stack definitely got shafted on memory bandwidth, and added L2 I don't think really compensated like they thought it would. 4080 being 256-bit lines up with most "80" branded cards of the last decade, but the $1200 asking price sure doesn't. Everything else though? Yeah...its not good.

null

Showcase

 Share

CPU: AMD Ryzen 9 5900X
GPU: Nvidia RTX 3080 Ti Founders Edition
RAM: G.Skill Trident Z Neo 32GB DDR4-3600 (@ 3733 CL14)
MOTHERBOARD: ASUS Crosshair VIII Dark Hero
SSD/NVME: x2 Samsung 970 Evo Plus 2TB
SSD/NVME 2: Crucial MX500 1TB
PSU: be Quiet! Straight Power 12 1500W
MONITOR: LG 42" C4 OLED
Full Rig Info

null

Owned

 Share

CPU: E8400, i5-650, i7-870, i7-960, i5-2400, i7-4790k, i9-10900k, i3-13100, i9-13900ks
GPU: many
RAM: Corsair 32GB DDR3-2400 | Oloy Blade 16GB DDR4-3600 | Crucial 16GB DDR5-5600
MOTHERBOARD: ASUS P7P55 WS SC | ASUS Z97 Deluxe | EVGA Z490 Dark | EVGA Z790 Dark Kingpin
SSD/NVME: Samsung 870 Evo 1TB | Inland 1TB Gen 4
PSU: Seasonic Focus GX 1000W
CASE: Cooler Master MasterFrame 700 - bench mode
OPERATING SYSTEM: Windows 10 LTSC
Full Rig Info

Owned

 Share

CPU: M1 Pro
RAM: 32GB
SSD/NVME: 1TB
OPERATING SYSTEM: MacOS Sonoma
CASE: Space Grey
Full Rig Info
Link to comment
Share on other sites

The whole 4000 series has been a bit of a mess from the very beginning.

 

When 4000 series was first on the horizon people were scared when the rumors of 600w cards came out. They were thinking it was going to be Fermi all over again. As it turned out, the 4000 series is actually very power efficient. 

q6J3Kvj-FhCZkcjcEHabC6PFPJj8VyzfNVW6AGiZ7PE.jpg?width=500&auto=webp&s=1140aec06982eb0179afa4752dbe8d6a919d6d83

 

People all cry about the memory bus and how cut down the card is, but for the average Joe buying a midrange card that just wants to game the 4060ti is fine. It performs in line with previous gen "70" cards at much lower power draw. 

 

What's not fine is that it's priced more like an upper midrange card. The whole product stack just seems off. 

 

If the 4060ti was priced at $300 for 8GB, $350 for 16GB,  it would be a lot more palatable. Budget gamers aren't trying to do 4K 120. They understand that getting a lower end card means that they may need to tune settings to hit their FPS goals. However, when you drop $500+ you expect to be able to just crank everything up and not worry about it. 

 

 

If the naming and pricing was closer to this, I think it would be more in line with peoples expectations. 

4090    = $1200

4080    = $800

4070ti -> 4070 = $500

4070   -> 4060ti = $400

4060ti -> 4060  =$300/350

4060   -> 4050 = $250

  • Respect 2

Owned

 Share

CPU: 5900X + Optimus block
MOTHERBOARD: MSI X570 Ace
GPU: EVGA RTX 3090 FTW3 Ultra + Optimus block
RAM: 32GB Oloy Blade 3600CL14
SSD/NVME: 1TB SN750 Black
SSD/NVME 2: 1TB SN750 Black
SSD/NVME 3: 2TB Samsung 970 Evo Plus
CASE: LD PC-V7
Full Rig Info

Owned

 Share

CPU: 7800X3D
MOTHERBOARD: B650E-I Strix
RAM: G.Skill Flare X5
GPU: Reference 6950 XT
CASE: InWin D-Frame Mini
WC RADIATOR: MO-RA3 with 4x180mm Silverstone fans
FAN CONTROLLER: Aquaero
Full Rig Info

Owned

 Share

CPU: 12600KF
MOTHERBOARD: Z790I Strix
RAM: G.Skill Trident Z5 Neo
GPU: RTX 2080
CASE: Sliger SM580
WC RESEVOIR: Phanteks R160C
WC RADIATOR: XSPC TX240 Ultrathin
FANS: Phanteks T30
Full Rig Info
Link to comment
Share on other sites

19 minutes ago, Fluxmaven said:

The whole 4000 series has been a bit of a mess from the very beginning.

 

When 4000 series was first on the horizon people were scared when the rumors of 600w cards came out. They were thinking it was going to be Fermi all over again. As it turned out, the 4000 series is actually very power efficient. 

q6J3Kvj-FhCZkcjcEHabC6PFPJj8VyzfNVW6AGiZ7PE.jpg?width=500&auto=webp&s=1140aec06982eb0179afa4752dbe8d6a919d6d83

 

People all cry about the memory bus and how cut down the card is, but for the average Joe buying a midrange card that just wants to game the 4060ti is fine. It performs in line with previous gen "70" cards at much lower power draw. 

 

What's not fine is that it's priced more like an upper midrange card. The whole product stack just seems off. 

 

If the 4060ti was priced at $300 for 8GB, $350 for 16GB,  it would be a lot more palatable. Budget gamers aren't trying to do 4K 120. They understand that getting a lower end card means that they may need to tune settings to hit their FPS goals. However, when you drop $500+ you expect to be able to just crank everything up and not worry about it. 

 

 

If the naming and pricing was closer to this, I think it would be more in line with peoples expectations. 

4090    = $1200

4080    = $800

4070ti -> 4070 = $500

4070   -> 4060ti = $400

4060ti -> 4060  =$300/350

4060   -> 4050 = $250

You hit the nail on the head.

 

Regarding memory, I do think the concept of a 192-bit bus on products that are north of $500 is something awful though.

null

Showcase

 Share

CPU: AMD Ryzen 9 5900X
GPU: Nvidia RTX 3080 Ti Founders Edition
RAM: G.Skill Trident Z Neo 32GB DDR4-3600 (@ 3733 CL14)
MOTHERBOARD: ASUS Crosshair VIII Dark Hero
SSD/NVME: x2 Samsung 970 Evo Plus 2TB
SSD/NVME 2: Crucial MX500 1TB
PSU: be Quiet! Straight Power 12 1500W
MONITOR: LG 42" C4 OLED
Full Rig Info

null

Owned

 Share

CPU: E8400, i5-650, i7-870, i7-960, i5-2400, i7-4790k, i9-10900k, i3-13100, i9-13900ks
GPU: many
RAM: Corsair 32GB DDR3-2400 | Oloy Blade 16GB DDR4-3600 | Crucial 16GB DDR5-5600
MOTHERBOARD: ASUS P7P55 WS SC | ASUS Z97 Deluxe | EVGA Z490 Dark | EVGA Z790 Dark Kingpin
SSD/NVME: Samsung 870 Evo 1TB | Inland 1TB Gen 4
PSU: Seasonic Focus GX 1000W
CASE: Cooler Master MasterFrame 700 - bench mode
OPERATING SYSTEM: Windows 10 LTSC
Full Rig Info

Owned

 Share

CPU: M1 Pro
RAM: 32GB
SSD/NVME: 1TB
OPERATING SYSTEM: MacOS Sonoma
CASE: Space Grey
Full Rig Info
Link to comment
Share on other sites

Social Media Manager
1.5k 845

When you take the 6800 and 6800xt into the game. All of nvidia lineup fall appart.

My 6800xt was on par with a lot of 6900xt with a few tweak in the overclock tab like just increasing power limit.  I imagine the 6800 to be about the same.

 

Owned

 Share

MOTHERBOARD: MSI MPG Z790i EDGE
CPU: Intel 13900k + Top Mounted 280mm Aio
RAM: 2x24gb Gskill 6400 cl36-48-48 1.4v
PSU: Cooler Master V850 SFX Gold White Edition
GPU: UHD ULTRA EXTREME BANANA GRAPHIC
MONITOR: [Monitor] LG CX48 OLED [VR] Samsung HMD Odyssey Plus OLED + Meta Quest 2 120hz
CASE: CoolerMaster NR200P White Mini ITX
SSD/NVME: 2TB Intel 660p 1tb sn850 1tb sn770
Full Rig Info

Owned

 Share

CPU: Asus Strix G15 AE 6800m 5900hx 32gb ram 1440p
RAM: MSI GT60 Dominator 870m 4800MQ
GPU: Alienware M11x R2 i7 640um Nvidia 335m 8gb Ram
MONITOR: Lenovo X270 1080p i7 7600u 16gb ram
SSD/NVME: Acer Chromebook 11.6
Full Rig Info

Owned

 Share

CPU: Ryzen 5560u
MOTHERBOARD: Beelink SER5 Mini PC Box
RAM: 2x32gb Sodimm
CASE: Jonsbo N1 Mini ITX
HDD: 8TB + 4TB HDD + 2 x Intel DC S3500 800GB
Full Rig Info
Link to comment
Share on other sites

7 hours ago, Fluxmaven said:

The whole 4000 series has been a bit of a mess from the very beginning.

 

When 4000 series was first on the horizon people were scared when the rumors of 600w cards came out. They were thinking it was going to be Fermi all over again. As it turned out, the 4000 series is actually very power efficient. 

q6J3Kvj-FhCZkcjcEHabC6PFPJj8VyzfNVW6AGiZ7PE.jpg?width=500&auto=webp&s=1140aec06982eb0179afa4752dbe8d6a919d6d83

 

People all cry about the memory bus and how cut down the card is, but for the average Joe buying a midrange card that just wants to game the 4060ti is fine. It performs in line with previous gen "70" cards at much lower power draw. 

 

What's not fine is that it's priced more like an upper midrange card. The whole product stack just seems off. 

 

If the 4060ti was priced at $300 for 8GB, $350 for 16GB,  it would be a lot more palatable. Budget gamers aren't trying to do 4K 120. They understand that getting a lower end card means that they may need to tune settings to hit their FPS goals. However, when you drop $500+ you expect to be able to just crank everything up and not worry about it. 

 

 

If the naming and pricing was closer to this, I think it would be more in line with peoples expectations. 

4090    = $1200

4080    = $800

4070ti -> 4070 = $500

4070   -> 4060ti = $400

4060ti -> 4060  =$300/350

4060   -> 4050 = $250

Kinda just saying the same thing in a different way. If a card is expensive, it shouldn't be considered a low tier card and should come with decent features. 

 

Gotta keep in mind that the 3000 series was supposed to have double the ram. The 3060 12GB and the 3090 are the only cards from that gen that released more or less as planned. Supply shortages are the only reason we didn't have 16GB 3070 cards and 20GB 3080s. There's no shortage now, but Nvidia didn't course correct. 

 

I think most of us see the writing on the wall, and see this as planned obsolescence. Nvidia got used to skimping on Vram on accident, now they're doing it on purpose. 

  • Respect 1

null

Owned

 Share

CPU: 5800x
MOTHERBOARD: ASUS TUF Gaming B550-Plus
RAM: 32GB 3600mhz CL16
GPU: 7900XT
SOUNDCARD: Sound Blaster Z 5.1 home theater
MONITOR: 4K 65 inch TV
Full Rig Info
Link to comment
Share on other sites

Nvidia thought what if we make our top end card the only one that really makes sense. Then just price the whole rest of the lineup so bad that people find ways to justify buying a X090.  

 

When 3XXX series was coming out everyone freaked out "2080ti performance for $500" with the 3070. We were coming off of a different wave of mining boom so 2080ti's were still selling for MSRP all the way up till just before 3XXX launch. People were just excited to finally get a real upgrade. 2XXX series introduced a lot of cool new technologies but it took a couple years for ray tracing and DLSS to actually be implemented or optimized well enough to actually use those features. Yet that generation came with a price hike over the 1XXX series. 

 

Not getting the 20gb 3080 felt like a real scam. Regressing from 11gb with the 1080ti or 2080ti for the 10GB 3080 didn't feel right. Shelling out for a 3090 didn't feel right. The 12GB 3080 was too little, too late to mend our broken hearts. 

 

If Nvidia could just stop jerking us around on VRAM and pricing 

that-would-be-great-office-space.gif

  • Thanks 2

Owned

 Share

CPU: 5900X + Optimus block
MOTHERBOARD: MSI X570 Ace
GPU: EVGA RTX 3090 FTW3 Ultra + Optimus block
RAM: 32GB Oloy Blade 3600CL14
SSD/NVME: 1TB SN750 Black
SSD/NVME 2: 1TB SN750 Black
SSD/NVME 3: 2TB Samsung 970 Evo Plus
CASE: LD PC-V7
Full Rig Info

Owned

 Share

CPU: 7800X3D
MOTHERBOARD: B650E-I Strix
RAM: G.Skill Flare X5
GPU: Reference 6950 XT
CASE: InWin D-Frame Mini
WC RADIATOR: MO-RA3 with 4x180mm Silverstone fans
FAN CONTROLLER: Aquaero
Full Rig Info

Owned

 Share

CPU: 12600KF
MOTHERBOARD: Z790I Strix
RAM: G.Skill Trident Z5 Neo
GPU: RTX 2080
CASE: Sliger SM580
WC RESEVOIR: Phanteks R160C
WC RADIATOR: XSPC TX240 Ultrathin
FANS: Phanteks T30
Full Rig Info
Link to comment
Share on other sites

3 hours ago, UltraMega said:

Kinda just saying the same thing in a different way. If a card is expensive, it shouldn't be considered a low tier card and should come with decent features. 

 

Gotta keep in mind that the 3000 series was supposed to have double the ram. The 3060 12GB and the 3090 are the only cards from that gen that released more or less as planned. Supply shortages are the only reason we didn't have 16GB 3070 cards and 20GB 3080s. There's no shortage now, but Nvidia didn't course correct. 

 

I think most of us see the writing on the wall, and see this as planned obsolescence. Nvidia got used to skimping on Vram on accident, now they're doing it on purpose. 

I wonder if its less about planned obsolescence and more that they just got addicted to pandemic level margins and as such are building everything in the stack to maintain that. The fact cards aren't selling doesn't seem to matter to them now though because of the AI cash cow, so no reason to make things sane again in their mind on the consumer side of graphics. If it doesn't sell, its more silicon allocation that can go towards higher priced AI products, if it does sell, its at a high margin.

 

Either way consumers lose in my opinion.

 

2 hours ago, Fluxmaven said:

Nvidia thought what if we make our top end card the only one that really makes sense. Then just price the whole rest of the lineup so bad that people find ways to justify buying a X090.  

 

When 3XXX series was coming out everyone freaked out "2080ti performance for $500" with the 3070. We were coming off of a different wave of mining boom so 2080ti's were still selling for MSRP all the way up till just before 3XXX launch. People were just excited to finally get a real upgrade. 2XXX series introduced a lot of cool new technologies but it took a couple years for ray tracing and DLSS to actually be implemented or optimized well enough to actually use those features. Yet that generation came with a price hike over the 1XXX series. 

 

Not getting the 20gb 3080 felt like a real scam. Regressing from 11gb with the 1080ti or 2080ti for the 10GB 3080 didn't feel right. Shelling out for a 3090 didn't feel right. The 12GB 3080 was too little, too late to mend our broken hearts. 

 

If Nvidia could just stop jerking us around on VRAM and pricing 

that-would-be-great-office-space.gif

 

Well said. The 20-series also just didn't present a good enough upgrade to performance over the 10-series outside of the 2080 Ti too, so most of the 10-series and older users were waiting for 30-series.

 

I'm glad you touched upon the VRAM scam of the 3080. This was clue #1 that the new "flagship" of the 30-series was in fact the "90 cards" which were in fact rebrands of what used to be "80 Ti" with a price hike. But they tried to market as "Titan like" to justify it. The 10GB 3080 told you all you needed to know about the truth of that lol. Didn't know "flagships" were supposed to regress in VRAM vs the prior 2 generations. :lachen:

 

Edited by Sir Beregond

null

Showcase

 Share

CPU: AMD Ryzen 9 5900X
GPU: Nvidia RTX 3080 Ti Founders Edition
RAM: G.Skill Trident Z Neo 32GB DDR4-3600 (@ 3733 CL14)
MOTHERBOARD: ASUS Crosshair VIII Dark Hero
SSD/NVME: x2 Samsung 970 Evo Plus 2TB
SSD/NVME 2: Crucial MX500 1TB
PSU: be Quiet! Straight Power 12 1500W
MONITOR: LG 42" C4 OLED
Full Rig Info

null

Owned

 Share

CPU: E8400, i5-650, i7-870, i7-960, i5-2400, i7-4790k, i9-10900k, i3-13100, i9-13900ks
GPU: many
RAM: Corsair 32GB DDR3-2400 | Oloy Blade 16GB DDR4-3600 | Crucial 16GB DDR5-5600
MOTHERBOARD: ASUS P7P55 WS SC | ASUS Z97 Deluxe | EVGA Z490 Dark | EVGA Z790 Dark Kingpin
SSD/NVME: Samsung 870 Evo 1TB | Inland 1TB Gen 4
PSU: Seasonic Focus GX 1000W
CASE: Cooler Master MasterFrame 700 - bench mode
OPERATING SYSTEM: Windows 10 LTSC
Full Rig Info

Owned

 Share

CPU: M1 Pro
RAM: 32GB
SSD/NVME: 1TB
OPERATING SYSTEM: MacOS Sonoma
CASE: Space Grey
Full Rig Info
Link to comment
Share on other sites

Almost reminds me when I bought my 660 Ti. They were 192-bit cards. They had the 2gb model ($299) and 3gb model ($350?). The 2gb model had an asymmetrical memory layout to get 2GB on 192 bit. The 3gb model wasn't asymmetrical IIRC. Both cards had 16x3.0 lanes. Upsell was the 670 at $399. 

 

Feels like Nvidia is following that same philosophy but just cracking down even further. Larger price gaps ($100), less lanes (8 lanes), same crappy bus (192-bit) to be more power efficient (really just cost savings).   

 

If the model has been working for 10 years and they have 70% market share in the consumer market... they do not care. Next generation they will want to entice Ampere owners to upgrade so I imagine they will put in a bit more effort than Ada in terms of marketing / pricing but still... very bleak. 

  • Thanks 1
Link to comment
Share on other sites

It's a shame AMD doesn't undercut them and gain some market share. At least they have been less stingy with the VRAM. It's just that most people don't want to wait for the drivers to fine wine into something usable. They also suck for VR and F@H PPD is laughable. 7XXX series is barely better than last gen, whereas the 4XXX series Nvidia stuff has massive gains over last gen. 

 

I need to pull my A770 out and see how it's doing on the latest drivers. Obviously it's been like paying to be a beta tester, but every time I put it on the bench it's been faster. We've got a way to go before I'd call Intel a real competitor in the GPU space, but it's nice to have a glimmer of hope for a viable 3rd choice in the market. 

  • Respect 1

Owned

 Share

CPU: 5900X + Optimus block
MOTHERBOARD: MSI X570 Ace
GPU: EVGA RTX 3090 FTW3 Ultra + Optimus block
RAM: 32GB Oloy Blade 3600CL14
SSD/NVME: 1TB SN750 Black
SSD/NVME 2: 1TB SN750 Black
SSD/NVME 3: 2TB Samsung 970 Evo Plus
CASE: LD PC-V7
Full Rig Info

Owned

 Share

CPU: 7800X3D
MOTHERBOARD: B650E-I Strix
RAM: G.Skill Flare X5
GPU: Reference 6950 XT
CASE: InWin D-Frame Mini
WC RADIATOR: MO-RA3 with 4x180mm Silverstone fans
FAN CONTROLLER: Aquaero
Full Rig Info

Owned

 Share

CPU: 12600KF
MOTHERBOARD: Z790I Strix
RAM: G.Skill Trident Z5 Neo
GPU: RTX 2080
CASE: Sliger SM580
WC RESEVOIR: Phanteks R160C
WC RADIATOR: XSPC TX240 Ultrathin
FANS: Phanteks T30
Full Rig Info
Link to comment
Share on other sites

3 hours ago, Fluxmaven said:

It's a shame AMD doesn't undercut them and gain some market share. At least they have been less stingy with the VRAM. It's just that most people don't want to wait for the drivers to fine wine into something usable. They also suck for VR and F@H PPD is laughable. 7XXX series is barely better than last gen, whereas the 4XXX series Nvidia stuff has massive gains over last gen. 

 

I need to pull my A770 out and see how it's doing on the latest drivers. Obviously it's been like paying to be a beta tester, but every time I put it on the bench it's been faster. We've got a way to go before I'd call Intel a real competitor in the GPU space, but it's nice to have a glimmer of hope for a viable 3rd choice in the market. 

I have a 7900XT, had an RX 6800 before that. AMD driver are very solid now. I can't think of the last time I had a driver issue with a game. Everything works all the time, and in fact the AMD control panel is light years ahead of Nvidia's. 

 

The days of AMD being behind on drivers in any meaningful way are long gone. 

 

 

4 hours ago, Sir Beregond said:

I wonder if its less about planned obsolescence and more that they just got addicted to pandemic level margins and as such are building everything in the stack to maintain that. The fact cards aren't selling doesn't seem to matter to them now though because of the AI cash cow, so no reason to make things sane again in their mind on the consumer side of graphics. If it doesn't sell, its more silicon allocation that can go towards higher priced AI products, if it does sell, its at a high margin.

 

Either way consumers lose in my opinion.

 

 

Well said. The 20-series also just didn't present a good enough upgrade to performance over the 10-series outside of the 2080 Ti too, so most of the 10-series and older users were waiting for 30-series.

 

I'm glad you touched upon the VRAM scam of the 3080. This was clue #1 that the new "flagship" of the 30-series was in fact the "90 cards" which were in fact rebrands of what used to be "80 Ti" with a price hike. But they tried to market as "Titan like" to justify it. The 10GB 3080 told you all you needed to know about the truth of that lol. Didn't know "flagships" were supposed to regress in VRAM vs the prior 2 generations. :lachen:

 

 

I think it's planned obsolescence in the sense that just about any new card would be able to run any games with RT turned off without issue, if only they had enough Vram. Nvidia has to make sure people can't just turn off RT and avoid having to upgrade. RT is the only thing that's not low hanging fruit on the performance tree these days, and most gamers are happy to ignore it completely. 

 

We are in a new era of graphics where GPU power is far out-pacing normal 3D rendering techniques. So now Nvidia has to both push RT and make sure the lower end of the stack can't just turn off RT and get acceptable performance without hitting a Vram bottleneck in the not too distant future. 

null

Owned

 Share

CPU: 5800x
MOTHERBOARD: ASUS TUF Gaming B550-Plus
RAM: 32GB 3600mhz CL16
GPU: 7900XT
SOUNDCARD: Sound Blaster Z 5.1 home theater
MONITOR: 4K 65 inch TV
Full Rig Info
Link to comment
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now


×
×
  • Create New...

Important Information

This Website may place and access certain Cookies on your computer. ExtremeHW uses Cookies to improve your experience of using the Website and to improve our range of products and services. ExtremeHW has carefully chosen these Cookies and has taken steps to ensure that your privacy is protected and respected at all times. All Cookies used by this Website are used in accordance with current UK and EU Cookie Law. For more information please see our Privacy Policy