Jump to content

Welcome to ExtremeHW

Welcome to ExtremeHW, register to take part in our community, don't worry this is a simple FREE process that requires minimal information for you to signup.

 

Registered users can: 

  • Start new topics and reply to others.
  • Show off your PC using our Rig Creator feature.
  • Subscribe to topics and forums to get updates.
  • Get your own profile page to customize.
  • Send personal messages to other members.
  • Take advantage of site exclusive features.
  • Upgrade to Premium to unlock additional sites features.
IGNORED

4060Ti 8GB reviews out


Recommended Posts

Quote

Conclusion

As an objective assessment, the RTX 4060 Ti 8GB exhibits very respectable performance, especially within a Full HD and even 2560x1440 mindset. Its shader engine performance is satisfactory, and the addition of DLSS3 and frame generation aid substantially improves its functionality. NVIDIA continues to lead in raw Raytracing performance. This graphics card's 32MB L3 cache is particularly effective at this resolution, though cache misses can result in the system resorting to a narrower 128-bit wide bus with only 8GB of graphics memory. However, at QHD and UHD you're bound to run into memory limitations, also keep in mind that DLSS frame generation will consume VRAM when used. While this could potentially cause issues, the card seems to handle such scenarios well. The RTX 4060 Ti 8GB graphics card boasts enough performance, solid build quality, and appealing aesthetics. However, its pricing is a notable drawback. With a price tag of $399, it is considered far too expensive for a mainstream product.

https://www.guru3d.com/articles_pages/geforce_rtx_4060_ti_8gb_(fe)_review,31.html

 

At low res, it can perform like a 3070. In high res it's worse than a 1080Ti, occasionally even worse than a 1070. 

 

 

untitled-12.png

 

 

 

 

Edited by UltraMega
  • Respect 1

Owned

 Share

CPU: 5800x
MOTHERBOARD: ASUS TUF Gaming B550-Plus
RAM: XMP 3600mhz CL16
GPU: 7900XT
SOUNDCARD: Sound Blaster Z 5.1 home theater
MONITOR: 4K 65 inch TV
Full Rig Info
Link to comment
Share on other sites

It's like Nvidia every couple months sends me a reminder that I don't need to have any buyers remorse over switching teams. Thanks Nvidia. 

  • Thanks 1
  • Respect 1

Owned

 Share

CPU: AMD Ryzen 5 5600X
MOTHERBOARD: MSI MPG B550 Gaming Plus
RAM: Corsair Vengeance LPX 4x8GB 32GB DDR4 3600MHZ
SSD/NVME: KINGSTON KC3000 1TB
GPU: ASUS Dual AMD Radeon RX 6750 XT OC Edition 12GB
SOUNDCARD: Sound Blaster Z SE
GPU 2: XPG CORE REACTOR 650W GOLD
CASE: NZXT H510i
Full Rig Info
Link to comment
Share on other sites

How did Guru3D give it 4 stars and a recommendation when it performs similarly to the same class card from 2 years ago at the same price? Worse if you're still on a PCIE-3.0 board because the card only has 8 lanes. 

 

They keep cutting down the costs yet the charge the same price. Doesn't add up. This shouldn't be more than $300 in today's market. 

  • Thanks 1
Link to comment
Share on other sites

Looks like its about on par with a 5700XT or 6700XT ish from AMD.  Definitely seems to be a tad weak for that performance bracket.  Obviously Nvidia's pricing is outrageous.  Yeah, I'm not interested in anything Nvidia anymore honestly after this generation.  Normally I don't really care that much between brands (I tend to go AMD because of my collections).  But this, this is getting to be stupid.

  • Thanks 1
Link to comment
Share on other sites

Just a quick note, $400 today would be about $280 in pre pandemic-greedflation numbers. It's still a pathetic card, but the price isn't quite as bad as it looks. 

Owned

 Share

CPU: 5800x
MOTHERBOARD: ASUS TUF Gaming B550-Plus
RAM: XMP 3600mhz CL16
GPU: 7900XT
SOUNDCARD: Sound Blaster Z 5.1 home theater
MONITOR: 4K 65 inch TV
Full Rig Info
Link to comment
Share on other sites

I never once imagined I'd see the day when Nvidia would pull a Bulldozer moment with midrange GPUs and think they're winning...

$2000~

Owned

 Share

CPU: AMD Ryzen 7 5800X3D
MOTHERBOARD: ASRock B550M Steel Legend
GPU: Radeon 12GB RX 6700XT
RAM: 64GB G.Skill DDR4 3200 CL16
SSD/NVME: 512GB Sabrent Rocket (Gen 4) NVMe
SSD/NVME 2: 1TB WD Black (Gen 3) NVMe
SSD/NVME 3: 2TB Sabrent Rocket (Gen 3) NVMe
SSD/NVME 4: 2TB Samsung EVO (SATA)
Full Rig Info

$900

Owned

 Share

CPU: Intel Core i7-10875H (8C/16T @ 45w)
MOTHERBOARD: "Comet Lake" HM470 Chipset
RAM: 32GB TeamGroup DDR4 2933
GPU: 8GB Nvidia RTX 2070 (130w)
SSD/NVME: 1TB Sabrent Rocket (Gen 3)
SSD/NVME 2: 2TB Silicon Power A80 (Gen 3)
MONITOR: 17.3" 1920x1080 @ 144Hz
PSU: Chicony 240 Power Brick
Full Rig Info
Link to comment
Share on other sites

3 hours ago, UltraMega said:

Just a quick note, $400 today would be about $280 in pre pandemic-greedflation numbers. It's still a pathetic card, but the price isn't quite as bad as it looks. 

Where'd you get that figure?

 

$400 in 2023 is about $346 in 2020 money and $338 in 2019 money.

 

Anyway, another pathetic showing from Nvidia. 4060 Ti is nothing but at best maybe a misbranded (and clearly mispriced) 4060. Though that 128-bit bus makes it seem even worse. No excuse for a 2023 product at $400 to have those issues. We can go back and talk about cache sure, but AMD clearly realized cache doesn't fix everything after RDNA2 and that's why you saw RDNA3 increase its bus width again and actually reduce total infinity cache from what RDNA2 had.

 

Just pathetic Nvidia.

Edited by Sir Beregond

Showcase

 Share

CPU: AMD Ryzen 9 5900X
GPU: Nvidia RTX 3080 Ti Founders Edition
RAM: G.Skill Trident Z Neo 32GB DDR4-3600 (@ 3733 CL14)
MOTHERBOARD: ASUS Crosshair VIII Dark Hero
SSD/NVME: x2 Samsung 970 Evo Plus 2TB
SSD/NVME 2: Crucial MX500 1TB
PSU: Corsair RM1000x
MONITOR: LG 42" C4 OLED
Full Rig Info

Owned

 Share

CPU: E8400, i5-650, i7-870, i7-960, i5-2400, i7-4790k, i9-10900k, i3-13100, i9-13900ks
GPU: many
RAM: Corsair 32GB DDR3-2400 | Oloy Blade 16GB DDR4-3600 | Crucial 16GB DDR5-5600
MOTHERBOARD: ASUS P7P55 WS SC | ASUS Z97 Deluxe | EVGA Z490 Dark | EVGA Z790 Dark Kingpin
SSD/NVME: Samsung 870 Evo 1TB | Inland 1TB Gen 4
PSU: BeQuiet Straight Power 12 1500W
CASE: Cooler Master MasterFrame 700 - bench mode
OPERATING SYSTEM: Windows 10 LTSC
Full Rig Info

Owned

 Share

CPU: M1 Pro
RAM: 32GB
SSD/NVME: 1TB
OPERATING SYSTEM: MacOS Sonoma
CASE: Space Grey
Full Rig Info
Link to comment
Share on other sites

9 minutes ago, Sir Beregond said:

Where'd you get that figure?

Saw a video talking about it, but maybe the number was somewhat exaggerated. 

Owned

 Share

CPU: 5800x
MOTHERBOARD: ASUS TUF Gaming B550-Plus
RAM: XMP 3600mhz CL16
GPU: 7900XT
SOUNDCARD: Sound Blaster Z 5.1 home theater
MONITOR: 4K 65 inch TV
Full Rig Info
Link to comment
Share on other sites

I read that Jayztwocents removed his video. Not that I watch his content, not my cup of tea but it’s popping up in my feed.

 

Even TechPowerUp removed the “Recommended” at the end of the written review.

 

Not sure why the reviews are all wishy washy. Yes the market is frustrating but when a GPU has 128-bit bus, 8 lanes, 8 GB of VRAM, small die, yet they want $400+, you really don’t have to do whole lot of testing to know this thing will buckle under high resolutions. Have we gone mental? *in Gordon Ramsay’s voice*. 
 

Reviewers need to call it out. Consumers will follow. Otherwise the trust is broken.

 

This class of card should easily handle modest 4K game or maxed out 1440p. @ $400.

 

The fact that it performs worse than a 3060 Ti in 4K in some titles is simply disgusting.

 

And they want $500 for 16GB of VRAM LOL. I used to love the mid end but this gen looked like a dumpster

when the rumours were rolling out last year. Unfortunately they all came true.
 

Sure, maybe DLSS 3 justifies that the overall experience can be better than last gen GPUs for some gamers but common, we all know this class of card really can’t take advantage of that feature fully. Especially with only ~40 games utilizing it. Not to mention that DLSS 3 should be able to run on Ampere… Just a lot of caveats around it makes it hard to call it a selling feature today.

 

Honestly, the revenue from the data centre / corporate / “AI” market must be stellar because I don’t know why Nvidia is shooting themselves in the foot.

 

Reminds me of Intel before Zen came out. Age of Nvidia stagnation inbound? I hope not.

 

  • Thanks 1
Link to comment
Share on other sites

I saw a listing of the JayzTwoCents video on YouTube this morning before it got pulled and even opened it to see that it was someone else on his team—his camera guy, from what I understand—doing the actual review. I didn't actually watch it though. At the time, the comments weren't blistering...yet, but there were already some criticisms about how it was so incongruent from other 4060 Ti reviews.

 

I told my friend that the 3060 Ti I sold him for $300 at the beginning of the year actually gained value after this release. Probably not literally true, but figuratively? Sure. The 4060 Ti provides no meaningful gain in games you would plan to play with a mid-range card like that, and regardless of new vs. used, it's 33% more than what he paid for the 3060 Ti. The second-hand market pricing adapts to the perceived value of the latest generation, and the 3060 Ti isn't looking any worse or more outdated after this release.

 

How is the 4060 Ti actually slower than a 1080 Ti from 2017 at 4K? Even the 3060 Ti outperformed the 1080 Ti in practically every scenario.

 

I don't like how a 128-bit memory bus reads on a spec sheet and I've always had a bias against GPUs with them. In terms of direct successors of a card I once owned, this is along the lines of the memory bus being slashed from the GTX 760 to 960, but worse. The results here are truly embarrassing considering what's at stake nowadays.

Edited by Snakecharmed
  • Thanks 1

Owned

 Share

CPU: AMD Ryzen 9 7900X
MOTHERBOARD: Asus ROG Strix B650E-F Gaming WiFi
RAM: 64 GB (2x32 GB) G.Skill Trident Z5 Neo RGB DDR5-6000 CL30
GPU: EVGA GeForce RTX 3080 Ti FTW3 Ultra Gaming
SSD/NVME: 1 TB WD_BLACK SN850X PCIe 4.0 NVMe
SSD/NVME 2: 2 TB WD_BLACK SN770 PCIe 4.0 NVMe
MONITOR: 38" LG UltraGear 38GN950-B 3840x1600 144 Hz
MONITOR 2: 55" Samsung Neo QLED QN85A 4K 120 Hz 4:4:4
Full Rig Info

Owned

 Share

CPU: AMD Ryzen 5 5600G
MOTHERBOARD: ASRock X300M-STM
RAM: 16 GB (2x8 GB) ADATA DDR4-3200 CL22
SSD/NVME: 500 GB Gigabyte Gen3 2500E PCIe 3.0 NVMe
SSD/NVME 2: 3.84 TB Samsung PM863a Enterprise SATA 6 Gbps
CASE: ASRock DeskMini X300W
CPU COOLER: Thermalright AXP90-X36
CPU COOLER 2: [Fan] Noctua NF-A9x14 92mm PWM 2.52 W
Full Rig Info
Link to comment
Share on other sites

30 minutes ago, Slaughtahouse said:

I read that Jayztwocents removed his video. Not that I watch his content, not my cup of tea but it’s popping up in my feed.

 

Even TechPowerUp removed the “Recommended” at the end of the written review.

 

Not sure why the reviews are all wishy washy. Yes the market is frustrating but when a GPU has 128-bit bus, 8 lanes, 8 GB of VRAM, small die, yet they want $400+, you really don’t have to do whole lot of testing to know this thing will buckle under high resolutions. Have we gone mental? *in Gordon Ramsay’s voice*. 
 

Reviewers need to call it out. Consumers will follow. Otherwise the trust is broken.

 

This class of card should easily handle modest 4K game or maxed out 1440p. @ $400.

 

The fact that it performs worse than a 3060 Ti in 4K in some titles is simply disgusting.

 

And they want $500 for 16GB of VRAM LOL. I used to love the mid end but this gen looked like a dumpster

when the rumours were rolling out last year. Unfortunately they all came true.
 

Sure, maybe DLSS 3 justifies that the overall experience can be better than last gen GPUs for some gamers but common, we all know this class of card really can’t take advantage of that feature fully. Especially with only ~40 games utilizing it. Not to mention that DLSS 3 should be able to run on Ampere… Just a lot of caveats around it makes it hard to call it a selling feature today.

 

Honestly, the revenue from the data centre / corporate / “AI” market must be stellar because I don’t know why Nvidia is shooting themselves in the foot.

 

Reminds me of Intel before Zen came out. Age of Nvidia stagnation inbound? I hope not.

 

Reviewers for a long time have gone on to just say "its expensive" and "recommended" in the same review. 

 

Reviewers really dropped the ball when this all started in Kepler with the GTX 680 clearly not being the actual replacement for the GTX 580. That came later with the 780 Ti, yet they gave them a pass and we got a decade of what we got.

  • Thanks 1

Showcase

 Share

CPU: AMD Ryzen 9 5900X
GPU: Nvidia RTX 3080 Ti Founders Edition
RAM: G.Skill Trident Z Neo 32GB DDR4-3600 (@ 3733 CL14)
MOTHERBOARD: ASUS Crosshair VIII Dark Hero
SSD/NVME: x2 Samsung 970 Evo Plus 2TB
SSD/NVME 2: Crucial MX500 1TB
PSU: Corsair RM1000x
MONITOR: LG 42" C4 OLED
Full Rig Info

Owned

 Share

CPU: E8400, i5-650, i7-870, i7-960, i5-2400, i7-4790k, i9-10900k, i3-13100, i9-13900ks
GPU: many
RAM: Corsair 32GB DDR3-2400 | Oloy Blade 16GB DDR4-3600 | Crucial 16GB DDR5-5600
MOTHERBOARD: ASUS P7P55 WS SC | ASUS Z97 Deluxe | EVGA Z490 Dark | EVGA Z790 Dark Kingpin
SSD/NVME: Samsung 870 Evo 1TB | Inland 1TB Gen 4
PSU: BeQuiet Straight Power 12 1500W
CASE: Cooler Master MasterFrame 700 - bench mode
OPERATING SYSTEM: Windows 10 LTSC
Full Rig Info

Owned

 Share

CPU: M1 Pro
RAM: 32GB
SSD/NVME: 1TB
OPERATING SYSTEM: MacOS Sonoma
CASE: Space Grey
Full Rig Info
Link to comment
Share on other sites

It's basically a 3060 variant/refresh with DLSS3 tacked on, but because DLSS3 does use some amount of Vram, it can't really help the 4060Ti get to 4K when it runs out of Vram well before that. It might be able to take a game running at 1080p or 1440p and double the frame rate, but since it already performs well at low res, so what? If you can't use the upscaling to get to 4K, it's pointless.

 

It really is mindboggling. I usually like to argue devil's advocate but there is just nothing that can be said to defend this card. 

 

4060 and up should have been 12GBs

4070 should have been 16GBs

4080 should have been 20GBs

 

Would have thought Nvidia would try to redesign their GPUs to be more flexible on ram sizes after the 3000 series cards. I'm not that technical, but AMD figured it out soo.... 

It seems like it would have been such low hanging fruit for Nvidia to just no skimp on Vram this time and have huge success in the market, but instead another failed series is just going to lead to them lowering production to inflate the prices most likely. 

Owned

 Share

CPU: 5800x
MOTHERBOARD: ASUS TUF Gaming B550-Plus
RAM: XMP 3600mhz CL16
GPU: 7900XT
SOUNDCARD: Sound Blaster Z 5.1 home theater
MONITOR: 4K 65 inch TV
Full Rig Info
Link to comment
Share on other sites

12 hours ago, UltraMega said:

It's basically a 3060 variant/refresh with DLSS3 tacked on, but because DLSS3 does use some amount of Vram, it can't really help the 4060Ti get to 4K when it runs out of Vram well before that. It might be able to take a game running at 1080p or 1440p and double the frame rate, but since it already performs well at low res, so what? If you can't use the upscaling to get to 4K, it's pointless.

 

It really is mindboggling. I usually like to argue devil's advocate but there is just nothing that can be said to defend this card. 

 

4060 and up should have been 12GBs

4070 should have been 16GBs

4080 should have been 20GBs

 

Would have thought Nvidia would try to redesign their GPUs to be more flexible on ram sizes after the 3000 series cards. I'm not that technical, but AMD figured it out soo.... 

It seems like it would have been such low hanging fruit for Nvidia to just no skimp on Vram this time and have huge success in the market, but instead another failed series is just going to lead to them lowering production to inflate the prices most likely. 

Nothing to figure out here. This is literally just cheapening out on $20 of additional RAM and building the dies to have a normal number of memory controllers. They cheapened out plain and simple and thought adding additional cache would "fix".

 

Meanwhile charging more than ever for the privilege. 

 

I wish AMD would price their stuff appropriately at launch. They don't have feature parity with Nvidia, but have been content to just slot into Nvidia's price points rather than do anything to drive a real competition in pricing. It's nice to see the 7900 XT now in a much better price point than what it launched at, but sounds like the upcoming 7600 XT might fall into the same pricing nonsense for an 8GB card in 2023.

Edited by Sir Beregond

Showcase

 Share

CPU: AMD Ryzen 9 5900X
GPU: Nvidia RTX 3080 Ti Founders Edition
RAM: G.Skill Trident Z Neo 32GB DDR4-3600 (@ 3733 CL14)
MOTHERBOARD: ASUS Crosshair VIII Dark Hero
SSD/NVME: x2 Samsung 970 Evo Plus 2TB
SSD/NVME 2: Crucial MX500 1TB
PSU: Corsair RM1000x
MONITOR: LG 42" C4 OLED
Full Rig Info

Owned

 Share

CPU: E8400, i5-650, i7-870, i7-960, i5-2400, i7-4790k, i9-10900k, i3-13100, i9-13900ks
GPU: many
RAM: Corsair 32GB DDR3-2400 | Oloy Blade 16GB DDR4-3600 | Crucial 16GB DDR5-5600
MOTHERBOARD: ASUS P7P55 WS SC | ASUS Z97 Deluxe | EVGA Z490 Dark | EVGA Z790 Dark Kingpin
SSD/NVME: Samsung 870 Evo 1TB | Inland 1TB Gen 4
PSU: BeQuiet Straight Power 12 1500W
CASE: Cooler Master MasterFrame 700 - bench mode
OPERATING SYSTEM: Windows 10 LTSC
Full Rig Info

Owned

 Share

CPU: M1 Pro
RAM: 32GB
SSD/NVME: 1TB
OPERATING SYSTEM: MacOS Sonoma
CASE: Space Grey
Full Rig Info
Link to comment
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now
×
×
  • Create New...

Important Information

This Website may place and access certain Cookies on your computer. ExtremeHW uses Cookies to improve your experience of using the Website and to improve our range of products and services. ExtremeHW has carefully chosen these Cookies and has taken steps to ensure that your privacy is protected and respected at all times. All Cookies used by this Website are used in accordance with current UK and EU Cookie Law. For more information please see our Privacy Policy