Jump to content

Welcome to ExtremeHW

Welcome to ExtremeHW, register to take part in our community, don't worry this is a simple FREE process that requires minimal information for you to signup.

 

Registered users can: 

  • Start new topics and reply to others.
  • Show off your PC using our Rig Creator feature.
  • Subscribe to topics and forums to get updates.
  • Get your own profile page to customize.
  • Send personal messages to other members.
  • Take advantage of site exclusive features.
  • Upgrade to Premium to unlock additional sites features.
IGNORED

RTX4070 reviews out


Recommended Posts

Quote

we'll quickly note that 12GB is now what we consider to be the bare minimum - that is to say, you shouldn't purchase a graphics card with less than 12 GB VRAM when spending over $200 in a new GPU. It is our opinion that 8GB should be reserved for entry-level gaming products now, and anything less than 8 GB isn't suitable for current generation gaming.

 

Bottom line, the GeForce RTX 4070 is a good value product packing a strong feature set and excellent performance. We're happy that the RTX 4070 has turned out to be a product that we can recommend, and we expect this GPU to sell very well if they can meet the promised $600 MSRP.

 

WWW.TECHSPOT.COM

Nvidia's GeForce RTX 4070 release has been highly anticipated for its potential of bringing the newer generation to a...

 

 

 

TLDR: It's a 3080 with better power consumption for $600. 

 

 

 

 

 

WDL_4K.png

FC6_4K.png

CoD_1440p.png

Screenshots_2023-04-12-12-22-39.png

Edited by UltraMega
  • Thanks 1

null

Owned

 Share

CPU: 5800x
MOTHERBOARD: ASUS TUF Gaming B550-Plus
RAM: 32GB 3600mhz CL16
GPU: 7900XT
SOUNDCARD: Sound Blaster Z 5.1 home theater
MONITOR: 4K 65 inch TV
Full Rig Info
Link to comment
Share on other sites

As much as I like to bash Nvidia for their somewhat overbearing business tactics, I gotta give them credit here. DLSS3 on the 4090 is sort of an afterthought, but when you pair it with a $600 card and run path tracing at "4K" ~60fps, it does make AMD look like they're falling behind again just when it started feeling like they were close to being on even terms with the DLSS2/FSR2 race. 

 

Interesting times. I think it's like 1 out of every 8 pixels is actual rendered, and the rest is being filled in with AI. It does look a bit soft, but it seems like we've gone from caring a lot about native res to very effective upscaling in a relatively short time. I can only imagine where this kind of 'render a small amount of pixels and fill in the rest' style of doing things will hold for the future.

  • Thanks 2

null

Owned

 Share

CPU: 5800x
MOTHERBOARD: ASUS TUF Gaming B550-Plus
RAM: 32GB 3600mhz CL16
GPU: 7900XT
SOUNDCARD: Sound Blaster Z 5.1 home theater
MONITOR: 4K 65 inch TV
Full Rig Info
Link to comment
Share on other sites

7 hours ago, UltraMega said:

As much as I like to bash Nvidia for their somewhat overbearing business tactics, I gotta give them credit here. DLSS3 on the 4090 is sort of an afterthought, but when you pair it with a $600 card and run path tracing at "4K" ~60fps, it does make AMD look like they're falling behind again just when it started feeling like they were close to being on even terms with the DLSS2/FSR2 race. 

 

Interesting times. I think it's like 1 out of every 8 pixels is actual rendered, and the rest is being filled in with AI. It does look a bit soft, but it seems like we've gone from caring a lot about native res to very effective upscaling in a relatively short time. I can only imagine where this kind of 'render a small amount of pixels and fill in the rest' style of doing things will hold for the future.

Oh absolutely.  The only reason why I said Nvidia tax was because its $600, and there's many cheaper cards that outperform it.  Not to mention the die being a traditionally lower tier die size, and any other reasons why this card really shouldn't be considered $600 worth.  You're not wrong though.

Personally, I'd still rather have native resolution.  That might just be me though.  I'm willing to accept things like FSR / DLSS though in exchange for MASSIVE purdies that make me drool at my screen.  Otherwise, not really.  Like Cyberpunk, I'm happy to use FSR in that in exchange for RT (right now, on a first gen RT card).

As far as the AI aspect of it, also correct, very interesting times indeed.  It could get very interesting for video games.  Where I don't like seeing this technology at is with cameras and video recording devices (which are already doing it).  If AI is generating frames or photos in cameras, what's going to stop the AI from producing fake "photographs".  It already does this in certain phones.  Might not be producing "fake" images just yet, but what's going to stop that from happening if the AI is generating the images in your camera?

Gaming wise, I think this tool could be very useful though.  I'd like to see buttery smooth framerates, high resolution (native if possible), and lots of pretties.  This is what I pay big money for a GPU for.  If AI is involved in that, so be it. 🙂

  • Thanks 1
Link to comment
Share on other sites

$600 bought you 4k cards not that long ago. This is definitely a 1080p/1440p card, so yeah I have no positive opinions of this card aside from the performance per watt is good.

null

Showcase

 Share

CPU: AMD Ryzen 9 5900X
GPU: Nvidia RTX 3080 Ti Founders Edition
RAM: G.Skill Trident Z Neo 32GB DDR4-3600 (@ 3733 CL14)
MOTHERBOARD: ASUS Crosshair VIII Dark Hero
SSD/NVME: x2 Samsung 970 Evo Plus 2TB
SSD/NVME 2: Crucial MX500 1TB
PSU: be Quiet! Straight Power 12 1500W
MONITOR: LG 42" C4 OLED
Full Rig Info

null

Owned

 Share

CPU: E8400, i5-650, i7-870, i7-960, i5-2400, i7-4790k, i9-10900k, i3-13100, i9-13900ks
GPU: many
RAM: Corsair 32GB DDR3-2400 | Oloy Blade 16GB DDR4-3600 | Crucial 16GB DDR5-5600
MOTHERBOARD: ASUS P7P55 WS SC | ASUS Z97 Deluxe | EVGA Z490 Dark | EVGA Z790 Dark Kingpin
SSD/NVME: Samsung 870 Evo 1TB | Inland 1TB Gen 4
PSU: Seasonic Focus GX 1000W
CASE: Cooler Master MasterFrame 700 - bench mode
OPERATING SYSTEM: Windows 10 LTSC
Full Rig Info

Owned

 Share

CPU: M1 Pro
RAM: 32GB
SSD/NVME: 1TB
OPERATING SYSTEM: MacOS Sonoma
CASE: Space Grey
Full Rig Info
Link to comment
Share on other sites

3 hours ago, Sir Beregond said:

$600 bought you 4k cards not that long ago. This is definitely a 1080p/1440p card, so yeah I have no positive opinions of this card aside from the performance per watt is good.

To me it seems like all new GPUs are 4K cards to me these days. I think the 4070 can probably run just about any game without RT at 4K just fine. In a sense, Nvidia needed to push RT because without it, there wouldn't be much reason to buy an expensive card part a certain point. Once a mid-range GPU can run HQ photogrammetry assets with enough detail to really flesh out a 3D environment, the only thing to build on next is the lighting. 

Kinda makes me wonder, once we have GPUs that can run path tracing in high detail at high res well enough to just make it the standard for all games, will we even need better GPUs anymore for gaming? 

null

Owned

 Share

CPU: 5800x
MOTHERBOARD: ASUS TUF Gaming B550-Plus
RAM: 32GB 3600mhz CL16
GPU: 7900XT
SOUNDCARD: Sound Blaster Z 5.1 home theater
MONITOR: 4K 65 inch TV
Full Rig Info
Link to comment
Share on other sites

5 hours ago, UltraMega said:

Kinda makes me wonder, once we have GPUs that can run path tracing in high detail at high res well enough to just make it the standard for all games, will we even need better GPUs anymore for gaming

🤷‍♂️

I'm sure there will be some advancement in gaming tech that'll make it so we do need new GPU's.  Maybe GPU's won't even be GPU's anymore, who knows.  Fun stuff to think about though. 🙂  

Link to comment
Share on other sites

18 hours ago, UltraMega said:

To me it seems like all new GPUs are 4K cards to me these days. I think the 4070 can probably run just about any game without RT at 4K just fine. In a sense, Nvidia needed to push RT because without it, there wouldn't be much reason to buy an expensive card part a certain point. Once a mid-range GPU can run HQ photogrammetry assets with enough detail to really flesh out a 3D environment, the only thing to build on next is the lighting. 

Kinda makes me wonder, once we have GPUs that can run path tracing in high detail at high res well enough to just make it the standard for all games, will we even need better GPUs anymore for gaming? 

I seriously doubt anyone serious about 4k gaming is looking at a severely cut down third tier silicon with only 12GB. The 4070 Ti already showed deficiency in performance scaling to 4k from 1440p. I've already hit a VRAM limit in 1 game at 4k with my 12GB 3080 Ti. These 4070's are not 4k cards. I am not saying you couldn't play 4k with these, but let's be real, these are at best 1440p cards. My point was that $600 not that long ago got you cards that were marketed as 4k cards. So 4k is getting more expensive, ok. But why is 1080p and 1440p also getting more expensive?

 

That issue aside, let's look at something else.

 

970 was roughly equivalent to a 780 Ti - for $329

1070 was equivalent performance to a 980 Ti / Titan X - for $379

2070 was a bit worse than a 1080 Ti - for $499

2070 Super was roughly equivalent or beat a 1080 Ti - for $499

3070 was equivalent performance to a 2080 Ti - for $499

4070 is roughly equal to, but can still lose to a 3080 (regular, not Ti) - for $599

 

So not only is the 4070 not even matching the performance characteristics of previous 70-class branded GPUs, Nvidia has the gall to also charge you $100 extra minimum, for it compared to those.

 

4070 and 4070 Ti are misbranded. 4070 Ti should be the 4070, and the 4070 is really a 4060 Ti, for $600.

 

I hope these rot on shelves.

 

 

Edited by Sir Beregond

null

Showcase

 Share

CPU: AMD Ryzen 9 5900X
GPU: Nvidia RTX 3080 Ti Founders Edition
RAM: G.Skill Trident Z Neo 32GB DDR4-3600 (@ 3733 CL14)
MOTHERBOARD: ASUS Crosshair VIII Dark Hero
SSD/NVME: x2 Samsung 970 Evo Plus 2TB
SSD/NVME 2: Crucial MX500 1TB
PSU: be Quiet! Straight Power 12 1500W
MONITOR: LG 42" C4 OLED
Full Rig Info

null

Owned

 Share

CPU: E8400, i5-650, i7-870, i7-960, i5-2400, i7-4790k, i9-10900k, i3-13100, i9-13900ks
GPU: many
RAM: Corsair 32GB DDR3-2400 | Oloy Blade 16GB DDR4-3600 | Crucial 16GB DDR5-5600
MOTHERBOARD: ASUS P7P55 WS SC | ASUS Z97 Deluxe | EVGA Z490 Dark | EVGA Z790 Dark Kingpin
SSD/NVME: Samsung 870 Evo 1TB | Inland 1TB Gen 4
PSU: Seasonic Focus GX 1000W
CASE: Cooler Master MasterFrame 700 - bench mode
OPERATING SYSTEM: Windows 10 LTSC
Full Rig Info

Owned

 Share

CPU: M1 Pro
RAM: 32GB
SSD/NVME: 1TB
OPERATING SYSTEM: MacOS Sonoma
CASE: Space Grey
Full Rig Info
Link to comment
Share on other sites

13 minutes ago, Sir Beregond said:

I seriously doubt anyone serious about 4k gaming is looking at a severely cut down third tier silicon with only 12GB. The 4070 Ti already showed deficiency in performance scaling to 4k from 1440p. I've already hit a VRAM limit in 1 game at 4k with my 12GB 3080 Ti. These 4070's are not 4k cards. I am not saying you couldn't play 4k with these, but let's be real, these are at best 1440p cards. My point was that $600 not that long ago got you cards that were marketed as 4k cards. So 4k is getting more expensive, ok. But why is 1080p and 1440p also getting more expensive?

 

That issue aside, let's look at something else.

 

970 was roughly equivalent to a 780 Ti - for $329

1070 was equivalent performance to a 980 Ti / Titan X - for $379

2070 was a bit worse than a 1080 Ti - for $499

2070 Super was roughly equivalent or beat a 1080 Ti - for $499

3070 was equivalent performance to a 2080 Ti - for $499

4070 is roughly equal to, but can still lose to a 3080 (regular, not Ti) - for $599

 

So not only is the 4070 not even matching the performance characteristics of previous 70-class branded GPUs, Nvidia has the gall to also charge you $100 extra minimum, for it compared to those.

 

4070 and 4070 Ti are misbranded. 4070 Ti should be the 4070, and the 4070 is really a 4060 Ti, for $600.

 

I hope these rot on shelves.

 

 

I agree with what you're saying about it being a cut down and overpriced card, but when the 4090 can only do 2077 at 16 FPS without upscaling you could argue there are no 4k cards.

 

I think these days it's more RT or not than res since when RT is on, it's never really at 4k anyway, unless it's very minor RT. 

null

Owned

 Share

CPU: 5800x
MOTHERBOARD: ASUS TUF Gaming B550-Plus
RAM: 32GB 3600mhz CL16
GPU: 7900XT
SOUNDCARD: Sound Blaster Z 5.1 home theater
MONITOR: 4K 65 inch TV
Full Rig Info
Link to comment
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now


×
×
  • Create New...

Important Information

This Website may place and access certain Cookies on your computer. ExtremeHW uses Cookies to improve your experience of using the Website and to improve our range of products and services. ExtremeHW has carefully chosen these Cookies and has taken steps to ensure that your privacy is protected and respected at all times. All Cookies used by this Website are used in accordance with current UK and EU Cookie Law. For more information please see our Privacy Policy