Jump to content

Welcome to ExtremeHW

Welcome to ExtremeHW, register to take part in our community, don't worry this is a simple FREE process that requires minimal information for you to signup.

 

Registered users can: 

  • Start new topics and reply to others.
  • Show off your PC using our Rig Creator feature.
  • Subscribe to topics and forums to get updates.
  • Get your own profile page to customize.
  • Send personal messages to other members.
  • Take advantage of site exclusive features.
  • Upgrade to Premium to unlock additional sites features.
IGNORED

AMD 7900XTX & XT reviews


Recommended Posts

 

 

WWW.TECHSPOT.COM

The Radeon RX 7900 XTX is a pretty good GPU, at least relative to its GeForce competitor, but whether or...

 

 

Similar to previous cards, the raster performance is usually very good, while RT is not. 

 

Screenshots_2022-12-12-11-03-42.png

Screenshots_2022-12-12-11-02-00.png

Screenshots_2022-12-12-11-01-43.png

Screenshots_2022-12-12-11-00-38.png

null

Owned

 Share

CPU: 5800x
MOTHERBOARD: ASUS TUF Gaming B550-Plus
RAM: 32GB 3600mhz CL16
GPU: 7900XT
SOUNDCARD: Sound Blaster Z 5.1 home theater
MONITOR: 4K 65 inch TV
Full Rig Info
Link to comment
Share on other sites

Part of me likes this, but the other part of me says at $999 this is just raising the price of that bracket (given its competition being the overpriced 4080).

 

Last gen the 3080 and its competitor, the 6800 XT was closer to the $700 mark (MSRP anyway). 

 

I don't know, would have liked to see this card cheaper frankly, but from a business standpoint I get why they didn't. AMD still makes themselves look like the hero given its not $1200. Forget the naming for a sec, this card feels more like a 6800 XT successor than a 6900/6950 XT successor given it more across the board competes with the 4080 vs the 4090 (some caveats).

Edited by Sir Beregond

null

Showcase

 Share

CPU: AMD Ryzen 9 5900X
GPU: Nvidia RTX 3080 Ti Founders Edition
RAM: G.Skill Trident Z Neo 32GB DDR4-3600 (@ 3733 CL14)
MOTHERBOARD: ASUS Crosshair VIII Dark Hero
SSD/NVME: x2 Samsung 970 Evo Plus 2TB
SSD/NVME 2: Crucial MX500 1TB
PSU: be Quiet! Straight Power 12 1500W
MONITOR: LG 42" C4 OLED
Full Rig Info

null

Owned

 Share

CPU: E8400, i5-650, i7-870, i7-960, i5-2400, i7-4790k, i9-10900k, i3-13100, i9-13900ks
GPU: many
RAM: Corsair 32GB DDR3-2400 | Oloy Blade 16GB DDR4-3600 | Crucial 16GB DDR5-5600
MOTHERBOARD: ASUS P7P55 WS SC | ASUS Z97 Deluxe | EVGA Z490 Dark | EVGA Z790 Dark Kingpin
SSD/NVME: Samsung 870 Evo 1TB | Inland 1TB Gen 4
PSU: Seasonic Focus GX 1000W
CASE: Cooler Master MasterFrame 700 - bench mode
OPERATING SYSTEM: Windows 10 LTSC
Full Rig Info

Owned

 Share

CPU: M1 Pro
RAM: 32GB
SSD/NVME: 1TB
OPERATING SYSTEM: MacOS Sonoma
CASE: Space Grey
Full Rig Info
Link to comment
Share on other sites

2 hours ago, Sir Beregond said:

Part of me likes this, but the other part of me says at $999 this is just raising the price of that bracket (given its competition being the overpriced 4080).

 

Last gen the 3080 and its competitor, the 6800 XT was closer to the $700 mark (MSRP anyway). 

 

I don't know, would have liked to see this card cheaper frankly, but from a business standpoint I get why they didn't. AMD still makes themselves look like the hero given its not $1200. Forget the naming for a sec, this card feels more like a 6800 XT successor than a 6900/6950 XT successor given it more across the board competes with the 4080 vs the 4090 (some caveats).

To be fair, the 6900XT didn't compete with the 3090 either, the results were pretty similar to what we see this gen, with the 7900XTX being in between the 4080 and 4090 in most tests.  Some tests it even beat the 4090, and yes, some tests it was slightly slower than the 4080.  Personally, it looks to be about the same as last gen as far as comparisons go.

I do agree with you on the pricing thing.  As much as I want to get one, because I really do feel like it'd be JUST ENOUGH performance finally that I'd be perfectly content at my resolution......I'm gonna pass till they hit the used market at a more affordable price.  Same goes for the AM5 kit too.  Nope, it can wait.  I jumped on the 5800x / 6900XT, made that mistake.  I'm almost, ALMOST happy with my rig's performance.  I think this gen might finally do it for me.  But........it can wait. 🙂  I still have a lot of enjoyment to get out of my current rig anyway.

  • Respect 1
Link to comment
Share on other sites

27 minutes ago, pioneerisloud said:

To be fair, the 6900XT didn't compete with the 3090 either, the results were pretty similar to what we see this gen, with the 7900XTX being in between the 4080 and 4090 in most tests.  Some tests it even beat the 4090, and yes, some tests it was slightly slower than the 4080.  Personally, it looks to be about the same as last gen as far as comparisons go.

I do agree with you on the pricing thing.  As much as I want to get one, because I really do feel like it'd be JUST ENOUGH performance finally that I'd be perfectly content at my resolution......I'm gonna pass till they hit the used market at a more affordable price.  Same goes for the AM5 kit too.  Nope, it can wait.  I jumped on the 5800x / 6900XT, made that mistake.  I'm almost, ALMOST happy with my rig's performance.  I think this gen might finally do it for me.  But........it can wait. 🙂  I still have a lot of enjoyment to get out of my current rig anyway.

I'm not sure that's accurate. Unless I am just looking at more mature stats vs launch, the 6900 XT competed very well or even beat in raster the 3090 (at sub 4k) much more than the 7900 XTX does against the 4090. Maybe I am missing something here, but it doesn't look remotely the same to me.

null

Showcase

 Share

CPU: AMD Ryzen 9 5900X
GPU: Nvidia RTX 3080 Ti Founders Edition
RAM: G.Skill Trident Z Neo 32GB DDR4-3600 (@ 3733 CL14)
MOTHERBOARD: ASUS Crosshair VIII Dark Hero
SSD/NVME: x2 Samsung 970 Evo Plus 2TB
SSD/NVME 2: Crucial MX500 1TB
PSU: be Quiet! Straight Power 12 1500W
MONITOR: LG 42" C4 OLED
Full Rig Info

null

Owned

 Share

CPU: E8400, i5-650, i7-870, i7-960, i5-2400, i7-4790k, i9-10900k, i3-13100, i9-13900ks
GPU: many
RAM: Corsair 32GB DDR3-2400 | Oloy Blade 16GB DDR4-3600 | Crucial 16GB DDR5-5600
MOTHERBOARD: ASUS P7P55 WS SC | ASUS Z97 Deluxe | EVGA Z490 Dark | EVGA Z790 Dark Kingpin
SSD/NVME: Samsung 870 Evo 1TB | Inland 1TB Gen 4
PSU: Seasonic Focus GX 1000W
CASE: Cooler Master MasterFrame 700 - bench mode
OPERATING SYSTEM: Windows 10 LTSC
Full Rig Info

Owned

 Share

CPU: M1 Pro
RAM: 32GB
SSD/NVME: 1TB
OPERATING SYSTEM: MacOS Sonoma
CASE: Space Grey
Full Rig Info
Link to comment
Share on other sites

7 minutes ago, Sir Beregond said:

I'm not sure that's accurate. Unless I am just looking at more mature stats vs launch, the 6900 XT competed very well or even beat in raster the 3090 (at sub 4k) much more than the 7900 XTX does against the 4090. Maybe I am missing something here, but it doesn't look remotely the same to me.

I could be wrong to be honest.  I really didn't look at the reviews all that terribly hard when I bought mine.  I knew it was a ton faster than my 5700XT, so I bought one (my 6900xt).  I just know the 7900XTX is just about my dream performance card for my monitor finally (other than RT performance).  I'm not willing to shell out $1600, or even $1000 to get that though, so *shrugs.

Link to comment
Share on other sites

3 hours ago, pioneerisloud said:

I could be wrong to be honest.  I really didn't look at the reviews all that terribly hard when I bought mine.  I knew it was a ton faster than my 5700XT, so I bought one (my 6900xt).  I just know the 7900XTX is just about my dream performance card for my monitor finally (other than RT performance).  I'm not willing to shell out $1600, or even $1000 to get that though, so *shrugs.

I guess I am levying my same rebranding criticisms I've made against Nvidia on AMD. Sure it's $200 cheaper than a 4080, but it's more like a successor to the 6800XT than it is to the 6900/6950XT. At least just speaking MSRP, the 6800XT was a $650 card so to me the 7900XTX is like $300 overpriced.

 

I don't know. Just my read. The 7900 cards seem much less competitive to current 4090 vs last gen matchup.

Edited by Sir Beregond

null

Showcase

 Share

CPU: AMD Ryzen 9 5900X
GPU: Nvidia RTX 3080 Ti Founders Edition
RAM: G.Skill Trident Z Neo 32GB DDR4-3600 (@ 3733 CL14)
MOTHERBOARD: ASUS Crosshair VIII Dark Hero
SSD/NVME: x2 Samsung 970 Evo Plus 2TB
SSD/NVME 2: Crucial MX500 1TB
PSU: be Quiet! Straight Power 12 1500W
MONITOR: LG 42" C4 OLED
Full Rig Info

null

Owned

 Share

CPU: E8400, i5-650, i7-870, i7-960, i5-2400, i7-4790k, i9-10900k, i3-13100, i9-13900ks
GPU: many
RAM: Corsair 32GB DDR3-2400 | Oloy Blade 16GB DDR4-3600 | Crucial 16GB DDR5-5600
MOTHERBOARD: ASUS P7P55 WS SC | ASUS Z97 Deluxe | EVGA Z490 Dark | EVGA Z790 Dark Kingpin
SSD/NVME: Samsung 870 Evo 1TB | Inland 1TB Gen 4
PSU: Seasonic Focus GX 1000W
CASE: Cooler Master MasterFrame 700 - bench mode
OPERATING SYSTEM: Windows 10 LTSC
Full Rig Info

Owned

 Share

CPU: M1 Pro
RAM: 32GB
SSD/NVME: 1TB
OPERATING SYSTEM: MacOS Sonoma
CASE: Space Grey
Full Rig Info
Link to comment
Share on other sites

I'm pretty happy with where the 7900xtx is sitting right now. Considering this is their first stab at the chiplet GPU and this early on its already beating the 4080. I think it will be a monster once the drivers mature a bit. 

 

My roommate is going to get one for his VR sim racing build. So I'll be sure to play with it on the test bench since I'll probably end up building the PC anyway. We do have a Microcenter near us, but won't be camping out trying to get one at launch. 

Owned

 Share

CPU: 5900X + Optimus block
MOTHERBOARD: MSI X570 Ace
GPU: EVGA RTX 3090 FTW3 Ultra + Optimus block
RAM: 32GB Oloy Blade 3600CL14
SSD/NVME: 1TB SN750 Black
SSD/NVME 2: 1TB SN750 Black
SSD/NVME 3: 2TB Samsung 970 Evo Plus
CASE: LD PC-V7
Full Rig Info

Owned

 Share

CPU: 7800X3D
MOTHERBOARD: B650E-I Strix
RAM: G.Skill Flare X5
GPU: Reference 6950 XT
CASE: InWin D-Frame Mini
WC RADIATOR: MO-RA3 with 4x180mm Silverstone fans
FAN CONTROLLER: Aquaero
Full Rig Info

Owned

 Share

CPU: 12600KF
MOTHERBOARD: Z790I Strix
RAM: G.Skill Trident Z5 Neo
GPU: RTX 2080
CASE: Sliger SM580
WC RESEVOIR: Phanteks R160C
WC RADIATOR: XSPC TX240 Ultrathin
FANS: Phanteks T30
Full Rig Info
Link to comment
Share on other sites

Yeah don't get me wrong, I am totally in agreement that the card looks great. I just feel the branding/pricing is wrong. If this was $700-$800 and called a 7800XT or XTX, I think I would like it better. I find it strange they chose to brand it a "9"-class in the series.

Edited by Sir Beregond

null

Showcase

 Share

CPU: AMD Ryzen 9 5900X
GPU: Nvidia RTX 3080 Ti Founders Edition
RAM: G.Skill Trident Z Neo 32GB DDR4-3600 (@ 3733 CL14)
MOTHERBOARD: ASUS Crosshair VIII Dark Hero
SSD/NVME: x2 Samsung 970 Evo Plus 2TB
SSD/NVME 2: Crucial MX500 1TB
PSU: be Quiet! Straight Power 12 1500W
MONITOR: LG 42" C4 OLED
Full Rig Info

null

Owned

 Share

CPU: E8400, i5-650, i7-870, i7-960, i5-2400, i7-4790k, i9-10900k, i3-13100, i9-13900ks
GPU: many
RAM: Corsair 32GB DDR3-2400 | Oloy Blade 16GB DDR4-3600 | Crucial 16GB DDR5-5600
MOTHERBOARD: ASUS P7P55 WS SC | ASUS Z97 Deluxe | EVGA Z490 Dark | EVGA Z790 Dark Kingpin
SSD/NVME: Samsung 870 Evo 1TB | Inland 1TB Gen 4
PSU: Seasonic Focus GX 1000W
CASE: Cooler Master MasterFrame 700 - bench mode
OPERATING SYSTEM: Windows 10 LTSC
Full Rig Info

Owned

 Share

CPU: M1 Pro
RAM: 32GB
SSD/NVME: 1TB
OPERATING SYSTEM: MacOS Sonoma
CASE: Space Grey
Full Rig Info
Link to comment
Share on other sites

I think if the RT performance were a bit better it would be a nice card. As for the pricing, let's face it,they probably went with undercutting the 4080, and in 6 months will release a "7950xtx" card  that out performs the 4080 in all test and is halfway between it and the 4090 for only $1399. Meanwhile as Linus said....

" the real state of amd vs. nvidia"

319880464_4933781256724543_3053089429785726960_n.jpg

Edited by schuck6566
  • Respect 3
Link to comment
Share on other sites

Yeah, the RT performance hit, I mean we knew before release it wasn't going to be spectacular.  But I'd like to actually be able to fully enjoy RT games with maxed out graphics (Portal / Cyberpunk cough cough).  So since that still can't be done, I don't mind waiting, although I absolutely DO want one of the 7900XTX's.  I apparently have a collection or something, so now its an impulse "must have". 🙄  But.....it can wait till its ~$500 ish or less on the used market.  I already bought a "top dollar" GPU (as did a LOT of us).  This just isn't that exciting, as great of a card as it is.

 

Same goes for the 4000 series in my book.  Yeah, the 4090 is top tier, high end performance.  It's a great card, please don't get me wrong.  But it even can't play 4k native res RT titles at smooth framerates, as it relies heavily on DLSS.  Both launches suck so far. -_-  

  • Thanks 1
Link to comment
Share on other sites

19 minutes ago, schuck6566 said:

I think if the RT performance were a bit better it would be a nice card. As for the pricing, let's face it,they probably went with undercutting the 4080, and in 6 months will release a "7950xtx" card  that out performs the 4080 in all test and is halfway between it and the 4090 for only $1399. Meanwhile as Linus said....

" the real state of amd vs. nvidia"

319880464_4933781256724543_3053089429785726960_n.jpg

Anyway you look at it, it's pretty much like....

 

image.gif.2ebda28d97cb797ddf6f2534c6c6d6bc.gif

  • Thanks 2
  • Respect 1
Link to comment
Share on other sites

I hate to be a broken record, but these price hikes are related to Moore's Law ending and the kind of issues that arise from it; and this is just the beginning of the end. Without some kind of major unforeseen advancement in chip design this is just physics making thing more expensive, and probably also a bit of greed on Nvidia's side.

 

As other's have noted, both Nvidia and AMD seem to be releasing cards that seem to be upper mid-tier cards and calling them top-tier cards, and I think this is a means of getting people more used to the price hikes. By calling the $1000 GPU an 7900XTX, it won't sting as much when they release a 7920, 7950, and 7990 or whatever they call it, should they decide to release more powerful cards; but that sort of raises so questions about what a top-tier card is. Is it the best GPU they can make within a given design or is it a GPU that runs up against the line of what consumers are willing to put up with as far as price, heat and power. I think Nvidia and AMD are trying to answer those questions right now. So, maybe we'll see a lot of higher tier cards come out and push these new releases down into the mid-tier over time, or maybe Nvidia and AMD will decide to draw a line based on cost and power consumption. These are the kinds of things that need to be address as Moore's Law starts to break down more and more. Not the only factor at play, but it's a huge part of all this. 

 

 

Now onto the topic of the 7900XTX release; AMD has been right on the mark for the last few years at least. Their GPUs neither impress or disappoint; they just exist as a decent but not exciting option against Nvidia. Whenever it seems like they have an opening, they just give us something average. If their RT performance had been maybe 20% better, this would have been pretty exciting.

 

AMD has said they sees RT as still being too early to focus more on, and that they plan to make it a top priority with the next series of GPUs, RDNA4 I would assume. If true, and they can catch up to Nvidia while still having just as impressive raster number, then next gen will be the first really serious RT gen. 

Edited by UltraMega
  • Thanks 4

null

Owned

 Share

CPU: 5800x
MOTHERBOARD: ASUS TUF Gaming B550-Plus
RAM: 32GB 3600mhz CL16
GPU: 7900XT
SOUNDCARD: Sound Blaster Z 5.1 home theater
MONITOR: 4K 65 inch TV
Full Rig Info
Link to comment
Share on other sites

13 minutes ago, UltraMega said:

I hate to be a broken record, but these price hikes are related to Moore's Law ending and the kind of issues that arise from it; and this is just the beginning of the end. Without some kind of major unforeseen advancement in chip design this is just physics making thing more expensive, and probably also a bit of greed on Nvidia's side.

GPC's are small and on TSMC 6nm (matured 7nm). Yields have got to be fantastic for these and price per wafer great given the small size and yield.

 

Frankly would say the same for the 5nm main GPC. Its of a fairly mid-range size (compared to monolithic chips), so again, yields will be up, and pricing should be good as its not the most cutting edge 4nm.

 

I find the $999 price-point to be overpriced, just not overpriced as badly as the 4080. But given the dies used for its design, Moore's Law is not why its $1000.

 

Anyway I don't disagree with the rest of your post. The branding from both companies and push for higher priced products has always been in an effort to see what the market will tolerate.

 

As for RT, I won't disagree that I still find it to be largely niche as a feature, but at the end of the day it comes down to marketing and as long as Nvidia can claim superiority here, you will see less actual competition between the two parties. Especially as Nvidia has really pushed the RT angle with now 3 gens of cards. It matters for the mindshare game.

Edited by Sir Beregond

null

Showcase

 Share

CPU: AMD Ryzen 9 5900X
GPU: Nvidia RTX 3080 Ti Founders Edition
RAM: G.Skill Trident Z Neo 32GB DDR4-3600 (@ 3733 CL14)
MOTHERBOARD: ASUS Crosshair VIII Dark Hero
SSD/NVME: x2 Samsung 970 Evo Plus 2TB
SSD/NVME 2: Crucial MX500 1TB
PSU: be Quiet! Straight Power 12 1500W
MONITOR: LG 42" C4 OLED
Full Rig Info

null

Owned

 Share

CPU: E8400, i5-650, i7-870, i7-960, i5-2400, i7-4790k, i9-10900k, i3-13100, i9-13900ks
GPU: many
RAM: Corsair 32GB DDR3-2400 | Oloy Blade 16GB DDR4-3600 | Crucial 16GB DDR5-5600
MOTHERBOARD: ASUS P7P55 WS SC | ASUS Z97 Deluxe | EVGA Z490 Dark | EVGA Z790 Dark Kingpin
SSD/NVME: Samsung 870 Evo 1TB | Inland 1TB Gen 4
PSU: Seasonic Focus GX 1000W
CASE: Cooler Master MasterFrame 700 - bench mode
OPERATING SYSTEM: Windows 10 LTSC
Full Rig Info

Owned

 Share

CPU: M1 Pro
RAM: 32GB
SSD/NVME: 1TB
OPERATING SYSTEM: MacOS Sonoma
CASE: Space Grey
Full Rig Info
Link to comment
Share on other sites

4 minutes ago, Sir Beregond said:

GPC's are small and on TSMC 6nm (matured 7nm). Yields have got to be fantastic for these and price per wafer great given the small size and yield.

 

Frankly would say the same for the 5nm main GPC. Its of a fairly mid-range size (compared to monolithic chips), so again, yields will be up, and pricing should be good as its not the most cutting edge 4nm.

 

I find the $999 price-point to be overpriced, just not overpriced as badly as the 4080. But given the dies used for its design, Moore's Law is not why its $1000.

 

It's not the only reason, but a big one. Bigger than I think most people talking about this have yet to realize. There's also inflation, which is about 20% since the last gpu releases, which is pretty huge. 

null

Owned

 Share

CPU: 5800x
MOTHERBOARD: ASUS TUF Gaming B550-Plus
RAM: 32GB 3600mhz CL16
GPU: 7900XT
SOUNDCARD: Sound Blaster Z 5.1 home theater
MONITOR: 4K 65 inch TV
Full Rig Info
Link to comment
Share on other sites

2 minutes ago, UltraMega said:

 

It's not the only reason, but a big one. Bigger than I think most people talking about this have yet to realize. There's also inflation, which is about 20% since the last gpu releases, which is pretty huge. 

I mean, it explains why AMD went down the MCD path and I imagine Nvidia will need to as well eventually. But since AMD did, the pricing makes no sense.

 

What's the 20% number from?

  • Thanks 1

null

Showcase

 Share

CPU: AMD Ryzen 9 5900X
GPU: Nvidia RTX 3080 Ti Founders Edition
RAM: G.Skill Trident Z Neo 32GB DDR4-3600 (@ 3733 CL14)
MOTHERBOARD: ASUS Crosshair VIII Dark Hero
SSD/NVME: x2 Samsung 970 Evo Plus 2TB
SSD/NVME 2: Crucial MX500 1TB
PSU: be Quiet! Straight Power 12 1500W
MONITOR: LG 42" C4 OLED
Full Rig Info

null

Owned

 Share

CPU: E8400, i5-650, i7-870, i7-960, i5-2400, i7-4790k, i9-10900k, i3-13100, i9-13900ks
GPU: many
RAM: Corsair 32GB DDR3-2400 | Oloy Blade 16GB DDR4-3600 | Crucial 16GB DDR5-5600
MOTHERBOARD: ASUS P7P55 WS SC | ASUS Z97 Deluxe | EVGA Z490 Dark | EVGA Z790 Dark Kingpin
SSD/NVME: Samsung 870 Evo 1TB | Inland 1TB Gen 4
PSU: Seasonic Focus GX 1000W
CASE: Cooler Master MasterFrame 700 - bench mode
OPERATING SYSTEM: Windows 10 LTSC
Full Rig Info

Owned

 Share

CPU: M1 Pro
RAM: 32GB
SSD/NVME: 1TB
OPERATING SYSTEM: MacOS Sonoma
CASE: Space Grey
Full Rig Info
Link to comment
Share on other sites

1 hour ago, Sir Beregond said:

I mean, it explains why AMD went down the MCD path and I imagine Nvidia will need to as well eventually. But since AMD did, the pricing makes no sense.

 

What's the 20% number from?

I'm not an expert on what the exact causes of inflation are from, but in this case it's clearly COVID related, and if that's because companies decided to be greedy at the same time and saw COVID is a good excuse to raise prices or not, IDK. I just know that inflation from the last two years is about 20%. 

 

And yea, Moore's Law ending is definitely why AMD went the chiplet route. Nvidia doubled down on the typical design and decided to simply push prices and power.

 

Chiplets helps curve the price increase from Moore's Law ending effects, but chiplets don't eliminate the extra cost all together. 

 

If GPU manufacturing costs are 20% higher (just a number I made up), and inflation is 20% higher, that's 40% higher end game prices right there. I think it adds up. It sucks, but it adds up. I don't think AMD got more greedy, their prices are realistic, unfortunately. I don't think the same is true for Nvidia, but they're not as greedy as they look. With Nvidia I think it's a combination of greed and higher manufacturing costs while doing nothing to change the design to bring costs down. They decided to just lean into the higher end of the price curve and be a premium brand even more so than before. 

Edited by UltraMega

null

Owned

 Share

CPU: 5800x
MOTHERBOARD: ASUS TUF Gaming B550-Plus
RAM: 32GB 3600mhz CL16
GPU: 7900XT
SOUNDCARD: Sound Blaster Z 5.1 home theater
MONITOR: 4K 65 inch TV
Full Rig Info
Link to comment
Share on other sites

Social Media Manager
1.5k 845

Anyone as an expected Folding@home performance from the 7900xt or XTX ?

 

4090 at 25m ppd is pretty far from my 2-4m 6800xt.

 

🙂 am folding 24/7

Edited by bonami2

Owned

 Share

MOTHERBOARD: MSI MPG Z790i EDGE
CPU: Intel 13900k + Top Mounted 280mm Aio
RAM: 2x24gb Gskill 6400 cl36-48-48 1.4v
PSU: Cooler Master V850 SFX Gold White Edition
GPU: UHD ULTRA EXTREME BANANA GRAPHIC
MONITOR: [Monitor] LG CX48 OLED [VR] Samsung HMD Odyssey Plus OLED + Meta Quest 2 120hz
CASE: CoolerMaster NR200P White Mini ITX
SSD/NVME: 2TB Intel 660p 1tb sn850 1tb sn770
Full Rig Info

Owned

 Share

CPU: Asus Strix G15 AE 6800m 5900hx 32gb ram 1440p
RAM: MSI GT60 Dominator 870m 4800MQ
GPU: Alienware M11x R2 i7 640um Nvidia 335m 8gb Ram
MONITOR: Lenovo X270 1080p i7 7600u 16gb ram
SSD/NVME: Acer Chromebook 11.6
Full Rig Info

Owned

 Share

CPU: Ryzen 5560u
MOTHERBOARD: Beelink SER5 Mini PC Box
RAM: 2x32gb Sodimm
CASE: Jonsbo N1 Mini ITX
HDD: 8TB + 4TB HDD + 2 x Intel DC S3500 800GB
Full Rig Info
Link to comment
Share on other sites

43 minutes ago, UltraMega said:

I'm not an expert on what the exact causes of inflation are from, but in this case it's clearly COVID related, and if that's because companies decided to be greedy at the same time and saw COVID is a good excuse to raise prices or not, IDK. I just know that inflation from the last two years is about 20%. 

 

And yea, Moore's Law ending is definitely why AMD went the chiplet route. Nvidia doubled down on the typical design and decided to simply push prices and power.

 

Chiplets helps curve the price increase from Moore's Law ending effects, but chiplets don't eliminate the extra cost all together. 

 

If GPU manufacturing costs are 20% higher (just a number I made up), and inflation is 20% higher, that's 40% higher end game prices right there. I think it adds up. It sucks, but it adds up. I don't think AMD got more greedy, their prices are realistic, unfortunately. I don't think the same is true for Nvidia, but they're not as greedy as they look. With Nvidia I think it's a combination of greed and higher manufacturing costs while doing nothing to change the design to bring costs down. They decided to just lean into the higher end of the price curve and be a premium brand even more so than before. 

I don't know. I want to see real numbers, not "if such is x, then y". I get what you are saying, but I am not entirely sure its accurate numbers wise.

 

I do agree with you on Nvidia, they doubled-down on expensive, but given the reports I've seen on their margins, I'm not so sure the cost to manufacture has increased as woefully as some make it out. They've also vastly increased their margins, so something doesn't jive here.

Edited by Sir Beregond

null

Showcase

 Share

CPU: AMD Ryzen 9 5900X
GPU: Nvidia RTX 3080 Ti Founders Edition
RAM: G.Skill Trident Z Neo 32GB DDR4-3600 (@ 3733 CL14)
MOTHERBOARD: ASUS Crosshair VIII Dark Hero
SSD/NVME: x2 Samsung 970 Evo Plus 2TB
SSD/NVME 2: Crucial MX500 1TB
PSU: be Quiet! Straight Power 12 1500W
MONITOR: LG 42" C4 OLED
Full Rig Info

null

Owned

 Share

CPU: E8400, i5-650, i7-870, i7-960, i5-2400, i7-4790k, i9-10900k, i3-13100, i9-13900ks
GPU: many
RAM: Corsair 32GB DDR3-2400 | Oloy Blade 16GB DDR4-3600 | Crucial 16GB DDR5-5600
MOTHERBOARD: ASUS P7P55 WS SC | ASUS Z97 Deluxe | EVGA Z490 Dark | EVGA Z790 Dark Kingpin
SSD/NVME: Samsung 870 Evo 1TB | Inland 1TB Gen 4
PSU: Seasonic Focus GX 1000W
CASE: Cooler Master MasterFrame 700 - bench mode
OPERATING SYSTEM: Windows 10 LTSC
Full Rig Info

Owned

 Share

CPU: M1 Pro
RAM: 32GB
SSD/NVME: 1TB
OPERATING SYSTEM: MacOS Sonoma
CASE: Space Grey
Full Rig Info
Link to comment
Share on other sites

6 hours ago, Sir Beregond said:

I don't know. I want to see real numbers, not "if such is x, then y". I get what you are saying, but I am not entirely sure its accurate numbers wise.

 

I do agree with you on Nvidia, they doubled-down on expensive, but given the reports I've seen on their margins, I'm not so sure the cost to manufacture has increased as woefully as some make it out. They've also vastly increased their margins, so something doesn't jive here.

Well the inflation part is definitely accurate, you can easily check on that. 

 

WWW.THEBALANCEMONEY.COM

The U.S. inflation rate by year is the percentage of change in prices from one year to the next. It responds to business cycle phases and interest rates.

 

So between 2020 and 2022 inflation is about 14%, with another ~3% estimated for 2023. No quite 20%, but pretty close. 

 

If inflation is just 15%, then GPU manufacturing prices only need to have gone up another 15% for AMDs prices to make sense compared to RDNA2. If you assume AMD has to pay 15% more for manufacturing simply because of inflation, and then they want to charge 15% higher prices because of inflation, that's 30% right there and that's about how much their prices have actually increased if you think of the 7900XT as being a replacement for the 6800XT. 

Edited by UltraMega
  • Thanks 1

null

Owned

 Share

CPU: 5800x
MOTHERBOARD: ASUS TUF Gaming B550-Plus
RAM: 32GB 3600mhz CL16
GPU: 7900XT
SOUNDCARD: Sound Blaster Z 5.1 home theater
MONITOR: 4K 65 inch TV
Full Rig Info
Link to comment
Share on other sites

On 13/12/2022 at 17:30, UltraMega said:

Well the inflation part is definitely accurate, you can easily check on that. 

 

WWW.THEBALANCEMONEY.COM

The U.S. inflation rate by year is the percentage of change in prices from one year to the next. It responds to business cycle phases and interest rates.

 

So between 2020 and 2022 inflation is about 14%, with another ~3% estimated for 2023. No quite 20%, but pretty close. 

 

If inflation is just 15%, then GPU manufacturing prices only need to have gone up another 15% for AMDs prices to make sense compared to RDNA2. If you assume AMD has to pay 15% more for manufacturing simply because of inflation, and then they want to charge 15% higher prices because of inflation, that's 30% right there and that's about how much their prices have actually increased if you think of the 7900XT as being a replacement for the 6800XT. 

I guess, but I don't know. Profits are not in-line with simply inflation. Look at the bottom lines and at least for Nvidia they are making more money than they ever have in their history. With that knowledge I have to assume there is more going on here and wafer costs are not the same now as they were when 5nm/4nm was brand spanking new. Just like any new node, cost is high at first and then it goes down. Companies then negotiate deals for supply, etc.

 

Given Nvidia's bottom line just seems to be growing (I have not looked at AMD), I have to assume the hit to them on manufacturing costs isn't as much as they are trying to make it out to be.

Edited by Sir Beregond

null

Showcase

 Share

CPU: AMD Ryzen 9 5900X
GPU: Nvidia RTX 3080 Ti Founders Edition
RAM: G.Skill Trident Z Neo 32GB DDR4-3600 (@ 3733 CL14)
MOTHERBOARD: ASUS Crosshair VIII Dark Hero
SSD/NVME: x2 Samsung 970 Evo Plus 2TB
SSD/NVME 2: Crucial MX500 1TB
PSU: be Quiet! Straight Power 12 1500W
MONITOR: LG 42" C4 OLED
Full Rig Info

null

Owned

 Share

CPU: E8400, i5-650, i7-870, i7-960, i5-2400, i7-4790k, i9-10900k, i3-13100, i9-13900ks
GPU: many
RAM: Corsair 32GB DDR3-2400 | Oloy Blade 16GB DDR4-3600 | Crucial 16GB DDR5-5600
MOTHERBOARD: ASUS P7P55 WS SC | ASUS Z97 Deluxe | EVGA Z490 Dark | EVGA Z790 Dark Kingpin
SSD/NVME: Samsung 870 Evo 1TB | Inland 1TB Gen 4
PSU: Seasonic Focus GX 1000W
CASE: Cooler Master MasterFrame 700 - bench mode
OPERATING SYSTEM: Windows 10 LTSC
Full Rig Info

Owned

 Share

CPU: M1 Pro
RAM: 32GB
SSD/NVME: 1TB
OPERATING SYSTEM: MacOS Sonoma
CASE: Space Grey
Full Rig Info
Link to comment
Share on other sites

2 hours ago, Sir Beregond said:

I guess, but I don't know. Profits are not in-line with simply inflation. Look at the bottom lines and at least for Nvidia they are making more money than they ever have in their history. With that knowledge I have to assume there is more going on here and wafer costs are not the same now as they were when 5nm/4nm was brand spanking new. Just like any new node, cost is high at first and then it goes down. Companies then negotiate deals for supply, etc.

 

Given Nvidia's bottom line just seems to be growing (I have not looked at AMD), I have to assume the hit to them on manufacturing costs isn't as much as they are trying to make it out to be.

Demand for GPUs has gone up a lot, and Nvidia makes the best ones. Data centers are relying on GPUs now more than ever. Even if their margins stayed the same, they would probably be making money hand over fist because they're in a huge market with very few sellers. 

 

I'm not saying it's just inflation though; I'm saying inflation is a big part of it... and there are rising manufacturing costs from getting to these extremely small nodes/ Moore's Law starting to break down, and there is the greed factor from Nvidia. 

null

Owned

 Share

CPU: 5800x
MOTHERBOARD: ASUS TUF Gaming B550-Plus
RAM: 32GB 3600mhz CL16
GPU: 7900XT
SOUNDCARD: Sound Blaster Z 5.1 home theater
MONITOR: 4K 65 inch TV
Full Rig Info
Link to comment
Share on other sites

1 hour ago, UltraMega said:

Demand for GPUs has gone up a lot, and Nvidia makes the best ones. Data centers are relying on GPUs now more than ever. Even if their margins stayed the same, they would probably be making money hand over fist because they're in a huge market with very few sellers. 

 

I'm not saying it's just inflation though; I'm saying inflation is a big part of it... and there are rising manufacturing costs from getting to these extremely small nodes/ Moore's Law starting to break down, and there is the greed factor from Nvidia. 

Right, you are right in the "multitude of factors", I am just saying $1200 4080's ain't in-line with a damn thing except greed. As for the 7900 XTX, I would imagine there's margin to play with on AMD's part given the chiplet approach. The GCD's are on functionally an old node at this point (TSMC 6nm being a derivative of TSMC 7nm) and the MCD's on newer 5nm are of a mid-range size given the separation of functionality into the other chips.

 

This calculator is fun to play with:

WWW.SILICON-EDGE.CO.UK

 

When you get the stats for TSMC wafers, plug in the details for AD102/AD103, and then the GCD's and MCD's for AMD's Navi31, it's very interesting to see how much in a better position AMD is in on yields per wafer vs Nvidia, I feel the Moore's Law thing falls apart a bit as a factor in the AMD pricing for RDNA3. It was definitely a factor in the decision to move to multi-chip design, but now that they have, the actual results should be lower manufacturing costs vs Nvidia.

 

Nvidia is definitely more screwed by the fact they have monolithic still, and I totally agree that's where you are running into the issues with Moore's Law. As for AMD, smaller chips, still mostly competing with these monolithic chips on functionally much cheaper to manufacture design, yet still $1k. My contention is that the 7900 XTX is only $1k because the 4080 is $1200. It's enough of a discount to look good (depending on who you ask), but in my mind still overpriced. 

Edited by Sir Beregond

null

Showcase

 Share

CPU: AMD Ryzen 9 5900X
GPU: Nvidia RTX 3080 Ti Founders Edition
RAM: G.Skill Trident Z Neo 32GB DDR4-3600 (@ 3733 CL14)
MOTHERBOARD: ASUS Crosshair VIII Dark Hero
SSD/NVME: x2 Samsung 970 Evo Plus 2TB
SSD/NVME 2: Crucial MX500 1TB
PSU: be Quiet! Straight Power 12 1500W
MONITOR: LG 42" C4 OLED
Full Rig Info

null

Owned

 Share

CPU: E8400, i5-650, i7-870, i7-960, i5-2400, i7-4790k, i9-10900k, i3-13100, i9-13900ks
GPU: many
RAM: Corsair 32GB DDR3-2400 | Oloy Blade 16GB DDR4-3600 | Crucial 16GB DDR5-5600
MOTHERBOARD: ASUS P7P55 WS SC | ASUS Z97 Deluxe | EVGA Z490 Dark | EVGA Z790 Dark Kingpin
SSD/NVME: Samsung 870 Evo 1TB | Inland 1TB Gen 4
PSU: Seasonic Focus GX 1000W
CASE: Cooler Master MasterFrame 700 - bench mode
OPERATING SYSTEM: Windows 10 LTSC
Full Rig Info

Owned

 Share

CPU: M1 Pro
RAM: 32GB
SSD/NVME: 1TB
OPERATING SYSTEM: MacOS Sonoma
CASE: Space Grey
Full Rig Info
Link to comment
Share on other sites

Just for shiggles, I decided to look up these cards to see if they were released yet on newegg, Amazon, etc.......

 

Looks like scalpers are still at it.  7900 XTX's are listed as more expensive than 4090's are at current time of posting on most sites.  Ebay does seem to have some "cheaper", but they're still $1500 there too.

Link to comment
Share on other sites

23 minutes ago, pioneerisloud said:

Just for shiggles, I decided to look up these cards to see if they were released yet on newegg, Amazon, etc.......

 

Looks like scalpers are still at it.  7900 XTX's are listed as more expensive than 4090's are at current time of posting on most sites.  Ebay does seem to have some "cheaper", but they're still $1500 there too.

lol,I don't know if it's a good or bad thing that some of the different models of the 7900 XT's are still in stock at their list price on newegg. Does that mean the scalpers can't even be bothered with the card?🤔

Link to comment
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now


×
×
  • Create New...

Important Information

This Website may place and access certain Cookies on your computer. ExtremeHW uses Cookies to improve your experience of using the Website and to improve our range of products and services. ExtremeHW has carefully chosen these Cookies and has taken steps to ensure that your privacy is protected and respected at all times. All Cookies used by this Website are used in accordance with current UK and EU Cookie Law. For more information please see our Privacy Policy