Jump to content

Welcome to ExtremeHW

Welcome to ExtremeHW, register to take part in our community, don't worry this is a simple FREE process that requires minimal information for you to signup.

 

Registered users can: 

  • Start new topics and reply to others.
  • Show off your PC using our Rig Creator feature.
  • Subscribe to topics and forums to get updates.
  • Get your own profile page to customize.
  • Send personal messages to other members.
  • Take advantage of site exclusive features.
  • Upgrade to Premium to unlock additional sites features.
IGNORED

AMD Retreating from Enthusiast Graphics Segment with RDNA4?


Regardless of rumor or not, smart move or no?  

18 members have voted

  1. 1. AMD possibly exiting the high end GPU segment and focusing instead on the AI sector / and or developing more resources to the midrange segment - is this a smart move on AMD's part or not so much?

    • Smart move.
      5
    • Not so much.
      13


Recommended Posts

22 minutes ago, Sir Beregond said:

I watch the stock numbers for my local Micro Center regularly. My store keeps them pretty accurate on their site for each card model. Its slowed down now because I think anyone in the market for a $1600+ GPU bought one by now, but until it hit that point, the 4090s were selling like hot cakes here while the lower tier Nvidia cards were sitting or moving much slower.

I mean, that is one data point. I'm not saying you're wrong, but it seems like we don't have any real info to base sales on. 

 

I just find it really hard to believe that the $1600 gaming GPU buyers club is significant in the overall market. I could be wrong, but there is little to go off of. 

Owned

 Share

CPU: 5800x
MOTHERBOARD: ASUS TUF Gaming B550-Plus
RAM: XMP 3600mhz CL16
GPU: 7900XT
SOUNDCARD: Sound Blaster Z 5.1 home theater
MONITOR: 4K 65 inch TV
Full Rig Info
Link to comment
Share on other sites

2 hours ago, UltraMega said:

I think there is a technical factor being missed in this discussion. 

 

I'm not technical enough to say I fully understand this, but I have seen a lot of info from people who are, and as I understand it, Nvidia actually made comments to the same effect before AMD, they just phrased it differently. 

 

Nvidia makes monolithic GPUs while AMD has moved to chiplets. As I understand it, there is sort of a hard limit on how big a monolithic GPU can be, and Nvidia is close to hitting that limit. They have said they face challenges with their monolithic design going forward from where they are now. Monolithic GPUs are more efficient, but they have a limit on how large they can get relative to their node. Chiplets are the way around this issue, but they come with a power cost as it's less efficient to move data around on chiplets vs a monolithic GPU. 

Both companies are facing similar challenges as GPUs get so big and nodes get so small that the physics starts to become an overwhelming obstacle. 

That is indeed a massive factor moving forward. Something Nvidia will have to address eventually. AMD has already started mitigating that issue with chiplet design as you say, so in one way they are ahead in the game, despite chiplet disadvantages. Honestly, I would love to see what AMD did to Intel to Nvidia with the Ryzen CPU's. 

 

Chiplet design is still in its infancy and I think overtime the disadvantages can be minimized for both CPU and GPU. 

 

GPU and CPU architecture is different but I would love to see some Ryzen action for an AMD GPU, that could be a game changer. 

 

Ultimately us consumers just want competition in the market. Unchecked markets lead to continual high prices and innovation stagnation. I would like Nvidia to have some pressure on their pricing at the high end product tiers.

 

£3000

Owned

 Share

CPU: AMD Ryzen 9 7950X3D
MOTHERBOARD: MSI Meg Ace X670E
RAM: Corsair Dominator Titanium 64GB (6000MT/s)
GPU: EVGA 3090 FTW Ultra Gaming
SSD/NVME: Corsair MP700 Pro Gen 5 2TB
PSU: EVGA Supernova T2 1600Watt
CASE: be quiet Dark Base Pro 900 Rev 2
FANS: Noctua NF-A14 industrialPPC x 6
Full Rig Info

Owned

 Share

CPU: Intel Core i5 8500
RAM: 16GB (2x8GB) Kingston 2666Mhz
SSD/NVME: 256GB Samsung NVMe
NETWORK: HP 561T 10Gbe (Intel X540 T2)
MOTHERBOARD: Proprietry
GPU: Intel UHD Graphics 630
PSU: 90Watt
CASE: HP EliteDesk 800 G4 SFF
Full Rig Info

£3000

Owned

 Share

CPU: 2 x Xeon|E5-2696-V4 (44C/88T)
RAM: 128GB|16 x 8GB - DDR4 2400MHz (2Rx8)
MOTHERBOARD: HP Z840|Intel C612 Chipset
GPU: Nvidia Quadro P2200
HDD: 4x 16TB Toshiba MG08ACA16TE Enterprise
SSD/NVME: Intel 512GB 670p NVMe (Main OS)
SSD/NVME 2: Samsung 1TB 980 NVMe (VM's)
SSD/NVME 3: 2x Seagate FireCuda 1TB SSD's (Apps)
Full Rig Info
Link to comment
Share on other sites

7 hours ago, UltraMega said:

I mean, that is one data point. I'm not saying you're wrong, but it seems like we don't have any real info to base sales on. 

 

I just find it really hard to believe that the $1600 gaming GPU buyers club is significant in the overall market. I could be wrong, but there is little to go off of. 

I think your mistake is limiting it to "gamers" as your demographic. Enthusiasts and gamers who will want the best every gen is a thing.

 

Yes it's one data point but it makes sense. 4090 only got $100 increase over the 3090 MSRP, but a massive uploft in performance.

 

On the other hand the 4080 got a $500 increase in MSRP, and everything else down the stack got gimped + price hike. Nobody is interested in those cards unless they have a budget (in which case I suspect many are considering cheap 6950XTs), or are looking for small/efficient like a 4070 for an SFF build or something.

Showcase

 Share

CPU: AMD Ryzen 9 5900X
GPU: Nvidia RTX 3080 Ti Founders Edition
RAM: G.Skill Trident Z Neo 32GB DDR4-3600 (@ 3733 14-8-14-14-21-35 1T GDM)
MOTHERBOARD: ASUS Crosshair VIII Dark Hero
SSD/NVME: x2 Samsung 970 Evo Plus 2TB
SSD/NVME 2: Crucial MX500 1TB
PSU: Corsair RM1000x
MONITOR: LG 48" C1
Full Rig Info

Owned

 Share

CPU: Intel Core i7-4790K, Core i9-10900K, Core i3-13100, Core i9-13900KS
GPU: various
RAM: Corsair 32GB DDR3-2400 | Oloy Blade 16GB DDR4-3600 | Crucial 16GB DDR5-5600
MOTHERBOARD: ASUS Z97 Deluxe | EVGA Z490 Dark | EVGA Z790 Dark Kingpin
SSD/NVME: Samsung 870 Evo 1TB | Inland 1TB Gen 4
PSU: BeQuiet Straight Power 12 1500W
CASE: Cooler Master MasterFrame 700 - bench mode
OPERATING SYSTEM: Windows 10 LTSC
Full Rig Info

Owned

 Share

CPU: M1 Pro
RAM: 32GB
SSD/NVME: 1TB
OPERATING SYSTEM: MacOS Sonoma
CASE: Space Grey
Full Rig Info
Link to comment
Share on other sites

9 hours ago, UltraMega said:

I think there is a technical factor being missed in this discussion. 

 

I'm not technical enough to say I fully understand this, but I have seen a lot of info from people who are, and as I understand it, Nvidia actually made comments to the same effect before AMD, they just phrased it differently. 

 

Nvidia makes monolithic GPUs while AMD has moved to chiplets. As I understand it, there is sort of a hard limit on how big a monolithic GPU can be, and Nvidia is close to hitting that limit. They have said they face challenges with their monolithic design going forward from where they are now. Monolithic GPUs are more efficient, but they have a limit on how large they can get relative to their node. Chiplets are the way around this issue, but they come with a power cost as it's less efficient to move data around on chiplets vs a monolithic GPU. 

Both companies are facing similar challenges as GPUs get so big and nodes get so small that the physics starts to become an overwhelming obstacle. 

Yes this is called the reticle limit. For 4nm (which is a refined 5nm node to be clear, not real 4nm) are still a ways off from hitting that limit. I forget where I heard or saw this, but it sounds like when they presumably go to TSMC 3N next gen, that reticle limit massively goes down. Theoretically they can take advantage of advanced packaging technology to push the limit up but that will just add cost.

 

EDIT: Sorry, on mobile, was not trying to double post.

Edited by Sir Beregond

Showcase

 Share

CPU: AMD Ryzen 9 5900X
GPU: Nvidia RTX 3080 Ti Founders Edition
RAM: G.Skill Trident Z Neo 32GB DDR4-3600 (@ 3733 14-8-14-14-21-35 1T GDM)
MOTHERBOARD: ASUS Crosshair VIII Dark Hero
SSD/NVME: x2 Samsung 970 Evo Plus 2TB
SSD/NVME 2: Crucial MX500 1TB
PSU: Corsair RM1000x
MONITOR: LG 48" C1
Full Rig Info

Owned

 Share

CPU: Intel Core i7-4790K, Core i9-10900K, Core i3-13100, Core i9-13900KS
GPU: various
RAM: Corsair 32GB DDR3-2400 | Oloy Blade 16GB DDR4-3600 | Crucial 16GB DDR5-5600
MOTHERBOARD: ASUS Z97 Deluxe | EVGA Z490 Dark | EVGA Z790 Dark Kingpin
SSD/NVME: Samsung 870 Evo 1TB | Inland 1TB Gen 4
PSU: BeQuiet Straight Power 12 1500W
CASE: Cooler Master MasterFrame 700 - bench mode
OPERATING SYSTEM: Windows 10 LTSC
Full Rig Info

Owned

 Share

CPU: M1 Pro
RAM: 32GB
SSD/NVME: 1TB
OPERATING SYSTEM: MacOS Sonoma
CASE: Space Grey
Full Rig Info
Link to comment
Share on other sites

22 hours ago, UltraMega said:

I mean, that is one data point. I'm not saying you're wrong, but it seems like we don't have any real info to base sales on. 

 

I just find it really hard to believe that the $1600 gaming GPU buyers club is significant in the overall market. I could be wrong, but there is little to go off of. 

 

You can look at the submission counts on 3dMark as well for another kind of out there number that does give some but not enough insight.

Link to comment
Share on other sites

On 12/08/2023 at 07:18, Sir Beregond said:

I think your mistake is limiting it to "gamers" as your demographic. Enthusiasts and gamers who will want the best every gen is a thing.

 

Yes it's one data point but it makes sense. 4090 only got $100 increase over the 3090 MSRP, but a massive uploft in performance.

 

On the other hand the 4080 got a $500 increase in MSRP, and everything else down the stack got gimped + price hike. Nobody is interested in those cards unless they have a budget (in which case I suspect many are considering cheap 6950XTs), or are looking for small/efficient like a 4070 for an SFF build or something.

 

 

Of course I am aware that people buy GPUs for workstations and other non-gaming related tasks. Still, we have no real data to go off of to get a good idea of real world 4090 sales. I don't think it's fair to say I've made any kind of mistake when we have no real data one way or the other. 

 

I simply think as prices go up, the number of people willing to pay those prices probably goes down and if that's the case, AMD probably isn't missing much. I could be way off but there is no good data to point to so we really just don't know. Based on what I can tell, it definitely sold well for the first few weeks of release which makes sense with it being the fastest consumer GPU available, but I can't find anything about ongoing sales. 

 

This is the only recent article I've found that is talking about 4090 sales, but it doesn't give any good data points. https://www.pcgamesn.com/nvidia/rtx-4090-sales-trouble

 

If it's true that Nvidia canceled a 4090Ti, it could be because AMD isn't competing at that price point and/or because the 4090 isn't selling well enough to suggest a more expensive card would do well. If I had to guess, the 4090 sales are probably not bad but probably a little worse than 3090 sales. There are always people who want or need the fastest hardware available, but I think GPU prices have risen faster than most people's willingness to pay. 

 

 

On a different note, one thing is for sure. Nvidia left the mid range wide open with its skimpy Vram. If AMD gets some cards out with similar performance in raster but with more Vram for a similar price, they will probably do very well in that market segment which is usually by far the largest segment. I think most gamers would take more Vram over better RT and DLSS if most other factors were about equal. It feels like AMD is still waiting for 6000 series cards to clear out though since they are already pretty good mid range cards. 

 

Owned

 Share

CPU: 5800x
MOTHERBOARD: ASUS TUF Gaming B550-Plus
RAM: XMP 3600mhz CL16
GPU: 7900XT
SOUNDCARD: Sound Blaster Z 5.1 home theater
MONITOR: 4K 65 inch TV
Full Rig Info
Link to comment
Share on other sites

13 minutes ago, UltraMega said:

 

 

Of course I am aware that people buy GPUs for workstations and other non-gaming related tasks. Still, we have no real data to go off of to get a good idea of real world 4090 sales. I don't think it's fair to say I've made any kind of mistake when we have no real data one way or the other. 

 

I simply think as prices go up, the number of people willing to pay those prices probably goes down and if that's the case, AMD probably isn't missing much. I could be way off but there is no good data to point to so we really just don't know. Based on what I can tell, it definitely sold well for the first few weeks of release which makes sense with it being the fastest consumer GPU available, but I can't find anything about ongoing sales. 

 

This is the only recent article I've found that is talking about 4090 sales, but it doesn't give any good data points. https://www.pcgamesn.com/nvidia/rtx-4090-sales-trouble

 

If it's true that Nvidia canceled a 4090Ti, it could be because AMD isn't competing at that price point and/or because the 4090 isn't selling well enough to suggest a more expensive card would do well. If I had to guess, the 4090 sales are probably not bad but probably a little worse than 3090 sales. There are always people who want or need the fastest hardware available, but I think GPU prices have risen faster than most people's willingness to pay. 

 

 

On a different note, one thing is for sure. Nvidia left the mid range wide open with its skimpy Vram. If AMD gets some cards out with similar performance in raster but with more Vram for a similar price, they will probably do very well in that market segment which is usually by far the largest segment. I think most gamers would take more Vram over better RT and DLSS if most other factors were about equal. It feels like AMD is still waiting for 6000 series cards to clear out though since they are already pretty good mid range cards. 

 

I don't know man. They keep selling at my local Micro Center. I think in general the 40-series is a poor seller, but the 4090 while being the most expensive model, is also the only real gen on gen gain for minimal increase in MSRP. So the money/performance generational increase actually was decent on it. 4080 on the other hand had none of that.

 

To your last point, I do agree here. AMD tends to do well if they can put a compelling offering with better value in the mid-range. If they don't skimp on VRAM and also provide a bit of a discount for the lack of better RT and DLSS, its a winner. Even last gen, the 6600XT and 6700XT were easy wins vs the Nvidia offerings at similar pricing.

  • Thanks 1

Showcase

 Share

CPU: AMD Ryzen 9 5900X
GPU: Nvidia RTX 3080 Ti Founders Edition
RAM: G.Skill Trident Z Neo 32GB DDR4-3600 (@ 3733 14-8-14-14-21-35 1T GDM)
MOTHERBOARD: ASUS Crosshair VIII Dark Hero
SSD/NVME: x2 Samsung 970 Evo Plus 2TB
SSD/NVME 2: Crucial MX500 1TB
PSU: Corsair RM1000x
MONITOR: LG 48" C1
Full Rig Info

Owned

 Share

CPU: Intel Core i7-4790K, Core i9-10900K, Core i3-13100, Core i9-13900KS
GPU: various
RAM: Corsair 32GB DDR3-2400 | Oloy Blade 16GB DDR4-3600 | Crucial 16GB DDR5-5600
MOTHERBOARD: ASUS Z97 Deluxe | EVGA Z490 Dark | EVGA Z790 Dark Kingpin
SSD/NVME: Samsung 870 Evo 1TB | Inland 1TB Gen 4
PSU: BeQuiet Straight Power 12 1500W
CASE: Cooler Master MasterFrame 700 - bench mode
OPERATING SYSTEM: Windows 10 LTSC
Full Rig Info

Owned

 Share

CPU: M1 Pro
RAM: 32GB
SSD/NVME: 1TB
OPERATING SYSTEM: MacOS Sonoma
CASE: Space Grey
Full Rig Info
Link to comment
Share on other sites

2 hours ago, Sir Beregond said:

I don't know man. They keep selling at my local Micro Center. I think in general the 40-series is a poor seller, but the 4090 while being the most expensive model, is also the only real gen on gen gain for minimal increase in MSRP. So the money/performance generational increase actually was decent on it. 4080 on the other hand had none of that.

 

To your last point, I do agree here. AMD tends to do well if they can put a compelling offering with better value in the mid-range. If they don't skimp on VRAM and also provide a bit of a discount for the lack of better RT and DLSS, its a winner. Even last gen, the 6600XT and 6700XT were easy wins vs the Nvidia offerings at similar pricing.

One microcenter sporadically browsed is not convincing data. 

Owned

 Share

CPU: 5800x
MOTHERBOARD: ASUS TUF Gaming B550-Plus
RAM: XMP 3600mhz CL16
GPU: 7900XT
SOUNDCARD: Sound Blaster Z 5.1 home theater
MONITOR: 4K 65 inch TV
Full Rig Info
Link to comment
Share on other sites

5 hours ago, UltraMega said:

One microcenter sporadically browsed is not convincing data. 

I've noticed the exact same thing at my local Microcenter in a different  part of the country.

 

The 4080 stands out as the most unloved card. They just sit on the shelves. When they are sold, they are often returned. The open box returns sit for weeks until they get discounted enough to make a 4080 seem like a decent value. It's almost like they priced the card so badly to make the 4090 easier to justify.

 

I know this is just one more data point, but I check stock regularly. I also check the new Indy location and the Columbus location and see similar trends.

Owned

 Share

CPU: 5900X + Optimus block
MOTHERBOARD: MSI X570 Ace
GPU: EVGA RTX 3090 FTW3 Ultra + Optimus block
RAM: 32GB Oloy Blade 3600CL14
SSD/NVME: 1TB SN750 Black
SSD/NVME 2: 1TB SN750 Black
SSD/NVME 3: 2TB Samsung 970 Evo Plus
CASE: LD PC-V7
Full Rig Info

Owned

 Share

CPU: 7800X3D
MOTHERBOARD: B650E-I Strix
RAM: G.Skill Flare X5
GPU: Reference 6950 XT
CASE: InWin D-Frame Mini
WC RADIATOR: MO-RA3 with 4x180mm Silverstone fans
FAN CONTROLLER: Aquaero
Full Rig Info

Owned

 Share

CPU: 12600KF
MOTHERBOARD: Z790I Strix
RAM: G.Skill Trident Z5 Neo
GPU: RTX 2080
CASE: Sliger SM580
WC RESEVOIR: Phanteks R160C
WC RADIATOR: XSPC TX240 Ultrathin
FANS: Phanteks T30
Full Rig Info
Link to comment
Share on other sites

32 minutes ago, Fluxmaven said:

I've noticed the exact same thing at my local Microcenter in a different  part of the country.

 

The 4080 stands out as the most unloved card. They just sit on the shelves. When they are sold, they are often returned. The open box returns sit for weeks until they get discounted enough to make a 4080 seem like a decent value. It's almost like they priced the card so badly to make the 4090 easier to justify.

 

I know this is just one more data point, but I check stock regularly. I also check the new Indy location and the Columbus location and see similar trends.

 

They definitely priced the 4080 to push 4090 sales. It's the same reason no one ever gets a small or medium popcorn at the movie theaters. Why spend $4 for a small bag when $7 gets you a huge bucket. This is called "decoy pricing". 

 

Even if we take the observations of two microcenters as being representative of the entire market, all that really means is the 4080 is not selling. 

 

 

I doubt anyone here is going to microcenter so much that they are accurately tracking 4090 sales figures because of it. 

 

Nvidia has learned to control the GPU supply as well now, so they can keep prices high by controlling the scarcity of their products. They would rather sell less at a higher price than let the market reach equilibrium. 

Edited by UltraMega

Owned

 Share

CPU: 5800x
MOTHERBOARD: ASUS TUF Gaming B550-Plus
RAM: XMP 3600mhz CL16
GPU: 7900XT
SOUNDCARD: Sound Blaster Z 5.1 home theater
MONITOR: 4K 65 inch TV
Full Rig Info
Link to comment
Share on other sites

29 minutes ago, UltraMega said:

 

They definitely priced the 4080 to push 4090 sales. It's the same reason no one ever gets a small or medium popcorn at the movie theaters. Why spend $4 for a small bag when $7 gets you a huge bucket. This is called "decoy pricing". 

 

Even if we take the observations of two microcenters as being representative of the entire market, all that really means is the 4080 is not selling. 

 

 

I doubt anyone here is going to microcenter so much that they are accurately tracking 4090 sales figures because of it. 

 

Nvidia has learned to control the GPU supply as well now, so they can keep prices high by controlling the scarcity of their products. They would rather sell less at a higher price than let the market reach equilibrium. 

Believe whatever you want. You have no data and we provided you two observations. I'm not claiming we are providing data as I haven't given you any numbers, but I can say the following:

 

  • My Micro center keeps accurate stock numbers on their website
  • I watch stocks multiple times a week
  • 4090s move
  • 4080s sit
  • lower stack moves slowly
  • 4090s are the fastest moving 40-series cards
  • RDNA2 cards are moving too

 

Flux lives on the opposite side of the country from me, and sounds like he is seeing the same thing. Yeah its just two stores, but I think one can get an idea of what may be going on.

 

As for Nvidia doing stock control, absolutely they are doing that. AI is their new cash cow. If consumer gaming cards aren't selling at the new prices asked and Nvidia is unwilling to lower their margins to price correct, then they are just going to move more silicon back to AI products and control stock of their only popular card (the 4090) so it doesn't drop in price.

Edited by Sir Beregond
  • Thanks 2
  • Respect 1

Showcase

 Share

CPU: AMD Ryzen 9 5900X
GPU: Nvidia RTX 3080 Ti Founders Edition
RAM: G.Skill Trident Z Neo 32GB DDR4-3600 (@ 3733 14-8-14-14-21-35 1T GDM)
MOTHERBOARD: ASUS Crosshair VIII Dark Hero
SSD/NVME: x2 Samsung 970 Evo Plus 2TB
SSD/NVME 2: Crucial MX500 1TB
PSU: Corsair RM1000x
MONITOR: LG 48" C1
Full Rig Info

Owned

 Share

CPU: Intel Core i7-4790K, Core i9-10900K, Core i3-13100, Core i9-13900KS
GPU: various
RAM: Corsair 32GB DDR3-2400 | Oloy Blade 16GB DDR4-3600 | Crucial 16GB DDR5-5600
MOTHERBOARD: ASUS Z97 Deluxe | EVGA Z490 Dark | EVGA Z790 Dark Kingpin
SSD/NVME: Samsung 870 Evo 1TB | Inland 1TB Gen 4
PSU: BeQuiet Straight Power 12 1500W
CASE: Cooler Master MasterFrame 700 - bench mode
OPERATING SYSTEM: Windows 10 LTSC
Full Rig Info

Owned

 Share

CPU: M1 Pro
RAM: 32GB
SSD/NVME: 1TB
OPERATING SYSTEM: MacOS Sonoma
CASE: Space Grey
Full Rig Info
Link to comment
Share on other sites

I guess my intended sarcasm didn't come through, I'm well aware of decoy pricing. 

 

Also I when I stated I noticed the same thing, I was referring to the fact that 4090s are selling not just that the 4080 is a bad value/not selling.

 

Between work and personal interest, I'm at Microcenter a couple times a month and check prices/inventory daily. Also as mentioned, I'm looking at 3 locations so between just Sir B and I, that's 4 locations being observed. 

 

I do realize that Microcenters in general only serve a small portion of the country and thus don't represent total sales. The whole point of mentioning any of this is just to point out that yes, people do in fact buy 4090s. 

  • Thanks 4

Owned

 Share

CPU: 5900X + Optimus block
MOTHERBOARD: MSI X570 Ace
GPU: EVGA RTX 3090 FTW3 Ultra + Optimus block
RAM: 32GB Oloy Blade 3600CL14
SSD/NVME: 1TB SN750 Black
SSD/NVME 2: 1TB SN750 Black
SSD/NVME 3: 2TB Samsung 970 Evo Plus
CASE: LD PC-V7
Full Rig Info

Owned

 Share

CPU: 7800X3D
MOTHERBOARD: B650E-I Strix
RAM: G.Skill Flare X5
GPU: Reference 6950 XT
CASE: InWin D-Frame Mini
WC RADIATOR: MO-RA3 with 4x180mm Silverstone fans
FAN CONTROLLER: Aquaero
Full Rig Info

Owned

 Share

CPU: 12600KF
MOTHERBOARD: Z790I Strix
RAM: G.Skill Trident Z5 Neo
GPU: RTX 2080
CASE: Sliger SM580
WC RESEVOIR: Phanteks R160C
WC RADIATOR: XSPC TX240 Ultrathin
FANS: Phanteks T30
Full Rig Info
Link to comment
Share on other sites

17 minutes ago, Sir Beregond said:

Believe whatever you want. You have no data and we provided you two observations. I'm not claiming we are providing data as I haven't given you any numbers, but I can say the following:

 

  • My Micro center keeps accurate stock numbers on their website
  • I watch stocks multiple times a week
  • 4090s move
  • 4080s sit
  • lower stack moves slowly
  • 4090s are the fastest moving 40-series cards
  • RDNA2 cards are moving too

 

Flux lives on the opposite side of the country from me, and sounds like he is seeing the same thing. Yeah its just two stores, but I think one can get an idea of what may be going on.

 

As for Nvidia doing stock control, absolutely they are doing that. AI is their new cash cow. If consumer gaming cards aren't selling at the new prices asked and Nvidia is unwilling to lower their margins to price correct, then they are just going to move more silicon back to AI products and control stock of their only popular card (the 4090) so it doesn't drop in price.

I think what you see at microcenters is probably just price decoying in effect. Doesn't mean the 4090 is selling well, just means it's selling better than the 4080 which is clearly intentional on Nvidia's part and therefore not at all a surprise. 

 

You can't really judge by stock supplies since the supply is being limited to keep prices high. 

 

Again, not saying your wrong but two people that have visited/watched microcenters is just not enough to go off of to have a well informed opinion on this. 

Owned

 Share

CPU: 5800x
MOTHERBOARD: ASUS TUF Gaming B550-Plus
RAM: XMP 3600mhz CL16
GPU: 7900XT
SOUNDCARD: Sound Blaster Z 5.1 home theater
MONITOR: 4K 65 inch TV
Full Rig Info
Link to comment
Share on other sites

9 minutes ago, UltraMega said:

I think what you see at microcenters is probably just price decoying in effect. Doesn't mean the 4090 is selling well, just means it's selling better than the 4080 which is clearly intentional on Nvidia's part and therefore not at all a surprise. 

 

You can't really judge by stock supplies since the supply is being limited to keep prices high. 

 

Again, not saying your wrong but two people that have visited/watched microcenters is just not enough to go off of to have a well informed opinion on this. 

Just watch reddit and other places too. People are building stuff with 4090s. 

 

Like I said believe whatever you want. This is what I am observing in the real world. Because I can't provide you sales numbers you don't want to believe it, that's fine, but naive of what's going on even if you can't put a number to it.

 

You know what we do have number on? Market share. AMD has done nothing but be completely uncompetitive in the flagship/top end space for a decade, basically everything between the last card that was (the 290X) and the recent card that was (6900XT/6950XT). What has that really done to their market share but shrink it over the past decade? AMD might get taken seriously again if they actually push to 1up Nvidia in features and performance, but either they can't or they won't. They seem perfectly content to just slot into Nvidia pricing stack, but still be overpriced at launch considering the deficiencies, and they don't seem to care. Gaming may be a big deal for AMD's business, but I suspect that's primarily console and dedicated GPU is probably near the bottom of what they give a crap about. It shows.

Edited by Sir Beregond

Showcase

 Share

CPU: AMD Ryzen 9 5900X
GPU: Nvidia RTX 3080 Ti Founders Edition
RAM: G.Skill Trident Z Neo 32GB DDR4-3600 (@ 3733 14-8-14-14-21-35 1T GDM)
MOTHERBOARD: ASUS Crosshair VIII Dark Hero
SSD/NVME: x2 Samsung 970 Evo Plus 2TB
SSD/NVME 2: Crucial MX500 1TB
PSU: Corsair RM1000x
MONITOR: LG 48" C1
Full Rig Info

Owned

 Share

CPU: Intel Core i7-4790K, Core i9-10900K, Core i3-13100, Core i9-13900KS
GPU: various
RAM: Corsair 32GB DDR3-2400 | Oloy Blade 16GB DDR4-3600 | Crucial 16GB DDR5-5600
MOTHERBOARD: ASUS Z97 Deluxe | EVGA Z490 Dark | EVGA Z790 Dark Kingpin
SSD/NVME: Samsung 870 Evo 1TB | Inland 1TB Gen 4
PSU: BeQuiet Straight Power 12 1500W
CASE: Cooler Master MasterFrame 700 - bench mode
OPERATING SYSTEM: Windows 10 LTSC
Full Rig Info

Owned

 Share

CPU: M1 Pro
RAM: 32GB
SSD/NVME: 1TB
OPERATING SYSTEM: MacOS Sonoma
CASE: Space Grey
Full Rig Info
Link to comment
Share on other sites

27 minutes ago, Sir Beregond said:

Just watch reddit and other places too. People are building stuff with 4090s. 

 

Like I said believe whatever you want. This is what I am observing in the real world. Because I can't provide you sales numbers you don't want to believe it, that's fine, but naive of what's going on even if you can't put a number to it.

 

You know what we do have number on? Market share. AMD has done nothing but be completely uncompetitive in the flagship/top end space for a decade, basically everything between the last card that was (the 290X) and the recent card that was (6900XT/6950XT). What has that really done to their market share but shrink it over the past decade? AMD might get taken seriously again if they actually push to 1up Nvidia in features and performance, but either they can't or they won't. They seem perfectly content to just slot into Nvidia pricing stack, but still be overpriced at launch considering the deficiencies, and they don't seem to care. Gaming may be a big deal for AMD's business, but I suspect that's primarily console and dedicated GPU is probably near the bottom of what they give a crap about. It shows.

Not sure why this is turning into a heated issue for you. We don't have sales numbers, so this is mostly speculative. I've said multiple times I'm not saying you're wrong, but it's hard to judge based on mere observations when price decoying is in effect because it somewhat skews the observational data.

 

I totally agree Nvidia is dominating the market in terms of market share, no question. I never said otherwise. Nvidias biggest competitor right now is GeForce 3000 and AMD 6000 cards, and I suspect the entire 4000 series is losing hard to last gen cards. 

 

If I bring my own observations into this, as someone who buys a lot of used parts I can tell you the used market is definitely thriving. I suspect for every 4090 sold, half a dozen or so 3070s sell on the used market. 

Edited by UltraMega

Owned

 Share

CPU: 5800x
MOTHERBOARD: ASUS TUF Gaming B550-Plus
RAM: XMP 3600mhz CL16
GPU: 7900XT
SOUNDCARD: Sound Blaster Z 5.1 home theater
MONITOR: 4K 65 inch TV
Full Rig Info
Link to comment
Share on other sites

4 minutes ago, UltraMega said:

Not sure why this is turning into a heated issue for you. We don't have sales numbers, so this is mostly speculative. I've said multiple times I'm not saying you're wrong, but it's hard to judge based on mere observations when price decoying is in effect because it somewhat skews the observational data.

 

I totally agree Nvidia is dominating the market in terms of market share, no question. I never said otherwise. Nvidias biggest competitor right now is GeForce 3000 and AMD 6000 cards, and I suspect the entire 4000 series is losing hard to last gen cards. 

No heat here.

 

It is hard to get numbers, I don't feel it's hard to judge when you look at multiple forums, Reddit, locally, etc and can see the trending is really all I am saying. Is it fur whole picture, no.

 

Yes I agree that 40-series in general overall numbers is probably losing to last gen RX 6000, and any remaining 30-series stock. Where that is not true is at the top end and the 4090 which seems to sale just fine.

 

I guess my point being top end market still means something and can be a driving force in public perception of your product lineup as a whole whether you are buying top end or not. My overall point being that just trying to play the sweet spot value market hasn't really been a winning formula to drive any real market gain for AMD. It's been great for value gamers, but as AMD is even abandoning trying to be the value player by slotting into Nvidia's pricing, that prospect goes away until their own cards course correct in pricing (see 7900XT).

Showcase

 Share

CPU: AMD Ryzen 9 5900X
GPU: Nvidia RTX 3080 Ti Founders Edition
RAM: G.Skill Trident Z Neo 32GB DDR4-3600 (@ 3733 14-8-14-14-21-35 1T GDM)
MOTHERBOARD: ASUS Crosshair VIII Dark Hero
SSD/NVME: x2 Samsung 970 Evo Plus 2TB
SSD/NVME 2: Crucial MX500 1TB
PSU: Corsair RM1000x
MONITOR: LG 48" C1
Full Rig Info

Owned

 Share

CPU: Intel Core i7-4790K, Core i9-10900K, Core i3-13100, Core i9-13900KS
GPU: various
RAM: Corsair 32GB DDR3-2400 | Oloy Blade 16GB DDR4-3600 | Crucial 16GB DDR5-5600
MOTHERBOARD: ASUS Z97 Deluxe | EVGA Z490 Dark | EVGA Z790 Dark Kingpin
SSD/NVME: Samsung 870 Evo 1TB | Inland 1TB Gen 4
PSU: BeQuiet Straight Power 12 1500W
CASE: Cooler Master MasterFrame 700 - bench mode
OPERATING SYSTEM: Windows 10 LTSC
Full Rig Info

Owned

 Share

CPU: M1 Pro
RAM: 32GB
SSD/NVME: 1TB
OPERATING SYSTEM: MacOS Sonoma
CASE: Space Grey
Full Rig Info
Link to comment
Share on other sites

4 minutes ago, Sir Beregond said:

No heat here.

 

It is hard to get numbers, I don't feel it's hard to judge when you look at multiple forums, Reddit, locally, etc and can see the trending is really all I am saying. Is it fur whole picture, no.

 

Yes I agree that 40-series in general overall numbers is probably losing to last gen RX 6000, and any remaining 30-series stock. Where that is not true is at the top end and the 4090 which seems to sale just fine.

 

I guess my point being top end market still means something and can be a driving force in public perception of your product lineup as a whole whether you are buying top end or not. My overall point being that just trying to play the sweet spot value market hasn't really been a winning formula to drive any real market gain for AMD. It's been great for value gamers, but as AMD is even abandoning trying to be the value player by slotting into Nvidia's pricing, that prospect goes away until their own cards course correct in pricing (see 7900XT).

I know the steam hardware survey is not a total picture of the market, but consider these numbers:

 

The 4090 has 0.64% market share and a 0.1% adoption rate.

 

The 3070 has a 3.15% market share and an adoption rate of .25%

 

At least according to steam, the 3070, a last gen card with only 8GBs, is still selling 2.5x more than the 4090. 

 

 

The 3070 actually has the highest adoption rate of any GPU on steam right now. The entire 4000 series (not including mobile chips) has an adoption rate of .63%, which is less than the adoption rate of just the 3060, 3070, 3070Ti, 3080 and 3080Ti combined (.65%). If you were to add up 2000 series cards and 1000 series cards as well, it's clear that the used market is outselling the 4000 series by a lot right now. 

 

Since these are the only numbers we have, and they seem to suggest the 4000 series is losing badly to last gen cards still to this day, it seems reasonable to conclude that none of the 4000 series cards are selling well, including the 4090. Though the 4090 has the highest adoption rate of any card in the 4000 series, it's still very low compared to last gen cards. 

I would agree that the 4090 is selling well compared to other 4000 series cards, but I maintain that it's very likely selling less than a typical top tier card does, which... back to the topic at hand, suggest that AMD isn't missing much by not competing with the 4090. 

Owned

 Share

CPU: 5800x
MOTHERBOARD: ASUS TUF Gaming B550-Plus
RAM: XMP 3600mhz CL16
GPU: 7900XT
SOUNDCARD: Sound Blaster Z 5.1 home theater
MONITOR: 4K 65 inch TV
Full Rig Info
Link to comment
Share on other sites

1 hour ago, UltraMega said:

Not sure why this is turning into a heated issue for you. We don't have sales numbers, so this is mostly speculative. I've said multiple times I'm not saying you're wrong, but it's hard to judge based on mere observations when price decoying is in effect because it somewhat skews the observational data.

 

I totally agree Nvidia is dominating the market in terms of market share, no question. I never said otherwise. Nvidias biggest competitor right now is GeForce 3000 and AMD 6000 cards, and I suspect the entire 4000 series is losing hard to last gen cards. 

 

If I bring my own observations into this, as someone who buys a lot of used parts I can tell you the used market is definitely thriving. I suspect for every 4090 sold, half a dozen or so 3070s sell on the used market. 

 

I agree that the 4000 series is likely losing out due to the 3000 series sales. At least toward the top end. While the 4090 is an awesome GPU, unless you have some specific requirements warranting it, the jump from the 3090 to 4090 does not make an awful lot of sense, not when taking into account the performance vs price uplift. The cost of jumping to the 4090 from the 3090 far out weighs the performance benefits, at least in my mind.

 

I am wondering what the 5000 series will bring, other than the astronomic price tag of the 5090/Ti lol.

  • Thanks 1

£3000

Owned

 Share

CPU: AMD Ryzen 9 7950X3D
MOTHERBOARD: MSI Meg Ace X670E
RAM: Corsair Dominator Titanium 64GB (6000MT/s)
GPU: EVGA 3090 FTW Ultra Gaming
SSD/NVME: Corsair MP700 Pro Gen 5 2TB
PSU: EVGA Supernova T2 1600Watt
CASE: be quiet Dark Base Pro 900 Rev 2
FANS: Noctua NF-A14 industrialPPC x 6
Full Rig Info

Owned

 Share

CPU: Intel Core i5 8500
RAM: 16GB (2x8GB) Kingston 2666Mhz
SSD/NVME: 256GB Samsung NVMe
NETWORK: HP 561T 10Gbe (Intel X540 T2)
MOTHERBOARD: Proprietry
GPU: Intel UHD Graphics 630
PSU: 90Watt
CASE: HP EliteDesk 800 G4 SFF
Full Rig Info

£3000

Owned

 Share

CPU: 2 x Xeon|E5-2696-V4 (44C/88T)
RAM: 128GB|16 x 8GB - DDR4 2400MHz (2Rx8)
MOTHERBOARD: HP Z840|Intel C612 Chipset
GPU: Nvidia Quadro P2200
HDD: 4x 16TB Toshiba MG08ACA16TE Enterprise
SSD/NVME: Intel 512GB 670p NVMe (Main OS)
SSD/NVME 2: Samsung 1TB 980 NVMe (VM's)
SSD/NVME 3: 2x Seagate FireCuda 1TB SSD's (Apps)
Full Rig Info
Link to comment
Share on other sites

29 minutes ago, ENTERPRISE said:

 

I agree that the 4000 series is likely losing out due to the 3000 series sales. At least toward the top end. While the 4090 is an awesome GPU, unless you have some specific requirements warranting it, the jump from the 3090 to 4090 does not make an awful lot of sense, not when taking into account the performance vs price uplift. The cost of jumping to the 4090 from the 3090 far out weighs the performance benefits, at least in my mind.

 

I am wondering what the 5000 series will bring, other than the astronomic price tag of the 5090/Ti lol.

I mean I suspect many like you had the same feeling, but others also saw it as the first 4k120 card and jumped on it, because the 3090 is not that and let's be real a lot of enthusiasts don't seem to have problems dropping that kinda dough every gen. I'm not one of them, but I do see that.

 

Also, I haven't seen any new 3090's, 3090 Ti's, 3080 Ti's, or 3080's in stock for months, so I am pretty sure Nvidia has mostly sold through their inventory of top end GA102 based Ampere cards. Below that though, if there are good prices, 30-series might still be selling well. Personally I saw an uptick in RDNA2 sales more than anything at my local Micro Center.

Showcase

 Share

CPU: AMD Ryzen 9 5900X
GPU: Nvidia RTX 3080 Ti Founders Edition
RAM: G.Skill Trident Z Neo 32GB DDR4-3600 (@ 3733 14-8-14-14-21-35 1T GDM)
MOTHERBOARD: ASUS Crosshair VIII Dark Hero
SSD/NVME: x2 Samsung 970 Evo Plus 2TB
SSD/NVME 2: Crucial MX500 1TB
PSU: Corsair RM1000x
MONITOR: LG 48" C1
Full Rig Info

Owned

 Share

CPU: Intel Core i7-4790K, Core i9-10900K, Core i3-13100, Core i9-13900KS
GPU: various
RAM: Corsair 32GB DDR3-2400 | Oloy Blade 16GB DDR4-3600 | Crucial 16GB DDR5-5600
MOTHERBOARD: ASUS Z97 Deluxe | EVGA Z490 Dark | EVGA Z790 Dark Kingpin
SSD/NVME: Samsung 870 Evo 1TB | Inland 1TB Gen 4
PSU: BeQuiet Straight Power 12 1500W
CASE: Cooler Master MasterFrame 700 - bench mode
OPERATING SYSTEM: Windows 10 LTSC
Full Rig Info

Owned

 Share

CPU: M1 Pro
RAM: 32GB
SSD/NVME: 1TB
OPERATING SYSTEM: MacOS Sonoma
CASE: Space Grey
Full Rig Info
Link to comment
Share on other sites

52 minutes ago, ENTERPRISE said:

 

I agree that the 4000 series is likely losing out due to the 3000 series sales. At least toward the top end. While the 4090 is an awesome GPU, unless you have some specific requirements warranting it, the jump from the 3090 to 4090 does not make an awful lot of sense, not when taking into account the performance vs price uplift. The cost of jumping to the 4090 from the 3090 far out weighs the performance benefits, at least in my mind.

 

I am wondering what the 5000 series will bring, other than the astronomic price tag of the 5090/Ti lol.

For gaming, the 4090 kinda solves a problem that doesn't really exist yet. A GPU fast enough to do 4K at 200FPS+ in a typical modern game is cool, but even among enthusiasts there are very few people who have a screen that can even keep up with that. CP2077 Path tracing is cool, but it's just one game.

 

18 minutes ago, Sir Beregond said:

I mean I suspect many like you had the same feeling, but others also saw it as the first 4k120 card and jumped on it, because the 3090 is not that and let's be real a lot of enthusiasts don't seem to have problems dropping that kinda dough every gen. I'm not one of them, but I do see that.

 

Also, I haven't seen any new 3090's, 3090 Ti's, 3080 Ti's, or 3080's in stock for months, so I am pretty sure Nvidia has mostly sold through their inventory of top end GA102 based Ampere cards. Below that though, if there are good prices, 30-series might still be selling well. Personally I saw an uptick in RDNA2 sales more than anything at my local Micro Center.

 

I disagree that 4K 120hz+ adoption is currently high at all. You can find people on sites like this that do it, but it's a very tiny part of the overall market IRL. I mean, you need the $1600 GPU and then a screen that's probably even more expensive. But I totally agree that it is the first reliable 4K 120 GPU. 

 

The 3000 series is 100% for sure outselling the 4000 series right now. You're not going to see as much older stock in a store in person, and you won't see the used market at all. 

 

Best selling GPU on newegg is the 3060: Desktop Graphics Cards | Video Cards - Newegg.com

 

 

 

Thinking about this subject, I can't help but think that until AMD is close to parity with RT, all they would do by releasing a card to compete with the 4090 would be add some price pressure on it, because people in that price bracket are not going to choose AMD to save $100 or so if it has less features. Maybe we would see the 4090 drop in price to $1500 and still no one would buy the AMD card lol. 

Edited by UltraMega

Owned

 Share

CPU: 5800x
MOTHERBOARD: ASUS TUF Gaming B550-Plus
RAM: XMP 3600mhz CL16
GPU: 7900XT
SOUNDCARD: Sound Blaster Z 5.1 home theater
MONITOR: 4K 65 inch TV
Full Rig Info
Link to comment
Share on other sites

On 07/08/2023 at 19:17, acoustic said:

The reality is that AMD isn't even winning at the low-end. Intel is doing an awesome job pricing competitively, and I think Arc is an interesting product.. can't say the same about the RX7600. Considering how poor AMD's GPU section seems to be managed, it's not surprising they haven't realized Intel isn't competing with NV -- Intel is stealing what little pebbles of market-share AMD still had, and AMD is letting them.

I've had the displeasure of using an Arc 730.  It couldn't even play youtube videos without massive rendering issues.  Most people measure graphics performance in frames per second, that card was how many seconds per frame.  Intel is not competitive in the graphics market.  The 730 should never have been sold, graphics cards from 15 years ago work better than that.

 

The enthusiast hardware group has always been a small market compared to the economical stuff.  In the enthusiast section, AMD isn't the most popular, so it makes sense that they would reduce focus on high end graphics.  They stated before they want enterprise to be their primary focus.  Servers don't prioritize graphics.  AI is the next push, and AMD must be competitive in that department if they want to stay relevant.

 

Being a hardware enthusiast, I do hope AMD can offer competition to Nvidia's high end.  I think AMD had the opportunity to sock it to Nvidia with their pricing this year, but instead they decided to maintain the status quo.  I feel that ray tracing and DLSS are marketing gimmicks, and I don't care about those, but it's true that a lot of software doesn't work with AMD.  Martix jewelry design software doesn't work without CUDA cores, and it hasn't for years.  Nvidia took their time to infiltrate the market and form partnerships, AMD is still trying to sell hardware.  Let's not forget about the tessellation scandal and how game companies Nvidia partnered with were running tessellation below the map because AMD cards didn't handle tessellation as well. 

Link to comment
Share on other sites

Steam, a gaming platform, shows that the most common cards are mid range units... Color me surprised. 

 

The popular cards on Steam are always going to be mid range and typically a gen or two behind. People building monster rigs are using them for something other than gaming, or barely have time to game... Whereas kids that have infinite time to game, can't afford top end GPUs. 

 

m5jvtw6r8mh81.jpg?width=640&auto=webp&s=6704305dfcb44871c57e98e72d20b21dd1730285

 

I really hope AMD doesn't pull out of the enthusiast GPU space, because Intel is still way out from being a real competitor and obviously if you give Nvidia an inch, they will take a mile. 

 

Another Microcenter observation... When it comes to open box GPUs. With the exception of the 4080, all other 4000 cards tend to sell within the week. Which is interesting because if you wait a week, the prices drop further. Suggesting that people are interested in the midrange 4000 series assuming the prices were a bit better. The AMD cards tend to sit around a bit longer. Waiting for a larger discount before people consider them. 

 

Which is interesting to me because often in the initial discount period, the 4000 cards are still IMO overpriced for what they offer, yet people will walk right past a better priced and performing AMD card to grab the Nvidia stuff. Obviously they do have some better features and perform better in some workloads, but it still seems odd.

 

People like to be associated with the winning team. Just because most people wouldn't buy a 4090 tier AMD card, doesn't mean it wouldn't strengthen their brand to be seen as an actual competitor and not just the bargain bin second place pick. 

 

  • Respect 1

Owned

 Share

CPU: 5900X + Optimus block
MOTHERBOARD: MSI X570 Ace
GPU: EVGA RTX 3090 FTW3 Ultra + Optimus block
RAM: 32GB Oloy Blade 3600CL14
SSD/NVME: 1TB SN750 Black
SSD/NVME 2: 1TB SN750 Black
SSD/NVME 3: 2TB Samsung 970 Evo Plus
CASE: LD PC-V7
Full Rig Info

Owned

 Share

CPU: 7800X3D
MOTHERBOARD: B650E-I Strix
RAM: G.Skill Flare X5
GPU: Reference 6950 XT
CASE: InWin D-Frame Mini
WC RADIATOR: MO-RA3 with 4x180mm Silverstone fans
FAN CONTROLLER: Aquaero
Full Rig Info

Owned

 Share

CPU: 12600KF
MOTHERBOARD: Z790I Strix
RAM: G.Skill Trident Z5 Neo
GPU: RTX 2080
CASE: Sliger SM580
WC RESEVOIR: Phanteks R160C
WC RADIATOR: XSPC TX240 Ultrathin
FANS: Phanteks T30
Full Rig Info
Link to comment
Share on other sites

Premium Platinum - Lifetime
767 1,670

Rumor has it that NVIDIA has slowed up their production of the 4090 just enough to keep TMSC biting.

 

If I had to speculate, my guess would be that Leather Jacket realizes that the larger part of what was their gaming market is now holding on to whatever disposable income they've got left to pay for things like food, shelter, and clothing these daze. 

 

But I'm not Maslow, and instead, it could be that Leather Jacket is now far more interested in his  future of AI (e.g., NVIDIA A100 & H100 GPUs, and their NEXT series).

 

In any event the way I/we  see it, the argument is mostly moot because I/we own two 4090s (among other cards) AND because you can't argue with a dead horse.

 

(well you could, but you might end up beating it to death.)  😉

 

Edited by iamjanco
  • Thanks 1
Link to comment
Share on other sites

12 minutes ago, Fluxmaven said:

Steam, a gaming platform, shows that the most common cards are mid range units... Color me surprised. 

 

The popular cards on Steam are always going to be mid range and typically a gen or two behind. People building monster rigs are using them for something other than gaming, or barely have time to game... Whereas kids that have infinite time to game, can't afford top end GPUs. 

 

m5jvtw6r8mh81.jpg?width=640&auto=webp&s=6704305dfcb44871c57e98e72d20b21dd1730285

 

I really hope AMD doesn't pull out of the enthusiast GPU space, because Intel is still way out from being a real competitor and obviously if you give Nvidia an inch, they will take a mile. 

 

Another Microcenter observation... When it comes to open box GPUs. With the exception of the 4080, all other 4000 cards tend to sell within the week. Which is interesting because if you wait a week, the prices drop further. Suggesting that people are interested in the midrange 4000 series assuming the prices were a bit better. The AMD cards tend to sit around a bit longer. Waiting for a larger discount before people consider them. 

 

Which is interesting to me because often in the initial discount period, the 4000 cards are still IMO overpriced for what they offer, yet people will walk right past a better priced and performing AMD card to grab the Nvidia stuff. Obviously they do have some better features and perform better in some workloads, but it still seems odd.

 

People like to be associated with the winning team. Just because most people wouldn't buy a 4090 tier AMD card, doesn't mean it wouldn't strengthen their brand to be seen as an actual competitor and not just the bargain bin second place pick. 

 

Steam hardware surveys are not a representation of play time, just hardware running on machines with steam installed. 

Owned

 Share

CPU: 5800x
MOTHERBOARD: ASUS TUF Gaming B550-Plus
RAM: XMP 3600mhz CL16
GPU: 7900XT
SOUNDCARD: Sound Blaster Z 5.1 home theater
MONITOR: 4K 65 inch TV
Full Rig Info
Link to comment
Share on other sites

18 minutes ago, Fluxmaven said:

Steam, a gaming platform, shows that the most common cards are mid range units... Color me surprised. 

 

The popular cards on Steam are always going to be mid range and typically a gen or two behind. People building monster rigs are using them for something other than gaming, or barely have time to game... Whereas kids that have infinite time to game, can't afford top end GPUs. 

 

m5jvtw6r8mh81.jpg?width=640&auto=webp&s=6704305dfcb44871c57e98e72d20b21dd1730285

 

I really hope AMD doesn't pull out of the enthusiast GPU space, because Intel is still way out from being a real competitor and obviously if you give Nvidia an inch, they will take a mile. 

 

Another Microcenter observation... When it comes to open box GPUs. With the exception of the 4080, all other 4000 cards tend to sell within the week. Which is interesting because if you wait a week, the prices drop further. Suggesting that people are interested in the midrange 4000 series assuming the prices were a bit better. The AMD cards tend to sit around a bit longer. Waiting for a larger discount before people consider them. 

 

Which is interesting to me because often in the initial discount period, the 4000 cards are still IMO overpriced for what they offer, yet people will walk right past a better priced and performing AMD card to grab the Nvidia stuff. Obviously they do have some better features and perform better in some workloads, but it still seems odd.

 

People like to be associated with the winning team. Just because most people wouldn't buy a 4090 tier AMD card, doesn't mean it wouldn't strengthen their brand to be seen as an actual competitor and not just the bargain bin second place pick. 

 

Interesting, I'll have to keep a closer eye on 4070 Ti and below cards. I saw a big uptick in RDNA2 movement here in Denver, but it is because like you say, they are massively discounted. 30-series hasn't seen a huge discount from 2020-2021 MSRPs as far as I can see. I also noticed that the only 30-series cards that see any volume still at my store are 3050's, 3060's, 3060 Ti's. 3070 and 3070 Ti are more onesie twosie. 3080 and above have just been gone for months.

 

I don't really see RDNA3 cards moving either. Maybe once the 7800 series starts releasing, but the 7900 series has been priced completely wrong. AMD had an opportunity to sock it to Nvidia this gen, absolutely, but they chose not to. I don't know why people keep thinking AMD only doing mid-range cards is a winning formula for them. This is exactly why their market share shrunk and general consumer perception of Radeon as a brand is that it's second rate. The only reason I see AMD pulling out of the enthusiast market is because they stopped giving a crap about trying to do anything revolutionary with Radeon and are content to service 10-15% of the market. Sad really considering the turnaround they pulled on Intel with Ryzen and Epyc.

Showcase

 Share

CPU: AMD Ryzen 9 5900X
GPU: Nvidia RTX 3080 Ti Founders Edition
RAM: G.Skill Trident Z Neo 32GB DDR4-3600 (@ 3733 14-8-14-14-21-35 1T GDM)
MOTHERBOARD: ASUS Crosshair VIII Dark Hero
SSD/NVME: x2 Samsung 970 Evo Plus 2TB
SSD/NVME 2: Crucial MX500 1TB
PSU: Corsair RM1000x
MONITOR: LG 48" C1
Full Rig Info

Owned

 Share

CPU: Intel Core i7-4790K, Core i9-10900K, Core i3-13100, Core i9-13900KS
GPU: various
RAM: Corsair 32GB DDR3-2400 | Oloy Blade 16GB DDR4-3600 | Crucial 16GB DDR5-5600
MOTHERBOARD: ASUS Z97 Deluxe | EVGA Z490 Dark | EVGA Z790 Dark Kingpin
SSD/NVME: Samsung 870 Evo 1TB | Inland 1TB Gen 4
PSU: BeQuiet Straight Power 12 1500W
CASE: Cooler Master MasterFrame 700 - bench mode
OPERATING SYSTEM: Windows 10 LTSC
Full Rig Info

Owned

 Share

CPU: M1 Pro
RAM: 32GB
SSD/NVME: 1TB
OPERATING SYSTEM: MacOS Sonoma
CASE: Space Grey
Full Rig Info
Link to comment
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now



×
×
  • Create New...

Important Information

This Website may place and access certain Cookies on your computer. ExtremeHW uses Cookies to improve your experience of using the Website and to improve our range of products and services. ExtremeHW has carefully chosen these Cookies and has taken steps to ensure that your privacy is protected and respected at all times. All Cookies used by this Website are used in accordance with current UK and EU Cookie Law. For more information please see our Privacy Policy