Jump to content

Welcome to ExtremeHW

Welcome to ExtremeHW, register to take part in our community, don't worry this is a simple FREE process that requires minimal information for you to signup.

 

Registered users can: 

  • Start new topics and reply to others.
  • Show off your PC using our Rig Creator feature.
  • Subscribe to topics and forums to get updates.
  • Get your own profile page to customize.
  • Send personal messages to other members.
  • Take advantage of site exclusive features.
  • Upgrade to Premium to unlock additional sites features.
IGNORED

AMD confirms plans to not play ‘king of the hill’ with Nvidia


Recommended Posts

Administrators
6k 3,176
Quote

There’s been lots of reports about AMD’s changing strategy around its desktop graphics cards, but we finally have something a bit more definitive.
 

When asked directly at a press Q&A at IFA 2024, Jack Huynh, AMD’s senior vice president and general manager of Computing and Graphics, discussed how the company’s upcoming approach to competing with Nvidia would change moving forward. The goal is still to achieve higher market share against Nvidia, but that may not involve releasing the high-end flagship tier that some PC enthusiasts want.

Source

£3000

Owned

 Share

CPU: AMD Ryzen 9 7950X3D
MOTHERBOARD: MSI Meg Ace X670E
RAM: Corsair Dominator Titanium 64GB (6000MT/s)
GPU: EVGA 3090 FTW Ultra Gaming
SSD/NVME: Corsair MP700 Pro SE Gen 5 4TB
PSU: EVGA Supernova T2 1600Watt
CASE: be quiet Dark Base Pro 900 Rev 2
FANS: Noctua NF-A14 industrialPPC x 6
Full Rig Info

Owned

 Share

CPU: Intel Core i5 8500
RAM: 16GB (2x8GB) Kingston 2666Mhz
SSD/NVME: 256GB Samsung NVMe
NETWORK: HP 561T 10Gbe (Intel X540 T2)
MOTHERBOARD: Proprietry
GPU: Intel UHD Graphics 630
PSU: 90Watt
CASE: HP EliteDesk 800 G4 SFF
Full Rig Info

£3000

Owned

 Share

CPU: 2 x Xeon|E5-2696-V4 (44C/88T)
RAM: 128GB|16 x 8GB - DDR4 2400MHz (2Rx8)
MOTHERBOARD: HP Z840|Intel C612 Chipset
GPU: Nvidia Quadro P2200
HDD: 4x 16TB Toshiba MG08ACA16TE Enterprise
SSD/NVME: Intel 512GB 670p NVMe (Main OS)
SSD/NVME 2: 2x WD RED 1TB NVMe (VM's)
SSD/NVME 3: 2x Seagate FireCuda 1TB SSD's (Apps)
Full Rig Info
Link to comment
Share on other sites

I've never spent more than $800 on a GPU personally, so this doesn't bother me. I just wonder how they plan to compete more effectively in the lower segments. If they can't put real price pressure on Nvidia then I don't see how they will improve their market share.

  • Agreed 1

Owned

 Share

CPU: 5800x
MOTHERBOARD: ASUS TUF Gaming B550-Plus
RAM: XMP 3600mhz CL16
GPU: 7900XT
SOUNDCARD: Sound Blaster Z 5.1 home theater
MONITOR: 4K 65 inch TV
Full Rig Info
Link to comment
Share on other sites

Honestly, I don't think this idea is impractical at all. Not everyone cares about having teh top of the line GPU. On top of that, not everyone can afford one anyway, so concentrating on mid-tier markets makes sense for them long term. I think most people just want good value with their performance, so if AMD can deliver that, it's going to win more fans. 

Link to comment
Share on other sites

Premium Platinum - Lifetime
1.3k 787

Mid tier or low tier would be great as afaik the options are currently bad and way overpriced.

 

A 4070 or 4070ti power card for $500 from AMD for example.

 

Low end for people with $300 on a budget needs attention too.

 

This hobby has gotten far too expensive and imo it's because of crazy inflation and Biden not reversing the China Trade War tariffs afaik. There was that bipartisan bill to try to bring electronics prices down but I'm not seeing that with stuff I've bought lately. * I know political talk isn't allowed, if what I said is over the line delete this last paragraph. Though I was just mentioning facts that contribute to GPU prices being stupid expensive.

Link to comment
Share on other sites

It's not just electronics!  

 

Things are high priced - everywhere. There are some caveats and special circumstances. 

 

1. Gas in gulf states - cheaper, lucky bastards. 

2. Arizona Iced Tea - it's still a dollar. 

3. Beer isn't any more expensive than it's ever been, there's just a lot more choices and everything gets an automatic bump because of it. 

 

Back on topic; AMD isn't pivoting because the technology is incapable of being produced - they're also seeing the trend for what it is.  Things in 10 to 20 years will not look much like they do today. 

 

It's only been 18 years "smart phones" (debatable due to PDA's in the 90's with cellular access). In that 18 years, the entire world has changed: 

 

1. How we purchase everyday things and splurge items. 

2. How we purchase homes, how we attain the information regarding them (no long closed circle), how we move the money around to do that. 

3. One huge world-wide affected economic collapse, one near collapsed pandemic. 

4. Education is completely different than when most of us went to school (at all levels). 

5. We (quite literally) just witnessed the first private commercial spacewalk. 

 

And it's only changing... faster now. AMD sees this - the future is not in the past. 

  • Respect 2
Link to comment
Share on other sites

On 15/09/2024 at 17:15, neurotix said:

Mid tier or low tier would be great as afaik the options are currently bad and way overpriced.

 

A 4070 or 4070ti power card for $500 from AMD for example.

 

The 7800XT ($499) and 7900GRE ($549 msrp) already fill that gap.

 

If they can get 4080 level performance for $500 next round that will be solid.

  • Agreed 1

Owned

 Share

CPU: 11900k 5.4 single / 4.9 all core
MOTHERBOARD: Asus TUF Z590
RAM: 32gb G.Skill DDR4-4000 CL16 @ 3733 14-14-14-32 2T 1.49v
GPU: AMD 6900XT 2500/2100 1.115mV
CPU COOLER: Noctua NH-D15 Chromax
SOUNDCARD: Creative Sound Blaster Z
PSU: 750w Evga SuperNOVA p2
CASE: Fractal Torrent Black
Full Rig Info

Owned

 Share

CPU: i9 10900
MOTHERBOARD: Asrock Steel Legend Z590
RAM: 64gb T-Force Vulcan Z 3600 c18
GPU: Asrock Challenger Arc A380
PSU: 850w NZXT Hale-82
Full Rig Info

Owned

 Share

CPU: Ryzen 7 4700U
RAM: 32gb Kingston Fury c20 3200
GPU: Vega 7 APU
Full Rig Info
Link to comment
Share on other sites

I’m not an expert. I just don’t see how they’re going to hit their objective without getting extremely creative.

 

High costs start from the wafer and there is a monopoly with TSMC. Unless AMD source other manufacturers or alternatively, design a more advance solution, akin to Zen to keep costs down,  I’m not seeing how we can improve performance and lower costs.

 

I suspect the performance will most likely flatline with the 8000 series products, but hopefully with a price that makes a statement.

 

If you could get 7800XT performance early next year for say, $300, that will move the needle forward.

Link to comment
Share on other sites

I think desktop hardware is going to condense and be a partial piece of a streaming puzzle in the end. This only seems like the beginning from AMD's perspective. Short + mid term coverage, with functionality moving off PC's altogether. 

 

If you could offer 120FPS (or more) to various devices at up to 1920x1080 (or 1440p) without expensive hardware, that'd be likely more profitable to masses in an increasingly growing market.

 

Gaming in any form (mobile, console or PC, streaming, TV, etc) is only going to grow because older populations whom likely don't game anyways are slowing becoming history. The last of the silent generation are upon their makers, and in 20 years Gen A will vote for the first time. 


By then, I don't think 'mainstream' hardware will remain the same. Maybe not.  Maybe that's 30 years out. I dunno. Maybe never. 

Link to comment
Share on other sites

51 minutes ago, GanjaSMK said:

I think desktop hardware is going to condense and be a partial piece of a streaming puzzle in the end. This only seems like the beginning from AMD's perspective. Short + mid term coverage, with functionality moving off PC's altogether. 

 

If you could offer 120FPS (or more) to various devices at up to 1920x1080 (or 1440p) without expensive hardware, that'd be likely more profitable to masses in an increasingly growing market.

 

Gaming in any form (mobile, console or PC, streaming, TV, etc) is only going to grow because older populations whom likely don't game anyways are slowing becoming history. The last of the silent generation are upon their makers, and in 20 years Gen A will vote for the first time. 


By then, I don't think 'mainstream' hardware will remain the same. Maybe not.  Maybe that's 30 years out. I dunno. Maybe never. 

 

For long term vision and planning, sure, I could see conventional tech continue to converge to smaller devices for the average consumer. Whether it is streaming or other means to improve UX, remove barriers, and push the medium to be even more accessible. More content, more iteration, faster development, easier to play. None of which requires the fastest consumer card.

 

In the meantime and back to the current reality, there will still be enthusiasts, consumers, and businesses tied to models that rely on personal computers and components with regular upgrade schedules to keep cash flowing. I don’t see that model shifting significantly for another 10 years at min.

 

Also, AMD can say a lot of stuff but I will believe it when I see it. No one needs a 8900 XTX from AMD and I don’t want to spend 1k+. What I would like, as a consumer, is hardware that can push latest software, without significant compromises, at an affordable price.

 

If that means FSR4 and other tech or design to circumvent the challenges of more raw compute, I am fine with that. As long as the results are consistent.

 

They challenge we all know is that the marketing doesn’t hold its weight. The results are compromised (lack of VRAM, poor upscaling, high power draw etc.) and we, specifically enthusiasts, are unsatisfied.

 

So we’re back at square one.

 

Most of us want the absolute best with minimal compromise. It’s the reason why most of us avoid upscalers, use OLED or high refresh rate displays. Same reason why we concede and spend more than we want to. So we can avoid wasting our time and just enjoy the content we like on what little time we have.

 

Sounds cryptic but that’s more or less what influences our decisions. 

 

So yea, AMD can skip the high end if they choose to, and I think that’s fine, but it requires them to not drop the ball on their vision. Please deliver mid range parts that can play most games at native 4K / 60fps with mid level settings / RT at affordable prices.

 

Otherwise, they’re just wasting time in a very competitive, fast paced industry.

  • Agreed 1
Link to comment
Share on other sites

5 hours ago, GanjaSMK said:

I think desktop hardware is going to condense and be a partial piece of a streaming puzzle in the end. This only seems like the beginning from AMD's perspective. Short + mid term coverage, with functionality moving off PC's altogether. 

 

If you could offer 120FPS (or more) to various devices at up to 1920x1080 (or 1440p) without expensive hardware, that'd be likely more profitable to masses in an increasingly growing market.

This already exists. Nvidia has a streaming service that can do 240FPS for $20 a month. It's called GeForce Now. 

 

Streaming is never going to be big in the gaming market because no matter how good all the hardware involved is. Latency will always be an issue, it's just physics. So many of the most popular games on PC rely on input latency being as low as possible, streaming is just not practical as anything more than a niche thing. It only really works for games where latency is just not important, and there's no getting around that. Faster internet speeds and better hardware won't change the latency issue. 

 

 

 

Hopefully AMD just keeps doing what they did with the 7000 series. As is, they really only lack a solid 4090 competitor. The 7900XTX is close enough to the 4080 to be a solid competitor. A lot of people on sites like this may have 4090s, but Steam shows the 4090 as having less than 1$ of the market.

 

Right now AMD is just suffering from being so far behind on upscaling and RT. Their roadmap has always been to make RDNA4 their real focus for RT, and the 8000 series will do AI enhanced upscaling like DLSS and XeSS as well. Hopefully they can bring those things up be good enough to hold up against Nvidia, and if so they can be in a good position to regain some ground. There are still so few games that make demanding RT a big part of the presentation, but we all know that's where things are headed so people feel like Nvidia is more future proof right now. 

 

3 hours ago, Slaughtahouse said:

Most of us want the absolute best with minimal compromise. It’s the reason why most of us avoid upscalers, use OLED or high refresh rate displays. Same reason why we concede and spend more than we want to. So we can avoid wasting our time and just enjoy the content we like on what little time we have.

I don't avoid upscalers at all. It's the first thing I'll turn to if I need more performance in a game. 

 

On my 7900XT, it's fast enough that I never really need to use a setting lower than FSR quality mode unless I try to do path tracing or something like that, and quality mode is perfectly fine most of the time.

 

I think the vast majority of PC gamers who understand what these upscalers are use them regularly, and those who don't know what they are probably have them enabled by default without knowing it in a lot of their games.  

 

 

I get a good comparison for upscaling since me and my gf sometimes play games together in the same room and I can see the differences between DLSS and FSR is real time. There is no denying that DLSS is pretty impressive. It can go really low and still look good. Here I am with a big expensive GPU and my gf's GeForce 2070 Super can run and look just as good in a lot of games 🤣

 

Maybe in a few more years we will all be playing demanding games at 1080p or lower and upscaling heavily all the time. It's already gotten really good and it hasn't been around very long. If you don't count DLSS 1.0 and just look at upscaling from the time in which it became good enough to be practical, then it's still a pretty new thing. I don't think Nvidia planned on DLSS even existing until they had tensor cores on their GPUS and wanted a way to justify the extra cost to gamers. I think they got sort of lucky in that they kinda stumbled into a success with DLSS in that sense, and looking for a way to justify tensor cores just worked out really well. FSR is really impressive IMO for what it's able to do without any special tricks.  

Edited by UltraMega

Owned

 Share

CPU: 5800x
MOTHERBOARD: ASUS TUF Gaming B550-Plus
RAM: XMP 3600mhz CL16
GPU: 7900XT
SOUNDCARD: Sound Blaster Z 5.1 home theater
MONITOR: 4K 65 inch TV
Full Rig Info
Link to comment
Share on other sites

7 hours ago, UltraMega said:

I don't avoid upscalers at all. It's the first thing I'll turn to if I need more performance in a game. 

 

On my 7900XT, it's fast enough that I never really need to use a setting lower than FSR quality mode unless I try to do path tracing or something like that, and quality mode is perfectly fine most of the time.

 

I think the vast majority of PC gamers who understand what these upscalers are use them regularly, and those who don't know what they are probably have them enabled by default without knowing it in a lot of their games.  

 

 

I get a good comparison for upscaling since me and my gf sometimes play games together in the same room and I can see the differences between DLSS and FSR is real time. There is no denying that DLSS is pretty impressive. It can go really low and still look good. Here I am with a big expensive GPU and my gf's GeForce 2070 Super can run and look just as good in a lot of games 🤣

 

Maybe in a few more years we will all be playing demanding games at 1080p or lower and upscaling heavily all the time. It's already gotten really good and it hasn't been around very long. If you don't count DLSS 1.0 and just look at upscaling from the time in which it became good enough to be practical, then it's still a pretty new thing. I don't think Nvidia planned on DLSS even existing until they had tensor cores on their GPUS and wanted a way to justify the extra cost to gamers. I think they got sort of lucky in that they kinda stumbled into a success with DLSS in that sense, and looking for a way to justify tensor cores just worked out really well. FSR is really impressive IMO for what it's able to do without any special tricks.  

 

That's fair (leaving them on or prioritizing them to be on) and I think typical gamers don't emphasis toggling them off like I or maybe other enthusiasts do. I just believewith channels like DF going hard, video after video, on FSR and knowing that the quality really isn't there, people have taken notice to this and it's one of the reasons why they continue to purchase Nvidia. DLSS is better. How much better it is today vs 2+ years ago since I last used it, I can't say but when I had my 3060 Ti, I had no problem running games on my tv at 4K with DLSS quality. Anything less than quality was easily noticeable as a compromise. On my monitor (1440p) I would still disable it though. Either due to viewing distance or a lower rendering resolution. 

 

If FSR4 or whatever the next version is called, can improve the IQ, I think that is their best shot. It'd have to be as good as DLSS 2.X  along side more affordable prices to make an impact. Otherwise, people will stick with Nvidia because... it already has the best upscaler, best RT perf, and significant mindshare. 

  • Agreed 1
Link to comment
Share on other sites

2 hours ago, Slaughtahouse said:

 

That's fair (leaving them on or prioritizing them to be on) and I think typical gamers don't emphasis toggling them off like I or maybe other enthusiasts do. I just believewith channels like DF going hard, video after video, on FSR and knowing that the quality really isn't there, people have taken notice to this and it's one of the reasons why they continue to purchase Nvidia. DLSS is better. How much better it is today vs 2+ years ago since I last used it, I can't say but when I had my 3060 Ti, I had no problem running games on my tv at 4K with DLSS quality. Anything less than quality was easily noticeable as a compromise. On my monitor (1440p) I would still disable it though. Either due to viewing distance or a lower rendering resolution. 

 

If FSR4 or whatever the next version is called, can improve the IQ, I think that is their best shot. It'd have to be as good as DLSS 2.X  along side more affordable prices to make an impact. Otherwise, people will stick with Nvidia because... it already has the best upscaler, best RT perf, and significant mindshare. 

Yea, Digital Foundry is pretty hard on FSR, but I think that's mostly because console games have been really over-using it and pushing it way beyond it's limits. DLSS is definitely a significant step up from FSR, but at the higher settings they are both fine. It's only at lower resolutions that the difference really stands out. 

  • Agreed 1

Owned

 Share

CPU: 5800x
MOTHERBOARD: ASUS TUF Gaming B550-Plus
RAM: XMP 3600mhz CL16
GPU: 7900XT
SOUNDCARD: Sound Blaster Z 5.1 home theater
MONITOR: 4K 65 inch TV
Full Rig Info
Link to comment
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now
×
×
  • Create New...

Important Information

This Website may place and access certain Cookies on your computer. ExtremeHW uses Cookies to improve your experience of using the Website and to improve our range of products and services. ExtremeHW has carefully chosen these Cookies and has taken steps to ensure that your privacy is protected and respected at all times. All Cookies used by this Website are used in accordance with current UK and EU Cookie Law. For more information please see our Privacy Policy