Jump to content

Welcome to ExtremeHW

Welcome to ExtremeHW, register to take part in our community, don't worry this is a simple FREE process that requires minimal information for you to signup.

 

Registered users can: 

  • Start new topics and reply to others.
  • Show off your PC using our Rig Creator feature.
  • Subscribe to topics and forums to get updates.
  • Get your own profile page to customize.
  • Send personal messages to other members.
  • Take advantage of site exclusive features.
  • Upgrade to Premium to unlock additional sites features.

Sir Beregond

Reviewer
  • Posts

    1,879
  • Joined

  • Last visited

  • Days Won

    33
  • Feedback

    100%

Everything posted by Sir Beregond

  1. According to this page on AMD's site, the 16GB GDDR6 with 128MB Infinity Cache over 256-bit bus has a theoretical 1664GB/s effective speed. Review/benchmarks will tell... Overall looks pretty good, but I think is still overpriced. Would have preferred to see the 6800XT at $599 and 6800 at $499, but I mean, that's just AMD working off of the market price points Nvidia created...
  2. Totally understand that different use cases will necessitate different cards being better. Makes sense. Anyway, the other bit that just leaves a bad taste in my mouth is the $500 price tag (so AIB's will be more). And it's an xx70 tier card. So I guess the Turing price hikes most of us complained about are fine now?
  3. I'm not impressed. You can't really call this a $500 2080 Ti as it doesn't beat it everywhere and in some respects spec wise. If you managed to snag a 2080 Ti for around $500, you have more VRAM on a higher bandwidth and 3070 DLSS/RT performance in gaming is not better than the 2080 Ti. As far as I am concerned, AMD can really kill this card if any of the leaks are true.
  4. Launch availability won't matter to me as I am not building for a few months, but Ampere is a hot mess as far as I am concerned as an architecture and being on Samsung "8"nm. Sure, like Fermi prior, it performs, but it's got its problems and Nvidia is sandbagging on VRAM. If Big Navi meets, is close to, or beats Ampere and is priced well, I will be jumping back over to Radeon for my next build, no problem. I've been on two Nvidia cards since my last Radeon cards. I'm ready. I expect Ampere to refresh on 7nm and that will help. It won't be a magic bullet, but I expect it will alleviate some issues they have being on "8"nm from a power and yields perspective, and maybe clocks too. I think it's Fermi 1 to Fermi 2 all over again. AdoredTV did an interesting analysis on VRAM going back to Fermi and how we generally saw a doubling for each tier each new generation. And Ti cards were typically half of a Titan. Then it started to get weird with Pascal and going into Turing and Ampere hasn't really progressed and if their marketing is believed can be seen as a regression if the 3080 is supposed to be a replacement for the 2080 Ti (Nvidia does keep calling 3080 their new "flagship" yet continually compares it to the 2080 on charts more than 2080 Ti which was flagship of last gen). More credence that the 3090 is a replacement for the 2080 Ti, not the Titan RTX, while having a $300 price hike. Fermi mid-range: 1GB (GTX 460/560) Fermi high-end: 1.5GB (GTX 480/580) Kepler mid-range: 2GB (GTX 680) Kepler high-end: 3GB (GTX 780 Ti) Kepler Titan: 6GB (OG Titan) Maxwell mid-range: 4GB (GTX 980) Maxwell high end: 6GB (GTX 980 Ti) Maxwell Titan: 12 GB (Titan X) Pascal mid-range: 8GB (GTX 1080) Pascal high-end: 11GB (GTX 1080 Ti) Pascal Titan: 12GB (Titan Xp) Turing mid-range: 8GB (RTX 2080/2080 Super) Turing high-end: 11GB (RTX 2080 Ti) Turing Titan: 24GB (Titan RTX) Ampere mid-range: 10GB (RTX 3080) Ampere high-end: 24GB (RTX 3090) Ampere Titan: there isn't one (48GB if one comes out?)
  5. Never really had any major driver issues from either camp. The only times I have had gameplay issues was doing crossfired 5770's back in the day and I think that's more to multi-gpu just sucking in general. After that I just adopted the philosophy of buying the best possible single GPU solution that fit my needs/budget. I will say this, Nvidia should probably update their control panel to not look like Windows XP anymore.? Anyway, as long as they don't have the reported issues that seemed to be with the 5700 XT drivers when it launched, I'm ready to jump ship from Nvidia. That said, I am skeptical of the performance at 1440p and 4k on a 256-bit bus and am awaiting the benchmarks and understanding of the architecture.
  6. That's nice to be able to do. Yeah hearing some people have nausea issues with it has always given me pause from outright buying into VR.
  7. Not tried any VR yet. Never really had a way to try it and cost of entry while seemingly lowering was always expensive.
  8. Just swapping my current PC into a new case and replacing the tubing scratched that itch till I can do a Zen 3 build in a few months.
  9. Yeah it does feel good to see Intel get their come-up-ins (now Nvidia please). But like you said, it will not benefit the consumer in the long run if it's just a pendulum swinging from one company dominating to the other for years at a time. We need innovation, technology, and performance competition from both camps which will lead to pricing competition as well and in this the consumer benefits with better products at better prices.
  10. Hmm maybe I'll try it. I wasn't so much concerned about the 4790k as much as I am about the GTX 980.
  11. I've heard of this game, but don't know anything about it. I'll have to check-it out more in-depth.
  12. Been looking forward to this game immensely. Deus Ex Mankind Divided had its moments, but was otherwise kinda of a disappointing follow-up to Human Revolution and I've been missing a game in that style/genre. Happy to see this one being an open world game of similar futuristic genre and can't wait to pick this up next year when I build new rig (I don't like playing at medium or below settings lol). CDPR have really done a fantastic job with their games to date. I'm sure this one will be no exception.
  13. Not sure. 4790k and a GTX 980 (non-Ti) with a 32" 1440p monitor. This GPU really can't handle newer stuff at 1440p.
  14. True, but in the Fermi example, it was an actual necessary refresh for the 400-series which was a hot mess even though it was winning. This was the most recent example for Nvidia analogous to what I think will probably happen with Ampere.
  15. Agreed. Really looking forward to grabbing this when I get a new system built next year.
  16. I don't get it, two Apple cult member co-workers of mine were going gaga over it. They are coming from 11's...
  17. Agree. Sad to see hardware following software trends where buggy and unfinished products get released to later be refreshed or completed...maybe. Ampere feels like Fermi to me. GTX 580 came out 8 months following the GTX 480..
  18. Yeah, this does make sense and jives with earlier statements they have made about Ampere primarily using TSMC 7nm. Guess I'll say it again, RIP early adopters!
  19. Still trying to decide if that rumor is true or not. Would make sense, but would also mean they would have had to develop that in tandem with the Samsung "8"nm versions. Definitely can't copy/paste those. If it does end up being true, RIP early adopters of Ampere. ? Curious to see what the GPU landscape looks like around March next year which is when I was thinking of building new rig (same case). I will need a GPU too.
  20. Yeah I wouldn't upgrade that either personally. I'm on an almost 6 year old PC now, so I'm definitely more in need of an upgrade. ?
  21. I thought about waiting for Zen 4 on 5nm and DDR5, but new RAM is always expensive when it first comes out. I think I'm ok upgrading to mature AM$ platform with DDR4 for cheap and Zen 3 and then we'll see what things look like in 2-4 years time.
  22. Yeah, that makes sense. Just one, but may previous case didn't have a place to mount them when I had to remove the drive cage to fit a reservoir/pump unit. My current case I have it mounted.
  23. If we are talking SATA 2.5" SSD's, I've just let them sit unmounted in the back of the case if I didn't have anywhere to mount them...
×
×
  • Create New...

Important Information

This Website may place and access certain Cookies on your computer. ExtremeHW uses Cookies to improve your experience of using the Website and to improve our range of products and services. ExtremeHW has carefully chosen these Cookies and has taken steps to ensure that your privacy is protected and respected at all times. All Cookies used by this Website are used in accordance with current UK and EU Cookie Law. For more information please see our Privacy Policy