Jump to content

Welcome to ExtremeHW

Welcome to ExtremeHW, register to take part in our community, don't worry this is a simple FREE process that requires minimal information for you to signup.

 

Registered users can: 

  • Start new topics and reply to others.
  • Show off your PC using our Rig Creator feature.
  • Subscribe to topics and forums to get updates.
  • Get your own profile page to customize.
  • Send personal messages to other members.
  • Take advantage of site exclusive features.
  • Upgrade to Premium to unlock additional sites features.

mouacyk

Members
  • Posts

    200
  • Joined

  • Last visited

  • Days Won

    2
  • Feedback

    0%

Everything posted by mouacyk

  1. It can and it will. There are just outstanding circumstances preventing the market from following MSRP's for a longer period of time than usual. 2080TI MSRP at $1200 and its performance equal 3070 at $500 shows that NVidia is willing to charge accordingly to cost of production. Unlike scalpers who care little for what/how you think of them, companies like NVidia and AMD do have a reputation to keep. Once production can ramp up to meet demand, prices will normalize back to around MSRP. Things are happening that should allow that to happen, sooner or later. I would resist feeding scalpers altogether, if value is a big deal. Otherwise, they only get encouraged even more in good times.
  2. I stopped waiting as well. It finally dawned on me that there is no performance gap between the 3080 and 3090 for the TI to fill. Unless you have a specific need for 20GB or 24GB VRAM, the 3080 is as good as it gets for Ampere, for all things gaming. As always, no to scalpers though -- find some other means.
  3. I do wonder if the card was damaged from the initial boot without any cooling? It's easy enough to strap on any heatsink to the die. With all the invested effort, it was foolish to drop caution.
  4. I like to leak test complex components like gpu and cpu blocks by themselves with an air pressure test kit. Otherwise it's harder/messier to identify the leaking part without actually seeing leaks.
  5. The root issue here is a quid-pro-quo. I think the reporting and reactions are overblown and sensationalized. A company takes a risk on a leading edge tech, and they are sending out free samples to ask for the favor of dedicated coverage. I'm not seeing anything malicious in this request, such as requiring the reviewer to manipulate data, exclude relevant data (raster results), or outright exclude competitor exposure. Unless I missed some detail, it seems HWUB interpreted the directive as #2, and correctly weighed its audience against NVidia. But the interpretation is badly stretched. I get that NVidia and AMD are no ones friend, but I also only see one of them taking leading edge risks. As one customer, I'd like to see what the rewards are.
  6. Wow wingman looks like it's straight out of bf3 flight missions.
  7. The Phoronix results were very confusing to me too as a Linux user. It's more likely they haven't completely optimized the Pro driver.
  8. 320-bit is a step down, hopefully it comes out as 352-bit and 11GB or 22GB. In the high end, you might as well pay for the best RTX performance and most VRAM. The choice between this and the 6900XT will be hard.
  9. Haven't seen the term interleave mentioned yet, so I will. This used be an old feature in BIOS that could be enabled for multiple dimm modules. It's basically RAID0 for RAM. With optimal channels, perhaps Ryzen is automatically enabling it, or some form of striping?
  10. There's a potential cross-CCD issue with the 5900X. https://www.igorslab.de/en/amd-ryzen-9-5900x-und-ryzen-5600x-im-ersten-test-wird-intels-10-generation-jetzt-obsolet-2/6/
  11. Something's wrong with me, when I see 2080 TI's posting on 2nd-hand markets for $700-800 and feel like it's a deal. Damn these scalpers for holding up the Ampere supplies. @Sir BeregondAt the end of the day, it does feel like $500 is too much for this range still. NVidia really did move the stack prices on all of us, even for AMD.
  12. Not too far from this, just a few more cables:
  13. Can't wait to see the transformation. I've been going sff for primary desktop, but I do have a server to overhaul someday.
  14. Rasterization is far from dead, so AMD choosing to invest performance into this is wise and welcomed for a few more generations. Despite NVidia's push, ray-tracing is just not powerful enough for competitive play yet, so the compromise is to use it purposefully for enhancing immersion in good single-player games, especially walking simulators.
  15. @J7SC_Orion Thanks, that helped but the rabbit hole still leads nowhere yet: https://hexus.net/tech/news/graphics/145957-amd-infinity-cache-patent-hints-rdna-2-secret-sauce/ I guess from the success of GameCache on Ryzen, it's only reasonable that AMD would try to replicate it on GPUs too. Caching is great if you love to run around in the same level doing the same thing over and over again. It's not so great when you want dynamism, streaming open-world with corresponding dynamic events. I just have a feeling it's not going to work so well for scientific data, which in its raw form is usually a unique series. The initial loading and processing will be slower, but any re-processing will be much faster. Hopefully, it all ends up being a wash like Ryzen.
  16. Does anyone know if AMD is planning to go beyond 256-bits for a bigger die for the compute community? It was something in the old days seeing AMD create 512-bit behemoth gaming dies. It's still amazing they managed to match Ampere with 66% the bus width, by making it up in clocks and pipeline efficiencies.
  17. I also read the realism in this game is amazing. A guy on hardforums looked out of his computer room window and saw a Boeing 747 hanging above his house that he was piloting. It was unlikely though that he saw himself looking up at the plane -- too far away.
  18. Not many 256-bit GPUs have been pushed beyond 300W. Clocks, performance and power consumption leaks are indicating that AMD is making the best use of their node advantage over NVidia. Why produce a complicated GPU when you can push more cycles through a simpler one? Complete slap in the face to HBM.
  19. I've covered Fast-Sync and G-Sync before at OCN, but would like to point out this special little gem for use with blur reduction and/or displays without VRR capabilities. ELMB-Sync need not apply. Source: https://medium.com/@blurbusters/another-new-mode-that-blur-busters-helped-advise-guru3d-in-adding-to-rtss-is-the-new-rtss-scanline-3d3a18d88372 There are many guides, howto's, and technical explanations on BlurBusters. It takes a little bit of manual calibration, but essentially you can force the scanline off the visible screen and still maintain input latency similar to VSync-Off without any visual tearing. Small caveate that GPU utilization of <=80% is required to maintain consistent behavior. Otherwise, this is great for 60fps emulators, PC gaming on TV, and ULMB. I will be trying this out in BF4 and BF1 on my Acer Predator XB271HU with ULMB. I have always preferred the motion clarity over G-Sync but sometimes the scanline just sits right in the middle of the display, giving it quite the sore. Now, with S-Sync, it seems possible to get ULMB with V-Sync minus the V-Sync lag.
×
×
  • Create New...

Important Information

This Website may place and access certain Cookies on your computer. ExtremeHW uses Cookies to improve your experience of using the Website and to improve our range of products and services. ExtremeHW has carefully chosen these Cookies and has taken steps to ensure that your privacy is protected and respected at all times. All Cookies used by this Website are used in accordance with current UK and EU Cookie Law. For more information please see our Privacy Policy