Welcome to ExtremeHW
Welcome to ExtremeHW, register to take part in our community, don't worry this is a simple FREE process that requires minimal information for you to signup.
Registered users can:
- Start new topics and reply to others.
- Show off your PC using our Rig Creator feature.
- Subscribe to topics and forums to get updates.
- Get your own profile page to customize.
- Send personal messages to other members.
- Take advantage of site exclusive features.
- Upgrade to Premium to unlock additional sites features.
-
Posts
2,596 -
Joined
-
Last visited
-
Days Won
65 -
Feedback
0%
Content Type
Forums
Store
Events
Gallery
Profiles
Videos
Marketplace
Tutorials
Everything posted by UltraMega
-
Ubisoft has had a lot of games that didn't meet general expectations, and the expectations are never that high. I can't think of any games that were major flops except for Skull and Bones, and a lot of lots of mini flops to go with it. Some of the more recent ones: Ghost Recon Breakpoint Far Cry 6 Watch Dogs Legion Skull and Bones Star Wars Outlaws
-
What are ur specs?
-
Just some added context to my meaningless Ubisoft rant: Ubisoft Released New Official Survey for the Next Ghost Recon! Just further evidence that Ubisoft might actually be trying to make a good game in the future. Ghost Recon Wildlands was in dev for 5 years before launch. The following game had a larger team, but was only in dev for ~18 months before launch. Here's hoping the next game gets both plenty of time and a full dev team. I'm not a particular fan of Ubisoft. I think they get more hate than they deserve sometimes, but they have clear and obvious faults. I do think their ups and downs are interesting to follow though. Ubisoft always seems to be well intended but still misses the mark in ways that are just always so... Ubisoft.
-
-
That's cool. I thought I was the only one here gaming in 4k. Do you guys ever get close to your refresh rate in demanding games at 4K?
-
How long have you been on 4K? Do you use a TV or a monitor? What's the refresh rate? What GPU and CPU do you have now?
-
AMD Ryzen 7 9800X3D Achieves 6.9GHz Overclock and 1480 FPS Performance in Counter-Strike 2
-
I'd imagine if the benchmarks were fully GPU bound and the chart was flat, 1% lows would have to be pretty similar. 1% lows can highlight larger difference between two CPUs that otherwise look to be very close, but in pure 4K GPU bound terms, if the chart were flat then 1% lows would also be pretty flat. That was also before we were firmly in the dx12/ue5/shader compilation stutters era though.
-
There were a couple years where 4K benchmarks would just be literally flat with no differences at all for just about any modern (at the time) CPUs. There might be some very minor differences for a 4080 or 7900 series GPU but beyond that it would still be flat. Of course, that's for demanding titles. If you're talking about games like CS, that's a different beast all together.
-
Yes, definitely agree guru3D is not the best, but it is very simple and to the point. When I post news like this it's often just a discussion starting point. I figure by the time people visit EHW, they have probably already seen reviews and these threads can be more of a discussion area for all the info out there. Still, these 4K charts are only interesting with super high end GPUs. For anything else, there no meaningful differences to show.
-
"Second star to the right, and straight on till morning."
UltraMega replied to ENTERPRISE's topic in Announcements
If you don't find a seller sooner than later, have you thought about trying to auction it off or something like that? Even if it didn't go for the amount you'd hoped, it would be better than nothing, right? -
If it were anything other than a 4090, the 4K charts would just be flat. I think guru3d actually stopped doing 4K benchmarks for a few years at least for most games because there was just no point. I wish they would go even further back on the 4K charts and show some older CPUs.
-
https://www.guru3d.com/review/review-ryzen-7-9800x3d-processor/page-29/#final-words
-
Great track for Rocket League/Hype:
-
Bump/reminder
-
This is an OP-ED about some of the recent struggles Ubisoft has faced. Context Summary: Ubisoft is getting a lot of flak lately, but I think some of their recent moves are actually positive for the company overall. I don't think anyone here really cares one way or the other, but I see a lot of news about this that isn't being properly represented so I just wanted to put some logic out there. Ubisoft is dropping dead weight and freeing up resources Ubi has been knows to enter strange development agreements. I suspect that is why they just released an NFT game. Not because they think NFT are still viable as the news about this suggests, but rather because they probably had a contractual agreement to deliver the game. Ubisoft did have such an agreement for the game "Skull and Bones", and when they finally got that out it, it rid Ubisoft of that obligation and logically also freed up some resources. I suspect the NFT game is the same; getting a contract out the door to free up company resources. Ubisoft has some games in development that have been in dev for years that we have not seen yet. They are taking their times to bake something good, or at the very least they are taking the right approach. Ubisoft should be in a good position now. They have an extremely capable game engine and a lot of newly freed up resources to take advantage of it. I think Ubisoft's stock will recover in 1-3 years, and may gain significantly if they are able to deliver a true next gen single player game.
-
Ghostwire Tokyo is free on Epic right now. I played this game and it was pretty weird, but cool. Has some good RT effects and supports all the upscalers. Ghostwire: Tokyo | Download and Buy Today - Epic Games Store
-
The gains are getting smaller and smaller, and further apart. Even if smaller nodes scale we'll, we're still almost at the end of the road. We're at 3nm right now, right? Can't get any smaller than 1nm or even if we can it will be just barely. Once we get there, that's it. The only gains left will be architecture improvements until and unless we find some new ways to make chips.
-
I'm just going to take this as an opportunity to go on a mini-rant: We're at the point where Moore's Law is dead. I think Moore's law actually died at 16nm. It's my understanding that there are parts of a CPU that just cannot get smaller than 16nm, where as others can. Individual parts of a CPU have different limits to how small they can be. While we're not quite at the limit for how small transistors can be, we're close and we are at limits for a lot of other parts of modern chips. Ever since we started hitting these limits, there is less and less to gain and less and less to be excited about with new hardware. That's why efficiency gains are a bigger deal now than ever before. Partly because increasing efficiency is one way to reduce heat and thus push chips harder, and also because there are not big wins left for performance. But then there's NPUs. I honestly think Nvidia did not have a plan to create DLSS when it launched the first NPU/Tensor enabled graphics cards. I don't know this to be true and it's pure speculation, but just following the development of this stuff as it went on I felt like DLSS was Nvidia trying to create something to justify the extra cost of the tensor cores. I think Nvidia knew they would be useful for something, and they wanted to create a foundation of hardware that could support a market for things like LLMs and all the other stuff NPUs can do. Nvidia was lucky that DLSS worked out so well, much better than I think they had any idea of when they first released the 2000 series. Fun fact about DLSS I think NPUs are a very interesting future direction for hardware, much more so in relation to gaming than anything else going on at the moment. We're still in the early days with what NPUs can bring to the table for 3D rendering. NPUs can do a lot to speed up the actually rendering process, and so far we are just using it for image enhancement, so it's basically a post processing effect in a lot of ways (though not entirely). Nvidia was basically able to get an 8X multiplier of rendering power in the sense that DLSS3 with frame gen can render 1 out of ever 8 pixels you actually see and get roughly comparable quality to native rendering, though with some obvious trade-offs. I don't think most people realize how impressive that is right now because the effects of Moore's law ending make it feel like there are just normal gains, but they're actually much more fascinating gains based on new hardware that is still really early yet extremely capable.
-
The reticle limit gets smaller in corelation with the node size, so it's smaller for current chips than it was with older ones. It's definitely true that the reticle limit is not the only factor, but one of many compounding factor that are forcing chip makers to go in the chiplet direction. Iirc, one of the bigger factors is power leakage which gets worse the larger the die is.
-
For reasons I don't fully remember off the top of my head, nor do I really fully understand, there are limits to how large a single die can be on any given node. The only way around certain limitations is to do chiplets. AMD does homogenous chiplets, Intel is doing heterogenous chiplets. I think both are reasonable approaches, but Intels approach is surely not going to play as nicely with certain software. It makes sense to me that E cores are essentially the replacement for hyper-threading and just like some apps work better with hyper threading off, some apps may work better with E cores off. I do agree that it would be nice to see one or two P core only chips from intel, with or without HT. as for the AI stuff, I do think it's actually really important for the hardware industry overall. Regardless of how any of us feel about stuff like chatGPT or AI, the capabilities an NPU brings to the table across the board are potentially pretty huge, and normalizing that across hardware platforms is a good thing in general so that the software advantages can then follow as we figure out more and more ways to use NPUs effectively. It's definitely not a fad or a trend that is going to go away. Even if "AI" goes through some ups and downs in terms of how receptive people are to it, NPUs are only going to get better and more prevalent simply because there are already a lot of common tasks they can do better than a CPU or GPU. The hardware is still pretty early on that front in some areas, and I think Intel ought to have not even bothered with this gen. The NPU they tacked onto these CPUs is so weak, they might as well have skipped it. It's not even a tenth of what AMD has on their MX300 CPUs. AMD meets the windows requirement NPUs, Intel doesn't even come close. You are probably right that there is a supply shortage given the CPU just launched. I think it's still true that demand is low, supply is just also low at the moment. Personally, I find I get better answers/information from CoPilot vs doing a search in Google/Bing for the same info. You just have to keep in mind AI can be wrong, just like anything else online can be wrong. You still have to verify the info, but the process is ten times faster (and without any ads) than trying to sort through a bunch of different search results yourself until you find everything you're looking for, like so sort of primitive caveman!
-
All chip makers are moving to tiles/chiplets. It's not because they want to, it's because it's literally a requirement. The laws of physics do not allow for monolithic chips to work at a certain point, and we are at that point now. AMD is ahead because they embraced this absolutely necessary step forward sooner than Intel did. Although Intel does already have tiled chips in the server market, so this isn't really a totally new thing for them. According to the article, this has nothing to do with low supply. Only low demand.
-
The title is misleading. They have sold some literally, but so few that is "practically nothing".
-
It's happening: https://www.guru3d.com/story/mindfactory-reports-zero-sales-for-intel-arrow-lake-core-ultra-200s-cpus/
-
https://www.guru3d.com/story/amd-officially-announces-nextgeneration-amd-ryzen-7-9800x3d-at/ $479