Jump to content

Welcome to ExtremeHW

Welcome to ExtremeHW, register to take part in our community, don't worry this is a simple FREE process that requires minimal information for you to signup.

 

Registered users can: 

  • Start new topics and reply to others.
  • Show off your PC using our Rig Creator feature.
  • Subscribe to topics and forums to get updates.
  • Get your own profile page to customize.
  • Send personal messages to other members.
  • Take advantage of site exclusive features.
  • Upgrade to Premium to unlock additional sites features.

J7SC_Orion

Members
  • Posts

    2,209
  • Joined

  • Last visited

  • Days Won

    94
  • Feedback

    0%

Everything posted by J7SC_Orion

  1. "30 fps", "and may be crazy or getting old idk" ...could be that you are, but I'll also get there soon enough. CP 2077 was designed with the 'Bladerunner' look in mind, not to be photo-realistic. I also don't see the handlebar 'problem', noting that these pics were taken with both HDR and RTX Psycho on which makes getting good stills a bit tougher. By comparison, the road in the vid you posted above looks a blurry mess, the plants are jaggy, as are some of the lines of the Range Rover and the houses. That said, when I feel like photorealism, I play > this which I got last year...Unreal 4 engine. From the demos I have seen about Unreal 5, we're in for a treat !
  2. ...at what resolution are you playing ? At 4 K ultra / everything maxed, I find it super-crisp, at least on my NV cards (AMD not quite). Still, if this improves it even more, I qm going to try it. Also hoping to ger some decent titles with Unreal 5 engine. soon..Unreal 4 is already quite something.
  3. Nice - though exceeding the speed of sound in a 737 may not go down well with the FAA & Co; busted ! ...yeah, MSFS 2020 has been getting a lot of patches and also additional 3rd party marketplace content lately. That's good to see as they're still investing in it. The DLSS3/FI/NVRef RTX4K update made me very happy, considering how MSFS2020 started out - always full of promise, but the early iterations had some 'coding issues'. Some of those are still lurking under the surface, but have far less impact. Even the DLLS2 / RTX3K secondary setup I sometimes use is a lot more playable, IMO. Below, the 'weird' Halo Pelican leaving Lukla for Mt. Everest... Some Pitts stomach-twisting over our place... ...and a quick jaunt over to Iceland for some night flying...
  4. I saw that at another news site yesterday and was going to poste it here, but then went down some related rabbit holes on YouTube...
  5. ...your front yard looks a bit warmer though
  6. ...I like roaming around in winter weather, do an hour or so in the forest and take some pics if the conditions are right...
  7. Point well taken; volume at highest margin is what an economist would recommend to maximize profits (not revenue). In any case, as important as we might think we are as gamers and 'fan-boys-and-girls', the real money for the big GPU manufacturers is in the workstation and enterprise markets. With the huge growth of AI, (primarily) Nvidia has been sitting pretty. It is also known to ride roughshod over some of its board partners by keeping them on a short leash re. designs and cost-plus pricing (see 'EVGA', compared to Palit - the latter owning all kinds of brands including Galax - and is the largest GPU vendor). AMD and Intel are desperately trying to catch up to NVidia in the lucrative workstation and enterprise markets with their latest multi-tile offerings. Still, for now, NVidia rules with the Ada Lovelace and Hopper designs, and well-developed related software with years of a head-start. Some permutations and combinations from those will hit the 'high-end' consumer market...see other threads about a potential 4090 Ti/Titan with 48 GB of VRAM; if/when it hits, it will be priced between the 4090 and RTX 6000 Ada...from that perspective, the US$ 1,619 I paid for my 4090 is outright '''cheap'''. Put differently, those are not upgrades of GPU chips from the gamer market, but downgrades from the workstation and enterprise markets... Who knows what the future will bring ? Multi-tile HD GPUs with tiles disabled for consumer models ? How about paying a subscription fee to unlock some tiles on the GPU you already own ? It boggles the mind... So DLSS3 for RTX4K was easier for NVidia with its massive AI prowess compared to their competitors. All I can say is that the 4090 with its 76.3 Billion transistors is a monster when compared to the 3090 at 28.3 billion and a 2080 Ti at 18.6 billion. DLSS3 / FrameIns/ NVReflex is hugely impressive from my vantage point re. not only speed in fps and frame-times, but also visual quality...I use a C1 48 OLED 4K120 for comparison, and at that size and resolution, imperfections would show more dramatically. A final point concerns the sheer gain in efficiency. Below are screenies from a workstation bench program called Octanebench, and while the oc'ed 3090+2x2080 Ti setups just nudge ahead of a single 4090, I could try to oc the 4090 a bit more - but they are certainly close. HOWEVER, the 4090 used around 580W for that while the 3xGPU combo was at about 1,260W...at the end of the day, DLSS3/FI/NVR is an incredible technology; I just wish it would also be available on my 3090 Strix, at least 'partially, given some additional hardware requirements with DLSS3. At least we can hope that some of it is used to improve DLSS2 some more.
  8. After experiencing DLSS3, Frame Insertion and NVidia Reflex on my RTX 4090 in MSFS 2020 as a flawless suite of tools that rejuvenated the 5950X setup and with 4K ultra settings hitting the limit of 120 fps on the 4K120 OLED in Flight Simulator, my expectations were already raised re. what the long-promised DLSS3/FI/NVRe update would do for Cyberpunk 2077...answer: - a heck of a lot. With virtually no stuttering or visual oddness, the same CPU/mobo/4090 oc setup went from an already decent 90 fps in the CP2077 internal benchmark (DLSS2, 'Quality') to over 130 fps (min fps are even more impressive) in DLSS3, 'Quality'. I didn't even oc to what the card is capable of because with the C1 OLED at max 120, it makes no difference...and I didn't think that I would be able to exceed the 4K120 level so soon. FYI, the same system with the same settings managed a respectable 45 - 50 fps (DLSS2, 'Quality') with an overclocked 3090 Strix at close to 2200 MHz. That card is now hooked up to an older IPS HDR panel at the other end of our place and that is a happy combo - be it for work or play - in its own right. For certain work tasks, the 3090 is paired with 2x 2080 Ti (all water-cooled) Still, I recall the early days of DLSS 1 - leaving much to be desired on the visuals. DLSS 3 / Frame Insertion / NV Reflex really are game changers, IMO, in the visual fidelity department as well. The irony is that the RTX4K series, especially the 4090, is probably the card that least required such boosts though apparently, there is also some extra hardware trickery involved on Ada Lovelace compared to Ampere.
  9. ....I initially thought it was some leftover Christmas fruitcake
  10. For my use cases, dual ethernet is a must - I only recently 'returned' from exclusively building HEDT with an eye on work/productivity apps (HEDT market is sort of dead - for now). But yeah, pricing has gone 'silly'. My Crosshair X570 Dark Hero is currently C$ 120 more than what I paid for it at the same retailer where I bought it. I wonder what Gigabyte will charge for the new B650E Tachyon - it competes with the X670E Crosshair Gene but should be a bit more affordable. It has some nifty features re. DDR5 oc'ing and VRM
  11. ...well, caution is advisable because I like pizza with pineapple, anchovies and Canadian back bacon ...I am far from having done any serious research on X670E, but for performance-benching-gaming only, I fancy the Asus Crosshair Gene, or something equivalent from MSI or Gigabyte. That said, I am more likely going to go with a semi-productivity mobo with decent oc capabilities. Like my current X570 Dark Hero, it should also come with dual ethernet - something like the Asus X670E ProArt may be which has 10 Gb and 2.5 Gb ethernet, DynamicOC and a few other tricks and bits
  12. Both Intel and AMD released their financial results recently. Nothing to write home about for either...in fact, rather bad news on many fronts for both. I picked YouTube / Coretek's summaries below -- Coretek is an acquired taste, much like certain kinds of strong European cheeses but he does make some good points, especially about some accounting gymnastics at Intel, and also their share dividend apparently being based on additional debt. I will update this thread when NVidia releases financial results, though NVidia's position is expected to be stronger as they are well positioned in the AI related hardware (and they got my 4090 money )
  13. On paper, asymmetric cores such as P + E make sense, but in practice, there will always be the scheduler issues with some apps, given all the legacy baggage Windows carries. I prefer symmetric setups because of that. Also, I use systems for both work and play and the 7950X3D makes huge sense to me at the price (compared to for example a 7/5995 Threadripper Pro). For gaming and some benchies, I would simply disable one or the other CCX in the 7950X3D and thus end up with a symmetric setup if needed.
  14. For me, the choice is a bit easier since I skipped LG1700 altogether (and so far, AM5). It is just too late now to get into LG1700 at this point and either way, I will get into DDR5 with the next upgrade. All that said, I first want to see lots of reviews on how the OS schedulers deal with the 7950X3D and 7900X3D. Also, I am still thrilled with my 5950X AM4 at 4K, so there is a bit of extra time to make an informed choice on a new mobo, DDR5 etc
  15. The 7950X3D makes more sense to me as you can make it a 7800X3D with a simple bios setting...
  16. Noice ! I love the 'older' stuff...and for K/Ubuntu, it really doesn't matter anyways.
  17. ...time to check your credit card balance ! Ryzen 7950X3D priced at US$ 699 with a release date pf February 28th...great price, at least before you factor in the AM5 / X670 / DDR5 upgrade costs. I am still going to wait for 3rd-party reviews (regarding asymmetric CCX, Windows scheduler and all that jazz) but if/when I go for it, it will be with two sticks of the fastest 32 GB per stick DDR5 I can find, hopefully 7200+...I know about Ryzen IF limitations, but I generally only buy one type of a DDR5 'family' to allow for mix-and-match interchangeability between work and play machines, be that Intel and/or AMD...never owned a set of GSkill AMD-specific 'Neo' for example, and never will... ... @Bastiaan_NL @bonami2 tempted ?
  18. ...psst, wanna load a > 1000 W XOC bios on your inno3d w/ no safeties enabled ?
  19. ...and it gets worse and worse before it gets better --- better seek medical assistance now
  20. ...very nice ! That said, I am not so sure about the PCB's power handling capabilities, and even the side-water-cooling connections > that will take up additional space lengthwise...it probably depends on the case used etc whether it is an advantage or disadvantage. Even a 'regular' 4090 such as the Gigabyte G-OC below and a Bykski block is only 233.1 mm long (unlike with its massive stock air-cooler when it is 340 mm long), but it comes out in the wash re. side vs top water-cooling connections. Bonus with regular models: A full 600 W at stock (more w/custom vbios as the 20+4 PCB can handle it). FYI - Alphacool also makes a block for the Gigabyte; it is 233.3 long, but @bonami2 will be happy to see those ultra-tiny Alphacool blocks - in quadrophonic splendor, no less...just don't get why they don't use 8 sticks of RAM on an 8-channel CPU and mobo
  21. ..Cyberpunk 2077 finally got DLSS3/Frame Insertion/NVReflex...4K120 OLED monitor met its match ...and MSFS 2020 also had a new patch (mostly around reflections and ships / yachts)...
  22. Another MSFS 2020 patch came out...it seems to have improved on ray reflections. In addition, a lot of new details for sea-farers, such as new super and mega-yachts (in case you're in the market for one )
×
×
  • Create New...

Important Information

This Website may place and access certain Cookies on your computer. ExtremeHW uses Cookies to improve your experience of using the Website and to improve our range of products and services. ExtremeHW has carefully chosen these Cookies and has taken steps to ensure that your privacy is protected and respected at all times. All Cookies used by this Website are used in accordance with current UK and EU Cookie Law. For more information please see our Privacy Policy