Welcome to ExtremeHW
Welcome to ExtremeHW, register to take part in our community, don't worry this is a simple FREE process that requires minimal information for you to signup.
Registered users can:
- Start new topics and reply to others.
- Show off your PC using our Rig Creator feature.
- Subscribe to topics and forums to get updates.
- Get your own profile page to customize.
- Send personal messages to other members.
- Take advantage of site exclusive features.
- Upgrade to Premium to unlock additional sites features.
-
Posts
2,209 -
Joined
-
Last visited
-
Days Won
93 -
Feedback
0%
Everything posted by J7SC_Orion
-
Replacing Desktop With Laptop, Recommendations?
J7SC_Orion replied to ENTERPRISE's topic in Laptops/Tablets & Phones
...a point to remember is that what they call a 4090 in laptops is actually a bit < than a desktop 4080, and it goes down the GPU stack like that. I would make sure that the laptop has very capable GPU power for connecting to and driving a wall-mounted OLED TV / monitor at 4K/120 to go with that couch for when you are home and relaxing ( ~~ vegging out). Dell/Alienware, MSI, Gigabyte, Asus, Acer etc all have good models around the 17 inch +- OLED screen-size; just bring your wallet... -
...good for 'floating' in OLED + Atmos space
-
...I might be utterly (get it: utter ) wrong, but DNS services might be one place to examine site performance issues...
-
-
I time-stamped YT, this is also used for sound-checking home theaters and sound bars. The graphics are neat, too. While I have a sound bar, it is not even connected to the LG 48 inch OLED...for gaming and YT, I sit quite close (~ 1.5 ft) and as such, the 'native' sound is superb, IMO. Have fun:
-
First carwash since last fall's first snows ...then: ...now: The protective crust of dirt that had built up protected the paint perfectly...
-
...^this is not an isolated incident, IMO. Of the last 14 GPUs I bought for work + play, the majority had badly applied thermal paste and high deltas for hotspot temps. My thermal pantry box is full of various thermal pastes, thermal pads and thermal putty (the latter vacuum-sealed and in the fridge for future builds). Only the 2x Gigabyte 2080 Ti WF factory full waterblock cards (from late 2018) have/had no issues and have never been apart. I water-cool most GPUs anyway which is how I also know about factory thermal paste disasters, even if they are not immediately obvious...My standard GPU cooling-mod approach is: Gelid GC Extreme for GPU die paste; thermal putty for the VRAM plus select VRM and back of die; decent thermal pads for the rest of the VRM; extra big heatsink for the backplates.
-
...yeah, but... ....one can increase the power limit of the AMD, and reduce it on the NVidia (I do both, coz, reasons) ...but seriously - NVidia segmenting the market to death with GPUs that have as little as 128 bit memory bandwidth is getting pretty silly. While AMD could take advantage of the situation re. a new and really competitive price-performance lead in this US$ 400 segment, they probably won't. Hopefully, I'm wrong on that.
-
...Good idea - and I have enough fans for it and also a good breeze wind ~30 floors up >>> but the condo strata council & city safety engineers would get too annoying if I hang that outside the windows...
-
...another new eye-candy release and great viewing for OLED personal bonus as this one actually has a few shots of our neighborhood in it, including the street we live on (I did a double-take - hey, I know that intersection)...
-
The build-in YouTube of the LG is a bit of a pain to maneuver around in even with voice commands compared to 'regular desktop YT' but it can eliminate both the PC-GPU and the cable as a potential problem. FYI, I just ran the vid below on the 6900XT switched to the 48 inch OLED and 4K HDR and 8K HDR worked well - though the 8K HDR stream is a bit jerkier on the AMD w/ 16 GB VRAM compared to the 4090 w/ 24 GB of VRAM (not due to my connection - same for both with is 1 Gbps up/down); could be codecs ...still, 4K HDR especially is stunning on both GPUs and the OLED; for 8K streams, I prefer the 4090.
-
...I would love to see some additional upcycling ideas here, re. fans and 'other'...pic on the left is just one of three bags of old fans (OEM, custom, 80mm - 200 mm) and the fans need to find some gainful employment instead of just taking up space / dust ! Pic on the right is an upcycling solution of sorts . This thing with its three GPUs and TR gets normally used for some light ML and as a rendering backup machine. However, we had some window closing / locking issues right at the coldest time mid-winter, so in the evenings, I fired that thing up and ran a bunch of Octane benches which maxed all three GPUs at once; within 5 min, room temp had risen by 3 C (and more when running it longer). Nice and cozy space-heater !
-
I had a similar issue with a 2080 Ti on LG 55 IPS HDR some years back - at the time, I was using a 'good' HDMI 2.0 cable but it would only do 30 Hz at 4K no matter what. However when using a DP-to-HDMI connector with DP port on the GPU and then the same HDMI 2.0 cable as before, I got 60 HZ and far less stuttering, and in my mind anyway, a sharper pic. FYI, I just tried the vid above at 4K and 8K on my 6900XT > 40 inch Philips VA and it is nice and sharp with both DP and HDMI 2.1 ....will try the LG OLED / 4090 / HDMI 2.1 combo after work. Have you tried watching the same vid on the LG 48 OLED internal 'native' YouTube (if connected to a network) ?
-
...kudos, really good result for ECC ...I'll try to find some older comparisons I did with ECC on vs off at the same MHz using memtest_vulkan...regular GDDR6X actually already has 'some' error correction built in anyhow, but 'full on ECC' is an unnecessary anvil to drag behind. ... @Avacado @Bastiaan_NL and I had some extensive discussions when this first arose...some HWBot elite league folks felt upstaged by 'regular' newbies who posted results which clearly had some artifacts. Then HWBot did a knee-jerk reaction and required only RTX 4K cards to run with ECC on. My HWBot elite league years are behind me anyway, but I argued that this rule change only applying to one model range is ludicrous and a failure of HWBot management....they introduced a huge bias with that even though a bit of sleuthing can expose artifact runs anyways. All that said, I don't like it either when 3DM has results that are clearly involved artifacts but as mentioned, you can usually tell by looking at extra details and fps curves. ...a couple of my runs below from up to ~ March '23 for both Port Royal and Speedway involved artifacts - but I didn't post those. I also have newer runs, but not handy. Anyways, 'the cure' is worse than 'the disease', IMO. Addendum: Also per earlier discussions, there are some up-to-date, quick tests to check and scan for artifacts (ie. below)...s.th. like that should be built into 3DM SystemInfo (which takes forever these days anyways ) for any card and model to make it a level playing field...
-
...reminds me of the chiller (originally behind a curtain) during the press intro of the 28-C Xeon 3175X ...that 'test' is a bit like sneaking up from behind and sucker punching LeBron James while he's having desert. I don't consider myself a fan boy for any particular brand and have lots of AMD stuff, too (3x 16 core CPUs, 6900XT with 520 W bios, other older AMD GPUs), but this is a bit ridiculous. FYI, there are some 1000W XOC vbios folks have been running on a 4090 - chiller combo (typically hit around 700 W actual) for a 'fair' test... ...as to core numbers, AMD and NVidia cores are not directly comparable even in rasterization. On top of that, the 4090 packs a whole bunch of extra tensor and ray tracing that are part of the overall core count.
-
...that's a bit of a tempest in a teapot with some apples and oranges swirling about, IMO -- especially as they're comparing a PL-modified 7900XTX against a stock 4090 by PC Gamer (there are several stock versions of PL with the 4090, btw, from 450W max to 667W max). To that end, my own stock 4090 hits just under 22,000 on the Time Spy Extreme test in GPU, that's with water at 44 C according to 3DM below, and I do have some higher results since, as do a ton of other 4090s if you check 3DM tables...
-
...speaking of playback resolution, YT sometimes 'drops' resolution levels later on in the playback phase, especially during heavy usage times. Clicking on vid settings and choosing 4K (w/HDR where applicable) manually settles it. Cyberpunk 2077 on OLED is pure eye-candy, especially with ray tracing.
-
That's weird - I haven't noticed anything on my 4K YTs regarding sharpness tough it also comes down to the originator's HDR. How is this 4K YT HDR playback working for you (I sometimes use this to calibrate certain settings) ?
-
...I got to have my FS2020 fix pretty much every 2nd day Below is an 'early' 4K screenshot of the Pemberton Meadows / Mt. Meager region, taken in August 2020 (DX11, 2x 2080 Ti in SLI-CFR). FS2020 has clearly come a long way since then, but also still has room to improve. At least with DLSS3 / FrameGen / NVR, it is now buttery-smooth with a single RTX Ada L and 4K DX12.
-
...not OLED-specific, but these trailers still look best on it, IMO - much of 'Hollywood' does use LG OLEDs for editing and effects-checking.