Jump to content

Welcome to ExtremeHW

Welcome to ExtremeHW, register to take part in our community, don't worry this is a simple FREE process that requires minimal information for you to signup.

 

Registered users can: 

  • Start new topics and reply to others.
  • Show off your PC using our Rig Creator feature.
  • Subscribe to topics and forums to get updates.
  • Get your own profile page to customize.
  • Send personal messages to other members.
  • Take advantage of site exclusive features.
  • Upgrade to Premium to unlock additional sites features.

Snakecharmed

Premium Bronze
  • Posts

    284
  • Joined

  • Last visited

  • Days Won

    8
  • Feedback

    0%

Everything posted by Snakecharmed

  1. Well, here's something annoying. Lately, my CWT-built 2018 Corsair RM850x has been making stupid sounds similar to a hard drive seeking, and it's distinct enough to be different and also more annoying than my actual hard drive. It doesn't happen all the time, but when it does happen, at its worst, it can be pretty rhythmic and consistent like a hard drive defrag operation. I can't pinpoint the conditions when it occurs though other than I think it starts at a mid-level power load, but of course it's a known problem after finding out there were some other users with similar complaints over the years who ended up RMAing their units. Of course, I also only find out about this via Google long after buying the product and thinking all the professional reviews covered all the bases on what a great PSU it was. I'll be bringing my Seasonic-built Corsair AX850 Gold from my previous rig into this one over the weekend. I paid $81 on eBay for this unused RM850x which really isn't much more than the $60 or so I would have paid for a used Seasonic Focus 750 as a backup. I guess that makes this fine for a backup power supply that I won't really care about, although I was seriously considering getting a Seasonic Prime 850 Titanium last night because the RM850x's rhythmic "drive seek" noise was pissing me off. Seasonic is probably the closest thing I have to brand loyalty in the DIY PC space. In the meantime, I guess I'll just have to keep the music on more frequently. I also spent $75 on two cans of new old stock spray paint last week as extra supply because the paint I used for my case has been discontinued for probably a few years now. After dusting off the work that I last did in 2015, the "bad" spray job I did on the top of the wraparound panel wasn't as bad as I remembered. The main repair I need to do is respray the top of the case with clearcoat to fill in the pit marks that originally resulted from spraying a can that sputtered from being dangerously low on paint, then sand it all down evenly and polish. The pitting was deep enough that I couldn't reliably buff the top even flatter because it was already glass smooth otherwise. Not much in the way of other updates at the moment, but I did increase my PBO Curve Optimizer negative offset from -20 to -25. There may still be more headroom, but I haven't tested it out yet. I hit 28146 in Cinebench R23 multi-core, which is exceptional for a 7900X at a cTDP of 105W considering most reviews put an uncapped 7900X at 28500-29500 and there are users out there who have manually tweaked their 7900X far more extensively than I have at the 105W power limit and are topping out in the upper-27000s. In fact, my previous best from three weeks ago was 27744.
  2. I would be wary of what you see in the product description of these Chinese-marketed products though. They're confusing at best and blatantly misleading at worst. In the bulletpoints at the top of the Amazon product description, it says: The product's ambiguous nonstandard notation (104 Mbp/s) notwithstanding, the transfer speed cannot be 5 Gbps (625 MB/s) to an SD card. They have their own bus. The adapter may provide 5 Gbps of available bandwidth via USB 3.0/3.1 Gen 1/whatever, but the SD card bus and the card will limit the maximum throughput of the card. The fastest commercially available SD card bus is UHS-II which has a half-duplex throughput of 312 MB/s, and is only available for SD cards that are most commonly used in and marketed for professional digital cameras with 4K video recording. Even the fastest general purpose SD card is on UHS-I, so unless you use UHS-II cards, I wouldn't go out of my way to seek a UHS-II card reader if it's not the most vital feature of this USB hub. UHS-II cards physically have a second row of contacts, and if a card reader doesn't have them, then they fall back to UHS-I which maxes out at 104 MB/s half-duplex (or whichever lower-spec bus the card reader supports). There is an approved UHS-III standard defined by the SD 6.0 spec that maxes out at 624 MB/s full duplex and actually would saturate a USB 3.0 5 Gbps interface. No UHS-III cards or readers currently exist on the market though. There is also an SD Express bus that uses a single PCI Express lane, but those aren't currently implemented as multi-input hub solutions and again, there are no cards.
  3. Technically, none. Has the card reader throughput been your sticking point? That is almost never specified by the manufacturer or reseller of the hubs. Otherwise, you've described practically any $30 MacBook Pro hub out there. I assume you're talking about SD cards, otherwise I'd imagine you would have specified which one. SD card reader interfaces aren't rated by USB bus speeds. They have their own bus interfaces and all the commercially available ones are slower than USB 3.0. The fastest is UHS-II which theoretically tops out at 312 MB/s. UHS-III cards aren't on the market yet. The better question is, what's your fastest SD card? Because if it's not UHS-II, you're needlessly looking for a needle in a haystack. Either way, this is the best you'll be able to do. https://www.amazon.com/Dual-Slot-Reader-10Gbps-Adapter-TF4-0-Compatible-Windows/dp/B0BLCF8YDP/
  4. Okay, this size comparison is what I was looking for. Visual TV Size Comparison : 45 inch 21x9 display vs 48 inch 16x9 display WWW.DISPLAYWARS.COM Visual size comparison of a 45 inch 21x9 display vs 48 inch 16x9 display If you ignore the bendable feature, the 45" Corsair is otherwise pretty pointless. Over time, people are less inclined to make frequent adjustments to their space if they can be avoided. I used to pull my TV wall mount to my desk in the morning and retract it back to the wall at the end of the workday. Now I don't even bother unless the TV is actually in the way of me getting between my desk and the wall. Seriously, for the price of that Corsair, I'd pick up two 42" LG C2s and call it a day.
  5. I was using the original 2015 34" Acer Predator X34 (they've reused the X34 model number more than once) prior to this LG UltraGear 38GN950-B. The X34 had a 3800R curve, which was barely even noticeable. This 38GN950-B has a curve of 2300R, which is pretty much at my limit for a curved screen that doesn't bother me. The majority of curved monitors today sport a 1900R or tighter curve. They're completely useless to me outside of gaming, and I don't consider them true game-changers for gaming either. My biggest gripe with curved ultrawide LCD monitors though is their bottom-of-the-barrel quality control. If you don't send your first one back, you scored a minor miracle. The bendable 45" OLEDs could have been worth considering if they didn't have such an abysmally low resolution. It's so low that it doesn't belong on a desk, at which point, you've lost the benefit of a curved screen. If it doesn't belong on a desk, then other solutions are far superior.
  6. I demand that 45" ultrawides have a higher resolution than 3440x1440.
  7. I doubt anyone will mind a necropost, but this is a niche benchmark test and now I have something to add even if it's not Zen 3. RAM disk benchmarks are fun because of how stupid off-the-charts they are. I've been doing them since I was on Sandy Bridge. I also have a practical use for them as scratch disks in Adobe CC apps as well as session-based temp storage, like for downloading and running program installers. The RAM disk app you use makes a big difference. When I first started using RAM disks, I used Dataram RAMDisk, which resulted in much slower sequential read speeds compared to SoftPerfect RAM Disk, to the tune of 6725 MB/s sequential read vs. 10874 on the same 16 GB Corsair Vengeance LP DDR3-1600 RAM. It's kind of wild to think that PCIe Gen 4 NVMe SSDs can outpace Dataram's read speed on those Corsair sticks now. Years ago, SoftPerfect RAM Disk was the fastest RAM disk app out there, and I haven't checked recently to see if anything has surpassed it, so here it is again on Zen 4.
  8. I have a 7900X but the best practices are largely unchanged from the 5000 series from what I've learned. Curve Optimizer with negative offsets is the easiest way to get a baseline, and then you can fine-tune individual cores if you want to get the most out of every core. When I installed the 7900X, I set it to 105W Eco mode immediately. I got whatever I got in the CPU-Z benchmark and didn't think anything of it, even to the point of not remembering to compare it in that CPU-Z benchmark thread that we had here last year. Last week, I read a little about undervolting to drop the temperatures, then saw what people were doing with the 7000 series. I started with -15 in Curve Optimizer which checked out fine and then went to -20, which resulted in the best single and multithreaded CPU-Z score yet. The 7900X with the generational uplift despite four fewer cores is almost matching the 5950X multi scores in CPU-Z. I haven't gone beyond -20 or tried to find the limits of each core yet, largely because I haven't had to, but I figure I'll get around to it as the weather and the ambient room temp gets warmer. I haven't even looked at CPU offset voltage or boost clock override and it doesn't look like there's any real reason to do so unless you've gotten everything there is to get out of Curve Optimizer.
  9. The latest LG C-series model is the C3 which debuted last month. The digit represents the model year. That means new old stock of C2s should be on clearance now. The best time to pick up a C2 is probably within the next couple of months while the more reputable retailers and even LG direct still have the C2 available. Once they fully shift to selling the C3, the remaining smaller retailers who still have stock of the C2 will jack up the prices on their meager inventory.
  10. I looked up the old model number for this computer and found this ad in a Google Books scan of the Jan 22, 1996 edition of InfoWorld. I got this PC in May of that year. I'm not really sure why they opted to depict the P5-150 instead of the 166 XL other than perhaps avoiding a copy layout issue trying to wrap text around the much taller XL model tower case. Anyway, beyond the specs, I'm a little surprised that 4+ months after the Windows 95 launch, they were still offering 3.11 by default instead of 95. That obviously changed by May when I got it. One thing I will say about those old days is that once again, timing was everything. Dell and HP had yet to gain the level of market dominance where they introduced proprietary case-specific hardware in their prebuilt systems. In 1995, this system would have been Baby AT and I would have tossed it following my 1999 or 2001 custom builds because I was way too young to care about case modding back then. If it was a couple years later, Gateway went away from these tanky case designs and I certainly wouldn't have kept one of their later and flimsier cases for 20+ years.
  11. I finally put on a case badge. My original plan was to have this done on a vinyl decal, but I realized partway through putting together the inquiry to the vinyl shop that it wasn't going to work. The actual Cobra emblem has too many fine details for a double layered, 1.2" square vinyl cutout with a metallic silver or chrome bottom layer. Even a black inverse print on metallic silver or chrome vinyl using a vector image of the emblem would have resulted in a loss of detail because vector renderings aren't as finely detailed as the actual metal emblem. All the vector images I've found also don't seem to get the belly of the cobra exactly right either. I ended up printing this myself with my color laser printer on adhesive-backed heavy glossy paper stock that was meant for CD/DVD labels and then cut it to size with a straight edge and utility knife. It turned out pretty good, but lacks the wow factor that I originally wanted. It's not reflective like it would be if it were inverse printed on vinyl. Well, at least it's not blank anymore, and it's something other than the P5-166 XL that was there before the paint job.
  12. I'm not a fan of the core parking concept for the dual-CCD X3D chips. The way it's been described, it seems that it may prevent actually using the CPU as a 12 or 16-core if you wanted to run background production processes while gaming. After all, why would you want those other cores to do nothing if you actually have something for them to do? As for the 7800X3D or the 7900X, if you never do any production work on your PC, then you could go for the X3D. Otherwise, I find that more and higher-clocked cores are far more valuable and noticeable for a mixed workload. I'll notice faster file decompression on a large archive or a video encoding completed a few minutes sooner. I probably won't notice a 30 FPS difference in a game when the lower result is still well past 144 or even 100. Also, all these CPU gaming benchmarks purposely create CPU-bound scenarios that aren't reflective of an enjoyable, or in my case, a remotely feasible gaming experience. I have never gamed at 1080p on desktop because I went from CRT to 1200p and now only have 3840x1600 or 4K to work with depending on whether the game is optimized for keyboard/mouse or controller. As for the 7900X3D, it does appear to occupy no man's land. I can't make a case for it versus the 7800X3D nor the 7900(X) when taking price into consideration.
  13. I'll see what I can do with tidying up the cable sprawl on the case floor. I did forget to tie together the two bundles of USB 2.0 header cables like I originally meant to do. I played around with the PBO Curve Optimizer today and I'm currently at -20 all core with some slight reduction to idle CPU package power and cooler temperatures across the board. The office room temp has stayed in a range of 3-5°F above ambient floor temp all day. That's a significant improvement over the 8-10°F I was noticing before with the i7-2600K. I did get a break today though because it was overcast. However, I'm now optimistic that tinting the west-facing window should keep the afternoon temps in this room under control without any other significant measures. I'm trying to optimize idle power right now because it turns out that sleep might not be a great option after all. I need to do more testing, but in two instances when the PC went to sleep, my monitor didn't want to recover after wake. Apparently, LG decided that having DisplayPort deep sleep recovery option settings on their monitors wasn't important. I've yet to find out the exact cause since everyone in the forum threads I researched was focused on how Windows 10 rearranges the app windows on wake. I should have tried pulling the DP cable and plugging it back in when I encountered the no-wake issue this morning, but I forgot. Meanwhile, the 55" Samsung has behaved well in Windows 10 so far coming out of screen-off, non-sleep idle. Even that was constantly an issue in Windows 7. With enough trial and error, I found that selecting the Switch User option in the Windows 7 Ctrl-Alt-Del menu would wake it up. However, there was also a small but statistically significant chance that the screen resolution would revert to 1024x768 instead of keeping the 4K setting and screen layout position in Nvidia Control Panel. Thus far in Windows 10, I've had no issues with the TV forgetting its resolution or screen layout position, although it did act up the one time I tried to enable VRR on it by dropping the video signal in Game Mode. Monitors and TVs these days seem to be a nightmare to configure properly with a PC now.
  14. Early tests with QD-OLED have not been promising on the burn-in front. If this matters to you, stick with LG. Longevity Burn-In Investigative Paths After 3 Months: QD-OLED vs. WOLED, LG vs. Sony, And More - RTINGS.com WWW.RTINGS.COM Our accelerated longevity test has been running for over three months, and we've already encountered some very interesting results. We've... I'm waiting it out for microLED. It'll probably be another 5-10 years, but what I do with my monitors would amount to abuse for OLED because I have them in work mode for most of the day. It's worse than letting a CNN news ticker scroll persist all day. I've been wanting a 65" LG C-series OLED for my family room for years now, but I've put it off indefinitely knowing that I almost never just "watch TV" anymore.
  15. Cable management isn't really going to be possible beyond what I can move out of the way to help airflow as best as I can. There's nowhere to stuff all the extra cable slack and it'd be worse if I tied them together into a thicker bundle. Considering this case has no side window, leaving it a little messy is fine, but it does bother me slightly. I used to not care about these things, but I do now. I'm not even sure what I'm photographing, but everything that's supposed to be inside the case is in there now. Despite how it looks in this picture, the path from the front intake to the CPU fan is unobstructed aside from the USB 3.0 header cable which disrupts the airflow a little. The last thing I needed was a pair of USB 2.0 header extension cables so I could put the PCI slot cover ports above the GPU. That allowed me to put all those SilverStone Aeroslots Gen 2 vented PCI slot covers below the 3080 Ti to reduce dead air zones at the back of the case. Here's a bonus pic of the packaging of the USB 2.0 header extension cables. I can put aside a lot of things when it comes to generic Chinese products that don't have a lot of design complexity or criticality, but I can't get over the futility of their alphabet soup brand names on Amazon. I can confirm those are letters from the alphabet. I know we all just accept this now and never really talk about it despite being covered as feature articles in New York Times and Slate, but it annoys me. This would be like an American product called Zzyzx. Actually, it's worse, because at least Zzyzx is the name of an actual town.
  16. I had started a post several days ago but ended up never posting it. It was about exhaust temps. The i7-2600K in the Montech X1 case had an exhaust temp of 105°F at idle. The 7900X in my modified case had an exhaust temp of 86°F at idle before I set it up in my desk. This weekend, I put the case in the desk and brought in the 12 TB HDD and 3080 Ti. In the middle of extracting 40 GB of data, the exhaust temp was 96°F under load. Ambient room temp was 79°F for this measurement. The day the idle temps were measured, the ambient room temp was 81°F. I used to have some theories as to what was making this room so hot and figured it couldn't have been just the PC, but the room this PC was in previously was also much bigger. At this point, it seems that the only must-do in this room is to have the west-facing window tinted so the afternoon sun doesn't heat up the room so much. The significantly cooler 105W Eco-limited 7900X should allow the room temp to stay within 10°F—or hopefully even within 5°F—of the rest of the floor now. There were days last week where the room temp got up to 86°F with both PCs running, which was as much as 14°F higher than the rest of the floor. After overclocking everything I've built for myself since 1999, it's an unfamiliar feeling to not be doing that anymore. However, the overclocking hobby in the form of getting free performance so your affordable CPU could punch significantly above its weight class has been dead to the average DIYer for quite some time now.
  17. If you're looking at CAD prices, that Asus GS-AX3000 seems to be a decent price compared to what it goes for in USD. However, I don't see Asuswrt-Merlin support for it. Like ENTERPRISE, the availability of that firmware is also my primary criteria for choosing an Asus router.
  18. There are some mixed visual cues happening here. The unread items indicator is being interpreted by users as a notification indicator because it's a large red circle with a number inside, which corresponds to the visual cue of a red chip with a number inside for notifications in many other web portals, chat apps, mobile OSes, and macOS. In these forums, the notification bell next to the unread items also shows a dot at its top right when you've received an actual notification, but it's nowhere near as prominent visually. Since I don't have any active notifications, I can't even remember if the dot is red. However, this is what a user really wants to know or see when they've received a notification. I've learned to ignore the unread items indicator completely. You can clear it if you want by clicking through and marking all forums as read, but that doesn't really serve a functional purpose. They aren't actual notifications.
  19. Not enough brown for that.
  20. Those are all Newegg Marketplace sellers. Just like Amazon Marketplace sellers who sell new old stock, they all price gouge. If you want older hardware at a reasonable price, you pretty much have to look on forums or go for used on eBay.
  21. Helm's Deep Reborn was a creative use of game assets to make the game function in a way that wasn't intended by Valve, to great effect. It gave a narrative purpose to survival mode. I never played any other survival map because the entire mode felt pointless to me otherwise. That map is so difficult that it worked well with the servers I played on, which supported up to 12 players. We also had increased difficulty mods including double tank spawns, increased horde sizes, and one-hit kill witches, but we also gained firepower with infinite minigun ammo and repositionable turrets. Since this is a survival map played on a survival game server, I don't think we had a CS-like buy menu system available for this map. Anyway, the mod potential of L4D2 is so vast unlike other games of the genre that the replayability factor for the game is nearly limitless as long as you can embrace the game's core mechanics.
  22. L4D2 holds up exceptionally well and it's largely because the game isn't micromanaged. I stopped playing a few months before the Last Stand update, but the custom map community still appears to be going very strong. HS Top on YouTube still puts out videos regularly that continue to surprise me about trick plays that you can perform in the game, and Andre Ng does a good job of showcasing custom campaign map secrets and gameplay facts. I stopped playing partly because online play is a massive timesink, but that's where most of the game's replay value is. I wore out the official campaigns long ago, the pure game experience feels lacking after you've played on modded servers, and the team play dynamic and randomness isn't there in single player because the bots are stupid.
  23. I feel like these types of leaks have persisted ever since HL3 became a meme. L4D3 has existed as comments left behind in source code for years. It's been worked on and subsequently abandoned. Tech demo screenshots exist, although they have no context. Nobody on the outside knows its current status. L4D2 on modded servers is my favorite gameplay experience of all-time. There is a case to be made for L4D3 because Turtle Rock's Back for Blood seems to have missed the mark in many ways.
  24. Yeah, it's definitely a keeper. The paint job alone is a reminder of what I can do when I make the effort to do things the right way, because that was my first and only attempt at an automotive-quality paint project. I said earlier in the thread that I might have went with a vinyl wrap on the whole case if I started a few years later after wraps became more popular. That still wouldn't have addressed the complexity of the front bezel which has enough nooks and crannies to make a vinyl wrap very difficult to adhere nicely. In any feasible scenario that involved keeping the case, I think painting it was likely inevitable. I was never a retro purist for beige. As for the Lian Li cases, I would go as far as saying Lian Li didn't think far enough outside the box back in the mid-2000s. They still tried to make those cases retain the dimensions of a normal ATX tower and I think that limited their imagination and led to those monstrosities. I do like the shape of the Burj Al Arab hotel as a design concept, but as an ATX case, the illusion of over a dozen stepped front panel drive bays doesn't work for me. I think the Lian Li PC-Y6 yacht from a few years ago is an amazing limited edition case, and part of that is because it didn't try to satisfy the dimensional requirements of an ATX case.
×
×
  • Create New...

Important Information

This Website may place and access certain Cookies on your computer. ExtremeHW uses Cookies to improve your experience of using the Website and to improve our range of products and services. ExtremeHW has carefully chosen these Cookies and has taken steps to ensure that your privacy is protected and respected at all times. All Cookies used by this Website are used in accordance with current UK and EU Cookie Law. For more information please see our Privacy Policy