Jump to content

Welcome to ExtremeHW

Welcome to ExtremeHW, register to take part in our community, don't worry this is a simple FREE process that requires minimal information for you to signup.

 

Registered users can: 

  • Start new topics and reply to others.
  • Show off your PC using our Rig Creator feature.
  • Subscribe to topics and forums to get updates.
  • Get your own profile page to customize.
  • Send personal messages to other members.
  • Take advantage of site exclusive features.
  • Upgrade to Premium to unlock additional sites features.

J7SC_Orion

Premium Platinum
  • Posts

    2,152
  • Joined

  • Last visited

  • Days Won

    85
  • Feedback

    0%

Everything posted by J7SC_Orion

  1. ...we have some proprietary and somewhat complex databases that started up in 1996 (to present)....GPU acceleration is for our data analytics and deep learning. At the time of the GPU purchase, NVidia had just extended relevant software to run on Titan RTX and 2080 Ti...
  2. Well Sir, you might have seen Orca and his pod come up for air here and there before... ...and yes, lonely slim 120mm above VRM and DRAM is supposed to help w/ airflow, though not really necessary, given temp sensors...still, why not add to the 20+ 120mm fans anyhow in this build to make 'Orca' really comfy?
  3. ...single pump ? My inner Bonobo is offended (...then again, he's offended quite easily...) But seriously, never had a single D5 fail in operation at work or home in 10 years plus; though I still run two of them in series for fail-over (just in case..)
  4. banned coz..you're doing well with this weight lifting program, even if Leo & Gus are chikinz-less in their tummies for now...keep it up !
  5. Admins please note: This thread is intended to settle somewhere between 'GPU' and 'Benchmarking' ...Ah, the joys of mGPU (multi-GPUs), such as NVidia SLI/NVLink and AMD Crossfire/Quadfire...and why CFR (checkerboard / tile-based) mGPU vs AFR (traditional Alternate Frame Rendering) matters...may be not now, but certainly in the not-too-distant future. Yes, yes - there is the chorus that SLI/mGPUs "is dead"...while not entirely true, it certainly is the case that a single GPU will usually be far more painless to optimize for a given game, or really be the only option for other games...that said, I recently switched from four Quad-SLI / Quad-Fire systems to 'only' dual NVLink (2x 2080 TIs), and while I do not play as many games on this HEDT system that also does 'work' as others, I have relatively few problems with NVLink on my fav games I do play, such as various NFS and also Metro Exodus, never mind 3DM and other benches. Yet this thread is NOT intended to convert anyone to mGPU. Instead, it is pointing out that there seems to be a movement aloft by various GPU producers (NVidia, Intel and likely AMD) to introduce 'mGPUs' in future generations of their GPUs. Think AMD Ryzen 7nm and soon smaller 'chiplets' vs Intel's difficulties with large 10nm 'all-in-one' giant and complex dies in the CPU realm. Likely, the next gens of GPUs will still be single die, but sooner or later, it will be mGPUs for the middle and upper class performance graphics cards - for which you need extremely-well performing mGPU drivers As such, NVidia released, rather quietly, their CFR capability in their drivers as of late 2019 to do 'CFR' - for RTX only. CFR is actually not new, but was supplanted by the easier-for-developers 'AFR'. Yet with future GPUs, CFR seems to be the far more capable ticket for future mGPU generations than AFR... Below are some early CFR vs AFR comps with the current gen of RTX GPUs. I already have run benchmarks of my own such as 8K Unigine Superposition with CFR vs AFR, but much more (tedious mod and setup) work is needed. I will update this post as I get more results of my own, time permitting... In the meantime, I will note that CFR is not always faster than AFR in outright FPS, but it seems to have better frame times...and below are some YouTube vids someone else ran for DCX12 (Titan RTX) and Tom Clancy's Division 2 (DX11)...have fun and plan you next mGPU (oops )
  6. lol ...point well taken - plug away for EHW ! I do point out though that this build is in part a workstation that, well, works for a living. Also, from my perspective, I was being prudent with 'Orca', coming from at least four prior HEDT projects that were either NV Quad-SLI or AMD Quad-Fire, needing multiple PSUs and sturdy circuit-breakers...
  7. I used to live in places with high humidity, but I can now confirm that ocean breezes are nice re. humidity, apart from keeping the mosquitoes at bay... As to the overall theme of global warming, 'human activities' clearly play a very big role, and '''we''' have to think about how we can contribute in helping to solve this via looking at our daily behavior, especially with 7.8 billion people now on our earth. All that said, there are also 'natural cycles' which drive climate change...even trickier when they combine with human-made climate impacts. BUT...but at the end of the day, there needs to be a discussion about how technology can be best employed (and its potential risks mitigated) to check the rise (if not lower) temps, humidity and greenhouse gasses. IMO, things already have progressed so much and moved us so much closer to tipping points that we need technical solutions, not just 'behavior modification'. At the end of the day, the hick-hack between 'man-made climate impacts vs natural climate cycles' is somewhat irrelevant anyway - because it needs to be dealt with one way or the other. Consider the billions of people that live in cities and areas near oceans and river deltas (where much of the early civilizations were established). Rising temps will lead to, among other things, higher water levels and flooding - and it makes little difference what caused it. It is happening and we have to deal with it, instead of suggesting, rightfully or wrongfully, that 'it wasn't man made, so ignore it'... One day, we'll manage to get to the hydrogen economy...since the rest of the universe seems to run on hydrogen as an energy source, clearly 'not a bad' idea. The overriding challenge seems to be how to SAFELY store and transport it...now back to my overpowered 1000+ watt computer
  8. There was no more room on the front ...so the other 3 rads had to migrate to the back which has a white plastic cover, but I never look back there anyways as the system sits in a corner between floor-to-ceiling windows. Cooling performance is great though (and so it should be w/ 1800x55 rad space) Starting to think about a future HEDT workstation build with 2x MoRa3-420s angled, and at the back
  9. banned coz kudos & good for you ...doing a cleansing and diet can be really healthy in the long term. Now, you just have to do some extensive weight training (you can lift Leo and Gus twice a day for starters) so that you can get out of ward 17
  10. banned coz aren't you in ward 17 ? That makes you neighbors and you can show E around the whole place and maybe plan an escape attempt or two
  11. banned for forgetting Midget the Kitty again @ENTERPRISE - you need to make a banned appointment with Nurse Ratched
  12. ...either, whatever you want it to be ! Could also be face mask rash ban
  13. tx :-) ...all D5s at 70% speed. Also, they are rubber mounted on the back TT5 Core P5 cover...below during temporary layout measuring w/ temp soft tubes for drilling mounting holes. Fans on the left and center are GentleTyphoons, fan on the right are painted Noctuas Rads, fans and pumps in continous use since 2012-2014; just needed thorough cleaning
  14. Thanks :-) ...yeah, the X399 Creation has been flawless, though I wasn't: During an early test fitting while modding the TT Core P5 to accept E-ATX, still-too-tight PSU cables pulled the mobo off the standoffs and shaved a cap off :-( But MSI was GREAT about it, and gave me an RMA anyway, perhaps because I did not tell them any 'BS' and explained what and how it had happened. I just had to pay a nominal amount for shipping and shipped the original X399 creation w/o the two mobo cover metal pieces. The replacement board arrived with their own set, so that's how this build ended up with those extra decorative pieces. The four horizontal white tubes are actually copper pipes, and the painted-white 'end-plate' on the right used to do duty as a backplate for a R290X... As to the 4x D5s and 5x XSPC RX360 rads, per above, they are divided into two discreet loops (2x D5 & 3X RX360 for GPUs, 2x D5 & 2x RX360 for CPU). I had most of the water-cooling parts laying around anyway from various earlier builds, including work-related servers. They just needed a thorough cleaning. I did plan an an eventual upgrade with a simple mobo swap (ie TRX40 / 3970X) so the cooling system was over-engineered to handle higher core-count CPUs later on from the get-go. At the same time, I usually build any discreet loop with 2x D5 in series anyway...First and foremost, that's 'fail-over' insurance, especially for servers and workstations where downtime is more expensive than the computer parts and peripherals. In addition, with for example 3x rads and 2x GPU waterblocks in just the GPU loop, 2x D5s are useful - even with 2x rads, a CPU and GPU block per your question above, I would still employ dual D5s in series for the above reasons, though it is not absolutely necessary from a flow POV. A final comment on D5 (also as compared to DDC pumps). DerBauer had a nice comparison on those last fall. On paper, a D5 slightly outperforms the smaller DDC, with DDC also running a bit hotter. The D5 have a larger diameter - but a such, they are also MORE sensitive to serious flow-drop (and thus temp spikes) if there are air bubbles - yet with dual D5s, that problem is all but eliminated and flow is superb even with the odd set of air bubbles (which can easily get trapped in GPU blocks with all its nooks and crannies). Quick tip for systems with multiple rads and blocks: I pre-fill the rads with the pre-sized tubes / pipes with the cooling liquid solution and temporarily plug the tubes / pipes at one end. Then I mount the whole thing (PSU disconnected of course) and connect all the bits of the loop(s) with usually just a bit of spillage when removing the temp plugs and connecting them...even just pre-filling the rads makes bleeding the loop(s) much easier and quicker...
  15. [uSER=31]SeriousDon[/uSER] ....banned for not banning enough. Also, Leo and Gus called re. their missing extra helping of 'Sunday chicken'. They want you banned...
  16. Thanks :-) 2 rads (sideways) on the front, 3 rads (again sideways) and 4 pumps, 2 mini-reservoirs 'hidden' on the back. Wiring and fan control boxes 'inside' the black back area which splits and which also mounts the mobo. With the exception of the top VRM/DRAM top slim 120mm fan, all other fans pull air from the left (by a window) and blow out on the right Dual 2080 Ti are 'pigs on gas, err watts', but with 1080x55 rad space in their dedicated loop, temp stay under control, per pic from a PortRoyal run below
  17. With 2x w-cooled 2080 TIs rocking away on a 4K monitor, I haven't really hit a performance barrier yet, so if there's some additional confirmation re. 5nm Hopper in latish 2021, I might skip Ampere and go straight to Hopper
  18. This is an abbreviated, mostly picture based summary of a workstation prototype project "Orca". The Project was realized about a year ago (with continued tinkering since then) in order to test out RTX cores for some database DLSS analysis tasks The system has part workstation, part desktop functions. Living on the Pacific Coast and with a project that had black, white and silver (before RGB) as its main theme, "Orca" seemed appropriate as a project title Basic description and major parts: TT Core P5 case, heavily modded AMD TR 2950X 16 core / 32 thread at 4.3GHz all-core MSI X399 Creation motherboard with 32GB / 64 GB of TridentZ DDR4 3466 MHz (14-14-14-30 setting) 2x AORUS GeForce RTX™ 2080 Ti XTREME WATERFORCE WB 11G GPUs (factory full waterblocks) Philips 4K / 40 inch monitor on DisplayPort 3x WD Blue 1TB M.2 Antec HPC 1300W Platinum PSU Dual cooling loops with a total of 5x XSPC RX 360/55 rads and 4x MPC655 (D5) pumps Cooling system is partly copper tubing and copper elbows Some GPU (desktop bench) performance
  19. Thanks ! I am definitely going to try this...question is whether I go for a 8700K or 9900K - once the 10900k / LGA 1200 series is available for consumers, I expect current gen 9900K to drop in price, and that's when I'm going to move ahead with this. Luumi has been able to get a 9900K working on Z170 ASRock, but the process should be similar. Biggest hurdle is getting the right Gigabyte Bios update, and any help is appreciated then
  20. ...fine (ultra large) puppies indeed - I love them. Long live LeoGustus Gigglebytes ! Also, how's the 48 core coming along ? ban
  21. Neon Noir at 4K and 2080 TI NVlink..."AFR2" settings in pic below. Will try CFR (checkerboard / tile NVLink setting on the weekend if I can get it going). Anyhow, I really like this bench...! For comparison, 2x 2080 TI at Superposition 8K, Heaven 4K and Valley 4K ...will try to rerun those with "CFR" as well - sometimes, CFR produces lower FPS but much better 1%, 0.1% frame times :-)
  22. ..say...me thinks we met somewhere before with those massive mastiffs...ah, at the meat and poultry wholesalers ! oh yeah, 'banned'
×
×
  • Create New...

Important Information

This Website may place and access certain Cookies on your computer. ExtremeHW uses Cookies to improve your experience of using the Website and to improve our range of products and services. ExtremeHW has carefully chosen these Cookies and has taken steps to ensure that your privacy is protected and respected at all times. All Cookies used by this Website are used in accordance with current UK and EU Cookie Law. For more information please see our Privacy Policy