Jump to content

Welcome to ExtremeHW

Welcome to ExtremeHW, register to take part in our community, don't worry this is a simple FREE process that requires minimal information for you to signup.

 

Registered users can: 

  • Start new topics and reply to others.
  • Show off your PC using our Rig Creator feature.
  • Subscribe to topics and forums to get updates.
  • Get your own profile page to customize.
  • Send personal messages to other members.
  • Take advantage of site exclusive features.
  • Upgrade to Premium to unlock additional sites features.

franz

Members
  • Posts

    36
  • Joined

  • Last visited

  • Days Won

    2
  • Feedback

    0%

franz last won the day on June 3 2020

franz had the most thanked content!

Reputation

22 Has started their journey

Personal

  • Location
    Rhode Island

Distributed Computing

  • Folding@Home
    https://folding.extremeoverclocking.com/user_summary.php?s=&u=441008
  • BOINC
    https://www.boincstats.com/stats/-1/user/detail/142551/projectList

Recent Profile Visitors

The recent visitors block is disabled and is not being shown to other users.

My Awards

  1. Another note about OCing multiple GPUs your desktop entry should look like this [Desktop Entry] Type=Application Exec=nvidia-settings -a "[gpu:0]/GPUFanControlState=1" -a "[fan:0]/GPUTargetFanSpeed=***" -a "[gpu:0]/GPUGraphicsClockOffset[4]=***" -a "[gpu:0]/GPUMemoryTransferRateOffset[4]=***" X-GNOME-Autostart-enabled=true Name=nvidia-fan-speed Name[en_US]=gpu1.desktop then you would change the gpu:0 value for each gpu gpu:1 gpu:2 etc If you would like to lower power caps on nvidia GPUs use sudo nvidia-smi -pl 135 135 is the desired power level in watts, so for my GTX1070 with a stock power cap of 151W 135 would be about a 10% reduction. This applies the same power cap for all GPUs and need to be set each time you boot. I havent figured out setting it up automatically yet. If you want to set individual power levels use sudo nvidia-smi -i 0 -pl 135 sudo nvidia-smi -i 1 -pl 150 The number represents the GPU id
  2. Not sure if you ever got this going or not, but here are my notes from my 20.04 rig Install latest driver, reboot Enable coolbits sudo nvidia-xconfig --enable-all-gpus #use if running multi gpu sudo nvidia-xconfig -a --cool-bits 28 Reboot Then you can manually change fan speed and OC If you want to load your OC at boot do the following Create new file: sudo gedit /home/your user name here/.config/autostart/gpuoc_all.desktop In new window paste: [Desktop Entry] Type=Application Exec=nvidia-settings -a GPUFanControlState=1 -a GPUTargetFanSpeed=75 -a GPUGraphicsClockOffset[4]=100 -a GPUMemoryTransferRateOffset[4]=500 X-GNOME-Autostart-enabled=true Name=nvidia-fan-speed Name[en_US]=gpuoc_all.desktop Click Save, Reboot, Profit Note: you can enter any filename.desktop that you want, enter in your own fan speed, clock and mem settings. This config will OC all GPUs with the same settings. This config is for RTX GPUs, if you are using GTX GPUs change the offset parameters from [4] to [3] like so GPUGraphicsClockOffset[3]=100 -a GPUMemoryTransferRateOffset[3]=500 To edit the file afterwards gedit admin:///home/your user name here/.config/autostart/gpuoc_all.desktop If you want to use different OC and fan values for each GPU you would have to create a new file with the same info posted above for each GPU. Then you could name them GPU1.desktop GPU2.desktop etc
  3. A little late to the party, but HighSpeed PC has tech benches and junk. I have 3 of them so far. Here is something that may suite you @BWG https://www.highspeedpc.com/category_s/1844.htm
  4. This wont be updated 24/7 but I got it working anyways....http://franz.42web.io/summary.html Thanks for the writeup @BWG
  5. In, but gimped for the time being due to some maintenance. Out for prizes. Just here to help
  6. I found the highlights at redbull.com They still have Monte and Finland available to watch but I dont know how long they keep them up. Only 30min of coverage for each day, but better than nothing. I love WRC but im not paying over $100/yr to get the official livestream at wrc.com
  7. I was able to catch the first 2 rounds of the WRC. Its been very limited coverage the last few years in the US, so it was nice to see those again, especially the Arctic rally which is always my favorite. Watching Oliver Solberg driving his first WRC stages was really fun, since I grew up cheering for his father and the Subaru team. Looking forward to the F1 season as well.
  8. HFM.net is a great tool. I have used it for a long time to monitor multiple systems across my network with a mix of Ubuntu and Windows rigs. HFM is installed on my Windows rig and it reads the client from the Ubuntu rigs, because getting that installed on Ubuntu was a PITA. I haven't used the website option in years so looking forward to this.
  9. @BWG Really nice to see you taking lead on this and getting the ball rolling. Although I tend to fold 24/7, summer is generally limited to just FaTs so I gave up on TCs a long time ago. Also my participation level tends to ebb and flow with whatever else is going in my life, so I might be super active for a week or a month and then disappear for 5 months....lol. I will dedicate my rigs to OCN until I hit 1 billion, after that I could switch my rigs to this forum and help move it up the ranks...
  10. Congrats all and thanks again @ENTERPRISE for hosting a 5 day.....errr 4 day foldathon. @MinotaurtooIm usually 24/7 stable, but of course for the foldathon one system decides to f that up. Its an Ubuntu rig and I recently upgraded to 20.04 which doesnt play well with fahclient, so Im pretty sure I know what the issue is. @Bastiaan_NL the 17800 do seem to prefer smaller or maybe older GPUs. My 1070s dont seem to notice them at all with similar PPD to other WUs but my 2060s lose 10-15% vs other projects.
  11. That is ubuntu, so he is using nvidia x server which should be installed when the drivers are installed and psensor which should also be preinstalled, but can be installed if needed. AMD definitely does something weird with their PCI lanes, when I added a second GPU to my B350 board it decided to make PCIE16_2 the primary GPU and PCIE_1 the secondary GPU, so my main GPU is limited to x4 lol. Then F@H set up GPU0(Main) as Slot 0, Bus1 and GPU1 as Slot 1, Bus10? Every intel rig I have its always bus1, bus2 etc
  12. That was really fun! Reminds me of Foldathons in my past life. Nice job everyone, next time we need to get to 100mil PPD and or 900WUs in one day. Didnt win the 3060 lottery today, I was able to click on buy now at Best Buy, but could not add to cart. Oh well. Time to rig up a 3rd 1070 in my main folding rig.
  13. I just woke up to a frozen system with a bunch of "error initializing CUDA" messages. Reset everything and it seems to be working....will dig deeper after work.
  14. I need a riser cable for that but if I add my 3rd 1070 to that rig I will test it then Ouch, good to know
  15. I compared x16 vs x4 on my 1070s recently. CPU 1700x, Mobo B350 PCIe 3.0 slots I used the same 75 Projects, but PPD still varies between WUs in the same project so.... x16 GPU Average 1,280,295 PPD x4 GPU Average 1,258,435 PPD 2% difference on GTX 1070s
×
×
  • Create New...

Important Information

This Website may place and access certain Cookies on your computer. ExtremeHW uses Cookies to improve your experience of using the Website and to improve our range of products and services. ExtremeHW has carefully chosen these Cookies and has taken steps to ensure that your privacy is protected and respected at all times. All Cookies used by this Website are used in accordance with current UK and EU Cookie Law. For more information please see our Privacy Policy