Jump to content

Welcome to ExtremeHW

Welcome to ExtremeHW, register to take part in our community, don't worry this is a simple FREE process that requires minimal information for you to signup.

 

Registered users can: 

  • Start new topics and reply to others.
  • Show off your PC using our Rig Creator feature.
  • Subscribe to topics and forums to get updates.
  • Get your own profile page to customize.
  • Send personal messages to other members.
  • Take advantage of site exclusive features.
  • Upgrade to Premium to unlock additional sites features.

Kaz

Members
  • Posts

    113
  • Joined

  • Last visited

  • Days Won

    3
  • Feedback

    0%

Everything posted by Kaz

  1. I've got 2 LG 32" 2560x1440 240 hz monitors. I wish I could keep them, but I am experiencing eye strain. Yesterday 10 minutes of desktop use resulted in blurry vision for hours, and eye irritation for 6-8 hours. These LG 32" screens are always blurry. Text in particular is very bad. I've tried increasing sharpness, it's an improvement, but not great. I think my eye strain comes from forcing my eyes to look for detail that isn't there. I'm not sure if it's a low PPI issue, or how LG decided to use rWbg instead of sRBG. I almost switched to LG's 27" OLED, but it turns out that monitor has really bad text clarity as well. Something about rWbg not being very good for text. I do not know if there is a way to force sRGB with AMD graphics cards, if there is, could anyone explain how it's done? After months of struggling, I want to use my computer bad enough I'm ready to dump these monitors and buy something else. I'm looking for 27" or 32" monitors in the 1440p to 4k range. If I get another 1440p monitor it's going to be 27". I don't know if there is a visual clarity difference between 27" 1440p and 4k. I thought my 27" 1440p IPS Yamakasi picture was very sharp. It had a gloss screen. I'm at a loss. One thing I know, text needs to be clear. I know I want 2 monitors. I know I like high refresh rates, but that it's waisted on a 2nd monitor. (Unless I get 3!). I liked having 2 of the same monitor, but I can see where it's not the best idea for gamers. I'm leaning towards 2 4k monitors with the thought that if I ever outgrow them and want faster refresh rate, I could buy just 1 monitor and run all 3. I also think the 4k monitors will still be relevant in 5 years whereas the higher refresh won't matter as much. I know LG makes a lot of panels that get rebranded, are there other companies I need to avoid because they use rWbg and text looks terrible? I've never used a curved monitor. I have a friend who swears by them. I probably prefer flat screens because it's easier to show other people what I'm working on, but I could be persuaded to give them a try. My primary use is office/gaming. My graphics card is a 7900xtx. Thanks for any input. I've put off buying replacements for a long time because I don't know what to buy.
  2. FSR is the reason I don't care about DLSS. With my GTX 970 Cyberpunk was unplayable with default/low graphics settings, but with FSR enabled it was an enjoyable game. That graphics card couldn't use DLSS. Nvidia likes to lock their DLSS improvements to newer hardware. That's why the 4060 is SO MUCH BETTERZ than the 3060 in charts, when in actuality, the hardware has very few improvements and is a downgrade with PCIE 3.0. I don't need upscaling/smoothing for new hardware, I need it when the hardware is a little older... When Nvidia users are locked out of newer DLSS versions, FSR is still available. It's only a matter of time until the DLSS for current gen hardware is outdated, and FSR is a superior option. It's kind of like G-Sync/Free-sync. G-Sync hit the market and required additional hardware on the monitor, which raised monitor prices by about $50. When Free-sync hit the market, it didn't require additional hardware and worked with most monitors on the market. It wasn't long until monitors all went Free-sync / G-sync compatible. G-sync compatible was not the same as G-sync, it didn't have the hardware module, but good luck finding an actual g-sync monitor now. Hardware companies learned that $50 more for a monitor means fewer buyers. I've noticed a trend of proprietary tech entering the market, people claiming it's better, then open source tech catches up and the proprietary tech doesn't matter. Nvidia has been the company pushing the envelope which means they keep their stuff proprietary. When AMD comes along they open the market. It will be interesting to see if Intel's competition means AMD keeps more stuff proprietary.
  3. I've had the displeasure of using an Arc 730. It couldn't even play youtube videos without massive rendering issues. Most people measure graphics performance in frames per second, that card was how many seconds per frame. Intel is not competitive in the graphics market. The 730 should never have been sold, graphics cards from 15 years ago work better than that. The enthusiast hardware group has always been a small market compared to the economical stuff. In the enthusiast section, AMD isn't the most popular, so it makes sense that they would reduce focus on high end graphics. They stated before they want enterprise to be their primary focus. Servers don't prioritize graphics. AI is the next push, and AMD must be competitive in that department if they want to stay relevant. Being a hardware enthusiast, I do hope AMD can offer competition to Nvidia's high end. I think AMD had the opportunity to sock it to Nvidia with their pricing this year, but instead they decided to maintain the status quo. I feel that ray tracing and DLSS are marketing gimmicks, and I don't care about those, but it's true that a lot of software doesn't work with AMD. Martix jewelry design software doesn't work without CUDA cores, and it hasn't for years. Nvidia took their time to infiltrate the market and form partnerships, AMD is still trying to sell hardware. Let's not forget about the tessellation scandal and how game companies Nvidia partnered with were running tessellation below the map because AMD cards didn't handle tessellation as well.
  4. I like autoruns for windows (microsoft made!) I use that to check what is running in the background. Other than that I just delete folders.
  5. If I didn't know any better, I'd say this was done as a direct response to the 6900x and 6900X3D. 28 Threads is kind of an odd, but I guess it doesn't take a lot to beat out 24 threads...
  6. MSI Ventus GeForce RTX 4080 Video Card RTX 4080 16GB VENTUS 3X OC - Newegg.com WWW.NEWEGG.COM Buy MSI Ventus GeForce RTX 4080 16GB GDDR6X PCI Express 4.0 Video Card RTX 4080 16GB VENTUS 3X OC with fast shipping and... $100 Discount code is included, total price is $999.99 Doesn't seem so bad to me...
  7. MSI Ventus GeForce RTX 4080 Video Card RTX 4080 16GB VENTUS 3X OC - Newegg.com WWW.NEWEGG.COM Buy MSI Ventus GeForce RTX 4080 16GB GDDR6X PCI Express 4.0 Video Card RTX 4080 16GB VENTUS 3X OC with fast shipping and... Price is listed as 1099.99, but has a $100 promo code that drops the price to $999.99 This is the first time I would consider a 4080 over a 7900XTX.
  8. I miss the days of bandits with names. This steam summer sale has been hard for me. I'm so tempted to buy 10+ games, but part of me says I never play them all when I do it that way. I've been going through ny catalog of games and I still have so many unfinished games. Hard to resist the temptation though.
  9. AI is the real deal. We had the machine age, then we had the information age, now we are entering the age of AI. That is to say, we can finally make use of all the information we have accumulated. AI is still in it's infancy. People have to understand statistics and datasets to program it, then they have to train it, and then verify results and tweak the process. Part of the problem with AI is that it's very hard for anyone to understand what it learned after training. ChatGPT may be able to answer a question, but until it can explain exactly how it came to that answer it's use as a tool is limited. A lot of people think they can just slap AI onto a problem and it will solve it. That's not what AI is about. In fact from a cybersecurity standpoint, adding AI onto something just increases the attack vectors people can use. The real importance is the datasets. AI is only as good as it's data. To companies like Google, Microsoft, Apple, AI is invaluable. To someone like me with limited data, it is not very useful. AI plays a major role in the growth and development of nations. The NSA has a ridiculous amount of data. Similarly, China has an immense amount of data. These AI algorithms will play a major part in how our countries function in the next 50 years. Tiktok is available worldwide. In China it promotes learning and growth. In America it promotes the dumbest things they can find. Perhaps the most influencal use of AI will be in the growth and development of the next generation of children. Already we are seeing AI develop new chemical drugs, and assist with silicon development. We will hit a point where it's growth is exponential in comparison to what people come up with alone. AI by itself may never be able to match the creativeness of a person, but when AI has the output of thousands+++ of people's work as a training dataset, it can potentially do what it would take a million people to come up with. Right now, AI is still a child.
  10. I spent some time bumming around trying to learn more about audio sound. Ultimately, it led to information overload. What I learned: I should probably take @GanjaSMK and @The Pook's advice, buy a 2.1 Klipsch speaker set, run onboard sound and escape while I can. Sadly, I never claimed to be smart. @pioneerisloud I don't know enough to build my own audio equipment. I can handle the electrical stuff, but when it comes to generic stuff I am woefully lacking. I don't know the difference between two different sets of speakers based on their stats. One thing I do know, I don't want an MDF finished product. Perhaps there is a way to make them look decent, I'd love to learn more. In high school I had a welder friend build a sub box for me out of MDF. If I were to do it again, I would buy a carbon fiber box and not think twice. That's not to say the MDF under preformed, Jeff Keys (owner of Sight and Sound), told me my subs would probably do better in a band pass box. Now days I'd pay more for a slightly finished product. If you can give me the knowledge on how to make a product seem finished, I'm still interested, I fell into the trap of listening to advertisers tell me why their stuff is good. Only to decide, they are marketing guys and I don't trust anything they say. A few 'neutral' youtube videos later and I realized these guys don't have a clue what they are talking about. This led me back to @T.Sharp's chart information, and I realized I should reset and learn how to read a chart. That's it for today. Maybe tomorrow I will make more headway. @Neurotix I'm really happy to hear someone is using a Raspberry Pi to drive their audio equipment. I happen to have a Pi that's not in use. I originally bought it to drive a TV, but I'm disappointed in it's graphics for a 4K TV. Driving from something as simple as a Pi, makes things seem a lot more future proof. (The joke about future proof, is the best way to future proof your stuff is to buy it in the future). What are you using for remote, and are you running headless? You mentioned gaming, is your system also running off a main computer as well as the Pi? Edit - What's up with @user formatting? How do I make them all the same?
  11. Aeron is, ok, I wouldn't say it's amazing. They have a tendency to break, particularly the reclining lock pin is small and snaps off. Their lifetime warranty doesn't help me because I bough mine used
  12. Lots of responses, thanks everyone! I'll try to answer questions. If I don't respond directly to you, please know that your opinion is still appreciated. I'm in a house, the closest neighboring house is about 25 feet away, with all other houses being much farther away. In the 3 years I've been here, I think the owners of the nearby home have only been around a couple months, it's not their primary residence. I live alone, though I do have a guest that stays in the room below for a couple weeks of the year, so the option to limit or turn down the bass during those times would be helpful. The room is decent sized, with the main focus of it being the computer. It's on a folding table, with another desk nearby that is unused. I like the table for the surface area. Most of my table is occupied by 2 32" monitors, I've considered adding a 3rd and will probably do so at some point in the future. The gap between the table and monitors is 6-7 inches, depending on if the monitor stand is taking up the other inch. The gap between the table and monitor bezel is 8 inches, but that would put the speakers fairly close to me. I'm not opposed to using stands for speakers and have a book case directly behind me that speakers could easily sit on. The reason I haven't used the 5.1 setup on my current speakers is that they don't have a stand and the wires are fairly short. For the amount of impact they make, the effort hasn't been worthwhile. Speaking of wires.... I embrace wire chaos. I believe that if a wire looks at itself it instantly becomes tangled. I'm not opposed to a better setup, but currently I haven't figured out a good way to hide wires and make them look respectable. I do have a concern about running wires across the floor (around the edge of the room might be ok), as people with dogs visit me and they don't seem to respect wires, creating a potential tripping/failure point. A receiver could easily sit on the desk next to me and be reachable/controllable without moving. The slight hiss would bother me. My environment is usually quiet. This is the first I'm learning about monitors, so yeah... Guess I have more reading to do. That was the speaker set that I was thinking about. I've got that terrible itch to scratch that makes me wonder if I shouldn't invest a bit more. I probably fall into the audiophile category (scary), since I've got about $800 invested into my headphone setup. These days I'm not using my headphones very much, only for specific situations. I wouldn't feel bad spending more if it's something I can get 20 years of use out of. I've used sound cards in the past. My old system had one after I messed up the onboard sound. There was a noticeable difference between driving headphones with a DAC/Amp vs using the sound card. I haven't compared my current motherboard's onboard vs DAC. That would probably be the rational/economical way to do things. The question really comes down to, do I need an amplifier for the speakers, if so, I might as well throw in a DAC. I plan to keep the Objective 2 for headphones. Edit - Realtek onboard doesn't sound great in linux, apparently the drivers are proprietary and they haven't released a linux driver. I do notice a difference in sound quality between windows and linux, which has significantly impacted how much I'm using linux. I'd like to be using linux a lot more, but when the overall activity I'm doing is the same with either operating system, but one sounds a lot better... I'd prefer to be using Linux over Windows. I have the space I could run a surround sound system. Part of the problem becomes positioning the speakers and running wires to them, which could probably be done, I'd need longer speaker wires to avoid dragging them across the center of the room. I like the chest rattling of a sub, I just don't want to over do it to the point that it drowns out other music qualities. I'm looking for a good balance. So far I'm leaning towards a receiver with near field monitor speakers, but I should probably do more research on monitor speakers so I know what I'm getting into. This made me laugh.
  13. My creative 5.1 speakers sound fine in use, but when nothing is playing they have a constant popping sound that is starting to bother me. They are 13 years old and giving me the itch to upgrade. It's not PC/wall outlet related, those have both changed over the years. I've grown accustom to using a DAC/headphone amplifier, and can't help but think I should have a DAC/Amp for computer speakers as well. My Objective 2 DAC/Amp could work as the DAC, but I think I'd have to change the wiring to switch between headphones and speakers. I'd much rather a switch/button, or even software setting than swapping wires all the time. Which means I may be interested in a new DAC/Amp, and some new speakers. My 5 speakers are all in front of me on my desk, so I don't know how relevant surround sound is. Might be nice in a theater setup, but not relevant for directional sound in gaming, I use headphones for that. I'm at risk of spending countless hours researching and way more money than I should on audio equipment. That said, I really like quality, clean sound. It doesn't have to be loud. I value my hearing. Max budget $1,200. I'd sleep better if it were around $400 or less. Part of me is cringing as I write that, knowing that if I bought some $150 speakers I'd likely be content. If I'm going to use it for 12+ years I don't mind spending a bit more. Audio technology hasn't changed a lot over the years. Amplifier/speakers are always relevant, the only thing that seems to change are receivers (CDs, Bluetooth, arc HDMI, wifi, etc.), which is why I might be better off only buying a receiver if I switch where the sound system is. Any suggestions? How deep is this rabbit hole, and how do I climb out before I've spent way to much time and money?
  14. I'm annoyed by the blackout. I've been unwillingly forced into a protest when I don't care. I don't use the reddit app or a 3rd party app to view their forums. I just use firefox with my array of addons. That means sometimes NSFW content doesn't come through correctly on mobile, but I probably didn't need to see it anyways. Reddit's primary value has always been their data. They collect everything, and it's a good amount of information to parse through when training AI and other such things. It makes sense that they want to sell this data to AI development companies. They can't do that when all the data is free. I saw this coming 10 years ago. I don't understand why everyone is in shock about it. The real push back here is, reddit's app sucks. Maybe they should make it better?
  15. Everyone knows you need 3 save games to not get stuck! J/k. Personally I just buy NAS drives and have stuff I really care about copied between 2 drives. I have yet to have a NAS drive fail, but I'm sure it could happen. If I were running a server or had really sensitive information, I'd run raid 3 and do an offsite/offline data backup once a week. Raid 3 allows information to be recovered if a drive goes down, and offline backup helps in the event of ransomware. The problem with every day backups is that files can become corrupted and the corrupted file is copied over the backup. I've only had 4 drives fail on me in my lifetime. 2 were over 20 years ago, before I used UPS systems, and 2 were intel SSDs, I don't buy intel storage anymore.
  16. It's a feel good article for people who bought a 7900XTX. They had to use a water chiller to cool it.
  17. Source Sony should absolutely fight the merger at every step possible. 10 Years isn't very long and CoD is a major title on Playstation.
  18. Nice to see them moving forward. The last time I played they hadn't sorted the netcode and servers were sending all events to every user, not just the local events they should see. Any time someone spawned a Catapiller ship, the server would crash within 5 minutes.
  19. Ublock Origin > AdBlockPlus. Adblock plus sold out years ago and they allow certain advertisements through. Youtube doesn't play any advertisements for me. I'm running it on Firefox, and I have configured Ublock's dashboard to pretty much include every list available. Whenever I see an annoying ad, I purge/refresh my lists. If that doesn't work I pull out the element picker and remove the advertisement that way. It blocks all youtube ads on windows, linux, and android. I use the firefox browser for my phone instead of the youtube app. Though I have heard good things about newpipe. Twich.tv advertisements on the other hand.... My best attempts to block have left me staring at a black screen for the duration of the advertisement. I consider that an improvement to seeing the same advertisement 12 times in an hour. Maybe one of these days I'll try and put pihole on an OpenWRT router. +1 for NoScript. It's a real pain to get used to, especially on new websites, but it's easily one of the best ways to control what content you see. At a certain point I think it becomes easier for websites to identify you. I used to use Waterfox, but quit because it's a lot easier to be identified because so few people use it. I also ran into websites that refused to run because my browser was "incompatible" and needed to be upgraded.... It's literally a clone of Firefox with the telemetry ripped out.
  20. Are there any chips that can utilize that ram speed?
  21. According to Intel, the ability to run at high temperatures is a feature. They are unlikely to change their outlook any time soon.
  22. Source Impressive leap considering how their 10nm development went. It always felt like they were sandbagging when they had the lead. That's a lot of tech combined to make it happen.
  23. For the same reason I don't password my bios. Physical access > all. I used to reset bios passwords for my school when people messed with them. They never bothered with bios passwords because they just cloned hard drives for all computer setups. Teachers generally only have hardware issues. Bitlocker makes sense if you want to use a USB key. It also makes sense if you're worried about employees stealing hard drive data, though in reality that data shouldn't be stored on their hard drives, but on a server they don't have physical access to. Personally, I'm weary of calling anything encrypted when I just handed the keys to someone else. Bitlocker isn't going to help against online threats, because the machine is already running. TPM unlock is an invitation to steal the entire computer instead of just the drive. I've had candy vending machines stolen from me. The audacity of people and the ability for stuff to walk away is not lost on me. Probably the best use case for bitlocker is on a laptop with a USB key. I'd rather trade that USB key for a password. I know passwords are weaker against brute force, but the dictionary size for brute force attacks > 8 characters is exponentially large. The likely hood of me running into someone with the knowledge, skill, and resources (server) to break a good long password is fairly low. They still need physical access to the device. I'm just not that important. If I did have sensitive information that I really cared about, I wouldn't want it decrypted upon boot. I don't use LastPass either. I know security guys everywhere have been touting it's great use, but ever since it's inception, I've felt it's just a single point of weakness that presents a target. LastPass got hacked. It's a tribute to their security team that it has taken this long for it to happen.
  24. Edge just became the new porn browser.
×
×
  • Create New...

Important Information

This Website may place and access certain Cookies on your computer. ExtremeHW uses Cookies to improve your experience of using the Website and to improve our range of products and services. ExtremeHW has carefully chosen these Cookies and has taken steps to ensure that your privacy is protected and respected at all times. All Cookies used by this Website are used in accordance with current UK and EU Cookie Law. For more information please see our Privacy Policy