Jump to content

Welcome to ExtremeHW

Welcome to ExtremeHW, register to take part in our community, don't worry this is a simple FREE process that requires minimal information for you to signup.

 

Registered users can: 

  • Start new topics and reply to others.
  • Show off your PC using our Rig Creator feature.
  • Subscribe to topics and forums to get updates.
  • Get your own profile page to customize.
  • Send personal messages to other members.
  • Take advantage of site exclusive features.
  • Upgrade to Premium to unlock additional sites features.

Kaz

Members
  • Posts

    111
  • Joined

  • Last visited

  • Days Won

    3
  • Feedback

    0%

Everything posted by Kaz

  1. Has intel fixed the drivers for a380? My only experience was horrible, it couldn't play youtube videos without freezing up.
  2. That software engineer worked really hard. I almost feel bad that it came out in the way that it did. What he did was along the lines of a firmware update. Using hardware in ways that are more applicable to the software's needs. Did they engineer their hardware to do this from the beginning? Maybe...
  3. Gamer's Nexus has mentioned this is a problem. In the drive for highest/best performance, mobo partners go ham on processor settings. That's why they stress that they run stuff to spec. It's funny that it takes 4 companies to point it out.
  4. Tom's Hardware Nvidia cards + Unreal engine compiling shaders on intel 13th/th14 gen processors with uncapped power limits is causing trouble. That must have been really fun to troubleshoot. I've heard of companies pointing fingers before, but man those motherboard partners really screwed up! (J/k). If you're an nvidia gamer with a new intel processor you might want to enable intel's default power limit, or at least cap it.
  5. After 2 failed attempts to learn programing, I have finally found a learning platform that I like and am taking the plunge. I wanted to take a minute and hype "Py4e.com". Charles R Severance is a professor who put this program together because he believes everyone should learn to program, not just career programmers. The ability to tell a computer to do exactly what you want it to is such a useful skill. It's a free, no cost course, that I am incredibly impressed with. His course includes lectures, reading material, exercises (writing actual programs), and quizzes. He also works through the assignments, which I have found interesting as he has suggestions that I didn't think of when solving them. He even has a sense of humor! I have paid money for college courses that are not as good as this. Py4E sure trumps my trade school attempt where they handed me a text book and told me to read, for 4 hours, every day. I didn't write one program or learn any specific language, just the concepts of programming. Yeah, I didn't finish that course... I also tried learning Python on Alison.com 3 years ago and was bored to tears by the end of the first chapter. I did 2 chapters and told myself I would do more the next day, I never returned. Professor Severance's course is the first one that actually feels fun, and the only course that has me writing programs. If only I'd found this 10 years ago.
  6. What we know about the xz Utils backdoor that almost infected the world ARSTECHNICA.COM Malicious updates made to a ubiquitous tool were a few weeks away from going mainstream. It was implemented on some unstable builds, (Like Kali experimental) so if you have an unstable Linux build, make sure it has been updated. This is a few days old, but since it hasn't been mentioned I thought I'd bring it up.
  7. In my security + class they talked about how a company setup a large (many servers) honey pot that was infected incredibly fast, I think within 12 seconds the entire system had been compromised. Within 24 hours the servers were compromised hundreds of time. It was used as a warning about connecting unpatched systems to the internet. A well designed web crawler doesn't need AI to infect things. I would love to know how they successfully defend a DDOS attack while still allowing standard users to connect. How do they identify attacker traffic from regular traffic? It's very likely Google is already using AI to help do that. What is going to be really scary is when AI blows a hole in all these hardware back doors. Tech debt is a real problem and bot nets are only getting larger. Hardware backdoors are an incredibly short sighted idea. This is an old video, but still an awesome watch about how they have found back doors that were intentionally baked into the CPU. The guy works for intel now, go figure.
  8. Google kept this open while they developed market share. Now they will use that market share to stifle competition. They no longer need 3rd party cookies to collect their data. The manifest v3 change is a lot more than just 3rd party cookies
  9. This is good news, the 40 series was overpriced because they were still trying to sell leftover 30 stock.
  10. Article title makes it seem like they are selling their firmware updates. Some of their monitors just won't support updates. That may have been a sales decision. They have the same panel. Overclockers like us love getting stuff we can flash different firmware on and receive more features. This isn't pitchfork worthy news. Up until last year, none of the monitors I owned could update their firmware. I remember when video games were sold as complete games, rather than having day one updates. Those were the days.
  11. This might be the first real feature of Windows 11 over 10! I generally don't use upscaling or frame generation if I don't need it, but it may be a way to keep older games relevent on new hardware. I'm curious how AI is actually learning this stuff. If it doesn't know right from wrong, good from bad, how does it know what improvements are desired and what isn't? It must be an interesting algorithim if they plan to impliment it on thousands of games at once. There's some talk about different processing units in the article, but it's something I didn't understand. Can anyone explain it better? The only thing I know is, Nvidia GPUs are good with AI, AMD and Intel, not so much. I've always wondered what is physically different with AI compute as opposed to rasterization. The fundamental concept still leads to a 1 or 0, so why is it hardware that's different and not software? Maybe we'll start to see software/firmware bridge the gap that hardware was doing.
  12. This is interesting. Intel may not have licensing rights on Nvidia technology, but if they are building it for them they are in a prime position to leverage Nvidia for their own gains. Sort of like China stealing technology. "We'll make it, just give us your designs". Intel can't copy the product verbatim, but they can still benefit from the R&D, which is no small cost. It's also interesting because it has similarities to Nvidia and AMD board partners. Seems to me that EVGA dropped out because they didn't want to compete with Nvidia. The ability to set the price they sell their components for, set the MSRP, and the ability to compete in the same market was too much to bear. It may indeed be time to buy Intel stock. It will be interesting to see just how many keys to their kingdom Nvidia is handing to Intel. It's still a bit early for Intel graphics cards though. I suspect it will take Intel 3 generations to smooth out the bumps. That is if they stick with it. Rumor has it their 3.5 billion investment may be axed before it can pay off.
  13. Yeah it sounds more like someone bought the 4090, then returned their borked card instead of the 4090. The old swaparoo. Seems more like a 1 off scenerio than something to get worked up over.
  14. $700 for a 7900xt is not bad. I had a 7900xt, it had an intermittent display port so I sent it back and picked up the XTX instead. I can honestly say the 7900xt was enough card for me, and I overspent upgrading to the xtx. They are both really nice graphics cards.
  15. I was just investigating if one of my add-ons was causing the news/latest problem, when I stumbled across this post. Thanks, that fixed it. That's a really obscure setting for such a main feature of the site.
  16. This will be a fun topic to follow. Material science has some real potential. It is a game changer. My friend who works in AI development (they are working with bees and plant germination), says the problem with AI is no one understands how it generated the answers. It's a black box that people don't understand. For stuff where the answer can be checked it's amazing. Chat GPT can generate an answer, but it's also known to lie and make up sources. It's real goal is to understand languages, but can a statistician ever know if it actually understands? It can tell you the probability or likelyhood that a certain word will be next. It doesn't actually know what it's saying. But hey, if I listen to enough smart people talk I can always just repeat the answer.... The problem is, unless they release their dataset (which they won't because they web crawled it without permission), we don't know how smart those people are when it regurtitates, or just plain makes up answers. https://nytco-assets.nytimes.com/2023/12/NYT_Complaint_Dec2023.pdf I was trying to copy article 99 to add an excerpt but it doesn't like copying. It's a blatant example of openAI ripping off an NYtimes article. It's bad, like I wouldn't have even tried to pull that off in middle school because any teacher would have called it plagurism. Statisticians are scary. In business stastics we learned how data can be manipulated to show just about anything. Most of AI development is just statistics. Where those statistics comes from matters. What people are trying to show matters. (I wouldn't trust AMD, Intel, or Nvidia marketing statistics to tell me how good a product is. There's too much variamce between that and independent reviews. OpenAI reached a little hard by not getting authorization to use company datasets They did it the good old fashioned piracy way, just take it.
  17. My approach would be to use wireshark to figure out what ports are needed, and nmap to see if those ports are actually open. Chances are they aren't and your ISP router is still in play. Setting the pfsense to demilitarized from your ISP router is a reasonable solution, that should eliminate the isp router as a problem. In my previous experiences with this, the ISP router was still in play and the ports from it had to be forwarded as well.
  18. The monitor I wanted, but probably shouldn't buy... (My wallet is thanking me.) I'm also hesitant with oled, though I think qdOLED is supposed to be better with burn in.
  19. I love MX blues. My first experience typing was with typewriters, blues audiable click always reminds me of a typewriter. I tried MX browns, they are alright but without orings the bottom out sound makes them arguably as loud as blues, and the sound isn't rewarding When my 2nd keyboard died I went back to blues. I'd like to see what holy pandas and topre switches are about, but there's little point in trying them when I already love blues.
  20. Gigabytes 32" 3840x2160 is a good monitor, but I returned them for the 27" version. At my viewing distance the image quality is noticeable. I can't imagine wanting to go bigger for anything less in terms of resolution. I deeply regretted buying 32" 1440p monitors. I'm holding out on OLED, burn in is a concern to me and I recently found out LG's OLED doesn't play well with text. Glossy glass monitors look a lot better than plastic matte finishes. I wish I could still buy glossy IPS screens. I'd pay extra for that.
  21. Update on the situation. I've now had Gigabyte M32U monitors for about a week. I was using text scaling, then I tried windows scaling (apparently they are different). While there were some gains/benefits, I ultimately decided the monitors weren't working out for me. I am now using 2x Gigabyte M32U (32" 4K 144hz) monitors, and I'm debating returning them and getting Gigabyte M27U (27" 4K 150/160hz overclocked) monitors. The smaller screen size might be preferable to me for a sharper image/text clarity. I'll probably fire up cleartext and see how that goes for the next week or so before making a decision. From what I read, the LG monitors don't play well with clear text, it was designed for sRGB/BGR, not rWbg. The gigabyte panels should be fine, but the LG panels are unsupported. With the prevalence of LG OLED panels hitting the market, it will be interesting to see if windows adds support for rWbg. Interesting sidenote, my 27" Yamakasi 2560x1440 monitor felt like it had a sharper/crisper image than the Gigabyte 32" 4K monitor. The 4K monitors have higher PPI, but the Yamakasi had a glass finish, as opposed to the mat finish on new monitors. I miss the days when monitors had glass. The 32" monitors I was using are LG 32GQ850-B. I question how much of the issue is a text issue, vs a screen/panel lighting issue. Before finally purchasing new monitors, I noticed that watching video for 10 minutes would bother my eyes. The final straw was when I was looking at my blue windows login screen and realized, my eyes are bothered. That was within 30 seconds of firing up my computer. Since the monitor swap, I do still have some eye trouble, but it is significantly reduced. I feel like I can recover in this situation, vs before when things were only getting worse. For now, I'll be avoiding LG panels due to the unsupported rWbg, which includes companies like Corsair who use LG panels. I'm extremely disappointed that all the new OLED panels fall into this category, especially considering they are the only panels that still use glass.
  22. Simulators are where VR really shines. Condor 2 (glider simulator), is pretty damn realistic to flying an actual glider. Condor 2 translates to real flight extremely well, to the point that I would say any potential glider pilot should spend time on the simulator. My instructor likes to say "Airplanes make poor classrooms, and classrooms make poor airplanes." Simulators are extremely nice because you can pause and talk though everything going on, while still getting the airplane experience. The only thing missing is the g force effect and the resistance feel of of holding open the air brakes. (Also that first flight has a hard realization that the plane is always falling and has no engine, first real flight is unnerving.) My local mall had one, but their pricing was more in line with $50 for 30 minutes. I'm a bit too frugal to justify that, but my personal hardware is such that I could easily add VR. I've been waiting for valve to come out with a new headset. The index has held it's value on the used market, and paying full price for 4 year old equipment computer equipment doesn't entice me. The Oculus Rift S is fairly cheap in the used market, but I don't want to deal with facebook/meta. When I last looked into VR, there were quite a few VR headsets on the market, but the actual implementation/support they get is questionable.
  23. After spending the evening reading reviews. Gigabyte M32U is the leading choice.
  24. I've got 2 LG 32" 2560x1440 240 hz monitors. I wish I could keep them, but I am experiencing eye strain. Yesterday 10 minutes of desktop use resulted in blurry vision for hours, and eye irritation for 6-8 hours. These LG 32" screens are always blurry. Text in particular is very bad. I've tried increasing sharpness, it's an improvement, but not great. I think my eye strain comes from forcing my eyes to look for detail that isn't there. I'm not sure if it's a low PPI issue, or how LG decided to use rWbg instead of sRBG. I almost switched to LG's 27" OLED, but it turns out that monitor has really bad text clarity as well. Something about rWbg not being very good for text. I do not know if there is a way to force sRGB with AMD graphics cards, if there is, could anyone explain how it's done? After months of struggling, I want to use my computer bad enough I'm ready to dump these monitors and buy something else. I'm looking for 27" or 32" monitors in the 1440p to 4k range. If I get another 1440p monitor it's going to be 27". I don't know if there is a visual clarity difference between 27" 1440p and 4k. I thought my 27" 1440p IPS Yamakasi picture was very sharp. It had a gloss screen. I'm at a loss. One thing I know, text needs to be clear. I know I want 2 monitors. I know I like high refresh rates, but that it's waisted on a 2nd monitor. (Unless I get 3!). I liked having 2 of the same monitor, but I can see where it's not the best idea for gamers. I'm leaning towards 2 4k monitors with the thought that if I ever outgrow them and want faster refresh rate, I could buy just 1 monitor and run all 3. I also think the 4k monitors will still be relevant in 5 years whereas the higher refresh won't matter as much. I know LG makes a lot of panels that get rebranded, are there other companies I need to avoid because they use rWbg and text looks terrible? I've never used a curved monitor. I have a friend who swears by them. I probably prefer flat screens because it's easier to show other people what I'm working on, but I could be persuaded to give them a try. My primary use is office/gaming. My graphics card is a 7900xtx. Thanks for any input. I've put off buying replacements for a long time because I don't know what to buy.
  25. FSR is the reason I don't care about DLSS. With my GTX 970 Cyberpunk was unplayable with default/low graphics settings, but with FSR enabled it was an enjoyable game. That graphics card couldn't use DLSS. Nvidia likes to lock their DLSS improvements to newer hardware. That's why the 4060 is SO MUCH BETTERZ than the 3060 in charts, when in actuality, the hardware has very few improvements and is a downgrade with PCIE 3.0. I don't need upscaling/smoothing for new hardware, I need it when the hardware is a little older... When Nvidia users are locked out of newer DLSS versions, FSR is still available. It's only a matter of time until the DLSS for current gen hardware is outdated, and FSR is a superior option. It's kind of like G-Sync/Free-sync. G-Sync hit the market and required additional hardware on the monitor, which raised monitor prices by about $50. When Free-sync hit the market, it didn't require additional hardware and worked with most monitors on the market. It wasn't long until monitors all went Free-sync / G-sync compatible. G-sync compatible was not the same as G-sync, it didn't have the hardware module, but good luck finding an actual g-sync monitor now. Hardware companies learned that $50 more for a monitor means fewer buyers. I've noticed a trend of proprietary tech entering the market, people claiming it's better, then open source tech catches up and the proprietary tech doesn't matter. Nvidia has been the company pushing the envelope which means they keep their stuff proprietary. When AMD comes along they open the market. It will be interesting to see if Intel's competition means AMD keeps more stuff proprietary.
×
×
  • Create New...

Important Information

This Website may place and access certain Cookies on your computer. ExtremeHW uses Cookies to improve your experience of using the Website and to improve our range of products and services. ExtremeHW has carefully chosen these Cookies and has taken steps to ensure that your privacy is protected and respected at all times. All Cookies used by this Website are used in accordance with current UK and EU Cookie Law. For more information please see our Privacy Policy