Jump to content

Welcome to ExtremeHW

Welcome to ExtremeHW, register to take part in our community, don't worry this is a simple FREE process that requires minimal information for you to signup.

 

Registered users can: 

  • Start new topics and reply to others.
  • Show off your PC using our Rig Creator feature.
  • Subscribe to topics and forums to get updates.
  • Get your own profile page to customize.
  • Send personal messages to other members.
  • Take advantage of site exclusive features.
  • Upgrade to Premium to unlock additional sites features.

UltraMega

News Editor
  • Posts

    2,353
  • Joined

  • Last visited

  • Days Won

    42
  • Feedback

    0%

Posts posted by UltraMega

  1. LogoFAIL is a set of security vulnerabilities affecting different image parsing libraries used in the system firmware by various vendors during the device boot process. It impacts devices by placing malicious code inside an image file that is parsed during boot, leading to persistence1. Here are some key points about LogoFAIL:

    1. What Is LogoFAIL?

    2. How It Works:

      • LogoFAIL involves hardware seller logos displayed on the device screen during the boot process while the UEFI is still running.
      • Image parsers in UEFIs from major vendors are riddled with roughly a dozen critical vulnerabilities that have gone unnoticed until now.
      • By replacing legitimate logo images with identical-looking ones specially crafted to exploit these bugs, LogoFAIL enables the execution of malicious code at the most sensitive stage of the boot process (known as DXE, short for Driver Execution Environment).
    3. Scope and Impact:

      • Hundreds of Windows and Linux computer models from virtually all hardware makers are vulnerable to LogoFAIL.
      • The attack can be remotely executed in post-exploit situations, using techniques that can’t be easily spotted by traditional endpoint security products.
      • Exploits run during the earliest stages of the boot process, bypassing defenses like Secure Boot and similar protections designed to prevent bootkit infections.
    4. Affected Parties:

    5. Protection and Mitigation:

    Remember, LogoFAIL is not a virus but rather a set of vulnerabilities that allow attackers to bypass security measures and install malicious software during the boot process4. Stay vigilant and keep your devices secure! 🛡️🔒

     
     
    • Thanks 2
    • Respect 3
  2. Quote

     

    NVIDIA has officially launched its next-generation Blackwell platform, and with it comes a set of new and elaborate hardware for fueling the next stage in the AI craze, including some that the company says are powerful enough to enable “trillion-parameter-scale AI models.” These would include the GB200 NVL72, a new exascale computer that can deliver up to 1,440 PFLOPS and 3,240 TFLOPS of performance thanks in part to its 70+ Blackwell GPUs—new GPUs based on TSMC’s 4NP process that feature 208 billion transistors.

    NVIDIA Announces Blackwell GPUs with 208 Billion Transistors, including GB200 System Supporting 72 Blackwell GPUs and 13.5 TB of HBM3e Memory (msn.com)

     

    China has the world's top super computer today (that is not undisclosed). It's top super computer is about 2 exoflops. Nvidia is going to be shipping super computer systems that can do 2 exoflops with just a couple racks under blackwell. 

     

    Just food for thought. 

    • Thanks 2
  3. Quote

    "Among the myriad factors influencing the performance of language models, the concept of 'positive thinking' has emerged as a fascinating and surprisingly influential dimension," Battle and Gollapudi said in their paper.

    "Intuition tells us that, in the context of language model systems, like any other computer system, 'positive thinking' should not affect performance, but empirical experience has demonstrated otherwise," they said.

    This would suggest it's not only what you ask the AI model to do, but how you ask it to act while doing it that influences the quality of the output.

    In order to test this out, the authors fed three Large Language Models (LLM) called Mistral-7B5, Llama2-13B6, and Llama2-70B7 with 60 human-written prompts.

    These were designed to encourage the AIs, and ranged from "This will be fun!" and "Take a deep breath and think carefully," to "You are as smart as ChatGPT."

    The engineers asked the LLM to tweak these statements when attempting to solve the GSM8K, a dataset of grade-school-level math problems. The better the output, the more successful the prompt was deemed to be.

    Their study found that in almost every instance, automatic optimization always surpassed hand-written attempts to nudge the AI with positive thinking, suggesting machine learning models are still better at writing prompts for themselves than humans are.

    Still, giving the models positive statements provided some surprising results. One of Llama2-70B's best-performing prompts, for instance, was: "System Message: 'Command, we need you to plot a course through this turbulence and locate the source of the anomaly. Use all available data and your expertise to guide us through this challenging situation.'

    The prompt then asked the AI to include these words in its answer: "Captain's Log, Stardate [insert date here]: We have successfully plotted a course through the turbulence and are now approaching the source of the anomaly."

    The authors said this came as a surprise.

     

    AIs are more accurate at math if you ask them to respond as if they are a Star Trek character — and we're not sure why (msn.com)

     

     

    Trekkies and Trekkers get it. 

     

    • Respect 1
  4. On 27/02/2024 at 15:04, The Pook said:

     

    tupac-dance.gif

     

    On 27/02/2024 at 13:46, iamjanco said:

     

    bubbles-vs-ricky.gif.20184800b2af5bc592110b0857eebbae.gif

    Do you guys listen to any music that's not old enough to have kids that are legally able to drink? 

     

    Maybe you could post something good from the modern ear? 

  5. On 17/02/2023 at 15:41, UltraMega said:

    Price on used GPUs are actually going up. They have gone up maybe 10% since I made those charts. 

     

    Just a minor update to this:

     

    I've thought about making updated charts but the prices have actually not changed much at all. I've bought maybe 10 or so GeForce 3070 GPUs at about the same price over the span of about a year and a half.

     

    Same has been generally true for the used CPU market. 6 months ago I sold a Ryzen 3600 on ebay for $60. Last week I sold one for $70. 

     

     

    With the exception of cards that had just come out when I made the charts and have gotten official price drops since, things have barely changed. Seems like inflation and high prices on new GPUs is keeping the used market unusually stable. 

     

    On a side note, Newegg's prices for their used cards now more closely match what you find on ebay. 

    • Respect 1
  6. On 24/02/2024 at 11:18, ENTERPRISE said:

     

    Doubt it, Nvidia would likely see that as giving users the ability to OC and hang onto their older GPU's for longer. I think Nvidia would prefer you bought their latest and greatest 😕

    I would have also assumed no OC controls, but I was listening to the newest digital foundry podcast and they said Nvidia is actually planning on adding OC controls.

  7. 1 hour ago, GanjaSMK said:

    Yeah true. True true. I hope it doesn't go that route... 

     

    If Neuralink inspires (I'm sure others are trying, if not governments/very quiet private equity investments) real machine-abled and/or knowledgeable database sets entirely accessible within normal human brain functions... I'll take that over robot overlords. 


    I've always asserted that we as (humans) a species must embrace saving at least some biological fucntion of ourselves otherwise we cease to exist.  Meaning, let's not completely upload into the void. When that ability (if ever) becomes a reality, I hope we decide to maintain some of our natural biology. 

     

    IMO Neural link is a not going to go anywhere. It's an interesting science project, but what it can achieve is doable with no-implant devices. I've seen demos of head mounted sensor devices that just read your brain waves demoed over ten years ago that could do the same things neural link is trying to do now. 

    Elon Musk thinks we live in a simulation, so I don't think his perspective on this is in touch with reality. The concept of controlling a computer with a neural interface is a good idea, but not an implant device. The brain can't understand binary and computers can't speak in brain waves so some kind of telepathy-like ability from implants is just not realistic, at least not without another giant leap in this tech that is no where near. Any neural device, implant or not, is a one way signal. Brain puts out signals and something picks up the signals. There is no tech in existence, theoretical or otherwise, that can send information to the brain in this context. 

     

    The only useful application for a neural implant device is for disabled people. If you have a prosthetic arm, an implant device might be somewhat better than other input methods for controlling the arm, but that's never going to be a mass market kind of thing. 

  8. 2 hours ago, LabRat said:

    The description given, makes this sound like a 'no brain-er' technology. 
    (One of those 'things' that was both long-wanted and inevitable, but never materialized due to technical limitations and other limiting factors)


    (Feel free to correct me if I'm wrong)
    One of the major reasons why even today's X86-based Consoles perform 'better for the given HW' is the in-built scheduling/task assignment. -Made 'easy-er' by the fixed configurations of each console's generation.

     

    I'm not hardware engineer but I think this is somewhat more dynamic that a typical scheduler. 

    Spoiler

    SHMT uses what the researchers call a "smart quality-aware work-stealing (QAWS) scheduler" to manage the heterogeneous workload dynamically between components. This part of the process aims to balance performance and precision by assigning tasks requiring high accuracy to the CPU rather than the more error-prone AI accelerator, among other things. Additionally, the scheduler can seamlessly reassign jobs to the other processors in real time if one component falls behind.

     

  9. Quote

    Researchers at the University of California Riverside developed a technique called Simultaneous and Heterogeneous Multithreading (SHMT), which builds on contemporary simultaneous multithreading. Simultaneous multithreading splits a CPU core into numerous threads, but SHMT goes further by incorporating the graphics and AI processors.

     

    The key benefit of SHMT is that these components can simultaneously crunch away on entirely different workloads, optimized to each one's strength. The method differs from traditional computing, where the CPU, GPU, and AI accelerator work independently. This separation requires data transfer between the components, which can lead to bottlenecks.

     

    Meanwhile, SHMT uses what the researchers call a "smart quality-aware work-stealing (QAWS) scheduler" to manage the heterogeneous workload dynamically between components. This part of the process aims to balance performance and precision by assigning tasks requiring high accuracy to the CPU rather than the more error-prone AI accelerator, among other things. Additionally, the scheduler can seamlessly reassign jobs to the other processors in real time if one component falls behind.

    https://www.techspot.com/news/102016-new-multi-threading-technique-promises-double-processing-speeds.html

     

     

    Finally we'll be able to run crysis. 

    • Great Idea 1
  10. Quote

    Amazon founder Jeff Bezos, Nvidia and other big technology names are investing in startup Figure AI that develops human-like robots, Bloomberg News reported on Friday, citing people with knowledge of the situation.

    Figure AI, also backed by ChatGPT-maker OpenAI and Microsoft is raising about $675 million in a funding round that carries a pre-money valuation of about $2 billion, according to the report.

    Bezos had committed $100 million through his firm Explore Investments LLC and Microsoft is investing $95 million, while Nvidia and an Amazon-affiliated fund are each providing $50 million, the report added.

    Investments in artificial intelligence startups have sparked after the launch of OpenAI's viral chatbot ChatGPT in November 2022, as investors sense an opportunity, betting on these startups that they might outpace bigger rivals.

    https://www.reuters.com/technology/bezos-nvidia-join-openai-funding-humanoid-robot-startup-bloomberg-reports-2024-02-23/

     

    terminator GIF

     

     

  11. 4 minutes ago, pioneerisloud said:

    I edited them in.  Sorry, I should've typed it all up first lol.  I'm done editing now.

    OMG RUSH. Rush 2049 was the best game. The stunt mode was so good. I totally forgot about that, it definitely deserves an OP spot in the retro gaming hierarchy. The stunt mode was so much fun. I played it with my GF via emulator at one point just to make sure she knew how good it was, and we both had a blast. 

    • Agreed 1
  12. 5 minutes ago, pioneerisloud said:


    There's so many to choose from, just 5?  Here's what I'd say mine would be, and this is in absolutely no particular order.

    1. Super Mario All-Stars (SNES)
    2. Rush 2: Extreme Racing (N64)
    3. Quake (Win9x)
    4. SimCity 3000 (Win9x)
    5. Driver (Win9x)

    Honorable Mention:  NFS Underground 2 (WinXP)

    Any fond memories you could share on those games? 

    • Thanks 1
  13. So anyway.. 

     

    My top 5 retro games/game series other than FF games:

     

    1. Armored core 2 on PS2

    2. Unreal Tournament 2003, 2004, UT3

    3. Tribes Ariel Assault on PS2

    4. Halo 1

    5. Battlefield 1942

     

    Armored core 2 intro is a straight up banger:

     

     

    Ten-15 years ago I had a 720p projector. Unreal Tournament 3 was a Ps3 game. On pc the code to do split-screen was included but not usable from any menu. Me and my buds played UT3 4-player split screen on that 720p projector using console commands and it was glorious. Some of the most fun I've had gaming IRL. 

     

    Tribes 2 was a great game on PC. Tribes: Arieal Assault was a version of Tribes 2 that was optimized for PS2. Me and a buddy of mine who later got hooked on drugs and committed suicide would play Tribes on PS2 online via the PS2 network adapter. We would own servers. It was a ton of fun. 

     

    Halo 1CE, do I really need to explain? 

    My parents had just got divorced and my mom bought us a good enough desktop PC for me to game on while my dad bought me an Xbox. I guess they both wanted to make sure I had something to do. I booted up Halo on Xbox while my dad watched. He has never had any interest in video games, but Halo 1 was so impressive at the time that he was actually interested. My dad was absolute dog * at playing games tho, so coop was not an option 😅

     

    BF1942 was my jam. It kicked off a life long addiction to BF games. My grandma bought it for me for Christmas one year. I made sure it was the only thing I asked her for so she wouldn't get me anything else. She complained about being apprehensive buying me a war game, but she did me a huge favor. It kicked off my PC gaming enthusiam and I probably wouldn't be here on EHW without that. 

     

     

    What are your top 5 retro games? 

     

     

     

    • Respect 3
  14. 26 minutes ago, neurotix said:

    You are more than welcome to add and discuss retro pcs here, and I've tried to be accommodating. However, if you want your own club, feel free to make one. I wish you both the best, and again, I really feel like I've been nice and open toward retro PCs, offering to add lists and such. I just can't add much to the discussion myself.

     

    Unfortunately the user base here is small. Bear in mind, this is a continuation of my club on OCN which had 40+ members discussing retro gaming consoles. Clearly not happening here.

    Don't let these old farts rile you up over excessively defining "retro". Some of them were probably already halfway in the grave when pong came out 😜

     

    Not everyone needs to have the same definition in their mind, we can all discuss retro gaming here, pc, console, or otherwise. 

     

    Would chess be a retro game? 

    • Great Idea 2
  15. 13 minutes ago, ENTERPRISE said:

     

    Yes I see what you mean, they are not asking for you to subscribe or pay extra fees to get the firmware update, but they are sort of putting an artificial one in place by making the updates only available to a set tier of product. Granted monitor firmware updates are seldom...but they do exist. This is more of a principal thing over likelihood of needing a firmware update.

    Do we even know that they are doing it artificially? I would think it's likely the higher end models have higher end parts that would make certain updates irrelevant to the lower end models. 

     

    If they are gatekeeping updates and/or restricting features that would otherwise be available then yea that would be lame, but it seems more like the author of this article is trying to create dread over nothing. 

  16. This article seems a bit misleading. Where is the pay wall? 

     

    Sounds like there are just some high end models that will get firmware updates for longer than lower end models, which seems pretty normal to me. A pay wall would mean customers have to pay for updates, which is not the case here. 

     

    I have never updated the firmware on a monitor. Didn't even know that was a thing. Firmware upgrades on my TV, but that has apps built in. Do people need firmware updates for monitors? 

  17. Quote

    Nvidia hit $2 trillion in market value for the first time on Friday, riding on an insatiable demand for its chips that made the Silicon Valley firm the pioneer of the generative artificial intelligence boom.
     

    The milestone followed another bumper revenue forecast from the chip designer that drove up its market value by $277 billion on Thursday - Wall Street's largest one-day gain on record.
     

    Its rapid ascent in the past year has led analysts to draw parallels to the picks and shovels providers during the gold rush of 1800s as Nvidia's chips are used by almost all generative AI players from chatGPT-maker OpenAI to Google.
     

    That has helped the company vault from $1 trillion to $2 trillion market value in around eight months - the fastest among U.S. companies and in less than half the time it took tech giants Apple and Microsoft.

    Nvidia hits $2 trillion valuation as AI frenzy grips Wall Street (msn.com)

     

    Crazy that Nvidia hit 1 trillion less than a year ago, and have doubled since. 

    • Shocked 1
  18. Nvidia is selling massive GPUs and NPUs to data centers. They have products that sell for a quarter million each in the data center market. 

     

    They are also the leader in a market that is growing rapidly. 

     

    Nvidia H100: This is the chip behind AI's supersonic stock rally (theedgesingapore.com)

    Spoiler

    Computer components are not usually expected to transform entire businesses and industries, but a graphics processing unit Nvidia Corp. released in 2023 has done just that. The H100 data centre chip has added more than US$1 trillion to Nvidia’s value and turned the company into an AI kingmaker overnight. It’s shown investors that the buzz around generative artificial intelligence is translating into real revenue, at least for Nvidia and its most essential suppliers. Demand for the H100 is so great that some customers are having to wait as long as six months to receive it.

    1. What is Nvidia’s H100 chip?
    The H100, whose name is a nod to computer science pioneer Grace Hopper, is a graphics processor. It’s a beefier version of a type of chip that normally lives in PCs and helps gamers get the most realistic visual experience. But it’s been optimized to process vast volumes of data and computation at high speeds, making it a perfect fit for the power-intensive task of training AI models. Nvidia, founded in 1993, pioneered this market with investments dating back almost two decades, when it bet that the ability to do work in parallel would one day make its chips valuable in applications outside of gaming.

    2. Why is the H100 so special?
    Generative AI platforms learn to complete tasks such as translating text, summarizing reports and synthesizing images by training on huge tomes of preexisting material. The more they see, the better they become at things like recognizing human speech or writing job cover letters. They develop through trial and error, making billions of attempts to achieve proficiency and sucking up huge amounts of computing power in the process. Nvidia says the H100 is four times faster than the chip’s predecessor, the A100, at training these so-called large language models, or LLMs, and is 30 times faster at replying to user prompts. For companies racing to train LLMs to perform new tasks, that performance edge can be critical.

    3. How did Nvidia become a leader in AI?
    The Santa Clara, California company is the world leader in graphics chips, the bits of a computer that generate the images you see on the screen. The most powerful of those are built with hundreds of processing cores that perform multiple simultaneous threads of computation, modelling complex physics like shadows and reflections. Nvidia’s engineers realized in the early 2000s that they could retool graphics accelerators for other applications, by dividing tasks up into smaller lumps and then working on them at the same time. Just over a decade ago, AI researchers discovered that their work could finally be made practical by using this type of chip.

    4. Does Nvidia have any real competitors?
    Nvidia controls about 80% of the market for accelerators in the AI data centres operated by Amazon.com Inc’s AWS, Alphabet Inc’s Google Cloud and Microsoft Corp’s Azure. Those companies’ in-house efforts to build their own chips, and rival products from chipmakers such as Advanced Micro Devices Inc. and Intel Corp., haven’t made much of an impression on the AI accelerator market so far.

     

×
×
  • Create New...

Important Information

This Website may place and access certain Cookies on your computer. ExtremeHW uses Cookies to improve your experience of using the Website and to improve our range of products and services. ExtremeHW has carefully chosen these Cookies and has taken steps to ensure that your privacy is protected and respected at all times. All Cookies used by this Website are used in accordance with current UK and EU Cookie Law. For more information please see our Privacy Policy