Jump to content

Welcome to ExtremeHW

Welcome to ExtremeHW, register to take part in our community, don't worry this is a simple FREE process that requires minimal information for you to signup.

 

Registered users can: 

  • Start new topics and reply to others.
  • Show off your PC using our Rig Creator feature.
  • Subscribe to topics and forums to get updates.
  • Get your own profile page to customize.
  • Send personal messages to other members.
  • Take advantage of site exclusive features.
  • Upgrade to Premium to unlock additional sites features.

UltraMega

News Editor
  • Posts

    2,353
  • Joined

  • Last visited

  • Days Won

    42
  • Feedback

    0%

Posts posted by UltraMega

  1. AI solves nuclear fusion puzzle for near-limitless clean energy (msn.com)

     

    Quote

    The latest research was published in the scientific journal Nature on Wednesday in a paper titled ‘Avoiding fusion plasma tearing instability with deep reinforcement learning’.

    “Being able to predict instabilities ahead of time can make it easier to run these reactions than current approaches, which are more passive,” said SanKyeun Kim, who co-authored the study.

    “We no longer have to wait for the instabilities to occur and then take quick corrective action before the plasma becomes disrupted.”

     

  2. Quote

    Nvidia is finally making long awaited, much needed changes to their graphics card software. Available now is a new beta application simply called the Nvidia App, which greatly improves the software experience for Nvidia GPU owners. While it doesn't contain all of its features just yet, the eventual goal is for the Nvidia App to replace the outdated Nvidia Control Panel and annoying GeForce Experience into a single unified application.

     

    One of the biggest new features with the Nvidia App is that you do not need to log in to use it. That's right, no more log ins to access driver updates and other features that were previously restricted in GeForce Experience. Finally!

    https://www.techspot.com/news/101985-nvidia-app-launches-beta-nvidia-new-gpu-control.html

     

     

    Took them long enough. 

     

    • Agreed 1
  3. Update on this:

     

    I have decided not to pursue trying to sell PCs on Amazon for now. While it is definitely something I could do, it turned out to be more work than I expected. For example, because the PC I listed has wifi, I had to make sure it was registered/approved by the FCC and it just got to be a lot of work. So while it is doable in the sense that Amazon certainly allows for someone like me to sell PCs on their platform, it doesn't really make sense to do it one PC at a time. If I had 10+ identical PCs I wanted to sell on there, it would make sense then. 

    As for Newegg, there is no way for an individual like me to sell on there. They do not let sole proprietors on their platform, not for any particular reason, I think it just never even occurred to them than any one person would even try so they have no options for it. 

     

     

    I still have 2 PCs left. All of the other ones I built and sold were sold for a reasonable profit, and I have not really tried to sell the two I still have. I find that I eventually encounter customers who are interested in buying one, so if I only have one or two at a time I don't really need to try to sell them online. One of the PCs I have left is probably already spoken for, so unless that falls through I really only have one left, and right now that one is loaned out to a customer who needed a working PC to run his nvme drive off of after his laptop failed (waiting on parts), and there is a good chance he decides to keep it and buy it off me after he uses it for a few days. 

     

     

    So all and all I would call this whole thing a success. Is it possible to build a PC around the $1000 price range and sell it for a profit from home? Yes, definitely. The hard part is just finding a way to sell them fast enough to make it anything more than a slow side project. 

     

    I'll be keeping my eye out for a steady way to sell these that doesn't require too much work to make it viable. I think I may actually try getting one or two in a local store eventually and just see if I can strike a deal with the owner of a local business to sell my PCs in their shop. No idea how practical that is, but it seems reasonable to try. 

     

    This is probably my last real update for this thread. It was a lot of fun, I learned a lot about online selling, gained a deep hatred for Etsy, figured out the logic of having one or two ready to go prebuilt PCs at all times to sell to whoever wants one, and got to build a bunch of nice PCs. 

    • Thanks 1
  4. Just wanted to add, I've been trying to find some good info on the technical differences between a GPU and an NPU but it's kind of hard to find a good article or video that explains it in detail. Most of the info I can find about what makes NPUs different from GPUs basically just boils down to "NPUs are much better at AI/Deep learning/Machine Learning tasks".

     

    I think a good example would be this: I know from personal experience, you need a GPU about equal or better than a 3080Ti to do Stable Diffusion image generation locally with decent results. That means a big GPU that takes a lot of power and has a huge heatsink. But now there are NPUs that can do the same task even better without the need for a ton of power and heat. There are NPUs that will be able to do things like image generation on a standard smart phone. 

     

    Another point that comes up a lot is that things that are typically processed in the cloud will be able to be done locally much more often, which will reduce power consumption for data processing in a big way. 

     

    GPUs and CPUs are good at tasks with clear defined answers. NPUs are good at tasks with no defined answer, where the answer need to be created or generated based on existing data. It's kind of ironic that AI isn't very good at math in the same sense that humans are not very good at math, but it makes sense when you think about it. We are very good at coming up with undefined/creative answers to things but not good at very specific detailed answers. Same is true for NPUs to a certain degree. 

     

     

    There was a time when simple 3D graphics were rendered on the CPU, and then GPUs came along and did that much faster. CPUs could still do it, but GPUs can do it better. NPUs are to GPUs as GPUs are to CPUs when it comes to AI/DL/ML tasks. The average user isn't doing anything like that today, but presumably once NPUs are ubiquitous we are all going to be using them for a lot more. 

  5. 57 minutes ago, Slaughtahouse said:

     

    I want to believe that but my instincts tell me that the NPU will be more like a PhsyX type card. Great for upgrading an old rig but not ideal if you're considering getting faster video card that can simply do this work. Especially if future graphics cards have an NPU. 

     

    Thanks for sharing. Curious how "NPUs" will evolve and from a practical standpoint, if they move the needle forward for enthusiasts. 

    Going forward, pretty much all GPUs and CPUs will have NPUs embedded. 

    • Thanks 1
  6. 13 hours ago, pioneerisloud said:

    I mean technically speaking, per the definition in the article, "Neural Processing Units are dedicated AI chips that remove some of the work from a computer's CPU or GPU so the device can work better.", that would mean any RX 6000 or 7000 series GPU can function as a NPU (same with any of Nvidia's RTX series cards).

    Interesting, I guess we'll see what the future of NPU's will end up looking like soon enough.  Maybe one day, we'll be popping NPU's into our systems to assist with our GPU's.  Like I said, we kind of can NOW, its just people aren't well enough aware that you can run Nvidia and AMD together (or Intel too now).

    Definitely a bit of overlap between a GPU and an NPU, but not as much as you might think. An NPU can be up to like 10k times better at certain tasks vs a GPU. 

     

    Just like how a CPU technically can do most of what a GPU can do, just not very well. 

     

    If AMD GPUs had NPUs or functioned well as NPUs, FSR and DLSS would be of similar quality. It's the NPU that makes the difference. 

  7. Quote

    As reported by Neowin, AMD NPUs will finally show up in Windows' Task Manager. If you're not sure what an NPU is, it's a new component to computers to aid with processing. Much like how the CPU handles regular processes and GPUs are dedicated graphics processors, NPUs are specially used for AI-based processes.

    With this agreement, AMD's NPUs will now use the Microsoft Compute Driver Model (MCDM). Part of this implementation means you can keep tabs on how the NPU is doing and how much processing power it's using via Task Manager, which will be particularly important when people begin using their NPUs for strenuous AI-based tasks. As AMD puts it:
     

    Being able to track how resources are allocated in real-time and which system components are under load is useful for monitoring application behavior. This kind of tracking is particularly important in notebooks where end users may choose to maximize battery life by controlling where certain workloads run or adjusting global power settings either via the Windows Settings menu or in an OEM-provided application.
     

    NPU implementation is going to be vital in the coming months. With AI PCs entering the consumer market, Microsoft will ideally support as many different NPU brands as possible so people aren't left out because of their choice of hardware.

    AMD's NPUs will finally get support from Windows' Task Manager (msn.com)

     

    This does not apply to Ryzen 7000 CPUs or 7000 series GPUs because neither have NPUs. I think this just serves to help understand what an NPU is and what the roadmap for their implementation will look like. 

     

    What is an NPU? | Windows Central

     

    Intel unveils Core Ultra, its first chips with NPUs for AI work (engadget.com)

     

    • Thanks 1
  8. 3 hours ago, Sir Beregond said:

    It was more that that was AMD's first go at RT and it showed then anything else. That was Nvidia's 2nd gen RT so maybe it improved more than they thought it would but I don't think there was otherwise anything there to really take AMD off guard.

     

    You don't think Nvidia being in their third gen of NPU capable GPUs while AMD has no NPUs in any of their GPUs is something that caught AMD off guard? I'd say it definitely did. 

     

    I don't think even Nvidia knew how effectively they would be able to leverage the NPUs for gaming at first. I think DLSS 1.0 was Nvidia's attempt to justify the existence of NPUs on their GPUs to customers after the fact, and over time I think DLSS turned out way better than Nvidia expected. Now with DLSS 3 and 3.5, I'm sure AMD feels pretty far behind. 

     

    I think Sony and Microsoft also feel like they messed up with current gen consoles. If they had put NPUs in the current gen consoles with no other real changes, it would have been a game changer. Hardware wise, they just barely missed the boat on that. Had the consoles launched a year later, they might have had some NPUs. 

     

    I think everyone selling hardware for graphic that hasn't embedded NPUs yet feels like they're missing out. 

     

     

     

     

  9. Some thoughts on some of this recent video generation stuff after learning a bit more about it:

     

    AI being able to generate these kinds of videos has some implications that go beyond just impressive generation capabilities. It means the AI has a pretty good understanding of the real world and 3D space. The same understanding of the real world that allows AI to make pretty realistic looking videos also means it understands things like physics. 

    The ability for the AI to understand physics in this case is an emergent property, meaning it's something the developers didn't set out to do, but happened as a result of training the data to understand the world. 

     

    Having an AI that understands the world well enough to generate these videos has a lot of implications. There are more and more rumors and conspiracy theories about OpenAI either already having AGI, or something very close. When I think about what kind of other things an AI with such a deep understanding of the world could potentially be capable of, it starts to seem like AGI rumors could have some merit. 

     

    When I try to take things at face value and assume OpenAI doesn't have any big secrets about AGI, the first thing that comes to mind is the 7 trillion figure. Maybe OpenAI isn't realistically anywhere near AGI right now, but they could easily have a very clear path to get there. Perhaps 7 trillion is what Sam Altman thinks it will take to get to AGI. Or perhaps 7 trillion is what he thinks it will take to make AGI accessible in mass and AGI is actually much closer on a small scale. 

     

     

    There is so much information to take in with this stuff. I wonder how a person living through the industrial revolution would have felt if instead of happening over roughly a century, it happened much faster. For example, the steam engine was invented in 1712. It took another 50 years for the design to be improved upon to a point where it could be used for a wide range of applications. Steam engine powered trains were not invented until 1804, so it took almost a century to go from steam engines to the first steam train. I'd imagine if you were living through that ear, the progress would have been almost imperceptible in people's general lives. ChatGPT-1 was created near the end of 2018, and I don't think anyone would argue that the advancements from conception to now have been anything short of 'very impressive' at the very least. We are somewhere between the point in which steam engines were invented and mostly used just to pump water out of mines and the point in which it becomes widespread and thus kicks off an industrial revolution, but it's not going to take centuries, or even close. 

  10. Demo:

     

    :CoPilot, please respond to this thread.

     

    Spoiler

    Here is a possible reply as UltraMega:
     

    I appreciate your input, bonami2. I agree that AI is a double-edged sword that could have both positive and negative impacts on the world. However, I don’t think we should be pessimistic or fearful of the future.

    AI is a tool that can be used for good or evil, depending on who controls it and how they use it. I believe that we, as humans, have the responsibility and the power to shape the direction of AI development and ensure that it aligns with our values and goals. That’s why I think it’s important to have a diverse and inclusive community of researchers, developers, investors, and users who can collaborate and communicate effectively on AI issues.

    I hope that Sam Altman and his partners will be transparent and ethical in their AI chip fabrication project, and that they will consider the social and environmental implications of their actions.

     

    This took all of a few clicks on my side bar, in which I asked copilot to respond to this thread as me. 

     

    I hope this serves as an absorbable point. I know it's not as annoying as I am yet, but they're working on it. 

    • Thanks 1
    • Respect 2
  11. 32 minutes ago, bonami2 said:

    AI will break the world. I want to do remote work but i know with my skill level for those job that i will be replaced. Mostly everyone working on computer will be replaced. Manual labor is the only one still safe for a while. It pretty sad

    It might break the world. 

     

    Sam Altman has been seeking this investment from all sorts of international invertors. All the biggest tech companies and governments are involved. Hopefully, that level of collaboration will provide a level of safety. 

    • Shocked 1
  12. Quote

    OpenAI CEO Sam Altman is in talks with investors to raise as much as $5 trillion to $7 trillion for AI chip manufacturing, according to people familiar with the matter. The funding seeks to address the scarcity of graphics processing units (GPUs) crucial for training and running large language models like those that power ChatGPT, Microsoft Copilot, and Google Gemini.

    Report: Sam Altman seeking trillions for AI chip fabrication from UAE, others | Ars Technica

     

    I've seen these headlines floating around for a few days. It took me a while to wrap my head around it since at face value it almost seems satirical. 7 trillion is 10% of global GDP. It seems hypothetical, but he's serious. The running theory is the only way to justify this kind of investment is if something that will fundamentally shift society is behind it.

     

     

    • Agreed 1
  13. 6 hours ago, neurotix said:

    Agreed with everything..

     

    I think we will keep the 10 year old rule and let people list a PS4 or Xbox One but *only* the originals, not the pro models.

     

    I do agree though on 15 years and 360/PS3/Wii actually being retro, but since this club is for fun I'll allow the original PS4/Xbone.

     

    PC wise, I started the hobby around 2009 and started with a Phenom II x2 that I unlocked to x4 and OCed. I think that is retro now given it can only really run games up to about 2014-2015 but does poorly at it, stuff like Just Cause 2 ran great on it. I agree with pio in that respect, nobody is capable of playing modern games on a FX chip or a 2600k.

    I maintain that currently supported consoles should be expmpt from a ten year rule of thumb. Ps4 is just not a retro console by any measure, yet. 

    • Thanks 1
  14. 42 minutes ago, pioneerisloud said:

     

    I'd put Windows 7 + 8.1 rigs into that category as well.  I think that'd be reasonable enough, no?  Nobody's playing on Phenom II's, FX's, and Sandy Bridge / Ivy Bridge these days unless you're actively trying to (the era those OS's are).

     

    15 minutes ago, axipher said:

    I would have to say at least 2 generations old, so being that we are I think 9th generation now, 7th generation and older so Wii, Xbox 360, PS3 roughly.

     

    Wii U might kind of get swept up with that though with how close it's titles were to Wii as well.

     

     

    For PC gaming is a little harder because of the huge range in quality and long support for games including engine updates, but probably around 2014 or 2015 I guess would be the latest, so like Shovel Knight, Dark Souls II, Fallout 4, Undertale and probably right before the huge influx of "16-bit" sprite based games in modern engines that don't actually give any regard for the CRT's that made sprite-art come to life and not just be a bunch of pixels.

     

     

     

    It's hard to pin-point an exact spot, years and generations are an easy way to put a number on it, but really hard to decide on that since games can be in development for up 5, even 10 years, others can be complete reimagining's of 10-15 year old games with modern engines and quality of life improvements.

     

    Deep down, part of me really wants to say that Retro Games are any Games that were designed to be played on CRT's by the majority and also before DLC was a thing.  But then I also consider that "retro" should point to a specific time, but instead a relative past to where we are now, so 10 years in video games is a huge amount of time with so many changes to engine design, game design and cultural changes.

     

    I agree with Axipher that it's harder to pin down for PC. I don't think you can just go by when an OS stopped being supported in reference to a game because the hardware may still be supported on the most recent OS, and the OS can just be updated to more recent versions for free. That's why I think it makes more sense to just have a timeframe in mind for PC games. I think 10 years is fine, even though that would awkwardly make GTAV a retro game. Technically though, it is a PS3 game. It would also make Star Citizen PTU a retro game. 😄

     

    15 years is also reasonable though.

     

    If you're just talking about PC hardware though, I would say anything that can't run on a currently supported OS is retro, and since windows 10 is still supported that means you would have to go back pretty far for retro PC hardware. On the other hard, I would also gladly say something like a Q6600 is retro now, and that can run windows 10, so there is a bit of a grey area. 

  15. This is an interesting concept...

     

    Having an upscaler that can run on any NPU is interesting because as far as I know an NPU doesn't have to be part of a GPU or CPU, it can just be a PCIe m.2 card... which technically means one could be added to a PS5 or Xbox Series console via the storage expansion slot. It also means other weird mis-matches would be possible. I guess in theory someone with an older PC that doesn't support an upscaling options could throw an NPU into a PCIe slot and use Microsoft's upscaling. It also means any modern CPU with an NPU will be able to get a lot more mileage out of the integrated graphics. 

     

     

    I asked CoPilot to make a list of NPU capable hardware:

    Spoiler

    As of now, several consumer-level hardware components incorporate NPUs (Neural Processing Units) to enhance AI-related tasks. Here's a list of some notable ones:


    - Intel Core Ultra Processors:
        - The Intel Core Ultra processors (formerly codenamed Meteor Lake) feature built-in NPUs.
        - These NPUs are widely utilized by various computer companies in their latest Ultrabook and Notebook laptops.


    - AMD Ryzen 8040-Series Laptop Processors:
        - The AMD Ryzen 8040-series laptop processors also integrate NPUs.
        - These processors enhance AI capabilities and contribute to improved performance in AI workloads.


    - Qualcomm Windows PCs:
        - While not as common, Qualcomm's Windows PCs were among the first to include NPUs.
        - Arm processors used in most smartphones have included machine-learning acceleration for several years, and this technology has extended to some Windows laptops.


    - Apple M-Series Chips for Macs:
        - Apple's M-series chips for Macs (including the M1 and later versions) incorporate NPUs.
        - These chips have been designed to accelerate AI tasks, including neural network inference.


    - Future Trends:
        - A recent report predicts that 60% of PCs sold will be AI-capable by 2027.
        - As technology evolves, we can expect more consumer-level hardware to integrate NPUs, further enhancing AI experiences.

     

     

    Ironically enough, CoPilot forgot to mention Nvidia GPUs which have NPUs via the tensor cores. AMD 7000 series GPUs do not have an NPU, which means they might not be able to run Microsoft's upscaler. When I looked this up, apparently AMD GPUs should be supported via a fall back method, but it might not have the same quality level as an NPU based method. 

    Spoiler

    Microsoft Super Resolution (MSR), also known as DirectML Super Resolution, is an AI-based upscaling technique developed by Microsoft. It aims to enhance the resolution and visual quality of images and videos. However, as of my last knowledge update (which was current until 2021), MSR primarily relies on DirectML, which is a framework for machine learning acceleration on Windows devices.
    Here's the situation regarding MSR and AMD 7000 series GPUs:
    - DirectML Compatibility:
        - DirectML is compatible with a wide range of hardware, including both NVIDIA and AMD GPUs.
        - However, the specific capabilities and performance of DirectML on different GPUs can vary based on their architecture, driver support, and other factors.
    - AMD 7000 Series GPUs:
        - The AMD Radeon RX 7000 series (such as the RX 7700 XT, RX 7600 XT, etc.) are based on the RDNA 3 architecture.
        - While these GPUs are powerful for gaming and general graphics tasks, they do not inherently include dedicated NPUs (Neural Processing Units) like some other AMD products.
        - MSR primarily relies on the DirectML framework, which leverages the GPU's compute capabilities for AI tasks.
    - MSR and AMD 7000 Series GPUs:
        - MSR should technically run on AMD 7000 series GPUs, but the performance and efficiency may not be optimal compared to GPUs specifically designed for AI workloads.
        - For the best experience with MSR, consider using GPUs that have better AI acceleration capabilities, such as those with dedicated NPUs or specialized AI hardware.

     

     

    It will definitely be interesting to see how this all plays out and what kind of weird hardware configs it will allow for. As far as I know, there are no NPU only devices available on the consumer market yet, but I think they are coming. I recently saw a video about an NPU device that looked like an NVME drive and went in the m.2 slot.

     

     

    • Respect 1
  16. I haven't played hardly any JRPGs outside of Final Fantasy. I would have to say 9 is my favorite in terms of plot and style, with 10 being my favorite in terms of gameplay. 9 is actually the one the creator of the series considers to be the most faithful to his overall vision. I think if 9 had come out when 7 did, it would be regarded as a better game but coming at the end of the ps1 life cycle, it got overlooked by many. 

     

    I would love to see Square do an FF9 remake. Supposedly there was supposed to be an FF9 anime in the works, but I haven't heard any updates about that in a while. 

    • Thanks 1
  17. 4 hours ago, neurotix said:

    Next time I replay that game, I will definitely use it. Thanks for the explanation.

     

    Any love for FF7? Last time I played that game I beat the whole game, including beating Ruby and Emerald Weapon, in 42 hrs.

     

    What do you guys think of the FF7 Remake?

    There are similar mods for ff7, though I don't think quite on part with the moguri mod. Been a while since I checked though. 

     

    I haven't played the remake simply because I want to wait until they finish making the game (all three parts) before I play it. 

    4 hours ago, The Pook said:

    I loved FF9 (and 7) as a kid but I tried replaying 9 when it came to Steam and couldn't get into it. Nostalgia glasses strike again 😢

     

    Check out the moguri mod. It goes a long way towards cleaning up the rough edges. 

     

    I still genuinely enjoy the older style turn based FF games, I wish they would make more like em. FFX was the peak of the combat system for the entire series. 

    • Respect 1
  18. 59 minutes ago, neurotix said:

    Favorite genre is JRPG for me as well.

     

    FFIX is fantastic, we got the Switch version last year and I helped my wife through it. We laid in bed for multiple weekends playing it. She loved it. (Our bed can be raised with a remote and we raised the head up. 55" on a dresser.) However, my favorite game of all time is FFIV on SNES (aka Final Fantasy II back in the day), even if its not the best FF. (The best overall is definitely VI, this is pretty well agreed upon I think.)

     

    Read through this thread, Ultra posted something interesting called FFIX Moguri Mod that runs in an emulator and upscales the backgrounds using AI for FFIX. I need to look into this myself and try it out but I've beaten that game and gotten every chocograph treasure etc  like 10 times and just played it with my wife so...

     

    I have a SNES flashcart and need to play FFIV Namingway Edition which is supposed to be the definitive translation of FFIV, better than the bad J2E translation of that game from around 2000.

     

    Moguri Mod for FF9 is great. It doesn't use an emulator, it's for the steam version. 

    • Thanks 1
×
×
  • Create New...

Important Information

This Website may place and access certain Cookies on your computer. ExtremeHW uses Cookies to improve your experience of using the Website and to improve our range of products and services. ExtremeHW has carefully chosen these Cookies and has taken steps to ensure that your privacy is protected and respected at all times. All Cookies used by this Website are used in accordance with current UK and EU Cookie Law. For more information please see our Privacy Policy