Oh no.

  • Eager Eagle@lemmy.world
    link
    fedilink
    English
    arrow-up
    189
    arrow-down
    2
    ·
    2 years ago

    Downfall, Inception, Meltdown, Spectre, I hate to see new vulnerabilities, but their naming choices are solid.

  • cybervseas@lemmy.world
    link
    fedilink
    English
    arrow-up
    136
    arrow-down
    2
    ·
    2 years ago

    Intel claims most consumer software shouldn’t see much impact, outside of image and video editing workloads…

    But that’s, like the one place other than games where consumers are looking for performance. What’s left, web browsing and MS Office?

    • philluminati@lemmy.ml
      link
      fedilink
      English
      arrow-up
      8
      arrow-down
      1
      ·
      2 years ago

      It’s not they aren’t impacted only you “don’t see the impact” as noticeably.

    • FaceDeer@kbin.social
      link
      fedilink
      arrow-up
      0
      ·
      2 years ago

      I just skimmed through the article and it seems like this vulnerability is only really meaningful on multi-user systems. It allows one user to access memory dedicated to other users, letting them read stuff they shouldn’t. I would expect that most consumer gaming computers are single-user machines, or only have user accounts for trusted family members and whatnot, so if this mitigation causes too much of a performance hit I expect it won’t be a big risk to turn it off for those particular computers.

      • Arghblarg@lemmy.ca
        link
        fedilink
        English
        arrow-up
        0
        ·
        edit-2
        2 years ago

        this vulnerability is only really meaningful on multi-user systems

        Well, that says it all. CPU manufacturers have no incentive at all to secure the computations of multiple users on a single CPU (or cores on the same die)… why would they? They make more cash if everyone has to buy their own complete unit, and they can outsource security issues to ‘the network’ or ‘the cloud’…

        Years ago when I was in University this would have been a deathblow to the entire product line, as multi-user systems were the norm. Students logged into the same machines to do their assignments, employees logged into the same company servers for daily tasks.

        I guess that isn’t such a thing any more. But wow, what a sh*tshow modern CPU architecture has become, if concern for performance has completely overridden proper process isolation and security. We can’t even trust that a few different users on the same machine can be separated properly due to the design of the CPU itself?

        • philluminati@lemmy.ml
          link
          fedilink
          English
          arrow-up
          0
          arrow-down
          1
          ·
          edit-2
          2 years ago

          Processor manufacturers target their devices and sales towards cloud computing so they have a huge incentive to avoid having issues like these. It’s ridiculous to suggest otherwise.

  • hark@lemmy.world
    link
    fedilink
    English
    arrow-up
    121
    arrow-down
    4
    ·
    2 years ago

    Install backdoors and sell that info to governments and companies, then years later reveal the issue to justify downgrading performance of older CPUs to encourage people to upgrade.

    • 1984@lemmy.today
      link
      fedilink
      English
      arrow-up
      0
      arrow-down
      1
      ·
      edit-2
      2 years ago

      Anti virus companies has also been caught making viruses.

      A lot of shady shit happens when there is money and power to be had.

  • AvgJoe@lemmy.world
    link
    fedilink
    English
    arrow-up
    50
    arrow-down
    1
    ·
    edit-2
    2 years ago

    It took them a year for a microcode fix and it still has a performance loss of 50% in some cases? Ew

  • dual_sport_dork 🐧🗡️@lemmy.world
    link
    fedilink
    English
    arrow-up
    48
    arrow-down
    1
    ·
    edit-2
    2 years ago

    Ha-ha. My chip’s too old to be affected. I don’t see my architecture on the list.

    I knew putting off upgrading for around a decade would pay off. (Windows Update tells me my PC is not “ready” for Windows 11 due to its hardware, either. Oh no, whatever shall I do.)

    • ArcaneSlime@lemmy.dbzer0.com
      link
      fedilink
      English
      arrow-up
      12
      arrow-down
      1
      ·
      2 years ago

      This inspires confidence with my 2010 ass toshiba sattelite with an i5 and 8gb DDR3. I need to look and see if mine is too old lol.

    • atticus88th@lemmy.world
      link
      fedilink
      English
      arrow-up
      9
      ·
      1 year ago

      Dont the older chips suffer from a greater performance drop from spectre and meltdown vulnerabilities?

  • FrankFrankson@lemmy.world
    link
    fedilink
    English
    arrow-up
    39
    arrow-down
    1
    ·
    1 year ago

    Every article is a copy paste of the same bullshit talking about the vulnerability and pointing to the stupid cryptic list of processors that requires you to jump through hoops to read it. You can’t just search for your processor in a database I mean fuck that would take them at least an a couple hours of their precious time to set up and they have only had a year. How do you fix it? Why with a microcode update of course!!..from where you ask? Well don’t worry just look at the cryptic list it will tell you if you need a microcode update!!

    Fuck every article about this shit. Anyone wanna bust an Eli5 on how to fix this problem for people? (I was assuming it’s a BIOS update but the articles have only confused me further)

  • HexesofVexes@lemmy.world
    link
    fedilink
    English
    arrow-up
    32
    ·
    2 years ago

    Guess it’s time for another FPS hit…

    While the article says it won’t impact most applications, I suspect it’s closer to saying “won’t impact most applications as much”.

  • chicken@lemmy.dbzer0.com
    link
    fedilink
    English
    arrow-up
    31
    arrow-down
    1
    ·
    1 year ago

    This vulnerability, identified as CVE-2022-40982, enables a user to access and steal data from other users who share the same computer.

    So just continue not letting people use my computer, got it. Very simple fix.

  • scottywh@lemmy.world
    link
    fedilink
    English
    arrow-up
    29
    arrow-down
    3
    ·
    1 year ago

    /tinfoilhat

    I admittedly stopped reading halfway through but I feel like these newest vulnerabilities being discovered are probably just fucking government back doors the manufacturers have been forced to include.

    /tinfoilhat

    • deranger@lemmy.world
      link
      fedilink
      English
      arrow-up
      4
      arrow-down
      1
      ·
      1 year ago

      Check out the documentary Zero Days (2016) if you haven’t already. That’s not really a tinfoil hat take these days IMO.

      • scottywh@lemmy.world
        link
        fedilink
        English
        arrow-up
        1
        ·
        1 year ago

        Just means they have to intentionally create new ones to be eventually found for the next generation.

    • luciferofastora@discuss.online
      link
      fedilink
      English
      arrow-up
      2
      ·
      edit-2
      1 year ago

      I can’t comment on the general trend, but this specific one seems a bit too circumstantial to be of use for a serious spying effort. You’d have to have the spyware running parallel to the apps usong passwords you want to steal in a specific way.

      The risk exists, which is bad enough for stochastic reasons (eventually, someone will get lucky and manage to grab something sensitive, and since the potential damage from that is incalculable, the impact axis alone drives this into firm "you need to get that fix out asap), but probably irrelevant in terms of consistency, which would be what you’d need to actually monitor anyone.

      If you manage to grab enough info to crack some financial access data, you can steal money. If you can take over some legit online account or obtain some email-password combo, you can sell it. But if you want to monitor what people are doing in otherwise private systems, you need some way to either check on demand or log their actions and periodically send them to your server.

      It would be far more reliable to have injection backdoors to allow you access by virtue of forcing a credential check to come up valid than to hope for the lucky grab of credentials the user might change at an arbitrary moment in time.

  • Veedem@lemmy.world
    link
    fedilink
    English
    arrow-up
    22
    ·
    2 years ago

    Yikes the performance hit is scary but if you’re running a server, what option do you have?

    • XIIIesq@lemmy.world
      link
      fedilink
      English
      arrow-up
      8
      ·
      edit-2
      1 year ago

      If it’s anything like the industry that I work in, the CEO would have been informed of the short comings of the design numerous times and given a response along the lines of “does it make our CPUs faster and more powerful though?”.

      The CEO won’t be pissed of at his chip designer, they’ll be pissed because they’ve been caught out.

    • Roboticide@lemmy.world
      link
      fedilink
      English
      arrow-up
      2
      ·
      1 year ago

      Given that the AMD vulnerability was called “Inception,” maybe they just like using movie titles to name CPU vulnerabilities?

  • Buddahriffic@lemmy.world
    link
    fedilink
    English
    arrow-up
    8
    arrow-down
    1
    ·
    1 year ago

    I’m curious if there’s a silver lining of all current DRM keys being accessible through this.