• doctortofu@piefed.social
    link
    fedilink
    English
    arrow-up
    18
    arrow-down
    1
    ·
    4 hours ago

    I said it before, and I’ll say it again - this is a scene from Idiocracy playing before our eyes, only the corporate morons were brainwashed into putting “AI is awesome!” in every sentence instead of “brought to you by Carl’s Junior”…

    Brawndo AI - it has electrolytes!

  • mintiefresh@piefed.social
    link
    fedilink
    English
    arrow-up
    44
    arrow-down
    1
    ·
    6 hours ago

    This person is so out of touch.

    It’s so absurd that it doesn’t even sound like a real story.

    • MudMan@fedia.io
      link
      fedilink
      arrow-up
      25
      arrow-down
      1
      ·
      6 hours ago

      I do appreciate LinkedIn because it helps surface how social bubbles distort perception.

      That includes Fedi/Lemmy. This guy is painfully out of touch, but so is everyone else. That’s the entire problem of the unmoored, atomized media and interaction landscape we’ve built for ourselves this century.

      I mean, I’m not excusing the guy, this was a pretty dumb post, but my first reaction to these is to wonder what dumb, unhinged assumptions my personal reality filters are inflicting on me that won’t make headlines, you know?

      • rebelsimile@sh.itjust.works
        link
        fedilink
        English
        arrow-up
        1
        ·
        56 minutes ago

        I appreciate your compassion and all but like, they just got fired from that place, and the guy is like “to alleviate the pain of being fired from GloboCorp, GloboCorp is giving the former GloboTherapy team a 30 day free trial of GloboCorp brand GloboTherapy”.

        Like if I got fired from Pizza Hut and they were like “but here’s a 10% off coupon on your next order” I’m not fucking ordering Pizza Hut again for the rest of my life, and that’s just pizza hut

    • Ogmios@sh.itjust.works
      link
      fedilink
      English
      arrow-up
      3
      ·
      6 hours ago

      He knows exactly what he’s doing, and he’s going to continue to do it until people stop expecting machines to solve all their problems for them.

  • TheDuffmaster@lemmy.world
    link
    fedilink
    English
    arrow-up
    20
    ·
    6 hours ago

    He calls this his “best advice”, definitely sounds like the kind of guy who’s so brain dead he asks chat gpt how it would lead a large gaming conglomerate

    • Evil_Shrubbery@lemmy.zip
      link
      fedilink
      English
      arrow-up
      1
      ·
      51 seconds ago

      It’s scary what a large proportion of upper managers & CEOs use ChatGPT daily for business. It’s so convenient when decisions are a, b, or c.

  • leftzero@lemmynsfw.com
    link
    fedilink
    English
    arrow-up
    17
    ·
    6 hours ago

    LLMs ain’t gonna replace programmers any time soon (they might get us laid off, but they’re not going to do our jobs no matter how much executives want them to), but they seem to have already replaced executives, though sadly without the laying off part.

    It’s becoming more and more evident that these extremely harmful idiots (including CEOs and whatnot) have completely outsourced all their decision making and what little thinking they used to do onto LLMs.

    We’re being ruled by vegetables parroting hallucinating autocomplete engines.

    • massive_bereavement@fedia.io
      link
      fedilink
      arrow-up
      4
      ·
      2 hours ago

      But the sales guy said that a programmer using their copilot technology will program at 10x speed. So I guess we can fire 10% to pay for it.

    • squaresinger@lemmy.world
      link
      fedilink
      English
      arrow-up
      12
      ·
      5 hours ago

      We’re being ruled by vegetables parroting hallucinating autocomplete engines.

      That’s been the case since MBAs in leadership positions have been a thing. They only swapped out the external business analyst consultants with LLMs.

  • rem26_art@fedia.io
    link
    fedilink
    arrow-up
    13
    ·
    6 hours ago

    Absolutely reckless suggestion to have newly emotionally vulnerable people seek help from an LLM. And it’s even worse that they’re probably emotionally vulnerable because of the actions of your company