• truthfultemporarily@feddit.org
    link
    fedilink
    English
    arrow-up
    1
    ·
    6 days ago

    Just a reminder LLMs are incapable of even counting. They are a statistical model figuring out which tokens are most likely to appear next based on previous tokens.

    Putting copilot in excel makes no sense whatsoever and MS must know people will use it and get completely wrong results.

    • WhatAmLemmy@lemmy.world
      link
      fedilink
      English
      arrow-up
      1
      ·
      edit-2
      6 days ago

      Even better. They are incapable of discerning correlation vs causation, which is why they give completely illogical and irrelevant information.

      Turns out pattern recognition means dogshit when you don’t know how anything works, and never will.

      • Wolf314159@startrek.website
        link
        fedilink
        English
        arrow-up
        1
        ·
        6 days ago

        Somehow this reminds of a meme thread that just popped up wherein there are a lot of people proudly declaring their inability to study and claiming that the mere suggestion that one should read the manual as a first step to solving a problem is actually very offensive.

        • WhatAmLemmy@lemmy.world
          link
          fedilink
          English
          arrow-up
          1
          ·
          edit-2
          6 days ago

          That’s not far off from reality, where normies laugh at you for suggesting they read the manual of the 21st century appliance (basically a computer) they spent hundreds/thousands purchasing.

          Soon the ridicule will be replaced with offense, then “straight to jail” shortly after.

          • Cypher@lemmy.world
            link
            fedilink
            English
            arrow-up
            0
            ·
            6 days ago

            My only issue with RTFM is how often the manual is absolute dog shit, written by some engineer whom assumes knowledge only an engineer would already have.