Just a reminder LLMs are incapable of even counting. They are a statistical model figuring out which tokens are most likely to appear next based on previous tokens.
Putting copilot in excel makes no sense whatsoever and MS must know people will use it and get completely wrong results.
Even better. They are incapable of discerning correlation vs causation, which is why they give completely illogical and irrelevant information.
Turns out pattern recognition means dogshit when you don’t know how anything works, and never will.
Somehow this reminds of a meme thread that just popped up wherein there are a lot of people proudly declaring their inability to study and claiming that the mere suggestion that one should read the manual as a first step to solving a problem is actually very offensive.
That’s not far off from reality, where normies laugh at you for suggesting they read the manual of the 21st century appliance (basically a computer) they spent hundreds/thousands purchasing.
Soon the ridicule will be replaced with offense, then “straight to jail” shortly after.
My only issue with RTFM is how often the manual is absolute dog shit, written by some engineer whom assumes knowledge only an engineer would already have.