• Optional@lemmy.world
    link
    fedilink
    English
    arrow-up
    11
    ·
    6 months ago

    Because CoPilot+ is purportedly trained on what users actually do, it looked plausible to someone in marketing at Microsoft that it could deliver on “help the users get stuff done”. Unfortunately, human beings assume that LLMs are sentient and understand the questions they’re asked, rather than being unthinking statistical models that cough up the highest probability answer-shaped object generated in response to any prompt, regardless of whether it’s a truthful answer or not.

    Hehehe.