• mhague@lemmy.world
    link
    fedilink
    English
    arrow-up
    5
    arrow-down
    1
    ·
    7 months ago

    I don’t get it, what makes the output trustworthy? If it seems real, it’s probably real? If it keeps hallucinating something, it must have some truth to it? Seems like the two main mindsets; you can tell by the way it is, and look it keeps saying this.

    • Olgratin_Magmatoe@lemmy.world
      link
      fedilink
      English
      arrow-up
      11
      arrow-down
      1
      ·
      7 months ago

      Given that multiple other commenters in the infosec.exchange thread have reproduced similar results, and right wingers tend to have bad security, and LLMs are pretty much impossible to fully control for now, it seems most likely that it’s real.