• jeffw@lemmy.worldOP
    link
    fedilink
    English
    arrow-up
    8
    ·
    edit-2
    6 months ago

    That cant answer most questions though. For example, I hung a door recently and had some questions that it answered (mostly) accurately. An encyclopedia can’t tell me how to hang a door

    • Balder@lemmy.world
      link
      fedilink
      English
      arrow-up
      5
      ·
      6 months ago

      Yeah, there’s a reason this wasn’t done before generative AI. It couldn’t handle anything slightly more specific.

    • linearchaos@lemmy.world
      link
      fedilink
      English
      arrow-up
      4
      ·
      6 months ago

      Same I was dealing with a strange piece of software I searched configs and samples for hours and couldn’t find anything about anybody having any problems with the weird language they use. I finally gave up and asked gpt, it explained exactly what was going wrong and gave me half a dozen answers to try to fix it.

    • btaf45@lemmy.world
      link
      fedilink
      English
      arrow-up
      1
      arrow-down
      1
      ·
      6 months ago

      That cant answer most questions though.

      It would make AI much more trustworthy. You cannot trust chatGPT on anything related to science because it tells you stuff like the Andromeda galaxy being inside the Milky Way. The only way to fix that is to directly program basic known science into the AI.