• Repple (she/her)@lemmy.world
    link
    fedilink
    English
    arrow-up
    13
    ·
    6 months ago

    Super disappointed if they’re doing this off-device. If we’re getting more language model crap, at least make it local, please.

    • WalnutLum@lemmy.ml
      link
      fedilink
      English
      arrow-up
      2
      ·
      6 months ago

      The problem is notably “powerful”, AIs need pretty significant hardware to run well

      As an example the snapdragon NPUs I think can barely handle 7B models.