I think AI is neat.

  • Redacted@lemmy.world
    link
    fedilink
    English
    arrow-up
    8
    arrow-down
    1
    ·
    edit-2
    10 months ago

    Whilst everything you linked is great research which demonstrates the vast capabilities of LLMs, none of it demonstrates understanding as most humans know it.

    This argument always boils down to one’s definition of the word “understanding”. For me that word implies a degree of consciousness, for others, apparently not.

    To quote GPT-4:

    LLMs do not truly understand the meaning, context, or implications of the language they generate or process. They are more like sophisticated parrots that mimic human language, rather than intelligent agents that comprehend and communicate with humans. LLMs are impressive and useful tools, but they are not substitutes for human understanding.

    • Even_Adder@lemmy.dbzer0.com
      link
      fedilink
      English
      arrow-up
      4
      arrow-down
      4
      ·
      edit-2
      10 months ago

      When people say that the model “understands”, it means just that, not that it is human, and not that it does so exactly humans do. Judging its capabilities by how close it’s mimicking humans is pointless, just like judging a boat by how well it can do the breast stroke. The value lies in its performance and output, not in imitating human cognition.

      • Redacted@lemmy.world
        link
        fedilink
        English
        arrow-up
        3
        arrow-down
        3
        ·
        10 months ago

        Understanding is a human concept so attributing it to an algorithm is strange.

        It can be done by taking a very shallow definition of the word but then we’re just entering a debate about semantics.

          • Redacted@lemmy.world
            link
            fedilink
            English
            arrow-up
            4
            arrow-down
            1
            ·
            edit-2
            10 months ago

            Yes sorry probably shouldn’t have used the word “human”. It’s a concept that we apply to living things that experience the world.

            Animals certainly understand things but it’s a sliding scale where we use human understanding as the benchmark.

            My point stands though, to attribute it to an algorithm is strange.