• Doorknob@lemmy.world
    link
    fedilink
    English
    arrow-up
    35
    arrow-down
    1
    ·
    edit-2
    2 days ago

    Took me ages to understand this. I’d thought "If an AI doesn’t know something, why not just say so?“

    The answer is: that wouldn’t make sense because an LLM doesn’t know ANYTHING

    • Electricd@lemmybefree.net
      link
      fedilink
      English
      arrow-up
      1
      ·
      edit-2
      19 hours ago

      Thinking model can realize their prediction doesn’t make sense if they really know nothing to an extent but yea, it’s not always accurate