• mrfugu [he/him, any]@hexbear.net
    link
    fedilink
    English
    arrow-up
    22
    ·
    2 days ago

    you can run a pretty decent LLM from your home computer and tell it to act however you want. Won’t stop it from hallucinating constantly but it will at least attempt to prioritize truth.

    • BynarsAreOk [none/use name]@hexbear.net
      link
      fedilink
      English
      arrow-up
      3
      ·
      1 day ago

      Attempt being the keyword, once you catch onto it deliberately trying to lie to you the confidence surely must be broken, otherwise you’re having to double and triple(or more) check the output which defeats the purpose for some applications.