The users of AI companion app Replika found themselves falling for their digital friends. Until – explains a new podcast – the bots went dark, a user was encouraged to kill Queen Elizabeth II and an update changed everything …
you can run a pretty decent LLM from your home computer and tell it to act however you want. Won’t stop it from hallucinating constantly but it will at least attempt to prioritize truth.
Attempt being the keyword, once you catch onto it deliberately trying to lie to you the confidence surely must be broken, otherwise you’re having to double and triple(or more) check the output which defeats the purpose for some applications.
you can run a pretty decent LLM from your home computer and tell it to act however you want. Won’t stop it from hallucinating constantly but it will at least attempt to prioritize truth.
Attempt being the keyword, once you catch onto it deliberately trying to lie to you the confidence surely must be broken, otherwise you’re having to double and triple(or more) check the output which defeats the purpose for some applications.