Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

So given that the output of an LLM is unreliable at best, your plan is to verify that a LLM didn't bullshit you by asking another LLM?

That sounds... counterproductive



Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: