Developers building RAG applications want to overcome the inherent problem of hallucinations in LLMs. In this talk, I'll explain why hallucinations occur, how RAG supports them, and why the HHEM (Hallucination Evaluation Model) can help.
Priority access to all content
Video hallway track
Community chat
Exclusive promotions and giveaways