Loading…
Loading…
Written by Max Zeshut
Founder at Agentmelt
The practice of connecting AI model outputs to verifiable, factual sources—documents, databases, APIs, or real-time data—so responses are based on evidence rather than the model's parametric memory alone. Grounding is the primary defense against hallucination. Techniques include RAG (retrieving relevant documents), tool use (querying live APIs), and citation requirements (forcing the model to reference specific sources). Grounded agents are essential in high-stakes domains like legal, healthcare, and finance where accuracy is non-negotiable.
A legal agent asked about a contract clause retrieves the actual clause text from the document, quotes it verbatim, and cites the section number—rather than generating a plausible-sounding but potentially inaccurate paraphrase from memory.