D
Doc and Tell
Glossary/ai/ml
ai/ml

Precision and Recall

Two complementary metrics for evaluating information retrieval systems: precision measures accuracy of returned results, recall measures completeness.

Precision = True Positives / (True Positives + False Positives) — of the items returned, what fraction were actually relevant? Recall = True Positives / (True Positives + False Negatives) — of all relevant items that exist, what fraction were returned? There is typically a trade-off: increasing recall (returning more results) reduces precision; increasing precision (returning only confident matches) reduces recall.

For document intelligence in legal and compliance contexts, the relative importance of precision and recall depends on the use case. In M&A due diligence, missing a change-of-control clause (false negative) may be far more costly than flagging an irrelevant clause for human review (false positive) — suggesting high-recall retrieval is appropriate. In executive summaries, precision matters more — including irrelevant information reduces trust and increases review burden. The F1 score (harmonic mean of precision and recall) provides a single balanced metric when both matter equally.

Analyze Documents Related to Precision and Recall

Upload any document and get AI-powered analysis with verifiable citations.

Start Free