AI Resource

Trust calibration

Term

Trust calibration

Plain-language definition

Matching trust to actual capability and risk.

Technical definition

The process of aligning user reliance with system reliability, uncertainty, context, and consequences.

Evidence status

Established Concept

Why it matters

Miscalibrated trust causes overreliance or underuse.

Example

A tool displays uncertainty when source quality is weak.

Related terms

Related frameworks

Sources or further reading

Last updated