Term
Trust calibration
Plain-language definition
Matching trust to actual capability and risk.
Technical definition
The process of aligning user reliance with system reliability, uncertainty, context, and consequences.
Evidence status
Established Concept
Why it matters
Miscalibrated trust causes overreliance or underuse.
Example
A tool displays uncertainty when source quality is weak.
Related terms
Related frameworks
Sources or further reading
- NIST AI Risk Management Framework
- NIST AI RMF Playbook
- UNESCO Recommendation on the Ethics of Artificial Intelligence
- Google Search Central: helpful, reliable, people-first content
- Schema.org DefinedTerm
