Term
Adaptive calibration
Plain-language definition
Tuning AI behavior against feedback, risk, and context.
Technical definition
Methods for adjusting system behavior in relation to intent, uncertainty, trust, risk, and measured outcomes.
Evidence status
Design Principle
Why it matters
Calibration keeps adaptation grounded in human and institutional constraints.
Example
An assistant asks more questions after repeated ambiguous requests.
Related terms
Related frameworks
Sources or further reading
- NIST AI Risk Management Framework
- NIST AI RMF Playbook
- UNESCO Recommendation on the Ethics of Artificial Intelligence
- Google Search Central: helpful, reliable, people-first content
- Schema.org DefinedTerm
