AI Resource

Adaptive calibration

Term

Adaptive calibration

Plain-language definition

Tuning AI behavior against feedback, risk, and context.

Technical definition

Methods for adjusting system behavior in relation to intent, uncertainty, trust, risk, and measured outcomes.

Evidence status

Design Principle

Why it matters

Calibration keeps adaptation grounded in human and institutional constraints.

Example

An assistant asks more questions after repeated ambiguous requests.

Related terms

Related frameworks

Sources or further reading

Last updated