The human tendency to over-rely on automated systems at the expense of independent judgment.
Automation bias is a cognitive phenomenon in which people disproportionately favor the outputs of automated or AI-driven systems over contradictory information from other sources, including their own reasoning. Rather than treating automated recommendations as one input among many, individuals exhibiting automation bias tend to defer to system outputs uncritically, reducing their own analytical engagement. This effect is compounded in modern AI contexts, where systems can appear highly confident, produce fluent and authoritative-sounding outputs, and operate at speeds that discourage careful human scrutiny.
The mechanism behind automation bias involves both complacency and trust calibration failures. When automated systems perform well most of the time, users learn to rely on them and gradually reduce their vigilance — a pattern that becomes dangerous precisely in the rare cases where the system errs. In AI-assisted workflows, this can manifest as accepting a model's classification, diagnosis, or recommendation without verifying it against domain knowledge or contextual cues that the model may have missed. The problem is especially acute in high-stakes domains like clinical medicine, aviation, and financial trading, where AI tools are increasingly embedded in decision pipelines.
Automation bias matters deeply for AI deployment and system design. It challenges the assumption that adding an AI assistant always improves human decision-making; in some conditions, AI recommendations actively degrade performance by anchoring users to incorrect outputs. Mitigating automation bias requires deliberate interface design choices — such as withholding AI confidence scores until after a human has formed an initial judgment, or requiring explicit human sign-off on consequential decisions. It also motivates research into appropriate reliance, a growing subfield of human-AI interaction concerned with helping users trust AI systems neither too much nor too little.