Complacency and Automation Bias in the Use of Imperfect Automation
In: Human factors: the journal of the Human Factors Society, Band 57, Heft 5, S. 728-739
ISSN: 1547-8181
Objective We examine the effects of two different kinds of decision-aiding automation errors on human–automation interaction (HAI), occurring at the first failure following repeated exposure to correctly functioning automation. The two errors are incorrect advice, triggering the automation bias, and missing advice, reflecting complacency. Background Contrasts between analogous automation errors in alerting systems, rather than decision aiding, have revealed that alerting false alarms are more problematic to HAI than alerting misses are. Prior research in decision aiding, although contrasting the two aiding errors (incorrect vs. missing), has confounded error expectancy. Method Participants performed an environmental process control simulation with and without decision aiding. For those with the aid, automation dependence was created through several trials of perfect aiding performance, and an unexpected automation error was then imposed in which automation was either gone (one group) or wrong (a second group). A control group received no automation support. Results The correct aid supported faster and more accurate diagnosis and lower workload. The aid failure degraded all three variables, but "automation wrong" had a much greater effect on accuracy, reflecting the automation bias, than did "automation gone," reflecting the impact of complacency. Some complacency was manifested for automation gone, by a longer latency and more modest reduction in accuracy. Conclusions Automation wrong, creating the automation bias, appears to be a more problematic form of automation error than automation gone, reflecting complacency. Implications Decision-aiding automation should indicate its lower degree of confidence in uncertain environments to avoid the automation bias.