As carmakers look to expand the market for autonomous vehicles (AVs), they are hitting a psychological speed bump: People blame AVs more for accidents—even when they are not at fault.
Experiments with more than 5,000 respondents across three studies describe a common tendency: People imagine what would have happened if a perfect human driver had been behind the wheel instead of artificial intelligence (AI). This “what if” reaction could extend to AI in other sectors as well, says Harvard Business School Assistant Professor Julian De Freitas, representing a barrier to broader acceptance and expansion of the technology.
“It's different than when regular systems fail, because AVs are still this unfamiliar and unsettling technology,” says De Freitas, a co-author of the study and director of the Ethical Intelligence Lab at HBS. “And so, when there is an AI failure people fixate on the technology and imagine how things could have turned out differently had a superior driver been present instead. This makes people more likely to view AV makers as liable, raising their legal and financial risk.”
It's different than when regular systems fail, because AVs are still this unfamiliar and unsettling technology.
Successful lawsuits and settlements against AV manufacturers can be very costly to firms. It can also increase their insurance premiums, and whether insurers insure them in first place. The promise of AVs—now slowly spreading to cities across the United States—is at stake, raising questions for companies about how to manage risk and win over skeptical users.
De Freitas probes this tension in the article “Public Perception and Autonomous Vehicle Liability,” published by the Journal of Consumer Psychology in January. He teamed with Xilin Zhou and Margherita Atzei of Swiss Reinsurance, Shoshana Boardman of the University of Oxford, and Luigi Di Lillo from the Massachusetts Institute of Technology.
The Cruise case: a cautionary tale
In 2023, a human-driven vehicle hit a pedestrian who was jaywalking in San Francisco. The pedestrian was launched into the path of an autonomous Chevy Bolt, operated by Cruise (a subsidiary of General Motors).
The autonomous vehicle braked but still hit the woman. After the impact, the car attempted to pull over but dragged her approximately 20 feet before coming to a stop. She survived but was seriously injured.
Although a human driver could not have avoided the accident, Cruise was fined $500,000, lost its license in San Francisco, and eventually shut down its robotaxi business—largely because it failed to fully disclose the car’s role in the dragging incident.
Held to an impossible standard
Yet, might the public have blamed Cruise more than it deserved?
To test how consumers view an AV’s responsibility in accidents like the Cruise incident where the AV is not at fault, the authors enlisted people to analyze how they perceived liability in three different studies, wherein one vehicle was at fault and another was not (without also showing them the dragging incident).
Inside the research
De Freitas and his collaborators asked over 5,000 people recruited online to assess liability after theoretical accidents in which a human-driven vehicle hits a second vehicle that’s not at fault. Each study tested different dimensions.
- Study 1: Who gets blamed when not at fault?
Participants were more likely to support suing the maker of the not-at-fault car if it was an AV (38 on a 100-point scale) versus human-driven (31).
- Study 2: The "what if" effect
When given a hypothetical scenario, participants mentioned the not-at-fault car more when it was an AV (43%) than human-driven (14%).
- Study 3: Refocusing blame the real culprit
When reminded of the at-fault driver’s traffic violation, disproportionate blame toward the not-at-fault AV disappeared (12 out of 100 for the AV maker versus 11 for the human-driven carmaker).
Have feedback for us?
Public Perception and Autonomous Vehicle Liability
De Freitas, Julian, Xilin Zhou, Margherita Atzei, Shoshana Boardman, and Luigi Di Lillo. "Public Perception and Autonomous Vehicle Liability." Journal of Consumer Psychology (forthcoming). (Pre-published online January 12, 2025.)