Psychology and Behavior

Why People Blame Self-Driving Cars More Than Human Drivers

In a crisis, are people too quick to assume that technology is the cause? Research by Julian De Freitas examines the psychology driving the debate about autonomous vehicles.

Truck with crushed front against gradient background with grid pattern.

As carmakers look to expand the market for autonomous vehicles (AVs), they are hitting a psychological speed bump: People blame AVs more for accidents—even when they are not at fault.

Experiments with more than 5,000 respondents across three studies describe a common tendency: People imagine what would have happened if a perfect human driver had been behind the wheel instead of artificial intelligence (AI). This “what if” reaction could extend to AI in other sectors as well, says Harvard Business School Assistant Professor Julian De Freitas, representing a barrier to broader acceptance and expansion of the technology.

“It's different than when regular systems fail, because AVs are still this unfamiliar and unsettling technology,” says De Freitas, a co-author of the study and director of the Ethical Intelligence Lab at HBS. “And so, when there is an AI failure people fixate on the technology and imagine how things could have turned out differently had a superior driver been present instead. This makes people more likely to view AV makers as liable, raising their legal and financial risk.”

It's different than when regular systems fail, because AVs are still this unfamiliar and unsettling technology.

Successful lawsuits and settlements against AV manufacturers can be very costly to firms. It can also increase their insurance premiums, and whether insurers insure them in first place. The promise of AVs—now slowly spreading to cities across the United States—is at stake, raising questions for companies about how to manage risk and win over skeptical users.

De Freitas probes this tension in the article “Public Perception and Autonomous Vehicle Liability,” published by the Journal of Consumer Psychology in January. He teamed with Xilin Zhou and Margherita Atzei of Swiss Reinsurance, Shoshana Boardman of the University of Oxford, and Luigi Di Lillo from the Massachusetts Institute of Technology.

The Cruise case: a cautionary tale

In 2023, a human-driven vehicle hit a pedestrian who was jaywalking in San Francisco. The pedestrian was launched into the path of an autonomous Chevy Bolt, operated by Cruise (a subsidiary of General Motors).

The autonomous vehicle braked but still hit the woman. After the impact, the car attempted to pull over but dragged her approximately 20 feet before coming to a stop. She survived but was seriously injured.

Although a human driver could not have avoided the accident, Cruise was fined $500,000, lost its license in San Francisco, and eventually shut down its robotaxi business—largely because it failed to fully disclose the car’s role in the dragging incident.

Held to an impossible standard

Yet, might the public have blamed Cruise more than it deserved?

To test how consumers view an AV’s responsibility in accidents like the Cruise incident where the AV is not at fault, the authors enlisted people to analyze how they perceived liability in three different studies, wherein one vehicle was at fault and another was not (without also showing them the dragging incident).

Inside the research

De Freitas and his collaborators asked over 5,000 people recruited online to assess liability after theoretical accidents in which a human-driven vehicle hits a second vehicle that’s not at fault. Each study tested different dimensions.

  • Study 1: Who gets blamed when not at fault?

    Participants were more likely to support suing the maker of the not-at-fault car if it was an AV (38 on a 100-point scale) versus human-driven (31).

  • Study 2: The "what if" effect

    When given a hypothetical scenario, participants mentioned the not-at-fault car more when it was an AV (43%) than human-driven (14%).

  • Study 3: Refocusing blame the real culprit

    When reminded of the at-fault driver’s traffic violation, disproportionate blame toward the not-at-fault AV disappeared (12 out of 100 for the AV maker versus 11 for the human-driven carmaker).

Participants tended to view the automaker of the not-at-fault vehicle as more responsible when the vehicle was self-driven than human-driven, even though in both situations the crash was unavoidable.

Why? People fixate on the AV itself as an “abnormal” presence, making them more likely to imagine alternative scenarios in which another driver were present instead—but they don’t just imagine any driver, but a “perfect” one. When the researchers asked participants to complete sentences about how things could have turned out differently, they gave answers like “If only there were a person in the car, they would have been able to swerve to avoid being collided into”—even though the accidents were constructed so that such split-second behavior would be impossible.

De Freitas explains: "In effect, these vehicles end up being held to a higher standard than human drivers, because people compare them to an imagined ideal that surpasses what a human is capable of in the same situation.”

Managing risks for AV companies

If the public unduly blames AV companies when they are not at fault, then this could drive up insurance premiums for AV providers, the research suggests. Some companies might have to choose between passing on insurance costs to consumers or reducing their liability coverage, exposing themselves to tremendous risk in the event of technology failure.

“As we saw with the Cruise incident, that would be unwise, because even when it is not at fault, an accident could be potentially devastating for a company’s survival,” De Freitas says.

In fact, the risks De Freitas and team uncovered might be magnified in practice, for at least three reasons:

  1. Liability in some US states is uncapped, so juries may award damages beyond actual costs if AVs are unfairly blamed.

  2. Even if they are only partially liable, AV companies are seen as wealthier and insured, making them attractive targets of lawsuits.

  3. In some states, laws may force AV providers to pay full damages if other parties in the accident lack full insurance coverage.

How to build trust

One upside from the research: The more people trusted AVs, the less likely they were to view them as liable. For these people, AVs may seem less abnormal in the first place, making them less likely to imagine alternative “perfect” scenarios.

A later experiment showed that highlighting the at-fault driver’s responsibility lessened the tendency to fault the AI. By steering attention away from the vehicle’s novelty, people became less prone to exaggerating hypothetical risks.

De Freitas says automakers and other companies expanding in AI need to:

  • Anticipate tech failures. “Be aware that these systems can fail, and when they do, you're potentially looking at more risk than usual.”

  • Earn consumers’ trust so that the tech seems less abnormal and unsettling. “Explain what you're doing, why it makes sense, why it's reasonable, and what you're continuously doing to improve and be transparent, especially when it comes to safety.” AI-based systems will need to “feel more familiar, like a common part of our surroundings.”

  • Rule out counterfactuals. The public will look for what could have happened, even under potentially impossible scenarios. Companies and their legal defendants will need to counter theories with facts, pointing out what AVs can—and cannot—do, and how this limits what counterfactuals are possible.

  • Steer attention away from the technology. Focus the public’s attention away from the tech and toward the true sources of fault. Still, redirecting attention is not the same as concealing facts—companies must remain transparent about AI’s role in any accidents.

Image by Ariana Cohen-Halberstam with asset from AdobeStock.

Have feedback for us?

Public Perception and Autonomous Vehicle Liability

De Freitas, Julian, Xilin Zhou, Margherita Atzei, Shoshana Boardman, and Luigi Di Lillo. "Public Perception and Autonomous Vehicle Liability." Journal of Consumer Psychology (forthcoming). (Pre-published online January 12, 2025.)

Latest from HBS faculty experts

Expertly curated insights, precisely tailored to address the challenges you are tackling today.

Strategy and Innovation

Social Responsibility

Data and Technology