Drivers of automated vehicles are blamed for crashes that they cannot reasonably avoid
People seem to hold the human driver to be primarily responsible when their partially automated vehicle crashes. But is this reasonable? In a paper recently published in Nature Scientific Reports , researchers from the AiTech Institute investigated the mismatch between the public's attribution of blame and finding from the human factors literature regarding human's ability to remain vigilant in partially automated driving. Participants of the experiment blamed the driver primarily for crashes, even though they recognized the driver's decreased ability to avoid them. The public expects drivers to remain vigilant and supervise the automated vehicle at all times, yet we know this is an unreasonable demand for a human driver; even highly-trained pilots struggle with supervising autopilot systems for prolonged periods. Drivers are unaware of what is happening in their surroundings, and they cannot respond as fast as the system requires. The imbalance between human-factor-related challenges with automation regarding driver ability and the participant's responsibility attributions reveals a culpability gap. In this culpability gap, responsibility is not reasonably distributed over the involved human agents; the driver receives most blame, yet this may be unreasonable given their impacted ability to change the outcome.
Advert