Blog Ipsa Loquitur

Published on under The Digital Age

Madeleine Clare Elish, on the future of machine intelligence, says future designers of autonomous systems will consult ethnographers. She opens her piece by recounting a taxi driver driving to the (nonexistant) back entrance of an airport because Google Maps told him to ignore the front gate. Elish’s piece is all about the expectations society places on the human operators of these systems:

In a previously published case study of the history of aviation autopilot litigation, Tim Hwang and I documented a steadfast focus on human responsibility in the arenas of law and popular culture, even while human tasks in the cockpit have been increasingly replaced and structured by automation. Our analysis led us to thinking about the incongruities between control and responsibility and the implications for future regulation and legal liability in intelligent systems. The dilemma, as we saw it, was that as control has become distributed across multiple actors (human and nonhuman), our social and legal conceptions of responsibility have remained generally about an individual.

We developed the term moral crumple zone to describe the result of this ambiguity within systems of distributed control, particularly automated and autonomous systems. Just as the crumple zone in a car is designed to absorb the force of impact in a crash, the human in a highly complex and automated system may become simply a component—accidentally or intentionally—that bears the brunt of the moral and legal responsibilities when the overall system malfunctions.