Humans are terrible at taking back control from autonomous cars: when we’re authorized to not pay attention, we don’t, and shifting back is not so easy

One of the big safeguards for autonomous cars is a human backup driver. In most self-driving car trials, a human sits behind the wheel, ready to take back control if the car freaks out and doesn’t know what to do. But in practice, things don’t quite work out like that. Humans, says a new study from Stanford, are terrible at handling the sudden context change required to take over in an instant.

“Many people have been doing research on paying attention and situation awareness. That’s very important,” lead author Holly Russell told the Stanford University News. “But, in addition, there is this physical change and we need to acknowledge that people’s performance might not be at its peak if they haven’t actively been participating in the driving.”

Holly Russell, former graduate student in the Dynamic Design Lab, is lead author of a new study on the handover of control from an autonomous car to a human driver. She is shown in Stanford’s X1 experimental vehicle. [Photo: Steve Castillo]

Russell, of Stanford’s Dynamic Design Lab, sent participants around a 15-second course in Stanford’s autonomous test vehicle. They drove a loop, then let the car drive a loop, four times in total. Then the test began, and the drivers drove the course 10 more times, only when they took back control, the steering had been modified to simulate changes in sensitivity that occur at different speeds. For instance, you might haul the wheel around to make a small change while parking, but if you try that at highways speeds you’ll end up flipping onto the grass verge.

Drivers were warned before each change, but even then they had trouble readjusting. It took a few moments of wobbling to get accustomed with the new steering sensitivity. “Even knowing about the change, being able to make a plan and do some explicit motor planning for how to compensate,” said co-author Lane Harbott, “you still saw a very different steering behavior and compromised performance.”

If you ever took the wheel for a moment while in the passenger seat, you’ll know how odd it is to suddenly be thrown into a new context while trying to drive. The big catch here is that an autonomous car is most likely to hand back control in an emergency situation, which is precisely when you don’t have a spare few moments to dither.

And this was just steering. In a real car on a real road, there are many more variables that would need to be assessed on takeover. Also remember that this was a controlled test, with the human drivers paying attention the whole time. Out on the real streets, human drivers already ignore the road ahead while they’re in charge of a vehicle, preferring to send a text or otherwise endanger the life of everyone around them. What will happen when the car drives itself? Do we really think that requiring a car to hand back control to a human driver in an emergency will work out well? What if the “driver” is pouring a coffee, or totally engrossed in reading a book? In the cases, even a confused car will do better than a surprised and panicked human.

It’s likely that human takeover will still be a requirement, if only because the alternative—full autonomous control with no backup—seems so scary. But while self-driving cars will get better and better as they log more miles, the handover will always be jeopardized by one weak element: the human driver. It’s impossible to predict what a person will do when tossed into the middle of a potential crash situation, but if we continue to insist on believing that people are capable of driving safely, we will have a lot of work to do.

“If someone is designing a method for automated vehicle handover, there will need to be detailed research on that specific method,” says Harbott. “This study is tip of an iceberg.”