Saturday, January 27, 2018

Split Responsibility


Please give me your undivided attention. Huh? What does that mean? If the phone rings, don’t answer it? I a text ping tells you someone is trying to reach you, ignore it? If someone rings your doorbell, pretend they aren’t there? We know what happens when attention is divided, as almost always is, when someone is driving a car. Virtually all traffic laws make it unlawful to text and drive, to hold a mobile phone to your ear as you turn the steering wheel. Notwithstanding the sensors, automatic braking systems, adaptive cruise control and automated warning tones, the number of accidents from “distracted driving” – mostly from drivers engaging with those compelling smart phones – are still rising.

What is distracted driving? Just “device irresponsibility”? How about: Listening to talk radio and getting agitated? Listening to hot music and singing along, just listening to soft music and relaxing? Sipping coffee or nibbling on a doughnut? Checking/adjusting make-up? Looking at Waze or the car’s navigation system? Luddite-map reading? Arguing with a passenger? Looky-looing? Correcting junior’s misbehaving in the back seat? Using hands-free Bluetooth cell phone to negotiate a business deal during rush hour?

But as we slowly move to a world where human drivers are being replaced by “Otto Pilot,” where cars of the future might not even have steering wheels, exactly where does the responsibility for safety lie? Even if we are able to use this technology vastly to reduce accidents, we know that there will be accidents. The computer manufacturer? Still the driver/owner of the car? To a new insurance schema that makes it a third party problem? Our entire legal system has yet to adapt to this obvious quagmire. I don’t even want to address what might happen if a malicious power were to hack into the overall system intentionally to cause accidents. And what about that ugly transition that is likely to last for decades as overlapping technologies coexist in ugly disharmony.

As states allow driverless cars to ply their streets to develop that technology, we are in a particularly awkward period. Most such “permissions to test” require that a real person be at the controls “just in case.” But is that really an effective practice? Think about the recent crash when such a “manned” Tesla, operating fully automatically, still plowed into a Culver City fire truck on California’s notorious 405 freeway (above picture). A human driver sitting in a car on autopilot. Crash!

There are folks who are beginning to question the value of a human back-up during such autopilot tests. The January 27th Los Angeles Times explains: “Researchers with deep experience in human-machine interaction say it’s folly to think that won’t cause problems. Even if the human-robot team-up leads to safer roads on average, plenty of drivers will abuse the relationship, intentionally or not, and events like Monday’s [1/22] crash will make the news.

“‘There’s something we used to call split responsibility,’ said Hod Lipson, director of Columbia University’s Creative Machines Lab. ‘If you give the same responsibility to two people, they each will feel safe to drop the ball. Nobody has to be 100%, and that’s a dangerous thing.’… That’s also true for humans sharing tasks with robots, he said.

“Engineering researchers in the psychology department at the University of Utah are studying whether semiautonomous driving technology will make things better or worse… During the experiments, people are put in semiautonomous driving simulators to measure their reaction times when something goes wrong. When subjects were distracted, average reaction time in the simulator almost doubled, researcher Kelly Funkhouser said…. The longer the subjects remained ‘cognitively disengaged,’ the longer their reaction times got. Some fell asleep.” Hmmm… back to the drawing board. But “it” is still happening. Slowly but surely.

The fact remains that automated driving systems are much more reliable that rely on sober human drivers. But that has to be of little comfort to someone sitting in a fully automated car that is heading for an obvious accident… when there is no manual override to stop it. That is the fear of lots of people, particular middle aged and older, when they are asked to put their trust in something that they are used to controlling but over which they do not have control. As infrastructure crumbles without repair or fails to expand to accommodate great usage, automation might be the only way to make it all work… so get used to it!

I’m Peter Dekom, and I am one of those folks that likes to drive… so I have to get used to it too!

No comments: