A Twitter friend posted this on a blustery Wisconsin morning:

Driverless cars

His car wasn’t being driven by a robot, but the tweet highlights some issues. Driverless cars use sensors—radar, cameras, and lasers—to detect obstacles and road edges. The radar sensors and cameras were iced over. If he had been in a driverless car, would he it have had to pull over and wait until the storm passed?

Even if the sensors were working, a Business Insider article pointed out that snow and heavy rain can confound the sensors because they need to detect lane markers to keep the car out of trouble. This problem might be overcome by employing high resolution three-dimensional maps that depict not only the road but the presence of signs and other landmarks.

Driverless cars also find it difficult to deal with situations where speed is a factor such as merging onto a highway. The normal cues about how traffic will behave inferred by a human driver are not detected by robots.

A different problem with speed took place on a race track in Buenos Aires. Two driverless cars were competing with each other in a “Roborace” when one of them missed a curve and crashed.

According to a BBC article, Roborace’s chief marketing officer put a positive “spin” [pun intended] on the crash, “It’s actually fantastic for us because the more we see these moments the more we are able to learn and understand what was the thinking behind the computer and its data.

“The car was damaged, for sure, but it can be repaired. And the beauty is no drivers get harmed because… there is no-one in them.”

One more real issue with self-driving cars was highlighted in another Business Insider piece. Its’ the Trolley problem.

“The Trolley problem goes like this: a runaway trolley is barreling toward five people on a track who cannot move. But you have the option to pull a lever and send it to a side track where you see one person standing. What would you do?

“But as [MIT associate professor Iyad] Rahwan puts it, the Trolley problem gets thornier when considering self-driving cars. The first scenario puts the ethical burden on a person. But if a self-driving car is in a lose-lose situation where it must make a choice, we’re asking a robot in our everyday environment to make the call.”

It’s a challenge for sure. The BI article discussed some different takes on the subject by personnel involved with the Mercedes-Benz driverless car project. One manager said the computer would always favor saving the driver, but another was a bit vague saying “For Daimler it is clear that neither programmers nor automated systems are entitled to weigh the value of human lives. There is no instance in which we’ve made a decision in favor of vehicle occupants. We continue to adhere to the principle of providing the highest possible level of safety for all road users.”

Here’s a question I have wondered about. My driverless car decides to take out a pedestrian instead of hitting a school bus. Who gets sued—me or the car’s manufacturer?

Finally a report surfaced this week about a 2015 incident involving a “rogue robot” that apparently entered a workspace that it didn’t belong in and fatally crushed a female coworker in a Michigan auto parts factory.

So are you ready to have an autonomous robot perform your gallbladder surgery? I’m not.

 

Skeptical Scalpel is a retired surgeon and was a surgical department chairman and residency program director for many years. He is board-certified in general surgery and a surgical sub-specialty and has re-certified in both several times. For the last six years, he has been blogging at SkepticalScalpel.blogspot.com and tweeting as @SkepticScalpel. His blog has had more than 2,500,000 page views, and he has over 15,500 followers on Twitter.

Author