BOSTON — Americans’ perception of risk affects how we will adopt and use new technologies. At 9:00 p.m. tonight, PBS’s NOVA will examine the challenges around autonomous vehicles in “Look Who’s Driving.”
Despite improvements in safety measures in conventional vehicles, there are about 35,000 traffic fatalities per year in the U.S., partly due to distracted driving from smartphone usage, according to Mark Rosekind, former administrator of the National Highway Traffic Safety Administration. Most automotive accidents involve human error, and major automakers and technology companies are spending billions of dollars with the promise of improving safety.
At the same time, the development of fully autonomous vehicles has been difficult, with several noteworthy incidents such as the case of an Uber test vehicle killing a woman crossing a road with her bicycle in Arizona last year.
“Three-quarters of Americans are afraid to ride in one,” said Missy Cummings, director of the Humans and Autonomy Lab at Duke University, in “Look Who’s Driving.”
Building trust along with technology
“We had a panel last Friday at Computer History Museum in Silicon Valley, and one thing that impressed me was how careful the panelists are being,” said Chris Schmidt, co-executive producer of NOVA. “Jesse Levinson of Zoox was eloquent about knowing that the industry needs to build public trust.”
Zoox yesterday raised $200 million and plans to expand development and testing of robotic taxis in Las Vegas.
“Sometimes technologists will try to cram something down the public’s throat or develop technologies without thinking about its social impact,” Schmidt told The Robot Report. “None of the participants gave any pushback or misunderstood where we raised criticisms, and the film was well-received by the audience.”
However, Tesla and Uber declined to participate, and several automakers have “walked back” claims that fully autonomous passenger vehicles would be available by next year, he said.
“Waymo, for example, is not trying to reach Level 5 autonomy; it has said it’s going for Level 4,” said Schmidt. “Waymo did early experiments with Level 2 and 3 autonomy, which it thought was too dangerous.” Gerdes mentions the challenge of safely handing of control between human drivers and autonomous systems in “Look Who’s Driving.”
“There is a recognition that cars will have to prove themselves to be safer than humans for people to trust them,” Schmidt noted. “Will we accept fatalities if they’re less than the rate with human drivers?”
At the same time, there’s a risk that people will get too comfortable in self-driving cars.
In “Look Who’s Driving,” MIT research subject Taylor Ogan and some early Tesla Autopilot users demonstrate a lack of attention while relying on the driver-assist feature. Joshua Brown died in 2016 and Jeremy Beren Banner died in May when their vehicles failed to recognize a truck crossing a highway in Florida.

Autonomous vehicle demonstration. Source: PBS
Lidar versus cameras
Tesla co-founder and CEO Elon Musk has stated that cameras are better than lidar for perceiving a vehicle’s surroundings, but the debate continues.
“I had a demo ride in a Waymo car, and it was loaded with sensors — cameras, lidar, microphones, and radar,” noted Schmidt. “It also carried a detailed 3D map of the area and was not relying on just GPS.”
“The pre-map was many terabytes, maybe a terabyte,” he recalled. “The company had come up with a way to compress the data by turning the map into a text-based encoded file format. That shows me that these systems are still brittle. How will a driverless car recognize a stop sign covered in snow?”
“Real-time mapping — like what Zoox does, focusing on tracking dynamic objects — makes more sense than trying to pre-map everything,” added Schmidt. “On the other hand, if vehicles are trained in a specific, limited environment, they could reach a high level of safety.”
This explains why several companies such as Optimus Ride and Perrone Robotics are working on shuttles for more controlled environments such as retirement communities, limited routes, or corporate and college campuses, as mentioned by Daniela Rus, director of the MIT Computer Science and Artificial Intelligence Laboratory (CSAIL), in the NOVA episode.
In addition, connected vehicles could share data. “If a Waymo car sees something unexpected, it communicate ‘back to the ranch,’ but driverless cars are not yet connected with one another,” Schmidt said. Connected cars might reduce traffic, noted Chris Gerdes, director of the Center for Automotive Research at Stanford University.
‘Look Who’s Driving’ examines machine learning
“Driving is the most complex activity that adults on the planet regularly engage with,” said Raj Rajkumar, a professor at Carnegie Mellon University (CMU). “That’s a high bar for technology to overcome.”
“Look Who’s Driving” describes the three areas of autonomous vehicle research and development since DARPA’s Driverless Grand Challenge in 2004 — perception, understanding, and planning.
“Of the three, understanding is arguably the hardest. That’s where edge cases and brittleness show up,” said Schmidt. “Planning is only as good as understanding.”
NOVA notes that a lot of training of algorithms involves human annotation of images. “This cuts out a key limitation of traditional, command-line event programming,” Schmidt said. “This reduces the problem of a situation or condition the programmer didn’t anticipate. Machine learning developers are at least starting from the assumption that there are too many variables for a human to encode from the top down.”
The Robot Report is launching the Healthcare Robotics Engineering Forum, which will be on Dec. 9-10 in Santa Clara, Calif. The conference and expo will focus on improving the design, development, and manufacture of next-generation healthcare robots. Learn more about the Healthcare Robotics Engineering Forum, and registration is now open.
Watching for blind spots
“There’s still a danger of blind spots, like bias in facial recognition, which could be amplified if inputs are affected by human bias,” he added. “Sometimes, engineers self-select for people who are highly analytical and deductive but are not as good at understanding social nuance or environments.”
“It would be good if self-driving companies would have a staff philosopher or ethicist in addition to training in simulation,” said Schmidt. “The ‘trolley problem’ was beyond the scope of this film, but it has gotten more attention than other issues, such as privacy and security. [Rethink Robotics Inc. and Robust AI founder] Rodney Brooks asked, ‘When did a human driver ever have to solve this?'”
“At the panel last week, an 8-year-old girl asked an interesting question: ‘If I’m riding in one of these cars, how can I make it honk?'” he said. “When might you honk your horn? If a driver in the next lane swerves. It’s not necessarily a situation that driverless cars would know how to avoid, and developers have to think of how they’ll communicate with human drivers.”
“If bad actors know a self-driving car will stop if they jump in front of it, could they use that to take control?” Schmidt asked. “There’s also questions around privately owned cars versus fleets. If you summon a sports car, it might not drive the same as other models.”
Bullish on autonomy
“The fact that so much money is flowing into this technology reflects the optimism of the researchers, who are knocking down problems one by one,” said Schmidt. “While Level 5 autonomy won’t be anytime soon, as Martial Hebert [dean of CMU’s School of Computer Science] notes, it’s a question of when and where it will be deployed.”
Last week, PBS also broadcast a “Lunch Hour Live” conversation with Senior Correspondent Miles O’Brien, MIT researcher Bobbie Seppelt, and Perceptive Automata Chief Technology Officer Sam Anthony to discuss the issues raised in the NOVA piece.
“Look Who’s Driving” airs at 9:00 p.m. tonight on PBS.
One discussion that I think was overlooked in this documentary is the reliability of GPS. As part of the trip planning, self driving cars will utilize the GPS. Those of us who already use GPS know the headaches involved with flaws in the programming. The signal is often lost in the mountains or when driving through cities with tall buildings like San Francisco. The GPS routes often take you through residential areas which pose additional risks. I’ve even had my GPS tell me to turn into a concrete wall before or do crazy things like get off the highway do a circle and then get back onto the highway going the same direction as before. GPS errors like this are a foundation for mistrust of the technology. Until, I can rely upon my GPS, I can’t find myself putting much faith into self-driving technology.