Listen to this article
|
You might not know it, but chances are you’ve seen some of Joel Johnson’s work. Better known as YouTuber JJRicks, he is the most prolific documenter of Waymo One Level 4 robotaxis in Chandler, Arizona – and probably the most frequent passenger.
According to his records, he’s taken 146 rides over the last two years, both of the safety driver and fully driverless variety. He’s had various disengagements along the way and even required Waymo’s roadside assistance, which is when a human Waymo employee needs to manually take over the vehicle as it can’t navigate a given situation.
But the Waymo robotaxi ride he took on May 3 is one he’ll never forget. You can watch the full ride in JJRicks’ video above, but the interesting parts start around the 11-minute mark. The Waymo robotaxi needs to make a right turn onto a multi-lane main road, but the far right lane was closed off by orange construction cones. And, boy, did they confuse the vehicle.
After the planning system can’t figure out how to handle the situation, it calls for roadside assistance. A human was supposed to arrive in mere minutes to get the car unstuck. However, before the assistance arrived, the Waymo robotaxi pulled out into the road, only to immediately stop again, this time blocking traffic. In fact, the Waymo robotaxi got stuck and took off again two more times before the roadside assistance employee could actually get into the vehicle, take over control and complete the ride.
As you’ll hear in the video, JJRicks asked Waymo how the roadside assistance program works. Waymo said it doesn’t assign its team members to individual vehicles, rather they patrol a particular area to cover multiple vehicles. The dispatcher said one-to-one assignment never existed, but JJRicks disagreed with that.
Waymo issued the following statement about the incident:
“While driving fully autonomously through an extended work zone, the Waymo Driver detected an unusual situation and requested the attention of a remote Fleet Response specialist to provide additional information. During that interaction the Fleet Response team provided incorrect guidance, which made it challenging for the Waymo Driver to resume its intended route, and required Waymo’s Roadside Assistance team to complete the trip. While the situation was not ideal, the Waymo Driver operated the vehicle safely until Roadside Assistance arrived. Throughout, Waymo’s team was in touch with the rider, who provided thoughtful and helpful feedback that allows us to continue learning and improving the Waymo Driver. Our team has already assessed the event and improved our operational process.”
According to his spreadsheet, JJRicks has taken seven disengagement-free Waymo One rides since the incident. “These cars have almost never made me actually scared that they’re going to do something dangerous, hard to shake my trust at this point,” he said. “I knew I’d make it out in one piece.”
I’m a big fan of what Waymo is doing. That’s why it made our 2020 RBR50 list. We share this not to criticize, but because the video highlights the challenges of scaling autonomous vehicles. Every company developing autonomous vehicles has had similar incidents, and others have had far worse. An Uber autonomous vehicle killed a woman in 2018. And Tesla, well, where do we even begin with its blatant disregard for safety and autonomous vehicles?
Waymo is in a different stratosphere than the two aforementioned companies. It’s rightfully considered the leader in the industry. But we don’t often see its mishaps on camera. And this one, which was the result of a simple change to road conditions, shows Level 5 autonomous vehicles remain a long, long way off. The Society of Automotive Engineers International recently changed the details of its six-level classification of autonomous driving capability. It defines Level 5, in part, as a system that can “drive everywhere in all conditions.”
Waymo has been mapping and operating in Chandler for years. It knows these roads better than any roads in the world, yet it struggled with a simple, everyday lane closure. After all these years, navigating around traffic cones seems like it should be an easy maneuver. Not to mention, there were kinks communicating with the roadside assistance team, although those should be easier to work out than the edge cases the autonomous driving system will encounter. The Waymo One service operates in about an 80-square-mile radius of Arizona, according to Waymo.
IEEE Spectrum recently interviewed Nathaniel Fairfield, who leads the behavior team at Waymo. He said some interesting things about the challenges of construction zones and when human assistance is needed. Fairfield said humans do not teleoperate the Level 4 Waymo One vehicles, and JJRicks’ video seems to confirm that statement, although we don’t know for sure.
“Imagine you’re out driving and you come up to a ‘road closed’ sign ahead,” said Fairfield. “You may pause for a bit as you look for a “Detour” sign to show you how to get around it or if you don’t see that, start preparing to turn around from that road and create your own detour or new route. The Waymo Driver does the same thing as it evaluates how to plot the best path forward. In a case like this where the road is fully blocked, it can call on our Fleet Response specialists to provide advice on what route might be better or more efficient and then take that input, combine it with the information it has from the onboard map and what it’s seeing in real time via the sensors, and choose the best way to proceed.”
Three of Waymo’s top executives have recently left the company, including former CEO John Krafcik. But the Alphabet subsidiary is moving forward. Earlier this week Reuters reported that Waymo applied for permits to start charging for rides and deliveries in San Francisco using its autonomous vehicles. Cruise submitted a similar permit, but neither company said when they intend to launch these services.
Chris says
Isn’t this why Tesla is choosing a different route? Some might argue it’s not quite so clear who is in the lead when it comes to level 5 autonomy.
Jason says
Waymo being in first is ignorant reporting at best if not outright biased reporting. Safety records per mile driven are not even in the same league.
Altino Cunha says
This reporter is an idiot… and if you dont belive just wait 1 or 2 more years and everybody will see who will deliver full SELF driving…
Steve Crowe says
Altino, thanks for the kind words! I’ll reach out in 2 years for another friendly discussion!
Michael says
Hahaha excellent response Steve
Bobby Brown says
Musk fanboy don’t like when you don’t workshop their God.
They believe there is some magic happening at Tesla and have already factored this in their valuation.
There will be a rude awekening.
Ean says
Waymo is going for the 5 year solution. Tesla is going for the hail mary. I would put my money on waymo. Waymo stores maps and this tonnes of compute.
Oh.sir says
The con, Ean, is that maps change ever so often that relying on them is a perfect recipe for disaster.
This whole article is about what a few unexpected cones can do in an 80 sq mile region Waymo operates in.
Their solution does not scale. What happens when they deploy this to more cities with more cars?
Ean says
this offloads tonnes.
Soccerguy says
After seeing this video I will vigorously oppose expansion of self driving taxis. Endangering and inconveniencing the public for their beta test and taking jobs away.
Matt says
Unfortunately waymo and most of the other autonomous systems are built on a “think about how they would build this at Disney world and scale it to the whole world” mindset. Disney controls the construction and knows when those cones are showing up and can update the code, the auto makers can’t. You can’t map a road to the centimeter and expect level 5. Level 5 requires it to be able to handle new roads, not just old, static roads.
Robert says
Waymo’s approach of relying on pre-mapped roads and Lidar is doomed. It will NEVER scale. It can’t adapt to changing conditions. Only Tesla and CommonAI are aiming to solve for vision (cameras only, no pre-loaded maps. Just real-time recognition of the road and all of the obstacles around it). A Tesla would have easily gone right around the cones as is evidenced by the countless examples on YouTube.
CJ says
Wait, so Tesla gets flak for having something called FSD, while Waymo calls their vehicles Drivers and no one finds that weird or annoying to read/write? How did you write this article and not question who you were talking about when referencing Waymo the company, the vehicle, the remote technician/driver, or the real driver who came to help?
Also, haven’t all the Tesla accidents been caused by drivers doing something illegal, dangerous, or by not paying attention (as they are supposed to!). That’s like blaming a kid at a swimming pool for drowning when the lifeguard is asleep…
Michael says
So, in your analogy, Tesla FSD is a drowning kid in a swimming pool?
Dan says
I think Cj is saying Tesla FSD is a kid in a swimming pool, the driver is the lifeguard, and a Tesla accident is the kid drowning.
lubomir Firko says
AI is the future. Computers have no emotions, they do not get frustrated and do not get into road rage. Goog waymo is the future, because people are dumb
Paul A Winston says
It will never work. Waste of money. Besides there are organizations ready to tear them up and hack into the computer system
William Thompson says
Visual Neural networks will have to come up with better solutions at Tesla, right? But isn’t that the challenge? The solution is really simple.
The article itself describes the options. Drive around the problem or reroute or follow detour signs or some combination. But what if the car itself breaks down. Someone will have to have it towed and another car will have to be sent, just as we do now. The trick is to virtually eliminate the liklihood of such events to a level statistically far less obstructive and less dangetous and less frequent than human drivers in the same number of miles traveled.
Captain says
Reminds me of a south park episode. Taking our jobs.
Deez nuts says
LOL my FSD tesla would’ve just gone into the other lane! 😂
Bobby Brown says
There is no autonomous driving with Tesla. It’s require that a driver must be in control.
Autonomous driving is something different. That’s what Waymo and other do, not Tesla.
Bobby Brown says
You’re confusing driving assistance and autonomous driving.
Waymo has autonomous driving you can use today and pay for it
Tesla has driving assistance where a driver is responsible of steering the car.
Those are two different things
Matt says
The biggest difference is that one is closed circuit (waymo), the other is not. I call it the “Disney version of self driving” because you can’t leave the premises. I can’t ask waymo to take me into any area that they haven’t mapped to the centimeter. It’s like you drove into the flat earther version of the world and hit the edge.
Al says
Funny thing is that most people that have a negative view on Tesla don’t regularly drive one.
Joel A Davis says
The fact that at no point could they remotely take control and stop the vehicle is terrifying
Evfunction says
AI has struggled to get anywhere near the capability of the human brain and body. Waymo or other solutions will imho never get there. If you paid for FSD at Tesla get your money back now
CLARA says
The assistant’s comment was interesting. He said he didn’t know why they took the cones off the map. So, am I to understand that if a construction cone is not represented on the car’s HD map, the car can’t deal with it?
April J. says
Great article! Very well researched and beautifully written! I am the mom of “JJRicks”… (Ricks is actually his middle name). Joel convinced me to ride along with him in a Waymo car in October 2020, when they first started operating without a safety driver… I couldn’t believe it! Since then, I’ve also joined the Waymo One program and taken 10+ rides… I can tell you that without a shadow of a doubt, I’m not worried about safety. Yes, it pulled out in traffic, and that was worrisome, but Waymo cares more about safety than the inventor of the parachute… and I don’t worry about him when he’s riding in Waymo cars… so I am Joel’s biggest fan as he catalogs one of the most impressive technology advances of our day! I still get excited to ride in a Waymo and I’m thankful we live in Chandler so we can get a front row view of all the incredible growth they are making!
Seth H says
Why are people making this into a Waymo vs Tesla issue? How many of you actually studied AI and artificial neural networks for the last 20 years? I’m sure not many, yet you talk as if you know the first thing about it but with a clear bias!
Given that I actually did study and receive my degree with concentration in AI two decades ago, I think I might be qualified to inform you all that NOBODY is anywhere close to full autonomy on a massive scale. One thing I can say is that at least Waymo doesn’t outright lie to you like Elon does. But seems like many of you just eat up those lies for some reason. Tesla’s approach does have merit, except that you’d need a system capable of full human level intelligence, at which point, who even cares about a self driving vehicle?
My feeling on this is the same as it was 20 years ago when AI wasn’t really a “thing’. Companies continually try to fly before they can crawl when it comes to AI research, which is what has held it back for the last half century or so. This is just another example of too many research resources being misdirected. What exactly do you think will happen if you reach a critical density of these vehicles on the road, where one vehicle getting confused and stuck leads to others getting confused and stuck with humans in the middle of it? It’s the ultimate gridlock scenario!
These things are a minimum 10 years off IF the necessary fundamental research is does that’s not being done now. If I hear about “edge cases” one more time I’m going to blow my lid! These are NOT edge cases, the AI is incapable of understanding human behavior and communication, including non-verbal communication like gestures and cones in the road. You can try to program this in, but it will never cover enough scenarios because you cannot hard-code human language. Much more fundamental research is needed where there’s less of an immediate payoff, but companies won’t put enough finding there. Governments and institutions used to fund such research for the good of everyone, but we’ve clearly lost our way in that regard for the sake of instant gratification.
B bold says
Well said sir! I think I’m more intrigued by your comment than the article it was written for. I feel that corporations have lost their way when it comes to keeping humanity in its best interest. No one seems to care what the bill costs for that instant gratification you speak of. Just look around. Self driving vehicles are popping up everywhere even though we have yet to master that technology. It’s no longer about being innovative but instead it’s a race for who can can beat who to market and gain the biggest market cap. It disgusting.