Self-Driving Cars

When a self-driving car doesn’t know what to do

Joel Johnson has encountered odd situations before while riding in Waymo’s fully autonomous vehicles. Heck, he’s even encouraged them.

As a member of the company’s Waymo One ride-hailing program, the Arizona State University student has requested pickups and drop-offs at a local Costco because he knows the parking lot teems with pedestrians. Or rides to In-N-Out Burger because there are traffic cones near the drive-thru that may prove challenging.

Johnson has amassed extensive Waymo One experience, with 146 rides covering 1,111 miles and counting, all documented on his website, jjricks.com. The most notable of his rides without human backup drivers came May 3 during what should have been an ordinary jaunt between two metro Phoenix strip malls. Instead, a construction zone flummoxed his Waymo minivan.

A cascading series of awkward complications followed for more than 16 minutes, with the vehicle stopping and starting its route five times while sporadically blocking traffic and waiting for human help.

The incident raised concerns about Waymo’s self-driving technology, called Waymo Driver, and the humans who are supposed to oversee its operation. From the rear seat, Johnson caught the whole thing on video, providing an unvarnished look at how a leading self-driving tech company handles anomalies in real-world driving.

It comes at a time when Waymo has applied for a permit to commercialize its operations in California, and more broadly, at a time when industry executives are wondering, after a decade’s worth of development, when Waymo will expand its driverless service beyond a geofenced portion of the Phoenix suburbs.

“Waymo has been at this for a long time, and if their technology was reliable and scalable and made financial sense, they should have been able to deploy in a much wider area of Phoenix by now,” said Raj Rajkumar, an electrical and computer engineering professor at Carnegie Mellon University and co-director of the General Motors-Carnegie Mellon Connected and Autonomous Driving Collaborative Research Lab.

“It’s a complex problem to solve,” he said. “But after spending all this money, I’m surprised they are not doing something this fundamental well. That was an eye-opener that they could not complete this route. To me, it implies there are a lot more edge cases they have not handled yet.”

Construction zones are already something of a nemesis for autonomous driving systems. Given the presence of human workers and frequent changes in lane markings and signage, there’s no room for error.

Indeed, in a response to questions about the incident, Waymo underscored the dynamic nature of the construction environment as a key contributor to the unusual scenario encountered on Johnson’s ride.

But that does not necessarily explain why the car first paused. When it approached North Dobson Avenue, there were construction cones on that road between the two travel lanes but no ongoing work in the immediate area. Waymo’s path planning, visible on a touch screen, indicated the vehicle wanted to turn into the right lane, which was closed. Rather than turn into the outer of the two lanes, the car stopped.

Waymo said the cars are programmed, per Arizona state law, to turn into the tightest turn lane. Because the cones blocked the path, the car needed additional guidance.

Sometimes construction cones themselves can be problematic for self-driving systems, says Anuja Sonalker, CEO and founder of Steer, a Maryland startup that makes self-driving and parking systems.

“They’re very interesting objects that can become hard to detect using sensor modalities other than cameras,” she said, noting their triangular shape and curved surface can hinder beams from radar, lidar or ultrasonic sensors from reflecting back.

Dedicated machine learning is required to recognize the shape as a traffic cone. In situations where returns from lidar or radar are unclear, as Sonalker described, cameras can operate without backups and measure depth of objects in their field. But information from multiple sensors and perspectives achieves better outcomes.

Whether Waymo’s quandary stemmed from adherence to state law, difficulty in perception or path planning, mapping or something else is unclear. Ultimately, Sonalker says, the vehicle did the right thing. “If the automated driver does not know what to do,” she said, “the right answer is to pause.”

Four minutes after the vehicle paused, things got more interesting.

Johnson chatted with a remote fleet operator, who had dispatched a roadside assistance crew. But before the crew could arrive, the vehicle started moving again and turned the corner. It drove for five seconds before stopping again, this time partially blocking busy North Dobson.

A company spokesperson said the autonomous system itself requested guidance from the remote fleet operator, who can provide information on the surrounding area to help Waymo Driver navigate the problem spot. In this case, the guidance provided was erroneous, “which made it challenging for the Waymo Driver to resume its intended route.”

Getting the roadside assistance team to the vehicle proved to be another stumbling block.

The initial dispatch was canceled after the car started moving. Ninety seconds into the second pause, with other motorists honking as they circumvented the minivan, the remote operator told Johnson, “I don’t even have a roadside assistance assigned right now because the car is no longer stranded.”

Three minutes later, suddenly, the roadside crew’s estimated time of arrival was “right now,” according to the remote operator. But there was no Waymo roadside service in sight. The only crew that appeared was a construction crew, picking up cones along the intended route. About two minutes and 15 seconds passed, and the car suddenly started moving again.

More so than the technical hurdles with the construction zone, the general confusion and lack of accurate communication between Waymo’s remote operator and the company’s roadside team were what surprised Johnson, who alternately appeared giddy and calm throughout a ride that grew more harrowing by the minute.

“They were somehow not communicating exactly, and the car just kept going when it wasn’t supposed to,” he said. “It just seemed like they lost control of the situation. That struck me as particularly odd because usually everything’s very smooth.”

Rajkumar cuts Waymo some leeway on the human communications breakdown, if only because the company likely does not have many real-world events in which humans gain valuable experience.

“They probably don’t have much of a chance to assess those capabilities with different scenarios,” he said. “Clearly, more needs to be done.”

The ride continued. As the car traveled along North Dobson, the remote operator queried Johnson. “Are you moving?” she asked, just before the robotaxi came to yet another stop that partially blocked traffic.

At last, Waymo’s roadside assistance crew arrived. But as a crew member approached, the minivan played a cat-and-mouse game and drove away. Another minute and a half passed, and the remote operator sounded incredulous: “The car took off again?”

Finally, after the car’s next unscheduled pit stop, a member of the roadside assistance team managed to open the front door and climb behind the wheel. He asked Johnson whether he was OK after the 16-minute ordeal.

“I live for moments like these,” Johnson replied.

At first, he was exhilarated. Like an art collector coming across a rare painting, Johnson knew the video would be the undoubted standout of his collection, which was already a valuable repository of information. In retrospect, Johnson wonders whether he should have been more worried for his safety. Or whether he should have tried to hop in the driver’s seat and steer the vehicle out of harm’s way.

“I should have been a little more freaked out, but I was mostly thinking of how good the video was going to be,” he said. Indeed, the video had nearly a quarter-million views on YouTube by late last week. “I should have been concerned about the whiplash potential.”

He remains enthusiastic about participating in Waymo One.

Rajkumar expressed admiration for the calm Johnson displayed. He wonders whether Johnson should have exited the vehicle. Earlier in the ride, the remote operator had advised Johnson to remain in his seat with his seat belt fastened. But are there situations where a rider should seek an exit? Or commandeer control and pull an erratic car to a safe stop on the side of the road rather than creating a traffic hazard? There are no surefire answers.
In its statement, Waymo said, “While the situation was not ideal, the Waymo Driver operated the vehicle safely until the Roadside Assistance arrived.”

Exactly what safety risk the incident posed to Johnson or other motorists as the minivan blocked traffic is difficult to quantify. What is reassuring in some sense is that a more dire scenario was avoided.

“In the best of all worlds, it should have pulled over,” Rajkumar said. “In the worst, something bad would have happened. It did not do the best. It did not do the worst. I give them credit for that.”

It comes at a time when Waymo has applied for a permit to commercialize its operations in California, and more broadly, at a time when industry executives are wondering, after a decade’s worth of development, when Waymo will expand its driverless service beyond a geofenced portion of the Phoenix suburbs.

Source: https://carsrealtime.com/2021/05/29/when-a-self-driving-car-doesnt-know-what-to-do/

Donovan Larsen

Donovan is a columnist and associate editor at the Dark News. He has written on everything from the politics to diversity issues in the workplace.

Related Articles

Leave a Reply

Your email address will not be published. Required fields are marked *

Back to top button