
Self-Driving Systems Aren’t Sentient (Yet). That Could Be an Accident Waiting to Happen
In our current climate, rapid innovation is a key ingredient for success; if you can’t keep up with the competition, you’re done. However, this “move fast and break things” ethos has unintended consequences—particularly among startups that are developing self-driving cars.
It’s an area of white-hot debate within the automotive industry, and we asked Mary “Missy” Cummings—a professor of engineering and computer science and ex-National Highway Traffic Safety Administration (NHTSA) adviser—to demystify it in a recent interview. She’s proved a very controversial (and useful) character within the automotive space for being so outspoken about the industry’s pursuit to figure out Autonomous Driver Assistance Systems (ADAS) at all costs.
“Elon Musk isn’t the only one that has been promising that self-driving is just around the corner.”
Cummings is in an interesting position, as her viewpoint on autonomous systems (driverless cars, in particular) is often wildly misunderstood. She began her career in the aviation industry, and was very much in support of the move toward autonomy—with a safe and calculated approach—and the same can be said for Autonomous Driver Assistance Systems (ADAS). Here, we discuss her views on how the automotive industry can change its game plan to get these ADAS systems across the line in a more realistic and safe manner.
Why Are Things Not Working Out?
The current self-driving arms race began way back in 2015 when Tesla’s Autopilot system first became available for Model S drivers. Elon Musk would go on to promise Level-5 full-self-driving—in which the vehicle drives itself without human interaction—by 2017. Clearly, that hasn’t happened, and self-driving technology has since produced more questions than answers: Will these systems ever be able to “outsmart” human drivers? How do they co-exist with pedestrians and other road users? Can we trust these systems?
Before we continue untangling the moral maze of self-driving systems, we’d like to note that these issues aren’t unique to Tesla (and other EV startups)—Ford’s Blue Cruise and Cadillac’s Super Cruise are in the same boat, to name a few.
Tesla CEO Elon Musk attends the start of the production at Tesla’s “Gigafactory” in Germany in March 2022.
Either way, while many automakers are very engineering-focused in their aspirations to create self-driving cars, Cummings sees herself as an advocate for making sure these systems are safe.
“That’s the problem with machine learning … when you take a machine-learning algorithm and apply it to a million images, it’s looking for pixelated statistical correlations between those pixels and the image,” Cummings tells Popular Mechanics. “And you don’t actually know if the pixels follow the shape of a stop sign … or are they finding another statistical pattern that it could see as a stop sign.”
Most vehicles now feature Level 2 ADAS systems—which are able to steer, accelerate, and brake autonomously—but all of them still require an attentive human behind the wheel. That’s all well and good, but a new issue called “phantom braking” is bringing vehicles to an emergency stop for non-existent phantom obstacles in the road. “We don’t know why the computer vision systems detect obstacles that the human eye cannot see,” Cummings explains. It remains a sharp rock in the shoe of developing a safe and effective self-driving system.
“It’s a core problem in artificial intelligence,” says Cummings. “We don’t have a model … We don’t have a way to find these erroneous associations and correlations inside that data.” While we have next to no idea why phantom braking happens, we do know quite a lot about how these systems approach driving an automobile.
How Do Driverless Systems, Well, Drive?
Cummings has developed her own framework—known as the skill, knowledge, rule, expertise (SKRE) model—to decode how autonomous systems and humans make decisions within their environment. She likens the idea to a set of stairs, with an understanding of how to do a certain task at the very top. Here’s how that model applies to autonomous driving systems.
→ Skills
Think about when you were first learning how to drive: the literal skill itself is a fundamental building block toward mastery. “When you learn to drive a car, you need to learn the skills of how to stay between the two white lines on the road,” says Cummings. However, while keeping your vehicle between the lanes is vitally important, it’s merely a surface-level skill when it comes to driving an automobile.
→ Rules
Continuing with our learning-to-drive analogy, this is the instance where you’re able to keep your vehicle between the lines without ping-ponging from either side. With that rapidly becoming second nature, it frees up the mental bandwidth to focus on the actual rules of the road; sure, there were quite a few to learn while you were nodding off in driver’s ed, but most of them have likely become second nature by now.
→ Knowledge
The knowledge facet of Cummings’ model involves judgment under uncertainty. In the case of autonomous vehicles, these are the situations where a soccer ball rolls into the street; the car might just see a soccer ball, but we’ll know in the back of our mind that there’s likely a person that’s chasing after it. The human brain has an amazing ability to process massive amounts of information super quickly—making it very good at this sort of thing.
“Your imagination, your ability to conceive of all of these potential probabilities is something that computers cannot do if they have never actually seen that event before,” says Cummings. Plain and simple, these systems haven’t achieved sentience yet—meaning they aren’t really able to make their own decisions. She mentions that these machine-learning algorithms aren’t thinking through decisions as a human being would. They’re merely pattern-matching with the data they’ve been supplied with—which is just an amalgamation of images.
Humans have an incredible ability to make sense of a massive amount of information very quickly—thanks in part to the brain’s amygdala. A recent NPR exclusive with behavioral and data scientist Pragya Agarwal reveals that the human brain can process roughly 11 million bits of information every second. While parsing through everything all at once would simply be impossible, our gray matter is able to make sense of the information by matching it with pre-existing experiences, templates, and stereotypes.
→ Expertise
“Expertise,” or expert-based reasoning, references how a human being or machine reacts to a situation they’ve never experienced before—the ability to think on their feet. During our interview, Cummings mentioned the case of Sully Sullenberger’s 2009 emergency landing in the Hudson River to explain the final facet of her model. “He had a lot of skills, and rules, and knowledge … and was able to reason under massive uncertainty to find a solution that would at least save the lives of all those people,” she says.
So, What’s The Problem?
Cummings brings up the fact that autonomous systems in the aviation world are more widespread than you might think. “In airplanes today, the pilots are not allowed to fly the aircraft for really more than a handful of minutes,” she says. “Planes that are flown autonomously fly much more smoothly, save a lot of gas, and save on tires during landings.” Yes, modern airliners can land themselves now.
While it’s well-documented that the autonomous systems in modern airliners are incredibly safe, autonomous driving presents a whole new set of challenges. Critically, things can go badly much more quickly while driving on the highway compared to flying in the air. “Even if you have the wing fall off the aircraft, you have minutes to figure things out,” says Cummings. Meanwhile, if a car gets spun around in front of you on the freeway, you have seconds to avoid making yourself a part of the accident.
That’s not to mention that distracted driving is already a real issue—which could be exacerbated by cars that can literally drive themselves. This means drivers won’t be able to react in time for the lifeguarding moment when we need to retake control of the vehicle; you’ll just keep hurtling toward a hideous accident.
She gives the example of someone eating a meal when they drop a french fry into the deep chasm between the seat and the center console. If they assume the car can drive itself they’ll try and retrieve their starchy snack, but what are they to do if the vehicle fails to navigate an upcoming corner on its own? Without time to get back to the steering wheel, there’s a good chance they’ll drive off the road, or even worse: into the median toward oncoming traffic.
Cummings likes to refer to this disconnect as “capability confusion.” Take Tesla’s Full-Self-Driving (FSD) system, for example, where drivers are likely to assume that FSD can drive the car sans human input—let’s be honest, it’s not called half-self-driving. Thankfully other mainstream automakers have been much more reserved in their approach to naming autonomous driving systems. Blue Cruise and Super Cruise are much more transparent when talking about their capabilities; their systems can only be used on select sections of highways and not on city streets.
What Is LIDAR and Is It a Fix-All?
Laser Imaging Detection and Ranging (LIDAR) is an advanced laser imaging system that really deserves its own article. “Computer vision has a lot of problems, and so we want to use some other kind of sensor system to either provide a second opinion or maybe even fuse the data for a richer world model,” says Cummings. Many automakers see LIDAR as the long-lost missing piece of the puzzle to unlock the next level of self-driving capability, but Cummings says it’s not the magic fix-all it was claimed to be. “I do think that these systems can improve some of these problems … it will improve the phantom-braking issue, but it won’t eradicate it,” she says.
An electric Jaguar I-Pace car outfitted with Waymo full self-driving technology in Santa Monica, California, in February 2023.
One of the issues with LIDAR is that it only really works in ideal conditions. “Turns out, they don’t work very well with moisture in the air,” says Cummings. “Rain … misty rain is a problem … even after it rains and there are puddles on the road.” These puddles are problematic as they have a sheen over top of them, which is basically invisible to LIDAR; Cummings says these systems aren’t able to decipher whether a puddle is an inch deep or a mile deep.
So there’s still quite a lot of work that needs to be done before we know if we can integrate these types of imaging systems to improve the capabilities of ADAS facilities.
Where Do We Go From Here?
“Elon Musk isn’t the only one that has been promising that self-driving is just around the corner,” says Cummings.
Borrowing from her aviation background, Cummings says that the automotive industry really just needs to start taking systems engineering seriously. “They need to get knee-deep in the testing, they need better testing, more often testing … they need to do a lot more track testing and real-world testing.” It’s no surprise that these things are going to take time and money to get right—which quite a few automakers don’t have.
Even without the resources to develop better testing procedures, Cummings says that automakers can still make an effort to embrace a real safety culture. The exciting prospect of being the first to make a big breakthrough in self-driving technology shouldn’t come at the cost of human life.
“Regulations are not a bad thing, and if you work with them, these companies would have a better time with things,” says Cummings. Whether it’s the dissonance between the startups (ie: Tesla, Rivian, Lucid) and the well-established automakers (Ford, GMC, Volkswagen, etc.) entering the EV space, Cummings says that the best way forward is working together.
Matt Crisara is a native Austinite who has an unbridled passion for cars and motorsports, both foreign and domestic, and as the Autos Editor for Popular Mechanics, he writes the majority of automotive coverage across digital and print. He was previously a contributing writer for Motor1 following internships at Circuit Of The Americas F1 Track and Speed City, an Austin radio broadcaster focused on the world of motor racing. He earned a bachelor’s degree from the University of Arizona School of Journalism, where he raced mountain bikes with the University Club Team. When he isn’t working, he enjoys sim-racing, FPV drones, and the great outdoors.