
Self-driving cars targeted by police
On a Friday morning in November last year, before dawn, a Tesla Model S electric vehicle was driving on Highway 101 between San Francisco International Airport and Palo Alto, and the California Highway Patrol police car “followed.” “After that. The gray Tesla was traveling at 70 miles per hour, with its turn signals staggering, and driving through multiple exits. When the car pulled over to a stop, the police stepped forward to find the driver’s head drooping down, and not even the lights and sirens could wake him up. Police speculated that the car was driving itself under the control of Tesla’s Autopilot technology.
The company said that each Tesla is equipped with hardware that enables its vehicles to drive autonomously throughout the driving process, without the need for input from a human driver. For now, the company confines cars to a system that guides them from on-ramp to off-ramp sections on highways. It now appears that the system is smart enough to keep the vehicle safe even with a driver who seems to have lost the ability to drive, but it doesn’t seem to be smart enough to recognize the siren and pull over as a result.
This appears to be the first time law enforcement has stopped a self-driving car on the open road. In fact, the police had no way to control the driving software, so at the time they came up with a way to take advantage of Tesla’s safety program. A highway patrol car stopped the vehicle behind, and another policeman circled in front of the Tesla and began to slow down until the Tesla came to a stop.
The incident exemplifies the duality of driverless technology, a promising future for the technology and deep concerns on the other. The driver in the incident, a 45-year-old California man who failed an on-site sobriety test, was charged with DUI, according to police. The car drove about 10 kilometers on the highway at night in a driverless state, so it’s likely that self-driving technology helped avoid a traffic accident this time around. However, neither the police nor Tesla has expected drivers to rely on self-driving technology in such a thrilling way, or it is not the time.
According to Tesla’s disclaimer, drivers should be “alert and proactive” when using Autopilot, for example: Drivers should be prepared to take over control of the vehicle if police approach. If the car does not sense a hand on the steering wheel, it should come to a slow stop and turn on the hazard lights. Two days after the incident, Tesla CEO Elon Musk tweeted that he was “investigating this matter.” A company spokesman, meanwhile, declined to share any information gleaned from the car’s data logs.
“If the driver is asleep and can get to the destination safely, when will that be possible, my guess is, maybe by the end of next year,” Musk said in a video show released on Monday.
The police who stopped the Tesla that night were also the first to do so. Their business training does not include stopping a self-driving car. It just so happened that they knew enough about Tesla to be adaptable. “The police did a great job this time,” said Superintendent Saul Jaeger of the nearby Mountain View Police Department. If this incident happened in the heart of Silicon Valley, where the car was parked between Facebook and Google headquarters, it would not be surprising that the professionals there could stop the car. It’s also really unreliable for law enforcement officers to respond to the problem of self-driving vehicles.
Until automakers, engineers, lawmakers and police tackle a host of thorny issues,robotIt is impossible to drive on the road. There are many questions, such as: How can the police pull over a self-driving car?after a collisionautonomous driving systemHow should we respond? How to design a driving system to confirm human authority?
Five years ago, MIT robotics expert John Leonard began shooting dashcams in Boston. He hopes to classify the “hard-to-grasp situations” of intelligent driving. One night, he saw police walk into a busy intersection, block oncoming traffic, and direct pedestrians (running a red light) to cross the road. In his view, this situation is a “difficult to grasp” situation.
Of all the challenges facing self-driving technology, scenarios like this are the most troublesome, John Leonard said, and that’s why the era of truly driverless cars will “come more slowly than many in the industry predict.” As an industry insider, his views are valuable. After Leonard left MIT in 2016, he joined the Toyota Research Institute and helped lead the company’s research and development of autonomous driving technology.
Waymo LLC, the self-driving startup owned by Alphabet (Google’s parent company), which currently serves passengers in the Phoenix, Arizona area, ran into the problems Leonard described.In January, Waymo’s fleet of autonomous ride-hailing services was fully loadedsensorThe Chrysler Pacifica van as it came to a dim traffic light in Tempe, Arizona. There was a power outage and police were standing on the road directing traffic. Using dash cam footage, and computer-transformed images provided by Waymo, the Pacifica can be seen parked at an intersection, waiting for oncoming traffic and left-turn vehicles coming from the other direction, and then continuing as the police wave. .
Waymo spokesman Alexis Georgeson said the company’s fleet of vehicles can distinguish between civilians and police officers standing on the road and can track hand gestures. “Once the vehicle recognizes the police, it makes concessions and responses based on that,” she said. “Our vehicles navigate construction areas very well and respond to uniformed officers.”
Waymo is taking a territorial approach to self-driving cars, focusing on taxi fleet services in limited areas and discontinuing fully autonomous, unlimited-area driving projects (such self-driving technology is classified as the most advanced level 5 in the industry, belonging to a level that is not yet achievable). Working within a limited area allows for both building detailed maps and easier coordination with government and law enforcement. Rather than try to span different government jurisdictions, Waymo chose Chandler, a suburb of Phoenix, as its first experimental area. Chandler has wide streets, sunny weather, and a welcoming local government. Many rivals have taken a similar approach by building autonomous fleets in specific areas. For example, Ford is conducting trials in Miami and Washington; dozens of companies are testing self-driving cars on the road in California, including GM’s Cruise, Zoox and Toyota.
In the summer of 2017, about a year and a half before Waymo first launched its ride-hailing service in Chandler, Waymo invited local police, firefighters and ambulances to participate in a test in which fire trucks, ambulances, and patrol cars were tested. (honking and flashing lights) Driving towards a driverless car from all angles on a closed road. “We have a lot of interaction with the people at Waymo when it comes to researching and developing technology,” city spokesman Matt Burdick said.
Last year, Waymo became the first self-driving vehicle manufacturer to issue a law enforcement interaction protocol. If its self-driving car detects a police car behind it and the lights are flashing, it will “stop when it finds a safe place,” the document said.
Jeff West, chief of the Chandler City Fire Department, said that based on his observations on the road, Waymo’s self-driving vehicles dodge faster than many human-driven cars. “Once it recognized us, it pulled over, and some human drivers listening to the radio, or turning on the air conditioning, didn’t respond as quickly as they did,” he said.
For now, however, most Waymo cabs have a human driver in case they take over in any situation that stumbles the self-driving car. Matt Burdick said there had not been any clashes between local police and driverless cars. In the event of such a situation, police can get in touch with the company’s support team by calling the 24-hour hotline or pressing the help button above the second row of seats, said Matthew Schwall, Waymo’s director of on-site security. At this point, remote workers cannot directly control the vehicle, but they can reroute, for example, follow police instructions to pull to the side of the road in the event of a collision.
Last summer, Michigan State Ranger Ken Monroe took Ford engineers for a ride around Flint. Engineers were particularly curious about how officers wanted the vehicle to react when it appeared behind the vehicle with its lights flashing. Whether the police signaled to pull the car over or wanted to overtake, how would the self-driving car react to these two different situations?
The engineer explained to Monroe in detail how the self-driving car can tell if the police want it to pull over—the most straightforward criterion is how long the police car has been following the self-driving car.
In addition to testing in Miami and Washington, Ford has also worked with police in Michigan for nearly two years to prepare for autonomous ride-hailing and delivery vehicles scheduled for launch in 2021. Two years ago, dozens of Michigan police officers came to Ford’s offices in Dearborn to discuss the plan.Fordautonomous driving systemEngineering director Colm Boran said: “We stress that these vehicles will not be privately owned. This alleviates some of their concerns.”
Teaching a self-driving car to park on the right is relatively easy. Police lights and sirens are noticeable from a distance after all. Zachary Doerzaph, director of the Center for Advanced Automotive Research at the Virginia Tech Transportation Institute, said: “If a human can notice a police car, so should a machine.” How to do. These more difficult problems account for about the last 10% of the development process, but take a lot of time to overcome. Doerzaph’s team is working on such a scenario for a group of automakers, but he wouldn’t reveal more.
Doerzaph said: “For these uncommon cases, we call them “extreme cases”, but the term does not actually fully reflect the difficulty of the challenge. At any given moment, there are thousands of construction zones, accident sites, police officers standing at crossroads everywhere. The cues humans use to identify these situations are subtle and varied. Humans can recognize basic gestures and, more importantly, can confirm commands by making eye contact or nodding.
It may be necessary for self-driving researchers to try to replicate these subtle human-communication behaviors to create new modes of communication between smart-driving cars and police officers. In theory, when a trooper gets out of a patrol car on a highway, he can instruct all driverless cars in the area to evade by operating a handheld device. These solutions, while technically feasible, present numerous organizational and legal obstacles.
Washington state-based Inrix, a startup focused on digital traffic and parking information, has begun delivering software to cities that allows them to input traffic rules and road signs into high-definition maps for self-driving developers to use. City officials can mark the location of stop signs, zebra crossings, bike lanes, and more, and when the self-driving software sends the navigation software a command to draw a route, it will also get information about the rules and restrictions associated with that trip. Several cities, including Boston and Las Vegas, are currently using the service, called AV Road Rules.
On May 11, 2018, in South Jordan, Utah, a Tesla Model S collided with a fire truck at a traffic light. A police report obtained by The Associated Press showed the Tesla crashed into a parked fire truck while in Autopilot mode and accelerated seconds before the crash.
These maps can be continuously updated. If roadworks block a lane, the change in road conditions can be marked on the map. Inrix is working to enable police to update maps at any time directly from their police cars. “We’re exploring how to turn this hypothetical capability into a real-world tool,” said Avery Ash, head of its autonomous mobility division.
After the self-driving industry tackles everyday traffic parking, accident scenes, and road construction, a series of “extreme scenarios” awaits them. Inspector Saul Jaeger drew an analogy: What if you encounter a terrorist suspect? What if someone orders a car, throws a backpack in the car, tells the self-driving car where to go, and then detonates the bomb? Jaeger started working with Waymo engineers when Waymo was just a self-driving project within Google.
The good news: Cities, police, and automakers are motivated to find solutions to these problems because they all believe the current state of traffic accidents is unacceptable. More than 37,000 people are killed each year in car crashes, the vast majority of which are caused by human factors. The police are undoubtedly the main witnesses to traffic accidents, and sometimes the victims. It would be a welcome change if self-driving vehicles could detect sirens from miles away and reliably obey the rules.
State Trooper Ken Monroe said: “Human drivers are unpredictable – it’s just too difficult.”
Waymo’s Matthew Schwall said that during training, the company shows police how the smart fleet works and lets them experience it in the car, and he often hears the same question: When can they have a self-driving police car?
Source: https://www.getinno.com/self-driving-cars-targeted-by-police.html