Articles, Cognitive Vehicles, Level 5 Autonomous Driving

Autonomous Vehicles and the Importance of Blending In

Driving seems like second nature for most individuals, even as hazards and potential pitfalls spring up around them. The experience of merging onto a speeding, overcrowded highway is so common, the frustration tends to fade away mere minutes later. In that moment, however, as cars are weaving in and out of lanes – practically pushing and shoving as they accelerate toward their destination – the complexities are quite clear. Roads are often a mess, and yet somehow we navigate them with fairly positive results.

Jeffrey Rupp, chief technology officer and chief safety officer for the American Center for Mobility, an autonomous vehicle test track in Ypsilanti, Michigan, has taken note of the complicated road environment. In developing automobiles that drive themselves, Rupp believes the solution is building cars that can handle the complex behaviors of human drivers.

“A lot of it is making a vehicle that blends in with traffic,” said Rupp. “It doesn’t stand out as having behaviors that are greatly different from human-driven vehicles, both from the other vehicles’ point of view [and] from the occupants.” Rupp said that occupants might not be comfortable with an AV that’s driving too conservatively. Human drivers might honk or exhibit other annoying behaviors as they try to maneuver around the driverless machine.

“[It’s important to] just kind of blend in and become part of the network of vehicles driving at the same time,” he explained. “That’s probably the biggest challenge – having a vehicle know where it’s at, know when to stop making sure it doesn’t hit stuff. Most companies are doing very well in that regard, but it’s much more sophisticated and challenging to make it more comfortable and more acceptable than [human] driving.”

Navigating Disaster

Edge cases are arguably the biggest threat to autonomous vehicles, particularly when natural disasters strike. Even something as simple as an ambulance could pose a problem if the AV doesn’t know how to properly move over and allow it to pass. Rather than a written law stating exactly how driverless cars will react, Rupp anticipates one solution where the vehicle can assess and determine alternative actions. For example, the automobile might realize that it is acceptable to cross a double yellow line if there’s a parked car halfway across a lane.

“I think some of the more similar exceptions to the rule can be built into the regular operation,” said Rupp. “Some of the edge cases, I think for many years, there will be situations where we just haven’t had the ability to be that sophisticated in the approach. I think manufacturers would be hard-pressed to lock out all human intervention. I think when manufacturers market a vehicle they’ll have a very specific description of the operational designs, the boundaries within which this system is intended to operate.”

An automaker might warn consumers that an AV can only operate in normal weather without earthquakes or other major road hazards. If those things occur, the front-seat occupant might be expected to take over. “For the foreseeable future there will be so many edge cases that we can’t contain within the operational design domains up front,” Rupp added. “[For now] there will always be the need for human takeover. A Level 4 vehicle is very possible, but from a practical view I think there will be limitations that everyone will understand. And I think the buying public will accept those limitations and not expect the perfect automated vehicle that would solve all of their worries about getting to work and dealing with traffic.”

About the author:

Louis Bedigian is an experienced journalist and contributor to various automotive trade publications. He is a dynamic writer, editor and communications specialist with expertise in the areas of journalism, promotional copy, PR, research and social networking.