Rumors of the self-driving car have been greatly exaggerated.
Researchers from Detroit to Silicon Valley are working feverishly to develop autonomous vehicles, but whatever reports you've read to the contrary, they have yet to succeed. That has become clear following three, or more, crashes and at least one death in connection to people using Tesla's semi-autonomous Autopilot system.
The headline-grabbing crashes seem to fly in the face of the promise that computer-driven cars will be safer than human-driven cars – as automatic pilot systems have been shown to be in airplanes.
In its marketing, Tesla has made a point of its Autopilot systems ability to take over driving responsibilities on highways. The truth is a little more complex.
Starting in late 2014, Tesla began building its cars with a collection of hardware designed to enable some autonomous features, similar to other high-end cars. In fact, it bought the suite of sensors and software from the same supplier that many traditional automakers do, Israeli tech company Mobileye. The difference is in what Tesla has enabled the software to do.
Mobileye bills itself as a maker of collision avoidance systems – not autonomous driving systems. That is the key to understanding the state of the technology, as well as how Tesla’s implementation makes them stand apart.
The Mobileye system consists of sensors around the car – front, sides, and rear – to detect obstacles, as well as electronic controls for the brakes, steering, and accelerator. Most of these controls were already in place, either to improve safety (such as anti-lock braking systems, electronic stability control, lane departure warning, blind spot alerts, and rearview cameras), to improve fuel economy and emissions (electronic throttles, electric power steering), or as convenience features, like adaptive cruise control. Adding just a couple more sensors and combining their signals in a central computer with the automatic controls, today's cars can effectively operate themselves – up to a point. But they still lack the strategic reasoning that drivers rely on to anticipate traffic and maximize safety.
To better understand these systems, it helps to look at all the pieces and what they do, as well as their ultimate potential.
Front Obstacle Detection
Front obstacle detection governs acceleration and, more importantly, braking. The Insurance Institute for Highway Safety (IIHS) data shows that these systems can reduce rear-end accidents by almost 40 percent. The most basic systems, called forward collision warning systems, warn inattentive drivers of impending accidents. More advanced systems can automatically apply the brakes. Automakers calibrate the systems differently; some intervene sooner, others later and more aggressively. IIHS of these systems on a six-point scale.
There are two parts to forward collision detection: locating an obstacle and identifying it.
Forward-facing radar scans the road ahead to locate obstacles as the car approaches them. If you're not paying attention and approaching, say, a parked truck, the radar will map the distance to the truck and how fast you're approaching it. Forward radar is also used in cars with adaptive cruise control, which can maintain a set distance between your car and the car ahead, rather than just a set speed. Cars with forward radar alone tend to have the most basic forward collision warning systems, rather than the more effective (and autonomous) automatic emergency braking.
To enable safer automatic emergency braking systems, and for true autonomous driving, forward radar is not enough. For example, if you're on a rural highway, and a car turns left in the distance in front of you, a car that just has forward radar might suddenly brake hard for the obstacle, then release the brakes once the crossing car cleared the road ahead. That would be unnecessarily alarming to you and the passengers in your car, not to mention hazardous to any cars behind you.
Front-facing cameras fill this gap and make automatic emergency braking more seamless. By capturing a digitizedimage of the road ahead, the cameras can compare the digitized image with stored images of various obstacles, such as the back of a truck, a bicycle, or a child crossing the road. If the image matches that of any stored obstacle, the car will apply the brakes. If not, it will continue on. (This is what happened in the fatal Tesla crash, when the computer couldn't correctly identify the side of a tractor-trailer turning left in front of the car and mistook it for a billboard.)
A single front camera, however, has no way to distinguish between a large object far away and a small one up close. Some systems, such as Subaru's EyeSight, use dual cameras to capture a stereo image so they can gauge distance and closing speed. Others coordinate the camera's image with the information from forward radar to gather distance and closing speed.
Forward cameras alone, however, don't work well in bad weather when snow, ice, or fog can distort their view. Radar cuts through such obstructions. (Most systems that use front cameras will simply shut down when the weather deteriorates.)
In some systems, including Tesla's, front cameras also watch the lane lines on the road. Cars with automatic steering use this information to keep the car in the lane or sound a warning if it starts to drift when you're not paying attention.
Some autonomous vehicle experts say lidar (Light Detection and Ranging) is needed to capture the view ahead of the car with greater resolution and speed. Google, for example, uses lidar along with cameras and radar in its self-driving test vehicles. So far, however, lidar is too expensive to be used in most cars sold to the public, and Teslas with Autopilot don't have it.
As these systems have advanced, some have developed the ability to stop the car completely in a traffic jam and then restart when traffic does. (Some systems can only do this above 37 mph; others only below 30 mph. Few, including Tesla and Mercedes-Benz, do it consistently.) Some have also developed the ability to recognize pedestrians and bicyclists, at least at certain speeds.)
Google, Ford, and others have even demonstrated the ability for cars to recognize traffic lights, speed limit signs, and other traffic controls by creating images of what they should look like (and their precise locations) for the computer to reference. So far, however, this ability is not comprehensive and is limited to a few specific locations.
Electrically assisted power steering has replaced hydraulic steering on most modern cars because it uses less power, making it more fuel efficient. Using an electric motor to control the steering has allowed automakers to enable all sorts of other features, from lane keeping assistance to self-parking systems. Tesla's and BMW's systems don't even require the driver to be in the car at all for it to park itself.
Using input from forward-facing cameras watching the lane markings on the road, these modern cars can steer themselves around corners. Teslas can even use input from the navigation system to take off-ramps, for example.
Most modern cars also offer blind-spot detection, at least as an option. Blind-spot detection uses infrared sensors mounted in the rear bumper to detect obstacles in the blind spots next to the rear of the car. In most cars, a yellow light illuminates in the outside mirror on the appropriate side of the car, and an alarm sounds if you activate your turn signal to move in the direction in which the system detects an obstacle or another car.
Tesla incorporates the signal from the blind-spot-monitoring sensors, along with the lane and obstacle information from the front cameras, into the electric steering control system. This is so that if you pull the left turn signal on the highway, it looks for obstacles next to you, and if it finds none, it steers the Tesla into the adjacent lane automatically. (There's no technical reason why other automakers couldn't do this too, they just don't.)
Rear-vision systems are not a big distinguishing factor in autonomous cars yet. However, widespread adoption of rear-view cameras, as well as infrared "parking" sensors, has enabled automakers to develop back-over prevention systems, as well as automatic parking systems that can back a car into either a parallel or perpendicular parking space. They have also brought rear cross-traffic alert systems that can spot a car coming down a lane as you're backing out between two tall trucks, for example. The latest cross-traffic prevention systems can automatically brake when they detect an obstacle approaching. A government regulation was passed that demands all vehicles to have standard rearview cameras by 2018.
Putting It All Together
Most automakers promote these systems as driver safety aids, or what they call "active safety systems." They essentially act as a backstop for drivers who are distracted or not paying attention.
Tesla was initially criticized for not including any of these active safety systems on its cars, which compete with ultra-luxury cars that have most of these systems. When it added these systems by contracting with Mobileye in late 2014, Tesla went a few steps farther with them than most automakers. The difference isn't in the capability of the hardware, only in what the software allows it to do.
Most other automakers using such technology only allow the steering to make minor corrections. It can't automatically steer the car all the way around a curve, for example, even on gentle highway curves (though some Mercedes-Benz and Infiniti models can). If you try to let a car without automatic steering take a turn by itself, it won't turn hard enough and will sound an alarm for the driver to retake control. Most systems will also sound an alarm and shut themselves off if they don't detect hands on the steering wheel for five or ten seconds. Tesla gives drivers up to two minutes before warning them to retake control.
It's not that the systems couldn't follow all the way through a curve or continue steering even without the driver for longer than two minutes, but most automakers won't allow that, to ensure that the driver keeps paying attention.
Study after study has shown that humans are horrible at retaking control of anything quickly after being disengaged or distracted, and make poor decisions when they do. This will be the biggest challenge for autonomous vehicles, until they can fully develop the strategic reasoning and planning cognitive abilities that driving requires. The airline industry has developed decades worth of procedures – including alarms, communication systems, even the presence of co-pilots – to govern when and how pilots retake control after using autopilot systems, and to ensure that they're fully engaged when they do.
In fairness, when Tesla CEO Elon Musk introduced Autopilot, he said it was not a fully autonomous system "where you can go to sleep behind the wheel and arrive safely at your destination." The company did call the system Autopilot, however, and crowed about how its cars would be able to drive down the highway keeping pace with surrounding traffic, and following lane lines. And that's how quite a few Tesla drivers, including at least one of the accident victims, have been trying to use it.
Later, Musk expressed dismay at how Tesla drivers were using the system, including on two-lane, undivided highways in YouTube videos. In truth, Autopilot is oversold. It's little more than the same full suite of active safety systems that other automakers use, only without most of the constraints that other automakers employ to keep drivers engaged.
Demographics may also play a role. Tesla buyers tend to be early adopters – and that applies as much to automated driving technology as to electric cars. Calling the system Autopilot only encourages eager Tesla drivers to use it that way.
Autopilot may be able to handle light driving assistance, but it's relatively new to the road, and like a teenager learning to drive, the system hasn't proved capable of handling the responsibility.