Tesla Inc. (NASDAQ: TSLA) launched its Autopilot autonomous driving technology in October of 2015. By the following May, a Model S operating in self-driving mode was involved in a fatal crash. At the time, Consumer Reports magazine took Tesla to task for a number of issues, including the name of the autonomous driving system.
Last October, three years after the Autopilot was first deployed in what Tesla called a “public beta phase,” the company rolled out its Navigate on Autopilot feature that the Tesla said “guides a car from a highway’s on-ramp to off-ramp, including suggesting lane changes, navigating highway interchanges and taking exits,” under active supervision by the driver. Navigate’s lane-changing feature was updated last month to include an automatic lane change feature. Consumer Reports is still not impressed.
In a new report Wednesday, the consumer publication said that Tesla’s Navigate on Autopilot feature “lagged far behind a human driver’s skill set,” citing the feature’s proclivity to “cut off cars without leaving enough space and even passed other cars in ways that violate state laws. Human intervention was required “to prevent the system from making poor decisions.”
Jake Fisher, Consumer Reports’ senior director of auto testing, said, “The system’s role should be to help the driver, but the way this technology is deployed, it’s the other way around. It’s incredibly nearsighted. It doesn’t appear to react to brake lights or turn signals, it can’t anticipate what other drivers will do, and as a result, you constantly have to be one step ahead of it.”
Tesla’s response to criticism of its Autopilot system has been a variation of what a company spokesperson told Consumer Reports: ” [I]t is the driver’s responsibility to remain in control of the car at all times, including safely executing lane changes.”
Tesla CEO Elon Musk has promised that fully autonomous driving will be available by the end of next year. By now, most people take Musk’s promises with a large grain of salt, but even if the capability were there it would probably not be legal to operate the car in full self-driving mode unless state and local laws were changed to allow it.
The accepted definition of a fully autonomous vehicle is one in which human intervention is optional. A chart prepared by the Society of Automotive Engineers shows the six levels of autonomous driving and how the driver must interact with the vehicle at each level. Full autonomy, or Level 5, is highly unlikely to be available from Tesla or any other automaker by late 2020.
Additionally, Consumer Reports cited Shiv Patel, an analyst at research firm ABI Research, who notes that while Tesla is further along with its autonomous driving technology than other automakers, the Navigate on Autopilot system’s “current hardware does not have the computing power required to support full self-driving features.”
Tesla has said that a new “Tesla-developed AI chip with our Full Self-Driving platform will allow the speed at which our system processes data to increase by an order of magnitude and take a meaningful leap toward our full self-driving future.” That would be pretty impressive given the processing power of the company’s current system.
Consumer Reports’ Fischer concludes that Navigate on Autopilot’s automatic lane-changing feature “isn’t a convenience at all. Monitoring the system is much harder than just changing lanes yourself. Using the system is like monitoring a kid behind the wheel for the very first time.”
David Friedman, vice-president of advocacy for Consumer Reports, said, “Tesla is showing what not to do on the path toward self-driving cars: release increasingly automated driving systems that aren’t vetted properly. Before selling these systems, automakers should be required to give the public validated evidence of that system’s safety—backed by rigorous simulations, track testing, and the use of safety drivers in real-world conditions.” Note that Tesla vehicles so far are not among the cars with the highest risk of being involved in a fatal crash.