News & Analysis

driven by the PitchBook Platform
Venture Capital

Is Tesla all wrong on Autopilot?

The woes continue to mount for Tesla amid growing questions over the safety and efficacy of its semi-autonomous system following a series of serious—and sometimes deadly—crashes.

The woes continue to mount for Tesla. And I’m not just talking about ongoing Model 3 production woes or testy earnings calls. I’m talking about the consequences—possibly fatal ones—of the decision to push out quasi-conditional autonomy before the technology was ready, banking on drivers being responsible and basing the system on an array of cameras instead of more reliable but much more expensive LiDAR sensors.

Another Model S crashed in Utah on Friday after failing to slow for a red light and slamming into a firetruck at 60 mph, apparently without braking before impact. This echoes a crash in Southern California back in January, when a Model S going over 50 mph slammed into a firetruck packed on the side of Interstate 405. Executive departures keep mounting, the company’s top engineer has taken a “leave of absence,” and CEO Elon Musk told employees on Monday that they are merely in the midst of a reorganization.

Following multiple accidents, there are four active federal investigations involving Tesla’s vehicles. Both the National Highway Transportation Safety Agency and the National Transportation Safety Board are conducting probes into the performance of the company’s semi-autonomous Autopilot system—a key feature that has attracted people to the brand, but which has also killed at least two people—just as the company’s point person with regulators has left to join autonomous vehicle competitor Waymo.

This all goes back to Musk and his “reality distortion field": In order to do the impossible—in this case, start a new American car company, bring electric vehicles into the mainstream, and challenge both established automakers and Big Oil—he needed a lot of capital and a lot of hype. Embracing the new and the exciting, from Ludicrous Mode to Falcon Wing doors.

And Autopilot was a huge part of this. It was supposed to merely be a driver’s aid, according to Tesla’s lawyers, but (wink-wink) is often used as a shoe-horned “Level 3" option by commuters who stick water bottles and oranges into the steering wheel to trick the computer into thinking they’re paying attention. The problem is that, per decades of research cited by The Wall Street Journal, autopilot systems that rely on humans monitoring the operations and standing ready to re-take control are the most dangerous.

Because people are lazy. And highway cruising can be so boring.

Most of Tesla’s competitors are skipping Level 3 autonomy altogether for this reason, with Ford, GM and Volvo going straight from Level 2 to Level 4 systems. Alphabet’s Waymo has been focusing on Level 4 and 5 autonomy from the outset, with riders as passengers with no responsibility already enjoying the experience in Phoenix. GM has petitioned the federal government to adjust motor vehicle standards, so it can road test a version of the Bolt EV without a steering wheel or pedals next year.


Audi was an exception, as it sought approval for its LiDAR-based “Traffic Jam Pilot” system, which is eyes-free and hands-free but not mind-free, with the system prompting the driver to take control when needed. But Audi has since decided to press pause and sell the system as an advanced Level 2 option in the United States (eyes-on, hands-on, mind-on) given the uncertain regulatory outlook and other legal issues. European drivers, on the other hand, get a chance to enjoy a more blissful commute thanks to well-sorted rules.

I think there is a real and rising risk that regulators will force Tesla to disable Autopilot functionality in its vehicles, which would be a crushing blow for the company at such a vulnerable time. Investment spending needs are high, necessitating a mad dash for profitability and cashflow to avoid another capital raise. The hit to demand and Musk’s hype bubble would jeopardize all that.

And it would be the result of an early decision Musk made to sell Autopilot systems as more capable than they are, relying on cameras and a single forward-looking radar for a total system cost of around $5,000, according to UBS. LiDAR, which is similar to radar but uses lasers to classify objects at high resolution, wasn’t included. Because right now, a single spinning unit, like the pod seen on the top of Waymo’s minivans, costs $70,000.

The argument from Musk was that by accumulating millions of miles of data, Tesla’s cameras and the AI behind the Autopilot system would be able to more accurately identify objects, mimicking the use of “passive visual” systems and accumulated smarts used by flesh-and-blood drivers, eventually paving the way for Level 4/5 autonomy, with Tesla’s website saying many vehicles leaving their factory “have the hardware needed for full self-driving capability.”

Left unsaid was the trial-and-error nature inherent to this approach.

Last fall, one of GM’s autonomous vehicle executives said Musk was “full of crap” for holding this view, since this LiDAR-less sensor package lacks redundancy

Musk also reportedly nixed the idea of adding eye tracking and steering-wheel sensors to avoid Autopilot usage by inattentive drivers, similar to the systems used by Nissan in its ProPILOT system and GM with its Super Cruise, because he thought drivers would be annoyed. On balance, Musk believed the system—albeit troubled—would save lives if it had been more widely used. Tesla, in a blog post following the fatal Autopilot crash earlier this year, proclaimed that drivers in an Autopilot-equipped vehicle were 3.7 times less likely to be in a fatal accident.

But isn’t it better to be a little inconvenienced—making absolutely sure drivers are paying attention—than lull them into a false sense of security?

In 2016, Consumer Reports urged Tesla to disable the Autopilot system, warning it was too much autonomy too soon and that jeopardized the lives of drivers. German regulators took issue with the company’s description of the system as being in “Beta,” which Musk said on Twitter was used to “emphasize to those who chose to use [Autopilot] that it wasn’t perfect.” Tesla’s own user manuals note the system is fooled by bright sunlight, faded lane marking, road seams and more.

Driven by a profit motive and a rush to be first, I think Musk provided drivers with a temptation they couldn’t resist. With potentially deadly consequences.

  • anthony-headshot-fhe.png
    Written by Anthony Mirhaydari
    Anthony Mirhaydari was a senior financial writer at PitchBook, covering the intersection of private markets activity (VC, PE and M&A) with public markets and the broader economy. He has more than a decade of experience in the financial space, covering everything from earnings activity to geo-politics.

    Prior to joining PitchBook, Anthony was a financial columnist for CBS News and MSN Money, Microsoft’s financial news portal. Previously, he was a business consulting analyst with Moss Adams LLP focusing on the financial services industry. He holds a BA in Business Administration (finance specialty) from the University of Washington’s Foster School of Business, graduating magna cum laude with distinction.

Join the more than 1.5 million industry professionals who get our daily newsletter!