Will Tesla’s Autopilot Feature Keep You Safer on the Highway?

by Terry Bryant

Tesla autopilot death

Are you a safer driver than your autonomous vehicle?

Although most of us welcome technological changes that offer greater safety and security, we still realize that when people are in charge of using any vehicle or device, human error will always remain a dangerous factor in the equation. On some level, human error likely played a role in the May 7, 2016 death of former Navy SEAL Joshua Brown while he was driving a 2015 Model S electric car.

Details Involved with Mr. Brown’s Accident – Some Not Yet Fully Documented

While out driving on a freeway, Mr. Brown decided to use his Tesla’s Autopilot feature. As he was traveling along, a tractor trailer apparently drove across the freeway, turning left in front of Mr. Brown’s Tesla. Reports of the accident indicate that the truck’s side was bright white and possibly difficult to see given the brightness of the sky.

Neither Mr. Brown nor the Autopilot featured activated the brake in time to save him from the ensuing crash. The Tesla drove underneath the big trailer, smashing its windshield and tragically killing Mr. Brown. Investigations continue regarding all of the factors involved in this event.

What Has Tesla Done to Try and Prevent This Type of Rare Accident?

According to a USA Today article published on July 1, 2016, Tesla has designed its cars with Autopilot so that drivers must keep both hands on the steering wheel. When they fail to do so, the cars are supposed to go slower until both hands return. This is surely a sign that Tesla recognizes that human error can easily work against the precision of its new technologies. Given the precise maneuvers required to turn the Autopilot system on and off, you have to hope that each driver receives adequate training before heading out on the road.

Did Human Error Play a Role in Mr. Brown’s Death?

Although the reported facts of the accident aren’t entirely clear, it appears that Mr. Brown may have been watching a “Harry Potter” movie at the time the accident occurred. If that’s true, his possible decision to watch a film instead of observing the freeway in front of him may have played a role in his crash. This possibility is clearly outlined in the USA Today article published on July 1, 2016. A DVD player was found in the Tesla accident wreckage.

We still don’t know whether the speed at which Mr. Brown was traveling might have played a role in his failure to stop. The same USA Today article just referenced notes that Mr. Brown had received a total of eight speeding tickets during the past six years. That’s quite high since most of us often only receive one or two of those in our lifetimes – assuming we only drive when fully alert and determined not to indulge in any distracted driving.

The other type of potential human error that can always be involved in accidents is the failure of design engineers to create sensors that are so highly effective that they can easily perceive large moving objects – regardless of their color – when they’re approaching a Tesla or other similar vehicle with an Autopilot-type program engaged.

How Many Model S Vehicles Are Currently on the Road?

The National Highway Safety Administration estimates that there are about 25,000 Model S vehicles currently in use in this country. Given the very rare incidents of injuries or fatal accidents being reported – they may indeed be providing a safer drive for many car and truck owners. Proponents of Teslas and similar vehicles are quick to point out that Mr. Brown’s death was the first one recorded or known after over 130 million miles of Autopilot usage.

Limitations of Both Self-Driving Vehicles and Cars with Autopilot-Type Features

Back in February of this year, a Google self-driving vehicle in California crashed into a public bus. The Google vehicle apparently failed to navigate properly around some sandbags in the road. At least nine other accidents involving self-driven autonomous cars had already occurred that same year which were all linked to human error.


Regardless of the millions of miles being logged by people driving Teslas with Autopilot and Google (and other brands of) self-driving cars, we will each simply need to make our own decisions about whether or not we think we’ll be safer driving such vehicles in the future. Since human errors abound in this world every day – a large number of us may prefer just to keep using our defensive driving skills to control our driving safety for many more years to come.