Town Square

Post a New Topic

Widow files wrongful death lawsuit against Tesla over fatal Mountain View crash

Original post made on May 1, 2019

The family of a man who died when his Tesla Model X crashed last year has filed a wrongful death lawsuit against the car manufacturer. Attorneys representing the family say Tesla's actions amount to beta testing its Autopilot vehicle software on "live drivers."


Read the full story here Web Link posted Wednesday, May 1, 2019, 12:11 PM

Comments (8)

Posted by BarbG
a resident of another community
on May 1, 2019 at 1:30 pm

It would seem to me that it would be appropriate to "ground" every Tesla and disable any autopilot functions before they can be driven. I sure don't want one behind me on the road.


Posted by Jake O.
a resident of Rengstorff Park
on May 1, 2019 at 2:46 pm

I thought that drivers must stay alert and capable to retake control when a car is in auto pilot? Seven seconds is a long time to notice your veering off the road. It's an unfortunate accident but I don't see Tesla at fault unless they said "Autopilot is 100% safe".
I also don't see CalTrans at fault unless they knew there was damage to the safe guard or conditions at the crash site were unsafe. CalTrans covers tens of thousands of miles. I don't see how they know an area needs repairs unless it is reported to them.


Posted by Wally
a resident of Rengstorff Park
on May 1, 2019 at 4:11 pm

I thought that autopilot follows the speedlimit. How did it accelerate to over 70mph?


Posted by D Moore
a resident of Whisman Station
on May 1, 2019 at 10:13 pm

I thought I remember reading that Tesla stated the autopilot function was not active.


Posted by AllYouCanEat
a resident of Monta Loma
on May 2, 2019 at 8:29 am

Witnesses say the driver was asleep at the wheel. How is Tesla responsible for this. 7 seconds is plenty of time to make corrections. If not asleep no doubt was texting or goofing with his phone. I see this everyday as I drive around silicon valley... Including Cops.


Posted by Kevin Forestieri
Mountain View Voice Staff Writer
on May 2, 2019 at 8:39 am

Kevin Forestieri is a registered user.

@D Moore
Directly from the NTSB prelim. report:
"The Autopilot system was engaged on four separate occasions during the 32-minute trip, including a continuous operation for the last 18 minutes 55 seconds prior to the crash."

@AllYouCanEat
I don't see anything in NTSB's preliminary report, nor anything in Tesla's two blog posts, nor anything in my notes from the MVFD fire chief on the day of the crash, indicating the driver was asleep at the wheel. You are likely confusing this collision with another incident.


Posted by Wondering
a resident of Another Mountain View Neighborhood
on May 2, 2019 at 12:43 pm

If he wasn't asleep, how do you account for the fact that there is a 7 second gap in time when the car was accelerating and moving left? He was probably on his phone and now people want money to replace their loved one who acted irresponsibly. Personal responsibility. I know it is sad that this happened, but why blame Tesla or CAL trans or the fire department or anybody else you can think of when there is obviously something done wrong here on the part of the driver. You can take it to the Supreme Court if you want to, but it doesn't change the fact that this driver did something wrong. This is not an autonomous vehicle, nor did Tesla claim that it was; you need to be paying complete attention to your driving when you are behind the wheel. I know I'll get blasted for being unsympathetic, not politically correct, etc., but this is how I feel. I'm tired of people not taking responsibility for their own actions and trying to blame anyone or everyone else for their mistakes and the tragedies that they cause.


Posted by Self-Inflicted
a resident of Rex Manor
on May 2, 2019 at 3:30 pm

Self-Inflicted is a registered user.

What this article does not remind people of is that the driver was already fully aware that this specific stretch of damaged roadway was somehow tricking the Tesla Autopilot software into doing exactly this action IF he drove in the lane near the damage and was using Autopilot without his hands on the wheel.

Mr. Huang himself knew what he was doing was very dangerous and was against Tesla Autopilot directions and that this specific section was potentially deadly and yet he continued to tempt death by using Autopilot and NOT keeping full control of the car as he passed this section.

Mr. Huang had even taken his wife on a trip past that area in order to demonstrate for her the problem. Mr. Huang had also complained to Tesla about the Autopilot for this reason and he had told his wife and others about his prior experience with this stretch of damaged road.

Mr. Huang had intentionally repeatedly experimented with Autopilot while driving along this specific section of the road in that lane to see how it would behave.

Mr. Huang had stated that he knew this section of the road had damage to the safety devices. He stated that due to the damage, an old, obsolete and dangerous set of lane markings were visible to the Tesla system and that he believed these old lane markings were the cause of the problem.

When the freeway was in the process of being reconfigured years ago, there were a variety of lane markings that changed from time to time as the construction proceeded. This specific section had lane markings from one old phase of the construction that were NOT scrubbed off the road surface, so when a prior crash (by a normal car) destroyed the safety barrier, the old lane markings could be seen.

The reason these old and dangerous lane markings had not been scrubbed off, which would be normal procedure, was that the plan was to plant a set of safety barriers in that location and thus they felt there was no real need to remove the old lane markings. They never considered what might happen if a crash took out these barriers and thus exposed the dangerous lane markings to view.

I understand why the construction people failed to obey standard procedures and remove these old lane markings, and I can understand why they violated the rules of how long a damaged safety rail was allowed to sit and wait for repair, but the fact is that we have 2 errors of judgement and 2 failures to do work that is required for safety on the freeway in the proper manner. I understand that they never considered the what-ifs, but that's exactly why we have standard procedures.

Nobody ever claimed that the Autopilot was perfect and anyone who has ever used a computer or device that uses software should already be well aware of the limitations of software and should NOT risk their lives in foolish ways as Mr. Huang chose to do repeatedly until he finally paid the ultimate price of being careless with his own life.

And let's not forget that it was equally possible that when Mr. Huang ultimately did crash that his crash could have killed other people in other cars as well. Mr. Huang was risking the lives of anyone who was driving near enough to him at the critical moment.

The bottom line is that Tesla Autopilot has SAVED many lives already and the tiny number of crashes have all been due to a combination of driver failure to maintain proper control and some extraordinary road situation.

Tesla is certainly not at fault here.


Don't miss out on the discussion!
Sign up to be notified of new comments on this topic.

Email:


Post a comment

On Wednesday, we'll be launching a new website. To prepare and make sure all our content is available on the new platform, commenting on stories and in TownSquare has been disabled. When the new site is online, past comments will be available to be seen and we'll reinstate the ability to comment. We appreciate your patience while we make this transition..

Stay informed.

Get the day's top headlines from Mountain View Online sent to your inbox in the Express newsletter.