Town Square

Post a New Topic

Family of Tesla crash victim looking into legal options

Original post made on Apr 11, 2018

The family of a man who died in a fiery crash involving a Tesla Model X in Mountain View has hired a San Francisco-based law firm to help them explore legal options, according to the law firm.

Read the full story here Web Link posted Wednesday, April 11, 2018, 11:47 AM

Comments (8)

Posted by Actually, no
a resident of Rengstorff Park
on Apr 11, 2018 at 12:15 pm

Tesla's Autopilot is designed to assist the driver, not self driving. The driver was obviously not paying any attention to the road, and the Autopilot is not 100% safe. There's a liability that is signed when purchasing the car, informing about that. It's not Tesla's fault, its the driver! Obviously. Sorry for the loss, however.


Posted by resident
a resident of Old Mountain View
on Apr 11, 2018 at 12:57 pm

Customers have to realize that TESLA AUTOPILOT is not a real autopilot. The name is just a marketing scam. The product is really only slightly more advanced from what other brands call CRUISE CONTROL. I would never think of taking both hands off the steering wheel when using the cruise control on my non-Tesla car. Taking one hand off the steering wheel is sometimes necessary to use the gear shift or turn signal.


Posted by Wayne
a resident of Old Mountain View
on Apr 11, 2018 at 12:58 pm

Very true it is to assist the driver. Pilots of commercial planes have auto pilot system but they still have to pay attention to the controls and navigation just in case they need to take over. Tesla cars are not approved self driving cars like waymo Google cars. Sorry about your loss.


Posted by Richt
a resident of Rex Manor
on Apr 11, 2018 at 3:04 pm

Richt is a registered user.

Let's try this again, maybe I'm just too reasonable?

It seems that the MV Voice changed the wording in the original article where they used the term "navigation system" instead of "Autopilot".
It's a common mistake which I have seen in TV news interviews as well.

Which is exactly what was found in the Tesla service records of the car in question. The owner had reported that the "navigation system" (which is the GPS, not the Autopilot) had a problem, but Tesla service could find nothing wrong with the navigation system. No mention of the "Autopilot" system was in the service records. He never demonstrated the Autopilot problem to Tesla or anyone else that has come forwards. The wife of the owner has stated on TV that the owner had taken her out to that dangerous strip of road to try to demonstrate the dangerous situation to her, but it had not happened.

Not something I would have done, risking my wife's life like that, but
Maybe I'm just too reasonable?

"allegations of a faulty Autopilot system,"
"had been using the vehicle's Autopilot function at the time of the crash"

So, maybe it's just me, maybe I'm just too "reasonable", but if I believed that a certain stretch of road, in a certain lane of that road, in a certain mode of driving my car, was likely to cause my car to crash, then I would not place myself and any passengers and any other cars behind me in such a dangerous situation.

What ever happened to rational risk/benefit thinking?
What was the benefit of repeatedly putting yourself at risk when it was so easy to avoid that risk?

I would simply avoid that lane in that area, or better yet, keep my hands firmly on the wheel at that location, or better yet, not have Autopilot on during that short dangerous stretch of road.
Or am I just being too reasonable?

I would have mounted a camera on the dash until I captured the event.
I would have gone to a service manager and carefully described the problem and had I caught it on video, I would have given them a copy.
I would have had Tesla service people take the car out to that stretch of road and in the same conditions allowed them to experience the problem.

Or is it more reasonable to keep putting yourself and others in danger until you finally crash?

Tesla has always been clear that the Autopilot system is a driver assist, not a self-driving car system, and the human driver must at all times control the steering wheel and be alert and watchful of the road and be ready to take full charge at any time. That's why the car has a warning system to alert you that your hands are not on the steering wheel.

Tesla always was clear that the performance of the autopilot system depends on road markings for it's decision making and that the drivers must be ready to correct any misreads the software might make when the road markings are non-standard or in error.

So, while I feel for his family, I cannot see how anything the owner did was reasonable. He repeatedly chose to risk his life in a manner which could have easily been avoided and eventually, the predictable outcome happened.

It's like sticking a metal fork in an electric toaster to pull out your stuck bagel and seeing an arc of electricity the first time, but then you just keep on using a metal fork to get your stuck bagel out until eventually you get electrocuted and then blaming the maker of the toaster.

If your bagel gets stuck and you really have no other tool that a metal object to remove the bagel, then at least unplug the toaster first!

Or am I being to reasonable?


Posted by Cordelia
a resident of Old Mountain View
on Apr 11, 2018 at 3:28 pm

Cordelia is a registered user.

Tesla Autopilot thought the concrete median was just another lane and steered right into it. This was inevitable because Autopilot is in beta-testing which means that finding problems is to be expected. It's the driver's job to test the computer's mistakes and report them for fixing.

We've decided as a society that we value progress over safety so treating customers as lab rats is perfectly legal in this situation. Many would even argue that this dangerous testing is necessary in order to more quickly achieve true autonomous driving and save even more lives down the road.


Posted by Richt
a resident of Rex Manor
on Apr 11, 2018 at 3:49 pm

Richt is a registered user.

@resident

"Customers have to realize that TESLA AUTOPILOT is not a real autopilot. The name is just a marketing scam. The product is really only slightly more advanced from what other brands call CRUISE CONTROL. "

No, "CRUISE CONTROL" is nothing more than a system that holds a "set speed" until the driver taps the brakes or the buttons to change the set point.
CRUISE CONTROL only effects the gas pedal to add more gas as the car climbs a hill and reduce gas when going down hill. That's pretty much it.

No CRUISE CONTROL can hit the brakes for you.
No CRUISE CONTROL can steer the car on a straight course, like an airplane autopilot can.
No CRUISE CONTROL can steer the car at all.
No CRUISE CONTROL can detect lines, or obstructions or help the driver avoid accidents in any manner.

CRUISE CONTROL is simply a means to avoid speeding tickets and to somewhat relieve the strain on the drivers right foot when they are driving on long trips at a steady speed.

Tesla Autopilot is very much like the autopilot in an airplane, which also cannot avoid crashes, nor react to changes, nor do anything safely without a pilot keeping an eye on things ready at all times to take full control.

Just like Tesla Autopilot.
Just like Tesla clearly tells it's owners.


Posted by Richt
a resident of Rex Manor
on Apr 11, 2018 at 4:13 pm

Richt is a registered user.

@ Cordelia

"Tesla Autopilot thought the concrete median was just another lane and steered right into it."

The news reports I have heard/read say it was not the concrete median, but rather the road lane markings that the Tesla was following. Specifically, a set of old obsolete markings that had been covered by a long stretch of safety barrier so these old markings were not seen. Until one day some other car crashed into that safety barrier and Caltrans removed it, but did not replace it quickly enough. The old obsolete lane markings lead the Tesla Autopilot to think they were the correct lane markings to follow and thus the car crashed into the unprotected concrete median.

"This was inevitable because Autopilot is in beta-testing which means that finding problems is to be expected."

Finding problems, yes, but this crash was 100% avoidable, if the driver had simply followed Tesla directions, or had simply done anything to actually avoid the crash, like not using that lane or not using Autopilot when approaching that spot, for example. Or if Caltrans had quickly replaced the barrier or had done a better job of removing the old lane markings.

"It's the driver's job to test the computer's mistakes and report them for fixing."

Sure, but this driver knowingly and repeatedly drove in exactly the manner most likely to kill himself and never did report the actual problem either.

"We've decided as a society that we value progress over safety"

Many people exploit the call for "safety" to justify many bad things, which almost never actually improve real safety. People say "if only one life is saved, then it's worth doing", then they do something that wont in fact save any lives but will certainly cost more lives.

"so treating customers as lab rats is perfectly legal in this situation."

I don't think that a fair way to put it.

"Many would even argue that this dangerous testing is necessary in order to more quickly achieve true autonomous driving and save even more lives down the road."

Considering how few deaths or even serious injuries have resulted from driving a Tesla compared to any other car, I don't see how Tesla is putting people at risk, but rather Tesla is offering people a way to drastically reduce their risk of injury or death, if they follow Tesla driving directions along with the driving laws that apply to all other cars.

As with so many other issues, the media is stirring up fear by grossly misrepresenting the facts and ignoring the overall impact of something the general public does not yet understand well.

Fear sells and fear results in giving a smaller number of people more and more power over everyone else, in the name of "safety". And it's virtually never true that giving people in authority more power ever actually results in more safety for anyone but those few who gain power by screaming about "safety".

Compare the safety record of Tesla cars to any other maker and Tesla always wins.


Posted by Cordelia
a resident of Old Mountain View
on Apr 11, 2018 at 4:55 pm

Cordelia is a registered user.

@Richt

Seems we agree on just about everything. The path to reliable autonomously driving vehicles is not going to be bloodless. Humans will not pay attention to the road at all times, whether beta-testing systems are involved or not. Computers will be more reliable than human drivers some day, so let's keep our eyes on the prize.


Don't miss out on the discussion!
Sign up to be notified of new comments on this topic.

Email:


Post a comment

On Wednesday, we'll be launching a new website. To prepare and make sure all our content is available on the new platform, commenting on stories and in TownSquare has been disabled. When the new site is online, past comments will be available to be seen and we'll reinstate the ability to comment. We appreciate your patience while we make this transition..

Stay informed.

Get the day's top headlines from Mountain View Online sent to your inbox in the Express newsletter.