Entertainment

Tesla has decided to use a bizarre defense in a lawsuit brought by the family of a Tesla owner who died in an accident while using Autopilot a few years ago.

The automaker claimed that CEO Elon Musk shouldn’t be made available to explain some of his statements on self-driving because some of the public comments might have been “deep fakes.”

The lawsuit revolves around the death of Walter Huang, an Apple engineer who died in his Tesla Model X while driving to work in 2018.

As we previously reported, the Model X was driving on Autopilot when it entered the median of a ramp on the highway as if it was a lane and hit a barrier about 150 meters after going into the median.

The impact was quite severe because there was no crash attenuator since it was already destroyed from a previous crash. The driver was rushed to the hospital, but he died of his injuries.

NHTSA investigated the accident and confirmed that the vehicle was using Autopilot at the time of the crash, but it blamed the driver, who was playing a video game on his phone at the time of the accident, according to the phone data, and on the lack of crash attenuator.

Tesla asks drivers always to pay attention and be ready to take control when using Autopilot.

The Huang family decided to sue anyway, and they are trying to use the argument that some of Tesla’s and, more specifically, some of CEO Elon Musk’s comments about Autopilot and self-driving have led Huang to believe he could use Autopilot in the manner that led to the crash.

The lawsuit is set to go to trial in Santa Clara County Superior Court this year, but Tesla has tried to keep Musk and his statements out of the case with a quite bizarre defense.

The automaker is claiming that some of the statements that Musk is believed to have made might have been “deep fakes,” and therefore he shouldn’t be questioned on them.

Deep fakes generally mean synthetic media that have been digitally manipulated to replace one person’s likeness convincingly with that of another, but people also use the term to refer to CGI videos made to make someone say something that they didn’t actually say.

Judge Evette D. Pennypacker didn’t buy the argument. She said in her judgment (via The Telegraph):

Their position is that because Mr Musk is famous and might be more of a target for deep fakes, his public statements are immune. In other words, Mr Musk, and others in his position, can simply say whatever they like in the public domain, then hide behind the potential for their recorded statements being a deep fake to avoid taking ownership of what they did actually say and do.

She has ruled that Musk should be made available for an interview of up to three hours to discuss his statements about Tesla Autopilot and Full Self-Driving.

Electrek’s Take

That’s bizarre. If Tesla thinks some of the statements are deep fakes, it should say exactly which ones and try to prove it. But the capability to create deep fakes certainly doesn’t make anyone immune to scrutiny on their statement.

Also, it’s not like we don’t know for a fact that Musk has made some fairly ambitious statements about Tesla Autopilot and Full Self-Driving.

Is he now going to claim that he never said that Tesla would have 1 million robotaxis on the road by the end of the year three years ago? Was it a deep fake? Was it also a deep fake when he said it again the next year? That’s just ridiculous and worrying that Tesla would try such a defense. I guess that Tesla’s new “hardcore litigation team” at work.

However, in this case, the Huang family is facing an uphill battle because despite Musk’s comments about what he believes Tesla could achieve with self-driving in the future, Tesla has always been clear about how drivers should use Autopilot.

Every time you activate Autopilot, it tells the driver to keep their hands on the steering wheel and be ready to take control at all times. The data points toward the fact that Huang was playing a video game, not paying attention, and had plenty of time to react when the car went into the median and before hitting the barrier. He was clearly not using Autopilot as intended.