Tesla worker in fiery crash could also be first ‘Full Self-Driving’ dying
Proof suggests the superior driver-assistance system was engaged throughout a deadly crash that killed recruiter Hans von Ohain in 2022
Warning: This graphic requires JavaScript. Please allow JavaScript for the most effective expertise.
The next footage obtained by The Washington Publish exhibits Colorado authorities responding to a automotive crash on Might 16, 2022, in Evergreen, Colo.
This story is finest skilled with sound.
Feb. 13, 2024 at 5:55 a.m.
Save
EVERGREEN, Colo.
Hans von Ohain and Erik Rossiter have been on their strategy to play golf one afternoon in 2022 when von Ohain’s Tesla out of the blue swerved off Higher Bear Creek Highway. The automotive’s driver-assistance software program, Full Self-Driving, was struggling to navigate the mountain curves, forcing von Ohain repeatedly to yank it again heading in the right direction.
“The primary time it occurred, I used to be like, ‘Is that ordinary?’” recalled Rossiter, who described the five-mile drive on the outskirts of Denver as “uncomfortable.” “And he was like, ‘Yeah, that occurs every so often.’”
Hours later, on the way in which dwelling, the Tesla Mannequin 3 barreled right into a tree and exploded in flames, killing von Ohain, a Tesla worker and devoted fan of CEO Elon Musk. Rossiter, who survived the crash, instructed emergency responders that von Ohain was utilizing an “auto-drive characteristic on the Tesla” that “simply ran straight off the highway,” in keeping with a 911 dispatch recording obtained by The Washington Publish. In a current interview, Rossiter mentioned he believes that von Ohain was utilizing Full Self-Driving, which — if true — would make his dying the primary identified fatality involving Tesla’s most superior driver-assistance know-how.
Story continues beneath commercial
Story continues beneath commercial
Tesla house owners have lengthy complained of sometimes erratic conduct by the automobiles’ software program, together with sudden braking, missed highway markings and crashes with parked emergency automobiles. Since federal regulators started requiring automakers to report crashes involving driver-assistance methods in 2021, they’ve logged greater than 900 in Teslas, together with no less than 40 that resulted in critical or deadly accidents, in keeping with a Publish evaluation.
Most concerned Autopilot, which is designed to be used on controlled-access highways. No deadly crash has been definitively linked to the extra subtle Full Self-Driving, which is programmed to information the automotive nearly anyplace, from quiet suburban roads to busy metropolis streets.
Two years in the past, a Tesla shareholder tweeted that there “has not been one accident or damage” involving Full Self-Driving, to which Musk responded: “Appropriate.” But when that was correct on the time, it not seems to be so. A Tesla driver who triggered an eight-car pileup with a number of accidents on the San Francisco-Oakland Bay Bridge in 2022 instructed police he was utilizing Full Self-Driving. And The Publish has linked the know-how to no less than two critical crashes, together with the one which killed von Ohain.
[The final 11 seconds of a fatal Tesla Autopilot crash]
Von Ohain and Rossiter had been consuming, and an post-mortem discovered that von Ohain died with a blood alcohol degree of 0.26 — greater than thrice the authorized restrict — a degree of intoxication that will have hampered his means to take care of management of the automotive, specialists mentioned. Nonetheless, an investigation by the Colorado State Patrol went past drunken driving, searching for to grasp what position the Tesla software program could have performed within the crash.
“Subsequent factor I keep in mind is waking up within the car.”
— Erik Rossiter
The query is crucial as automakers race towards the promise of a driverless future. For personal automobiles, that day is much from right here. However critics say options like Full Self-Driving already are giving drivers a false sense of confidence about taking their eyes off the highway — or getting behind the wheel after consuming — evincing the risks of letting shoppers check an evolving, experimental know-how on the open highway.
Tesla didn’t reply to a number of requests for remark. The corporate, which has launched Full Self-Driving to about 400,000 clients, acknowledges that the software program is in “beta” mode — that means nonetheless in growth, continually studying and being modified. However Tesla argues that its public launch is a necessary step towards lowering America’s 40,000 annual highway deaths. “The extra automation know-how supplied to help the motive force, the safer the motive force and different highway customers,” Tesla tweeted in December.
Story continues beneath commercial
Story continues beneath commercial
On the similar time, Tesla consumer manuals cite a protracted record of circumstances beneath which Full Self-Driving could not operate correctly, together with slender roads with oncoming automobiles and curvy roads. The corporate has lengthy maintained that drivers should management their automobiles and that Tesla shouldn’t be responsible for distracted or drunken driving.
[Tesla drivers run Autopilot where it’s not intended — with deadly consequences]
A number of lawsuits have begun difficult the view that drivers are solely accountable when Tesla’s software program allegedly causes crashes or fails to forestall them. To date, Tesla has prevailed. Final fall, a California jury found Tesla not liable for a 2019 Autopilot crash during which survivors mentioned the automotive out of the blue veered off the highway. At the least 9 extra circumstances are anticipated to go to trial this 12 months.
Von Ohain’s widow, Nora Bass, mentioned she has been unable to discover a lawyer keen to take his case to courtroom as a result of he was legally intoxicated. Nonetheless, she mentioned, Tesla ought to take no less than some duty for her husband’s dying.
“No matter how drunk Hans was, Musk has claimed that this automotive can drive itself and is basically higher than a human,” Bass mentioned. “We have been bought a false sense of safety.”

Von Ohain used Full Self-Driving almost each time he received behind the wheel, Bass mentioned, putting him amongst legions of Tesla boosters heeding Musk’s name to generate information and construct the technology’s mastery. Whereas Bass refused to make use of the characteristic herself — she mentioned its unpredictability pressured her out — her husband was so assured in all it promised that he even used it with their child within the automotive.
“It was jerky, however we have been like, that comes with the territory of” new know-how, Bass mentioned. “We knew the know-how needed to study, and we have been keen to be a part of that.”
“Now it appears like we have been simply guinea pigs.”
— Nora Bass
Von Ohain, a former Marine initially from Cincinnati, joined Tesla in late 2020 as a recruiter for engineers, interested in the corporate’s mission of bringing electrical and autonomous automobiles to the lots, Bass mentioned. He additionally was impressed by the concept of working for Musk, she mentioned — a “sensible man” who constructed an organization that promised to save lots of lives and make the roadways safer.
Von Ohain “had this chance to be a part of an organization that’s engaged on insanely superior know-how, and we had all the time thought Elon Musk was fascinating,” Bass mentioned. “Hans was so curious about sensible minds.”
Story continues beneath commercial
Story continues beneath commercial
On the time, Tesla had simply launched Full Self-Driving, and would finally launch it to a wider group of householders who had been monitored by the carmaker and declared secure drivers. Like many Tesla workers, von Ohain obtained the characteristic — then a $10,000 possibility — free together with his worker low cost, in keeping with Bass and a purchase order order reviewed by The Publish.
Although nonetheless in its beta part, the know-how is “the distinction between Tesla being price some huge cash and being price mainly zero,” Musk has said, noting his clients’ — and buyers’ — enthusiasm for a totally autonomous automotive. Many main automakers have been creating superior driver-assistance know-how, however Tesla was extra aggressive in pushing subtle options out to an keen public.

For years, Musk had preached the advantages of pursuing autonomous driving. In 2019, he predicted that it will in the future be so dependable that drivers “may fall asleep” — although, for now, Tesla’s consumer settlement requires the motive force to remain engaged and able to take over from Full Self-Driving always.
In 2022, Tesla recalled greater than 50,000 automobiles amid issues that Full Self-Driving triggered the automotive to roll by way of cease indicators with out coming to a full halt. At the same time as Musk tweeted months later that Tesla had made Full Self-Driving Beta out there to anybody in North America who purchased it, complaints continued to pile up: Drivers reported that automobiles would cease brief, blow by way of cease indicators or out of the blue veer off the highway when lane markings have been unclear.
“We check as a lot as attainable in simulation and with [quality assurance] drivers, however actuality is vastly extra complicated,” Musk tweeted final spring a couple of new model of the software program. Tesla workers would get it first, he mentioned, with wider launch to come back “as confidence grows.”
“I’m glad it wasn’t Nora and his daughter in that automotive.”
— Erik Rossiter
On the day of the crash, von Ohain and Rossiter performed 21 holes of golf, downing a number of drinks alongside the way in which. Although an post-mortem would later present that von Ohain was legally drunk, Rossiter mentioned he appeared composed and “under no circumstances intoxicated” as they received within the Tesla and headed dwelling.
Rossiter, who was discovered to have the same blood alcohol degree, can recall solely shreds of the crash: A vibrant orange glow. Careening off the highway. Leaping out of the automotive and making an attempt to drag his pal out. The motive force’s-side door blocked by a fallen tree.
As Rossiter yelled for assistance on the abandoned mountain highway, he remembers, his pal was screaming contained in the burning automotive.

Colorado State Patrol Sgt. Robert Madden, who oversaw the company’s investigation, mentioned it was considered one of “probably the most intense” car fires he had ever seen. Fueled by 1000’s of lithium-ion battery cells within the automotive’s undercarriage, in keeping with the investigation report, the hearth is what killed von Ohain: His reason for dying was listed as “smoke inhalation and thermal accidents.” Madden mentioned he in all probability would have survived the impression alone.
On the scene of the crash, Madden mentioned, he discovered “rolling tire marks,” that means the motor continued to feed energy to the wheels after impression. There have been no skid marks, Madden mentioned, that means von Ohain appeared to not have hit the brakes.
Story continues beneath commercial
Story continues beneath commercial
“Given the crash dynamics and the way the car drove off the highway with no proof of a sudden maneuver, that matches with the [driver-assistance] characteristic” being engaged, Madden mentioned.
Colorado police have been unable to entry information from the automotive due to the depth of the hearth, in keeping with the investigation report, and Tesla mentioned it couldn’t affirm {that a} driver-assistance system had been in use as a result of it “didn’t obtain information over-the-air for this incident.” Madden mentioned the distant location could have hindered communications.
Nevertheless, Tesla did report the crash to the Nationwide Freeway Site visitors Security Administration. Based on NHTSA, Tesla obtained notification of the crash by way of an unspecified “criticism” and alerted federal authorities {that a} driver-assistance characteristic had been in use no less than 30 seconds earlier than impression. Due to the in depth fireplace harm, NHTSA couldn’t affirm whether or not it was Full Self-Driving or Autopilot.
In December, Tesla acknowledged issues with driver inattention, issuing a recall for almost all of its 2 million U.S. automobiles so as to add more-frequent alerts. Bass mentioned that von Ohain knew he wanted to concentrate however that his focus naturally flagged with Full Self-Driving.
“You’re instructed that this automotive needs to be smarter than you, so when it’s in Full Self-Driving, you loosen up,” she mentioned. “Your response time goes to be lower than if we weren’t in Full Self-Driving.”


Alcohol additionally dramatically slows response time, and von Ohain’s intoxication in all probability factored closely within the crash, mentioned Ed Walters, who teaches autonomous car legislation at Georgetown College. If the know-how was appearing up on the way in which to the golf course, as Rossiter claims, von Ohain ought to have identified that he wanted to stay absolutely alert on the drive dwelling, Walters mentioned.
“This driver, when sober, was in a position to pull the automotive again on the highway and was in a position to appropriate for any issues within the Tesla safely,” Walters mentioned. “Folks want to grasp that no matter sort of automotive they’re driving, no matter sort of software program, they should be paying consideration. They should be sober they usually should be cautious.”
Nonetheless, Andrew Maynard, a professor of superior know-how transitions at Arizona State College, mentioned stories of the automotive’s frequent swerving increase questions on Tesla’s resolution to launch Full Self-Driving.
“The FSD know-how isn’t fairly prepared for prime time but,” Maynard mentioned, including that the worth of testing the know-how on the open highway needs to be weighed towards the dangers of pushing it out too rapidly to drivers who overestimate its capabilities.
[Why Tesla Autopilot shouldn’t be used in as many places as you think]
“I’ve so many questions.”
— Nora Bass
Practically two years later, mangled automotive elements and charred battery cells are nonetheless strewn alongside Higher Bear Creek Highway. Madden closed the Colorado State Patrol investigation, unable to find out whether or not Full Self-Driving performed a task.
In a current interview, Madden mentioned he worries in regards to the proliferation of subtle driver-assistance methods. “Autonomous automobiles are one thing of the longer term, and they’re going to be right here,” he mentioned. “So the extra we all know, the extra we perceive, the safer we are able to proceed into the longer term with this” know-how.
In the meantime, Tesla has but to publicly acknowledge the dying of an worker driving considered one of its automobiles.
To its workforce, the corporate has mentioned little about what occurred to von Ohain, making few efforts to console those that knew him, in keeping with a former worker who spoke on the situation of anonymity for concern of retribution. Von Ohain’s substitute was employed inside just a few weeks, the individual mentioned.
“As soon as Hans handed away and time glided by, there wasn’t any extra dialogue about him,” mentioned the previous worker, a member of von Ohain’s group who quickly resigned.
To von Ohain’s widow, Tesla’s silence appeared nearly merciless.
Although the corporate finally helped cowl the price of her transfer again dwelling to Ohio, Bass mentioned, Tesla’s first communication with the household after the crash was a termination discover she present in her husband’s electronic mail.
About this story
Extra growth by Jake Crump. Enhancing by Lori Montgomery and Karly Domb Sadof. Video manufacturing by Jessica Koscielniak. Design modifying by Betty Chavarria. Picture modifying by Monique Woo. Video graphics by Sarah Hashemi. Copy modifying by Anne Kenderdine and Martha Murdock.