Tesla's 'self-driving' software fails at train crossings

1 hour ago 2

“They said when they got to the tracks, the car just turned left,” said Western Berks Fire Commissioner Jared Renshaw, who interviewed the driver. “The car was in self-driving mode, and it just turned left.” The incident received some news coverage at the time, but Tesla hasn’t addressed it, and the driver’s identity hasn’t been made public.

Frigoli said that if any Tesla should handle rail crossings well, it should have been his. He drives a 2025 Model Y with the latest Tesla self-driving hardware, known as HW4, and the most recent version of the software, FSD 13.2.9. He also had good driving conditions, with a mostly clear sky and no one on the road ahead of him, as his car approached the train tracks in June.

“I would think with flashing red lights the car should stop on its own,” he said. “In future iterations of FSD, hopefully they’ll code it to work and recognize the railroad crossings correctly.”

The train incidents have tested the faith of some otherwise-satisfied Tesla drivers.

“It’s kind of crazy that it hasn’t been addressed,” said Jared Cleaver, a Tesla owner in Oakland, California.

Cleaver, a project manager in the construction industry, said he was driving his 2021 Tesla Model 3 last fall when he approached a railroad crossing near downtown Oakland. He said he had FSD engaged and was watching the car carefully.

“The car came to a complete stop, and I was like, ‘OK, we’re good.’ And then the car just jumped forward like it was going to go,” he said.

He said he slammed on the brakes. “I don’t know if it would have just jerked and stopped, but I wasn’t going to wait to find out,” he said. He said the same thing happened again this month and that this time he was able to document the mishap on video, which he shared with NBC News.

Cleaver said he loves his car and uses FSD often. He said it sometimes amazes him but also makes dumb mistakes.

“I think it doesn’t perform nearly as well as Elon claims and Tesla claims, but I think it is good,” he said. “They seem to make a habit out of making these really big claims and then falling short. It bothers me.”

“It seems like borderline false advertising,” he said.

Tesla has previously been accused of exaggerating the capabilities of its software, including in a wrongful-death trial this summer over a different piece of Tesla software known as Autopilot. A jury in that case in Miami awarded $243 million to the plaintiff, finding Tesla 33% responsible for a crash. Autopilot is a narrower set of driver-assistance features that includes lane control and blind spot monitoring. Tesla has asked the trial judge to set aside the jury’s verdict or order a new trial.

Full Self-Driving is a package of driver-assistance features that Tesla owners and lease-holders can buy for $99 a month or a one-time fee of $8,000. The software package works with pre-installed hardware, including cameras that capture what’s around the vehicle. Despite the name, the software doesn’t make a Tesla autonomous, and it requires constant human supervision.

FSD has limited appeal. Musk said in July that “half of Tesla owners who could use it haven’t tried it even once,” and a survey of U.S. consumers in August found that only 14% said FSD would make them more likely to buy a Tesla. (Musk didn’t define the phrase “owners who could use it.”)

There are six levels of driving automation, according to a rating system developed by SAE International, a professional association for engineers. Level 0 indicates no automation, and Level 5 represents full self-driving under all conditions without human intervention. Tesla classifies FSD as a “Level 2” system, and it tells drivers in its online manual that FSD requires active human supervision at all times.

NHTSA said in October that it was investigating Tesla FSD’s ability to safely navigate through fog, glaring sun or other “reduced roadway visibility conditions.” Tesla has not provided an update on that investigation, and NHTSA says it is still an open investigation.

Musk, though, has continued to make claims that go beyond what his company says. He recently asserted that, with FSD, Tesla vehicles “can drive themselves,” a claim that experts say isn’t backed up by the evidence.

The rail industry has warned for years about the potential danger of autonomous vehicles. In 2018, the Association of American Railroads, a trade group, told federal regulators in a letter that it was a complicated problem, requiring self-driving cars to recognize “locomotive headlights, horns, and bells,” because not all rail crossings have gates and flashing lights. (Some rail crossings have only white X-shaped signs known as “crossbucks.”)

“Just as the behavior of automated vehicles must be governed as they approach busy roadway intersections, so too must their behavior be governed as they approach highway-rail grade crossings. Rail corridors must be afforded respect,” the association said. An association spokesperson said the rail industry’s position hasn’t changed but didn’t comment on specific incidents. Norfolk Southern declined to comment on the Tesla crash in Pennsylvania.

Last year, 267 people died at railroad crossings, according to the Federal Railroad Administration, which monitors the safety of rail crossings nationwide. It doesn’t track the makes or models of the vehicles involved or whether the vehicles were using autonomous software. The FRA said it was aware of some Tesla incidents but declined to comment.

The issue with Tesla FSD remains despite two examples that got widespread attention: a May 2024 viral video in which a Tesla in Ohio nearly missed colliding with a train and a video from this July in which a Tesla robotaxi test rider said his vehicle failed to see a railroad crossing in Austin.

Joe Tegtmeyer, the robotaxi rider and a Tesla booster, said in a video on X that his vehicle was stopped in an area with a traffic light and a railroad crossing. Then, he said, it began to move at precisely the wrong moment.

“The lights came on for the train to come by, and the arms started coming down, and the robotaxi did not see that,” he said. He said a Tesla employee sitting in the front passenger seat overseeing the robotaxi had to stop the vehicle until the train had passed.

Tegtmeyer declined an interview request. Responding to a question from NBC News, he said in a post on X that he had taken 122 successful rides in Tesla’s robotaxi service and that he thought the service, which began in June in Austin, worked well overall. The Tesla robotaxis use a new version of the FSD software, different from the consumer version, according to Musk.

One Tesla driver said his vehicle didn’t make the same mistake twice. Nathan Brassard of Brunswick, Maine, said his Tesla Cybertruck in FSD mode correctly handled a recent train crossing by braking before the white stop line on the road — an improvement, he said, from an earlier experience when it stopped on the tracks at a red light and he had to take over.

“It’s fun for me to watch it get better,” he said. “I don’t mind when I need to step in. It’s just so much easier than driving, especially on long trips.”

But experts in autonomous technology said there’s no way to be sure whether Tesla’s FSD software is improving because there’s little to no transparency in how it works to outsiders. According to Musk, the latest versions of FSD don’t even have human-written computer code but instead are end-to-end neural networks, based solely on training data rather than specific rules. Musk has compared it to ChatGPT.

Koopman of Carnegie Mellon said Tesla’s choice of training data — the hours of driving video that it feeds into the software to help it learn — is likely to be the root of the rail crossing issue. He said that engineers at Tesla have to choose which videos go into the training data and that it’s not known how many train examples they included.

“The only possible explanation is that it is not sufficiently trained on that scenario,” he said.

He also echoed a complicating factor raised separately by the rail industry: Not all train crossings are the same. Some have white stop lines on the road, but others don’t. Some have flashing lights and gate arms, but many don’t. Lights and gate arms can also malfunction.

Waymo, the market leader in robotaxis and a Tesla competitor, may have taken a different, more cautious approach to rail crossings. Waymo began “rider only” autonomous rides in 2019, and for years afterward, some Waymo customers speculated in social media posts that they believed the cars would route around railroad tracks because of the possible risk.

A Waymo representative said the company’s software considers many factors when it draws up a route and confirmed that rail crossings are one such factor. She added that Waymo vehicles have been regularly crossing heavy rail lines, not just light rail lines, since 2023. She said Waymo uses audio receivers to detect train sounds, and the company has said it has a model rail crossing at a training facility in California.

NBC News didn’t find examples of customers complaining that Waymo vehicles nearly crashed into gates or ignored flashing lights.

“The team at Waymo has invested copiously in developing a safety model and developing a safety culture that seems to be working well,” said Bryan Reimer, a research scientist at the Massachusetts Institute of Technology’s Center for Transportation and Logistics.

Tesla said last year that it planned to begin using audio inputs for better handling of situations involving emergency vehicles, but it’s not clear whether it does so for trains or train crossings. The company didn’t respond to questions about it.

Reimer said he would expect Tesla’s FSD software to be able to handle rail crossings without supervision if Musk is serious that the cars can drive themselves, including as robotaxis.

“You’d think they’d be able to reliably detect this stuff,” he said.

David Ingram

David Ingram is a tech reporter for NBC News.

Tom Costello

and

Nollaig O'Connor

contributed

.

Read Entire Article