These Tesla Drivers Claim Their 'Full Self Driving' EVs Tried To Drive Onto Active Train Tracks
Using Tesla's autonomous full-self driving function can turn a long, boring trip into a nice relaxing drive by taking control of the vehicle away from the driver. While Tesla's autonomous driving has been an impressive leap in automotive technology, some drivers have been having issues with the system. There have been reports from Tesla drivers where the self-driving systems directed the vehicles off the road, and in some situations, even onto active train tracks.
A recent report from NBC News detailed the experiences of multiple Tesla drivers who warned of the dangers that the self-driving system has when approaching train tracks. According to one Tesla Model Y driver in Texas, when his SUV approached train tracks with the signals flashing and announcing an oncoming train, the Model Y didn't blink an eye and tried to drive over the tracks. The NBC News team and driver attempted to recreate the situation, and the driver started moving toward train tracks as the crossing arms came down. Again, the driver had to slam on the brakes before the self-driving system drove onto the tracks.
The story goes on to mention other drivers that have come across similar situations. An investigation into Tesla's self-driving system by NBC News found that at least six other drivers had the self-driving system fail to recognize railroad crossings. In one case, the vehicle drove straight through the crossing arms. Another situation saw a Tesla drive onto the train tracks and ended up under a train.
Tesla issues over the years
Unfortunately, Tesla has seen a number of issues with both the technology in its vehicles, and the vehicles themselves. One of the issues that has arisen is with Tesla's hidden door handles, which will push out and can be used when the key is near the door. There have been numerous incidents where Tesla vehicles have been involved in crashes, and the hidden door handles refuse to pop out to allow the door to be opened.
In fact, in 2019 a man named Dr. Omar Awan got into an accident in his Tesla Model S and hit a palm tree. The vehicle caught fire, and when bystanders attempted to help Dr. Awan, the door handles would not pop out and the bystanders were unable to open the door. Tragically, Dr. Awan passed away inside of his Model S due to smoke and fire.
Earlier this year, the National Highway Traffic Safety Administration opened an investigation into the Tesla Model Y and its electronic door handles. The issue involved over 170,000 Model S vehicles and happens when the battery is low. Then, there isn't enough voltage for the door handles to work properly. Bloomberg recently reported that Tesla plans on redesigning the door handles to help in "panic situations."
Are autonomous vehicles safe?
There are six levels for autonomous driving technology, ranging from Level 0 (which is considered lane departure warning and automatic emergency braking) to Level 5 (which is fully autonomous with no driver input). There are currently no consumer vehicles that have Level 5 or Level 4 technology. The most common form of autonomous driving is Level 3, which is what Tesla's full self-driving is categorized as.
Autonomous and self-driving vehicles are incredible feats of technology, there is no denying that. With numerous automotive manufacturers releasing at least semi-autonomous driving features, it may seem as if manufacturers are perfecting this life-changing technology. Drivers on California streets regularly see Waymo robotaxis ferrying passengers to and fro. However, there have been many issues regarding how well the autonomous driving functions on these vehicles work.
In May of 2025, an autonomous vehicle developer called Zoox recalled its autonomous robotaxis from Las Vegas streets after one collided with a passenger vehicle. In California, police officers have pulled over autonomous vehicles for making traffic violations, only to walk up to the door and see there's nobody behind the wheel. The officers did not write the autonomous vehicle a ticket, of course, which brings another big issue into the self-driving vehicle equation. When things go wrong, who is at fault?