Rules To Follow When Driving A Tesla On Autopilot

Autopilot is one of the most innovative features Tesla owners have access to, since it uses cameras, sensors, and AI to drive the car autonomously in certain circumstances. It's capable of a wide range of functions, and Elon Musk and his team frequently add more. Activating the self-driving capability is a very simple procedure, and when it's on, the car is mostly capable of driving down highways and well-marked roads without human intervention. The word mostly is very important there, as there are plenty of myths that still surround Autopilot and its limitations.

Despite what some of Tesla's more bullish proponents might have you think, Autopilot does still make mistakes and can require emergency intervention at any time. If drivers don't follow the proper procedure when using it, they can put themselves and everyone else around them at unnecessary risk. That's not to say that the system is inherently unsafe — when used properly, it's generally proven itself to be just as safe as a human driver. However, it's always worth making sure you know the basic rules before venturing out onto the open road and trying out this revolutionary system for yourself.

Make sure the cameras are properly calibrated

Autopilot relies on a mix of sensors and cameras to relay data about the road and surrounding environment to the AI system. Newer Teslas have removed most of the sensors altogether and rely primarily on what the company calls "Tesla Vision," with HD cameras providing nearly all of the data that the car uses to navigate. In order to ensure that the data being fed back to the AI is accurate, it's important to make sure all these cameras are properly calibrated. The car will automatically detect that it needs to recalibrate after certain types of service or repair, and Autopilot will be disabled until this process is completed.

Tesla says that calibration is usually completed after around 20-25 miles of highway driving, and to ensure it's as quick as possible, it's best to drive on straight highways with multiple clearly marked lanes. In the event that one of the cameras has shifted away from its original position and the car doesn't automatically detect that recalibration is needed, drivers can manually reset the calibration through the Controls/Service menu.

Keep cameras and sensors clean

In addition to calibration, it's important to make sure that the car's cameras and sensors are kept free from any dirt or debris that might impede their functionality. This is particularly important with an Autopilot feature like Summon, which allows drivers to call their vehicle over from a parking lot to wherever they're standing. To maneuver out of its parking space and avoid hitting other cars, pedestrians, animals, or any other hazards, the car will need to have a crystal-clear view of everything in its surrounding area.

Even a small piece of dirt on a vital camera could mean that the car can't see said hazard, with potentially disastrous consequences. In most cases, the car will alert drivers if it detects any blockages, but as a precaution, Tesla advises drivers to conduct regular visual checks of their car's sensors and cameras to make sure they're clear and undamaged.

Monitor the car carefully when using Summon

Summon might be a very handy feature for drivers who often struggle to remember where they've parked their car, but it's one of several features that's currently offered to customers in Beta form. Tesla states in its owners manual that drivers should continually monitor the vehicle while the feature is enabled, remaining prepared to take action at any time. Although its cameras are able to spot most hazards and maneuver around them, they're not always able to identify every hazard that you might find in some parking lots or garages.

Tesla says that any object that's "very narrow" or located lower than the bumper might slip by the system unnoticed, and it's also not capable of detecting anything that's hung from a ceiling, like a bicycle. In short, the Summon feature is far from a foolproof system, and to avoid any costly damage to your car, your belongings, or anybody else, it's best to be very careful when using it.

Decide how independent you want lane changing to be

Another Beta feature offered as part of Tesla's Autopilot package is Navigate, which can change lanes while on the highway and follow your navigation route to guide you to off-ramps and exits. Tesla offers two settings for the lane-changing feature, the first of which requires the driver to confirm that they want the car to move lanes. They can do so through engaging the turn signal, and then the car will do the rest. Alternatively, they can opt to turn Lane Change Confirmation off, which enables the car to move between lanes without any driver input.

Picking which one suits you best is largely a question of deciding how confident you are in Autopilot's capabilities. Especially in heavy traffic, drivers more acutely aware of the system's shortcomings might prefer to authenticate their car's actions rather than let it pick independently, but that's not to suggest that either one is an inherently better option. If you're confident in the system and can ensure that you don't get distracted when there's no need for manual input, then turning confirmation off is the most convenient option.

Steer clear of fitting aftermarket accessories

There are plenty of things that can limit the car's ability to accurately gather the data that Autopilot requires to function, with aftermarket accessories being one of the most troublesome. Tesla specifically cites things like bike racks, tinted or aftermarket glass, and certain wraps as potential causes of interference, along with any in-car device that generates ultrasonic waves. As a general rule of thumb, it's best to keep the car exactly as it was when it left the factory, at least if the regular use of Autopilot is one of your biggest priorities.

The system can also be affected by bright lights, extreme temperatures, and poor visibility due to adverse weather conditions like fog or rain. Damage or a misaligned body panel can also cause issues, so if you spot any noticeable dents or loose paneling, then it's best not to use Autopilot until you get them fixed.

Keep your eyes on the road and hands on the wheel

One of the biggest criticisms of Autopilot has been its failure to ensure that drivers remain adequately attentive while the self-driving functions are activated. Earlier versions of the software required drivers to frequently have their hands on the steering wheel while Autopilot was engaged, but the safeguard could easily be fooled. A video emerged of a driver using an orange wedged in the steering wheel to fool the system, and a few companies even began selling an "Autopilot Buddy" accessory that could clip onto the wheel to replicate the hand of the driver.

Tesla claimed to have fixed the problem by installing driver-monitoring cameras that check whether you're paying attention while Autopilot is engaged, but independent testing by Consumer Reports proved the system could still be cheated. In some instances, researchers found that they could cover the camera altogether without triggering a safety warning, and even if a warning did trigger, it wasn't effective enough to reliably ensure that the driver refocused on the road. It should go without saying that using any method to override Tesla's safety systems is inherently dangerous, not only to the car's occupants but to anyone else around them on the road. There is a simple solution — drivers need to make sure that they're not tempted to use distractions like phones while Autopilot is engaged, and that they remain alert to the road ahead of them, as Tesla recommends.

Watch out for unexpected braking

Anyone not paying adequate attention to their car's actions might find themselves caught out by one of Autopilot's occasional quirks, like its tendency to brake unexpectedly. A driver in late 2022 found this out the hard way when their Tesla slammed on its brakes while driving on a highway in San Francisco, causing a huge pileup that injured nine people. Video of the incident shows the car traveling down the highway with no obvious obstructions, but abruptly stopping, causing several cars to crash into it in the process.

In another incident, Autopilot appeared to think that the moon was a yellow traffic light, causing it to repeatedly attempt to slow down on the highway. It's not just unexpected braking that could cause problems either, as a 2018 crash in a residential area appeared to show that a Model 3 failed to brake and crashed into a tree, catching fire and killing its occupants in the process. It's disasters like these that serve as stark reminders that for all its impressive capabilities, Autopilot is still very much a system in development, and like any developmental software, it's prone to the occasional major bug. Except in Autopilot's case, these bugs could leave you unexpectedly stopping in the middle of a highway, or careering off of it and into a tree.

Be careful that the steering wheel doesn't fall off

It might sound bizarre, but having the steering wheel suddenly fall off is a rare-yet-dangerous Tesla phenomenon that's currently under investigation by the NHTSA. It affects Model Y cars built for the 2023 model year, and it's caused by a manufacturing defect. It was found that certain cars were missing a retaining bolt that secured the steering wheel onto the steering column, but the problem wasn't initially noticeable because the friction fit created through installation made the wheel appear to be connected.

When a large force was exerted on the steering wheel, it could suddenly detach. In theory, the jolt associated with grabbing hold of the steering wheel in an emergency could cause the detachment, rendering the car effectively out of control. The last thing a driver wants when trying to take control of the car after an Autopilot error is for the steering wheel to come loose, although it's worth noting that, at the time of writing, the fault is only known to have affected a handful of cars. Owners of the 2023 Model Y will no doubt continue to watch the situation closely, but for now, no recall has been triggered and Tesla has yet to take any further remedial action.

Be aware of the tech's limitations

Despite some assumptions among owners that the car is capable of driving itself without intervention, that's simply not the case at present. Adding further confusion is the recently-recalled "Full Self-Driving" mode, which, from its name, many would expect to be able to drive the car autonomously. To be clear — it's not capable of completely autonomous self-driving at all, and Musk and Tesla are reportedly currently under investigation over whether they confused customers about its capabilities.

Tesla's owners manual is a little clearer about the situation, stating that drivers should not "depend on [Autopilot] components to keep [them] safe. It is the driver's responsibility to stay alert, drive safely, and be in control of the vehicle at all times." Autopilot can be confused by anything from temporary road signs to cross-traffic turns at intersections, and it's also not capable of adjusting for slippery conditions or sharp turns in winding roads. It might be an impressive, futuristic piece of technology, but Autopilot and Full Self-Driving are not fully autonomous driving systems, and drivers need to be careful not to treat them as such.

Remember Elon Musk allegedly exaggerated Autopilot's capabilities

On that note, it's important to keep in mind that there have been allegations that Elon Musk has had a tendency to exaggerate what Autopilot is able to do. A senior Tesla engineer recently testified under oath that a 2016 promotional video showing off Autopilot's tech was not reflective of the system's capabilities at all, but rather had been staged. Instead of using its systems to drive independently like the video implied, the Model X shown was instead driving along a set path that had been pre-programmed, with all the relevant 3D data already fed into the system.

Moreover, Musk had personally overseen the production of the demonstration, sending late-night emails to engineers saying "Since this is a demo, it is fine to hardcode some of it, since we will backfill with production code later in an OTA [over-the-air] update." While this doesn't mean that Musk has misled customers on any recent aspects of Autopilot's capabilities, it shows that at the very least, he's previously been willing to bend the truth in an effort to impress customers and investors. So, it's worth taking Elon's more ambitious claims about Tesla's newest Autopilot features with a grain of salt, at least until there's plenty of real-world data to back them up.