Search

New Energy Vehicles

Thursday
31 Aug 2023

The Feds Want to Know More About Tesla Autopilot's Elon Mode — Do You?

31 Aug 2023  by cleantechnica   

Tesla federal inquiries are so frequent that we’ve become a bit desensitized when we see another one announced in the media. Yet, when news emerged of a federal automotive safety letter sent to Tesla recently, it seemed different. Partially, it’s because of a largely hidden Autopilot configuration under scrutiny — dubbed “Elon mode” by the researcher who discovered it. And there’s the revelation that Tesla owners may have had this Autopilot enhancement enabled without their knowledge or permissions.

Regulators issued a special order on July 26 with a demand for Tesla to turn over complex data about its driver assistance and driver monitoring systems. As reported by CNBC, the “Elon mode” configuration eliminates what’s called a “nag” that would otherwise notify Tesla drivers to hold their hands on the steering wheel. How many Tesla drivers have ever had this configuration enabled? Did a lack of notification about this endanger Tesla drivers? Those questions nag at us, don’t they (pun intended)?

The letter and special order expressed concerns that knowledge of Elon mode may inspire more drivers to attempt to activate it, leading to less driver dynamism, greater inattention, and failure to properly supervise Autopilot.

“NHTSA is concerned about the safety impacts of recent changes to Tesla’s driver monitoring system. This concern is based on available information suggesting that it may be possible for vehicle owners to change Autopilot’s driver monitoring configurations to allow the driver to operate the vehicle in Autopilot for extended periods without Autopilot prompting the driver to apply torque to the steering wheel.”

It’s definitely not the first time the National Highway Traffic Safety Administration (NHTSA) has issued a recall notice for certain Tesla vehicles equipped with the Full Self Driving (FSD) suite of driver assistance technologies. On February 15, 2023, NHTSA issued recall 23V-085, which was part of ongoing communications about the company’s self-driving capabilities.

Hands On, Hands Off — Elon Mode RevealedDriver assistance technologies are generally seen as a very good thing. They hold the potential to reduce traffic crashes and save thousands of lives each year, according to the NHTSA. Advanced driver-assistance systems (ADAS) combine technologies with a human-machine interface to increase car and road safety. Sensors and cameras detect nearby obstacles or driver errors and assist with reactions to the problem. ADAS can enable various levels of autonomous driving.

Yet NHTSA sent a letter and special order to Tesla on July 26, seeking details about the use of what apparently includes a special notification configuration, including how many cars and drivers Tesla has authorized to use it.

The Tesla website states that “Tesla driver assistance systems require a human driver to remain attentive and ready to brake or steer at any moment.” Often critiqued as being misnamed due to its implication that humans aren’t really needed to drive a Tesla, Autopilot “enables your car to steer, accelerate, and brake automatically within its lane,” the company says. “Current Autopilot features require active driver supervision and do not make the vehicle autonomous.”

Current FSD features add to the navigation available with Autopilot. A driver can program in a highway trip, which can include finding the most efficient route as well as navigating the on- and off-ramps, exits, interchanges, and lane changes. Once underway, FSD engages Auto Lane Change and Autosteer to evaluate and execute a lane change when the driver uses a turn signal. Later, Autopark can identify parallel and perpendicular parking spots at low speeds and guide the car into the spot. Summon can direct the vehicle to move into and out of a parking space.

FSD does not completely drive itself. A driver’s hands must remain on the steering wheel, and the driver is, ultimately, responsible for all driving functions. That means a responsible driver is in the driver’s seat, alert, and prepared to assume driver control when needed. Full Self-Driving does not absolve a Tesla driver from the responsibility of the safe operation of the vehicle. If Full Self-Driving operates improperly and a crash occurs as a result, Full Self-Driving does not take responsibility for that incident, nor does Tesla.

The company said its productive Q2 2023 performance was directly related to artificial intelligence (AI) development, which entered a new phase with initial production of Dojo training computers. The Dojo supercomputer will be able to process massive amounts of data, including videos from its cars, to further develop software for self-driving cars. Tesla’s complex neural net training needs will be satisfied with this in-house designed hardware, as the company has determined that “the better the neural net training capacity, the greater the opportunity for our Autopilot team to iterate on new solutions.”

It is the requirement of driver supervision that is under US federal regulator scrutiny. When a Tesla driver uses Autopilot, FSD, or FSD Beta options, a prompt flashes on the touchscreen, urging the driver to grab the steering wheel. Linger too long without complying, the “nag” changes to a beeping rhythm. If the driver still doesn’t act as requested, the ADAS can shut down. Newly discovered is a secret setting in Tesla vehicles that the company can click on so that the hands-on-the-steering-wheel prompt doesn’t happen. That means a Tesla driver can use ADAS without having hands on the steering wheel.

CleanTechnica’s Zachary Shahan mused a couple of months back that he’d be “curious how Elon Mode diverges from normal FSD Beta in city driving. Highway driving is quite straightforward and simple.” I guess Zachary learned more a few days ago, as the eponymous Musk livestreamed an FSD test drive which has caused much controversy.

The test drive may have, in fact, violated the company’s own terms of use for Autopilot, FSD, and FSD Beta, according to Greg Lindsay, an urban tech fellow at Cornell. He told CNBC that the entire drive was like “waving a red flag in front of NHTSA.” During several moments of the video, Musk had no hands on the steering wheel.

The researcher who coined the “Elon mode” phrase says he expects future recalls related to issues with FSD Beta, particularly how well the system automatically stops for “traffic-control devices” like traffic lights and stop signs. (The researcher isn’t just a naysayer — he reports bugs back to Tesla and receives bug bounties.) The Full Self-Driving system builds on top of Tesla’s standard Autopilot driver-assistance system and costs $15,000, which is about double what the feature cost in 2020 and nearly three times what it cost in 2019.


More News

Loading……