
Federal car safety regulators have issued Tesla a special order ordering it to submit detailed information on its driver assistance & driver monitoring systems, as well as a formerly classified configuration for these systems known as “Elon mode.”
A visual icon blinks on the touchscreen of the car to signal the driver to turn on the steering wheel when a Tesla driver employs the company’s driver assistance systems, which are advertised as Autopilot, Full Self-Driving, or FSD Beta choices.
The “nag” turns into a beeping sound if the driver neglects the steering wheel for an extended period of time. At that point, if the driver is still not behind the wheel, the car may disable its advanced driver assistance systems for the remainder of the trip or longer.
As previously reported by CNBC, Tesla can permit a driver to use the company’s Autopilot, FSD or FSD Beta technologies without the alleged “nag” when the “Elon mode” configuration is enabled.
The National Highway Traffic Safety Administration requested information on the use of what appears to include this particular configuration in a letter and special order that was sent to Tesla on July 26. This information included how many vehicles and drivers Tesla had given permission to use it. Tuesday saw the addition of the file on the agency’s website, and Bloomberg was the first to report on it.
Acting Chief Counsel John Donaldson of the agency penned the letter and special order.
“The effects of recent modifications to Tesla’s driver monitoring system on safety are a worry to the NHTSA. This worry is based on information that suggests it might be possible for car owners to alter the driver monitoring settings in Autopilot so that the driver can operate the car in Autopilot for longer stretches of time without Autopilot prompting them to turn the steering wheel.
Tesla was given until August 25 to respond with all the information required by the organization. They did everything on time, however, NHTSA gave them confidentiality in exchange for their request. When CNBC asked for comment, the corporation didn’t answer right away.
After the decision was made public, computer engineering associate professor at Carnegie Mellon University and expert on automotive safety Philip Koopman told CNBC, “It seems that NHTSA takes a poor view of cheat codes that allows disabling safety measures like driver monitoring. I concur. Production software shouldn’t contain secret features that compromise security.
Koopman also mentioned that the NHTSA hasn’t finished looking into a number of accidents where Tesla Autopilot systems may have had a role, such as a succession of “fatal truck under-run crashes” and collisions involving Tesla cars that struck stationary first responder vehicles. Ann Carlson, the acting NHTSA administrator, has hinted that a resolution is imminent in recent press interviews.
Although they are marketed under brand names that could cause confusion, Tesla has been claiming to regulators for years, including the NHTSA and the California DMV, that their driver aid technologies, such as FSD Beta, are merely “level 2” and do not render their cars autonomous. Elon Musk, the CEO of Tesla and owner and operator of the social network X (previously Twitter), frequently makes the implication that Tesla cars are self-driving.
On the social media site over the weekend, Musk broadcast live a test drive in a Tesla fitted with a still-in-development version of the firm’s FSD software (v. 12). Ashok Elluswamy, the director of software engineering for Tesla’s Autopilot, was Musk’s passenger as he webcast the demonstration while driving.
Musk did not illustrate that he had his hands on the steering yoke and was prepared to take over the driving duties at any moment in the hazy video broadcast, nor did he show all the intricacies of his touchscreen. He occasionally appeared to be without hands on the yoke.
According to Greg Lindsay, an Urban Tech fellow at Cornell, his usage of Tesla’s systems would probably constitute a breach of the company’s own terms of use for Autopilot, FSD, and FSD Beta. His entire journey, he claimed to CNBC, was like “waving a red flag in front of NHTSA.”
Drivers are advised on Tesla’s website that “it is your responsibility to stay alert, hold your hands on the steering wheel at all times, and maintain control of your car” under the section titled “Using Autopilot, Enhanced Autopilot, and Full Self-Driving Capability.”
Tesla is making some technological advancements, but there is still a long way to go before it can provide a secure, self-driving system, according to Bruno Bowden, managing partner of Grep VC and investor in the self-driving firm Wayve.
While driving, he noticed that the Tesla system almost ran through a red light, necessitating Musk’s assistance so that he could brake in time to prevent any risk.