From 'Demon Mode' to 'Elon Mode'
Are Tesla's methods of developing safety critical software madness or are they on course to permanently upend safety orthodoxy? I wish I knew.
Code in plain sight
Those who have read ‘Tech Safe Transport’ for some time will be aware that one of the themes I return to from time to time is the growing challenge of assuring software given that it is basically opaque to users and regulators.
This concern has always been premised on the idea that this challenge was an unintended consequence of the growing power of computers and the increasing automation of transport functions. I had never considered that the lack of transparency would be driven by an intent to hide safety related functions in code. Step forward Elon Musk and the Tesla ‘Elon Mode’.
‘Elon Mode’
The U.S. highways safety regulator is currently demanding an explanation from Tesla for a software change that allows drivers to keep their hands off the wheel for longer. The National Highway Traffic Safety Administration (NHTSA) has sent a special order to Tesla to come clean on a secret software mode, nicknamed 'Elon Mode' after Tesla’s maverick leader Elon Musk.
‘Full Self-Driving’ (FSD) is Tesla’s vision-based advanced driver-assist system. When a Tesla driver uses the FSD mode, and removes their hands from the wheel for an extended period of time, a symbol appears on the screen to ‘nag’ the driver into putting their hands back on the steering wheel.
Well, a hacker known as greentheonly has for some time been searching for hidden code in Tesla software and they recently found a cheat mode (which they nicknamed ‘Elon Mode') to turn the FSD ‘nag’ feature off. A demonstration of this was posted on youtube and came to the NHTSA’s attention. The FSD software has already been a cause for concern: a leaked report recently indicated FSD has had thousands of customer complaints of sudden braking and abrupt acceleration. In the conservative and sober world of software safety engineering, this all therefore seems like utter, reckless madness.
Going dark
The 'Elon Mode' resonates as it confirms Elon Musk's image as a maverick. It's perhaps not surprising to most that he'd sneak some 'easter eggs' into Tesla's code for his own libertarian convenience – perhaps to enjoy a carefree drive with his feet up on the dashboard.
This image conceals a crucial fact though: making cars safe enough for drivers to fully trust them to drive is the ultimate goal in automotive safety. Once achieved, it could potentially elevate road transport to the same level of safety as rail or aviation by eliminating the human error that inevitably arises when non-professional drivers take the wheel. The safety benefits could be immense.
News of ‘Elon Mode’ comes as a newly published biography of Musk also describes his ‘demon mode.’ In the book, the renowned biographer William Isaacson describes Musk’s ‘maniacal sense of urgency’ and how he observed many times how the billionaire’s attitude can suddenly flip frightening his workers when they don’t match his pace.
He'd go dark and I'd know that he was just going to rip that person apart.
This was reportedly a common occurrence when Musk first took over Twitter and gutted over half of the social-media site's staff. This approach is the complete opposite of the sort of safety culture that would be expected in an organisation responsible for safety critical systems. But Isaacson, at least, has some sympathy for Musk’s toxic control freakery seeing it as an inevitable part of the package:
as Shakespeare teaches us, all heroes have flaws…sometimes great innovators are risk-seeking man children.
Whether or not you buy this will come down to your personal views. But the confounding fact for me is that this West Coast, disruptor mindset does seem to be steadily bearing fruit.
Vertical integration
Tesla is renowned for its frequent over-the-air software updates, enhancing vehicle features and performance. This strategy enables them to continuously enhance and personalize their vehicles even post-sale, providing a competitive edge and boosting customer loyalty. However, this approach clashes with the mindset of traditional safety engineers who naturally insist on rigorously tested and stable software.
Tesla has introduced changes that offer a different way to regain control over software safety though, compensating for their rapid innovation cycle. They've embraced vertical integration, manufacturing a significant portion of their components and systems in-house instead of relying heavily on external suppliers. This includes producing custom-designed hardware for autonomous driving. This approach also grants Tesla more control over their system safety architecture and software safety requirements.
While other manufacturers and sectors - still coping with the remote supply chains favoured by ‘globalisation’ - struggle to comply with the mandatory verification and validation lifecycles in software safety standards, Tesla has organised itself to bring more fundamental control and less trust on third parties.
A safety paradox
It is now over eight years since the autopilot collision of a 2015 Tesla Model S 70-D in Florida. The accident happened on May 7, 2016, when the driver, Joshua Brown, put the car in autopilot mode, expecting it to drive itself while on the highway. The Tesla’s autopilot sensors failed to recognize a white tractor-trailer crossing the highway and attempted to drive through it at full speed. Brown died from the injuries sustained in the crash. This was a landmark tragedy that will be analysed as a case study for many years to come.
Since then major safety investigations have been undertaken of the Tesla’s phantom braking and tricking of its autopilot system. Back in May, a whistle blower leaked data showing customer complaints about ‘self-acceleration’ issues. The NHSTA has also recently begun investigating incidents of Teslas striking emergency vehicles.
But there is a paradox here. In an engineered system, with a rapid innovation cycle, errors can be designed out quickly and permanently. These recent incidents could be seen as an inevitable part of the learning needed to reach the holy grail of safe, self driving vehicles.
There is still some way to go, but it appears that the flaws in the self driving systems are being systematically eradicated. It is notable that Modern Tesla models are rated 5 stars across the board, in all safety categories, by the NHTSA.
This all reminds me of a question that a friend once asked, many years ago, after we had both watched the magician Hans Moretti performing a spectacular trick on ‘the Paul Daniels Magic Show’:
How do you learn to catch a bullet between your teeth?
It made me laugh because I knew what I’d seen was a trick. But one thing’s for sure, if you were to try it for real you’d certainly have to be wildly reckless, if not insane.
I’m keen to build the network for Tech Safe Transport. If you know anyone who is interested in the safety of modern transport technology, and who likes a thought provoking read every few weeks, please do share a link with them.
Thanks for reading
All views here are my own. Please feel free to feed back any thoughts or comments and please do feel free to drop me an e-mail.
My particular area of professional interest is practical risk management and assurance of new technology. I’m always keen to engage on interesting projects or research in this area.