KU Leuven’s security research group COSIC has a strong track record in studying Tesla security and demonstrated attacks on model S in 2018 and 2019. This time, they broke model X and applied some new techniques for that, using 2 design flaws, both exploiting a lack of authentication. The first flaw concerns a lack of authentication for key fob software updates. In order to get a car unlock code from a key fob, they first patch the key fob with an unauthenticated firmware update. The acquired unlock code gives access to the car but is not sufficient to drive it. For this, they exploit a second vulnerability: the apparent lack of authentication in pairing another key fob to the car. After performing this through an unauthenticated maintenance procedure with a spare key fob, they succeeded to drive the car.
One could argue that this is yet another attack on a car lock, which only underlines that the automotive industry is still not up-to-date with modern-day security. But, I believe something more interesting is at play here: the adversarial use of a trusted hardware security feature (secure enclave) in a broken platform.
In this attack, the generation of a valid unlock code remained possible after the unauthorized firmware update. This is possible as the overwritten firmware did leave the secure enclave in the chip untouched. The secure enclave still has its keys and cryptographic implementations available to generate valid security codes. However, the firmware update did change the crypto protocol, allowing to send the unlock code without meeting verification conditions. Here, we can guess that the original firmware would verify the car’s authenticity before sending the unlock code. The researchers have proven that separation of the security logic from the crypto implementation is risky and that the entire crypto protocol should have been implemented in the secure enclave. This is an important lesson that may also apply in alternative situations, such as mobile phones that keep their crypto implementation in a Trusted Execution Environment. Attackers may be inspired by this attack and apply the concepts elsewhere.
For the specific case, a solution is underway as Tesla is updating its system. In more general terms, developers should reconsider their system’s security when this is partially relying on hardware security. The key question here is: would different software in the ‘normal world’ be able to degrade the strength of the ‘secure world’? In other words: would a compromised application or OS be able to abuse the security features of a TEE (or other secure enclaves) to break the solution? While a well-designed and implemented solution would resist such attacks, it is an important question to ask and evaluate.
If you have any questions, contact us at firstname.lastname@example.org.
Check out other posts of Riscure Security Highlights.