back

Back in the Driver's Seat: Recovering Critical Data from Tesla Autopilot Using Voltage Glitching

If you suspend your transcription on amara.org, please add a timestamp below to indicate how far you progressed! This will help others to resume your work!

Please do not press “publish” on amara.org to save your progress, use “save draft” instead. Only press “publish” when you're done with quality control.

Video duration
00:41:19
Language
English
Abstract
Tesla's driving assistant has been subject to public scrutiny for good and bad: As accidents with its "full self-driving" (FSD) technology keep making headlines, the code and data behind the onboard Autopilot system are well-protected by the car manufacturer. In this talk, we demonstrate our voltage-glitching attack on Tesla Autopilot, enabling us root privileges on the system.

Apart from building electric vehicles, Tesla has gained a reputation for their integrated computer platform comprising a feature-rich infotainment system, remote services through Tesla's Cloud and mobile app, and, most notably, an automated driving assistant. Enabled by a dedicated arm64-based system called Autopilot, Tesla offers different levels of "self-driving". The "full self-driving" (FSD) is provided to specific customers via in-car purchases and has been subject to public discourse.

Despite using multiple cameras and Autopilot's machine learning (ML) models, accidents persist and shape FSD reporting. While the platform security of Autopilot's hardware protects the code and ML models from competitors, it also hinders third parties from accessing critical user data, e.g., onboard camera recordings and other sensor data, that could help facilitate crash investigations.

This presentation shows how we rooted Tesla Autopilot using voltage glitching. The attack enables us to extract arbitrary code and user data from the system. Among other cryptographic keys, we extract a hardware-unique key used to authenticate Autopilot towards Tesla's "mothership". Overall, our talk will shed light on Autopilot's security architecture and gaps.

Before delving into Autopilot, we successfully executed a Tesla Jailbreak of the AMD-based infotainment platform and presented our attack at BlackHat USA 2023. This achievement empowered custom modifications to the root file system and temporarily facilitated the activation of paid car features.

Talk ID
12144
Event:
37c3
Day
1
Room
Saal Granville
Start
1:50 p.m.
Duration
00:40:00
Track
Security
Type of
lecture
Speaker
Christian Werling
Niclas Kühnapfel
Hans Niklas Jacob - hnj
Other Artists
Talk Slug & media link
37c3-12144-back_in_the_driver_s_seat_recovering_critical_data_from_tesla_autopilot_using_voltage_glitching
English
0.0% Checking done0.0%
0.0% Syncing done0.0%
0.0% Transcribing done0.0%
100.0% Nothing done yet100.0%

Work on this video on Amara!

English: Transcribed until

Last revision: 1 month, 2 weeks ago