My name is Urban Jonson, and I’m the Chief Technology Officer and Program Manager, Heavy Vehicle Cyber Security Program, with the National Motor Freight Traffic Association, Inc. (NMFTA).
I’m honored that IOActive has afforded me this guest blogging opportunity to connect with you. The research at IOActive is always innovative and they have done some really good work in transportation, including aviation, truck electronic logging devices, and even satellites. Being among such technical experts really raises the stakes of the conversation. Luckily, I can lean on some of my awesome collaborators like Ben Gardiner at NMFTA, as well as countless industry experts who I’m privileged to call friends.
I feel a profound sense of loss of technical progress in our field this year. All of my favorite technical events, where I can connect with people to discuss and share ideas, have been canceled (heavy sigh). The cancellation of the NMFTA HVCS meetings have been the hardest for me, as they pull together an incredible community in the motor freight industry. Many of the attendees are now my friends and I miss them.
The cancelation of my other favorite industry events, Blackhat/DEF CON, CyberTruck Challenge, and ESCAR, have been hard as well. While I do enjoy many of the presentations at these conferences, my biggest benefit is meet one-on-one with some of the brightest minds in the industry. Where else do I get to sit down with Craig Smith and casually discuss the state of the automotive industry? I remember having great conversations with Colin O’Flynn about wily new ideas on power fault injection at many different events. These one-on-one opportunities for conversations, collaboration, and information sharing are invaluable to me.
This year I had wanted to talk to some of my friends about Triton malware and vehicle safety systems such as lane departure assist, crash avoidance, and adaptive cruise control. Alas, no such luck this year. So, I’m going to dump this discussion out in the open here.
The Triton Malware
First, for the uninitiated, a quick review of the Triton malware. The Triton malware intrusion was a sophisticated attack that targeted a petrochemical plant in the Middle East in 2017. How the attackers first got into the network is a little bit of a mystery, but most likely the result of a misconfigured firewall or spearphishing attack. The first piece was a Windows-based remote access tool to give the attackers remote access to an engineering workstation. What came next was very interesting: according to reports1, a highly specific secondary attack was mounted from the compromised engineering workstation against a specific Schneider Electric Triconex2 safety controller and select firmware versions (10.0 – 10.4) using a zero-day vulnerability. The safety controllers in question are designed to take direct action to initiate shutdown operations for the plant without user intervention in the case of serious safety issues.
Stop and think about that for a second—someone had taken the time to figure out which specific controller and firmware versions were running at the plant, obtain similar hardware to research, find a zero-day vulnerability, then research and compromise the plant’s IT infrastructure, just to install this malware. That is not an insignificant effort, and not as easy as they make it out to be in the hacker movies.
An unplanned “accidental” shutdown of the plant revealed the presence of the malware and the intrusion. It was theorized that the attackers wanted to obtain the capability but not use it, and that the shutdown was an accidental reveal3. Cyberphysical attacks are usually broken into separate cyber and physics packages. Given the effort put into the attack, it is extremely unlikely the attacker would have intended on such a dumb physics package. If you want an example of a well thought out cyber-physical attack read up on Operation Olympic Games which targeted Iranian uranium centrifuges4. This goes to show that if you play around with bombs, physical or digital, they can go off unintentionally.
Another interesting tell occurred as the response team was trying to secure and clean up the intrusion, when the attackers fought back to try to maintain a foothold. Actively engaging blue-team efforts in real-time is risky, as it can quickly lead to full attribution and unwanted consequences. This tells us that the attackers considered this capability a high priority; they had made a large investment in resources to be able to compromise the safety controllers and they were determined to keep it. A great deal of information about this intrusion is still murky and closely guarded, but it is generally considered to have potentially been one of the deadliest malware attacks so far, had the capability been leveraged.
The safety controller concept of a contained device taking decisive action without user intervention sounds eerily familiar. The concept is virtually everywhere in the new safety technologies in modern cars and trucks in the form of crash avoidance, lane departure assist, and other features for which we have all seen the ads and literature. FMCSA is even studying how to retrofit existing trucks with some of these promising new safety technologies, which can help reduce accidents and save lives.
These automotive safety systems rely on sensors such as cameras and LIDAR to get the input they need to make decisions affecting steering, braking, and other actions. This brings up some interesting questions. How secure are these components? How diverse is the marketplace; that is, do we have risk aggregation through the deployment of just a few models/versions of sensors? Is there a specific sensor model that is ubiquitous in the industry? Do we have our own version of a Triconex safety controller that we need to worry about?
How secure are these components? How diverse is the marketplace; that is, do we have risk aggregation through the deployment of just a few models/versions of sensors? Is there a specific sensor model that is ubiquitous in the industry?
The short answer seems to be yes. I read an interesting paper on Fault Detection, Isolation, Identification and Recovery (FDIIR) for automotive perception sensors by Goelles, Schlager, and Muckenhuber from Virtual Vehicle Research5. (Note: This paper is an interesting paper worth reading in its own right, and discusses a sensor fault classification system that can be applied to other domains, such as aviation and maritime.) The conclusion of the paper is that, for the most part, systems such as LIDAR treated as black boxes with little or no knowledge of the internal firmware or interfaces. This is mostly due to a small number of companies in fierce competition working hard to protect their intellectual property. In my opinion, that is not a good sign. If we need multiple sensors to cooperatively decide on safety-critical actions, transparency is going to be crucial to designing a trusted system. The present lack of transparency in these systems almost certainly implies a lack of security assurance for their interfaces. This sort of inscrutable interface (aka attack surface) is a hacker’s delight.
All of this is not really new—our own Ben Gardiner discussed similar points in 20176. So, what other truck-specific safety system black boxes can we discuss through the filter of the Triton attack that might not be ready knowledge to you? Enter RP 1218.
RP 1218 – Remote Disablement of Commercial Vehicles
First a little background: the American Trucking Association’s (ATA) Technology Maintenance Council (TMC) develops recommended practices (RPs) for the trucking industry. The council is comprised of representatives from motor carriers, OEMs, and Tier 1 suppliers for the truck industry. They generally do great work and mostly focus on physical truck maintenance-related issues, but they also work on other recommended practices such as Telematics-Tractor connectors (RP 1226). These are, strictly speaking, only recommendations for the industry, but many of them end up being de-facto standards, especially in-vehicle electronics.
The TMC has recently decided to take up RP 1218 and develop an updated version, which is how it came to our attention. Now, why has this RP drawn our attention and ire at the NMFTA HVCS? The title for RP 1218 is “Guidelines for Remote Disablement of Commercial Vehicles.” It consists of a recommended practice on how to implement a remote shutdown and/or limp mode for a heavy truck. The current version is rather old, from 2005. The problem is that cybersecurity was not at the forefront of the trucking industry’s thinking in at that time.
The core security premise of RP 1218 was based around “secret” CAN message instructions sent to the engine controller. Uh-oh. CAN doesn’t include encryption, so there’s no such thing as a secret CAN message. Well, not for very long anyway. Even the existence of the RP was enough to give us the jitters.
We immediately set out to determine if anyone had implemented RP 1218 and did a basic survey of remote disablement technology with the assistance of our friends at CanBusHack. The good news was that we could not find anyone who had implemented RP 1218 as specified. The bad news was that we found plenty of other ways to do it, including messing around with diesel exhaust fluid (DEF) messages and derate limits, among others. I’m not going to dig into those details here.
We also discovered a robust global market for both Remote Vehicle Shutdown (RVS) and Remote Vehicle Disablement (RVD). Luckily for me, most of that market is outside of North America, my primary area of concern. The methods by which the various vendors achieved RVS/RVD varied significantly, but were not as simple as sending a message to the engine using RP 1218. That’s good, but the problem is that companies are building in a full remote stop button on their entire fleet. It seems that the sensitivity of RVS/RVD is well understood, and due to this concern, there’s not a great deal of transparency into these systems; we found it difficult to get even basic information. Another inscrutable black box.
While you can certainly make the case that it’s necessary to be able to disable a vehicle from a national security perspective, to prevent truck hijackings and terrorists turning trucks into battering rams, such a system would need to be absolutely bulletproof. While there are some ideas on how to mitigate such risks using things line Consequence driven, Cyber-informed Engineering (EEC)7, that’s a very hard thing to accomplish when it involves black-box technology with unknown interfaces. It’s worth repeating, black boxes with unknown interfaces are huge flashing targets for threat actors.
If we look at this through the lens of the Triton intrusion, how much effort do you think someone would go through to obtain the ability to affect motor transportation at scale? Do you think they would conduct the same level of research on infrastructure and components, and attempt to compromise these systems so that they can be hit at the most critical time? I certainly do. This whole set of problems is pressing, and I really needed to get some perspective and ideas.
How much effort do you think someone would go through to obtain the ability to affect motor transportation at scale?
This brings us back to meeting up in person with industry experts with extensive expertise in industrial control systems (ICS), automotive, and many other areas. When I’m looking at this massive problem, I don’t get to ask important questions of my friends, many of whom I only see once a year at these events. Are there any lessons from the Triton ICS attack that we can leverage in designing active safety systems for vehicles? Can we develop an attack tree for someone attempting a sophisticated nation-state attack against vehicle safety control systems or remote disablement vendors? How do we best defend against someone who would like to own our infrastructure and unleash disruption on our transportation sector? How do we improve our designs, resiliency, processes, and general security posture against this type of threat?
Multi-mode Transportation Sharing and Support
Unfortunately, in today’s world, I can’t ask my friends in a quiet corner over a drink and strategize on a way to mitigate this risk. I’m sure that I’m not the only one feeling bereft of such opportunities.
By the way, informal collaborations at security events are exactly how the NMFTA Heavy Vehicle Cyber Security program came to be in the first place. After a Miller and Valasek presentation at Black Hat 2014, I sat down with a bright guy from Cylance at the Minus5 Ice Bar at the Mandalay Bay and we “doodled” the attack vectors on a truck. After taking a look at the finished napkin, we were both horrified. When I returned to Alexandria, Virginia, I started doing the research that eventually became our first white paper on heavy vehicle cybersecurity, which we “published” in September 20158. Okay, honestly, we sat on it for a couple of years before we made it public.
So how do I move past this obstacle to fun, as well as to progress and work on keeping trucks secure and moving? In my case, I’ve been endeavoring to create a multi-mode transportation sharing and support group. This is an informal monthly gathering of a few select folks from various transportation modes, sharing resources and connections and generally supporting each other’s missions. Additionally, I’m trying (and mostly still failing) to reach out to those wonderful smart and talented friends to connect with them and see how they are doing personally, and to share whatever resources, references, articles, papers, connections, or technology I can provide to help them be successful in their missions. I ask whether they might be able to give me some advice or novel take on my problem and discuss possible solutions and mitigations. Like most tech geeks, I like technology because it’s easier to understand and deal with than most people. However, the lack of camaraderie this year is a little much, even for me.
So let’s help everyone connect. Think of those people that you see so infrequently at these canceled conferences, and call them to check-in. Don’t text or email, just pick up the phone and give them a call. I am sure that they’d appreciate hearing from you, and you’ll probably find that you have some interesting technical topics to discuss. Maybe you could invite the “usual gang/CTF team” to a Zoom happy hour.
Don’t stop connecting just because you’re stuck in the basement like me. You never know, maybe you’ll solve an interesting problem, find a new, really evil way to hack something cool, help someone find a resource, or just maybe make the world a slightly safer place. Most importantly, if you discover the solution to the problems I’ve discussed here, please let me know.
 Blake Sobczak, The inside story of the world’s most dangerous malware. E&E News, March 2019
 NSA/CISA Joint Report Warns on Attacks on Critical Industrial Systems. July 27, 2020.
 Tara Sales, Triton ICS Malware Hits A Second Victim. SAS 2019, Published on Threatpost.com, April 2019.
 Pierluigi Paganini, ‘Olyimpic Games’ and boomerang effect, it isn’t sport but cyber war, June 2012.
 Goelles, T.; Schlager, B.; Muckenhuber, S. Fault Detection, Isolation, Identification and Recovery (FDIIR) Methods for Automotive Perception Sensors Including a Detailed Literature Survey for Lidar. Sensors, 2020, Volume 20, Issue 13.
 Ben Gardiner, Automotive Sensors: Emerging Trends With Security Vulnerabilities And Solutions. MEMS Journal, February 2017.
 For more information on CCE please see the INL website. A short overview can be found here.
 National Motor Freight Traffic Association, Inc., A Survey of Heavy Vehicle Cyber Security. September 2015.