The story behind Stuxnet, the malware targeted at an Iranian nuclear processing station, has been known in general since last fall when a team of researchers at Symantec released this document, which we covered at the time in our article here. But seeing is believing. I had a chance to attend a special briefing at Symantec’s headquarters in Mountain View, Calif. where Patrick Gardner, a director in their security group, actually showed us what was involved. It was a real thrill.
Stuxnet was a very sophisticated piece of software, some 10,000 lines of code that took man-years to develop. Symantec started seeing versions of the malware up to a year before the actual attack last June, they just had no idea what they were looking at until things started to happen at the nuclear facility. They eventually reverse engineered the entire code with a team of three working full time for several months.
The software is extremely specific, targeting a particular programmable logic controller from Siemens that runs a series of 9,000 different centrifuges used in Uranium separation. The malware package ultimately destroyed about 1,000 of them, causing a considerable amount of damage and setting back the Iranian nuclear program a year or more.
The plant’s computer network had what is called an “air gap” between the PCs that were used to interact with the Siemens controllers, and the ordinary business network of the plant that had an Internet connection. This means that Siemens-connected PCs had no direct Internet access themselves, which is just good security practice. So how did these PCs get infected? As it turns out, it was human error!
The authors targeted the five potential subcontractors of the plant, knowing that eventually a worker at one of them would carry their laptop into the plant and use a thumb drive to load some software onto the controller PCs. The virus used a special zero-day attack that hadn’t been seen previously that changed a Windows file icon shown in Explorer so that just viewing the file would compromise the PC.
Once this happened, Stuxnet got to work. Ironically, its first task was to sit dormant and just record the controller traffic and responses for two weeks, without doing anything to disrupt the controllers or other plant operations. Once it gathered this data, it began to infect the controllers. Also, because of the way the virus was constructed, there was no way to use ordinary debugging tools in Step7 to see what code the Stuxnet authors had added to foul up things: it looked the same as the normal controller programs. That was pretty clever programming.
At the Symantec briefing, they had brought along an actual Siemens controller to show first-hand what was likely to have happened. It was a small box about the size of a loaf of bread. Connected to it was an automatic air pump, like the kind that you might carry in the trunk of your car to inflate your spare tire. The first demo showed “normal” operations, where the pump would run for three seconds to inflate a balloon. (No, we didn’t get to see any nuclear materials in use; that would have been too much.) Then we ran Stuxnet, and it changed the controller to run the pump continuously, and of course the balloon exploded.
The Stuxnet authors (no one is saying definitively who they are, but you can imagine some very well-funded state-sponsored operation) had access to the exact factory layout in Iran where the controllers were located. What this means is that factory plans were stolen or compromised so the authors of the virus knew where the pumps, motors, and other equipment were located and connected to each other. This is because the software was designed for this specific layout and no other. Stuxnet wouldn’t harm another plant using the same Siemens’ equipment.
“There had to be some kind of data exfiltration, as well the skills needed to do the programming,” said Gardner. And the programming skills were very sophisticated. There were 15 different modules to the software, and five different concealing mechanisms built-in. There were also two rootkits, one for the host PC and one on the Siemens controller itself running a special embedded OS called Step7. The virus authors also had stolen two digital certificates also from companies that were physically adjacent to each other in a Taiwan business park. Why two? Because the first one was discovered and expired before the virus could be deployed. All in all, there were six zero day infections coded into the virus. To give you an idea of this scope, Symantec found all of 14 total zero day attacks in all of 2010.
As the virus took action, it spun up and spun down the centrifuges beyond their normal operational frequencies, essentially damaging them. While this was happening, the two weeks’ of controller traffic was being replayed to the plant operators, so they wouldn’t suspect anything was wrong until the machines literally started breaking apart. The virus also disabled the kill switches built into the controllers, so they essentially couldn’t be shut down.
What was going through my mind during this demo was: what next? The Stuxnet team wasn’t going to stop with this effort. The next one may be even more chilling.
NB: I have done some consulting work for Symantec over the past few years on a variety of security-related projects.
0 Responses
Stay in touch with the conversation, subscribe to the RSS feed for comments on this post.