Stuxnet - a genuine threat or just media hype?

Posted in Cyber Security , Democracy | 02-Oct-10

Felix Juhl is Government Information Management Specialist at the Center for Security Studies at the ETH Zurich.

Opening any paper or magazine during recent days reveals the same headlines: a powerful computer worm has attacked industrial facilities around the world, but mainly in Iran and Indonesia. Self-propogating worms and viruses are almost normal today, so what makes this worm so special?

It is special because it crossed the divide between the cyber and physical environments. Such a worm usually executes a denial of service attack, slowing down or stopping a control system from operating. In an oil refinery, it would mean inaccurate displays on operators' network stations that do not reflect what is happening. The worst case scenario is losing control of the plant, possibly causing environmental, health, or safety issues; refineries can blow up.

This malicious code, called Stuxnet, reportedly went after several "very specific, valuable targets". There is evidence that the worm was the first malicious code specifically created to take over the control systems of industrial plants. In contrast to mass media reports however, there is no real evidence that it was developed to target nuclear plants in Iran.

Intriguingly, to design such malicious code requires an expert team of highly trained computer specialists. A number of governments with sophisticated computer resources have the ability to create such code. They include: China, Russia, Israel, Britain, Germany and the United States. Cyber war experts have no clue who developed it, or why.

Experts in Germany discovered the worm, and German officials transmitted the malware to the U.S. across a secure network so they could analyze the code. The two servers controlling the malware were in Malaysia and Denmark; both were shut down soon after they were discovered by computer security experts earlier this summer.

The worm was able to burrow into some operating systems that included software designed by Siemens AG, by exploiting a vulnerability in several versions of Microsoft Windows. Unlike a virus, which is created to attack computer code, a worm is designed to take systems over, such as those that open doors or turn physical processes on or off.

The real concern about stuxnet is not the attack vector, but its ability to spread by targeting files that administrators use to configure Siemens software. If present on a targeted Windows PC, these so-called Step7 files are immediately infected. Opening the files later sets off a new wave of infection.

Stuxnet's ability to infect project files and run when they are opened is a new propagation vector.

This technique can be especially effective in environments where Step7 files are located on a central computer, then copied to and executed on other machines. If Stuxnet infects that central server, it can then infect local machines downstream. This technique potentially allows Stuxnet to re-infect machines even after they have been purged of the malware. Infected projects restored from backups may reintroduce the infection to clean machines so administrators must exercise caution when restoring files in this manner.

When control and safety systems moved away from being hardwired and relay-based to computerized systems, vendors and asset owners were more interested in functionality than security. Typically, the systems were standalone with a dedicated Safety Instrument System, especially in high-risk environments like refineries and offshore oil installations. Advances in computer technology during the 1980s and 1990s caused a rapid shift from these proprietary systems to Intel / Microsoft based systems, driven primarily by the end user to reduce costs and for ease of standardization with other IT infrastructure. Microsoft released sporadic patches and updates to the base operating system, and security was rarely considered.

To compound these risks, when these control systems were analyzed, it was quickly discovered the individual components would be at home in a standard IT environment: file servers, SQL (database) servers, and web servers. Yet these were often unpatched, lacking anti-virus software, and configured with weak or even no passwords; security risks that would not be tolerated on business networks. The risks associated with these systems were rising fast.

The last 10 years have seen a continuation in the rapid rise of the complexity of these control systems, with asset owners demanding more functionality and vendors competing to add new features to differentiate their system from the competition. These systems are typically marketed and sold to control engineers, so advances are focused on the parts of the system that will appeal to them: better HMIs, faster control loops, wireless I/O, and so on. These engineers often have little experience with IT infrastructure or cyber security, but are unwilling to enlist IT security experts with scant knowledge of process systems. A few evangelists have emerged, from the vendor community as well as from among users, but often security and complexity take second place to system operations.

As stated, modern control systems contain many standard IT components. Complexity starts to grow when the location of these devices is optimised. Should they be installed on the process control network or are they better suited to the business network? It is likely that both scenarios will be deployed; sometimes by choice, but often by accident as no one has looked at the architecture as a whole. Often, process control engineers at a plant will approve system architecture designs put forward by third parties (consultants or Main Automation Contractors) but do not have in-depth knowledge of current IT practices to judge if these designs are suitable. The systems are often implemented by third parties, challenging the understanding of the actual users who make poor choices in terms of security.

Microsoft releases patches to their operating systems every month; these may directly affect the vulnerability of one or more of your control systems and hence increase risk. Asset owners have to decide if the increased risk is worth accepting, eliminating (by installing the patch), or mitigating by other means. The difficulties of testing and installing an operating system patch to a system located on a drilling platform miles from shore are huge. This may be exacerbated because these control systems may be utilized for critical functions and therefore the availability of these systems is essential. The CIA triad (confidentiality - integrity - availability) is used widely within IT security circles, but the importance is often reversed in control systems as availability is usually far more important than protecting actual information.

The recent Stuxnet incident has obviously increased the profile of these problems. The vulnerability of control systems has been explained at security conferences for many years, as a Google search will quickly show. However, these were generally vague presentations and alluded to unpatched systems or poor/non-existent passwords. Stuxnet was specifically targeted at Siemens equipment, and there are likely to be more attacks of this kind in the future. The increase in news channels around hacking critical infrastructure obviously raised the target attractiveness index as well.

Potential consequences must be very carefully considered. The disaster aboard the Deepwater Horizon in April 2010 showed how easy it is to underestimate the consequences of any incident. Apart from the human and environmental cost, the industry as a whole will likely be impacted by increased regulation.

The question is clear: what happens next? My personal opinion is that Stuxnet will continue to mutate and infect systems as a demonstration of power. In the meantime, will we see waves of retaliatory strikes from Iran or its supporters? I wouldn't rule out the Chinese or Russians here. This may have been a wake-up call to let the Iranians know what's coming. There is also another objective: cyber reconnaissance. For example, launch a cyber attack on Iran that will likely be blamed on the U.S. and Israel, and then sit back and see what happens next. It would not cost much to fund a piece of code like this.

The latest reports suggest the malware has infected as many as 49,000 computer systems around the world. Siemens AG, the company that designed the system targeted by the worm, said it has infected at least 15 of the industrial control plants it was apparently intended to infiltrate. It's not clear what sites were infected, but they could include water filtration, oil delivery, electrical and nuclear plants. According to Siemens statements, none of those infections has adversely affected the industrial systems.