Bad News for users of Siemens devices that run the WinCC operating system: Siemens deliberately made their system running WinCC deeply vulnerable, knew about the problem for 2 or more years, and made a command decision to do….nothing. That's right: Siemens built a seriously stupid vulnerability into their systems and then ignored it for over two years, even after being informed of it. That's Siemens, the company whose tag line is "Global Network of Innovation". With "innovation" like that, we're all screwed.
Say "Hello" to the Stuxnet worm. The Stuxnet worm uses a zero-day flaw in the Windows shell to spread. It also comes with its own drivers (!!) and a separate binary signed by two separate digital certificates belonging to legitimate technology vendors. That's impressive. But one of the most interesting things about the Stuxnet worm is that it looks like it's been designed to exploit a specifc weakness in a particular SCADA control software package.
(SCADA stands for "supervisory control and data acquisition", and typically refers to an industrial control system or a computer system that monitors and/or controls a process.)
Once the Stuxnet worm gets onto a system running WinCC, it establishes a connection to a remote server and then tries to steal and export sensitive data.
Now, this is actually a two-part comedy of errors. The exploit is made possible in part because of a previously unknown vulnerability in the way Microsoft Windows handles '.lnk' or shortcut files. But the other half of the equation is that the worm searches specifically for systems running Siemens WinCC, which has a critical severity vulnerability: it uses a hard coded password. This is one of the most elementary mistakes possible in software design, so elementary that it's on the list of CWE/SANS Top 25 Most Serious Software Errors.
SANS says, in part, “Hard-coding a secret password or cryptograpic key into your program is bad manners, even though it makes it extremely convenient – for skilled reverse engineers. While it might shrink your testing and support budgets, it can reduce the security of your customers to dust. If the password is the same across all your software, then every customer becomes vulnerable if (rather, when) your password becomes known."
In other words, "it's stupid, and don't do it".
You could ask, "How did a supposedly competent compoany like Siemens make a moronic mistake like hard-coding a password into their system? Or, "How did such poorly-written software ever make it past a code review?"
But the real question we should be asking is, "Why did Siemens ignore this critical vulnerability when it was publicly disclosed more than two years ago?" It's a fact: Siemens waited more than two years to do anything, and even then they only started to address it after a worm exploited it. I'd bet that there are lawyers pondering whether or not Siemens can be successfully sued for negligence, based on their lack of response even after they knew about this vulnerability.
If this is "innovation", give me less.