Normally, I Fuss About Code Weasels...
Sep. 30th, 2010 10:03 am![[personal profile]](https://www.dreamwidth.org/img/silk/identity/user.png)
...with too much time on their hands. You know, the ones writing viruses that require a complete "Format C:." But in this particular instance?
I'm actually cheering them on:
That was 4 days ago. Reports now are stating that it could take months to repair the damage to the systems, as a new "mutation" of the virus has appeared. The other interesting note, as of this morning:
Back when I was in high school -- a lifetime ago, it seems -- I did an internship with one of Siemens' competitors. During that time, I learned quite a bit about programmable logic controllers, the industrial systems that would have been targeted by this virus. These were some of the earlier systems to use proprietary forms of networking to pass information from console to console, and to ease the process of updating programming. In places like paper mills and steel mills, these systems might even have a direct external connection so they can be remotely programmed. As noted in the articles above, however, nuclear power plants would likely be a closed system, one in which a programmer would have to physically install the programming changes to a central location, or possibly even each individual controller. The programming for these things is incredibly specific -- logic ladders that spread thousands of pages and, while the applications are similar across the board, some of the syntax and education tends to be proprietary.
Thus, given what (little) I know about these controllers, the closed nature of the nuclear plant systems, and the means by which they would have to be updated, it does seem highly likely that this is a coordinated effort. Not only does it create a process concern, delaying start-up of the nuclear plant, but it also has a secondary, psychological effect: "planting a 'seed of paranoia among the managers and bosses, and everybody in the workplace becomes scrutinized as a potential leak.'"
In one case, this is bad. There are regular people, doing their jobs, who will likely be targeted by this scrutiny unjustly. And, it will certainly make program officials more wary of any further incursions into their systems. That will create higher risk and more difficult conditions under which future missions might be executed. But maybe -- just maybe -- it will also slow things down, make Iranians do things more deliberately, and that can be used to advantage.
The immediate result, however, cannot be ignored: Full start-up of the plant has been delayed by at least two or three months while the Iranians try to clean house. That is, of course, if they're telling the press the whole story; in this instance, they may be under-reporting the extent of the damage. Either way, it gives them food for thought, as well as buying other entities more time to develop a more interesting solution to the problem of Iran going nuclear.
Abso-frakking-lutely brilliant.
ETA: A pretty decent follow-up from the Times. From the article:
Now, I'm kinda thinking out loud here, but to me, this creates a few issues. For the manufacturer, sometimes the third party isn't as experienced in dealing with the equipment as a manufacturer's sales/technical rep. This has many times resulted in bad programming and the eventual service call to the manufacturer's technical reps to correct the problem.
In this instance, using a third party vendor means Siemens has no control over where their equipment is going and who is programming it; it means the Iranians are relying on someone who is not a Siemens rep to program it, possibly creating a larger vulnerability than otherwise expected; and the extended chain of custody (as it were) provides more access for intentional sabotage.
Then again, if they had been able to purchase directly from Siemens, who is to say that anyone would have had access to plant the bug to begin with?
Ah, all the loverly what-ifs....
ETA 2: Another follow-up from The Weekly Standard. Even better than the Times article, it even details the weaknesses of the programmable logic controllers that Stuxnet exploited.
I know I'm being repetitive, but I'll say it again: Absolutely brilliant work.
I'm actually cheering them on:
Iran's official news agency said today [24 Sept] that a sophisticated computer worm purportedly designed to disrupt power grids and other such industrial facilities had infected computers at the country's first nuclear-power plant but had not caused any serious damage.
The Stuxnet worm, which some see as heralding a new era of cyberwarfare, appeared in July and was already known to be widespread in Iran. In fact, its high concentration there, along with a delay in the opening of the Bushehr plant, led one security researcher to hypothesize that Stuxnet was created to sabotage Iran's nuclear industry.
That was 4 days ago. Reports now are stating that it could take months to repair the damage to the systems, as a new "mutation" of the virus has appeared. The other interesting note, as of this morning:
“Since Iran's nuclear program in all probability would be a 'closed' system – without internet access – an individual would have had to carry a thumb drive into the facility and insert that into the system,” said Fred Burton, Stratfor's Vice President of Intelliegence in a video report available to members of Stratfor, a global intelligence company.
Back when I was in high school -- a lifetime ago, it seems -- I did an internship with one of Siemens' competitors. During that time, I learned quite a bit about programmable logic controllers, the industrial systems that would have been targeted by this virus. These were some of the earlier systems to use proprietary forms of networking to pass information from console to console, and to ease the process of updating programming. In places like paper mills and steel mills, these systems might even have a direct external connection so they can be remotely programmed. As noted in the articles above, however, nuclear power plants would likely be a closed system, one in which a programmer would have to physically install the programming changes to a central location, or possibly even each individual controller. The programming for these things is incredibly specific -- logic ladders that spread thousands of pages and, while the applications are similar across the board, some of the syntax and education tends to be proprietary.
Thus, given what (little) I know about these controllers, the closed nature of the nuclear plant systems, and the means by which they would have to be updated, it does seem highly likely that this is a coordinated effort. Not only does it create a process concern, delaying start-up of the nuclear plant, but it also has a secondary, psychological effect: "planting a 'seed of paranoia among the managers and bosses, and everybody in the workplace becomes scrutinized as a potential leak.'"
In one case, this is bad. There are regular people, doing their jobs, who will likely be targeted by this scrutiny unjustly. And, it will certainly make program officials more wary of any further incursions into their systems. That will create higher risk and more difficult conditions under which future missions might be executed. But maybe -- just maybe -- it will also slow things down, make Iranians do things more deliberately, and that can be used to advantage.
The immediate result, however, cannot be ignored: Full start-up of the plant has been delayed by at least two or three months while the Iranians try to clean house. That is, of course, if they're telling the press the whole story; in this instance, they may be under-reporting the extent of the damage. Either way, it gives them food for thought, as well as buying other entities more time to develop a more interesting solution to the problem of Iran going nuclear.
Abso-frakking-lutely brilliant.
ETA: A pretty decent follow-up from the Times. From the article:
While the S-7 industrial controller is used widely in Iran, and many other countries, even Siemens says it does not know where it is being used. Alexander Machowetz, a spokesman in Germany for Siemens, said the company did no business with Iran’s nuclear program. “It could be that there is equipment,” he said in a telephone interview. “But we never delivered it to Natanz.”
Now, I'm kinda thinking out loud here, but to me, this creates a few issues. For the manufacturer, sometimes the third party isn't as experienced in dealing with the equipment as a manufacturer's sales/technical rep. This has many times resulted in bad programming and the eventual service call to the manufacturer's technical reps to correct the problem.
In this instance, using a third party vendor means Siemens has no control over where their equipment is going and who is programming it; it means the Iranians are relying on someone who is not a Siemens rep to program it, possibly creating a larger vulnerability than otherwise expected; and the extended chain of custody (as it were) provides more access for intentional sabotage.
Then again, if they had been able to purchase directly from Siemens, who is to say that anyone would have had access to plant the bug to begin with?
Ah, all the loverly what-ifs....
ETA 2: Another follow-up from The Weekly Standard. Even better than the Times article, it even details the weaknesses of the programmable logic controllers that Stuxnet exploited.
I know I'm being repetitive, but I'll say it again: Absolutely brilliant work.