Science fiction-style sabotage a fear in new hacks

Oct 23, 2011 By JORDAN ROBERTSON , AP Technology Writer
Researcher Dillon Beresford poses for a photo at his office, Wednesday, Aug. 31, 2011, in Austin, Texas. Beresford said it took him just two months and $20,000 in equipment to find more than a dozen vulnerabilities in electronic controllers of the same type used in Iran. The vulnerabilities, which included weak password protections, allowed him to take remote control of the devices and reprogram them. (AP Photo/Eric Gay)

When a computer attack hobbled Iran's unfinished nuclear power plant last year, it was assumed to be a military-grade strike, the handiwork of elite hacking professionals with nation-state backing.

Yet for all its science fiction sophistication, key elements have now been replicated in laboratory settings by security experts with little time, money or specialized skill. It is an alarming development that shows how technical advances are eroding the barrier that has long prevented computer assaults from leaping from the digital to the physical world.

The techniques demonstrated in recent months highlight the danger to operators of power plants, water systems and other around the world.

"Things that sounded extremely unlikely a few years ago are now coming along," said Scott Borg, director of the U.S. Cyber Consequences Unit, a that helps the U.S. government prepare for future attacks.

While the experiments have been performed in laboratory settings, and the findings presented at security conferences or in technical papers, the danger of another real-world attack such as the one on Iran is profound.

The team behind the so-called Stuxnet worm that was used to attack the Iranian may still be active. New with some of Stuxnet's original code and behavior has surfaced, suggesting ongoing reconnaissance against industrial control systems.

And attacks on critical infrastructure are increasing. The Idaho National Laboratory, home to secretive defense labs intended to protect the nation's , and other critical infrastructure, has responded to triple the number of computer attacks from clients this year over last, the U.S. has revealed.

For years, ill-intentioned hackers have dreamed of plaguing the world's infrastructure with a brand of sabotage reserved for Hollywood. They've mused about in industrial settings by burning out power plants, bursting oil and gas pipelines, or stalling manufacturing plants.

But a key roadblock has prevented them from causing widespread destruction: they've lacked a way to take remote control of the electronic "controller" boxes that serve as the nerve centers for heavy machinery.

The attack on Iran changed all that. Now, - and presumably, malicious hackers - are racing to find weaknesses. They've found a slew of vulnerabilities.

Think of the new findings as the hacking equivalent of Moore's Law, the famous rule about computing power that it roughly doubles every couple of years. Just as better computer chips have accelerated the spread of PCs and consumer electronics over the past 40 years, new hacking techniques are making all kinds of critical infrastructure - even prisons - more vulnerable to attacks.

One thing all of the findings have in common is that mitigating the threat requires organizations to bridge a cultural divide that exists in many facilities. Among other things, separate teams responsible for computer and physical security need to start talking to each other and coordinate efforts.

Many of the threats at these facilities involve electronic equipment known as controllers. These devices take computer commands and send instructions to physical machinery, such as regulating how fast a conveyor belt moves.

They function as bridges between the computer and physical worlds. Computer hackers can exploit them to take over physical infrastructure. Stuxnet, for example, was designed to damage centrifuges in the nuclear plant being built in Iran by affecting how fast the controllers instructed the centrifuges to spin. Iran has blamed the U.S. and Israel for trying to sabotage what it says is a peaceful program.

Security researcher Dillon Beresford said it took him just two months and $20,000 in equipment to find more than a dozen vulnerabilities in the same type of electronic controllers used in Iran. The vulnerabilities, which included weak password protections, allowed him to take remote control of the devices and reprogram them.

"What all this is saying is you don't have to be a nation-state to do this stuff. That's very scary," said Joe Weiss, an industrial control system expert. "There's a perception barrier, and I think Dillon crashed that barrier."

One of the biggest makers of industrial controllers is Siemens AG, which made the controllers in question. The company said it has alerted customers, fixed some of the problems and is working closely with CERT, the cybersecurity arm of the U.S. Department of Homeland Security.

Siemens said the issue largely affects older models of controllers. Even with those, the company said, a hacker would have to bypass passwords and other security measures that operators should have in place. Siemens said it knows of no actual break-ins using the techniques identified by Beresford, who works in Austin, Texas, for NSS Labs Inc.,

Yet because the devices are designed to last for decades, replacing or updating them isn't always easy. And the more research that comes out, the more likely attacks become.

One of the foremost Stuxnet experts, Ralph Langner, a security consultant in Hamburg, Germany, has come up with what he calls a "time bomb" of just four lines of programming code. He called it the most basic copycat attack that a Stuxnet-inspired prankster, criminal or terrorist could come up with.

"As low-level as these results may be, they will spread through the hacker community and will attract others who continue digging," Langer said in an email.

The threat isn't limited to . Even prisons and jails are vulnerable.

Another research team, based in Virginia, was allowed to inspect a correctional facility - it won't say which one - and found vulnerabilities that would allow it to open and close the facility's doors, suppress alarms and tamper with video surveillance feeds.

During a tour of the facility, the researchers noticed controllers like the ones in Iran. They used knowledge of the facility's network and that controller to demonstrate weaknesses.

They said it was crucial to isolate critical control systems from the Internet to prevent such attacks.

"People need to deem what's critical infrastructure in their facilities and who might come in contact with those," Teague Newman, one of the three behind the research.

Another example involves a Southern California power company that wanted to test the controllers used throughout its substations. It hired Mocana Corp., a San Francisco-based security firm, to do the evaluation.

Kurt Stammberger, a vice president at Mocana, told The Associated Press that his firm found multiple vulnerabilities that would allow a hacker to control any piece of equipment connected to the controllers.

"We've never looked at a device like this before, and we were able to find this in the first day," Stammberger said. "These were big, major problems, and problems frankly that have been known about for at least a year and a half, but the utility had no clue."

He wouldn't name the utility or the device maker. But he said it wasn't a Siemens device, which points to an industrywide problem, not one limited to a single manufacturer.

Mocana is working with the device maker on a fix, Stammberger said. His firm presented its findings at the ICS Cyber Security Conference in September.

Even if a manufacturer fixes the problem in new devices, there's no easy way to fix it in older units, short of installing new equipment. Industrial facilities are loath to do that because of the costs of even temporarily shutting its operations.

"The situation is not at all as bad as it was five to six years ago, but there's much that remains to be done," said Ulf Lindqvist, an expert on industrial control systems with SRI International. "We need to be as innovative and organized on the good-guy side as the bad guys can be."

Explore further: Report: FBI's anthrax investigation was flawed

4.3 /5 (11 votes)
add to favorites email to friend print save as pdf

Related Stories

Symantec warns of new Stuxnet-like virus

Oct 19, 2011

US security firm Symantec has warned of a new computer virus similar to the malicious Stuxnet worm believed to have preyed on Iran's nuclear program.

Computer expert says US behind Stuxnet worm

Mar 03, 2011

A German computer security expert said Thursday he believes the United States and Israel's Mossad unleashed the malicious Stuxnet worm on Iran's nuclear program.

Iran: Computer worm could have caused huge damage

Apr 17, 2011

A senior Iranian military official involved in investigating a mysterious computer worm targeting Iranian nuclear facilities and other industrial sites said Saturday the malware could have caused large-scale accidents and ...

Recommended for you

Report: FBI's anthrax investigation was flawed

Dec 19, 2014

The FBI used flawed scientific methods to investigate the 2001 anthrax attacks that killed five people and sickened 17 others, federal auditors said Friday in a report sure to fuel skepticism over the FBI's ...

Study reveals mature motorists worse at texting and driving

Dec 18, 2014

A Wayne State University interdisciplinary research team in the Eugene Applebaum College of Pharmacy and Health Sciences has made a surprising discovery: older, more mature motorists—who typically are better drivers in ...

Napster co-founder to invest in allergy research

Dec 17, 2014

(AP)—Napster co-founder Sean Parker missed most of his final year in high school and has ended up in the emergency room countless times because of his deadly allergy to nuts, shellfish and other foods.

LA mayor plans 7,000 police body cameras in 2015

Dec 16, 2014

Mayor Eric Garcetti announced a plan Tuesday to equip 7,000 Los Angeles police officers with on-body cameras by next summer, making LA's police department the nation's largest law enforcement agency to move ...

User comments : 21

Adjust slider to filter visible comments by rank

Display comments: newest first

gwrede
1 / 5 (4) Oct 23, 2011
After this, we are supposed to assume Stuxnet was not an American/Israeli operation. (See another recent thread about controlling public opinion.) Oh, well.

I wonder how long before somebody electronically booby-traps a nuclear powerplant or missile silo, and demands $10M for defusing it, or else!

Or, better yet, someone makes a Die Hard film about it, and then this idiot from Oklahoma sees the movie and actually does it.
trekgeek1
3 / 5 (2) Oct 23, 2011
After this, we are supposed to assume Stuxnet was not an American/Israeli operation. (See another recent thread about controlling public opinion.) Oh, well.

I wonder how long before somebody electronically booby-traps a nuclear powerplant or missile silo, and demands $10M for defusing it, or else!

Or, better yet, someone makes a Die Hard film about it, and then this idiot from Oklahoma sees the movie and actually does it.


You could always just pull the plug. No matter how complicated a system is you can always find the right place to just yank out the power that would enable a launch. That controller must send the final signal to the mechanism and you can just "hot wire" it.
InsaniD
5 / 5 (2) Oct 23, 2011
When I was a kid, I read a lot of sci-fi. The funny thing is, in these stories, the tech was sort of the same level we are at now and in the stories there were often counter tech that could turn the tables on would-be hackers.
At best most high tech faculties are running some pathetic bit of virus protection and have a firewall up - both of which do little in the event of an attack like this.
Common folks - we have the tech of "the future" it is time to protect it with real protection and even offensive capabilities...
Grizzled
2 / 5 (5) Oct 23, 2011
While not challenging his finding per se, I'd like to point out that finding som [potential] vulnerabilities isn't the same as finding a way of exploiting them - either practical or even theoretical. As a simple, very low-end example of that, I can name without even trying too hard at least a dozen weaknesses in my own home network. And I'm sure a good security expert OR hacker can find even more.

The reason I'm not doing anything about them is two-fold: they are both hard to fix and hard to exploit. Many would require a physical access to the device, even if briefly. Note that even in the article they mention physical inspection of the facilities. Others would require a truckload of electronic equipment sitting right next to the walls of my house. Even I would grow suspicious if I saw it there, the guys protecting a nuclear or chemical installation presumably should pay even more attention.

The trick is to do the same below the radar and the article makes no such claim.
Jeddy_Mctedder
1.7 / 5 (6) Oct 23, 2011
no you need to be a nation state to identify the hardware you are going to be hacking, and how to introduce viruses to the hardware by use of spies without getting caught.

the human part of security is and always be the hard part. we are the ultimate hardware running the ultimate software in our brains.
InsaniD
3.7 / 5 (3) Oct 23, 2011
Grizzled Said:
While not challenging his finding per se, I'd like to point out that finding som [potential] vulnerabilities isn't the same as finding a way of exploiting them - either practical or even theoretical.


The Article Says:
The vulnerabilities, which included weak password protections, allowed him to take remote control of the devices and reprogram them.


I'm not a genius, but I'd say if you could get into the systems and reprogram them (as Dillon did), you can probably exploit them rather easily as well.
Nerdyguy
2.6 / 5 (5) Oct 23, 2011
"You could always just pull the plug..."
- TrekGeek1

This is hopelessly naive. Modern industrial operations are huge monsters with so many tentacles stretched over such a massive amount of real estate that the difficulty is finding the right piece of equipment, let alone unplugging it. Many utilities have admitted as much over the years. And there have been countless articles (even since 9/11) of security breaches at high security facilities (e.g., nuclear).
copracr
3 / 5 (4) Oct 23, 2011
Seriously??? Why do people at the DOD or at nuclear power plants have computers connected to the internet? I have an old laptop that never gets plugged into the internet cause I want it to work. If I figured this out in 2002 why are these security geniuses not on the ball.
IF ITS IMPORTANT THEN DONT CONNECT IT TO THE NET.
ShotmanMaslo
2.3 / 5 (6) Oct 23, 2011
Seriously??? Why do people at the DOD or at nuclear power plants have computers connected to the internet?


Do they? I believe USB sticks are to blame here, not the net.
InsaniD
3 / 5 (4) Oct 23, 2011
Seriously??? Why do people at the DOD or at nuclear power plants have computers connected to the internet?
IF ITS IMPORTANT THEN DONT CONNECT IT TO THE NET.


SERIOUSLY, DUDE?

(From WIKI)
The Advanced Research Projects Agency Network (ARPANET - 1966), was the world's first operational packet switching network and the core network of a set that came to compose the global Internet. The network was funded by the Defense Advanced Research Projects Agency (DARPA) of the United States Department of Defense (DOD) for use by its projects at universities and research laboratories in the US.


Do I need to say more?
InsaniD
3.3 / 5 (4) Oct 23, 2011
Do they? I believe USB sticks are to blame here, not the net.

(From WIKI)

The Advanced Research Projects Agency Network (ARPANET - 1966), was the world's first operational packet switching network and the core network of a set that came to compose the global Internet. The network was funded by the Defense Advanced Research Projects Agency (DARPA) of the United States Department of Defense (DOD) for use by its projects at universities and research laboratories in the US.


Again, do I need to say more?
hush1
1 / 5 (2) Oct 23, 2011
Diebold's spokesperson:
Frankly, we welcome this. Our clientele welcome this. Providing you with outcomes you can afford.
rproulx45
not rated yet Oct 23, 2011
Has anyone heard of this newfangled thing called mechanical systems? Hard wired mechanical switches that can route power from one place to another. Tamper proof.
hush1
2.3 / 5 (3) Oct 23, 2011
In prehistoric times they called that Hoover Dams.
Grizzled
1.8 / 5 (4) Oct 23, 2011
Grizzled Said:
While not challenging his finding per se, I'd like to point out that finding som [potential] vulnerabilities isn't the same as finding a way of exploiting them - either practical or even theoretical.


The Article Says:
The vulnerabilities, which included weak password protections, allowed him to take remote control of the devices and reprogram them.


I'm not a genius, but I'd say if you could get into the systems and reprogram them (as Dillon did), you can probably exploit them rather easily as well.


I will give you an example of what you are missing there.

He mentions weak password prtection, right? Have you stopped to think what exactly that means? Well, one of the most popular problems is sticky notes with passwords. Take a stroll thru almost any large office and you'll probably pick half dozen or so. Weakness? For sure. Can you exploit it from the outside? Hardly. Lots of weakesses are like that - you need a foot in the door first.
Grizzled
3 / 5 (6) Oct 23, 2011
Again, do I need to say more?

Actually yes. You see, in adition to Wiki, it wouldn't hurt to check other sources too. If you had, you may have had found that in the case of Stuxnet, Iranians were Paranoid enough or security-concious enough to isolate all their computers at the facility from the net. Didn't work.

The reason? A few sloppy scientists and engineers who took their own personal laptops and USB devices through the tight outside and back. That was enough.

This is another illustration of my earlier point that most such weaknesses are not technolgcal but human. HUMANS brought that worm in. HUMANS failed to enforce physical site security. The rest was exploiting it. Clever exploiting, yes. Sophisticated - no doubt. But still relying on human error.
KWZ
5 / 5 (4) Oct 24, 2011
I know this is lame but did anyone else laugh when they read that the director of the U.S. Cyber Consequences Unit was named Scott Borg. Resistance is futile.
SiBorg
5 / 5 (1) Oct 24, 2011
Yup, the Borg thing tickled me too.

@Grizzled, I'd assumed that the weak password protection mentioned in the articles was referring to the default admin passwords on PLC. Manufacturers use a number of the more common ones and it only takes an idle engineer for them not to be reconfigured. I've worked on a number of systems where this was the case.
Smellyhat
5 / 5 (1) Oct 24, 2011
It is a remarkable (and false) supposition to think that government agencies would have more advanced capabilities than private researchers in this field.
COCO
1 / 5 (3) Oct 24, 2011
park of the problem Stinky is that is where talent goes in Amerika - that is where the money and fun is - just ask Hillary and her psycho lick-spittles
trekgeek1
not rated yet Oct 25, 2011
No matter how complicated a system is you can always find the right place to just yank out the power that would enable a launch.


-ME

...the difficulty is finding the right piece of equipment.


-Nerdguy

Glad we agree that at least you can find such a place. Did I say it's easy to find it? I just said you can find it and disable it. And the issue of people at the location being traitors is a completely different problem than a remote attack.

Please sign in to add a comment. Registration is free, and takes less than a minute. Read more

Click here to reset your password.
Sign in to get notified via email when new comments are made.