The backbone of America — banks, oil and gas suppliers, the energy grid — is under constant attack by hackers.
But the biggest cyberattacks, the ones that can blow up chemical tanks and burst dams, are kept secret by a law that shields U.S. corporations. They’re kept in the dark forever.
You could live near — or work at — a major facility that has been hacked repeatedly and investigated by the federal government. But you’d never know.
What’s more, that secrecy could hurt efforts to defend against future attacks.
The murky information that is publicly available confirms that there is plenty to worry about.
Unnamed energy utilities and suppliers often make simple mistakes — easily exposing the power grid to terrorist hackers and foreign spies. A CNNMoney investigation has reviewed public documents issued by regulators that reveal widespread flaws.
There was the power company that didn’t bother to turn off communication channels on its gear at mini-stations along the electrical grid, leaving access points completely open to hackers. It was fined $425,000 by its regulator in August.
Another power company forgot to patch software on 66% of its devices, thus exposing them to known flaws exploited by hackers. It got a $70,000 fine in February.
There are plenty of other examples, and all “posed a serious or substantial risk” to portions of the electrical grid, these documents say.
And hackers do sometimes get through.
In an industry newsletter available online, the Department of Homeland Security occasionally documents hacks, though only with vague descriptions.
In early 2013, hackers attacked several natural gas pipelines in the Midwest, trying to break into the communication network that tells industrial machines what to do.
Last year, a hacker got into the network that controls industrial equipment at a public utility — but DHS won’t even say where it is in the United States.
We don’t know what happened in either case — or the dozens that stay under the radar each year. Neither do the very computer experts who train the nation’s next generation of hacking defenders. And even regulators can’t use this information to make safety regulations.
“Most folks don’t have any idea,” said David Kennedy, whose firm TrustedSec investigates attacks on power plants and other critical companies.
Steven Aftergood, who leads the project on government secrecy at the American Federation of Scientists, worries that “by categorically withholding this information, the government is concealing the very factors that shape homeland security policy.”
“Instead of a precise picture of an actual threat, the public is left with only vague generalities. The resulting deliberative process is crippled from the start,” Aftergood said.
It’s not just the energy industry. Every company that’s considered “critical infrastructure” can keep major hacks secret: the telecom industry, big banks, major chemical makers.
The only reason you hear about the small time stuff — such as when a retailer loses your credit card — is because some states have laws demanding disclosures. The potentially dangerous hacks stay in the dark permanently.
Why all the secrecy?
In the wake of the 2001 terrorist attacks, government officials were worried about protecting the nation’s critical infrastructure.
To encourage the sharing of information about major physical and computer-based attacks, the 2002 Homeland Security Act included special protection for U.S. companies: Any evidence they submit is considered “Protected Critical Infrastructure Information” (PCII) and kept from public disclosure.
CNNMoney reviewed a 2009 DHS policy manual explaining the policy to law enforcement, government agents and industry. The manual explicitly explains this information is to be kept out of the hands of journalists, regulators and the public at large. The media “may not receive PCII” unless a company approves. A safety inspector “does not have a valid need-to-know” if he or she plans to use that information “for regulatory purposes.”
The manual explains what this means in practice. What happens if a severe vulnerability exists that makes train stations prone to terrorist attacks? If that information is categorized as “PCII,” a federal regulator can’t mention it — even when writing reports to push for better safety regulations at train stations.
At an energy industry conference in Philadelphia last month, Caitlin Durkovich, assistant secretary for infrastructure protection at DHS, repeatedly told company executives they’d never have to worry about public exposure.
“We go through extraordinary measures to make sure that information cannot get to someone who’d want to hurt you,” she said. “We cannot make it available to regulators, sunshine laws or [public records]. It’s part of building this trusted relationship with you.”
“We recognize we work in support of you,” Durkovich said.
The value of silence
There are good reasons for the policy. Investigators don’t want tip off hackers how close they are to catching them. And cybersecurity experts agree we shouldn’t make computer flaws public before they’re fixed.
American power utilities tell CNNMoney they don’t want to give hackers a road map to their systems, which rarely get upgraded or replaced.
Then there’s DHS. It is concerned that, if records ever go public, the many private companies that run the vast majority of the nation’s backbone will stay silent.
“The program… offers an essential incentive for critical infrastructure owners and operators to share relevant information with DHS,” said agency spokesman S.Y. Lee.
Besides, the public hears about some hacks anyway.
“I think we know enough. We do know enough that this is an epidemic. I don’t think we need the whole picture and all the gory details,” said Phillip Dunkelberger, a technology executive who leads Nok Nok Labs, which specializes in biometrics and other futuristic authentication tools.
The darkness is blinding
But even if they agree about the need for initial secrecy, computer security experts are deeply skeptical about making it permanent.
“It makes zero sense to lock up this information forever,” said Jeremiah Grossman, who founded cybersecurity firm WhiteHat Security. “Certainly there are past breaches that the public should know about, is entitled to know about, and that others can learn from.”
Robert M. Lee spent time in the U.S. Air Force, where he conducted hacking operations as a “cyber warfare officer.” Now he travels the world for the Sans Institute, teaching the actual government investigators and power plant computer teams who face these types of dangerous attacks.
Except he doesn’t have any class material. He can’t find it. It’s all secret.
“My class is the only hands-on training for industrial control systems, but my students’ number one complaint is that there aren’t case studies or enough data out there about the real threat we’re facing,” he said. “There’s no lessons learned. It is extremely destructive to the overall national security status of critical infrastructure.”