On Info Sharing

In September Critical Intelligence held its inaugural CounterIntel Conference and Training in beautiful Park City, Utah. It was probably the most eye-opening security conference I have ever attended. This owes primarily to the outstanding speakers. You can take a look at the full lineup here.

As several recent blog posts have dealt with information sharing, I wanted to relate the on-point comments provided by Mark Weatherford who delivered the keynote address. Mark has an impressive resume, as a former Naval Cryptologic Officer, former DHS Deputy Undersecretary for Cyber Security, and now Principal at the Chertoff Group.

I have heard Mr. Weatherford speak on previous occasions, but I was unprepared for his candor — which had me scribbling furiously away. Here were some gems:

“Government thinks they know what’s right for private industry. That’s wrong and it’s wrong-headed.”

“The government is incapable of providing timely and actionable intelligence to the private sector today.”

“Good intelligence is integrated across sources.”

“Public-private partnership is probably not going to work out, at least in the near term, in a way satisfactory to the private sector.”

“In our business there simply isn’t time to get everyone’s opinion prior to making a decision”

“There is a mystique around classified information — but once people get access to it they are severely disappointed.”

I was more than surprised to hear a former DHS cybersecurity official make those statements. But I think that makes them all the weightier.

National Security Systems and ICS

I was reviewing the Committee on National Security Systems Instruction 1253, which provides guidance for applying security controls to National Security Systems. I was specifically looking through the document for references to industrial control systems.

CNSSI1253

National Security Systems are essentially any system used for military or intelligence activities (including weapons systems), that are not used for payroll, finance, logistics, and personnel management applications (See 44 U.S.C. 3542(b)(2)).

CNSSI 1253 was updated in March 2014. The previous version (March 2012) said:

INDUSTRIAL CONTROL SYSTEMS

SECURITY CONTROLS, ENHANCEMENTS, AND SUPPLEMENTAL GUIDANCE

Adoption of National Institute of Standards and Technology Special Publication 800-53, Revision 3, Appendix I, is not mandatory and is solely at the discretion of national security community departments and agencies, at this time, pending further applicability by the national security community.

To me, this meant that there was a “free pass” for NSS ICS. I found that quite concerning.

Potentially worse, however, looking at the updated 1253 (March 2014) instruction, I find no reference whatsoever to industrial control systems. My guess for the reason why is that NIST is now addressing ICS security entirely outside of 800-53 (i.e. 800-53 revision 4, released in April 2013, has no appendix dealing with industrial control systems). The industrial control systems guidance is now found in 800-82. Because 800-82 is not part of the core “transformational documents” (e.g. SP800-30, 37, 39, 53, 53A) coordinated between NIST and CNSS, it appears to have been left out of CNSSI 1253.

This leaves me wondering what guidance is being used for categorization and control selection for industrial control systems that are national security systems.

Maybe the guidance is classified. Maybe it exists at the agency level. Maybe there is some reference to 800-82 that I didn’t find. But, in an age where we have our country’s highest defense officials talking about “cyber 9/11” and “digital pearl harbor”, the inability to easily identify a common baseline for securing military industrial control systems appears deeply concerning.

Three ideas for federal advancement of ICS security

Over the past several years I have witnessed a line of federal leaders push information sharing like it is the absolute solution to forever ensuring the cyber security of the infrastructures on which Americans rely. We have had testimony in congressional hearings, proposed and passed legislation, executive orders, NIST info sharing guidance, and WashPost op/eds proclaiming the indispensable nature of information sharing for the preservation of the country.

While fans may show up to watch deceased baseball greats compete on the Field of Dreams, we aren’t talking about the American pastime; we are talking about American infrastructure. We don’t need fans. We need action.

There are three fundamental problems to securing the industrial control systems (ICS) at the heart of our infrastructure that info sharing will never address:

  • insecure control system architectures
  • insecure control systems products
  • insecure control systems communication protocols

Make no mistake: I am an intelligence and information sharing professional. I have been serving the unique intelligence needs of the country’s largest utilities for 8 years, the first two building out the DHS control systems cyber situational awareness effort that became the ICS-CERT, and next six as the Director of Analysis of the only commercial vulnerability and threat intelligence organization to focus specifically and exclusively on industrial control systems. I believe that intelligence and information sharing have a role to play. But that role is to inform appropriate action. It is not a sufficient solution in and of itself.

In an attempt to advance conversation about how to address the core issues at a federal level, I submit the following action-oriented ideas:

ONE

Government to lead the way by cleaning its own house. The U.S. government is a significant buyer of industrial control systems technology. Why not use this purchase power to advance the state of ICS security? How about this: As of July 1, 2016 (or whatever reasonable date you choose), new ICS builds relying on federal funds cannot use unauthenticated protocols.

TWO

Require network security monitoring for ICS networks owned by the federal government. Instead of worrying quite so much about enterprise networks (which are important),  ask/encourage/incentivize federal ICS owner/operators to install network security monitoring solutions that provide defenders insight into anomalies on *ICS* networks. Other technologies such as application whitelisting would also do well in many of those environments. The government push for these solutions on its ICS networks may spur the market and result in advancement and confidence.

THREE

Fines for Online ICS. As an example, if water provision systems are connected to the public Internet, it must be brought to the water users’ attention. We need a small consequence right now (something like a $5,000 fine) to avoid a big consequence in the future.

I recognize that there are significant details to work out in order to implement each of these ideas. But these action-oriented concepts have at least some direct correlation with the underlying problems facing critical infrastructure cyber security.

Let’s forgo the info-sharing pastime, and focus federal efforts on the foundation of a more secure future.

Revisiting copycats and Stuxnet

As I read Kim Zetter’s “Countdown to Zero Day” I was reminded of the copycat discussions that seemed sparked by Ralph Langner’s warnings (see pp. 182-183).

“Langner suspected it would tie just six months for the first copycat attacks to appear. They wouldn’t be exact replicas of Stuxnet, or as sophisticated in design… but they wouldn’t need to be.”

“[Melissa] Hathaway, the former nationals cuber security coordinator for the White House… told the New York Times, ‘We have about 90 days to fix this before some hacker begins using it’.”

Did we have copycats? Do we have copycats?

Off the top of my head, I can’t think of any I would call a close copy cat. That doesn’t mean there aren’t any, but if there are, they are still virtually unknown.

However, we should recognize that some threat actors seem to have learned what I consider the most valuable lesson from Stuxnet: Engineering firms, ICS integrators and ICS software vendors are high value targets.

Stuxnet attackers apparently went after after the computers at NEDA and other ICS integrators to get access to Natanz. This means the attackers had access to engineering details necessary to create highly-specific and customized attacks. It also means that the attackers had access to the ICS networks themselves (via engineering lap tops at a minimum).

When we think of Stuxnet, we think of Natanz — but broaden your view. What other projects had NEDA and the other targeted ICS integrators worked on? Stuxnet and its cousin code (Duqu etc.) was/is all over Iranian (critical) infrastructure.

Back to the copycats thread. Look at Havex. The parties behind Havex certainly targeted ICS integrators and support providers ( via Trojanized software from eWon and MBConnectLine). So in 2014 we saw a copycat of a key concept. And I would fully expect to see more ICS vendors, integrators, and engineering firms targeted by ICS-seeking malware in the near future.

So, if you operate critical infrastructure, consider the following questions:

  • Who are your ICS integrators?
  • Who is providing maintenance to your ICS?
  • What security policies and procedures are you requiring of those parties?

If the answer to these questions is buried in layers of subcontracts, and all you know is “that your control systems work” chances are there’s not a lot of security oversight going on. Good luck when the next copycats arrive.

Schneider Proclima vulns: ICS or not?

The Schneider ProClima vulnerability disclosures were another interesting case study on ICS security communications.

ProClima

Security Week ran an article on them. As did Threatpost.

Interesting-ness #1
In communications from Schneider and DHS, there are two “vulnerabilities”, both classified as “Command Injection” (CWE-77), yet a total of five CVEs. I understand the reasons behind combining analysis in some cases, but am I the only one that thinks each CVE should serve exactly one vulnerability?

Interesting-ness #2
ProClima software would very rarely be found on an industrial network. It is enclosure design software. It helps engineers design control enclosures/cabinets so that they don’t get too hot. It could maybe, be on ICS engineer lap tops, but its fundamental purpose is not process control or process design — it is process control cabinet design!

Interesting-ness #3
CVSSv2 base score for these vulnerabilities is 10.0 (the highest score possible). The vulnerabilities are in ActiveX, so if it were on an ICS network (but it’s not — see #2 above) the vulnerable machine would still have to be surfing the public Internet to get infected. If your ICS machines can do that, then you have worse problems than some obscure ActiveX vuln. In short, the score here does a poor job of characterizing the potential impact to the actual process being controlled.

The reason I think these “small” analytical issues matter is that if we are really concerned about protecting critical infrastructure we have to communicate clearly. There is *virtually zero* potential process impact that results from successful exploitation of these vulnerabilities.

If you want to cut the hype and get solid ICS vuln analysis, then subscribe to Critical Intelligence ICS Core Intelligence Service.