I came across a GAO publication the other day: “Iranian Commercial Activities Update: Foreign Firms Reported to Have Engaged in Iran’s Energy or Communications Sectors”

GAO Iran

This is a recurring report the GAO issues on foreign firms that could be helping Iran with energy or communications infrastructure projects.

I found it interesting from two angles.

First, the report relies exclusively on OSINT to make its determinations:

We searched for the names of firms identified in our January 2014 report as well as for key terms such as “Iran” that appeared within 25 words from “explore,” “drill,” “refinery,” “natural gas,” or “petroleum.” We also searched for locations in Iran where oil, gas, and petrochemical activities were being conducted. In addition, we reviewed company publications, including annual reports; U.S. Securities and Exchange Commission (SEC) filings, if available; firms’ press releases and corporate statements that publicly reported their commercial activities in Iran; and corrected information that had been publicly reported. We excluded firms that reported purchasing crude oil or natural gas from Iran, because these purchases did not meet our definition of commercial activity in Iran’s oil, gas, or petrochemical sectors. We identified firms that were reported as having contracts, agreements, and memorandums of understanding to conduct commercial activity in Iran.


Second, the report reads like a first-pass targeting list.

Maybe I am imagining too much here. But I can envision this report’s message as: “Here’s what these companies are saying about themselves on the public Internet. Now, let’s pass this thing off to the heavy dudes (for CNE), and see what’s really going on.”

There could even be an implicit threat — something like “You help our adversaries, we will help ourselves to your networks, your data, and the infrastructure you helped build.”


Have you heard of the “pre-emptive cyber strike” doctrine?

I view “preemptive cyber strike” as the digital counterpart to the Bush-era pre-emptive strike doctrine expressed in a national security policy document in September 2002. This policy  was used to “justify” U.S. actions in Afghanistan and Iraq:

To forestall or prevent such hostile acts by our adversaries, the United States will, if necessary, act preemptively in exercising our inherent right of self-defense. The United States will not resort to force in all cases to preempt emerging threats. Our preference is that nonmilitary actions succeed. And no country should ever use preemption as a pretext for aggression.


While contemplating the implications of preemptive cyber strike for critical infrastructure, I had this novel idea for a NERC CIP-005 R2-“compliant” appropriate use banner:

//       WARNING

ICS vendors: Seven ways to smell like a security champ!

Picking up where Reid left off, I want to promote seven simple things ICS vendors can do to stink less when it comes to managing security (okay, number 2 isn’t so simple, but it is important!):

7. List security contact info on your Web page
This shows you are ready and willing to talk with researchers (or users) who have a security concern with the prodcut you created. Then respond promptly and courteously when contacted.

6. Put your own advisories on your own Web page
Look, it’s *your* product. Don’t rely on the ICS-CERT, US-CERT, CERT/CN etc. to communicate with *your* customers. What they are doing is a public service. What you are doing is supporting the product *you* wrote and *your* customers purchased. Do *your* job.

5. Proofread the vulnerability advisories you write
If you don’t have a clue what that advisory means, there is a good chance no one else (including your customers) has a clue either. Hire someone who has a security background and can communicate effectively to author these important communications.

4. Monitor for vulnerabilities in the thrid party components your products rely on
Lots of bugs exist in third party code. Some of these have been known for years. Don’t keep these buried and pretend like no one will ever know.

3. Keep public details about case studies to a minimum
If you tell the world all about the system you built for your customers, you could be making your customers a target.

2. Monitor your own networks for security breaches
Think about it, your support infrastructure, your code management infrastructure, and your Web site, if compromised can be easily leveraged against your customers. As such, those resources are all high value targets.

1. Alert your customers when your networks suffer a breach
Don’t sweep it under the carpet. Your risk is their risk.

These aren’t the only ideas out there. Read the Microsoft Security Development Lifecycle book and Web site. Read guidance documents such as the Organization for Internet Safety’s “Guidelines for Security Vulnerability Reporting and Response” or National Infrastructure Advisory Council’s “Vulnerability Disclosure Framework“.

Most of all, put yourself in the shoes of the critical infrastructure firms who are relying on your product and your expertise. Put yourself in the shoes of the citizens who rely on electricity, water, oil and natural gas. What do they want to know, what do they deserve to know? Don’t stink! Smell like a security champ!

Secure your buildings? uhhhh….

I had to emit a sad chuckle at the GAO report on cyber security for federal buildings.

GAO buildings report cover

In way of background, DHS, through its Federal Protective Service (FPS), provides security at more than 9,000 federal facilities nation-wide.

I’ve read lots of GAO reports. This one is about as scathing as GAO sterilely-objective grammar permits:

DHS lacks a strategy that: (1) defines the problem, (2) identifies the roles and responsibilities, (3) analyzes the resources needed, and (4) identifies a methodology for assessing this cyber risk. A strategy is a starting point in addressing this risk. The absence of a strategy that clearly defines the roles and responsibilities of key components within DHS has contributed to a lack of action within the Department.

Unfortunately, building automation is one of the most overlooked areas of cyber security. In the commercial world, some buildings are leased rather than owned; facilities maintenance teams often fall under separate management than corporate  IT; facilities maintenance personnel are not likely to have cyber security training/forethought, and so on.

On one hand I understand that the folks at FPS haven’t thought about cyber. Has your guard force? Imagine walking up to one of these cops and saying “Hey, I bet there’s an ActiveX control in your HVAC HMI with a stack based buffer overflow. It could probably be exploited via malicious redirect employed by strategic Web compromise.”

He or she will look at you like you are from another planet.

In addition, I recognize that the DHS mission is enormous and ill-defined; oversight is lacking; leadership has surprising turnover; bureaucracy can be oppressive and slow.

For those reasons, I don’t mean to imply that implementing cyber security into federal building automation environments is easy.

One the other hand, it has been two and a half years since Billy Rios knocked building automation cyber security into the limelight. That should be time enough to at least have some type of plan/strategy.

And it would have been, had DHS risk management leaders been relying on a competent and appropriately-resourced *integrated cyber-physical intelligence team* to bring important developments in the external threat environment to leadership attention!

Fines or “Traffic School” for Online ICS?

My post on three ideas for advancing ICS security at a federal level received some comments from Andy Robinson on Twitter. You can read the thread here.

Andy liked my post but took issue with idea number 3:

“Fines for Online ICS. As an example, if water provision systems are connected to the public Internet, it must be brought to the water users’ attention. We need a small consequence right now (something like a $5,000 fine) to avoid a big consequence in the future.”

Essentially, Andy posits that government should help utilities, particularly municipal water provisioners, rather than fine them.

He thinks the punishment should be something like traffic school instead of just slapping them with a fine. He dislikes the idea of transferring funds from already struggling municipalities to the federal government, which only increases taxpayer burden.

That is a fair objection.

Twitter is not an apt medium for discussing some of the broader context of my idea. So here goes:

1. Finding online ICS

Just about anyone can find online ICS. Shodan, ZoomEye, SCADASL cheat sheet. It is not hard. That’s not to say that every online ICS is vulnerable either. But I think being online does make it a target — at least for opportunistic attackers. What we saw with Black Energy supports that.

2. The problem is getting worse

If you go back to Eireann Leverrett’s work on Internet-connected ICS in 2011, and repeat the searches today, you see that in most cases, the number of hits has dramatically increased. Granted, maybe Shodan’s scanning and parsing have also improved, but the assertion that more ICS are being connected to the Internet today is confirmed by Project Shine. We are connecting more ICS to the Internet. The problem is getting worse.

3. Contacting the owner

The problem we want to address is getting the offending systems segmented from the public Internet. The hang-up is that many of these IP addresses are registered to Internet Service Providers (ISPs). There is currently no way to compel the ISP to disclose who owns/operates a particular IP address. Essentially you can’t warn who you can’t reach. There could be privacy and free speech issues here, but also public safety concerns.

4. Assigning responsibility

Even if we know who owns an Internet connected ICS, we don’t know who decided to put it on the Internet (in many cases they probably didn’t “mean” to). Many water systems rely on third party engineers for about every operational aspect of their system. In many cases, it is the engineer’s “fault”. They might know how to specify pumps, but they might not know what a VPN is. Assigning ultimate responsibility is the job of the system owners.

5. Ratepayers must bear the burden at the end of the day

The fact of the matter is that rate-payers have to bear the cost of increased security. It might be unpleasant, but they reap the benefits of a more secure system, ergo, they should pay for what it costs.

6. A fine versus “traffic school”

The “Traffic school” idea is a fair one. We all like to give someone a chance cause they don’t know better — alter behavior though education. I would simply point out that the federal government has offered free trainings for years. Sure take the trainings, I hope they help, but knowing and doing are two different things. Don’t leave it to goodwill alone.

7. Threat of fines will be sufficient in most cases

My belief is that if the DHS or EPA or whoever is the appropriate agency contacts the owner/operator of an Internet-connected ICS and says “we have statutory authority to fine you $5,000 for connecting your industrial control system to the public Internet; however, we will forebear in this instance if you segregate your network within 15 days”, that will effectively get the job done without actually issuing a fine.


Now, I understand that creating a regulatory regime would be fraught with costs; defining terms would be a chore. There are burdens of proof, there are protocols for receiving and processing complaints, records must be kept. And there could even be litigation.

I’m not entirely convinced the “fines for online ICS” proposal is “worth it”; but I am convinced that it has a more direct connection with enhancing the cyber security of industrial control systems in the USA than many other programs have had to date.