zondag 3 maart 2013

"Data Honesty" and why IOCs are not (yet)


In the past half decade I've been working in incident response and data analysis extensively, working on projects that helped monitor security-related data on very large networks and helping to building Incident Response capabilities that were empowered rather than paralyzed by that data. It is no secret that I have always been a big proponent of data sharing among peers to improve data quality and more importantly to build what I have started to dub "data honesty". A term I can only hope will find some level of adoption sooner than later.

"Data honesty" is, in its simplest form, the level of maturity where anybody with access to a dataset and the methodology used to derive information from said dataset will arrive at the same conclusions as you did. It is this level where you are confident to publish both your dataset and the methodology used.

There is no doubt that the Verizon Risk Team as a whole has pushed the envelope in this field for our industry. The work they have done in developing VERIS ("Vocabulary for Event Recording and Incident Sharing") is a tremendous effort that I feel our community has largely ignored. It helps to understand that the V in the acronym stood for Verizon in a previous incarnation in order to realize that this is exactly the reason why widespread adoption of VERIS among Incident Response service providers has not occurred. As with any standard, commercial organizations are only interested in the standard if there is an opportunity to OWN the standard (and thus an opportunity to exploit it for profit). It is, unfortunately, a problem that almost any standard in our industry has suffered from since forever.

All this has resulted in report after report based on, undoubtedly, real data but without any structured methodology. This doesn't mean the reports were constructed in bad faith but one always wonders how the data conveniently correlated towards a need for more services/product/... conveniently provided by the publisher of the report (or the one sponsoring it). 

All jokes aside, there has never been more interest in security-related data. To a point where industries are waiting for any data that can help make them informed risk decisions. The latest data points being thrown at us are 'Indicators of Compromise' or IOCs. VERIS covers IOCs here:http://www.veriscommunity.net/doku.php?id=iocs and frankly, I love them. They're little nuggets of information that, when related to properly analyzed incidents can turn an incident response effort upside down. But caveat emptor!

IOCs lose a lot of their value when not related to properly analyzed incidents. There probably lies their biggest weakness. Anybody can publish IOCs these days without the need to link them to any active (or terminated) targeted attack campaign and attribute them to "Wim Remes' bad-ass Belgian Hacking Team extra-ordinaire". It is no longer the thorough methodology that supports the credibility of the IOCs but the commercial power and (perceived) familiarity with the "threat du jour" of the publisher.

When respected researchers and entrepreneurs start calling for "IOC Wednesday" I would urge you to take a step back and look at VERIS first. If you can analyze and categorize your incidents based on the data YOU can gather, you won't need to wait for others to be compromised to protect yourself better. 

IOCs are, at this moment, not by definition "honest data" but you can use them to streamline the processes you build based on your own "honest data".

For those companies with a stake in Incident Response services, now is the time to set aside your egotistical reasons for not using VERIS in your analysis or reports. Any methodology will always have weaknesses but this is the one we have RIGHT NOW and for a flawed methodology, it's a pretty darn good starting point if you ask me. The RISK team deserves that credit and the industry deserves that standard.

Geen opmerkingen:

Een reactie posten