Pages

Monday, November 26, 2012

The Next Big Thing

First off, this is not an end-of-year summary of 2012, nor where I'm going to lay out my predictions for 2013...because that's not really my thing.  What I'm more interested in addressing is, what is "The Next Big Thing" in DFIR?  Rather than making a prediction, I'm going to suggest where, IMHO, we should be going within our community/industry.

There is, of course, the CDFS, which provides leadership and advocacy for the DFIR profession.  If you want to be involved in a guiding force behind the direction of our profession, and driving The Next Big Thing, consider becoming involved through this group.

So what should be The Next Big Thing in DFIR?  In the time I've been in and around this profession, one thing I have seen is that there is still a great deal effort directed to providing a layer of abstraction to analysts in order to represent the data.  Commercial tools provide frameworks for looking at the available (acquired) data, as do collections of free tools.  Some tools or frameworks provide different capabilities, such as allowing the analyst to easily conduct keyword searches, or providing default viewers or parsers for some file types.  However, what most tools do not provide is an easy means for analysts to describe the valuable artifacts that they've found, nor an easy means to communicate intelligence gathered through examination and research to other analysts.

Some of what I see happening includes analysts go to training and/or a conference, and hearing "experts" (don't get me wrong, many speakers are, in fact, experts in their field...) speak, and then return to their desks with...what?  Not long ago, I was giving a presentation and the subject of analysis of shellbag artifacts came up.  I asked how many of the analysts in the room did shellbag analysis and two raised their hands.  One of them stated that they had analyzed shellbag artifacts when they attended a SANS training course, but they hadn't done so since.  I then asked how many folks in the room conducted analysis where what the user did on the system was of primary interest in most of their exams, and almost everyone in the room raised their hands.  The only way I can explain the disparity between the two responses is that the tools used by most analysts provide a layer of abstraction to the data (acquired images) that they're viewing, and leave the identification of valuable (or even critical) artifacts and the overall analysis process up to the analyst.  A number of training courses provide information regarding analysis processes, but once analysts return from these courses, I'm not sure that there's a great deal of stimulus for them to incorporate what they just learned into what they do.  As such, I tend to believe that there's a great deal of extremely valuable intelligence either missed or lost within our community.

I'm beginning to believe more and more that tools that simply provide a layer of abstraction to the data viewed by analysts are becoming a thing of the past.  Or, maybe it's more accurate to say that they should become a thing of the past.  The analysis process needs to be facilitated more, and the sharing of information and intelligence between both the tools used, as well as the analysts using them, needs to become more part of our daily workflow.

Part of this belief may be because many of the tools available don't necessarily provide an easy means for analysts to share that process and intelligence.  What do I mean by that?  Take a look at some of the tools used by analysts today, and consider why those tools are used.  Now, think to yourself for a moment...how easy is it for one analyst using that tool to share any intelligence that they've found with another (any other) analyst?  Let's say that one analyst finds something of value during an exam, and it would behoove the entire team to have access to that artifact or intelligence.  Using the tool or framework available, how does the analyst then share the analysis or investigative processed used, as well as the artifact found or intelligence gleaned?  Does the framework being used provide a suitable means for doing so?

Analysts aren't sharing intelligence for two reasons...they don't know how to describe it, and even if they do, there's no easy means for doing so within the framework that they're using.  They can't easily share information and intelligence between the tools they're using, nor with other analysts, even those using the same tools.

For a great example of what I'm referring to, take a look at Volatility.  This started out as a project that was delivering something not available via any other means, and the folks that make up the team continue to do just that.  The framework provides much more than just a layer of abstraction that allows analysts to dig into a memory dump or hibernation file...the team also provides plugins that serve to illustrate not just what's possible to retrieve from a memory dump, but also what they've found valuable, and how others can find these artifacts via a repeatable process.  Another excellent resource is MHL et al's book, The Malware Analyst's Cookbook, which provides a great deal of process information via the format, as well as intel via the various 'recipes'.

I kind of look at it this way...when I was in high school, we read Chaucer's Canterbury Tales, and each year the books were passed down from the previous year.  If you were lucky, you'd get a copy with some of the humorous or ribald sections highlighted...but what wasn't passed down was the understanding of what was leading us to read these passages in the first place.  Sure, there's a lot of neat and interesting stuff that analysts see on a regular basis, but what we aren't good at is sharing the really valuable stuff and the intel with other analysts.  If that's something that would be of use...one analyst being aware of what another analyst found...then as consumers we need to engage tool and process developers directly and consistently, let them know what our needs are, and start intelligently using those processes and tools that meet our needs.

No comments:

Post a Comment