Friday, November 04, 2011

DF Analysis Lifecycle

In an effort to spur some interest within the DFIR community (and specifically with the NoVA Forensics Meetup group) in engaging and sharing information, I thought it would be a good idea to point out "forensic challenges" or exercises that are available online, as well as to perhaps setup and conduct some exercises of our (the meetup group) own.

As I was thinking about how to do this, one thing occurred to me...whenever I've done something like this as part of a training exercise or engagement, many times the first things folks say is that they don't know how to get started.  When I've conducted training exercises, they've usually been for mixed audiences..."mixed" in the sense that the attendees often aren't all just DF analysts/investigators; some do DF work part-time, some do variations of DF work (such as "online forensics") and others are SOC monitors and may not really do DF analysis.

As such, what I wanted to do was lay out the way I approach analysis engagements, and make that process available for others to read and comment on; I thought that would be a good way to get started on some of the analysis exercises that we can engage in going forward.  I've included some additional resources (by no means is this a complete list) at the end of this blog post.

Getting Started
The most common scenario I've faced is receiving either a hard drive or an image for analysis.  In many cases, it's been more than one, but if you know how to conduct the analysis of one image, then scaling it to multiple images isn't all that difficult.  Also, acquiring an image is either one of those things that you can gloss over in a short blog post, or you have to write an entire blog post (or series of posts) on how to do it...so let's just start our examination based on the fact that we received an image.

Documentation
Documentation is the key to any analysis.  It's also the hardest thing to get technical folks to do.  For whatever reason, getting technical folks to document what they're doing is like herding cats down a beach.  If you don't believe me...try it.  Why it's so hard is up for discussion...but the fact of the matter is that proper documentation is an incredibly useful tool, and when you do it, you'll find that it will actually allow you to do more of the cool, sexy analysis stuff that folks like to do.

Document all the things!

Most often when we talk about documentation during analysis, we're referring to case notes, and as such, we need to document pretty much everything (please excuse the gratuitous meme) about the case that we're working on.  This includes when we start, what we start with, the tools and processes/procedures we use, our findings, etc. 

One of the documentation pitfalls that a lot of folks run into is that they start their case notes on a "piece of paper", and by the end of the engagement, those notes never quite make it into an electronic document.  It's best to get used to (and start out) documenting your analysis in electronic format, particularly so your notes can be stored and shared.  One means of doing so is to use Forensic CaseNotes from QCC.  You can modify the available tabs to meet your needs.  However, you can just as easily document what you're doing in MS Word; you can add bold and italics to the document to indicate headers, and you can even add images and tables (or embed Visio diagrams) to the document, if you need to.

The reasons why we document what we do are (1) you may get "hit by a bus" and another analyst may need to pick up your work, and (2) you may need to revisit your analysis (you may be asked questions about it) 6 months or a year later.  I know, I know...these examples are used all the time and I know folks are tired of hearing them...but guess what?  We use these examples because they actually happen.  No, I don't know of an analyst who was actually "hit by a bus", but I do know of several instances where an analyst was on vacation, in surgery, or had left the organization, and the analysis had to be turned over to someone else.  I also know of several instances where a year or more after the report was delivered to the customer, questions were posed...this can happen when you're engaged by LE and the defense has a question, or when you're engaged by an organization, and their compliance and regulatory bodies have additional questions.  We often don't think much about these scenarios, but when they do occur, we very often finding ourselves wishing we'd kept better notes.

So, one of the questions I hear is, "...to what standard should I keep case notes?"  Well, consider the two above scenarios, and keep your case notes such that (1) they can be turned over to someone else or (2) you can come back a year later and clearly see what you did.  I mean, honestly...it really isn't that hard.  For example, I start my case notes with basic case information...customer point of contact (PoC), exhibits/items I received, and most importantly, the goals of my exam.  I put the goals right there in front of me, and have them listed clearly and concisely in their own section so that I can always see them, and refer back to them.  When I document my analysis, I do so by including the tool or process that I used, and I include the version of the tool I used.  I've found this to be critical, as tools tend to get updated.  Look at EnCase, ProDiscover, or Mark Woan's JumpLister.  If you used a specific version of a tool, and a year later that tool had been updated (perhaps even several times), then you'd at least have an explanation as to why you saw the data that you did.

Case notes should be clear and concise, and not include the complete output from every tool that you use or run.  You can, however, include pertinent excerpts from tool output, particularly if that output leads your examination in a particular direction.  By contrast, dumping the entire output of a tool into your case notes and including a note that "only the 3 of the last 4 lines in the output are important" is far from clear or concise.  I would consider including information about why something is important or significant to your examination, and I've even gone so far as to include references, such as links to Microsoft KnowledgeBase articles, particularly if those references support my reasoning and conclusions.

If you keep your case notes in a clear and concise manner, then the report almost writes itself.

Now, I will say that I have heard arguments against keeping case notes; in particular, that they're discoverable.  Some folks have said that because case notes are discoverable, the defense could get ahold of them and make the examiner's life difficult, at best.  And yet, for all of these comments, no one has ever elaborated on this beyond the "maybe" and the "possibly".  To this day, I do not understand why an analyst, as a matter of course, would NOT keep case notes, outside of being explicitly instructed to do so (i.e., to not keep case notes) by whomever you're working for. 

Checklists
Often, we use tools and scripts in our analysis process in order to add some level of automation, particularly when the tasks are repetitive.  A way to expand that is to use checklists, particularly for involved sets of tasks.  I use a malware detection checklist that I put together based on a good deal of work that I'd done, and I pull out a copy of that checklist whenever I have an exam that involves attempting to locate malware within an acquired image.  The checklist serves as documentation...in my case notes, I refer to the checklist, and I keep a completed copy of the checklist in the case directory along with my case notes.  The checklist allows me to keep track of the steps, as well as the tools (and versions) I used, any significant findings, as well as any notes or justification I may have for not completing a step.  For example, I won't run a scan for NTFS ADSs if the file system of the image is FAT. 

The great thing about using a checklist is that it's a living document...as I learn and find new things, I can add them to the checklist.  It also allows me to complete the analysis steps more thoroughly and completely, and in a timely manner.  This, in turn, leaves me more time for things like conducting deep(er) analysis.  Checklists and procedures can also be codified into a forensic scanner, allowing the "low hanging fruit" and artifacts that you've previously found to searched for quickly, thereby allowing you to focus on further analysis.  If the scanner is designed to keep a log of it's activity, then you've got a good deal of documentation right there.

Remember that when using a checklist or just conducting your analysis, no findings can be just as important as an interesting finding.  Let's say that you have a checklist that includes 10 steps, and of those, only 1 step finds anything interesting.  Let's say you follow all 10 (again, purely arbitrary number, used only as an example) steps of your malware detection checklist, and only the ADS detection step finds anything of interest, but it turns out to be nothing.  If you choose to not document the steps that had no significant findings, what does that tell another analyst who picks up your case, or what does it tell the customer who reads your report?  Not much.  In fact, it sounds like all you did was run a scan for ADSs...and the customer is paying how much for that report?  Doing this makes whomever reads your report think that you weren't very thorough, when you were, in fact, extremely thorough.

One final note about checklists and procedures...they're a good place to start, but they're by no means the be-all-end-all.  They're tools...use them as such.  Procedures and checklists often mean the difference between conducting "Registry analysis" and getting it knocked out, and billing a customer for 16 hrs of "Registry analysis", with no discernible findings or results.  If you run through your checklist and find something odd or interesting (for example, no findings), use that as a launching point from which to continue your exam.

Start From The End
This is advice that I've given to a number of folks, and I often get a look like I just sprouted a third eye in the middle of my forehead.  What do you mean, "start at the end"?  Well, this goes back to the military "backwards planning" concept...determine where you need to be at the end of the engagement (clear, concise report delivered to a happy customer), and plan backwards based on where you are now (sitting at your desk with a drive image to analyze).  In other words, rather than sitting down with a blank page, start with a report template (you know you're going to have to deliver a report...) and work from there.

Very often when I have managed engagements, I would start filling in the report template while the analyst (or analysts) was getting organized, or even while they were still on-site.  I'll get the executive summary knocked out, putting the background and goals (the exact same goals that the analyst has in their case notes) into the report, and replicating that information into the body of the report.  That leaves the analyst to add the exhibits (what was analyzed) and findings information into the report, without having to worry about all of the other "stuff", and allows them to focus on the cool part of the engagement...the analysis.  Using a report template (and using the same one every time), they know what needs to be included where, and how to go about writing their findings (i.e., clear and concise).  As mentioned previously, the analysis steps and findings are often taken directly from the case notes.

What's the plan, Stan?
Having an analysis plan to start with can often be key to your analysis.  Have you ever seen someone start their analysis by loading the image into an analysis application and start indexing the entire image?  This activity can take a great deal of time, and we've all seen even commercial applications crash during this process.  If you're going to index an entire image, why are you doing so?  In order to conduct keyword searches?  Okay...what's your list of keywords?

My point is to think critically about what you're doing, and how you're going to go about doing it.  Are you indexing an entire image because doing so is pertinent to your analysis, or "because that's what we've always done"?  If it's pertinent, that's great...but consider either extracting data from the image or making an additional working copy of the image before kicking off the indexing process.  That way, you can be doing other analysis during the indexing process.  Also, don't waste time doing stuff that you don't need to be doing.

Report Writing
No one likes to write reports.  However, if we don't write reports, how do we get paid?  How do we communicate our findings to others, such as the customer, or the prosecutor, or to anyone else?   Writing reports should not be viewed as a necessary evil, but instead as a required skill set.

When writing your report, as with your case notes, be clear and concise.  There's no need to be flowery and verbose in your language.  Remember, you're writing a report that takes a bunch of technical information and very often needs to translate that into something a non-technical person needs to understand in order to make a business or legal decision.  It's not only harder to make up new verbiage for different sections of your report, it also makes the finished product harder to read and understand.

When walking through the analysis or findings portion of the report (leading up to my conclusions), I've found that it's best to use the same cadence and structure in my writing.  It not only makes it easier to write, but it also makes it easier to read.  For example, if I'm analyzing an image in order to locate suspected malware, in each section, I'll list what I did ("ran AV scan"), which tools I used ("AV scanner blah, version X"), and what I found ("no significant/pertinent findings", or "Troj/Win32.Blah found").  I've found that when trying to convey technical information to a non-technical audience, using the same cadence and structure over and over often leaves the reader remembering the aspects of the report that you want them to remember.  In particular, you want to convey that you did a thorough job in your analysis.  In contrast, having each section worded in a significantly different manner not only makes it harder for me to write (I have to make new stuff up for each section), but the customer just ends up confused, and remembering only those things that were different. 

Be professional in your reporting.  You don't have to be verbose and use $5 words; in fact, doing so can often lead to confusion because you've used a big word incorrectly.  Have someone review your report, and for goodness sake, run spell check before you send it in for review!  If you run spell check and see a bunch of words underlined with red squiggly lines, or phrases underlined with green squiggly lines, address them.  Get the report in for review early enough for someone to take a good look at it, and don't leave it to the last minute.  Finally, if there's something that needs to be addressed in the report, don't tell your reviewer, "fine, if you don't like it, fix it yourself."  Constructive criticism is useful and helps us all get better at what we do, but the petulant "whatever...fix it yourself" attitude doesn't go over well.

The report structure is simple...start with an executive summary (ExSumm).  This is exactly as described...it's a summary for executives.  It's not a place for you to show off how many really cool big words you know.  Make it simple and clear...provide some background info on the incident, the goals of the analysis (as decided upon with the customer) and your conclusions.  Remember your audience...someone non-technical needs a clear and concise one-pager (no more than 2) with the information that they can use to make critical business decisions.  Were they compromised?  Yes or no?  There's no need to pontificate on how easily they had been compromised...just be clear about it.  "A successful SQL injection attack led to the exposure of 10K records."

The body of the report should include background on the incident (with a bit more detail than the ExSumm), followed by the exhibits (what was analyzed), and the goals of the analysis.  From there, provide information on the analysis you conducted, your findings, and your conclusions.  The goals and conclusions from the body of the report should be identical...literally, copy-and-paste...from the ExSumm.

Finally, many reports include some modicum of recommendations...sometimes this is appropriate, other times it isn't.  For example, if you're looking at 1 or 10 images, does that really give you an overall view into the infrastructure as a whole?  Just because MRT isn't up-to-date on 5 systems, does that mean that the organization needs to develop and implement a patch management infrastructure?  How do you know that they haven't already?  This is the part of the report that is usually up for discussion, as to whether or not it's included.

Summary
So, my intention with this post has been to illustrate an engagement lifecycle, and to give an overview of what an engagement can look like, cradle-to-grave.  This has by no means been intended to be THE way of doing things...rather, this is a way of conducting an engagement that has been useful to me, and I've found to be successful.

Resources
Chris Pogue's "Sniper Forensics: One Shot, One Kill" presentation from DefCon18
Chris Pogue's "Sniper Forensics v.3" from the most recent SecTor (scroll down)
TrustWave SpiderLabs "Sniper Forensics" blog posts (five posts in the series)
Girl, Unallocated On Writing
UnChained Forensics Lessons Learned
Brad Garnett's tips on Report Writing (SANS)
Computer Forensics Processing Checklist

Useful Analysis Tidbits
Corey's blog posts on exploit artifacts

4 comments:

Rob said...

Great Posting. I really like the way you break down the reporting into sections. I think there area group of folks that "Over Report", which sounds odd. Too Much detail can turn off the reader and eventually have you trying to explain something to a Judge and jury that you copied and pasted from an automated report, Just because it was there. Get to the point and put enough detail that in a year or so you can read the report and understand what steps you took to get the results you have. I think whomever reads the work product will be pleased that you have something like the "Executive summary" which you described.

Good Stuff.. Thanks
Rob

H. Carvey said...

It's not so much that analysts tend to "over report"...in my experience, it's that they tend to try to make the report too fluffy, b/c they think that the customer will get bored reading the same thing over and over.

There is, however, a significant element of over-reporting in some cases. I was once told that customers prefer the "weight test" when it comes to their reports...rather than providing clear, concise answers that they can easily understand and absorb, the attitude seemed to be that the customer wanted massive volumes of data that consumed reams of paper when printed out.

I've never understood that one...nor have I ever talked to a customer that agreed with that line of thinking.

Paul Harper said...

My old boss told me when producing a report for managers and lawyers write it in crayons as if though for a child in kindergarden. (A bit of hyperbole.) What I tend to do is a clear executive summary followed with heaps of technical attachments to follow. The manager/lawyers tend to only read the executive summary anyway.

Cults14 said...

Agree 100%, and appreciate the sharing Harlan. I had just one case from my early days which I had to re-visit a year later - and the notes really took a lot of working on to make sense of. I mean, I wondered "what the heck"

On the SANS408 course I attended they said you should run your report through a style analysis tool and if it comes anything higher than a level 12 (or was it 8? - guess that makes sense in USA, not so much in UK), simplify it!