Monday, December 13, 2010

Links, Thoughts...

What's good and what's been going on lately in the world of Windows IR/DF? I think maybe things are slowing down around the holidays, as it seems that many of us who are blogging are simply sharing the same links that everyone else is blogging...kind of circular, and when you poke your head up, you realize that you've been reading the same half dozen stories for the past week.

Maybe there really isn't a whole lot going on at the moment...maybe adding something a little bit different, perhaps a different angle, will change things up just a bit.

Books
Many of you know that I've had a couple of books published, and number 5...Windows Registry Forensics...will be out shortly. WFA 2/e has been pretty successful so far, and a number of folks have sort of off-handedly asked me about a third edition. To be honest, this is something that's up to the publisher...WFA 2/e may be successful enough for a third edition, or the publisher may decide that the title and subject have run their course....who knows?

However, the question itself got me thinking along a different road...that is, what does it take to write a book? In exploring material for a third edition, I took an opportunity to ask some folks what they thought should be included in that edition, only to get some very interesting responses. Those responses then got me to thinking, "here's someone who does a lot of work...why doesn't he write a book?" I then thought, what is it that's holding folks back? There are a number of folks out there who likely have the necessary background to write a book...I mean, based on some of the books that have been written over the years, a solid background in ediscovery or digital forensic analysis should be enough to write a solid book, right?

If you're looking for a particular topic, my publisher has a page up called "Write for Syngress" for those who may be thinking about writing a book, or just tossing the idea around.

I think that a lot of folks look for someone to write a book for them, without ever thinking that maybe they could write that book themselves. Time is always a factor, but based on my experience, there are a number of decisions that need to be made, and sometimes thinking through those things can actually get you over that hump.

As such, what I'm going to be doing over time is writing a series of blog posts about writing books. I'm going to offer up my thoughts and experiences, and hope that someone out there decides to put pen to paper (figuratively, or literally) and step out and just do it. Yes, writing a book is time consuming and it can be arduous, but it can also be very rewarding.

Prefetch Files
Mark Wade recently had an article published in the DFI News regarding Prefetch file analysis. The article is a good resource for those not familiar with Prefetch files and how they might be used. I do, however, have a question about one statement made in the article:

For instance if the Standard Information Attribute (SIA) and File Name Attribute (FNA) timestamps are modified in the Master File Table (MFT) to impede analysis...

In light of what's available in Brian Carrier's File System Forensic Analysis book, I'm a bit unclear as to how FNA timestamps would be purposely modified. Does anyone have any input on that? Is anyone seeing FNA timestamps being purposely and maliciously modified in the same manner as SIA timestamps?

That question aside, I think that Mark makes a number of excellent points and observations in his article, particularly that data embedded within the Prefetch files can be used to determine a number of interesting artifacts, such as errant removable storage devices or paths, "abnormal" or deleted user accounts, etc.

Imaging Systems
Lance recently posted a "hard drive imaging process tree for basic training"; keeping in mind that it's for basic training, it's an excellent resource to look to and use.

There are a couple of things that you should consider adding if you're planning to use this decision tree for your own training purposes. For example, the first comment talks about adding chain of custody documentation...this is something that should definitely be done. Several other comments refer to collecting volatile memory...while Lance's responses are that this is covered in other modules, I tend to agree with Rob and Andrew...this is something that needs to be included at the basic level; we can no longer leave this to the advanced level.

The thing I am glad to see at the end of the entire process is "Verify Image". I can't say that it's happened too many times, but I do know of analysts who have gone on-site, acquired images, documented everything, and not verified their images. When they got back to the lab, they found that for some reason, they did not have valid images. This must be part of the process.

Great job, Lance, on producing an excellent resource! Thanks!

Oh, hey, almost forgot...thanks for updating the post to include volatile data collection!

Memory Analysis
Gl33da's got a new blog post up, this one on identifying memory images. If you do memory analysis, but don't actually perform the memory acquisition, this is an excellent post that's chock full of great information, in that she not only points out a method for determining the version of Windows of a memory image, but points out that it's already been incorporated into Volatility 1.4. Shoutz to gl33da for sharing!

If you are performing memory acquisition, are you using windd? Or are you using the MoonSols Windows Memory Toolkit Community Edition?

Addendum: Russ McRee pointed me to an article he wrote for InfoSecInstitute, titled "Security Incident Response Testing To Meet Audit Requirements". The article focuses primarily on the PCI DSS v2.0, specifically para. 12.9, which mandates the requirement for an IR capability, in addition to providing specifications. Overall, I think that the article is very good, and includes a good deal of the things I would do or recommend when writing or evaluating an IR plan.

The only real issue I have with Russ's article...and this comes from being QSA certified and on a QIRA team for far too long...is that the majority of organizations that I investigated had enough trouble just getting something down on paper, let alone a coherent IR plan. The likelihood of a comprehensive plan that was tested, and an IR team that is trained and drilled on a regular basis was slim to none. Now, this isn't all organizations obviously...this is simply based on my experience and the responses I performed.

I will say that Russ does have some very interesting information in the article that will no only benefit responders, but forensic analysts, as well as those who provided training to responders and analysts. Thanks, Russ, for pointing me to the article.

2 comments:

Unknown said...

Not sure IF anyone is modifying FNA timestamps, but in regards to HOW, the Timestomp wiki points out an interesting workaround here http://www.forensicswiki.org/wiki/Timestomp. I have also blogged about doing this with Powershell here http://securitybraindump.blogspot.com/2010/05/more-experiments-with-master-file-table.html . Of course changing the system time in my example of detectable in other ways. ;)

Ken Pryor said...

As you may know, I'm involved in writing a book for Syngress with Brad Garnett and Joe Garcia. It's the first book writing venture for any of us, so I'm looking forward to your upcoming posts on the topic.

I'm also greatly looking forward to your Windows Registry Forensic book. It will definitely be on my "buy now" list when it's available.
KP