Monday, June 22, 2009


My interview with Lee Whitfield is up as Forensic4Cast episode 17. Lee asked some interesting questions, so be sure to listen to the entire podcast...we talk about some things at the end of the interview that you like to hear.

Chris Pogue, co-author of Unix and Linux Forensic Analysis, has started his own blog...check it out! Chris and I have worked together, and it's good to see him getting into the mix now and bringing his experience and knowledge to the blogosphere, including posting a review of WFA 2/e! Chris will also be at the SANS Forensic Summit, speaking on the IR panel. I'm sure if you asked him, he'd be more than happy to sign your copy of ULFA, which, by the way, Syngress will have table at the Summit with their books available.

Hogfly posted on the Need for Speed, and I really think that this is something that cannot be said enough. While there is a need for speed in response, there's also a need to ensure that things are still done right and still done to a standard of accuracy and quality. Again, though...the need for speed in response is very real. In many cases, you'll have an issue of suspected data leakage or exposure, and acquiring a small number of systems and taking 2 months or more to provide an answer is simply unacceptable, as much or more so than providing the wrong answer too quickly. Processes and techniques need to be addressed, improved and implemented in such a manner as to answer the three most important questions:

1. Was the system compromised?
2. Did the system house or store "sensitive" data?
3. Did #1 lead to the exposure of #2?

Suffice to say that a lot of what it takes to answer these questions rests squarely on the shoulders of the system owners themselves. There's only so much that can be done when the breach goes unnoticed (often, for weeks), and then the first reaction of the on-site staff is to shut the system down and take it off of the network.

Hogfly also posted his review of WFA 2/e...check it out. I like to see what practitioners have to say about the book (or any other resource, for that matter), because who better to have an opinion on something like that than someone who works in the business, right? Seriously. If you wanted to get someone's opinion on, say, the acceleration and handling of a sports car, who would you look to? Eddie, the introvert who reads car magazines (and other things) online, or Danika Patrick?

Wednesday, June 17, 2009

#1 on Amazon!

Just today, I found out that WFA 2/e is #1 in the Amazon Sales Rank for Forensics books! Awesome! Thanks to everyone who's reviewed the book and to everyone who's purchased a copy of the book!

Tuesday, June 16, 2009

Buy F-Response, get a free copy of WFA 2/e!

Hey, no kidding! Check it out! Matt's offering a free (as in "beer") copy of WFA 2/e with each purchase or renewel of F-Response CE/EE. Got four consultants? Outfit each of them with a copy of F-Response EE, and they'll each get a copy of WFA 2/e. Sweet! Don't think so? Check out the reviews!

Sunday, June 14, 2009

WFA 2/e eBooks

I've received a number of emails regarding ebook versions of WFA 2/e, and at this point, all I've been able to determine is that Elsevier will NOT be producing a PDF version of the book for sale. No, I don't know why, and to be honest, I'm as mystified as you are.

The information I have from the publisher at this time is that it takes about 1-2 months to produce the ebook version of a book, and multiple ebook versions are produced (Kindle, Safari, etc.). I've been told that while a PDF version will not be produced for sale, that there will be a version produced by Ingram (??) that will be available to be read on a computer, and I'm also told that this reader, like Adobe Reader, allows the ebook to be searched. Other than that, I have nothing...I don't even know where to download the reader just to take a look at it. Nor do I know when the ebook will be available in any version, nor how much it will cost.

That's all I have. Pretty amazing, eh? A large publishing firm like Elsevier, and an author such as myself is having trouble getting basic information.

So, all I can say is sorry (although I'm not sure why I'm the one saying that...), and if you really want to let someone know how you feel about this, email my editor...she's graciously consented to accepting your emails. Or stop by the Syngress table at the SANS Forensic Summit and let her know.

Saturday, June 13, 2009

Thoughts on Timeline Analysis

I was chatting with Chris Pogue (a fellow Syngress book author attending the SANS Forensic Summit) a bit over the past couple of days on the subject of Timeline Analysis, and had some thoughts that I wanted to throw out there and see what others thought about them...

Personally, I've been doing some pretty cool things with timeline analysis, incorporating not only file system metadata, but Event Log entries, data from the Registry, as well as the user's web browser history, etc. What this does is allow me to view events from several sources all in one place, giving me some context, but not all of the possible context. And this can be a LOT of data! I go through the process of creating a bodyfile, then a 5-field TLN format events file, and then a full timeline in ASCII, saving it in a text file. I've updated some of my code recently to allow me to re-run the events-file-to-timeline conversion tool and focus solely on a specific date range, down to a single day.

This is where we usually start talking about visualization...what's a good way to present this information in a graphic format so that the analyst can determine the answer to the question they're trying to answer? Perhaps better yet...IS there a good way?

When it comes down to presenting the data to the customer, I've never been a supporter of giving the customer all of the raw data (there are folks out there who think a 3300+ page report is a good thing!), and giving the customer a timeline graphic of ALL of the data really doesn't do a whole lot, either for them to understand what's going on, or for your professional credibility. That's where the knowledge and ability of the analyst come in, and you create a timeline that summarizes the important and relevant events for the customer.

So, how do you do this? Do you sift through the data, extracting all of the irrelevant stuff (ie, removing thousands of file last accessed events and replacing them with a single AV scan event, etc.) and dump it into some kind of program that will generate the timeline automatically, or is it something more of a manual process? (See the Resources section at the end of this post for some examples of how to create a graphic representation of a timeline that can be added to reports.)

At this point, I'm of the opinion that this is still largely a manual process. While timeline creation and analysis has been automated to some degree through the use of tools, the fact is that there's currently no automated "sausage grinder" that you can drop an acquired image into and have it chug away and give you a full timeline. Just the file system metadata alone from one system can be cumbersome and overwhelming, particularly if you don't know what you're looking for. Lets say that you automatically add the Event Log entries to the timeline...but what if the Security Registry hive shows that the type of auditing you're looking for (successful login attempts) wasn't enabled, and a scan of the Event Logs shows that the events do not cover the dates in question anyway? If this is an automatic process, you've now got a lot of extra, albeit irrelevant, data.

What about context? Not all context of the events is visible in a some cases, a recent modification date on a file isn't as important as what was added (or removed) from the file. Or you may have two events...a USB removable storage device plugged into the system and shortly thereafter, a Windows shortcut/LNK file created...and the valuable context of the correlation between the two events is in the path information and volume ID embedded in the LNK file.

In a way, this discussion brings us back around to the basic idea of the skill and knowledge of the examiner/analyst. Lets say an analyst responds to an incident, and goes on-site to find four desktop systems that had been powered down and taken off of the network. One analyst might look at this, remove the drives, and image them with the pair of Vooms he has in his jump kit. Another might hook each drive up to a write-blocker and acquire logical images of each partition. Yet another responder might boot each system, log in as Administrator, acquire volatile data, and then perform live acquisitions. Given this kind of disparity across a single response, how does an analyst then "correctly" decide which information needs to be included in a timeline for analysis, and then determine the context of the data?

IMHO, this all comes down to training and experience. Training specifically in this topic needs to be available, followed by guidance and mentoring. Cheatsheets need to be available to remind folks about what's available, why and how the data is important, and then within organizations and labs, there needs to be some kind of peer review.


How to create a timeline in Excel (free templates)
Free SmartDraw Timeline Software

Friday, June 12, 2009

Very Interesting Developments

I don't often get presented with issues of copyright violation and intellectual property theft, but I did see this one today. It seems someone has set up a blog where they are offering copyrighted ebooks for free, two of which I authored. I also know the authors of several of other offered ebooks.

Folks, one of the big myths about authoring books is that somehow the author gets rich. I'm here to tell you, in this niche market, that just is NOT true at all. If you're that hard up that you need to steal someone else's intellectual property...well, what can I say?

Suffice it to say, this just isn't cool.

SANS Forensic Summit

Folks, let's not forget that the SANS Forensic Summit is coming up! Check out the list of speakers, presentations, and panels...this conference is going to be great!

Also, I spoke to the marketing folks at Syngress, and they are going to have a table at the Summit (graciously provided by Rob Lee) where they're going to have books available. Now, the way, wait...the WAY COOL thing about this is that several of the authors are also speakers at the Summit! So, if you don't have Chris Pogue's book, get it and get it signed by none other than Chris Pogue himself! Eoghan Casey's going to be there, too!

Finally, I have pristine copies (one each) of Windows Forensic Analysis (first and second editions), as well as Perl Scripting for Windows Security. I am going to bundle all three of them together and provide them as a give-away following my presentation at the conference.

BTW...the presentations from the 2008 SANS Forensic Summit are archived here! Take a look!

PS: I had a meeting yesterday and got there a few minutes early...I was meeting a friend for lunch and took a minute or two to walk through a nearby bookstore. Guess what I saw on the shelf? I'll give you a hint...I went to the Computer section, and was browsing in the area where they keep the books on security and forensics... ;-)

Tuesday, June 09, 2009

More Links

NetWitness announced on 8 June the availability of NetWitness Insight. This is a very interesting announcement, in part because, IMHO, NetWitness is the premier product available today when it comes to seeing and understanding what's happening on your network. In this case, collection of network traffic isn't the's the analysis and presentation, and that's where NetWitness products excel. The inclusion of InSight now gives the NetWitness suite of products what appears to be a DLP and vulnerability assessment capability, so that customers can find out where that sensitive data resides, as well as (according to the press release) locate vulnerable systems and prioritize remediation. As an incident responder, this is a fantastic capability...but what's missing is still the host-based response capability. Sounds like a job for F-Response!

I recently heard about a tool called MIR-ROR, put together originally by Troy Larson and then expanded by Russ McRee, both of Microsoft. Russ blogged about it here, and there's a toolsmith article available on it, as well. MIR-ROR is a batch file that is useful for running tools on a system as part of incident response; what I like about this is that Russ isn't sitting back hoping that someone does something like this, he's taking advantage of his knowledge and capabilities to put this together. And he's made it available to the public, along with instructions on how to run it. I like tools like this because they're self-documenting...properly constructed and commented, they serve as their own documentation. As always, the standard caveat applies...use/deploy tools like this as part of an incident response plan. If your plan says you need to acquire a pristine image of the drive first, you will want to consider holding off on using a tool like this...

Didier updated his disitool...I'm not even going to try to explain this one; instead, go to his blog and check it out.

Win32dd has been updated...according to Matthieu, there are some bug fixes, improvements, and some additional information about the memory state is displayed when the tool is used. Thanks, Matthieu, for the great work you've done with this tool!

While we're on the subject of memory collection and analysis, Brendan has updated VolReg to support BIG_DATA data types, due in part to Matthieu's blog post on Undocumented Vista and later Registry Secrets. Also, be sure to check out Brendan's Volatility Plugins page.

If you're a follower of Lon Solomon, at this point, you might be thinking, "SO WHAT?!?" Well, take a look at this write-up from Sophos...the part I like about this bit of malware is:

Rather than creating another file on disk, the dropper logic writes an entire PE file into the registry. The executable is stored under the key HKLM\SOFTWARE\Licenses with a randomly generated entry name.

Years ago while I was working for a security company in New Jersey, I wrote some code that would go out to a web site and grab what appeared to be a GIF image, but was in reality a PE file. The code would then disassemble the PE file into various Registry keys...the idea being that disassembling and writing it into the Registry would avoid detection by AV scanners. Then another piece of code would reassemble the PE file into the Recycle Bin and launch it. I thought that was pretty cool...but that was 8 years ago. Reminds me of that song Round and Round, by RATT..."what comes around goes around...". Hey, I wonder if we'll "see" a resurgence in the use of NTFS Alternate Data Streams, say, to hide PCI data?

Sunday, June 07, 2009

Forensic4Cast and Links

Lee Whitfield of the Forensic4Cast podcast reached out to me this past week, and asked me to be a guest on his podcast on Wed, 10 June.

If you've never listened to Lee's podcast, give it a shot...Matt Shannon of was interviewed, as well as others. Lee's also got a section for technical articles, many of which look to be extremely useful.

Lee's also taking nominations now through 21 June for Forensic4Cast Awards; be sure to place your vote in any or all of the various nomination categories. Take a look at the page to see how everything works, and dates for submissions, voting and the posting of the final results. While this isn't something huge that's going to get you a free pass to RSA next year or something, I do think that it's a great opportunity to show your appreciation for the work done in the various categories. See what Matt's posted as his nominations!

Speaking of podcasts, did you know that CERT has podcasts? Another security podcast out there is ExoticLiability. Man, there's just too much to check out!

Didier's posted some links to PDF analysis tidbits...very cool! Didier's done a great deal of work in the area, and his work reminds me a lot of the ComputerBytesMan's work in the area of MSWord metadata extraction. Now, some folks are going to look at these links and ask, "...okay, but how can I use this?" Far too often, folks will post links to other blogs or blogposts without any real explanation of how the information is useful, valuable, or important. Well, when conducting analysis of a compromised system, one of the questions that comes up very often is, how was the system compromised? What was the infection vector? It's pretty trivial, really, to scan a mounted image with AV software or to locate files that an intruder may have copied onto the system...but sometimes (many times?) we need to find out how they got in. One means of doing so is to run file signature analysis tools across web browser and email attachment cache directories to locate things like PDF documents or Excel spreadsheets the may have been downloaded. Finding such documents, which have recently been identified as having vulnerabilities, may lead to identifying the initial source of compromise or infection.

Moyix recently posted some Windows 7 Registry hives for examination, based on a request from Tim Morgan. I'd taken a look at hives from a Windows 7 VM earlier this year, and found that while key locations may change between various revs and versions of the OS, the binary structure appears to remain the same. Thankfully, MS hasn't moved to an all-XML format for the Registry (right now, a lot of you out there are going, "Dude, shut up!!"). I've been running my RegRipper plugins against the hives and dude...they work great!

Speaking of Registry hives, reviews of Windows Forensic Analysis 2/e are already starting to appear! It appears that some folks really like the Registry analysis chapter...maybe this is something to take off on it's own...what do you think? Should Registry Analysis become it's own book? Personally, I think that there's more than enough information out there for this...let me know your thoughts. Or let Syngress know your thoughts.

Finally, more reviews of WFA 2/e are being posted, and I've gotta thank Larry for his review of Perl Scripting for Windows Security! I greatly appreciate the efforts of those who are posting reviews, regardless of the forum. Thanks, folks!

Saturday, June 06, 2009

First Amazon Review!

The first review of WFA 2/e has been posted on Amazon! Check it out! Thanks, Dave!

Thursday, June 04, 2009

Links and stuff

First off...for anyone who purchased a copy of Windows Forensic Analysis Second Edition at the TechnoSecurity conference...I'd greatly appreciate it if you'd post a review on Amazon! Thanks!

Richard Bejtlich has an interesting post regarding incident ratings. I find Richard to have well-thought out and -reasoned views, and this is yet another example of that. When writing CSIRPs, we include things such as incident severity ratings for classification and escalation purposes, so having something like this, while perhaps a little complex for many organizations, is very important.

JL's been nice enough to post on some CEIC stuff. Thanks for posting and making these materials available!

Over on OffensiveComputing, there's a link for OfficeMalScanner, which scans Office documents for malware, embedded PE files, and OLE streams. If VB code is found, it's reportedly extracted for analysis. This sounds pretty cool and a good thing to have in your toolkit, along with other means for malware detection.

The eEvidence site has been updated again! Christine has a way of finding some really cool papers and presentations...while they may not always be brand-spanking new, they are definitely topical and well worth reading and discussing.

Ed posted some good command-line kung fu for getting user and group information from a live system. For post-mortem analysis, I use RegRipper's samparse plugin for not only parses out the user information, but also the group membership information, as well. Another interesting bit of analysis you can use this for is to determine all local users on the system; dumping the contents of the ProfileList key (from the Software hive) or during a 'dir' on the Documents and Settings directory will give you the list of users with profiles on the system, but this will not distinguish between local and domain users.

According to SANS, the key ingredient to team development! Amen to that!

Wednesday, June 03, 2009

The Case of the "Default User"

Ever run across a case during which, while examining Internet browser history, you found that the "Default User" had browser history? Ever wondered about that?

Rob "van" Hensing was one of the first I know of to blog about this issue, almost three years ago. Given the time frame, this is a good time to bring this subject up again, don't'cha think?

I've seen this sort of thing in a couple of instances, specifically when SQL injection has been used to gain access to an infrastructure, and the bad guy gets a copy of wget.exe (static PE analysis will tell you if the program accesses the WinInet APIs) onto the system, and then uses that to pull down other files - in many cases, they'd use echo to create an FTP script, then launch the native command line FTP client using the script, or use wget.exe to pull the files down. Why? Well, most times FTP and/or HTTP are allowed out through the firewall.

Good stuff.

Tuesday, June 02, 2009

WFA 2/e Published!

I caught a note over on Facebook yesterday from Syngress that Windows Forensic Analysis, 2/e was published today! Awesome!

The info posted on Facebook is the same information posted on Elsevier's site. Aside from the "?" where there should be quotes, the information itself looks good...just one of those things I kinda wish the publisher would've picked up on earlier in the game, and corrected.

I'm told that the book is available, right now, at the TechnoSecurity conference in Myrtle Beach, but I haven't received any confirmation of that, nor any feedback from the Syngress marketing folks who are on-site. Speaking of which, the SANS Forensic Summit is rapidly approaching, and I'm trying to get copies of WFA and other books there. Rob Lee has been gracious enough to offer table at the Summit, and I've let the Syngress folks know that I'm not the only author who will be at the Summit.

Speaking of the TechnoSecurity conference, here's a picture of WFA 2/e on sale!

So, if you're going to be at the Summit (or someone you know is going), and you want Syngress to have books there for purchase, either post something publicly (blog) and send me a link, or comment here! Let the Syngress marketing folks know that if a computer conference has the word "forensics" in the title, then they should have books with the same word in the title available...particularly if there are going to be authors at the conference (and speaking)!

Another thing...if you are a college educator who uses WFA in computer forensics courses, or if you know of an academic institution (community college, college, academy, university, etc.) where WFA is part of required or recommended reading...please contact me at keydet89 at yahoo dot com. Thanks!