Tuesday, October 27, 2009


F-Response 3.09.05 is out! With this version comes "Compatible with Windows 7" status, as well as additional platform support (ie, HP-UX, and FreeBSD 7). If you haven't been watching Matt's product, all I can say is, you really need to be. Why? What does F-Response offer? As an incident responder, one of the biggest issues I've had to face is the lack of available data for analysis. This is most often due to the fact that the "victim" is woefully unprepared for those incidents that will, without question, occur. The short story here is that for relatively little expense, F-Response provides system owners and first responders (who should be the folks on-site) with the ability to quickly gather data so that the questions they do have (ie, was the system infected/compromised, was sensitive data on the system, etc.) can be answered.

Thanks to JL, we should be looking for a new release of Volatility soon! JL's been doing a lot of great work documenting Volatility, as well.

A bunch of us will be at the NetWitness User Conference next week...I won't be speaking, but I will be there with my employer. This is a great product, and if you don't already know about it, you really should check it out. Richard Bejtlich of TaoSecurity fame, perhaps the predominant NSM luminary, has blogged about NetWitness, albeit not recently. Maybe there's something on the horizon...we can only hope!

TrueCrypt 6.3 is out, with full support for Windows 7 and MacOSX 10.6 Snow Leopard! If you're one of those folks who loves the MacBook hardware, and loves to have the ability to use both MacOSX and Windows (via BootCamp), then you now have the ability to protect sensitive (ie, customer) data on both platforms.

Hey, did you know that this guy has been collecting screenshots from TweetMyPC? Looking at the archive, all of the screenshots are from this past summer (June through August), but still...probably a little more revealing than I'd like to have folks see! Reminds me of the site that used to be up a couple of years ago called "seewhatyoushare.com"...

Christa Miller had an excellent article posted on Officer.com, regarding crime scene evidence that's being ignored. While specific to LE, my own experience tells me that this is also the case with IR activities, where first responders don't often recognize the value in certain devices or data. Also important in today's day and age, Christa raises the issue of evidence "in the cloud". I'd blogged about 4 1/2 years ago about GMail Drive artifacts, and it's good to see Christa bringing this sort of thing back into focus again.

There are some thought provoking posts over on the Cassandra Security site...give them a look and a read, leave your comments. At least one of the guys over there is a former Marine, like myself, and this guy...so that's a recipe for some interesting posts!

The Free Tools post is really taking off...if you've got something to add, please feel free to make a comment. Really. Just add a comment if you have a free tool for Windows systems that isn't already on the list.

File Extensions and Programs

Now and again in the lists, you'll see a post asking about a file extension, and what program it "belongs" to or what it does. Many times the way to determine some information about the file extension may be to search via Google of the Filext.com site. However, if you found the file while analyzing an acquired image, you already have the information you need at your finger tips, right there in the Registry within the image. The Registry maintains a list of file associations; that is, file extensions for installed applications, associated with the programs that should be used to open them. These are maintained for the system, as well as the user.

File extensions are the basis of traditional file signature analysis, where the file signature (usually a "magic number" within the first 20 bytes of the file) is compared to a set of known file extensions associated with that particular type of file. When a match is found, nothing happens...that's to be expected. However, when there's a mismatch...either a new file extension, or a new file extension and "magic number" combination...there should be a flag of some kind to notify the analyst.

I blogged on file associations over a year ago...sometimes circling back around to the older stuff is a good thing, can be very useful, and can remind us of things that might not have been useful at the time. So, the next time you run across an odd file extension, try taking a look at the Registry within the image; perform a little Registry analysis and post your findings to the list, rather than posting a question...because folks are just going to be asking you, "what did you find through Registry analysis?"

Friday, October 23, 2009

Free Tools

I've seen requests in several listservs for listings of free tools that people use during examinations, and most often, the response is something akin to "contact me off list". In my mind, that kind of defeats the purpose of the listserv...why not just close it down and move everyone to Craigslist?

Regardless, I thought that this would be a good way to start and even maintain a list of free tools (or at least some that have trials/demos available) that can/have been used during computer forensic examinations on Windows systems. I'll start by providing tools that I use, as well as links to other tools, and from there, I will expand the list as I receive information (ie, comments, emails, etc.)

General Tools
Perl - 'nuff said; mostly for creating my own tools

FTK Imager - great for opening raw (ie, dd) images, .EOx files, .vmdk files, etc - even allows you to "acquire" other formats to raw/dd. Also great for selected file extraction from the image, when you don't need everything
dd - George M. Garner Jr's FAU
dcfldd - another CLI imaging tool, available for the Windows platform
Tableau TIM - coming Q4, 2009
Raptor - bootable Linux CD that can be used for imaging (this will likely open up a whole flurry of similar emails, so let's just use this one as a placeholder for all bootable Linux CDs...)

Image Mounting
IMDisk - great free tool for mounting Windows images on Windows systems, in read-only mode
VDKWin - another free tool
P2Explorer - from Paraben; free, requires registration

Image Analysis
TSK Tools - I've used mmls and fls mostly, but blkls is extremely useful, as well
ProDiscover, Basic Edition - Not a full suite, but very useful
AntiVirus Scanners (ClamWinPortable, SysClean, Malwarebytes)
Timeline Creation Tools (TSK tools, pasco, Perl scripts, etc.) - Perl scripts available from the Win4n6 Yahoo Group
Internet Evidence Finder (JADSoftware) - also, check out the Encrypted Disk Detector
Carving - foremost, scalpel, PhotoRec
DiskDigger - from Dmitry Brant; also check out NTFSWalker

File/Document Metadata
Structured Storage Extractor - view contents of structured storage/OLE files; this used to mean just MS Office (pre-2007) documents, but on Windows 7, this now means Sticky Notes, etc.
OffVis (fact sheet) -
Office 2007 document metadata (script) - look for cat_open_xml.pl; other tools available, as well
Skype Extractor -
PDF Tools - from Didier Stevens; some of Didier's tools have been incorporated into the VirusTotal site
MSI files - InstEd

Working with Email
Email Conversion Tools - may not be free
AvTech - Perl script
Emailchemy - from Weird Kid Software; demo available
Mail-Cure - free, described here
Aid4Mail - free trial available
Intella - from Vound Software; doesn't require that Outlook be installed; trial available

File Hashing
MD5Deep - also allows for other hashing algorithms
SSDeep - fuzzy hashing; is also incorporated into VirusTotal

Registry Analysis
RegRipper - includes rip, ripXP, and regslack
MiTeC Registry File Viewer
Didier Stevens' UserAssist
Pwdump7 or SAMInside - great way to get password hashes for cracking

Archive/Compression Utilities
Other utilities

Memory Collection/Analysis
Windd - 1.3, for x86 and x64 now available
MDD - ManTech's memory imaging tool; 32-bit, has the 4GB limit
Nigilant32 - from Matt Shannon, F-Response; Windows 2000/XP only
Volatility - XP SP 2&3 only
Memoryze - from Mandiant

Packet Analysis
NetWitness Investigator
Tools for extracting files from streams - not all of the tools listed run on Windows

Browser Analysis
SQLite Spy (for Firefox 3 analysis)

U3 Launcher Log parser
Other Mandiant Tools (Highlighter, Web Historian, etc.)
MIR-ROR - read about it here; great tool from Russ McRee (read Russ's ISSA toolsmith write-ups on other tools)
ShadowExplorer (Dan Mares' VSS)
SMPlayer - "for troublesome videos"
Evidence Mover
Windows Search Index Extractor - Extract information in the Windows Desktop Search database (ie, windows.edb file)

Various thumbnail cache extractor applications can be found here.
NirSoft has a variety of free and useful utilities available.
RedWolf Computer Forensics - various parsing tools

Any you'd like to add? Comment, or email me.

Prefetch Parser
Fox Analysis - browser analysis
MiTeC Windows Registry Recovery
MiTeC Windows Registry Analyzer (associated guide)
DigestIT 2004 MD5 Hash

Wednesday, October 21, 2009

Windows 7 and the Future of Forensic Analysis

Okay, so I was in Redmond, WA, last week at some computer conferences (yes, plural) and was on-stage with Troy Larson while he waxed philosophic on forensicy stuff with respect to Windows Vista and beyond, including Windows 7. I've been noodling a lot of this over, and here's what I've come up with...

One of Troy's pet projects is Volume Shadow Copies (please, do not ask me about any of his other interests...), and I have to say, he's really one of the most knowledgeable folks I'm aware of on the subject of VSC and the needs of forensic analysts. Troy has some interesting things to say about how Volume Shadow Copies can be accessed, but one of the most interesting aspects is that one way to do this is by booting your acquired image via something like LiveView. Another means is to mount the image file as a drive letter from a like system. At that point, you can image the entire volume or dump only selected files.

Notice at no point did I say, "...insert your dongle...", or "...run this EnScript...". It turns out that Volume Shadow Copies can be enumerated and accessed via WMI, meaning that once you have an image mounted, you may be able to (haven't tried it yet) automatically process what you need.

I was doing some research into processing the new Windows Event Log format (new as of Vista and Windows 2008, that is...) for inclusion into timeline analysis, and what I've been able to find out is that if you extract the pertinent .evtx files from your acquired image, you may be able to process them via LogParser, but again...on a like system. Andreas Schuster did a great job in documenting the format, but .evtx files are a combination of binary, and binary XML...eesh! Note - you may need to consider using something like wevtxutil in your live response activities...

Okay, I'm not sayin' that commercial forensic analysis suites are no longer useful...after all, ProDiscover 6.0 allows you to access Volume Shadow Copies if you're accessing the remote system live via the servlet...which means that if you're using PD for live response, you can likely automate what you need via Perl-based ProScripts.

So where does that leave us? Folks, I'm gonna sound the ol' "the age of Nintendo forensics is over " trumpet yet again, and the dawn of the educated, knowledgeable, sofis...soffis......sophisticated responder is upon us!

Tuesday, October 20, 2009

Timeline Creation Tools

As time progresses, we look at the tools we have available to us, tweak those that we have, and maybe look for new capabilities, creating new tools. Recently, someone was kind enough to take the time to post some feedback on their experiences with the timeline tools I released in the Win4n6 Yahoo Group a bit ago, and I took the opportunity to update some of the tools based on that feedback. Below are the tools I updated, and what I did to update them:

pref.pl - removed the path to the directory where the Prefetch files are kept; the feedback had an excellent point - don't want to confuse the user

evtparse.pl - updated this script to (a) dump the sequence of event records and time generated timestamps, and (b) get all .evt files in a directory, rather than requiring the user to enter one command line for each file

jobparse.pl - created this one recently, for parsing Scheduled Task .job files (NOT the schedlgu.txt log file); includes output in TLN format

Now, these updated tools have NOT been included in the toolset available in the group, largely because my second Hakin9 article - the one where I provide a hands-on walk-through of the tools - should be coming out in the near future, and I don't want to confuse anyone. Also, the feedback (which I greatly appreciate) pointed out that this is still largely a manual process, and I realize that this can be an impediment to a lot of forensic examiners. Maybe what needs to happen is that I need to provide training on using these tools, so that more folks can realize for themselves the real power in this analysis technique.

Another thing I really need to emphasize about timeline generation is how powerful it can be when used to optimize triage and analysis techniques. Let's say you have a large-ish incident that you're responding to, and it's clear that you need to have a means to get some analysis completed in parallel, while the rest of the data is being collected. On-site staff can collect file system metadata and specific files from acquired images while verifying the image file systems, and ship that data off to another analyst for timeline generation and analysis. Given an image of 80 or 160GB, getting the file system metadata, and archiving selected files that have been extracted from an image means that you're sending off several MB of data, rather than GB. In addition, you're not actually sending file contents...so in the case of response activities involving a data breach, you can get analysis done by shipping this data off, but you're not sending the actual sensitive data itself...file names and paths != file contents.

So consider this scenario...on-site staff are in the process of acquiring systems (or, perhaps the organization's own incident responders are acquiring memory dumps and images) and part of that process is to verify the acquired images by opening the image file in FTK Imager. Now, you may only have a few team members on-staff, all trying to collect a considerable amount of data; not just images, but also network diagrams, data flows, etc. So, their new process is to verify the file system of each image, and then run the appropriate tools to collect file system metadata, as well as various files (i.e., .evt, .pf, .job, Registry, etc.), zip them up, and ship them off for analysis. Put these in the hands of someone skilled and practiced in the use of the timeline creation tools, and you will very quickly get a timeline of activity from each system. This can help you quickly narrow down what you're looking for or at, as well as help you scope other systems that may be involved in the incident. And you haven't contributed to the exposure of sensitive data!

Friday, October 16, 2009


What challenges do you face in Windows forensic analysis?

Book news and Registry research

I've recently exchanged a number of emails with my editor at Syngress, and opted to put of working on a book on Registry analysis until next year.

Well, more accurately, I won't be submitting a manuscript until after the summer of 2010. One reason for this is because I want to have the time to really dig into the Windows 7 Registry and do some in-depth analysis (and thoroughly document it) to be included in the book. I also need to refine some of the updates I have planned for RegRipper and that set of tools.

However, there were other reasons for putting this project off, as well. I submitted my proposal for the book, and got back almost a dozen reviews...all anonymous. Many of the comments were interesting, but one of the common threads throughout the reviews was a need to compare commercial tools. Sadly, this isn't something I have access to...while some vendors have offered me trial versions of tools, this hasn't been the case with tools that deal with the Registry. I simply don't have access to such tools. Further, these tools are largely just Registry viewers, and don't offer the same sort of functionality or flexibility as RegRipper. I'm not sure, but this may end up being the biggest obstacle to the book.

Finally, I have to come up with a way to present the information I have and develop in the book without making it just a big, long, boring list of Registry keys and values. That'll take some time to develop...

DCC2009 Takeaways

I had an opportunity to attend some of the presentations at the Digital Crimes Consortium 2009 conference at the Microsoft campus in Redmond, WA.

One of my biggest takeaways from this event was the fact that the needs of CIOs, IT staffs and consultants (which is where I spend most of my time) are, on the surface, vastly different from the needs of law enforcement. "Victim" IT organizations are primarily concerned with getting rid of a malware infection, regardless of what it is...worm, Trojan, etc. In my experience, eradication and returning the infrastructure to normal operations are the primary concern, with compliance and questions about data loss/exfiltration usually popping up after the fact (i.e., too late).

However, LE is interested in intelligence, some sort of actionable data that can be used to investigate cyber crimes, track down the players and prosecute someone, preferably someone fairly high up the food chain.

At first glance, there may not be an obvious overlap. However, both sides have information available that is useful, even valuable, to the other. LE might have data available about cyber crimes that occur across a wide range of victims...such as, was the incident initiated by a browser drive-by, was it targeted, etc? LE (depending upon the level that we're talking) may have trending information available regarding victim types, intruder/criminal activity, etc. Victim IT organizations will have information available about malware variants, outbound connections (to command-and-control servers, etc.), sensitive information collected, etc.

Where things tend to break down is that in some cases, LE either doesn't track the kind of information that might be useful to victims, or they feel that they can't share it because doing so might expose information. Victim IT organizations many times feel the same way...that they can't share what information they have without exposing information about their infrastructure, intellectual property, or "secret sauce". Sometimes, the victim organizations do not want to contact LE for fear that their name would be included in public documents, exposing the fact that and the means by which they were compromised...something those organizations do NOT want made public.

Another takeaway I got from the conference is that there is a definite organization and structure behind cyber criminal activities. There's a hierarchy to the structure, an economic driver (i.e., money), and individuals in the communities are kicked out if they fail to provide something back to the community. These seem to be driven like businesses without an HR department...maybe there are certain elements to this structure that the good guys could emulate.

Taking this anywhere is going to take some thought and some work.

The first part of this trip was to participate with Troy Larson in his Windows 7 Forensics presentation. I've been focusing on the Registry, but Troy's been looking at a lot of other things, most notably Volume Shadow Copies and how they can be used.

One of the things that Troy brought up in the presentation that stood out for me was the number of files (Sticky Notes/.snt, etc.) that are based on Microsoft's OLE, "structured storage" file format. You might be able to get some interesting data from these files using oledmp.pl, or you can use MS's own Office Visualization Tool.

Speaking of metadata, everyone should remember Kristinn's post to the SANS Forensic blog on Office 2007 document structure and metadata; I like it because he includes a Perl script for parsing this information. If you end up using the version of the script for Windows systems, be sure to read the file headers for instructions on how to ensure that you have the right modules installed.

Usually when I mention something like this, I get questions like, "...ok, but what about other document metadata?" Well, let's not forget Didier's work with PDFid.

Saturday, October 10, 2009


Not much new, so here are some links to things I've found interesting...

Some of the TrustWave guys were at SecTor this past week...check out Chris's write-up on the event. Chris's presentations, and others, can be found here.

An astute reader found that the Kindle edition of WFA 2/e is now available. Thanks, Tom!

Matt Shannon posted recently on New Directions in Electronic Evidence Collection, regarding a conference he's attending at the University of Florida.

If you need to get specific information, such as product keys, from a Windows installation, check out KeyFinder from Magical Jelly Bean Software. Hey, it even has command line options so you can include it in your live response batch files!

If you haven't done so in a while, check out the e-Evidence site...the most recent update appears to be about 22 Sept, and Christina has linked some really interesting files, like this one and this one. There are even a couple of papers on forensics involving social networks (here, and here).

Wednesday, October 07, 2009

Hakin9 articles

I returned from a trip this morning, to find two copies of the most recent edition of Hakin9 on my desk, with the first of three articles I've written on timeline creation and analysis. This first article is more of an introduction to the topic, and my hope is that anyone reading the articles is able to understand what I'm trying to get across, and see the usefulness and the power of this technique. Personally, I've used this technique on several examinations, all to spectacular effect.

Something that's very interesting (and validating) about this edition is Ismael Valenzuela's "My ERP got hacked - An introduction to computer forensics, pt II" article. Not only does Ismael make use of RegRipper, but he also walks through some techniques for parsing data (i.e., Event Logs/.evt files, IE browser history/index.dat file, etc.) in forensic analysis...very cool stuff, indeed! While Ismael's article does not explicitly develop a timeline, there are some data collection and analysis techniques illustrated in the article that are pretty spot on and very useful.

The second article in the series (I'm told that it will be in the next edition) is a hands-on walk-through, using a freely available image file that can be downloaded from the Internet as a basis for actually creating a timeline. While this is still a very manual process, I firmly believe the benefits of this technique far outweigh the "costs" (i.e., having to extract files and run CLI tools, etc.).

The third and final article (which I'm working on now) is a wrap-up, showing some alternative and advanced techniques that have proven (for me, anyway) to be extremely useful in getting data to include in the timeline. I've also pointed out a couple of areas where we need coverage with respect to converting the retrieved data into something that we can include in a timeline.

Overall, I think that the biggest issue with timeline creation and analysis at this point is the sheer volume of data that's available, and how we can go about doing a bit of data reduction. For example, I have yet to find a suitable technique for data visualization on the front end, when you have all of this data to go through. Clustered dots showing various activity (i.e., file system, Event Log, etc.) don't particularly make a great deal of sense to me, largely due to the fact that things such as software updates and normal operating system activity tend to create a great deal of "noise", where as, the compromise or the malware activity falls into what Pete Silberman of Mandiant referred to as "least frequency of occurrence". So spitting things out in ASCII format so that the analyst can do...well...analysis seems, to me, to be the most effective way to go at this point.

Once the analyst has nailed down the events in question, essentially separating the wheat from the chaff, then is the time for visualization techniques, particularly for reporting. I've seen and referred to some techniques for doing this, including Simile and using Excel to generate something usable.