Thursday, April 16, 2009

Obtaining file system timeline data

One of the things I've run across regarding generating timeline data from the file system is that there has to be more than one way to do this.

For example, what happens if you have an image that you can open in FTK Imager, but for some reason, you cannot get any meaningful data from the TSK tools mmls and fls? Or, what happens if you don't have an image, but are instead accessing a live system? While you can use tools like FTK Imager to extract Registry hive files, and use the copy command to get copies of the EVT (and other) files, you may have to find an alternative method by which to obtain file system data.

Well, that's no problem at all, really. First off, if you have an image and you can open that image in FTK Imager, you can export the directory listing. Simply right-click on the image and choose Export Directory Listing. You can also do this from the command line, via the /CreateDirListing=filename switch, as described in the user manual. The resulting output is tab-delimited and goes to a .csv file which you can easily open in Excel...although that isn't entirely useful, really. Instead, you'll want to parse it in Perl (of course). The format of the directory listing output by FTK Imager includes the filename, full path, size, created, modified and accessed dates (no "entry modified" date), and if the file "is deleted". As it's tab-delimited it can be easily parsed and placed in the necessary format for use in our timeline.

Another thing we'll need to do, though, is translate the time format output by FTK Imager into a Unix epoch time...which is pretty simple with the proper application of the split() function and the use of the DateTime module. Through repeated tests, I've found this module to be more accurate than Time::Local, which has actually had several of my converted timestamps off by a month. Installing the DateTime module is as easy as ppm install datetime, if you're running ActiveState Perl.

Okay, that's one way, how about some others?

You can probably just extract the MFT from an image and use Mark Menz's MFTRipper (discussed in the 22 March CyberSpeak podcast) to extract the data you need.

If you're accessing a live system, you can use Perl (or Python...that's what the illustrious Don Weber prefers) and the stat() funtion to get data from each file on the system, without having to actually access the file itself. However, this will just get the live file won't get things like entry modified times and deleted files, and you will be prevented from accessing some directories by default. You can use this same code to do the exact same thing with an image mounted via Andy Rosen's SmartMount, but you may want to first open a command prompt via psexec -s cmd, so that you can run the Perl script from that command prompt, with System-level privileges. This will be particularly important if you want to get not only file system data from Windows XP System Restore Points, but also if you want to parse the rp.log file in each Restore Point for when and why the RP was created, or run tools such as ripXP.

Finally, if you're read my book, you'll know that I've written a number ProScripts for use with ProDiscover. Writing one to create a timeline bodyfile shouldn't be too difficult, although at this point, I'm not sure why you'd need to with the other methods listed.

A couple of things to remember when creating these timelines...

First, the output of fls.exe is pipe-delimited, and includes the fields:


In most instances, the "MD5" entry is 0, although if you're writing your own code, you can definitely populate it. Populating this field can let you do all sorts of analysis besides just looking at the timeline, such as determining if Windows File Protection was subverted to modify "protected" files.

The timestamps are in Unix epoch time, and consist of the last access time, the last modified time, the "entry modified" (ie, ctime), and the creation date (ie, crtime) of the file. This order is important, not only when parsing the bodyfile, but also when creating a tool to assemble your own bodyfile; putting these various times in the wrong locations can throw off your timeline analysis.

Finally, an interesting side effect of conducting this analysis is that timeline assembly and analysis can be run in parallel to other activities, particularly those that consume a great deal of time, such as scanning for PCI data. Also, as the output is largely text based and in most instances the filenames and other necessary files (INFO2, EVT, AV logs, etc.) do not themselves contain sensitive data, this information can be extracted from an image, compressed and protected as necessary, and sent to another analyst to conduct the assembly and analysis. What this leads to is a faster response time, answers delivered to the customer faster, and much quicker resolution of an incident.

No comments: