Sunday, March 01, 2009

Timeline Analysis, pt IV

My last post on this subject was starting to get a bit long, so rather than add to it, I thought it would be best to write another post.

I think that in the long run, Michael was correct in his comment that a database schema would be needed. However, for the time being, there is enough here to provide a bit of insight to an analyst, albeit one with some programming ability. Okay, so here's how this can be used all together right now...

You have an acquired image, and you run Brian's TSK tool, fls, against it, getting a body file. You can then modify the mactime utility to filter the body file into a secondary file containing the 5 fields which were described in part III. At this point, we should have a file in the .tln format from the file system. You can also extract files from within the acquired image itself (Event Logs, Registry hives, etc.) and run them through the same filtering process. For example, evt2xls has such a filtering capability for .evt files...I ran it against a SysEvent.evt file, and got entries that look like the following:

1092790209|EVT|PETER||EventLog/6006;EVENTLOG_INFORMATION_TYPE; 1092790237|EVT|PETER||EventLog/6009;EVENTLOG_INFORMATION_TYPE;5.01. 2600 Service Pack 1 Uniprocessor Free 1092790237|EVT|PETER||EventLog/6005;EVENTLOG_INFORMATION_TYPE;

As you can see from the three .tln file entries, we have fields for the date/time value based on the Unix epoch (GMT, of course), source (EVT), host or system ("Peter", taken from Ender's Game), user (blank, in this case), and the description of the event. All fields are pipe-separated.

Quick segue...the description field for the above events is semi-colon separated, starting with the event source and ID fields from the event record. I do this because when I have a question about Windows event records, I'll start by going to, where you can look up events by...well...source and ID. There are others ways to look up what an event record may be indicating, including searching on MS or Google, but that's for a different post all together.

Now, say you wanted to write a filter for events from IIS web server logs or the like. Michael wrote such a filter for McAfee AV logs in ex-tip; converting the time values in the logs to a Unix epoch time is pretty straightforward using the Perl Time::Local module, and as necessary, time zone settings in order to normalize all values to Unix epoch times based on GMT. Doing so gives us an easy means of comparison.

So, after running filters and creating output files, you may end up with several such .tln files right there in a subdirectory within your analysis file structure. At that point, if you wanted to...say...locate all events with a specific time frame, you could enter the dates into a script, have the script do the conversion and searching, and then display all of the events appropriately, where "appropriately" could mean either a text file or some sort of graphical output, such as Simile.

You may be asking questions like, "what about specific events that I'd like to add to a tln file that may not be something from within the acquired image, or I just want to add the event myself?" No problem...GUIs are easy to write, whether you're using Perl or Python or whatever...heck you could even add an event using Notepad! But the key to all of this is that you still have the raw data, and you've performed filtering and data reduction without modifying that raw data, and you're able to narrow down your analysis a bit.

The next step, I guess, would be to put something like this together as a complete package, and run it against an available image.

Addendum: This afternoon I updated, a Perl script I wrote for Windows Forensic Analysis to parse INFO2 files. The update I added was the ability to write timeline information in the format I'm using, as well as allow the analyst to add values for the system (host name) and the user (SID or username). Works like a champ!

No comments: