On 19 May, something of a firestorm seems to have erupted. While I don't know most of what's been said, nor by whom, some friends have expressed concern over what they've seen, so I thought I would take a moment to apologize for my actions. I realize that my actions were wrong, and for that, I sincerely apologize.
Around the beginning of the year (to be honest, I don't remember the exact date), I posted my Jump List parsing code to my Google Code site. The archive I posted consisted of two Perl modules (JumpList.pm and LNK.pm), two Perl scripts (jl.pl and jl2.pl), and a PDF user guide document. During development and testing, I found that the jl.pl script did not work correctly...it failed to function at all and quit with an error regarding a function (i.e., getLNK()) call to one of the modules...that function call did not exist within the module. The Perl interpreter would not allow the script to run. I opted to leave this code in place and post the archive, in order to see how many people would download it, and of those, how many would report any issues.
On 2 Jan, I posted updated, corrected code to the download site.
The Issue
On 15 May, I posted a comment to the Jason Hale's blog. Here's the comment that appears to have caused all of the issues:
well, that's the one I wrote so that it wouldn't work...it was a social experiment, to compare the number of downloads to the number of folks who say anything about it not working. So far, only 2 people have said anything about issues with the script...
On Saturday, 19 May, I received emails from two friends while I was offline stating that there were discussions about my comments occurring online. When I did get back online later in the day, I could not see most of those comments, so I don't know what was being said. Further, I do not have a Google Plus account, so I could not see what was discussed in this forum, either. I did receive an email on Sunday morning (sent to me late Sat night), where someone expressed their concerns and feelings to me.
Was there any deception?
Absolutely not. The jl.pl script did not work. It did not report incorrect or deceptive data. There was no "joke app". The Perl interpreter did not allow the script to complete, regardless of any arguments passed to the script. In fact, the same error would have occurred, regardless of what data was passed to the script. The Perl interpreter prevented the script from processing any data.
Does this issue affect any other code?
Absolutely not.
This script is in no way associated with RegRipper. It is not part of RegRipper, nor any other tool or script I've ever released.
Have you conducted any other "social experiments" without anyone's knowledge?
Absolutely not. Nor will I do so again.
My Apology
I realize that what I said in my comments to Jason Hale's blog, specifically the two words "social experiment", are what appears to have generated the reaction that many are seeing online. I sincerely apologize for the use of the term, and I sincerely apologize for purposely and knowingly releasing code that did not work. It was not the best judgement to do this, and I realize that, and apologize for my actions. I have not done anything like this before, and it will never happen again. I am deeply sorry for any ill feelings of mistrust or betrayal my comments caused.
The Windows Incident Response Blog is dedicated to the myriad information surrounding and inherent to the topics of IR and digital analysis of Windows systems. This blog provides information in support of my books; "Windows Forensic Analysis" (1st thru 4th editions), "Windows Registry Forensics", as well as the book I co-authored with Cory Altheide, "Digital Forensics with Open Source Tools".
Monday, May 21, 2012
Friday, May 18, 2012
Good time for some 0xBADC0FEE
Every now and then, I'll post (to this blog or to an online forum) on the topic of sharing and collaboration within the community, and how it benefits those who are actively involved. Actively engaging with others in the field has worked out very well for me in a number of circumstances, both during casework and research, and I seek out that kind of active engagement and recommend it to others.
Here are some examples of how active engagement has been a benefit:
I was working on a malware detection case a while back, and by actively engaging with our malware analysis guy, we were able to provide much more information and intelligence to the customer than either one of us working separately...the final result was much greater than the sum of it's parts. I provided information to the malware analyst regarding where the malware was found, artifacts associated with it, how it was likely launched, etc. He was able to successfully launch the malware on a similar platform, and then provide information and intel from memory analysis that was then used to further extend the host-based analysis. Working together in this manner, we were able to collect much more information in a timely manner, which allowed us to extend that information into intelligence by collecting and incorporating information from external sources. The final turn-around on all of this was 4 1/2 days, where the customer would have been happy with (and was more used to) 6 months.
Last January, when returning from a conference, a friend mentioned to me that he had an NT 4.0 system (yes, that's exactly right...NT 4.0) to examine, and the Event Logs had apparently been cleared. I was able to offer him a solution, allowing him to recover Event Log records from unallocated space, and progress and extend his analysis.
Okay, so continuing on...not long ago, the folks at Mandiant posted regarding the AppCompatCache Registry value (which is where the "0xbadc0fee" reference in the post title comes from...), and provided example code. This was a great bit of sharing with the community, and I was able to not only find and correct an issue with the tools I was using, but also create a RegRipper plugin based on the information and code they provided. From there, I started looking at other locations within the Registry that could provide indications of program execution, and created plugins for these, as well.
Then when I updated the Prefetch file parsing code in preparation for some upcoming training courses, I added some 'filters' to the code, so that certain artifacts (modules with "temp" in their path, etc.) were extracted and highlighted.
I was exchanging code and emails with Corey Harrell, engaging in discussions with him with respect to the usefulness of both the code and analysis techniques, and he suggested adding similar filtering capability to some of the already available RegRipper plugins (MUICache, UserAssist, etc.)...something I'd considered, but it hadn't occurred to me how useful that might be. Thanks to engaging with Corey on this, he made it clear to me that not only would these filters be valuable, but he also suggested some additional filters that would be useful in other areas, as well. In short, the idea was to take the breadth of both of our experiences and put that into filters that would illustrate to other analysts, "look, here's a bunch of data, but here are some things you might consider looking at sooner rather than later...".
It was that active sharing and collaboration that not only served to extend our own individual knowledge through the exchange of our experiences, but also allowed us to extend the tools themselves. Clearly, this would then have a cyclic effect...running the updated tools would likely show us artifacts that we might have missed, extending our knowledge and experience, which we could then use to extend the tools, etc.
One of the things I really appreciate about engaging with Corey is that he puts a great deal of thought into things. We're all busy, and I don't expect immediate responses from anyone, but I can always count on Corey to come back with something thoughtful, such as, "why would you want to do that?", or "how about if we did it this way?" It's this kind of thoughtfulness when engaging that allows all parties involved to grow and develop. Sometimes, when you're used to doing something a certain way, or if you're really actively engaged on something, a simple question like, "why?" can really provide a great deal of value.
If you haven't done so already, I highly recommend that you vote for Corey or his blog in the 2012 Forensic4Cast awards. Particularly in the last year, Corey has contributed a great deal to the community as a whole, and it would really being doing him justice if he were to win at least one of the awards that he's nominated for. Also, he'll be at the SANS Forensic Summit in Austin this year, so be sure to buy him a beer, regardless.
You'll notice that throughout this post, I kept using the term "active" or "actively". I did this to make a distinction; downloading code or tools that someone provides is not actively engaging.
Here are some examples of how active engagement has been a benefit:
I was working on a malware detection case a while back, and by actively engaging with our malware analysis guy, we were able to provide much more information and intelligence to the customer than either one of us working separately...the final result was much greater than the sum of it's parts. I provided information to the malware analyst regarding where the malware was found, artifacts associated with it, how it was likely launched, etc. He was able to successfully launch the malware on a similar platform, and then provide information and intel from memory analysis that was then used to further extend the host-based analysis. Working together in this manner, we were able to collect much more information in a timely manner, which allowed us to extend that information into intelligence by collecting and incorporating information from external sources. The final turn-around on all of this was 4 1/2 days, where the customer would have been happy with (and was more used to) 6 months.
Last January, when returning from a conference, a friend mentioned to me that he had an NT 4.0 system (yes, that's exactly right...NT 4.0) to examine, and the Event Logs had apparently been cleared. I was able to offer him a solution, allowing him to recover Event Log records from unallocated space, and progress and extend his analysis.
Okay, so continuing on...not long ago, the folks at Mandiant posted regarding the AppCompatCache Registry value (which is where the "0xbadc0fee" reference in the post title comes from...), and provided example code. This was a great bit of sharing with the community, and I was able to not only find and correct an issue with the tools I was using, but also create a RegRipper plugin based on the information and code they provided. From there, I started looking at other locations within the Registry that could provide indications of program execution, and created plugins for these, as well.
Then when I updated the Prefetch file parsing code in preparation for some upcoming training courses, I added some 'filters' to the code, so that certain artifacts (modules with "temp" in their path, etc.) were extracted and highlighted.
I was exchanging code and emails with Corey Harrell, engaging in discussions with him with respect to the usefulness of both the code and analysis techniques, and he suggested adding similar filtering capability to some of the already available RegRipper plugins (MUICache, UserAssist, etc.)...something I'd considered, but it hadn't occurred to me how useful that might be. Thanks to engaging with Corey on this, he made it clear to me that not only would these filters be valuable, but he also suggested some additional filters that would be useful in other areas, as well. In short, the idea was to take the breadth of both of our experiences and put that into filters that would illustrate to other analysts, "look, here's a bunch of data, but here are some things you might consider looking at sooner rather than later...".
It was that active sharing and collaboration that not only served to extend our own individual knowledge through the exchange of our experiences, but also allowed us to extend the tools themselves. Clearly, this would then have a cyclic effect...running the updated tools would likely show us artifacts that we might have missed, extending our knowledge and experience, which we could then use to extend the tools, etc.
One of the things I really appreciate about engaging with Corey is that he puts a great deal of thought into things. We're all busy, and I don't expect immediate responses from anyone, but I can always count on Corey to come back with something thoughtful, such as, "why would you want to do that?", or "how about if we did it this way?" It's this kind of thoughtfulness when engaging that allows all parties involved to grow and develop. Sometimes, when you're used to doing something a certain way, or if you're really actively engaged on something, a simple question like, "why?" can really provide a great deal of value.
If you haven't done so already, I highly recommend that you vote for Corey or his blog in the 2012 Forensic4Cast awards. Particularly in the last year, Corey has contributed a great deal to the community as a whole, and it would really being doing him justice if he were to win at least one of the awards that he's nominated for. Also, he'll be at the SANS Forensic Summit in Austin this year, so be sure to buy him a beer, regardless.
You'll notice that throughout this post, I kept using the term "active" or "actively". I did this to make a distinction; downloading code or tools that someone provides is not actively engaging.
Wednesday, May 16, 2012
Tool Updates
In preparation for some upcoming training courses that I'll be delivering, I've been updating some of the tools that I use (in my own work, as well as in the training I provide). I've also been compiling them (via Perl2Exe) so that folks don't have to have Perl installed in order to run them.
One of the tools I've updated is pref, used for parsing Prefetch file metadata. Some of the updates allow for a modicum of Prefetch analysis to occur automatically; for example, when running the tool across a number of Prefetch files, you can look for outliers (with respect to the module paths). Another capability I've added to the tool is to automatically look for '.dat' and '.exe' files, as well as look for paths that include "temp".
I ran pref against a Prefetch file known to be associated with malware, and this is what I saw at the end of the output:
EXEs found:
\DEVICE\HARDDISKVOLUME1\SYSTEM VOLUME INFORMATION\_RESTORE{00D8A395-89D5-46B8-A8
50-E02B0F637CE5}\RP2\SNAPSHOT\REPOSITORY\FS\SMS.EXE
\DEVICE\HARDDISKVOLUME1\WINDOWS\SYSTEM32\INETSRV\RPCALL.EXE
DAT files found:
\DEVICE\HARDDISKVOLUME1\DOCUMENTS AND SETTINGS\VMWARE\LOCAL SETTINGS\TEMPORARY INTERNET FILES\CONTENT.IE5\INDEX.DAT
\DEVICE\HARDDISKVOLUME1\DOCUMENTS AND SETTINGS\VMWARE\COOKIES\INDEX.DAT
\DEVICE\HARDDISKVOLUME1\DOCUMENTS AND SETTINGS\VMWARE\LOCAL SETTINGS\HISTORY\HISTORY.IE5\INDEX.DAT
Temp paths found:
\DEVICE\HARDDISKVOLUME1\DOCUMENTS AND SETTINGS\VMWARE\LOCAL SETTINGS\TEMPORARY INTERNET FILES\CONTENT.IE5\INDEX.DAT
\DEVICE\HARDDISKVOLUME1\DOCUME~1\VMWARE\LOCALS~1\TEMP\DEL10.BAT
None of this was surprising, as one of the modules loaded by the malware was WinInet.dll. So what does this tell us?
Well, first off, the malware file was sms.exe...and look where it was launched from. And why is there a second .exe file listed as one of the modules?
Next, check out the .dat files...kind of interesting. We should expect this, as the WinInet API is used...but this also tells us where to go to get additional data. And because we were also able to get the last time that the program ran, we know where (what time) we should look in the index.dat files for artifacts.
Finally, we see two files with "temp" in their paths...one of them a .bat file. Now, we may be able to recover this file, if it was deleted. Hey, wouldn't it be cool if this was really a .bat file and the contents were resident within the MFT entry?
I've also updated the Jump List parsing script, jl, and compiled it.
Another tool that I've updated and compiled is jobparse, for collecting metadata from Windows XP/2003 .job files.
I still have evtparse, for parsing Windows XP/2003 Event Logs, and I've added evtxparse, for parsing the output of LogParser, run against Vista+ Windows Event Logs.
I've updated recbin to parse both Windows XP/2003 INFO2 files as well as the $Ixxxxxx files from Vista+ Recycle Bins. There are a few other odds and ends that I have (and have updated) as part of my toolkit, as well.
Oh, and yes, all of the tools are capable of formatting output to TLN format, for inclusion in the events file.
These tools will be available as part of the training courses.
One of the tools I've updated is pref, used for parsing Prefetch file metadata. Some of the updates allow for a modicum of Prefetch analysis to occur automatically; for example, when running the tool across a number of Prefetch files, you can look for outliers (with respect to the module paths). Another capability I've added to the tool is to automatically look for '.dat' and '.exe' files, as well as look for paths that include "temp".
I ran pref against a Prefetch file known to be associated with malware, and this is what I saw at the end of the output:
EXEs found:
\DEVICE\HARDDISKVOLUME1\SYSTEM VOLUME INFORMATION\_RESTORE{00D8A395-89D5-46B8-A8
50-E02B0F637CE5}\RP2\SNAPSHOT\REPOSITORY\FS\SMS.EXE
\DEVICE\HARDDISKVOLUME1\WINDOWS\SYSTEM32\INETSRV\RPCALL.EXE
DAT files found:
\DEVICE\HARDDISKVOLUME1\DOCUMENTS AND SETTINGS\VMWARE\LOCAL SETTINGS\TEMPORARY INTERNET FILES\CONTENT.IE5\INDEX.DAT
\DEVICE\HARDDISKVOLUME1\DOCUMENTS AND SETTINGS\VMWARE\COOKIES\INDEX.DAT
\DEVICE\HARDDISKVOLUME1\DOCUMENTS AND SETTINGS\VMWARE\LOCAL SETTINGS\HISTORY\HISTORY.IE5\INDEX.DAT
Temp paths found:
\DEVICE\HARDDISKVOLUME1\DOCUMENTS AND SETTINGS\VMWARE\LOCAL SETTINGS\TEMPORARY INTERNET FILES\CONTENT.IE5\INDEX.DAT
\DEVICE\HARDDISKVOLUME1\DOCUME~1\VMWARE\LOCALS~1\TEMP\DEL10.BAT
None of this was surprising, as one of the modules loaded by the malware was WinInet.dll. So what does this tell us?
Well, first off, the malware file was sms.exe...and look where it was launched from. And why is there a second .exe file listed as one of the modules?
Next, check out the .dat files...kind of interesting. We should expect this, as the WinInet API is used...but this also tells us where to go to get additional data. And because we were also able to get the last time that the program ran, we know where (what time) we should look in the index.dat files for artifacts.
Finally, we see two files with "temp" in their paths...one of them a .bat file. Now, we may be able to recover this file, if it was deleted. Hey, wouldn't it be cool if this was really a .bat file and the contents were resident within the MFT entry?
I've also updated the Jump List parsing script, jl, and compiled it.
Another tool that I've updated and compiled is jobparse, for collecting metadata from Windows XP/2003 .job files.
I still have evtparse, for parsing Windows XP/2003 Event Logs, and I've added evtxparse, for parsing the output of LogParser, run against Vista+ Windows Event Logs.
I've updated recbin to parse both Windows XP/2003 INFO2 files as well as the $Ixxxxxx files from Vista+ Recycle Bins. There are a few other odds and ends that I have (and have updated) as part of my toolkit, as well.
Oh, and yes, all of the tools are capable of formatting output to TLN format, for inclusion in the events file.
These tools will be available as part of the training courses.
What's New
Timelines
The Sploited blog has part 3 of the Forensic Timeline for Beginners series of posts up, and there are a couple of interesting and useful things mentioned in the post that I wanted to point out.
First, the posts make use of Lance's first practical to illustrate developing a timeline. Why is this interesting? Well, even though Lance has closed his blog, he has left the content up. So, while it's sad to see Lance go, I'm grateful that he's left the information and images up on the 'net. This first practical has some very interesting artifacts
The post includes the statement:
"For some reason RegRipper said this registry entry was not found. I'm not sure why this was at this point and decided to move on and try the application ProDiscover."
Okay, we can see from the graphic that follows that statement that the key in question does not appear to exist in the System hive within the image. Creating a ProDiscover project is a great way to troubleshoot the issue; "hey, this tool says that this artifact isn't available, I'd better check it out." I've heard from a couple of folks who use RegRipper and have told me that it has reduced the Registry analysis portion of their exams from days or hours to just minutes; if that's the case, and it's only taking you a few minutes to do something that normally takes hours, why not take a minute to troubleshoot or verify your findings? It's pretty quick and easy to do...
Overall, this has been a very interesting set of blog posts for me to follow along and read, as this mirrors much of what I see when I conduct training courses on Timeline Analysis, as well as other Windows Forensic Analysis topics.
SQLite
Here is an excellent article that addresses information regarding the SQLite Write Ahead Log (-wal file). While SQLite databases are found in a number of locations on Windows systems (Firefox places.sqlite and Chrome History file, as well as in iTunes iPhone backups, etc.), they aren't only found on Windows systems. As such this information can be very important to analysts who encounter a number of different target platforms.
Cyber Espionage
This post to the ShadowServer blog provides some excellent insight into the world of cyber espionage, and specifically strategic web compromises. Some of what's discussed in the post takes me back to what I mentioned previously in this blog regarding defense driven by intelligence, not FUD.
LFO
According to Grady Summers over at the Mandiant blog:
"...the very systems that hold the data targeted by an attacker are probably the least likely to have malware installed on them."
I think that's a very interesting distinction to make, and one that goes right along with Peter Silberman's comment a while back about LFO, or "least frequency of occurrence".
e-Evidence
The e-Evidence "what's new" page was updated just a bit ago...this site is always an excellent resource for information that you may not have known was out there and available.
The Sploited blog has part 3 of the Forensic Timeline for Beginners series of posts up, and there are a couple of interesting and useful things mentioned in the post that I wanted to point out.
First, the posts make use of Lance's first practical to illustrate developing a timeline. Why is this interesting? Well, even though Lance has closed his blog, he has left the content up. So, while it's sad to see Lance go, I'm grateful that he's left the information and images up on the 'net. This first practical has some very interesting artifacts
The post includes the statement:
"For some reason RegRipper said this registry entry was not found. I'm not sure why this was at this point and decided to move on and try the application ProDiscover."
Okay, we can see from the graphic that follows that statement that the key in question does not appear to exist in the System hive within the image. Creating a ProDiscover project is a great way to troubleshoot the issue; "hey, this tool says that this artifact isn't available, I'd better check it out." I've heard from a couple of folks who use RegRipper and have told me that it has reduced the Registry analysis portion of their exams from days or hours to just minutes; if that's the case, and it's only taking you a few minutes to do something that normally takes hours, why not take a minute to troubleshoot or verify your findings? It's pretty quick and easy to do...
Overall, this has been a very interesting set of blog posts for me to follow along and read, as this mirrors much of what I see when I conduct training courses on Timeline Analysis, as well as other Windows Forensic Analysis topics.
SQLite
Here is an excellent article that addresses information regarding the SQLite Write Ahead Log (-wal file). While SQLite databases are found in a number of locations on Windows systems (Firefox places.sqlite and Chrome History file, as well as in iTunes iPhone backups, etc.), they aren't only found on Windows systems. As such this information can be very important to analysts who encounter a number of different target platforms.
Cyber Espionage
This post to the ShadowServer blog provides some excellent insight into the world of cyber espionage, and specifically strategic web compromises. Some of what's discussed in the post takes me back to what I mentioned previously in this blog regarding defense driven by intelligence, not FUD.
LFO
According to Grady Summers over at the Mandiant blog:
"...the very systems that hold the data targeted by an attacker are probably the least likely to have malware installed on them."
I think that's a very interesting distinction to make, and one that goes right along with Peter Silberman's comment a while back about LFO, or "least frequency of occurrence".
e-Evidence
The e-Evidence "what's new" page was updated just a bit ago...this site is always an excellent resource for information that you may not have known was out there and available.
Tuesday, May 08, 2012
Approximating Program Execution via VSC Analysis with RegRipper
I recently listened to Ovie and Corey on the latest CyberSpeak podcast, and wanted to combine what I'd heard them discuss with respect to the latest release of RegRipper, and provide a technique for analysis that incorporates VSCs.
Now, one of the things we may run across during our analysis, if we create a timeline, is that we may have a Registry key that was modified in some way during a particular time window of interest. There are a number of Registry keys for which all we have available is a LastWrite time (which is analogous to a files last modification time) but we do not know what that modification entailed.
For example, some keys maintain a "most recently used" (MRU) list, and we know that whatever the most recent activity was (user accessed/viewed a file, etc.), we can be pretty sure that, in most cases, the activity we see is associated with the key's LastWrite time. However, we may see other keys that simply contain values and subkeys, with LastWrite times that occur during our window of interest. Examples of these keys include the Run key (in both the Software and NTUSER.DAT hives), the MUICache key, etc. We don't know, simply by looking at the contents of the key, what change may have occurred; was a value or subkey added or deleted?
So, given that we have an image of a Windows 7 (or Vista) system, which may have a number of VSCs available, how can we attempt to determine what that change may have been?
Well, the first thing you want to do is mount the image as a volume on your Windows 7 analysis system in a manner that allows you to access the VSCs within the image; perhaps the easiest way to do so is via the VHD method. Once you have the image mounted (let's call it V:\), you'll want to determine which VSCs are available using the vssadmin command:
You'll see a bunch of stuff go by, and you'll want to look for lines that look like this:
Shadow Copy Volume: \\?\GLOBALROOT\Device\HarddiskVolumeShadowCopy8
A really easy way to cut through all of that stuff is to use the following command line:
Okay, so now you should have a list of VSCs from the image; for the sake of this example, let's say that you have VSCs numbered 23 through 30. How can we use this information to compare the values in, say, the Run key from the Software hive? We know that the 'soft_run.pl' plugin lets us output the contents of this key, but how can we easily retrieve information from the previous versions of the Software hive within the VSCs?
Let's create a batch file to automate this for us. A simple 'for' loop in a batch file looks like this:
If you put the above command in a batch file ('test.bat') and then run the batch file, you'll see the numbers 23 through 30 echo'd to the console. We can use variables in our batch file by substituting the first and last numbers with '%1' and '%2', respectively:
Okay, that's easy enough. Now, all we need to do is get RegRipper to parse this information for us. We can do that using the latest version (2.5) of RegRipper, and specifically rip.exe.
We would save this command into a batch file (name it "vsc_run.bat") in the same directory where we have rip.exe, and then run it using the following command line:
That's it. That's all it takes. And the cool thing is that we can save this batch file and use it again and again, or modify it in some way for future use. One way to modify it would be to output everything to a file, rather than just allowing everything to fly by at the console. To do that, simply add " >> output.txt" to the end of the command within the batch file. You can also add comments to the batch file, via normal batch file scripting techniques.
This is just another example of how we can use already-available tools and techniques to build better and more efficient analysis methodologies. Why sit around staring at your monitor when there are easy and efficient ways to extract the data that you need for analysis?
Here's another use for this technique; at the request of a member of LE, I recently modified the currently available 'ares.pl' plugin (for parsing Registry keys for the Ares P2P application) to parse out just specific information, which includes decoding the search terms typed in by the user. If a Windows 7 (or Vista) system was being examined, you could use this technique to see the change of of several of the values, including the user typing in search terms, over time. The key that maintains the search terms does not include an MRU value, so you can use this technique to easily parse out exactly what you're looking for, and rather than seeing "between these two dates, the user typed in 30 search terms", you could see the progression of the terms typed in over time, across the available VSCs.
Following that same technique, I've seen instances where administrators have "taken remediation actions" and "cleaned up" prior to systems being reported as compromised and acquired. Using the contents of the MUICache key (located in the USRCLASS.DAT hive on a Win7 system), I found that this little bundle of joy had been run on the system, but "cleaned up". Using this technique, I could narrow down when the application had been run, and then go into my timeline to see what else was going on around that time.
Note that this technique can be used to approximate the time of program execution on systems where application Prefetch files are not available, either because they were deleted, or because the system is a server (Win2003, Win2008) and doesn't have application prefetching enabled by default.
With that kind of pin-point, surgical accuracy, this kind of sounds like Sniper Forensics to me!
Now, one of the things we may run across during our analysis, if we create a timeline, is that we may have a Registry key that was modified in some way during a particular time window of interest. There are a number of Registry keys for which all we have available is a LastWrite time (which is analogous to a files last modification time) but we do not know what that modification entailed.
For example, some keys maintain a "most recently used" (MRU) list, and we know that whatever the most recent activity was (user accessed/viewed a file, etc.), we can be pretty sure that, in most cases, the activity we see is associated with the key's LastWrite time. However, we may see other keys that simply contain values and subkeys, with LastWrite times that occur during our window of interest. Examples of these keys include the Run key (in both the Software and NTUSER.DAT hives), the MUICache key, etc. We don't know, simply by looking at the contents of the key, what change may have occurred; was a value or subkey added or deleted?
So, given that we have an image of a Windows 7 (or Vista) system, which may have a number of VSCs available, how can we attempt to determine what that change may have been?
Well, the first thing you want to do is mount the image as a volume on your Windows 7 analysis system in a manner that allows you to access the VSCs within the image; perhaps the easiest way to do so is via the VHD method. Once you have the image mounted (let's call it V:\), you'll want to determine which VSCs are available using the vssadmin command:
vssadmin list shadows /for=v:\
You'll see a bunch of stuff go by, and you'll want to look for lines that look like this:
Shadow Copy Volume: \\?\GLOBALROOT\Device\HarddiskVolumeShadowCopy8
A really easy way to cut through all of that stuff is to use the following command line:
vssadmin list shadows /for=v:\ | find "Shadow Copy Volume: \\?\GLOBALROOT"
Okay, so now you should have a list of VSCs from the image; for the sake of this example, let's say that you have VSCs numbered 23 through 30. How can we use this information to compare the values in, say, the Run key from the Software hive? We know that the 'soft_run.pl' plugin lets us output the contents of this key, but how can we easily retrieve information from the previous versions of the Software hive within the VSCs?
Let's create a batch file to automate this for us. A simple 'for' loop in a batch file looks like this:
for /L %%i IN (23,1,30) DO @echo %%i
If you put the above command in a batch file ('test.bat') and then run the batch file, you'll see the numbers 23 through 30 echo'd to the console. We can use variables in our batch file by substituting the first and last numbers with '%1' and '%2', respectively:
for /L %%i IN (%1,1,%2) DO @echo %%i
Okay, that's easy enough. Now, all we need to do is get RegRipper to parse this information for us. We can do that using the latest version (2.5) of RegRipper, and specifically rip.exe.
for /L %%i IN (%1,1,%2) DO rip.exe -r \\?\GLOBALROOT\Device\HarddiskVolumeShadowCopy%%i\Windows\system32\config\Software -p soft_run
We would save this command into a batch file (name it "vsc_run.bat") in the same directory where we have rip.exe, and then run it using the following command line:
vsc_run.bat 23 30
That's it. That's all it takes. And the cool thing is that we can save this batch file and use it again and again, or modify it in some way for future use. One way to modify it would be to output everything to a file, rather than just allowing everything to fly by at the console. To do that, simply add " >> output.txt" to the end of the command within the batch file. You can also add comments to the batch file, via normal batch file scripting techniques.
This is just another example of how we can use already-available tools and techniques to build better and more efficient analysis methodologies. Why sit around staring at your monitor when there are easy and efficient ways to extract the data that you need for analysis?
Here's another use for this technique; at the request of a member of LE, I recently modified the currently available 'ares.pl' plugin (for parsing Registry keys for the Ares P2P application) to parse out just specific information, which includes decoding the search terms typed in by the user. If a Windows 7 (or Vista) system was being examined, you could use this technique to see the change of of several of the values, including the user typing in search terms, over time. The key that maintains the search terms does not include an MRU value, so you can use this technique to easily parse out exactly what you're looking for, and rather than seeing "between these two dates, the user typed in 30 search terms", you could see the progression of the terms typed in over time, across the available VSCs.
Following that same technique, I've seen instances where administrators have "taken remediation actions" and "cleaned up" prior to systems being reported as compromised and acquired. Using the contents of the MUICache key (located in the USRCLASS.DAT hive on a Win7 system), I found that this little bundle of joy had been run on the system, but "cleaned up". Using this technique, I could narrow down when the application had been run, and then go into my timeline to see what else was going on around that time.
Note that this technique can be used to approximate the time of program execution on systems where application Prefetch files are not available, either because they were deleted, or because the system is a server (Win2003, Win2008) and doesn't have application prefetching enabled by default.
With that kind of pin-point, surgical accuracy, this kind of sounds like Sniper Forensics to me!
How not to get p0wned by RR v2.5
I recently provided a minor update to the RegRipper tools, moving to v2.5. As there was no modification to how the tools would interact with the plugins, I only provided the tools themselves, including both the Perl scripts (source code) and Windows executables, compiled via Perl2Exe. I did not include the contents of the plugins directory along with the distribution, as I figured folks who were using the tool would just copy the files over their current installation.
Since the release of the updates, I've received a couple of comments about the RegRipper GUI not working properly. Some folks are finding that "Plugins File" drop-down box will not be populated, and the assumption appears to be that the tool isn't reading the plugins directory, even though the "plugins" directory exists and contains plugins. When you launch the GUI, one of the things that happens is that the GUI will look in the "plugins" directory for any files that do NOT contain an extension, and assume that these are profiles. What appears to be happening is that while the directory contains plugins, it does not contain profiles...these are the files that tell RegRipper which plugins to run. By default, those profiles are "ntuser", "sam", "security", "software" and "system", all without any extension (by that, I mean that the file does not end with '.txt' or anything else).
What I think may be happening is that folks are creating fresh installations of the tool; they're downloading the new version and putting it into it's own directory, and then getting the plugins archive file from here; this archive does not contain the profiles.
What you can do in that case is copy the profiles over from your RR v2.02 install, or simply create your own profile. A really easy way to do that is to go to your RR v2.5 install directory, open a command prompt, and type:
Open the resulting file in Excel and sort the rows, based on the hive column. Another way to do this for individual hive files is to use a command such as the following:
This will list just the plugins that are intended to be run against the Software hive, their versions, etc.
Since the release of the updates, I've received a couple of comments about the RegRipper GUI not working properly. Some folks are finding that "Plugins File" drop-down box will not be populated, and the assumption appears to be that the tool isn't reading the plugins directory, even though the "plugins" directory exists and contains plugins. When you launch the GUI, one of the things that happens is that the GUI will look in the "plugins" directory for any files that do NOT contain an extension, and assume that these are profiles. What appears to be happening is that while the directory contains plugins, it does not contain profiles...these are the files that tell RegRipper which plugins to run. By default, those profiles are "ntuser", "sam", "security", "software" and "system", all without any extension (by that, I mean that the file does not end with '.txt' or anything else).
What I think may be happening is that folks are creating fresh installations of the tool; they're downloading the new version and putting it into it's own directory, and then getting the plugins archive file from here; this archive does not contain the profiles.
What you can do in that case is copy the profiles over from your RR v2.02 install, or simply create your own profile. A really easy way to do that is to go to your RR v2.5 install directory, open a command prompt, and type:
rip -c -l > plugins.csv
Open the resulting file in Excel and sort the rows, based on the hive column. Another way to do this for individual hive files is to use a command such as the following:
rip -l -c | find ",Software,"
This will list just the plugins that are intended to be run against the Software hive, their versions, etc.
Sunday, May 06, 2012
RegRipper: Update, Road Map
I thought that, due to some changes in how things were developing with respect to RegRipper, it was time to take a look at a couple of things that had been requested, and to go ahead and include some updates. After all, RegRipper hasn't been updated in a while...I'm not sure why it would need to be, in particular, as RegRipper itself seems to be doing well. I'd think that it would be the plugins that need updating, but there were a couple of things sitting scattered about my work bench that I could include in RegRipper. As such, I opted to break things out into an update for a short-term release, and then include a road map of what I'm looking to do in the future.
**RegRipper v2.5 is posted here, and should be up on the RegRipper site shortly.
Updates - RegRipper v2.5
1. Recompile RegRipper (via Perl2Exe) with the updated version of the the Perl module Parse::Win32Registry, which went to version 1.0 at the end of April. During some testing and development, I'd found that the version of the module available (v.0.60) didn't support "big data", so I contacted the author and developed my own patch (posted here). The author responded very quickly and issued an update, which can be downloaded here, as well as installed into ActiveState Perl via PPM, either command line ("ppm install parse-win32registry") or via the GUI.
2. Remove error checking for finding the hive files to that the tools can be run against files within VSCs, via
\\?\GLOBALROOT\Device\HarddiskVolumeShadowCopyX (see Corey Harrell's blog post). I ran a test against my own live, running system using the following command:
rip.pl -r \\\GLOBALROOT\Device\HarddiskVolumeShadowCopy13\Windows\system32\config\
System -p compname
I got the following output:
Launching compname v.20090727
ComputerName = ENZO
TCP/IP Hostname = enzo
This seems to work pretty well. Thanks to Corey for documenting this technique. What this means is that spelling something wrong will result in errors that are not the result of the tool itself.
3. Add a GPL v.3.0 license to the tool
In this case, I'm releasing just an updated version of the tool. The plugins themselves will all continue to work just as they are...none of what was done to the tool itself has any impact whatsoever on the plugins.
RoadMap - RegRipper v3.0
1. Update GUI to allow analyst to point at a mounted image
2. Implement "osmask" value within plugins
The "osmask" value is an 8-bit (little endian) value maintained in the configuration of a plugin, and something that can be read by external programs. The bits, values, and what they will refer to are listed in the table below:
As such, a value of "3", expressed as either decimal or hex, would indicate that the plugin is intended for Windows XP and 2003, whereas a value of "8" would indicate that the plugin was intended for Win7 and Win2008R2. In this way, if the OS version of the target system is known, the application can bypass the plugin, and not run it, logging the reason the RegRipper log file. This isn't something that the analyst running the tool really needs to worry about, but it does help with questions that I get, such as, "I ran the ACMru plugin against a Windows 7 system, and the result says 'key not found'...why is that?"
I'm also considering doing something similar with a "hivemask" value, which would be similar, except that it would be an "OR" function. By this I mean that a plugin could be run against the Software hive OR an NTUSER.DAT hive, rather than both. It's simply a matter of too many moving parts...having a plugin that requires multiple hives would return incorrect information if the analyst didn't include both hives.
This value will have to be added to the plugin by the plugin author.
3. Implement "categories" value within plugins
Adding "categories" to the configuration of a plugin will not only provide more information about the plugin, but also provide a facility for running an entire category of plugins against several hive files at once.
The "category" field within plugins can have multiple entries, comma separated. When the GUI initiates, it can search through all of the plugins and populate a drop-down box within the GUI with a unique list of the available categories, from the which the analyst can choose.
This will require that the author of the plugin populate the 'categories' field, and spell the category correctly; otherwise, the misspelled categories will appear as unique choices.
4. Keep the CLI version of RegRipper (i.e., rip) as is
5. Produce more plugins, particularly for timeline (TLN) format output
6. I'm going to reserve the release of RR v3.0 to coincide with an upcoming Registry Analysis course.
Now, I know that there are things that some folks have asked for...such as different output formats. One of the biggest challenges about this is that those asking for these features really don't have much more than "I want XML..."; they don't provide a style sheet or schema. For my part, that's a difficult request to work with...the data produced by the plugins as a whole doesn't fit well into a standard schema. That is, different plugins look for different things and display their results in different ways.
One final thing...if you're using RegRipper in a training course, I'd greatly appreciate hearing/seeing what it is you're providing with respect to the tool. You're welcome to continue using the tool...I'd simply be grateful to know what's being taught, that's all.
**RegRipper v2.5 is posted here, and should be up on the RegRipper site shortly.
Updates - RegRipper v2.5
1. Recompile RegRipper (via Perl2Exe) with the updated version of the the Perl module Parse::Win32Registry, which went to version 1.0 at the end of April. During some testing and development, I'd found that the version of the module available (v.0.60) didn't support "big data", so I contacted the author and developed my own patch (posted here). The author responded very quickly and issued an update, which can be downloaded here, as well as installed into ActiveState Perl via PPM, either command line ("ppm install parse-win32registry") or via the GUI.
2. Remove error checking for finding the hive files to that the tools can be run against files within VSCs, via
\\?\GLOBALROOT\Device\HarddiskVolumeShadowCopyX (see Corey Harrell's blog post). I ran a test against my own live, running system using the following command:
rip.pl -r \\\GLOBALROOT\Device\HarddiskVolumeShadowCopy13\Windows\system32\config\
System -p compname
I got the following output:
Launching compname v.20090727
ComputerName = ENZO
TCP/IP Hostname = enzo
This seems to work pretty well. Thanks to Corey for documenting this technique. What this means is that spelling something wrong will result in errors that are not the result of the tool itself.
3. Add a GPL v.3.0 license to the tool
In this case, I'm releasing just an updated version of the tool. The plugins themselves will all continue to work just as they are...none of what was done to the tool itself has any impact whatsoever on the plugins.
RoadMap - RegRipper v3.0
1. Update GUI to allow analyst to point at a mounted image
2. Implement "osmask" value within plugins
The "osmask" value is an 8-bit (little endian) value maintained in the configuration of a plugin, and something that can be read by external programs. The bits, values, and what they will refer to are listed in the table below:
Bits | 7 | 6 | 5 | 4 | 3 | 2 | 1 | 0 |
WinXP | ||||||||
Win2003 | ||||||||
Vista/Win2008 | ||||||||
Win7/2008R2 | ||||||||
Win8 | ||||||||
Win8Server | ||||||||
Reserved | Reserved | |||||||
Values | 128 | 64 | 32 | 16 | 8 | 4 | 2 | 1 |
As such, a value of "3", expressed as either decimal or hex, would indicate that the plugin is intended for Windows XP and 2003, whereas a value of "8" would indicate that the plugin was intended for Win7 and Win2008R2. In this way, if the OS version of the target system is known, the application can bypass the plugin, and not run it, logging the reason the RegRipper log file. This isn't something that the analyst running the tool really needs to worry about, but it does help with questions that I get, such as, "I ran the ACMru plugin against a Windows 7 system, and the result says 'key not found'...why is that?"
I'm also considering doing something similar with a "hivemask" value, which would be similar, except that it would be an "OR" function. By this I mean that a plugin could be run against the Software hive OR an NTUSER.DAT hive, rather than both. It's simply a matter of too many moving parts...having a plugin that requires multiple hives would return incorrect information if the analyst didn't include both hives.
This value will have to be added to the plugin by the plugin author.
3. Implement "categories" value within plugins
Adding "categories" to the configuration of a plugin will not only provide more information about the plugin, but also provide a facility for running an entire category of plugins against several hive files at once.
The "category" field within plugins can have multiple entries, comma separated. When the GUI initiates, it can search through all of the plugins and populate a drop-down box within the GUI with a unique list of the available categories, from the which the analyst can choose.
This will require that the author of the plugin populate the 'categories' field, and spell the category correctly; otherwise, the misspelled categories will appear as unique choices.
4. Keep the CLI version of RegRipper (i.e., rip) as is
5. Produce more plugins, particularly for timeline (TLN) format output
6. I'm going to reserve the release of RR v3.0 to coincide with an upcoming Registry Analysis course.
Now, I know that there are things that some folks have asked for...such as different output formats. One of the biggest challenges about this is that those asking for these features really don't have much more than "I want XML..."; they don't provide a style sheet or schema. For my part, that's a difficult request to work with...the data produced by the plugins as a whole doesn't fit well into a standard schema. That is, different plugins look for different things and display their results in different ways.
One final thing...if you're using RegRipper in a training course, I'd greatly appreciate hearing/seeing what it is you're providing with respect to the tool. You're welcome to continue using the tool...I'd simply be grateful to know what's being taught, that's all.
Saturday, May 05, 2012
Trusted Adviser
I've blogged before regarding the need for a "trusted adviser" and I recently had an opportunity to respond to a query, and recommend yet again for a trusted adviser.
This time, however, it was a little different, in that the initial question had to do with asking forensic analysts what they would do to educate prosecutors on what is available and what can be achieved from digital forensic analysis. The short story is...a lot. But that doesn't really help answer individual questions as they come up. So, providing an initial brief and then extending that to include something more regular and even on-call would be something that I would recommend.
I did an exam once where our team was asked to look at a bunch of EXE files pulled from a system, in relation to a CP case. While we were delivering our report on the exam, I asked the attorney if she was interested in answering the "Trojan Defense", and she responded that she was...which is why she'd asked us to look at the EXEs. I suggested to her that another approach might be to look to see what files (images, movies, etc.) the various user accounts had been used to view, and provide that list to her, along with dates and times, as applicable. She seemed very appreciative, and provided the necessary data we needed in order to provide the list. Additional information beyond what was requested had been provided, so we were also able to answer other questions specific to remote logins to the systems. The issue ended in a plea agreement. However, that route (Registry analysis) wasn't something that had been considered as a resource previously...so the "trusted adviser" role proved to be very effective.
Having a trusted adviser can be extremely beneficial. You can ask questions about what is available given specific data, and even have someone on-hand, that you trust, to do the actual work. This applies not only to public sector (LE, gov't, etc.) but also to the private sector, as well. Many times corporations may purchase an expensive product because they haven't clearly defined their needs in their own mind, and they get that "clarity" from someone in sales. Sometimes, it may simply be beneficial to contact someone to help you define what you're looking at or trying to do.
This time, however, it was a little different, in that the initial question had to do with asking forensic analysts what they would do to educate prosecutors on what is available and what can be achieved from digital forensic analysis. The short story is...a lot. But that doesn't really help answer individual questions as they come up. So, providing an initial brief and then extending that to include something more regular and even on-call would be something that I would recommend.
I did an exam once where our team was asked to look at a bunch of EXE files pulled from a system, in relation to a CP case. While we were delivering our report on the exam, I asked the attorney if she was interested in answering the "Trojan Defense", and she responded that she was...which is why she'd asked us to look at the EXEs. I suggested to her that another approach might be to look to see what files (images, movies, etc.) the various user accounts had been used to view, and provide that list to her, along with dates and times, as applicable. She seemed very appreciative, and provided the necessary data we needed in order to provide the list. Additional information beyond what was requested had been provided, so we were also able to answer other questions specific to remote logins to the systems. The issue ended in a plea agreement. However, that route (Registry analysis) wasn't something that had been considered as a resource previously...so the "trusted adviser" role proved to be very effective.
Having a trusted adviser can be extremely beneficial. You can ask questions about what is available given specific data, and even have someone on-hand, that you trust, to do the actual work. This applies not only to public sector (LE, gov't, etc.) but also to the private sector, as well. Many times corporations may purchase an expensive product because they haven't clearly defined their needs in their own mind, and they get that "clarity" from someone in sales. Sometimes, it may simply be beneficial to contact someone to help you define what you're looking at or trying to do.
Friday, May 04, 2012
Links and Tools
Windows 8 Forensics Guide
You can now find a free Windows 8 forensics guide over on the Propeller Head Forensics blog. Amanda's guide is a great way to get started learning about some of the new things that you're likely to see in Windows 8 (if you aren't already running the Consumer Review edition)
I had an opportunity to meet and listen to Christopher Ard of MS talk about some of the neat new features of Windows 8 recently at the Massachusetts Attorney General's Cyber Crime Conference. I also sat in on Chris Brown's presentation on ProDiscover, and he stated that he's working on adding support for the new ReFS file system to ProDiscover. Looks like there are lots of cool things on the horizon with Windows 8 forensic analysis.
Timelines
The Sploited blog has posted part 2 (part 1 is here) of the Forensic Timelines for Beginners series, in which they discuss creating timelines using the tools and techniques illustrated in chapter 7 of Windows Forensic Analysis Toolkit 3/e.
File System Behavior
There's an interesting thread over on the Win4n6 Yahoo Group regarding file system behavior when files are deleted, including removed from the Recycle Bin. During the thread, one of members made the statement that during some vendor training, they'd been told that when files are deleted, Windows will automatically securely wipe the files. This is, in fact, not the case, as Troy Larson clearly states during the thread.
What this does being up, as Troy says later in the thread, is that Windows systems are extremely active under the hood. During the thread, several members say that they did their own testing and found that files were not securely deleted...what this comes back to is that some files may be very quickly overwritten by normal system activity. This is something that I've pointed out in my books and presentations for some time, particularly when talking about the need for immediate response. Troy even mentions in a follow-up post that "just opening and editing a Word file creates several temporary and scratch files--more than you would image." Even with no specific user interaction, Windows systems have a great deal of activity that go on behind the scenes...look at some of the performance enhancements for XP described here, and in particular in the "Prefetch" section. Windows 7 is very similar, in that it ships with a Scheduled Task that performs a defrag once a week, and another that backups up the main Registry hives every 10 days. Add to that all of the other activity that occurs on Windows systems, and it's not surprising that some folks are seeing, on an inconsistent basis, that Windows appears to be securely wiping files upon deletion. This is very important for DF analysts to keep in mind while performing analysis and file or record carving, but also for incident responders to keep in mind, particularly when developing IR procedures...the more immediate the response, the fresher and more pristine data you will be able to preserve.
Resources
MS File System Behavior Overview
SQLite WAL Files
The DigitalInvestigation blog has an excellent post on SQLite Write Ahead Log files, and their potential as a forensic resource. I've seen these, as well, in the course of forensic impact analysis, and this is a very good read for folks who want to get a little bit familiar with what these files are all about, and how they can be useful during an examination.
Scripting
Melissa's got a very good post up that demonstrates how useful scripting skills (or "skillz") can be. Over the years that I've done infosec work, I've found that an ability to write scripts has been invaluable, and I've found that to be even more true in the DFIR realm. I once held an FTE position where I wrote a Perl script that would reach out across the enterprise, locate all systems that were turned on, query certain Registry keys and return the results to me. As I began investigating my findings, I was able to develop a white list, and within relatively short order got to the point where I could launch the script before lunch, and return to find a report that was about half a page in length.
I was able to provide a viable solution that worked extremely well in my environment (rather than fitting the problem to a commercial tool), for free.
If you're interested in trying out some of the things she demonstrates on your Windows box, check out the Resources section below.
Resources
Unix Command Line Tools for Windows
Utilities and SDK from MS
Unix Utilities
Encryption
Encryption has long been a thorn in the side for examiners. I've had a number of engagements where I was asked to acquire images of systems known to be encrypted, and more than a few where we found out after we got on-site that some of the systems employed whole disk encryption. In those cases, we opted for a live acquisition via FTK Imager (fully documented, of course). It appears that Jesse has found found a free program that can reportedly decrypt BitLocker-protected volumes.
PE Files
If you do PE analysis, check out the CorkAmi Google Code site. There's a good deal of very good information there, as well as detailed information regarding the PE file format.
You can now find a free Windows 8 forensics guide over on the Propeller Head Forensics blog. Amanda's guide is a great way to get started learning about some of the new things that you're likely to see in Windows 8 (if you aren't already running the Consumer Review edition)
I had an opportunity to meet and listen to Christopher Ard of MS talk about some of the neat new features of Windows 8 recently at the Massachusetts Attorney General's Cyber Crime Conference. I also sat in on Chris Brown's presentation on ProDiscover, and he stated that he's working on adding support for the new ReFS file system to ProDiscover. Looks like there are lots of cool things on the horizon with Windows 8 forensic analysis.
Timelines
The Sploited blog has posted part 2 (part 1 is here) of the Forensic Timelines for Beginners series, in which they discuss creating timelines using the tools and techniques illustrated in chapter 7 of Windows Forensic Analysis Toolkit 3/e.
File System Behavior
There's an interesting thread over on the Win4n6 Yahoo Group regarding file system behavior when files are deleted, including removed from the Recycle Bin. During the thread, one of members made the statement that during some vendor training, they'd been told that when files are deleted, Windows will automatically securely wipe the files. This is, in fact, not the case, as Troy Larson clearly states during the thread.
What this does being up, as Troy says later in the thread, is that Windows systems are extremely active under the hood. During the thread, several members say that they did their own testing and found that files were not securely deleted...what this comes back to is that some files may be very quickly overwritten by normal system activity. This is something that I've pointed out in my books and presentations for some time, particularly when talking about the need for immediate response. Troy even mentions in a follow-up post that "just opening and editing a Word file creates several temporary and scratch files--more than you would image." Even with no specific user interaction, Windows systems have a great deal of activity that go on behind the scenes...look at some of the performance enhancements for XP described here, and in particular in the "Prefetch" section. Windows 7 is very similar, in that it ships with a Scheduled Task that performs a defrag once a week, and another that backups up the main Registry hives every 10 days. Add to that all of the other activity that occurs on Windows systems, and it's not surprising that some folks are seeing, on an inconsistent basis, that Windows appears to be securely wiping files upon deletion. This is very important for DF analysts to keep in mind while performing analysis and file or record carving, but also for incident responders to keep in mind, particularly when developing IR procedures...the more immediate the response, the fresher and more pristine data you will be able to preserve.
Resources
MS File System Behavior Overview
SQLite WAL Files
The DigitalInvestigation blog has an excellent post on SQLite Write Ahead Log files, and their potential as a forensic resource. I've seen these, as well, in the course of forensic impact analysis, and this is a very good read for folks who want to get a little bit familiar with what these files are all about, and how they can be useful during an examination.
Scripting
Melissa's got a very good post up that demonstrates how useful scripting skills (or "skillz") can be. Over the years that I've done infosec work, I've found that an ability to write scripts has been invaluable, and I've found that to be even more true in the DFIR realm. I once held an FTE position where I wrote a Perl script that would reach out across the enterprise, locate all systems that were turned on, query certain Registry keys and return the results to me. As I began investigating my findings, I was able to develop a white list, and within relatively short order got to the point where I could launch the script before lunch, and return to find a report that was about half a page in length.
I was able to provide a viable solution that worked extremely well in my environment (rather than fitting the problem to a commercial tool), for free.
If you're interested in trying out some of the things she demonstrates on your Windows box, check out the Resources section below.
Resources
Unix Command Line Tools for Windows
Utilities and SDK from MS
Unix Utilities
Encryption
Encryption has long been a thorn in the side for examiners. I've had a number of engagements where I was asked to acquire images of systems known to be encrypted, and more than a few where we found out after we got on-site that some of the systems employed whole disk encryption. In those cases, we opted for a live acquisition via FTK Imager (fully documented, of course). It appears that Jesse has found found a free program that can reportedly decrypt BitLocker-protected volumes.
PE Files
If you do PE analysis, check out the CorkAmi Google Code site. There's a good deal of very good information there, as well as detailed information regarding the PE file format.
Thursday, May 03, 2012
MASS AGO Conference
I recently had an opportunity to attend (and present at) the Massachusetts 2012 National Cyber Crime Conference, and wanted to take the time to present some takeaways.
As with many conferences, there were a number of excellent presentations at this conference, as well as opportunities for a number of bountiful conversations that occurred outside of the classrooms.
I had an opportunity to sit next to Det. Cindy Murphy at lunch both Mon and Tues, and something we talked about on Tues included the thought that forensic analysts need to understand what their "customers" need, and end-users of analysis (investigators, attorneys, corporate senior level managers, etc.) need to know what is possible and available. Now, this is nothing new, but clearly something that still needs to be addressed. When I say "still", I don't mean that it hasn't been addressed, but due to the volatile nature of jobs and the community as a whole, this is something that has to be readdressed and refreshed over time. The issue seems to be that we have on one side of the table some very technical folks (the analysts) who are performing analysis, and on the other side of the table we have less technical folks who need to make use of the results of the analysis, whether it's an investigator or prosecutor pursuing a case, or a senior-level executive who needs to make a business decision. There's very often a translation that needs to occur and in most cases, there isn't someone who is specifically designated to do that. As an extension of that, many cases involve the end customer asking the analyst to "find bad stuff", most often with little to no context, and then the analyst heading off to perform the work without asking any further questions. End customers need to understand that can be achieved through forensic analysis, especially if specific goals provided.
I had another conversation shortly after a presentation that led me to understand that, as powerful as they can be, one limitation of open source tools is the lack of access to information that is only available from the operating system vendor or application developers. Specifically, information regarding how certain fields within data structures are populated (created, modified, etc.). Most times, those creating open source tools for use in digital analysis only have access to a very limited amount of information in order to explain how a particular artifact is created/modified, or a field is populated, which they obtain via observation and monitoring. However, what the tool authors don't have access to is the inside information regarding all the ways in which the various fields are populated, managed and modified. Examples of this include (but are not limited to):
- The algorithm used to manage the "count" value in the UserAssist subkey values
- What can populate the TypedURLs key within the Registry
- Mandiant's blog post regarding Shim Cache Data
As such, in most cases, open source tools are likely going to give an analyst access to information, and it is up to the analyst to collect and correlate additional information, in order to build context to the data and provide a more accurate interpretation of it...yes, this is a pitch for timeline analysis. ;-)
Overall, I think the conference went extremely well. I spoke to a number of attendees who were very happy with how the conference was set up and organized, as well as the content that was offered. I'm told that the conference had 460+ attendees, and was very well received. I hope to be invited to future events.
Thanks to the folks who set up the conference, and a special thanks to Natasha and Chris for making the conference the success that it was!
As with many conferences, there were a number of excellent presentations at this conference, as well as opportunities for a number of bountiful conversations that occurred outside of the classrooms.
I had an opportunity to sit next to Det. Cindy Murphy at lunch both Mon and Tues, and something we talked about on Tues included the thought that forensic analysts need to understand what their "customers" need, and end-users of analysis (investigators, attorneys, corporate senior level managers, etc.) need to know what is possible and available. Now, this is nothing new, but clearly something that still needs to be addressed. When I say "still", I don't mean that it hasn't been addressed, but due to the volatile nature of jobs and the community as a whole, this is something that has to be readdressed and refreshed over time. The issue seems to be that we have on one side of the table some very technical folks (the analysts) who are performing analysis, and on the other side of the table we have less technical folks who need to make use of the results of the analysis, whether it's an investigator or prosecutor pursuing a case, or a senior-level executive who needs to make a business decision. There's very often a translation that needs to occur and in most cases, there isn't someone who is specifically designated to do that. As an extension of that, many cases involve the end customer asking the analyst to "find bad stuff", most often with little to no context, and then the analyst heading off to perform the work without asking any further questions. End customers need to understand that can be achieved through forensic analysis, especially if specific goals provided.
I had another conversation shortly after a presentation that led me to understand that, as powerful as they can be, one limitation of open source tools is the lack of access to information that is only available from the operating system vendor or application developers. Specifically, information regarding how certain fields within data structures are populated (created, modified, etc.). Most times, those creating open source tools for use in digital analysis only have access to a very limited amount of information in order to explain how a particular artifact is created/modified, or a field is populated, which they obtain via observation and monitoring. However, what the tool authors don't have access to is the inside information regarding all the ways in which the various fields are populated, managed and modified. Examples of this include (but are not limited to):
- The algorithm used to manage the "count" value in the UserAssist subkey values
- What can populate the TypedURLs key within the Registry
- Mandiant's blog post regarding Shim Cache Data
As such, in most cases, open source tools are likely going to give an analyst access to information, and it is up to the analyst to collect and correlate additional information, in order to build context to the data and provide a more accurate interpretation of it...yes, this is a pitch for timeline analysis. ;-)
Overall, I think the conference went extremely well. I spoke to a number of attendees who were very happy with how the conference was set up and organized, as well as the content that was offered. I'm told that the conference had 460+ attendees, and was very well received. I hope to be invited to future events.
Thanks to the folks who set up the conference, and a special thanks to Natasha and Chris for making the conference the success that it was!
Wednesday, May 02, 2012
ASI Training
At ASI, we are developing digital forensics training courses, and our first offerings will be Intro to Windows Forensics Analysis (4-5 June) and Timeline Analysis (18-19 June). Both courses will be held at our Reston, VA, training facility, and the online course descriptions can be found here. Both courses focus on Windows 7 as the platform being analyzed, but the tools and techniques discussed can also be used on other versions of Windows (XP, Win2003, Win2008, and even Windows 8).
The purpose of the Intro to Windows Forensic Analysis course is to provide analysts with a detailed understanding of forensic resources and artifacts that are available within a Windows 7 system, so that they can extend their analysis and findings. As versions of Windows have progressed, there have been more and more forensic artifacts automatically created as a user or as malware interacts with the operating system environment. As such, this introductory course is intended to give attendees an in-depth view of what resources are available and how they can be accessed for pertinent data. This course will also illustrate how the data can be used in a variety of instances to further develop the analyst's findings. The course offers several hands-on exercises where attendees will work with tools and actual data in order to develop their skills.
The purpose of the Timeline Analysis course is to provide attendees with a thorough understanding of not only the benefits of timeline analysis, but also how to create a timeline, and how timeline analysis can be used to expedite and improve the overall analysis process. The course includes a number of hands-on, instructor-led exercises, and culminates in attendees creating their own timeline for analysis.
These courses are open to anyone. Attendees should be comfortable with working a command prompt and executing command-line interface (CLI) tools. The courses do not focus solely on the use of such tools, but several such tools are demonstrated and used by attendees so that they have a thorough understanding of the processes presented and discussed.
ASI will also be offering a 1-day course in Registry Analysis in the near future. Stay tuned for additional offerings, and feel free to contact us for more information, as well as if you have specific training needs.
The purpose of the Intro to Windows Forensic Analysis course is to provide analysts with a detailed understanding of forensic resources and artifacts that are available within a Windows 7 system, so that they can extend their analysis and findings. As versions of Windows have progressed, there have been more and more forensic artifacts automatically created as a user or as malware interacts with the operating system environment. As such, this introductory course is intended to give attendees an in-depth view of what resources are available and how they can be accessed for pertinent data. This course will also illustrate how the data can be used in a variety of instances to further develop the analyst's findings. The course offers several hands-on exercises where attendees will work with tools and actual data in order to develop their skills.
The purpose of the Timeline Analysis course is to provide attendees with a thorough understanding of not only the benefits of timeline analysis, but also how to create a timeline, and how timeline analysis can be used to expedite and improve the overall analysis process. The course includes a number of hands-on, instructor-led exercises, and culminates in attendees creating their own timeline for analysis.
These courses are open to anyone. Attendees should be comfortable with working a command prompt and executing command-line interface (CLI) tools. The courses do not focus solely on the use of such tools, but several such tools are demonstrated and used by attendees so that they have a thorough understanding of the processes presented and discussed.
ASI will also be offering a 1-day course in Registry Analysis in the near future. Stay tuned for additional offerings, and feel free to contact us for more information, as well as if you have specific training needs.
Subscribe to:
Posts (Atom)