Pages

Friday, December 28, 2012

Malware Detection

Corey recently posted to his blog regarding his exercise of infecting a system with ZeroAccess.  In his post, Corey provides a great example of a very valuable malware artifact, as well as an investigative process, that can lead to locating malware that may be missed by more conventional means. 


This post isn't meant to take anything away from Corey's exceptional work; rather, my intention is to show another perspective of the data, sort of like "The Secret PoliceMan's Other Ball".  Corey's always done a fantastic job of performing research and presenting his findings, and it is not my intention to detract from his work at all.  Instead, I would like to present another perspective, utilizing Corey's work and blog post as a basis, and as a stepping stone.

The ZA sample that Corey looked at was a bit different from what James Wyke of SophosLabs wrote about, but there were enough commonalities that some artifacts could be used to create an IOC or plugin for detecting the presence of this bit of malware, even if AV didn't detect it.  Specifically, the file "services.exe" was infected, an EA attribute was added to the file record in the MFT, and a Registry modification occurred in order to create a persistence mechanism for the malware.  Looking at these commonalities is similar to looking at the commonalities between various versions of the Conficker family, which created a randomly-named service for persistence.

From the Registry hives from Corey's test, I was able to create and test a RegRipper plugin that do a pretty good job of filtering through the Classes/CLSID subkey (from the Software hive) and locating anomalies. In it's original form, the MFT parser that I wrote finds the EA attribute, but doesn't specifically flag on it, and it can't extract the shell code and the malware PE file (because the data is non-resident).  However, there were a couple of interesting things I got from parsing the MFT...

If you refer to Corey's post, take a look at the section regarding the MFT record for the infected services.exe file.  If you look at the time stamps and compare those from the $STANDARD_INFORMATION attribute to those of the $FILE_NAME attribute that Corey posted, you'll see an excellent example of file system tunneling.  I've talked about this in a number of my presentations, but here's a pretty cool to see an actual example of it.  I know that this isn't really "outside the lab", per se, but still, it's pretty cool to this functionality as a result of a sample of malware, rather than a contrived exercise.  Hopefully, this example will go a long way toward helping analysts understand what they're seeing in the time stamps.

Corey also illustrated an excellent use of timeline analysis to locate other files that were created or modified around the same time that the services.exe file was infected.  What the timeline doesn't show clearly is that the time stamps were extracted from the $FILE_NAME attribute in the MFT...the $STANDARD_INFORMATION attributes for those same files indicate that there was some sort of time stamp manipulation ("timestomping") that occurred, as many of the files have M, A, and B times from 13 and 14 Jul 2009.  However, the date in question that Corey looked at in his blog post was 6 Dec 2012 (the day of the test).  Incorporating Prefetch file metadata and Registry key LastWrite times into a timeline would a pretty tight "grouping" of these artifacts at or "near" the same time.

Another interesting finding in analyzing the MFT is that the "new" services.exe file was MFT record number 42756 (see Corey's blog entry for the original file's record number).  Looking "near" the MFT record number, there are a number of files and folders that are created (and "timestomped") prior to the new services.exe file record being created.  Searching for some of the filenames and paths (such as C:\Windows\Temp\fwtsqmfile00.sqm), I find references to other variants of ZeroAccess.  But what is very interesting about this is the relatively tight grouping of the file and folder creations, not based on time stamps or time stamp anomalies, but instead based on MFT record numbers.

Some take-aways from this...at least what I took away...are:

1. Timeline analysis is an extremely powerful analysis technique because it provides us with context, as well as an increased relative level of confidence in the data we're analyzing.

2. Timeline analysis can be even more powerful when it is not the sole analysis technique, but incorporated into an overall analysis plan.  What about that Prefetch file for services.exe?  A little bit of Prefetch file analysis would have produced some very interesting results, and using what was found through this analysis technique would have lead to other artifacts that should be examined in the timeline.  Artifacts found outside of timeline analysis could be used as search terms or pivot points in a timeline, which would then provide context to those artifacts, which could then be incorporated back into other analyses.

3. Some folks have told me that having multiple tools for creating timelines makes creating timelines too complex a task; however, the tools I tend to create and use are multi-purpose.  For example, I use pref.pl (I also have a 'compiled' EXE) for Prefetch file analysis, as well as parsing Prefetch file metadata into a timeline.  I use RegRipper for parsing (and some modicum of analysis) of Registry hives, as well as to generate timeline data from a number of keys and value data.  I find this to be extremely valuable...I can run a tool, find something interesting in a data set as a result of the analysis, and then run the tool again, against the same data set, but with a different set of switches, and populate my timeline.  I don't need to switch GUIs and swap out dongles.  Also, it's easy to remember the various tools and switches because (a) each tool is capable of displaying its syntax via '-h', and (b) I created a cheat sheet for the tool usage.

4.  Far too often, a root cause analysis, or RCA, is not performed, for whatever reason.  We're losing access to a great deal of data, and as a result, we're missing out on a great deal of intel.  Intel such as, "hey, what this AV vendor wrote is good, but I tested a different sample and found this...".  Perhaps the reason for not performing the RCA is that "it's too difficult", "it takes too long", or "it's not worth the effort".  Well, consider my previous post, Mr. CEO...without an RCA, are you being served?  What are you reporting to the board or to the SEC, and is it correct?  Are you going with, "it's correct to the best of my knowledge", after you went to "Joe's Computer Forensics and Crabshack" to get the work done?

Now, to add to all of the above, take a look at this post from the Sploited blog, entitled Timeline Pivot Points with the Malware Domain List.  This post provides an EXCELLENT example of how timeline analysis can be used to augment other forms of analysis, or vice versa.  The post also illustrates how this sort of analysis can easily be automated.  In fact, this can be part of the timeline creation mechanism....when any data source is parsed (i.e., browser history list, TypedUrls Registry key, shellbags, etc.) have any URLs extracted run in comparison to the MDL, and then generate a flag of some kind within the timeline events file, so that the flag "lives" with the event.  That way, you can search for those events (based on the flag) after the timeline is created, or, as part of your analysis, create a timeline of only those events.  This would be similar to scanning all files in the Temp and system32 folders, looking for PE files with odd headers or mismatched extensions, and then flagging them in the timeline, as well.

Great work to both Corey and Sploited for their posts!

6 comments:

  1. Harlan,

    Excellent work on explaining what you found out by parsing the MFT. I'm curious though, what stood out to you in the $MFT that the timestamp manipulation was done leveraging file system tunneling as opposed a different timestomping technique? I picked up on the timestamp changes but noticing file system tunneling was a great catch and helped explain other activity on the system.

    ReplyDelete
  2. Corey,

    Thanks for the comment, and good question...

    ...what stood out to you in the $MFT that the timestamp manipulation was done leveraging file system tunneling...

    Well, I had some a priori knowledge, specifically, that the services.exe file was modified. From your post, it was clear that the original file was deleted and a new one created, based on the MFT record number.

    From there, I looked at the creation dates from both attributes in the MFT record...they were the same.

    Finally, I know what file system tunneling is... ;-)

    ReplyDelete
  3. Anonymous10:20 PM

    Great post Harlan. I continue to learn from both your post and Corey's.

    I'm glad you can see some value in the Perl script I wrote too and as always appreciate the mention and highlight.

    ReplyDelete
  4. Harlan,

    Thanks for elaborating on Corey's ZeroAccess blogpost. As Corey stated I also did not see it in that angle and your explanation was helpful. I remember reading your article on "System Tunneling", but when you pointed out it was all there.

    Thanks to you both on this excellent blog post.I learnt a lot.

    Lakshmi N

    ReplyDelete
  5. Harlan,

    Thanks for elaborating on Corey's ZeroAccess blogpost. As Corey stated I also did not see it in that angle and your explanation was helpful. I remember reading your article on "System Tunneling", but when you pointed out it was all there.

    Thanks to you both for an excellent blog post.I learnt a lot.

    Lakshmi N

    ReplyDelete
  6. These Maradona-Messi two posts were great! Thank you very much!

    ReplyDelete