I think it's great when analysts and organizations share this kind of information, so that the rest of us can see what others are seeing. So, a big thanks goes out to TrustWave, and the next time you see Chris at a conference, be sure to say hi and buy him a beer...or better yet, treat him some bread pudding!
What I'd like to do is take a moment to go through the post and discuss some things that might add something of a different perspective or view to the issue, or perhaps something .
As you can see from the post, Chris uses timeline analysis to locate the malware in question, and he's got some really good information in the post about creating the timeline for analysis (Chris uses the log2timeline tools). I'm sure that there's quite a bit about the engagement and the analysis that weren't mentioned in the post, as Chris jumps right to the target date within his timeline and locates the malware.
I like the fact that Chris uses multiple analysis techniques to corroborate and check his findings. For example, in the post, Chris mentions looking at the file's $STANDARD_INFORMATION and $FILE_NAME attributes in the MFT, and confirming that there were no indications of "time stomping" going on. This is a great example that demonstrates that anti-forensics techniques target the analyst and their training, and that a knowledgeable analyst isn't slowed down by these techniques. I think that the post also demonstrates how timelines can be used to add context to what you're looking at, as well as increase the level of confidence that the analyst has in that data.
One of the things that kind of struck me as odd in the post is that there's mention of the "regedit/1" entry in the RunMRU key, and then the post jumps right to discussing the InProcServer32 key, based on the timeline. The RunMRU information (ie, key LastWrite time) is from a user's hive, so another key of interest to check might be the following:
As you'd think, this key contains information about the key that had focus when the user closed RegEdit. Specifically, the LastKey value (mentioned in MS KB 244004) contains the name of that key. This value might be used to make a bit of a transition between the RunMRU data and the changes to the InProcServer32 key that's mentioned in the post, and possible provide insight into how the malware was actually deployed on the system.
As Chris points out in the post, the value type being changed from "REG_SZ" (string value) to "REG_EXPAND_SZ" does allow for the use of unexpanded references to environment variables, such as %SystemRoot%. One statement that I don't really follow is:
So now the threading for webcheck.dll is no longer pointing to the legitimate file, but to the malware!
The threading model listed doesn't have anything to do with the path...I'm going to have to reach to Chris and find out what he was referring to in that statement. He follows that up with this statement later in the post:
So not only did the attackers use a legitimate threading, but they made sure to use a shell extension that was trusted by Windows.
Again, I'm not clear on the "threading" part of that statement, but Chris is quite correct about the shell extension issue. Basically, the Windows shell (Explorer.exe), which is launched when a user logs into the system, will load the approved shell extensions, which includes this particular malware. Trust seems to be implicit, as there are no checks run when Explorer goes to load a shell extension DLL. This is a bit different from the shell extension issue I'd mentioned last August, in part because it doesn't use the DLL Search Order issue. Instead, it simply points Explorer directly to the malware through the use of an explicit path.
All in all, I'm glad to see that Chris and the TrustWave folks sharing this kind of thing with the community. I do think that there's more that isn't being said, like how the malware actually got on the system (ie, how it was deployed), but hey, we all know that there are some things that can't be said about engagements. And that's okay.