More APT Confusion
I ran across an interesting article on TechTarget recently, which states that due to confusion over the APT threat, which "...leads companies to often misappropriate resources, making unnecessary or uninformed investments."
Really? I remember going on-site to perform IR back in 2006 when I was with the ISS ERS Team, and understanding how the customer knew to contact us. They had three copies of ISS RealSecure. All still in their shrink-wrap. One was used to prop a door open. So what I'm saying is that, with respect to the TechTarget article, it isn't necessarily confusion over what "APT" means that leads to "uninformed investments", although I do think that what most organizations find themselves inundated with, with respect to marketing, does lead to significant confusion. I think it's not understanding threats in general, as well as the panic that follows an incident, particularly one that, when investigated, is found to have been going on for some time (weeks, months) prior to detection.
Context...no, WFP. Wait...what?
When presenting on timeline analysis, or most recently at PFIC 2011, Windows forensic analysis, one of the concepts I cover is context within your examination. Recently, Chris posted on the same topic, and gives a great example.
Something about the post, and in particular the following words, caught my eye:
"...manually went through the list of running services using the same methodology...right name, wrong directory, or slightly misspelled name, right directory (for the answer to why I do this, check this out... http://support.microsoft.com/kb/222193)."
Looking at this, I was a little confused...what does Windows File Protection (WFP) have to do with looking for the conditions that Chris mentioned in the above quote? I mean, if a malware author were to drop "svch0st.exe" into the system32 directory, or "svchost.exe" into the Windows directory, then WFP wouldn't come into play, would it?
What's not mentioned in the post is that, while both of the conditions are useful techniques for hiding malware (because they work), WFP is also easily "subverted". The reason I put "subverted" in quotes is that it's not so much a hack as it is using an undocumented MS API call. That's right! To break stuff, you don't have to break other stuff first...you just use the exit ramp that the vendor didn't post signs for. ;-)
Okay, to start, open WFA 2/e and turn to pg. 328. Just below the middle of the page, there's a link to a BitSum page (the page doesn't seem to be available any longer...you'll need to look here) that discusses various methods for disabling WFP...one that I've seen used is method #3; that is, disable WFP for one minute for a particular file. This is something that is likely used by Windows Updates. This CodeProject page has some additional useful information regarding the use of the undocumented SfcFileException API call.
To show you what I mean by "undocumented", take a look at the image to the right...this is the Export Address Table from sfc_os.dll from a Windows XP system, via PEView. If you look at the Export Ordinal Table, you'll see only the last 4 functions listed, by name. However, in the Export Address Table, you don't see names associated with several of the functions.
Note that at the top of the BitSum page (archived version), several tools are listed to demonstrate some of the mentioned techniques. As the page appears to be no longer available, I'm sure that the tools are not available either...not from this site, anyway.
Mandiant has a good example of how WFP "subversion" has been used for malware persistence; see slide 25 from this Mandiant The Malies presentation. W32/Crimea is another example of how disabling WFP may be required (I've seen the target DLL as a "protected" file on some XP systems, but not on others...). This article describes the WFP subversion technique and points to this McAfee blog post.
Yes, Virginia...it is UTC
I recently posted a link to some of my timeline analysis materials that I've used in previous presentations. I've mentioned before that I write all of my tools to normalize the time stamps to 32-bit Unix time format, based on the system's interpretation of UTC (which, for the most part, is analogous to GMT). In fact, if you open the timeline presentation from this archive, slide 18 includes a bullet that states "Time (normalized to Unix epoch time, UTC)".
I hope this makes things a bit clearer to folks. Thanks!
Intel Sharing
Not long ago, I posted about OpenIOC.org, and recently ran across this DarkReading article that discusses intel sharing. Sharing within the community, of any kind, is something that's been discussed time and time again...very recently, I chatted with some great folks at PFIC (actually, at the PFIC AfterDark event held at The Spur in Park City) about this subject.
In the DarkReading article, Dave Merkel, Mandiant CTO, is quoted as saying, "There's no single, standardized way for how people to share attack intelligence." I do agree with this...with all of the various disparate technologies available, it's very difficult to express an indicator of compromise (IoC) in a manner that someone else can immediately employ it within their infrastructure. I mean, how does someone running Snort communicate attack intel to someone else who monitors logs?
I'd suggest that it goes a bit further beyond that, however...there's simply no requirement (nor apparently any desire) for organizations to collect attack intelligence, or even simply share artifacts. Most "victim" organizations are concerned with resuming business operations, and consulting firms are likely more interested in competitive advantage. At WACCI 2010, Ovie talked about the lack of sharing amongst analysts during his keynote presentation, and like others, I've experienced that myself on the teams I've worked with...I wouldn't have any contact with another analyst on my team for, say, 3 months, and after all that time, they had nothing to share from their engagements. We took steps to overcome that...Chris Pogue and I wrote a white paper on SQL injection, we developed some malware characteristics, and I even wrote plugins for RegRipper. I've seen the same sharing issue when I've talked to groups, not just about intel sharing, but also about the forensic scanner.
I think that something like OpenIOC does provide a means for describing IoCs in a manner that can be used by others...but only others with the same toolset. Also, it is dependent upon what folks find, and from that, what they choose to share. As an example, take a look at the example Zeus IOC provided at the OpenIOC.org site. It contains some great information...file names/paths, process handles, etc...but no persistence mechanism for the malware itself, and no Registry indicators. So, this IoC may be great if I have a copy of IOCFinder and a live system to run it against. But what happens if I have a memory dump and an acquired image, or just a Windows machine that's been shut off? Other IoCs, like this one, are more comprehensive...maybe with a bit more descriptive information and an open parser, an analyst could download the XML content and parse out just the information they need/can use.
Now, just to be clear...I'm not saying that no one shares DFIR info or intel. I know that some folks do...some folks have written RegRipper plugins, but I've also been in a room full of people who do forensic analysis, and while everyone admits to having a full schedule, not one person has a single artifact to share. I do think that the IoC definition is a good start, and I hope others pick it up and start using it; it may not be perfect, but the best way to improve things is to use them.
DoD CyberCrime Conference
Thanks to Jamie Levy for posting the DC3 track agenda for Wed, 25 Jan 2012. It looks like there're a number of interesting presentations, many of which all go on at the same time. Wow. What's a girl to do?
NoVA Forensics Meetup
Just a quick reminder about the next NoVA Forensics Meetup, scheduled for Wed, 7 Dec 2011, at the ReverseSpace location. Sam Brothers will be presenting on mobile forensics.
6 comments:
"I write all of my tools to normalize the time stamps to 32-bit Unix time format"
FLS and TimeScanner both do this as well. Win32 datetimes has a granularity of .001 sec while Unix time only has a granularity of 1 sec. Would it not also be helpful during timeline analysis, and especially during anomaly detection, to maintain that higher level of accuracy?
I've had some thoughts on this, but I'm not sure if I want to share them with someone who doesn't want to sign their posts.
Thanks for commenting.
I think MAEC was invented to fill the intel sharing gap. Vendors haven't really adopted it yet though. http://maec.mitre.org/
Hi Harlan,
In terms of using the IOCFinder I have used Memoryze\AuditViewer to review a memory sample.Build a IOC using mandiants IOCEditor then run the IOC against the xml output from multiple Memoryze\Auditview audits to see if other samples were compromised.
Excellent post as always.
The DOD conference looks insane... I may have to plan out which tracks I want to see for once.
I assume you will be attending this one on Wednesday: "Registry Analysis for Network Intrusions"? Sounded interesting to me at least.
Andrew,
I assume you will be attending this one...
Eesh, I don't know. At this point, I'm simply concerned with getting through my presentation! ;-) Also, it's in the IA track...does that mean that it won't be heavily technical?
Post a Comment