The Windows Incident Response Blog is dedicated to the myriad information surrounding and inherent to the topics of IR and digital analysis of Windows systems. This blog provides information in support of my books; "Windows Forensic Analysis" (1st thru 4th editions), "Windows Registry Forensics", as well as the book I co-authored with Cory Altheide, "Digital Forensics with Open Source Tools".
Thursday, November 07, 2013
I've always enjoyed the format that Aaron has used for the OMFW, going back to the very first one. That first time, there was a short presentation followed by a panel, and back and forth, with breaks. It was fast-moving, the important stuff was shared, and if you wanted more information, there was usually a web site that you could visit in order to download the tools, etc.
This time around, there was greater focus on things like upcoming updates to Volatility, and the creation of the Volatility Foundation. Also, a presentation by George M. Garner, Jr., was added, so there were more speakers, more variety in topics discussed, and a faster pace, all of which worked out well.
The presentations that I really got the most out of were those that were more akin to use cases.
Sean and Steven did a great job showing how they'd used various Volatility plugins and techniques to get ahead of the bad guys during an engagement, by moving faster than the bad guys could react and getting inside their OODA loop.
Cem's presentation was pretty fascinating, in that it all seemed to have started with a claim by someone that they could hide via a rootkit on Mac OSX systems. Cem's very unassuming, and disproved the claim pretty conclusively, apparently derailing a book (or at least a chapter of the book) in the process!
Jamie's presentation involved leveraging CybOX with Volatility, and was very interesting, as well as well-received.
There was more build-up and hype to Jamaal's presentation than there was actual presentation! ;-) But that doesn't take anything at all from what Jamaal talked about...he'd developed a plugin called ethscan that will scan a memory dump (Windows, Linux, Mac) and produce a pcap. Jamaal pointed out quite correctly that many times when responding to an incident, you won't have access to a pcap file from the incident; however, it's possible that you can pull the information you need out of the memory buffer from the system(s) involved.
What's really great about OMFW is that not only does Aaron get some of the big names that are really working hard (thanks to them!) to push the envelope in this area of study to present, but there are also a lot of great talks in a very short time period. I'll admit that I wasn't really interested in what goes into the framework itself (that's more for the developers), but there were presentations on Android and Linux memory analysis; there's something for everyone. You may not be interested in one presentation, but wait a few minutes...someone will talk about a plugin or a process, and you'll be glued to what they're saying.
Swag this year was a cool coffee mug and Volatility stickers.
Here's a wrap-up from last year's conference. You can keep up on new developments in Volatility, as well as the Volatility training schedule, at the Volatility Labs blog.
I've attended this conference before, and just as in the past, there is a lot of great information shared, with something for everyone. Personally, I'm more interested in the talks that present how a practitioner used open source tools to accomplish something, solve a problem, or overcome a challenge. I'm not so much interested in academic presentations, nor so much in talks that talk about open source tools that folks have developed. As in the past, I'd suggest yet again that there be multiple tracks for this conference...one for academics and developers, and another for practitioners, by practitioners.
As part of full disclosure, I did not attend any of the training or tutorials, and I could not attend all of the presentations.
You can see the program of talks here.
Thoughts and Take Aways
Visualization in DFIR is a sticky point...in some ways, it may be a solution without a problem. Okay, so the recommendation is, "don't use pie charts"...got it. But how does one use visualization techniques to perform analysis, when malware and intrusions follow the Least Frequency of Occurrence? How can a histogram show an analyst when the bad guy or the malware compromised a system when activities such as normal user activity, software and system updates, etc., are the overwhelming available activity? Maybe there is a way to take a bite out of this, but I'm not sure that academics can really start to address this until there is a crossover into the practitioner's end of the pool. I only mention this because it's a recurring thought that I have each time I attend this conference.
As Simson pointed out, much of the current visualization occurs after the analyst has completed their examination and is preparing a report, either for a customer or for presentation in court. Maybe that's just the nature of the beast.
Swag this year was a plastic coffee cup for the car with the TSK logo, TSK stickers, and a DVD of Autopsy.
Link to Kristinn's stuff
We should all give a great, big Thank You to everyone involved in making both of these conferences possible. It takes a lot of work to organize a conference...I can only imagine that it's right up there with herding cats down a beach...and providing a forum to bring folks together. So, to the organizers and presenters, to everyone who worked so hard on making these conferences possible, to those who sat at tables to provide clues to the clueless ("...where's the bathroom?")...thank you.
There is another thing that I really like about DFIR-related conferences; interacting with other DFIR folks that I don't get to see very often, and even those who are not directly involved with what we do on a day-to-day basis. Unfortunately, it seems that few folks who attend these conferences want to engage and talk about DFIR topics, but now and again I find someone who does.
In this case, a good friend of mine wanted to discuss "...is what we do a 'science' or an 'art'?" at lunch. And when I say "discuss", I don't mean stand around and listen to others, I mean actively engaging in discussion. That's what a small group of us...there were only four of us at the table...did during lunch on Tuesday. Many times, finding DFIR folks at DFIR conferences that want to actively engage in discussion and sharing of DFIR topics...new malware autostart/persistence mechanisms seen, new/novel uses of tools, etc...is hard to do. I've been to conferences before where, for whatever reason, you just can't find anyone to discuss anything related to DFIR, or to what they do. In this instance, that wasn't the case, and some fascinating discussion ensued.
Posted by H. Carvey at 9:20 AM
Subscribe to: Post Comments (Atom)
Personally, I'm more interested in the talks that present how a practitioner used open source tools to accomplish something, solve a problem, or overcome a challenge. I'm not so much interested in academic presentations, nor so much in talks that talk about open source tools that folks have developed.
IMO there are already too many of these "how to use the tool talks". I think the OSDFCon is a good alternative to these and good for getting the developers of the tools together with the users.
... there are already too many of these...
How so? Can you name some? I'm just curious, as many of the presentations I've seen at conferences are more along the lines of "...here's a tool, here's another tool..." but too few practitioners are sharing how they've used or extended tools to get a job done. Corey Harrell is a big exception to that.
So I was at SANS EU Prague e.g. there were a couple of talks that were in-line of: I had this problem I've solved it by writing this tool, looking at this file and getting this information from it. IMO that's in the same line.
In most conference if someone presents a tool, besides vendor conferences, if they present a tool they'll tell you what you can do with it.
I get the impression to what you're referring to is: I have this case and these are the steps (and tools) I'd used to solve it and discuss it in full? And that you want to focus more on the decisions made/mindset used in the case then the tools?
... if they present a tool they'll tell you what you can do with it.
Sure, but for the most part, saying, "...here's a tool and here's how you use it...", is one thing. I get that, and it's very beneficial for most folks. Many times, if I've seen something posted about the tool prior to a conference, I can generally get most of that from the announcement and description of the tool.
What I tend to be more interested in is how someone addressed an issue...what where the thought processes, what worked, what didn't work, and why?
... want to focus more on the decisions made/mindset used in the case then the tools?
Not so much...both really. But that's just me. I like to hear about new tools, but I'm also interested in thought processes and techniques used...not so much how you can use a tool, but how did you use the tool, particularly if it was in some novel manner.
Post a Comment