Monday, May 30, 2011

NoVA Forensic Meetup

Reminder: NoVA Forensic Meetup, Wed, 1 June, at the ReverseSpace location in Herndon (7pm - 8:30pm)


Wednesday, May 25, 2011

Tools

I've run across a number of tools recently, some directly related to forensics, and others more related more to IR or RE work. I wanted to go ahead and put those tools out there, to see what others think...

Memory Analysis
There have been a number of changes recently on the memory analysis front.  For example, Mandiant recently released their RedLine tool, and HBGary released the Community Edition of their Responder product. 

While we're on the topic of memory analysis tools, let's not forget the erstwhile and formidable Volatility.

Also, if you're performing memory dumps from live systems, be sure to take a look at the MoonSol Windows Memory Toolkit.

SQLite Tools
CCL-Forensics has a trial version of epilog available download, for working with SQLite databases (found on smartphones, etc.).  One of the most noticeable benefits of epilog is that it allows you to recover deleted records, which can be very beneficial for analysts and investigators.

I'm familiar with the SQLite Database Browser...epilog would be interesting to try.

MFT Tools
Sometimes you need a tool to parse the NTFS $MFT file, for a variety of reasons.  A version of my own mft.pl is available online, and Dave Kovar provided his analyzemft.pl tool online, as well.  Mark McKinnon has chimed in and provided MFT parsing tools for Windows, Linux, and MacOSX.

Other Tools
HBGary also made their AcroScrub tool available, which uses WMI to reach across the enterprise and scan for older versions of Adobe Reader.

A very interesting tool that I ran across is Flash Dissector.  If you deal with or even run across SWF files, you might want to take a look at this tool, as well as the companion tools in the SWFRETools set.

The read_open_xml.pl Perl script is still available for parsing metadata from Office 2007 documents.

From the same site as the SWFRETools are some malware write-ups including NiteAim, and Downloader-IstBar.  As a complete aside, here's a very interesting Gh0stNet writeup that Chris pointed me to recently (fans of Ron White refer to him as "Tater Salad"...fans of Chris Pogue should refer to him as "Beefcake" or "Bread Puddin'"...).

ADSs
Alternate data streams isn't something that you see discussed much these days.  I recently received a question about a specific ADS, and thought I'd include some tools in this list.  I've used Frank's LADS, as well as Mark's streams.exe.  Scanning for ADSs is part of my malware detection process checklist, particularly when the goal of the analysis is to determine if there's any malware on the system.

Also, I ran across this listing at MS of Known Alternate Stream Names.  This is some very useful information when processing the output of the above tools, because what often happens is that someone uses one of the above tools and finds one of the listed ADSs, and after the panic that ensues, their attitude switches back to the other side of the spectrum, to apathy...and that's when they're most likely to get hit.

Here are some additional resources from Symantec, IronGeek, and MS. Also, be sure to check out what I've written about these in WFA 2/e.


Scanners

Microsoft recently released their Safety Scanner, which is a one-shot micro-scanner...download it, run it, and it expires after 10 days, and then you have to download it again.  This shouldn't replace the use of Security Essentials or other AV tools, but I'm pointing this out because it could be very useful when included as part of your malware detection process.  For example, you could mount an acquired image via FTK Imager or ImDisk and scan the image.  Also, the folks at ForensicArtifacts recently posted on accessing VSCs (their first research link actually goes back to my post by the same title...thanks to AntiForensics for reposting the entire thing...)...without having to have EnCase or PDE, you could easily scan the mounted VSC, as well.


Frameworks
The Digital Forensics Framework (DFF) is open source, and was recently updated to include support for the AFF format, as well as mailbox reconstruction via Joachim Metz's libpff.

Christopher Brown, of TechPathways, has made ProDiscover Basic Edition v6.10.0.2 available, as well.  As a side note, Chris recently tweeted that he's just finished the beta of the full version of ProDiscover, adding the ability to image and diff VSCs.  Wowzers!

Sites
TZWorks - free "prototypes" tools, including the Windows Shellbags parser, an EVTX file parser, and others.  Definitely worth checking out.

WoanWare - several free forensics tools including a couple for browser forensics, and (like TZWorks) a "USBStor parser".

NirSoft - the link to the site goes to the forensics tools, but there are a lot of free tools available at the NirSoft site...too many to list.

The Open Source Digital Forensics site is a good source of tools, as well.

OSDFC
Speaking of tools, let's not forget that the OSDFC is right around the corner...

Addendum
Check out Phil Harvey's EXIFTool (comes with a standalone Windows EXE)...there's a long list of supported file types at the tool page.

Additional lists of tools include Mike's Forensic Tools, as well as the tools at MiTeC (thanks to Anonymous' comment).  Also, Mark McKinnon has posted some freely available tools, as well.

Sunday, May 22, 2011

Brain Droppings

NoVA Forensics Meetup
The next NoVA Forensics Meetup will be held on Wed, 1 June 2011, from 7-8:30pm.  As to a location, I met with the great folks at Reverse Space, a hacker space in Herndon where some of the folks have an interest in forensics.  Thanks to Carl and Richard for taking the time to meet with me, and for offering to host our meetings.

I hope that we get a big turn-out for our currently scheduled presentation, titled "Build your own packet capture engine".

Our meetup in July will be scheduled for Wednesday, 6 July, and we've already got an offer of a presentation regarding setting up virtual machines to use for dynamic malware analysis.

As to further topics, I'd like to get suggestions regarding how we can expand our following; for example, Chris from the NoVA Hackers group told me that they follow the AHA participation model.  I'd like the development of this group to be a group effort, and as such will be asking participants and attendees for thoughts, ideas, comments (and to even volunteer their own efforts) regarding how this group can expand.  For example, do we need a mailing list or is the Win4n6 Group sufficient?  If you have anything that you'd like to offer up, please feel free to drop me a line.



Breakin' In
Speaking of the NoVA Forensics Meetup, at our last meeting, one of our guests asked me how to go about getting into the business.  I tried to give a coherent answer, but as with many things, this question is one of those that have been marinating for some time,  not just in my brain housing group, but within the community.

From my own perspective, when interviewing someone for a forensics position, I'm most interested in what they can do...I'm not so much interested that someone is an expert in a particular vendor's application.  I'm more interested in methodology, process, what problems have you solved, where have you stumbled and what have you learned.  In short, are you tied to a single application, or do you fall back to a process or methodology?  How do you go about solving problems?  When you do something in particular (adding or skipping a step in your process), do you have a reason for doing so?

But the question really goes much deeper than that, doesn't it?  How does one find out about available positions and what it really takes to fill them?  One way to find available positions and job listings is via searches on Monster and Indeed.com.  Another is to take part in communities, such as the...[cough]...NoVA Forensics Meetup, or online communities such as lists and forums.

Breaches
eWeek recently (6 May) had an article regarding the Sony breach available, written by Fahmida Rashad, which started off by stating:

Sony could have prevented the breach if they’d applied some fundamental security measures...

Sometimes, I don't know about that.  Is it really possible to say that, just because _a_ way was found to access the network, that including these "fundamental security measures" would have prevented the breach?

The article went on to quote Eugene Spafford's comments that Sony failed to employ a firewall, and used outdated versions of their web server.  'Spaf' testified before Congress on 4 May, where these statements were apparently made.

Interestingly, a BBC News article from 4 May indicates that at least some of the data stolen was from an "outdated database".   

The eWeek article also indicates (as did other articles) that Data Forte, Guidance Software and Protiviti were forensics firms hired to address the breach.

As an aside, there was another statement made within the article that caught my interest:

“There are no consequences for many companies that under-invest in security,” Philip Lieberman, CEO of Lieberman Software, told eWEEK. 

As a responder and analyst, I deal in facts.  When I've been asked to assist in breach investigations, I have done so by addressing the questions posed to me through analysis of the available data.  I do not often have knowledge of what occurred with respect to regulatory or legislative oversight.  Now and again, I have seen news articles in the media that have mentioned some of the fallout of the incidents I've been involved with, but I don't see many of these.  What I find interesting about Lieberman's statement is that this is the perception.

The Big Data Problem
I read a couple of interesting (albeit apparently diametrically opposed) posts recently; one was Corey Harrell's Triaging My Way (shoutz to Frank Sinatra) post where Corey talked about focusing on the data needed to answer the specific questions of your case.  Corey's post provides an excellent example of a triage process in which specific data is extracted/accessed based on specific questions.  If there is a question about the web browsing habits of a specific user, there are a number of specific locations an analyst can go within the system to get information to answer that question.

The other blog post was Marcus Thompson's We have a problem, part II post, which says, in part, that we (forensic analysts) have a "big data" problem, given the ever-increasing volume (and decreasing cost) of storage media.  Now, I'm old enough to remember when you could boot a computer off of a 5 1/4" floppy disk, remove that disk and insert the storage disk that held your documents...before the time of hard drives that were actually installed in systems.  This dearth of storage media naturally leads to backlogs in analysis, as well as intelligence collection.

I would suggest that the "big data" problem is particularly an issue in the face of the use of traditional analysis techniques.  Traditional techniques applied to Corey's example (above) states that all potential sources of media must be collected, and keyword searches run.  Wait...what?  Well, no wonder we have backlogs!  If I'm interested in a particular web site that the user may have visited, why would I run a keyword search across all of the EXEs and DLLs in the system32 directory?  While there may be files on the 1TB USB-connected external hard drive, what is the likelihood that the user's web browser history is stored there?  And why would I examine the contents of the Administrator (or any other) account profile if it hasn't been accessed in two years?

Another variant on this issue was discussed, in part, in Mike Viscuso's excellent Understanding APT presentation (at the recent AccessData User's Conference)...the presentation indicates that the threat isn't really terribly "advanced", but mentions that the threat makes detection "just hard enough".

Writing Open Source Tools
This is a topic that came up when Cory and I were working on DFwOST...Cory thought that it would be a good section to add, and I agreed, but for the life of me, I couldn't find a place to put it in the book where it just didn't seem awkward.  I still think that it's important, in part because open source tools come from somewhere, but also because I think that a lot more folks out there really have something to contribute to the community as a whole.

To start off, my own motivation for writing open source tools is to simply solve a problem or address something that I've encountered.  This is where RegRipper came from...I found that I'd been looking at many of the same Registry keys/values over and over again, and had built up quite a few scripts.  As such, I wanted a "better" (that's sort of relative, isn't it??) to manage these things, particularly when there was so many, and they seemed to use a lot of the same code over and over.

I write tools in Perl because it's widely available and there a LOT of resources available for anyone interested in learning to use it...even if just to read it.  I know the same is true for Python, but back in '98-'99 when I started teaching myself Perl, I did so because the network monitoring guys in our office were looking for folks who could write Perl, and infosec work was as hard for folks to sell back then as forensic analysis is now.

When I write Perl scripts, I (in most cases) try to document the code enough so that someone can at least open the script in Notepad and read the comments to see what the script does.  I don't always try for the most elegant solution, reducing the number of keystrokes to accomplish a task, as making the steps available not only lets someone see more clearly what was done, but it also lets someone else modify the code to meet their needs...simply comment out the lines in question and modify the script to meet your own needs.

DFF
Speaking of open source tools, one of the tools discussed in DFwOST is the Digital Forensics Framework, of which version 1.1.0 was recently released.  This version includes a couple of updates, as well as a bug fix to the ntfs module.  I've downloaded it and got it running nicely on a Windows XP system...great work and a huge thanks to the DFF folks for their work.  Be sure to check out the DFF blog for some tips on how you can use this open source forensic analysis application.

Thursday, May 05, 2011

Updates

NoVA Forensics Meetup
Last night's meetup went pretty well...there's nothing wrong with humble beginnings.  We had about 16 people show up, and a nice mix of folks...some vets, some new to the community...but it's all good.  Sometimes having new folks ask questions in front of those who've done it for a while gets the vets to think about/question their assumptions.  Overall, the evening went well...we had some good interaction, good questions, and we gave away a couple of books. 

I think that we'd like to keep this on a Wed or Thu evening, perhaps once a month...maybe spread it out over the summer due to vacations, etc. (we'll see).  What we do need now is a facility with presentation capability.  Also, I don't think that we want to have the presentations fall on just one person...we can do a couple of quick talks of a half hour each, or just have someone start a discussion by posing a question to the group.

Besides just basic information sharing, these can be good networking events for the folks who show up.  Looking to add to your team?  Looking for a job?  Looking for advice on how to "break in" to the business?  Just come on by and talk to folks.

So, thanks to everyone who showed up and made this first event a success.  For them, and for those who couldn't make it, we'll be having more of these meetups...so keep your eyes out and don't hold back on the thoughts, comments, or questions.


Volatility
Most folks familiar with memory analysis know about the simply awesome work provided through the Volatility project.  For those who don't know, this is an open source project, written in Python, for conducting memory analysis.

Volatility now has a Python implementation of RegRipper built-in, thanks to lg, and you can read a bit more about the RegListPlugin.  Gleeda's got an excellent blog post regarding the use of the UserAssist plugin.

I've talked a bit in my blog, books, and presentations about finding alternate sources of forensic data when the sources we're looking for (or at) may be insufficient.  I've talked about XP System Restore Points, and I've pulled together some really good material on Volume Shadow Copies for my next book.  I've also talked about carving Event Log event records from unallocated space, as well as parsing information regarding HTTP requests from the pagefile.  Volatility provides an unprecedented level of access to yet another excellent resource...memory.  And not just memory extracted from a live running system...you can also use Volatility to parse data from a hibernation file, which you may find within an (laptop) image.  Let's say that you're interested in finding out how long that system has been compromised; i.e., you're trying to determine the window of exposure.  One of the sources I've turned to is crash dump logs...these are appended (the actual crash dump file is overwritten) with information about each crash, and include a pslist-like listing of processes.  Sometimes you may find references to the malware in these listings, or in the specific details regarding the crashing process.  Now, assume that you're looking at a laptop, and find a hibernation file...you know when the file was created, and using Volatility, you can parse that file and find specifics about what processes were running at the time that the system went into hibernation mode.

And that's not all you can use Volatility for...Andre posted to the SemperSecurus blog about using Volatility to study a Flash 0-day vulnerability. 

If you haven't looked at Volatility, and you do have access to memory, you should really consider diving in and giving it a shot.  

Best Tool
Lance posted to his blog, asking readers what they consider to be the best imaging and analysis tools.  As of the time that I'm writing this post, there are seven comments (several are pretty much just "agree" posts), and even reading through some of the thoughts and comments, I keep coming back to the same thought...that the best tool available to an analyst is that grey matter between their ears.

This brings to mind a number of thoughts, particularly due to the fact that last week I had two opportunities to consider some things for topics of analyst training, education and experience...during one of these opportunities, I was considering the fact that when I (like many other analysts) "came up through the ranks", there were no formal schools available to non-LE analysts, aside from vendor-specific training.  Some went that route, but there were others who couldn't afford it.  For myself, I took the EnCase v.3.0 Introductory course in 1999...I was so fascinated by the approach taken to file signature analysis that I went home and wrote my own Perl code for this; not to create a tool, per se, but more to really understand what was happening "under the hood".  Over the years, knowing how things work and knowing what I needed to look for really helped me a lot...it wasn't a matter of having to have a specific tool as much as it was knowing the process and being able to justify the purchase of a product, if need be.

Breaches
If the recent spate of breaches hasn't yet convinced you that no one is safe from computer security incidents, take a look at this story from The State Worker which talks about the PII/PCI data of 2000 LE retirees being compromised.  I know, 2000 seems like such a small number, but hey...regardless of whether its 77 million or 2000, if you're one of those people who's data was compromised, it's everything.

While the story is light on details (i.e., how the breach was identified, when the IT staff reacted in relation to when the incident actually occurred, etc.), if you read through the story, you see a statement that's common throughout these types of announcements; specifically, "...taken steps to enhance security and strengthen [the] infrastructure...".  The sequence of events for incidents like this (and keep in mind, these are only the ones that are reported) is, breach, time passes, someone is notified of the breach, then steps are taken to "enhance security".  We find ourselves coming to this dance far too often.

Incident Preparedness
Not long ago, I talked about incident preparation and proactive IR...recently, CERT Societe Generale (French CERT) posted a 6 Step IRM Worm Infection cheat sheet. I think that things like this are very important, particularly when the basic steps necessarily assume certain things about your infrastructure.  For example, look at step 1 of PDF includes several of the basic components of a CSIRP...if you have all of the stuff outlined in the PDF already covered, then you're almost to a complete CSIRP, so why not just finish it off and formalize the entire thing?

Step 3, Containment, mentions neutralizing propagation vectors...incident responders need to understand malware characteristics in order to respond effectively to these sorts of incidents.

One note about this, and these sorts of incidents...worms can be especially virulent strains of malware, so this applies to malware in general...relying on your AV vendor to be your IR team is a mistake.  Incident responders have seen this time and again, and it's especially difficult for folks who do what I do, because we often get called after response efforts via the AV vendor have been ineffective, and have exhausted the local IT staff.  I'm not saying that AV vendors can't be effective...what I am saying is that in my experience, throwing signature files at an infrastructure based on samples provided by on-site staff doesn't work.  AV vendors are generally good at what they do, but AV is only part of the overall security solution.  Malware infections need to be responded to with an IR mindset, not through an AV business model.

Firefighters don't learn about putting out a fire during a fire.  Surgeons don't learn their craft during surgery.  Organizations shouldn't hope to learn IR during an incident...and the model of turning your response over to an external third party clearly doesn't work.  You need to be ready for that big incident...as you can see just from the media, it's a wave on the horizon headed for your organization.