Saturday, May 29, 2010

Some more stuff...

I've been working on a book on forensic analysis of the Windows Registry, and I was adding something to my outline the other day when I ran across Chris's blog post on how to crack passwords using files from an acquired image. Nothing quite like freeware to get the job done, eh? I guess one of the issues is that there's a "cost" associated with either pay a lot of $$ for a commercial package, or you "pay" by having to learn something that doesn't include pushing the "find all evidence" button. Kind of makes me wish for Forensicator Pro! ;-)

This is pretty cool stuff, particularly when you use it in conjunction with the samparse plugin, and this information about User Account Analysis. I know I keep referring back to that post, but hey...there are a LOT of analysts out there who think that the "Password Not Required" flag in the SAM means that the account doesn't have a password, and that's not the case at all.

Two things about this: first, some things (like this) bear repeating...again and again. Second, this is why we need to engage and be part of the larger community. Sitting in an office somewhere with no interaction with others in the community leads to misconceptions and bad assumptions.

Contacts and Sharing
Speaking of communities and sharing, Grayson had an interesting post that caught my eye, with respect to sharing. Evidently, he recently found out about a group that meets in Helena to discuss security, hacking, etc. This is a great way to network professionally, share information...and apparently, to just get out and have a sandwich!

Speaking Engagements
I've blogged recently about some upcoming speaking engagements, conferences where I and others will be speaking or presenting. My next two presentations (TSK/Open Source and the SANS Forensic Summit) will cover creating timelines, and using them for forensic analysis. The content of these presentations will be slightly different, due to time available, audience, etc. However, they both address timelines in forensic analysis because I really feel that they're important, and I'm just not seeing them being used often enough, particularly where it's glaringly obvious that a timeline would be an immensely powerful solution.

Yes, I know of folks who are using SIFT and log2timeline...I've seen a number of comments over in the Win4n6 Yahoo group. That's some real awesome sauce. I've written articles for Hakin9, including this one, which walks the reader through using my tools to create a timeline. I've done analysis of SQL injection attacks where a timeline consisting of the web server logs and the file system metadata basically gave me a .bash_history file with time stamps. I've created and used timelines to map activity across multiple systems and time zones, and found answers to questions that could only be seen in a timeline.

So, at this point, for those of you who are not creating timelines regularly, what is the biggest impediment or obstacle for you? Is it lack of knowledge, lack of access to tools...what?

Speaking of speaking engagements...I'm scheduled to be on with the guys from the Securabit podcast on 2 June. I'm a big fan of Ovie and Bret's CyberSpeak podcast and these kinds of things are always interesting. Most recently, I listened to the interview that included Dr. Eric Cole...whom I once worked with when he was at Teligent (I was with a consulting firm), albeit only for a couple of weeks.

I've also been on Lee Whitfield's Forensic4Cast podcast. Lee and Simon are swinging the Forensic4Cast Awards 2010, which they started last year...if you're planning to be at the SANS Forensic Summit this July (and even if you're not), be sure to enter a nomination and vote. You can view the 2009 awards here.

There's an updated version of CaseNotes do keep case notes, right? Chris blogged on it, as well as the importance of keeping case notes.

Wednesday, May 26, 2010

More links...

WFA 2/e Book Review
Peter Sheffield posted a review of WFA 2/e over on the SANS Forensic Blog. What can I say, besides, "Thanks, Peter!" I really appreciate it when folks let me know what they think of the book, or the tools, but I appreciate it even more when they do so publicly, like what Peter did. Such things really help the sales of the book. More importantly, it's beneficial for me to see that others in the community have found the work and effort put into the books to be useful or valuable.

I received the following quote from Chris Perkins, CISSP, ACE (Hujarl), Digital Forensic Investigator, along with his authorization to share it:

Some years ago while at a tech conference I ran across your first edition of the Windows Forensics Analysis book. On my return flight I read it cover to cover, and read the Registry Analysis chapter twice! I had an interest in the forensic space previous to this experience with my work as a security analyst, but your book spurred my interest even further and helped drive me towards my current career.

Fast forward to today and I am still referencing that great book frequently in my work as a Digital Forensic Investigator. It is well worn and dog-eared throughout.

In addition, your RegRipper tool is used constantly in my investigations, especially in Intellectual Property work. The beauty of the tool is its quick, clean text reports and flexibility for additional plug-ins based on specific needs. It can be verified directly with other tools and methods, which is very necessary process to validate the data.

Thanks so much for the great work!

Thanks, Chris, for your words, as well as for allowing me to share your comments publicly.

MS goes Open Source
Microsoft recently released a tool for viewing the content structure of PST files called the PST Data Structure View Tool, or pstviewtool. MS has also released the PST file format SDK. These releases follow MS's release of the .pst structure specification earlier this year, and make it easier for programmers to access the contents of PST files without having to have OutLook or Exchange installed.

Date Formats
Working on writing recently, I've been trying to figure out where a good place is to fit in a discussion or even just state, "here are date formats used by MS". The Old New Thing blog has a very good post on time stamp formats. One that isn't mentioned in the post is the 128-bit SYSTEMTIME format; this one is used in Scheduled Task .job files, as well as in several Registry keys that have to do with wireless access on Vista and above. Please don't think that that's a complete or comprehensive list of where the date format is used in's only two places that I'm aware of, and there are likely others.

I've recently seen and received a number of questions about Office 97-2003 metadata date formats, what the date values refer to (GMT vs. local system time), and where they're located in the binary format. Well, MS was nice enough to publish the formats, which you can use to verify findings from other tools. Click on the link in the "Date Formats" section above, and you'll see that the OLE date format is different from other formats, particularly the more recent Office (2007, 2010) formats.

User Account Analysis
The issue of user account analysis comes up time and again, and I thought that this would be worth repeating. I've seen the question of the "password not required" flag and what it means come up in various forums, most recently in the new RegRipper forums. I understand that this can be a bit tough to grasp, so I'd like to post it again.

With respect to the "password not required" flag in the output of the plugin, what I got from someone at MS is as follows:

That specifies that the password-length and complexity policy settings do not apply to this user. If you do not set a password then you should be able to enable the account and logon with just the user account. If you set a password for the account, then you will need to provide that password at logon. Setting this flag on an existing account with a password does not allow you to logon to the account without the password.

I hope that helps those of you doing analysis.

Ovie (sans Bret) has posted another CyberSpeak podcast...check it out!

TSK/Open Source Conference
Just a reminder about the TSK/Open Source Digital Forensics Conference coming up on 9 June! Check out the presentations!

SANS Forensic Summit
The SANS Forensic Summit is coming up, 8/9 July! Check it out!

Friday, May 21, 2010

Analysis Tips

I wanted to throw out a couple of things that I've run across...

I've worked a number of incidents where malware has been placed on a system and it's MAC times 'stomped', either through something similar to timestomp, or through copying the times from a legitimate file. In such cases, extracting $FILE_NAME attribute times for the file from the MFT have been essential for establishing accuracy in a timeline. Once this has been done, everything has fallen into place, including aligning the time with other data sources in the timeline (Scheduled Task log, Event Logs, etc.).

In a number of instances, this NTFS metadata file has been a gold mine. I worked on an exam last year where we thought that the intruder had figured out the admin password for a web-based CMS. I had a live image of the drive, and found the contents of the password file in $LogFile, which clearly demonstrated that the admin password was blank. BinText worked great for this.

This index file appears in directories when you're using FTK Imager, and in a number of instances, has been indispensable. During a recent exam where, due to lack of temporal proximity, I found the directory used by an intruder, but it was apparently empty. The $I30 file contained references to the artifacts in question.

Thoughts on Analysis
When working on an engagement...IR, forensic analysis, etc...we often loose site of what the term "analysis" means. We're supposed to be the experts, and the "customer" is relying on use to extract, review, and distill that extremely technical data into something that they can understand and use. Running through gigabytes of IIS logs and dumping out the entries that appear to indicate SQL injection and giving those to the customer from them to research and interpret is NOT analysis.

Let me break it down so you can put this on a t-shirt:

Run tool + dump output != analysis

Make sense?

Temporal Proximity

A couple of years ago, I heard someone...a really smart about "temporal proximity". I know that sounds Star Trek-y, but the reference was to initiating response activities as close to an incident as possible.

Several of the industry reports indicate that the majority of incidents that the vendors have responded to have been the result of third-party notification to the victim. That alone indicates a number of things, the lack of temporal proximity (perhaps a better description would be "temporal dispersion") being one of them.

Why is this so important? Well, a lot of, strike that...a lot of critical information can exist in memory, making it very volatile. Processes complete, network connections terminate. The longer you wait, the more this happens...and the system is more likely to be rebooted.

Some intruders get on systems, run tools, and them leave with data. When they do so, they may delete their toolkits. I've seen batch files that include the "del" command for just that purpose. Well, the more temporal dispersion you have from the incident, the less likely you are to recover the deleted case you haven't heard, Windows (especially XP) has it's own built-in antiforensics measures.

Okay, you're probably wondering what I'm talking I'll tell you. For Windows systems, even if you don't interact with the system, stuff still happens, particularly with XP. Just let an XP system sit for a couple of days, and you'll see. Restore Points are created every 24 hours, and if the disk space available for the RPs is getting short, others will be deleted. A limited defrag is run every three days. And this is just for an XP system that sits there with no network connectivity and no one interacting with it. Now, add to that things like Windows software and application know, the stuff that just kind of happens automatically with a network connected system. Even with minimal auditing enabled, stuff still gets logged to the Event Log...more so on Vista and Windows 7, simply because there are so many more logs.

Now, add to the mix that no one within your infrastructure is aware of an incident (intruder, malware, etc.), and systems remain up, functioning, operational and in use. I've been on engagements where we collected data from a system and then three days later collected the same data...and you'd swear that they were two different systems. Prefetch files had been deleted, deleted files had been overwritten by OS and application updates, applications and tools being run, etc.

In order to achieve temporal proximity, you need a couple of things. First, visibility...if you don't have visibility into your infrastructure, how will you know when something occurs? You can't really expect to know when something goes wrong or changes if you're not monitoring, right?

Second, you need a plan. What's you're IR plan? Acquire memory and disk, and then take the system offline? Or panic and not do anything at all until someone who has no idea what's going on makes a decision? I can't tell you the number of times I've responded and found out that the incident had been detected a month prior, and the infected/compromised system had been left up the entire time.

Me: "You know the intruder has been siphoning data off of this system for the past month, right?"

Them: "We didn't know what to do."

This happens more than you'd care to know, and not just to one vertical...not just PCI, but to many, many types of victims.

One final note...Marines in training learn what are referred to as "immediate actions". These are simple tasks that you use to clear a jammed weapon. They're simple when you're on the range, on a bright, sunny day after a good night's sleep. You can ask a range coach if you're doing it right. But we're trained on this over and over because you never need it in those conditions...when you're going to need that reaction to be programmed is during an assault, at 2:30am, after you've gone without sleep for two or more days and maybe haven't eaten in as long. And it's raining. And it's cold.

Are your IT assets critical to your business? If I were to back up a truck and take all of your computers...all desktops, laptops, servers, would that affect your business? It would disappear, wouldn't it? Well, if IT assets are so critical to your business, why not protect them? The bad guys aren't coming into your organization and walking out with boxes full of papers...they're coming into your network and stealing data that way. And they're successful because in many cases, they have greater visibility into your infrastructure than you do.

Friday, May 14, 2010

Linkity linkity
Brett updated recently...take a look. I received some emails recently regarding "404 Not Found" messages for some stuff linked at the original site, and then received a couple of messages from Brett.

Brett's done a great job of maintaining the site, but for the site to really be of value, it takes more than folks in the community coming by to grab RegRipper and that's it. It takes contributions...thoughts, ideas, communication, etc. One particular project that's benefited from a very active community is Volatility.

Here's an interesting post from the Binary Intelligence blog, explaining how Matt went about modifying the current RegRipper to meet his own needs! Great job, Matt!

64-bit Software hives
Speaking of the Registry, does anyone have Software hives from well-used 64-bit systems that they're willing to share, for research purposes?

Chris Brown recently released v6.5 of ProDiscover. Chris very graciously provided me with a license back when PD was at version 3, and I've used the framework ever since. Chris added Perl as the scripting language for PD, in the form of ProScripting, a while back and that proved to be very beneficial. Most times my case notes will start with something like, "Created a case project in ProDiscover v6.0, added the image, and populated the Registry and Internet History Views." This is a great way to get an initial view of things, particularly if you suspect malware has infected the system. One of the things I look for first is the Default User with a web browsing history.

When conducting IR or analyzing live IR data, I tend to lean toward a little Least Frequency of Occurrence (LFO) analysis as an approach to malware detection on systems. Most times, what I do is grab the output of handle.exe and run it through a Perl script (, posted to the Files section of the Win4n6 Yahoo group) to get a list of the mutants/mutexes that are unique or appear least frequently on the system.

When it comes to LFO analysis, some folks seem to think that means just running a tool, like handle.exe, or just running a script that locates all of those mutexes that are unique and appear only once in the output of handle. But that's not analysis...that's just running a tool and getting data. Due to the way malware authors use mutexes, you have to look for something odd and out of place, so comparing the names with each other is one way to conduct LFO analysis. Another way to address this if you're looking at multiple systems is to compare your findings between systems. Let's say that you have some systems you know to not be infected, and others that you suspect may be infected...conducting LFO analysis on each system and comparing the output across all systems may provide some interesting findings.

My point is that running a tool and dumping the output into a report is NOT analysis, folks.

In addition to handle.exe, I ran across this post on Jamie Blasco's blog today that listed two tools, one of which that would be of use during IR...enumeratemutex.exe. While this would enumerate the mutexes for you and allow you to do a really quick LFO analysis, it wouldn't necessarily allow you to tie the mutex to a specific process. However, it can be a good check.

TSK Open Source Conference
June 9th is the date for Brian Carrier's first TSK and Open Source Forensics Conference, right in my own backyard (well, almost...that would be kind of cool though...).

I'll be giving a presentation on using open source tools to create timelines for analysis.

Cory's apparently doing a presentation entitled Commando it wrong of me to want to go see Cory talk about doing forensics commando? I have to admit that there's a certain horrifying fascination there...but is it really so wrong? ;-)

The venue isn't far from a Dogfish Head Ale House, and Vintage 51 is close, as well. Look to one of those venues for the conference pre-party (I'm kind of proactive and not into after-parties...) the evening before.

SANS Forensic Summit
The SANS What Works in Forensics and Incident Response summit is coming up this summer in Washington, DC.

The agenda looks like another good one this year. Jesse will be talking about fuzzy hashing, Troy will be talking about Windows 7, and Richard will be presenting on the CIRT-level response to APT. Between the presentations and panels, this looks like it will be another great opportunity.

I'll be giving a workshop on adding pertinent Registry data to a timeline (can you see a trend developing here, with my presentation at the TSK conference?), and how doing so can really help develop context to and confidence in the data you're looking at.

Looks like I'm on a panel again this time around...those are always a good time. Troy Larson will be there...everyone should come on by and check out his Sharky lazer pointer. No, that's not a euphemism for anything.