Monday, August 31, 2009

What is "Registry Analysis"??

What is this thing called (that's the second time I've used that Benny Hill reference in this blog)...Registry analysis?

For most folks, this phrase probably conjures memories opening a hive file in their favorite Registry viewer (RegEdit, EnCase, ProDiscover, or RFV, etc.) and looking at a couple of the more popular entries, such as the "ubiquitous Run key". Others may run though an entire list or spreadsheet worth of Registry keys and values. Manually. By hand. How boring. And if you're a corporate consultant, there's no better way to waste a customer's money as you burn through the hours on this chore, calling it Registry analysis.

But is this really analysis? Is analysis simply viewing data, or is it extracting data and interpreting that data in not only its own context , but also in context with other data?

One of the things I've stated in my books as well as in this blog is that when interpreting data (particularly data extracted from the Registry) its vital that the analyst understand what created or modified (with deletion being the extreme form of modification) a particular artifact, so that the nature and context of that artifact is understood (and presented/explained). For example, what would lead to a graphics image file being referenced in the MRU list to an image viewing application? The artifact is there, but what lead to its creation? Depending upon how that particular MRU list is maintained by the application, you may have a very specific timestamp associated with the artifact, as that may correlate with the last access time of the file. Ah, but there's another point about Registry analysis, analysis in general, and understanding the context of artifacts...beginning with Vista, MS disabled the updating of last access times on files by default, so now analysts can't correlate a file's last access time to other artifacts.

So my point is that Registry analysis isn't just about viewing certain keys and, that's Registry viewing. Rather, Registry analysis is about interpreting Registry artifacts (keys, LastWrite times, values, and data) in the context of the actions that led to their creation and modification, as well as in the context of other artifacts. Tools such as RegRipper strive to assist analysts and examiners with this sort of analysis, by providing a framework for extraction, correlation, a modicum of interpretation, as well as the presentation of the data with some supporting information.

Growth and research in this area appears to be sought after by the community, but is also limited by a lack of support and contributions from the community.

Timeline Analysis
Registry data can be an integral part of a timeline created for analysis; however, there is much more timestamped data available in the Registry than just key LastWrite times...sort of like that quote from Hamlet. For example, on Windows XP, some data holds the SSIDs that the user connected the system to via wireless networking, as well as the WAP MAC addresses and when the connection was made. On Vista, you also get the first and last time that SSID was connected to (as well as the WAP MAC address). Let's not forget other keys and values, such as MRU listings and one of my personal favorites, the UserAssist subkeys. These aren't the only differences between Windows versions...and I'm sure that there are others out there who are working on documenting these differences besides myself.

Links and Stuff (Registry research)

The subtitle of this blog post is "There are more things in heaven and earth than are dreamt of in your EnScript", to paraphrase Horatio.

Thanks to the SANS ISC blog, I ran across an AV product (free) called Immunet. This seems very interesting as it takes a "cloud" approach to protection...I know that "cloud" is becoming something of a buzzword in the computing industry, but maybe this will bring it home for the older folks; it take a SETI@Home approach to protection. How's that? The concept (not talking about the actual implementation) sounds pretty cool...

I ran across a tool called TrojanHunter recently that looks pretty interesting from a live system response perspective. The main page for TrojanHunter is absolutely correct...there are Trojans and other malware that don't necessarily hide on the system as much as they are DLLs injected into the memory of other processes. Therefore, when you're doing your live response, you're going to see network connections for some processes that may not normally have network connections...because it's the Trojan that's doing it.

Lately, I've been working on some Registry research, and I came across this page at the MS site, which includes the following:

Executable binary code should never be stored in the registry.

Pretty cool, eh? That's why I wrote the plugin for the RegRipper...Don had written on his blog not long ago about malware that wrote executable files into binary Registry values, so there had to be a way to detect this sort of thing.

As part of this Registry research I was doing, I was digging into the Shell\BagMRU data, as I had read this excellent paper on the use of the keys, but noticed that the paper didn't include anything about the binary data itself. I contacted the authors of the paper and Mr. Zhu was kind enough to respond with some information that proved to be very helpful. I took it from there and wrote up a RegRipper plugin to parse the information...several actually, that provide the information in various formats. Right now, however, it's very bare bones and the data needs to be formatted a bit (no, I take that back...a LOT) better.

Also during that research, I found something interesting...actually, a couple of somethings. First, the MUICache key on Vista has moved to the other user hive file, %LocalAppData%\Microsoft\Windows\UsrClass.dat. On a Vista system, I found the key in the UsrClass.dat file, in this path: Local Settings\Software\Microsoft\Windows\Shell\MuiCache.

The other interesting thing I found was that some of the Shell\BagMRU key appeared to be in the normal location in the NTUSER.DAT file, but there was additional information in the UsrClass.dat file, as well, in the path, Local Settings\Software\Microsoft\Windows\Shell\BagMRU. In fact, some of the information in the UsrClass.dat file seemed to be more inclusive...but that's just in my initial testing.

While we're on the topic of the Registry, its completely clear to me that the Windows Registry maintains a great deal of data, and as has been the case with memory analysis, things also seem to change as the versions of Windows change. With Vista and Windows 7, not only has a great deal of the information we would normally expect to look for on an XP system moved, but new data associated with wireless networking (specifically, binary values for wireless profiles named DateCreated and DateLastConnected) and an entirely new format has been added for the dates.

Mark has a blog post about translating the date format here; completely separately, I wrote the below code for my own use (thanks to my friends who helped me with the format):

sub parseDate128 {
my $date = $_[0];

my @months = ("Jan","Feb","Mar","Apr","May","Jun","Jul",
my @days = ("Sun","Mon","Tue","Wed","Thu","Fri","Sat");

my ($yr,$mon,$dow,$dom,$hr,$min,$sec,$ms) = unpack("v*",$date);

$hr = "0".$hr if ($hr < 10);
$min = "0".$min if ($min < 10);
$sec = "0".$sec if ($sec < 10);
my $str = $days[$dow]." ".$months[$mon - 1]." ".$dom." ".$hr.":".$min.":".$sec." ".$yr;
return $str;

Okay, so not only does this show us that, clearly, things change between versions of Windows in more than just the memory analysis arena. So, not only does the Registry contain some time stamps in 32-bit Unix time format, as well as the more expected 64-bit Windows FileTime format, but now there's this 128-bit whatever format for time stamps!

More to come...

Wednesday, August 26, 2009

Goin' commando

Cory had post a bit ago about using alternatives to commercial analysis suites when conducting an exam, and that got me to thinking...when I wrote WFA 2/e, one of the things I was acutely aware of was that some of the information would age pretty quickly; that is, from the time that I submitted the manuscript (early March) until the book was published (June), there would be a LOT of things that changed or improved, with new tools and new versions coming out. So something like a published book would be a good start, but it wouldn't be a great way to keep track of freely available tools that may be of use. Considering the fact that in most cases, folks don't even look for (or in some cases, write) tools until they actually need them, something online and easily edited (ForensicWiki) would be a better resource for tracking this sort of thing. The ForensicWiki would also be a great resource for not only providing information about tools (free or otherwise) for conducting analysis, but also for information on the format on the files being analyzed.

As a side note, I've found that over the past year or more, with the exception of PCI-specific searches, I've pretty much gone commando (i.e., sans dongle) on my exams, relying instead on specific, free tools...not because I have anything against the commercial stuff, but because the free tools fit the bill for what I needed. Does that make me a bad person?

Anyway, I think that is would be a great place to start throwing up information, discussion and links to free and open-source tools that folks are using for analyzing various files or formats. This can include general stuff (such as, does anyone have a good, free grep utility for Windows that doesn't use cygwin?)

For example, over on the ForensicFocus forums recently, there was a question regarding viewing information in MSI files. The original poster (OP) found that one of the recommended tools, InstEd, was extremely helpful for what he needed to do.

So, I'll be posting links to and comments about tools here, but I'd love to have folks send in comments or emails about tools they use that are free and/or open-source, and allow them to "go commando" on their exams. Please, no pictures! ;-)

Monday, August 24, 2009


What is a way to use a system to perform various activities, but leave minimal traces? One might think that any of the available cleaner tools (anyone remember SilentRunners.vbs?) would be the answer, but such tools can be too good, making it clear that something was used to wipe artifacts off of the system, harkening back to one of the adages in my books, that the absence of an artifact is itself an artifact.

Think virtualization. Think plugging in a USB device (thumb drive, iPod, etc.) that contains its own running operating system, along with its own tools and storage area. This isn't something that I've seen a lot of, but in a time when the media is telling us that declines in the economy are leading to increased data theft by departing employees, its something to consider.

Diane Barrett has talked about how virtuatlization affects forensics. When portable virtual environments such as MojoPac or Moka5 are used, the analyst is presented with a whole new set of challenges, as the usual remnants and artifacts (i.e., browser history, Registry settings, etc.) won't be available on the confiscated system. Instead, all artifacts will be on the USB storage device that contains the virtual environment, and only indications of the use of these environments (USBStor and MUICache Registry key entries, etc.) will be found.

So, the days of a simple, straightforward examination are fading into the past. Concerns of data leakage or IP theft just took on another dimension... going from file copying and malware and Trojans to leakage via social networking sites and virtualized environments. What's needed is specialized research and training to keep up with developments.

Qemu Manager - Manager for Qemu VMs
Running the OLPC Image in Qemu
ReactOS (use Qemu or VMWare)
Run Haiku under Qemu (Haiku is based on BeOS...I added this one for pure kewlness...)
Portable Virtual Privacy Machine
Windows + Qemu + Plan 9 (again...pure kewlness)
OS/2 Warp 4 on Qemu (ok, that's just going for the extra nerd points...)

Papers, Tools, and Such

If you do any work at all with network traffic captures (ie, .pcap files), you should take a look at Claus's Network Capture Tools and Utilities post. I mentioned several of the tools listed in Claus's post in Chapter 9 of Windows Forensic Analysis 2/e, and some of those tools have been updated. If you want to try the tools out but don't have any "interesting" traffic available, check out sites like this or this.

I didn't attend DFRWS this year, but I did see this very interesting paper about the CyberForensics TimeLab (CFTL) by Jens Olsson and Martin Boldt. CFTL is a "computer forensic timeline visualization tool", with the engine developed in Perl (yes!) and a C#/.NET GUI frontend. According to the paper, timestamped data is automatically extracted from an acquired image via processing with filters, and then stored in an XML format, which is then made available to the analyst via the frontend. CFTL appears to take a similar approach to ex-tip, in that filters extract timestamped data from various file formats. Taking the similarities a step further, CFTL also appears to parse Registry hive files only for keys and LastWrite times. From just reading the paper, the similarities end there, as CFTL uses an automated approach to extracting information from an acquired image, making in-processing and development of the XML file a fairly straightforward process.

I applaud the efforts of the authors, as this is just the kind of work we need to see within the computer forensic analysis community. Without this kind of work, we wouldn't move forward. However, I have a couple of thoughts with respect to this sort of approach to timeline creation and analysis. First, the context available from a timeline is derived from contents of the events themselves, not in a bar chart or histogram showing the numbers or frequencies of events within a given time period. For example, automatic software updates can lead to a great deal of file system and Registry activity, and any malicious activity during that timeframe will be hidden. Also, the LastWrite time for the ubiquitous Run key isn't nearly as relevant to an examination as the values contained in the key.

Second, I have heard analysts say time and time again that when developing a timeline, most analysts want everything, all of the available data, and that data reduction will be performed later. With respect to the Registry, I tend to disagree, as in most cases, the timestamps embedded within Registry data provide greater context than the key's LastWrite time. I tend to agree that if you don't really know and understand what you have available to you within an image acquired from a Windows system, go ahead and grab everything that's available. I also believe that if you do know what you have available to you and what data you can use, you can take an interative approach to building your timeline for analysis. I've done this myself, and I've found it to be much faster and more accurate than dumping all of the Registry key LastWrite times into a timeline and trying to make heads-or-tails of what happened.

From Spanagel Hall out at NPS in Monterey, CA (my alma mater) comes a very interesting master's thesis from Greg Roussas titled "Vizualization of Client-Side Web Browsing and Email Activity". This approach reminds me of the Analyst's Notebook, and I can see how this sort of approach would be

A bit of computer trivia...Gary Kildall once had an office in Spanagel Hall. I used to walk by it almost every day.

I caught this ComputerWorld article about NZ police releasing a virtualized environment tool for evidence. One thing about the article that concerned me was this statement:

Digital forensic examiners faced with a complex inquiry can spend weeks delving into a computer to find the traces of evidence required for a successful prosecution, Police say.

Ouch! Really? Weeks? Why? On the commercial consulting side, a "complex inquiry" would be broken down into discrete goals and steps, each of which would be provided an answer, or something along the lines of "...could not be determined conclusively...", along with the reason why. But really...weeks? Anyway, in the end, the new tool sounds as if it's LiveView with a web interface...but hey, if it helps LE put bad guys in jail, so be it. Personally, I think that there's a lot being missed using something like this, but I do see how it can be a very useful tool.

Finally, IronGeek posted the video for a class he gave on anti-forensics. The video itself is quite long (just over 3 hrs), so you might want to grab the slides and check out the links in the page. There's some pretty interesting (re: challenging) stuff presented in the links alone (I haven't watched the video yet...).

Tuesday, August 18, 2009

Tools and Links

From DFRWS2009 (papers here) and Andreas's tweets I found out about NFI Defraser, a utility that reportedly detects full or partial multimedia files in data streams, such as unallocated space within an acquired image. Defraser downloads as an MSI file, so it appears to be Windows-based. John posted some great info on Defraser to the SANS Forensic blog.

Also from Andreas's tweets I found out about Snorkel from NFILabs. Snorkel is a Java-based library that provides access to acquired images. While an evaluation version is available for testing, the implications of something like this are incredible! Imagine being able to completely script (ie, automate) an entire pre-processing examination of an acquired image, where much of what is now done manually would be a matter of inputting a job and waiting.

Internet Evidence Finder from JADSoftware has been updated to version 2. IEF v2.0 runs searches for a variety of Internet artifacts, such as Facebook, Yahoo and Google chats, Limewire search histories, etc. If this is something you're looking for, particularly with regards to inappropriate usage or data leakage issues (hey, USB devices aren't the only way to get data off of a system), then this is a tool you'll want in your kit.

Also be sure to check out Jad's Encrypted Disk Detector tool. EDD is a CLI tool that checks local physical drives for PGP, TrueCrypt and BitLocker encrypted volumes. If you're doing live response and need to know this sort of information before shutting a system down and acquiring an image, then you really need to include a tool like this in your kit.

If you're interested in Vista's Volume Shadow Copies, check out this post from the Forensics from the Sausage Factory blog. Most folks seem to love 'war stories' and 'case studies', so this is something you might want to take a look at.

Speaking of analysis, Chad's post on the SANS Forensic blog about demystifying the use of defrag on a Vista system is very well-written and interesting. This follows Chad's other post that takes a look at the same subject, but from an XP perspective. Posts such as these are extremely important, IMHO, as they look at attempting to determine intentional use of the disk defragmenter utility, given the fact that the system itself runs a limited defrag on a regular basis. I think that too many times, analysts fall into the trap of thinking that if a Prefetch file exists on an XP system, that indicates that the user run utility...end of story. However, per Microsoft (from the section on Prefetch):

Then it launches the system defragmenter with a command-line option that tells the defragmenter to defragment based on the contents of the file instead of performing a full defrag.

On a side note, one of the tools I recently posted to the Win4n6 Yahoo group as part of the timeline creation toolkit (ie, does a great job of parsing information/metadata from both XP and Vista Prefetch files.

The folks over at the SANS Internet Storm Center posted a compilation of tools for extracting files from pcaps. Very cool. I talked about NetworkMiner in WFA 2/e, and it's one of my favorites.

Sunday, August 16, 2009

HelpNet Security Interview Posted

I recently responded to an email Q&A session for HelpNet's posted here.


Thursday, August 13, 2009

Timeline Creation Tools Posted

I've posted the currently available timeline creation tools to the Files section of the Win4n6 Yahoo group, along with a PDF document that illustrates how to use the tools.

The tools themselves are somewhat raw at the moment. They're all Perl scripts (none have been compiled into standalone EXEs) and therefore require that ActiveState Perl be installed to use them (I used version 5.8 to develop the scripts). Some of the scripts require additional modules, but they're pretty easy to install using the Perl Package Manager (PPM) that ships as part of ActiveState Perl.

At the moment, the process for using the tools is manual...there's no single "Find all evidence" button to push...this isn't Nintendo forensics. However, there is a method to my madness...this provides the analyst with the flexibility to create mini-timelines, comprised of a subset of available sources, such as only the Event Logs. I've found this capability alone to be extremely valuable.

Finally, these tools are a start, and even through they're still raw at the moment, they open the door to an extremely valuable analysis technique. If this is something you're interested in, download the zipped archive, take a look at the PDF document (walks through using the tools to build an actual timeline from an image), and let me know what you think. Posting this isn't the end, it's just the beginning...

Monday, August 10, 2009

Links and Stuff

Once again, I present for your reading pleasure an odd mish-mash of links to information that defy categorization...

I no longer do PCI forensic audits, but I thought this one was interesting...the BreachSource site includes an interesting tool called BreachProbe, which takes a packet capture and can parse out credit card data. I can't say that I had access to very many packet captures during a breach investigation...most often we'd get a call after the "victim" organization had been notified by an external third party that they were a CPP, usually as a result of a fraud investigation. From there, there was even more time that went by before folks like me were called on-site. However, this does look like a useful tool for those who still do this kind of work.

I've been doing some research on various values and data within the Windows Registry, and part of that included settings for WZCSVC, largely for Windows XP. During my research, I ran across this Symantec web page, which includes a tool called wlan.exe. This looks like it's a great tool for collecting information and troubleshooting issues in a wireless environment. Based, in part, on this information and more about the WZC_WLAN_CONFIG structure, I updated the RegRipper plugin to parse the MAC address of the WAP out of the data. There's some additional work that needs to be done regarding control flags, but so far this has been some pretty interesting information. I originally started looking down this road based on something said in a post to the EnCase user forum, and started digging for some credible, solid information on which to base some Registry research. While the structure I found is specific to WinCE, it seems to apply very well to XP, as well. In my own testing, I pulled the MAC addresses for two SSIDs out of a hive file; the MAC address for the "tmobile" SSID pointed to (via OUI lookup) Cisco as the manufacturer, and the one for the "ender" SSID (my own WAP) pointed to Cisco-Linksys.

A recent post on one of the lists asked about locating indications of the use of the at.exe in the Task Scheduler log file. The at.exe command is used to schedule tasks, and on XP and above, you can also use schtasks.exe to create scheduled tasks. However, the schedlgu.txt file does not record the creation of scheduled tasks via either of these tools, but instead records Task Scheduler Service start/stop messages and scheduled task/job execution information. So this is something to keep in mind when conducting analysis and you find a Prefetch file for at.exe.

In a recent discussion regarding Windows trusted shells, someone sent in a link to a tool called "console", which appears to be a replacement to cmd.exe. It looks pretty interesting, and may be a great edition to your toolkit if you find the functionality useful.

A new post on the Hacking Exposed Computer Forensics blog continues the "what did they take when they left" series of posts, and also mentions RegRipper and RipXP. The post brings a lot of the thought processes that an analyst must use into play when trying to answer the question. Its easy to say, "there's more to it than that", and I'm sure that the author(s) would agree, but that's why there's a series of posts...putting it all in one post would simply make that post too long and too hard to digest. Also, I'd suggest that since part of the title of the series includes "when", suggesting a relation of events to time, that timeline creation and analysis might help to answer these questions. RegRipper doesn't have a plugin that parses all of the information mentioned in the blog post...yet.

On a side note, it looks as if HECF has a second edition on its way out. Nice! I still have the first edition posted prominently on my bookshelf, and it's nice to see a number of computer forensic books out there. As much of a niche as the topic of computer forensics is, its nice to see a choice of books that covers different aspects of the field.

The Virus Bulletin recently released their RAP index, illustrating the results of testing to determine an AV product's reactive and proactive detection capabilities. Kind of scary when you think about it, but then most folks are aware of this and know not to rely on AV as a be-all-end-all, silver bullet solution. I mean, what do you expect, with malware write-ups like this? I mean, seriously...

Finally, RipXP has been picked up in Spanish on the NeoSystemsForensics blog! Nice!

Saturday, August 08, 2009

Thoughts on using a "trusted shell"

Recently I've seen the topic of using a "trusted shell", especially when performing live response on Windows systems, brought up in a couple of forums.

The idea of a "trusted shell" comes from the use of statically-compiled binaries on *nix systems, where the actual executable file carries with it it's own libraries and does not rely on libraries on the "victim" system. The idea is that if the actual shell binary or any of the dependent libraries have been compromised, the responder can still get accurate information due to the fact that she's using a "trusted shell". And, of course, there's always the concern about "rootkits", but whenever I see a post or email that includes a reference to rootkits, my FUD radar goes off; in most cases, I don't think that folks understand rootkits, per se. For example, why say that you're only going to use a "trusted" cmd.exe because the cmd.exe on the system may have been subverted by a rootkit, when your "trusted" cmd.exe is going to use the same DLLs as the native one? I mean, most rootkits hook or subvert API functions, not executables.

Mapping this to the world of Windows incident response, one has to ask, how do you statically compile an executable for Windows? Can you even do this? At some point, you're going to have to rely on an MS-provided DLL, and its all a matter of how deeply the system has been compromised. After all, if the intruder is able to compromise extremely low-level APIs to the point of tricking the kernel, it's game over!

Even using SxS manifests will very likely present their own unique set of problems, particularly since most currently available tools and applications don't use manifests and would need to be rewritten.

Finally when running cmd.exe, there are a number of ways to subvert the use of just that executable without ever touching or modifying the EXE or any of its dependent DLLs. One is through the use of the Command Processor\AutoRun Registry entry (HKLM or HKCU), one is through the exefile Registry entries, and another is through the Image File Execution Options Registry key.

So that leaves me with the question of, why are folks stressing so much about "trusted" shells, when we can't even figure out what that is?

Another way to look at this is that if we've done our IR planning, maybe we want the operating system to lie to us. Think about it...if we're using some form of differential analysis, wouldn't we want to run "netstat -ano" on a system, and then compare that output to a packet capture, firewall logs, or even an nmap scan? Or how about using tools that use disparate API calls, such as perhaps pslist.exe and handle.exe, to get similar information (ie, process listing)? Wouldn't we get some benefit from quick, automatic parsing of the data that would highlight disparities in the output? Let's say you're seeing a steady stream of data flowing from a system, and you have firewall and IDS logs, as well as a packet capture, and the output of netstat shows in indication of the use of the port? Wouldn't this sort of analysis provide us with something useful, and obviate the need for speculation and assumption?

What are your thoughts?

Tuesday, August 04, 2009

More work on Timeline Analysis

I saw that there's a new post over on the SANS Forensic blog about generating timelines for analysis by adding "alternative" sources of timestamped data. This post points back to Kristinn Guðjónsson's blog post about the log2timeline tool for creating timelines.

This is very similar to some of the stuff I've been working on, and it's great to see that there's more interest in creating timelines for analysis and answering questions.

Another cool thing about Kristinn's work is that its Perl-based! Sweet!

Speaking of Perl-based approaches to timeline creation, I received an email yesterday that contained a link to the Revealer Toolkit, or "RVT". This toolkit looks pretty promising, so be sure to take a look at the project page for downloads and news. RVT has its own newsletter/Google group, as well.

RVT is an interesting project. First off, there's the Perl aspect. Second, reading through the documentation, it appears that the authors came up with some similar thoughts along the same lines that I was thinking of when I started down this road; in particular, how to incorporate not just different sources of time stamped data from a single system (ie, file system, EVT files, Registry hives, etc.), but how to incorporate data from multiple systems, as well as other external sources.

Another interesting aspect of RVT is the ability to plot timelines in a graphical format. I'm having some trouble coming up with a means for plotting timeline data in a meaningful way, so that the analyst is not overwhelmed with raw data, but is instead able to glean some modicum of actual intelligence from the shear glut of information. For right now, the timeline tools I've developed are a very manual process, but there's a method to my madness...more about that later.

Saturday, August 01, 2009

I love it when stuff works...

Remember my earlier post about parsing the output of handle.exe? Well, I received the output of this tool from a customer with a system that, in all likelihood, was infected with W32.Virut.CF. Using the Perl script from my earlier post, I found a mutant named 'l0r8' within the winlogon.exe process.

An interesting item about this malware is that it doesn't appear to use autostart locations within the Registry or file system in order to maintain persistence; rather, it achieves persistence as a file infector...some of the files infected get run on a regular basis, so the infection remains persistent across reboots. A list of files infected by at least one variant of Virut can be seen here. Here's a nice write-up on another variant of Virut, and here's an informative blog post from Fortinet.

So, when examining a system that may be infected with Virut, don't expect to find a process running or a Registry entry that will point to an executable file. From a post-mortem analysis perspective, look for the artifacts in the write-ups available, in particular an odd entry in the hosts file, an entry in the Registry allowing outbound connections through the firewall (AuthorizedApplications\List in the DomainProfile or StandardProfile...the RegRipper plugin can help you here), and a value being added HKLM\Software\Microsoft\Windows\CurrentVersion\Explorer key.

One important aspect of forensic analysis when it comes to malware infections is that customers are asking the question, how did it first get on the system? This malware doesn't float around on its own, and it doesn't install itself as an executable, launching from a Registry key, so we'll have to find another way to determine when and how the system was first infected. One way to do this is check the last modification time on some of the files that were infected, and correlate those to the Registry artifacts listed with some of the variants. If you're not sure which files to check for (some variants may infect different files), there's another means of determining some of the files you can look at; some of the write-ups on the malware state that it patches sfc_os.dll in memory, disabling Windows File Protection, presumably to infect 'protected' files. Therefore, we can use a tool such as WFPCheck in order to determine infected files, and from there, determine the last modification time of the files.

Don't forget to check the contents of the host file, and if the described artifacts are present, check the last modification time, as well.

Now, one thing to keep in mind is this...what happens when an infected system is rebooted? When the system starts back up, infected EXE files are launched and the activity indicative of the malware occurs again...but are the EXE files re-infected? If so, the last modification time of the files should be shortly after the last reboot...something we can correlate with other system artifacts. So the question becomes, are the Registry and file system artifacts the result of only the first infection instance, or are they reproduced each time the system is rebooted and the infected files are run again?

Now, if you have a Windows XP or Vista system, you may be able to use existing Restore Points or Volume Shadow Copies (respectively) to narrow down (to an approximate window) when the system was infected.

Once you have a timeframe to focus on, use timeline analysis of system and user activity to attempt to determine the original infection vector...via web browsing activity, email attachments, or possibly connection of USB removable storage devices or connections to network shares.

More Links

Picked up this site the other day, with a post about the MS Office Visualization Tool...looks pretty cool! Offvis allows you to graphically view data structures and records withing MSOffice documents...neat! If you need this kind of capability and need to be able to extract metadata from Office documents, then this is a good tool to have available. If you have your copy of WFA 2/e, be sure to check out and on the DVD...and check out chapter 8 for an example of how I've used!

Hey, the Illustrious Don Weber has posted about finding malware hiding in (that's right, I said "in") the Registry (Sophos post on the same sort of thing). This is actually pretty amazing when you think about how do you go about finding this sort of thing, if you suspect it? Do you do what Don did and essentially stumble across an exported DLL function or handle to an odd Registry key during memory analysis, or can you find it another way? Remember my recent post where I mentioned least frequency of occurrence (props to Pete Silberman!)? Do something like that and parse through the entire hive file looking for values with binary (REG_BINARY) data types, and (a) map them based on size and (b) look for ones that start with "MZ". To get started on this, I wrote a RegRipper plugin (uploaded it to the Win4n6 Files section, and pasted it into a post on the forums) that parses through any Registry hive file and looks for all values with binary data. For each of the values with binary data that if finds it keeps a count, and searches the binary data for "MZ", an indicator of a PE file. Here's what the output looks like when the plugin is run (via rip.exe) against the file that Don was looking at:

Launching findexes v.20090728
Key: Microsoft\SysMgr
LastWrite time: Fri Jun 26 08:18:27 2009

Value: ssdt Length: 2464 bytes
Value: hide Length: 3328 bytes

Value: door Length: 110592 bytes

Number of values w/ binary data types: 5103

Number of values w/ MZ in binary data: 3

Pretty neat, eh?

Speaking of neat tools (no, this is NOT a reference to Cory Althiede), I received this comment about ripXP over in the forums this week:

This is a very slick tool, and, aside from the fact that it's free, should be a candidate for "tool of the year" if there's such a thing!

Thanks! I don't know if there is such an award, but I'm glad that someone has found the tool to be useful!

Addendum, 8/10: Symantec posted a write-up on the malware (Backdoor.Regdor) that was originally written to address. Something interesting about the Symantec write-up is the statement that the malware takes control of the system when the replaced version of mspmsnsv.dll is loaded by svchost.exe. Hey, wait a second...on my XP systems, mspmsnsv.dll is a file that's protected by shouldn't the write-up be mentioning something about WFP being subverted? Yeah, yeah, I know that malware is doing this more and more, but this is still something that needs to be said.