Pages

Tuesday, May 29, 2007

XP Firewall

Pp 216 - 128 of my book address the Windows XP firewall logs; where the file(s) is/are located on a system, and how they are useful to an investigation. I even include a sample firewall log on the DVD from where I enabled all logging and scanned my system with nmap from another system. I wanted folks to see what this kind of thing looks like, and I hope that you've found it beneficial.

Has anyone seen the "Bonus" directory on the DVD yet? Within the Bonus directory is a Perl script (and an associated EXE file...be sure to follow the instructions and keep the appropriate DLL with the EXE if you copy it off of the DVD) called "fw.pl" that uses WMI to get configuration information about the Windows XP firewall, and the SecurityCenter, in general.

Using either the Perl script or the EXE, type "-?" or "/h" at the command prompt to see the syntax information. Simply typing "fw.pl" or "fw" (for the EXE) tells the tool to collect and display all information. The tool displays basic information about the firewall, authorized applications, service/port information, SecurityCenter information, etc., all from a live system.

Porting this over to extracting the same information from an imaged system shouldn't be too difficult.

Note: The fw.exe file that you see in the Bonus directory was "compiled" from the Perl script using Perl2Exe. When I compiled the EXE, I used the "-small" switch so that the Perl runtime DLL would be pulled out as a separate file. However, other Perl modules are used as well, so I also compiled a version using the "-tiny" switch. This setting creates a separate DLL for each Perl module used, rather than pulling them out of the EXE at runtime and creating temporary files on the local hard drive. This file is in the "fw.zip" file...using the "-tiny" switch means that its suitable for use in live response, particularly with the Forensic Server Project.

Saturday, May 26, 2007

XP Anti-Forensics

There is discussion now and again in computer forensic circles regarding "anti-forensics", techniques that are used on systems to remove or obfuscate the artifacts that an examiner may look for and analyze. These are usually discussed in the context of purposeful actions performed by a user or attacker, but not so much in the sense that there are "under the hood", "behind the scenes" activities that go on as part of normal, day-to-day operations of the operating system that can serve the same function.

What is it that happens behind the scenes on a live XP system that most of us don't really know about? Have you ever fired up FileMon or RegMon and just watched what they recorded on a live system, without you interacting with the system at all? Now...move the mouse pointer across the screen...

While XP is a treasure trove of artifacts for an examiner, there are also things that the examiner needs to keep in mind when performing artifact extraction and analysis, particularly when it comes to looking for deleted files. When a file is deleted on a Windows system (not moved to the Recycle Bin, but really deleted), it's common knowledge that its not really gone. In a nutshell, the sectors that the file occupies are still on the hard drive, although they are now available for use by the operating system. And there's a lot that XP does to use those available sectors.

Many applications, such as MS Word, like to create temporary files while you're editing a document, and then delete those when you close the document. The sectors used by those temporary files need to come from somwhere. Yes, this is an application-specific issue and applies to any version of Windows that the application is running on.

XP creates System Restore Points, every 24 hrs by default, but also during various other actions, such as software installation or removal, etc. These Restore Points contain files that consume sectors. See my book for more information on Restore Point analysis.

Every three days, the XP Prefetch function performs a limited defragmentation of files on the hard drive. While this is limited, it still moves the contents of some sectors, overwriting others.

Speaking of XP Prefetch, when a user (any user) on the system launches a "new" application, a Prefetch file may be created for that application (assuming the 128 Prefetch file limit hasn't been reached). On my system, I have 104 .pf files, ranging in size from 8K to over 100K. Again, sectors are consumed.

As discussed in the Registry Analysis chapter of my book, there are a number of places within the Registry where a user's actions are recorded in some manner. New entries are added to the Registry, increasing the size of the files...not just the user's NTUSER.DAT file, but some actions will added entries to the HKLM hives, as well.

Of course, there are also a number of Registry settings that will have an effect on the examiner's analysis; these are addressed in detail in my book. While these aren't specific to XP, they do have a decidedly anti-forensic effect.

I mention these things because many times an examiner may be looking for evidence of a deleted file, carving unallocated space looking for a keyword or a file header, and come up empty. Remember Harlan's Corollary? ;-) Funny how there just seem to be more and more ways to apply that corollary...

Sites discussing anti-forensics aren't hard to find:
The MetaSploit AntiForensics Site
Ed Skoudis is quoted on antiforensics in 2003
Marcus K. Rogers' presentation
Ryan Harris' DFRWS paper

Friday, May 25, 2007

Prefetch Analysis

I've seen a couple of posts recently on other blogs (here's one from Mark McKinnon) pertaining to the Windows XP Prefetch capability, and I thought I'd throw out some interesting stuff on analysis that I've done with regards to the Prefetch folder.

First off, XP's Prefetch capability is meant to enhance the user eXPerience by helping frequently used applications load faster. Microsoft has a nice writeup on that, and portions of and references to that writeup are included in my book. XP has application prefetching turned on by default, and while Windows 2003 has the capability, only boot prefetching is turned on by default. So, XP systems are rich in data that can help you assess and resolve an incident investigation.

First off, XP can maintain up to 128 Prefetch files...these are files within the Windows\Prefetch directory that end in ".pf". These files contain a bunch of prefetched code, and the second half of the files generally contain a bunch of Unicode strings that point to various modules that were accessed when the application was launched. Also, each Prefetch file contains that run count (number of times the application has been run) as well as a FILETIME object representing the last time the application was launched, within the file itself (ie, metadata).

Okay, so how can this information be used during forensics analysis? Remember Harlan's Corollary to the First Law of Computer Forensics? If you acquire an image from a system...say, a user's laptop...and you're told that the user had this laptop for a year or so, and you don't find any .pf files...what does that tell you?

Mark talked about U3 Smart Technology, and some of the Prefetch artifacts left behind by the use of tools like this. Excellent observations, but keep in mind that the Prefetch files aren't specific to a user...they're system-wide. On a multi-user system, you may have to look other places to determine which user launched the application in the first place. Ovie does a great job talking about the UserAssist keys and how they can help you narrow down who did what on the system.

I've looked to the Prefetch folder for assistance with an investigation. In one instance, there was a suspicion that a user had deleted some files and removed software from the system, and attempted to cover his tracks. While it was clear that the user had done some of these things (ie, removed software, emptied their Recycle Bin, etc.) it was also clear that they hadn't gone through the trouble of running one of those tools that delete everything; most of the artifacts I would look for were still in place (can you guess from my book what those artifacts might have been?). I found a reference to defrag.exe in the Prefetch folder, but nothing to indicate that the user had run the defrag tool (XP's built-in, automatic anti-forensics capabilities are a subject for another post). It turns out that as part of the Prefetch capability, XP runs a limited defrag every 3 days...the Prefetch capability prefetches XP's own prefetch functionality. ;-)

In another instance, I wanted to see if a user had burned anything to CD from the system. I found the installed software (Roxio Sonic), but found no references in any of the user artifacts to actually launching the software. I did, however, find an IMAPI.EXE-XXXXXX.pf file in the Prefetch directory. Interestingly enough, the Unicode strings within the file included a reference to iTunes, which, it appeared, the user used a lot. It turns out that iTunes likes to know where your CD or DVD burner is...I confirmed this on another system on which I knew the user used iTunes, and had not burned any CDs.

So, as a wrap up, some things to look for when you're digging into the Prefetch directory:

- How many .pf files (between 0 and 128) are in the Prefetch directory?

- For each .pf file, get the last run time and the run count. The last run time is a FILETIME object, meaning that it is maintained in UTC format...you may need to adjust using information from the TimeZoneInformation Registry key (ie, ActiveTimeBias).

- Correlate .pf files and the last run times to UserAssist key entries to tie activity to a specific user, as well as the Event Logs.

- Run strings to get the Unicode strings from the file and see what other modules were accessed when the application was launched.

Finally, there is a ProDiscover ProScript on the DVD that ships with my book (in the ch5 directory) that will locate the Prefetch folder (via the Registry) and automatically parse the .pf files, listing the last run time and run count for each. I have since updated that ProScript to display its output in time-sorted order, showing the most recent time first. I've found that this makes analysis a bit easier.

Saturday, May 19, 2007

New versions of tools released

I ran across a blog post this morning saying that new versions of pwdump6 and fgdump have been released.

So what does this have to do with forensic analysis? Well, like most folks, I've seen compromised systems that start by getting a downloader on the system, and the attacker is able to gain System level access and use something like wget to download their tools. I've seen not only the pwdump password dumping tool on systems, but I've also seen the output file from the command run sitting on the system...in some cases, in a public web directory with a corresponding query for that page in the web logs.

For those of you who use hash comparison tools, grab these puppies, hash 'em and store the hashes! If you don't do hash comparisons, or don't use this technique to a great extent, you should still be aware of the tools.

Litchfield on Oracle Live Response

Thanks to Richard Bejtlich, I learned this morning that David Litchfield, famed security researcher with NGSSoftware, has released a paper entitled Oracle Forensics Part 4: Live Response. In that paper, David starts off by discussing live response in general, which I found to be very interesting, as he addresses some of the questions that we all face when performing live response, particularly those regarding trust and assurance...trusting the operating system, trusting what the tools are telling use, etc.

David's paper highlights some of the aspects of live response that every responder needs to be aware of...in particular, when the first responder arrives on-scene and wants to collect volatile data, she will usually start by assessing the situation, and then when she's ready to collect that volatile data, insert a CD full of tools into the CD-ROM drive. From David's paper:

When they insert the CD and run one of the tools, due to the way Windows launches new processes, the tool will have key system dynamic link libraries in its address space, i.e. the memory the tool uses.

Great point...but keep in mind that at this point in time, during live response, there really isn't any way to avoid this situation. It happens, and it has to happen. The key to live response isn't how to keep it from happening...rather, it's to have a thoroughly documented process that lets you address the situation head on.

One of the main concerns about live response is often, if we do live response and have to take the information to court, how do we prove that our investigation did not modify the "scene" in any way, and that everything is pristine? The fact is...we can't. Nor should we try. Instead, we need to have a thorough, documented process, and be able to show that while our actions did modify the "scene" (via the application of Locard's Exchange Principle or Heisenberg's Uncertainty Principle...) just as an EMT's actions will modify a real-world crime scene, as investigators we should be looking at the totality of the information or evidence that we're able to collect and examine.

So, in a nutshell, while it is possible that the tools we loaded and ran on the system to collect volatile data were themselves compromised by a patched version of ntdll.dll in memory, what does the totality of the information tell us?

One thing I would suggest is that when you're reading David's excellent paper, and you get to the General Steps of Live Response section, refer back to the Order of Volatility. Dave is correct in that the application-specific information (about Oracle in this case) should be collected last but IMHO, the first thing that should be collected, as soon as possible, is a complete snapshot of physical memory (check out the sample chapter for my book, Windows Forensic Analysis). The reason I would suggest collecting the contents of physical memory first have to do with David's description of process creation...when a process is created, an EPROCESS block and all of the other structures necessary (at least one ETHREAD block) are created, consuming memory. This means that as processes are created, the pages used by other processes will be swapped out to the pagefile. Knowing this, we should collect as much of the contents of RAM as possible before moving on and collecting specific items, such as running processes, or the memory contents (RAM + pagefile) of those processes, etc.

Okay, enough about live response for now...this is a topic that deserves it's own space.

I found David's paper to be particularly interesting, as some of the work I've been involved with (and likely will continue to be involved with) has had to do with databases; was the database compromised, and if so, was sensitive information extracted from the database. I'm not a database guy (ie, a DBA) but I do need to know some things about databases; per David's suggestion, it's often best for an incident responder to work shoulder-to-shoulder with an experienced DBA, bringing the forensics mindset (and requirements) to the table.

If you're interested in database security in general, check out David's databasesecurity.com site for more information and books related to database security. For additional information about other database topics, I picked up a link at Andrew Hay's blog, pointing to the Comprehensive SQL Injection Cheat Sheet (well, the cheat sheet is actually here). This resource is invaluable to anyone performing forensic analysis of a potentially compromised system, particularly if it either has a web server installed, or acted as a database back-end for a web-based system. Hint: any reference in web server logs to SQL stored procedures is worth looking at!

Sunday, May 13, 2007

Forensic Visualization

A while ago, I ran across an interesting 3D visualization project called fe3d. I remember thinking at the time that this would have been cool to have when I was performing vulnerability assessments. Something like this would have made analysis a bit easier...going through ASCII logs can be a pain...but it would have also been a plus in our deliverables, allowing us to provide the data in a visually appealling way to the customer. I'd also used the old version of cheops before, as well.

I was reading Andrew Hay's blog this morning and came across an interesting post from O'Reilly SysAdmin that has to do with log file visualization. This looks very interesting. I haven't dug into the code content itself yet, but I have to ask...has anyone used this for log file analysis during incident response?

Some thoughts that I had:

1. Using Marcus Ranum's artificial ignorance, read in the IIS web server logs from a case, and compare the entries to the actual pages on the web server (yes, I understand that this would take a couple of phases). If a request is made for a page that exists on the web server, set the color of the dot to green. If the request is made for a page that doesn't exist on the web server (as with a scan), set the color to red.

2. Modify the code to use Event Logs, and tag certain events or records from each log with a particular color based on the event. Say, records from the Security Event Log get a particular color, or successful logins get one color and failed login attempts get another color.

I can see how something like this would be very helpful in visualization of data content, as well as presentation and reporting of the data that is found. I'm thinking more along the lines of reports to customers, but I'm sure that there are others out there who are thinking, "would something like this be useful in presenting data to a prosecutor, or to a jury??"

Saturday, May 12, 2007

Forensic Laws

I mentioned a concept or idea in my book, but I wanted to follow up on it a bit...I believe to be a theorem. Okay, maybe not a theorem (there's no math involved), so how about a law. Let's call it the First Law of Computer Forensics. Yeah, yeah...that's the ticket! Kind of like "Murphy's Law".

With that being said...here goes:

There is evidence of every action.

Just to be above board on this, credit (or blame, you decide) goes to Jesse Kornblum. One thing to keep in mind about this law is that the evidence is there...it simply may not exist on the media that you're currently examining. For example, one question that I've seen in the lists is, how do you tell from an acquired image of a system if files were copied from it to, say, a thumb drive? Well, you may find the existence of the file on the system, and you will find that the thumb drive was plugged into the system (to see how to determine this on Windows systems, grab a copy of my book), but how do you determine if the file was copied to the thumb drive, if all you have is the image of the system? The fact is...you can't. You need the thumb drive. Even though the evidence you're looking for isn't on the image, it is on the thumb drive.

Now, here's Harlan's Corollary to the First Law of Computer Forensics:

Once you understand what actions or conditions create or modify an artifact, then the absence of that artifact is itself an artifact.

What this is saying is that not only is there evidence of every action, but the lack of that evidence is itself evidence.

Thoughts?

Addendum, 13 May: I wanted to point out that the example I gave of the laptop and the thumb drive is just that...an example. If you're starting to think that I'm making an absolute, definitive statement about the existence of an artifact on the thumb drive, please re-read the statement, and think of it only as an example. Thanks.

Friday, May 11, 2007

PPT Metadata

I received an email recently asking if I had any tools to extract metadata from PowerPoint presentations. Chapter 5 of my book includes the oledmp.pl Perl script, which grabs OLE information from Office files; this includes Word documents, Excel spreadsheets, and PowerPoint presentations. I've run some tests using this script, and pulled out things like revision number, created and last saved dates, author name, etc.

Pretty interesting stuff. There may be more...maybe based on interest and time, someone can look into this...

Here's an example of the oledmp.pl output from a PPT file (some of the info is masked to protect privacy):

C:\perl>oledmp.pl file.ppt
ListStreams
Stream : ♣DocumentSummaryInformation
Stream : Current User
Stream : ♣SummaryInformation
Stream : Pictures
Stream : PowerPoint Document

Trash Bin Size
BigBlocks 0
SystemSpace 876
SmallBlocks 0
FileEndSpace 1558

Summary Information
subject
lastauth Mary
lastprinted
appname Microsoft PowerPoint
created 09.06.2002, 19:51:48
lastsaved 14.09.2004, 19:08:39
revnum 32
Title Title
authress John Doe

Pictures
Current User
♣SummaryInformation
PowerPoint Document
♣DocumentSummaryInformation

So what does all this mean? Well, we see the various streams that are embedded in the document, and an example of what is extracted from the SummaryInformation stream. Some of this information can be seen by right-clicking on the file in Windows Explorer, choosing Properties, and then choosing the Summary Tab, and then clicking the Advanced button.

Simple modifications to the oledmp.pl script will let you extract the stream tables, as well, showing even more available information.

Tuesday, May 08, 2007

Event Logs in Unallocated Space

I received an email from a friend recently, asking about finding an Event Log in unallocated (ne "free") space. He mentioned that he'd found it using a hex editor and copied it out of the image to a separate file, but still couldn't open it in the Event Viewer.

That got me thinking about the content of my book, and how that could be useful in a situation like this. On page 201 of Windows Forensic Analysis, table 5.3 lists the event record structure; that is, what an event record "looks like". With this information alone, event records can be retrieved from unallocated space; once you find the "magic number", back up 4 bytes and you've got the size of the event record. From there, you can copy out the entire event record and the rest of the information within the record can be easily parsed from unallocated space, or even from the pagefile or a RAM dump.

A post from another forum got me thinking that the same is true for Registry keys, as well. Figure 4.3 illustrates a hex view of what a Registry key and a Registry value "look like" on disk. Using this information, as well as the code listed on pgs. 133 and 134, Registry keys and values can be extracted and reconstructed from unallocated space, the pagefile, or even a RAM dump.

The great thing is that event records and Registry keys have time stamps associated with them (Registry values do not). This also illustrates what can be retrieved from these other areas through data carving...after all, event records and Registry structures have "magic numbers", similar to file headers, and their data can be carved out just as easily.

Sunday, May 06, 2007

SOLD OUT!

I went by the Syngress site for my book today and saw a message that said, in part:

Sorry! This item is currently out of stock at syngress.com. You may want to check availability at the resellers listed on the item's catalog page.

Cool! Many thanks to everyone who has purchased a copy of the book, and to those who are going to...

Addendum, 8 May: I called a bunch of people at Syngress yesterday, leaving messages all over. I was somewhat concerned that the original page had been completely replaced, so no one could put the book on back-order, or view the Table of Contents or even the Sample Chapter. This morning, the order page is back up, sans the picture of the book cover.

Addendum, 9 May: Okay, false alarm, folks! I finally got through to someone at the publisher, and it turns out that while the books are running low due to the volume of orders, what really happened is that the web page fell victim to a developer!

Interviews

For anyone who is curious about my book, Windows Forensic Analysis (ToC and sample chapter available), I've had an opportunity to speak to some folks and answer some questions recently:

Andrew Hay's Q&A
29 Apr CyberSpeak Podcast with Ovie and Brett
ForensicFocus

I'm really looking forward to Andrew's review of my book.

Tuesday, May 01, 2007

Something Else To Look For...

Not long ago, Didier Stevens blogged about Windows Safe Mode and some Registry keys/values that pertain to Safe Mode. He filed this blog entry under "hacking". One of the cool things about computer forensics is that it's the flip side of hacking...discovering artifacts or "footprints" to find what kind of things happened on a system when it was "hacked".

Didier points out in his blog post how easy it is to write your own service that launches from Safe Mode. As more and more malware authors seem to be choosing a Windows service over the ubiquitous Run key in order to maintain the persistence of their malware on a system, it simply makes sense that a check should be made of the SafeBoot (Windows 2000, XP) key, as well.

Is this really such an issue, something you should be concerned about when performing IR or conducting an investigation? Let me add some perspective...not long ago, I examined a worm that had infected several systems, and it created an entry for itself in the RunOnce key; the entry was prepended with a "*". Does anyone get the significance of that?