Thursday, February 15, 2007

(In)Secure Magazine Mention

I received an email from Didier Stevens this morning, letting me know that he'd mentioned me in an article that he wrote for (In)Secure Magazine (issue 1.10, Feb 2007). His article, "ROT-13 is used in Windows? You're joking!" starts on pg 72 of the PDF, and runs through pg 77, where he mentions my ProScript.

Didier's article is on the use of ROT-13 to "encrypt" information that Windows uses to keep track of most frequently used programs (MFUPs). These MFUPs are tracked in order to populate the new Start menu, in both the pinned list (left side) as well as the most frequently used programs list (at the bottom)...see Didier's article for the full explanation.

The ProScript that Didier mentions is a Perl script (go figure, right??) that works with Technology Pathway's ProDiscover forensic analysis product, and parses the NTUSER.DAT files for all of the users on the system, extracting and "decrypting" the UserAssist entries and sorting them in order based on the timestamps that Didier mentions in his article. The ProScript is run against an image that is open in ProDiscover.

I also use a Perl script that parses the raw NTUSER.DAT files, and collects the same information...an excerpt of the output appears below:

G:\book2\DVD\ch4\code>pnu.pl d:\cases\ntuser.dat
LastWrite time = Mon Sep 26 23:33:06 2005 (UTC)
Mon Sep 26 23:33:06 2005 (UTC)
UEME_RUNPATH
UEME_RUNPATH:C:\WINDOWS\system32\notepad.exe
Mon Sep 26 23:26:43 2005 (UTC)
UEME_RUNPATH:Z:\WINNT\system32\sol.exe
Mon Sep 26 23:22:30 2005 (UTC)
UEME_UISCUT
UEME_RUNPATH:Downloads.lnk
Mon Sep 26 23:16:26 2005 (UTC)
UEME_RUNPATH:C:\Program Files\Morpheus\Morpheus.exe
Mon Sep 26 23:16:25 2005 (UTC)
UEME_RUNPATH:Morpheus.lnk
Mon Sep 26 23:15:04 2005 (UTC)
UEME_RUNPATH:C:\Program Files\Internet Explorer\iexplore.exe
Mon Sep 26 23:04:08 2005 (UTC)
UEME_RUNPATH:d:\bintext.exe

The UserAssist keys record user activities that are performed via the shell (ie, Windows Explorer). The UserAssist key actually has two subkeys, both of which are GUIDs or CLSIDs...one for the Active Desktop, and the other for the Internet Toolbar. If you've installed IE 7.0, you will see a third subkey.

Both the ProScript and the Perl script mentioned here will be available on the DVD that accompanies my next book, Windows Forensic Analysis, due our later this spring from Syngress/Elsevier.

Didier's got some other very interesting posts on his blog, so be sure to check it out when you get a chance.

Wednesday, February 14, 2007

BlackHat DC

Who's going to BlackHat DC? I found out today that my publisher has a pass available for me, and I'm looking forward to the opportunity to attend (hopefully, at least the first day of the briefings). It's been a while since I presented at BlackHat (2002, 2004), and the Forensic track looks like a good reason to go back!

Friday, February 09, 2007

Why I like Perl for Forensic Analysis

Many times when I post to a list or give a presentation and talk about using Perl, I almost immediately hear comments back such as, "I can't do that because I don't know Perl".

Well, I have to say, it's not about knowing Perl, or how to program, really. Most of the stuff I'm doing with Perl is translating binary stuff into something readable (timestamps, checking flag values), and just doing repetitive stuff. For example, in Windows Memory Analysis, I just use Perl to implement various processes that are repetitive and boring, and therefore I would be prone to making mistakes. The same is true with tools like SAMParse and SECParse that I use to parse the contents of raw Registry files. The information is there for anyone to see, extract, and interpret. I simply use Perl to do it automatically. In some cases, I write the scripts I do because there is commercial application that provides anywhere close to the information I want or need.

I have written tools to parse RAM dumps, parse Event Log .evt files (without using the Windows API), parse raw Registry files, etc. I wrote these tools because I needed them, and as it was very likely that I would need to do those things again, I automated the process in a Perl script.

I also hear things like, "I don't want to have to keep Perl up-to-date". I think in a lot of cases, comments like these come from folks who don't know a great deal about Perl...and you know...that's fine. It's okay, people. I'm not being a Perl snob or forensics snob or whatever snob, I simply use Perl to automate the repetitive tasks that make up my work day. Besides, I also use Perl2Exe, so all the tools that will be provided on the DVD that comes with my next book (Windows Forensic Analysis, from Syngress/Elsevier) will be provided with Perl "source" code, as well as a standalone EXE file that will allow you to run the tools on a Windows system without having to install Perl.

Perl is an interpretted language, so I don't have to "compile" my scripts to run them, and I can easily make modifications and changes to the scripts, updating or correcting them quickly. The scripts are somewhat self-documenting, although I do add my own comments to give myself a better view of what's going on, so I can figure out was I was doing a year or two later, rather than just look at it and think, "*I* wrote this??". Finally, Perl gives me a freedom and flexibility that is not available in commercial tools. For example, there are commercial tools that will allow you to view certain Registry data in a nice GUI, but you can only view that data, and only one entry at a time. With Perl, I can output multiple entries, as well as correlate data from multiple keys, such as when I want to show USB removable storage devices, any drive letters they were mapped to, and the last time each device was connected to the system (I should call this one "USBParse"). Don't like ASCII output in block format? In a few moments, I can have CSV style output. Ta-da!

So...some folks don't like to use Perl or open-source tools for forensic analysis, and that's okay. I use Perl because I know that if I have to perform some kind of analysis or correlation once, I'll probably have to do it again, and in order to avoid (as much as possible) forgetting a step or making a mistake, I'll automate the process.

Besides, I give most of this stuff away for free, as in beer.

Tuesday, January 30, 2007

Intentional erasure

An interesting question appeared on one of the listservs a bit ago..."what is an investigator's protocol for demonstrating intentional erasure of data, ostensibly done by the user to remove evidence from a system?" This is an interesting question and since it doesn't fit neatly into one of the FAQ sections at the end of a chapter in my next book, I thought I'd address that question here in the blog.

The first thing I would look at is the level of erasure that has occurred. One of the first places to check is the Recycle Bin. Many users delete files through the Explorer shell, and they end up in the Recycle Bin...from there, some users don't bother to empty the Recycle Bin. However, this does show an intentional attempt to remove data, based on the actions that are required to move the files to the Recycle Bin.

I have seen instances in which the user has deleted files (ie, sent to the Recycle Bin) and then emptied the Recycle Bin. In such cases, the last modification time on the INFO2 file in the Recycle Bin may give you an idea of when the Recycle Bin was emptied. Again, this may show intent.

In some cases, many of the sectors for the files were then overwritten due to the limited defrag that occurs about every 3 days on a Windows XP system, making the deleted files unrecoverable.

I would also suggest checking the contents of the UserAssist keys, and on XP systems, the Prefetch folder, to see if there are any artifacts to indicate that an erasure tool of some kind was used. This may range from commercial tools to freely available VBS scripts.

One important thing to keep in mind when performing forensic analysis is that given some artifacts, we can expect to see other other artifacts. For example, if we find that auditing of logons has been enabled, and we see user profiles with MAC times (on the NTUSER.DAT files, etc.) that indicate logons, then we can expect to see some information in the SAM file, as well as the Security Event Log. By correlating these sources, we can develop information about our case. However, the absence of those artifacts that we know we should see (but don't) is itself an artifact.

Tuesday, January 23, 2007

Scripts for parsing the Registry

I was working on a script recently to expand the reach of the Registry Analysis material in my upcoming book, and I thought it would be a good idea to implement something that would parse the audit policy from a system. So I wrote poladt.pl, a Perl script that uses the Parse::Win32Registry module to extract the necessary value from the raw Security file and parse it, displaying the audit policy as shown below:

G:\perl>poladt.pl d:\cases\security
LastWrite: Fri Sep 9 01:11:43 2005 (UTC)
Auditing was enabled.
There are 9 audit categories.

Privilege Use..............None
Object Access..............None
Account Logon Events.......Both
System Events..............Both
Policy Change..............Both
Logon Events...............Both
Account Management.........Both
Directory Service Access...None
Process Tracking...........None

Note: I added the little dots so that the output would line up better and be easier to read; formatting in the blog is a little beyond my current skillset.

Pretty neat, eh? You can compare this to the contents of the Event Logs (I still use the File::ReadEvt module that I wrote to do this, as the Event Viewer still reports the files as corrupted sometimes when you extract them from an image and try to import them into the Event Viewer on your analysis system). If I get any information about other values to parse out of the raw Security file, I'll add that to a script and call it "SECParse".

This script is will be found in the Bonus section of the DVD that comes with my book. Look for Windows Forensic Analysis from Syngress later this spring.

Friday, January 19, 2007

SAMParse

Not too long ago, I blogged about using the Parse::Win32Registry module to parse raw Registry files, specifically the SAM file.

Since then, I've added to the code a bit, so that not only does it retrieve user information, but group membership info, as well. In this way, it's similar to the ProDiscover ProScript I use to do the same thing, only SAMParse works on the raw Registry file, and can be used when exporting the SAM file from your image. It also works on the SAM files located in the Windows XP System Restore Points. It's a useful tool and requires some additional testing, but for the most part it does provide me with a view into a Windows system that I wouldn't otherwise have.

As a side note, I've also written a tool that parses the audit policy from the Security file, returning information similar to what you can view on a live system using auditpol.exe. When combined with other information from the Registry, this lets me know what I should expect to see in the Event Logs.

Both of these scripts, and others, are provided on the DVD that comes with my upcoming book, "Windows Forensic Analysis", due out from Syngress/Elsevier this spring. The scripts will be provided as Perl code, as well as standalone executables 'compiled' using Perl2Exe.

St Louis...here I come!

I'm headed to the DoD Cybercrime Conference in St. Lous, MO, next week. I'll be presenting at 0830 on Thursday morning. It turns out that there are a total of four presentations on Windows memory analysis at this conference. Wow. Had I known, I might have submitted on something else, like Registry analysis. The other presenters include Jesse Kornblum, Tom Goldsmith, and Tim Vidas. I'm sure that there will be some overlap, but I also think that this will be a very interesting conference.

Jesse was nice enough to set up a BoF/BYO on Memory Analysis on Thursday evening...swing by and say hi.

Addendum, 27 Jan: Okay, I'm back, safe and sound. I only regret that I did not get to spend as much time as I would have liked with the folks I met at the conference, but work called...

Wednesday, January 17, 2007

Legends of the Tech Industry

Okay, this post is not specific to Windows or IR, but I wanted to mention this...

Ever since I was in graduate school, I've been interested in the real stories behind the legends of the technology industry. Actually, it really goes back further than that to when I was in high school...my calculus teacher held Grace Hopper as her personal hero. Skip forward 10 years and I'm in grad school walking by Gary Kildall's former office every day.

I could keep rattling off names, but I wanted to share something that I had read yesterday...this online article from WSJ.com about John Draper, aka "Cap'n Crunch". I first saw the article linked from SlashDot just by chance, and I found the article itself to be fascinating. I happened to catch John on IM and he asked me for my opinion about the article. I told him what I thought, and by that time, he'd gotten started on his day so his thoughts were "mixed". Regardless of what the author of the article chose to point out and how he chose to protray John, I do think that it was a good thing that Steve Wozniak was quoted and that John's relationship with both "The Woz" and Steve Jobs was at least pointed out.

I think it's cool to be able to get the "inside scoop" as it were, to be able to pierce the veil of mystery (and misinformation) that is put up around people like John, Woz, etc., by the media. Whether it's an article, a book, or a movie, there's always something that was misinterpretted or simply done wrong to make the piece suitable for public consumption. Also, it's good to get another perspective on some of the events in history. Very cool.

Tuesday, January 16, 2007

Fundamentals

MS recently posted their Fundamental Computer Investigation Guide for Windows. You can download the document or read the entire doc online.

I haven't heard too many opinions yet about this document, good or bad. It may be because most folks aren't aware of it. I will say right up front that I was involved in the editing process, so I'll have a slightly different view. I've seen this document in various forms, and read each of those iterations at least twice.

The paper is intended for IT professionals in the United States who need a general understanding of computer investigations, including many of the procedures that can be used in such investigations and protocols for reporting incidents. Okay, that's a great way to start when writing a fundamentals paper. The paper seems to be directed at folks who either don't normally respond to incidents, or haven't done so before and are now in a position where they are required to do so.

The paper consists of five chapters and an appendix. I'm not going to go through each one, but rather just mention some highlights. I do think that the paper is worth the time to read it, as anyone who reads it is going to get something out of it, positive or negative.

Chapter 1: Assess the Situation: This chapter provides some good advice and things to think about when an incident occurs; I would suggest, however, that most of those things (obtain authorization, review laws, etc.) be done before an incident occurs...that way, attention can be focused on responding to the incident. The Conduct a Thorough Assessment section is fairly comprehensive, and as you read it, keep in mind that no one will be able to list everything you need to know for every incident. Not only do incidents vary, but the same worm on two different networks will be handled differently, because each infrastructure is different, both from a technical/network perspective and a human/socio-political perspective. I would say that there's enough there to get you started, and going forward, it's better to know or not know than to make things up; many an investigation has gone down the wrong path because someone made assumptions based on too little information.

One thing I would like to see changed about the paper is how data about an incident is referred to. In the Prepare for Evidence Acquisition section, the first sentence says in part, "To prepare for the Acquire the Data phase..." So which is it...evidence or data? Given that many states are considering PI laws, it is important to differentiate between the two. Also, consistency of terminology is just a good idea.

On a positive note, the first chapter ends by referring to documentation. Documentation is very important during an investigation. Let's say that you're sweeping across servers to identify a piece of malware...if you don't document your process (what's being checked and why), you're going to have folks who go to a server and don't know what they're supposed to do. Also, you need to document which systems have been scanned and by whom, so that you don't spend a great deal of time rescanning the same server over and over again. Remember, if you didn't document it, it didn't happen.

Chapter 2: Acquire the Data: This chapter glosses over data acquisition as part of incident response. There is a reference to the Tools section in the Appendix, but the tools listed are exclusively either SysInternals tools or native commands on the system. Don't get me wrong...there are some extremely useful tools from SysInternals, but what's missing is a description of what information needs to be collected, and the best tools for doing so. For example, when responding to an incident, the first things I want to know are:
  1. The list of active processes, with the full path to the executable image and the command line to launch each one. I'd also like to know the user context of each process.
  2. Network connections.
  3. Process-to-port mappings, so that I know which network connection is owned or used by which process.
  4. Who is logged into the system (NetBIOS, mapped shares, Terminal Services, etc.)
  5. Running services.
This is just the basics...there's more info that I won't get into here (hey, I need something to blog about later, right?). It's not that the tools mentioned won't provide that information...some do. It's that the paper doesn't mention what's important to know...it just tells the reader that there are some tools they can use.

Chapter 3: Analyze the Data tells the reader what they should do with the data they've collected. In the section on analyzing host data, you'll see the following statement:

...you might use the Microsoft Windows® Sysinternals Strings tool to search the files located in the \Windows\Prefetch folder. This folder contains information such as when and where applications were launched.

Don't get me wrong...I agree with both sentences. I just don't think that the two of them should be next to each other. Why? Well, the information maintained in a .pf file regarding when the application was launched is a 64-bit FILETIME object...strings.exe won't find that for you. While using strings or BinText from FoundStone can be very useful when looking for ASCII, Unicode, or Resource strings within a file, neither will help you locate the FILETIME object.

There's more to it than that, though. When responding to an incident, how do you identify a "suspicious" file or process? Within your infrastructure, what constitutes suspicious? Some might think that running three or even four remote desktop applications is suspicous, while others may think that multiple copies of svchost.exe running on a system is suspicious.

At this point, I think that I've provided enough comments regarding what I saw to be good in the paper, as well as what, IMHO, could be improved of changed. I think that the paper is a good start, but don't expect to sit down and read it, and then be able to conduct an investigation. I still think that MS would be better off with a different structure to documents such as these, directing different versions to different audiences. For example, a high level document for incident managers, one that's a bit more technical for incident team leaders (so that they can evaluate the performance of the team members), and then separate documents for data acquisition and analysis for host- and network-based live acquisition, as well as acquiring an image.

Thoughts?

Sunday, January 14, 2007

P0wned by certs

Have we been p0wned by certs?

You're probably asking yourself, "What the...?" right about now. Bear with me on this one. How many of us go away to training, receive a certification, and then never use that training again? Or worse, without an instructor there, we are sloppy about how we use the training, and then it's as if we had never gone in the first place. Come on, raise your hands. Okay, now, think about it...how many of us go away to training, come back, and thankfully never get evaluated on what we learned?

We've all seen it...we meet someone with a certification in, say for example, incident handling. And then we watch how they go about handling an incident, either in a training scenario or during a real incident, and we wonder...why are they doing that, or "what the...?"

I'm sure we could all trade stories on this, but I think that we've already gotten to the point. So the question becomes, what's the purpose of the certification if the person who gets it cannot then perform to the minimum level specified by that certification?

Back in the '90s, I decided that I wanted to learn SCUBA. So I went to the nearby military base, took classes, paid my tuition, took the "final" and then received my certification. From that point on, if I failed to perform at the level specified by my certification, I could have seriously injured (or killed) myself, or worse, others. Had I been unsafe and irresponsible, some one would have likely reported me, or at the very least, refused to dive with me and left me to hurt only myself.

The military itself is very similar. If you get sent off for training, it's very likely that when you return to your unit, you will have to actually use that training for something. During military training, I was taught how to disassemble and reassemble several weapons, to include the M9, M16 (w/ M203 40mm grenade launcher), M249, M60, and M2 .50 calibre machine gun. Not only did I have to pass a practical, but I then had to use that knowledge at my first unit to teach the same things to others.

So someone goes off to learn to be an incident handler and then returns to their organization, and an incident occurs. How is the certified individual evaluated, or are they evaluated at all? What is the outcome of the incident? Does the certified individual declare that the incident was the result of "rootkit" with no data to support that claim?

See, I guess what I'm getting at is...are certifications effective, or have we been p0wned by certs? I guess you really have to look at the purpose of a certification, and what it's intended for. However, I do have some recommendations as to an alternative approach...
  1. Rather than sending someone off to take generic training, have functional training within your organization. That way, the training can be specific to your environment, and immediately useful in that environment. Be sure to coordinate with the instructor and provide input on the types of incidents you're seeing.
  2. This type of training isn't just for the technical folks...managers need it, too. Not to the same technical detail, of course, but managers need to know what skills they are deploying against an incident, so that efforts can be coordinated and properly...uh...managed.
  3. Managers also need to know how to evaluate the performance of the team, as well as each member. After all, don't we tend to remember things better if we know that we're going to be tested on it, and that something (bonus, promotion, etc.) may be riding on how well we use that knowledge? Also, being able to evaulate the team will allow the manager to identify shortcomings, obtain additional training, etc. One great way to do this is to see who's really strong in one area, and have them work with others to bring them up to speed.
So, in a nutshell, IMHO 'tis better to spring for functional training that can be used and evaluated than to send someone off for a certification and then not be able to evaluate them when they return. I do believe that certifications have their purpose, but I also believe that there a few folks out there that actually evaluate interview candidates or new hires or even those who've just received their certification.

Thoughts?

Saturday, January 13, 2007

New SANS Cert

Do we need another SANS cert? I don't think it's so much about us...it's what SANS wants.

I was reading TaoSecurity this morning and ran across this link to the recent SANS newsletter...

Does anyone on your staff do an excellent job of cleaning out PCs that have been infected by spyware and other malicious software. We are just starting development of a new certification (and related training) for Certified Malware Removal Experts and we are looking for a council of 30 people who have done a lot of it to help vet the skills an dknowledge required for the certification exam and classes. Email cmre@sans.org if you have a lot of experience.

I looked at this and thought...hhmmm...why create a new certification for skillsets that admins should already have? After all, malware detection is really just an advanced form of troubleshooting...which all admins should be knowledgeable in, right? I mean, when you have trouble with your car, do you examine it (ie., look at the gas guage and determine you're out of gas) or do you just abandon your car on the side of the road and get a new one?

Of course, there is a school of thought that believes why should you certify someone to run "format c:\", then fdisk, then re-install the OS? Ugh. I don't know which is worse...thinking that "slash and burn" is an acceptable solution, or certifying something like this.

Thoughts?

Friday, January 12, 2007

IMHO

I received some questions from readers and I thought
that I'd take a crack at them...

With consumer and commerical hard drives getting larger
and larger (soon
there will be a consumer-level 1TB HD)
what effect will the sheer volume of information have
on the forensics community in general, and the

open-source forensics community in specific? Does the
future include more automated tools? Distributed forensic
examinations? Will the open-source forensic

community be able to keep up with the commercial tools?

For the most part, this appears to be two separate
questions, but I'll treat them as pretty much the same
one.

This is nothing new...the challenge of increasing volumes
has affected the forenic community since...well...since
there was storage. Capacities have always increased...such
is the nature of a consumer market. So what's the answer?
Well, in a nutshell...the Age of Nintendo Forensics is
dead. What I mean by that is that new techniques,
processes and methodologies need to be employed to
meet these challenges. We can no longer rely on
simply pushing a button...we have to understand
what it is we're looking for, and where to
look for it.

Notice I didn't say "developed". Rather, I said "employed".
Yes, it was my intention to imply that these techniques are
already available, and actually used by a number of
practioners. However, the vast majority of the digital
forensics community appears to rely on the traditional
approach to computer forensics analysis, which can be
likened to killing the victim of a crime and performing
an autopsy to see what happened.

Live analysis techniques need to be employed more often.
Many traditionalists say that "I won't use these
techniques until they've been proved in court."
Well, consider this...at one point, computer forensics
wasn't used in court, and had to be proven.

Suffice to say, computing technology grows in complexity
...this applies to the hardware, as well as the software.
Accordingly, forensic techniques need to keep pace.
The traditional techniques are still usable...I'm not
saying that they aren't...but practitioners can no longer
learn one thing and hope to stay in business as complexity
and change surrounds them. We need to grow in knowledge
and understanding in order to keep up. This includes
Locard's Exchange Principle, as well as understanding
what actions or conditions lead to the creation and
modification of certain artifacts...from that, we
understand that the absence of an artifact is in
itself an artifact.

Some of the challenges introduced by Windows Vista
include extracting live memory
and (presumably) an
increase in use of EFS by consumers. How can these

challeges best be addressed by the forensics community?

The challenge of extracting live memory has been an
issue since Windows 2003 SP1, when user-mode access
to the \\.\PhysicalMemory object was restricted.
However, has it really been a challenge, per se?
How many folks have obtained memory dumps, and of
those, how have they been used? It isn't as if
the vast majority of the community has been using
physical memory as a source of evidence and
presented that in court.

That being said, these challenges will be addressed
by a small portion of the forensics community.
That core group (and it may be dispersed) have
developed processes and methodologies, as well as
documentation, that they will use, and even
publish. However, it may be a while before the
rest of the "community" of forensic practitioners
catches on and begins using these on a regular
basis.

Since Vista was specifically asked about, I'll
throw this out there...there is a Registry value
called NtfsDisableLastAccessUpdate, and yes, it
does just exactly what as it sounds. If this
value is set to "1", then the operating system
will not update last access times on files.
From NT through 2003, this has been a setting
that was disabled by default, and recommended
to be set for high-volume file servers in order
to increase performance. However, on Vista,
this functionality is enabled by default. What
this means is that examiners are going to have
to use other techniques for determining timelines
of activity, etc., and that may even require
Registry analysis.

Do you think there are ethical challenges
specific to forensic examiners or

incident handlers? What do you think is
the most effective method of oversight in

small offices where there may only be one
examiner? Would you say that strong
personal
ethics are more important than technical
skill for an examiner/incident
handler?

I would suggest that ethics is not something
that is restricted to forensic examiners alone,
but just as important in all fields. I think
that in any field, a wake-up call comes when
someone falls into that "who are you to
question me" trap and then gets caught.
I also think that is an important evolution
and good for the community at large.
In this kind of work it is important to
remain humble and to constantly keep checking
ourselves. There's nothing wrong with going to
someone else and asking, hey, did I do enough
here, or is this telling me what I think it's
telling me? Overall, I think that builds trust,
and strengthens the individuals.

Thoughts? Comments? Email me, or leave a comment...

Friday, January 05, 2007

What's New So Far in 2007

Less than a week into 2007, and there's already a bunch of new stuff out that is really useful and really cool.

First off, there's a new blog on the streets...uh, web. Check out Mark McKinnon's CFED-TTF blog. One post up (at the time of this writing) and he already has some good info on the drivetable.txt file in XP System Restore Points. Great start so far!

Next, I was doing my monthly check on the E-Evidence site, and I found some really good stuff. No, wait...I mean REALLY good stuff! Raphael Bousquet has an interesting presentation on forensic triage. I know that its a product pitch, but the information on the idea of doing a triage is interesting, and something that should be, at the very least, discussed.

Golden Richard has an interesting PPT available that discusses next-gen digital forensics, to include topics such as live forensics. In the PPT, Prof. Richard points out that evidence (and he uses the term "evidence") exists in places other than those thought of in the "traditional" sense of forensics. He also talks about all the work that needs to be done...and while I agree, the question is, who will do that work? We do have a plethora (like that? No, I didn't get a thesaurus for Christmas) of students now that many universities and even community colleges here in the US have started offering courses and degrees in digital and computer forensics, but how long will it be before the big-brain ideas become something useable by investigators and examiners?

There are other presentations and papers available in this month's "What's New", but IMHO, the best paper availabe in this collection of links is Jessica Reust's paper on AIM trace evidence. What struck me most about her paper is that by the time I finished it, I actually had something useful, something I could use in an examination.

Now, on the flip side of all this, we should take it upon ourselves, as a community, to identify those things that we need, and either create them or put them on the table. What am I talking about? If you see a need or have a question, get it out there. Let someone know. Maybe someone out there already has the information you need, or is working on it.

Thoughts? Ideas? Comments?

Wednesday, January 03, 2007

New Year's Resolutions

I read today that there are some technical bloggers that have resolved to not make any New Years resolutions. Uh...okay...but isn't saying that you're not going to do that, in essence, a resolution? Hey, I'm just sayin'...

To kick 2007 off, I'm going to resolve to think big thoughts about IR and CF this year. Seriously. There has to be more to IR and forensic analysis than just what we're seeing. Think about it. There's got to be ton of evidence in Registry, right? After all, no one goes there. What about in RAM? And I know that there are a lot of questions out there, as I see some of them again and again. Questions like:
  • How do I show files were copied to/from a system?
  • How do I show that a CD/DVD was created on a system, and by whom?
  • How do I show that a user account was changed from a User to an Administrator, and when?
What I would ask of all of you for 2007 is to build the knowledge base of the forensic community. Remember, it takes a village...I know, I can't believe I said that either, but hey, it makes my point. I've heard that a lot of folks don't post or comment or ask questions online because they don't want to look stupid. Okay, so post under someone else's name so they look stupid. Or post anonymously. Whatever works. The point is that there're a bunch of us out there working in this area, and every now and then a "hey, what about..." or "hey, what if..." or "hey, look what I found..." would really go a long way toward adding to all of our knowledge.

I've also been told that LEOs don't like to post questions because opposing counsel might see it and hold that against them in court. Well, if that's the case, then couldn't opposing counsel pretty much get any testimony thrown out because at one point, before all of your training and reading, you didn't know anything? I mean, doesn't it make sense that you'd ask, get an answer, and verify it, and let that be the case, rather than go into court with less of a case, all because you didn't want to ask a question?

One last thing...please resolve that in 2007, when posting questions, you'll include the OS and version. Seriously. I know some of you think that when someone responds to your post with "what OS/version?", you've been "chastized" or p0wned...whatever. Get over it. Most times, the answer will be different, depending on whether we're talking Windows 2000, XPSP2, or Vista, and I don't have the time to write or read an if...then...else encyclopedic answer.

Saturday, December 30, 2006

Tagged!

Okay, I haven't blogged in a while...not because I haven't wanted to, but because I've been working (you know, that thing that pays the bills) and writing my next book (I've been exchanging chapters with tech editors). In between all that, I've been coding...and if I take this any further, I'm going to have to add a "really bad excuse" label to my blog.

So what exactly is a "blog tag"? To be honest, I have no idea. I first saw mention of it over on MySpace, so that should give you some indication of what it is. Now it's been picked up in the rest of the virtual world, even by nerds ("hey, I resemble that remark!") like us. So, my understanding from Richard is that I'm supposed to document 5 things people may not know about me, and then "tag" five other bloggers. Okay, here goes...

1. I attended the Virginia Military Institute for my undergraduate degree...by accident. I am a Navy brat, and was working toward an appointment to one of the federal service academies, and a high school counselor handed me an application to a small military school in the Shenandoah Valley. In fact, he gave me the last application in his folder, and didn't ask me to make copies of it. That's one of the things about my life that I would NOT change if I could.

2. While attending VMI, I ran marathons my senior year. Yes, looking at me, you'd never know that...according to many fellow Marines, I was "built like a brick sh*t house"...a physique not necessarily associated with nor conducive to marathoning-like activities. I first ran the Marine Corps Marathon in '88, with a finishing time of 2:57. This qualified me for the Boston Marathon, which I signed up for (that is, paid money) and ran...with a finishing time of 3:11. There's more to that story, but unless you attended VMI, it would neither make sense nor be funny. I ran that Marine Corps Marathon again in '89, while I was in The Basic School, and finished in 2:54. I still have all of my finisher medals.

3. In my first tour in the Fleet, I was in Okinawa for a year, stationed with MACS-4. This is the same sort of unit that Lee Harvey Oswald served in. I was a Communications Officer (MOS: 2502). While there, I studied Shorin Ryu, an Okinawan martial art. I'd like to study Akido and Krav Maga.

4. I have a horse, and yes, I ride. Western. Well, not so much "ride" in the traditional sense, like dressage, but more in the sense of "don't dump me off and I'll give you a carrot". I've even done this at a dude ranch, but the carrot thing didn't work.

5. I completed my master's degree at the Naval Postgraduate School, just prior to separating from the military. While there, I wasn't actively engaged in a whole lot (set up a 10Base2 network and connected the detachment to the token ring CAN), so I worked out quite a bit. At one point, I was doing 27 dead-hang, palms-facing-out pullups and ran the 3 mile PFT run in 17 minutes or less. I could bench press 360 lbs unassisted...and yet I was 5 lbs over the weight that the Marine Corps said I should be for my height. So I was placed on weight control, along with the Ho-ho and Twinkie bandits.

Now, on to the tagging...

Jesse Kornblum

Marcus Ranum

Dave Roth

Andreas Schuster

Robert "van" Hensing

Live long and prosper!

Wednesday, November 29, 2006

Artifact classes

I've been doing some thinking about IR and CF artifacts over the past couple of weeks, and wanted to share my thoughts on something that may be of use, particularly if its developed a bit...

When approaching many things in life, particularly a case I'm investigating, I tend to classify things (image the scene in the Matrix where Agent Smith has Morpheus captive, and tells him that he's classified the human species as a virus*) based on information I've received...incident reports, interviews with the client, etc. By classify, I mean categorizing the incident in my mind...web page defacement, intrusion/compromise, inappropriate use, etc. To some extent, I think we all do this...and the outcome of this is that we tend to look for artifacts that support this classification. If I don't find these artifacts, or the artifacts that I do find do not support my initial classification, then I modify my classification.

A by-product of this is that if I've classified a case as, say, an intrusion, I'm not necessarily going to be looking for something else, such as illicit images, particularly if it hasn't been requested by the client. Doing so would consume more time, and when you're working for a client, you need to optimize your time to meet their needs. After all, they're paying for your time.

Now, what got me thinking is that many time in the public lists (and some that require membership) I'll see questions or comments that indicate that the analyst really isn't all that familiar with either the operating system in the image, or the nature of the incident they're investigating, or both. This is also true (perhaps more so) during incident response activities...not understanding the nature of an issue (intrusion, malware infection, DoS attack, etc.) can many times leave the responder either pursuing the wrong things, or suffering from simple paralysis and not knowing where to begin.

So, understanding how we classify things in our minds can lead us to classifying events and incidents, as well as classifying artifacts, and ultimately mapping between the two. This then helps us decide upon the appropriate course of action, during both live response (ie, an active attack) and post-mortem activities.

My question to the community is this...even given the variables involved (OS, file system, etc.), is there any benefit to developing a framework for classification, to include artifacts, to provide (at the very least) a roadmap for investigating cases?

Addendum, 30 Nov: Based on an exchange going on over on FFN, I'm starting to see some thought being put into this, and it's helping me gel (albiet not crystalize, yet) up my thinking, as well. Look at it this way...doctors have a process that they go through to diagnose patients. There are things that occur every time you show up at the doctor's office (height, weight, temperature, blood pressure), and there are those things that the doctor does to diagnose your particular "issue du jour". Decisions are made based on the info the doctor receives from the patient, and courses of action are decided. The doctor will listen to the patient, but also observe the patient's reaction to certain stimuli...because sometimes patients lie, or the doctor may be asking the wrong question.

Continuing with the medical analogy, sometimes it's a doctor that responds, sometimes a nurse or an EMT. Either way, they've all had training, and they all have knowledge of the human body...enough to know what can possibly be wrong and how to react.

Someone suggested that this may not be the right framework to establish...IMHO, at least it's something. Right now we have nothing. Oh, and I get to be Dr. House. ;-)

*It's funny that I should say that...I was interviewed on 15 May 1989 regarding the issue of women at VMI, and I said that they would initially be treated like a virus.

Tuesday, November 28, 2006

MS AntiMalware Team paper

I know I'm a little behind on this one, but I saw this morning that back in June, the MS AntiMalware team released an MSRT white paper entitled "Windows Malicious Software Removal Tool: Progress Made, Trends Observed".

I've seen some writeups and overviews of the content, particularly at the SANS ISC. Some very interesting statistics have been pulled from the data collected, and as always, I view this sort of thing with a skeptical eye. From the overview:

This report provides an in-depth perspective of the malware landscape based on the data collected by the MSRT...

It's always good for the author to set the reader's expectations. What this tells me is that we're only looking at data provided by the MS tool.

Here are a couple of statements from the overview that I found intersting:

Backdoor Trojans, which can enable an attacker to control an infected computer and steal confidential information, are a significant and tangible threat to Windows users.

Yes. Very much so. If a botmaster can send a single command to a channel and receive the Protected Storage data from thousands (or tens of thousands) of systems, this would represent a clear and present danger (gotta get the Tom Clancy reference in there!).

Rootkits...are a potential emerging threat but have not yet reached widespread prevalence.

I don't know if I'd call rootkits a "potential" or "emerging" threat. Yes, rootkits have been around for quite a while, since Greg Hoglund started releasing them with the NTRootkit v.0.40. In fact, commercial companies like Sony have even seen the usefulness of such things. It's also widely known that there are rootkits-for-hire, as well. Well, I guess what it comes down to is how you define "widespread". We'll see how this goes...I have a sneaking suspicion that since it's easy enough to hide something in plain sight, why not do so? That way, an admin can run all the rootkit detection tools they want and never find a thing.

Social engineering attacks represent a significant source of malware infections. Worms that spread through email, peer-to-peer networks, and instant messaging clients account for 35% of the computers cleaned by the tool.

I can't say I'm surprised, really.

These reports are interesting and provide a different view of the malware landscape that many of us might not see on a regular basis...it's kind of hard to see the forest for the trees when you're face down in the mud, so to speak.

Even so, what I don't think we see enough of in the IR and computer forensics community is something along the same lines but geared toward information that is important to us, such as forensic artifacts. For example, how do different tools stack up against various rootkits and other malware, and what are the artifacts left by those tools?

Most of the A/V vendors note the various changes made when malware is installed on a system, but sometimes it isn't complete, and other times isn't correct (remember the MUICache issue??).

What would be useful is a repository, or even various sites, that could provide similar information but more geared toward forensic analysts.


Monday, November 27, 2006

Some sites of note

I've been perusing a couple of new sites that I've run across over the past couple of weeks and wanted to pass them on to others, and include some of my thoughts/experiences with the sites...

MultiMediaForensics - At first, this looks like another forum site, similar to ForensicFocus. One of the aspects of this site, however, is that the author maintains the shownotes for the CyberSpeak podcast. One of the most important things about this site is that it provides a forum for members of the computer forensics community to come together. In my experience, this is one of the things missing from the community...a "community". While ForensicFocus is UK-based, MultiMediaForensics appears to be more US-based, and that may have an effect on its popularity here in the US. If you follow the CyberSpeak podcast, or are just interested in computer forensics, make a point of contributing (post, write an article, provide a link, ask a question, etc.) to this site.

Hype-Free - While not specifically related to security or forensics, this site does have some interesting commentary. The author says in the "About" section that the goal of the blog is to "demystify security", so keep your eye out for some good stuff coming down the road.

MS AntiMalware Blog - Don't go to this site expecting to see MS's version of Symantec or some other A/V vendor that posts writeups on malware. However, you may find some interesting white papers and posts that will help you understand issues surrounding malware a bit better. For example, the ISC recently mentioned some highlights from one such white paper.

Sunday, November 26, 2006

A little programming fun, etc.

I was working on writing my book yesterday afternoon (for those of you who live in my area and know how nice it was outside, I know...shame on me for being inside), and was working on a section that has to do with the Windows Recycle Bin. Most forensic analysts are likely familiar with the Recycle Bin and how it works, as well as with Keith Jones's excellent description of the Recycle Bin records (1, 2), so this is more than likely nothing new.

For the book, I wanted to throw together a little script that can be used to parse the INFO2 file, in order to illustrate some concepts that I am trying to convey in section. To my surprise, it was easier than I thought...I got something put together that works amazingly well - my only stumbling block at this point is how to present the data in such a way as to be useful to widest range of folks.

This just shows how useful a little bit of programming skill can be. Something I've been thinking about adding to my list of ProDiscover ProScripts is a script that will go through the Recycler directory and look for files that are out of place...not only files that are in the Recycler directory itself, but those that are in the user's deleted files but not listed in the INFO2 file.

Way Too Funny
Okay, now this is just way too funny...I was doing a search on Amazon to see if my next book is up for early order yet (my publisher said that it would be soon), and I ran across my master's thesis! If you're suffering from insomnia, it's well worth the price to grab a copy of this thesis...I used video teleconferencing traffic to verify the fractal nature of ethernet traffic, so that real statistical models could be used (rather than assumed) for constructing buffers on switches at the edge of ATM clouds.

Anyway, this was really "back in the day" (re: 1996). I was using Java (which, with the help of O'Reilly, I had taught myself) to write an SNMP application to poll Cisco routers for traffic data. I then used MatLab to perform statistical analysis tests of the data to determine its "burstiness" and verify its fractal nature. The video conferencing came in because it allowed me to generate data...I could sit in front of one camera, across the room from the other camera, and have measurable traffic generated across the network. At the time, Notepad was my Java IDE...this was before IBM's Java IDE came out (I saw my first copy of the IBM Java IDE at the end of July, 1997).

Something new from ISC
There were a couple of items I found interesting on the SANS Internet Storm Center handler's diary recently.

One was this item about malware that uses a commercial tool to detect the existence of a virtual machine. Very interesting.

Another is this writeup on the FreeVideo Player Trojan, submitted by Brian Eckman (the writeup also appears on SecGeeks and First.org, among other locations). Brian's writeup is thorough and very comprehensive...I like seeing this sort of thing posted, as it's a nice break to see a submission like this. Brian put a lot of work into the writeup, and looked at a variety of areas on the system that were affected by this malware.

The one question I have (and would have for anyone) is this statement that was made:

Further analysis shows that this appears to inject itself into running processes.

Given the rest of the writeup, there really isn't anything that specifically states how Brian determined this. I'm sure we can come up with a variety of assumptions as to how he did it, or how we would go about doing it, but I'd like to hear from the author. I'm going to submit that as a question to him, and I'll let you know what I find out.

Friday, November 17, 2006

How do you do that voodoo that you do?

After reading through the SecuriTeam documents, and then reading this SANS ISC post, I have to admit, incident response can be extremely complex. This is particularly true if the bad guys catch you with your pants down (figuratively, or hey, even literally). From the SecuriTeam write-ups, we get an insight into how systems are targetted...via Google searches (with books such as Google Hacking from Johnny Long, there's really no big mystery to this) and even freely available vulnerability scanning tools.

In the SecuriTeam documents, the decision to conduct a live response is documented, as a "full forensic analysis" would take too long...it was evidently determined that the attackers hadn't simply defaced a web page, but were actively on the system and involved in creating mayhem. This isn't unusual...many organizations decide that affected systems cannot be taken down. However, with the Team Evil incident, there didn't seem to be any volatile data collected or analyzed. It appears that because a web page was defaced and some directories created, that the investigation and the IR team's reactions focused solely on the web application(s).

One thing I took away from the second SecuriTeam document was that there were...what? Eight attacks?

The post from the Internet Storm Center doesn't seem to make things any easier, does it? After all, not only do multiple downloaders dump a "zoo of malware" (i.e., keylogger, BHOs, even disables the Security Center) on the system, but it also reports the infected system's location so it can be tracked via Google Maps. While this is an interesting new capability, the real issue IMHO is the stuff that's dumped on the system. How do you track it all? If you really don't know a great deal about your systems (hosts and networks) and the applications that you're running, and you haven't taken any real steps to protect those systems, I'd have to say that the ISC recommendation to reinstall is the only option.

If you really think about it, though, maybe this is the intention. If you've been looking at some of the various conference presentations over the past couple of years, there have been some anti-forensics and "how to use the analyst's training against them" presentations. One way to look at these attacks is that the attacker is just being greedy. Another is to think that the intention of dumping multiple programs (keyloggers, BHOs, etc.) on the system is that IF the sysadmin detects any one of them, they'll reinstall the system...and it's likely that the system can be easily reinfected.

So, on the face of things, we've got a denial of service attack...compromise and infect a critical-use system, and it's so important that the sysadmin takes it offline for a reinstall. This also lends itself to persistence...the reinstalled system may have the same or additional vulnerabilities, so the attacker can reinfect the system once it's back up.

Of course, I can't help but think that this could also be a distraction, a little bit of misdirection...get the IT staff looking one direction while the attacker is siphoning data off of another system.

So I guess I have to concede the point that reinstallation is the thing to do. If you (a) don't really know your infrastructure (hosts or network) that well, (b) have no idea where critical (processing or storing sensitive personal information) applications are, (c) haven't really taken any defense-in-depth measures, (d) have no clue what to do, and (e) don't have an IR plan that is supported and endorsed by senior management, I guess there really isn't any other option.

Thoughts?