Tuesday, January 30, 2007

Intentional erasure

An interesting question appeared on one of the listservs a bit ago..."what is an investigator's protocol for demonstrating intentional erasure of data, ostensibly done by the user to remove evidence from a system?" This is an interesting question and since it doesn't fit neatly into one of the FAQ sections at the end of a chapter in my next book, I thought I'd address that question here in the blog.

The first thing I would look at is the level of erasure that has occurred. One of the first places to check is the Recycle Bin. Many users delete files through the Explorer shell, and they end up in the Recycle Bin...from there, some users don't bother to empty the Recycle Bin. However, this does show an intentional attempt to remove data, based on the actions that are required to move the files to the Recycle Bin.

I have seen instances in which the user has deleted files (ie, sent to the Recycle Bin) and then emptied the Recycle Bin. In such cases, the last modification time on the INFO2 file in the Recycle Bin may give you an idea of when the Recycle Bin was emptied. Again, this may show intent.

In some cases, many of the sectors for the files were then overwritten due to the limited defrag that occurs about every 3 days on a Windows XP system, making the deleted files unrecoverable.

I would also suggest checking the contents of the UserAssist keys, and on XP systems, the Prefetch folder, to see if there are any artifacts to indicate that an erasure tool of some kind was used. This may range from commercial tools to freely available VBS scripts.

One important thing to keep in mind when performing forensic analysis is that given some artifacts, we can expect to see other other artifacts. For example, if we find that auditing of logons has been enabled, and we see user profiles with MAC times (on the NTUSER.DAT files, etc.) that indicate logons, then we can expect to see some information in the SAM file, as well as the Security Event Log. By correlating these sources, we can develop information about our case. However, the absence of those artifacts that we know we should see (but don't) is itself an artifact.

Tuesday, January 23, 2007

Scripts for parsing the Registry

I was working on a script recently to expand the reach of the Registry Analysis material in my upcoming book, and I thought it would be a good idea to implement something that would parse the audit policy from a system. So I wrote poladt.pl, a Perl script that uses the Parse::Win32Registry module to extract the necessary value from the raw Security file and parse it, displaying the audit policy as shown below:

G:\perl>poladt.pl d:\cases\security
LastWrite: Fri Sep 9 01:11:43 2005 (UTC)
Auditing was enabled.
There are 9 audit categories.

Privilege Use..............None
Object Access..............None
Account Logon Events.......Both
System Events..............Both
Policy Change..............Both
Logon Events...............Both
Account Management.........Both
Directory Service Access...None
Process Tracking...........None

Note: I added the little dots so that the output would line up better and be easier to read; formatting in the blog is a little beyond my current skillset.

Pretty neat, eh? You can compare this to the contents of the Event Logs (I still use the File::ReadEvt module that I wrote to do this, as the Event Viewer still reports the files as corrupted sometimes when you extract them from an image and try to import them into the Event Viewer on your analysis system). If I get any information about other values to parse out of the raw Security file, I'll add that to a script and call it "SECParse".

This script is will be found in the Bonus section of the DVD that comes with my book. Look for Windows Forensic Analysis from Syngress later this spring.

Friday, January 19, 2007

SAMParse

Not too long ago, I blogged about using the Parse::Win32Registry module to parse raw Registry files, specifically the SAM file.

Since then, I've added to the code a bit, so that not only does it retrieve user information, but group membership info, as well. In this way, it's similar to the ProDiscover ProScript I use to do the same thing, only SAMParse works on the raw Registry file, and can be used when exporting the SAM file from your image. It also works on the SAM files located in the Windows XP System Restore Points. It's a useful tool and requires some additional testing, but for the most part it does provide me with a view into a Windows system that I wouldn't otherwise have.

As a side note, I've also written a tool that parses the audit policy from the Security file, returning information similar to what you can view on a live system using auditpol.exe. When combined with other information from the Registry, this lets me know what I should expect to see in the Event Logs.

Both of these scripts, and others, are provided on the DVD that comes with my upcoming book, "Windows Forensic Analysis", due out from Syngress/Elsevier this spring. The scripts will be provided as Perl code, as well as standalone executables 'compiled' using Perl2Exe.

St Louis...here I come!

I'm headed to the DoD Cybercrime Conference in St. Lous, MO, next week. I'll be presenting at 0830 on Thursday morning. It turns out that there are a total of four presentations on Windows memory analysis at this conference. Wow. Had I known, I might have submitted on something else, like Registry analysis. The other presenters include Jesse Kornblum, Tom Goldsmith, and Tim Vidas. I'm sure that there will be some overlap, but I also think that this will be a very interesting conference.

Jesse was nice enough to set up a BoF/BYO on Memory Analysis on Thursday evening...swing by and say hi.

Addendum, 27 Jan: Okay, I'm back, safe and sound. I only regret that I did not get to spend as much time as I would have liked with the folks I met at the conference, but work called...

Wednesday, January 17, 2007

Legends of the Tech Industry

Okay, this post is not specific to Windows or IR, but I wanted to mention this...

Ever since I was in graduate school, I've been interested in the real stories behind the legends of the technology industry. Actually, it really goes back further than that to when I was in high school...my calculus teacher held Grace Hopper as her personal hero. Skip forward 10 years and I'm in grad school walking by Gary Kildall's former office every day.

I could keep rattling off names, but I wanted to share something that I had read yesterday...this online article from WSJ.com about John Draper, aka "Cap'n Crunch". I first saw the article linked from SlashDot just by chance, and I found the article itself to be fascinating. I happened to catch John on IM and he asked me for my opinion about the article. I told him what I thought, and by that time, he'd gotten started on his day so his thoughts were "mixed". Regardless of what the author of the article chose to point out and how he chose to protray John, I do think that it was a good thing that Steve Wozniak was quoted and that John's relationship with both "The Woz" and Steve Jobs was at least pointed out.

I think it's cool to be able to get the "inside scoop" as it were, to be able to pierce the veil of mystery (and misinformation) that is put up around people like John, Woz, etc., by the media. Whether it's an article, a book, or a movie, there's always something that was misinterpretted or simply done wrong to make the piece suitable for public consumption. Also, it's good to get another perspective on some of the events in history. Very cool.

Tuesday, January 16, 2007

Fundamentals

MS recently posted their Fundamental Computer Investigation Guide for Windows. You can download the document or read the entire doc online.

I haven't heard too many opinions yet about this document, good or bad. It may be because most folks aren't aware of it. I will say right up front that I was involved in the editing process, so I'll have a slightly different view. I've seen this document in various forms, and read each of those iterations at least twice.

The paper is intended for IT professionals in the United States who need a general understanding of computer investigations, including many of the procedures that can be used in such investigations and protocols for reporting incidents. Okay, that's a great way to start when writing a fundamentals paper. The paper seems to be directed at folks who either don't normally respond to incidents, or haven't done so before and are now in a position where they are required to do so.

The paper consists of five chapters and an appendix. I'm not going to go through each one, but rather just mention some highlights. I do think that the paper is worth the time to read it, as anyone who reads it is going to get something out of it, positive or negative.

Chapter 1: Assess the Situation: This chapter provides some good advice and things to think about when an incident occurs; I would suggest, however, that most of those things (obtain authorization, review laws, etc.) be done before an incident occurs...that way, attention can be focused on responding to the incident. The Conduct a Thorough Assessment section is fairly comprehensive, and as you read it, keep in mind that no one will be able to list everything you need to know for every incident. Not only do incidents vary, but the same worm on two different networks will be handled differently, because each infrastructure is different, both from a technical/network perspective and a human/socio-political perspective. I would say that there's enough there to get you started, and going forward, it's better to know or not know than to make things up; many an investigation has gone down the wrong path because someone made assumptions based on too little information.

One thing I would like to see changed about the paper is how data about an incident is referred to. In the Prepare for Evidence Acquisition section, the first sentence says in part, "To prepare for the Acquire the Data phase..." So which is it...evidence or data? Given that many states are considering PI laws, it is important to differentiate between the two. Also, consistency of terminology is just a good idea.

On a positive note, the first chapter ends by referring to documentation. Documentation is very important during an investigation. Let's say that you're sweeping across servers to identify a piece of malware...if you don't document your process (what's being checked and why), you're going to have folks who go to a server and don't know what they're supposed to do. Also, you need to document which systems have been scanned and by whom, so that you don't spend a great deal of time rescanning the same server over and over again. Remember, if you didn't document it, it didn't happen.

Chapter 2: Acquire the Data: This chapter glosses over data acquisition as part of incident response. There is a reference to the Tools section in the Appendix, but the tools listed are exclusively either SysInternals tools or native commands on the system. Don't get me wrong...there are some extremely useful tools from SysInternals, but what's missing is a description of what information needs to be collected, and the best tools for doing so. For example, when responding to an incident, the first things I want to know are:
  1. The list of active processes, with the full path to the executable image and the command line to launch each one. I'd also like to know the user context of each process.
  2. Network connections.
  3. Process-to-port mappings, so that I know which network connection is owned or used by which process.
  4. Who is logged into the system (NetBIOS, mapped shares, Terminal Services, etc.)
  5. Running services.
This is just the basics...there's more info that I won't get into here (hey, I need something to blog about later, right?). It's not that the tools mentioned won't provide that information...some do. It's that the paper doesn't mention what's important to know...it just tells the reader that there are some tools they can use.

Chapter 3: Analyze the Data tells the reader what they should do with the data they've collected. In the section on analyzing host data, you'll see the following statement:

...you might use the Microsoft Windows® Sysinternals Strings tool to search the files located in the \Windows\Prefetch folder. This folder contains information such as when and where applications were launched.

Don't get me wrong...I agree with both sentences. I just don't think that the two of them should be next to each other. Why? Well, the information maintained in a .pf file regarding when the application was launched is a 64-bit FILETIME object...strings.exe won't find that for you. While using strings or BinText from FoundStone can be very useful when looking for ASCII, Unicode, or Resource strings within a file, neither will help you locate the FILETIME object.

There's more to it than that, though. When responding to an incident, how do you identify a "suspicious" file or process? Within your infrastructure, what constitutes suspicious? Some might think that running three or even four remote desktop applications is suspicous, while others may think that multiple copies of svchost.exe running on a system is suspicious.

At this point, I think that I've provided enough comments regarding what I saw to be good in the paper, as well as what, IMHO, could be improved of changed. I think that the paper is a good start, but don't expect to sit down and read it, and then be able to conduct an investigation. I still think that MS would be better off with a different structure to documents such as these, directing different versions to different audiences. For example, a high level document for incident managers, one that's a bit more technical for incident team leaders (so that they can evaluate the performance of the team members), and then separate documents for data acquisition and analysis for host- and network-based live acquisition, as well as acquiring an image.

Thoughts?

Sunday, January 14, 2007

P0wned by certs

Have we been p0wned by certs?

You're probably asking yourself, "What the...?" right about now. Bear with me on this one. How many of us go away to training, receive a certification, and then never use that training again? Or worse, without an instructor there, we are sloppy about how we use the training, and then it's as if we had never gone in the first place. Come on, raise your hands. Okay, now, think about it...how many of us go away to training, come back, and thankfully never get evaluated on what we learned?

We've all seen it...we meet someone with a certification in, say for example, incident handling. And then we watch how they go about handling an incident, either in a training scenario or during a real incident, and we wonder...why are they doing that, or "what the...?"

I'm sure we could all trade stories on this, but I think that we've already gotten to the point. So the question becomes, what's the purpose of the certification if the person who gets it cannot then perform to the minimum level specified by that certification?

Back in the '90s, I decided that I wanted to learn SCUBA. So I went to the nearby military base, took classes, paid my tuition, took the "final" and then received my certification. From that point on, if I failed to perform at the level specified by my certification, I could have seriously injured (or killed) myself, or worse, others. Had I been unsafe and irresponsible, some one would have likely reported me, or at the very least, refused to dive with me and left me to hurt only myself.

The military itself is very similar. If you get sent off for training, it's very likely that when you return to your unit, you will have to actually use that training for something. During military training, I was taught how to disassemble and reassemble several weapons, to include the M9, M16 (w/ M203 40mm grenade launcher), M249, M60, and M2 .50 calibre machine gun. Not only did I have to pass a practical, but I then had to use that knowledge at my first unit to teach the same things to others.

So someone goes off to learn to be an incident handler and then returns to their organization, and an incident occurs. How is the certified individual evaluated, or are they evaluated at all? What is the outcome of the incident? Does the certified individual declare that the incident was the result of "rootkit" with no data to support that claim?

See, I guess what I'm getting at is...are certifications effective, or have we been p0wned by certs? I guess you really have to look at the purpose of a certification, and what it's intended for. However, I do have some recommendations as to an alternative approach...
  1. Rather than sending someone off to take generic training, have functional training within your organization. That way, the training can be specific to your environment, and immediately useful in that environment. Be sure to coordinate with the instructor and provide input on the types of incidents you're seeing.
  2. This type of training isn't just for the technical folks...managers need it, too. Not to the same technical detail, of course, but managers need to know what skills they are deploying against an incident, so that efforts can be coordinated and properly...uh...managed.
  3. Managers also need to know how to evaluate the performance of the team, as well as each member. After all, don't we tend to remember things better if we know that we're going to be tested on it, and that something (bonus, promotion, etc.) may be riding on how well we use that knowledge? Also, being able to evaulate the team will allow the manager to identify shortcomings, obtain additional training, etc. One great way to do this is to see who's really strong in one area, and have them work with others to bring them up to speed.
So, in a nutshell, IMHO 'tis better to spring for functional training that can be used and evaluated than to send someone off for a certification and then not be able to evaluate them when they return. I do believe that certifications have their purpose, but I also believe that there a few folks out there that actually evaluate interview candidates or new hires or even those who've just received their certification.

Thoughts?

Saturday, January 13, 2007

New SANS Cert

Do we need another SANS cert? I don't think it's so much about us...it's what SANS wants.

I was reading TaoSecurity this morning and ran across this link to the recent SANS newsletter...

Does anyone on your staff do an excellent job of cleaning out PCs that have been infected by spyware and other malicious software. We are just starting development of a new certification (and related training) for Certified Malware Removal Experts and we are looking for a council of 30 people who have done a lot of it to help vet the skills an dknowledge required for the certification exam and classes. Email cmre@sans.org if you have a lot of experience.

I looked at this and thought...hhmmm...why create a new certification for skillsets that admins should already have? After all, malware detection is really just an advanced form of troubleshooting...which all admins should be knowledgeable in, right? I mean, when you have trouble with your car, do you examine it (ie., look at the gas guage and determine you're out of gas) or do you just abandon your car on the side of the road and get a new one?

Of course, there is a school of thought that believes why should you certify someone to run "format c:\", then fdisk, then re-install the OS? Ugh. I don't know which is worse...thinking that "slash and burn" is an acceptable solution, or certifying something like this.

Thoughts?

Friday, January 12, 2007

IMHO

I received some questions from readers and I thought
that I'd take a crack at them...

With consumer and commerical hard drives getting larger
and larger (soon
there will be a consumer-level 1TB HD)
what effect will the sheer volume of information have
on the forensics community in general, and the

open-source forensics community in specific? Does the
future include more automated tools? Distributed forensic
examinations? Will the open-source forensic

community be able to keep up with the commercial tools?

For the most part, this appears to be two separate
questions, but I'll treat them as pretty much the same
one.

This is nothing new...the challenge of increasing volumes
has affected the forenic community since...well...since
there was storage. Capacities have always increased...such
is the nature of a consumer market. So what's the answer?
Well, in a nutshell...the Age of Nintendo Forensics is
dead. What I mean by that is that new techniques,
processes and methodologies need to be employed to
meet these challenges. We can no longer rely on
simply pushing a button...we have to understand
what it is we're looking for, and where to
look for it.

Notice I didn't say "developed". Rather, I said "employed".
Yes, it was my intention to imply that these techniques are
already available, and actually used by a number of
practioners. However, the vast majority of the digital
forensics community appears to rely on the traditional
approach to computer forensics analysis, which can be
likened to killing the victim of a crime and performing
an autopsy to see what happened.

Live analysis techniques need to be employed more often.
Many traditionalists say that "I won't use these
techniques until they've been proved in court."
Well, consider this...at one point, computer forensics
wasn't used in court, and had to be proven.

Suffice to say, computing technology grows in complexity
...this applies to the hardware, as well as the software.
Accordingly, forensic techniques need to keep pace.
The traditional techniques are still usable...I'm not
saying that they aren't...but practitioners can no longer
learn one thing and hope to stay in business as complexity
and change surrounds them. We need to grow in knowledge
and understanding in order to keep up. This includes
Locard's Exchange Principle, as well as understanding
what actions or conditions lead to the creation and
modification of certain artifacts...from that, we
understand that the absence of an artifact is in
itself an artifact.

Some of the challenges introduced by Windows Vista
include extracting live memory
and (presumably) an
increase in use of EFS by consumers. How can these

challeges best be addressed by the forensics community?

The challenge of extracting live memory has been an
issue since Windows 2003 SP1, when user-mode access
to the \\.\PhysicalMemory object was restricted.
However, has it really been a challenge, per se?
How many folks have obtained memory dumps, and of
those, how have they been used? It isn't as if
the vast majority of the community has been using
physical memory as a source of evidence and
presented that in court.

That being said, these challenges will be addressed
by a small portion of the forensics community.
That core group (and it may be dispersed) have
developed processes and methodologies, as well as
documentation, that they will use, and even
publish. However, it may be a while before the
rest of the "community" of forensic practitioners
catches on and begins using these on a regular
basis.

Since Vista was specifically asked about, I'll
throw this out there...there is a Registry value
called NtfsDisableLastAccessUpdate, and yes, it
does just exactly what as it sounds. If this
value is set to "1", then the operating system
will not update last access times on files.
From NT through 2003, this has been a setting
that was disabled by default, and recommended
to be set for high-volume file servers in order
to increase performance. However, on Vista,
this functionality is enabled by default. What
this means is that examiners are going to have
to use other techniques for determining timelines
of activity, etc., and that may even require
Registry analysis.

Do you think there are ethical challenges
specific to forensic examiners or

incident handlers? What do you think is
the most effective method of oversight in

small offices where there may only be one
examiner? Would you say that strong
personal
ethics are more important than technical
skill for an examiner/incident
handler?

I would suggest that ethics is not something
that is restricted to forensic examiners alone,
but just as important in all fields. I think
that in any field, a wake-up call comes when
someone falls into that "who are you to
question me" trap and then gets caught.
I also think that is an important evolution
and good for the community at large.
In this kind of work it is important to
remain humble and to constantly keep checking
ourselves. There's nothing wrong with going to
someone else and asking, hey, did I do enough
here, or is this telling me what I think it's
telling me? Overall, I think that builds trust,
and strengthens the individuals.

Thoughts? Comments? Email me, or leave a comment...

Friday, January 05, 2007

What's New So Far in 2007

Less than a week into 2007, and there's already a bunch of new stuff out that is really useful and really cool.

First off, there's a new blog on the streets...uh, web. Check out Mark McKinnon's CFED-TTF blog. One post up (at the time of this writing) and he already has some good info on the drivetable.txt file in XP System Restore Points. Great start so far!

Next, I was doing my monthly check on the E-Evidence site, and I found some really good stuff. No, wait...I mean REALLY good stuff! Raphael Bousquet has an interesting presentation on forensic triage. I know that its a product pitch, but the information on the idea of doing a triage is interesting, and something that should be, at the very least, discussed.

Golden Richard has an interesting PPT available that discusses next-gen digital forensics, to include topics such as live forensics. In the PPT, Prof. Richard points out that evidence (and he uses the term "evidence") exists in places other than those thought of in the "traditional" sense of forensics. He also talks about all the work that needs to be done...and while I agree, the question is, who will do that work? We do have a plethora (like that? No, I didn't get a thesaurus for Christmas) of students now that many universities and even community colleges here in the US have started offering courses and degrees in digital and computer forensics, but how long will it be before the big-brain ideas become something useable by investigators and examiners?

There are other presentations and papers available in this month's "What's New", but IMHO, the best paper availabe in this collection of links is Jessica Reust's paper on AIM trace evidence. What struck me most about her paper is that by the time I finished it, I actually had something useful, something I could use in an examination.

Now, on the flip side of all this, we should take it upon ourselves, as a community, to identify those things that we need, and either create them or put them on the table. What am I talking about? If you see a need or have a question, get it out there. Let someone know. Maybe someone out there already has the information you need, or is working on it.

Thoughts? Ideas? Comments?

Wednesday, January 03, 2007

New Year's Resolutions

I read today that there are some technical bloggers that have resolved to not make any New Years resolutions. Uh...okay...but isn't saying that you're not going to do that, in essence, a resolution? Hey, I'm just sayin'...

To kick 2007 off, I'm going to resolve to think big thoughts about IR and CF this year. Seriously. There has to be more to IR and forensic analysis than just what we're seeing. Think about it. There's got to be ton of evidence in Registry, right? After all, no one goes there. What about in RAM? And I know that there are a lot of questions out there, as I see some of them again and again. Questions like:
  • How do I show files were copied to/from a system?
  • How do I show that a CD/DVD was created on a system, and by whom?
  • How do I show that a user account was changed from a User to an Administrator, and when?
What I would ask of all of you for 2007 is to build the knowledge base of the forensic community. Remember, it takes a village...I know, I can't believe I said that either, but hey, it makes my point. I've heard that a lot of folks don't post or comment or ask questions online because they don't want to look stupid. Okay, so post under someone else's name so they look stupid. Or post anonymously. Whatever works. The point is that there're a bunch of us out there working in this area, and every now and then a "hey, what about..." or "hey, what if..." or "hey, look what I found..." would really go a long way toward adding to all of our knowledge.

I've also been told that LEOs don't like to post questions because opposing counsel might see it and hold that against them in court. Well, if that's the case, then couldn't opposing counsel pretty much get any testimony thrown out because at one point, before all of your training and reading, you didn't know anything? I mean, doesn't it make sense that you'd ask, get an answer, and verify it, and let that be the case, rather than go into court with less of a case, all because you didn't want to ask a question?

One last thing...please resolve that in 2007, when posting questions, you'll include the OS and version. Seriously. I know some of you think that when someone responds to your post with "what OS/version?", you've been "chastized" or p0wned...whatever. Get over it. Most times, the answer will be different, depending on whether we're talking Windows 2000, XPSP2, or Vista, and I don't have the time to write or read an if...then...else encyclopedic answer.