Wednesday, December 29, 2010

Mining MSRC analysis for forensic info

Anyone who's followed this blog for a while is familiar them my "rants"...against AV vendors and the information they post about malware; specifically, AV vendors and malware analysts have, right in front of them, information that is extremely useful to incident responders and forensic analysts, but they do not release or share it, because they do not recognize its value. This could be due to the AV mindset, or it could be due to their business model (the more I think about that, the more it sounds like a chicken-egg discussion...).

When I was on the IBM ISS ERS team, we did a good deal of malware response. In several instances, team members were on-site with an AV vendor rep, whose focus was to get a copy of the malware to his RE team, so that an updated signature file could be provided to the customer. However, in the time it takes to get all this done, the customer is are getting infected and re-infected, data is/maybe flooding off of the infrastructure, etc. Relying on known malware characteristics, our team members were able to assist in stemming the tide and getting the customer on the road to recovery, even in the face of polymorphic malware.

What I find useful sometimes is to look at malware write-ups from several sites, and search across the 'net (via Google) to see what others may be saying about either the malware or specific artifacts.

I watched this video recently, in which Bruce Dang of Microsoft's MSRC talked about analyzing StuxNet to figure out what it did/does. The video is of a conference presentation, and I'd have to say that if you get get past Bruce saying "what the f*ck" way too many times, there's some really good information that he discusses, not just for malware RE folks, but also for forensic analysts. Here are some things I came away with after watching the video:

Real analysis involves symbiotic relationships. I've found this to be very true in some of the analysis I've done. I have worked very closely with our own RE guy, giving him copies of the malware, dependency files (ie, DLLs), and information such as paths, Registry keys, etc. In return, I've received unique strings, domain names, etc., which I've rolled back into iterative analysis. As such, we've been able to develop analysis that is much greater than the sum of its parts. This is also good reason to keep a copy of Windows Internals on your bookshelf, and keep a copy of Malware Analyst's Cookbook within easy reach.

Malware may behave differently based on the eco-system. I've seen a number of times where malware behaves differently based on the eco-system it infects. For example, Zeus takes different steps if the infected user has Administrator rights or not. I've seen other malware infections be greatly hampered by the fact that the user who got infected was a regular user and not an admin...indicating that the variant does not have a mechanism for check for and handling different privilege levels. Based on what Bruce discussed in his presentation, StuxNet takes different steps depending upon the version of Windows (i.e., XP vs. Vista+) that its running on.

Task Scheduler. I hear the question all the time, "what's different in Windows 7, as compared to XP?" Well, this seems to be a never ending list. Oy. Vista systems (and above) use Task Scheduler 2.0, which is different from the version that runs on XP/Windows 2003 in a number of ways. For example, TS 1.o .job files are binary, whereas TS 2.0 files are XML based. Also, according to Bruce's presentation, when a task is created, a checksum for the task .job file is computed and stored in the Registry. Before the task is run, the checksum is recalculated and compared to the stored value, to check for corruption. Bruce stated that when StuxNet hit, the hash algorithm used was CRC32, and that generating collisions for this algorithm is relatively easy...because that's part of what StuxNet does. Bruce mentioned that the algorithm has since been updated to SHA-256.

The Registry key in question is:

HKLM\Software\Microsoft\Windows NT\CurrentVersion\Schedule\TaskCache

A lot more research needs to be done regarding how forensic analysts (and incident responders) can parse and use the information in this key, and in its subkeys and values.

MOF files. Bruce mentioned in his presentation that Windows has a thread that continually polls the system32\wbem\mof directory looking for new files, and when it finds one, runs it. In short, MOF files are compiled scripts, and StuxNet used such a file to launch an executable; in short, put the file in as a Guest, the executable referenced in the file gets run as System.

Management needs actionable information. This is true in a number of situations, not just the kind of analysis work that Bruce was performing. This also applies to IR and DF tasks, as well...sure, analysts can find a lot of "neat" and extremely technical stuff, but the hard part...and what we're paid to to translate that wealth of technical information into actionable intelligence that the customer can use to make decisions, within their environment. What good does it do a customer if you pile a 70 page report on them, expecting them to sift through it for data, and figure out how to use it? I've actually seen analysts write reports, and when I've asked about the significance or usefulness of specific items, been told, "...they can Google it." So, through experience, Bruce's point is well-taken...analysts sift through all of the data to produce the nuggets, then filter those to produce actionable intelligence that someone else can use to make decisions.

A final thought, not based specifically on the helps forensic analysts and incident responders to engage sources that are ancillary to their field, and not directly related specifically to what we do every day. This helps us to see the forest for the trees, as it were...

Sunday, December 26, 2010

Writing Books, pt IV

Okay, by now, you've probably/likely decided to write that book, and that you've opted to do so through a publisher to get it on the bookshelves and onto Kindles and other ereaders. Remember, this isn't the only way to get something published, but it is one of the only ways to get your book published and have someone else take care of getting it on shelves and in front of your intended audience through Amazon, etc. Your alternatives include self-publishing through sites like, or simply writing your "book" and printing your manuscript to a PDF file, rather than a printer. There are advantages and disadvantages to each approach, but we're going to go with the assumption that you'll be working with a publisher.

Working with the publisher
When you're working with the publisher, don't set your expectations (at all, or too high) of what that's like ahead of time. Remember, the publisher's staff are people, too, and may be working with multiple authors. You very likely won't be the only author that they're working with, nor the only schedule. In addition, remember that in the current economy many people are wearing multiple hats in their the editor you're dealing with may not get your email because they're traveling or at a book show. I've worked with staff who, in 2009/2010, do not have remote access to their email (can you imagine that??), so I won't hear back from them for weeks at a time.

While working with the publisher, I've had editors and even editors assistants changed part way through the writing process. The result was that chapters that I had sent in for review could no longer be found. I know this sounds like a bit much, but keep track of what you send in, when, and to whom. This can really help...particularly in instances where you have to resend things.

One of the things I've run into several times is that I've submitted what I thought was everything...DVD contents, revised chapters, etc...and asked the staff I was working with if everything was received, and if I needed to provide anything else. I'd been told, no, that's everything...only to be contacted three weeks later and told that something else was needed (review the proofs, provide a bio, etc.). The key to this is to see what's in other books, and keep a list of what you've provided...have you written a preface yet? A bio? How about that acknowledgment or the dedication page?

In short, be flexible. Focus on meeting your schedule in the contract. If you're not going to meet the schedule for some reason, do the professional thing and let them know ahead of time. Don't worry about what anyone else does or is doing. In the long run, it'll help.

Working with reviewers
When you're working with reviewers, keep in mind what their role is in the process. They're generally there to review your work, so don't take what they say or comment on personally.

There are generally two kinds of reviewers...those who do the grammar, spelling and formatting review for the publisher (they tend to work for the publisher), and those who are supposed to review your work from a technical perspective, to ensure that it's accurate (although why you'd put that amount of time into writing something that is completely off base, I have no idea). Generally speaking, whatever the grammar/spelling reviewer suggests is probably advisable to accept. However, this won't always be the case, particularly when you've written a turn of phrase that you really want to use, or are using acronyms specific to your field. I remember that I had an issue with the acronym "MAC"...did it refer to file "MAC" times, or to a NIC's "MAC" address? Kind of depended on the chapter and context.

As far as your technical reviewers go, that's another story. There's no reason that you have to accept any of their proposed changes, or follow what their comments say. Hey, I know that's kind of blunt, but that's the reality of it. In every book I've worked on, to my knowledge, the technical reviewer has had no prior contact with me, my book proposal, or my thought process prior to getting my draft chapters. Therefore, they are missing a great deal of context...and in some cases, their comments have made little sense when you consider the overall scope and direction of the book.

For some reason, the publishing process seems to be something of a maze of Chinese walls. You get an author who's writing a technical book, working with a publisher who knows publishing, but not the subject that's being addressed. One person reviewing the book and working for the publisher knows spelling, grammar, and formatting, and that's good...but often times, the technical reviewer may not know a great deal about the subject being addressed, and knows nothing at all about the author, the goals and direction of the book, or much in the way of overall context. In my mind, this is just a short-coming of the process, and something that you need to keep in mind. I've worked with a LOT of folks with respect to writing technical reports, and there are generally two things that most folks do with suggested changes and comments...they either accept them all unconditionally, or they delete and ignore them. I would suggest that when you are going through the document that you receive back from the technical reviewer, make your changes and add your own comments to theirs, justifying your actions. Then save the document, copy it, and (if it's written in MSWord) run the copy through the document inspection process, accepting the edits and removing comments. That way, you have a clean copy to send back, but you also have a clear record of what was suggested and what you chose to do about it.

Another thing to keep in mind is that people have varying schedules...if you submit a couple of chapters and you feel that you aren't getting much in the way of a review, or one that's technical, get in touch with the editor and request someone else. Or, suggest someone to them up front...after all, if you really know the subject that you're writing about, you will likely know someone else in the field who (a) knows enough about it to review your work, (b) has the time to do a good review, and (c) has the interest in working with you. I've had folks offer to review my work completely aside from the publisher...that's okay, too, but it also means you may submit a chapter and not hear back at all. Remember, in the technical field, you don't make enough money to support yourself writing books, so neither writing nor reviewing books is a full-time job, and people have day jobs, too.

Working with co-authors

Writing a book as the sole author can be tough, as it is a lot of work...but I think that writing a book as multiple authors, particularly when none of the authors ever actually sit in a room together, is much harder. There are a lot of decisions that need to be made and coordinated ahead of time, and continually revisited throughout the process. Again, writing books in this field is NOT a full-time such, people's day jobs and lives tend to take precedence. Family illness, holidays, vacations, etc., all play a role in the schedule that needs to be worked out ahead of time.

Another thing to consider is that someone has to take the lead on tone...or not. You need to decide early on what the division of labor will be (split up chapters or sections), and whether or not you feel it's important to have a single tone throughout the book. There will be times when it makes sense to have a single tone, and there will be other times when it's pretty clear that you aren't going to have a single tone, as the various authors take the lead on the chapters for which they have the most expertise in the subject matter.

Providing Materials With Your Book
I'm one of those folks who writes some of my own code, and I tend to create my own tools, whether they be a batch file or a Perl script. As such, it's helpful to others if I make those tools available to them in some manner, and this is often done by putting those tools on a CD or DVD included with the book. I think that a lot of times, this increases the value of the book, but it can also be a bit difficult to deal how you provide the materials is something to consider up front. Another item that a lot of folks find interesting and very valuable is "cheat sheets"...if you list or explain a process in your book, and it covers a good portion of a chapter, it might be a good idea to provide a cheat sheet that the reader can print out (perhaps modify to meet their own needs) and use. How you intend to provide these, and other materials (i.e., videos that show the viewer how to do something step-by-step, etc.) is something that you need to consider ahead of time.

The point is that if there are materials you're going to refer to in your book, you have to figure out ahead of time how you're going to provide them. In my experience, there's two ways you can do this...provide the materials on the DVD that comes with the book, or provide them separately. I have usually opted to provide the materials on a DVD, but after having written a couple of books, I think I'm going to move to something completely separate, and provide the materials online.

I have decided to do this for a couple of reasons. One is that there's always someone out there who ends up purchasing a copy of the book that mysteriously doesn't have a DVD. Or they loose the DVD. Or they leave it at home or at work, when they need it in the other location. Then there's the folks who purchase ebooks for their Kindle or other ereader, and never got the email that says, "...go here to download additional content." Or they did, but the publisher modified their infrastructure so now the instructions or path aren't valid. And, of course, there's always the person who's going to contact you directly because they want to ensure that they have the latest copy of the materials.

My thinking is that a lot of these issues can be avoided if you choose a site like Google Code or something else that is appropriate (and relatively stable/permanent) for hosting your additional materials. That way, you can control what's most up-to-date and not have to rely on someone else's schedule for that. You can refer to the actual tools (and other materials) in the book, so that having the book itself makes the tools more valuable, but by providing them on the web, you can include "here are the absolute latest, newest, most up to date copies" on the page where the reader will go to download those tools.

Blogging is a great way to get started and get the feel for writing, without the constraints of editing (and things like spelling, grammar, etc.). Face it, some folks don't take criticism of any kind well, and don't put a great deal stock in checking their own spelling and blogging is sort of a way to get into writing without having someone looking over your shoulder. It's also a great way for some folks to realize how important that sort of thing is.

Blogging is also a great way to self-market your book, prior to and following publication. It's a great way to start talking about the book, to answer questions that you get about your book and materials, address errata, etc. In some ways, a blog can also lay the groundwork for a second edition, or even just for your next effort, as you get feedback, read reviews, post new ideas, etc. For example, if you start to see that your book on forensic analysis is linked to another blog on malware reverse engineering, with that author making comments about what you've written (positive or negative), that could be a good indicator for you...what do you need to improve on, expand on, and what were you dead on with in your book?

Take the lead on marketing your book. Present the publisher with ideas, and take the lead on getting the word out there (assuming that that's what you want). When WFA 2/e was coming out, I was excited because this was the first book in a new direction that Syngress was going, something that was exemplified by the new cover design. That summer, the SANS Forensic Summit was going to be in Washington, DC, and I was attending as a speaker. As I looked more and more into the conference, and who was speaking and attending, I counted almost half a dozen Syngress authors who would be there, all of whom had the work "forensics" in their book title. I contacted the publisher to find out if they'd have a bookstore...I thought, between sessions I could answer questions about the book. Well, it turns out that they had NO PLANS for a bookstore!! I thought (and said to them), you've GOT to be kidding me! Here's a conference with "forensics" in the title, and all these authors of "forensics" books will be me, it was a total marketing coup. The short story is that the editor was there with books on a table and it was a huge success for everyone.

Final Thoughts...
And now, some final thoughts as I close out this series of posts.

I hope that in reading these posts, you've enjoyed them and at the same time gotten something out of them. I tend to take something of a blunt approach, in part because I don't want to sugarcoat things for someone who's considering writing a technical book. Yes, it is hard...but if you know up front what you may be facing, you're less likely to let it slow you down. One of the hardest things about writing books is that you're rarely, if ever, face-to-face with anyone from the publisher's staff when discussing your book. In fact, you're rarely face-to-face with anyone throughout the process.

One of the misconceptions a lot of folks who have never written a book have about authors is that they retain some modicum of control over what happens with the book once it's submitted to the printer. Nothing could be further from the truth. When WFA 1/e was released by Syngress, a PDF version of the book was available...for the first couple of weeks, it was provided with each copy of the book purchased through the Syngress web site. After that, it was available for purchase. Later, Syngress was purchased by Elsevier, a company out of Europe that produced all e-format versions of its books EXCEPT PDF. The author's role in any of that, particularly in the availability of a PDF version of their book, is zero. And I say that only because there's nothing less than zero.

Another misconception that I've run across is that most folks think that book authors have access to endless resources, or that somehow, the publishing company will provide those resources. This simply isn't the case. When I submitted the proposal for the Registry forensics book, all of the reviews came back saying that I needed to include discussion of the use of commercial tools, such as EnCase and FTK. Well, the short answer was "no"; the long answer was that I neither have access to, nor have I been able to obtain a temporary license for either...and none of the reviewers was offering such a license. In all fairness, I will say that I was offered a temporary license to one of the commercial tools, but by the time that offer was available, I was too far into the writing process to go back and add that work and material into the book. I would have been particularly time consuming because I don't use those tools regularly. Anyway, my point is that when I have written my books, I tend to do so based on my own experiences, or those interesting experiences that others have shared. I tend not to write about ediscovery, because I've never done it. I likely won't be writing about Registry analysis of a Windows-powered car or Windows 7 phone, because I neither own nor have access to either, nor do I have the tools available to work with either. Like most authors, I don't have access to massive data centers for testing various operating systems and application installations across numerous configurations.

Keep in mind that your book is not going to be everything to everyone. You're going to have critics, and you're also going to have "armchair quarterbacks". You're going to have people who post to public forums that you "should've done this...", and not once have a good thing to say about your work. You're going to have folks who will email you glowing commendations for what you've done, but not post them publicly...even when they purchased your book based on a publicly-posted review. Don't let any of this bother you. One of my good friends who's also written a book has received some not-so-glowing criticism, to which he's responded, "...come see me when you've published a book." In short, don't let criticism get you down, and don't let it be an obstacle that prevents you from writing in the first place.

Finally, I want to say once again that writing technical books is tough. It's tough enough if you're a single person and not at all used to writing. If you're married (particularly newly married) and/or have small children, it can be logarithmically harder, and it will require even more discipline to write. However, it can also be extremely rewarding. Seeing your work published and sitting on a bookshelf is very rewarding. Think about've completed and achieved something that few others have attempted. If you've put the effort in and done the best you can, you should take pride in what you've done...and don't let the little things becomes insurmountable obstacles that prevent you from even trying.

Saturday, December 25, 2010

Installing on Windows

I don't often work with Python scripts, but I recently had an instance where, due to advice from a trusted source, I needed to run, mentioned in the Malware Analyst's Cookbook. In short, what I wanted to do was take look at a couple of suspicious executable files, having already run several AV scanners to identify and locate those files. Based on what I learned in setting this up, I wanted to share the steps I used to get this script running on Windows XP SP3. is a powerful tool that takes a look into a Windows portable executable (PE) file, and reports on "suspicious" elements of the file, if found, based on heuristics identified within the "Pimp my PE" paper. can also incorporate YARA functionality so that PEiD and ClamAV signatures can be used, as well. This can be extremely valuable to an analyst, as we're all aware how AV alone often times will not detect malware. I've seen cases where malware was detected by the installed AV, only to have the timeline clearly show that at some point further down the road, another file with the same name was dumped on the system, but NOT detected by the same AV.

Install Python: I opted for ActiveState's ActivePython, but you can also get the current distro for Windows from

Install Pefile: Do NOT use the pypm utility that ships with ActivePython to install the pefile module; instead go directly to the source and get the latest version. Download the archive and copy the and files into the Lib directory for your Python installation.

Download Go here (this is rev. 18, get the most current one available) and get the file; the easiest thing to do is click on "View raw file" and save it where you want it to go. I had some issues getting the script running on Windows 2003, and it came down to the indentations...if you program Python, you may know what I mean. I had selected and copied the code in my browser, and pasted it into a Notepad window; when I saved the code using "View raw file" from the Google Code site, things worked. On my XP system, I pasted the code into an UltraEdit window and saved it.

Installing python-magic: According to this source, you'll need a couple of files to get python-magic installed on your system. First, go get the GnuWin32 file utility, and download the latest archive. Copy magic1.dll to your system32 directory and put the magic file in the same directory as You can get regex2.dll from the latest regex archive (copy the file to your system32 dir), and zlib1.dll from the latest zlib archive.

For this one, I contacted MHL (one of the Cookbook authors...) and he sent me the below instructions for installing python-magic on Windows:

Assuming you already have Python from or the ActivePython version...
1) Install
2) Get
* python build
* python install

3) Get the GnuWin32's
File utility
* Place magic1.dll from the Binaries package into your system32 dir

* Place "magic" from the Binaries package into your system32 dir (or anywhere else, just as long as you remember the path)

* Place zlib1.dll and regex2.dll from the Dependencies package into your system32 dir

4) Test your installation

C:\> python

>>> import magic

>>> test = magic.Magic(magic_file='C:\path\to\your\magic')

>>> print test.from_buffer("test") ASCII text, with no line terminators

For more information, go
here and here.

Again, many thanks to MHL for providing those instructions.

Another lesson here is to not stick with one tool or one set of tools, but instead be open to finding and using a tool or technique that works, and incorporating it into your toolkit. While Perl has the Parse::Win32Registry module and Python does not appear to have something comparable, Python does have the pefile module (on which was built) and Perl does not have (to the best of my knowledge) a comparable module. So rather than fitting the case to the tool, it's often a much better idea (albeit not easier) to find a tool or technique that will help you with your case.

Friday, December 24, 2010

Writing Books, pt III

At this point, if you're still following and reading this series, you're likely interested in writing a book, particularly (but not exclusively) in a technical field. Now that we're to the point where we've at least considered a publishing medium...or an actual publisher...let's talk about the actual writing.

Determine your audience and writing style

So, now that you have a topic, and your detailed outline, who are you going to be writing the book for? Who is your intended audience? In many ways, your audience can also dictate your writing style. Are you going write from/to a more academic perspective, or do you want to reach as wide an audience as possible? Do you want to sound more clinical, or do you want to be a bit more free flowing?

From my own perspective, I think that over the books I've published, I've progressed a bit and so far, I've received the most and best comments regarding the writing style with respect to WFA 2/e...keep your eyes out for the Registry book, as I think that you'll notice an improvement along those lines.

One of the best ways to determine your writing style is to read. Seriously. What books have you read that you really enjoyed reading? Now, think about why you enjoyed them? Were the characters authentic, did they seem real? Was the writing and its tone enjoyable and easy to digest, or did the author bore you to death (hey, you should read my master's thesis!)?

Also, what other things do you enjoy? When I was originally working with Syngress to publish WFA 1/e and we were working up the title, I immediately agreed with the publisher that if the title had "Windows" and "forensics" in it, I would immediately pull it off the bookshelf to take a look at it. That made sense to me, because that's exactly what I did when I went to the bookstore. From then on, I started to look at those things that I liked about certain books, or didn't like, and tried to emulate them. Over time, I've tried to add more of those things...sidebars, use cases, etc...that people have told me that they want to see more of, in most cases because I wanted to see more of them, as well.

As to the audience, I would like to reach a wide range of folks within the community. I find a lot of books that aren't specifically about IR and digital forensics to be very valuable (such as the Malware Analyst's Cookbook), so my hope is that besides those directly involved with responding to and analyzing Windows systems, others find my books useful as well. Within the community, I'm hoping to provide something useful or valuable to a wide range of analysts, from those new to the field (or just interested in it) to those who have been in a while, from active responders to those taking university or community college courses, to LE, etc. It's probably something of a lofty goal, I know, but that's the direction I'm going.

Be thorough in your writing
Deciding on your overall audience will also have an impact on the level of technical ability that you assume. For example, when describing a technique that uses an open source tool (say, written in Perl), how far do you want to go in explaining the tool? Do you want to go into detail explaining how to install Perl (or Python, or Cygwin, or whatever), or are you going to assume that reader can figure that out themselves?

Sometimes, writing "run the tool" (in magazines like 2600, you'll see 'run nmap' and little else...) doesn't often work very well. Providing a complete command line and relevant output from the command gives the reader something that they can hold on to. Many times, screen captures also help a great deal. However, the caveat about screen captures is that they have to be clear enough to see and understand...I've opened many books where I couldn't read the screen captures at all.

Also, when providing screen captures, be sure to specify what system or systems you're running the tool on, so that the output makes sense. I've met a LOT of folks who think that the major OSs that they have to address are Linux, MacOSX, and Windows. As someone who works with Windows systems, I'd strongly suggest to you that there's a major difference between Windows XP and Windows 7...but that's something to get into later. The fact of the matter is that in some cases, you're going to get different output based on a number of variables...version of the OS, version of Python or Perl installed, etc. Specify what you're working with when you write your book.

If something's important, it's okay to repeat it
Sometimes, this is the case. You don't have to repeat it verbatim, or even in the same chapter. If there's an important concept or technique that you've presented in your book, returning to it later in the book will help the reader see that its important, and perhaps even understand the importance.

When I was an instructor in the military, we'd do that...a lot. We had things that we did to signify to the (usually dozing) students that something was important. One really easy way to do this is to think about it when you're developing your outline, and find places where you can tie concepts or techniques together, where you expound or expand on something...and if you need to, highlight them on the outline in different colors. Some of us that talk about IR mention things like "Locard's Exchange Principle" (I do, and I know that Chris Pogue does, as well...) when you talk about it, find places in the book, such as during exercises or case studies, where you can clearly demonstrate the concept. If it's important, use it.

Be consistent
One of the things I learned about report writing that is also true in books is that you need to be consistent in how you write and present things. For example, lets say you're writing a report based on your analysis of three acquired images. When you discuss the analysis of the first system in your report, you describe the AV scans you ran, the Registry analysis you conducted, etc. Then for the second system, you first discuss AV log analysis, Event Log analysis, and you leave the AV scan results to the end and do not even mention your Registry analysis. The discussion of your analysis of the third system continues to be disjointed.

So, what do you think that the reader will walk away with after reaching the end of your report? Will she think that you did a thorough job in locating the malware? Not likely. It's more likely that she'll remember the inconsistencies, because the technical information you presented was lost in the "noise" of those inconsistencies.

The same can be said about writing books. Let's say you're writing a book on analyzing different image (JPG, TIFF, etc.) formats; providing a consistent structure for each image format will allow the reader to form comparisons and contrasts in their minds, whereas an inconsistent approach will leave the reader's mind similarly tangled and confused. Further down the road, the reader will have to search for important references in your book; however, if you've presented the material consistently, they're more likely to turn right to it, particularly if they're comparing information between two or more formats.

Getting Started
Regardless of the decisions you've made up until now, or how much discussion you've had with others about your ideas, nothing is going to actually happen until you start writing. That's right...write something. Anything. Because it's easier to change something that's already written, whereas if you're just sitting there starting at a blank page, you'll just keep staring...and that's hard.

When I was in the military and was writing fitness reports, I found it much easier to start early, consult my platoon commander's notes, and write something. I've found that 17 yrs later, the same things apply, and apply very well. As I develop my outline, I take a notebook, on stickie's, etc. I start pulling them all together, and start writing.

You're Not Gonna Be Perfect
What first comes out isn't pretty, and it may not be what I end up going with, but it's something to start with. When I was writing my first book, I sent chapter 1 in for review early on in the process...when it came back, I couldn't believe that I'd actually written what was there, and that it was only 8 pages! I really had to look at it, because in the process of working with one of the reviewers, I'd learned so much about writing from the other chapters that I had a hard time believing that I'd actually submitted what was in chapter 1! In the end, I ended up completely re-doing the entire chapter.

As much as you want what you're writing to be just right, understand that it's not going to be perfect. Sometimes, that becomes the excuse we use for not writing, or not submitting what we've written. Just know that it's not going to be perfect...but if you've put your best effort into it, it will be good and something that you can be proud of. Also, remember that in a group of people, you're never going to please everyone...keep that in mind. You're going to need a tough skin, because whenever you put something out there for public consumption, you're going to get positive as well as negative're going to have critics.

All that matters is that you're doing this for the right reasons, and that at the end of the day, how do you feel when you look up at your bookshelf, or at the bookshelf at the bookstore (or on someone's iPad...) and see your book there?

Tuesday, December 21, 2010

Writing Books, pt II

So far, the first post in this series seems to have done well. Next, we'll talk about something that's very important toward getting a book published...the publisher.

Finding/choosing a publisher
Once you've gotten through the recommendations in my first post in this series, it's time to start thinking about a publisher. Remember what I said, much as those of us in technical fields hate writing (some to the point of doing it badly so that they don't have to do it...), doing those things that I mentioned in the previous post are only going to help you in the long run. This is, in part, due to the fact that the publisher is going to make you do it anyway. Now, I'm sure that this is also going to be where most folks falter in their efforts...not realizing how much effort is required (I say "effort" instead of "work", due to the fact that a lot of what goes into this process is outside the norm and comfort zones of what most of us do...), some folks will start down the road and not just stop, however unintentionally, when asked to provide something else. Consider it something akin to special operations forces evaluation or assessment programs...the publishers really want to know who has the desire to stick with the process. They aren't throwing up ridiculous obstacles...they're just stating the needs of their process, and I'm sure it weeds out a lot of the folks that aren't serious and would just consume a great deal of their time and resources.

Some publishers, such as Syngress, actively advertise for authors. When I say "actively", I mean on their web site, as well as on social media sites such as LinkedIn, Facebook, and Twitter. We'll talk later about how you can use those sites, as well...but suffice to say for the moment that some publishing companies are taking full advantage of social media sites.

Now, I've only worked with two publishing companies, and for the most part, there were really no significant differences between the initial contracts (more about that later in the post) for either one of them. The only real difference was in the advance payment amount. Remember, you don't get into writing technical books to make money...and the advances come out of your initial royalties. After publishing my first book, I had another idea (to take the things I learned from writing that first book and write something a bit better), and had to provide my first publisher with their right of first refusal (this was stated in the contract) for the new concept. Once this was done, I moved on to Syngress, and through chance and circumstance, I opted to remain with Syngress in part due to some positive experiences with some of the staff, as well as the fact that by proving I wasn't a one-trick pony, I was able to get a bit better percentages in my royalties.

When looking for a publisher, also consider what you're interested in. Who's going to market your book, and how are they going to do that? Will a marketing campaign consist of some mass emailings, or will there be more involved? What can you do to assist, or pick up the slack? Also, besides the actual book itself, in what other formats will the book be available? When the first edition of Windows Forensic Analysis came out, there was a PDF version of the book available. By the time I had written the second edition, Syngress had been purchased by Elsevier, and the second edition was available in several e-book formats...but not PDF. If this is something important to you, be sure that you ask the questions.

Remember, self-publishing is an option. However, before you go that route, be sure you thoroughly research what is involved. How much will you need to know about desktop publishing? How much effort do you need to put in and what will you get in return?

Another thing to consider if you self-publish is, how do the books get on the shelves? Hey, providing review copies to luminaries in your field is easy, thanks to FedEx. But if you handle the marketing of your book and generate hype, and then get some really amazing reviews from notable people in your profession, how do you then get the book to the people who want to buy it, or to those who don't yet know that they want to buy it?

I know someone who wrote a book, self-published through, and provides all proceeds from the book to benefit a fallen comrade. This was the route he chose, and he's happy with it. This may be a route that you will want to go...or it may not. On another note, I was in one of the Family Christian Bookstores recently and found a pamphlet on one of the shelves for Westbow Press...I know someone who feels led to write about an experience in her life, and this may be an approach that she can use to get her book published and share what she's learned with others.

That being said, let's get on with it...

Working with your publisher
Working with the publisher is going to be a new experience for many, unless you're used to working as a contractor, of sorts. What I'm going to share with you now is some things I saw and learned with respect to working with two specific publishers, in writing technical books. As such, if you're going another route, such as writing children's books or non-technical books, YMMV.

The contract
The contract will usually specify the schedule. This may be an area where you will want to negotiate...take a look at your outline and anything else you've written, and correlate it with the schedule. Is it attainable? Is it something you can manage, or will you have to give up weekends and take a "vacation" to complete the book? This effort is going to be enough of a challenge, without having to meet some arbitrary see if what's proposed makes sense, and don't get locked into something that's going be really hard on you.

A word about royalties...again, you're not going to make money to the point of retiring if you're writing for a niche market, like forensics. That's just how it is. Therefore, that's usually not the reason that folks get into writing books, particularly technical books in this market. If your contract specifies an advance, remember that advance means "advance"...which means that whatever you're given in your advance comes out of your royalties checks. So don't get disappointed if during the first quarter or six months your book seems to be doing really well, and your first royalties check is 0, or negative.

The keys to any contract are:
1. Read it thoroughly and make sure you understand it. Discuss anything you don't understand or have a problem with.

2. Don't expect that the publisher has it all locked on and is doing everything right...publishers are people, too, and we all make mistakes, like send the wrong contract or don't include things that had been discussed.

3. Make sure you understand and agree with the schedule. If you don't agree, get on the phone with the publisher and work it out. Publishers have schedules for when they'd like to get the book out, so try to work with them on that.

4. If you have questions about anything, or don't understand something...ask. Before you sign.

When working with the publisher, you'll likely be provided with a template that specifies how the chapters should look. I've had templates that were really loose, while others were really stringent and specified a certain number of sidebars per chapter (among other things). It will help you a great deal in the long run if you get to know the template up front, and begin using it and abiding by it right away. Believe me, I know from experience that going back and restructuring an 80+ page chapter is NOT a pleasurable experience. Using and following the template from the beginning is going to be helpful to not just your overall writing experience, but also for everyone else involved.

However, that doesn't mean that as you're putting your detailed outline together, you can't start writing. In fact, I highly recommend that you do...start putting something down on paper. Throughout my time in the military...whether I was writing my master's thesis, or fitness reports, or whatever...I always found it much easier to write something and change it as I needed to, rather than sitting there staring at a blank page, waiting for the perfect turn of phrase to come to mind (which is a fancy way of saying "writer's block"). Even now, I have snippets written down for a project I have in mind...returning to those snippets will likely be the push I need to get that project off the ground and completed.

Reviewers and reviews
At some point, your chapters are going to have to be reviewed by someone; in many cases, several someones. There will be a reviewer who works for the publisher who will review your format (to ensure that it is in accordance with the template), grammar, spelling, etc. This reviewer will likely pick up things such as unfinished sentences, your horrific spelling, etc. It usually makes good sense to accept what they say (particularly if you're not too terribly good at spelling or grammar), and make most of the changes that they suggest. However, take a really close look at technical acronyms, and consider the reviewer's suggestions...acronyms specific to your field might be easily confused. I had an issue with "MAC"...depending upon the context, it could refer to the file "MAC" times, or to a "MAC" address.

There will also be someone who is a "technical reviewer", who is perhaps someone in your field and hopefully has some knowledge about your subject matter. Their job is to take a look at what you've written and see if it's correct and makes sense. Now, just because the technical reviewer makes a comment or suggestion, that doesn't mean that you have to make the change. I've had some really good reviewers, and some really bad ones. The bad ones were easy to recognize as they provided nothing of value to what had been written, and I would usually suggest to the publisher that they replace the reviewer. In fact, if you can, and you know someone that you can trust to be honest with you, provide their name to the publisher, rather than letting the publisher pick someone out for you. I've also had really good reviewers who've run every tool and every command I mention, noted differences in platforms, etc. However, if you get technical reviews back that say nothing more than "needs work", or "hey, this is a cool idea...", it's probably best to find someone else.

Once your book is published, getting reviews from folks in the field is a great idea. When WFA 1/e was published, 101 copies were sent to folks deemed by the publisher to be "in the industry" (most of whom I did not recognize) with the expectation that many (or some) of them would write reviews. None did. When WFA 2/e was published, I provided a list of about 25 folks (that I had already contacted) to the publisher, and they were provided copies of the book. Almost all of them have written reviews.

Marketing your book yourself is a great idea, and there's nothing to stop you from doing some cases, you may be more familiar with industry and how to get "the word" out better than the publisher's marketing staff. In most cases, they have a set program that they follow, and you can follow a sort of "guerrilla marketing" plan. Using social networking sites and any lists you're on and forums you frequent, talk about your book and what's in it. Reference the publisher's page and the Amazon page for your book, and post the cover art graphic where it makes sense to do so.

Like I said, reviews are good, as they help with marketing your book. If you can get someone well-known and well-respected within the community to write a review of your book, and it's positive (yeah, no kidding, right?), then that's going to be the little push that gets someone who's heard about your book and is sitting on the fence to purchase it. If the reviews are posted to Amazon, that's a great site to reference. Blogging and posting links to reviews on other sites is also good. Several years ago, Richard Bejtlich noted and documented a favorable shift in the "Amazon Bestsellers Rank" for his book following a review being posted on Slashdot.

Think about something that you've been interested in...a car, an e-reader, or anything else. If you have a choice, how does the recommendation of a trusted friend weigh in your decision? This is how reviews work. However, much like the technical review I mentioned above, a review that says, "hey, this is a good book" and little else really isn't much of a help. One way to get some of these really useful reviews is to establish relationships through professional networking before writing your book, or early on in the process of writing your book. Lots of folks out there are known to "trade" books and reviews with other authors...and really, there's nothing wrong with this. In fact, it's a great idea. I recently reviewed the Malware Analyst's Cookbook, which was not only a fount of information, but excellent for cross-pollination between what I do and what the author's do.

Friday, December 17, 2010

Writing Books, pt I

I mentioned in an earlier post that I felt as if I wanted to share my experiences after having written five books, and been a minor co-author on a sixth.

First, I wanted to share a bit about my history of how I got involved in writing books, just to give you some perspective. I didn't start writing books; instead, I started reviewing books and book proposals. This seemed like a great way to see what was up and coming in the industry, as well as to make a little mad money on the side (or, if the remuneration was books, to stock my library). At one point, I was working on a review and completely disagreed with something the authors stated with respect to NTFS alternate data streams; not only did I disagree with it, but I offered up some text of what I thought the section should say. In the end, the authors agreed and accepted my contribution.

When the book was published, the publisher asked the authors if they knew of anyone who might be interested in writing a book. My name was offered up along with several others, and I accepted...and began the process of discovery that has led me to this point.

Further, I do have a long history of writing, albeit not books. Like many of us, I have a public school education. 'Nuff said, right? I wrote in college. I wrote during my first career as a military officer. I wrote a thesis for my graduate degree. I have written a lot of reports since I got out of the military and began pursing a career in the information security field. I can't say that I've enjoyed all of the writing, but it does provide both breadth and depth to my writing experience. In every case, there's some bad stuff that I didn't like and leave behind, and in most cases, there are some gems that have really benefited me that I can trace back to some of that early experience.

What I'd like to do now is provide something of a process but really more of my experiences and what I've learned in the process of writing books. My hope is that someone will take a look at this and perhaps decide to pick up their pen, as it were, and put it to paper. I know that there are some really smart folks out there with some really good ideas that need to be shared, and writing a book is a great way to do that.

Why write a book?
The first thing you need to think about is, why do you want to write a book? Some folks I have met want to write a book so that home users know what to do to secure their systems, or the members of their church know more about the Internet in general. Personally, I wanted to take all that stuff I had laying around in different locations and put it in one place...yes, that's right, I use my own book as a reference. ;-) After my first book, I wanted to prove to myself that I wasn't a one-trick pony and that this was something I could do again.

If you're in a technical field, do not expect to get rich writing a book, or even several that's not a reason to write a book. It just it's going to happen. The technical field does not have the market that other genres tend to enjoy; in fact, the more technical and narrowly focused your book, the more of a niche market you'll be focused on and the less likely you will be to really "score".

There can be a number of reasons to write a book...those that I've mentioned above are just some of them. However, if you don't have a good reason or a goal that you'd like to reach, what's going to keep you going through the process?

Choose a topic
So, what do you want to write about? Clearly, if you're reading this blog, then you're interested in technical topics, including incident response and/or digital forensics. Okay, that's a good place to start. Can you narrow it down...or should you? Sometimes, taking a broad brush approach is a great way to get started on a topic. I've found that focusing very narrowly has proven to be the way to go for what I'm writing about...the narrow focus being forensic analysis of Windows systems.

However, I don't think that this process is really specific to just technical books. I think a lot of my experiences in writing books can be applied to other areas/genres, as well...such as writing children's books.

Research the topic
Look online and at bookstores to see if there are any current books that cover your chosen topic. If there are, you would have to determine how you would differentiate your book from the others. Will your book be more up-to-date, will you provide a different perspective somehow?

If there aren't any other books or resources that cover what you're interested in writing may be home free.

Another aspect of this research what will benefit you greatly is to take a look at other resources, and not just those related to your chosen topic. Look at books in your chosen genre that you've enjoyed reading, and try to determine what it is you really enjoyed about them. One early example that I found that I really enjoyed is sidebars, little snippets that usually appear on the page in a grey box that tend to pertain to what you're currently reading, even if only tangentially. I recently reviewed the Malware Analyst's Cookbook, and I really like the idea of "recipes".

Create a detailed outline
Before you begin writing your book, outline it. Yes, just like you learned in grade school...and you thought you'd never use that again, like math. Begin with a simple outline, and begin adding to it. This allows you to begin to organize your thoughts with respect to writing your book, and begin preparing for the questions you're going to eventually encounter, such as how many chapters your book will have, what topics it will cover, etc. When I've put together outlines, I've found that just the act of writing the outline has led me to the idea that my chapters need to be restructured to make better sense, and that I needed to add additional chapters, as well.

A detailed outline helps me to organize my thoughts, and see where I need to include images, charts, figures, tables, etc. Would an image here make really good sense, and would a table here that encapsulates and summarizes the last three paragraphs really be useful to the reader?

Adding detail to your outline will only benefit you in the long run. Having a detailed outline keeps you on-point and on-track, provides some structure to what you're doing while you're doing it, and can be very beneficial, particularly if you've taken additional notes and written down some choice phrases that you'd like to use.

Just so you know, both publishers I've worked with thus far have asked for this research and work (i.e., outline) to be done and submitted as part of a proposal before going forward. So, you're going to *have* to do the work, and as much as I didn't want to do it the first time, I found out that going through the process...I mean, really going through it, not just doing it to get it done...made a difference.

Seek review
Most of us in the technical field do not like to have our written matter (when we do it) reviewed. I know that. You know it, too. But know that as you begin going down this road, you're going to get tunnel vision and you won't see the forest for the trees. To use another cliche and score a hat trick, you'll be too far down in the weeds and you'll need a reality check. Find someone you trust to give you honest, thoughtful criticism. When I was writing my very first book, I got a lot of that kind of criticism from Jennifer Kolde, and I've used what I learned through our exchanges to try and improve my writing with each book. As much as you won't want to, it will only benefit you to find someone (or someones, plural) to take a look at your ideas and give you a reality check.

Know that it will suck
That's right...simply accept the fact that this will be hard. Why? If you choose to write a book, you're going to have to write, which means you'll be adding a new activity that is going to replace an old activity in your life. You may have to give something least, in part. My lovely wife, God bless her, says that when I was writing my first book, she had no idea that I was writing a book. That was because I rarely gave up our dates during that process...instead, taking her to dinner or a comedy show served as a much needed break.

Also, the truth is, most people don't like to write, especially technical people. Case notes, reports, you name it...most technical folks simply do not enjoy the act of writing.

However, with a little (okay, a lot) prior planning, you can make the process of writing a book much easier.

Thursday, December 16, 2010


At the BasisTech Open Source conference (June, 2010), a LEO told me, "we do CP and fraud cases, you do intrusions and malware."

My response at the time was, well, no, we (meaning analysts/examiners who are not LE) solve problems...and the people who call us have intrusion and malware issues. My point was that much of what we do and the skills we bring to the table are (or could be) very useful to LE. In fact, many of us who are not LE have done work on CP cases, some even resulting in plea agreements.

Looking back over some of the work that I and others have done, it occurs to me that there's an ever greater convergence between LE and analysts in the private sector. Say LE has a CP or fraud case...if the claim of "the Trojan did it" is made, then the case becomes a malware case, even to the point that the claim has to be disproved. If the claim is made that an unauthorized user accessed the system and placed the image files on the system, then the case then becomes an intrusion case.

So I guess my point is that there's a convergence in what each of us does, and we're not quite so separate and in our own silos the way some think, and we don't so much have disparate skill sets. I'm just sayin'...

Monday, December 13, 2010

Book Review: Malware Analyst's Cookbook

Michael Hale Ligh was kind enough to provide me with a review copy of a book he recently co-authored along with Steven Adair, Blake Hartstein, and Matthew Richard, titled Malware Analyst's Cookbook and DVD: Tools and Techniques for Fighting Malicious Code. The book was an excellent read, and is an extremely valuable resource for any analyst in this industry.

First off, I am not a malware reverse engineer. Yes, I do have some experience working with malware, but not to the level that reverse engineers (RE) such as the authors tend to operate. I work with some really smart folks who do malware reverse engineering all the time, and they're very, very good at it. As such, during IR or digital forensics analysis, I tend to exchange information with the RE folks, providing what I've found and then taking what they find, and performing iterative analysis. I've found over time that this approach provides a much more in-depth analysis than the sum of it's parts.

That being said, weighing in at a hefty 18 chapters, the Cookbook covers a wide range of topics specific to reverse engineering and analyzing malware, from anonymizing your research activities to honeypots to malware classification and automated analysis and beyond. Throughout the book, the authors present "recipes" for using various tools (many of which are open-source) to solve specific problems. For example, chapter 3 includes several recipes involving YARA, an open-source, Python-based tool for identifying and classifying malware. Many other popular tools are also used, including ssdeep, Didier's PDF tools, and even RegRipper. Chapter 15 goes discusses effectively using Volatility for memory analysis. Many of the examples provided in the book are based on the real-world experiences of the authors, lending considerable credence and value to the demonstrated skills and information imparted.

Chapter 10 is near and dear to my heart, not only due to the discussion of ADSs, but also due the fact that the authors wrote their own RegRipper plugins! That's right! These guys found some things of value with RegRipper and wrote their own plugins. It brings a tear to my eye to think that my little tool is all grow'd up!

Some of the truly powerful aspects of the book include clear, thorough explanations of the presented topics, as well as easy-to-follow examples that allow the reader to follow along and learn by doing (I tend to learn more by doing than reading). Whether you're an aspiring reverse engineer, incident responder, or forensic analyst, this book will be an extremely valuable resource to you. For example, some of the explanations of how systems get infected with malware (JavaScripts, infectable document formats, HTML injection, etc.), as well as artifacts to indicate a malware infection, will prove extremely valuable to IR/DF folks. Heck, even if you're a somewhat-seasoned malware reverse engineer, it's likely that this Cookbook will show you some things that you haven't seen before, or show you some ways of looking at malware that you haven't thought of before.

Much of what's in the Cookbook goes beyond commercially available applications and clearly demonstrates the use of Python- (or Perl-) based open source tools that accomplish specific objectives. The cookbook even goes so far as to explain and demonstrate how different malware-related activities are performed, as well as how they can be detected.

I have to say that reading through the Cookbook gave me a new appreciation for what malware reverse engineers do. I also walk away from the book with a better understanding, not only of how to look for malware during IR/DF activities, but also how to better provide information and data to our reverse engineer once I've found it. I also walk away from it knowing that I'll be back. With more study and practice, I'm sure I can do some modicum of malware analysis beyond what I already do, and while I know that I'll never be at the level of the authors, I thank them for a truly exemplary and valuable resource. If I didn't already have it, this Cookbook would be on my Christmas the very top!

Links, Thoughts...

What's good and what's been going on lately in the world of Windows IR/DF? I think maybe things are slowing down around the holidays, as it seems that many of us who are blogging are simply sharing the same links that everyone else is blogging...kind of circular, and when you poke your head up, you realize that you've been reading the same half dozen stories for the past week.

Maybe there really isn't a whole lot going on at the moment...maybe adding something a little bit different, perhaps a different angle, will change things up just a bit.

Many of you know that I've had a couple of books published, and number 5...Windows Registry Forensics...will be out shortly. WFA 2/e has been pretty successful so far, and a number of folks have sort of off-handedly asked me about a third edition. To be honest, this is something that's up to the publisher...WFA 2/e may be successful enough for a third edition, or the publisher may decide that the title and subject have run their course....who knows?

However, the question itself got me thinking along a different road...that is, what does it take to write a book? In exploring material for a third edition, I took an opportunity to ask some folks what they thought should be included in that edition, only to get some very interesting responses. Those responses then got me to thinking, "here's someone who does a lot of work...why doesn't he write a book?" I then thought, what is it that's holding folks back? There are a number of folks out there who likely have the necessary background to write a book...I mean, based on some of the books that have been written over the years, a solid background in ediscovery or digital forensic analysis should be enough to write a solid book, right?

If you're looking for a particular topic, my publisher has a page up called "Write for Syngress" for those who may be thinking about writing a book, or just tossing the idea around.

I think that a lot of folks look for someone to write a book for them, without ever thinking that maybe they could write that book themselves. Time is always a factor, but based on my experience, there are a number of decisions that need to be made, and sometimes thinking through those things can actually get you over that hump.

As such, what I'm going to be doing over time is writing a series of blog posts about writing books. I'm going to offer up my thoughts and experiences, and hope that someone out there decides to put pen to paper (figuratively, or literally) and step out and just do it. Yes, writing a book is time consuming and it can be arduous, but it can also be very rewarding.

Prefetch Files
Mark Wade recently had an article published in the DFI News regarding Prefetch file analysis. The article is a good resource for those not familiar with Prefetch files and how they might be used. I do, however, have a question about one statement made in the article:

For instance if the Standard Information Attribute (SIA) and File Name Attribute (FNA) timestamps are modified in the Master File Table (MFT) to impede analysis...

In light of what's available in Brian Carrier's File System Forensic Analysis book, I'm a bit unclear as to how FNA timestamps would be purposely modified. Does anyone have any input on that? Is anyone seeing FNA timestamps being purposely and maliciously modified in the same manner as SIA timestamps?

That question aside, I think that Mark makes a number of excellent points and observations in his article, particularly that data embedded within the Prefetch files can be used to determine a number of interesting artifacts, such as errant removable storage devices or paths, "abnormal" or deleted user accounts, etc.

Imaging Systems
Lance recently posted a "hard drive imaging process tree for basic training"; keeping in mind that it's for basic training, it's an excellent resource to look to and use.

There are a couple of things that you should consider adding if you're planning to use this decision tree for your own training purposes. For example, the first comment talks about adding chain of custody documentation...this is something that should definitely be done. Several other comments refer to collecting volatile memory...while Lance's responses are that this is covered in other modules, I tend to agree with Rob and Andrew...this is something that needs to be included at the basic level; we can no longer leave this to the advanced level.

The thing I am glad to see at the end of the entire process is "Verify Image". I can't say that it's happened too many times, but I do know of analysts who have gone on-site, acquired images, documented everything, and not verified their images. When they got back to the lab, they found that for some reason, they did not have valid images. This must be part of the process.

Great job, Lance, on producing an excellent resource! Thanks!

Oh, hey, almost forgot...thanks for updating the post to include volatile data collection!

Memory Analysis
Gl33da's got a new blog post up, this one on identifying memory images. If you do memory analysis, but don't actually perform the memory acquisition, this is an excellent post that's chock full of great information, in that she not only points out a method for determining the version of Windows of a memory image, but points out that it's already been incorporated into Volatility 1.4. Shoutz to gl33da for sharing!

If you are performing memory acquisition, are you using windd? Or are you using the MoonSols Windows Memory Toolkit Community Edition?

Addendum: Russ McRee pointed me to an article he wrote for InfoSecInstitute, titled "Security Incident Response Testing To Meet Audit Requirements". The article focuses primarily on the PCI DSS v2.0, specifically para. 12.9, which mandates the requirement for an IR capability, in addition to providing specifications. Overall, I think that the article is very good, and includes a good deal of the things I would do or recommend when writing or evaluating an IR plan.

The only real issue I have with Russ's article...and this comes from being QSA certified and on a QIRA team for far too that the majority of organizations that I investigated had enough trouble just getting something down on paper, let alone a coherent IR plan. The likelihood of a comprehensive plan that was tested, and an IR team that is trained and drilled on a regular basis was slim to none. Now, this isn't all organizations obviously...this is simply based on my experience and the responses I performed.

I will say that Russ does have some very interesting information in the article that will no only benefit responders, but forensic analysts, as well as those who provided training to responders and analysts. Thanks, Russ, for pointing me to the article.

Sunday, December 05, 2010

WFA 3/e

This fall, while I was at a couple/three conferences, I was asked if/when the third edition of WFA would be coming out. In response, I'd point out that something like this isn't really a decision for me, so much as it is a decision for the publisher.

However, that got me to a forensic analyst, what would you like to see in a third edition of Windows Forensic Analysis?

Friday, December 03, 2010


Corey asked a question recently in the Win4n6 Yahoo group that peaked my interest...because it had to do with a Registry key.

In short, Corey had found the entries were created under the following key:


First, ESENT apparently refers to the built-in JET database engine that has shipped with Windows as of Windows 2000. According to various pages at the MS site, this DB engine is very limited, not allowing remote access, and only providing for simple queries...but it is used in instances with other storage formats (flat file, Registry, etc.) are simply not suitable. According to the MS Windows SDK blog:

"The ESENT database engine can be used whenever an application wants
high-performance, low-overhead storage of structured or semi-structured data."

I then began looking into the key itself to get an idea of what was happening. I found, on my system, a number of subkeys beneath this key...ipconfig, svchost, wmplayer, etc. As I found each executable image, I opened the .exe file in PEView and saw that each .exe had an IMAGE_DEBUG_DIRECTORY, as well as a IMAGE_DEBUG_TYPE_CODEVIEW directory listed. Within the IMAGE_DEBUG_TYPE_CODEVIEW directory, I could see a reference to a .pdb, or program database file, which is where symbols and program debugging information is stored. While this is admittedly a somewhat limited and narrow view, it did occur to me that this was something to check, if you have a full image that includes the Registry hives in addition to the executable files.

Of course, both Corey and I employed ProcMon to see what was happening when an executable listed in the ESENT\Process key was run. For the monitoring I did, I opened a command prompted, typed "ipconfig /all", started my capture, and then hit "Enter" for my command in the command prompt window. As soon as the command completed, I halted monitoring in ProcMon. I then began going through the output, and found that the ipconfig.exe process itself did, in fact, query the key in question.

Corey found in his capture, for wmplayer.exe, "RegSetValue" was used on the "DEBUG\Trace Level" value; I have run the ipconfig command several times and while I did see the accesses to the "ipconfig\DEBUG\Trace Level" value (ie, RegQueryValue), I do not see any values set (via RegSetValue), nor modified.

Another interesting artifact that I saw in the ProcMon capture was that while esent.dll is not included in the PE header of the executable image file, the DLL is loaded by the ipconfig.exe process.

Okay, so this is interesting stuff...but only in a limited manner. If you Google for "ESENT\Process", you'll see a number of hits at AV sites that refer to malware. As I read through some of these (here, here), I was reminded of some of the artifacts surrounding the MUICache key, and how sometimes what is referred to as "analysis" could simply be secondary or tertiary artifacts of how the executable was run, or simply from the OS running and doing it's thing.

However, there were other links, such as this one (refers to Virut) that seemed to indicate that a closer look at the executable file itself is likely warranted. Other entries at McAfee, such as this one, further indicates that there may be malware samples for which the ESENT\Process key entries are specifically created, and that this may definitely be worth a look. In many ways, entries beneath this key may...and I say may, as more research and assistance from Microsoft is definitely needed in this similar to those beneath the MUICache key, in that they are indirect artifacts created not by the executable itself, but instead by the environment or eco-system (that's just a fancy way of saying "operating system") as the malware interacts with it.

I have a couple of test images that I can look at with respect to this key, and of course, I created an plugin (took about 5 minutes) for RegRipper to make things like testing and analysis easier.

As a side note (and shout out), I'm reading through Malware Analyst's Cookbook at the moment, and there are a LOT of really good Python and Perl recipes listed in the book that would be abundantly useful for this kind of analysis. While my own book provides a short view into what a normal PE file should look like, the Cookbook provides not only a view into what malware looks like, but it also provides tools for examining malware files up close without actually having to run them.

So, going forward...any thoughts, comments, or assistance in understanding a bit more about the ESENT\Process key would be greatly appreciated!

Addendum: For anyone who uses or has used ProcMon, or more specifically, RegMon, when running applications, you'll usually see a process accessing the "Image File Execution Options" key; when associated DLLs are loaded, you'll see similar accesses, as well, for this key. When this key is accessed, it's the operating system looking to see if there is a Debugger value associated with the specific executable image file. It is this association with debugging that led me to think...tenuous, I know...that the files with .pdb associations in their PE headers were accessing the ESENT key. My reasoning was, if the OS is going to look for some sort of debugger association once, why not twice?

Wednesday, December 01, 2010

A Bit More About Timelines...

Just a few more thoughts about timelines and data reduction...

One of the comments to my previous post brought up the fact that sometimes, you DO want everything you can get into a timeline. The example given was a murder such cases, you may want to "see" what an individual's online or "cyber" life looked like for the last few days or weeks. In such instances, I completely agree that a comprehensive timeline could be very valuable, but in such instances, a comprehensive timeline would be an actual goal of your exam, wouldn't it? Everything goes back to the goals of the examination.

There are other things you can do with respect to data reduction and timelines. One of the tools I use to transition the events file (file containing the listing of all events) into a timeline includes an option to restrict the time window that I want to look at:

C:\tools> -f D:\cases\events.txt -r 11/20/2010-11/22/2010 > D:\cases\tln.txt

This command line tells to create a timeline using all events from 00:00:00 Z on 11/20 to 23:59:59 Z on 11/22.

Another tool you can use against the events file is grep. The "-v" option is for inverting regex matches, and allows you to remove all lines that meet a specific criteria...such as application updates, Restore Point creation/deletion, etc. This can be very useful and powerful in removing noise from your timeline.

Rob Lee also pointed out that dumping all of the Registry key LastWrite times into a timeline has been valuable, and in at least one instance, revealed a very interesting artifact. With the tools and methodology I use, I've created mini-timelines of the various verbose data sources...Registry hives generally have a LOT of keys, and I may not want to include everything, so I can instead go find the interesting tidbits and add only those items, documenting my decision to include or exclude certain data. I've also done this with web server logs during SQL injection exams, and even used this approach to answer very specific questions that did not require that entire timelines be created; a micro-timeline of just the event ID 528, type 10 events from the Security Event Log provided the answer I needed.

One final point on data reduction in your with creating timelines, run your data reduction from an educated, knowledgeable standpoint. If you can't justify the data reduction steps in your own documentation, don't do it. Not only is repeatability something we're striving for, but being able to come back to the exam 6 months or more down the road, and understand what you did...I think everyone sees the value in that.

Sunday, November 28, 2010


Corey Harrell recently posted to his blog about using Calc (the spreadsheet program associated with OpenOffice) to perform timeline analysis. Corey's post was very revealing and thorough, and clearly demonstrates to the reader that here's a guy who does timeline analysis. I mean, Corey looks at regular expressions for searching, etc., and really does a good job of covering a lot of the different analysis aspects that you'd do in Excel, and provides a good indication that he's actually been doing this kind of analysis. Great job, Corey.

However, there are two things that came up in the post that might be good points for discussion. First, Corey says that his exploration originated from an anonymous question asking if Calc could be used in place of Excel. I applaud Corey's efforst, but this question demonstrates how some analysts will continue in asking these kinds of questions, rather than doing their own research and posting the results. This is one of those things that could have been easily stated as "Calc can also be used...", rather than as an anonymous question.

The other issue is the concern that Corey expressed with respect to the spreadsheet program's ability to handle massive numbers (100K or more) of rows. This is definitely a concern (particularly with versions of Excel that were pre-Office 2007), but it also demonstrates how timelines, which are meant to some degree to be a data reduction technique, can actually become very cumbersome and even slow the analyst down, simply by the shear volume of data. Yes, extracting data and generating a timeline is less data than a full image (i.e., a directory listing of a 160GB hard drive is much less than 160GB...), but it appears that so much data is capable of being added to a timeline that doing so could easily overwhelm an analyst.

Timelines and Data Reduction
One solution to this that I highly recommend is an educated, knowledgeable approach to timeline development and analysis, which is something Chris points to in his Sniper Forensics presentation. Rather than throwing everything (even the kitchen sink, if it has a time stamp) into your timeline and sorting it out from there, why not instead start with the goals of your examination, what you're trying to show, and go on from there? Your goals will show you what you need to look at, and what you need to show.

After all, one of the benefits and goals of timelines is...and should reduction. While a timeline compiled from an abundance of data sources is indeed a reduction of data volume, is it really a reduction of data to be analyzed? In many cases, it may not be.

Consider this...let's say that while driving your car, you hear an odd noise, maybe a squeal of some kind, coming from front left tire every time you break. What do you do? Do you disassemble the entire car in hopes of finding something bad, or something that might be responsible for the noise? Or do you do some testing to ensure that the noise is really coming from where you think it is, and under what specific conditions?

No, I'm not saying that we throw everything out and start over. Rather, what I'm suggesting is that rather than throwing everything into a timeline, assess the relative value of the data prior to adding it. One excellent example of this is a SQL injection analysis I did...I started with the file system metadata, and added just the web server logs that contained the pertinent SQL injection statements. There was no need to add Event Log data, nor all of the LastWrite times from all of the Registry keys from the hive files on the system. There was simply no value in doing this; in fact, doing so would have complicated the analysis, simply through shear volume. Is this an extreme example? No, I don't think that it is.

By reducing the data that we need to analyze, even if doing so in through the analysis (determine what we can remove from timeline as "noise", etc.), we then get to a point where relational analysis...that is, analyzing different systems in relation to each other, we can get a better view of what may have occurred (and when) in multi-system incidents/breaches. Remember, in the five field timeline format that I've recommended using, there is a field for the system name, as well as one for the user name. Using these fields, you can observe the use of domain credentials across multiple systems, for example.

I know that many folks are going to say, "What if you don't know what you're looking for?", and the answer to that is, "If you don't know what you're looking for, why are you doing analysis?" Seriously. If you don't know what the goals of your analysis are, why are you doing it?

Sometimes folks will say, "my goals are to find all bad stuff". Well...what constitutes "bad"? What if you find nmap and Metasploit installed on a system? Is it bad? What if the user is a network engineer tasked with vulnerability scanning and assessment? Then, is this find "bad"?

From the questions I receive, a lot of times I think that there is difficulty in defining goals. Does anyone really want to "find all malware"? Really? So you want to know about every BHO and bit of spyware? Given some home user systems, imagine how long it would take to locate, document, and categorize all malware on a system. Usually what I have found is that "find all malware" really means "find the malware that may have been associated with a particular incident or event". Then, getting the person you're working with to describe the event can help you narrow down the goals of your analysis. After all, they're talking to you for a reason, right? If they could do the analysis themselves, there would be no reason to talk to you. Developing a concise set of goals allows you to define and set expectations, as well as deliver something tangible and useful.

Timeline as a Tool
Timeline analysis is an extremely useful and valuable tool...but like any other tool, it's just a tool. The actual analysis is up to the analyst. There may be times when it simply doesn't make sense to create a timeline...if that's the case, then don't bother. However, if it does make sense to develop a timeline, then do so intelligently.

Volatility Updates
Gleeda recently tweeted about documentation for Volatility being available. If you do any memory forensics, this is an excellent resource that walks you through getting all set up and running with the full capabilities of Volatility.

Monday, November 22, 2010

More Updates

New Blog
Ken Pryor has started a new blog, and has his first post up as of Sun, 21 Nov. Check it out, and add it to your RSS feed. I'm sure Ken's going to have some gems.

I met Ken face-to-face at the WACCI conference a bit ago. He's a great guy, very knowledgeable, and very enthusiastic. Speaking of which, Ken was at the WACCI conference along with Brad Garnett, who's also posted to his blog recently. If you like some caffeine-induced forensic ramblings, stop on by and take a look.

Russ reached to me recently to let me know about Confessor, a tool that he'd covered in a recent toolsmith column. Russ had previously mentioned MIR-ROR, and says that Confessor uses similar tools but deploys them in an "enterprise-capable manner". Also from Russ's description of Confessor and another tool (mentioned below):

"These tools were born of needing better utilities for incident response and security analysis in complex, massive cloud-like environments."

Russ also mentioned MOLE in his toolsmith article; "MOLE" stands for "malicious online link engine", which allows the analyst to validate URLs to see if malware was present. I can see how a tool like this would be very useful for analysts during a malware investigations and incident response.


I received a question the other day that I thought was interesting, because I'd seen it before. Back when I had submitted my proposal for the Windows Registry Forensics book, all of the proposal reviewers had stated that this book would need to compare and contrast RegRipper to the commercially available Registry "analysis" tools.

As it turned out, I wasn't able to do this for the book...for the simple reason that I didn't have access to those commercial tools. I don't use EnCase at work, nor do I use FTK. I did try to get a temporary license for one of the commercial tools, and was told "no". In the spirit of full disclosure, I did have an opportunity to meet Brian Karney of AccessData, and he did offer to discuss providing a temporary license for the AccessData product, but by then I was so close to the deadline for the book that there simply wasn't time to go back and work this into the book. I did reference Technology Pathways ProDiscover in the book, and that's because I had access to that commercial tool.

Also, I used quotes around the word "analysis" earlier, because most commercial tools are simply's up to the analyst to perform analysis. To some extent, RegRipper is also a viewer, of sorts, although it doesn't so much leave the "what's important" up to the analyst, but instead allows the analyst to extract and analyze what is likely the more important and valuable data.

The question I received was right along the same lines. I guess on the surface, questions such as "how is RegRipper better than or different from the commercial tools" is one that comes from folks who, for the most part, haven't really used RegRipper much if at all, and have pretty much haven't really used the commercial tools to a great extent. I would also think that the question also comes from not really having conducted a great deal of Registry analysis. I wouldn't say that RegRipper is any better than any other tool...because it's just a tool, and is therefore only as useful or as good as the analyst using it. Like any tool that's used improperly, RegRipper would be seen as useless. Or, a knowledgeable analyst can use the tool effectively and even find new ways to use it that had not been thought of before, particularly by the designer.

One of the benefits and useful features of RegRipper is that it's open source, and the tool can be modified to suit your needs. Chris Perkins has modified RegRipper, and so did Adam James. Okay, so most folks are likely to say to this, "...but I don't program", and may even qualify that with " Perl." That's okay, because you can always ask someone to assist in meeting your needs. One of the reasons many folks provide tools for free is to get feedback from others who are either doing the same or similar work, or those who may be new to field and have a fresh view or perspective. So when I'm at a conference, and talk to someone who says, "...but I can't program...", I will generally ask them if they have email...because if they do, they can ask someone for assistance.

Another benefit of RegRipper as an open source tool is that if you need something done with a plugin...a new plugin written, or a current plugin with something a bit different done with the's a simple matter to change things. Early on, shortly after releasing RegRipper, I received a request or two for XML response, I asked for recommendations on a style sheet...and never heard back. I've received requests for .csv output...but it's a simple matter for someone to open the plugins of their choice in Notepad, commenting out (add "#" to the beginning of the line) the appropriate "::rptMsg()" statement, and adding their own. Or copy a plugin to a different name...say, copy to and make the appropriate changes.

Okay, so what's the point of all this? To answer the original question, RegRipper is open source, so if you want to know how something is done or if you want to change something, just open up the appropriate file in Notepad. If you're not a programmer, ask someone. It's that easy. RegRipper isn't any better than any other tool, simply because it's not the tool, but the analyst that plays the most important role in any examination.

ZeroAccess Write-up
I was reading through Giuseppe Bonfa's write-up of the ZeroAccess/Max++ rootkit recently, and I have to say...I was interested not only in how detailed and thorough the write-up was, but also the steps taken by the malware author.

In part 1 of the reverse engineering write-up, Giuseppe points out an important artifact associated with this malware...a randomly named Windows service. According the write-up, the service is installed as a kernel driver, set to load on demand, and the ImagePath is set to "\*". The Service key name itself begins with a '.' (dot).

In part 2, Giuseppe reverse engineer's the kernel-mode device driver. His analysis revealed that when the kernel-mode driver loads, it first deletes it's Services Registry key, and then the entries under the "Enum\Root\LEGACY_" path. Apparently, the author(s) of this malware are taking steps to protect their gem from discovery, and are doing so from learning from incident responders and forensic analysts.

Giuseppe's write-up is as thorough as it is interesting. Take an opportunity to read through's not only a good example for reverse engineers, but it's also good for other analysts to see so that they can understand the perspective of a reverse engineer, as well as what a reverse engineer can come up with and find out about malware. In this case, we've not only seen a rootkit that creates a hidden volume for its files, but also actively takes steps to obfuscate its presence on a live system.

OffensiveComputing also has a bit about the reverse engineering of this crimeware rootkit.