Wednesday, May 11, 2016

...back in the Old Corps...

I was digging through some boxes recently and ran across a bit of "ancient history"....

MS-DOS, WFW, Win95 diskettes
Ah, diskettes...anyone remember those?  When I was in college, this was how we did networking.  Sneakernet.  Copy the file to the diskette, carry it over to another computer.  Pretty reliable protocol.

I have (3) MS-DOS 6.0 diskettes, MS-DOS 6.22 setup diskettes, (8) diskettes for Windows-for-Workgroups 3.11, and (13) diskettes for Windows 95.

And yes, I still have a diskette drive, one that connects to a system via USB.  I would be interesting to see if I could set up a VM in VirtualBox running any of these systems.

I guess the days of tweaking your autoexec.bat file are long gone.  Sigh.

I did find some interesting sites when I went looking around the purport to provide VirtualBox images:

Kirsle.net (this site refers the reader to downloading the files at Kirsle.net)
Vintage VMs, 386Experience

I wish I still had my OS/2 Warp disks.  I was in grad school when OS/2 Warp 3.0 came out, and I went up to Frye's Electronics in Sunnyvale, CA, and purchased a copy of OS/2 2.1, the box of which had a $15 off coupon for when you purchased version 3.0.  I remember installing it, and running a script provided by one of the CS professors that would optimize the driver loading sequence so that the system booted and ran quicker.  I really liked being able to have multiple windows open doing different things, and web browser that came with Warp was the first one where you could drag-n-drop images from the browser to the desktop.

Windows, Office on CD
Here's a little bit more history...Windows 95 Plus, and Windows NT 4.0 Workstation, along with a couple of copies of Office.  I also have the CDs for Windows NT 4.0 Server, Windows 2003 Server, and I have (2) copies of Windows XP.











OS/2 Warp 4.52, running in VirtualBox
Oh, and hey...this just happened!  I found a VM someone had uploaded of OS/2 Warp 4.52, and it worked right out of the box (no pun intended...well, maybe just a little...)

Saturday, May 07, 2016

Accessing Historical Information During DF Work

There are a number of times when performing digital forensic analysis work that you may want access to historical information on the system.  That is to say, you'd like to reach a bit further into the past history of the system beyond what's directly available within the image.

Modern Windows systems can contain hidden caches of historical information that can provide an analyst with additional visibility and insight into events that had previously occurred on a system.  Knowing where those caches are and how to access them can make all the difference in your analysis, and knowing how to access them efficiently doesn't significantly impact your analysis.

System Restore Points
In the course of the analysis work I do, I still see Windows XP and 2003 system images; most often, they'll be images of Windows 2003 Server systems.  Specifically when analyzing XP systems, I've been able to extract Registry hives from Restore Points and get a partial view of how the system "looked" in the past.

One specific example that comes to mind is that during timeline analysis, I found that a Registry key had been modified (i.e., via the key LastWrite time).  Knowing that a number of events could lead to the key being modified, I found the most recent previous version of the hive in the Restore Points, and found that at that time, one of the values wasn't visible beneath the key.  The most logical conclusion was then that the modification of the key LastWrite time was the result of the value (in this case, used for malware persistence) being written to the key.

The great thing is that Windows actually maintains an easily-parsed log of Restore Points that were created, which include the date, as well as the reason for the RP being created.  Along with the reasons that Microsoft provides for RPs being created, these logs can provide some much-needed context to your analysis.

RegBack Folder
Beginning with Vista, a number of system processes that ran as services were moved over to Scheduled Tasks that were part of the default installation of Windows.  The specific task is named "RegIdleBackup", is scheduled to run every 10 days, and creates backup copies of the hives in the system32\config folder, placing those copies in the system32\config\RegBack folder.

VSCs
The images I work with tend to be from corporations, and in a great many instances, Volume Shadow Copies are not enabled on the systems.  Some of the systems virtual machines, others images taken from servers or employee laptops.  However, every now and then I do find a system image with difference files available, and it is sometimes fruitful to investigate the extent to which historical information may be available.

Now, the Windows Forensic Analysis books have an entire chapter that details tools and methods that can be used to access VSCs, and I've used the information in those chapters time and time again.  Like I mentioned in a previous post, one of the reasons I write the books is so that I have a reference; there are a number of analysis tasks I'll perform, the first step of which is to pull one of the books of my shelf.  As an update to the information in the books, and many thanks to David Cowen for sharing this will me, I've used libvshadow to access VSC metadata and folders when other methods didn't work.

What can be found in a VSC is really pretty amazing...which is probably why a lot of threat actors and malware (ransomware) will disable and delete VSCs as part of their process.

Hibernation File
A while back, I was working on an issue where we knew a system had been infected with a remote access Trojan (RAT).  What initially got our client's attention was network alerts illustrating that the RAT was "phoning home" from this system.  Once we received an image of the system, we found very little to indicate the presence of the RAT on the system.

However, the system was a laptop, and the image contained a hibernation file.  Our analysis, along with the network alerts, provided us with an indication of when the RAT had been installed on the system, and the hibernation file had been created after that time, but before the system had been imaged. Using Volatility, we were able to not just see that the RAT had been running on the system; we were able to get the start time of the process, extract a copy of the RAT's executable image from memory, locate the persistence mechanism in the System hive extracted from memory, etc.

Remember, the hibernation file is a snapshot of the running system at a point in time, much like a photograph that your mom took of you on your first day of school.  It's frozen in time, and can contain an incredible wealth of information, such as running processes, executable images, Registry keys/values, etc.  If the hibernation file was last modified during the window of compromise, or anywhere within the time frame of the incident you're investigating, you may very well find some extremely valuable information to help add context to your examination.

Windows.Old Folder
Not long ago, I went ahead and updated my personal laptop from Windows 7 to Windows 10.  Once the update was complete, I ended up with a folder named "Windows.old".  As I ran through the subfolders, reviewing the files available within each, I found that I had Registry hives (in the system32\config folder, RegBack folder, and user folders), Windows Event Log files, a recentfilecache.bcf file, etc.  There was a veritable treasure trove of historical information about the system just sitting there, and the great thing was that it was all from Windows 7!  Whenever I come out with a new book, the first question people ask is, "...does it cover Windows ?"  Well, if that's a concern, when you find a Windows.Old folder, it's a previous version of Windows, so everything you knew about Windows still applies.

Deleted Keys/Values
Another area of analysis that I've found useful time and time again is to look within the unallocated space of Registry hive files themselves for deleted keys and values.  Much like a deleted file or record of some kind, keys and values deleted from the Registry will persist within the unallocated space within the hive file itself until that space is reclaimed and the information is overwritten.

Want to find out more about this subject?  Check out this book...seriously.  It covers what happens when keys and values are deleted, where they go, and tools you can use to recover them.

Wednesday, May 04, 2016

Updates

RegRipper Plugins
I don't often get requests on Github for modifications to RegRipper, but I got one recently that was very interesting. Duckexmachina said that they'd run log2timeline and found entries in one ControlSet within the System hive that wasn't in the one marked as "current", and as a result, those entries were not listed by the appcompatcache.pl plugin.

As such, as a test, I wrote shimcache.pl, which accesses all available ControlSets within the System hive, and displays the entries listed.  In the limited testing I've done with the new plugin, I haven't yet found differences in the AppCompatCache entries in the available ControlSets; in the few System hives that I have available for testing, the LastWrite times for the keys in the available ControlSets have been identical.

As you can see in the below timeline excerpt, the AppCompatCache keys in both ControlSets appear to be written at shutdown:

Tue Mar 22 04:02:49 2016 Z
  FILE                       - .A.. [107479040] C:\Windows\System32\config\SOFTWARE
  FILE                       - .A.. [262144] C:\Windows\ServiceProfiles\NetworkService\NTUSER.DAT
  FILE                       - .A.. [262144] C:\Windows\System32\config\SECURITY
  FILE                       - .A.. [262144] C:\Windows\ServiceProfiles\LocalService\NTUSER.DAT
  FILE                       - .A.. [18087936] C:\System Volume Information\Syscache.hve
  REG                        - M... HKLM/System/ControlSet002/Control/Session Manager/AppCompatCache 
  REG                        - M... HKLM/System/ControlSet001/Control/Session Manager/AppCompatCache 
  FILE                       - .A.. [262144] C:\Windows\System32\config\SAM
  FILE                       - .A.. [14942208] C:\Windows\System32\config\SYSTEM

Now there may be instances where this is not the case, but for the most part, what you see in the above timeline excerpt is what I tend to see in the recent timelines I've created.

I'll go ahead and leave the shimcache.pl plugin as part of the distribution, and see how folks use it.  I'm not sure that adding the capability of parsing all available ControlSets is something that is necessary or even useful for all plugins that parse the System hive.  If I need to see something from a historical perspective within the System hive, I'll either go to the RegBack folder and extract the copy of the hive stored there, or access any Volume Shadow Copies that may be available.

Tools
MS has updated their Sysmon tool to version 4.0.  There's also this great presentation from Mark Russinovich that discusses how the tool can be used in an infrastructure.  It's well worth the time to go through it.

Books
A quick update to my last blog post about writing books...every now and then (and it's not very often), when someone asks if a book is going to address "what's new" in an operating system, I'll find someone who will actually be able to add some detail to the request.  For example, the question may  be about new functionality to the operating system, such as Cortana, Continuum, new browsers (Edge, Spartan), new search functionality, etc., and the artifacts left on the system and in the Registry through their use.

These are all great questions, but something that isn't readily apparent to most folks is that I'm not a testing facility or company.  I'm one guy.  I do not have access to devices such as a Windows phone,  a Surface device, etc.  I'm writing this blog post using a Dell Latitude E6510...I don't have a touch screen device available to test functionality such as...well...the touch screen, a digital assistant, etc.  I don't have access to a Windows phone.

RegRipper is open source and free.  As some are aware, I end up giving a lot of the new books away.  I don't have access to a device that runs Windows 10 and has a touch screen, or can run Cortana.  I don't have access to MSDN to download and test new versions of Windows, MSOffice, etc.

Would I like to include those sorts of artifacts as part of RegRipper, or in a book?  Yes, I would...I think it would be really cool.  But instead of asking, "...does it cover...", ask yourself instead, "what am I willing to contribute?"  It could be devices for testing, or the data extracted from said devices, along with a description of the testing performed, etc.  I do what I can with the resources I have available, folks.

Analysis
I was pointed to this site recently, which begins a discussion of a technique for finding unknown malware on Windows systems.  The page is described as "part 1 of 5", and after reading through it, while I think that it's a good idea to have things like this available to DFIR analysts, I don't agree with the process itself.

Here's why...I don't agree that long-running processes (hash computation/comparison, carving unallocated space, AV scans, etc.) are the first things that should be done when kicking off analysis.  There is plenty of analysis that can be conducted in parallel while those processes are running, and the necessary data for that analysis should be extracted first.

Analysis should start with identified, discrete goals.  After all, imaging and analyzing a system can be an expensive (in terms of time, money, staffing resources, etc.) process, so you want to have a reason for going down this road.  Find all the bad stuff is not a goal; what constitutes bad in the context of the environment in which the system exists?  Is the user a pen tester, or do they find vulnerabilities and write exploits?  If so, bad takes on an entirely new context. When tasked with finding unknown malware, the first question should be, what leads us to believe that this system has malware on it?  I mean, honestly, when a sysadmin or IT director walks into their office in the morning, do they have a listing of systems on the wall and just throw a dart at it, and whichever system the dart lands on suddenly has malware on it?  No, that's not the case at all...there's usually something (unusual activity, process performance degradation, etc.) that leads someone to believe that there's malware on a system.  And usually when these things are noticed, they're noticed at a particular time.  Getting that information can help narrow down the search, and as such should be documented before kicking off analysis.

Once the analysis goals are documented, we have to remember that malware must execute in order to do damage.  Well, that is...in most cases.  As such, what we'd initially want to focus on is artifacts of process execution, and from there look for artifacts of malware on the system.

Something I discussed with another analyst recently is that I love analyzing Windows systems because the OS itself will very often record artifacts as the malware interacts with it's ecosystem.  Some malware creates files and Registry keys/values, and this functionality can be found within the code of the malware itself.  However, as some malware executes, there are events that may be recorded by the operating system that are not part of the malware code.  It's like dropping a rock in a pond...there's nothing about the rock, in and of itself, that requires that ripples be produced; rather, this is something that the pond does as a reaction to the rock interacting with it.  The same can very often be true with Windows systems and malware (or a dedicated adversary).

That being said, I'll look forward to reading the remaining four blog posts in the series.

Monday, May 02, 2016

Thoughts on Books and Book Writing

The new book has been out for a couple of weeks now, and already there are two customer reviews (many thanks to Daniel Garcia and Amazon Customer for their reviews).  Daniel also wrote a more extensive review of the book on his blog, found here.  Daniel, thanks for the extensive work in reading and then writing about the book, I greatly appreciate it.

Here's my take on what the book covers...not a review, just a description of the book itself for those who may have questions.

Does it cover ... ?
One question I get every time a book is released is, "Does it cover changes to ?"  I got the with all of the Windows Forensic Analysis books, and I got it when the first edition of this book was released ("Does it cover changes in Windows 7?").  In fact, I got that question from someone at a conference I was speaking at recently.  I thought that was pretty odd, as most often these questions are posted to public forums, and I don't see them.  As such, I thought I'd try to address the question here, so that maybe people could see my reasoning, and ask questions that way.

What I try to do with the books is address an analysis process, and perhaps show different ways that Registry data can be incorporated into the overall analysis plan.  Here's a really good example of how incorporating Registry data into an analysis process worked out FTW.  But that's just one, and a recent one...the book is full of other examples of how I've incorporated Registry data into an examination, and how doing so has been extremely valuable.

One of the things I wanted to do with this book was not just talk about how I have used Registry data in my analysis, but illustrate how others have done so, as well.  As such, I set up a contest, asking people to send me short write-ups regarding how they've used Registry analysis in their case work.  I thought it would be great to get different perspectives, and illustrate how others across the industry were doing this sort of work.  I got a single submission.

My point is simply this...there really is not suitable forum (online, book, etc.) or means by which to address every change that can occur in the Registry.  I'm not just talking about between versions of Windows...sometimes, it's simply the passage of time that leads to some change creeping into the operating system.  For example, take this blog post that's less than a year old...Yogesh found that a value beneath a Registry key that contains the SSID of a wireless network.  With the operating system alone, there will be changes along the way, possibly a great many.  Add to that applications, and you'll get a whole new level of expansion...so how would that be maintained?  As a list?  Where would it be maintained?

As such, what I've tried to do in the book is share some thoughts on artifact categories and the analysis process, in hopes that the analysis process itself would cast a wide enough net to pick up things that may have changed between versions of Windows, or simply not been discussed (or not discussed at great length) previously.

Book Writing
Sometimes, I think about why I write books; what's my reason or motivation for writing the books that I write?  I ask this question of myself, usually when starting a new book, or following a break after finishing a book.

I guess the biggest reason is that when I first started looking around for resources the covered DFIR work and topics specific to Windows systems, there really weren't any...at least, not any that I wanted to use/own.  Some of those that were available were very general, and with few exceptions, you could replace "Windows" with "Linux" and have the same book.  As such, I set out to write a book that I wanted to use, something I would refer to...and specifically with respect to the Windows Registry Forensics books, I still do.  In fact, almost everything that remained the same between the two editions did so because I still use it, and find it to be extremely valuable reference material.

So, while I wish that those interested in something particular in a book, like covering "changes to the Registry in ", would describe the changes that they're referring to before the book goes to the publisher, that simply hasn't been the case.  I have reached out to the community because I honestly believe that folks have good ideas, and that a book that includes something one person finds interesting will surely be of interest to someone else.  However, the result has been...well, you know where I'm going with this.  Regardless, as long as I have ideas and feel like writing, I will.

Thursday, April 14, 2016

Training Philosophy

I have always felt that everyone, including DFIR analysts, need to take some level of responsibility for their own professional education.  What does this mean?  There are a couple of ways to go about this in any industry; professional reading, attending training courses, engaging with others within the community, etc.  Very often, it's beneficial to engaging in more than one manner, particularly as people tend to take in information and learn new skills in different ways.

Specifically with respect to DFIR, there are training courses available that you can attend, and it doesn't take a great deal of effort to find many of these courses.  You attend the training, sit in a classroom, listen to lecture and run through lab exercises.  All of this is great, and a great way to learn something that is perhaps completely new to you, or simply a new way of performing a task.  But what happens beyond that?  What happens beyond what's presented, beyond the classroom?  Do analysts take responsibility for their education, incorporating what they learned into their day-to-day job and then going beyond what's presented in a formal setting?  Do they explore new data sources, tools, and processes?  Or do they sign up again for the course the following year in order to get new information?

When I was on the IBM ISS ERS team, we had a team member tell us that they could only learn something if they were sitting in a classroom, and someone was teaching them the subject.  On the surface, we were like, "wait...what?"  After all, do you really want an employee and a fellow team member who states that they can't learn anything new without being taught, in a classroom?  However, if you look beyond the visceral surface reaction to that statement, what they were saying was, they have too much going on operationally to take the time out to start from square 1 to learn something.  The business model behind their position requires them to be as billable as possible, which ends up meaning that out of their business day, they don't have a great deal of time available for things like non-billable professional development.  Taking them out of operational rotation, and putting them in a classroom environment where they weren't responsible for analysis, reporting, submitting travel claims, sending updates, and other billable commitments, would give them the opportunity to learn something new.  But what was important, following the training, is what they did with it.  Was that training away from the daily grind of analysis, expense reports and conference calls used as the basis for developing new skills, or was the end of the training the end of learning?

Learning New Skills
Back in 1982, I took a BASIC programming class on the Apple IIe, and the teacher's philosophy was to provide us with some basic (no pun intended) information, and then cut us loose to explore.  Those of us in the class would try different things, some (most) of which didn't work, or didn't work as intended.  If we found something that worked really well, we'd share it.  If we found something that didn't work, or didn't work quite right, we'd share that, as well, and someone would usually be able to figure out why we weren't seeing what we expected to see.

Jump ahead about 13 years, and my linear algebra professor during my graduate studies had the same philosophy.  Where most professors would give a project and the students would struggle for the rest of the week to "get" the core part of the project, this professor would provide us with the core bit of code (we were using MatLab) to the exercise or lab, and our "project" was to learn.  Of course, some did the minimum and moved on, and others would really push the boundaries of the subject.  I remember one such project were I spent a lot of time observing not just the effect of the code on different shaped matrices, but also the effect of running the output back into the code.

So now, in my professional life, I still seek to learn new things, and employ what I  learn in an exploratory manner.  What happens when I do this new thing?  Or, what happens if I take this one thing that I learned, and share it with someone else?  When I learn something new, I like to try it out and see how to best employ it as part of my analysis process, even if it means changing what I do, rather than simply adding to it.  As part of that, when someone mentions a tool, I don't wait for them to explain every possible use of the tool to me.  After all, particularly if we're talking about the use of native Windows tool, I can very often go look for myself.

So you wanna learn...
If you're interested in trying your skills out on some available data, Mari recently shared this MindMap of forensic challenges with me.  This one resource provides links to all sorts of challenges, and scenarios with data available for analysts to test their skills, try out new tools, or simply dust off some old techniques.  The available data covers disk, memory, pcap analysis, and more.

This means that if an analyst wants to learn more about a tool or process, there is data available that they can use to develop their knowledge base, and add to their skillz.  So, if someone talks about a tool or process, there's nothing to stop you from taking responsibility for your own education, downloading the data and employing the tool/process on your own.

Manager's Responsibility
When I was a 2ndLt, I learned that one of my responsibilities as a platoon commander was to ensure that my Marines were properly trained, and I learned that there were two aspects to that.  The first was to ensure that they received the necessary training, be it formal, schoolhouse instruction, via an MCI correspondence course, or some other method.  The second was to ensure that once trained, the Marine employed the training.  After all, what good is it to send someone off to learn something new, only to have them return to the operational cycle and simply go back to what they were doing before they left?  I mean, you could have achieved the same thing by letting them go on vacation for a week, and saved yourself the money spent on the training, right?

Now, admittedly, the military is great about training you to do something, and then ensuring that you then have opportunity to employ that new skill.  In the private sector, particularly with DFIR training, things are often...not that way.

The Point
So, the point of all this is simple...for me, learning is a matter of doing.  I'm sure that this is the case for others, as well.  Someone can point to a tool or process, and give general thoughts on how it can be used, or even provide examples of how they've used it.  However, for me to really learn more about the topic, I need to actually do something.

The exception to this is understanding the decision to use the tool or process.  For example, what led an analyst to decide to run, say, plaso against an image, rather than extract specific data sources, in order to create and analyze a timeline while running an AV scan?  What leads an analyst to decide to use a specific process or to look at specific data sources, while not looking at others?  That's something that you can only get by engaging with someone and asking questions...but asking those questions is also taking responsibility for your own education.

Tuesday, April 12, 2016

Links

Ramdo
I ran across this corporate blog post regarding the Ramdo click-fraud malware recently, and one particular statement caught my eye, namely:

Documented by Microsoft in 2014, Ramdo continues to evolve to evade host-based and network-based detection.

I thought, hold on a second...if this was documented in April 2014 (2 yrs ago), what about it makes host-based detection so difficult?  I decided to take a look at what some of the AV sites were saying about the malware.  After all, the MSRT link indicates that the malware writes it's configuration information to a couple of Registry values, and the Win32/Ramdo.A write-up provides even more information along these lines.

I updated the RegRipper malware.pl plugin with checks for the various values identified by several sources, but because I have limited data for testing, I don't feel comfortable that this new version of the plugin is ready for release.

Book
Speaking of RegRipper plugins, just a reminder that the newly published Windows Registry Forensics 2e not only includes descriptions of a number of current plugins, but also includes an entire chapter devoted just to RegRipper, covering topics such as how to use it, and how to write your own plugins.

Timeline Analysis
The book also covers, starting on page 53 (of the softcover edition), tools that I use to incorporate Registry information into timeline analysis.  I've used to this methodology to considerable effect over the years, including very recently to locate a novel WMI persistence technique, which another analyst was able to completely unravel.

Mimikatz
For those who may not be aware, mimikatz includes the capability to clear the Event Log, as well as reportedly stop the Event Service from generating new events.

Okay, so someone can apparently stop the Windows Event Log service from generating event records, and then steal your credentials. If nothing else, this really illustrates the need for process creation monitoring on endpoints.

Addendum, 14 Apr: I did some testing last night, and found that when using the mimikatz functionality to clear the Windows Event Log, a Microsoft-Windows-EventLog/1102 event is generated.  Unfortunately, when I tried the "event::drop" functionality, I got an error.

Something else to keep in mind is that this isn't the only way that adversaries can be observed interacting with the Windows Event Log.  Not only are native tools (wevtutil.exe, PowerShell, WMI) available, but MS provides LogParser for free.

Ghost in the (Power)Shell
The folks at Carbon Black recently posted an article regarding the use of Powershell in attacks.  As I read through the article, I wasn't abundantly clear on what was meant by the adversary attempting to "cloak" attacks by using PowerShell, but due in part to the statistics shared in the article, it does give a view into how PowerShell is being used in some environments.  I'm going to guess that because many organizations still aren't using any sort of process creation monitoring, nor are many logging the use of Powershell, this is how the use of Powershell would be considered "cloaked".

Be sure to take a look at the United Threat Research report described in the Cb article, as well.

Tuesday, April 05, 2016

Cool Stuff, re: WMI Persistence

In case you missed it, the blog post titled, "A Novel WMI Persistence Implementation" was posted to the Dell SecureWorks web site recently.  In short, this blog post presented the results of several SecureWorks team members working together and bringing technical expertise to bear in order to run an issue of an unusual persistence mechanism to ground.  The specifics of the issue are covered thoroughly in the blog post.

What was found was a novel WMI persistence mechanism that appeared to have been employed to avoid not just detection by those who administered the infected system, but also by forensic analysts.  In short, the persistence mechanism used was a variation on what was discussed during a MIRCon 2014 presentation; you'll see what I mean you compare figure 1 from the blog post to slide 45 of the presentation.

After the blog post was published and SecureWorks marketing had tweeted about the blog post, they saw that Matt Graeber had tweeted a request for additional information.  The ensuing exchange included Matt providing a command line for parsing embedded text from a binary MOF file:

mofcomp.exe -MOF:recovered.mof -MFL:ms_409.mof -Amendment:MS_409 binarymof.tmp

What this command does is go into the binary MOF file (binarymof.tmp), and attempt to extract the text that it was created from, essentially "decompiling" it, and placing that text into the file "recovered.mof".

It was no accident that Matt was asking about this; here is Matt's BlackHat 2015 paper, and his presentation.


Windows Registry Forensics, 2E

Okay, the book is out!  At last!  This is the second edition to Windows Registry Forensics, and this one comes with a good bit of new material.

Chapter 1 lays out what I see as the core concepts of analysis, in general, as well as providing a foundational understanding of the Registry itself, from a binary perspective.  I know that there are some who likely feel that they've seen all of this before, but I tend to use this information all the time.

Chapter 2 is again about tools.  I only cover available free and open-source tools that run on Windows systems, for the simple fact that I do not have access to the commercial tools.  Some of the old tools are still applicable, there are new tools available, and some tools are now under license, and in some cases, the strict terms of the license prevent me from including them in the book.  Hopefully, chapter 1 laid the foundation for analysts to be able to make educated decisions as to which tool(s) they prefer to use.

Chapters 3 and 4 remain the same in their focus as with the first edition, but the content of the chapters has changed, and in a lot of aspects, been updated.

Chapter 5 is my answer to anyone who has looked or is looking for a manual on how to use RegRipper.  I get that most folks download the tool and run it as it, but for my own use, I do not use the GUI.  At all.  Ever.  I use rip.exe from the command line, exclusively.  But I also want folks to know that there are more targeted (and perhaps efficient) ways to use RegRipper to your advantage.  I also include information regarding how you can write your own plugins, but as always, if you don't feel comfortable doing so, please consider reaching to me, as I'm more that happy to help with a plugin.  It's pretty easy to write a plugin if you can (a) concisely describe what you're looking for, and (b) provide sample data.

Now, I know folks are going to ask about specific content, and that usually comes as the question, "do you talk about Windows 10?"  My response to that it to ask specifically what they're referring to, and very often, there's no response to that question.  The purpose of this book is not to provide a list of all possible Registry keys and values of interest or value, for all possible investigations, and for all possible combinations of Windows versions and applications.  That's simply not something that can be achieved.  The purpose of this book is to provide an understanding of the value  and context of the Windows Registry, that can be applied to a number of investigations.

Thoughts on Writing Books
There's no doubt about it, writing a book is hard.  For the most part, actually writing the book is easy, once you get started.  Sometimes it's the "getting started" that can be hard.  I find that I'll go through phases where I'll be writing furiously, and when I really need to stop (for sleep, life, etc.), I'll take a few minutes to jot down some notes on where I wanted to go with a thought.

While I have done this enough to find ways to make the process easier, there are still difficulties associated with writing a book.  That's just the reality.  It's easier now than it was the first time, and even the second time.   I'm much better at the planning for writing a book, and can even provide advice to others on how to best go about it (and what to expect).

At this point, after having written the books that I have, I have to say that the single hardest part of writing books is not getting feedback from the community.

Take the first edition of Windows Registry Forensics, for example.  I received questions such as, "...are you planning a second edition?", and when I asked for input on what that second edition should cover, I didn't get a response.

I think that from a 50,000 foot view, there's an expectation that things will be different in the next version of Windows, but the simple fact is that, when it comes to Registry forensics, the basic principles have remained the same through all available versions. Keys are still keys, deleted keys are still discovered the same way, values are still values, etc.  From an application layer perspective, its inevitable that each new version of Windows would include something "new", with respect to the Registry.  New keys, new values, etc.  The same is true with new versions of applications, and that includes malware, as well.  While the basic principles remain constant, stuff at the application layer changes, and it's very difficult to keep up without some sort of assistance.

Writing a book like this would be significantly easier if those within the community were to provide feedback and input, rather than waiting for the book to be published, and ask, "...did you talk about X?"  Even so, I hope that folks find the book useful, and that some who have received their copy of the book find the time to write a review.  Thanks.