Thursday, January 27, 2011

WRF book available!!

It seems that the Windows Registry Forensics book is available, as it was shipped to the DoD CyberCrime Conference. I'm looking forward to getting my copy!

If you have the Kindle edition of this book, and want the DVD contents, go here. Also, I've added a Books page to the blog, so check there in the future.

Addendum: Reviews!
Brad and Dave have been nice enough to post reviews of the book thus far! Thanks so much, guys...your efforts are greatly appreciated!

Now, they're both up on the Amazon page for the book, as well...

Speaking of reviews, but specific to WFA 2/e, Eric Huber posted this So You'd Like to...Learn Digital Forensics page on Amazon. In it, he says:

Harlan Carvey's Windows Forensic Analysis DVD Toolkit, Second Edition is the best book available on Windows digital forensics.

Thanks, Eric!!

Friday, January 21, 2011

New Tools and Links

Chris Brown has updated ProDiscover to version 6.8. This may not interest a lot of folks but if you haven't kept up with PD, you should consider taking a look.

If you go to the Resource Center, you'll find a couple of things. First off, there's a whitepaper that demonstrates how to use ProDiscover to access Volume Shadow Copies on live remote systems. There's also a webinar available that demonstrates this. Further down the page, ProDiscover Basic Edition (BE) v 6.8 is available for download...BE now incorporates the Registry, EventLog and Internet History viewers.

Chris also shared with me that PD v6.8 (not BE, of course) includes the following:

Added full support for Microsoft Bitlocker protected disks on Vista and Windows7. This means that users can add any bitlocker protected disk/image to a project and perform all investigative functions provided that they have the bitlocker recovery key.

The image compare feature in the last update is very cool for getting the diff's on volume shadow copies.

Added support for Linux Ext4 file system.

Added a Thumbs.db viewer.

These are just some of the capabilities he mentioned, and there are more updates to come in the future. Chris is really working hard to make ProDiscover a valuable resource.

MS Tool
Troy Larson reached to me the other day to let me know that MS had released the beta of their Attack Surface Analyzer tool. I did some looking around with respect to this tool, and while there are lot of 'retweets', there isn't much out there showing its use.

Okay, so here's what the tool install the tool and run a baseline of the system. After you do something...install or update an app, for rerun the tool. In both cases, .cab files are created, and you can then run a diff between the two of them. I see two immediate uses for something like this...first, analysts and forensic researchers can add this to their bag of tricks and see what happens on a system when an app is installed or updated, or when updates are installed. The second, which I don't really see happening, is that organizations can install this on their critical systems (after testing, of course) and create baselines of systems, which can be compared to another snapshot after an incident.

I'll admit, I haven't worked with this tool yet, so I don't know if it creates the .cab files in a specific location or the user can specify the location, or even what's covered in the snapshot, but something like this might end up being very useful. Troy says that this tool has "great potential for artifact hunters", and I agree.

CyberSpeak is back!
After a bit of an absence, Ovie is back with the CyberSpeak podcast, posting an interview with Mark Wade of the Harris Corporation. The two of them talked about an article that Mark had written for DFINews...the interview was apparently based on pt. 1 of the article, now there's a pt. 2. Mark's got some great information based on his research into the application prefetch files generated by Windows systems.

During the interview, Mark mentioned being able to use time-based analysis of the application prefetch files to learn something about the user and their actions. Two thoughts on this...unless the programs that were run are in a specific user's profile directory (and in some cases, even if they are...), you're going to have to do more analysis to tie the prefetch files to when a user was logged in...application prefetch files are indirect artifacts generated by the OS, and are not directly tied to a specific user.

The second thought is...timeline analysis! All you would need to do to perform the analysis Mark referred to is generate a nano-timeline using only the metadata from the application prefetch files themselves. Of course, you could build on that, using the file system metadata for those files, and the contents of the UserAssist subkeys (and possibly the RecentDocs key) to build a more complete picture of the user's activities.

Gettin' Local
A recent article in the Washington Post stated that Virginia has seen a rise in CP cases. I caught this on the radio, and decided to see if I could find the article. The article states that the increase is a result of the growth of the Internet and P2P sharing networks. I'm sure that along with this has been an increase in the "I didn't do it" claims, more commonly referred to as the "Trojan Defense".

There's a great deal of analysis that can be done quickly and thoroughly to obviate the "Trojan Defense", before it's ever actually raised. Analysts can look to Windows Forensic Analysis, Windows Registry Forensics, and the upcoming Digital Forensics with Open Source Tools for solutions on how to address this situation. One example is to create a that shows the user logging into the system, launching the P2P application, and then from there add any available logs of file down- or up-loads, launching an image viewing application (and associated MRU list...), etc.

Another issue that needs to be addressed involves determining what artifacts "look like" when a user connects a smart phone to a laptop in order to copy or move image or video files (or uploads them directly from the phone), and then share them via a P2P network.

Free Stuff
Ken Pryor has posted his second article about doing "Digital Forensics on a (less than) shoestring budget" to the SANS Forensic blog. Ken's first post addressed training options, and his second post presents some of the tools described in the upcoming Digital Forensics with Open Source Tools book.

What I like about these posts is that by going the free, open-source, and/or low cost route for tools, we start getting analysts to understand that analysis is not about tools, it's about the process. I think that this is critically important, and it doesn't take much to understand why...just look around at all of the predictions for 2011, and see what they're saying about cybercrime being and continuing to become more sophisticated.

Tuesday, January 18, 2011

More VSCs

I was doing some writing last night, specifically documenting the process described in my previous blog post on accessing VSCs. I grabbed an NTUSER.DAT from within a user profile from the mounted image/VHD file, as well as the same file from within the oldest VSC available, and ran my RegRipper userassist plugin against both of the files.

Let me say that I didn't have to use robocopy to extract the files...I could've just run the plugin against the mounted files/file systems. However, I had some other thoughts in mind, and wanted the copies of the hive files to try things out. Besides, robocopy is native to Windows 7.

If the value of VSCs has not been recognized or understood by now, then we have a serious issue on our hands. For example, we know that the UserAssist key values can tell use the last time that a user performed a specific action via the shell (ie, clicked on a desktop shortcut, followed the Start->Programs path, etc.) and how often they've done so. So, the 15th time a user performs a certain action, we only see the information about that instance, and not the previous times.

By mounting the oldest VSC and parsing the user hive file, I was able to get additional historical information, including other times that applications (Quick Cam, Skype, iTunes, etc.) had been launched by the user. This provides some very significant historical data that can be used to fill in gaps in a timeline, particularly when there's considerable time between when an incident
occurred and when it was detected.

Here's an excerpt of the UserAssist values from the NTUSER.DAT in the mounted VHD:

Thu Jan 21 03:10:26 2010 Z
UEME_RUNPATH:C:\Program Files\Skype\Phone\Skype.exe (14)
Tue Jan 19 00:37:46 2010 Z
UEME_RUNPATH:C:\Program Files\iTunes\iTunes.exe (296)

And here's an excerpt of similar values from the NTUSER.DAT within the mounted VSC:

Sat Jan 9 11:40:31 2010 Z
UEME_RUNPATH:C:\Program Files\iTunes\iTunes.exe (293)
Fri Jan 8 04:13:40 2010 Z
UEME_RUNPATH:C:\Program Files\Skype\Phone\Skype.exe (8)

Some pretty valuable information there...imagine how this could be used to fill in a timeline.

And the really interesting thing is that just about everything else you'd do with a regular file system, you can do with the mounted AV scans, run RegRipper or the forensic scanner, etc.

Thursday, January 13, 2011

More Stuff

More on Malware
No pun intended. ;-)

The MMPC has another post up about malware, this one called Kelihos. Apparently, there are some similarities between Kelihos and Waledac, enough that the folks at the MMPC stated that there was likely code reuse. However, there's quite a bit more written about Waledac...and that's what concerns me. The write-up on Kelihos states that the malware "allows unauthorized access and control of an affected computer", but there's no indication as to how that occurs. The only artifact that's listed in the write-up is a file name and the persistence mechanism (i.e., the Run key). So how does this control occur? Might it be helpful to IT and network admins to know a little bit more about this?

Also, take a close look at the Kelihos mentions a file that's dropped into the "All Users" profile and an entry in the HKLM\...\Run key...but that Run key entry apparently doesn't point to the file that's listed.

I understand that the MMPC specifically and AV companies in general aren't in the business of providing more comprehensive information, but what would be the harm, really? They have the information...and I'm not talking about complete reverse engineering of the malware, so there's no need to do a ton of extra work and then post it for free. Given that this affects Microsoft operating systems, I would hope that some organization with MS could provide information that would assist organizations that use those OSs in detecting and reacting to infections in a timely manner.

Eric Huber posted a very illuminating interview with Hal Pomeranz over on the AFoD blog. Throughout the interview, Hal addresses several questions (from his perspective) that you see a lot in lists and particular, there are a lot of "how I got started in the business" responses. I see this sort of question all the time, and it's good to see someone like Hal not only discussing what he did to "break into the business", as it were, but also what he looks for with respect to new employees. If you have the time, take a read through the questions and answers, and see what Hal has to will definitely be worth your time.

Personally, having received an iTouch for Christmas, I think that a podcast would be a great forum for this sort of thing. I'm just sayin'... ;-)

Corey Harrell posted the results of some research on his Journey into Incident Response blog; he's performed some analysis regarding locating AutoPlay and Autorun artifacts. He's done some pretty thorough research regarding this topic, and done a great job of documenting what he did.

Results aside, the most important and valuable thing about what Corey did was share what he found. Have you ever had a conversation with someone where maybe you showed them something that you'd run across, or just asked them a question, and their response was, "yeah, I've been doing that for years"? How disappointing is that? I mean, to know someone in the industry, and to have a problem (or even just be curious about something) and know someone who's known the answer but never actually said anything? And not just not said anything at that moment...but ever.

I think that's where we could really improve as a community. There are folks like Corey who find something, and share it. And there are others in the community who have things that they do all the time, but no one else knows until the topic comes up and that person says, "yeah, I do that all the time."

Process Improvement
I think that one of the best shows on TV now is Undercover Boss. Part of the reason I like it is because rather than showing people treating themselves and each other in a questionable manner, the show has CEOs going out and engaging with front line employees. At the end of the show, the employees generally get recognized in some way for their hard work and dedication.

One topic jumped out in particular from the UniFirst episode...that front line employees who were the ones doing the job were better qualified to suggest and make changes to make the task more efficient. After all, who is better qualified than that person to come up with a way to save time and money at a task?

When I was in the military, I was given training in Total Quality Management (TQM) and certified by the Dept of the Navy to teach it to others. Being a Marine, there were other Marines who told me that TQM (we tried to call it "Total Quality Leadership" to get Marines to accept it) would never be accepted or used. I completely agree now, just as I did then...there are some tasks that process improvement won't provide a great deal of benefit, but there others that will. More than anything else, the one aspect I found from TQM/TQL that Marines could use everywhere was the practice of engaging with the front line person performing the task in order to seek improvement. A great example of this was my radio operators, who had to assemble RC-292 antennas all the time; one of my Marines had used wire, some epoxy and the bottom of a soda can to create "cobra heads", or field-expedient antenna kits that could be elevated (and the radios operational) before other Marines could go to the back of the Hummer, pull out an antenna kit, and start putting the mast together. This improved the process of getting communications up and available, and it was a process developed by those on the "front lines" who actually do the work.

So what does that have to do with forensics or incident response? Well, one of the things I like to do now and again is look at my last engagement, or look back over a couple of engagements, and see what I can improve upon. What can I do better going forward, or what can I do if there's a slight change in one of the aspects of the examination?

While on the IBM team and performing data breach investigations, I tried to optimize what I was doing. Sometimes taking a little more time up front, such as making a second working copy of the image, would allow me to perform parallel operations...I could use one working copy for analysis, and the other would be subject to scans. Or, I could extract specific files and data from one working copy, start my analysis, and start scanning the two working images. Chris Pogue, a SANS Thought Leader who was on our team at the time, got really good at running parallel analysis operations, by setting up multiple VMs to do just that.

The point is that we were the ones tasked with performing the work, and we looked at the requirements of the job, and found ways to do a better, more comprehensive job in a more efficient manner, and get that done in fewer ticks of the clock. One thing that really benefited us was collaborating and sharing what we knew. For example, Chris was really good at running multiple VMs to complete tasks in parallel, and he shared that with the other members of the team. I wrote Perl scripts that would take the results of scans for potential credit card numbers, remove duplicate entries, and then separate the resulting list into separate card brands for archiving and shipping (based on the required process). We shared those with the team, and Chris and I worked together to teach others to use them.

So why does any of this matter? When I was taking the TQM training, we were told that Deming originally shared his thoughts on process improvement with his fellow Americans, who laughed him out of the country, but others (the Japanese) absorbed what he had to say because it makes sense. In manufacturing processes, errors in the process can lead to increase cost, delays in delivery, and ultimately a poor reputation. The same is true for what we do. Through continual process improvement, we can move beyond where we are now, and provide a better, more comprehensive service in a timely manner.

In closing, use this as a starting point...a customer comes to you with an image, and says that they think that there's malware on the system, and that's it. Think about what you can provide them, in a report, at the end of 40 hours...5 days, 8 hrs a day of work. Based on what you do right now, and more specifically, the last malware engagement you did, how complete, thorough, and accurate will your report be?

Friday, January 07, 2011

Links and stuff

Windows Registry Forensics
The folks at Syngress tweeted recently that Windows Registry Forensics is due to be published this month! It's listed here on Amazon, along with editorial reviews from Troy Larson and Rob Lee. I, for one, cannot wait! Seriously.

A word about the book...if you're interested in an ebook/Kindle version, or if you have trouble getting the contents of the DVD with your ebook purchase, please contact the publisher first. Once the book has been sent in for printing, I (and authors in general) have very little to do with the book beyond marketing it in ways that the publisher doesn't.

Jesse Kornblum posted his Four Rules for Investigators recently. I would say that it was refreshing to see this, but I've gotta say, I've been saying most of the same things for some time...I think the big exception has been #2, and not because I disagree, but as a consultant, I generally assume that that's already been addressed and handled.

Jesse's other rules remind me a great deal of some of the concepts I and others have been discussing:

Rule 1 - Have a plan...that kind of sounds like "what are the goals of your investigation and how do you plan to address it with the data you have?"

Rule 2 - Have permission...definitely. Make sure the contract is signed before you forensicate.

Rule 3 - Write down what you do...Documentation! Now, I know some folks have said that they don't keep case notes, as those would be discoverable, and they don't want the defense counsel using their speculation and musings against them. Well, I'd suggest that that's not what case notes are about, or for. Case notes let you return to a case 6 months or a year later and see what you did, and even why. They also let someone else pick up where you left off, in case you get sick, or hit by a bus. What I really don't like seeing is the folks who say that they spent hours researching something that was part of a case, but they didn't document it, so they can't remember it...they then have to re-do all of that research the next time they encounter that issue. Also, consider person on a team conducts research that takes 10 hrs to complete. If they don't document and share the results of the research, then the other 9 people on the team are going to spend a total of 90 hrs doing that research themselves...when the original research could have been shared via email, or in a 1/2 hr brown bag training session.

Rule 4 - Work on a copy...Always! Never work on the original data. I've had instances where immediately after I finished making copies of the images, the original media (shipped by the customer) died. Seriously. Now, imagine where I'd've been had I not followed procedure and made the boss would've said, "...that's okay, because you made copies...right?" I'm generally one of those folks who follows procedure because it's the right thing to do, and I tend not to make arbitrary judgments as to when I will or won't follow the procedure.

Jesse isn't the only one saying these things. Take a look at Sniper Forensics: Part 1 over at the SpiderLabs Anterior blog. Chris has gotten a lot of mileage out of the Sniper Forensics presentations, and what his talks amount to include putting structure around what you do, and the KISS principle. That's "keep it simple, stupid", NOT listening to Love Gun while you forensicate (although I have done that myself from time to time).

Is it StuxNet, or is it APT?
I found this DarkReading article about targeted attacks tweeted about over and over again. I do agree with the sentiment of the article, particularly as the days of joyriding on the Information Superhighway are over with, my friends. No one is really deploying SubSeven any longer, just to mess with someone and open and close their CD-Rom tray. There's an economic driver behind what's going on, and as such, steps are being taken to minimize the impact of unauthorized presence on compromised systems. One thing's for appears that these skilled, targeted attacks are going to continue to be something that we see in the news.

USB Issues and Timelines
Okay, this isn't about the USB issues you might be thinking of...instead, it's about a question I get now and again, which is, why do all of the subkeys beneath the USBStor key in the System hive all have the same LastWrite time? While I have noticed this, it hasn't been something that's pertinent to my exam, so I really haven't pursued it. I have seen where others have said that they've looked into it and found that the LastWrite time corresponded with an update.

Rather than speculating as to the cause, I thought what I'd do is recommend that folks who see this create a timeline. Use the file system metadata, LastWrite times from the keys in the System and Software hives, and Event Log data, to start. This should give you enough granularity to begin your investigation. I'd also think about adding Prefetch file metadata (if you have any Prefetch files...), as well as data from the Task Scheduler log (that is, if it says anything besides the Task Scheduler service starting...).

Tuesday, January 04, 2011

Accessing Volume Shadow Copies

Over the past months, I've had some work that involved Windows systems beyond XP...specifically, one Windows 7 system, and I had some work involving a Vista system. Now, most of what I looked at had to do with timelines, but that got me to thinking...with the number of systems now available with Windows 7 pre-installed (which is pretty much everything that comes with Windows pre-installed), how much do we really know about accessing Volume Shadow Copies (VSCs)?

Troy Larson, the senior forensic-y dude from Microsoft, has been talking about Volume Shadow Copies for quite some time. In his presentations, Troy has talked about accessing and acquiring VSCs on live systems, using George M. Garner, Jr's FAU dd.exe; however, this requires considerable available disk space.

Troy's SANS Forensic Summit 2010 presentation can be found here. In his presentation, Troy demonstrates (among other things) how to access VSCs remotely on live systems, using freely available tools.

ProDiscover - Chris Brown has a presentation available online (from TechnoSecurity 2010) in which he discusses using ProDiscover to access and mount Volume Shadow Copies on live systems...remotely. Pretty cool.

I ran across a QCCIS whitepaper recently that discusses mounting an acquired image using EnCase with the PDE module, accessing the VSCs using the same method Troy pointed out, and then copying files from the VSCs using robocopy. There are also a number of posts over at the Forensics from the sausage factory blog that address VSCs, and a couple that include the use of robocopy. As I often work with acquired images, being able to access VSCs within those images is something I'm very interested in being able to do. However, most of my online research points to folks using EnCase with the PDE module to mount their acquired images when demonstrating how to access VSCs within those images...and I don't have EnCase.

So...what if you don't have access to EnCase or the PDE module? How could you then access Volume Shadow Copies within an acquired image?

I started out with a host system that is a fresh install of Windows 7 Professional, 64-bit. The acquired image I started with is of a physical disk of from a 32-bit Vista system; as it's an image from the physical disk, it has several partitions, including a maintenance partition. The acquired image is called "disk0.001". I also extracted the active Vista partition as a separate raw/dd image, calling it "system.001". I verified the file systems of both of these images using FTK Imager to ensure that I could 'see' the files.

So here are the tools I installed/used:
FTK Imager
ImDisk 1.3.1
Mount Image Pro (14 day trial)
Shadow Explorer 0.8.430.0

So the first thing I did was mount the image using FTK Imager 3.0, and noted the drive letter for the third this case, I:\. I opened a command prompt and used the 'dir' command to verify that I could access the volume. I then typed the following command:

vssadmin list shadows /for=i:

This got me an error message:

Error: Either the specified volume was not found or it is not a local volume.

Okay. I fired up ShadowExplorer, but the I:\ drive was not one of the options available for viewing.

I tried mounting the system.001 file, and then tried both image files again using ImDisk, and each time got the same result...running the vssadmin command I got the above error message. I also tried using the "Logical Only" option in FTK Imager 3.0's "Mount Type" option, and that didn't work, either. So, at this point, I was failing to even identify the VSCs, so I could forget accessing them.

I reached out the QCCIS guys and John responded that FTK Imager 3.0 seems to mount images so that they appear as remote/network drives to the host OS; as such, vssadmin doesn't 'see' VSCs on these drives. This also explains why ShadowExplorer doesn't 'see' the volumes, and why I get the same error message when using ImDisk. I also got in touch with Olof, the creator of ImDisk, and he said that ImDisk was written using the method for creating drive letters available in NT 4.0, prior to the Volume Mount Manger being included in Windows; as such, getting ImDisk to mount the volumes as local disks would require a re-write. Thanks to Olof for ImDisk, and thanks to Olof and the QCCIS guys for responding!

I then installed the VMWare VDDK 1.2 in order to get a version of vmware-mount that would run on Windows 7. I had booted the acquired image using LiveView, so I had a .vmdk file on my drive for this image. After installing the VDDK, I ran "vmware-mount /p", and clearly saw the 4 volumes within the image...I knew that I wanted to access volume 3. I then ran the following command:

vmware-mount /v:3 x: f:\vista\disk0.001.vmdk

This resulted in an error message stating that vmware-mount could not mount the virtual disk. Checking the log file that was produced, the first message I see is that the image file, disk0.001, "failed to open (38): AIOMgr_Open failed." I'm still researching this one...

Getting It To Work
So, at this point, I'm stuck...I want to access files within the VSCs in an acquired image, and I don't have EnCase/PDE. So far, my attempts to just see the VSCs have failed. So, I grabbed a copy of vhdtool.exe, which is available from MSDN (it is described as "unmanaged code"). Originally, I wanted to get a copy of this as I have XPMode installed on my Windows 7 Professional system, which means I have Virtual PC...but I don't want to boot the vhd file at this point, so that's not a required component. So I made a copy of system.001 to another storage location and ran the vhdtool.exe with the "/convert" switch. This apparently adds a footer to the file...which I'd read about during my research and is the reason I made a copy of system.001 to work with (don't want to muck up my original in case all of this doesn't work...know what I mean?). I should note here that running the tool adds the VHD footer to the file without changing the file even though I apparently now have a VHD file, I can still see only "system.001".

Next, I opened the Computer Management interface in Windows 7 and fired up the Disk Manager. I then chose Action -> Attach VHD, and browsed to my new VHD file. Before clicking "OK", I made sure to check the "Read-only" box. I then had Disk2 added to Disk Manager, and the Volume listing included a new G:\ volume. In both instances, the drive icon was light blue, as opposed to the grey drive icon for the other drives on my system. When I ran the vssadmin command against the G:\ drive, I could see the VSCs! Oddly enough, the G:\ drive is NOT a visible option in ShadowExplorer.

Next, I ran the mklink command against the last VSC identified on the G:\ drive. To do this, I selected everything on the line following "Shadow Copy Name:"...the stuff that says "\\?\GLOBALROOT\Device\HarddiskVolumeShadowCopy40".

mklink /d c:\vista \\?\GLOBALROOT\Device\HarddiskVolumeShadowCopy40\

Note: The final trailing "\" is EXTREMELY important!

From here, I could now access the files within the VSC. Several folks have mentioned using robocopy to get copies of files from within the VSCs, using something as simple as a batch file. See the QCCIS whitepaper for some good examples of the batch file. This is a great way to achieve data reduction...rather than acquiring an image of the VSC, simply mount it and access the files you need. Another idea would be to use RegRipper (include RegRipper's rip.exe in a batch file) or a forensic scanner to access and retrieve only the data you want. For example, when you access a user's UserAssist key and parse the information, if they've taken an action before (launched an application, etc.), you only see the date/time that they last took that action, and how many times they've done this. By accessing VSCs and running RegRipper, you could get historical information including the previous dates that they took those actions. Let's say that you use something similar to the batch file outlined in the QCCIS whitepaper, and include a command similar to the following (you may need to modify this to meet your environment):

rip.exe -p userassist -r D:\vsc%n\ > user_ua%n.txt

Now, this assumes that your VSCs are mounted with mklink under D:\vsc; %n refers to the VSC number.

Something similar would also be true with respect to MRU lists within the Registry...we know that the LastWrite time for the key tells us when the most recently accessed file (for example) was accessed; accessing VSCs, we can see when the other files in the MRU list were accessed.

When you're done accessing the mounted VSC, simply remove the directory using rd or rmdir. Then go back to your Disk Manager, right-click the box to the left in the lower right pane, and select "Detach VHD".

If you need a command line method for doing this, take a look at can even include the list of commands you want to run in a script file (similar to ftp.exe) and run the script using:

diskpart /s

This, of course, leads to other questions. For example, we've seen how, if you have a full image, you can use vhdtool to convert the image file to a vhd file, or extract the pertinent partition (using FTK Imager) and convert that raw/dd image to a vhd file. But, what if you have a vmdk file from a virtual environment? One option would be to use FTK Imager to re-acquire the file to raw/dd format; FTK Imager opens vmdk files, as well as EnCase format images, just fine.

There's a tool available from the VMToolkit (produced by the VMWare folks) that will reportedly convert vmdk files to vhd format. Apparently, Starwind has a free converter, and some have mentioned that WinImage should work, as well. I haven't tried any of these, so YMMV.

Monday, January 03, 2011

Links and Updates

It's been a while since I posted a list of links and resources from across the Internet. I thought that since things have been quiet toward the end of 2010, I'd post some of the things I'd run across and found, here goes...

Looks like Claus is back with an interesting update to his site. Claus hasn't been updating his site as much as he had done in the past, but it is always good to see is posts. A lot of what Claus posts that is oriented toward forensics is from an admin's perspective, which is great for a guy like me...I'm not an admin (nor do I play one on TV), so I often find that it's good to get a reminder of the admin's perspective. Besides, Claus always seems to be able to find the really good stuff...

One of the interesting things I found in Claus's post was the mention of a new mounting tool, OSFMount, for mounting images. I find it useful to be able to do this, and have been using FTK Imager 3.0. Claus also mentions in his post that ImDisk was updated OSFMount, it comes with a 64-bit version, in addition to the 32-bit version.

So, what does this tell us about image mounting tools? There are several other free and for-pay tools, some of varying quality, and others with vastly greater capabilities. So why does it seem that there's an increase in the number of tools that you can use to mount images? After all, you can use LiveView to convert a raw dd image to a vmdk and open it in VMPlayer, or you can use vhdtool to convert a raw dd image to a vhd and open it in MS's Virtual PC, which is freely available.

I watched for a long time and didn't see any updates for a while...while I wasn't watching, Christine updated the site with a lot of great reading material back in November. This site has always been a great source for information.

Based on a link from the e-Evidence site, I did some reading about mounting images, and accessing and recovering data from Volume Shadow Copies. The first resource I looked at was from; the whitepaper provides an explanation of what the Volume Shadow Service does, and provides a simple example (albeit without a great deal of exacting detail) of mounting and extracting data from shadow copies. This is a good way to get started, and I've started looking at ways to implement far, I've used Windows 7 Professional 64-bit as a base system, mounted an image (with FTK Imager 3.0) that includes a Vista 32-bit volume, and not been able to access the shadow copies. I'll be trying some different things to see if I can mount images/volumes in order to access the Volume Shadow Copies.

Malicious Streams
This site isn't strictly fact, it's decidedly focused on MacOSX. However, contains information about PDF malware, a bit of code geared toward Windows systems, and some good overall reading. Also, the author is working on a version of autoruns for MacOSX and I hope that this gets released as a full version early this year, as it would be a great way to start things off in 2011.

Derek Newton's list of Forensic Tools
Open Source Digital Forensics Site
LNK Parser written in Python