Wednesday, July 11, 2007

ACPO Guidelines

The Association of Chief Police Officers (ACPO), in association with 7safe, has recently released their updated guide to collecting electronic evidence. While the entire document makes for an interesting read, I found pages 18 and 19, "Network forensics and volatile data" most interesting.

The section begins with a reference back to Principle 2 of the guidelines, which states:

In circumstances where a person finds it necessary to access original data held on a computer or on storage media, that person must be competent to do so and be able to give evidence explaining the relevance and the implications of their actions.

Sounds good, right? We should also look at Principle 1, which states:

No action taken by law enforcement agencies or their agents should change data held on a computer or storage media which may subsequently be relied upon in court.

Also, Principle 3 states:

An audit trail or other record of all processes applied to computer-based electronic evidence should be created and preserved. An independent third party should be able to examine those processes and achieve the same result.

Okay, I'm at a loss here. Collecting volatile data inherently changes the state of the system, as well as the contents of the storage media (i.e., Prefetch files, Registry contents, pagefile, etc.), and the process used to collect the volatile data cannot be later used by a third party to "achieve the same result", as the state of the system at the time that the data is collected cannot be reproduced.

That being said, let's move on...page 18, in the "Network forensics and volatile data" section, includes the following:

By profiling the forensic footprint of trusted volatile data forensic tools, an investigator will be in a position to understand the impact of using such tools and will therefore consider this during the investigation and when presenting evidence.

It's interesting that this says "profiling the forensic footprint", but says nothing about error rates or statistics of any kind. I fully agree that this sort of thing needs to be done, but I would hope that it would be done and made available via a resource such as the ForensicWiki, so that not every examiner has to run every test of every tool.

Here's another interesting tidbit...

Considering a potential Trojan defence...

Exactly!

Continuing on through the document, I can't say that I agree with the order of the sequence for collecting volatile data...specifically, the binary dump of memory should really be first, not last. This way, you can collect the contents of physical memory in as near a pristine state as possible. I do have to question the use of the term "bootable" to describe the platform from which the tools should be run, as booting to this media would inherently destroy the very volatile data you're attempting to collect.

Going back to my concerns (the part where I said I was "at a loss") above, I found this near the end of the section:

By accessing the devices, data may be added, violating Principle 1 but, if the logging mechanism is researched prior to investigation, the forensic footprints added during investigation may be taken into consideration and therefore Principle 2 can be complied with.

Ah, there we go...so if we profile our trusted tools and document what their "forensic footprints" are, then we can identify our (investigators) footprints on the storage media, much like a CSI following a specific route into and out of a crime scene, so that she can say, "yes, those are my footprints."

Thoughts?

8 comments:

Anonymous said...

I've missed your reports! It's pretty much a "given" today that live acquisition is acceptable in a number of circumsance that you know better than I. It seems that the guidelines are dated in some respects, and a qualified examiner should be able to justify a live exam done with tested tools in a manner such as described in your book.

Consider a tool like X-Ways Capture. It acuires data (live) in an acceptable order. It produces a log of its steps, it's configurable, and leaves a small footprint. The size of one's foot, however, varies from system to system. You can run a few tests and determine the data written to storage devices, e.g., the prefetch file you described. I'm not sure whether you can measure the footprint in RAM with any reliability.

Playing devil's advocate, perhaps your oppenent will claim that you overwrote key evidence when you consumed unallocated space by writing a prefetch file. Maybe the latest version of a key document was in memory, but overwritten by your acquisition tool. What else? I'm looking for the downside issues, as they may lead to better preparation.

BTW, I've been told that KNTdd will in fact acquire Vista RAM, at least with the Home editions. There's a discount to LE. There's an interesting thread on the Digital Detective board, where Craig also alerted members to your book. :-)

H. Carvey said...

I'm not sure whether you can measure the footprint in RAM with any reliability.

I agree. Further, many tools will complete their task and exit, which means that very quickly, the process will be created, memory allocated and used, and then that memory will be released for use by other processes.

Not only will the number of memory pages consumed vary based on software load (what's installed and running) and the OS version, but it will be difficult enough to identify on a test system...let alone on the real-world systems we respond to.

Maybe the latest version of a key document was in memory...

Perhaps, but when a process is loaded into memory, pages that are actively used are not overwritten...if that were the case, then we'd have all sorts of issues and system failures when someone loaded Solitaire. This means that whatever is available in memory for use will not be associated with an active, running process or thread of execution. The latest version of a key document may be in memory, but if the process that was being used to create it (ie, Excel, Word, Notepad) is still running, the memory pages won't be overwritten...at the worst, they'll be written out to the pagefile.

..I've been told that KNTdd will in fact acquire Vista RAM...

I was in the beta test program, so I can say that yes, kntdd will allow you to acquire RAM from Windows 2003 SP1 and Vista systems. I had to purchase my copy of the tool, and as I've recently installed Vista into a VMWare session, I'll be able to test this.

...Craig also alerted members to your book...

Craig who? I'll have to thank him...

Anonymous said...

Craig who? I'll have to thank him...

Craig Wilson, the list admin and author of NetAnalysis. Craig has just written a beta RAM imager.

H. Carvey said...

Jimmy...is this beta RAM imager available for preview?

Anonymous said...

Yes. Craig has asked for interested persons to email and request a copy of MemGrab. I'll send his address to you off-blog.

Oh, I forgot to mention that the thread on the DD forum discusses some very interesting findings on acquiring artifacts from memory after the machine has been shut down.

H. Carvey said...

Jimmy,

Can you tell me which thread this is? I'm a member of the board, and I'd like to review this...it's great that you point it out, but it would be immensely more helpful if you could tell me *where* it is, rather than just that it is. ;-) Thanks.

Anonymous said...

Sorry, Harlan. I didn't know you were on the board, though it was obvious that you'd be a valuable resource. (I still wish there was way to get an email when you/I/anyone comments on a followed blog post. I'd be better able to post back quickly.)

The link is http://www.digital-detective.co.uk/cgi-bin/digitalboard/YaBB.pl?num=1184139817;start=all
"Recovery of data from RAM"

H. Carvey said...

Thanks, I found it...while it is interesting, I do see a lot of the same questions/concerns/misunderstandings with RAM acquisition that I see in other forums...and the same folks addressing and attempting to answer them! ;-)