Pages

Monday, August 29, 2011

Updates and Links

Report Writing
One of the hardest parts of what we do is writing reports; technical people hate to write.  I've seen this fact demonstrated time and again over the years.

Paul Bobby wrote up a very interesting blog post about criteria for an effective report.  As I read through it, I found myself agreeing, and by the time I got to the end of the post, I noticed that there were some things that I see in a lot of reports that had not been mentioned...for example,

One section of the post that caught my eye was the Withstand a barrage of employee objections section...I think that this can be applied to a number of other examinations.  For example, CP cases will sometimes result in "Trojan Defense" or remote access claims (I've seen both).  Adding the appropriate checklists (and training) to your investigative process can make answering these questions before they're asked an easy-to-complete task.

At the end of the post, Paul mentions adding opinions and recommendations; I don't really so much have an issue with this, per se, as long as the opinions are based on and supported by clearly documented analysis and findings, and clearly and concisely described in the report.  In many of the reports I've reviewed over the years, the more prolific the author attempts to be, the less clear the report becomes.  Also, invariably, the report becomes more difficult of the author to write.

CyberSpeak Podcast
Ovie's posted another CyberSpeak podcast, this one with an interview of Chris Pogue, author of the "Sniper Forensics" presentations.  Chris talks about the components of "Sniper Forensics", including Locard's Exchange and the Alexiou Principles.

Another thing that Chris talks about is Occam's Razor...specifically, Chris (who loves bread pudding, particularly a serving the size of your head...) described a situtation that we're all familiar with, in that an analyst will find one data point, and then jump to a conclusion as to the meaning of that data point, not realizing that the conclusion is supported by that one data point and a whole bunch of assumptions.  When I find something that is critical to addressing the primary goal of my examination, I tend to look for other supporting artifacts to provide context, as well as a stronger relative level of confidence, to the data I'm looking at, so that I can get a better understanding of what is actually happening.

At  the beginning of the podcast, Ovie addresses having someone review your analysis report before heading off to court, sort of a peer review thing.  Ovie said that Keith's mention (in a previous podcast) of this review probably referenced folks in your office, but this sort of thing can also include trusted outside analysts.  Ovie mentioned that you have to be careful about this, in case the analyst then goes about talking/blogging about their input to your case.  I agree that this could be an issue, but I would also suggest that if the analyst were trusted, then you could trust them not to say anything.

One thing to remember from the podcast is that there is no such thing as a court-approved tool...the term is simply marketing hype.

Finally, Chris...HUGE thanks for the RegRipper (and ripXP) shout-out!  And a HUGE thanks to Ovie and the CyberSpeak team for putting together such a great resource to the community.

Morto
I recently blogged regarding Jump Lists, and in that post had indicated what artifacts are available when the user uses the Remote Desktop Client to connect to other systems via RDP.  Another thought as to how this might be useful came with F-Secure's announcement of a worm called Morto, which appears to use RDP to spread.  How Jump Lists might come into play is if RDP connections are observed between systems (or in the logs of the system being accessed); an investigation might show no Jump Lists associated with the Remote Desktop Client for the primary user on that system.  This goes back to what I was referring to earlier in this post...let's say you see repeated RDP connections between systems, and go to the system from which they originated.  Do you assume that the connections were the result of malware or the user?  Examining the system will provide you with the necessary supporting information, giving you that context.

Mentions of Morto can also be found at Rapid7,  as well as MMPC.


NoVA Forensics Meetup Reminder
The next NoVA Forensics Meetup is set for 7 Sept.  We're scheduled to have a presentation on botnets from Mitch Harris...I'm really looking forward to it!

Tools
I posted recently regarding StickyNotes analysis, and also recently completed my own StickyNotes parser.  It works very well, and I've written it so that the output is available in a listing, CSV, and TLN formats.  Not only does it print out information about the embedded notes within the StickyNotes.snt file but it also provides the modification date/time for the "Root Entry" of the .snt file itself.  This would be useful if the user had deleted all of the sticky notes as it would provide an indication of user activity on the system (i.e., the user would have to be logged in to delete the sticky notes).  In order to write this tool, I followed the MS OLE/Compound Document binary format spec, and wrote my own module to parse the Sticky Notes.  As I didn't use any proprietary modules (only used the Perl seek(), read(), and unpack() functions) the tool should be cross-platform.

Anyway, the tool parses out the notes out of the .snt file, and presents information such as the creation and modification dates, and the contents of the text stream (not the RTF stream) of the note.  It also displays the modification date for the Root Entry of the OLE document, as well...

C:\Perl\sticky>sn.pl -f stickynotes.snt
Root Entry
  Mod Date     : Fri Aug 26 11:51:35 2011

Note: a4aed27b-cfd9-11e0-8
  Creation Date: Fri Aug 26 11:51:35 2011
  Mod Date     : Fri Aug 26 11:51:35 2011
  Text: Yet another test note||1. Testing is important!

Note: e3a17883-cfd8-11e0-8
  Creation Date: Fri Aug 26 11:46:18 2011
  Mod Date     : Fri Aug 26 11:46:18 2011
  Text: This is a test note

I also have CSV and TLN (shown below) output formats:

C:\Perl\sticky>sn.pl -f stickynotes2.snt -t
1314359573|StickyNote|||M... stickynotes2.snt Root Entry modified

In the above example, all of the notes had been deleted from the .snt file, so the only information that was retrieved was the modification date of Root Entry of the document.

Addendum: I've posted the Windows binary of the Sticky Notes parsing tool to my Google Code site.  Note that all times are displayed in UTC format.

2 comments:

  1. "In many of the reports I've reviewed over the years, the more prolific the author attempts to be, the less clear the report becomes."

    That's so true. One book that really drives this point home is, Why Business People Speak Like Idiots: A Bullfighter's Guide. It's especially great in audio: http://www.amazon.com/Business-People-Speak-Like-Idiots/dp/B0009RFWTI

    ReplyDelete
  2. A colonel I once worked for said that if you can't summarize an issue in 3-5 bullets on an index card, you don't know enough about that issue.

    Some in the community have been trained that customers want volume, and that simply is not the case. Why write a 30+ page report when 5 pages will do AND actually be read?

    ReplyDelete