I recently shared my findings from an analysis challenge that Ali posted, and after publishing my post, found out that Adam had also shared his findings. Looking at both posts, it's clear that there are two different approaches, and as I read through Adam's findings, it occurred to me that this is an excellent example of what I've seen time and time again in the industry.
Adam and I have never met, and I know nothing at all about his background. I only know that Adam's blog has (at the time of this writing) a single post, from 2 Jan 2019. On the other hand, my blogging goes back to 2004, and I've been in the industry for over 21 years, with several years of service prior to that. All this is intended to point out is that Adam and I each come to the table with a different perspective, and given the same data, will likely approach it differently.
A good deal of my experience is in the consulting arena, meaning that engagements usually start with a phone call, and in most cases, it's very likely that the folks I work with weren't the first to be called, and won't be the last. Yes, I'm saying that in some cases, consumers of digital analysis services shop around. This isn't a bad thing, it's simply meant to indicate that they're looking for the "best deal". As such, cases are "spec'd out" based on a number of hours...not necessarily the number of hours it will take to complete the work, but more so the number of hours a customer is willing to purchase for the work they want done. The nature outcome of this is that once the customer's questions have been established, the analyst assigned the work needs to focus on answering the questions. Otherwise, time spent pursuing off-topic issues or "rabbit holes" increases the time it takes to complete the work, which isn't billed to the customer. As such, the hourly rate drops, and it can drop to the point where the company looses money on the work.
All of this is meant to say that without an almost pedantic focus on the questions at hand, an analyst is going to find themselves not making friends; reports won't be delivered on time, additional analysts will need to be assigned to actually complete the work, and someone may have their approved vacation rescinded (I've actually seen this happen...) in order to get the work done.
When I read the challenge, my focus was on the text that the admin saw and reported. As such, my analysis goal, before even downloading the challenge image, was to determine the location of the file on the system, and then determine how it got there.
However, I approached Ali's first question of how the system was hacked a little differently; I've dealt with a lot of customers in two decades who've asked how something was "hacked", and I've had to keep in mind that their use of the term is different from mine. For me, "hacked" refers to exploiting a vulnerability to gain access to a system, escalate privileges, and/or essentially take actions for which the system and data were never intended. I didn't go into the analysis assuming that the system was "hacked"...I approached my analysis from the perspective that the focus of main effort for my analysis was the message that the admin had reported, and any "hacking" of the system would be within some modicum of temporal proximity to that file being created and/or modified. As such, a timeline was in order, and this approach helped me with question #2, regarding "evidence". In fact, creating micro-timelines or "overlays" allowed me to target my analysis. At one point, I created a timeline of just logon events from the Security Event Log. In order to prove that the Administrator account was used to create the target readme.txt file, I created micro-timelines for both user profiles, using just web browser and Registry (specifically, shellbags and RecentDocs) data.
From my overall timeline, I found when the "C:\Tools\readme.txt" file had been created, and I then used that as a pivot point for my analysis. This is how I arrived at the finding that the Administrator account had been used to create the file.
From my perspective, the existence of additional text files (i.e., within a folder on the "master" profile desktop), who created them and when, the modification to magnify.exe, and the system time change all fell into the same bucket for question #4 (i.e., "anything you would like to add"). All of these things fell out of the timeline rather easily, but I purposely did not pursue further analysis of things such as the execution of magnify.exe, net.exe, and net1.exe, as I had already achieved my analysis goal.
Again, in my experience as a consultant, completing analysis work within a suitable time frame (one that allows the business to achieve it's margins) hinges upon a focus on the stated analysis goals.
I've seen this throughout my career...a number of years ago, I was reviewing a ransomware case as part of an incident tracking effort, and noticed in the data that the same vulnerability used to compromise the system prior to ransomware being deployed had been used previously to access the system and install a bitcoin miner. As the customer's questions specifically focused on the ransomware, analysis of the bitcoin miner incident hadn't been pursued. This wasn't an issue...the analyst hadn't "missed" anything. In fact, the recommendations in the report that applied to the ransomware issue applied equally well to the bitcoin miner incident.
A Different Perspective
It would seem, from my reading (and interpretation) of Adam's findings that his focus was more on the "hack". Early on in his post, Adam's statement of his analysis goals corresponded very closely to my understanding of the challenge:
We are tasked with performing an IR investigation from a user who reported they've found a suspicious note on their system. We are given only the contents of the message (seen below) without its file path, and also no time at which the note was left.
...and...
Since we can understand that this note was of concern to the user, it is very important to start developing a time frame of before the note was created to understand what led to this point. This will allow the investigator to find the root cause efficiently.
Adam went on to describe analyzing shellbags artifacts, but there was no indication in his write-up that he'd done a side-by-side mapping of date/time stamps, with respect to the readme.txt file to the shellbag artifacts. Shortly after that, the focus shifted to magnify.exe, and away from the text file in question.
Adam continued with a perspective that you don't often see in write-ups or reports; not only did he look up the hash of the file on VT, but he demonstrated his hypothesis regarding how magnify.exe might have been used, in a virtual machine.
In the end, however, I could not find where the question of how the "C:\Tools\readme.txt" file came to be on the system was clearly and directly addressed. It may be there; I just couldn't find it.
Final Words
I did engage with the challenge author early this morning (9 Jan), and he shared with me the specifics of how different files (the files in a folder on one user's desktop) were created on the system. I think that one of the major comments I shared with Ali was that not only was this challenge representative of what I've seen in the industry, but so are the shared findings. No two analysts come to the table with the exact same experience and perspectives, and left to their own devices, no two analysts will approach and solve the same analysis (or challenge) the same way. But this is where documentation and sharing is most valuable; by working challenges such as these, either separately or together, and then sharing our findings publicly, we can all find a way to improve every aspect of what we do.
1 comment:
No one should ever underestimate the power of a 'second set of eyes' on a problem. I believe that there is a moving target of the sweet spot of how many eyes would be best for a specific type of problem, but I am certain that one set of eyes is never best, yet it is most common.
A different perspective (free set of eyes, second set of eyes, impartial perspective, etc...) will see something completely different than the original person has spent hours or weeks looking at.
In my first book, I wrote "Having persons new
to your case look at your work may be effective for them to see something you
overlooked or just didn’t put together. This is not a sign of weakness to ask, but
instead it is a sign of being a good investigator."
I can't count how many times that someone else has pointed something out to me that I completely missed, but was so obvious after being pointed out to me. I also can't count how many times that I have done that for others. You'd be surprised at how many crimes have been solved by a officer/detective/agent walking by someone's desk and pointing to something that solves a case in seconds. Simply because looking in at the big picture to see the forest, rather than counting the leaves on the trees.
Post a Comment