Thoughts on Detection Engineering
I read something online recently that suggested that the role of detection engineering is to reduce the false positive (FPs) alerts sent to the SOC. In part, I fully agree with this; however, "cyber security" is a team sport, and it's really incumbent upon SOC and DFIR analysts to support the detection engineering effort through their investigations. This is something I addressed a bit ago in this blog, first here, and then here.
The overall point here is that the SOC shouldn't be inundated or overwhelmed with false positive (FP) detections. Rather, the SOC should be collecting the necessary metrics (through an appropriate level of investigation) to definitively demonstrate that the detections are FPs, and the feed that directly to the DFIR cycle to collect and analyze the necessary information to determine how to best address those FPs.
One example of the use of such a process, although not related to false positives, can be seen here. Specifically, Huntress ThreatOps analysts were seeing a lot of malware (in particular, but not solely restricted to Qakbot) on customer systems that seemed to be originating from phishing campaigns that employed disk image file attachments. One of the things we did was create an advisory for customers, providing a means to disable the ability for users to just double-click the ISO, IMG, or VHD files and automatically mount them. Users are still able to access the files programmatically, they just can't mount them by double-clicking them.
While this specific event wasn't related to false positives, it does illustrate how taking a deeper look at an issue or event can provide something of an "upstream remediation", approaching and addressing the issue much earlier in the attack chain
Podcasts
If you're into podcasts, Zaira provided me the wonderful opportunity to appear on the Future of Cyber Crime podcast! It was a great opportunity for me to engage with and learn from Zaira! Thank you so much!
Recycle Bin Persistence
D1rkMtr recently released a Windows persistence mechanism (tweet found here) based on the Recycle Bin. This one is pretty interesting, not just in it's implementation but you have to wonder how someone on the DFIR side of that persistence mechanism would even begin to investigate it.
I know how I would...I created a RegRipper plugin for it, one that will be run on every investigation automatically, and provide an analysis tip so I never forget what it's meant to show.
recyclepersist v.20230122
(Software, USRCLASS.DAT) Check for persistence via Recycle Bin
Category: persistence (MITRE T1546)
Classes\CLSID\{645FF040-5081-101B-9F08-00AA002F954E}\shell\open\command not found.
Classes\Wow6432Node\CLSID\{645FF040-5081-101B-9F08-00AA002F954E}\shell\open\command not found.
Analysis Tip: Adding a \shell\open\command value to the Recycle Bin will allow the program to be launched when the Recycle Bin is opened. This key path does not exist by default; however, the \shell\empty\command key path does.
Ref: https://github.com/D1rkMtr/RecyclePersist
Plugins
Speaking of RegRipper plugins, I ran across this blog post recently about retrieving Registry values to decrypt files protected by DDPE. For me, while the overall post was fascinating in the approach taken, the biggest statement from the post was:
I don’t have a background in Perl and it turns out I didn’t need to. If the only requirement is a handful of registry values, several plugins that exist in the GitHub repository may be used as a template. To get a feel for the syntax, I found it helpful to review plugins for registry artifacts I’m familiar with. After a few moments of time and testing, I had an operational plugin.
For years, I've been saying that if there's a plugin that needs to be created or modified, it's as easy as either creating it yourself, by using copy-paste, or by reaching out and asking. Providing a clear, concise description of what you're looking for, along with sample data, has regularly resulted in a working plugin being available in an hour or so.
However, taking the reigns of the DIY approach as been something that Corey Harrell started doing years ago, and what let to such tools as auto_rip.
Now, this isn't to say that it's always that easy...talking through adding JSON output took some discussion, but the person who asked about that was willing to discuss it, and I think we both learned from the engagement.
LNKs
Anyone who's followed me for a short while will know that I'm a really huge proponent for making the most of what's available, particularly when it comes to file metadata. One of the richest and yet largely untapped (IMHO) sources of such metadata are LNK files. Cisco's Talos team recently published a blog post titled, "Following the LNK Metadata Trail".
The article is interesting, and while several LNK builders are identified, the post falls just short of identifying toolmarks associated with these builders. At one point, the article turns to Qakbot campaigns and states that there was no overlap in LNK metadata between campaigns. This is interesting, when compared to what Mandiant found regarding two Cozy Bear campaigns separated by 2 years (see figs 5 & 6). What does this say to you about the Qakbot campaigns vs the Cozy Bear campaigns?
Updates to MemProcFS-Analyzer
Evild3ad79 tweeted that MemProcFS-Analyser has been updated to version 0.8. Wow! I haven't had the opportunity to try this yet, but it does look pretty amazing with all of the functionality provided in the current version! Give it a shot, and write a review of your use of the tool!
OneNote Tools
Following the prevalence of malicious OneNote files we've seen though social media over the past few weeks, both Didier Stevens and Volexity crew have released tools for parsing those OneNote files.
Addendum, 30 Jan: Matthew Green added a OneNote parser/detection artifact to Velocidex.
No comments:
Post a Comment