Almost everyone likes spies, right? Jason Bourne, James Bond, that sort of thing? One of things you don't see in the movies is the training these super spies go through, but you have to imagine that it's pretty extensive, if they can pop up in a city that they maybe haven't been to and transition seamlessly into the environment.
The same thing is true of targeted adversaries...they're able to seamlessly blend into your environment. Like special operations forces, they learn how to use tools native to the environment in order to get the information that they're after, whether it's initial reconnaissance of the host or the infrastructure, locating items of interest, moving laterally within the infrastructure, or exfiltrating data.
I caught this post from JPCERT/CC that discusses Windows commands abused by attackers. The author takes a different approach from previous posts and shares some of the command lines used, but also focuses on the frequency of use for each tool. There's also a section in the post that recommends using GPOs to restrict the use of unnecessary commands. An alternative approach might be to track attempts to use the tools, by creating a trigger to write a Windows Event Log record (discussed previously in this post). When incorporated into an overall log management (SEIM, filtering, alerting, etc.) framework, this can be an extremely valuable detection mechanism.
If you're not familiar with some of the tools that you see listed in the JPCERT/CC blog post, try running them, starting by typing the command followed by "/?".
TradeCraft Tuesday - Episode #6 discusses how Powershell can be used and abused. The presenters (one of whom is Kyle Hanslovan) strongly encourage interaction (wow, does that sound familiar at all?) with the presentation via Twitter. During the presentation, the guys talk about Powershell being used to push base64 encoded commands into the Registry for later use (often referred to as "fileless"), and it doesn't stop there. Their discussion of the power of Powershell for post-exploitation activities really highlights the need for a suitable level of instrumentation in order to achieve visibility.
The use of native commands by an adversary or intruder is not new...it's been talked about before. For example, the guys at SecureWorks talked about the same thing in the articles Linking Users to Systems and Living off the Land. Rather than talking about what could be done, these articles show you data that illustrates what was actually done; not might or could, but did.
So, what do you do? Well, I've posted previously about how you can go about monitoring for command line activity, which is usually manifest when access is achieved via RATs.
Not all abuse of native Windows commands and functionality is going to be as obvious as some of what's been discussed already. Take this recent SecureWorks post for example...it illustrates how GPOs have been observed being abused by dedicated actors. An intruder moving about your infrastructure via Terminal Services won't be as easy to detect using command line process creation monitoring, unless and until they resort to some form of non-GUI interaction.
The Windows Incident Response Blog is dedicated to the myriad information surrounding and inherent to the topics of IR and digital analysis of Windows systems. This blog provides information in support of my books; "Windows Forensic Analysis" (1st thru 4th editions), "Windows Registry Forensics", as well as the book I co-authored with Cory Altheide, "Digital Forensics with Open Source Tools".
Pages
▼
Wednesday, January 27, 2016
Saturday, January 23, 2016
Training
On the heels of my Skills Dilemma blog post, I wanted to share some thoughts on training. Throughout my career, I've been on both sides of that fence...in the military and in private sector consulting, I've received, as well as developed and conducted training at various levels. I've attended presentations, and developed and conducted presentations, at a number of levels and at a variety of venues.
Corey's also had some pretty interesting thoughts with respect to training in his blog.
Purpose
There are a lot of great training options out there. When you're looking at the various options, are you looking to use up training funds, or are you looking for specific skills development? What is the purpose of the training? What's your intent in either attending the training, or sending someone, a staff member, to that training?
If you're just looking to use up training funds so that you'll get that money included in your budget next year, well, pretty much anything will do. However, if you're looking for specific skills development, whether it's basic or advanced skills, you may want to look closely at what's being taught in the course.
What would really help with this is a Yelp-like system for reviewing training courses. Wait...what? You think someone should actually evaluate the services they pay for and receive? What are you, out of your mind?
So, here's my thought...as a manager, you sit down with one of your staff and develop performance indicators and goals for the coming year, as well as a plan for achieving those goals. The two of you decide that in order to meet those goals, one step is to attend a specific training course. Your staff member attends, and then you both write a review. You both write a review of the course based on what you agreed you wanted to achieve by attending the course; the staff member based on attending the course, and you (as the manager) based on your observation of your staff member's use of their new skills.
Accountability
I'll say it again...there are a lot of great training options out there, but in my experience, what's missing is accountability for that training. What I mean by that is, if you're a manager and you send someone off to training (whether they obtain a certification or not), do you hold them accountable for that training once they return?
Here's an example...when I was on active duty and was stationed overseas, there was an NBC (nuclear, biological, chemical) response course being conducted, and being the junior guy, I was sent. After I'd passed the course and received my certificate, I returned to my unit, and did absolutely NOTHING with respect to NBC. Basically, I was simply away for a week. I was sent off to the training for no other reason than I was the low guy on the totem pole, and when I returned, I was neither asked about, nor required to use or implement that training in any way. There was no accountability.
Later in my tenure in the military, I found an opening for some advanced training for a Sgt who worked for me. I felt strongly that this Sgt should be promoted and advance in his career, and one way to shore up his chances was to ensure that he advanced in his military occupational specialty. I got him a seat in the training course, got his travel set up, and while he was gone, I found a position in another unit where he would put his new-found skills to good use. When he returned, I informed him of his transfer (which had other benefits for him, as well). His new role required him to teach junior Marines about the device he'd been trained on, as well as train the new officers attending the Basic Communication Officers Course on how to use and deploy the device, as well. He was held accountable for the training he'd received.
How often do we do this? Be honest. I've seen an analyst that had attended some pretty extensive training, only to return and within the next couple of weeks, not know how to determine if a file had been time stomped or not. I know that the basics of how to conduct that type of analysis were covered in the training they'd attended.
Generalist vs. Specialist
What kind of training are you interested in? Basic skills development, or advanced training in a very specific skill set? What specific skills are you looking for? Are they skills specific to your environment?
There's a lot of good generalist training out there, training that provides a broad range of skills. You may want to start there, and then develop an in-house training program ("brown bag" lunch presentations, half- or full-day training, mentoring, etc.) that reinforces and extends those basic skills into something that's specific to your environment.
Corey's also had some pretty interesting thoughts with respect to training in his blog.
Purpose
There are a lot of great training options out there. When you're looking at the various options, are you looking to use up training funds, or are you looking for specific skills development? What is the purpose of the training? What's your intent in either attending the training, or sending someone, a staff member, to that training?
If you're just looking to use up training funds so that you'll get that money included in your budget next year, well, pretty much anything will do. However, if you're looking for specific skills development, whether it's basic or advanced skills, you may want to look closely at what's being taught in the course.
What would really help with this is a Yelp-like system for reviewing training courses. Wait...what? You think someone should actually evaluate the services they pay for and receive? What are you, out of your mind?
So, here's my thought...as a manager, you sit down with one of your staff and develop performance indicators and goals for the coming year, as well as a plan for achieving those goals. The two of you decide that in order to meet those goals, one step is to attend a specific training course. Your staff member attends, and then you both write a review. You both write a review of the course based on what you agreed you wanted to achieve by attending the course; the staff member based on attending the course, and you (as the manager) based on your observation of your staff member's use of their new skills.
Accountability
I'll say it again...there are a lot of great training options out there, but in my experience, what's missing is accountability for that training. What I mean by that is, if you're a manager and you send someone off to training (whether they obtain a certification or not), do you hold them accountable for that training once they return?
Here's an example...when I was on active duty and was stationed overseas, there was an NBC (nuclear, biological, chemical) response course being conducted, and being the junior guy, I was sent. After I'd passed the course and received my certificate, I returned to my unit, and did absolutely NOTHING with respect to NBC. Basically, I was simply away for a week. I was sent off to the training for no other reason than I was the low guy on the totem pole, and when I returned, I was neither asked about, nor required to use or implement that training in any way. There was no accountability.
Later in my tenure in the military, I found an opening for some advanced training for a Sgt who worked for me. I felt strongly that this Sgt should be promoted and advance in his career, and one way to shore up his chances was to ensure that he advanced in his military occupational specialty. I got him a seat in the training course, got his travel set up, and while he was gone, I found a position in another unit where he would put his new-found skills to good use. When he returned, I informed him of his transfer (which had other benefits for him, as well). His new role required him to teach junior Marines about the device he'd been trained on, as well as train the new officers attending the Basic Communication Officers Course on how to use and deploy the device, as well. He was held accountable for the training he'd received.
How often do we do this? Be honest. I've seen an analyst that had attended some pretty extensive training, only to return and within the next couple of weeks, not know how to determine if a file had been time stomped or not. I know that the basics of how to conduct that type of analysis were covered in the training they'd attended.
Generalist vs. Specialist
What kind of training are you interested in? Basic skills development, or advanced training in a very specific skill set? What specific skills are you looking for? Are they skills specific to your environment?
There's a lot of good generalist training out there, training that provides a broad range of skills. You may want to start there, and then develop an in-house training program ("brown bag" lunch presentations, half- or full-day training, mentoring, etc.) that reinforces and extends those basic skills into something that's specific to your environment.
Analysis
A bit ago, I posted about doing analysis, and that post didn't really seem to get much traction at all. What was I trying for? To start a conversation about how we _do_ analysis. When we make statements to a client or to another analyst, on what are we basing those findings? Somewhere between the raw data and our findings is where we _do_ analysis; I know what that looks like for me, and I've shared it (in this blog, in my books, etc.), and what I've wanted to do for some time is go beyond the passivity of sitting in a classroom, and start a conversation where analysts engage and discuss analysis.
I have to wonder...is this even possible? Will analysts talk about what they do? For me, I'm more than happy to. But will this spark a conversation?
I thought I'd try a different tact this time around. In a recent blog post, I mentioned that two Prefetch parsers had recently been released. While it is interesting to see these tools being made available, I have to ask...how are analysts using these tools? How are analysts using these tools to conduct analysis, and achieve the results that they're sharing with their clients?
Don't get me wrong...I think having tools is a wonderful idea. We all have our favorite tools that we tend to gravitate toward or reach for under different circumstances. Whether it's commercial or free/open source tools, it doesn't really matter. Whether you're using a dongle or a Linux distro...it doesn't matter. What does matter is, how are you using it, and how are you interpreting the data?
Someone told me recently, "...I know you have an issue with EnCase...", and to be honest, that's simply not the case. I don't have an issue with EnCase at all, nor with FTK. I do have an issue with how those tools are used by analysts, and the issue extends to any other tool that is run auto-magically and expected to spit out true results with little to no analysis.
What do the tools really do for us? Well, basically, most tools parse data of some sort, and display it. It's then up to us, as analysts, to analyze that data...interpret it, either within the context of that and other data, or by providing additional context, by incorporating either additional data from the same source, or data from external sources.
RegRipper is a great example. The idea behind RegRipper (as well as the other tools I've written) is to parse and display data for analysis...that's it. RegRipper started as a bunch of scripts I had sitting around...every time I'd work on a system and have to dig through the Registry to find something, I'd write a script to do the actual work for me. In some cases, a script was simply to follow a key path (or several key paths) that I didn't want to have to memorize. In other cases, I'd write a script to handle ROT-13 decoding or binary parsing; I figured, rather than having to do all of that again, I'd write a script to automate it.
For a while, that's all RegRipper did...parse and display data. If you had key words you wanted to "pivot" on, you could do so with just about any text editor, but that's still a lot of data. So then I started adding "alerts"; I'd have the script (or tool) do some basic searching to look for things that were known to be "bad", in particular, file paths in specific locations. For example, an .exe file in the root of the user profile, or in the root of the Recycle Bin, is a very bad thing, so I wanted those to pop out and be put right in front of the analyst. I found...and still find...this to be an incredibly useful functionality, but to date,
Here's an example of what I'm talking about with respect to analysis...I ran across this forensics challenge walk-through recently, and just for sh*ts and grins, I downloaded the Registry hive (NTUSER.DAT, Software, System) files. I ran the appcompatcache.pl RegRipper plugin against the system hive, and found the following "interesting" entries within the AppCompatCache value:
C:\dllhot.exe Tue Apr 3 18:08:50 2012 Z Executed
C:\Windows\TEMP\a.exe Tue Apr 3 23:54:46 2012 Z Executed
c:\windows\system32\dllhost\svchost.exe Tue Apr 3 22:40:25 2012 Z Executed
C:\windows\system32\hydrakatz.exe Wed Apr 4 01:00:45 2012 Z Executed
C:\Windows\system32\icacls.exe Tue Jul 14 01:14:21 2009 Z Executed
Now, the question is, for each of those entries, what do they mean? Do they mean that the .exe file was "executed" on the date and time listed?
No, that's not what the entries mean at all. Check out Mandiant's white paper on the subject. You can verify what they're saying in the whitepaper by creating a timeline from the shim cache data and file system metadata (just the $MFT will suffice); if the files that had been executed were not deleted from the system, you'll see that the time stamp included in the shim cache data is, in fact, the last modification time from the file system (specifically, the $STANDARD_INFORMATION attribute) metadata.
I use this as an example, simply because it's something that I see a great deal of; in fact, I recently experienced a "tale of two analysts", where I reviewed work that had previously been conducted, by two separate analysts. The first analyst did not parse the Shim Cache data, and the second parsed it, but assumed that what the data meant was that the .exe files of interested had been executed at the time displayed alongside the entry.
Again, this is just an example, and not meant to focus the spotlight on anyone. I've talked with a number of analysts, and in just about every conversation, they've either known someone who's made the same mistake misinterpreting the Shim Cache data, or they've admitted to misinterpreting it themselves. I get it; no one's perfect, and we all make mistakes. I chose this one as an example, because it's perhaps one of the most misinterpreted data sources. A lot of analysts who have attended (or conducted) expensive training courses have made this mistake.
Pointing out mistakes isn't the point I'm trying to make...it's that we, as a community, need to engage in a community-wide conversation about analysis. What resources do we have available now, and what do we need? We can't all attend training courses, and when we do, what happens most often is that we learn something cool, and then don't see it again for 6 months or a year, and we forget the nuances of that particular analysis. Dedicated resources are great, but they (forums, emails, documents) need to be searched. What about just-in-time resources, like asking a question? Would that help?
I have to wonder...is this even possible? Will analysts talk about what they do? For me, I'm more than happy to. But will this spark a conversation?
I thought I'd try a different tact this time around. In a recent blog post, I mentioned that two Prefetch parsers had recently been released. While it is interesting to see these tools being made available, I have to ask...how are analysts using these tools? How are analysts using these tools to conduct analysis, and achieve the results that they're sharing with their clients?
Don't get me wrong...I think having tools is a wonderful idea. We all have our favorite tools that we tend to gravitate toward or reach for under different circumstances. Whether it's commercial or free/open source tools, it doesn't really matter. Whether you're using a dongle or a Linux distro...it doesn't matter. What does matter is, how are you using it, and how are you interpreting the data?
Someone told me recently, "...I know you have an issue with EnCase...", and to be honest, that's simply not the case. I don't have an issue with EnCase at all, nor with FTK. I do have an issue with how those tools are used by analysts, and the issue extends to any other tool that is run auto-magically and expected to spit out true results with little to no analysis.
What do the tools really do for us? Well, basically, most tools parse data of some sort, and display it. It's then up to us, as analysts, to analyze that data...interpret it, either within the context of that and other data, or by providing additional context, by incorporating either additional data from the same source, or data from external sources.
RegRipper is a great example. The idea behind RegRipper (as well as the other tools I've written) is to parse and display data for analysis...that's it. RegRipper started as a bunch of scripts I had sitting around...every time I'd work on a system and have to dig through the Registry to find something, I'd write a script to do the actual work for me. In some cases, a script was simply to follow a key path (or several key paths) that I didn't want to have to memorize. In other cases, I'd write a script to handle ROT-13 decoding or binary parsing; I figured, rather than having to do all of that again, I'd write a script to automate it.
For a while, that's all RegRipper did...parse and display data. If you had key words you wanted to "pivot" on, you could do so with just about any text editor, but that's still a lot of data. So then I started adding "alerts"; I'd have the script (or tool) do some basic searching to look for things that were known to be "bad", in particular, file paths in specific locations. For example, an .exe file in the root of the user profile, or in the root of the Recycle Bin, is a very bad thing, so I wanted those to pop out and be put right in front of the analyst. I found...and still find...this to be an incredibly useful functionality, but to date,
Here's an example of what I'm talking about with respect to analysis...I ran across this forensics challenge walk-through recently, and just for sh*ts and grins, I downloaded the Registry hive (NTUSER.DAT, Software, System) files. I ran the appcompatcache.pl RegRipper plugin against the system hive, and found the following "interesting" entries within the AppCompatCache value:
C:\dllhot.exe Tue Apr 3 18:08:50 2012 Z Executed
C:\Windows\TEMP\a.exe Tue Apr 3 23:54:46 2012 Z Executed
c:\windows\system32\dllhost\svchost.exe Tue Apr 3 22:40:25 2012 Z Executed
C:\windows\system32\hydrakatz.exe Wed Apr 4 01:00:45 2012 Z Executed
C:\Windows\system32\icacls.exe Tue Jul 14 01:14:21 2009 Z Executed
Now, the question is, for each of those entries, what do they mean? Do they mean that the .exe file was "executed" on the date and time listed?
No, that's not what the entries mean at all. Check out Mandiant's white paper on the subject. You can verify what they're saying in the whitepaper by creating a timeline from the shim cache data and file system metadata (just the $MFT will suffice); if the files that had been executed were not deleted from the system, you'll see that the time stamp included in the shim cache data is, in fact, the last modification time from the file system (specifically, the $STANDARD_INFORMATION attribute) metadata.
I use this as an example, simply because it's something that I see a great deal of; in fact, I recently experienced a "tale of two analysts", where I reviewed work that had previously been conducted, by two separate analysts. The first analyst did not parse the Shim Cache data, and the second parsed it, but assumed that what the data meant was that the .exe files of interested had been executed at the time displayed alongside the entry.
Again, this is just an example, and not meant to focus the spotlight on anyone. I've talked with a number of analysts, and in just about every conversation, they've either known someone who's made the same mistake misinterpreting the Shim Cache data, or they've admitted to misinterpreting it themselves. I get it; no one's perfect, and we all make mistakes. I chose this one as an example, because it's perhaps one of the most misinterpreted data sources. A lot of analysts who have attended (or conducted) expensive training courses have made this mistake.
Pointing out mistakes isn't the point I'm trying to make...it's that we, as a community, need to engage in a community-wide conversation about analysis. What resources do we have available now, and what do we need? We can't all attend training courses, and when we do, what happens most often is that we learn something cool, and then don't see it again for 6 months or a year, and we forget the nuances of that particular analysis. Dedicated resources are great, but they (forums, emails, documents) need to be searched. What about just-in-time resources, like asking a question? Would that help?
Wednesday, January 20, 2016
Resources, Link Mashup
Monitoring
MS's Sysmon was recently updated to version 3.2, with the addition of capturing opens for raw read access to disks and volumes. If you're interested in monitoring your infrastructure and performing threat hunting at all, I'd highly recommend that you consider installing something like this on your systems. While Sysmon is not nearly as fully-featured as something like Carbon Black, employing Sysmon along with centralized log collection and filtering will provide you with a level of visibility that you likely hadn't even imagined was possible previously.
This page talks about using Sysmon and NXLog.
The fine analysts of the Dell SecureWorks CTU-SO recently had an article posted that describes what the bad guys like to do with Windows Event Logs, and both of the case studies could be "caught" with the right instrumentation in place. You can also use process creation monitoring (via Sysmon, or some other means) to detect when an intruder is living off the land within your environment.
The key to effective monitoring and subsequent threat hunting is visibility, which is achieved through telemetry and instrumentation. How are bad guys able to persist within an infrastructure for a year or more without being detected? It's not that they aren't doing stuff, it's that they're doing stuff that isn't detected due to a lack of visibility.
MS KB article 3004375 outlines how to improve Windows command-line auditing, and this post from LogRhythm discusses how to enable Powershell command line logging (another post discussing the same thing is here). The MS KB article gives you some basic information regarding process creation, and Sysmon provides much more insight. Regardless of which option you choose, however, all are useless unless you're doing some sort of centralized log collection and filtering, so be sure to incorporate the necessary and appropriate logs into your SEIM, and get those filters written.
Windows Event Logs
Speaking of Windows Event Logs, sometimes it can be very difficult to find information regarding various event source/ID pairs. Microsoft has a great deal of information available regarding Windows Event Log records, and I very often can easily find the pages with a quick Google search. For example, I recently found this page on Firewall Rule Processing events, based on a question I saw in an online forum.
From Deus Ex Machina, you can look up a wide range of Windows Event Log records here or here. I've found both to be very useful. I've used this site more than once to get information about *.evtx records that I couldn't find any place else.
Another source of information about Windows Event Log records and how they can be used can often be one of the TechNet blogs. For example, here's a really good blog post from Jessica Payne regarding tracking lateral movement...
With respect to the Windows Event Logs, I've been looking at ways to increase instrumentation on Windows systems, and something I would recommend is putting triggers in place for various activities, and writing a record to the Windows Event Log. I found this blog post recently that discusses using PowerShell to write to the Windows Event Log, so whatever you trap or trigger on a system can launch the appropriate command or run a batch file the contains the command. Of course, in a networked environment, I'd highly recommend a SEIM be set up, as well.
One thought regarding filtering and analyzing Windows Event Log records sent to a SEIM...when looking at various Windows Event Log records, we have to look at them in the context of the system, rather than in isolation, as what they actually refer to can be very different. A suspicious record related to WMI, for example, when viewed in isolation may end up being part of known and documented activity when viewed in the context of the system.
Analysis
PoorBillionaire recently released a Windows Prefetch Parser, which is reportedly capable of handling *.pf files from XP systems all the way up through Windows 10 systems. On 19 Jan, Eric Zimmerman did the same, making his own Prefetch parser available.
Having tools available is great, but what we really need to do is talk about how those tools can be used most effectively as part of our analysis. There's no single correct way to use the tool, but the issue becomes, how do you correctly interpret the data once you have it?
I recently encountered a "tale of two analysts", where both had access to the same data. One analyst did not parse the ShimCache data at all as part of their analysis, while the other did and misinterpreted the information that the tool (whichever one that was) displayed for them.
So, my point is that having tools to parse data is great, but if the focus is tools and parsing data, but not analyzing and correctly interpreting the data, what have the tools really gotten us?
Creating a Timeline
I was browsing around recently and ran across an older blog post (yeah, I know it's like 18 months old...), and in the very beginning of that post, something caught my eye. Specifically, a couple of quotes from the blog post:
...my reasons for carrying this out after the filesystem timeline is purely down to the time it takes to process.
...and...
The problem with it though is the sheer amount of information it can contain! It is very important when working with a super timeline to have a pivot point to allow you to narrow down the time frame you are interested in.
The post also states that timeline analysis is an extremely powerful tool, and I agree, 100%. What I would offer to analysts is a more deliberate approach to timeline analysis, based on what Chris Pogue coined as Sniper Forensics.
Speaking of analysis, the folks at RSA released a really good look at analyzing carrier files used during a phish. The post provides a pretty thorough walk-through of the tool and techniques used to parse through an old (or should I say, "OLE") style MS Word document to identify and analyze embedded macros.
Powershell
Not long ago, I ran across an interesting artifact...a folder with the following name:
C:\Users\user\AppData\Local\Microsoft\Windows\PowerShell\CommandAnalysis\
The folder contained an index file, and a bunch of files with names that follow the format "PowerShell_AnalysisCacheEntry_GUID". Doing some research into this, I ran across this BoyWonder blog post, which seems to indicate that this is a cache (yeah, okay, that's in the name, I get it...), and possibly used for functionality similar to auto-complete. It doesn't appear to illustrate what was run, though. For that, you might want to see the LogRhythm link earlier in this post.
As it turned out, the folder path I listed above was part of legitimate activity performed by an administrator.
MS's Sysmon was recently updated to version 3.2, with the addition of capturing opens for raw read access to disks and volumes. If you're interested in monitoring your infrastructure and performing threat hunting at all, I'd highly recommend that you consider installing something like this on your systems. While Sysmon is not nearly as fully-featured as something like Carbon Black, employing Sysmon along with centralized log collection and filtering will provide you with a level of visibility that you likely hadn't even imagined was possible previously.
This page talks about using Sysmon and NXLog.
The fine analysts of the Dell SecureWorks CTU-SO recently had an article posted that describes what the bad guys like to do with Windows Event Logs, and both of the case studies could be "caught" with the right instrumentation in place. You can also use process creation monitoring (via Sysmon, or some other means) to detect when an intruder is living off the land within your environment.
The key to effective monitoring and subsequent threat hunting is visibility, which is achieved through telemetry and instrumentation. How are bad guys able to persist within an infrastructure for a year or more without being detected? It's not that they aren't doing stuff, it's that they're doing stuff that isn't detected due to a lack of visibility.
MS KB article 3004375 outlines how to improve Windows command-line auditing, and this post from LogRhythm discusses how to enable Powershell command line logging (another post discussing the same thing is here). The MS KB article gives you some basic information regarding process creation, and Sysmon provides much more insight. Regardless of which option you choose, however, all are useless unless you're doing some sort of centralized log collection and filtering, so be sure to incorporate the necessary and appropriate logs into your SEIM, and get those filters written.
Windows Event Logs
Speaking of Windows Event Logs, sometimes it can be very difficult to find information regarding various event source/ID pairs. Microsoft has a great deal of information available regarding Windows Event Log records, and I very often can easily find the pages with a quick Google search. For example, I recently found this page on Firewall Rule Processing events, based on a question I saw in an online forum.
From Deus Ex Machina, you can look up a wide range of Windows Event Log records here or here. I've found both to be very useful. I've used this site more than once to get information about *.evtx records that I couldn't find any place else.
Another source of information about Windows Event Log records and how they can be used can often be one of the TechNet blogs. For example, here's a really good blog post from Jessica Payne regarding tracking lateral movement...
With respect to the Windows Event Logs, I've been looking at ways to increase instrumentation on Windows systems, and something I would recommend is putting triggers in place for various activities, and writing a record to the Windows Event Log. I found this blog post recently that discusses using PowerShell to write to the Windows Event Log, so whatever you trap or trigger on a system can launch the appropriate command or run a batch file the contains the command. Of course, in a networked environment, I'd highly recommend a SEIM be set up, as well.
One thought regarding filtering and analyzing Windows Event Log records sent to a SEIM...when looking at various Windows Event Log records, we have to look at them in the context of the system, rather than in isolation, as what they actually refer to can be very different. A suspicious record related to WMI, for example, when viewed in isolation may end up being part of known and documented activity when viewed in the context of the system.
Analysis
PoorBillionaire recently released a Windows Prefetch Parser, which is reportedly capable of handling *.pf files from XP systems all the way up through Windows 10 systems. On 19 Jan, Eric Zimmerman did the same, making his own Prefetch parser available.
Having tools available is great, but what we really need to do is talk about how those tools can be used most effectively as part of our analysis. There's no single correct way to use the tool, but the issue becomes, how do you correctly interpret the data once you have it?
I recently encountered a "tale of two analysts", where both had access to the same data. One analyst did not parse the ShimCache data at all as part of their analysis, while the other did and misinterpreted the information that the tool (whichever one that was) displayed for them.
So, my point is that having tools to parse data is great, but if the focus is tools and parsing data, but not analyzing and correctly interpreting the data, what have the tools really gotten us?
Creating a Timeline
I was browsing around recently and ran across an older blog post (yeah, I know it's like 18 months old...), and in the very beginning of that post, something caught my eye. Specifically, a couple of quotes from the blog post:
...my reasons for carrying this out after the filesystem timeline is purely down to the time it takes to process.
...and...
The problem with it though is the sheer amount of information it can contain! It is very important when working with a super timeline to have a pivot point to allow you to narrow down the time frame you are interested in.
The post also states that timeline analysis is an extremely powerful tool, and I agree, 100%. What I would offer to analysts is a more deliberate approach to timeline analysis, based on what Chris Pogue coined as Sniper Forensics.
Speaking of analysis, the folks at RSA released a really good look at analyzing carrier files used during a phish. The post provides a pretty thorough walk-through of the tool and techniques used to parse through an old (or should I say, "OLE") style MS Word document to identify and analyze embedded macros.
Powershell
Not long ago, I ran across an interesting artifact...a folder with the following name:
C:\Users\user\AppData\Local\Microsoft\Windows\PowerShell\CommandAnalysis\
The folder contained an index file, and a bunch of files with names that follow the format "PowerShell_AnalysisCacheEntry_GUID". Doing some research into this, I ran across this BoyWonder blog post, which seems to indicate that this is a cache (yeah, okay, that's in the name, I get it...), and possibly used for functionality similar to auto-complete. It doesn't appear to illustrate what was run, though. For that, you might want to see the LogRhythm link earlier in this post.
As it turned out, the folder path I listed above was part of legitimate activity performed by an administrator.
Tuesday, January 19, 2016
More Registry Fun
Once, on a blog far, far away, there was this post that discussed the use of the Unicode RLO control character to "hide" malware in the Registry, particularly from GUI viewers that processed Unicode.
Recently, Jamie shared this Symantec article with me; figure 1 in the article illustrates an interesting aspect of the malware when it comes to persistence...it apparently prepends a null character to the beginning of the value name. Interesting, some seem to think that this makes tools like RegEdit "brokken".
So, I wrote a RegRipper plugin called "null.pl" that runs through a hive file looking for key and value names that being with a null character. Jamie also shared a couple of sample hives, so I got to test the plugin out. The following image illustrates the plugin output when run against one of the hives:
All in all, the turn-around time was pretty quick. I started this morning, and had the plugin written, tested, and uploaded to Github before lunch.
Later in the day, Eric Zimmerman followed up by testing the hive that Jamie graciously shared with me against the Registry Explorer. I should also note that MiTeC WRR has no issues with the value names; it displays them as follows:
Addendum, 20 Jan: On a whim, I ran the fileless.pl plugin against the hive, and it detected the two values with the "fileless" data seen in figure 2.
Recently, Jamie shared this Symantec article with me; figure 1 in the article illustrates an interesting aspect of the malware when it comes to persistence...it apparently prepends a null character to the beginning of the value name. Interesting, some seem to think that this makes tools like RegEdit "brokken".
So, I wrote a RegRipper plugin called "null.pl" that runs through a hive file looking for key and value names that being with a null character. Jamie also shared a couple of sample hives, so I got to test the plugin out. The following image illustrates the plugin output when run against one of the hives:
Figure 1 |
All in all, the turn-around time was pretty quick. I started this morning, and had the plugin written, tested, and uploaded to Github before lunch.
Later in the day, Eric Zimmerman followed up by testing the hive that Jamie graciously shared with me against the Registry Explorer. I should also note that MiTeC WRR has no issues with the value names; it displays them as follows:
Figure 2 |
Addendum, 20 Jan: On a whim, I ran the fileless.pl plugin against the hive, and it detected the two values with the "fileless" data seen in figure 2.
Friday, January 15, 2016
The Skills Dilemma
Is there an issue of skills within information or "cyber" security? Yes, without a doubt. But it's not the way you think...the dilemma is not one of a lack of qualified and skilled practitioners, it's one of a lack of skilled managers.
Okay, caveat time...if you're a manager, you might want to stop reading. If you get butt-hurt easily, you might not want to continue on beyond this point. Just sayin'...
I read Scott Scanlon's The Hunt for Cyber Security Leadership Intensifies article recently, and I have to say, being in the industry for the past 19-some-odd years, I have different perspective on the issue. The second sentence of Scott's article, referring to executive recruiters, says:
But they are finding a lack of qualified candidates just as companies put a greater emphasis and give a higher priority to corporate security.
It's not my intention to take anything away from Scott, nor am I suggesting that he's incorrect. I'm simply saying that I have a different perspective. In doing so, I'd like to take a look at that sentence; specifically, what constitutes a "qualified candidate", and who decides? If you're "finding a lack of qualified candidates", how are you looking?
Let's look at the process of finding a "qualified candidate":
Job Posting
Who writes job postings or position descriptions? Managers? Are you a manager? Write a description for a position you need to fill. Now, ball it up and throw it away, because you're wrong.
Here's what I mean...I was engaged in a thread recently on LinkedIn, where an employee of a company had posted two position descriptions, one for a threat intel analyst. When I read the position qualifications, one of the stated requirements was a familiarity with "EnCase or FTK". I was curious, so I asked why that was a requirement, and the employee who shared the links didn't know. Shortly, one of the C-level execs from the company responded, saying that it wasn't a requirement.
Then why say that it is?
Have you ever seen those position descriptions? "The candidate MUST have a CISSP, EnCE, etc." Really?
Running the Gauntlet
Position descriptions are passed from the manager to HR or a recruiting firm, who become the gate keepers. Most of the recruiters I've encountered have no experience in the information security field themselves...they're recruiters. So for them, the position description is a set-in-stone road map, and the words used by the hiring manager become the round holes in the board.
I once worked at a company where, after I was hired, one of the recruiters stated publicly that when they receive a resume from a candidate for a position in information security, they search the resume for the term "information security", and if they don't find it at least 4 times, they throw the resume out. What about qualifications? The hiring manager includes "CISSP" and "EnCE" as a "requirements", but doesn't tell the recruiter that they really aren't "requirements". So, the recruiter looks at resumes, and if "CISSP" AND "EnCE" aren't listed, you don't pass GO and you don't collect $200.
So the question then becomes, how does someone who's qualified pass through that gauntlet and get an actual interview? I "came up" in the industry before there were courses you could take, and a lot of what I know is self-taught. I know enough about EnCase and FTK to know when they're suitable for use. I'm not suggesting that I'm a "qualified candidate" but if I was, how would anyone know?
Interviewing a Candidate
I'll be 100% with you...most of the people I've encountered while interviewing don't know how to interview. We all like to think that we're good at it, but the simple fact is that we don't know how to interview.
When I first got out of the military, I interviewed at a defense contractor, and had four hours of interviews with different departments scheduled. At the beginning of the first interview of the day, the senior manager started off by telling me, very clearly, that he'd run all of my qualifications through a model that he'd developed, and he'd determined how much I would make in my first job. This is before he even spoke to me or got to know me. That's not how to conduct an interview...and I made considerably more than what his model showed in my first job.
A great way to loose a candidate is to take them around the office, and surprising members of your team by dropping the candidate off for a "spur of the moment" interview.
Look, I've been on both sides of the fence in 19 years. When I was getting out of the military, I had to take classes in "how to interview". What made it disheartening was that the people I wasn't interviewing with had NO training at all. All the preparation in the world cannot stand up to the first question in an interview being, "so...why are you here?"
I've also been responsible for conducting interviews. I've seen people lie on their resume, simply to make it past the "recruiter gauntlet" and get an interview. I've had interviews go really well, and some that didn't go well. I've also been in a position where someone was hired to support the work that I did, and I was not involved in the process, at any level. In fact, in that case, I wasn't even aware of the vision or business decision for filling the position...all I know is that I heard a discussion in the hallway about offering this person a signing bonus.
The Reality of the Position
What is the reality of the position itself? Yeah, I know what the job description says about the position and the company (words like "dynamic" are used), but all bullsh*t aside, what's the reality?
Is the actual work position in the heart of a major city? As someone who lives outside of a major city (way outside), I know better than to try to drive into the city for the odd social event...and you want me to drive into the city everyday as part of the job? I thought the position description said that your company "values quality of life"....
What about the actual work itself? In my time, I've worked for a couple of contracting firms, "supporting" federal law enforcement. In both cases, a lot of very positive things were said about the position. When I supported a CSIRT, it took me 8 months to get my agency-specific clearance, and in that time, I found out that the "CSIRT" didn't actually respond to anything; if they happened to find out that something happened, they had to request that someone from network ops run a tool (just one) on the suspect system. When I found out that the one tool was one that simply listed processes, I suggested that along with the process, we also get the path to the executable image (for context), and the person I suggested this to got offended.
In the other position, all of the case agents would take their work to one or two analysts, while the rest of us got really good at Solitaire.
If you're a contractor and having trouble finding "qualified candidates", then the issue may be one of the positions you're filling themselves. I've spent time with contracting firms whose business model is to be a seat-filler, and to be honest, I can see why they're having trouble finding qualified candidates.
I'm not talking about being cynical about the position or the company...I'm talking about being honest about it, that's all. After all, if you're not honest about the position, it's going to be revolving door of candidates. As bad as it sounds, a worse outcome is having someone realize how it is, and stay.
So, my point is that there are, in fact, highly skilled individuals in the "cyber" arena. Many of them have time in the industry, have learned a lot of the lessons I've described (and more), and have created for themselves an environment where they're happy. Some of the highly qualified but relatively new individuals in the industry have gravitated to the more experienced folks, and are similarly very happy.
Rather than repeating the "lack of qualified candidates" mantra, take a good hard look at what you're doing to find those candidates. Is it the process you're using? Is it the business model that needs to be changed? Or, consider "rolling your own"...use your current expertise to develop and grow new expertise.
Addendum, 19 Jan: I ran across this INC article today that gives 16 steps to help make your interview a success. The problem I've always found is that there aren't articles like this for those on the other side of the table...those who have head count and a position to fill. There are a lot of articles out there that talk about how to be an interviewee, but few that really prepare the interviewer.
Addendum, 25 Jan: Here's a Forbes article that discusses answers to the 5 dumbest interview questions; the point is that they're still being asked.
Okay, caveat time...if you're a manager, you might want to stop reading. If you get butt-hurt easily, you might not want to continue on beyond this point. Just sayin'...
I read Scott Scanlon's The Hunt for Cyber Security Leadership Intensifies article recently, and I have to say, being in the industry for the past 19-some-odd years, I have different perspective on the issue. The second sentence of Scott's article, referring to executive recruiters, says:
But they are finding a lack of qualified candidates just as companies put a greater emphasis and give a higher priority to corporate security.
It's not my intention to take anything away from Scott, nor am I suggesting that he's incorrect. I'm simply saying that I have a different perspective. In doing so, I'd like to take a look at that sentence; specifically, what constitutes a "qualified candidate", and who decides? If you're "finding a lack of qualified candidates", how are you looking?
Let's look at the process of finding a "qualified candidate":
Job Posting
Who writes job postings or position descriptions? Managers? Are you a manager? Write a description for a position you need to fill. Now, ball it up and throw it away, because you're wrong.
Here's what I mean...I was engaged in a thread recently on LinkedIn, where an employee of a company had posted two position descriptions, one for a threat intel analyst. When I read the position qualifications, one of the stated requirements was a familiarity with "EnCase or FTK". I was curious, so I asked why that was a requirement, and the employee who shared the links didn't know. Shortly, one of the C-level execs from the company responded, saying that it wasn't a requirement.
Then why say that it is?
Have you ever seen those position descriptions? "The candidate MUST have a CISSP, EnCE, etc." Really?
Running the Gauntlet
Position descriptions are passed from the manager to HR or a recruiting firm, who become the gate keepers. Most of the recruiters I've encountered have no experience in the information security field themselves...they're recruiters. So for them, the position description is a set-in-stone road map, and the words used by the hiring manager become the round holes in the board.
I once worked at a company where, after I was hired, one of the recruiters stated publicly that when they receive a resume from a candidate for a position in information security, they search the resume for the term "information security", and if they don't find it at least 4 times, they throw the resume out. What about qualifications? The hiring manager includes "CISSP" and "EnCE" as a "requirements", but doesn't tell the recruiter that they really aren't "requirements". So, the recruiter looks at resumes, and if "CISSP" AND "EnCE" aren't listed, you don't pass GO and you don't collect $200.
So the question then becomes, how does someone who's qualified pass through that gauntlet and get an actual interview? I "came up" in the industry before there were courses you could take, and a lot of what I know is self-taught. I know enough about EnCase and FTK to know when they're suitable for use. I'm not suggesting that I'm a "qualified candidate" but if I was, how would anyone know?
Interviewing a Candidate
I'll be 100% with you...most of the people I've encountered while interviewing don't know how to interview. We all like to think that we're good at it, but the simple fact is that we don't know how to interview.
When I first got out of the military, I interviewed at a defense contractor, and had four hours of interviews with different departments scheduled. At the beginning of the first interview of the day, the senior manager started off by telling me, very clearly, that he'd run all of my qualifications through a model that he'd developed, and he'd determined how much I would make in my first job. This is before he even spoke to me or got to know me. That's not how to conduct an interview...and I made considerably more than what his model showed in my first job.
A great way to loose a candidate is to take them around the office, and surprising members of your team by dropping the candidate off for a "spur of the moment" interview.
Look, I've been on both sides of the fence in 19 years. When I was getting out of the military, I had to take classes in "how to interview". What made it disheartening was that the people I wasn't interviewing with had NO training at all. All the preparation in the world cannot stand up to the first question in an interview being, "so...why are you here?"
I've also been responsible for conducting interviews. I've seen people lie on their resume, simply to make it past the "recruiter gauntlet" and get an interview. I've had interviews go really well, and some that didn't go well. I've also been in a position where someone was hired to support the work that I did, and I was not involved in the process, at any level. In fact, in that case, I wasn't even aware of the vision or business decision for filling the position...all I know is that I heard a discussion in the hallway about offering this person a signing bonus.
The Reality of the Position
What is the reality of the position itself? Yeah, I know what the job description says about the position and the company (words like "dynamic" are used), but all bullsh*t aside, what's the reality?
Is the actual work position in the heart of a major city? As someone who lives outside of a major city (way outside), I know better than to try to drive into the city for the odd social event...and you want me to drive into the city everyday as part of the job? I thought the position description said that your company "values quality of life"....
What about the actual work itself? In my time, I've worked for a couple of contracting firms, "supporting" federal law enforcement. In both cases, a lot of very positive things were said about the position. When I supported a CSIRT, it took me 8 months to get my agency-specific clearance, and in that time, I found out that the "CSIRT" didn't actually respond to anything; if they happened to find out that something happened, they had to request that someone from network ops run a tool (just one) on the suspect system. When I found out that the one tool was one that simply listed processes, I suggested that along with the process, we also get the path to the executable image (for context), and the person I suggested this to got offended.
In the other position, all of the case agents would take their work to one or two analysts, while the rest of us got really good at Solitaire.
If you're a contractor and having trouble finding "qualified candidates", then the issue may be one of the positions you're filling themselves. I've spent time with contracting firms whose business model is to be a seat-filler, and to be honest, I can see why they're having trouble finding qualified candidates.
I'm not talking about being cynical about the position or the company...I'm talking about being honest about it, that's all. After all, if you're not honest about the position, it's going to be revolving door of candidates. As bad as it sounds, a worse outcome is having someone realize how it is, and stay.
So, my point is that there are, in fact, highly skilled individuals in the "cyber" arena. Many of them have time in the industry, have learned a lot of the lessons I've described (and more), and have created for themselves an environment where they're happy. Some of the highly qualified but relatively new individuals in the industry have gravitated to the more experienced folks, and are similarly very happy.
Rather than repeating the "lack of qualified candidates" mantra, take a good hard look at what you're doing to find those candidates. Is it the process you're using? Is it the business model that needs to be changed? Or, consider "rolling your own"...use your current expertise to develop and grow new expertise.
Addendum, 19 Jan: I ran across this INC article today that gives 16 steps to help make your interview a success. The problem I've always found is that there aren't articles like this for those on the other side of the table...those who have head count and a position to fill. There are a lot of articles out there that talk about how to be an interviewee, but few that really prepare the interviewer.
Addendum, 25 Jan: Here's a Forbes article that discusses answers to the 5 dumbest interview questions; the point is that they're still being asked.