Monday, September 09, 2019

The Ransomware Economy

There's no doubt about it...cybercrime, and especially ransomware, is an entire economy in and of itself.

Don't believe me?

Read through this ProPublica article, not just once, but a couple of times.  And take notes.  Then go back and read the notes.  Here's what I got from the article:
  1. Organizations are looking to insurance policies to defray the costs of incidents.  Rather than investing in prevention, detection, and response, they're accepting (to some degree) that these incidents are going to happen, and seeking to establish a means to minimize their financial risk.  Hence, insurance policies.
  2. A ransomware incident occurs, and the policy kicks in.  Depending upon how the policy was set up, and what it covers, the deductible may be much less than the ransom.  Financial risk minimized.
  3. Insurance providers are more interested in getting ransoms paid quickly; getting the encryption keys and recovering files minimizes down time, and therefore any additional costs incurred as a result of services not being available.  So, insurance providers want the ransom paid, in order to minimize their financial exposure.
  4. There's also an entire economy that's popped up around ransom payment brokers, organizations that act as intermediaries between victim organizations, insurance providers, and the bad guys.
But is that the end?  Is this just about encrypting data and getting paid to unlock it?  I wouldn't think so, and here's why.  One of the things I've always been curious about is, is there any data exfiltration going on prior to data encryption?  Are bad guys taking anything before encrypting files?  In most cases, it's hard to tell...I'm aware of ransomware cases where the bad guys are actually in the environment for weeks or even months before encrypting files, and the artifacts of data staging and exfiltration may be fleeting, at best, and nonexistent during the incident response.

Not long ago, a fellow responder shared that many of the ransomware cases he works include an element of data exfiltration.  A recent 60Minutes segment on ransomware includes a similar statement; if you watch until 9:50 in the segment, you'll see mention of the bad guy further extorting an organization by threatening to leak their "internal data".

Let's look at some of the reporting on ransomware, such as this The Conversation article. At one point in the article, we see the statement:

Ransomware usually spreads via phishing emails or links...

Perhaps "usually", yes, but not always.  The 60Minutes segment mentioned the Samsam ransomware; during the first half of 2016, these guys were seen using the publicly available JexBoss exploit to gain access to organizations through JBoss CMS servers.  At that time, the average time between initial access to the organization and deploying the ransomware was 4 months. In 2017, in some cases, they switched to Terminal Services servers, gaining access via easily-guessed passwords.  Yes, some ransomware (some Ryuk incidents, for example) incidents begin with a phishing email, and then branch off into deploying remote access tools, internal reconnaissance, possibly privilege escalation, networking mapping, and finally, deploying the ransomware.

Another quote from the article:

Offenders will do their homework before launching an attack, in order to create the most severe disruption they possibly can.

Yes, they will.  But what does this mean?  This means a couple of things; first, they decide who to target, and when.  Employees within companies have targets against which they're judged; sales reps, for example, usually hit crunch time at the end of a quarter.  So, what the bad guys will do is send something to a sales rep that looks legit, and it's something that they need to open.  Yes, they're targeting individuals.

What does this look like, you ask?  While not related to ransomware, but take a look at the Mia Ash story, and you'll see what targeting looks like.  Going after sales reps, or the finance department, legal counsel...all of these are targets within an organization, and very often the "lure" looks attractive enough to obviate phishing awareness training.  However, this is only the beginning.  In the Mia Ash story, the adversary developed a relationship with their targets, to the point where, when it came time to send a weaponized document for the target to open, the target had no doubt in their mind regarding the fact that they were dealing with "Mia".

Something that isn't stated in the media is that, for some ransomware cases, once an adversary gains initial access to an infrastructure, there are a number of actions that must take place in order for them to have such an impact as to make paying the ransom the obvious choice going forward.  They need to observe and orient to where they are, collect information about the infrastructure, make decisions (that's the easy part, they're often quite practiced at this), and then act.  This is Col Boyd's OODA loop.  In some cases, this can take weeks, and in others, months.  Unfortunately, one of the things missing from public reporting of ransomware incidents, in addition to the observed initial access method, is the time that the adversary is on target before deploying ransomware.  It's not an easy task to go into a completely new infrastructure and find those files and systems that, if unavailable, would bring the organization to a halt.

With visibility, these actions can be detected, and responded to in a timely manner.  When I say, "responded to", I mean determining the initial infection vector and following a containment and eradication plan early in the adversary's process.  Let's say that you detect a new account being created on a system, because you have the visibility to do so...which user account was used to create the new one?  How did that user account gain access to the system on which the command was run?  Follow the tracks back to the starting point, and determine how the adversary got on the system, and then search your infrastructure for other, similar artifacts. 

It all starts with visibility.  Don't address ransomware by trying to figure out if you should restore systems from backup or pay the ransom; instead, catch the adversary early in their process and stop them before they encrypt their first file.

A Brief History of DFIR Time, pt I

Whether we like it or not, we're all time travelers. We're all moving through time, caught in the flow. In the western world, we're moving left-to-right, going along with the flow of time, from point A to point B. 

Sometimes it's interesting to look back at where we've been, what we've been witness to, and to reflect on and appreciate it.  Here's an abridged version of my take...

As a kid, my parents purchased a Timex-Sinclair 1000 computer.  I started out by following instructions for writing programs and saving them to a cassette tape...or trying to, as the case may be.  This wasn't the most reliable means (although it was the only one) for saving programs, and sometimes things would get corrupted, and I'd have to start all over.  As I learned a little bit of coding, I'd try different things...I'd start with the basic (no pun intended) recipe, and then make small modifications to see what happened.

In the early '80s, I was programming BASIC on the Apple IIe during a summer course.  Later, my parents purchased an Epson QX-10, which my father used for word processing.  During my senior year in high school, I took AP Computer Science, which involved programming PASCAL on the TRS-80 systems at the school.  My folks found a copy of Turbo PASCAL, which meant I could easily compile my programs at home in minutes, rather than trying to schedule time to get access to one of the TRS-80 systems at school, and get in before lunch, because compilation took over half an hour for some programs.

When I went to college (circa '85) we had a BASIC programming course, and we were still using the TRS-80 systems.  There were some mainframe systems in the physics building, and while I didn't get a real introduction to networking, some of did have fun sending messages to each other using the "wall" command.

After I got commissioned and went on active duty, I really didn't have a great deal of contact with computers.  In the Marine Corps at that time, Communications was a separate MOS from Data Processing, and as such, officers (and enlisted) for the MOSs attended separate schools.  For officers, both school houses were located on Quantico, at the time.  After training, I found that there was a great deal of cross-training in the fleet; quite often, CommOs were sent to data processing courses by their units.  The Marine Corps later combined the MOSs, along with the school houses and the curricula.

In the mid-'90s, I had the opportunity to attend graduate school, and I really got much more involved with computers.  I showed up with a 486DX desktop system that I used at home, and one of the first things I did was add a hard drive.  At the time, that meant putting it in the right location on the ribbon cable, and setting the correct jumpers on the drive chassis.  I later saved up and purchased additional RAM, going from 4MB to 16MB.  Yes, with an "M".  I also began going beyond Windows for Workgroups 3.11, and expanding into OS/2 2.1, and then later, OS/2 3.0 Warp.  At the time, I was using a SLIP/PPP script to dial into a local ISP, and then connecting remotely to the school systems.

Interestingly enough, I found someone in my local community who was running a BBS based on Amiga systems, and got a look at his setup.  That was a big deal at the time, because the town I lived in was close to a LATA border, meaning that while I could dial a number that was physically located about 10 miles south of me for no extra charge, the closest AOL POP was two miles north, and therefore, a long distance charge.  Eventually an airline pilot who lived in the local community set up an ISP, and I used that to access the Internet.

At school, I was working on SparcStations, using the Netscape browser.  I was learning about UseNet, SunOS, *nix-based systems, etc., none of which had anything to do with the curriculum.  I was the student rep to the sysadmin council when SATAN was released.  During the course of my "studies", I learned a little bit of C and C++ coding, a lot of MatLab, and a good bit of Java, at a time before Java was 1.0 GA status.  I played around with a bunch of different things with Java...I wrote programs to query fingerd, wrote an email spoofing program, and I wrote some code that connected the chargen port to the echo port...that was fun!

When I first started at graduate school, I didn't know it at the time, but I spent about 4 months walking by Gary Kildall's office every day on my way out of the building.  His office was next to one of the main doors that led out to the quad, where I'd go sit to each lunch. I never met Gary, nor took one of his courses, and again, it wasn't until much later that I found out who he was, and the role he played (or depending on your perspective, didn't play...) in the history of computing.  In one of my courses, I learned about the Hamming distance, and later took a seminar from Dr. Hamming himself.

As part of my master's thesis, I set up a lab; it consisted of two Cisco 2514 routers that I cross-connected, and from which I ran two small networks.  One was 10BaseT, the other 10Base2, and both had one Windows NT 3.51 server and three Windows 95 workstations.  The entire set up was connected to the campus backbone via a 10Base5 "vampire tap".  To collect data for my thesis, I wrote an SNMP polling application in Java, and processed the data using various statistical techniques in MatLab.

While I was in graduate school, one of my favorite courses was a new class in neural networks.  Part of the reason I liked it was due to how it was structured; the first half of the course was some instruction and small projects to get out feet wet, but the projects were small enough to allow us to stretch a bit, as well.  In many of the courses available at the time, the labs were such that it took most, if not all of the week to get them done, so there was very little learning beyond just finishing the minimum requirements for the lab.  In this course (and a few others), a different approach was taken, one that allowed the students to engage, experiment, and learn.  The second half of the course was a project, which was really cool to work on.  As it turned out, several of the students used that course as the basis for their master's thesis...one wrote a program that could discern 'dirty' images of six consecutive Cyrillic characters (i.e., something you'd seen in a satellite photo of Red Square, for example.)  Another student created a neural network to assist with sonar identification.

So, how does all this matter?  Well, 24+ years later, I can discern what's behind the terms "ML" and "AI" that we see with respect to cyber security products.  ;-)

My time in grad school was also when I started brushing up against "information security" in the world of computers.  During a C programming course, I finished my assigned labs and wanted to learn a bit more, so I downloaded a file called 'crack.c' to see what it did.  All I ever did was open it in an editor, but the senior sysadmin for the department got upset.  She even told me that I had "violated security policies".  When I asked her to see the policies, knowing that I had never signed such a policy, I learned that there really was no written "policy".  That was to change more than a year later when a new Admiral took over the school, but at the time, there was no written security policy that any students read or signed.

After I graduated, I spent 8 months processing out of the military, and during that time was assigned to the Marine detachment at the Defense Language Institute (DLI).  While there, one of the things I did was get the detachment's computer systems connected to the DLI campus area network (CAN), which was token ring.  Also during that time, the Commandant of the Marine Corps (Gen. Krulak) had stated that Marines were authorized to play "Marine DOOM"; the setup at the detachment was six Gateway systems connected via 10Base2, running IPX.  I was able to use what I had learned just down the street (literally) to help get the "network" up and running.