Pages

Thursday, April 04, 2019

A Look Back...

I was chatting with someone recently with whom I'd crossed paths a bit more than 21 years ago, and throughout the time we chatted, I had this sense of nostalgia. What that led me to was the thought that when you've been in and around any industry for a while, it's interesting to take a step back and reflect on your path.

One example of nostalgia and looking back that comes about pretty regularly is when I meet former Marines.  I've met many who were anywhere between toddlers and elementary school when I left active duty, and when we talk about what we did, I have to remember that my MOS (military occupational specialty) no longer exists.  I was a Communications Officer, 2502, and the 25xx occfield is something a lot of current and former Marines have never heard of. So, nostalgia.

So, how does this relate to the cyberz? I first read The Cuckoo's Egg years ago, and thought, wow, what must it be like for Clifford and others involved to look back on what they did and the decisions they made?  What was it like to revisit the events of that time with someone else who was there, reminiscing with, "...hey, do you remember when...?"

When I started grad school in June 1994, I walked by Gary Kildall's office for about 4 months before he passed.  I never met the guy, and to be honest, would not have known who he was; I didn't find out until several years later.  At the time, I was use MS-DOS and Windows 3.1, and had transitioned to Windows 3.11 for Workgroups just prior to heading west.  I had never heard of CP/M, nor did I know much about operating systems at the time. 

While I was in grad school, OS/2 was a thing.  I went to Frye's Electronics in Sunnyvale to purchase a copy of OS/2 2.1, in part to install it on my desktop computer (with a 486DX microprocessor), but also to get the $15 off coupon sticker to apply to the purchase of OS/2 Warp 3.0.  I'd had to install a second hard drive in the computer, in part due to the fact that drives were so small back then. 

To get a sense of perspective of cost, I dropped $500 to upgrade to 16 MB (YES, with an "M") of RAM.  When you installed the second hard drive (the ribbon cables only had two connectors), you had to be sure to set the jumpers correctly.

As part of my master's thesis, I wrote an SNMP polling utility in Java.  This was just for the data collection phase of my thesis work; this was preceded by setting up a lab environment, using Windows 95 and Windows NT 3.51 servers, with two different network media (10-Base2, 10-BaseT) connected across two Cisco 2514 routers.  One of the routers was connected to the campus area network (CAN) via a 10-Base5 vampire tap.  I used simple video teleconferencing software to generate traffic across the network, and I used the SNMP polling application to collect traffic volume information over time.  Once volumes of data had been collected, I did all of the statistical processing and image display via MatLab.  To generate the traffic, I'd sit in front of one of the video cameras eating my lunch, and I'd sit in sight of the other camera.  I'd wave my arm to generate frame updates from the second camera.

As an aside, the knowledge I developed of SNMP would serve me quite well, well after I left grad school and the military.  Not only could I make recommendations to a network operations team regarding security (i.e., don't allow SNMP or even just UDP through the firewall, etc.) based on that knowledge, but I could also use what I had learned to develop information about systems during pen testing.  Yes, you'd be surprised what was accessible from the Internet back then (SNMPNetBIOS), and even later.

All of the main terminals in my graduate program were SparcStations running Solaris (NetScape was the browser we used).  I'd used TRS-80s in high school (programming PASCAL) and college (programming BASIC), and my family's first home computer was a Timex-Sinclair 1000; the second was an Epson QX-10.  During that time (early '80s) I had also taken a course in BASIC programming on the Mac IIe.  Somewhere along the line, a friend of mine had a Commodore 64, but I wasn't into playing computer games so I didn't spend much time with it.  I never did touch an Amiga, but I did take two courses in grad school that involved using assembly language to program the Motorola 68000 microprocessor.

I was in grad school when SATAN was released (circa  1995).  Shortly after it became available, I transitioned to the role of being the student rep from my academic department to the IT board.  I sat through more than a few meetings where the IT admins from across campus argued about running the scanner on each other's networks.  I never actually ran the tool; I was afraid to even download it.  This was due to the fact that I'd downloaded a copy of a file called "crack.c", in order to learn a bit more about C programming (for a course I was taking at the time).  I got in "trouble" with the senior sysadmin for the department because I had the file on my system.  I argued that I had never compiled it (part of the course was the compilation process), never created an object file, never linked it, etc.  None of that mattered to her.  Things went really sideways when she'd claimed that I had violated THE security policy; knowing that I'd never signed such a thing, I asked to see the security policy. That's when she turned red and stormed off.  Even though the two other admins were on my side, I was in the doghouse with the senior sysadmin. The fact was that it was another year before a new Admiral took of the school as superintendent and there was an actual written security policy.  Even what that policy was available, downloading crack.c would not have been a violation. 

As my active duty time ended and I was processing out of the military, I was attached to the Marine Detachment at the Army's Defense Language Institute (DLI). My role at the time was to connect the computer systems in the detachment to the Army's CAN, which was token ring. Around that time, the Commandant of the Marine Corps (Chuck Krulak) had stated that Marines were authorized to play "Marine DOOM", and the detachment had purchased 6 Gateway computer systems for that purpose.  These systems were set up on a round credenza, connected via a 10-Base2 network, running IPX.  At one point, a SSgt joined the unit and decided to make the room his office.  To make more room, he had the Marines split the credenza in half and place the flat parts against opposite walls, with 3 stations on either side of the room.  To do so, the computers needed to be disconnected.  Once everything was in place, the computers were reconnected, and attempts were made to run the game, all of which failed. As the SSgt approached me for assistance, I didn't realize until much later in life that this was my first consulting gig.  The SSgt informed me that the computer network had been reassembled EXACTLY as its previous state.  I took a look at the system at the end of the network, and realized that the RS-232 connector had been attached directly to the NIC; looking around, I found the T connector and terminator sitting under the keyboard.  No one saw me reconnect everything, but I told the SSgt to try again, and the network sprung to life.  Within minutes, the Marines were back to playing the game. 

In one of my first jobs out of the military, I was doing assessment work.  During that time, I spent time in one of the Twin Towers in New York City, doing war dialing work.  Our tools of choice were THCScan and ToneLoc. It was pretty exciting for two of us to sit in a cubicle, with the sound turned way down on the laptop we were using, listening to the responses we were getting from remote systems.  Most of the time, the phone on the other end would be answered, and we'd hear "hello" come out of the speakers. It really got exciting when we could hear software dial a number, and a phone ring two cubicles down, and the person in that cubicle respond, their "hello" echoing in the speaker of the laptop.  When the software dialed the cubicle on either side of us, we threw our jackets over the laptop to ensure that the residents didn't hear us.  We got "caught" when the software dialed phones on opposite sides of the mainframe room; the mainframe admin wasn't happy about having to get up and walk across the room apparently, and he called the CIO to report the activity. 

I later worked at Trident Data Systems (TDS), where I was doing vulnerability assessments and some light pen testing work.  We used ISS's Internet Scanner for our vulnerability assessment work, and while I was there, I began developing a tool we called "NTCAT" to replace the use of ISS's tool.  We were getting a good bit of spurious findings from Internet Scanner, most notably with respect to the Windows "AutoAdminLogon" setting, and were learning more about what Internet Scanner was doing.  That is, what data it was collecting, and how it was making it's determinations.  And this is where we were running into issues, which we were addressing using this new tool.  When I wasn't doing assessment work or writing reports, I was connected to the lab, developing and testing NTCAT.  This is also where I really started to see the value in graphic representations of data, albeit in a very simple format (i.e., red for bad, green for good).  I also began to see, up close, the value of processing data in different ways; getting a quick SWAG while you're on-site, versus a deeper inspection of the data when you had time.  Taking it a step further, we were finding out new things and developing new tools to process the data, making things more useful and valuable to the analyst, and subsequently to the customer.

One of the things I pitched to my manager at the time was that once we collected all of this data from the customer, we had it.  We'd need to address it legally, through our contracts of course, but we could keep the actual data in a secure manner (encrypted and/or stored offline on a CD), and as we learned new things, run this new processing across the retained data.  It's interesting to look back on those discussions, and then look at what we do now with respect to MSS and EDR, particularly regarding data retention times.  I've seen some of the same questions and issues from back then in today's world; few EDR tools provide a retrospective look at what happened on a system prior to the installation of the agent or sensor, and as such, IR teams have to use other means to collect this historical data.  Things brings up questions of "what do you collect?", "how do you collect it?", as well as "how long can you keep the data?"  Times have changed, technology has changed, but we still see some of the same issues.

During my time with TDS, some members of our team were working a pretty big pen test, and had run up against an Internet-connected PBX.  They really wanted to get access to this PBX, and worked hard at doing so.  At one point, several members of the team went to the loading dock on the first floor of our building for a smoke break, and were talking about what they'd run into.  On the first floor of our building was an engineering firm, one that specialized (as it turns out) in that PBX. While our guys were out on their break, one of the engineers was just finishing up a break of his own when he overheard what they were discussing.  He walked over and offered them the 15 character password that was hard-coded into the PBX. 

In another job, I worked (indirectly) for a security director who'd appeared by name in the book TakeDown.  Several years later, while I was attending a conference in Seattle, I saw Kevin Mitnick.  I didn't talk to him...he was surrounded by a mob, and I don't do mobs. Also, I wasn't really sure what I'd say. 

As time passes, I may write some additional posts, looking back on the things I've seen, as well as folks I've engaged with.  We'll see...


1 comment:

  1. I remember when SATAN was released and Dan Farmer got $#!+canned for releasing it. Was pretty mad about it, at the time.

    Still kinda mad about it, actually.

    ReplyDelete