It has been my experience that many groups do a poor job of managing the tools they have. This comment is not directed at managing costs or keeping up with renewals, though that can be a problem as well, rather at how we deploy and employ the tools themselves.
So how are we falling short? Most tools are purchased and deployed in what I refer to as the “buy, fire and forget” mode. The whole mess starts when the tool’s sales team comes out to your company and demonstrates how their tool can assess the world in a Pico second and generate highly detailed reports at the press of a button. Interestingly enough, most tools will perform as advertised. The issue then is not how or when to press the button, but what to do once that button has been pressed.
We speak of the integration of a tool as being the installation of the tool, the granting of appropriate access rights such that the tool is able to perform its function, and the subsequent generation of the report. This is where we often make our mistake. As managers, we should view tool integration as the process from acquisition through to the point of an improved security posture. The interesting part of that view is that an improved security posture is not the sole province of the information security team. With that definition in mind, installation of the tool represents, maybe, 10% of the total effort required for true “integration”.
Allow me to provide an example of what I mean. A number of years ago, I was hired at a company to head-up their information security program. Upon arrival, I was presented with a vulnerability report that showed tens of thousands of vulnerabilities existing across the networked environment and the report was more than 700 pages in length. The security team noted that they had been dutifully running the assessment monthly and the numbers continued to climb. From their perspective, this was clear evidence that the systems administrators were not concerned about the security of the environment.
In speaking with the system administrative staff, they stated that the reports were not accurate and cited several false positives in the report. Their view was that working through an inaccurate report was a total waste of time. What was the security team’s proposed solution? Purchase a different tool. Clearly the tool was the root of their problem.
On the surface, the stated problem and the proposed solution sounded reasonable. However, when speaking with the senior manager of the security group, he noted that the tool was eighteen months old and was considered one of the best on the market. So what happened over the last year and a half that turned a state of the art tool into something that needed to be replaced?
My first question to the team was, “Who here has fully read this report?”. No hands were raised, the reason? The report was more than 700 pages. Who in their right mind would want to read that? I confessed to the team that, having slugged my way through a number of English literature courses, I too would be disinclined to read the report. There was no surprise that we were getting pushback by the administration teams. My suspicion was that the administrators had full-time jobs and those jobs probably did not include reading the security team’s version of “War and Peace”.
I pointed out to the security team that if they, a group specifically hired as security experts, were unwilling to read through their own security document, why ever would they expect non-security people to read the same document? There was a clear case of “buy, fire, and forget” integration.
We undertook a formal effort to fully integrate the tool into the company. We, the security team, began by documenting what the basic elements of tool integration should include. The security team and I also reviewed all the complaints and excuses we received from the administration staff regarding our report.
For example, to address the complaint highlighted earlier regarding the false positive findings in our report. We brought out experts on the tool to help us manage those findings administratively. Did it add to our level of effort? Yes, but turning out a reliable report is also part of our job. Not so surprisingly, by addressing this complaint, confidence in the reports we produced rose dramatically.
Another major change came through was our understanding how the responsibilities of the system administration groups were organized. When we took their organizational structure into account, our 700 page tome became a dozen or so smaller reports. Each of those smaller reports contained only the information that fell within the preview of that group of administrators. An example: the Linux team only received reports that pertained to the Linux systems over which they had responsibility.
To help accelerate the path towards an improved security posture, we removed all the low and medium risk items from the report. This allowed the administrators to focus on those items representing the greatest risk to the company first. Only after we had a solid handle on the high risk findings did we reintroduce medium findings into the report. As a side note, the decision to initially address just the high risk items and ignore all others was one that was jointly made with the senior executives and the company’s audit committee, which both of those groups were solidly behind the plan.
There were a number of other changes that we introduced to support the administration teams. The single biggest change was in our attitude towards those teams. As a direct result of this exercise, we came to realize that they were our customers and consumers of our product. That also led to the understanding that the security and integrity of the company’s computing resources was not the sole purview of information security.
In the end, our now fully integrated tool lead to dramatically fewer issues, far more rapid resolution to issues that arose, and significantly better relations between information security and the groups that were being assessed.