I've referenced my new job a couple of times recently without actually saying what I'm doing - I figure it's time to explain where I ended up. But I'll do that by way of a story about where the industry is and what I'm focusing on in my new role.
As you probably know, I've spent time in a whole bunch of different roles within the security community - vendor side, customer side, service provider, product vendor, consultant, etc. Most recently, I was spending time at a large insurance company on the east coast, working as a security architect in a particularly dysfunctional security organization. And I wouldn't have traded it for anything - the dysfunction allowed me to see clearly a whole bunch of management and career strategies that wouldn't have been evident otherwise.
But one thing in particular really started to bother me. We were an under-resourced group (what security team isn't?), and we had to maximize the effectiveness of our investments. So we did a good amount of due diligence on products - there weren't any paper evals. We were really bringing in products and putting them through their paces to make sure that they worked.
And even in that type of organization, I was seeing project completion rates of 20-30%. I had heard the statistics before that the IT industry completes less than 20% of its projects, and I was seeing even a relatively well-disciplined project management organization do not much better than that.
The main reason? Product inadequacy or unfriendliness.
A great example of this was seen when we were deploying an enterprise desktop application. This particular product is a market leader and has a great reputation for usability and some great reference customers. And it passed pilot incredibly easily, so we made the decision to deploy it on 40,000 desktops and laptops throughout the organization. Of course, we didn't have the resources or testing equipment to pilot the product on more than a few machines in a test environment. So, before moving forward, the project manager asked the sales engineers whether 40,000 machines was going to be a problematic deployment.
"Oh, no", they replied. "Our architecture can handle that perfectly well."
So, we went forward. And, when roll-out day came, we found out (the incredibly hard and painful way) that the machines could be deployed only 900 at a time. Our 1-week roll-out became a 2-month roll-out. There was much wailing and gnashing of teeth, and abusing the vendor. But nothing could be done - that was all the product could do.
That was just a single example, and, having spent time on the vendor side of the world, I know it's not even a particularly egregious example of vendor sales exaggeration. I've seen sales people completely misrepresent product functionality to clients to get business.
To me, this type of exaggeration and misrepresentation is one of the biggest risks that information security teams face today - in the face of budgets that aren't ever high enough, a 7-figure purchase of a product that doesn't perform as advertised just isn't acceptable. It's the kind of thing that gets CISO's and their direct reports fired, and gives security a black eye within their organizations.
So, when I got a call from Greg Shipley at Neohapsis talking about a vacancy at the top of the Neohapsis Labs organization, I got incredibly excited. Because I saw immediately the opportunity to help stem the tide of crappy information out there. Neohapsis has always had an amazing reputation for their product testing - from the old Network Computing reviews to the work that we do for individual clients, helping them validate that the product that they're about to deploy actually works in the way that it's supposed to, to helping vendors prove that their product works as they're about to advertise (hint: most of the time, they have to fix something after we look at it), the work that I'm getting to do right now allows me to help fight bad product.
As I look at it, I've seen a few too many multi-million dollar security product engagements fail to be anything but cynical about it - the customers that use the lab before they deploy at least know that the product works as advertised. Or that it doesn't. (If only I had known as a customer all of the things that I've learned in the first 3 weeks reviewing the old lab reports here, I'd have been able to avoid my team a lot of headaches and steer clear of some big mistakes)
So, if you're about to spend a few hundred thousand or a few million dollars on a product, a good idea might be to drop me an email before you do... we might be able to keep you from making a really big career limiting move.