Criticism: A Security Chief's Most Valuable Resource
This past week, a lesson about enterprise information security found its way to me via a somewhat unorthodox channel: specifically, an episode of Gordon Ramsay's Kitchen Nightmares. If you haven't seen it, the premise of the show is that Ramsay shows up at a restaurant -- usually one "in crisis" -- and leverages his experience to put it on track.
This particular episode focused on an Arizona-based bakery and bistro. I won't go too far into the specifics, since that's not the relevant part, but the upshot was that Ramsay wasn't able to help. Why not? At least as it was portrayed on the show, the issue was how the owners handled criticism.
Even though Ramsay might have proposed any number of viable strategies to ultimately benefit the restaurant -- e.g., changes in image, menu additions or subtractions, or customer service improvements -- none of that would have been feasible since all feedback, constructive or otherwise, was perceived by the owners as an attack.
Instead of incorporating Ramsay's feedback -- learning from it and adapting their business to its betterment -- the owners maintained status quo. There's a lesson in this for those of us in the network and security space -- particularly those in the community with a decision-making element to their role.
Specifically, to be successful, we need to be able to hear feedback. Not only do we need to be able to hear it, but we need to analyze it objectively and incorporate it directly into our planning. When we don't, we're leaving a valuable resource on the table.
Feedback - Not Unsolicited Advice
Now, I want to be clear from the get-go that by "listening to feedback," I'm not suggesting practitioners react willy-nilly to unsolicited advice from anybody with an opinion. There are quite literally as many different viewpoints about how security "should be done" in an organization as there are people within it. As a consequence, it's a given that there will be some percentage of requests, desires, or needs that we can't accommodate. That's both normal and natural.
What I'm referring to is building a mechanism to systematically measure and analyze the degree to which our program services the needs of the stakeholders we support. As practitioners, we've all heard the truism about how it's important for security to support and "enable" the business. However, you'd be surprised how rare it is in practice that organizations actually spend time evaluating whether they are actually doing that effectively.
Consider this against the broader context of IT. As most of us know well, many organizations actively take steps to understand customer satisfaction as it relates to support interactions in IT. For example, many ticketing systems automatically ask support customers to provide ratings or feedback in response to support tickets; other organizations use post-incident survey forms to determine the level of satisfaction with staff in dealing with a particular support issue. This information is used to evaluate a support team's performance: how well the support staff serviced the customer and how the customer viewed the interaction as a result.
In security, though, we don't often have a comparable measuring instrument -- meaning we often don't know whether the experience that business teams have in working with our team are positive or negative. This is problematic. Why? Because if business teams find the experience of working with us valuable, they'll proactively get us involved in future endeavors. If they see the experience as nonvaluable, they'll bring us only when they have to -- and then only grudgingly.
Security 'Experience Management'
What I'm essentially talking about here is "experience management" for security: having a way to know how our business partners view the level of service that we as the security organization provide. Keeping track of that over time is a way to gauge whether we are improving in light of changes to our program as well as changes in how business teams use our services.
Probably the best -- and arguably only -- method to accomplish this is to ask them directly. Reach out to them. One way to do that is to canvas them directly via face-to-face dialog -- i.e., go out and meet with them to ask how you're doing, where you can improve, what services you're not offering but should, and -- most importantly -- what their pain points are.
The advantages of performing this exercise as a face-to-face exercise are that it's intimate; you get to ask questions; and it gives you a chance to actively market the capabilities you provide. The disadvantages are that you need a thick skin, since not all feedback will be positive; it's time-intensive; and some percentage of the folks you meet with might be less than candid -- some folks will be challenged in telling you to your face that you're not getting something right.
Another approach is through computer-based surveys. This allows you to canvas a larger segment of the population in less time, but it lacks the personal interaction of a face-to-face meeting. It also may be perceived as more of an inconvenience by business partners; this is due to the perception -- in absence of evidence to the contrary -- that responses won't be taken seriously. This is particularly true the first time through.
If you do decide to conduct an automated survey, keep in mind that identification of respondents is a core issue: An anonymous survey increases the likelihood of candor, while participant-identified -- or "identification optional" -- means you can ask for clarification if needed. Either way, solicit business area information if you can, since one benefit of the exercise is to identify areas with special needs.
Whether you approach it through face-to-face discussion or through an automated survey, it's particularly valuable to repeat the process periodically. Why? Because this is where you get to see whether changes you've made in response to feedback are moving the needle when it comes to perceived value in the stakeholder community.