“More important than science is its results; one answer invokes a hundred questions. More important than poetry is its results; one poem invokes a hundred heroic acts.” J.L. Moreno
Some years ago, one of my teachers said, “It’s not so important what you do; it’s more important what you do NEXT.” What he meant was that if you are going to undertake to do something, you should be prepared to handle what unfolds as a result of your actions. This is pertinent for organisations who choose to measure customer experience, carry out staff surveys or gauge culture or capability.
With this in mind, I’d like to expand on a point I made in one of my previous articles. Many organisations have cottoned on to the practice of measuring and surveying in order to get some kind of readout on culture, attitude or behaviour. Measuring is a good way of exposing OD gaps or establishing the L&D needs of an organisation. It is also a good way of getting a snapshot of how responsive an organisation is to their customers or how well they are doing with regards staff engagement. There are many kinds of metrics, measuring all manner of organisational phenomena. I’ve encountered a few of these in my time and find them endlessly fascinating, but then I can be a bit of a nerd sometimes.
As I mention in my previous article, it is true that in some cases, a thing observed is a thing changed. This is not always the case, however. Carrying out staff surveys or measuring phenomena such as customer experience or staff engagement is only the first step. Once you have found the sources of your problems, I would suggest that there should be some sort of action plan to deal with them. This may seem blindingly obvious to many of you, however in my time, I have seen far too much confusion and inaction in response to measures and surveys.
For my company, the first step in any engagement with clients is a comprehensive analysis of their system, and a few of our past clients have been tempted to go no further than this. Because we apply a strengths-based methodology to our analysis, people come away with a greater sense of what is functioning well and what capabilities they already have at their disposal. It is usual at this stage for clients to say to themselves, “Wow, we are doing better than we imagined,” but that is often because in our society, we are so habituated to focussing on what is going wrong that when the ‘health’ is instead highlighted, it comes as a bit of a surprise.
No system is perfect, however, and in our systems analysis, we also uncover the hidden causes of their dysfunctions and for some clients, they have believed that this ‘seeing’ was sufficient; this could be likened to going to the doctor to investigate some recurring physical symptoms, say chronic lethargy, and upon discovering the underlying cause, doing nothing in response. “I’m just relieved to know what it is,” is something I’ve heard more than once. This is, of course, normal human behaviour; if someone is in the ‘pre-contemplative’ stage of making a change, the associated thoughts will be “There is nothing wrong,” or even “I’m sure it’ll sort itself out.” We humans are masters of self-deception, even in the face of hard fact and evidence. Even though we know what is good for us, we still sometimes exhibit behaviours which fly in the face of good sense and reason.
Beyond my own nerdishness and fascination with statistics and measurements, I can see the importance, in some instances, of measuring things in organisations. It is vital that this is done purposefully, though. A robust and strategic measure can be the confirmation of something which is felt intuitively, and which catalyses an organisation into corrective action. A clear, simple metric, such as a good 360, can also be a spur for individuals to change something about themselves. Nothing like a cold hard look in the mirror to jump start us into doing something different.
Or not. As I say, there is nothing like human behaviour to prove that logic and good sense does not always prevail, and this is also the case for organisations. Do any of these scenarios seem familiar?
We’ve measured, we’ve got the results, but why did we bother measuring in the first place? It is essential that both the metrics and the subjects of measurement are well thought out, and that it is not done simply as some slavish knee-jerk response to a perceived trend or fad. I have seen injudicious surveys carried out which arose from no identified need or strategy and which ended up wasting the organisation precious resources. Perhaps because “everyone else is doing it”, HR was charged with carrying out some kind of culture survey or staff engagement survey. Be sure there is an identified need in the organisation. Measure if you are seeking data; and when you get the data, do something purposeful with it. Blindingly obvious, I know.
We’ve measured, we understand what the results mean, now let’s see how things change over the next year. If you are going to measure, be prepared to do something in response when the results come in. The experts in measuring usually provide reports that outline and interpret the data and go on to make suggestions as to what to do next, but because their expertise is in measurement, they often do not know how to follow up, beyond coming back next year to measure again. You cannot count on change occurring simply because they have reported that your managers need to be better at listening to staff. There is a strong likelihood that, if you do nothing, when they come back next year to re-measure, things will have not changed very much. Ensure that you follow up with some sort of action plan. Blindingly obvious, I know.
We’ve seen the results of the survey and now we must do something about it. Many New Zealanders take pride in what they call their “#8 wire mentality”. Put simply, it translates as, “We can fix it ourselves.” While this may have been a real strength in years gone by, it is now a significant weakness. New Zealanders are not the only people who pride themselves on self-reliance, though. Many of us like to think that we’ve got it all in hand and that we don’t need to ask for help. Really? You may be the CEO or the CHRO, but it does not follow that you are actually capable of developing listening skills in your managers or that you know how to get your salespeople to really put themselves in their customers’ shoes. Admittedly, resources are pretty stretched these days and kudos to leaders who attempt to follow up on survey results themselves, but holding a two-hour staff meeting in which you tell people to improve is no guarantee it will happen. Right tool for the right job, get a professional in. Blindingly obvious, I know.
We’ve measured but we don’t like those results. Let’s not measure again. If you don’t like the answers, don’t ask the questions again. Sad to say, but I know of organisations who actually operate like this. In one of them, the senior executive team all undertook a 360 survey and when the results were in, they were so shell-shocked that they decided that they would never do an exercise like that again. Needless to say, by the time we came along some months later, they were not interested in engaging my company’s services because by then, the thought of shifting attitudes or behaviour was simply too scary or too hard. Again, this is normal human behaviour; we get overwhelmed when we look in the mirror sometimes and resolve never to look again. There are subtle shades to this approach: “Those results are invalid because those guys who did the measuring are no good anyway;” “Those results are invalid because our industry sector is really special and unique and their measures don’t take that into account;” “Those results are invalid because I told them what to measure and what not to measure and they didn’t do what I told them.” Don’t ask the question if you think you might find the answer distasteful. Blindingly obvious, I know.
What other scenarios have you encountered with regards measurement and surveys and what innovative responses have you seen?