Stop Measuring, Start Doing
David Ogilvy once said, “I notice increasing reluctance on the part of marketing executives to use judgment; they are coming to rely too much on research, and they use it as a drunkard uses a lamp post: for support, rather than for illumination.” As I sat in a brainstorming meeting I began the see the inherent dangers of expectations and assumptions of data.
Ideas were flying around the room: “we should buy the wrap for the hotel keys”, “can we brand the lanyards?”, “last year we got the headrests for the buses and that worked out great!” And then it happened – a digital tactic – “we weren’t very impressed with the media buy last year”, “what kind of click through rate can we expect?, “how many customers are really going to see it?”
This is a classic case of data paralyses except on a small and relatable scale. When we talk about offline tactics the sky is the limit because we can throw around generalizations about, “how well they worked last time”. Based on what? You can’t measure it. You have no idea if the key wrap was any more or less effective than the door drop or coffee cup. And because we can’t measure it, so we don’t have to justify it.
At a popular digital conference I attended there were many trends, some standing out from the rest. One of the core messages was stop measuring, start doing – intuition still matters. Because we can measure it, we are expected to not only prognosticate, but then live up to our predictions. As a result we spend more time thinking, researching, and strategizing, and less time doing and measuring the results. Offline, however, is not held to the same standard. There is no post-mortem discussion on the sign next to the coffee or mobile phone charging station or subsequent discussion on how we can improve it next time.
Unfortunately this scenario plays out all too often. A new digital technology evolves or replaces a historically off-line technology. Paper details moving to tablet based CLM platforms, for instance. For the first time organizations can granularity measure effectiveness and interest. Unfortunately it doesn’t always turn out the way you think it will. As the authors of Subject to Change noted, “you don’t really understand your product until you put it in front of users.” When this happens the knee-jerk reaction is to go back to where you were comfortable, to what “worked” … to off-line.
Don’t do it! Though it may feel comfortable and familiar, always remember ignorance is bliss. Not tracking does not imply it is working.
As I continued to listen to the research from the agency partner she touted with pride in how many ways the messages and segments had been derived and tested. “After a year and a half,” she proclaimed, “we’re ready to go to market and we’re confident in our approach.” Make no mistake, I love testing. But a year and a half of it?
Data is here to help us make better decisions faster, not to stop us from making them.