Discuss the merit of a campaign with a marketer and it won’t be too long before the jargon hits the conversation stream. Terms like ‘quantitative‘ and ‘qualitative data‘ are thrown around like scientists analyzing a white board with a couple ‘demographics‘ and ‘A/B testing‘ mentions to clarify that this is business.
I am no different though analyzing metrics never really came into play until I was married. My husband would annually present his Excel spreadsheet on the screen, all the cells filled with every spending category in the family budget alongside a corresponding dollar amount. He even had the formulas in place so that if you changed one number, they all changed accordingly. My husband loves numbers.
I thought of this spreadsheet this weekend while teaching my Facebook for Business class at The Hub. I was deep into my lecture on Facebook Insights when a student raised her hand and asked a rather important question:
“Do these metrics reflect the reality of the campaign?”
I looked at the graphs that analyzed the last ad sets purchased for Goldstream Sports. By the metric standards the response was low, but the store had moved a lot of product during this time period.
“No,” I answered after some thought. “Few people actually responded to any of these ads on Facebook. But there were days when the store was officially closed, and our doors were open for some kind of competitive event; and people unrelated to the event were coming in to buy stacks of gear.”
It’s easy to get wrapped up in the numbers, especially if you are paying for ads. We are told to look for the lowest Cost Per Click (CPC) or Cost Per Mille (CPM), but does that really indicate failure or success? Campaign assessment should reflect the campaign objective, and both come in many forms.
For example, if your objective is sales, then the receipts at the end of a campaign will tell you if you sold the expected amount in that time period. If it is to get people to attend an event, then the attendance will give you the appropriate metrics. Yes, we can ask people where they learned about the event, but consider this:
“Marketing guru Jay Levinson figures you have to run an ad twenty-seven times against one individual before it has its desired impact. Why? Because only one out of nine ads is seen, and you’ve got to see it at least three times before it sinks in.” (Permission Marketing by Seth Godin)
If you run a multimedia campaign, whose to say they were even aware of the twenty-seven previous times they saw your ad before it started sinking in?
And then there are times when there is only ‘qualitative’ or ‘soft data’ to assess the campaign. These usually include individualized concepts such as satisfaction, enjoyment, and even customer service. Much like they way I killed the annual presentation of my husband’s spreadsheet. I acknowledged that the ‘quanitative data’ proved he was 100% right about reducing spending; but no way existed to prove from a ‘qualitative’ perspective that I would be a 76% happier wife if I continued to get my hair colored and styled at a salon.