Site icon Netimperative

Right to reply: Everything you know about digital measurement is wrong

Measuring the true impact of your digital marketing camapaigns can be a complex task, but are the current standard metrics the best way to measure success? Gary Angel, President of Semphonic shows how the failings in digital measurement have come to be, and how to bypass them to make your own measurement better.

gary_angel.jpg
If you’re a savvy retail or eCommerce-focused Marketing Executive with your hands on digital, you may already know the key best practices in digital measurement. A need to focus on a small set of KPIs, to measure your site satisfaction against eCommerce benchmarks, trend your NetPromoter scores and to track your mentions and level of engagement across social media, to name but a few.
What you probably don’t know, is that every single one of these “best practices” is wrong. That virtually every digital dashboard you rely on is poorly constructed, deeply misleading and fundamentally flawed.
The wrong focus
Let’s start at the top. The key to successful dashboarding and reporting is finding a small set of site KPIs that are understandable and immediately actionable. And your measurement department has probably delivered exactly that – a small set of key metrics like Site Conversion Rate, Total Visits Trend, Overall Site Satisfaction, etc. all laid out in big numbers with great fonts, pretty colors, big trend arrows and lots of Tufte-inspired whitespace.
Sadly, these reports deliver neither understanding nor actionability.
Suppose I walk into your office and tell you that your Site Conversion rate is up 5%. You’ll probably be delighted. Now suppose I tell you that your search engine traffic is down 20%. That’s bad, right? But would you realise that in all probability, the two metrics are related and are, in fact, telling you exactly the same story? As you drive less early-stage traffic to your site via natural search, your Conversion Rate will go up. But in this instance, that isn’t a good thing.
In the real world, there are many different reasons why your Conversion Rate might change. There is simply no way to know which of the nearly infinite explanations might be true, and by removing all but a few key metrics from your dashboard, it is almost impossible for you to ever find out.
The simple fact is that site-wide metrics, from Conversion Rate to Total Traffic, are nearly all worthless. To be meaningful, a metric needs to be placed in the context of “who” it’s about and “what” those customers were trying to accomplish.
When I walk back in and say something like “Traffic is up 5%”, I expect to be asked, “With whom?” Because if you don’t know the audience behind that traffic increase, you don’t know anything. Once I’ve answered the “who” question, the next question I expect to be asked is, “And why were they there?” Understanding what your customers are trying to accomplish on the web is vital to understanding whether you are successful or not. Your chances of making a product sale during a Customer Support visit? Zero. So why are those visits included in your Conversion Rate?
If your dashboards aren’t built around this simple two-tiered segmentation concept, they probably aren’t very useful.
All you have to do is ask…?
If there is so little utility in site-wide metrics reported from web analytics systems, you might be tempted to steer your interest toward another staple of digital measurement – the online intercept survey.
Online survey research is based on a sample of your site visitors, but have you ever thought through the implications of this simple fact? Every time you launch a new marketing campaign, every time you improve (or worsen) your Search Engine Optimisation, you effectively change the demographic coming to your website and, therefore, the sample of visitors in your online survey.
The upshot is that site-wide trends in satisfaction are completely useless. Instead of tracking true changes to Site Satisfaction, you’re actually tracking changes to the site population caused by your marketing program. At the site-wide level, trends in online survey data are meaningless.
“Sure,” you may be thinking, “but we use NetPromoter scores, not Sitewide Satisfaction.”
Unfortunately, NetPromoter scores suffer from exactly the same problems with sampling. Trending your top-level NetPromoter score is no more valid than trending your top level satisfaction. Without careful controls on sub-populations, every online survey metric is completely devoid of meaning. Worse, unlike Site Satisfaction, NetPromoter tends to be a very poor intrinsic measure of Site Experience.
Social ‘Measurement’
What about those cool new “Brand Mention” and “Social Share” metrics on your dashboard? Unfortunately, Social Media measurement is just as prone to fail. If your marketing team is really on the ball, you probably have a Social Media program along with a set of reports showing you how well it’s doing. At the heart of those reports is something called Total Mention Counts – a simple measure of how often your brand is mentioned in the social realms. It might be compared to other brands to demonstrate “Brand Share”, or trended over time to demonstrate “Social Success”.
These are very poor measures, but they are used by nearly every social media agency. Social media chatter is nearly always tainted by commentary driven by professional influencers with some stake in a given industry. When you include all these mentions in a count, you are measuring some impossible combination of PR messaging and actual consumer sentiment. Unless your team has aggressively, manually pruned the counts of all professional posters and commentators, your figures have zero correlation with actual consumer sentiment. In that Total Mention Count, you’ve added mentions in the New York Times to blog mentions by paid professionals in your industry to tweets to your customer support reps. What’s that combination supposed to represent? Nothing of value.
Blind leading the blind
If you’re like most marketing executives in the ecommerce space and are interested in Digital, it’s time you realised that the figures used by your measurement teams are not fit for viewing, much less basing decisions upon. Do the people feeding you these numbers know better? Mostly, the answer is no.
People who never need to actually use the data are all too prone to accept it without question and, odd as it may sound, your measurement team probably never really uses the data. Worse, the idea that senior decision makers need simple, clear numbers has become the guiding mantra of the measurement community. There’s nothing wrong with simple, clear numbers of course. But when they lack meaning, misrepresent reality, and hide the truth, simplicity is no virtue. It’s up to you to demand more, and by dispelling these common, easy metric crutches from your dashboards, you’ll be on the right path to getting the information you need.
By Gary Angel
President
Semphonic

www.semphonic.com

Exit mobile version