While digital analytics can uncover unparalleled insights for marketers, too often something goes wrong and sometimes terribly wrong. Throughout history, the abovementioned “four horsemen of the apocalypse” have been identified as Conquest, War, Famine, and Death. Here we will look at how each, in a modern incarnation, can destroy digital analytics effectiveness, and how to battle them.
Horseman 1: Conquest
If you’ve been asked to abandon your healthy skepticism and buy into a single-vendor solution that “solves every problem” in analytics; if this also includes all “services” to be performed by the vendor; and worse, if you have allowed yourself to be convinced by your creative agency that yes, they also do analytics (in other words, they agree to umpire their own balls and strikes); then you have been a victim of conquest. You no longer control your own analytics destiny but have put it entirely in the hands of providers with an agenda that does not necessarily include your getting the most insight out of your analytics.
How to battle it:
Choose only best-of-breed solutions for each purpose. Worry less about interoperability. The dirty secret is that most branded solutions are made up of portals that don’t work together all that well anyway — they just have the same logo. Choose your own consulting company, make sure they know the tool, and let them deliver what they can. In many cases this will be superior to what the vendor can deliver. Finally, don’t let your creative team measure their own success. This is why baseball has umpires and why football has referees.
Horseman 2: War
Internal battles between IT and marketing can be ruinous. Endless turf-wars, truth-blockage, “not-invented-here” attitudes, and lack of clear chains of command make for heavy losses on all sides. And with the stakes higher and the “weapons” more powerful, the battle becomes bloodier. Analytics today is visible, but it’s tough to find someone who can really own it. In too many cases it’s easier to blame someone else — even if the only problem is that the trendline is headed south this month.
How to battle it:
Be an informed peacemaker. Understand the tools and technologies that powerfully affect your business and work to make sure how each division plays its role. Unlike today at most enterprises, the person “in charge of analytics” in fact needs to be in charge of making sure measurement is done correctly — and not at the whim of a developer reluctant to partake. Analytics is not an option. Clear leadership wins the battle.
Horseman 3: Famine
In a digitally driven media environment, lack of accurate information about usership is the equivalent of famine. No matter if you have spent enough to send rockets into space. If your tools are not instrumented properly (a far more common problem than most would want to admit), then your data is going to be inaccurate. Some might say this is nitpicky (yes, I have heard that) as long as the trends are well understood. But we are not talking about a few percentage points — more like the actual raw numbers of visitors, page views, and more. Data famine leads to blindness and eventually (see below).
Horseman 4: Death
You cannot fight death. If your analytics efforts are lying off in a ditch showing little signs of life, then your gong has been rung. But fortunately, in this world, you can reincarnate. Pull the wagon out of the mud. Choose new vendors. Strip out the old, junky code and have it replaced with something that will pass a basic QA test. There is nothing as bracing as a new start.
Remember that throughout history, in the real world, people have cheated every one of these horsemen many times over. In analytics it is no different. Next time we will talk about the legendary spider that failed seven times to make a web but succeeded on the eighth.
Media converges with digital more and more each day.
Even a year ago, would anyone have predicted that Amazon would win a Golden Globe for a show starring Jeffrey Tambor of Arrested Development fame? Or that today you can watch network television without a cable subscription (or even an antenna)? Given these new twists, can we begin to consider how this affects digital analytics as compared with the “ratings” that once drove all media sales?
Today, where media companies sell to advertisers, they continue to offer up the proverbial eyeballs even if no one wants to call it that anymore. How many viewers? How many pre-rolls of the digital video? How many tweets, shares, replays, aborted sessions, repeated sessions? In an ad-sales environment, the data can drive the transfer of lots of real dollars. And typically, media folks just haven’t as reliable a data model today as they once had; consequently making it that much more difficult to help their customers (brands) make decisions about what to buy. And that’s despite the fact they can offer far better targeting and far more depth into audience analytics than was ever before possible.
Why is there even a debate about what’s important in digital media? Aren’t the old-fashioned ratings systems working the way they used to?
Turns out they’re not. And it seems like it’s more about optics than data.
For instance, when television was unchallenged by the Internet as a media outlet, there was just one accepted metric, and that was the Nielsen rating. It was more or less gospel. If you had bad ratings, you had bad advertising sales rates and if you couldn’t sell a 30-second spot for your show at an acceptable rate, then your show got canceled.
The reason the ratings model was accepted was not necessarily because Nielsen had come up with a fault-free method of audience measurement. As likely it was because Nielsen didn’t share a great deal about exactly how they arrived at those ratings numbers. Or at least they owned the methodology and were willing to stand behind it. And media folks had no way of questioning it because they had nothing else to compare it to. Nielsen said: “here are the numbers,” not “here are some methodologies we developed to measure audiences.” I call that a black box. It was easy and it worked.
Then came the Internet and, heaven help us, you could look in your browser at the code on a live, published page yourself; almost as if you might have won a spot backstage at the sitcom set and watch them writing the script just before show time. And if you were the “publisher” and wanted to know about who was looking at the page, when, and for how long, you had no choice but to do the counting yourself. Whatever you found out you did not share; and you also could not reasonably compare. For where one sitcom versus another was pretty close to apples/apples, now it became apples/starship. Worse, no one could agree on what an apple was. Or if a starship could be made of apples.
With visibility into methodology came the right to question everything, including the methodology. It’s inherent to the Internet paradigm that no one company measures and presents results. Everyone is their own little Nielsen and they know every flaw in the model and have a thousand reasons not to believe the numbers. It isn’t that it’s less reliable than Nielsen, it’s just that it looks that way.
Somehow, media needs to get back to a model such as Nielsen had. But that might require a black box again, and that seems about as likely as a starship made out of apples.
If you’re a marketer and you’re thinking that the solution to your analytics troubles is a dose of training, think again.
Training won’t hurt, of course — unless you think you’ve got analytics knocked after a few sessions with an expert. But most likely training won’t get you very far down the road to solving your problems, either.
Not because you’re not smart or not good at what you do.
More likely it’s because you have underestimated how difficult it is to get analytics “right.”
And because by definition, if training is what you think you need, then you’re very likely a professional at something else. Can someone else — a digital analytics expert, let’s say — take a few hours of training and do what you do? Chances are they cannot.
Training as a Code Word
I’ve seen a number of cases where marketers begin a conversation about fixing an analytics ague with a request for “training.” I typically assume that what they are trying to say is, “things here are a mess and we’ve got to start somewhere” – and very often, “training” is the only thing that comes to mind. This can be attributed at least partly to the fact the prospective trainee really doesn’t know where the real problems are and cannot begin to articulate the real need.
It may also be that it’s been made clear to the marketer that the company has no particular taste for hiring consultants to do stuff that “should be done internally.” Which is a little bit like suggesting that lawyering should not be outsourced — pick up a book about case law and study! If it sounds a bit silly, perhaps that because it is a bit silly.
I am not going to say analytics is as convoluted as law, but it isn’t anything like picking apples on a fine autumn day, either.
Analytics is hard. It occupies the minds of seasoned professionals all the live-long day and sometimes they cannot even get it right. Large organizations have been known to falter in their attempts to maintain currency and accuracy in analytics even after concerted efforts to get beyond the basics. Almost every analytics professional has a wagonload of war stories about “Big Analytics Messes.” As many will have tales to tell of “Inability to Gain Insight From Data” even at the largest and most sophisticated of outfits.
In this context, “training” is code for “some kind of help — any kind of help.”
Misconception and Missed Opportunity
Recently I encountered a client who kicked off a phone call by saying they needed “a few hours of training.” Then, having signed up for a customized training program, they took their first lesson. It was supposed to last three hours. It lasted 45 minutes.
The reason they cut it short was because (wisely I believe) they became aware rather immediately that there was far more to understand than could possibly be accounted for in a few — or even a bunch of — training sessions. The next thing they asked for was a proposal for us to solve the problem with consulting hours.
Things might have gone more smoothly if my team had understood how little this client really knew; and how that substantial lack of perspective would affect their ability to understand what was needed. To me, it points out that analytics professionals often need to get away from their own jargon and do a little more listening than they might be used to.
Part of the solution our team designed would involve (as does nearly every analytics solution) “tagging.” Which means placing code into the HTML of the pages that need tracking. While this is a bedrock technology in analytics, it certainly is not common knowledge. And what is perhaps even less common is an understanding of who needs to get involved in order to make it happen (for instance, the client’s internal developers).
Our team submitted a specification for tagging and another for reporting. All seemed to go well and smoothly until it came time to take a first look at the data. And when our team looked, there was no data. And when we looked further, we saw that the tagging had not been placed on the pages — which meant no data could be collected and no reporting could be accomplished.
The project only got worse after that, and neither side came away happy.
OK, Maybe You Do Need Training.
Would this project have gone better if the client had been better informed about the basics? Certainly. Did they “know what they did not know”? No, they did not, and most folks do not, either. Should we have offered them training to get them familiar with certain terms and configurations that would weigh on the success of the project? In retrospect, perhaps.
So, maybe the marketer does need training. But not so much about “how to do it.” More likely the training should focus on the broader concepts that underpin the entire analytics endeavor. Therefore, if you think the answer to your analytics problem is a dose of training, you may not be as wrong as all that. It may be that the kind of training you think you want is not the kind you need.
And remember — much as you would not have a newbie do your job, try not to position yourself as a newbie trying to do someone else’s while at the same time expecting much in the way of results.
Analytics is hard, kind of like building a house. You don’t want to live in the one you “built yourself.”