14
Apr

Why You Probably Don’t Need Analytics Training

Written by Andrew Edwards. Posted in Analytics

shutterstock-154221011-185x114If you’re a marketer and you’re thinking that the solution to your analytics troubles is a dose of training, think again.

Training won’t hurt, of course — unless you think you’ve got analytics knocked after a few sessions with an expert. But most likely training won’t get you very far down the road to solving your problems, either.

Not because you’re not smart or not good at what you do.

More likely it’s because you have underestimated how difficult it is to get analytics “right.”

And because by definition, if training is what you think you need, then you’re very likely a professional at something else. Can someone else — a digital analytics expert, let’s say — take a few hours of training and do what you do? Chances are they cannot.

Training as a Code Word

I’ve seen a number of cases where marketers begin a conversation about fixing an analytics ague with a request for “training.” I typically assume that what they are trying to say is, “things here are a mess and we’ve got to start somewhere” – and very often, “training” is the only thing that comes to mind. This can be attributed at least partly to the fact the prospective trainee really doesn’t know where the real problems are and cannot begin to articulate the real need.

It may also be that it’s been made clear to the marketer that the company has no particular taste for hiring consultants to do stuff that “should be done internally.” Which is a little bit like suggesting that lawyering should not be outsourced — pick up a book about case law and study! If it sounds a bit silly, perhaps that because it is a bit silly.

I am not going to say analytics is as convoluted as law, but it isn’t anything like picking apples on a fine autumn day, either.

Analytics is hard. It occupies the minds of seasoned professionals all the live-long day and sometimes they cannot even get it right. Large organizations have been known to falter in their attempts to maintain currency and accuracy in analytics even after concerted efforts to get beyond the basics. Almost every analytics professional has a wagonload of war stories about “Big Analytics Messes.” As many will have tales to tell of “Inability to Gain Insight From Data” even at the largest and most sophisticated of outfits.

In this context, “training” is code for “some kind of help — any kind of help.”

Misconception and Missed Opportunity

Recently I encountered a client who kicked off a phone call by saying they needed “a few hours of training.” Then, having signed up for a customized training program, they took their first lesson. It was supposed to last three hours. It lasted 45 minutes.

The reason they cut it short was because (wisely I believe) they became aware rather immediately that there was far more to understand than could possibly be accounted for in a few — or even a bunch of — training sessions. The next thing they asked for was a proposal for us to solve the problem with consulting hours.

Things might have gone more smoothly if my team had understood how little this client really knew; and how that substantial lack of perspective would affect their ability to understand what was needed. To me, it points out that analytics professionals often need to get away from their own jargon and do a little more listening than they might be used to.

Part of the solution our team designed would involve (as does nearly every analytics solution) “tagging.” Which means placing code into the HTML of the pages that need tracking. While this is a bedrock technology in analytics, it certainly is not common knowledge. And what is perhaps even less common is an understanding of who needs to get involved in order to make it happen (for instance, the client’s internal developers).

Our team submitted a specification for tagging and another for reporting. All seemed to go well and smoothly until it came time to take a first look at the data. And when our team looked, there was no data. And when we looked further, we saw that the tagging had not been placed on the pages — which meant no data could be collected and no reporting could be accomplished.

The project only got worse after that, and neither side came away happy.

OK, Maybe You Do Need Training.

Would this project have gone better if the client had been better informed about the basics? Certainly. Did they “know what they did not know”? No, they did not, and most folks do not, either. Should we have offered them training to get them familiar with certain terms and configurations that would weigh on the success of the project? In retrospect, perhaps.

So, maybe the marketer does need training. But not so much about “how to do it.” More likely the training should focus on the broader concepts that underpin the entire analytics endeavor. Therefore, if you think the answer to your analytics problem is a dose of training, you may not be as wrong as all that. It may be that the kind of training you think you want is not the kind you need.

And remember — much as you would not have a newbie do your job, try not to position yourself as a newbie trying to do someone else’s while at the same time expecting much in the way of results.

Analytics is hard, kind of like building a house. You don’t want to live in the one you “built yourself.”

03
Apr

Enterprises: Fix These 3 Analytics Challenges Now

Written by Andrew Edwards. Posted in Analytics

advanced-analytics-185x114We’ve all been at this analytics thing for a while now, and with all the advances in big data and algorithms and data visualization it’s sometimes too easy to lose sight of some of the bedrock disciplines that got us here.

Big companies are different from smaller ones and the stakes are higher where mistakes are made. Getting these three basics in good working order at the enterprise level makes positive change much more likely than not.

1. Governance

Who’s in charge here?

Many large organizations have content owners across the country or across the globe, and they are often in different stages of analytics maturity. Each team seems to think they have found a solution that works (more or less), but when it comes time to compare a Golden Delicious to a Granny Smith, often enough it turns out your team in Kalamazoo is using a pomegranate. You need to standardize on a platform and make it your benchmark. Each platform measures things in different ways and there is no reliable way to compare (for instance) what a “visit” is in one platform versus another. Most of the time, they won’t match.

The enterprise may also engage a fair number of content-creators, ranging from full-service agencies to bloggers to pure-play development shops. Too many of them also claim they can do the measurement as well as the creation. Avoid this trap by centralizing measurement inside one team that’s responsible for standards, governance, and measurement itself. It may require bringing in a digital analytics consulting team that can laser-focus on these issues. The alternative is a world of rogue sites, unhelpful self-measurement, and no way to roll up results beyond the market level.

2. Data Integrity

The notion of data collection is often remote and mysterious to marketers and decision-makers. They have little knowledge of the details and less control of it. Safety seems to require skepticism about accuracy – and this leads rapidly to nothing at all. Because if you cannot trust the integrity of the data, you won’t be taking any action based on it.

Taming this problem requires standardization (as noted above), but also trust in your analytics team.

Trust is a nice word, but how about verification?

A tag audit is a great place to start. Find someone neutral on content and measurement and have them look at whether the data is being collected properly. Too often, the answers will be less comforting than you’d hoped, but knowing is the first step toward fixing. Are tags firing more than once on a page? Are certain things not tagged at all? Are parameters set correctly so that the data maps to actual reports? Audit today.

3. Content Actionability

The main reason you measure user activity is to find out whether your content is working. This implies that if it turns out the content is weak, it will need to be changed. And how will that happen? Do you have a plan, a process, a regime that addresses this key element to success? Or do you end up making recommendations that never see the light of day? Are politics getting in the way? Is someone’s favorite campaign or favorite agency looking not-so-good after measurement? It’s tough, but this kind of thinking needs to be defeated before you can win at optimization.

Actionability requires human intervention. It requires putting aside personal preference and prejudice and looking at the data. Your mindset needs to be grounded in success metrics rather than aesthetics or alliances. If you want more customer activity and a healthier bottom line, ditch the so-called loyalty to any particular campaign or site design and look at its performance. The race goes to the swiftest car, not the prettiest and certainly not anyone’s subjective “favorite.”

Simply put, you need to have a plan to govern your data collection, then verify its accuracy, then be confident enough to make changes based on what you find out. If you are thinking this sounds pretty hard, you won’t be the first.

But making a success of it brings success to the business. Take immediate steps to fix these problems and you will be glad you did.

 

 

31
Mar

5 Weird Tricks for Keeping Analytics Projects on Track

Written by Andrew Edwards. Posted in Analytics

analytics-shutterstock-139983946-185x114Analytics is complicated and, because it is designed to provide facts, is not often as welcome in a marketing discussion as other disciplines. Analytics sits between marketing and technology in a place where neither party feels altogether comfortable; and too often the analytics consultant finds herself taking arrows from both sides.

Here are some suggestions for the analyst in order to make sure your analytics customers remain happy, whether internal or external.

1. Establish Legitimacy (Confidence)

Do you know what you are talking about? Make sure your customer knows it. Often the more insecure a stakeholder is, the more skeptical they are of you. When you meet these kind of folks for the first time, have handy a brief “elevator pitch” that lists your qualifications and experience in a friendly way. Then, avoid jargon and instead, talk in a way that some folks call “storytelling” but which I call “narrative.” This requires you know the data and where it points. In this scenario, you are the scout-leader heading the group on a hike to the facts.

2. Overcommunicate

Don’t make yourself a burden by going over the top, but when you wonder “should I check in,” it often means your subconscious has already answered that question and is trying to get your attention. Making sure your stakeholders are nearly as well-informed is key to keeping them happy. Communicate more judiciously than you would with a colleague. Your customer is more averse to surprises, and that’s mainly because they often have their own reports to do, and when you surprise them, then they have to surprise their boss. And their boss really does not like surprises. Keep from surprising your stakeholder, even when you think the news is good.

3. Test Before Launch

Have you heard of the “small technical glitch” that “caused a big problem”? It happens a lot more than you’d expect. Almost always, this is because no one has set up a proper testing environment; and tested whether the program creates unexpected changes in data collection or reporting. A test environment is a great way of avoiding surprises (see above). In many cases, it can spell the difference between a good analytics program and loss of confidence.

4. Pay Attention to Narrative

People are storytellers. Data doesn’t tell a story, but it does provide you with reports so that you can tell a story. Perhaps one day there will be a truly engaging storytelling robot but today, it is still the job of the human being to look at seemingly unconnected threads of information, see patterns, understand nuance and relative importance, and to create a story out of raw numbers. You’ll likely need data from different sources to create a fully dimensional picture for yourself, which then you will use to create the narrative that comprises the insight needed to make changes based on data. Without the narrative, it’s just machines talking to other machines.

5. Don’t Defend Technology at the Expense of Business Needs

Technology is not business, it is a subset and a provider to business. So when a non-technical person says why not, the technologist is ill-advised to simply say “we can’t do that,” assuming the non-techie will accept that “the technology just cannot do that.” First, you may not be right. Very often, there is a solution out there, and maybe you need to find it. Second, many businesspeople see a technology lack as your lack, because without technology, you would not be there at all. It’s OK to say the technology cannot do it if the technology cannot do it, but you will need to communicate that as a business concept. For instance, “there is no data source” is not nearly as effective as “we need someone to give us access to the data sources, do you know who that might be?” All of your reasons for doing things (or not doing things) must serve a business purpose, or you need to supply a plausible business reason why you can’t do it.

With these five weird tricks you should be able to lose weight, get cheap car insurance, and even keep your analytics customers happy!

About Efectyv Digital

Efectyv Digital is focused on strategy for two distinct markets: digital analytics end-users; and marketing strategy for technology companies.

Click here to learn how how we can help your business grow >