Author Archive

10
Oct

Convergence Analytics 2.0: Everybody is Still Measuring Everything

Written by Andrew Edwards. Posted in Convergence Analytics

How much does multi-channel analytics really help the marketer?

It’s hard to believe it was only six months ago when ClickZ published the first Convergence Analytics Report that I co-authored. We just launched the second Convergence Analytics report at SES San Francisco and I feel like we were barely able to document some of the latest changes taking place in digital analytics today. Suffice it to say things are moving very, very fast in this field.

Our tag line for the first report was “everybody’s measuring everything”. We were referring to the way nearly every vendor and many practitioners were planning to broaden their web analytics plans to include social, mobile, demographic, seasonal, advertising, customer relationship management (CRM) data and more into a single discipline.

Some folks call it “multi-channel analytics”. We called it Convergence Analytics because we were describing the convergence of many channels into a single tool—but also because we were describing how a multitude of vendors were converging on the notion of providing a single view into many measurement channels.

Today the rush to single-vendor solutions seems more headlong than ever.

But just because everybody’s doing it, does that mean it’s a good thing?

Allow me to answer that question with a definite “maybe”.

It’s a “good thing” if certain criteria are followed:

  • the practitioner has in place both expertise and a process for deployment and action
  • the vendor is in fact delivering an integrated, robust and accurate solution
  • expectations are kept in check
  • costs are managed

In our second report we called out a number of factors that seemed to be impeding adoption of what really is a good idea – the ability to see more data at once, more quickly and at lower overall cost.

The biggest problem in a volatile market like this is that it’s very confusing for the buyer. There are simply too many analytics vendors talking about the same thing. Some are aligning what they say with what they deliver, and that’s the way it ought to be.

Many more are shoehorning themselves into what sounds good at the moment, and at the moment that might be multi-channel analytics. Paraphrasing an old Love Story, “SaaS means never having to say you don’t do that.”

Tomorrow the same vendors will pull back and say they are vertical (specialized for a single-market or single-solution) and would not dream of trying to be all things to all people when their tool is limited or perhaps unfocused. Because that would be dumb. And because investors aren’t putting their money into companies that say they measure everything for everyone all the time (which is probably true).

The ability to look at more information from more sources in one place is prima facie advantageous. For the general public, a device that provided this would have been called (until recently) “the newspaper”. For digital businesses, it’s more technological and less obvious, but it’s the same sound principle: the more you can know about your world, the better decisions you can make.

A newspaper might have told you it was likely to rain later. Better put on your boots. And it might have told you the garbage collectors were striking so you might also want to pack a posey. What about an analytics tool that could tell you how web was affecting mobile, and further, help you automate the content served to specific customers in your CRM database?

A digital analytics tool that would tell you only what was going on with your web pages, and assuming you didn’t have other ways to measure or act upon the rest of your digital properties, would be a prime candidate for retirement. Is it any wonder why, with the sudden emergence of a hundred and one digital channels, that every company that ever measured anything, and some that really never did, would flop towards the concept like seals to a bucket of mackerel?

Seals are smarter than that, and so are most vendors. The bucket of mackerel has only a certain amount of fish in it. And the notion that to measure everything is something everybody can do is more than a little bit fishy. Vendors without strong, integrated offerings and enough cash or customers to stay competitive will either go the way of all seals or find another pool to swim in.

Some market-leading companies, and some very capable upstarts will thrive and prosper in Convergence Analytics. They will find customers that have the right expertise and processes in place to make worthwhile the effort of deploying a complex measurement application. Their tools will be powerful, different and useful, rather than just cleverly described.

I believe the future will see marketers looking at multiple streams of data in contextually relevant ways that help drive their marketing programs more quickly, more efficiently and in a way that yields more tangible results.

At that time, the “maybe” I suggested above would become a more definite “yes”. But until we can better understand how to select and deploy the right technologies and disciplines at the right time, we are just splashing around in the crowded, shallow end of the pool.

04
Sep

3 Digital Practices Brands Need to Master Before It’s Too Late

Written by Andrew Edwards. Posted in Analytics Strategy

It’s August 2013. Do you know where your markets are?

In terms of digital analytics, that is.

With content focus shifting rapidly to mobile (as if the web was going anywhere!), and multi-channel analytics all the rage, it’s important to note that whatever the platform, three principles remain key to brand success.

The larger the brand, the more important these items become. And with added layers of measurement complexity, they become that much more difficult to achieve – as well as that much more essential to success.

Time to get the woodwinds playing with the strings; and the brass playing along, too. If you miss this opportunity to make a symphony, you’ll have an awful time getting control of it later when it’s even more complex. And by then the hall will be empty – your audience will have gone home.

Here are three important challenges that must be met by any brand hoping to get more value out of its digital properties.

1. Analytics governance. Far-flung content owners? One big, rather dated and rather expensive measurement solution that seemed to be the answer to everything back in 2007 (but that most business users don’t like or don’t use)? Lots of agencies each making the case they should measure their own success? Rogue sites with non-standard tools measuring in non-standard ways? And no way to roll up reporting because nobody seems to have any measurements in common anymore? It’s a poorly tended garden indeed, but it’s more common than you’d think among big brands.

There’s a way to get this under control, but that’s the keyword: control. Some call it “governance.” Governance requires at least the following:

  • Create a central digital measurement authority and run all analytics through this team without exception.
  • Choose a single, not overly costly measurement platform (or suites for different channels) and enforce its use.
  • Create a basket of basic measures that all properties must measure no matter what else they also measure (again, there may be one at least for desktop and one for mobile).
  • Don’t let agencies measure their own content; not only is it unmanageable for the brand, it’s also a natural conflict of interest.
  • Drive reporting responsibility to the markets and business owners. For instance, ask them to justify why they need a particular custom report (e.g., “What would you do different if you knew the numbers?”).
  • Update reporting often. Underutilized custom reports and long-forgotten profiles are the bane of good analysis. They are confusing resource hogs and should be deleted. A clean interface leads to better adoption by business users.

2. Data integrity. They don’t trust the numbers. And they won’t take action (see below) because they don’t trust the numbers.

Adoption remains low.

Millions of dollars may be lost because no one’s keeping track of the data till.

In order to establish data integrity, institute governance, then:

  • Perform tag audits on everything right now (tags are the snippets of code that collect data); make sure they are connecting with the analysis engine!
  • Establish baseline reporting and a set of standard tags to support it.
  • Think seriously about tag management systems and how they can simplify tagging across the organization.
  • Don’t just accept numbers from external sources – use your own analytics to see if there’s a match.
  • Data-integrity laggards should be sharply questioned as to why they won’t or can’t measure accurately.
  • Remember the audit? Perform this not once, but often, and have the results reported up to senior management.

3. Content actionability. Does a digital campaign make noise in cyberspace if no one is there to hear it?

The answer is “no.” There’s no value to content that doesn’t drive conversion.

And that’s why you measure. To optimize content. To get better conversion rates.

This suggests rather strongly that you may need to do something different once you see the data gathered by measuring user interaction with your content.

There are two ways “actionability” works. One is indirect via human intervention. The other is automatic via sophisticated content delivery strategies supported by algorithms and what’s often called “real-time” data.

  • Human intervention means never having to say “my creative team liked it” even though the data says you wasted your money. They may have liked it when they launched it, but if it doesn’t bring dollars or meet an exposure goal (all measureable), then they should stop liking it and get to work fixing it. It can be very difficult to drive this message home to the teams you rely on to build excitement for your brand; but the best ones know which side of the bread has butter, and it’s the side that also has the conversions. Who’s paying the bills? You are. They work for you. Demand changes to content areas that don’t perform.
  • Automatic actionability is a more recent development but it can be very effective. Dozens of measurement platforms claim to be “real time,” and to deliver what they call “predictive” analytics. An awful lot of that is bunk. But when the right algorithm is hooked up to the right data at the right time, then content can be served up based on criteria gathered during the measurement phase. When it’s done right, this qualifies as both “real time” (very fast) and “predictive” (properly modeled based on data). This is what ad networks do all the time; and now this capability, via several competing multichannel analytics applications, is available to brands that want to leverage all the data they’ve gathered to achieve better-targeted messaging.

Whether by human hands or robot power, the feedback loop from data to content and back again is essential to digital marketing success – and now is the time to get a handle on it.

Managing the digital brand experience is only going to get more complex; and the cost of reining in the wild analytics horses is going to get higher, quickly.

Best get out the lariat.

Convergence-Analytics-at-SES

 

21
Aug

3 Reasons Why Analytics Disappoints and How Not to Be Disappointed

Written by Andrew Edwards. Posted in Digital Analytics

analytics-disappointingMany analytics experts will probably agree they very often come across organizations where analytics is not only “unadvanced” but disappointing. Where analytics isn’t being adopted by executives who need to make decisions based on performance data. Where, for one reason or another, analytics isn’t delivering much value at all.

Rare is the case where analytics is entirely tossed out (that would be an admission of failure!). Much more common is the “back burner” syndrome. Analytics chugs along quietly, ineffectively, inaccurately, unnoticed.

Here are three major reasons why analytics may seem a disappointment (and remedies for same):

No. 1: Business Misalignment

Ask the question “why do you have a website?” and sometimes the response is a deafening silence. No one wants to say “because everybody has one” because they know that can’t be a good enough reason and they don’t want to say “to sell stuff” because they know that’s too broad, and after that you’re liable to find five different answers from five different constituents.

Dumb as the question sounds, it’s the one that, when answered incompletely or carelessly, generates the most frustration with analytics.

If you don’t know why you have a site, it means you don’t know what matters to you about user behavior. If you don’t know what matters to you about user behavior, you don’t know what kind of behaviors to measure. If you don’t know what to measure, you just measure everything or nothing, and the result is much the same either way: meaninglessness.

How to Get Better Aligned

Convene a meeting of your business leaders and your web content stakeholders. Facilitate a discussion of what business outcome each part of the site is attempting to create. Broadly, there will be only a few possibilities and within these there may be drilldowns. But the broad business objectives for sites are generally as follows: e-commerce (sell stuff online); content/branding (spend time on the site to view messaging from your organization or your advertisers); lead generation (lead nurturing, list-building, informative downloads, sales-call triggers); and self-service (think of intranets, insurance portals, problem resolution databases). Nearly every site will fit either entirely inside one of these large buckets or may have a presence in a couple of them at once. But whatever the site seeks in terms of business outcomes will then determine the specific type of reporting you’ll want out of analytics.

By customizing your analytics tool to answer the business questions generated by the abovementioned goals, you will avoid misalignment – and one of the most common and destructive failures in analytics.

No. 2: Poor Implementation

Many frustrated analysts have fallen victim to a sub-par or simply incomplete implementation. Much of the work in getting analytics to answer relevant business questions turns out to be non-obvious and not easy for the occasional user to implement correctly. This persistent complexity, when combined with architecture issues, multiple developers, and simple lack of tool-familiarity, results in what some call a “broken” implementation.

In a broken implementation, analysts will note that data seems incorrect or impossible to reconcile; calls to the tool vendor refer one back to whomever did the implementation (for example, the tagging and report building); developers often lack the understanding to even know what they did wrong; and the analyst is left without any way to do a good job for the company.

Just to give you an example of how typical this can be, imagine a fairly sizable organization deploying a Flash or Ajax module inside their digital offering. With two or three teams of developers involved in creating pages, putting tags on pages, and making pages go live, the communication is already prone to misunderstanding and error. Compounding this is a lack of understanding of how “tagging” actually works. The result is repeated failure to get the module to send tracking information to the analytics engine, even long after launch.

Result: the analyst, and the organization, is left blind as to the success of its campaign – in this case a rather expensive one.

How to Improve Implementation

Remember that page-tagging, tool implementation, and custom code development require specific expertise. Sometimes it’s difficult to see that, with all the web talent in the organization, this remains a significant gap. And yet in many organizations it is enormous. Some companies look to contractors and this can be of great help assuming you get a contractor that is knowledgeable and reliable. Others hire specialized agencies to take on this task and enjoy success, though at a cost in dollars. Training for existing in-house talent is often a stopgap; but mostly, training is geared to helping folks use analytics once it already has been properly set up.

Finally, some organizations, attempting to think ahead more than others, will hire an individual employee or an internal team that will focus solely on analytics deployment. This can also be a great solution, though it has its own significant costs. And there’s always the risk that the knowledge requirements shift away from the specific skill set of the hired specialist(s).

Whichever way you choose to fix a poor implementation, make sure it includes dedicated expertise – deploying deep expertise (both in business and technology) will result in a much more robust and effective analytics platform. Try not to rely on users with only a shallow understanding of how analytics tools, tags, and interactive architecture must work together in order to deliver meaningful insight.

No. 3: Company Politics

OK, lots of things fail because of company politics – not just digital analytics.

But there are particular ways that politics gets in the way of digital analytics, chiefly related to misunderstandings or misinterpretation of proper roles and responsibilities as relates to technology and content.

Put simply, the “measuring” should never be managed by the “measured.” This is because no one wants to be forced to be objective about the success or failure of their own efforts. And when put in that position, they may sometimes behave in what might be called an “obstructionist” manner, even if they are otherwise very helpful and above-boards. Of course this is not universal. But it is a noticeable tendency.

In practice, this means that the agency or marketing team responsible for putting up content (especially if they are third party) should be told that the measurement of the content is going to be handled by someone else. They will also need to be told they need to cooperate as a condition of engagement.

Too often, analytics goes down a rabbit hole and never reappears once a third-party creative shop gets involved in performing analytics. And often enough it’s because of a lack of throughput on the third-party agency side: either because they don’t know how to tag and implement in an expert-enough manner, or they put it on their “later” pile because they have no upside in doing otherwise.

How to Get Past Company Politics

The best way to handle this problem is to think of analytics as a discrete project that needs to be assigned to a particular group that specializes in that and has a clear upside in making sure it’s done properly. Almost universally, this will result in a far better state of analytics than leaving it in the hands of folks who might not have a vested interest in the success of analytics.

Aligning expertise with a properly identified business need – in this case, analytics expertise with a need for accuracy and objectivity – will drive your analytics effort away from the whirlpool of competing interests.

Adoption Is Key to Web Optimization

You’ll probably find that more targeted, more accurate, more objective, more constantly reliable analytics data results in higher adoption rates. This means that people who need to look at the data will look at the data. And then they can make decisions based on what they see. But if the landscape is littered with meaningless reports, inaccuracy, and tardiness, expect low adoption and low impact. And in the end, low impact for analytics can leave your organization at a distinct disadvantage – because the competition may have figured out last week how they might stop being disappointed in their analytics.

Convergence-Analytics-at-SES

About Efectyv Digital

Efectyv Digital is focused on strategy for two distinct markets: digital analytics end-users; and marketing strategy for technology companies.

Click here to learn how how we can help your business grow >