Why data is becoming the albatross around your neck

Guest Blogger

Ian_Howard_700x365.jpg

Ian Howard leads the strategy development at Little Giant, Linked by Isobar — a digitally-led creative and innovation agency in Auckland. He brings to the team extensive experience in creating growth-focussed brand, marketing and digital strategies for some of the biggest companies in New Zealand.

Long heralded as the savior of marketing, why is data still being used to make poor, generic or same-same decisions?

Ah, big data. Two little words which, when put together, seem to stir the loins of marketers like no other compound term. And quite rightly too. For the first time in human history, we have data at our fingertips to allow us to make informed and defendable decisions, to understand our target market beyond skin deep demographics, to report on our success or otherwise with unprecedented ease and robustness.

Sexy stuff I think you’ll agree. Except it’s not. For a combination of reasons, I think the use of data in marketing is doing the opposite. It’s making us less empathetic, less informed, less understanding and less smart. Here’s why.

Best practice = uniformity

One of the problems we have is that it’s not just us with access to data. Sure, you may have some proprietary data, but compared to the wealth of transparent and instantly accessible data out there, it’s a drop in the ocean. And so the first thing we do when looking to solve briefs is pull out huge pieces of research to understand best practice in our industry or among our consumer set. Or we buy a set of raw data and crunch it ourselves, coming up with the same conclusions as everyone else. Then we slavishly abide by the rules of best practice. After all, the research was conducted analyzing millions of data sets captured over millions of touch points, so who are we to argue with it?

Well that’s all very well. But if you, your main competitor, your other competitors and your adjacent industry competitors all have access to the same data, you’ll all end up doing the same things. The research says people respond better to this type of message. The research says a button should be this big. The research says this word works better than that. And so we disappear into a world of homogenous mediocrity. Oh no, not mediocrity. Homogenous best practice. That’s much better.

Capturing the wrong data

As a digitally-led agency, at Little Giant, we’re big on using analytics to get an informed understanding of the way people navigate through your digital assets, be them products or marketing platforms. Having a robust analytics implementation means being able to optimize the performance of your asset over time, leading to improved results and therefore business growth.

Great. Well, not so great actually. In my time in digital land, I’ve seen some analytics implementations that not only render the data being captured completely useless, but actually makes the data obstructive. Doing analytics wrong is worse than not doing analytics at all. Indulge me for a second. Back in March I was at Eden Park living out a boyhood dream to watch the Lions take on the All Blacks. I witnessed one of the great test tries. Liam Williams gathers the ball at full back, steps inside in his own 22 and arcs his way up the middle. He’s looking around him, he gets to halfway and offloads to Davies. Davies passes on to Daly, who looks to go in, then out, then back inside to Davies, who has O’Brien — where did he come from! — on his inside to splash over. A great try and one in which four Lions players played a big part.

Now I’m going to take a punt here that with your current analytics set up, you’d give 100% of the credit to Sean O’Brien, forgetting entirely the part that the other three played. So when you’re picking your next team, Williams, Daly and Davies might not make it to the starting line up. After all, the data doesn’t show that they did much. That’s not how to make informed decisions.

You are what you measure

While analytics implementations require some specialist skills and technical knowledge, ensuring that your measures match your objectives doesn’t.

Please, please, please stop measuring things that don’t matter. A click through rate on an ad is only important if your objective is to get people clicking on it. The number of likes you get on a Facebook post only matters if your objective is to have people like it (don’t get me started on Facebook’s new feature that allows individual users to track their “performance” on the platform via a personal report that details out things like “engagement” and “new likes this week”).

A click through rate on an ad is only important if your objective is to get people clicking on it. 

The more measures you introduce to your marketing set up, the more things you have to worry about and the more benchmarks you have to fall short of. Restrict your measuring to those that really matter. Anything else is just vanity.

Correlation doesn’t always mean causation

I once sat in an all-agency briefing with a very smart marketer at a very large organization who presented back research findings from a very impressive and very robust piece of work. They took us through a correlation between the amount someone used their product, and the amount they knew about the product. They then explained that one of the key strategic initiatives going forward would be to concentrate on educating more people on the product, with the insinuation being that this would therefore drive increased usage.

Well, no. In some cases that might be right, but in this case it wasn’t. It makes perfect sense that the more someone uses something, the more they understand it. That doesn’t necessarily mean it works the other way around. I know how to use an iPhone from years of using one. I know what to look for in a whisky because I drink whisky. The likelihood of being bald correlates with increased age. Increased age causes a higher likelihood of being bald. Being bald does not cause increased age. If it did, we could concentrate on staving off baldness to prevent people getting old. Sadly, that’s not the case.

Distrusting intuition

Finally, I’m increasingly seeing marketers reluctant to back their intuition without data to back it up. When it comes to many of the more intangible elements of marketing, there simply aren’t any guarantees. You can take ideas in to testing, but recognize that consumers don’t know what they don’t know. And that the first reaction to anything different is generally discomfort.

If you rely purely on data and don’t allow your empathy, creativity, humour, sensibility, motivations — in short your humanity — to come through in your decisions, you’ll find it extremely hard to create distinctiveness. Trust your gut. It often knows better than your head.

Missed Ian's presentation at UX New Zealand 2017? Don't sweat! You can watch a video of his talk on the website, as well as download a copy of his slides.

Guest Blogger
  • Guest Blogger
  • Want to join the list of talented and insightful guest bloggers at Optimal Workshop? Get in touch with max@optimalworkshop.com and let her know your ideas!

Blogs you might also enjoy