Data is dangerous when you stop asking questions

A few weeks ago I was reviewing a competitive analytics report with an in-house marketer at a large online university. I pointed out a bump in the graph a few months back that had given a big boost to a competitor, but they had only seen a minor improvement. I asked him what happened to cause that, and he sheepishly admitted “We don’t know, and we haven’t had time to look into it.” Obviously the team didn’t stop developing new marketing strategies, even though they could see the landscape shifting around them in ways they didn’t understand.

Do-it-yourself analytics

Data analytics has become an essential part of a university’s digital marketing toolkit. In the past 15 years, it has become possible for most marketers to get a dashboard of activity for whatever they’re working on. They can see anything from clickstream data on the .edu website to social impressions and engagement on Instagram. We can even identify individual students and personalize content. Data is collected almost automatically; it’s cheap to store and easy to visualize.

Because these reports are easy to access, everyone is expected to be an analyst.

DIY analytics is a good thing. It’s good that people are learning to unfold data, ask questions of it, to describe their work and activities with more accuracy. But easy accessibility brings challenges, too. There is still a learning curve for analytics practitioners, and there is a risk of misidentifying trends or attributing outcomes to the wrong activities. It happens because these newly-minted data professionals aren’t paying attention, or they don’t know better, or they see a correlation and confuse that with causation.

Despite 15 years of industry opportunity, most marketing professionals haven’t learned to powerfully leverage the data available to them.

Are gut-based decisions better than bad-data decisions?

In enrollment marketing, making decisions based on bad data may not be more dangerous than gut-based decisions when it comes to achieving outcomes. Teams regularly build websites and print collateral with or without data to back them up. Even in some sophisticated institutions, effectiveness gets measured after the fact, and they see how right or wrong they were with a “best-guess” approach.

But if decisions founded on bad data are worse than their gut-based counterparts, then it’s because it word “data” can inspire too much confidence. You get a lot of credibility when you say you’re following the data. It allows you to declare “I know I’m doing the right thing!” when in reality the data isn’t telling you anything at all. You may be listening to noise, or misinterpreting what’s happening. You may tell yourself a story from the data that doesn’t reflect reality.

A soggy example

Let me give you a real world example: This summer we arrived in Buffalo, NY after one of the biggest storms of the year. The rain let up right after the plane touched down. We jumped in our rental car and set off for Niagara Falls. Google Maps showed us the route but were stunned by the traffic. Why was there so much congestion on a Thursday afternoon near the Buffalo airport?

We crept our way down the street towards an underpass, only to find it blocked. The police were directing traffic away from the now submerged roads. The stream, swollen with rain, had overflowed its banks.

And Google would have stuck us chest deep in the river. The once-accurate map data didn’t account for the new, watery reality. It seems like a silly comparison; my marketing data may lead me astray, but who would drive straight into the water because the computer told them to? Sadly, it happens all the time.

Bad-data decisions become dangerous when you’re so married to your reading of the data that you stop being inquisitive and paying attention to other signals.


So what should you do?

What then? If data can’t be trusted, or if you have to second-guess the data when it appears to tell you something, what should you do? How do you move forward without worrying you’re driving into the river?

The most important thing to learn about data analytics in a marketing context is this: there is no one final answer. There are good ways and better ways to communicate online. Some students will really respond to your landing page, or you may have gotten a great response to that comm flow. But there is not one best way or one right answer. Once you find a better way, you have to realize that the circumstances, environment, personality, and perspective of students are flexible and temporary. People change, and the offering of the university changes. And if you rest on your laurels, sitting at the top of your best idea, and say, “Well, we did it! Now we have the answer!”, you will invariably find that two years from now, the answer will be different and you’ll have been left behind.

So instead of picking the winner and going all in, you need to iterate and optimize. Test variations, and explore options.

The only constant is change. There will be learnings from your current optimization activities, and things will carry over, but it’s doubtful that the best answer today will hold.

Falling out of love

Overcoming the temptation to fall in love with one data story is hard. The only solution that we’ve ever found is simple, but not easy: you have to keep asking questions. We keep reading. We keep looking. We dig and optimize and make recommendations and try new things because the only thing you can be sure of is that what worked last time won’t necessarily work just as well this time.

And that’s a good thing. Because not only will it keep us all in a job, but it means that as the world changes, we don’t stubbornly refuse to recognize and adapt. There is no single right answer; rather, there are a wide variety of opportunities to explore and learn from. We’re not tied to the way things have always been done. We can be smart and careful and wise and still be experimental and forward-thinking. We don’t need to fear changes or the new work to be done.

Here are a few questions to get you started:

  • Are there other explanations that would make the data look like this?
  • Is my dataset clean and accurate?
  • Am I comparing this data to the right targets or date ranges to understand it in context?
  • Does my new strategy or idea impact my key performance indicators? (Is this the best way to spend my time?)
  • Is there something about students right now that makes my insight true but temporary?

A better mindset

This mindset liberates an institution’s marketing team to engage with their best effort. When a new initiative comes down from the president’s office or new programs are developed, they can get to work testing and improving the way you communicate it. They can give you accurate, timely, and useful reports about reception and engagement. And they can make recommendations about what to do better. They become a vital investment and integral part of the your strategic team, instead of an expense or a loudspeaker to use once a decision is made.

To push your marketing team in the right direction, they need clarity from the institution about goals. If they know the one or two things that matter, then they can analyze data with the right perspective.

Building a culture of curiosity and inquiry that allows your team to keep asking questions in the quest for better outcomes will keep you safe and dry.

We offer analytics audits and consulting, so if you feel like your data interpretation or culture could use an upgrade, we’d love to chat.