Why Data Driven Decisions Work: A Short Use Case


Inbound Marketing

This is a story of how we failed. But also of how we learned and how we got better.

A few months back, I was charged with digging into the Organic Traffic for one of our clients to report on the positive impact our blogging efforts had on their site traffic. I was using the Google Analytics Channels Report to see Organic Traffic over a specific time frame and comparing it with the previous period.

What I uncovered puzzled me. Actually, it straight up concerned me. Organic Traffic had dropped. Not just a little bit either. It dropped significantly on all metrics across the board. Users, New Users, Sessions, Bounce Rate, Pages per Session, Average Session Duration. They were all down by decent amounts.

Google Analytics Traffic Down

There's nothing more terrifying than seeing red arrows when you're expecting green arrows. This has never happened before. Ever. Never have I seen organic traffic on a website take a turn for the worse after heavy content marketing efforts. That's not how it works – that's not how it's supposed to work. It just doesn't happen. I was stumped.

stumped-tom-hanks

We had done everything right, too. How could we go back to the client with this kind of news? We called an internal meeting to review and discuss what the data showed. The rest of the team was in disbelief as well. We talked about the updates that had been made to the site, and pulled our hair out wondering what went wrong – but more importantly, why? During this part of the chat, our developer mentioned that the hosting company this client uses requires all development to happen on a staging site before getting pushed live.

The obvious question came up. "What stops Google from indexing the development site, then?"

Turns out, the staging site uses a noindex tag to keep search engines from crawling it. Apparently (and unbeknownst to us), that doesn't get removed when the staging site is pushed live. We checked the source code of the live site, and sure enough, there it was. A noindex tag. Essentially, it told search engines to just ignore everything on this domain and pretend it doesn't exist. Not surprisingly, the date that organic traffic started declining perfectly aligned with the date our updates were pushed live. We removed the offending tag post haste, and organic traffic started climbing almost overnight.

We checked again a couple months later and the data was much happier. Organic traffic had improved across the board when compared to the time period before our updates were made.

Google Analytics, Traffic Up

Crisis averted. Well, almost. We still had to tell the client.

So, what's the moral of this story? What's the lesson learned? There are two of them, actually.

Lesson One

Had we not taken the time to make data-driven decisions, we likely would have lost a good client and their site would still have that noindex tag on it – forever a black hole in the eyes of Google. We could have just taken the data at face value and assumed that our tried and true tactics just didn't work this time.

Sometimes, taking data at its face value can have disastrous results. As would have been the case here. In this situation, the data clearly showed that organic traffic had taken a nose dive. We forced ourselves to not make a knee-jerk reaction and call it quits. After all, nothing we did should have caused anything to decrease – in fact, that's the reason why we dove deeper. Because the data showed the exact opposite of what we were expecting to happen.

Those are the situations that warrant a deeper look – to peel back the layers and get your hands dirty. Only by doing that did we uncover the true source of the problem.

Lesson Two

We had to own it.

The data didn't lie. It wasn't wrong. That client's organic traffic really did drop. It shouldn't have, but it did. We were the ones who failed to notice that the staging site didn't update the live site as it should have. We could have blamed it on the hosting company because it was a system we had never used before. We could have blamed the data for being faulty. We could have blamed Google for being slow to re-index the pages. But none of those would have done any good for anyone – except for us. All of those excuses would have taken advantage of the client's lack of understanding, which is why they hired us in the first place – to help them understand.

At the end of the day, we were the ones that pulled the trigger and updated the site. The onus had to be on us. We took one hundred percent of the blame and apologized to our client. This client was understanding and took it in stride. Not to imply they were okay with what happened, but they understood the situation and trusted that we did our due diligence.

Living Up to a Core Value

To be transparent is one of our five core values here at Leighton Interactive. It's not easy, especially when we screw up and need to own up to it. I firmly believe that transparency is the single most important thing we can do for ourselves and for our clients. If we can't be transparent, then what else is there?

To bring this back full circle, when it comes to analyzing data, being transparent and open minded is critical.

Give a little.
Get a lot.

We regularly share insights on how we approach marketing. Get on the list.

Easter Egg!