Content marketing has evolved, and the way we measure campaigns has evolved with it. Everyone is trying to find the perfect formula, and the Globe Edge Content Studio is no exception. As a digital producer, I’ve found myself beginning to group metrics into three distinct categories.
The old reliables
Industry players are no longer satisfied with click-through rates, page views and time spent. They want bigger and better results. It makes sense, especially if you have the ability and opportunity to access more sophisticated metrics.
At the same time, I can’t dump on click-through rates and page views. Every content marketer struggles with reach at some point, and metrics that measure whether people are actually consuming your content will always be relevant. The new and improved engaged time spent, which tracks readers scrolling or clicking on your page, might end up replacing the traditional time-spent metric. But there’s one thing engaged time spent doesn’t have on its side: a history of data.
Time spent and click-throughs are universal metrics. There’s enough data out there to form robust industry-wide benchmarks, and the metrics are straightforward enough that comparing your click-through rates to another marketer’s click-throughs can be an apples-to-apples comparison. There is greater variance to how more complicated metrics are measured.
The Globe Edge team keeps regular tabs on performance for different content and advertiser verticals. Benchmarking in the content marketing landscape keeps you motivated, and it gives you goals for further optimization. It can help prove whether your content strategy is working. There’s a lot to love about the old reliables.
The new, young things
You’ve probably heard the words engaged time spent, scroll depth and average finish thrown around a lot lately, and there’s plenty to love about these metrics.
For the longest time, there was no way to get granular when measuring content performance, and metrics such as average finish rate have helped content marketers better prove their ROI and glean insights about their target audience.
But each of these metrics only tells part of the story. They work best in tandem and as a suite of tools. As with all metrics, the numbers might mean nothing without a clear objective or KPI – which means the best metric for one marketer may be of little use to another. More metrics open a bevy of doors. They can also be distracting and confusing.
The ones you didn’t know you had
As you create more content, you might end up with more questions. Why was this piece more effective? Do the returns justify the time and investment put into this video? Would this story have done better with a different headline, or if it had been promoted with different hashtags?
Once you have a good amount of content that’s been tested and tried, an audit can be incredibly helpful to help identify where to focus your efforts, and which assumptions were entirely wrong.
Here’s an example:
Earlier this year, we did an audit on the scroll depth percentage from a selection of our native-advertising content. We always assumed that adding more images would increase attention span and scroll behaviour because it gave readers a visual break from the text.
It turns out the biggest factor for increasing scroll on content was a larger font size (even on desktop). It’s definitely something we wouldn’t have known if we hadn’t been regularly assessing our content as a whole.