This article originally appeared on the Huffington Post. Below is the full version of the Q&A.
As the digital revolution continues to transform the way in which companies market their products to consumers, brands and agencies are facing a whole new set of challenges previously unencountered in the industry.
There remains industry-wide confusion as to how to effectively track the success metrics of both content and content distribution. In fact, according to HubSpot’s State of Inbound 2014-2015, measurement is the number one challenge marketers face, with nearly 30 percent of those surveyed reporting that proving the financial return of their content-driven marketing activities was a widespread problem.
So how are brands and agencies attempting to overcome this stumbling block? How does their approach vary? I recently spoke to Leanne Brinkies, Global Head of Native Advertising at Sydney-based content agency, King Content, and Darin Diehl, Director of Content and Shared Services from direct bank Tangerine in Toronto, to find out.
Note: King Content and Tangerine have no affiliation and do not directly work with one another.
Industry-wide, we’re quickly learning that organic content distribution can only get you so far. How important do you think paid amplification is as part of an overall content strategy?
LB: My personal opinion is that if you don’t invest in any amplification of content, your ability to assess ROI is much lower because you have a smaller pool. That’s why King Content has created a specialist native advertising division—because we realized our clients weren’t seeing the returns they needed. If you’re not getting enough eyeballs on the content you’ve paid for, to me that isn’t a very good equation.
DD: In terms of content distribution, you have to start with your owned and earned channels. And then of course there’s paid channels, and we generally work with a media agency for that. The agency helps us devise a paid strategy that we can continuously optimize and track to ensure we’re getting the most out of our paid distribution dollars. This year we’re all about trying new things, seeing how they perform, and then optimizing them to ensure we see the results we need.
A problem facing most marketers today is how to track and measure the ROI of content. Are you any closer to solving the quandary?
LB: I think it goes back to the metrics we talked about before. It’s about how many are viewing the content, where they are in the funnel, how they’re engaging with the content, how they’re sharing it, and how this is affecting sales and performance over time. At the end of the day, all our clients want to know how many sales they’re making and at what cost per they’re going to deliver. But so far it’s been difficult to prove that case with content.
The last-click attribution model doesn’t really work with content. It doesn’t effectively demonstrate how content is performing or how engagement with content positively impacts a business’s bottom line. For example, with a last-click attribution scenario search will always offer the lowest cost per acquisition (CPA). But what we should see is that that CPA, even for search, should reduce over time with a content-driven approach, because you’ve brought so many more people into the funnel that the number of searches a user does should decrease, and when a user comes through they should be lower down in the funnel.
A perfect attribution model is the holy grail of content marketing that we’re all trying to discover, but we also need to find a way to demonstrate the value of engagement metrics and draw attention to those top-funnel awareness considerations.
DD: We closely track and monitor a range of metrics and KPIs so we can see what the results are, learn from the data, and then make adjustments accordingly. I would argue that we’re always going to be in the process of figuring it all out. Things are constantly changing—Google makes changes to its algorithms and user behaviour changes, so to truly understand the ROI of something, you’re going to need to constantly revise the ways in which you’re measuring to understand the impact your content is making.
I like to break our strategy down into three separate components because success looks different for each of them. Firstly, we look at whether the content is being consumed—it’s important to break through the noise by creating something that’s going to make a connection with your audience. Then we measure engagement to see if the audience is sharing or taking some kind of measurable action, including following through on a call to action. And then lastly, what’s the impact on brand perception and are we influencing conversion.
You’ve both mentioned the importance of user engagement with content. Have you found an effective means of measuring the impact your content is making on your audience?
LB: We definitely track key metrics like views, shares, and how much time a user spends reading a piece of content, which I think are all key from an engagement perspective. On our end we also measure what types of audiences or personas engage with content. This enables us to create themes, pillars and topics that relate to different audiences when we’re creating our content calendars. We also have a proprietary measurement system, Communiqué that allows us to categorize all the data we collect, so we can see how each of these audiences or personas is performing. This allows us to track the data over time and could potentially completely change who we’re trying to talk to.
Another important element we track is the connection between the audience reached through content amplification and the amount of time a user has spent with a piece of content subsequent to this connection. A positive outcome is if the user has spent a considered amount of time with the content and potentially shared it. On the flipside, where we’d say there’s a negative outcome is if someone has gone to a piece of content and has left pretty quickly. In this scenario, we would assume that either the content wasn’t right for the reader, we’ve reached the wrong audience, we need to work on changing the content, or there’s something wrong with the site itself that’s causing people to leave. We would then look for ways to tweak and change the amplification strategy and/or the content to improve a clients result. In terms of what our clients look at the most, I would say that time on site is definitely an important key performance indicator (KPI) many of them are looking at.
DD: I would say that it really depends on the type of content. One example is when all the changes happened with the TFSA last year, we put out some content that explained how that product works, and it got was highly consumed and engaged with because there was a need people had to understand a product that was in the news. This is perhaps an example of content that sits lower down the funnel. But sometimes you might create a post or a video or an infographic that focuses on not so much a product, but instead on a habit or behaviour that’s based on a personal story or experience that touches people. Something that tells a story, about a personal finance victory, big or small. It could be how someone was able to pay off their student debt or was able to pay off their mortgage early. It’s all about storytelling that people engage with, find helpful, and provides insights. And how do you measure the impact of that? Maybe metrics like views and engagement in the form of clicks if there’s a call to action, and then on social you might look at interactions.
We’re currently looking at all those things, and we’re learning along the way just like everybody else in this space. We’re all students of content marketing. We’re learning as we go, and that’s exactly why we measure things—so we can optimize it as we go.
How do you benchmark the success of your efforts?
LB: We have enough data in our system now that has allowed us to create benchmarks across different categories and different types of audiences. We generally try to allocate a period of time at the very beginning of a content marketing plan and say that during this time, we’ll be benchmarking performance to guide future initiatives. Content marketing isn’t a set-and-forget strategy, where we lock in 12 months of content we’re going to deliver. It actually needs to be very much a live plan, and after the first month everything is up for discussion based on how well that first month of activity went. This allows us to see what content is being engaged with and in what ways, and then we can set benchmarks for those key metrics. Some clients will come to us and say “these are the metrics we want”, but most of them aren’t like that because content marketing is so new to them that they don’t know exactly what success looks like for them.
My view is that the more data and insights we can provide clients with the better. Clients can read data, but where we can add value is by providing insights that can show how things need to change so we can improve ROI.
DD: There’s an interesting debate going on out there at the moment. Recently, Seth Godin did an interview where he explained how you can’t over-metricize and industrialize content if you’re doing it to a point where you’re sacrificing its integrity. But still, we need some way of measuring whether or not our strategy is having an effect. As I’ve said we’re creating benchmarks across a few sets of metrics. First, is the content being consumed? Second, are people actually reading it? This would include metrics like time on site, views, clicks, and so on. Lastly, we track whether consumption of the content has influenced conversions. Are prospects becoming clients, or are existing clients opening new accounts? We also want to track the effect our content is having on the brand—we want to know how this content affects a client’s perception of Tangerine overall. Are they aware of us? Does it evoke a positive feeling towards the bank and its products?
There will likely never be a hard-and-fast, one-size-fits all solution to measuring the financial return of content marketing. “It’s really a constant learning model,” explains Diehl. “I don’t think there’s a 'plug and play' for this yet, and perhaps there never will be because the ground beneath us is always shifting. You’ve got to learn as you go and adjust as you go.”