One of the most often used, and somewhat tired, clichés in marketing is the idea of “breaking through the clutter.” And with so many media outlets available, there’s a lot of clutter for advertising and general marketing messages.
This clutter has definitely asserted itself in the world of content marketing, where much time and effort is spent generating content, which creates even more clutter. Clutter like this blog post, for example. Or your latest video. Or that company’s online magazine.
Does any of it make a difference? And how would you know if it did?
Many marketers believe the answer is in metrics: simply counting page views, video views, downloads or likes. Bigger numbers are better, right? That’s how it’s always been done: TV shows with the most viewers stay on the air longer and can charge a premium for advertising; politicians with the most votes win the election. Counting is a measure, and you can sort of assume that if a message reached a lot of people, it made a difference. But did it?
A step further in the realm of measurement is trying to determine if people are engaging with your content. For many, it’s not enough to have someone like their post, but did they also share it with their network, retweet it or comment on it? Traditional trade media often measure the “pass-along” rate of their print publications; in the digital world this is kind of the equivalent to that. So, many assume this is a good measure of “engagement” and that content is successful. But it’s still just counting. Did it make a difference?
Another way is simply to ask people. Many choose to conduct a survey or a quick poll to see what content people are interested in, see what they recall reading or viewing, or to have them evaluate various materials. Again, many magazine readership studies have asked these types of questions for years, and they are making their way into digital content as well. Even Google has created pop-ups that ask if you are satisfied with the search results you just received. But, people tend not to comment as thoroughly as survey sponsors may hope, leaving the question still open of whether or not it made a difference.
Still others try to measure the success of content by sales, which is perhaps the most difficult of all to link together.
The key, in my view, is not relying solely on those outputs described above to determine whether your content made a difference. All of those simply provide a number or a rating, but not whether it made a difference. To be able to know if it makes a difference, you need to know why you are creating the content, what its purpose is, and what its measures of success will be. Sadly, many stop at counting the output, checking off an item on their list: “Hey, we published stuff!”
Content should be tied to your company’s business and marketing objectives to really help ensure that it makes a difference. If you don’t have those or know what those are, it makes it more difficult to truly know if your content has made a difference.
Beyond just counting the volume of content, these are some helpful (and admittedly subjective) ways to look at your content to help determine if it is making a difference, and not just contributing to content clutter:
• Does it help further position your brand the way you want to be perceived?
• Does it support your key sales focus for the year?
• Does it help your sales team do its job more effectively?
• Does it educate your potential customers about a particular product or service attribute?
• Does it solve a customer problem or challenge?
• Does it entertain?
These items may not be immediately measurable, but they are often critical to building affinity for your brand over time, and making a difference for your brand.