I’m often surprised at how this one comes up again and again, especially at initial meetings with new clients. The notion is to use syndicated (or even copied) content to easily and cheaply build up a whole raft of keyword-rich pages on specific topics.

The pot of gold at the end of this particular rainbow is the improved search engine results driving lots of new visitors. Unfortunately, it never turns out to be a pot of gold – usually it’s more like a crock of something less attractive.

The drawback is simple. Google likes unique content. It specifically doesn’t like duplicated content – and it’s pretty good at weeding it out of its search engine results.

If Google (and other search engines) didn’t do this, then (since the duplicate pages have a similar degree of relevance) search engine results would be littered with pages that are different in terms of their URL but nearly identical in terms of their content. If this happened, people would soon switch search engines – so it’s not just bad for visitors, it’s also bad for Google.

So, I’ll be sat in front of a prospective customer, who has just passed me a glossy proposal from a syndicated content firm – claiming that, for a fraction of the cost of writing proper content, they can provide reams and reams of content that will do the same job. They usually cite specific examples of high-performance pages too.

It looks attractive. Just add in the RSS feed, and you have a news section (or similar) with content that people are hungry for and that will be great for search engines.

It doesn’t take too much to debunk these claims.

First, a few quick searches in Google will usually reveal that the content provided to each customer is also provided to a few hundred others. None of these do particularly well in search engines, since they are all duplicates. More often than not, the content contains a link back to the originating article – and it will be this article that performs best in search engines.

Second, some of the more underhand companies will use as their reference pages ones which are not typical of their work – that is to say, they are unique content and not syndicated. Therefore, they perform well when tested. But testing other pages for their other clients almost always reveals the truth – each article is shared by hundreds of sites, and few are gaining any search engine benefits.

Third, the quality of the writing isn’t often that good. It’s often subcontracted to writing firms where there is little control over quality other than a cursory glance. And, since each article is generally written by a different person, they don’t collectively form a ‘voice’ for the company – and in some cases actually directly contradict the company’s proposition.

Finally, the costs aren’t often that cheap. True, it can be a case of getting (for example) ten syndicated articles instead of two – but the two unique and original articles can be better written, more in line with the company’s proposition, of more value to visitors and of greater benefit to search engines. In some cases I have seen syndicated costs to be as much as that of generating original content, but this isn’t usually the case.

People are always tempted by the easy path. We ‘know’ that there is no such thing as a free lunch, but it’s human nature to want to believe that we may have found the good deal that’s the exception to the rule.

With website content, it’s clear-cut. Content that is duplicated does not perform anywhere near as well as content that is unique (except usually for the originating site). Not only is this dire for search engines, it’s also a poor strategy for getting visitors back again and again – if your content isn’t unique, they can get it somewhere else.


Add a comment


  • Comments
  • 0 comments