In this episode of Actionable Marketing In Minutes we talk with Mark Bowens about the best practices for duplicate content.

Use the player below to listen to this episode, or the download link to load it on your device for listening later. You can also find our RSS feed by clicking here, or subscribe by email in the sidebar to the left.

Download our Social Media Marketing Best Practices Guide here!

Not able to listen? Why not read the episode transcript below:

Problem:

The phrase “duplicate content” is just as it suggests – that there is identical or exceptionally similar content appearing on more than one place online. The problem with duplicate content is that it confuses – or better said, annoys – search engines in a few ways. First, of course, is the problem of not knowing which content to index. Second, they don’t want to allow people to ‘game the system’ by dominating the SERPs to a ton of links that all go to the same content. And finally, users will abandon search engines that waste their time, so search engines have a great incentive to disallow or penalize duplicate content.

Sometimes, you become involved in issues around duplicate content unbeknownst to you. If you’ve noticed a loss in traffic or ratings, you very well could be suffering from the effects of duplicate content. Often, site owners or managers are not even aware of they have a problem. That’s kind of scary.

To add insult to injury, you may have even caused your own problem. Either you or your programmer may have created duplicate content innocently enough. But you need to figure out how it happened so that the problem can be rectified and so it won’t be an issue going forward.

Solution:

The good news is: Solutions do exist. In fact, we’ll discuss a few today.

Did you know if you have a print version of a webpage, it may be seen as a duplicate? Other ways you may have inadvertently created duplicate pages is through session IDs, sorting options, or affiliate codes. Session IDs are assigned to website visitors to track behavior. eCommerce and other large content sites allow sorting option for search results. And codes that connect a visitor to a referrer, as do affiliate codes. ALL are examples of where issues can occur.

So as you can see, normal, everyday life on the web creates duplicates. Of course, there are sometimes malicious and insidious reasons for duplicate content. Unscrupulous content curators or website developers can snitch your content without your permission. As content is becoming more and more valuable this is becoming an ever-increasing problem. Especially, if you generate good content.

Today, Mark Bowens, one of our favorite SEO experts here at DirectiveGroup, has stopped in to speak with us. I’m sure he will enlighten us on how duplicate content might hurt our content marketing efforts and tell us what we can do to correct or prevent any problems.

Q&A:

Q: Hi, Mark. Thanks for coming by to share your insights with us today.

A: Glad to, Lisa.

Q: So, Mark, I’ve briefly touched on what duplicate content is, but can you tell us a little more?

A: Google says duplicate content usually refers to “substantive blocks of content within or across domains” that are either identical or too similar for coincidence. At the end of the day, it’s content that is similar or identical across multiple URL’s. And it’s important to understand that it can be the result of content on or off your website.

Q: Why is it important to be mindful of duplicate content?

A: Google wants to index and serve up relevant and distinctive content to their users. When there is duplicate content it makes it more difficult for google to decide which version of the content it should display when a search is made. Google may not know which version to index and rank and may not know how to distribute link metrics.

a. Ex. being on a crossroad, and road signs are pointing in 2 different directions for the same final destination, where do you go. Google runs into the same issue with duplicate content.

Q: Does Google punish us for participating in duplicate content – whether that be publishing syndicated content or syndicating our content to others – or duplicating content on our own site or sites?

A: So in general no, you will not be penalized or de-listed. However, that doesn’t mean that your site won’t lose rankings or a page that you want to show up for won’t show in the search results.

However, in extreme cases, when there is obviously an attempt to manipulate the search rankings, Google may manually review and penalize the site.

Q: What is de-listing? That doesn’t sound good.

A: No, it’s not good. They remove you from their indices all together, so your site cannot be found.

Q: Wow! That could be a real nightmare! So, you say they do not de-list for duplicate content.

A: That’s right, Lisa. Just this month, Google’s Search Quality Senior Strategist, Andrey Lipattsev, stated in a YouTube video that Google does not have a duplicate content penalty. https://search.yahoo.com/yhs/search?p=who+is+googles+Andrey+Lipattsev&ei=UTF-8&hspart=mozilla&hsimp=yhs-002

That said, the real issue is not how much duplicate content you may have on your site; rather, the lack of positive signals emanating from your site. In other words, if you’re not offering much that is unique or of value you may not be getting much user interaction – or signals. And, that will absolutely affect your ranking.

Q: What should I do if I have duplicate content?

A: There are quite a few different ways to resolve or even avoid duplicate content issues. I will briefly touch on some you may find on the Google webmaster page pertaining to this topic. https://support.google.com/webmasters/answer/66359

* Remove duplicate content manually if possible. Minimize occurrences of all similar content – including boilerplate repetition. You can do this by removing, updating or merging pages with duplicate or similar content.

* Use 301 redirects if you’ve restructured your site.

* rel=canonical tag. This tag tells Google the given page is a copy of another page. Or, you can use another tool in your webmaster console, called the Parameter Handling tool.

* Syndicate carefully. One beneficial thing you can do is have the site you’re syndicated on link back to your original content and, if they would, use the noindex meta tag to avert search engines from using their site’s version of your content.

* Use Google’s Search Console to indicate your preferred domain name, so they will know how you want your site indexed.

Q: I know I kept you a long time today, Mark. But, you had so much good information to share I wanted our listeners to get it all. Thank you very much.

A: Sure, Lisa.

Benefits:

As we’ve learned today, most of the time our duplicate content is created unintentionally. Fortunately, we now know we won’t be penalized for it and we’ve learned how to correct the problems. To that end, many free tools are available to assist you. Tools such as Copyscape, Duplichecker, Plagiarisma, or Plagium. Some of these check URLs. Others can scan documents. Check them out. They’re very interesting. And, very useful.

Once you fix those duplicates you will notice better search rankings. And, an unintended benefit is that your content will, no doubt, become more rich and valuable – which, in the long run will make your marketing efforts more interesting to your prospects.

We hope you’ve found this information helpful. Please connect with us on Twitter @DirectiveGroup or on LinkedIn and be sure to share it with in your networks using hashtag #actionablemarketing. Join us next time for more actionable marketing in minutes.