Saturday 7 September 2013

Sep 4th, 2013 - Sudden Drop In Traffic - A Thin Or Lack Of Original Content Ratio Issue?

Many people have reported a sudden drop in traffic to their websites since September 4th, 2013.

Google Webmaster forum is full of related posts. A Webmaster World thread has been started. Search Engine Roundtable mentions 'early signs of a possible major Google update'. A group spreadsheet has been created. No one seems to make sense of what is happening. There is a lot of confusion and speculation, without any definitive conclusion.

I have seen a major drop in traffic on a new website I am working on since then. However, the traffic for this blog has remained the same. No impact.

I am going to post relevant information and facts here as I find them. If you have any relevant or conclusive information to contribute, please do so in the comments and I will include them here. Let's try to understand what has happened.

Symptoms

Facts & Observations

  • Many owners claim no black hat techniques, no keyword stuffing, only original content, legitimate incoming links.
  • Many owners say they have not performed any (or significant) modifications to their website.
  • All keywords and niche are impacted.
  • Both old and new websites are impacted.
  • Both high and low traffic websites are impacted.
  • Some blogs are also impacted.
  • It is an international issue, not specific to a country, region or language.
  • Site with few backlinks are also penalized, not only those with many backlinks.
  • Nothing changed from a Yahoo or Bing ranking perspective.
  • One person mentions a site with thin content still ranking well.
  • At least one website with valuable content has been penalized.
  • Several sites acting as download repositories have been impacted.
  • Some brands have been impacted too.
  • So far, Google has nothing to announce.
Also:
  • In May 2013, Matt Cutts announced that Panda 2.0 is aiming at getting better at fighting blackhat techniques. He also announced that upcoming changes include better identification of websites with higher quality content, higher authority and higher trust. Google wants to know if you are 'an authority in a specific space'. Link analysis will be more sophisticated too.

Example Of Impacted Websites

  1. http://www.keyguru1.com/
  2. http://allmyapps.com/
  3. http://www.techerhut.com
  4. http://domainsigma.com/
  5. http://www.knowledgebase-script.com/
  6. http://alternativeto.net/
  7. http://www.pornwebgames.com/ (adult content)
  8. http://www.fresherventure.net/
  9. http://www.casaplantelor.ro/
  10. http://www.dominicancooking.com/
  11. http://www.weedsthatplease.com/
  12. http://www.medicaltourinfo.com/
  13. http://www.safetricks.net/
  14. http://www.qosmodro.me/
  15. http://botsforgames.com/
  16. http://blog.about-esl.com/
  17. http://last-minute.bg/
  18. http://www.shine.com/
  19. http://www.rcmodelscout.com/
  20. http://itech-hubz.blogspot.nl/
  21. http://gpuboss.com/
  22. http://www.taxandlawdirectory.com/
  23. http://www.newkannada.com/
  24. http://places.udanax.org/review.php?id=33
  25. http://www.seniorlivingguide.com/
  26. http://listdose.com/
  27. http://indianquizleague.blogspot.nl/
  28. http://beginalucrativeonlinebusiness.com/
  29. http://www.teenschoolgirlsfucking.com/ (adult content)
  30. http://quickmoney365.com/
  31. http://www.orthodoxmonasteryicons.com/
  32. http://codecpack.co/
  33. http://www.filecluster.com/
  34. http://www.dominicancooking.com/
  35. http://pk.ilookstyle.com/
  36. http://www.ryojeo.com/2013/08/ciptojunaedy.html

Hypotheses

  • Websites with content duplicated on other pirate websites are penalized.
  • Websites with little or no original or badly written content are penalized (thin content vs plain content ratio).
  • Websites with aggregated content have been penalized.
  • Sites having a bad backlink profile have been penalized.
  • Sites having outbound links to murky site or link farms have been penalized.
  • Ad density is an issue.
  • Google has decided to promote brand websites.
  • This is a follow-up update to the August 21st/22nd update, at a broader scale or deeper level.
  • An update has been posted and contains a bug (or a complex, unanticipated and undesirable side effect).

Analysis

Using collected information and data gathered in the group spreadsheet:
  • Average drop in traffic is around 72%
  • No one reports use of black hat techniques
  • 12,8% report use of grey hat techniques
  • 23,1% report impact before 3rd/4th September
  • 7,7% have an EMD
  • 17,9% had a couple of 404 or server errors
  • 17,9% are not using AdSense
  • 30,8% admit thin content
  • 38,5% admit duplicate content
  • 25,6% admit aggregate content
  • 15,4% admit automatically generated content
  • 64,1% admit thin or duplicate or aggregate or automatically generated content
  • The range of backlinks is 10 to 5.9 millions
  • The range of indexed pages is 45 to 12 millions
The spreadsheet sample contains only 39 entries, which is small.
  1. The broad range for the number of backlinks seems to rule out a pure backlink (quality or amount) issue.
  2. The broad range of indexed pages points at a quality issue, rather than a quantity issue.
  3. More than 92% do not have an EMD, so this rules out a pure exact domain name issue.
  4. More than 82% did not have server or 404 issues, so this rules out them as the main cause for a quality issue.
  5. 17,9% are not using AdSense, meaning this cannot be a 'thin content above the fold' or 'too many ads above the fold' issue only.
  6. Some brand websites have been impacted. Therefore, it does not seem like Google tries to promote them over non-brand websites.
  7. Domain age, country or language are not discriminating factors.

    Best Guess

    By taking a look at the list of impacted websites and the information gathered so far, it seems like we are dealing with a Panda update where sites are delisted or very severely penalized in search rankings because of quality issues.

    These are likely due to thin content, lack of original content, duplicate content, aggregate content or automatically generated content, or a combination of these. It seems like a threshold may have been reached for these sites, triggering the penalty or demotion.

    Regarding duplicate content, there is no evidence confirming for sure that penalties have been triggered because a 3rd party website stole one's content. More than 60% do not report duplicate content issues.

    To summarize it, the September 4th culprit seems to be a high thin or lack of original content ratio issue, leading to an overall lack of high quality content, leading to a lack of trust and authority in one's specific space.

    Unfortunately, Google has a long history of applying harsh mechanical decisions on websites without providing any specific explanation. This leaves people guessing what is wrong with their websites. Obviously, many of the impacted websites are not products of hackers or ill willed people looking for a 'I win - Google looses' relationship.

    Some notifications could be be sent in advance to webmasters who have registered to Google Webmaster Tools. If webmasters do so, it can only mean they are interested in being informed (and not after the facts). This would also give them an opportunity to solve their website issues and work hand-in-hand with Google. So far, there is no opportunity or reward system to do so.

    Possible Solutions

    Someone from the Network Empire claims that Panda is purely algorithmic and that it is run from time to time. If this is true, then this might explain why no one received any notifications or manual penalty in Google Webmaster Tools, and why no one will.

    Google might just be waiting for people to correct issues on their websites and will 'restore' these sites when they pass the Panda filter again. The up side is that this update may not be as fatal as it seems to be.

    Assuming the best guess is correct, the following would help solving or mitigating the impact of this September 4th update:
    • Re-read Dr. Meyers' post about Fat Panda & Thin content.
    • Thin content pages should be marked as noindex (or removed from one's website) or merged into plain/useful/high quality content pages for users.
    • Low quality content (lots of useless text) pages should preferably be removed from the website, or at least be marked as noindex.
    • Internal duplicate content should be eliminated by removing duplicate pages or by using rel="canonical" (canonical pages).
    • Content aggregated from other websites is not original content. Hence, removing these pages can only help (or at least, these page should be marked as noindex).
    • Not enough valuable content above the fold should be solved by removing excessive ads, if any.
    • Old pages not generating traffic should be marked as noindex (or removed).
    • Outbound links to bad pages should be removed (or at least marked as nofollow), especially if they do not contribute to good user experience. This helps restore credibility and authority.
    • Disavow incoming links from dodgy or bad quality websites (if any). One will loose all PageRank benefit from those links, but it will improve their reputation.
    • Regarding Panda, it is known (and I'll post the link when I find it again) that one bad quality page can impact a whole website. So being diligent is a requirement.
    • Lorel says that he has seen improvement on his client websites after de-optimizing and removing excessive $$$ keywords.
    Something to remember:
    • Matt Cutts has confirmed that noindex pages can accumulate and pass PageRank. Therefore, using noindex may be more interesting than removing a page, especially if it has accumulated PageRank and if it has links to other internal pages.

    7 comments:

    1. Indeed, it's a Panda content quality update.
      A lot of thin sites got busted.

      See alternative.to - good ideea, a lot of spam; a lot of emptry content pages.
      Filecluster.com is a softpedia.com copy cat 100%.

      Also, a lot of sites that list tech specifications and smartphone software got unranked.

      Google Play or iTunes Store are down in search, but blogs and sites with real software reviews (not copy & paste descriptions) are up.

      ReplyDelete
    2. Mine has been severely affected. (www.weedsthatplease.com) I was first page, top 3 if not the top spot for searches. I am still number one in Bing, Yahoo!, and DuckDuckGo - Google has been the only engine taking me off the first page for hundreds of phrases on about 70 crawled pages. It is NOT due to thin content or major changes I made. It began exactly May 8th 2013 (for my website).

      Google relevance has dropped. One only needs to look at the site's it has moved above me and see the content is NOT better than what I offered. Yahoo! for the first time in history (August 2013) has exceeded Google searches - So others know of the relevancy issues, it is clear.

      ReplyDelete
    3. This is a very interesting article. Thank you for your analysis. Please take my site (number 3) off the list as I prefer that it does not appear. Thank you very much!

      ReplyDelete
      Replies
      1. I just did. Let me know if I removed the wrong one. If so, put the link in a comment and I will delete the comment after.

        Delete
    4. i am not agree with you with this issue of duplicate or baldy content because hundred of sites open and front of search engine which have low quality or total copy right data google have double standered

      ReplyDelete
      Replies
      1. Don't worry, they will be caught sooner or later. Watch this video for more explanation: https://www.youtube.com/watch?v=gMDx8wFAYYE Google does not have double standards... Have patience!

        Delete
    5. This comment has been removed by a blog administrator.

      ReplyDelete