Google launched Panda 2.2 in late June of this year. The update marks the fourth time the company has filtered sites through their new ranking factor, each time with various tweaks and modifications based on the results of the previous filter. Here are a few things you need to know about the nature of Google’s Panda, as well as how to prevent the change from harming your web presence.
Panda is a Negative Ranking Factor
Panda is not designed to reward high-quality content, but rather to punish lower-quality content. In theory, Panda should prevent low quality sites, such as scraper sites or sites with lots of duplicate content, from dominating the search rankings, even if they contain a lot of relative keywords to the search. Other parts of Google’s algorithm focus on the relevance of a search, but Panda is geared specifically towards the quality of the content provided. If enough pages in a website have a sufficiently high Panda ranking (high is bad in this case, because it is a negative factor) then the entire website will be punished in the rankings, essentially being flagged as a scraper site or other low-quality source.
So what constitutes lower-quality content? For starters, duplicate content will have a huge impact on a website’s Panda ranking. This is a specific effort on the part of Google to punish scraper site practices of copying quality content for the purposes of black-hat SEO.
Other red flags which hurt a page’s Panda ranking include the quality of the site’s content and the quality of the sites which are linked to the page. That is, where you are getting your links from now has a chance of hurting your website, if the site which linked to you is of poor quality. So where before it was important for your own site to have quality content, it is now also important that all the sites which link you also be reputable.
Additionally, the number of advertisements on your page can now hurt your page rankings, as Panda includes advertisement density in its calculations. If you have a page with high-quality content which nonetheless makes heavy use of Google AdWords, Panda could still label it a low-quality page. It is therefore important to balance advertisements and content on any given page, and to make high-quality content the rule rather than the exception.
Avoiding the Panda Slap
SEO circles already have a slang term for what happens when Panda flags a site as lower quality based on its algorithms: “Panda slap.” Specifically, this term refers to what happens when a number of different pages on a website are deemed to be low-quality by the algorithm’s standards, resulting in the entire website being punished. This can reduce traffic to a site from Google by as much as 50%. So it is now more important than ever to not just have some high-quality content for the user, but to be consistent in the quality of your work. Too many low-quality articles can hurt your entire web presence.
Google’s goal behind the Panda update is to add a slightly more “human” factor to their web relevance algorithms, by incorporating such aspects as readability of the content, site bounce rates, and how reputable the site seems in general to affect a site’s relevance. For the most part, this has been successful: Google reports approximately 85% overlap between what Panda catches and what users block using Chrome’s search filter extension. The goal of Panda is to ensure that websites are user-friendly rather than simply web-crawler-friendly. Understanding this goes a long way in creating content which plays nicely with the new setup.
Panda is Not Always On
This is a big misconception out there: Panda is not something which is constantly being used by Google to assimilate search rankings. It currently takes too much system resources to leave Panda perfectly automated. Instead, Google incorporates Panda into their system once every few weeks to get a “snapshot” of each site. It is rumored that Panda will reward sites which show steady improvement from snapshot to snapshot (even if the content isn’t immediately perfect), but this is not for certain. Therefore, if a site is Panda Slapped, figuring out the problem with the site and correcting it will not immediately fix the problem. Instead, it could take up to six weeks before Google updates their results. This is important to know, as it means that you will not know whether you found the problem in real time.
Why Did Google Do This?
Google incorporated Panda into their algorithms for one primary purpose: to prevent scraper sites from outranking the sites from which they scraped, and to prevent unhelpful content from outranking helpful content. This is the theoretical purpose of Panda. As such, it is now more important than ever to write to a human audience and provide helpful, original content rather than duplicate content or shallow content. Long story short: Present yourself as an expert on your topic and Panda will be happy. Then you can worry about making your site web-crawler friendly to satisfy the other elements of Google’s algorithm.