by The RevSEO Team
Recently we discussed a variety of ineffective, “black hat” SEO techniques which are all-too-commonly used to try and get a quick boost in page rank. While it was explained that such tactics would inevitably result in Google blacklisting the site as spam, we never fully explained what sorts of things Google looks for when it is making that decision. So in this blog, we approach the “black hat” techniques from another point of view. Google is the referee determining which websites are spam, ultimately, and so it is good to know what the referee takes into consideration when keeping or ejecting a website from the SEO game. Here are three factors which could result in a website being marked as spam.
Let’s start with the most unfair way that Google may determine that your site is not deserving of its page rank. The age of your website and how much more time before your domain registration expires are both taken into consideration in Google’s ranking algorithm. This means that if your website domain was registered for a contract of a year or less, it could be hurting your page rank. Though on the surface this seems unfair and arbitrary, Google does have good reason to be suspicious: Many scammers will use short-term websites in order to hit as many people as possible before vanishing from the internet. The result is that young websites often have to work extra hard to “prove” to Google that they are a reputable website. The good news is that this is fairly easily avoided by registering for a long-term contract for your website domain.
If your website features content which is the same or extremely similar to another website’s content, both websites will be harmed in the search rankings. This is why it is so important to make sure that the content which appears on your website is original and does not get reposted by someone else. While it is impossible to keep other people from reposting your content, there are ways to make it more difficult. The easiest way is to only offer partial RSS feeds if your site uses them, rather than posting copies of your entire article to the feed. By having your web pages be the only way for customers to see the entirety of your content, you can put a significant hurdle in the path of scrapers and reduce the likelihood of damaging duplicate content.
Your Site is in a Bad Community
When building links for a website, one of the most common mistakes which people make is linking to a website which Google has flagged as either spam or a scam and given a page rank of zero. This page rank effectively buries the site, and sites which link to this site often get penalized as well. Conversely, if a site which has been marked as spam has a link to your website, you can get penalized for that. Google uses links to determine affiliations between websites, and so will often penalize those affiliations when a site is flagged as spam. The result of this is that you need to be careful who you link to and who links you, as these links are effectively endorsements. Google will see your link to a scammer’s website as an endorsement of that scammer and act accordingly. Likewise, having a scammer endorse your website hurts the credibility of your website and therefore its page rank.
While avoiding these three things won’t get you to the top of the search engine rankings, it will keep you from being punished in the page rankings. This then enables your website’s content to be judged on its own terms for relevancy, rather than being lumped in with the black hat SEO websites, scraper sites, and scam websites.