The date of the first publication: March 15, 2006
Nowadays, having a website is becoming a tremendous responsibility. If you own a website, you need to do at least the following things, and do them on a regular basis:
- Make sure your domain name doesn’t expire;
- Comply with the latest quality standards in terms of graphic design, copywriting, usability;
- Keep your coding at least moderately tidy;
- Add fresh content from time to time, and apply necessary changes to your existing content as it gets obsolete;
- Constantly watch legal issues ensuring your website doesn’t break any laws;
- Study your web statistics carefully and improve your users’ experiences to make sure it converts better and better;
- Track your ROI;
- Check if anybody is stealing your web copy or other copyrighted materials and take legal action against those who do.
That sounds like a lot of work, doesn’t it? But now I’ve got some bad news for you: you’ve got yet another duty in regard to your website, which is to keep it clean in terms of SEO (unless you are a black hat SEO and don’t care about such things).
Let’s assume you are a whitehat and would like to live in peace with the engines. You never think about spamming them, always check other sites for being as whitehat as yours before linking to them, and rest assured that you are doing everything you can to keep your white SEO’s hat spotless. That’s where you are wrong!
We are living in the year 2006!
Yes, we are living with today’s Internet and we have to take its realities into account. Today, you do need to go through all of your outbound links regularly (I’d suggest at least every three months, though it may vary depending on your niche) and check all those websites again and again. Believe me; you will be amazed, astounded and sometimes shocked to see how many of them will have been banned since the last check-up and how many of those not banned yet will just allow themselves to get obviously dirty. A few will disappear completely and return 404s; others will change the owner and the theme, or turn into one of the “created-for-AdSense-only” pseudo-sites, which are swiftly becoming yet another dominating type of Internet garbage. The total number of websites you will have to de-link will probably reach 50% of your outbound links if you are in a spammy niche. (And if you used to exchange links with anyone who sent you an automated link exchange request, be prepared to remove 99.9%.)
That’s a lot of work. In one of my previous articles I described in much detail the common procedure of checking sites for the most obvious of spammy techniques. It is a complicated procedure that requires time and effort. If you list something like 300 websites on your resource pages, the clean-up might take a week or more. But it has to be done.
Reciprocal links revised once more
The astonishing amounts of spam filling today’s Internet give us yet another good reason not to use reciprocal linking as our main link building strategy. I still believe that there is nothing wrong with two-way links as such; it’s mostly the side-effects that cause the problems.
The more links you exchange with other site owners, the more potentially bad neighbourhoods you expose yourself to. Link exchange spammers have become very good at making their automated emails apparently personalised; sooner or later you will fall for it and mistake their spam for the real thing; then you will simply forget about it. Next time you revisit your links, be sure to find a bad neighbourhood there, with everything that goes with it, like a mild and barely noticeable but still real ranking penalty, especially if you made this same mistake several times.
The engines are now capable of keeping your whole linking history in their databases, and according to some experts, that’s exactly what they do. I don’t believe that they do it for the pleasure of having a larger database; their only purpose can be to analyse that data and use it when calculating the overall authority of each and every website they know about. I also believe that every spammy neighbourhood the site ever linked to is counted against the site’s overall authority, and the longer it stays on the site the more damage it is likely to cause. Even if the engines haven’t yet become smart enough to really implement these new factors into their algorithms, they will soon.
And of course, if the number of bad neighbourhoods you link to is exceeding reasonable limits, the penalty will be very real and very noticeable.
If you own a general or a niche-specific directory, then the overall number of your outbound links is much higher than it would be if you owned an ordinary website. But directories, just like all other sites, are watched for bad neighbourhoods. It is probably one of the factors that explains why so many directories got banned from Google during recent months (though, of course, not the only one).
If you list a few thousand sites, the task of revisiting each and every one of them every few months is a tough one, even if you’ve got a team of dedicated editors willing to help you. If you are just considering launching a directory, you do need to take this into account; ask yourself twice if you really want to accept such a burden.
Whether we like it or not, the Net has become very full of spammy sites. In spite of all the attempts at cleaning it up, they won’t disappear in the near future. We do need to stay away from such sites, and not just for fear of being penalised for linking to them, but also because it’s no good to give them any sort of credibility. That’s another reason to check all your outbound links on a regular basis, unless you know the people behind those sites really very well and trust them as much as you do yourself.
Keeping your White Hat clean amid all the filth is a tough task, but it is well worth the effort.