Spam Whackers

Exposing Various Types of Spam – Offering SEO & Webmaster Tips

July 18, 2008


Filed under: Htaccess,SEO General,SEO Tools — Connie @ 4:22 pm

A .htaccess file is an important tool for webmasters. The file is useful in regard to search engines, but it is also useful in protecting your website. A .htaccess file can only be used on a Apache webserver. If your hosted on a Windows server you will not be able to use a .htaccess file. A lot of what I can do routinely with .htaccess can be done on a Windows server, but It will probably have to be done by the server Administrator.


August 28, 2007

Opt-in or Blacklist?

Filed under: Bad Bots,Htaccess,Webmaster Resources — Connie @ 4:50 pm

What do I mean by Opt-in? In short rather than always adding user agents, or IP addresses to your .htaccess file to blacklist them, only allow certain bots, and browsers to access your site. So rather than using the a black list method to prevent unwanted bots, you just white list certain bots. All others will get a 403 error page.


May 13, 2007

Take Care Using .htaccess

Filed under: Htaccess,SEO General — Connie @ 11:13 am

The .htaccess file gives webmasters a lot of control over their websites, especially if your on a shared server. I don’t have any proof, but I suspect the majority of websites on the WWW are on shared servers.

A simple error in the file, can cause you problems. A couple of weeks ago I uploaded a new .htaccess to Condells and shut the site down for a day. I didn’t check after uploading it. All traffic for about a 30 hour period got a 500 internal server error. (more…)

May 9, 2007

.htaccess is a Blessing

Filed under: Htaccess,SEO General — Connie @ 6:02 pm

As most experienced webmasters know the .htaccess is a valuable tool for their site. Though my first webpages were published in Dec 1999, I never heard of a .htaccess file until I read something about a .htaccess file at IHY, some time after I registered there 2004. (more…)

August 13, 2006

Blocking Bad Bots

Filed under: Bad Bots,Htaccess,Robots-Spiders — Connie @ 2:09 pm

Previously I provided a definition of Badbots. Now I want to provide you with some ways to block them from your website. You can block by user agent, IP address, or domain name.

Any of these methods will require the use of a .htaccess file. You will also need access to your actual log files.

Blocking by user agent

SetEnvIfNoCase User-agent “8484_Boston_Project” spammer=yes
Order allow,deny
deny from env=spammer
allow from all
You can change env=spammer to whatever name you want.

Blocking by IP address
order allow,deny
allow from all

deny from

Blocking by domain name.

This method uses mod_rewrite in addition to the normal .htaccess rules.

RewriteCond %{HTTP_REFERER} badsite\.com [NC]
RewriteRule .* – [F]

Or this for multiple domains.
RewriteCond %{HTTP_REFERER} badsite\.com [NC,OR]
RewriteCond %{HTTP_REFERER} anotherbadsite\.com
RewriteRule .* – [F]

If you want to learn more about who the bad bots are the two best sites are:

Incredibill’s blog. Bill provides up to date information about bots that every site should be concerned about. is another good resource.

Powered by WordPress