Blocked User Agents

I just added a page that list over 400 user agents that many web masters block for one reason or another. Every user agent listed is not necessarily a bad user agent. Some are from from small legitimate SEs. I think the bulk are from scrappers, and email harvesters.

I think I have explained on the page why I have included the different user agents.  As time allows I will do research on each of the user agents and update the page.

You are the only one who can determine if you should block any user agent.

If you have comments on the list you can make them here.

You can view the list of user agents here. 

4 Responses to “Blocked User Agents”

  1. As a webmaster, you definitely should use user-agent headers to manager server traffic. But understand that this is purely a pragmatic tactic and not a serious security measure.

    I wrote more about this here:

    Webmaster Tips: Blocking Selected User-Agents
    http://faseidl.com/public/item/213126

  2. Connie says:

    It all depends on the user agent as to how great the security risk is.

    It might depend on your definition of security risk. Mine includes bots that scrape my content and post it on some other website.

  3. >>It might depend on your definition of security risk.<<
    I see what you mean, but what I was really getting at is that it is trivially easy to to forge a user-agent header. So, if someone really wants to exploit a security hole (regardless how that might be defined), you really can not prevent the exploit by simply examinging the user agent string. You may reduce some unwanted traffic by looking at user agent strings (and, in practice, you should), but if you really want to *completely* plug the security hole, something more is needed.

  4. great post..It all depends on the user agent as to how great the security risk is. thanks