A recent thread at IHY directory submission software. Personally I think all automated link submission tools are a waste of time for the end user, and will only benefit the developer in one way or another.
In this case the developer of the software also has a directory that is focused on links. You can read about the software automatic-directory-submission (link condom applied), or the link directory(link condom applied).
I think it is a sad day that crap like the directory, and the auto submit software can possibly flourish for a short period of time.
Somewhat like a shooting star. You watch it go across the sky. Suddenly it becomes even brighter, and then it is gone.
I’m waiting for the day this kind of crap is gone. Probably won’t happen in my life time, but I can always dream. ðŸ˜€
I’m not a professional SEO. After seeing these figures I may change my mind :D.
I assume that anyone reading this will understand the function of a bot, robot, or spider.
If you don’t my simple explanation is that they are programs designed to crawl a webpage and extract information from the page.
For instance Google uses a bot, or spider to get information for indexing in their SE. So do other SEs.
Every webmaster wants their site crawled by the robots (spiders) from the major SEs.
However, there are some bots or spiders that you will probably want to exclude from your site for various reasons.
I will call these badbots, although not all are necessarily bad.
Some may just be bots that I don’t want crawling my site for personal reasons.
What is a bad bot?
- A bot that does not follow the robots.txt protocol
- A bot that is scraping content from your pages to display elsewhere
- A bot that is looking for e-mail addresses. E-mail address on web pages is where a lot of the spam e-mail that we receive comes from
- A bot from a country (or SE) that will not benefit my site, or one that just eats up a lot of bandwidth.
How do you deal with a badbot?
There are several ways and I will post more about that next time.
Here are a few good resources in regards to bad bots.
Comments Off on Bad Bots
Shawn at DigitalPoint who initially developed a link program to game the SEs has now added the ability to use the new Google nofollow attribute to those links.
Why has he done this? Probably because many users over the last few months have reported problems with their Google rankings. Sites using the Coop link scheme seem to be drooping like flies.
I think this post pretty well sums up why people are using the DP link Coop.
Yes, I understand why – but why should I bother to particpoate now? Its not like I get a ton of traffic – the SEO benefit was the main purpose for nearly everyone, and that would be gone with nofollow everywhere.
The only reason I can see to exchange links with another site is to help your site visitors.
Personally I don’t exchange links, but that does not mean I think all link exchanges are bad.
On the other hand if you sign up for a program that will automatically put 100s or 1000s or links on your site, you will probably get busted at some point.
From Google Quality guidelines – basic principles
Don’t participate in link schemes designed to increase your site’s ranking or PageRank. In particular, avoid links to web spammers or “bad neighborhoods” on the web, as your own ranking may be affected adversely by those links.
why else would a webmaster participate in a scheme like this? Traffic if any will be small and will not provide any ROI.
If your site is listed in the ODP (aka DMOZ) Google often uses the description from the ODP as the snippet they display in the SERPS. Recently I found out that MSN has been doing the same thing.
Webmasters have always been unhappy about this for several reasons. Some legitimate, and possibly some that are not legitimate.
Back in May MSN addressed this problem with a new meta tag. You can read more about it here.
Today Google’s Vanessa Fox, announced that Google was going to allow webmasters some control in this post on Google Site Maps Blog.
For any spider or robot that is using this tag you can use [meta content=”NOODP” name=”ROBOTS” /] for all robots. If you only want to control one then for the moment you can use [meta content=”NOODP” name=”msnbot” /] for MSN or [meta name=”GOOGLEBOT” content=”NOODP” /]
Of course change [ tag ] this to the proper html angle brackets.
Since Yahoo has their own directory I doubt they will ever be showing ODP descriptions in their SERPS.
Comments Off on Search Engines Using ODP Snippets
Here is a site that seems to have an innovative link scheme. www.link-back.net/page-rank. Like Link Vault and the DP coop this one is surely going to go down the tubes at some point.
A few excerpts from their site:
increase your back-link numbers and therefore your website’s page rank by:
1. Back-links, page rank and keywords analysis and
2. Back-link automatic rotation program & Google-Bot detection
Get 1000 free back-links today. All this is just a click away.
Step 1: Submit a site to our web directory
Step 2: Insert one of the service codes in your site… and it’s done, no future maintenance is required, all systems of back-link are automatically self changed and maintained
I don’t see how this can be anything else than another link spam scheme designed to artificially inflate a sites ranking.
It is clearly in violation of Google’s Guidelines.
Don’t participate in link schemes designed to increase your site’s ranking or PageRank
I would advise anyone serious about a long term plan for success to avoid this and all other schemes to artificially try and help your site to rank better.
One thing that comes up occasionally in forums is “I’m changing host”. “Will that hurt me with the Search Engines”?
The simple answer is no.
Changing domain names is an entirely different issue. I’m talking about moving an existing domain from one host to another.
I always advise people who are changing host to keep the old site alive for 30 days. When you change host you will need to change the name servers from the old host to the new host. That change can take anywhere from a few minuets to a few days to propagate around the world wide web.
When you change host there are a couple of problems you many encounter. Those problems are not Search Engine related.
1: E-mail. You may be receiving e-mail at two different IP’s (or host). You should be able to get around this by setting up some forwards on the old host to the new host.
2: Data base Sites. Other than this forum I do not use a data base driven site, but I think in this case setting up a 301 redirect would solve that problem.
Comments Off on Changing Host
Over a year ago Kinderstart initiated a Lawsuit against Google. A thread started yesterday at TW Google Lawsuit names ‘Google Sandbox’ by Kinderstart points out that Kinderstart is using the “Sandbox theory” as part of their lawsuit against Google.
If your interested you can download a PDF copy of the Suit here. There are 6 references to the “sandbox”, in the 26 pages of the document.
I refer to the Sandbox as a theory because no one knows for sure whether it exist or not. The theory tries to explain why new sites do not seem to rank for many months for competitive keywords. Many people believe in a sandbox, others do not.
Google has always been silent on the subject of the sandbox as far as I know. Nothing new there. Google remains silent about a lot of things.
Around the middle of May Mike Grehan interviewed Matt Cutts. You can read the interview here Google’s Matt Cutts: The Big Interview. Scroll down to the middle of the page and you can see what Matt says in regard to the sandbox. Nothing that would definitively confirm or deny the existence of a sandbox that I can see. The usual Google spiel on subjects they do not want to discuss.
My guess is that Kinderstart is just wasting their money in the lawsuit.
Comments Off on Kinderstart and the Sandbox
Are automated link exchange programs a valid way to to get incoming links to your website? For some background information you might want to read my comments about Links.
I guess I need to clarify what I mean by automated link exchange programs. You sign up for the program. Put some code on your webpage and bingo you have thousands of sites linking to you. Recently it seems that Link Vault has been hit hard by Google. My understanding based on blog and forum post is the participants have also been hit.
It also appears that those participating in the DP (DigitalPoint) scheme are also getting hit. There is no telling how many of these programs exist but I have one to add. linksmaster.com (link condom applied). Doing a search on Google for site:www.linksmaster 1 page is shown out of 212 total. Seems to me something is going on here.
It’s interesting that linkmasters says they are
a unique link exchange service unlike any other on the Internet
What a lie this is!!!!
Fully compliant with all the major search engine guidelines and standards.
The cost for this one is only $20.00 per month.
I would really like to know what is different about them. A link exchange scheme is a link exchange scheme regardless of how you try to package and market it. At least DP does not charge you for their scheme that will probably get you banned.
When will webmasters learn that these link schemes work at best short term, and in the end is probably going to hurt the sites that participate as well as the site who initiated the scheme.
My thanks to Deb (aka savvy1) for posting about Linksmaster in the staff area of IHY Forums.