Bob Massa the owner of Searchking is on the move again based on this thread at SEW. I really have to wonder why this self promotional post was allowed at SEW in light of Bob’s (Searchking’s) past history.
But then Danny Sullivan seems to allow a lot of stuff as long as it’s not hard core whitehat.
A few years back Searchking started selling PR. Google banned the site, and all sites participating in the scheme. Google also banned some sites that were simply being hosted by Searchking. I mean by banned loss of PR and rankings).
Bob Massa the owner of Searchking took Google to court but lost. Now he has another marketing scheme which seems OK to promote at SEW regardless of who may get hurt in the process. I have to ask after your previous marketing fiasco why should anyone trust you now?
For those who want the background here are a few links.
Now I have no doubt that Bob is a master marketer. In fact he used to if he still does not sell Search Engine submission. If I remember somewhere I believe Bob stated it did no good, but if people wanted ti he was going to sell it.
Needless to say, Doug Heil (aka ihelpyou) is not happy with the thread at SEW. He has posted in regard to this at http://www.ihelpyou.com/forums.
Bob and or Danny I’m certainly willing for you to post your viewpoint.
A question that often comes up in forums is what directories should I submit to? Or should I only submit to directories with a certain (ToolBar) PR? Does Directory with low PR help?
Obviously the person who asked the question was only interested in PR. Personally PR is the least of my considerations when I occasionally submit to a Directory. If the home page is a PR 0 or grayed out then I may do a little more investigation.
If the home page has a PR 6 that means nothing in regard to your individual listing. What is the PR of the page your site will be listed on? In most cases it will be a 4 or less.
Besides that the PR shown in the TB has nothing to do with how a site ranks in the SEs. True PR which only the SEs know may have some effect. In the case of a Search Engine, only the SE knows what the true PR is.
Here are a few things I consider when looking at a directory. I don’t submit to far more than I look at.
- Does the directory appear to be built for users, webmasters, or to display Adsense?
- Do they require a link exchange? I will not link back to any directory.
- Is their link a straight a href or does it redirect?
- Are other sites listed in the category that my site would be in be considered Quality sites?
- Do I think this directory will ever provide my site with some quality traffic?
A good friend of mine Quaderiall (a SuperMod at IHY) has recently written an article that I recommend. What Is A “Quality” Directory”? Quaderiall also maintains a list of directories that he recommends.
Comments Off on Directory Submission
Yesterday Matt Cutts posted a summary of his thoughts about a few things that are going on a Google right now. The post contains a lot of links to other things that he has discussed over the last few months.
The major subheadings are:
- Bigdaddy: Done by March
- Refreshing supplemental results
- Reading current feedback
- Closing thoughts
If you read his post and all the comments you should gain some insight into what is happening with Google right now. I think his post and comments are a must read for anyone intrested in SEO. This post has created more comments than usual. I highly recommend that you read the Indexing Timeline.
Comments Off on Matt Cutts Update
Does W3C validated code help you in ranking. Personally I don’t think so. I have friends that will disagree and that is OK with me.
I spent some time today looking at sites on the first page of results that relate to my industry.
Most of them do not even have a doc type specified. Nearly all of them have gigantic CSS files, and JS files on page rather than using an external file. Linking to a file such as a CSS or JS file helps to keep page bloat down.
The sites that had a doc type did not validate. My conclusion is that code validation has nothing to do with ranking at least in my industry.
So You Know I believe every website should have compliant code that will validate by W3C standards. I just do not believe that compliant code will help a site when ranking.
I welcome examples where you can prove that validated code increased rankings of any site.
A few reasons I think good code is important.
1: it will ensure your site renders the same in all Browsers.
2: Extremely bad code could hang a spider up.
3: Good code will help a spider crawl your site
4: Bringing a site up to W3C standards is not that difficult. Why not have a compliant site Whether it helps you with the SEs or not?
Yesterday Matt Cutts posted a summary of Google Press Day 2006 on his blog. The bigwigs at Google talk to the press abut what is happening at Google. A lot of areas are covered. Comments by Alan Eustace the Senior VP of Engineering were of particular interest to me. Of course no big Google secrets were reveled.
the need to crawl, index, and then score relevant results. “Speed matters.” With 8 billion pages, it would take 253 years (I think I got that right) to fetch pages if you fetch one page per second. “It’s important that we gently crawl the web at very high speeds.” Alan talks about duplicate pages, which can vary from 30-50% of pages with a naive approach.
in regard to indexing. Heh. He’s working a simple example with posting lists. Queries with two words intersect those posting lists. So if heart is on page 5, 9, 25 and attack is on page 7, 9, 22, then the best intersection is on page 9.
Once you’ve intersected posting lists, you have a smaller set of documents to score. Alan mentions anchors and PageRank; lots of pages (and important pages) link to Stanford, so it’s fair to consider them an authority. Then Alan points out the anchortext “Knight fellows” on the Stanford home page and mentions that anchortext can be handy.
You can read the complete summary including a long question and answer session on Matts Blog.
Comments Off on Google Press Day
What is going on with this Company (weindexed.com )? A Moderator at IHY forums started a thread about windexed.com. No doubt they are Spamming IMHO but what they are actually doing causes me to ask some questions that I don’t know the answer to at this time. I have asked those questions in the thread, but possibly some of you who read the blog may have some additional comments.
It appears to me they are doing two 302 redirects but the user who clicks on a link does end up at the correct site.
Checking the header for http://www.weindexed.com/go/195/8529/90345/
HTTP/1.1 302 Moved
Date: Tue, 02 May 2006 13:19:37 GMT
you get this: http://www.weindexed.com/php/site.php?id=90345&k=8529&c=195 Then checking that header you get this:
HTTP/1.1 302 Found
Date: Tue, 02 May 2006 13:21:11 GMT
Here is Googles cached page for http://www.weindexed.com/go/195/8529/90345/.
It appears to me the site is both scrapping content from other sites and cloaking so SEs get one version but visitors get another version (the actual site).
My assumption may be wrong, but I have to ask what is the benefit to weindexed.com?