Google Hell?

Andy Greenberg wrote an article for Forbes entitled “Condemned To Google Hell” about supplemental results. I was getting ready to go on vacation, so I didn’t have a chance to talk to Andy, and now I wish that I had. It’s easy to read the article and come away with the impression that Google’s supplemental results are some sort of search engine dungeon where bad pages go and sit in limbo forever, and that’s just not true.

I did some quick searching, and this post from January includes a pretty good rebuttal of the “you get into supplemental results for spamming or duplicate content, and then your pages stay there for a long time” idea. I’ll quote the most relevant paragraph:

As a reminder, supplemental results aren’t something to be afraid of; I’ve got pages from my site in the supplemental results, for example. A complete software rewrite of the infrastructure for supplemental results launched in Summer o’ 2005, and the supplemental results continue to get fresher. Having urls in the supplemental results doesn’t mean that you have some sort of penalty at all; the main determinant of whether a url is in our main web index or in the supplemental index is PageRank. If you used to have pages in our main web index and now they’re in the supplemental results, a good hypothesis is that we might not be counting links to your pages with the same weight as we have in the past. The approach I’d recommend in that case is to use solid white-hat SEO to get high-quality links (e.g. editorially given by other sites on the basis of merit).

That statement still holds. It’s perfectly normal for a website to have pages in our main web index and our supplemental index. If a page doesn’t have enough PageRank to be included in our main web index, the supplemental results represent an additional chance for users to find that page, as opposed to Google not indexing the page.

Okay, so that’s the general advice I’d highlight. It can also be the case that links that used to carry more weight for a website might not be counting as much. Let’s see if we can find an example of that in the article. Here’s a quote:

MySolitaire.com, another online diamond business, spent January to June of 2006 in the supplemental index. Amit Jhalani, the site’s vice president of search marketing, says he figures that cost his business $250,000 in sales, and he says he still doesn’t know why the site’s pages got Google’s thumbs-down.

“So many of the rules are vague,” Jhalani says. But he admits that he tried gray-area tactics like buying links from more established sites to juice his traffic.

Okay, so the VP of SEM for this site mentions that they tried buying links; maybe those links started to count for less. I decided to check into mysolitaire.com and see if I could find any other links that might have started counting for less. I did find a spam report where someone forwarded an email that appeared to be from mysolitaire.com:

>From: “MySolitaire Jewelry” <xxxx@mysolitaire.com>
>To: <xxxxxx @xxxxxxxxxx.org>
>Subject: Link Exchange Request from MySolitaire
>Date: Wed, 4 Jan 2006 11:59:47 -0500
>
>Dear Sir / Madam
>
> I am writing to see if you would be interested
>in setting up reciprocal links.
>
> We offer a unique and extensive line of
>Diamonds and Jewelry. Thousands of brides,
>grooms and other customers view our site every
>month as they plan their weddings, engagements
>and gifting ideas. Our Diamonds and Jewelry
>Links page is a direct link off our home page.
>Currently, we have less than 50 outgoing links
>in each category.
>
> The information and services offered on your
>site would be of great value to our visitors,
>and I believe, your visitors would find great
>value in our site. Because our two web sites are
>complementary rather than competitive, we see
>the synergy here as an opportunity for our
>mutual benefit.
>
> To exchange links with us either enter the code below onto your site:
>
> <a href=’http://www.mysolitaire.com/diamondearrings/’ target=_blank><b>
> Diamond Earrings, Diamond Stud Earrings,
>Diamond Studs, Diamond Hoops, Diamond
> Chandelier Earrings, Diamond Threader Earrings</b><br /> Diamond Earrings,
> Diamond Stud Earrings, Diamond Studs, Diamond Hoops, Diamond Chandelier
> Earrings, Diamond Threader Earrings at a great value. Only available at
> MySolitaire.com
>
> ….
>
> Mxxxx Fxxxxxxxxx
> MySolitaire
> http://www.mysolitaire.com/
> info@mysolitaire.com
> 62 West, 47th St #1409
> New York, NY 10036
> Phone: 866-697-6548 (866-MySolitaire)
> Fax: 212-840-5909

A quick Google search finds similar emails that were sent to mailing lists. Reciprocal links by themselves aren’t automatically bad, but we’ve communicated before that there is such a thing as excessive reciprocal linking. Note that the email above doesn’t say that they have less than 50 outgoing links; nope, it says “Currently, we have less than 50 outgoing links in each category.” I checked out http://www.mysolitaire.com/resources/ and by my count saw 329 different categories offered for link exchanging:

Examples of link exchange categories

I know a lot about SEO, so I decided to check out the “Search Engine Optimization” category. This was the first entry I saw:

Example link exchange entry

Now I didn’t click through to check out that site; it could be the best SEO site in the world. But the entry doesn’t give great experience for users; heck, it’s not even a complete sentence. And it didn’t look really relevant for users for a diamond ring site to exchange links like this in potentially up to 329 different categories. As Google changes algorithms over time, excessive reciprocal links will probably carry less weight. That could also account for a site having more pages in supplemental results if excessive reciprocal links (or other link-building techniques) begin to be counted less. As I said in January: “The approach I’d recommend in that case is to use solid white-hat SEO to get high-quality links (e.g. editorially given by other sites on the basis of merit).”

I thought Andy Beal had an interesting take on the Forbes piece as well.

266 Responses to Google Hell? (Leave a comment)

  1. Interesting thoughts, Matt. Without wishing to foist a “what’s happening to my site” thread on your constructive blog, my Dublin site has been gradually relegated into supps over the past 3-4 weeks. I loose about 5-6 pages (of a 200+ site) every few days. There is no intentional duplicate content although lack of content on picture pages may account for those. The remaining pages are filled with unique content and I can’t see any reason for their demise. No matter what you say about supps, it still feels like an old friend is gradually dying.

    At least I don’t have a $250,000 business to screw up – it’s only an engaging hobby for me.

  2. That is fascinating Matt.

    I especially find it funny after you reviewed my site and pointed out the bad link exchanging I did in the past (before I learned google rules, basically the hundreds of paid link exchanges I had from the past :)

    The advice to blog was a good one, it has created many good, free links!.

    I just got done reading a thread at WMW where google was criticized for basically making any rules at all, since Google controls so much of the traffic on the web.

    I guess the real dilemma is simply that if google lets the rules be known, then all of us will do whatever we have to do to achive the results of those rules. The second an individual has a concience intention to please the algo, then it becomes more and more grey.

    The gradient becomes something on the order of,
    Just making good content, and who cares if it gets links or not.
    Making good content with the intention of it being link bait.
    Asking others to link to you.
    Creating some type of exchange to get the link, such as I will build your site if I can put a link on it to my site.
    I will pay you to blog about my product or site.
    And then the outright, I will pay you to link to my site.

    Would you comment on whether that is the approximate order of good to evil from the viewpoint that google is promoting?

    There is also a lot of talk on WMW about paid directories such as Yahoo directory or BOTW and whether those will have SEO value, or possibly even just create a penalty.

    Again, thanks for the access to both listen to and comunicate to you. I think George W. needs a blog like this where we can all ask him questions, and tell him what we think!

    dk

  3. I think if a site is going to give a quote to Forbes complaining about a lack of Google love they really need to be prepared for the worst. :)

    I can’t believe this site has a VP of search marketing and they still don’t know what happened. Where is the President of search marketing?

    Heads should roll for this.

  4. JohnMu

    Sigh. Same procedure as last year :-)

    Can you tell us some more about having a site “suddenly” rank at #30 and below for all terms? Supplemental results are usually easy to debug.

  5. I read the Forbes article yesterday and thought: “Just another search journalist wanting to use the current anti-Google sentiment (hype) to get exposure!” But even I’m not convinced that Google’s “do no evil” still stands proud.

    If you currently write an anti-Google article, you will get noticed and if your arguments are sound, you get published. The arguments in Forbes aren’t bad, but not good enough for a Forbes article in my opinion.

  6. > But the entry doesn’t give great experience for users; heck,
    > it’s not even a complete sentence. And it didn’t look really
    > relevant for users for a diamond ring site to exchange links
    > like this in potentially up to 329 different categories

    Google AdSense also often includes nonsense English & irrelevant links:
    http://www.dealcafe.com/funnies/searchgame.html

    The way AdSense is set up, only humans will be affected, not search bots, so I guess it’s outside the scope of this post. But there are a lot of spam sites out there financing themselves with Google AdSense. I often wonder, in a kind of thought experiment, how Google engineers would react if they found out a significant correlation between sites using > N AdSense blocks, and spam sites. Would they be as ruthless in transforming this find into an algorithm that lowers the ranks of these sites? Or is there a — subconscious, rather innocent, nevertheless existing — conflict of interest?

    On another note, pages like these also contain incomplete sentences, but in this case you were OK with it:
    http://spiderbites.nytimes.com/articles/pay/1980_index.html
    I wouldn’t want to see every site with nonsense English be downranked in Google, that’s not it — but tis NYT page is clearly made for SE’s first, and humans second, and it’s examples like these which may make it harder for webmasters to see the fine line you’re seeing.

  7. I’ve heard / read many times these two statements:

    1. Reciprocal links by themselves aren’t automatically bad

    2. Excessive reciprocal links are bad

    What is “excessive”? ;) Can you give us some numbers?

    If I have 40 reciprocal links on a site with 200 content pages, is that trigerring any penalties?

  8. This is not the first time, we had similar stories where a little less knowledge became a big news article. Here is another example online.wsj.com/public/article/SB117375265591935029-azt3SDR6a_bQwU1WbraemnGSXZ0_20070411.html

    (posting it again, may be the previous one hit the spam box due to the active link to wsj.com)

  9. Hmmmm….so what you are saying is that reciprocal linking to other (some unrelated) sites is causing the pages to go into supplemental index? But does the site in question having poor content? Are the pages duplicates?

    Why I ask? Well, because you guys have been claiming that Content is King all these time. If so, and if the site has good content pages, then those pages shouldn’t be in supplemental in the first place, right? “Devalued” incoming and outgoing links should not play such a part in them going into supplemental.

  10. I’m with JohnMu – what about some comments on the +30 penalty that I and other webmasters have been in for months and despite seeking feedback, cleaning up etc no change which has cost me thousands.

  11. To get rid of supplemental results I feel its necessary to have a unique meta description and title to all pages of your website. My Blog totally went into supplemental results a few day ago and I have posted about how I got rid of them.

    I want to ask Matt a question. How would Google treat a page which has its meta description completely different from the page content? Are such pages handles by Google manually or the bots take care of this?

  12. I wish the guy said “supplemental hell” instead of “Google Hell” (never heard that before), then I’d get some traffic to my site for a change :D

  13. Dyce

    Aye.. seen ‘Supplemental Hell’ got one or two in there myself :( I believed it wasnt so much to do with PageRank but more to do with how easy it was to be spidered… eg.. pages with tonnes of variables on the URL (non SE friendly) were generally found there… Ive seen pages with little/no PageRank doing quite well…

    Also.. dont spose anyone could shed any light on why Google suddenly spat out my verification on a website form the webmaster console.. and now wont verify despite the fact that the page is there.. I can see it, browse to it etc… most odd… if not.. no worries :P

  14. I think many people aren’t ready to let go of paid links and reciprocal link exchanges. It’s more comforting to believe they still pass some juice and blame meta descriptions and duplicate content for supplemental results than to acknowledge 1) PageRank still has a place in Google’s algorithm and 2) Google’s link analysis will grow more sophisticated over time.

    Still, if you search for “we live together” you’ll find a porn spam site ranking #1 with sitelinks. Many backlinks are from off-topic sites with hidden links like http://www.coins2artefacts.co.uk. Spam ranking #1 – no big deal – but with sitelinks? Lol, c’mon…

  15. Amazing.. Ive never really gotten my head around supplmental index and was of the opinion like many others that it was in fact “hell” :0)

    But the following from Matt –

    “It’s perfectly normal for a website to have pages in our main web index and our supplemental index. If a page doesn’t have enough PageRank to be included in our main web index, the supplemental results represent an additional chance for users to find that page, as opposed to Google not indexing the page.”

    ..has spelled it out for me .. thanks again..

  16. Well, Andy spelled my reference web site wrong. So much for getting some press off the article.

    If you get quality links to the individual pages in the supplement index, they will magically get out of the supplemental results. And by magic, I mean it makes complete sense.

    This does make things tougher for small businesses. Let’s say you have a small jewelry shop that has a traditional store front but want to sell some of your items online.

    So you create a site that organizes your jewelry in categories and sub categories. So to get to any product you have Home Page > Category Page > Sub Category > Product Page.

    I’ve noticed that unless you build external links to the Sub Category pages, your product pages will drift into supplemental. It’s very tough to get these links when you are a small business with not a lot of money or time.

  17. Hmm funny I remember a high profile space that went supplemental and a bunch of webmasters complained about the loss of traffic.

    http://www.threadwatch.org/node/10946

    Yes Google fixed it but, only after the damage was done and people made a bunch of noise. How often does this happen and webmasters and websites end up collateral damage with nowhere to turn.

  18. “Yes Google fixed it but, only after the damage was done and people made a bunch of noise. How often does this happen and webmasters and websites end up collateral damage with nowhere to turn.”

    Ummm… yeah. Graywolf, if SEOs spent more time explaining to clients what to do instead of whining on SEO blogs like an old lady about Google’s hidden agenda maybe people have more places to turn.

    Look at that guy’s jewerly site. Lame paid IBLs, TBPR 1 (update’s underway) and the guy has the balls to point his finger at Google? Any $100/hour link builder/viral marketer can fix his problem.

  19. Hi Matt. What you say makes perfect sense and I suppose thats how the supplemental index should work like.
    I believe however that there might be certain flaws connectet to it.
    My site songtext.net has a PR of 6 and over 70.000 backlinks according to the webmaster console (all legaly obtained over the years and not paid for or otherwise enforced).
    In january, out of a sudden, all of the 300.000-1.000.000 indexed pages (hard to get absolute numbers there as you know) dropped into the supplementary index.
    Since then I tried various things: a complete redesign without tables, moving the content further up in the code, shorter and more individual pagetitles, etc.
    Don’t get me wrong, I know that not all of our pages have individual content (due to the directory like structure) so I wouldn’t mind if 30-60% where in the supp.-index, but songtext.net was craving with only 900 (!) pages in the main index for the last three months (a little more now). And considering the number of backlinks and our PR, to me that seems like an error.
    Thanks for reading all this,

    Stefan

  20. Well why didn’t we have this level of public dissection in the vanishing sex blogs incident … hmmm … oh we’ll make some adjustments behind the curtain fix it , sweep it under the carpet and mum’s the word.

  21. JLH

    I think Adam Lasnik summed up the supplemental index best on Google groups which I completely stole so I could find it again and use for a reference, on just such an occaision.

    http://webmastershelp.iblogget.com/2007/03/09/the-skinny-on-supplementals/

    Specially interesting, and would have been good investigate journalism to find, is this statement:

    “1) Penalty? When your site has pages in our supplemental index, it does *not* indicate that your site has been penalized. In particular, we do not move a site’s pages from our main to our supplemental index in response to any violations of our Webmaster Guidelines.”

  22. I’ve read a number of times on here and on Webmaster Central that being in the supplemental index isn’t a bad thing but try telling that to the thousands of webmasters who find their sites suddenly disappearing off the radar.

    Have Google recently changed the way the classify what goes into the supplementals? Also is there a way for this information to be integrated into the Webmaster Tools?

  23. I agree with Chris, The tough thing with all the spam and the measures Google has to take to “defend” there results end up hurting the small business opportunity to rank. I wish the local results were stronger in the SERPs to give them a chance.

  24. Supplemental results is bad but how long it gonna stay ?

  25. It seems to me that these people complaining have change tactics not strategy. The are still trying to game page rank by complaining to Forbes (and getting links from them).

  26. Johnny Mnemonic

    Chris, you said:

    “It’s very tough to get these links when you are a small business with not a lot of money or time.”

    Small brick and mortar has trouble competing with large national chains, because it’s out-advertised.

    Film at 11.

    Maybe the world doesn’t really need another vendor willing to sell me jewelry online? Google, or anyone else for that matter, doesn’t owe this small jewelry shop a living. OTOH if they were to innovate and produce a unique product, then they would be sure to get some individual attention. Until then, though, I’m not sorry that they don’t show up in the search results–who needs ‘em?

  27. I love how people are always surprised and offended when a mainstream media source screws up coverage of an Internet-related issue. They just can’t get it right to save their lives.

    Steve Maich wasn’t even close, the BBC missed the mark, Computer World of all people screwed it up, and now Forbes has.

    Mainstream media should either hire dedicated people to cover Internet and/or SEO-related issues or just not bother at all.

  28. Dan

    The interesting part about this is that the vast majority of people don’t have a problem with how google does things. Its only when the company in question starts to use SEO that is generally labled either black or greyhat SEO that they get into trouble.

    While I understand while new freshly launched web sites hang out for a bit before google starts to page rank them, there are legitimate ways to get more traffic, advertising, digg and reddit come immediatly to mind.

    All part of the fun of it, the funny part is that the rules are clear, how this works is clear. Its only when someone does something that is against the rules, that they whine.

    Sorry mate, no sympathy even less after reading this artcile.

  29. To be fair to Matt, anyone seeing this links page knows it is there for SEO only. And thats the whole point. Anything done for SEO only with no value (or negative value) to user experience counts as spam. This list is obviously spam. And the “rules” about this have been the same for a long time. The ability of Google to catch this kind of stuff has improved. However, there are a number of Web masters that do get accidently caught in these filters. Having said that, its not like Google owes you anything to begin with…lol. Use unique Page Titles and Meta descriptions on all pages, make sure your URL structure is easy for Google to understand and that each page of your site is linked to with at least one straight html link from within your site (ps. the sitemap is a good place to ensure this link connectivity) and that will solve a lot of supplemental related issues (although obviously not all).

  30. We as Daktronics Inc has gotten hit with Google Hell and removed from Google Index a few weeks ago. We didnt know what was happening since Google explaination on the Webmaster Tools is very vague. We have done everything we can to find and fix the problems according to the webmaster’s guidelines. Emails and request forms has been submitted to Google requesting assistance but no response. Thats poor business practice — Google dont seem to care about companies like us. We are a legitimate company from Brookings, SD — one of the world’s leader in programmable electronic signs, eletronic and digital displays. http://www.daktronics.com We saw a drop in traffic and sales as a result of removal from Google.

  31. I’ve never had a problem with Supplemental Results, of course my sites are all fairly old websites and yes, I have utilized some “get links quick” software but most of them were due to advice by Search Engine Watch (Many years ago).

    We haven’t done much link exchange outside of the clubs and organizations of our industries within the last 3 years. Some of the old links to our sites still appear on Google’s Webmaster Console, but I’ve done my best to offset them with quality links. I have no control over removing them.

    We did find ourselves, or rather some of our pages listed as supplemental, but those were https (SSL) listings and it didn’t matter, they’ve been removed and are no longer on Google.

    It’s unfortunate that many websites have found themselves with this problem, I looked at the sites myself and found evidence of bad link patterns. What I really disliked about that article however, is that there were no solutions offered and none were requested either. It was attacking in nature, and almost blubberish and immature. waa waa waa Google Hates me! (Sound familiar?)

    It’s really simple to fix their listing and rankings, they just have to put effort into finding quality links to offset the bad. I’ve done it, but I started early before the onslaught of “relative links” became a must. It took awhile to find these niche sites, but I will say it didn’t cost me near $500,000 in business because I sat around and cried.

    And then came yesterday… When Google took away my shiny 5 PR (sigh..) just kidding Matt! Our hits are up, our internal page rankings are up once again back to top 5 and 10 positions as they should be. So take the PR and bury it, just keep giving me relative rankings and I’ll be happy with my pretty 4. I’m liking the change, and really happy I didn’t get caught up in the PR craze either.

  32. Kudos on the outing, Matt! Another great treat for us.

    But on a more serious note, I don’t think Google is really taking heat because Web sites with questionable content are going Supplemental. You’re taking heat because a great number of perfectly legitimate sites with good, unique content but relatively few inbound links are sinking into the Supplemental Results Index.

    And those pages rarely show up in active, well-populated queries because Google favors non-Supplemental content first (totally ignoring relevance and value in favor of link spam) AND because Google does NOT parse the text on Supplemental Pages. There is no way a Supplemental Page will show up in organic search results unless it has at least one non-supplemental link pointing to it.

    I’ve performed many site searches on a lot of people’s Web sites that were wholly in the Supplemental Results Index and those pages don’t even come up for their own internally branded site names. This is consistent, easily reproduced behavior.

    So you should really make an effort to look at this from the Webmaster’s point of view. Well-linked guys like you and me can blow off the Supplemental Results Index because we get over 1 million visitors a year. Most sites don’t get that kind of traffic.

    The smaller sites with fewer inbound links don’t deserve to be relegated to the dungeon of the Supplemental Results Index where the on-page content is not found for very specific, unique term queries. If they cannot be found even for the very unique expressions embedded in their content, it’s completely unreasonable for Google to pretend that they will be found for anything else.

    I hope you do have a great vacation, but when you come back, you’re probably going to find you and Google are facing an increased amount of criticism for favoring link spammers over relevant, valuable content that is simply link-poor.

    You can continue your crusade against paid links, but you guys created the problem and you can easily fix it by preventing ALL sites from passing link anchor text in your index.

  33. “waa waa waa Google Hates me! (Sound familiar?)”

    Sounds real familiar, Asia. I’ve been crying in my coffee over this for months.

    It is very insulting when 90% of a seven (7) year old site has been dumped into supplemental hell. I don’t do link exchanges, and if any site that I do link to happens to link back, it is merely a coincidence. For months now the other search engines have been sending 4 to 5 times the traffic that Google now sends. It used to be the other way around.

    I’ve even rebuilt the entire site (nearly 900 pages) from the ground up with the hope that things will improve. The rebuild isn’t completed yet but the site is up and running. I have time consuming housekeeping yet to do and I figure that chore will take me months to complete.

    Even when I try to find search results on Google I’m having to go far too deep into the results to find a result that fits with my query. More and more often I am now finding myself resorting to other engines that I haven’t had to use in a coon’s age to get to what I’m looking for.

    To add insult to insult the PR update clobbered the site as well.

    Is there any hope left? I don’t think so, but I’ll just keep trucking along doing my thing and maybe someday down the road all will get fixed.

  34. nerd gone bad

    Google is still getting gamed and easily when one website can appear in 8 of the top 10 spots for it’s category (and it’s not original content and all spam advertising).

    http://mythicalblog.com/blog/2007/04/17/the-porn-of-web20/

  35. Any $100/hour link builder/viral marketer can fix his problem…. How?

  36. ****
    My site songtext.net has a PR of 6 and over 70.000 backlinks according to the webmaster console (all legaly obtained over the years and not paid for or otherwise enforced).
    In january, out of a sudden, all of the 300.000-1.000.000 indexed pages (hard to get absolute numbers there as you know) dropped into the supplementary index.
    ****

    Wow – 70k links and only PR6 – a PR6 attempting to hold up 1million pages………..I think you have your own answer…..

  37. Matt –
    I’m not getting your objection to the gist of that article, which I think is less about whether it is possible to get out of supplementals, rather about the collateral damage issue.
    You seem to be saying that going to supps does not generally involve a dramatic downgrade in rank, and that dramatic ranking changes only affect sites that deserve them?
    Though perhaps part of this is a time frame thing – Google fixing a ranking quirk in 6-12 months may seem reasonable to you but for some companies it means … death or huge losses.

  38. Sometimes I wonder about how Google really thinks. Here’s an example:
    Google the phrase “business cards” . You will see on the first page alone, out of all of the pay per click bidders, Vista Print is on there 3 times! Two of them mislead you to think they are other companies (they’re like splash pages). However, when you click them then click anything, you end up on VistaPrint.

    Plus, they are number 1 and 2 with a high pagerank. How can anyone compete with a company like that using unethical tactics? Am I just wrong?

    I know this fell off topic, but it drives me nutes! Matt, maybe that’s something you can address? I feel that’s a form of spam. I don’t see how that could fit in the white hat category.

  39. Matt Crouch

    Like someone else said above, what is excessive? I don’t expect Google to actually give me an answer. But I think in general webmasters are used to seeing link directories with at least 20 categories and 20-30 links per page. Having hundreds of reciprocal links may not seem like a lot to people that look at it all day (who have high aspirations) but maybe Google all along has thought that is a ridiculous number. I realize that even hundreds of links could be a small number in comparison to sites that have thousands of natural links .. but I image Google looks at the % of recip vs. natural when measures a sites link backs. With that being said, say a website has nothing but reciprocal links. Is hundreds of recips bad?

  40. I ended up talking to Andy before the article went out because of my recent post on SEOmoz:

    http://www.seomoz.org/blog/how-i-escaped-googles-supplemental-hell

    I got the distinct impression that he really wanted to hear the more sensational horror stories, and, as I tried to convey in the article, I honestly have come to believe that Google doesn’t intend supplemental as a penalty. The one lesson I learned from getting my client out of supplemental was that we were suffering from dozens of small things we’ve been doing wrong over the past 2-3 years. None of it was black-hat, but much of it made for duplicate content, unfriendly search results, and a bad user experience.

    On the other hand, I can certainly vouch that having over 99% of your pages in supplemental has very real consequences. The week after my client got their site out of supplemental, Google searches almost doubled. The vast majority of these were long-tail searches that are driving high-quality traffic. Of course, these searches are good for Google, my client, and our users, as they bypass our own search results and get users straight to the content.

  41. Did you ever wonder if part of the link exchange and paid links sites that are out there are actually owned by Google or a front company owned by Google? Google sets up link exchange sites like these to catch spammers? May be even give a “link exchange” site high PR, good search engine rankings and even has ad words ads pointed towards the sites so people sign up and Google catches them?

    Very interesting indeed….. I could see Google doing this to catch the spammers….

    Of course Matt, I would not expect you to comment on this but I can definitely seeing you do something like this…. That’s sneaky but spammers deserve it.

  42. Reciprocal links happen naturally all the time. I link to SEL on one post and SEL links to me from another post, for example. Excessive reciprocal exchanged links (ab/abc trades) intended for search results manipulation are what you should be careful with.

    “Steve Maich wasn’t even close, the BBC missed the mark, Computer World of all people screwed it up, and now Forbes has.”

    Yeah. I don’t mind healthy criticism but I’m seeing alot of baseless anti-Google propaganda lately.

    Ideally, a website should rank without any marketing. If it’s a good quality site, it should rank high even if I don’t bother to install a blog and blog on it 24/7 to get a bunch of people to send me links. YouTube is an example of a unique site naturally gaining links but not every site is YouTube.

    There are a ton of good quality porn sites, for example, but because there are almost zero natural links in that niche, Google has a hard time deciding what’s a good porn site and what’s spam.

  43. Matt,

    You said:

    “Reciprocal links by themselves aren’t automatically bad, but we’ve communicated before that there is such a thing as excessive reciprocal linking”

    Thank you for finally commenting specifically about reciprocal linking!

    We’ve been preaching to our thousands of customers for years to make linking decisions for the end users and not for the search engines. Our patended editor based system was designed intentionally to charge per category instead of per link to facilitate relevant link exchange for the end user.

    No site needs 329 link categories.. most only need a few dozen at the most.

    Relevant link exchange benefits the end user’s experience by providing another knowledge gateway to other sites related to the first site’s product/service/information.

    Link exchange can be abused or it can be done right for the end user.

    Folks: Avoid full duplex link exchange services and software that guarantee you links overnight. If you do use a software or service to manage link development, use one that is EDITOR BASED so you maintain editorial discretion on making links.

    Link as if the search engines did not exist. That means, ask yourself “Is this site I am about to link up with benefit my end user or help my end user learn more about my own product/service/information??”

    If the answer is no or you aren’t sure, skip it and move on. Don’t link to sites that don’t offer significant original information (avoid scraper sites).

    If the answer is yes, get the link while forgetting about the site’s pagerank or related metrics.

    Link exchange has existed since the WWW was invented. Google is rightly working to discern between webmasters who link for the end user and webmasters who abuse this legacy marketing method. Thank you Matt for reminding webmasters to avoid irrelevant high volume link exchange.

  44. Nice article, nothing goes wrong with “supplemental results”, thanks!

    I believe even writers at Google do not know exactly what “supplemental results” are; perhaps this was written few years before so many changes: http://www.google.com/support/webmasters/bin/answer.py?answer=34473

    Google is big, of course it uses multiple architectural layers. How is it possible that search returns results within 100-200 milliseconds?

    First logical layer is fastest “cache” connected to main index. “Supplemental index” should be hthousands times larger. Suppose that user tries to find Bambarbia in Toronto; of course search will return results within milliseconds. Next day/hour/minute, if many users tried to find Bambarbia in Toronto, some more “supplemental results” will be pulled out into fastest cache; at the end of month they will be added to main index too.

    Some SEO advice to use “fresh content”, that is because of 2 kinds of Google crawls: one is for main index, and another is for more frequent “supplemental” updates of “fresh content”.

    I believe Google implements similar logic…

  45. @Linda

    I’m not trying to make light of the situation, however, I took the opportunity to look at your website. I love the design, but you might want to look through Matt’s Archive as well as Search Engine Land and Neil Patel’s website for some tips on how to wake that site up for Search Engines. SEOmoz has some great stuff too.

    I would point them out to you, but Matt would probably cut my hand off for spamming his comments.

    I recently read an article a few days ago, by another person who has a cooking/recipe website who gave some great ideas on how to use social networking via Cooking sites to gain traffic. It took her under 30 minutes to make contact with individuals and within a day or so, a natural link via recommendation on another website. I got the link off one of SEL’s Search Day feeds. I highly recommend it, I never sleep until I’ve clicked every link on that particular feed, I learn lots each time I read them.

  46. People who are still obsessed with PR need to realize it is only a number with a green shiny mark. I know of people who rank higher then others in google and yet have a lower pagerank then their competitor. PR is more of a publicity stunt aimed at earning SEO’s money for getting clients high PR.

    Your site can be a PR8 but if it isn’t in the top 10 for its keyphrase it is worth $8.00 CAD.

  47. So

    1. It’s ok to exchange reciprocal links, with moderation
    2. Search engin abuse will be frowned upon

    I can deal with moderation factor

  48. I executed some site: queries. Interestingly, Google shows excellent results (few hundred thousands pages) for some well-known sites, but it does not allow you to browse all pages.
    For instance, site:www.bbc.co.uk – 1,940,000 pages
    http://www.google.ca/search?q=site:www.bbc.co.uk&hl=en&start=990&sa=N
    and it does not allow you to browse next 1000 results
    Try start=1000, “Sorry, Google does not serve more than 1000 results for any query. (You asked for results starting from 1000.)”.

    For my site, it stops after page number 76 (760 results, from total of 23,500 pages), probably because those Documents are not cached (bad PR, after-1000-results mark, not enough user’s queries, etc.). They will be retrieved after some user’s queries using some specific keywords, in a PR order, and placed into cache (even supplemental results).

  49. I think Matt doesn’t like pleas to look into cases in his comments, but I’ll jump in on one.

    To Christopher Ficek — Your case has nothing to do with the content of this post. daktronics.com violated our Quality Guidelines by hiding a few dozen keywords via CSS at the top of the site, which was there as recently as last month (you seem to be acknowledging as much).

    We sent you an email to a reasonable set of email addresses to alert you. Within this email were instructions on where to go when the problem was fixed:

    https://www.google.com/webmasters/tools/reinclusion?hl=en

    I would see a reinclusion request queued up with my tools, but I don’t see one for daktronics.com as of yet. That’s your next step.

  50. I believe problem here is that we are being shown that all these links can make a site rank #1 for extremely competitive terms like “diamond engagement rings” and a bit of whining can get around any spam penalty?

    In the future is a good thought, most of the SEO community already believes reciprocals do not work well and were already devalued long ago. Perhaps after viewing this we could second guess that theory.

    I think graywolf made an accurate statement above about having to get your attention to take action may require a bit of screaming. Perhaps a little communications back (a change in your policy) would make webmasters happier. Isn’t the Google’s goal here? Better Communications? When we are looking for help, getting a response would surely make a difference rather then leaving us than feeling like we are “Screaming at a Wall”, which becomes frustrating for any webmaster in any industry.

    Food for thought!

  51. spamhound

    With sites like this filling up the index, no wonder things are all screwy:

    http://www.google.com/search?q=leads2results&hl=en&safe=off&start=0&sa=N

    Most of the links back to that are about as spammy as they get.

  52. Hi Matt,

    Just as an aside, I submitted this post of yours to Digg:

    http://www.digg.com/tech_news/Google_s_Matt_Cutts_rebuts_Forbes_article_on_Google_Hell

    and had an interesting experience. While the Forbes article was submitted almost simultaneously, and got promoted and received tons of Diggs, my post wound up getting promoted only later. However, it didn’t show in the front page. Not even as a popular story in the section I submitted it to:

    http://www.digg.com/tech_news

    The only thing I can think of is that some commenter posted a HD DVD hack and it was buried per:

    http://blog.digg.com/?p=73

    I’ve heard some SEOs got buried sometimes, but I’m not an SEO and your blog is an official Google blog…

    Oh well. Finally got a non-video story on the front page and it dies right away…haha, easy come easy go…

  53. Beltira

    It seems to me people are fogetting two basic things.

    1. Search engines exist to help PEOPLE find things. If results of a given search engine return less usefull results than another enging, then people will use that one. As a user of search engines, I dont care who is religate to where, just give me usefull results.

    2. Shakespear was wrong. First we shoot all the SEOs.

    Why is it that people think they have a right to good rankings?

    Why is it no one understands Google and the rest cannot publish the details of how they do rankings. To do so would simple destroy the value to us end users of the search engines.

    Why?

    Simple, no matter what search temrs we might put in, we will find:

    300 pages hawking fake Rolexs
    400 pages of sites full of hot lonelly single women just looking for a hook up tonight
    500 pages of sites offereing me Viagra without a prescription(I assume to be used with the above sites).

    In all fairness I do realize all SEOs are not out to cheat and scam the system. The problem is that the 99% of them that are make the 1% look just as bad as them.

    I think the search engines are being very nice in relegating those that manipulated thereselves up in rankings artificially. If it were up to me, I would plain out ban forever the domain from appearing the results at all.

    Harsh?

    Yes, indeed it is harsh.

    The fanancial rewards of getting away with scamming the search engines are so large, the risks need to match.

    Instead of criticizing Google, why do we not have blogs of people blogging about how even the cheaters are? Because there are not enough that have never stepped over the line left. So instead they all harp on Google.

    And this Google has a conflict of interest crap.

    Yes, sure there is potential for them to cheat. So they cheat, the value of the search results to the end users go down. Users flock to another engine that doesnt. People use Google because they like the results it gives. They are not forced there or horded there. If they wore, MicroSoft and Yahoo would be sure to let us all know.

    I am not saying Google is perfect and without mistakes. Google is a company of fallable human employees, and they have mnade mistakes, and will make more in the future. As everyone else will whethor individual or a company.

    Beltira

    O, that this too too solid search results would melt
    Thaw and resolve itself into spam!
    Or that the SEO had not fix’d
    His canon ‘gainst supplemental results! O Google! Google!
    How weary, stale, flat and unprofitable,
    Seem to me all the uses of this SEM!
    Fie on’t! ah fie! ’tis an unweeded garden,
    That grows to SEM; things ranking and SPAM in nature
    Possess it merely. That it should come to this!
    But two months dead: nay, not so much, not two:
    So excellent a web store; that was, to this,
    online pharmacy to an x10.com; so loving to my search results
    That he might not beteem the winds of heaven
    Visit the links too roughly.

  54. Kelly

    Hi Matt

    Please could you answer the question about number of recip links asked by Tomaz (“If I have 40 reciprocal links on a site with 200 content pages, is that trigerring any penalties?”)

    My comment is this – I have religiously turned down every 3 way (ie non-reciprocal) link request believing that I was doing the right thing, because I do not want to “trick” search engines. When I get a reciprocal link request I look at the site and IF I would recommend the site to someone regardless of a link offer, then I agree to the exchange. Now, based on what I am reading here, I am doing the wrong thing and should’ve taken up some of those 3 way link offers. (if the sites were link-worthy of course).

    Its very confusing Matt if we are told different things at different times. Are you now saying that we should not reciprocal link and if not, why not if the site is good and fits in with the theme of my site?

  55. Kelly and the rest of you asking about “number of recip links” ..

    I don’t think it’s reasonable to expect Google to tell you exactly how many link exchanges you can do on a certain day or during a specific time period. Think of how that may be abused if Google stated exactly what is reasonable and what their speed limits may be.

    It’s reasonable to conclude that there are many speed limits. Separate speed limits for various types of websites, websites of different age groups and states of development, websites with global as well as national and regional links.

    The nature of the links themselves may also be factored into the “speed limit.”

    Don’t worry so much about a specific number. If you are obtaining relevant quality links for the end user and maintaining editorial discretion while doing so, it will be practically impossible to bust whatever speed limit may be in place for your market segment. In other words, you won’t bust Google’s limits if you obtain links with editorial discretion and without using full duplex software.

    Link to and obtain links from quality sites when it benefits your end user’s experience. Avoid services and software that force you to link in very high volume in a short time period. That means you might obtain one link exchange today, none tomorrow, two the next day, none for the next five days, three the next day, one the following day, none for the next week, then eight the next day and so on… that is natural link exchange activity. Avoid services and software that force you to link in high volume (more than 25-50 a day) to sites you have not reviewed.

    Avoid link exchange schemes which includes three way links, four way links, and anything else that appears as chicanery or gamesmanship.

  56. Walkman

    Matt,
    you say that over time some links may be counted less. Assuming that PR increases from 5 to 6, does it follow that the links being counted less is not an issue?

  57. The problem is most companies that hire a VP of SEM look for somebody with a degree in marketing and hire the first one that has been working for a few years and knows what SEM stands for. I have talked to several big companies about doing SEO for them and they are more concerened about non seo things. I was very suprised when looking for an SEO job that actualy good knowledge of SEO was not that important.

  58. my agreed here in the term for the web pages

    like : duplicate content, no content at all, orphaned pages …

  59. Dave (Original)

    To those worried about reciprocal links. IF you have read the Google guidelines and are still uneasy about YOUR reciprocal linking, you probably have good cause. If you are simply linking out when it enhances your users experience on your site, you likely have nothing to fear.

    It is extremely unlikley and just as unrealistic, to expect there to be some magic number that trips something. And, if they were and Matt stated it, 99% of those who read it would go exactly to that number. In other words 99% would be linking for SEs, the wrong reason.

  60. About “reciprocal links”, the main subject…

    It’s very easy to automate process. If “density” of different “terms” is the same on same page, – “spam”. Related to [of-topic] in newsgroups, blogs, forums, etc.

    From my Apache logs:
    [29/Apr/2007:19:34:39 +0000] “GET /adxmlrpc.php HTTP/1.0″ 404 1684 “-” “-”
    [29/Apr/2007:19:34:40 +0000] “GET /phpAdsNew/adxmlrpc.php HTTP/1.0″ 404 1684 “-” “-”
    Automated software looking for a backdoor to publish smth. Funny, someone still tries to post Emails to newsgroups ;)

    Sooner or later users will decide who is who. Via Google Toolbar, Alexa, …

    Do not worry about PR too much! Even if you lost 250k, you didn’t pay anything to lose it ;)

  61. JLH

    So the guy dumps his 329 categories and their links of recipricated garbage generated through http://www.linkhelpers.net/add_url.asp?info=mysolitaire.

    For example let us say he’s got 10 links in each category or 3290 exchanged links. He of course drops them due to his public outing. Does he now have 3290 non-reciprocated links, generating a strong boost for him? Or does the Google memory remember the previous relationships? If so, for how long?

    Now let’s say a site in a similar situation, with thousands of exchanged links on it, but not having the personal attention of the head of the spam team, notices this and then dumps the pages. Does it get said boost?

    Obviously not all links on two sites are bad. Matt’s linked to me before, I’ve linked to him many many more times, nothing was exchanged, no arrangement was made, but yet I’d still think a link from thE (long E) Google’s Matt Cutts blog is worth a bunch.

    Further speculation also has me wondering if the exchange is really the bad part here but the 329 categories of links not relative to the content they are trying to rank for. It’s a terrible profile; the text of the site says its about one thing, but yet the vast majority of the sites external links are about a million other subjects, the non-relevant links probably don’t add much either as those sites are off subject and probably have the ultra creative anchor text of “www.mysolitaire.com”

    (I really don’t expect an answer from you on my speculative questions)

  62. Deb

    Hi Matt
    so reciprocal is not black hat; right? so why some link pages of my site blocked by google? is it for 3 way linking?

    Deb

  63. Dave (original):

    “To those worried about reciprocal links. IF you have read the Google guidelines and are still uneasy about YOUR reciprocal linking, you probably have good cause. If you are simply linking out when it enhances your users experience on your site, you likely have nothing to fear.”

    Exactly!

    I am not an SEO, but I am from the school of common sense!

    Why is everyone getting burned about about reciprocal link ratios, numbers, categrories and so on?

    I mentioned something along these lines in another thread:

    If you sell “widgets” in the UK and get enquiries from “widget buyers” in the US you might link to a US based “widget seller” and he might link back to you – cos he gets UK enquiries.

    If you sell “recycled widgets” you might link to a guy that sells “recycled teapots” and he might link back – cos you support that communtiy.

    These type of links are natural, ADD VALUE for your visitors and is plain common sense.

    Thank you and good night!

  64. Dave (Original)

    I am not an SEO, but I am from the school of common sense!

    I wish more were, seems like “common sense” is not at all common when it comes to SEO or Google.

  65. I think most people are approaching this from the wrong direction – afaik a page isn’t in supps because it has a lot of ‘bad’ links, its in supps because it has very few *direct* ‘good’ links (ie; natural deep links to that page)

    Not a penalty, just a ‘this page is not valuable’ decision.

    That still doesn’t make it a good idea. Like MM said above, if a page won’t appear for a query where it has perfect onpage content, you know that the link based algo has had the dial turned way too far.

    Its not normal for most websites to have high volumes of deep links, especially some types of e-commerce. Blogs and news sites naturally get them, information sites will tend to over time, sites selling ‘part #1234 for brand name model #4321′ aren’t going to get many natural links to that page, but provide exactly what searchers for that keyphrase are looking for. Putting that page supplemental is unhelpful and, since it can be fixed by going and getting a few carefully placed links, just makes for more unnatural linking as people figure it out.

  66. Matt,

    I’m baffled by the supplemental results thing. My site used to have 99% of its pages in the main index. Over the last few months, I’ve lost about 70% of the pages to the supplemental index, which I know you say isn’t a penalty but it has led to a complete cessation of referrals to those pages from Google search.

    The ironic thing is that the pages that have gone supplemental are not the shortest, most copied or poor pages on my site but are in fact the main purpose and the unique content and value given by my site. ALL of these pages have gone supplemental.

    I do have reciprocal links (isn’t this part of what the web’s about?) but not excessive – no links to sites off topic and certainly vastly many times fewer reciprocal links than unique pages on the site.

    So why has the very best part of my content gone supplemental (and left the rest)?

  67. Andy Greenberg from Forbes called me looking for information on Google’s supplemental results. When I said that I believed that SEOs exaggerate what supplemental results are all about he wasn’t interested in discussing the issue further with me.

  68. Paul, that happens to me all the time. What I consider primary pages Google seems to think are secondary and end up in the supplemental index which of course means they never get returned in the SERP.

    If I could understand what it takes to make primary pages and secondary pages I would organize the sites that way. Not all my pages have to be in the main index – but some simply have to be or my site is not properly indexed.

    If you figure it out please let me know.

  69. Andy Greenberg from Forbes called me looking for information on Google’s supplemental results. When I said that I believed that SEOs exaggerate what supplemental results are all about he wasn’t interested in discussing the issue further with me.

    Well what good is telling a journalist something that contradicts his opinion? They never let facts get in the way of a good Internet story.

  70. Just as a sample of not relying on PR:

    try to search m7640nr at Google. On my PC located in Canada it shows my site on a second place after FutureShop.ca, marked with “supplemental results”. It was on 3rd place when PR was 0, one month ago. So why search at Google does not return pages even from Hewlett Packard?
    Unbelievable! Even search at PriceGrabber and HP itself returns zero results. And someone already performed this search with Google as I know from Webmaster Tools; “m7640nr” is real word-in-mouth.

    I need to add more URLs to the crawler, and to tune it, part-time hobby… Thanks!

  71. Well what good is telling a journalist something that contradicts his opinion? They never let facts get in the way of a good Internet story.

    You are the man for keeping it real Adam! Won’t get you much traffic but it’s better to have a soul at the end of the day.

    1. “Supplementals” are purgatory for link builders.

    2. Supplementals are great for removing things like tags, rss and other duplicates to make the real content count.

    I believe if you are bitchin’ about supplementals you are a sucky SEO.

    Anyone have anything good to say about supplementals? :)

  72. Thanks for clearing up some confusion Matt. It is nice to know you have pages in the supplemental index too.

  73. rob

    Gurtie hits the nail on the top of the cannister.

    The whole shebang naturally encourages people to go out and acquire links. Not everyone has marketing nous or is web savvy to the power of blogs and tags and all the other ways of bumping your stuff to where you want it to be.. Least of all the hobbyist with kick arse content – but hey, then again, is there really anything new now under the sun? Perhaps thats the view , the uber meritocracist web, the cream will rise etc blah – Well, not if its stuck in suppsville and no one knows about it won’t.

  74. is there really anything new now under the sun?

    Yeah, the “hobbyist” can now get to position #1 with few relevant, “earned” organic links.

    Next question?

    (I figure Matt is busy so anyone else want to jump in feel free)
    :)

  75. Speaking of SEO , Dear Matt , Alot of people argue out there that PageRank is no longer OK (Useless) and say it does not effect SERP at all , however , I demand an honest feedback from you just to clear things out.
    Phil

  76. Interesting, PriceGrabber has PR9 with 400,000 links (it goes down to 992 on some servers); CNET has PR9 with 1,440,000. Almost same amount of links has Google, 1,600,000. All links naturally come from everywhere.
    According to Alexa CNET is 6 times more popular site than PriceGrabber; according to scientific papers PR reflects probability that average user will find you, and it is easily calculated (in theory).

  77. PriceGrabber has PR9 with 400,000 links (it goes down to 992 on some servers right now); CNET has PR9 with 1,440,000. Almost same amount of links has Google, 1,600,000. All links naturally come from everywhere.
    According to Alexa CNET is 6 times more popular site than PriceGrabber; according to scientific papers PR reflects probability that average user will find you, and it is easily calculated (in theory).
    All this Math should help in sorting search results and to put into main index subset of all indexes, limited by 1000 documents per search term. Least recently used might be removed.
    (sorry I tried to post under Hobbyist; pls ignore)

  78. Number of links drops to 999 for CNET, and 998 for Google. Fair enough.

    “Exchanged links” where A voted for B, and B for A, and even bigger loops shouldn’t have any impact on PR calcs, isn’t it? If everyone voted for everybody, PR should be the same, 0.

  79. Shii

    This is to Peter Scott:

    It looks like you designed your Ireland website with the purpose of getting high placement for various Ireland-related search terms. I for one am quite happy that Google has slighted you. People should write their websites for humans, not computers.

  80. … but hey, then again, is there really anything new now under the sun?

    No, because everything rotates around it, not under it. ;)

  81. Aaron Pratt,

    Andy called me as well. He wanted “real world” examples of websites that are stuck in the SI. I told him if my clients were in there I could not tell him anyways. I appreciate Andy’s efforts to try to bring the SI results to the public as fact, but the way the article came off was like that Google was trying to hurt webmasters. Obviously that is not true, but as Michael Martinez has pointed out Google needs much, much more work on deciding who to put in the SI and who to take out.

  82. I need help in knowing why Supplemental Results affect both new site and old. The reason of asking my website http://www.ecommind.com – this domain was registered in 2005 but full fledged site is launched in 2007 with sub pages.

    Still, I see after first Google update few of my back links results are gone to supplement results and another thing which has come to my notice is that if I search with different search engines and online tools available.

    Search engine shows links with supplement results and other online tools shows results excluding supplemental results.

    When my pages will come out of supplemental results ?

  83. Hi,

    This is the first time I am writing in MattCutts. Just wanted a clarification about supplemental results. I have been working in the Search Engine field for the past 3 years. I have found sites which goes initially in supplemental results and soon if not taken any steps they got penalized. Surely they may be using unethical seo tactics.

    But my question is I have seen sites which which are in the google for the past many years, getting lots of users but at the same time using unethical SEO tactics. I mean why google not banned those sites, just because they are reputed sites, deriving lots of traffic and people want to see those sites.

    But what about unethical seo tactics they are using. I had this question in my mind so thought should ask this ?

  84. Lee

    Hi,

    Anybody know of a tool that can simply report back counts of indexed pages and pages in the supplemental index?

    Thanks,

    Lee

  85. When I first launched one of my sites I experienced that although google crawled most of the pages, most of them were placed in the supplemental index. After a few strong links, they were put in the main index.
    So it’s all about links I guess.

  86. Try to make this simple test: Create a beautiful web page, and add a link to it on a home page. After few days (even next day) you will see this new page in a search results, marked as “supplemental”. After assigning PR and gathering some statistics, this page may move to main index; although I believe there is no such technical concept as “main index”, it is simply cached “top 1000″ documents which Google tries to synchronize around tens of thousands servers (it can’t synchronize all indexed pages!)

    May be I am WRONG because I’ve never read this article:
    http://www.mattcutts.com/blog/bigdaddy/

  87. Matt, isn’t the link exchange that is being done by mysolitaire in direct violation of Google Quality Guidlines??

    “Don’t participate in link schemes designed to increase your site’s ranking or PageRank.”

    I practice White hat, encouraging my clients to get good quality links based on content and quality. I tell them to avoid these sort of link building schemes, as they are against Google guidelines.

    The unfortunate reality is these techniques work. My clients argue back that there competitions sites have poor content that is not updated often, has zero content on the home (click to enter) page, but still comes up #1 for most of the searches. The only answer I can see is that they participate in these link building schemes, and have thousands of reciprocal links from sites that are completely irrelevant to there business.

    It is becoming hard to stand on the soap box saying “this is the honest way to do it” when the facts show that the other way works better.

    Google has to give us White Hat SEO guys some sort of way to fight this or all our clients will go off to black hat SEOs…

  88. Tom Barber

    With all due respect to Matt and the Google team, hurry up already!

    How long do we have to play by Google’s Webmaster Guidelines, building out good content and focusing on our users, only to be outranked by crappy sites with “resource directories” clearly designed only for SEO purposes? In fact, cases where reciprocal links genuinely benefit a website’s users, especially an eCommerce site’s users, are probably very few and far-between…yet websites with entire site sections dedicated to such links consistently rank in the top 5 in Google. People using these tactics can also control the anchor text of the links they place and reciprocate for, allowing them to target their spam.

    As long as purchasing and trading links works, people will do it.

  89. Today someone visited my site using [Sony ICD-470] Google query. Query returns single and the only document!
    Sorry guys I was not trying to fool Google (look at cached page too), but sooner or later more and more supplemental results will be returned… for instance, from Mediapartners’ index… Bigdaddy is small…

    Classic sample of a query is this:
    To be, or not to be

    Google has really changed, first time during past decade. And some SEOs ;( still advice to remove stop words from a title ;)

  90. From this blog I have read the definition of supplemental results as:
    Google’s original stated purpose for their Supplemental Index is to augment the results for obscure queries. So if you are searching for a very particular thing, you may see supplemental results.

    It seems Forbes was being deliberately misleading to create an
    aura of sensationalism for their article. Whatever their intention, the article succeeded in generating a lot of traffic.

  91. Last year query To be, or not to be (without double-quotes) returned 1st page “Too hot to be truth” possibly because of stop-words and punctuation removal before query processing (“to”,”or”,”not”,”,”). (although stop words were not removed during indexing as query with quotes showed)

  92. Doug Heil

    I think that instead of blaming Google and wondering why sites are going supplemental or pages disappearing, etc, some sites I’ve viewed from “this” thread have very good reasons why things are happening the way they are. I’ll bet many problems have almost nothing to do with incoming links to pages at all.

    Christopher of Daktronics; I’m not picking on you, but your’s is one I remember very well: :)

    Your url’s are atrocious and your home links on interior pages go to the index.cfm extension instead of the “root” of your domain. That creates “Two” identical home pages in Google. That’s only one small thing among many, many things. :)

    If sites out there actually looked and tried to learn about what makes se’s tick, etc, you really wouldn’t be wondering what is wrong with your sites. Most problems are easily fixed if you have a designer/programmer who knows what they are doing. Of course, most designers out there don’t have a clue about search engines, so you owners need to learn this stuff or find people who can help you.

  93. From Google viewpoint, if both queries return the same HTTP response, which one will be automatically banned for similar content?
    URL1: http://www.mydomain.com/
    URL2: http://www.mydomain.com/index.html ;)

  94. Matt, I have a specific question from the real estate industry. Real estate agents have a long history of creating a reciprocal directory of agents by state. Many agents have abused this technique having hundreds of reciprocal links for each state.

    This has led Greg Boser and other SEO’s to recommend real estate agents not have any reciprocal links, contrary to your statement that reciprocal links are fine if they are not excessive.

    My brother is an agent and I built his website. Over his 20 years in real estate he has met and referred business to several agents in each state. Many SEO advisers to real estate agents are now saying it is excessive if he exchanges links with the agents he has exchanged clients with.

    My specific question: is there any reason real estate agents should not have a directory of reciprocal links with agents by state as long as they are not excessive?

    This is a major concern for the real estate industry. Given the amount of money the real estate industry spends advertising with Google, the favor of a reply and some guidance would be much appreciated. Thanks!

  95. @Aaron Pratt : Hi Aaron see you again here ..
    just wondering how you moderate and answer most question here as I remark for each of this post by others … :)

    Much more info I could get here .. thanks

  96. Disconsolate

    @Aaron Pratt: “Anyone have anything good to say about supplementals”

    Hi Aaron, Hi Matt … Yes I’ve.

    First: sorry for my bad english :-)

    My site is organized book review collection about italian personages and actors.
    I do not sell house, cars, video. I’m not a merchant. I create free content to read … stop.

    Every biographical page is purposely created from our staff with the collaboration of the actors and so on … We link important and contestual sources like imdb.com, abc.com, and rai.it, for example, for the exclusive convenience of our readers.

    For the thoroughness, the originality and the clarity our pages are between the best ones in circulation. The web sites of italian televisions, italian version of wikipedia, and many others, copy our content … but without linking the source (… and if they link us, they use rel=”nofollow”. For a encyclopedia constructed on the other people’s contents, to cite the source with nofollow is not the maximum of ethics!).

    It’s a mentality and a cultural problem, and I cannot doing nothing in order to change it.

    But, what can I do if my site have only poor incoming link? Unfortunately little will cite us with a link, but all the world wil copy our content. Then the only way to present our pages is to build a little of popularity by directories, of any bought link.

    From 3 month we have loosed many position in google serps. Our carrying section has been penalized dramatically. Today wo have copied our content is on the top. Spam engines with one BL and with a little of my content are in the first 2 pages.
    For example if I search literally any spample of my text (composed from 10 – 15 words) in google, my pages are in the supplemental results, but after others that have copied my job.

    I have executed some Spam report too, for love towards my job, but with no result. Therefore I have decided that I will not make others spam report, because reporting is expensive activity in terms of time, and is not effective. Sometimes instead some bad site speed up :( .

    I observe:
    1. Fortunately the traffic from Yahoo and Ask has grown (Perhaps in my sector google is less important, because the large amount of spam, and perhaps people watch elsewhere in order to reduce the cost of the searches)

    2. Spam blog build around one single key (many of them are build on free blogspot platform) with content copied without to cite the source, become more and more numerous and goes very well in the serps. But they are only made for AdSense.

    3. Network sites build on free spaces are increasing their presence in the serp of google. Often are sites build in directory on domains like this one: digilander.libero.it (digilander.libero.it/key1, digilander.libero.it/key2, …, digilander.libero.it/keyn) and they are over connected. To the eyes of Google, I thing that situation is considered exactly like one single site. But they are independent web sites. But they are only made for AdSense.

    4. If I restrict the search in Italian language, I see numerous sites in english, french, Chinese and German language, however! Sorry but I know that there are important italian pages in correspondence of that key. My pages there are not. Are they considered not italian? I do not know.

    Best Regards

  97. Hi

    So we are still discussing reciprocal links????

    http://www.seochat.com/c/a/Google-Optimization-Help/Links-Why-They-Are-Of-Little-Value-In-Helping-To-Achieve-Front-Page-Google-Results/

    I have been telling people for years they are worthless……when are we going to move forward Matt??

  98. I have been telling people for years they are worthless……when are we going to move forward Matt??

    When we as a collective move forward, not just we as the minority who already have known this for years.

    When people like Andy Greenberg can report things accurately and without any bias, making sure to gather information from all sides of a story, rather than posting some sensationalistic piece of crap designed to garner sympathy based on dominant company resentment.

    When people like Bob and Sally, the mom and pop website owners, know the things that Doug Heil referred to.

    In other words, it will be a cold day in Hell before that happens.

  99. MattKP

    You know what is funny. We all know the standards of links, content, etc… and all the things we preach as being bad doesn’t mean they don’t work. I’m not talking about spamming but when done with care, reciprocal linking can still benefit sites a lot. Sometimes a site needs a couple hundred well picked recip links and it can help nudge ranks. Saying they are worthless isn’t accurate.

  100. still a NOVICE brand new to all this stuff so hope this is not stupid Q:

    okay so what about sites like this one — i trying not to link to it :

    traffic-czar[dot]com

    at the very bottom — obscured white text matching the white background are a TON of very tiny cross-links to other sites. you don’t even see it for some reason i just happened to try to highlight and copy something at the bottom and stumbled on the hidden links by accident.

    Isn’t that verboten? Or is it okay since apparently they are all his sites? or what? I thought that was not okay.

  101. Doug Heil

    Bambarbia; It’s Google’s “CHOICE” at that point if a site does not know to not get both index.html and the mydomain.com in the index at the same time. “You” as the site owner have the choice by not linking to “both” forms on your site.

    But like I stated; that’s just a minor problem I see out there compared to many, many other problems “most” sites have with their design/structure.

    Matt; “Any” site who somehow gets “200″ incoming from exchanging links is “eventually” going to fade away in Se’s. That’s a fact. I’ve “never” had a client go after link exchanges the way I see many out there doing. Yep; I said never. Most of my clients have just a few incoming links anyhoo. Don’t need to many to begin with.

    children’s gifts

    If you think being on first page for that term requires MANY incoming links?…… you are mistaken. That’s only one example. Yes; I have a client on the first page for that term along with many other combos of terms…. and has very few incoming links.

  102. Kirby

    This isn’t about what is worthless. It’s about how Google wants to dictate to webmasters what is and isn’t acceptable to Google based on how Google believes they can divine intent.

    Matt, Calculator’s post stems from the recent penalty assessed on several real estate web sites who are guilty of nothing more than linking to colleagues throughout the country. If you understand the real estate niche, you’ll know that networking is an integral part of the business and some basic research will provide evidence that these links do generate real business. This penalty hit many working families hard, effectively shutting down a primary source of revenue.

    If it warrants a penalty, then take it across the board, but also clarify the ambiguity of your statements concerning reciprocal links. “Excessive” is a meaningless quantifier and many of these sites had few reciprocal links.

  103. From Google viewpoint, if both queries return the same HTTP response, which one will be automatically banned for similar content?
    URL1: http://www.mydomain.com/
    URL2: http://www.mydomain.com/index.html

    There is some specific terminology which we need to follow, and the correct answer is:
    - No any “Page” is banned
    - Some “URL” may disappear from search results, but not a “page”
    - Page is associated with one single URL.

    The index page will be shown in SERP, but URL will be either URL1 or URL2 (whichever was crawled first with 200 response code; may be after some redirects)

    “Page” – either HTML content, HTTP content, parsed data, raw bytes, – depends on search engine implementation. Usually associated with Primary Key which is calculated value (MD5 etc.; two different pages may have same MD5! probability is low…) (BTW, how Google implements EQUALS() method for pages? by comparing hash-value?)

    “URL” – initial URL finally providing us with this “Page” (part of HTTP response), possibly after some redirects.

    Single “page” may be associated with different URLs, and only one lucky URL will be shown in a search results. Suppose URL1 & Page1 is included in SERP. And Page003 has a link to URL2 (in a human-readable words).
    In more correct (unhuman) terminology: Page003 has a link to an indexed Page1 which is associated with URL1.

    Different urls within same domain with exactly the same content should not hurt at all, even with 301/302.

  104. http://www.voelspriet2.nl/PageRank.pdf
    http://labs.google.com/papers.html
    (I’ve never read it…)
    PR is nothing. Google uses 100 times more algorithms…
    I am not a human, sorry…

  105. Different URLs with similar content, recently published:
    http://www2007.org/papers/paper194.pdf

  106. Jay Griffin

    Matt,

    Calculator raises a question that has been asked on here numerous times but no clear answer has ever been given. Your example of mysolitaire.com obviously points out what you consider to be excessive reciprocal linking and of course this has gotten lot of people talking about what is too much or what is ok.

    The fact is there are almost as many differing opinions as there are people discussing it. No one really knows at this time except Google. Even though mysolitaire.com has been singled out as of today (May 4th) it does not appear to have been penalized. They are ranking #1 for at least 2 of their most important keywords. Maybe your using them as an example is just YOUR way of warning them before a penalty is invoked? If so, my hat off to you.

    With that said on April 15th several dozen real estate agent sites were hand picked and penalized by Google for reciprocal agent to agent linking and the confirmation(?) that these sites had been penalized was relayed to them by someone other than a Google employee. Yes these sites all had reciprocal links but NOTHING even remotely close to the degree of your example. And the links they (do) did have were to other real estate agents, not to unrelated industry sites.

    In order to clear this “penalty” they have been told they must first file a reinclusion request even though none have been excluded from the index. These hand picked sites are all owned by “moms and pops” and some, if not all, are suffering from the effects of a penalty that was implemented with not even a hint of warning “directly” from Google.

    My question is how can Google invoke a penalty on these in light of the fact that for the past 2 years on numerous occasions people have been asking on here what is Google’s position on this subject? Several times when this issue has been brought up URL’s of real estate agent sites were provided as examples to help make a determination. To the best of my recollection no answer was ever given. In addition, some of these same people who participate here also asked you for clarification and provided examples whether this kind of linking was acceptable or not in the same get together at Pubcon when “Big Daddy” was named. Your answer was something to the effect that you saw nothing wrong with linking to others within your own industry. Who better to link to? Maybe you remember the example that was given by the photographer who was linking to other photographers and vice-versa?

    These penalized agent sites have dropped 30-50 spots from where they did rank and what is sad is the vast majority of sites that have not been penalized and are ranking above them have reciprocal agent to agent links. Why? Because they don’t think they are doing anything wrong. Are they wrong?

    If the answer is yes and your goal for penalizing these sites is to get the ball rolling to “clean up” what you consider excessive reciprocal linking, by YOU stating that here the word will spread like wildfire. The result is it’s likely you will see more undesirable links deleted in one month than have been deleted since the Internet was born. Mission accomplished?

  107. Lori

    Whether Google tells me my links are worthy or not is not an issue for me. They benefit my visitor, plain and simple. No schemes, no three-ways, no paid or bought. A simple exchange between two sites who appreciate each other as a resource, and that is it.

    Google can put that in their pipe and smoke it.

  108. Doug Heil

    Hi Bam, That paper is “someone else’s” algo on what they may do. I thought you asked what does Google do? I answered that question. Your particular question involved SAME page with different url’s, which means Exact content. It’s not “similar”, but it’s exact.

  109. Dave (Original)

    It’s about how Google wants to dictate to webmasters what is and isn’t acceptable to Google based on how Google believes they can divine intent.

    They supply guidelines that webmasters can choose to follow, or not. The simply don’t want the tail (webmasters) wagging the dog (Google) at the expense of billions of Google users. Why would they?

    It patently clear that the site Matt used as an example was linking to increase PR, manipulate the SERPs and artificially boost their rankings. All of which is outside the guidelines.

    BTW, I don’t see anywhere Matt has made mention of penalizing. I would guess such sites MAY fool Google for a short period but when their cheating is exposed they probably just slip back to where they should be.

  110. Wonderful piece of information. I have been looking for this indeed. Well Matt I am into Search Engine Optimization Service but I always look into white hat techniques. Well I know about black hat tricks though :p.

    Black hats can bring one quick result but at the same time it can put one down the hell.

    One doubt I have here, if someone gets too many reciprocal links but with a restricted per page number and that too with relevent websites, then is it going to count good?

    Regards
    Dwarika

  111. Guys, there’s no need for all this worries and discussion… Just take a look of at Google’s Webmaster Guidelines and your answers are there:

    Number of links?
    - Keep the links on a given page to a reasonable number (fewer than 100).

    Link Exchange?
    - Don’t participate in link schemes designed to increase your site’s ranking or PageRank. In particular, avoid links to web spammers or “bad neighborhoods” on the web, as your own ranking may be affected adversely by those links.

    Site coverage?
    - Submit a Sitemap as part of our Google webmaster tools. Google uses your Sitemap to learn about the structure of your site and to increase our coverage of your webpages.

    General worries?
    - Webmasters who spend their energies upholding the spirit of the basic principles will provide a much better user experience and subsequently enjoy better ranking than those who spend their time looking for loopholes they can exploit.

    Regards,
    Giovani Spagnolo

  112. I have a PR3 page that is in supplemental, this makes no sense.

  113. Jay Griffin

    Dave (Original),

    It would be nice if Matt would acknowledge here that these sites were penalized.

  114. Doug Heil

    Jay; Matt won’t acknowledge that because sites are “not” penalized. Their prior method of implementing bad SEO by exchanging links has simply been discovered and now the links are simply ignored by Google. That’s why Dave stated the sites are probably now in the SERPS where they “should” be. Not a penalty at all, but only saying that exchanging links is NOT what you or anyone should be doing as it does you NO good in the long run.

  115. Am I one of the few to applaud the fact that Google has a supplemental index? With Yahoo! or MSN or Ask you only get a vague indication that your website is not performing well. So you’re way down the rankings but that could be because it’s a very competitive field and you’ve just been beaten out by other stronger web pages.

    Not so with Google. They make it really simple for you. Having a small fraction of your web pages in the supplemental index is not a cause for alarm. However if most of your web pages are in the supplemental index, that’s a very clear indication that you’re just not getting it. Only weak websites with little original content end up there.

    Andy Greenberg called me when preparing his Forbes article and I expressed this point of view to him. I guess it didn’t fit the picture that he wanted to present so he ignored what I said. So much for journalistic integrity.

  116. Doug Heil

    Barry wrote;
    “However if most of your web pages are in the supplemental index, that’s a very clear indication that you’re just not getting it. Only weak websites with little original content end up there.”

    Good stuff! Totally “fact”.

  117. Jay Griffin

    Doug,

    Not to step on your toes but the last time I talked with Matt he did a very good job talking for himself and that was even after he’d had several Sprites.

    The person that “relayed” that Google had confirmed these sites were penalized is very respected in the industry. Because of the advice he gave quite a few site owners have made changes to their sites. Why? Because they believe what they were told. For what is worth I know the person and I believe him.

    On the other hand there are considerably more people like yourself that choose to interpret what is going on based on what they know or think they know, not what Google knows. So in this case please give Matt or maybe another Google engineer the courtesy of answering the question whether these sites were penalized or not? A simple yes or no is really all takes.

  118. Matt,

    As Kirby and Jay indicated, the real estate industry is frightened and concerned because a large number of real estate agent websites have been penalized by Google and no longer rank. Apparently this is because they have reciprocal links with other agents by state.

    I have been a webmaster a very long time and this seems crazy to me. Twice my Realtor brother has referred me to excellent agents in different states, both of whom he has listed in his reciprocal link directory which is organized by state.

    Does Google really think it is excessive for agents to have a directory of agents in other states? This is what the expert advisers to the real estate industry are now saying. This seems like a stricter set of rules specifically targeting the real estate industry, and they are rightfully upset about being held to a different standard than websites from other industries.

    In fairness to Greg Boser, he clarified he thinks it’s OK for agents to have reciprocal links with recommended local service providers like mortgage brokers, home inspectors, and contractors, but not with other agents organized by state.

    See a thoughtful discussion about this topic including some great posts from the WebGuerrilla himself at:
    http://www.realestatewebmasters.com/thread15412.html

    As this discussion shows, the real estate industry is trying to do the right thing to stay within Google’s guidelines. Please help clarify those guidelines so real estate webmasters know what is the right thing to do.

    This is our simple and straightforward question: does Google consider real estate agents having reciprocal links with other agents by state excessive?

  119. Dave (Original)

    Jay, nobody here is preventing Matt saying anything! After all, it is HIS blog and we couldn’t stop him even IF we did want to. So your argument that we are somehow preventing Matt from answering questions make no sense at all.

    The person that “relayed” that Google had confirmed these sites were penalized is very respected in the industry

    Which are “these sites” you mention and who is this person who is “very respected in the industry” and what industry? You are being very vague.

    Doug, myself and a few others are simply *offering* common sense and logic that fits in with what Matt has said in the past, what is written in the Google guidelines and experience.

  120. Doug Heil

    Hi Jay, Yes, I’m writing what I “think” I know about things. What I may know just might be a thing or two…..

    Common sense usually wins out in the end.

    Without mentioning “who” these so-called respected people are, no way can we evaluate what you were told. I think I’m “fairly” respected as well, and I gave you my opinion. Matt can certainly answer you if he chooses to do so, but I can tell you that “reciprocal” links are NOT penalized. BUT: let’s say your site is linking out to bad neighborhoods, etc? If that is the case, your site crosses that nice line and you could in fact get penalized for that.

    There is not something out there set in stone about a site being penalized or not. Google is not going to let us all know exactly how the algo works. All we can do is use our common sense and logic and do the best we can. Don’t you think common sense should rule? I do.

  121. Doug Heil

    bimbba.com

    hmm. I find it strange that this site filled out my free consultation form on my site last night asking for help. Does anything look strange about that? LOL

    That site is the totally BEST example of a site who is doomed to failure for almost every possible reason I can think of. It’s also a site that will “hurt” any other site in the long run who uses them for anything.

  122. I’d like to comment on the Real Estate examples various people have cited:

    1. Yes, I can see why a Real Estate Agent would provide a list of Real Estate Agents in other areas – cos their clients are often likely moving from one area to another.

    BUT – and it’s a big BUT:

    2. I think that we all understand that a true link is a “vote editorially given on merit”

    Someone tell me that a Real Estate Agent that links to 100′s or 1000′s of others is passing a “Vote Editorially Given on Merit”.

    So, lets apply some common sense and logic to their argument:

    1. It may be that those links carried more weight than they ought to in the past

    2. It may be that as Google continually refines it algo’s those links now carry the weight the ought to from an SEO point of view – little/none

    3. And therefore – such sites that have dropped in rankings have likely simply now got their natural deserved SERP.

    4. Rather than been penalised, I suggest it might be more accurate to think in terms that those sites have had some link values neutralised.

    So how does that site now deserve to clinb in rankings?

    Well, they could try creating new quality content – write a blog on local issues – create pages with useful information about the local community – make their sites better than those ranked above them and the links and SERPS will come again.

  123. Oh, I forgot to add this:

    If I am searching Google for a Real Esate Agent I would like to be served up the sites with the best content and information on that topic – which possibly aint those with the most real estate links on a page!

    Think users – think visitors – think great websites.

    Right – I’m off to write some new content – must get me a blog sorted soon!

  124. Jay Griffin

    Dave and Doug,

    What purpose would it serve naming the parties involved?

    Besides I didn’t come here to ask a question so I could get everyone else’s answers and then come to a conclusion. The only answer I am interested in is Matt’s.

  125. Doug Heil

    Hi Jay; Naming what parties involved? and involved with what exactly?

    The domain I posted “emailed” me last night for “help”. I found it extremely funny stuff as that site is truly bad stuff and is all about GOOGLE PR and link exchanges. LOL

    Why did you ask anyway? Is that your site? Did you email me?

    Jay; you posted a question. People WILL answer you. You take the advice given or you don’t take the advice given. The choice is solely your choice to make. If you didn’t want answers,… don’t post questions.

    Matt is on vacation for the whole month of MAY. Don’t expect him to answer you anytime soon, if at all. For the record though; there is no reason for him to answer as many posts in here are answering you correctly.

    But again; take the advice or not.

  126. Doug Heil

    BTW Jay; Please read Ray Burn’s posts as well. They are VERY good also. Then; read them again,…. and then again. :)

  127. Doug Heil

    I just read the entire thread at that REW forums.

    I guess that calculated and Jay in here and possibly a few others in this thread are all in the realestate industry, right? I’ve found that agents and those in it just don’t get this for some reason. It’s really not hard if you truly sit back and think about the big picture.

    webguerilla in that thread actually made some good posts trying to explain things to you all, but for some reason you were not picking up on his points.

    Let’s be real clear here:

    Google does NOT penalize you if you exchange links with other real estate agents. All your links were always giving you a boost in the serps before. Now; those same links are simply “not” giving you a boost anymore. It’s not hard to understand, right. You can keep on exchanging links with other agents if you wish, and if you think they are helping your visitors, etc. That is very acceptable.

    You get into trouble with Google ONLY if you are linking out to “bad neighborhoods”. You should not be randomly exchanging links with “every” agent who asks you. You need to “research” those agent sites “first” to make sure you understand totally how they are doing business. Link wise, and SEO wise. Unless you know they are CLEAN websites…. totally clean…. do NOT link out to them. Linking out to a bad site doing bad things CAN get you penalized. Exchanging links with GOOD sites “cannot” get you penalized. Those links are simply not worth as much anymore.

    Think about it; why would Google reward your site just because you can exchange links with someone? They don’t. At least they don’t want to. Your links might have worked in the past with a boost, but they will not work going forward if Google can help it.

    I hope this is clear.

    Like I said; for some reason agents and that industry just cannot wrap their heads around this link stuff. I’m really not sure why this is so hard to understand?

  128. Dave (Original)

    What purpose would it serve naming the parties involved?

    Not too worry, that’s answer enough for me. It’s one of those situtations where if you have to ask you would never understand.

    Besides I didn’t come here to ask a question so I could get everyone else’s answers and then come to a conclusion. The only answer I am interested in is Matt’s.

    So? How exactly do you think anyone is preventing Matt from answering questions on his own Blog????

    I think this is half your trouble, you don’t read answers from others and come to your own conclusions……with just a tad of common sense applied.

    Myself & Doug are only trying to help you apply some common sense. Nothing more, nothing less.

  129. Jay Griffin

    Dave,

    How do you go from, “The only answer I am interested in is Matt’s,” to “How exactly do you think anyone is preventing Matt from answering questions on his own Blog????”

  130. Matt, you can’t keep telling us that being in the supplemental index is nothing to worry about….it is definitely something to worry about if the majority of your income comes from your website. Going supplemental can mean the difference between having a healthy business one day to not being able to pay your bills the next. This is especially so for the small ‘mom and pop’ type businesses who have no idea about SEO.

    It’s no skin off your nose if your pages go supplemental. What do you care? You are not making your living from your blog? For anyone else who has gone supplemental it is a major problem so stop telling us we shouldn’t be concerned.

  131. Doug Heil

    Hi Paula, It’s not up to Google is tell each and every website what is wrong with their sites. It’s up to each owner to figure it out and make the necessary adjustments. How they do that is up to the owner. They can either learn this stuff on their own, or they can get others to help them. Trying to rely on a free search engine for your visitors, and then also trying to get that free search engine to tell you exactly what is wrong with each individual site is just not feasible. I think you understand that. There is no one size fits all in this industry. If you owned a brick and mortar store, you would actually try to figure out your own problems with it, right? You wouldn’t be asking that merchant you get free stuff from to actually do the work for you, right?

    Google is doing the best they can do by reaching out to all of you. They cannot give out “personal” advice to each site and wouldn’t want to do so. Your pages could be in supplemental for about one hundred reasons I can think of. The knowing exactly what that reason is, and then fixing it is what separates owners in the know from those who need to hire others to help them.

  132. Dave (Original)

    How do you go from, “The only answer I am interested in is Matt’s,” to “How exactly do you think anyone is preventing Matt from answering questions on his own Blog????”

    Quite easily. It’s really very simple, just wait until Matt replies. In the interim, *anyone* can post to Matts blog with or without your permission. If you don’t like the answers you are reading, don’t read them.

    Going supplemental can mean the difference between having a healthy business one day to not being able to pay your bills the next.

    “Going supplemental” isn’t THE cause for pages not ranking well. Poor rankings are the direct result of Google not seeing your pages as ‘good enough’ to rank well. The reasons for this are many. If your income depends on it then you need to understand that you are at the mercy of the same algo as your competition. They can’t ALL be where they want. This is where content is King. So, rather than rely on a few pages, spread your wings so you are getting a good spread of traffic. That is, more content pages, then some more still.

    Your job as a site owner is to make your site so *damn good for users* that others will link to it (vote), without you having to link back. All you NEED to know about SEO is what Google state in their guidelines.

  133. Hey Matt,

    I also agree people actually do such kind of practices for link exchange, I even found a link net where one site was promoting 25 sites and all have same format same links and everything was so same. My link exchange newbie got 30 links and was happy… When I checked I realized that out of that only one can be taken … Overall he spent his time on that same 25 link sites..

    Are these site not good? If they are not good can’t we block them.

    Please do give your comment on it Matt

    Regards,
    Preeti Sharma

  134. I don’t think you quite understood what I was getting at Doug. I understand what Google is doing and I understand that work is needed when a site goes supplemental. My problem actually relates to Matt telling us that we shouldn’t worry if our pages go supplemental. I’m sorry but we should worry. It is a concern because even though Matt says that we aren’t being penalized, when it comes down to it we are being penalized because pages that we might have that were ranking really well previously are suddenly no where to be seen.

  135. Doug Heil

    Yes Paula; I understand what you are saying clearly. Read my post again though. You need to figure out “why” the pages are now doing what they are doing. You use to do well with them and now you are not doing well. Try not to analyze what Matt is saying. It’s your site with the problem. Your problem might not have anything to do with links, etc and may be something totally different wrong with your site. Please understand it’s not one size fits all. You should be concerned as there is something wrong with your site. Google is not going to tell you what is wrong. I hope you understand.

  136. And by the way…my income doesn’t depend on my online businesses so if one goes supplemental then I can still eat…but not everyone has that luxury.

  137. I don’t expect Google to tell me what is right or wrong with my site…I understand the problems of going supplemental and ways to fix it and I am already figuring out ways to get it out of the supplemental index. Don’t get me wrong I actually really like Google – its a great search engine but it doesn’t mean I am not going to be concerned when I am making money one day and not making it the next. Why do you think Matt has to continually write blog posts and answer questions about the supplemental index? It is a problem for a lot of us as we run around in circles trying to do the right thing by Google but it never quite works. I just don’t think Google quite understands the impact it has on a webmaster when their site goes supplemental and I can tell that by the way Matt tells us that it is something we shouldn’t be afraid of. Google does it’s best but it has to realise that it is dealing with people and not machines and we will continue to do what we can to get our sites ranking… and we will continue to worry despite what Matt says. (And by the way, the site in my link above is not the site I am concerned about – this is a site I do for the love of it and I don’t care whether it goes supplemental or not, which is why I don’t mind putting duplicate content on the site when I know that it might be of value to my readers or doing anything else for that matter that might send it to the supplemental index).

  138. “Excessive Reciprocal Links”

    I noticed the same wrong calculations on a quite a lot of sites, check http://www.seocompany.ca/pagerank/page-rank-explained.html with a link PageRank Calculator etc. And the problem is that we all expect at least 40 iterations in PageRank calculations.

    I believe Google performed up to 5(?) iterations pre-2006; may be even 3. With BigDaddy 7(?) iterations (Big!), may be more; and it explains some current dancing. To be fair A->B and B->A should not play any role for similar sites (suppose that A has 9 internal and 1 external link; suppose that B has 9 internal and 1 external link to A); by removing them we can improve performance.

    How many iterations does Google do?

    BigDaddy crawls and decides which page should be further indexed (it is “Crawl-Proxy” and not an “Indexer”). So, it calcs PR, checks Title & Description & URLs, retrieves outlinks, calculates “exactly the same” (unique key for a page). It caused some unforeseen “supplemental results” for outdated sites (non-unique Title of a page; can’t define a Topic via anchor text similarities; etc.).

    Additionally, as with each software upgrade (in this case even architectural changes) BigDaddy has some bugs and constant fixes.

    Can we have access to Google Bug database? ;)

    Query Processing layer changed too: old queries do not return expected results anymore, and it is not related to Big Daddy layer in most cases…

    Am I correct?

  139. Hi Doug, this paper IS NOT “someone else’s algo:
    http://www2007.org/papers/paper194.pdf
    It is linked from main Google Labs page, and it was written by two Googlers from Haifa Engineering Center (Israel) and one person from University of California. It was presented at WWW-2007.
    I agree with you that Google can’t follow scientific papers (did you mean that?); it uses “average estimates” instead, even for PageRank.

  140. Doug Heil- I’m blushing!

    Paula said:

    “It is a problem for a lot of us as we run around in circles trying to do the right thing by Google but it never quite works. I just don’t think Google quite understands the impact it has on a webmaster when their site goes supplemental”

    Paula, a webmaster running around trying to do the “right thing by Google” is bound to fail IMO.

    Try doing the “right thing” for your current and potential visitors!

    Unless you want to suggest that Google makes page 1 around 250 results instead of 10 – to give the poor sites a go?!

    Also, following on from some of my earlier posts I would like to add:

    1. If you link out to all sorts without editorial merit your increase the risk of linking to a bad “hood” – now that will get you penalised! Perhaps those real estate examples fell foul of this one – link out to loads and it’s bound to happen!

    2. I’m proud of my site (now) but ‘fess up it was pants at first!

    3. When it was pants and I was naive I linked out/ swapped links with related sites.

    4. I looked a few of those links again and I got rid – if you link out to a “scraper” site when yours is great – it drags yours down to the “lowest common denominator”

    And to re-confirm – I am not an SEO – I run a “mom and pop” myself

    Please let common sense prevail – again – build great sites please!

  141. It would be nice if Matt would acknowledge here that these sites were penalized

  142. Matt,
    What is confusing about this whole SEO thing is that I am told one thing by you and then go to these webmasterworld conferences and see guys like Patrick Gavin pushing link buying. This seems like such a major problem. Guys with the most money, can buy the most links.

    I also met an affiliate who was a professor at a university. He told me point blank that he became and affiliate after he found that links from .edu’s are valued so much higher than normal links. He simply started a simple link building campaign starting with all of the professors he knew in his profession.

    I know Google can’t watch everything, but it disturbs me that there really is not anything such as white hat SEO. I beg, plead for a single link where these characters are all at the top of the SERPS.

    Mike

  143. WebGuerrilla

    This thread is a great example of why you shouldn’t leave comments on when you go away on vacation.

  144. Denis

    Matt,

    I tried searching the massive amount of comments above but didn’t find this question answered. Could be that it is, in that case I’m sorry.

    Will Googles policy on not serving it’s search partners the supplemental index results change? Any partner site that serves a google search will not get any results from the supplemental index, and that is the worst part about the existence of this addon index at all imho.

    The day you start supplying partners with the same data the google.com search uses will be a nice day!

  145. Dave (Original)

    Try doing the “right thing” for your current and potential visitors!

    Good sound and common sense advice!

    Do that Jay and you and Google will be on the same page. From that point on you will at LEAST be in the running for page 1 *long term*. Also understand, that while your site pages my drop, others will rise. In other words, there are always 10 spots filled on page 1 so you statement of “I just don’t think Google quite understands the impact it has on a webmaster when their site goes supplemental” doesn’t hold water.

    BTW, IF your pages had a sudden, permanent big drop it could indicate that something you are doing WAS being credited and it no longer is. Or, put bluntly, you may have been busted!

  146. Doug Heil

    Mike wrote:
    “What is confusing about this whole SEO thing is that I am told one thing by you and then go to these webmasterworld conferences and see guys like Patrick Gavin pushing link buying. This seems like such a major problem. Guys with the most money, can buy the most links.”

    Why sure Mike. If you are listening to link mongers you get link stuff answers. That’s no surprise. But what you do not understand is that Google is FIGHTING this link monger mentality, and that is what this is all about. Stop listening to others heavily involved in link crap… because it’s all they do, and listen/read Matt instead. If you think in “general” terms and use just a tad bit of common sense, you can get it.

    And trust me; don’t think for a second that people speaking or whatever at these conferences know as much as they/you think they know. They don’t.

    Bambarbia; Get rid of your PageRank brain. Matter of fact; delete the toolbar if you can’t stand to have the green bar thang turned off. It does you NO good.

    Google; TURN OFF the damn little green bar please. Thanks.

    Bam; Thanks. Somone asked about linking to the “root” in the file ext for the front page. That paper does not detail exactly what to do and is general. Sites need to pick on way and stick with it throughout the site. Pick the .com or root to link to as most directories are going to put your url this way:

    yourdomain.com and never link it this way:

    yourdomain.com/index.html

    Google will choose which one to index for you if you are linking to both of them. You don’t want that.

    Shehap; Please read all the posts as we have covered your question in many of them.

  147. Thanks Doug, I am in a constant loop… PR… I found a lot of articles about “supplemental” and “main” indexes, the subject started even before 2002.

    Site quality should be the main concern; crawlers such as A9, probably Yahoo, and even famous ;) Baiduspider can’t understand the difference between 301 and 404.

    I fully ignore 301/302 for my spider; some web-sites use Session Id and Redirects, it’s much easier to assign a page an initial value of URL instead of redirected value of a final URL such as http://www.zzz.com/;SessionID=AL56RTPSLR…QR41

    The new question just arrived… if Googlebot constantly signs ;) as “Mozilla/5.0 (compatible; Googlebot/2.1; +http://www.google.com/bot.html)” why can’t it use cookies and not show Session IDs in so many search results?

  148. Just in the interests of fair play, I’ve just had a call from Andy Greenberg of Forbes as I’m sure have others he spoke to prior to his article. He wanted to say that on reflection, it might have been better to include both sides of the debate. I’ll drink to that. :) Anyway good for him to be making such calls.

  149. People have to come to terms that their content might not be very good.

    This type of content usually lacks human interest.

    For instance, seems like every SEO has a real estate site, they hire writers to build HUGE amounts of content to be found in those areas and are often upset with the results.

    Google is looking for the best, most useful copy algorithmically which is not an easy task. They also look to allow those who follow the current rules to benefit so those who got there via link exchange might get hosed as they should.

    What I have found is my most honest writings are rewarded generously because for those moments of meditation there is no Google. You see, the meek shall enherit the index.
    ;)

  150. Sorry for the poor spwelling, not good when trying to make a point or two. ;)

  151. skipfactor

    >>He wanted to say that on reflection, it might have been better to include both sides of the debate.

    It’s a blog, not 60 Minutes. If he’s good with you posting that, it should be OK for him to post his side of the ‘debate’ here right? ;)

  152. You all know Matt is on vacation this month right?

  153. Jasmine

    HI Matt.

    An alternate theory

    MySolitaire.com knew exactly what they had done, and exactly what they had to do. They’d been penalized for dud links, and needed to get some new ones FAST.

    They put their press team on the case, got their situation onto the front page of many syndicated newspaper websites, a bitch about google bans, is also sometimes worthy of a matt cutts review, so even more links as a result.

    Now, they have a huge number of one way links, not gained through link exchanges.

    They are not only out of the supplumental index, they are No.1 for their desired term.

  154. Dave (Original)

    >>He wanted to say that on reflection, it might have been better to include both sides of the debate.

    So he normally states one-side in writing based on a bias Webmaster and reflects in his mind how he should have put both sides forward. Aint that a surprise… NOT!

    Even bad journalist make accusations on page 1 and apologies on page 10.

  155. Old version of Froogle was the best sample of black-hat backdoor to the new “fast” crawler with “limited depth of crawl”.
    eBay also is shown at Google the same way: when I open link from SERP, it shows “refreshed” content. And Amazon.
    (Google Products is now robots.txt -protected.)

    Question: can I analyze HTTP referral headers and show users more related info if they come via SE? They will like it!

    (I am not talking about cloaking as it is implemented at WMW’s SERP)

  156. Interesting topic to pick Matt, but why use a blatant spammer as your example? Without doubt they deserve to languish in Google hell. But what about those legitimate small sites that are also condemned?
    I work with a lot of good small businesses who’s websites are buried in the supplemental index. No shady tricks, no link buying/exchanging (I won’t let them), just not much Page Rank, mostly because they’re small businesses, quite often serving a local niche, and they get no chance of gaining links, because nobody can see them to link to them in the first place because they’re in the supplemental index. Catch 22.
    The great flaw in the thinking behind the supplemental index is that popularity is more important than content. Lots of small sites have good quality and worthwhile content relevant to a niche, but because the pages fall below the popularity radar that content never sees the light of day in general search.
    Perhaps you might want to comment on why so many genuine content pages from an “honest” site like http://www.europeanlawmonitor.org are in the supplemental index (and you don’t have to mention the calendar pages – they are now restricted by the robots.txt file!)

  157. nippi

    Jerry, perhaps consider your site, is just not set up very well. You;’ve got duplciate title and description tags throughout and all sorts of other issues.

    You need to do a making sites for google 101 course, and many of the problems with your site, are in fact covered in googles page for webmasters on their site.

    Further, you can;t rely on people finding the site, to get links, your job is to get your site found, before the links arrive. Do a press release, email some related busienss/sites about promoting your site, etc

    SEO, is more than waiting for people to decide your site is good and linking to you, and more than just recip link requets, there are other steps in between you can take.

  158. Dave (Original)

    Jerry, pages in supplimental can be a symptom of low PR, not the cause. Being in Google’s supplimental index has no affect on PR according to Google.

    ALL pages are created equal in Google’s eyes. It is what happens from that point on (by choice of the site owner) that causes pages to rise or fall. If/when the site is important and relevant you should have enough PR to spread around as the site owner sees fit. You should let the site owners know up-front that there is NO magic pill to page 1 and there are only 10 places. I.e, infinately more will fail than succeed in making it to page 1 of the SERPs. Harsh reality.

    Having pages in Google’s supplemental index is NOT a condemnation. AFAIK they are subject to the same ranking algo as those not in the supplemental index.

    The great flaw in the thinking behind the supplemental index is that popularity is more important than content.

    I believe it’s never been a case of simply getting link volume to climb the SERPs, it’s always been quality over quantity.

  159. Hi Jerry,

    If you run a site command on the website you mentioned I think the problem is that the site “looks” like having loads of duplicate pages – that is more likely the problem in this example.

  160. Dave (Original)

    Seems like all your pages Jerry also have near identical Titles and identical Meta descriptions & keywords. I suggest you fix those before pointing the finger of blame.

  161. Doug Heil

    Hi Jerry, I agree with all above. An example:

    Find New EU Directives and Legislation : European Law Monitor – EU Policy Areas

    The first part of your title tag is exactly the above on “every” page. Your description is the exact one on every page. I’ll bet there are many more reasons why the site is not doing well. You need to read and learn about se’s, etc.

  162. Hi Ray / Doug / Dave et al

    Thanks for the comments. Lots of things to come back on!

    Ray: “If you run a site command on the website you mentioned I think the problem is that the site “looks” like having loads of duplicate pages – that is more likely the problem in this example”

    I think you are most probably looking at the EU work programme calendar pages – they are nearly identical – this is a known issue, and these have been dealt with using the robots.txt file to prevent them being indexed – not all the old ones have been dropped out yet. But anyway, this should only affect these pages, and I would agree that if any pages should be in a supplemental index it should be these!

    Dave – “The first part of your title tag is exactly the above on “every” page. Your description is the exact one on every page.”

    Um, if you discount the calendar pages, in general, no. The first part is the same on each page because that’s the name of the site/organisation and what it does. The second part of the title is different and unique to each page (the document/section title), and says what that page is about. This is all perfectly right and proper (even if I’d prefer to reverse the order). Lets look at an example from a page in the supplemental index:

    Title – Find New EU Directives and Legislation : European Law Monitor – COM 2007 196 Proposal on setting a framework for data management in EU fishing sector

    Likewise, the meta description has a default short general description with an appended specific description (the default appears by itself on pages that are article lists like the one you cite):

    Description – Information on the EU: news, law, directives, and policies, Proposal concerning the establishment of a Community framework for the collection, management and use of data in the fisheries sector

    Keywords – eu, european, union, law, legislation, directives, regulations, news, policies, procedures, COM 2007 196, Fisheries, fish, fishermen, fishing

    As you can see from the above, not only are the title/description/keywords distinctive and relevant to the page, but the URL is also content specific, without an excess of parameters. (www.europeanlawmonitor.org/eu-regulations-2007/com-2007-196.html)

    This is the normal form for the majority of content pages on the site. The page content is also highly individual, i.e. not repeated on other pages in the site.

    I will accept that there might be some issues with some content duplicated from other sources – for example, and for obvious reasons, a number (but not all) of the abstracts are drawn from the official draft legislation, but this doesn’t seem to affect which pages are dumped into the supplemental index. Those where the abstract has been uniquely authored by ELM seem equally likely to end up in the supplemental index.

    Which brings us back to the desirability of popularity being a key factor in whether or not individual pages end up in the supplemental index.

    Dave – “ALL pages are created equal in Google’s eyes. It is what happens from that point on (by choice of the site owner) that causes pages to rise or fall. If/when the site is important and relevant you should have enough PR to spread around as the site owner sees fit. You should let the site owners know up-front that there is NO magic pill to page 1 and there are only 10 places. I.e, infinately more will fail than succeed in making it to page 1 of the SERPs. Harsh reality.” and “AFAIK they are subject to the same ranking algo as those not in the supplemental index.”

    My contention here isn’t being on page 1 – I know and so do the site owners that there is no magic pill etc etc. Also PageRank is not spread around as the site owner sees fit – its allocated automatically by Google. The issue is whether popularity is a reasonable filter to be used in place of quality when determining whether a page is consigned to Google Hell. Pages in the supplemental index are not subject to the same ranking algorithm; pages from the supplemental index are only delivered when no pages from the main index are considered sufficiently relevant, something that’s only realistically likely to happen on a specific phrase search or something particularly obscure.

    A search for say, [ new eu regulations on fishing ] is never going to find this page because its in the supplemental index. Sure you’ll find a lot of information on EU fishery regulation etc, some of which will be more or less relevant to the searcher, but you won’t find this particular page, despite it being very specifically about a new EU regulation relating to fishing. It’s highly relevant for that search term, but despite this will never appear in the general search results because G doesn’t think it “popular” enough.

    Let me make plain I’m not having a whinge about this particular site, but expressing my view that the general manner and criteria upon which pages are dumped into the supplemental index seems flawed because it puts too much emphasis on link popularity and not enough on quality, uniqueness and relevancy. I’m picking out one specific instance here – but over the last year I’ve seen many pages of unique and relevant content on clean and entirely “white hat” sites dumped into the supplemental index for no other reason than the site / pages appear to be not “popular” enough.

  163. Hi Jerry,

    Can you tell please how many links should I follow on your site to reach a page
    /eu-directives-2007/
    (which is in “supplemental”), starting from a homepage?

    Looks like I can never reach it from homepage…

    Due to published info on so many sites that “BigDaddy is very fast” and “supplemental index has less constraints”, and also due to very-very obvious fact that each crawlers has… depth of crawl(!)… This also describes why very new sites easily appear in main index especially with sitemaps provided, then slowly disappear.

    I suspect that BigDaddy can’t easily reach some of your pages.

    I suspect also that similar title on so many pages is not a problem: there is “repeat search with omitted results” for this case…I had 1000 pages with same title in main index, without any problem.

  164. Hi Bambardia

    Its two links from the home page: one to a browse list for the year, clicking the relevant year takes you to that page, which is actually a browse list of the actual legislation.

    There’s no problem with G crawling the pages (rather the opposite!), but its more the supplemental index issue.

    I’ve also penned a fairly major response to a few of the comments posted here, but Matt’s probably fallen asleep moderating it!

  165. Dave (Original)

    Jerry, don’t pen replies to the problems we have pointed out, fix em if you want out!

  166. Matt takes some rest before G D Night ;)

    So I am probably wrong with “depth of crawl”, need to do some more tests.
    Can you confirm that clicking to link [browse list for the year] and then clicking [the relevant year] I can actually land on page with URL /eu-directives-2007/?

    Funny,
    [G crawl-depth] == [PR].

    Dave, what do you think about self-refreshing home page of Froogle, is it optimized for “crawl-depth” rule? (it is currently G Products, with robots.txt restrictions, and G Data; it was black-hat SEO last year, and GSERP never returned similar to cached results).

  167. Pages from supplemental index are crawled too, and cached for a longer time (more than a year). Pages from main index are crawled more frequently and have shorter TTL and depth-of-crawl. Orphan pages must go to supplemental index. Millions of pages submitted via sitemaps may go into supplemental even immediately, in case of no knowledge about parent pages. Then, after discovering of parent “trusted” pages, they have a chance to move into main index.
    It really does not matter “main” or “supplemental”.
    I noticed many-many people try to find “popular keywords” and be at 1st place with such specific search. Other people can’t find any relevant info by issuing “Adaptec 4805 SAS Review”, they can’t find review; and they are forced to issue more specific queries; and data from supplemental index comes at 1st place.

    The bombed rule “Anchor text is a true subject of a page” is still alive, and it must be alive at least for “in-site constrained crawl” (within domain only).

    The rule “do not have thousands of reciprocal links”… ;) IS a bug.

    It should be confirmed by Google that they do not trust the “true subject of a page”-rule anymore, which caused a lot of bad stuff in past… and this untrust is not related to “reciprocal links”.

    So, how to define a subject of a page, how to categorize it without human intervention? Using Title?!
    Ok, pages with similar titles may be united in SERP under same subject/category, but it will never mean that pages are not crawled.
    Sometimes “site:” queries show just 5-7 results for some sites with a link to “repeat with search results omitted”, this happens… but it is not “supplemental” related; this happens when Google tries to put many pages under same “virtual” title.

  168. Thanks Brian White — for some reasons I didnt get the email of reasonable set of email address to alert us as well as instructions. Can you resend that email to the email provided in the blog or send it to webmaster email provided on the footer of daktronics.com. We reviewed the guidelines when we first notice daktronics was no longer on Google and corrected it. We also submitted a reinclusion of the website. About a month later daktronics website has been index on Google again. Are there some useful information out there that would help us do better other than the guidelines provided by Google?

    Thanks –

    Brian White Said:

    We sent you an email to a reasonable set of email addresses to alert you. Within this email were instructions on where to go when the problem was fixed:

    https://www.google.com/webmasters/tools/reinclusion?hl=en

    I would see a reinclusion request queued up with my tools, but I don’t see one for daktronics.com as of yet. That’s your next step.

  169. Hi Dave

    “Jerry, don’t pen replies to the problems we have pointed out, fix em if you want out!”

    I’m not actually complaining (about anything really) about the performance of the the site I quoted. It was just an example – there are a lot of proper quality pages on the site with full and proper titles, and meta description, and content that are in the supplemental index. My as yet unpublished reply details one of those pages. I’m quite happy to concede a lot of pages there probably should be in the supplemental index, all those calendar event pages for example (now blocked by robots.txt).

    My point in relation to Matt’s post is that since the Big Daddy update, a lot of “real” pages on decent but small sites have dropped into the supplemental index, apparently on the grounds of link popularity, and that this to my mind is undesirable, as those pages may consequently never be accessible to those they are intended to serve.

    I can see things have improved somewhat over the last few months as some of those pages have returned to the main index, but for me still too many good pages are in the SI. And this particularly affects the small sites and small businesses who are always going to struggle when it comes to getting links. For example, how many people are ever going to link to a small cleaning contractor serving Essex? Wow, this is a really interesting page on kitchen cleaning, everyone will want to know about this, I’ll put a link to them… No, I don’t think so!

    When Matt covers the issue of the SI he focuses on the real bad boys who deserve the SI, but to me that throws a bit of a smoke screen over those good guys who have also been condemned because they don’t have the sort of offering that’s going to attract links. Doesn’t mean that they’re offering isn’t of interest to people who want a kitchen cleaned!

    Bambarbia – depth of crawl isn’t an issue here – none of the pages are deep, and max is 3 clicks.

  170. Dave (Original)

    Bam, no idea what you are saying but you have major alignment & page width issues with your site.

    I think you should start considering and focusing on your visitors 1st & foremost rather than SEs.

  171. As a neophyte, I am not the sharpest tool in the box but I can tell when a company has its own agenda. And trust me, there is nothing wrong with that either. Capitalism at it’s best.

    My concern is Matt et. al say one thing, and then the data indicates something else. “It” is not “important” but often “it” often is.. And then the importance is admitted after the fact.

    OBL used to be worth something. Now they appear to be worthless. If someone asks me to put a link on our jewelry site, what is the benefit to me? I have heard that if the person leaves my site to go to a competitor, then Gozizzle likes me for it. Means I am helping the community (Sounds socialistic to me.) What about my bottom line and my shareholders (ok my wife is the only shareholder and she can be tough she does not see a profit. REALLY!)

    So what are we to do? Buy links. No you cannot do that, Gozizzle no likes that either.

    This is my livelihood as it is for many. Just be honest, give us a level playing field and let market forces (price, quality and free shipping) dictate what happens. How do you go from number 1 on a dozen key words to off the first two pages? Huh?

    Oh, and have a blessed day!

    Michael

  172. Jana

    Matt,

    Can you please help me out? Almost my entire site has been placed into Google supplemental pages. My domain has been around since 1998, I have strong sites liking to my site that has to do with my type of business but yet almost my entire site has been placed in Google supplemental. What I don’t understand is why all these sites that have come along in 2004 (in the same industry that I am in) are raking higher then my site where my site is not ranking at all. I have spent soo much money on my site and at this point I don’t know what else to do. My site does not even show up when you type my domain name which is a popular search term. We have been featured in different national and local TV stations and so a lot of people search for our business under the business name but when you type my domain name on Google what comes up is all the other sites that have come along after mine that have purchased the extension of .org, .net of my business name and some have purchased mybusinessnamecalifornia.com and so on. My whole family depends on this business for a living and losing almost all our pages to Google supplemental has really hurt my business. Please let me know what I can do to fix this problem and to get my site out of Google supplemental. Please let me know what is wrong with my site that Google has decided to place almost the entire site into Google supplemental. Thank you for your time and understanding.

  173. Hi Matt,

    The Supplemental Index is a big issue for most people involved in some way in SEO. I appreciate what you say in that it isn’t a “bad place”, but it is certainly not where you would like you pages to be.

    A quick question… if you have a site with many pages with some of them in the Supplemental Index, is it a good thing to direct your robots.txt file to not index the less important pages on the site?

    You may have pages within the Supplemental Index that you would like in the main index and also some pages in the main index that you wouldn’t really care about if they were in the other index.

    Would the limiting of pages being indexed improve the chances of the ones you would like out of the Supplemental Index getting to the main index?

    I know this approach would not be a cure for every situation i.e. there may be some important other factors that are causing the problem. But in isolation would this help?

  174. Jerry D

    “For example, how many people are ever going to link to a small cleaning contractor serving Essex?”

    By using “Google Maps” that business can be top in its area with ZERO links. The best way to get a micro business to the top – for free. Why does nobody seem to praise Google when they provide such a great opportunity for the tiny businesses -for free?

  175. bethabernathy

    Say in the case of a real estate site that has set up a page for other realtors in other states to leave their website information. They keep the links on the page to under 50 i.e. 50 states. So these are topically related links going back and forth. In the real world this should be fine as relators get a commission from a referral. Pretty logical. But, from what I have been reading, it is when the links are not topically related the devaluing of the page occurs. I have several competiting websites in the “Lake Tahoe Real Estate” area that are ranking in the top search engine positions for this exact search phrase. Each of these top sites are participating in reciprocal linking, some topically related and some not. I am not seeing any of these sites dropping off in their positioning. So either it is a manual review or this thread is really a bunch of rubbish.

  176. Dave (Original)

    Or you’ve missed the point, or you’ve NEVER been credited for your links, or your off-topic and links exchanges have YET to be neutralized.

  177. Doug Heil

    Hi Beth, Yeah, you have not read this thread real good or Matt’s comments well either. You have missed the point. It’s the same o’l stuff like anything else. Just because it’s working for them “now” does not mean it will work tomorrow or next week.

    One thing you did not mention is this problem; what if ONE of those sites all linked up with each decides they want to try some kind of linking scheme/trick with some other neighborhood and that neighborhood is bad? That would drop each and every site off the map. Also; some of those sites might already be linking out to bad sites and Goog has just not discovered it. Also; as Dave said; those links might already be discounted or/and ignored.

    There are great tips in this thread if you can read things clearly and openly.

    Jana; Your questions are “impossible” to answer as things are very much on an individual site basis. I suggest you hire someone to review your site to find the problems. What sites just don’t seem to get is that this thread and what is discussed may not be the site’s problem whatsoever. Many other issues I have found are much bigger problems and the reasons a site might not be doing well.

  178. bethabernathy

    Hi There – I did get the points both Doug & Dave mentioned. I am just looking forward to the time that my competitors sites’ spammy link charades are discredited. Google has been talking about evaluating links and link exchanges for several years now. I am just not seeing it happen to date. Maybe next month? BTW Hi Doug….

  179. Hi,

    I was wondering why google seems to penalize people for having duplicate sites with the same content when sites like bizrate.com, shopzilla.com and bizrate.co.uk all have the same content but doesn’t come away with a scratch. It’s obvious that this company has the same content packaged in a different way. I’m just wondering why some people get dropped off and other stay up on top.

  180. Dave (Original)

    Beth, I really don’t think you do get it. I would say Google has always valued links by certain certain categories. As time goes on those categories have likely become many. I would say Google is striving (might be there already) to evaluate links on link-by-link basis.

    Don’t assume a well ranked site/page, using spammy practices, is being credited for the spam. They are more likely to be there due to their *non-spammy* practices. While we often see spammers on page 1, we DONT see the many more spammers who are languishing on page x.

    IMO Google is ONLY interested in keeping their SE the most useful and popular SE out there for their USERS. As such they don’t often penalize (strictly speaking) or ban sites/pages.

    Google’s users don’t care IF a relevant page is engaged in spammy practices. They ONLY care if it answers their needs. As such, Google puts its user wants and needs WELL before that of Webmasters etc.

    My advice is always to let Google and your competitors worry about their own SE and sites. Webmasters should take a leaf from Google’s book and put THEIR OWN users 1st and foremost. By doing so you are likely to become just what Google wants on page 1 of the SERPs.

  181. bethabernathy

    Hi Dave – Good points. Thanks.

  182. D C

    No question about it, Google is the best search engine at this time, so I spent so much of my time reading Google guidelines and follow those to the very detail but till this time my traffic to my website are coming from Adwords. (I already spent a lot). Basically, the important pages that might drive traffic to my site are just supplemental pages. Some of them were not supplemental when they were first indexed but after a week or two with no changes or minor changes they became supplemental. Google is I think so hard to new site no matter how relevant your site is. The pages I dedicated before for cakes by area, all became supplemental so I requested the URLs to be removed. Then when I created the pages specific to wedding cakes and birthday cakes, they became supplemental after a week or two, not even given a chance.

    It’s just hard, I left my job a year ago. I developed the site for close to 7 months. When it’s done, I spent five months tweaking the site just to fit Google guidelines but nothing really worked for me. Now, I am about to run out of financial resources to sustain my family. My only reason of building the site is to make a difference by helping local cake businesses and to work from home because I am raising two young kids by myself. I think Google’s check for duplicate is so strict that may not be realistic for some sites like mine where your content is based from user’s postings.

    I don’t know now what to do with the site, except leave it that way and start looking for a job after a PRWEB press release.

    It’s good that there are sites like CoolbusinessIdeas.com, Killerstartups.com , businessinspiration.blogspot.com that they will blog about your site even if they don’t know you.

    I am sorry, I am just voicing out my frustration. I am now updating my resume and will start looking for a job before everything will run dry.

    Sometimes I would , it just doesn’t make sense, for your relevant site to appear in search engines is when your site is already popular, unless your site is phenomenally so cool.

    That’s why I now realize why the Wikipedia founder thought of building a new search engine.

    Thank you for letting me post.

    Good luck to all of you!

  183. Matt… you are genious, i hope your never visit my site like you did with this one.

  184. Dave (Original)

    It’s just hard, I left my job a year ago. I developed the site for close to 7 months. When it’s done, I spent five months tweaking the site just to fit Google guidelines but nothing really worked for me

    That is likey part, or all of the problem. Read & understand the guidelines but THEN build a site that is the best in its topic for HUMANS not Google.

    In a nutshell, if you create pages for humans only, you are VERY unlikely to ever go outside the guidelines.

  185. Hi DC, It appears your site just doesn’t have enough unique content. I read the first sentence on the first page googlebot would come to that is linked at the top on your front page. The first sentence reads:

    “Wedding cakes are a very special part of a wedding.”

    It’s amazing how quite a few sites are using the same or almost the same sentence. I didn’t view any other pages of your site. This is just a very quick view of it and things are easy to spot.

    http://www.google.com/search?hl=en&c2coff=1&rls=GGGL%2CGGGL%3A2006-22%2CGGGL%3Aen&q=Wedding+cakes+are+very+special+part+of+a+wedding&btnG=Search

  186. D C

    Thank you Doug for taking time to read and check one page of my site.
    But I am a bit surprise how shallow is your findings. The first sentence of my page are not exactly the same as other sites. I took a look of the sites of your search results and those words appear in different sentences not in just one sentence in other sites. For sure if you grab the first sentence of a page, you will see the results in google and the popular pages will be displayed first. Is Google this shallow that if it happens the words that you use in the first sentence also appear in other web pages, even if it’s not exactly the same, then your page will be penalized as a supplemental page. How many hundreds of billions of pages are there that are indexed by Google and how many words are there in the dictionary. For sure, if you grab the first sentence of a page you will get many results.

    Yes, showcasing wedding cakes is not unique but try to find out any site out there where you can find more than 150 wedding cakes complete with contact details and hardly you can find any.

    I even browse each pages of the search results of the keyword “Wedding Cakes” one by one up to the last page and nowhere I can find the page up to the last page of the search results. This supplemental thing is crushing my hope of my site being found. All the cakes detail pages are marked as supplemental.

    Also the concept of the site is unique. The site lets local cake businesses post their cakes and cakes can be found by zip code and by city. I created the wedding cake page because I think the keyword “wedding cakes” is a popular cake category.

    My point here is this… no matter how relevant or important your site and your page… Google will not show the page unless it’s already popular through links… the worst thing is when your page is marked as a supplemental page then your hope is crushed.

    I think Google’s marking the pages as supplemental is too strict and flawed.

  187. D C

    Thank you Dave for your comment but the site was designed for humans but machines are not letting it be known. Does it mean Google guidelines are not designed for humans? Does it mean that Google guidelines is useless?

  188. I understand fully what the supplemental index is. And perhaps some people are misrepresenting it. I have seen people very worried about their supp results as if it effected their rankings as a whole.

    Don’t get me wrong. I think the google supplemental index is a good thing. It weeds out junk and worthless pages from ranking for anything that is even remotely worthwhile.

    It does at times get pages that are good and deserve to be in the regular index. I have seen this however it isn’t hell really. They are very easy to get out from my experience.

    1. Make sure the meta tags make sense, are not garbled mess, keyword stuffing etc.
    2. Make sure the content on the page is not duplicate, is worthwhile and is more than a few sentences.
    3. Link to it from your home page
    4. Turn your site into a natural linkbait and build quality links to it.

    It never fails. The supp index basically tells me that Google sees the page as worth less than regular results. This tells me I need to change some things. Is that page orphaned? Is it duplicate? etc – They can be easily put back into the regular index if you fix these problems.

    I also like your representation of reciprocal links. I TOTALLY agree there. For most of my sites I won’t even reciprocate, I sometimes link to a good site and then they link back, sometimes unknown to both of us until we see our logs or sometimes intentional. That is the only type of link exchange I do. I won’t reciprocate if my link is placed on a links page, what does that say for the value of my site?

    And its not because you or google said no no. It is common sense. I won’t associate myself with garbage and I see sites that reciprocate with anything very poorly.

    Well I edited my comment a few times so hope it still makes sense :P

  189. Dave (Original)

    Thank you Dave for your comment but the site was designed for humans but machines are not letting it be known. Does it mean Google guidelines are not designed for humans? Does it mean that Google guidelines is useless?

    Then IF you cannot better your site you will need to reside yourself the fact that your pages do not make the cut in Google’s eyes.

    The Google guidelines are designed for Webmasters and not “useless” IMO. Although as I said, if you build your site for humans 1st and foremost you are VERY unlikely to go outside them.

    You can try and tell us all why your site should be number 1 and justify what you have done/doing, but it wont make 1 scrap of difference. There are only 10 places on page 1 and the vast majority never make the cut. Especially those who cannot act on contructive critisism

  190. Dave (Original)

    Nice common sense post Randy. The way in which see links is this. IF another sites page(s) are going to help my site visitors I link out to them. IF they link back, good, if not no loss. As such I link out to just about every site in the same theme as mine.

  191. D C

    This is what you quoted Dave

    “That is likey part, or all of the problem. Read & understand the guidelines but THEN build a site that is the best in its topic for HUMANS not Google.

    In a nutshell, if you create pages for humans only, you are VERY unlikely to ever go outside the guidelines. ”

    The problem with your comment is right away you’re inclined in the negative side.

    Do you really want to help to diagnose the problem or you just post for the sake of commenting?

    There is no point in what you said. I clearly specified I followed the guidelines as specified by Google, why will it be the problem?

    Don’t just comment without reading the whole thing? Did I say I want my site to be no 1 or 10 right away?

    The hope of my pages of my site getting found is crushed because my pages are in supplemental, nowhere to be found even up to last page of the result.

    I think my site is being penalized because my site is new and there are less links into it. My links are only coming from Yahoo directory, BTOW, Business.com and some cool sites that blog about my site. Linking to other sites with the same theme didn’t work for me. Other sites clearly specified that if they will let me link my site , their visitors will just switch to my site. I even tried to advertise on one popular site with not so much similar theme but the following day they refunded my payment and was told my site is a threat to them.

    That’s why I am thinking this linking thing is flawed. I think in the end this linking thing will be to Google’s disadvantage. I think in the future, the SE which can better figure out the relevance of the sites not based on links will emerge as a better SE.

  192. Dave (Original)

    This is what you quoted Dave

    No, they are my words and were not “quoted” by me.

    The problem with your comment is right away you’re inclined in the negative side.

    Not at all, I’m being optimistic in the face of your pesermist attitude.

    Do you really want to help to diagnose the problem or you just post for the sake of commenting?

    If you believe the latter part of your own rhetorical question then it’s no skin off my nose.

    Don’t just comment without reading the whole thing? Did I say I want my site to be no 1 or 10 right away?

    No to the 1st and I didn’t say or imply any such thing on the 2nd. On the 1st I think you should actually READ what Matt has stated here and in the past on Supplimentals.

    I think my site is being penalized because my site is new and there are less links into it.

    No site is penalized for being new and having less links (whatever that means). ALL pages in Google are subject to the same algo IMO.

    Other sites clearly specified that if they will let me link my site , their visitors will just switch to my site. I even tried to advertise on one popular site with not so much similar theme but the following day they refunded my payment and was told my site is a threat to them.

    LOL!

    That’s why I am thinking this linking thing is flawed. I think in the end this linking thing will be to Google’s disadvantage. I think in the future, the SE which can better figure out the relevance of the sites not based on links will emerge as a better SE.

    Yeah ok DC, Google is totally flawed and your site is a victim of their flaws. Perhaps Google will come to their senses soon and have you at number 1 for all related searches.

    Good luck and farewell :)

  193. Hey,

    I just noticed a particular page on my site move from the main index to the supplemental.

    Whilst the page is there for a reason I just had a good look at it and Google got it right – the content is thin. Nothing to do with links – just content!

    I’m off to freshen the content on that page – every page on a site is prime real estate for good content for your visitors.

    That may or may not put in back in the main index – I’m not too fussed either way.

    But it is my responsibility to ensure every page is there for one reason – to give useful content to my visitors.

  194. Dave (Original)

    No Ray, apparently it’s Google’s responsibility to ask each Webmaster if their sites pages should on page 1 and/or part of the supplimental index ;)

  195. D C

    See how you distort the context of my comments with added insulting tone to it.

    Google search is a program and written by humans. I am hoping that by placing some comments here can trigger some good discussions. As I said Google is no doubt the best search engine but what I am bringing is a possible flaw to supplemental index algorithm.

    I was also voicing out my frustration because after one year of work in this site I just got supplemental pages.

    Just tonight, I viewed my site listings in Google. I have a certain subtopic with 8 pages, kinda sort of gallery pictures, descriptions and contact details. The first page of that gallery with all the details and topic was marked as a supplemental page but somehow page 4 and page 8 were not marked as supplementals and the rest of the pages are not even showing.. There are no duplicate items on the pages which shows 24 items per page. I really couldn’t explain it. It defies any logic. The first page which contains more details is in supplementals while pages just for pagination are not supplementals. Also, all the pages of the detailed items are marked supplementals.

    By the way, I have been in IT for 19 years and I would not bring up anything if it’s not truly reasonable. Humans make mistakes in the code. In Google no matter how good the people are there, they are susceptible to bugs which any programmer had been accustomed to or may be the code is too strict with very low tolerance.

    What I am just trying to open up is Google should give a benefit of a doubt before placing the pages to supplemental index. Crushing the hope of people who worked hard on something they believe into is really depressing. As I said I believe my site can make a difference. I would not even bother waste my time in this blog if it’s not so. I was hoping in my own little way I can help small cake businesses (some are home-based) to present their cakes online and their masterpieces can be found. Some are even struggling moms and single parents but they are good in making cakes. They don’t have time to promote their cakes online nor they have advertising money… but unfortunately my site is nowhere to be found because of this supplemental thing.

    By the way, the term Google hell is not appropriate term for supplemental index because of the possibility of getting out from it. I think it’s Google purgatory.

    I applaud Randy for your tips. ( Wish I have more resources to work on my site) I will use this as guidelines if I have time. Keep up the good work. Thank you Doug for looking into the page, maybe by just removing that sentence that may help in the future. It’s just that first sentence just make perfect sense to the page and comes naturally to mind.

    Don’t worry Dave, I won’t bother commenting anymore.

    Good Luck to all webmasters who are working so hard! Hopefully your hardwork will pay-off.

    Thank you for letting me comment here!

    Bye!

  196. D C, I can understand that you’re frustrated. Your problem is that, as both Doug and Dave have tried to point out to you, you’re frustrated for all the wrong reasons.

    From this viewpoint, it sounds like you’re pissed at Google for not giving you the traffic you feel you deserve whereas other sites such as killerstartups.com have written about you. Let me ask you a question…what was killerstartups.com going to do for you? They may send you traffic, but other business owners, not cake purchasers.

    You’re complaining that you’re out of money, but I don’t see a thing on the site that indicates how you’re supposed to make any. I don’t see ads, I don’t see anything for sale, I don’t even see a page indicating how much an enhanced listing would cost. Where’s the money?

    You also don’t appear to be promoting yourself in places where people who would be interested in a cake (say a wedding forum) would be lurking.

    I can appreciate that you’re upset, but your problems seem to go a lot deeper than Google supplementals.

  197. Dave(the original) is right. At SeoChat forums we get these guys and I am sure most other internet related forums get them. If there site is not ranking for every last phrase related to their site Google is flawed, incompetent, and problematic. If they have one page in the supplemental index life will end.

    When all they have to do is read a little and they could easily figure out what google wants. The sole purpose of most of these developments at Google ever since day one have been to make their results more relevant.

    Again IMHO the supplemental index is a good thing and does a great job.

    QUOTING DC:

    “In a nutshell, if you create pages for humans only, you are VERY unlikely to ever go outside the guidelines.”

    Huh? Well climb out of your miry nutshell mate because you have obviously been in the dark too long. A while back I converted most of my sites to natural, human oriented sites(not that they weren’t before. keep reading) so that I have not built one link to them since then. I don’t ask for links, i don’t submit to directories, I don’t reciprocate(except the form mentioned in the previous comment)

    How have I done? In my opinion very well. My sites have NEVER done better and any positions I gain this way hardly ever move a centimeter. Except for being hacked lately I have done very well.

    “Do you really want to help to diagnose the problem or you just post for the sake of commenting?”

    Do you always comment on articles and ask the question that is basically answered in the article? I think the “problem” has been diagnosed fine. Do you need an auto-graphed copy?

    Besides Matts article there are many good articles on the supp index that explain it well AND why you get in and how you can get out. It isn’t rocket science just plain and simple stuff. If you just did a little reading you could find the answers.

    Again an almost surfire way to keep your pages all in the main index is to write quality, natural, high-demand articles. I sometimes spend days, whole days on ONE article. Right now the initial traffic probably wouldn’t seem worth it but eventually I believe my blog will be used as a massive resource of info. Put time and effort into your site and you can take it places.

    If you don’t put time and effort into your sites, and act as if your site is worth something don’t expect Google to think it is worth much.

    Also new sites may have supp pages at first but as they grow in authority and fix any problems with the page that could cause supp indexing they will come out. And I have yet to find any site without a supplemental page, there has to be one somewhere but I haven’t found it yet ;-)

    And I need to cut this short. I got work to do :P

  198. I am the first to admit that I am far from a Google SEO expert, but from an outsider’s point of view, there seem to be some ranking rules that are just not appropriate for retail sites, and are probably a big part of why the examples in the Forbes article were all e-tailers.

    Although I understand why a link based ranking system would work for an article based website, when it comes to retail, it seems to be about the worst way to determine ranking. In our case, we have our home page, then category pages, then individual item detail pages. Typically we have found that shoppers are most likely to type in a general brand name first when they are looking for an item (we sell to young women, so this is especially true in our case). Keeping that in mind, a search for a brand name would ideally direct them to the specific category page for that particular brand on our site. However, most of our category pages have recently fallen into the supplemental listings! How exactly do we get them out?

    Category and/or brand pages are the most difficult to get quality links to, yet for the searches we receive, they are usually the most appropriate landing pages :( Why are they difficult to get others to write about and/or link to? Well, someone may write about our site in general, or they may write about a particular product they like and link to that page, but if they are going to write about a brand that we carry, they are most likely going to link directly to the website for that brand and not to our site, even if they found the brand on our site! It is nearly impossible for us to get links directly to the most important pages on our retail site, and it seems that Google has overlooked this issue when it comes to retail sites :(

    It seems strange to me that for some of the Google keyword searches for brands that we carry, there might be 1 other retailer listed on the first page that actually carries the line – then there are a bunch of eBay listings, msn links, shopping comparison sites and such, most of which never return an appropriate response to the search inquiry when you click on the listing. In some cases we are the largest supplier of a particular brand online, but we are in the supplemental pages for some unknown reason, while inappropriate sites are listed on the first page. This just seems very convoluted!!! It’s not a surprise to me that some retail sites resort to black hat tactics when they can’t get ranked on merit alone. I think we need to consider that part of the reason their sales drop so drastically when they lose their ranking isn’t just because of the drop in traffic, but because they were obviously offering what the person was searching for, and isn’t that the whole point of a search engine?

  199. My last comment but it seems Dave banned me to comment here.

    Adam,

    Thanks for taking time to comment!

    I tried cake forums, they would even delete my posting. Anything that you mention about a website would be deleted. It’s time consuming also to do linking. I tried this but big sites will not even bother creating reciprocal link after 3 months.

    I guess I gave too much expectation on Google. I created a website in 2001 and right away I was getting traffic from Google. But I had no time to maintain the site at that time. Things had changed since then. I was unprepared for this supplementals. I just cannot figure out after 5 five months of doing the right thing based on Google guidelines, most pages are still supplemental.

    To tell you honestly, I like Google and I just cannot get over this supplemental thing.

    It’s not just about the money thing but this supplemental thing is crushing my hope. It doesn’t matter to me if it’s not gaining traffic at this moment as long as my pages can cross to the main index.

    Honestly, I want to make a difference. I want to help struggling local cake businesses who have no resources to advertise on the web. These cake designers bake their cakes fresh and it’s their passion. I don’t mind going back to work as long I have a running that already has the momentum to pick up to eventually help some people to succeed. That’s more of a reward for me. I am just frustrated that after a year of effort nothing much have paid off because I cannot tweak my pages to please machines to get my pages out of supplemental.

    By the way it’s appropriate to call it Google purgatory than Google hell.

    Thank you.

  200. Dave (Original)

    What I cannot comprehend is how even after Matt has cleary stated;

    As a reminder, supplemental results aren’t something to be afraid of; I’ve got pages from my site in the supplemental results, for example. A complete software rewrite of the infrastructure for supplemental results launched in Summer o’ 2005, and the supplemental results continue to get fresher. Having urls in the supplemental results doesn’t mean that you have some sort of penalty at all; the main determinant of whether a url is in our main web index or in the supplemental index is PageRank. If you used to have pages in our main web index and now they’re in the supplemental results, a good hypothesis is that we might not be counting links to your pages with the same weight as we have in the past. The approach I’d recommend in that case is to use solid white-hat SEO to get high-quality links (e.g. editorially given by other sites on the basis of merit).

    ..some still think they are being “penalized” and still think that the sheer volume of links they have should see them excluded from the supplemental index. Links aint links and no 2 links are likely treated the same.

    Come on, this isn’t rocket science from a Webmaster perspective, it’s plain old common sense. Now, who was it that said; “The problem with common sense is it’s not so common”? Perhaps Matt’s Blog has been around longer than we think :)

  201. “supplemental results aren’t something to be afraid of” …Yeah right! So I guess when you have 92% of your retail site’s pages in supplemental results, and your sales drop by 50% because of it, you should lay back and relax??? No worries?!?

    “this isn’t rocket science from a Webmaster perspective, it’s plain old common sense.” …Well, we can’t all be webmasters, and for us small businesses who have to navigate on our own because our sales are rapidly dropping due to pages disappearing into supplemental no-man’s-land, we can’t exactly afford to hire an experienced web master either! It becomes a vicious circle – so you can imagine our frustration!

    It may not seem to YOU that we are being punished, but in fact we are. Small businesses are punished for not having the resources to keep up with (or figure out) the ever changing algos, and guidelines secretly imposed by Google. This creates a less than democratic environment, which in turn produces less quality returns on search results.

    Google, and those who defend their logic, seem to forget that many of us are running a business here, and that business is our sole concern (i.e. dealing with customers, buying product, photographing products, pricing, accounting, and all other back end non-SEO related duties). And while their actions may deter spammers, they also hurt those they are supposedly trying to help, and I don’t just mean US the legitimate businesses. I mean the general population who don’t find the page they are truly seeking when they search, because Google doesn’t deem it fit to display, even if it is the most accurate destination for a particular search.

    From an outsider’s perspective, it seems that Google is moving too aggressively in one direction, and is digging a deep and ineffective hole that could turn into an abyss. It may be time to change course and try a different tactic. It seems very odd to me that a company with such enormous resources can’t figure out a better way to return quality results. I am not just speaking from our own concerns either. I am aware of other sites, even our competitors, who do not show up for searches, even when they may be the most qualified result!

  202. Amazing! Thanks a lot guys, very interesting.

    My site does not have any users yet; sorry, I am worried about SE at first.
    I noticed that many pages (97%) disappeared from the index (supplemental). I suspect it is HTTP related. They were simply expired.

    1. I had 1-month expiration header for old pages
    2. I have currently 2-minutes expiration header (after upgrade in April).

    Pages with short TTL can’t be placed even in supplemental ;) they are crawled and expired before indexing.

    If I am correct I can have tens of thousands Indexed Page (homepage) objects with the same & the only child URL. I’ll simply reply to each “If-Modified-Since” request with new content 200 Ok, and with long expiration 1 year. Let’s put it into supplemental.

  203. Brian:
    “…sites like bizrate.com, shopzilla.com and bizrate.co.uk all have the same content but doesn’t come away with a scratch.”

    Google too. Yahoo too. NewEgg too. TigerDirect too. Answers.com too. And Bambarbia Kirkudu tooooo ;)

    I suggest to use “unique service” instead of “unique content”.

  204. Hi Rebecca, I feel your pain. I really, really do. But what you need to know is that when a site owner decides they want to build a site or they want to take their “offline” business… online and they need to build a site, what did you do at that point? Did you build the site yourself, or did you hire a “design” firm to build it?

    What you did at that point in time will be with your site until/unless you either learn all of this stuff yourself, or you hire a “quality” firm to redesign your site and help you. It’s really that simple.

    Your type of site that sells products is actually the very easiest to get good positions on. They really, really are. Information type sites are the hardest IMO.

    Let me ask you this; how does a 23 year old wife with two “small” children ages 2 and 4 years work out of her home making “gift baskets” and shipping them all around the country? How did she do that? She certainly knows almost NOTHING about search engines. She had a real “bad” site when she started three years ago that was built by a firm with no clue. After a redesign her positions on ALL phrases are page one across the board and her site is very visitor friendly.

    My point is that there can only be ten spots on page one that sell the products you sell. Many of those are going to be taken by LARGE firms who may have lots of “authority” etc and even have many employees, etc. So the question you have to ask has to be:

    “How can I better my site? How can I compete with the big boys in this market?”

    That baskets site is competing with real BIG players in the market. It’s funny stuff to see sites like amazon and giftbaskets.com on the first page of results, but then see a site ran from her bedroom with two screaming kids in the background ranked in the top 5 on ALL phrases you can think of. LOL

    This is not about links and not really even about content. This is ALL parts of everything that has to do with your site put together. If you know the “how” to do things, you can do them all yourself.

  205. Rebecca; if your site is in your name on here, then I can clearly see why you are having problems right now. For example:

    http://www.shopplasticland.com/fashion/c/Dresses.html

    Do you really, really think you can get a position on the term:

    dresses

    ??

    In fact; do you really want to?

    That is only one problem I see after viewing two pages of your site. There are many, many….. many problems though.

    Another tip;

    Your front page is waaaay too busy with waayy too many inside links on it and way to “all over the place” with “focus”. You simply cannot target every phrase you want on one single page, nor should you want to. In your market, either your site is built “the best” and you totally the best content that stands out from the rest, or you totally stand zero chance with the se’s. You are in a very tough market and you can bet your bottom dollar other sites in your market have spent big bucks getting where they are now and finding quality people to help them.

  206. Dave (Original)

    Why is it that those who want and need help are hell-bent on defending their position and take constructive critism personally?
    Why do some believe Google owe them a page 1 listing?
    Why don’t some realize that to be page 1 you have to take the position from another site?
    Why don’t some realize there are only 10 spots on page 1?
    Why don’t some understand that not everyone can be on page 1?
    Why don’t some realize that 90%+ of site pages will never be seen on page 1?
    Why don’t some understand that Google exist for its users not Webmasters?
    Why don’t some understand that coming here frustrated and convinced their site is THE site is not helpful to themselves?
    Why don’t some see that they are their own worse enemy?
    Why do some insist on shooting the messenger?
    Why do some have this constant defeatist attitude?

    I guess, as they say, you can lead a horse to water but you can’t make em drink.

  207. Bambarbia Kirkudu is always on first 3 pages ;)

    Dave, I noticed about 99.99% of all SEO related posts are:
    - Find keywords
    - Optimize site, links, titles (using these keywords)

    And all complaints are like this:
    - I was on 3rd place with search for “broogle crigle magic”, what happened, why -30/-950 penalty?

    That’s stupid… One SEO optimized site of my friend, and “Sasha Photo” is always at 1st page, 1st place. Who will find this? Only those who have “word in mouth” and don’t need to google.

    P.S.
    Recently I was looking for an info on Toyota Corolla 97 & Infinity Cappa. “Google for Users” simply didn’t work (I checked all imaginable SERP&links). I found more useful info using Ask.com & MSN.

  208. Dave (Original)

    “Always” has happened yet :)

  209. d c

    I should not be commenting again but this is just kinda funny…

    Hey Randy … wake up why are you attacking your friend Dave (orig)

    You said

    “QUOTING DC:

    “In a nutshell, if you create pages for humans only, you are VERY unlikely to ever go outside the guidelines.”

    This was not my statement, it was Dave’s. See, this is the perfect proof that you don’t even read the whole posting. Your instinct is to defend your buddy Dave and Google for whatever reason.

    The only reason I said quoted by Dave was…
    because from the Google guidelines it said “Make pages for users, not for search engines” and Dave’s statement is kinda a version of it. ( I kinda almost had memorized these guidelines) .

    ha ha ha! LOL ! I didn’t know Randy can be a comedian… why criticize your buddy Dave’s comments and use my initials instead of Dave’s.

    You guys are fake, you kept defending each one of you but would outright attack people who would criticize Google’s supplemental thing.

    Randy you just lost your credibility.

    Are the two of you paid by Google ?

    You guys don’t listen… you are preprogrammed to comment against real humans.

    I am taking taking back my comment that I applauded Randy bot.

    Doug is really here to help. Check his comments. He even took time checking the sites and give you some recommendations. He’s human.

    Watch out webmasters, now you know whose comments you can ignore… plainly ignore bots’ comments.

    Just kinda conspiracy theory… Why Google will bother much if so much of our pages will go supplemental. Google has adwords, you will gain real traffic from it…. but you make Google richer. By the way we cannot blame them, it’s their own search engine.

  210. Doug, you’re getting soft in your old age. What did you let Rebecca off so easily for? Where’s Angry Doug? Bring out Angry Doug! Angry Doug is so much more fun!

    Rebecca, here’s why your remarks cannot be taken at face value:

    1) http://www.shopplasticland.com/fashion/c/Links.html . A large number of those links are reciprocal, including all of the blogs. There is no way you can say that all of the reciprocal links are purely coincidental and are purely there for users.

    2) Plasticland has not one, but two blogs (including a Myspace blog). Whether the blogs were created for SEO or online marketing excuses (they’re not good enough to be called reasons), they’re not done with the end user in mind.

    3) http://www.shopplasticland.com/fashion/c/Contact-Us.html Either you have a powerful, feature-rich, robust “CMS” (highly unlikely) or a designer who understands spam implications (the very same designer you claim not to have).

    4) You offer an affiliate program for blogs only (for those who can’t find the page, http://www.shopplasticland.com/fashion/c/Street-Team.html is the page). Why blogs? Why not any other sites? And why a pay-per-post model? Couldn’t have anything to do with the commonly held SEO belief that “blogs are search-engine friendly and rank highly” as if blogs are isolated in some search engine vacuum, could it?

    5) The fact that you made it here and you posted. As popular as Matt is (and deservedly so), his core audience is generally confined to the technically savvy and/or those who participate in the SEO community. No mom and pop site owner is going to stumble upon this blog unless they type in some really weird longtail search query into their engine of choice or they do their homework. Neither scenario is very likely.

    You may or may not have a web designer on board, but someone somewhere on your Plasticland team has just enough knowledge of web design and/or SEO to be really, really dangerous. You’re not here because your site is lily-white and not ranking; you’re here because some Plasticlandianite has pulled off some stupid SEO tricks and you got sent to the proverbial woodshed for a spanking.

    If you’re serious about why you’re here and you legitimately don’t understand what’s going on, then hopefully your next answer will reveal as such, but the smart money says that’s not the case.

  211. Dave (Original)

    DC if you are as indecisive about what makes a good site as you are on not “commenting anymore” it’s no wonder you can’t grasp it.

    Your last post shows what I believe are your true colors and sais more about you than the people you are hurling personal abuse at. That sort of post doesn’t not belong on Matts blog and is better suited to a teenage chat room.

    You guys don’t listen…

    Now that IS rich and it’s no surprise you have “run out of financial resources to sustain” your “family”.

  212. Guys, I indexed yooooooou ;););)
    I am very black now, with HTTP-backdoors.
    Dave, thanks, alignments are Ok (theoretically); I need 2 weeks more to recrawl with new algo (and shorter ‘tokens’). Old pages have long unformatted sentences. Constantly in-progress, almost Go:)ogle.
    Need to parse/format product desc., add “cache” correctly, etc.
    Thanks!

  213. Dave (Original)

    Happy to help Bam.

  214. I lost my credibility? Isn’t the first time someone has told me that. Would you like me to add you to the list of people who hate me?
    I don’t base my comments on what others think of me or will say. I base them on research and knowledge. If you use that knowledge and are helped. HOORAY!!! If not that is your problem. Feel free to find your own way.

    BTW Doug was helpful… what does that have to do with anything?

    I don’t work for Google, I had a chance and lost it. I don’t want to work for Google. And as for me and Matt being buddies. Were not. He has said stuff before that was inaccurate. When he does I call him up on it. I speak freely based on what I believe and what I have found to be fact. If that agrees with Matt then fine. If not I can’t help that.

    DC I just really don’t see what you are trying to accomplish. I didn’t understand half your comment. You have given nothing to this discussion that I have seen.

    And for the record I don’t know who the heck you or Dave are. I agree with whoever is right and I disagree with any wrong.

  215. Matt,

    I have to disagree with this comment you made,

    “And it didn’t look really relevant for users for a diamond ring site to exchange links like this in potentially up to 329 different categories.”

    As an online marketer I like to keep in touch and offer suggestions to my customer base. Just because you search for “Diamond rings” doesn’t mean that in the future you might be looking for another service. If I find a site that I like and that offers me a reason to return, I sometimes do look at their links to get some ideas of products and or services I may be needing or the site may suggest. Much like asking a friend for his/her advise if I am going to buy a car or purchase Viagra online.

    Now say that I, as a site owner find a great site that I think might offer my customers a product that is out of the sphere of products and or services that I offer. So I contact them and we exchange links. With two great sites and years of working to keep the white hat white, why wouldn’t we be able to pass along our authoritative status (page rank) along?

    If I find a diamond ring site and check out their links page, I would be very disappointed to find only links to other diamond ring sites and I think it is short sided to think or to rank sites on that assumption. Taking away, or limiting the ability to pass along ones authoritative status will turn links pages into nothing more than a small web directory for a sites focus and will not let us suggest other sites to our customers.

  216. Dave (Original)

    Now say that I, as a site owner find a great site that I think might offer my customers a product that is out of the sphere of products and or services that I offer. So I contact them and we exchange links. With two great sites and years of working to keep the white hat white, why wouldn’t we be able to pass along our authoritative status (page rank) along?

    I would say because it skews the organic SERPs in favor of the 2 sites, which is likely the only reason such linking is done, not for site visitors. If it is, why care about passing PR?

  217. Dave (Original)

    Randy, I would take loosing credibility in DC’s eyes as a compliment :)

  218. D C

    Randy and Dave,

    I apologize if you got offended. Honestly, I don’t want to offend anybody.

    I am impressed by the way Adam was digging into sites, showing the possible weakness of the pages and links. He even opened up very logical questions. Those were very good constructive comments and questions.

    Thank You!

  219. I am impressed by the way Adam was digging into sites, showing the possible weakness of the pages and links. He even opened up very logical questions. Those were very good constructive comments and questions.

    Ockham had a razor. I have a shovel and a wheelbarrow to carry away the bull…doody.

    Mercy buckets for the kind words!

    And for the record I don’t know who the heck you or Dave are.

    I know who Dave is!

    Dave: should I tell him? ;)

  220. Dave (Original)

    No need, I’ll tell. I’m Dave (AKA David). :)

  221. Hi all

    Well I have been reading this post by Matt and the hundreds of comments over and over the past few weeks and I am still not sure the best way to tackle this Supplemental ‘issue’ for my site.

    I have had great success within Googles normal search results for my web site (click my name above as I don’t want to post it here) over the last 18 months for important keyword combinations. I used to get Google reindexing the site (updating me in the cache) and therefore dropping me for 2 weeks maximum at a time for some keywords every 6-7 weeks but have never ended up in the Supplemental wilderness before.

    I have now been in there for 4 weeks for my most important keword combinations (probably 60% of my site) which as everyone else has pointed out equals a major drop in traffic = less income!

    I have many incoming links from respected web sites (The Sunday Times in UK being a latest recommended link in) and do not practice the dodgier SEO techniques but the people who do seem to now be stuck in my old front page positions :-(

    I know no site has a right to be anywhere near the front page in Google results but to be there for 18 months or so and then suddenly not due to being put into the Supplemental area for pages that are different, have different meta data and content is really annoying – any help would be much appreciated!

  222. Dave (Original)

    Andy, not that we can rely on Toolbar PR but it is very low on your home page. No doubt your inner pages are lower still. If you were getting decent rankings and are now not, you could have been getting credited for links you are no longer being credited for.

    Remember, Google only credits true votes (links) and likely not link exchanges (perhaps to a small degree) or any link you can add yourself on another site.

    Low PR is the number one reason pages are in the Supplimental Index.

  223. Hi Dave (original)

    Thanks for the quick feedback.

    My home page used to have a page rank of 5 or 6 (since big daddy update) and since then I have only been linked to by people who have linked to me naturally. When I launched the site over 18 months ago I always tried to steer clear of link exchanges as I know they are harmful in most cases and preferred the natural organic linking that comes from having an interesting and useful web site.

    On that note Google’s index of who is linking to me is very out dated both with the link: command and the incoming links within the Google sitemaps control panel so I guess it’s a case of waiting for Mr or Mrs G updating both incoming links and subsequently my page rank. I am going to add quite a bit of new content to the site soon so maybe if I make some changes Google will have a fresh look as it usually does if I make large-ish structural changes.

    Here’s to making some guesses of what’s happened, crossing all limbs for good luck and hoping I fall back into favour with the big G :-)

  224. Sorry Adam, I must be slipping. :) But anyhoo; I only spent about one minute looking over the site, but you found much more probs anyway.

    It is kind of amazing that webmasters/owners come here to criticize and gripe at google for “perceived” whatever’s, but when you look at all the sites that are linked just in this thread, it becomes extremely clear about the “why” things are happening to them. The poor sites just need to get “quality” people helping them going into the future and stop listening to the wrong people/places.

  225. Yes; I can hear it alright:

    “So how the “H” do we know exactly WHO to read and listen to????””

    And you have a VERY legitimate question. I cannot answer that question other than to say “Do Your Research” first. It also helps to actually learn lots of stuff as well before researching for who to help you.

    After my last sentence in the above post, I thought I had to write this. :)

  226. Dave (Original)

    IMO who or what you should listen to is very simple. Read the Google guidelines, I mean REALLY read them and read them with an open mind, but above all, with common sense. Also, REALLY read what Matt posts here and cross reference his comments with the Google guidelines. You will find that all his comments make perfect sense when you know the Google guidelines.

    Basically, it’s build yourself a killer site that others link to WITHOUT you requesting they do so. Link out to similiar sites etc and don’t spam them for a link back. Your visitors ARE going to find them anyway so it may as well be from you. Give you visitors a REASON to come back to your site time-and-time again (linking out is one way of many). Open you mind and think about humans first and foremost and consider Google only if you are unsure how they will react.

    IMO, for the most part SEO is basically common sense. If ANYONE suggests chasing links or anything not within the Google guidelines then run-away! Unfortunately the majority of so-called “white hats” are not. Be THE master of your own destiny, don’t give away your power to anyone.

    Google does NOT seek out optimized sites, it seeks the same site type it’s user seek.

  227. Ahhh, but could it not be said that Google does seek out optimized sites, Master David? Google seeks user- optimized sites with a positive Chi.

    We have now entered Search Engine Zen Land.

  228. Bread became 3-4 times expensive in Russia… Huge impact on economics… Bakeries need use AdWords now… ;)

  229. Doug,
    Sorry for the delayed response, have been spending all my time trying to work on figuring out how to get our site to link higher. As an independent business owner I certainly don’t have anything better to do with my time right? (i.e. answer customer questions, accounting, etc…)

    Yes, that is our site, and no, we have no intention of trying to rank #1 (or even #30) for the term dresses. In fact, if you had taken the time to actually read my first post you would see that we are trying to rank for brand names, many of which are 2 word key phrases, and some of which we are the largest supplier of online.

    I’ll tell you why a single mom running a gift basket company fares better in searches – she is a niche market, and she sells gift baskets. Like most fashion boutiques, we don’t just sell one type of item, we sell shoes, clothing, gifts, handbags, cosmetics, etc, etc… from dozens of different brands.

    We built out site for our customer base, not for search engines. That’s what everyone says you’re supposed to do right? Yeah, right!

    As far as the home page being to busy, I have to laugh at that. You should see our competition (and yes, I am referring to all of the competition that ranks on the 1st page for the same brands that we carry). Our page is very clean and organized compared to some others. Don’t forget that our site is targeted to young women 18-35, not adults who prefer things simple and streamlined. I looked at your page, and it doesn’t seem very focused to me, but that’s probably because I can’t relate to your demographic any more than you can relate to ours.

    We run our business the way our type of business is usually run. We aren’t a niche business, and I don’t think that the internet should be a club that only niche businesses can get into. Maybe that doesn’t fit in to the normal guidelines for SAO optimization, but that was the point of my original posting. Not all businesses are the same, and it seems that Google doesn’t always take that into account.

  230. Multi-Worded Adam

    Against my better judgment, I am going to respond to your ridiculous comments. Although, since you obviously didn’t read my other posts, I’m guessing that this will just fall on deaf ears as well.

    1) Our Links Page – Our links page is not meant for SEO. With the exception of the links to our blogs, all of the links on that page are meant to give credit and support to those who have supported us. Each and every one of them is either a friend, or has directly contributed to our business.

    2) Our blogs – Yes, we have two. One was supposed to be an actual blog, but I can’t seem to find the time to actually write in it (I wonder why). The MySpace blog was started as an alternative to signing up for our opt-in email list. Our customers and MySpace friends can view the most recent emails and promotions there without having to sign up for emails. Our customers really like this feature, and that was the purpose.

    3) Are you reading the same blog I am??? Where did I say that we don’t have a designer??? We have a very good designer; I think anyone could tell that from looking at our site.

    4) Our affiliate program – This is also not intended for SEO purposes. If it was, do you really think we would downplay it so much? As hard as it is for you to believe, we actually get pretty good sales conversion from blog postings. Once again READ. We don’t have a pay-per-post program (we do have a small one-time $5 gift certificate for the first post); we pay a commission on sales referred from the blog postings. This is called marketing, and not all marketing directly relates to SEO. Some of it is done just for the sake of…selling products!!!

    5) Your fifth point…Wow…what the hell is going on there? No, unlike you I do not have a crush on Matt, and am not here just to be close to him. I do participate in the SEO world (as much as I can, considering my main job is actually running a business that sells tangible goods), I subscribe to several SEO emails and I read the Forbes article (see top of this string…Google Hell article…Forbes…maybe you’ve heard of it?). You are out of your mind if you think Mom and Pop ventures don’t educate themselves on every aspect of their business. Your comment “No mom and pop site owner is going to stumble upon this blog” is very unintelligent. Do you think big corporations just pop up out of nowhere? Many of them start out as Mom and Pop start-ups and grow to become huge entities. Do you think they achieve that by sticking their head in the sand???

    So, Adam of too many words, it seems that you feel we should just be content to let our site fade off into nothingness. Heaven forbid we run it like a retail business, and offer services meant to help our client base (our MySpace blog); or when we fail to rank in Google based on merit, resort to good old fashioned Gorilla Marketing like getting editorial in blogs to sell our inventory. Or, the worst offense of all, link to sites run by people we actually care about and want to support.

    So glad I decided to post on this blog and put forth a point of view from a small business owner’s perspective. You guys have it all figured out huh? You know this is why Google is under such an avalanche of attack by critics these days. Maybe if they listened a little bit, and actually tried to help the many small businesses who use them, they could provide a better experience for everyone involved…Hear that Matt…Guess not.

  231. Dave (Original)

    Yes, that is our site, and no, we have no intention of trying to rank #1 (or even #30) for the term dresses. In fact, if you had taken the time to actually read my first post you would see that we are trying to rank for brand names, many of which are 2 word key phrases, and some of which we are the largest supplier of online.

    What a rude reply to someone who is trying to help a total stranger for free! Perhaps YOU should have “taken the time to actually read” Doug’s reply as I don’t believe he he did state you are trying to rank at #1.

    Your reply to Adam (again only trying to help total stranger for free) is even worse!

    I hope you languish in “Google Hell” for years to come you rude &%$^%

  232. Matt,
    We are getting ready to launch pages in our site that come from news data feeds … basically AP feeds for sports news, scores, stats, team updates, injuries, etc. I would like to bring these stories into my site to help build the size and content in the siite. however, I am told by a couple of SEO people I know that most of those will wind up in supplemental results and will probably also appear as duplicate content since they are news feeds that go all over the internet.

    In the same discussion, they advised me against even offering the news articles to our users because it would simply decrease the ration of indexed vs supplemental and that our indexed pages would then be devalued and the rankings would drop very sharply sortly afterwards.

    Is there an ounce of truth to this? We believe it will provide value to our visitors, but if it kills us in the engines, we cannot afford the loss of traffic. Thoughts?

    THANKS!

  233. Thanks, Dave, but I’m okay with her answer. It doesn’t bother me in the slightest. As a matter of fact, it’s almost exactly what I was expecting; the only thing I didn’t expect was for there to be quite as much snark. But hey, if that’s how she needs to say “Adam, you’re right and I can’t even admit it to myself”, I’m cool with that. Some people just have a hard time coming to terms with their own guilt.

    Oh, and my man crush is on Mr. Pregnant (at least he has girl boobs).

  234. Jason Kole

    All this gray area seo stuff is complete nonsense in personal opinion. Gray area is basically something that is not considered bad right? or else it wouldn’t be gray.

    Reciprocal linking can’t get your site penalized but reciprocal links can be given less weight. Sound fair? Maybe…

    Giving reciprocal links less weight. I’ve seen and have been aproached by many sites that are offering to reciprocate links to your site and have supposedly already done so. Or so it seems that way to an inexperienced webmaster. They link to you from a page that as far as search engines go non-existant. There are no links at all pointing to this page exept the link in the email they sent you.
    Does google find this situation black hat SEO?
    I would think so. But, google can’t see it so google can’t stop it.

    I see sites ranking in the #1 position and I analyze their backlinks and to my suprise they are almost all coming from reciprocal linking sites but when looking for the reciprocal link on the #1 ranking site there is nothing to be found. What does this suggest? They have a hidden links page that only is known by the sites it is linking to so there automated reciprocal link checking software can find the reciprocating link.

  235. Dave (Original)

    Yeah, I know. I guess after many years of helping total strangers for free we are used to this sort. As they say, every Circus has a clown and Rebecca is only trying to fill the spot until the next clown comes along.

    The irony is, if she put as much effort into her site as she does whining and abusing those trying to help nobody would have to suffer her total ignorance. Gotta laugh at her site though and how she expects to prosper online :)

  236. Hi Dave

    Thanks for the quick feedback.

    My home page used to have a page rank of 5 or 6 (since big daddy update) and since then I have only been linked to by people who have linked to me naturally. When I launched the site over 18 months ago I always tried to steer clear of link exchanges as I know they are harmful in most cases and preferred the natural organic linking that comes from having an interesting and useful web site.

    On that note Google’s index of who is linking to me is very out dated both with the link: command and the incoming links within the Google sitemaps control panel so I guess it’s a case of waiting for Mr or Mrs G updating both incoming links and subsequently my page rank. I am going to add quite a bit of new content to the site soon so maybe if I make some changes Google will have a fresh look as it usually does if I make large-ish structural changes.

    Here’s to making some guesses of what’s happened, crossing all limbs for good luck and hoping I fall back into favour with the big G :-)

  237. Hi Rebecca; I’m sorry, but you are just not getting it at all. And BTW: your “designer” is “not” good.

    SEO = Website Design

    Please keep that in mind. If a site is not designed well, it’s not ready for search engines. I’m not talking about the “look” or even the layout either. ANYone can build a pretty site, and your site is pretty. The key to SEO is building that pretty site also search engine friendly “and” optimized.

    You wrote this:

    “Yes, that is our site, and no, we have no intention of trying to rank #1 (or even #30) for the term dresses. In fact, if you had taken the time to actually read my first post you would see that we are trying to rank for brand names, many of which are 2 word key phrases, and some of which we are the largest supplier of online.”

    You are trying to rank for them? NO WAY Rebecca;

    This is your title tag on your “main” folder for “dresses”

    Your Biz Name – dresses

    That’s IT!

    That is why I asked about the word…….dresses.

    If your designer,… and even you don’t know that one of the most important aspects of design and search engines IS the title tag on “each” page of your site, then you “and” your designer have lots and lots to learn. That first category page about dresses is viewed as your most important page on “dresses”. To be targeting such a broad word as dresses is just not good.

    Oh yes Rebecca; your site is nice.

    But trust me; you have many, many problems with the site and need lots of help.

    And trust me again; the same principals apply to a “niche” as you call it and website design as they do to your site selling different products. It’s all the same.

    Tell me how another client who sells ALL kinds of bedding stuff…… a main wholesaler of this stuff; can rank on many, many, many different types of terms as well if things depend on the type of site? Oh yes; I could show you countless hundreds of phrases the site is up there for,….. on many, many different types of products.

    To tell you the truth; it’s much easier to do “your” type of site than anything else I know of.

    And Dave and Adam are very correct; please understand we are simply trying our bestest to help you. You are getting very “good” and very “free” advice. Do you know the going rates per hour for phone consultation from someone who knows something? High.

  238. BTW; I’ve never heard a site selling “gift baskets” is called a niche. I think many would agree that market is a tough one. Try expanding that to the term of…. children’s gifts,… and you can bet that many out there would “not” even try, let alone get a good position on that term.

    Think of things this way:

    Each category of “your” site IS a niche. You build the site on that principal using the “root” of EACH category as the front page and go from there. Each root is the most important page of that category.

    If you really read that carefully, there is a real good bit of advice in it for you and anyone else reading.

  239. Jason Kole

    I’m not about to read through 200+ comments to find out what is going on here but I’m pretty sure the comment section of the blog wasn’t intended for senseless fights. Although I’m sure he expected people to come here to complain about Google.

    Giving someone advice for there site in a meant to be smart @$$ way is pretty rediculous. Don’t go jumping the gun on me just yet. As I said before i am not going to read all these posts to find out what is really going on but seems to me like you are trying to press it on with her.

    And if her answer is almost exactly what you were expecting, and that is what you are fueding about.. maybe you should have never posted it in the first place.

  240. Rebecca wrote:

    “You guys have it all figured out huh? You know this is why Google is under such an avalanche of attack by critics these days.”

    erm,… : You must be hanging out at the wrong places Rebecca, and maybe reading the wrong SEO newsletters?. Where I hang I see almost zero attacks from critics. Well, check that; I read totally zero attacks about Google. I’d say you are getting your info from people who really don’t know much about what they are doing.

    Also; You stated that Matt should care more about business owners, or something like that. Let me tell you that Matt and google “does” care as google use to be small at one time as well. The thing is, the small biz owners are NOT who Google caters to and they are not what or why or how google makes up their algo. They actually cater to “real” google users who are searching on google. Those real users may be biz owners, or they may not be. The point is that “you” and “your” site is not who google should or does cater to. None of “our” websites are who google caters to, nor should they be. It’s amazing you are shooting the messengers like Matt who are actually reaching out to you to give you this blog with lots of good advice in it.

  241. Dave (Original)

    On that note Google’s index of who is linking to me is very out dated both with the link: command and the incoming links within the Google sitemaps control panel so I guess it’s a case of waiting for Mr or Mrs G updating both incoming links and subsequently my page rank. I am going to add quite a bit of new content to the site soon so maybe if I make some changes Google will have a fresh look as it usually does if I make large-ish structural changes.

    The link command only shows a sample of your inbound links and the samples it does show may or may not be counted toward PR. The Sitemaps can be slow in catching up but should show most, if not all links. Again though, which are helping with PR etc is best left to common sense.

    Jason Kole, perhaps you SHOULD read before jumping to conclusions. Matt cannot possibly help all that come here “complaining”. Some of us are offering help which should benefit the complainer and anyone else that reads. If they don’t like what they read they can simply ignore, say thanks and wait for Matt (who is only human and likely will never answer). There is NEVER a need to to shoot the messenger and stoop to the level shown here.

  242. Well Jason, Thanks for that. Too bad you don’t have your site in your profile so we could help you as well. It seems you are not looking for help though as you won’t even take the time to read the thread. Google and Matt created this for people like you to learn from. Not reading it is not in your best interests.

  243. Jason Kole, in this case you would be very well-served to read the original comment. The problem with it is that it isn’t, and never was, what it was purported to be (i.e. a legitimate complaint from a small business owner who is negatively affected despite being as lily white as can be.)

    This is one of the more common problems that Big G seems to have to face: trying to weed through the BS and figure out who’s giving legit feedback and who is merely upset because their site doesn’t rank (anymore).

    I didn’t necessarily comment in response to Rebecca specifically…I commented in case someone out there read her comments and thought they were legitimate.

    A side note: the comment itself was held in the mod queue for at least two or three days, which means Matt had to manually approve it. This doesn’t bother me, either; but if Matt had to approve it and did so, there’s some meat on the bone.

  244. workinghard

    After reading the comments above, I think Dave, Adam, Doug and Randy have some valid points but Rebecca, DC and the rest of the people who complained about supplemental index also have legitimate concerns.

    I think the comments and discussions here in this blog are helpful for webmasters. They may find some tips to make their sites Google-friendly as they work hard to make their sites human-relevant sites. Also, who knows, Google may discover some weaknesses of their supplemental index logic based on the valid comments brought up to this blog.

    If legitimate sites will continue to find their relevant pages trapped in supplemental index after doing the best they could in line with Google’s guidelines, I think it’s better for the webmasters to shift directions. Maybe it’s time not to expect too much from Google. Live.com is starting to show very relevant search results. Also, you may integrate your sites to Facebook using their developer’s framework API. Maybe it’s also time to allot some advertising dollars as search engines focus more in earning money through displaying ads alongside with search search results. (Search Engines need some revenue too to survive and prosper). Maybe it’s time to ask some friends to write articles or blog about your sites. Maybe you can partner and make alliances with legitimate sites and businesses. The web is changing as the search engines are. Web sites businesses may now realize that they just cannot depend solely on search engines anymore. Let your creative survival instincts come out from you as this may bring your sites and businesses to better prosperity without solely depending on Free Search Engine that may constantly change. My site is also affected and I learned a lesson. Instead on depending on Google, I am working hard and finding ways to put my site on better and solid grounds. Who knows these hardships of small businesses being affected by supplemental indeces may be their turning point to discover better ways of survival and growth.

    We also don’t know what these supplemental pages will bring to Google in the future if tons of legitimate and relevant pages are just buried and rated less important.

  245. Hi workinghard! I agree with most of your post. Good post. The thing is that all owners should already know to have multiple sources of getting visitors…. supplemental or not.

    You wrote this:

    “DC and the rest of the people who complained about supplemental index also have legitimate concerns.”

    Oh yes; legitimate concerns for sure, but those concerns have almost nothing to do with Google. It’s not google’s fault the sites have duplicate title tags on pages or duplicate description tags, etc and even duplicate content or real nasty type url’s, etc. I could go on and on. Google cannot teach owners how to build a real good website that is good in all ways…. navigation, structure, architecture, etc. It’s up to owners to hire good people or up to owners to learn this stuff themselves.

    Yes; sites are concerned, but they simply don’t know how to fix things. Complaining about the concern of supplemental indexing is not going to help them. It’s always been a case of “good” site’s pages not ranking where they should be, but it’s also been the case of the same page’s owners not doing the most basic of common sense stuff to fix their problems.

    You can clearly look at “profile” sites in here to clearly see all the problems and why these sites now have “concerns”.

  246. Let your creative survival instincts come out from you as this may bring your sites and businesses to better prosperity without solely depending on Free Search Engine that may constantly change. My site is also affected and I learned a lesson. Instead on depending on Google, I am working hard and finding ways to put my site on better and solid grounds.

    Well put. I’m glad you learned a lesson from this (although for your own sake, it’s too bad it was the hard way.)

    I would also agree with most of what you said. There are two things, however, I respectfully disagree with (and I say respectfully because it’s clear that you took the time to think about what you said and craft your words carefully):

    1) The people complaining about the supplemental index having legitimate concerns. There may be some people out there somewhere in the world with pages that are ranked supplemental that don’t deserve to be.

    My blog site has 97 pages in the supplemental index, and to be perfectly frank, they probably deserve to be there. I use WordPress (as does Matt) for my blog and there are some design issues within WordPress which, if left uncorrected, can create situations like this. I have only corrected some of these issues (got some more to go) recently and have seen my supplemental page count drop to 97 from 127.

    This doesn’t mean that the pages won’t be found; I still see traffic coming to those pages from various relevant search queries. They just don’t happen as often as any of us would like.

    Also, see Doug’s comments.

    2) Live isn’t showing relevant results and hasn’t been for some time now. I keep getting found under various drug terms (Viagra, Carisprodol, Valtrex) and in many cases I’ve never even heard of the drug, nor do I have any pages on my site promoting said drugs. Mind you, Live will (hopefully) continue to evolve and grow and get rid of situations like that.

  247. Dave (original)

    workinghard, nice balanced post. I would like to address the point below though;

    If legitimate sites will continue to find their relevant pages trapped in supplemental index after doing the best they could in line with Google’s guidelines, I think it’s better for the webmasters to shift directions.

    The supplemental index is not a trap for non-legitimate sites and non-relevant pages and it has no direct affect on rankings. If pages are in the supplemental index is often means the PR of those pages are not high enough, and among other criteria, it’s likely the PR of those pages are the reason for rankings. High quality authoritative sites can quite possibly have x pages in the supplemental index.

    IF a sites pages were NOT part of Google’s supplemental index and then suddenly one day they ARE, it’s quite likely that Google WERE passing PR for x links to the site and now they aren’t. This should be a red flag for Webmasters to review how they are obtaining links. If they still fill comfortable in the way they obtain links then it’s just a case of Google neutralizing x inbound links that were being seen as votes.

    You can fool some of Google some of the time, but you can’t fool all of Google all of the time :)

  248. Such a huge discussion about nothing!
    Many still believe that Google automatically bans websites for sudden increase of incoming links. I have millions links to hundreds of other sites currently “nofollow” protected. Will it hurt other sites if I remove “nofollow”? It won’t.

  249. Dave (original)

    Millions of links to hundreds of sites. LOL! What has this to do with the topic?

  250. About the topic: I can’t publish “in-topic”. What is it about?!! Do you really believe that I am spammer? LOL!!! I republished in blog (and changed URI). About Google in June-2002.

    The topic is “Google-Hell”, but I don’t use such wordings.

  251. why bother with supplemental at all, why not just index everything in a list and if Google doesn’t think it is good, have it way down the search results? This is the bit I don’t understand, there does not seem to be a need for two indexes. But keep writing please I am following along.

  252. Great post!

    I also garnered a bit of information from SiteProNews.com’s article “Google’s Supplemental Index – What You Need to Know” (http://www.sitepronews.com/archives/2007/may/23.html). The section explaining how to calculate your supplemental index ratio was extremely helpful. However, it failed to clarify what a favorable ratio would be. I tested one of my larger sites and came up with a ratio of 99% (12,600/12,700). Then I tested the site’s nearest competition and they had a ratio of 30% (1,660/5,670). I understand that 99% is not good, so what would you say is a good ratio?

  253. We tried to deal with the supplemental results problem with priority in our sitemap.xml page, and it hasn’t worked.

    Our site (town-court.com – a directory of traffic courts) has a lot of the “court” pages in supplemental results. When someone searches for a some courts (say, Galway Town Court (near Saratoga NY if you care)) the search result is the blog page from our site where we reported that Galway Town Court had been added, instead of our page on Galway Town Court appearing.

    I suspect there are other issues that explain this partly, but we thought we could address it by making the blog pages priority 0.1 in our sitemap.xml page, and making our court pages higher (default of 0.9, but we can change that). In the case of Galway (just an example), that court page does have sitemap priority of 0.9. All of our blog pages have priority 0.1.

    Shouldn’t Google respect our choice and put the low priority pages in supplemental results before the high priority pages?

    I do recognize there are other issues. When we first set up the blog, the link to our court pages was different so the blog entry doesn’t link the way we’d like to the court page (it still works, but we have excess baggage). The blog archive pages have many courts on them and so may seem more “unique”. The blog archive pages do generally have higher pagerank too. At the same time, there are virtually zero links to the blog pages (only from the main page to the main blog page) while the site is set up to link to the court pages.

    We can see from our Analytics data that the blog pages are much less helpful to users than the main page, and are thinking of some remedies (redirecting blog archive pages to the site home page for example, or adding a larger search box to the blog, page or even replicating part of the home page content on each blog page), but really we just thought that sitemaps priority was the answer – so far it’s not.

    Any words of wisdom from the Great Matt or his legion of followers?

  254. Tsau
    I understund that linkbuing is something that Google doesnt like. But how about affiliateprogram links like Cj or Tradedoubler? I think those are also paid links you just pay them afterwards.

  255. Dave (original) wrote: “Millions of links to hundreds of sites. LOL! What has this to do with the topic?”
    To be correct, about 2 millions of links to 2 millions unique product pages on 120 large internet stores. Google counts even links marked with rel=”nofollow”. Such links have a boost factor in configuration, it could be 0.1 last year, and it could be 9.99 today. Webmaster Tools clearly show links to Tokenizer from this “Google-Hell” page, even links outside of top-100 mark.

  256. Matts

    your artcile is really interesting, but what about links that you don’t want and start to have just becuase stupid and unrealiable directory borns and put your web site inside them?
    I saw my web site slowly disappear from the web position I got along the time. My web site doesn’t offer dup contents, but all my pages are marked as supplemental, links are suddenly disappeared (just 2 against 282 available in webmaster tools).
    What about this? How can I get back my old positioning?

    How can I report spam for those web site with which I get in touch and they refuse to remove my link? I’ve a link that is there since 2001 …

    Your private answer is appreciated too.

  257. Jim

    Matt:
    our site is a retail site listing 28,000+ items. We have over the years had excellent google listings with high search results. We submitted our listings through froogle and still continue to upload the files even though froogle has gone away. 50% of our listings have now gone into supplimental results. It has destroted our retail from our site. Why has this occured and how can we fix it?
    Please help
    Thank you
    Jim

  258. Thanks Matt…
    Its really good post, My blog was in Supplemental Index, your tips really helps me…..

  259. Supplemental flaw

    Supplemental Google algorithm is flawed.

    I worked hard to have these 2 pages out of supplemental index. After 2 months of being out of supplemental and gaining a little traffic from Google these pages come back to supplemental. These pages are loved by visitors because they are informative and with good quality pictures and pictures speak a thousand words… but for Googlebot it means nothing.

    Fortunately, these pages are getting traffic from search engines that are starting to give relevant results such as live.com and yahoo.com.

    Google just removed the word supplemental from the results but it doesn’t make any difference since they still bring your tons of pages to supplemental.

    Search is evolving but I think Google’s search is evolving in the wrong direction.

  260. When searching Google, there are some results that come back with one or two columns of extra site links. Obviously, this gives the user much more information and creates a higher percentage chance of a user clicking through.

    What are these extra links called and how do you go about getting your site listed with these extra links?

    Note: only the top result ever has these extra category links
    I’m not talking about Local Results.
    There used to be one column and now I’m seeing 2 columns.

  261. How much of a Ranking impact can a site have for switching web hosting servers?
    This would be from one company to a new company. We have dedicated IP addresses assigned, but of course will be getting a new dedicated IP address as well.
    We’re concerned that DNS changes could hurt ranking positions.

  262. ive already been in google hell and i dont wanna go back! ;)
    -Jack @ How to become a police officer

  263. I have a questions about site migration. We were migrating our content (~1K pages) from domain A on hosting X to domain B on hosting Y. After the migration and 2 months of recrawling wait we have less than 30% of site indexed, the rest went to suplimental index. All pages has 301 move permanetly redirects. Thanks !

  264. I wish the guy said “supplemental hell” instead of “Google Hell” (never heard that before), then I’d get some traffic to my site for a change :D

  265. Hi Matt:

    I have been experiencing a strange problem lately. From Dec 2010 some loved one (s) have been somehow constantly placing malicious codes on my website even though the page permissions don’t allow to write on pages. What the guys do is put a link to some bad or banned website and as a result, I receive an email from Google that ‘suspicious activity found’ on my website. I am so much thankful for that gesture but then now since it has happened quite a lot of times (and I always have to upload my whole website again to my web hosting account on Apthosts) my inner pages have now become supplementary pages and have PR of ‘n/a’ what used to be PR5.

    I am not really crazy about PRs but yes, I need more exposure of my website and at the right places or niches, so I try whatever-hat efforts but not so-called black or gray hat tactics to promote my website. Still am kind of wondering what’s my fault when my website gets hacked or intruded. I am already in trouble and tensions of getting things back but knowing that my pages being thumped down to ‘n/a’ from PR5 made me curious enough to do some research and apparently landed here with my curiosity. Can you help me understand that?

Leave a Comment

Your email address will not be published. Required fields are marked *

*

If you have a question about your site specifically or a general question about search, your best bet is to post in our Webmaster Help Forum linked from http://google.com/webmasters

If you comment, please use your personal name, not your business name. Business names can sound salesy or spammy, and I would like to try people leaving their actual name instead.

You may use these HTML tags and attributes: <a href="" title=""> <abbr title=""> <acronym title=""> <b> <blockquote cite=""> <cite> <code> <del datetime=""> <em> <i> <q cite=""> <strike> <strong>

css.php