Gone Supplemental

Some site owners over at WebmasterWorld have been discussing an issue where on Bigdaddy data centers, the site wouldn’t be crawled as much in the main index. That would result in Google showing more pages from the supplemental results for that site. GoogleGuy requested feedback with concrete details, and several people responded with enough details that we identified and changed a threshold in Bigdaddy to crawl more pages from those sites.

I checked in that email queue tonight to see how the “gonesupplemental” feedback looked. I looked at an emergency responder site, a truck site, a ticket site, a karate site, a silver site, a T-shirt site, a site about memory, a site selling a type of document, a boating site, and a jewelry site. All were getting more pages crawled, and I expect over time that we’ll crawl more pages from these sites and similar sites that people mentioned. The biggest site that I saw had 711K pages reported, and I saw other sites with 40,400 estimated pages and 52,700 estimated pages for a site: search.

So the upshot is that if you’re one of these people who was paying attention to this issue, I think it has already improved quite a bit, and I would expect to see more pages indexed in the coming week or two. Some sites may see improvements earlier than others because of where a site happens to be in Google’s crawl cycle.

127 Responses to Gone Supplemental (Leave a comment)

  1. Thanks for the update Matt. Webmasters and business owners are going haywire over this. I don’t blame them. Lots of bling being lost that is for sure…errr…rather falling into compeditor hands.

    Anyway it sounds like we should all hold tight and hope for the best. Just keep on letting us know on progress so we all won’t be pulling our hair out.

  2. Hi Matt,

    much thanks for the statement!
    We recognized, that the MozillaBot’s acivity is about 50% higher than normal. (about 50k sites a day). But the GoogleBot 2.1 is crawling only 1000 Sites a day – is this a penalty?

    We and other site owners from suppl.hell dont see the crawled sites in BD Datacenters – the youngest cachedates are from July 2005 or earlier. Thats the worst problem on suppl hell.

    Thanks for response,
    Greets from Germany

  3. Hello Matt,

    Do you have Hotels consolidator sites in your mind?

    We hope you won’t forget us. 🙂

  4. I’d just like to say thanks for the speed in which this was solved (well for my site in any case 🙂 )

    Matt, regarding this comment:

    [quote]we identified and changed a threshold in Bigdaddy[/quote

    It would be interesting to understand what triggered this in the first place and if it’s something the site owner has mistakingly done with their site

  5. Hi Matt,

    I have these questions:

    Which Bot is relevant for the Bigdaddy data centers: The Googlebot 2.1 or the Mozilla Bot?

    Is it normal that predominantly the Mozilla Bot ist active?

    Why all pages are Supplemental (except the main page) although the Mozilla-Bot crawled approx. 50,000 pages per day for many weeks?

  6. Matt,

    I have no problem with the crawling but both bots are not adding the pages crawled to the index.

    Either the 2.1 bot or the mozzilla bot.

    I have an online google spider tracking function which shows just google visits so its really easy for me to track. Incedentially this can be made available if you need to see what i am talking about

    Many thanks,

    BB

  7. Matt,

    First of all thank you for responding quickly to my and other requests for a seperate thread on “Gone Supplemental”.

    Our 430 page PR6 UK financial site is still “Homepage Only + Supplementals” eben though it gets crawled every day by Googlebot Mozilla and Regular Googlebot.

    Why is it that the our site and others effected are being crawled daily but the pages in the index are not yet returning?

    As a reminder the site was perfectly indexed by Big Daddy before the problem hit so its really odd that the pages have not returned.

    Hope somebody can look into this further as I am keeping other folks at WMW updated on progress.

    Many thanks

    Gary (Ellio)

  8. Hi Matt,
    i can not believe this. The other elder problems were not solved till today.

    Regards

  9. Hi Matt,

    Seems to me that neither you, nor Google, have yet to grasp the scale of the current problem. It’s understandable in a way, because there is so much noise, conflicting information and so on, on the forums.

    This is categorically not a crawling problem. The crawlers are crawling like crazy, it’s just that for certain sites (no one has yet identified what they have in common) the pages crawled are not making it into the index. The problem seems to be restricted to BD. We have not yet seen one page make it into a BD index, despite being crawled multiple times. The non-BD indexes, which are getting harder and harder to find, are all functioning normally, with the new pages being added after being crawled.

    In short, Big Daddy has a big problem. From what I’ve read it seems Google are absolutely unaware of this, or else keeping it very quiet. How this has remained out of the Press so far is beyond me.

  10. Our site has been redesigned http://www.printe-z.com. If I do a search for site:www.printe-z.com most url’s are from our old site marked as supplemental results. Google is hardly indexing our new site, could you do somehting about it. We worked very hard in optimizing our new site and I would appreciate if you can help us in this matter.

  11. I’m having this same problem. One of my sites had about 600-700 pages indexed, and now its down to 100 for about 2 weeks with most of them being supplemental.

    GoogleBot has been hitting my site daily but none of the content it’s hitting is showing up in the database.

  12. I have a site with 4 sub domains including www.

    All of them have active content…all of them are supp and I do not see any return at all nor is google heavily crawling them. We are talking about 300k+ pages in the supp index for some reason.

  13. I am very grateful that my sites returned after only a short timeout in supplimental. But sad that many sites have been hit for now coming up a month.

    One month with severly reduced traffic and income. I help many sites become better, especially forums, and it is sad when big sites that are well seo’ed get so little traffic because Google goes bung for so long.

    At this moment I have got a im from a site owner that in some dc’s shows 26k pages, and in two dc’s shows 128k. Not supplimentals, but just bad crawling. Just had a pm from another person whose site is still being kicked around badly.

    To whom is responsible for so much, lies much responsibility. One month is a long time Matt. And 5 March on WMW – “a week or so to get sorted” was true for my site, why not many more. Rather long crawl cycles.

  14. My site is down from around 18,000 pages (non BD) to around 400 pages (BD). It looks like BD was kick-started from a pre-December 2005 index and that since then, none of the 90,000-odd crawls have made it into the BD index.

    That is, BD has a fundamental bug that is preventing certain pages from ever making it into the index. Google, from all outward signs remain entirely unaware of this bug and are, therefore, extremely unlikely to track it down and fix it.

    What do we have to do to get someone at Google to sit up and take notice? Has anyone at Google counted the number of pages in BD versus the number of pages in non-BD, for example? If so, has it occured to anyone that the huge drop (no doubt numbering in Billions of dropped pages) is down to a bug? If not, why not?

  15. Matt,
    Thanks for taking time to post this thread. It would be nice to get a debrief once everything settles down with Big Daddy. Hopefully then you can describe what Supplementals are all about.
    Like others here, several of our sites were de-indexed for a short period of time under Big Daddy, but returned right when GoogleGuy had predicted, mid last week.
    It has been a roller coaster ride for many of us since mid last year. I went back into WMW archives to read some of the old posts from that time period. It was like deja vu, comparing some of the recent experiences of webmasters over the last few weeks.
    Hopefully Big Daddy “smooths” out the ride in the future.

  16. Matt;

    I am not sure if you checked our site in the “supplementalresults”. But only again our home page is not supplemental and everything else is. Although, i have seen google bot come in many times, the pages are NOT returning to normal index.

  17. Sorry, I don´t think that the pages in the “Supplemental Hell” have a low crawling problem…….

  18. Is there anything being done about the forums that are getting dropped. All my main index pages were dropped in place of my archives. Not Good at all for me.

  19. Matt,

    Thanks for the update Matt.

    If a page has gone supplemental will it ever be moved back into the main index?

    -jay

  20. Ok, so more are getting crawled but why did millions “Gone Supplemental” in the first place?
    What is the glitch and has it been eliminated? What’s up with the Supplemental index anyway?
    Google should fully index sites or delete them, not throw them into some cesspool assuring they
    will not be accessed by searchers.
    Matt’s thread is appreciated but it is half a loaf.

  21. wow, after hijacking every thread for the last 2 weeks to talk about supplementals, the supplementalists are suprisingly quiet in this one. I expected the comments to fill up exponentially.

    Hmm supplementalists.. I like that word.

  22. Thank you Matt for taking care of supplemental results. We have a new redesigned site http://www.printe-z.com. If I do a search for site:www.printe-z.com most of the listings that appear are urls from our old site marked as supplemental results. Google is indexing only our home page of our new site. We worked hard to optimize our redesigned site and the supplemental results is killing the whole thing. PLEASE HELP! THANK YOU!

  23. All three of my main sites have the same problem on Big Daddy. Hugely less pages indexed but the pages that ARE indexed rank much better! What is going on??

    Just wish the deeper crawling would come back. Have been using Sitemaps from the start but even this isn’t helping now.

  24. >supplementalists”
    It does have a nice ring to it.

    As one of the silent supplementalistst, I just checked and was pleasantly surprised to see all my supplemental pages no longer supplemental and now showing their previous TB PR.

    Very cool. Thanks.

  25. Ryan,

    Your post does not leave me with a good impression about you. You are acting a little smug when it comes to a problem that obviously doesn’t affect you so you try to trivialize it and those that it has hurt. Last check, there were over 100 pages of comments at MWM. GoogleGuy jumped in and asked for feedback there. Matt has been involved for some time now.

    For webmasters affected by this, it not a trivial matter. Getting feedback (and obviously a fix) from Google is probably the only thing that these webmasters are looking for.

  26. Steve, the Mozilla Bot is what fetches pages for the Bigdaddy data centers.

  27. pgaz, I’m not trying to minimize that this affects people. But some of this happens in the crawl/index cycle and I can’t force that to run differently. T2DMan, that was the best estimate I would have made at the time. There were some things about the Bigdaddy crawl/index cycle that I wasn’t aware of that made it take longer. I was in a meeting yesterday and re-emphasized that I thought it was important to get more pages from those sites crawled as soon as we could, because I know that this is stressful for the webmasters who were affected.

    Steve, the Mozilla Bot is what fetches pages for the Bigdaddy data centers.

  28. @Matt

    Thank you for your answer.

    But my page has 50k Mozilla-Bot visits every day for weeks (!) – and still all is Supplemental except the main page!

    I believe that this is not a crawling problem.

  29. Hi Matt,

    thanks for reply!
    But we supplementalists underline, that _NONE_ of the crawled sites are fetched to the BD Datacenters – would that thing “debugged” with this new threshold?

    Greeting from DE,
    Markus

  30. Matt

    So was this problem webmaster related or Google ?

    Just want to ensure that us webmasters are not doing anything that is getting us in trouble with you guys.

    My site was in this supp thingy until WMW GG fixes – but I want to ensure I dont do anything to get me there again

    Thanks for your time and continued feedback

  31. Same situation here. 51,389 visits. I have half that indexed with 99% being supplemental.

  32. Same here Matt. Our site has been getting around 100k Mozilla-Bot visits a week for the last several weeks. But not one of the crawled pages has made it into the BD indexes.

    So this definately is not a problem with the frequency of being crawled. It looks instead like a fundamental problem with the Mozilla crawler, whereby it rejects perfectly good pages (by the thousands) and simply discards them. I am 100% certain that there is a serious bug somewhere, either in the Mozilla-bot or in the BD indexes.

    Is there anything we can do to get someone to take this seriously?

  33. Steve,

    We have also been crawled every day for the last 2-3 weeks by Mozilla Googlebot but not a single page has returned yet.

    Thats why its hard to believe its “just” a crawl problem.

    Could there be a little more to it Matt?

  34. Matt in case you were not aware my thread at Web Pro World on the subject is now over ten pages – one of the largest I have seen for ages and its that in just a week or so:

    http://www.webproworld.com/viewtopic.php?t=61582

    Further evidence ( if you need it) of how many site owners and webmasters are still effected.

  35. Pgaz.. I understand it’s a problem, and I agree that you want a solution.

    To be frank, it DOES affect one of my sites, but Matt said they were working on it, so I believed him and shut up about it. Guess what? It’s fixed.

    Besides, my site(s) do well enough that they need to rely soley on google for traffic. I have one site that doesn’t show up anywhere above page 3 in Google, but still has an alexa traffic rank below 300,000. It’s called not putting all your eggs in one basket.

    The world would be a better place when webmasters would spend time trying to make their sites more helpful to users, and tweaking copy and layout to increase conversions instead of worrying so much about SEO.

    Want to know the big secret to not having any search engine problems? It’s got 3 steps: 1.) Make a site that people find useful and want to share with others. 2.) write clean w3c valid code that follows the webmaster guidelines. 3.) wait a few months. That’s all there is to it. It’s really that simple. It has nothing to do with nofollow tags, supplementals, pagerank, or meta tags…

    When I came in to discuss a Microsoft release, I didn’t expect to see 20 posts relating to supplementals. There’s a time and a place for it… That’s all I was saying.

    You don’t go into a physics lecture and start asking questions about Freud’s views on oral fixation… the same ettiquitte applies to the web.

  36. Ryan,

    Thanks for your post and I apologize for being a little thin skinned here. As I mentioned earlier, our sites fully recovered last week to pre BD levels. But I still very much sympathize with what others are going through and understand their frustrations.

    Blogs are sometimes very informative and other times rumors and totally bad information gets passed around. Thus my request from Matt to put on another “training” session on Supplementals like he did with canonicalization.

  37. Ryan,

    Everything you said is what most webmasters would agree with but you seem to be missing the point that they have performed your 3 steps, have waited, have ranked well only to see it completely pulled out from under them and replaced with results that are over a year old. So what good are conversion rates if your traffic and ranking are based on year old data.

    Many of these people have put the time and effort into doing everything possible within the rules only to have it pulled away with what seems like a snap of the finger. To have a year worth of work just nullified is pretty frustrating and not everyone has the luxury of owning mutiple sites in the same niche field in which they are specialized in.

    There is a place for etiquette but when this many people are effected this much and the answers we get up until this point is “We think we know what is it and you SHOULD be seeing some improved results” sometimes its gets a bit frustrating for those honest webmasters working hard to do everything by big G’s guidelines. In that event, put your etiquette on hold and have a little symphathy for those people being effected by what seems a pretty big “glitch”. We don’t need lectures and analogies of physics class.

    In my case its pretty simple.

    Site A in 2004 was awful.

    Owner of Site A worked very hard in 2005 to improve.

    End of 2005 Site A was doing much much better do to many hours of work.

    Beginning of 2006 site is awful again because Site A is being judged on 2004 results.

    Owner of site A becomes very frustrated.

    You want to ease that frustration? Communicate. This blog is the right direction and we are thankful for thank but we don’t need lectures on etiquette. Save it.

  38. I believe that this is not a crawling problem.
    My site has been supplemental for over a year – before
    BigDaddy was empregnated so to speak.
    Somebody in Google knows what they are doing
    to websites re Supplemental index and they ain’t
    fessing up

  39. @Matt: you said: “Steve, the Mozilla Bot is what fetches pages for the Bigdaddy data centers.”

    But why does the “normal” G-Bot still crawl daily about 50% of my website which is sitting in “supplemental hell”?

    Additionally I get Mozilla G-Bot hits but not as many as “normal” G-Bot hits. My site has been deepcrawled (completely) by the normal G-Bot twice the last two days.

    Any comment on that? What can I expect?

  40. I have the sane problem: Mozilla Bot is *very active* for many weeks – but still “Supplemental hell” (only main page is normal).

  41. Not sure why my previous comment was deleted….

    My site is supplemental as well. My www. subdomain has only 2 pages showing…i then have 3 different subdomains (ie: sub1.domain.com, sub2.domain.com, sub3.domain.com),using the same domain name, that are all supplemental. They total about 300k worth of pages. A lot of it prob deserves to be supplemental, but then where is all my normal content.

    Matt, I know you said we have to wait for our sites to be recrawled, but what I find strange is my logs show GoogleBot hitting my page many thousands of times since Mar 2 and yet my pages are still not in the index. I would think some of my pages at least would be back in there at this point, 20+ days later. Could this be the googlebot devoted to the supp index or is that just a rumor?

  42. Hi Matt,

    Thanks so much for staying on this subject!! And I knew it would have to be you that would actually go into that email queue to look at sites.

    We have seen some of our pages get crawled and return to the index. However, unlike most sites that were left with at least a homepage in the index, our homepage was taken out of the index and out of the supplementals. However, the homepage does have a cached result now. How could that be the case??

    How can Google have cached our homepage and yet not have it in the index? Besides the relatively large number of pages still left in the Supps, this problem is making the “time out” especially difficult.

    Any advice other than wait for another week or two?

    Thanks again.

  43. RE: “It’s called not putting all your eggs in one basket.”

    Arrrh! That’s where Google won the SE wars 🙂 That is, they didn’t wait for Webmasters to put the eggs in the Google basket, they have (from day one) quitely been putting their Google basket under ALL our eggs.

    Great example of Catch-22. Although nowehere near as funny 🙂

    BTW, WMW, or any forum is NOT represtative of the whole www by a looooooonnnnnnnnng shot. Thank goodness!

  44. Matt,

    Thank you for this update on the supp issue. Also, thank you so much for looking at those emails.

  45. I noticed the DC’s that haven’t switched to BD were still hitting my pages and indexing 41,000 pages as of about 10 days ago when I last checked. My site map was working and pages were being updated just fine. Not with Big Daddy though.
    I was once in the supp. club but came out of that about 2 weeks ago but I just can’t seem to get more than 700 pages or so indexed.

    Granted many of those 41,000 pages are some forum, form, and not overall important pages but I should still have somewhere in the 30,000 page range even if I dumped my forum and junk pages all together.
    I have waited patiently and in fact have 2 sites with the same thing happening. The other probably has 8000 pages of mostly .html pages at that and I can only get 123 pages indexed and that never went supplemental. It just went…

    I noticed this low page index total way back when BD hit the very first DC. Before any of the supplemental issues. There has been an indexing problem from the very beginning and I just kept telling myself every day or so that “they will get to the pages”, “they will index them eventually’ – I don’t believe that anymore.
    But I think I should also add that the 700 or so pages that are indexed are ranking fairly well.

    Thanks, Joe

  46. First of all, I want to stress that we are not complaining, as we have had our busiest month ever, but I’d like to comment further on Chris’s post about the Googlebot vs. the MozillaBot.

    Am I missing something here? The GoogleBot visits our oldest site, just like it has done for the past 3 or 4 years, every day. It indexes any new pages it finds within hours, even as recently as yesterday. Our site shows 35,000 results on a search in the non-BigDaddy search results. (Using http://216.239.59.104/search?hl=en&q=site%3Awww.printzone.com.au+-www.printzone.com.au%2Fcart%2F+-www.printzone.com.au%2Fshop%2F&btnG=Google+%E6%A4%9C%E7%B4%A2&lr=) – the minus searches are to remove supplemantal non-existant pages from the results.

    The MozillaBot, too, has been dropping in and crawling every day, but with strangely different results. The number of pages is decreasing! Yesterday the SERPs showed 104. Today there are 101. (Using http://www.google.com/search?hl=en&q=site%3Awww.printzone.com.au+-www.printzone.com.au%2Fcart%2F+-www.printzone.com.au%2Fshop%2F&btnG=Google+%E6%A4%9C%E7%B4%A2&lr=), once again minus searches to filter out supps.

    So why?? Obviously there is no difference in the pages crawled by the 2 bots. And obviously links can be followed OK. Is it MozillaBot not crawling correctly, or is it the BigDaddy datacenters that are slow to index? Or is there something about our site that the MozillaBot or BigDaddy doesn’t like? very mysterious.

  47. Maybe it’s just me who didn’t notice it, but did Google index and cache files of 500-700 kbytes before?

    I know there were exceptions to the 101 kbyte limit but previously I didn’t see such huge files indexed totally.

  48. Matt,

    I’d just like to thank you and your SES “still supplementary” team, I got an email today telling me they are going to look through my situation.

    Cheers,

    BB

  49. >>BTW, WMW, or any forum is NOT represtative of the whole www by a looooooonnnnnnnnng shot. Thank goodness!

    Getting definitive statements from insiders like Matt or GoogleGuy are worth their weight in gold. The forums can sound the alarm. They don’t provide trustworthy diagnosis nor solutions unless it comes directly from MC or GG.

  50. What is meant by “Google’s Crwal Cycle” ? Plz help me…..

  51. Hello,

    My sites are affected and are visited by both Googlebots(normal+mozilla bot)
    A few others who are effected also have visits off both bots.

    If someone haves the time can he see in his logs or they get hits of both bots?
    Yes, are they effected or not
    No, the same

    If it looks like problems come in major cases when both bots visit then this can be a clue to where things go wrong.(handling info from both bots)

    Thanks
    (in addition to the thread at webproworld)

  52. Hi Matt,

    Am I mistaken, or has http://www.thepeoplescube.com been reinstated?

    I was just checking other websites that have been referenced here before and it looks like they were in the “gonesupplemental” club also.

  53. Some where I read Google is creating a program to compete with PayPal. How do businesses break into new markets that are established? Create a niche or a need that solves a basic problem. From my research of the recent effects of BigBrother (oops typo…BigDaddy), it is apparent that the majority of sites running osCommerce systems have taken the supplemental hit the hardest. Now most of these sites are small to medium retailers who all use PayPal and depend on search engines for traffic. Most are too small to get in the paid ad game. They are however the perfect target for a new upstart PayPal competitor. Most would switch formats in a second if it would result in the clinking of coin. To be fair, one must also note that osCommerce sites have built in page systems that create multiple urls that can lead to the same info and a switch language feature that appears to effect different search engines in various ways. It would be fair to say that Google would be right in trying to eliminate the duplication these sites create thus forcing a change in code on these sites. Good code is good code. What would be wrong is if Google has spotted this and created BigDaddy as a means to introduce a product as the solution to a problem by creating a problem. Old time theory has it that if you help people then those people will inturn help you. I would challenge Google to write a shopping cart in good code ( with the option of using different payment systems and allow for competition which is healthy in an industry )…give this system away for free…and introduce a competitive payment system to break into that market. Most people are loyal to those that help them which would translate into $$$ for Google. Help people make money and they will make you money. Take money from people and you become a dinosaur. As in politics, time will write the complete story on this one…and so the small people must wait.

  54. Robots.txt Supplemental?

    Hi Matt,

    Would using the robots.txt to block a URL that’s gone supplemental (and you don’t want in the Index at all) get rid of it once and for all??

    Thanks!
    Sarah

  55. Maybe it would be a better idea to put it in Google Sitemap, have Google index the Supplemental Results and get rid of it.

  56. Pascal

    I was wondering that myself, we get hit by both bots and maybe that is the problem. Has anyone came back from the supplemental problem that gets a good amount of visits from both bots?

  57. Gary, Steve: we can still fetch those pages, it’s just that they don’t always make it live. I believe someone checked in another threshold yesterday based on the meeting that I had some with crawl/index folks. That will still take a while to go fully into effect (1-2 weeks was the estimate that I heard), but should help the remaining people who still saw some of this issue left over.

  58. Matt,

    So what you’re saying is that those sites that have been heavily crawled recently won’t have those pages indexed for 1-2 weeks? Or did I completely misread what you posted. I tihnk for most of us we just want some clarification on why we are crawled so heavily (my sitemap has almost 500k url’s) and if most of those pages will eventually be indexed. I think we all would breathe a very large sigh of relief of this was the case or if you even hinted towards this since I know you can’t comment for every site.

  59. Matt,

    Matt,

    Thanks for the continuing updates much appreciated.

    Based on your reply I guess we need to sit tight for another 1-2 weeks to see if the latest “threshold tweak” has the desired result and allows the lost pages return with rank and position.

    Please let me know if you hear more from the crawl/index folks in the meantime.

    I am passing you comments on in the WPW and WMW threads as there are still many effected parties eager for information.

    Gary

  60. Matt,

    Thanks for the continuing updates much appreciated.

    Based on your reply I guess we need to sit tight for another 1-2 weeks to see if the latest “threshold tweak” has the desired result and allows the lost pages return with rank and position.

    Please let me know if you hear more from the crawl/index folks in the meantime.

    I am passing you comments on in the WPW and WMW threads as there are still many effected parties eager for information.

    Gary

  61. I am watching an osCommence site with 50 000 pages supposedly previously listed, even though there were only about 10 000 “real” pages to be had.

    Site is now showing either 3 500 or just 170 results in BD datacentres.

    Something is seriously amiss. There are no pages reported as Supplemental, they are just “gone”.

    As for cached pages, I have seen many reported at 500K, and 800K and a few as high as (I think) 1200K, even going back about a year or more now. They still show up in results, even now.

  62. Matt,

    I obviously don’t know the details, but from the symptoms it seems extremely unlikely that the BD bug could be explained or fixed by simply tweaking a “threshold”.

    Is anyone looking at this in any detail? Or are you banking on a threshold tweak to solve all of BDs problems? Can a simple threshold possibly explain the huge number of sites that are now missing between 95 and 99% or their content from the indexes? I doubt it.

  63. Hi Matt,

    We have a site http://www.bharatbhasha.com which was having 400k+ pages indexed in Non BD DC’s but recently after the BD rollover no. of pages indexed as we see doing site: is fluctuating between 750 – 840 pages. But more interesting part is that if we do a search “site:domain directory” results returned are more then 2000 in which there are many supplemental results. Kindly let me know if theres any problem on our site which needs to be rectified.

    Regards,
    Mayur

  64. I now have two pages in the index, rankings have gone through the roof !!

    1000 pages to go !!

    Thanks for getting a manual review done Matt, as long as things dont go crasy it looks like things may go back to how they were last year.

    Cheers,

    BB

  65. Broker Boy Said,
    March 24, 2006 @ 2:11 am
    Matt,
    I’d just like to thank you and your SES “still supplementary” team, I got an email today telling me they are going to look through my situation.
    Cheers
    ……………………………

    Matt,
    I have discussed our site here and sent a “stillsupplemental” email within hours of your original request and then followed it up a week later.

    So far we have not received a reply of any kind and would really appreciate some form of communication from the team.

    Or even a fix!

    Many thanks

  66. Also sended feedback and didn’t have a reply but that’s not the thing that i want. An email with “we are working on it” doesn’t help.

    I just wished there was a fix at Google or a way to fix it here.

    Perhaps it is worth traying(if possible?) of allowing only the mozillabot on my site? Every test is usefull i think

  67. Not quite sure what to make of the lastest update. I have a site
    (mainly forum based) that has dropped from 12,000 pages to
    under 10 – it’s been like that for a month. Some of the DC’s are
    showing a few hundred pages.

    Another one not forum based but a directory has droped a few thousand pages and is now only showing 2.

    A third site has dropped significant keywords a few hundred pages
    and when searching for the name using the url it has disappeared
    then reappeared now gone again.

    Two of the three sites have been established for years and have always
    been visible for my choosen keywords. The other one is about 8 months
    old.

    I do hope that the results will settle down shortly as it is getting rather
    frustrating now.

    Cheers

    Steve

  68. @Toby Sodor: I have observed the same stuff. The site gets heavy visits and the amount of pages in the index goes down. Very frustrating 🙁

  69. Matt FYI

    Hi Matt,

    Just discovered and tried the email address for “stillsupplemental” and discovered it’s been “retired.”

    Thanks,
    Sarah

  70. A short update with respect to amount of indexed pages in the index: Since a few days every day there are less pages in the index. Additionally blog feeds blocked by robots.txt and via rel=”nofollow” appear in the SERPS.

    I have a clean conscience with respect to spam and ranking for the remaining pages is excellent. Additionally Google is crawling the site heavily. But nothing appears in the index…

    Matt, could you shed some light on this? This is – let’s put it modest – frightening…

  71. Guys, wanted to give an update about this.. Our site is finally showing more pages on google. From few Dc’s showing 915 pages (Most not supplemental) to just looking at one datacenter showing 20500 pages showing mostly again with no supllemental.

    Although this is coming back, we realize that rankings for all these pages have disappeared for us. what to do?

  72. Could you give us the IPs of those DCs? I have found some more stuff on http://64.233.171.104/

  73. Matt,

    Starting to see our pages comng back on some DC’s. Looks like your tweak may have done the trick.

    Many thanks.

    Gary

  74. Hi Matt,

    I still have tens of thousands of pages missing from the index. Being crawled at a rate of about 5,000 per day for weeks now but only around 400 pages in there.

    Any ideas?

    Thanks,
    Pete.

  75. When I put up the Google Adsense on my web pages, I could do a site search for various terms and they’d show up for my site and also adsense ads. Now all I get is adsense ads.

    When I queried Googel Helpline, I received this message:
    “[#53026604] Adsense Search Not Working
    Please look at one of your results. You will see the “Supplemental Result” tag in green. This tag means that the result is in our Supplemental Index, and will not be returned through AdSense for search, just through regular Google search. We appreciate your understanding.”

    Why would my site end up in the “supplemental index”?

    Thank you for your time. Love your blog!

  76. Hi Matt,

    I hope you still check this post. Our number of listings on Google just went from over 32000+ last week to 24000 a few days ago, to 17500 yesterday, to 13400 today! We have no idea why and we would really appreciate a way to find out.

    Our site had gone supplemental about a year ago and we worked hard to fix all we could find out. There was never spam but we eliminated variables in urls, changed 302s to 301s, created robots.txt, started using sitemap… We were excited to see it back up with Big Daddy but now it looks like trouble again. Is there any way to know what is causing this please?

    Regards,

    Michael

  77. As far as I’m concerned the situation is getting worse not better. I noticed a problem at the beginning of march when I had about 450 pages indexed. Then suddenly this dropped to 240 pages with half of them supplemental and all the supplemental pages were from before the middle of december. I’m now down to about 95 pages indexed with 30% supplemental and I’m losing whole sections of the website almost every week. At this rate in two more weeks none of my site will be available.

    This is really scary, is there any update on this from Google, or are they still staying quiet about the problem? I’ve even cleaned out the site to make sure all the pages validate, have good content and unique header tags. Now I have several thousand new and updated pages entirely missing from the index.

  78. What do we do now if we’re still stuck in the supplemental results and the “stillsupplemental” mailbox is no longer actively monitored?

  79. Matt –

    I hope this is still monitored. I’m a long-time mattcutts blog reader, first-time commenting.

    We seem to be in the same boat on this as well…

    Originally we had about 2900 pages indexed. Then we got rid of the “?id=” issue in our URL’s, per Google guidelines, and we started climbing to around 34k indexed pages. Now all of a sudden, overnight, we have 1 of 1 results with 30k+ similar.

    It’s bad enough we’re bigger, stronger, and have more content than ANY of our competitors, yet we can’t seem to rank above a site with 2 good keywords in its URL. Now this happens, and I guarantee our content to be superior to sites returning with higher results. There’s a lot of searchers NOT getting the best returns on their searches.

    Best wishes – and HELP!

  80. Hi,

    As of today, my site has dropped even further in number of pages indexed.
    The reason I’m sending this email now is that today, not only are ~400 pages
    (of 1600) being indexed, but many of the remaining pages have gone
    supplemental. I have no idea how this could be. I use google sitemaps. I
    have worked very hard to build up quality inbound links without spammy link
    exchanges. I have paid quite a bit of money for quality, topical, content.
    My site is not based on affiliate oriented spam. I’ve quite some time
    building what I believe to be a quality, content focused site which happens
    to be in an area full of spammy sites (gambling). My site is not spam!

    Thanks for any help you can provide,
    Jim

  81. Jim, I think we’re “hollerin’ in a hurricane”, as my Pops used to say…

    I’ve been following the other blog discussion on this matter, and it seems we, the legitimate site owners are being lost in the cacophony of bleating from the “spammy” site owners (and tiny, content-free site owners who blame Google for their miserable, yet accurate, rankings).

    Until the porn/gambling/work-from-home site owners grow weary of squawling, I fear we’ll not be heard (or replied to).

    Best wishes!

  82. I have a small ecommerce site that is not listed by bd, leaving me with only supplemental results since early this year. I, too have worked hard this last year to get this site indexed, only to see all that time apparently wasted.
    I can’t help agreeing with supplementalrus that this has to be a way of softening up the small website owner to purchase Adwords or similar.
    Supplemental pages are just a panecea, the truth is our pages are simply not being indexed anymore. Guess I’ll go look for traffic somewhere else.

  83. Matt,
    I have fairly decent pagerank and good original content, and Googlebot is spidering my site often, yet I am getting no traffic from Google. Somebody suggested my listings might have gone “supplemental.” Why do you suppose this is? If this is the case, is this a kind of penalty?

    A google link: search of my homepage is showing 64 backlinks. A site: search however is showing 147 pages which is far more pages than I actually have. I have about 60 pages. It seems many old pages I removed almost a year ago are still in Google’s cache, and were never cleared out. I don’t suppose this could be my problem? Shouldn’t these dead links be removed automatically or would I need to use Google’s remove URL tool to remove these individually?

  84. This site has 2 subdomains for Bulgaria and Spanish properties, with separate sitemaps for the primary and subdomains.

    Google originally indexed most of the primary domain pages. I then created the subdomains and moved some of the primary domain pages to the subdomains (making sure there was no duplicate content, used permanent redirects and noindexing metatags on the origina pages). But the subdomains were ignored even though the original pages were removed from the index.

    I sent emails to Google, but got the usual “no penalising” reply. I then noticed that Google sitemaps offered the option to send a re-inclusion request. So, because nothing else worked, I gave it a try.

    But you cannot submit the request unless you check a radio-box ADMITTING that you infringed Google’s guidelines – the submit box is grayed out until you check it. OK, I thought I had nothing to loose, so even though the site did not infringe the guidelines at any time I submitted it.

    BIG MISTAKE!!! Google has now indexed my site – and EVERY page including the subdomains homepages are supplemental. The only page not supplemental is the primary domain homepage. Months of work and big money is now wasted.

    The site ranks in the top ten for hundreds of search terms in Yahoo and MSN but hardly any in Google. This servesd neither the search community nor the web design industry well at all.

    The site is totally free of spam, rich in quality, contains original text, is fully W3C compliant and is properly optimised according to Google guidelines.

    I feel this is Google’s penalty for “admitting” to infringing their guidelines – which I did not. So I now warn others against using the inclusion request in sitemaps!

    The question is, how do I get Google to give this site the recognition it deserves?

  85. I’m late to learn/somewhat understand re this issue; but quite some time since March, and seems I’ve problems here.
    site:www.domain.com – and show additional results – reveals many supplemental pages, majority of which (by quick check) no longer exist on the site. Even include “pages” as posts were being made to forums. Useless to anyone searching.
    Actual, current forums, seem poorly indexed. May be partly linking issue – had been a link to the forums from all pages; now added a menu with link to forum categories: hoping this will help some.

    But all site pages seem erratically indexed.
    I use cms, and if don’t add meta description to each item, just get the site meta description. For lower PR pages, maybe bigdaddy sees such pages as duplicates, no matter the content??
    I’ve started working through, adding meta descriptions, but not sure how to do so for forums.
    In my view, this makes bigdaddy a significant step backwards; google formerly able to “see” content on pages, and even abstract useful snippets for search results. (hence my laziness w descriptions)

    Anyway, must admit to being puzzled at some of the “pages” in google supplemental index.
    And wonder why, in google results, I see stuff like “Google web www” tagged onto page titles.

  86. Just rechecked with stie:www.domain.com, and seems there’s been a vast improvement; less supplemental pages on quick glance through (hooray!), but also major reduction in number of pages indexed: again looking quickly, seems pages that don’t exist apparently no longer shown in results – which must be good for searchers.

    Not sure if related to my posting here; but if so, many thanks!

  87. Re my comment dated 22 June – I owe a thankyou to Matt and the Google team.

    My property website with two subdomains (for Spain and Bulgaria) which had been put entirely into the supplementary index except for the primary homepage – is now the main index. Every single page!

    And because it ranks well and is informative, the traffic has more than doubled over night.

    Thanks for listening!

  88. Concerned Webmaster

    Hi

    What is the point of have 89,000 pages in the Google Supplemental Results index if the pages nolonger exist. Is this adding value to the Google users. Googlebot has been crawling my site and these urls, it see’s the robots.txt blocks it (also 301 redirects) and still I see these results in its Supplemental Results Index. There is a bigger issue then just this, when I login to Sitemaps I see no penalty for my site (it states that my site is in the index) buy there must be because my index page which is not Supplemental is buried in the Supplemental Results. I have a theory that within the Supplemental Results there is duplicate content and that is the reason why. HAS ANYONE GOT ANY IDEAS about this issue including matt if he’s back from holiday yet.

  89. This problem seems to be rearing it’s ugly head again?

    My site – http://www.MoDaCo.com – seems to have been hit by a combination of pluging down the SERPS even for terms that originate at my site (e.g. ‘wm5torage’), having the number of pages indexed cut from around 770,000 (about right) to 70,000 (eek).

    To top it all off yes, most of them have ‘gone supplemental’.

    Worst of all, there’s nowhere to turn to for help 🙁

    P

  90. I have not got any mail from yopu side. Please provide some feed back on this.

    Hi,

    I was reading an article for the supplimental result. As far as I knw for the supplimental result G uses a diffrent indexing.

    And Bad pages will be shifted to supplimental index.

    So now confusion starts :

    When u search any keyword it shld show result from the main index not from supplimental index as they are the bad pages for google.

    But you can see the mixed result in search.

    So wht is the criteria G is mugging behind it..?

    Anybody can explain me please..

  91. I think this is related to the introduction of Google Sitemaps. I have just put one of my sites through the sitemap mill and the very next day the site has been desimated.
    Only a few pages now exist and the others have gone supplemental. So there may be a link with the introduction of site maps and this constant battle to be included.

    Has anyone else had any success with Sitemaps or have you all had similar problems as I have. Do you think the pags will appear back in the main index ?? Only time will tell…

  92. I’ve had the same suspicion about sitemaps, but I don’t believe they are connected with the problem. I run many websites, with and without sitemaps, and cannot see any trend amongst them. I have many Number 1 ranking sites, and others that are disappointing. Some use sitemaps, some don’t.

    However, there is no doubt in my mind that the reason any of my sites rank badly is due solely to Google placing them in the supplemental index. This is because I check carefully for duplicate content, always provide html navigation in addition to JS or Flash, always validate the code to W3C standards, and always optimise the sites properly using WebPosition Pro. I take care with meta tags, dates stamps, I don’t use redirects – I do everything necessary to produce a highly professional, totally “white hat” site. And sometimes I use sitemaps, sometimes not.

    Two of my badly performing sites even use Google analytics. You would think that in return for me giving Google free access to my site’s statistics, Google would have the good sense to say “thanks” and index the site in such a way that it will actually get search results!

    This supplementary issue is a MAJOR problem.

    I now have another irritating example of a website that suddenly went supplemental (except the homepage), then came out for about 10 days, and has now gone back to supplemental again. There is no reason for the site being supplemental – it uses sitemaps, shows no errors, the site has very simple html navigation, there is no duplication and no spamming.

    And if there was a reason for it being supplemental why would Google remove it and then return it again after only 10 days?

    Once again I have to explain to frustrated clients that this is a Google problem – nothing to do with our competance in building websites. But as website designers how many times can you keep saying this to clients before you loose credibility and go out of business?

    I have studied this problem 24/7 and cannot see any logic for most cases of going supplemental. Sometimes I can see pages where there are coding problems which might have caused it – but mostly I am convinced that Google has algorithm or capacity problem.

    By the way, try running most of Google’s own pages through an html validation analyser. If they were not owned by Google, they would have all gone supplemental if the criteria included valid html coding!

    I have learnt to be philosphical about Google’s index. But it is still disheartening when weeks or months of hard work is suddenly thrown away just because an automated routine in Google randomly decides to dump your site into supplemental.

    Finally, and to be fair to Google, I complained on this site back in June about a similar issue with another site. Google fixed the problem in days – the site is now one of my star performers. This proves Google has a problem its supplemental index, which I believe they accept.

  93. Google – you seem to got yourself into a real mess.

    Another of my sites has just disappeared into supplemental. It now shows the last date cached as 17 July 2005 one year ago today!!!

    And it now shows the temporary homepage before the site was even built! The previously cached image showed the current page – but that has now disappeared.

    You had been indexing the site until recently, caching the current homepage and showing a recent date when last crawled.

    What is going on??? Have you discovered the secret of time travel??

  94. Sorry Phil,
    It may have something to do with the manor in which you are working. I too looked at WebPosition and that was not long before my site nose dived.

    “This is because I check carefully for duplicate content, always provide html navigation in addition to JS or Flash, always validate the code to W3C standards, and always optimise the sites properly using WebPosition Pro.”

    There is a lot of chat about this product and Googles views on its use. I have just grabbed this from the Google site:
    “Don’t use unauthorized computer programs to submit pages, check rankings, etc. Such programs consume computing resources and violate our Terms of Service. Google does not recommend the use of products such as WebPosition Gold™ that send automatic or programmatic queries to Google.” http://www.google.com/support/webmasters/bin/answer.py?answer=35769

    Sorry if this comes as bad news as you have invested heavily in this.

    Maybe read a few of the other blogs and we meet again…

  95. ! appreciate your comments, Computer Consultant.

    I agree with you – but I NEVER use WebPosition Pro to submit or check rankings for the very reason you suggest. I only ever use it to analyse my pages on my local server, and occasionally, prospective clients pages on the internet. And always only using the page analysis tools.

    So I this is not the problem.

    I have several number one ranking sites all built in exactly the same way and all optimised according to Google’s guidelines using WebPosition Pro. The product works extremely well, but I agree there are certain features like these which are best avoided.

    The only drawback with WebPosition Pro’s analytical tools is that they do not check for page content duplication. That is all to easy to do by mistake, e.g duplicate title metatags, description and content.

    By the way, my last comment must have had the effect, within a couple of days it came out of supplemental and showed the current homepage.

    Phil

  96. I run an eCommerce site (click my name), that in June had a pr5 and 40,000+ serps. The site went supplemental in June (except for about 70 pages) and I have been scouring all resources available to try to rectify the situation. I have reported hundreds of spam sites to Google times for dupe content, and doorway pages. I set up a 301 redirect to correct a www, non-www issue. I also modified the site to 301 redirect all “large query” PHP pages to nice HTML pages. I submitted a re-inclusion request to Google. I see LOTS of googlebot activity but newly created pages with original content are not showing up months after being spidered. I use Google Sitemaps, and also submit most of my products to Google Base for Froogle inclusion. As a side note I also run an eBay store with about 6000 products with a lot of the same content in product descriptions. Could this be causing a dupe issue?

    For the last few months site:mysite.com has shown approximately 900 pages. site:mysite.com –inurl:www shows about 11,000 supplemental pages.

    If anyone has any more suggestions, I’m all ears.

    Thanks in advance, Joe

  97. Hi Matt,
    When I use operator “site:www.coimbatorecity.com” to check the indexed pages in the site result shows total number of pages different every time. Is this a bug in google search?

  98. Matt

    we have a site that is almost entirely in the supplemental index. It has unique content; is built to W3C standards as a shell and links into an Actinic platform for the e-commerce elements; has links into various pages from different sources, doesnt have any canonical issues. In other words, it should all be OK but most of it is in the supplemental index and as such has almost no chance of being discovered via search. What have we done wrong?

  99. I have been involved in the Google battle since early 2000, and very little changes if you look at the bigger picture.

    Google as we will all agree; are the bench mark for online success.

    They work on surprisingly simple rules, quality and lots of it. Never step over the mark or try anything too maverick you will get a nasty shock, stay away from ridiculously high % ages of search terms in the copy content and focus of top level terms with many synonyms in the page(s) BUT WEWARE OF BROAD MATCHING. Keep your code clean and all should be well.

    So in many cases where someone is saying I used to do really well and now their not: It means they have either taken their eye off the ball or being playing with boarder line methodology. SEM is not fire and forget and it is best to stay away from nuclear ideas.

    If anyone else has any ideas or thoughts please feel free to add them. I might be wrong, these are simply my opinions: No man can be the authority on Search, it requires an amalgamation of ideas from us all.

  100. Shifting from supplemental to main SERPs is the trick

  101. I agree with the need to avoid ANY border-line methodology. But, the truth is that Google is not consistent in the way it applies the rules.

    I have carried out tests using various websites built using both totally different and totally identical techniques. I can prove that the one certainty is that if you use duplicate title and description metatags and / or duplicate text at the start of your body text (if you use server side includes for example) then those pages will almost certainly go supplemental – and rightly so.

    However, I have some sites that are built identically and follow the Google guidelines to the letter. Some of these sites have all their pages in the main index, some have all in the supplemental index except the homepage.

    One site – the one hyperlinked to my name – has had every page placed in the supplemental index except the homepage since it was first crawled, and yet has no duplicate content and follows the guidelines perfectly.

    To prove Google doesn’t follow its own rules – it has even cached (in the supplemental index) pages which are in a folder disallowed to all robots by the robots.txt file!!

    Worse still, Google has now dropped ALL the other pages except the homepage!!! So this site now has the homepage cached plus 6 other pages -all of which are listed in the robots.txt file as disallowed!

    And Google has failed to find a subdomain of the site designed to get some new pages indexed outside the supplemental index, even though I use Google sitemaps.

    Another site using identical construction (with different content) has rocketed to instant success overnight – it achieved a PR of 3 within four months of launching, and every page is in the main index.

    I have studied this problem for a considerable time. There is no doubt that breaking basic rules will (and should) get you put into the supplemental index. But I also have proof that some sites go supplemental for absolutely no good reason – and there appears to be nothing you do about it, except try and get Google to recognise the problem.

  102. I have taken a peak at your Tags and the supplemental results in Google.

    The description tags seem pretty similar in all of supplemental. Does the CMS sytem you use give you the ability to create your onw tags? If not you would be best advised to get hold of a more advanced system. If you can alter the tags manually I would set to work on it.

    Additionally your PR is only a 2/10. You need to get a good PR site of 4/10 -5/10 to give you a link to your main URL, This needs to be done within the next three week ready for the next PR update, once you have an increase in the PR drop the link and start on a combination of directory submissions, along with link exchanges, offer an advertisers section on the bottom of your main URL and and drop the price to nil if they give you an IBL.

    If you need any help give me a shout.

  103. OH and it would be best to get the site trasnferred to a UK hosting company (UK IP Address). It is in Germany at present this is OK for world Wide searches but it can hamper Uk searches.

    1and1.co.uk do a very good package (do the business one as it can handle CMS and mod re-write).

    Good Luck mate.

  104. Thanks for your helpful comments and interest, AMS. It is sometimes useful to have another pair of eyes look at what you have – I agree, there was too much similarity in the title and description tags. I had forgotten to go back and update them! So thanks for the time you spent checking them out – I have updated them now!

    I do my tags manually – do you use an automated system, and does it work well for you?

    The site is scheduled for more links when I have time, but I agree a good high PTR link would be helpful.

    Your final thought is very interesting. I have also suspected that non-UK servers can disdavantage your UK search results, and I am not happy with this company’s service. I normally use Hostway in the Docklands, who are superb.

    Looked at your site. I have several good sites we could exchange links with if you are interested and once your rebuild is complete.

    Thanks again.

  105. I’ve just begun looking at a new client site that has the majority of their pages in the supplemenal results. It’s a small retail ecommerce site, never had any optimization done before, so I doubt anything devious is being done to cause the supplemental results situation. I don’t see any duplicate content issues. I’ve read most of these posts, but other than wait and see, is there anyone who can suggest the best direction to finding a way out of the supplemental results? Thanks!

  106. Matt,

    My site http://www.lowgidietbreakthrough.com used to be in the supplemental results and now with the latest update the pages are showing in the normal index..

    Im not complaining but what could have caused this to happen?

    The reason I ask is becuase I have a couple of websites completely hand written that have gone supplemental and I can’t figure out why!

    Many Thanks,

  107. Hi Matt,

    We still have supplementals going back as far as Aug 2005, even thought the pages have changed since then. I hope it’s ok to post the url:-

    http://72.14.221.104/search?q=cache:zTkG0J7uA5sJ:www.profileheaven.com/angelshadowx7+site:profileheaven.com&hl=en&gl=uk&ct=clnk&cd=443

    http://www.profileheaven.com/profile.php?clientID=7642

    The clientid ones arn’t directly linked anymore as we replaced them with sitename.com/username However they never seem to drop out of the index even after a year and we have about 500 fresh ones and 100,000 supplementals. Is it possible to get google to drop my entire site and then re-index it?

    Thanks
    Mark

  108. I think there is still a huge problem with supplemental results.
    for an example our site if you do a site search on justhom.com
    http://www.google.co.uk/search?q=site%3Awww.justhom.com&start=0&ie=utf-8&oe=utf-8

    all but one page is supplemental and we have been through about 4 different reputable SEO companies in the UK and not one of them can identify a problem that would cause google to act like this

  109. Yes, there is a huge problem with supplemental results.

    I have a site to which I have referred Google several times in this blog (hyperlinked with my name) which is all supplemental except the homepage. It is also handwritten and is IDENTICAL in structure to another site we launched at the same time, and which instantly achieved a PR of 3 and all pages in the main index.

    The pages never get recrawled inspite of uploading new Google sitemaps after modifying the pages. We even added a subdomain with its own Google sitemap which Google has completely ignored. After months of work and considerable investment by the client, we have given up with trying to resolve the problem. Sadly, my client has another site which Google has treated in the same way. One very disillusioned client who is probably unlikely to invest in the internet again….

    In cases like this, Google behave very irresponsibly to the webmaster community, the vast majority of whom work within Google’s own guidelines. We earn a bad reputation with our clients through no fault of our own, and we have no recompense on Google.

  110. Seems like some sites are unaffected whilst some just refuse to be found apart from a huge list of supplemental results, google updating seems a lot slower than it used to be, even adding a google sitemap doesn’t seem to make it any better (or maybe it will after a long time…)
    Will Google ever be the same again?

  111. I’m starting to think that if you post a comment here with your URL hyperlinked with your name, you trigger a penalty and all your pages go supplemental 😉

  112. I can only echo what Phil Marter said although I also understand that Google does not owe me anything…it is just so sad that so much time and effort can be so badly damaged by a google snafu with apparantly no way of addressing the problem.

    The site http://surfing-gooroo.com/ is a wholy original work of hundreds of pages written by a broke surfer who just wanted to help people learn to surf, and maybe earn a few adsense dollars to keep him in board wax.

    I built the site for him gratis as a favour as he was an old friend and I suspect it shoudlk win an award for the least commercially motivated site around, yet it is invisible. I know it is not the construction of the site as it is built the same as many of my commercial ventures (which are doing fine)

    The site is 100% unique, original, helpfull non-commercial content that google should “lap up”, and yet 6 months ago every page but the index went supplemental and the time that it would take to build up the incoming links is prohibitive when you are working for free. So there it stays…a perfectly good content rich helpfull user centric site….buried in supplemental hell.

    Very frustrating, but for those of you who are loosing serious money becuase of this you have my full sympathy, I am frustrated as hell even though money is not the issue for me and it is not my site !

    I am just sad that my poor friend went to so much trouble on the weight of my well meaning advice…and got zipp for his efforts.

  113. We have several sites (which I’m not going to list for fear of reprisal) which are all original content sites that had previously been indexed fairly well that were dropped into Supplemental Hell.

    We tried everything … the pages are all unique content but we painstakingly changed the title tags and meta tags as well as double-checked every single page for duplicate content. We resubmitted the site map and it only got worse.

    We continue to add original content to our site in the form of original articles and blog posts, but this issue with Supplemental Indexing continues to be a huge issue.

    So we’ve given up … there’s nothing more that we can do to follow these so-called guidelines any closer. We have literally followed them to the letter of the law only to be penalized for all of the painstaking effort we have put in. It is beyond frustrating. We are now down to 3 pages indexed … 1 of which is the index page, down from close to 400 and none of the new pages with additional content are even bothered with by Googlebot.

    We’re wasting our time with Google and will be focusing our efforts on the engines that actually reward sites for providing good original content.

    For a subject that contains endless commentary about an issue that is such an incredibly vexing problem for the webmaster community, it is stunning to see how little effort is being put into resolving it on Google’s end.

    Incredibly disappointed with Google’s clear failure on this matter. We are so frustarted with spending more or our time trying to adhere to Google’s guidelines (that appear not to really matter with regards to indexing anyway) than we are in actually focusing on what should really matter … more relevant content for our users.

  114. Hi Matt.

    Well I was just once again searching Googles serps in the hope that my site had been released to the wider community. Not so, still in supplemental hell. I’m the surfer guy mentioned in a previous post. You know the one who tried to do something helpful for the wider online learn to surf community. Ie offer them something original informed and most importantly not missleading or spam. God knows why i thought that might work. i mean hey it’s only a couple of hundred pages of original content thats basically non commercial other than your adsense. Well built and with 5 years of research behind the material not to mention 37 years experience on the subject matter. But as it’s been said before, Google owes me nothing. Thats true. But you guys sure as hell owe a better deal to those who use you as an informational resource. Because at the moment the spam is at the top, not the information. Remember Matt if your searchers get rubbish then they search elswhere.
    Hmm be interesting to see what happens to my site after this comment. Maybe it will spontaneously combust in my mates server.
    True the guy who built my sight, feels sorry for me. I feel sorry for the millions of kids and tourists who want to learn to surf but can only access missleading commercially driven information. after all it was them who I put all that effort into it for.
    Have a lovelly day
    Ben

  115. Hi Matt,

    I have been trying to read up on “Supplemental Results” as all my site pages excusing my homepage is in supplemental results. I can not work out why this has happened, I used to have a page rank of 7 plus appeared very highly under “internet marketing” i then developed a directory of my clients within my site, these pages also appeared highly under various search terms. 8 months ago, my web site ended up with no pagerank and lost its positions. I assumed this was due to my directory. I removed the directory from my site, re wrote some of the pages and launched it again this time in .php I assumed i had done something wrong with my site so i cleaned it up and resubmitted it 8 months ago. The site has remained stagnant since. Please help, what am i doing wrong if anything? 🙁

  116. We’ve been supplemental for some time and I’ve been tweaking bits and bobs for a while, with no avail.

    Hopefully, when the next big update comes it should sort it’s self out…. shouldn’t it?

  117. Hi,

    From what I gathered, the use of Webposition Gold is not recommended as it could result to a site ban. Could you suggest a better way to track the performance of a site? Since it is time-consuming to go and check site rankings, link popularity, crawl history, # of indexed pages and all those metrics manually especially when you’re handling multiple sites.

    Any help is greatly appreciated.

    Thanks!

  118. makes no sense.. why do pages end up being marked supplemental while others on the same site are not marked supplemental???

  119. Something for you Vilayte… I am not sure how this will help you but here it goes:

    Have a look at the backlinks pointing to http://www.searchquest.co.uk/marketing.php here http://siteexplorer.search.yahoo.com/advsearch?p=http%3A%2F%2Fwww.searchquest.co.uk%2Fmarketing.php&bwm=i&bwmo=d

    You will see only 6 incoming links. Now, your website has 26,000+ links and only 6 of them are pointing towards “internet marketing” page. Are you understanding what I am trying to say. May be, google has rectified its mistake. Probably now, google is treating individual URL as a measure of Authority and in current scenario, just 6 links for internet marketing page are just too wek.

    What do you think?

  120. Hi,

    I was wondering if timing out in Google Spidering would be a risk for going supplemental. My directory contains certain queries that may take a bit too long.
    Does anyone know how long it will take for the Google robot to stop spidering a site and abort deep-indexing?

    Thanks!

  121. I’ve read through this thread, but I have a question about a supplemental result for a hijacked home page. My traffic dropped drastically (in half) from one day to the next and is continually dropping. It took me a couple of weeks to attempt to figure this out. My home page was hijacked by a Danish site and their page (actually mine, with their url prepended to all my links) shows as a supplemental result when I check inurl:mysite.com. Can this be the reason for my SERPs plummeting?

  122. Hi Matt,

    Our site was hacked 4 months ago and a bunch of hidden spem pages were introduced into our system. By the time we realized this, these pages were already included in the Google index and we had hel to pay in the following weeks.

    But in october we changed our servers and rectified the situation. However a site:www.logodesignworks.com still shows hundreds of those bogus pages. I tried contacting Google via the webmaster tools but there has been no responce.

    Can you please advice me what might be happening and what I can do to rectify the situation.

    Thanks in advance.
    Jeff

  123. Hey Jeff ,

    Dont worry they will deindexed very soon . I too suffered same problem in past . You can ask them to delist them . And also try to usr 301 redirect for such pages to your index page 🙂

  124. Sorry to dig up an old post, but I have a question about supplemental results. If I do a google search for site:www.mydomain.com and it returns with “Results 1 – 75 of about 175 from http://www.mydomain.com.” Does that mean that 100 pages are supplemental and can be searched for at all by google?

  125. Shoemoney mentioned on his blog that Aaron Wall from SEO Book had solved his supplemental problems using the robots.txt file. I’m not exactly sure how he did this, but his Robots file looks like this:

    User-Agent: Googlebot
    Disallow: /link.php
    Disallow: /gallery
    Disallow: /gallery2
    Disallow: /gallery2/
    Disallow: /gallery/
    Disallow: /category/
    Disallow: /page/
    Disallow: /pages/
    Disallow: /feed/
    Disallow: /feed

    And now everything is fine! I would love to know if anyone can make heads or tails of this…

  126. Hi,

    I’m new to all this seo stuff but have taken a look since most product pages don’t come in on search results despite competitors who have only just started trading do so.

    I read here that the whole page is read in but if I take a look at my supp results

    http://www.petmeds.co.uk/&

    the summary description is the navigation index. Clearly, this doesn’t help. I spoke to an seo company who suggested adding meta description tags so we are doing that. However, some competitors don’t use tags at all (and I read that they carry less weight these days) and the summary information for them is the real product information.

    Now I also read that code is read left to right. Our competitors have similar layouts (index on left).

    So, head baffled, I guess I need to fix this somehow. Our site is user friendly, and people keep telling us but have no idea what we are doing wrong? thanks

  127. hi,

    sorry i meant

    http://www.google.co.uk/search?hl=en&q=site%3Ahttp%3A%2F%2Fwww.petmeds.co.uk%2F%26&meta=

    and synoquin small breed is an example item with no meta description and therefore it shows the index, yet it had product information.

css.php