Feedback: What did I miss?

Okay, those are some of the areas I can think of right now; maybe more will come to me later. What did I miss? Is there anything that you especially like/dislike that Google is doing? If it isn’t related to webspam, search quality, new products, or communication, add it here.

Again, please take a moment to cogitate without checking the comments, so that I get independent opinions. For the next few minutes, I’ll be out of touch (read: not approving comments), and from 6pm on, I’m out of touch for the rest of the evening (also not approving comments).

Thanks in advance for telling what you wish we did better.

129 Responses to Feedback: What did I miss? (Leave a comment)

  1. Matt, you were interviewed a couple of months back and in the interview you were asked what your favorite “spider” was and you said Grub.org’s spider.
    Was it just a coincidence that the day after your interview LookSmart pulled Grub off the net?
    Do you think Grub will be revived?
    Did you or your company buy it?

    TIA

  2. Matt I have been reading you for a while now and love that fact that you share real information. For someone who is making a living with a large part of it dependent on Google, my biggest question is why some of my sites do so poorly, while some do so well. This is coming from a completely white hat never spammed a thing in our existence person (really!). We have some sites that are just ignored now that were popular a few years ago, and I just can’t figure it out. I guess what I am saying (in response to your question) is I wish there was some kind of way to know that if you were not showing up in results (but are indexed) why exactly that is. Thanks for your work here it is appreciated.
    Happy New Year –
    Anthony

  3. Google has hurt AdWords considerably with the introduction of Quality Score. The actual system itself is not the issue, it’s the fact that AdWords has now become a black box. Before, we knew the relevant variables in the equation. CTR*CPC=RANK. We could test and come to logical conclusions. Now that is impossible. Half the variables are now completely hidden. Google refuses to tell us how the quality score system really works, giving only the vaguest of advice. How can I test effectively when I don’t have the variables I need? Am I supposed to test out a million different things until I figure it out? Sorry, I have a business to run. I can’t lose tens of thousands of clicks and waste a bunch of money trying to reverse engineer a system that will just change in 2 months anyway. I can’t stress enough how important stability is in a system like this. If you are going to make changes, fine. But don’t then refuse to tell us what those changes are. We have big campaigns that took a long time and a LOT of money to optimize. When you make everything secret you lose my trust. I can no longer trust the AdWords results are unbiased. Google can change quality scores at will to create their desired results. Our only possible response is to raise our bids. How nice that must be for Google.

    I hate Overture with a passion, but you know what? It has been my most consistant campaign for months. I’m starting to like it a lot more, at least their system is fairly stable.

  4. Matt,

    It still sucks. You still don’t know the difference between La Canada real estate, and Canada real estate.

  5. So what’s this about you participating in a reality telivision show? (See point 3)

    🙂

  6. Matt

    I think you really need too do 4 or 5 things Google needs to take a look at:.

    1. The data has to be fresher. I have pages that are now 404’s and have been for 6-8 months, yet I can still find an older cached version in the index(albeit marked supplemental). But the question is, why is it still there ?

    2. During my site structure realignment, I had to change the url structure for a vast majority of my website. My site is an ecommerce site, so this change affected hundreds of pages. I deleted them as best I could through the removal tool but about 130 pages had to be removed by entering each and every page in the removal tool. I was not able to just remove like an entire directory, so I had to enter all 130 pages. Those pages are gone and they are not comming back. However 6 months later, I had to re-enter them all again because once the 6 month removal was up, Google started showing them in the index again. I have also removed entire directories and prohibited Google from crawling those directories via robots/txt. Yet those came back as well and turned up listed in the index again. Yes, the robots.txt file was valid. Would it make sense to offer webmasters a way to permanently remove webpages from Google ?

    3. My biggy, Internal Duplicate Content. Many people feel that they are innocently getting hurt by this…me included. I have an e-commerce site, my index page ranks well, but no matter what I do, my category pages and my product pages wont rank well in Google. The site is spam free, it conforms to the guidelines and I have gone over it multiple times with a fine tooth comb. I look at my competitors sites and see that there category and product pages rank just fine. They often have only internal links going to those pages, where as I have a combination of internal and external links. And although our site might be comparable, I have been careful to make sure our content is unique (i.e. not both using the same product descriptions). Thus I can only assume that I am suffering some kind of duplicate content penalty due to url structure formats that changed many many months ago. So why does Google need to be so heavy handed about internal duplicate content ?

    4. Because of my problems, I have to bring up what someone else said. Google needs to devise a way where webmaster can actual know if they are being penalized or if there is a clear problem. I understand that Google doesn’t want to give away the store, and I respect that, but for people like me who just want to get a fair shake it ought to be easier than spending months or even years prowling over SEO boards trying to diagnose a mystery problem. I don’t want to be number #1, leave that for the fanatics, but if I am being penalized for duplicate content, hidden links, hidden text or anything simple I would like to know that so that I can fix it and be on my way. Shouldn’t there be a way to at least find out what Google hates about a particular site or page ?

  7. Local search for smaller towns. By this I don’t mean Google Local which works fine, but if you search for say [ ] there is usually lots of site that are just have an entire site that lists every town within the US.

    I’ve reported a few that I’ve noticed, but it seems to affect almost every service in every small town I’ve tried.

    As a real example if you search for [web design hackettstown] you will notice a listing for teammediaonline.com towards the bottom of the first page. Just swap out the URL State/Town and it works for anything, but doesn’t change the content.

  8. I think it’s great how Google is pushing boundaries by offering new products, but I have to question the boundaries. For example, why on earth was Google Talk released? Don’t get me wrong, it’s a great product and I use it to speak with a few tech-savvy friends. But do you honestly think that you can overtake MSN,Yahoo and AIM as the main providers of IM services. It’s not a case of quality here, there is a huge monopoly and as it stands as a normal-IM user, there’s nothing that GoogleTalk offers that would make me switch from MSN,AIM or Yahoo.

    And for heavens sake, bring google suggest out of beta – I love that :)!

  9. I enjoyed all your latest blogs Matt, I bet the majority of us who think we are being punished by Google are just messing up our code.

    Example: I just added alt and title tags to my home and garden blog but am unsure if they are compliant and not seen as spam? Should alt and title tags be the same text? Where do I go to learn about this? Surely NOT an SEO forum, are you following me?

    Further: If people do not have a good place to go to learn about how to do things the correct way, they end up getting bad advice from others who are having temporary success in the search engines. I believe when people understand what the rules are I will gladly follow them…

  10. and Blah Blah Blah, sorry for the sloppy writing…but you get the point.

  11. When will we be able to upload entries from multiple domains using the Google Base bulk upload tool? Right now we’re only allowed to submit pages for one domain, as if there aren’t companies out there handling hundreds of clients.

    I realize most peoples’ “I’ve been banned” cries are just lack of knowledge and frustration mixed with a bit of over-expectation. However, would it be nice to dismiss all of those cries with a simple tool called “R U Banned” where you enter a URL (on a per page OR per domain basis) and get a yes or a no.

    Likewise, what about a simple tool called “Is Your Content a Duplicate,” which we can enter a URL into and get back one of three answers:
    – Yes, your content is a duplicate result and is not considered by Google to be the original. The following website is considered the originator (www.website.com).
    – Yes, your content is a duplicate result, but is considered by Google to be the original. Do you need to contact the following websites about copyright violation? (www.website1.com, http://www.website2.com, etcetera)
    – No, your content is not a duplicate result. It is the only version we have found in our indexes.
    Right now I use Copyscape to find out if anyone is stealing my content, but until I check rankings for those terms, it’s hard to tell who YOU think is the originator. Given the little experiment that was done on your blog by a certain other website last year, I hope that you understand how and why this is important to us. Also, give us a way to prove that the content is OURS, if indeed someone is ranking better for it than we are.

    Provide us with a tool that we can use to tell you which version of a variable URL is the MAIN version, and which ones you should ignore. I can’t put Noindex or redirect some of them because it will either affect all versions or cause your spider to get caught in an infinite loop. This tool would cure that problem.

    Get better at crawling dynamic URLs. I have a site with thousands of pages that haven’t been crawled in a month. We can’t use a URL Rewrite program because of the way our server works where this particular site is hosted, and the way our content management system is running. However, I have submitted those pages via xml sitemap to Google, have linked to the xml sitemap from the actual html web page, and have also supplied an html sitemap with all of the main pages where these internal links can be found. My navigation system is in plain text html, no java, no rollover buttons, no frames…

    AdSense, OMFG! PLEASE, please let me know where the clicks are coming from. You don’t even have to tell me which ad – just which PAGE! Even telling me which WEBSITE would be better than nothing at all! I have more than one affiliate website (which doesn’t have anything to do with my primary business) and I never know which one is producing and which one is a dud. What if one gets more traffic, but the other has higher CTRs? What if I hire writers, and want to pay them a commission on what their content makes? How can I get this information?

    I’m sure many others will have more to say, so I’ll stop there.

    Gratefully,

    Everett

  12. Matt,

    Since this is the “other” category, how ’bout your two cents on the SEO contest that is shaping up … especially since one “side/camp” says you need more Google Juice to your (non-www) URL … 😉

    alek

  13. Once again, my webmaster skills are often a large part of the problem, but in this particular case the fact that I do all my own work and have nobody to proofread my work often causes a problem.

    It seems that Google penalizes pages if you change them too often, particularly if you make the changes shortly after launching them. This is a double edged sword. I understand that Google likely sees this as a probable effort to manipulate search results, but in several instances, it was me doing dumb stuff that I didn’t notice before launching the new page or pages. For instance, last week I launched 12 new pages to do with my country. These are all strictly information pages which took me 2 months to put together, check the facts, etc. All 100% original content and unlike many, many other web pages out there, every fact is accurate and verifiable … which is why it took so long to produce.

    I launched them last week and have already discovered several mistakes and oversights despite the fact that I went over and over and over them many times to avoid just this situation in which I now find myself. Naturally, Google has already indexed them and I am afraid to correct them so soon lest I incur a penalty for my site.

    Isn’t there some way Google could ease up on changes to new pages? Say, give us webmaster one month to get it right or at least a couple of weeks. I work alone and there is nobody to proofread for me. In addition, even if someone were to proofread, they would have to know everything I know about my country in order to find my mistakes. That’s not likely to happen! Its a small country, but I have seen every square inch of it, photographed it and written about it. Few people who were born here could say the same.

    So what do you say? Do you think Google could allow a grace period during which time we can make changes after launch without incurring a penalty?

  14. My comment is actually a question. You recently wrote about not having multiple sites linking into each other. We have a Web site and want to have two blogs. Will Google kick us in the shorts if we have two separate blogs that link to the main site? What if they link to the main site and not to each other? Would it be OK to have the main site link to the two blogs?

    Kathy

  15. I see sites that are blatantly keyword stuffing and getting by with fantastic rankings. Don’t want to post a specific example here, but Matt if you e-mail me I can provide an example.

    Andrea

  16. I am disappointed that there is no customer support for google sitemaps.

  17. How did you get such a high PR after the first update? How did you get first on Google SERP’s for your keywords?

    Your first for “gadgets google”, 11th for “seo google”.

    I understand you have a lot good content and yes you are famous person in the web and I guess most of people must have spammed your link in forums and blogs. But, what is the key here that made your rank so high?

  18. Matt I have to agree with Ledfish’s 4th point. There are a lot of honest webmasters who may have a problem and when they are banned they really don’t know why.

    I haven’t had any problem’s with Google but have known many that couldn’t figure out why something was happening.

    And Google has to be faster. It needs to index things a lot faster than it does.

  19. Hello Matt,

    first of all excuse my english (I am Italian).. and thanks for your blog and helpfull suggestions u give us every day…

    Now let’s come on something I dislike of Google (hoping it can helps you to improve this). My company spends per year between 80 to 100.000 Euros in adwords (a part directly and a part with our customers that have direct accounts we manage).

    The system is not so clear in this sense:
    ACCOUNT A >> Keyword [pippo] >> In first page of results with PPC = 0,40
    ACCOUNT B (of a competitor of A and I manage both) >> Keyword [pippo] >> In third page of results with PPC = 0,60
    CTR similar with both accounts.

    Other case.
    I got some website included only in Adwords. We pay every day about 20% of visits more than the ones we see with Php Stats on the site(http://www.phpstats.net).
    This does not happens when I do test with other PPC companies
    My explication is that phpstats find only unique visitors but adwords system is not so precise

    Matt for me all this is OK because we anycase have a good economical return on it so is not a big problem but I think you must take a look also because other company doesn’t thinks same as us.

    For all the rest compliments because for me Google is the best!!!

    Greatings and happy new year!

    Flavio

  20. A Different Matt

    As an online store owner that vanished from Google without any idea why as part of the jagger update, and haven’t returned, I’d like a way to get feedback as to what tripped up the indexing of our pages. We pay for adwords, have adsense, feed froogle so we are both a paying and paid customer of Google and find it totally frustrating that there is no way to talk to folks at G. We understand there is balance between good and evil, and that means G keeps lips shut to everyone. However, there must be some way (and maybe there isn’t because everyone’s out to get G) to become a trusted partner of G. Is such a thing possible, I can’t say. But I can say we were hurt badly by Jagger, and so G is affecting livelyhoods of small folks. How to balance that against big evil nasty folks is tough.

    We busted our b*tts working on our site for the 3 months of Jagger, and now we wonder if all the change hurt our rebound after jagger. Without a clue as to what is good and what is bad leaves small businesses in a precarious positions. We love G and yet hate G. We meet the ‘Do no evil’ spec in the G webmaster pages. I’m not whining, just explaining a position of a small business owner in a tough position that depends of G to pass business on to us just as we do the same in return.

  21. A Different Matt

    I know this is a double post, but I do think the feedback is important. I contacted Yahoo! customer support through a web form on Tuesday about a page they had kept accessing releated to 404 problem and inktomisearch bot. The question was technically proper [I included apache log entries and the complete description of the issue] and I received a reply back with a technically proper answer within 24 hours. I was surprised, and the Y! customer service rep even enclosed a quote from the search engineer they we in touch with. If G could provide that type of support, Yahooo!

  22. Make your websites valid tableless XHTML! Maybe it’s better to wait until IE7 is out (as it has much better CSS support) but you should really think about that.

  23. It seems that Google penalizes pages if you change them too often, particularly if you make the changes shortly after launching them. This is a double edged sword. I understand that Google likely sees this as a probable effort to manipulate search results, but in several instances, it was me doing dumb stuff that I didn’t notice before launching the new page or pages. For instance, last week I launched 12 new pages to do with my country.

    Yep that is me exactly, I do this, do that, try this, try that just like anyone else who is learning. The first site I made to sell my rain barrels got rewarded for this but today’s sites get no love at all. New sites get lots of visits from MSN, a couple from Yahoo but none from Google that are worth anything. New sites (even 5-8 months old) are treated like splogs in Google and even though they are highly focused Google looks at them with clouded lenses by choice.

    I am going to upgrade to wordpress v2.0 now and even change my themes which changes the code, all of a sudden a calendar shows up and categories appear that I have removed along time ago. Shouldn’t be an issue but I believe Google jumps the gun.

    Prove me wrong.

  24. I hate that my site appears in the index with all of the pages spidered but with virtually no traffic from Google. I hate that you guys don’t at least TELL us in a direct way if we have been penalised or what? I hate that Google is EVIL sometimes, no matter what the slogan say. I would like to know what I did to make Google be so EVIL to my site. Got original content? Yes. Got inbound links? Yes. Did any evil SEO? No, I don’t think so.

    And Matt, if I somehow got lucky and you actually read my post, just please send an email and tell me if my site is penalised or what cause I’m running out of ideas, patience and will to live. 🙁

  25. I guess I would like to see some clarification about hiding text. I see this a such a tricky issue for google, its trickiness being the reason nothing is forthcoming officially.

    The problem:
    Many sites use hidden text to rank well. Do a search for “vermont web design” and look at some of the source of those sites in the top ten, its insane.
    Solution:
    Penalize hidden text

    The problem:
    Image replacement is a massively popular technique of replacing text with an image using CSS. It even has the benefit of increasing accessibility with screen readers.
    Solution:
    Do not penalize hidden text

    The problem:
    Conflicting problems
    The Solution:
    ?

  26. I’d give my right arm for Google to offer explanations of why sites have been penalized with PageRanks of zero.

  27. Fix GOOGLE searches by domain!
    1) Looking for domains: when I search for digg.com, USUALLY Google returns pages that mention digg.com and NOT ONE digg.com page! If I am looking for digg.com, Google needs to move that domain’s pages up in thier returns.

    2) NOT looking for a particular domain: Google needs a feature that BLOCKS returns for a particular domain (yes, even if it’s a google sponsor). The other day I was looking for a car part and I got 50-ZILLION returns for rockauto.com (none of which was what I wanted) and ‘ trunion -“rockauto.com”‘ DID NOT block the returns!

  28. I’d like to see Google become more communicative with its customers and the webmaster community. For the better part of 2005, I was doing 4 figures worth of business a month with Google, but couldn’t seem to get anyone there to give me the time of day. I get treated better when I buy a pack of gum at the corner store. I’m starting to move my business away from Google. Another thing, why such a focus on SEOs and not with the webmaster community as a whole? Why is there a GoogleGuy at Webmasterworld and not at places like Sitepoint.com which has a more general webmaster audience that is far friendlier to Google and much more receptive to what you’re preaching?

  29. This is just a list of things I think may be useful for users, webmasters, etc.

    1. Numbered search results

    2. Bring back the number of indexed page count at the bottom of the home page 😉

    3. Subcategorize SERPS. For example a search on ‘quotes’ will show results for insurance quotes, quotations from famous people, stock quotes, etc. It would be great to have a “did you mean” for this issue as well.

    4. Language matching, if a keyword with the same exact spelling shows up in several languages, it would be great to have the option to choose the language, with a quick link above the results instead of having to do an advanced search.

    5. A weekly special Google logo 😉

    6. Exclude domains which are parked and just contain list of advertiser links using exact match search traffic to generate revenue. That is basically spam.

    7. A list of authority sites to replace DMOZ

    8. Totally discount reciprocal linking (I think this is being done already) 🙂

    9. Discount site-wide links (web design sites plaster their links all over the place). Only one of those links should be counted.

    10. Local search results should have the business listings which have websites listed first.

    11. Keep an eye on known spam areas (e.g. real estate, satellite dishes)

    12. Software/Freeware search

    13. Antivirus Program. Not as a part of Gmail 😉

    14. Pay closer attention to the “Similar Pages” link, quite often it brings back irrelevant results, just because those pages are linked to from the same page that the initial page was link to from the same page.

    15. An option to have PDF Free SERPs. Some time more than half of the top ten SERPs are PDFs.

    16. Do not automatically favor .edu sites. Sometimes the pages that show up are of a professor who has mentioned the subject briefly, and ends up ranking because of the .edu domain.

    17. Patent / Trademark / Copyright search.

    18. Nicer Interface for Google Base 🙂

    19. Results beyond the top 1000. I’d pay for this one.

    20. Do away with the public PR! It is driving people crazy. Everyone concentrates too much on PR forgetting that they should be working on providing something useful: good content, tools, ect.

    21. Decrease the importance of link-popularity. A lot of good sites get buried by sites which have an aggressive approach to artificial link development.

    22. More communication with webmasters. This is another service that I would gladly pay for.

    23. Put me on the Google payroll so that I can get paid for these ideas. Tee hee! 😉

    As always, thanks for the great blog and the opportunity to voice off.

  30. I don’t know what you did miss, but I can tell you what you’re going to miss…

    The next 4 weeks of sunlight… you’ll be spending it reading all of these replies.

    Good luck man, only a sick sadistic SOB would wish all of this feedback upon himself; and that’s why we love you Matt 🙂

    Seriously though, kudos to you for doing this. I hope it works out; or at the least that you have a team of googlers working underneath you that you can pawn all this reading upon.

    Thankfully, you don’t have to reply to it all.

    Good luck.

  31. Aaron, speaking of your changes… I just changed the dictionary page at http://www.noslang.com into 27 seperate pages (1 for each letter) instead of 1 page with all the letters…

    I’ve used mod rewrite so the urls will look like noslang.com/dictionary/w or what not…

    The old full dictionary page got TONS of google traffic… Now i’m afraid of what will happen with the new layout.. I hope it doesn’t kill me.

  32. Have Google stay a pure sell-side entity and be more mindful of Google’s increasing conflicts of interest.

    Example: Google should have more integrity regarding the buy vs. sell side issue of PPC (and of course SEO, but I don’t see SEO as a problem — yet). You can’t represent someone on both the buy and sell side of a transaction. It’s as absurd as having an attorney represent both sides of a case.

    The fact that Google (Yahoo and MSN for that matter) assigns account reps to large PPC accounts is a clear conflict of interest.

    The motives of the sell side entity when giving buy side advice is suspicious as long as human beings are not perfect. Look at all the crap going on in Washington right now — and look at all the people working at Google. It is inevitable that this power will be abused.

    Example: a Google account rep has a mortgage company that finds some keywords that are a treasure. No one is bidding on them, the client is making tons of money on low cost terms.

    In this scenario, it is in the best interest of Google to whisper those terms to their other PPC clients and start a bidding war. These clients think, “Oh, what a GREAT piece of advice Google just gave me. I LOVE Google!”.

    These clients then bid up the term, all the while stuffing money into Google’s pockets (hmm, why is it that Google is always making more PPC money each quarter than anyone thought possible??). The client that had the term to himself previously doesn’t realize Google screwed him; he just thinks others finally caught on. No transparency.

    I am scared that Google is going to find a way to monetize natural results (Yahoo already has).

    Matt, you might say, “but we have policies and systems that prevent this kind of thing from happening”, or, “but we’re too moral to do anything like that”. I think both of those perspectives are naive. Mankind has had policies, laws, commandments, and scriptures, all of which are penetrable.

    So have Google stay clear of any revenue model wherein it benefits from the outcome of the results it serves. (I see a lot more hard-coding of results and Google on the top of PPC results, like a search for “science news” illustrates. Nice job Google! Did you get a discount on that top PPC result? I wonder how that #1 result affects the businesses of the companies that were LEGITIMATELY bidding for, and occupying, that spot previously).

  33. Thanks for your weblog!

    If Google is still prioritizing SERPs for competitive/popular keywords of sites listed is DMOZ, I would like to see a reduction of such an emphasis.

    We have tried unsuccessfully to get listed in an appropriate category for well over three years for a site of high quality of unique and content rich material.

    The BS some webmasters must endure [myself included] to get an appropriate listing in DMOZ is extremely high and debilitating.

    I am hoping Google will offer some relief in this regard.

    Cheers.

  34. Hey Matt,

    I enjoy your blog….what would be the easiest way to get an idea in front of Google….especially an idea for a service that would benefit users …. not a change to the search engine but an idea for a service that would greatly benefit the public that would be a good fit for Google…and no I dont want any money for it…just want to see somebody do it and do it well…. It is related to weather information….

    JO

  35. Matt,

    What I’d like most from Google is an email to the webmaster when they remove a site from the index, or better yet a warning. If the point is to clean up the index, rather than punish offenders, then this system seems more fair. Currently it is possible to loose a good deal of income and not have a clear idea of why and how to ‘fix’ the problem.

  36. I’d like some effective means of remembering the dozens of passwords I use on many different login accounts.

    Google should design some sort of fingerprint identification screen that can be used as a password instead.

  37. Hi Matt

    You mentioned in an early post that Supplimental results are crawled by Supplimental Googlebot and that normal Googlebot can not change the url of the supplimental results – so does that mean once a page goes supplimental it is stuck as supplimental until the next Supplimental Googlebot visit.

    EG – normal Googlebot can not make the page unsupplimental – or are you saying even after the page is non supplimental a supplimental ghost exists until Supplimental Googlebot visits.

    Also with Bigdaddy – would still like clarification of whether sites that have been hit by the canonical problem and now appeared fixed in the new DC are expected to regain lost ranks – if this is already supposed to happen then site owners need to look at other causes of problems.

    Cheers

    Stephen

  38. Ever since the search engine spam wars, Google has lost “some” of it’s ability to handle plurals well. I’ve seen serps where just adding the “s” to a noun completely changes the search results.

  39. Matt, great blog. I just wanted add my comments about a Google “bug” that annoys the heck outta me and maybe it can be fixed for the new year. Lets say for whatever reason, I’m searching google through a Google IP address rather than the usual http://www.google.com and I type in “Matt Cutts: Gadgets, Google, and SEO” http://66.102.7.104/search?hl=en&lr=&q=%22Matt+Cutts%3A+Gadgets%2C+Google%2C+and+SEO%22 . Lets say I want to do a new query, but I’m too lazy to clear out what I’ve already typed in the box.. I click on the Google Logo and I get http://www.66.102.7.104/webhp?hl=en which obviously doesn’t work. Tell someone, somewhere to fix that bug. Thanks.

  40. I would love to see the search engine being able to understand words like “what, how, why, when, etc.”

    It would require a totally new algorithm that can understand which pages give answers to the above question words. Of course an additional problem will be how to implement this for other languages, but I am sure you guys can figure that out.

    I know Google is after information retrieval and not just data retrieval as is done now. I think it is needed to understand language before you even start thinking of real information retrieval.

  41. Ummm… How to make google better….

    Okay I typed in “2nd mortgages” and this link comes up:
    http://www.searchguide.com.au/cgi-bin/se/search.cgi?keywords=Production

    Now this is what really makes me angry, I mean I see those links (or click on them – with there damn cookies that never expire) and the whole page is pure B.S! What the hell do they have to do with 2nd mortgages?
    There is absolutely nothing to do with 2nd mortgages -nada, nothing! Garrrrrrrrrrrrrrrr!!! Anger!!!!!!!!!!!!

    If I wanted to see another search engine (not SPAM linkage) I would of visited yahoo or god forbid msn *shudder* (though as a side note my blog has zero hits from google, but quite a few from MSN, which is kinda surprisng because I use blogspot.com…. And no I don’t sell anything). So thats what I HATE, I mean completely HATE!!!!!!!!!!

    Okay, okay…….. Now on witht the rest of it…

    I really like gmail, and I think it would be cool if you gave everyone a rss reader that somehow intergrates with the gmail experience (so don’t just stick it in there perhaps on another tab/page or something). Google mobile is brillant! (and works very well with the motorola A1000 default symbian browser – but not opera mini atm…).

    Perhaps a web/handheld version of google talk (say just instant messaging rather than talk), esp. now with WiFi happening everywhere.

    In google sidebar perhaps a currency converter and a world time finder would be awsome (handy for non-us residents). And also in quick view make frequent used files links actually frequnetly used items (there seems to be always new item links).

    Lastly a bookmark saver would be really cool too (perhaps somehow with gmail) it would be great to for u guys – better advertising (know directly what the consumer is interested in), perhaps with RSS reader.

  42. Adsense

    I think it is too easy for a competitor to get you kicked off adsense. I was averaging about $10.00 per month when all of a sudden I got about $3.00 in clicks in one day. Within days my adsense account was shutdown and I lost the appeal. I have no idea what happened but I am sure someone used an autoclicker to get me banned.

  43. Hi Matt. Fascinating blog.

    If you want another case like Liane’s only worse: about 13 months ago I changed my non-profit, informational site from frames and tables to xhtml and css, to improve accessibility. Its rankings in Google dropped dramatically and have never recovered.

    I update my pages and/or add to them fairly regularly. They would rapidly become out of date otherwise. But also I want to provide more of what I know my students or other researchers are looking for.

    Fortunately many of the people who would find the site useful already know where it is, or have access to it from their university intranets.

    Since Google currently provides about 1% of the site’s traffic, but Googlebot indexes the entire site every day (which makes it difficult to see the real visitors in my stats), I wonder whether I should just ban Googlebot.

  44. I think better explanations and alternatives to certain issues. I have noticed a fair bit in the postings here that are in reference to ‘What you can’t do’ and what Google does not like etc. I never really see alterbatives or explanations to some of this.
    It’s very well to turn around and tell a webmaster you can’t do this without giving an alternative that would enhance the way a webmaster does business.
    It’s just a thought Matt and something that you may want to consider.

  45. Hi Matt,

    My big problem with you and Google is the way you handle a problem when it shows up and when it’s reported to you (I mean Google): “C’mon, it’s not true, it doesn’t exists” would be one of your favourite answer…when there is an answer.
    As a member of the French DarkSEOtem, I’ve seen that on the 302 hijack problem, which used to be called the GenHit problem in the beginning of the story, GoogleGuy was complaining that he has no feedback and that he was waiting for examples with URLs.
    Please, this “we don’t know what you are talking about attitude but we will check and fix it as soon as you will do your complaining job properly” is stupid.
    So how could that be possible ?
    http://pr10.darkseoteam.com/
    We have a great PR collection, from PR10 to PR5, fell free to check.
    As matter of fact, we have never seen GoogleGuy again is those 2 very large threads which were talking about that problem:

    Let’s Test Hijacking A Google Listing
    http://forums.searchenginewatch.com/showthread.php?t=3030&page=1&pp=20
    Come on Google, Fix it !!!
    http://forums.searchenginewatch.com/showthread.php?t=2979&page=1&pp=20

    Now, the canonical URLs problem is well known, but still not resolved, maybe on BigDaddy DC…maybe. That means that you finally got those examples you were asking for, users did they job properly, but what about you ?
    Do you imagine how many webmasters lost their business because of that ?
    But that wasn’t a major issue, that’s why Google didn’t do anything for a log time…waiting for examples.

    Now we have another problem and Google’s attitude is exactly the same, even worth as I, as a Google AdWords client, have sent about 10 e-mails to Google AdWords staff in Madrid and Paris, to report the problem…no answer. That’s pretty bad as I was facing a problem as a client, my ads wouldn’t show up, I just asked why, in French, in Spanish, by e-mail to my AdWords rep who always answered me in a very short time before that, I’ve also fulfilled the feedback form many times, in French and Spanish, just in case…nothing. It’s amazing how this deaf syndrome (or should I say blindness) affects Google staff when you talk to them about things they don’t want to ear.
    I also wrote several posts about this in SEW forums, in those 2 following threads:

    AdWords In UK Sucks — Massive Duplicate Ads Get Through
    http://forums.searchenginewatch.com/showthread.php?t=8588

    AdWords No Longer Human Reviewed?
    http://forums.searchenginewatch.com/showthread.php?t=8979

    In the first thread by DaveN, the AdWords Rep even came and he wrote:

    “No sure how, but I somehow managed to miss this entire thread until this morning. Clearly, I should hang out here more often.

    In any case, I sent the thread to the right folks a short time ago. Many thanks for the heads-up.

    AWR”

    He never CAME back again on the thread and if you type “plumbers ripon” in Google UK you will get the same problem DaveN highlighted, the same site advertising many times.
    Maybe he really did send the thread to the right folks but maybe they just couldn’t read it…the problem deaf & blindness syndrome Google staff seems to be particularly affected.
    Problems, problems, so many problems, why people just come to you with their problems, don’t they have anything nice to talk about ?
    As soon as they don’t come saying problems, problems, etc…they way some other people says developers, developers, etc…on a such scary way, it should be acceptable, shouldn’t it ?
    http://www.ntk.net/media/developers.mpg
    My problem is a bit different.
    I work as an independent SEM and I have an online casino as client.
    I did a lot to educate them, to don’t use tricks, not because I want hem to b as white as you would like it Matt, but just because it’s the best way to success.
    I won’t explain their problem as you will find all the details in the 2 threads I’ve mentioned, with screenshots and everything you need.
    I just wonder why my client’s ads don’t show up in AdWords, why their AdWords Rep doesn’t even want to answer them, when other casinos appear in AdWords with their own URL, without “sneaky redirection”, without using tricks.
    If advertising with gambling sites is forbidden in AdWords, it should be the same rule for everyone otherwise Google is creating business inequality, providing a service to my competitors that they refuse to me and that is very bad.
    Please, don’t come and say me that the problem doesn’t exit anymore just because 888.com and other casinos I’ve mentioned in my posts don’t show up anymore because many online casinos or casino guides are still adverting in AdWords.

  46. Hi Matt:
    Update Google Earth! Make it so you can interface it with GPS and cell phones. I think then you could use your cell phone like a GPS and actually see where you are in the world.

  47. I guess an unpopular comment but needed is this:

    Google needs to stop rewarding sites that use Adsense with better SERPs. Too often now the sites at the top of SERPs for keywords are crammed full of as many Adsense boxes and links as they can get away with, often using RSS feeds to make the site look fresh.

    Use of a Google product, either by displaying Adsense or by buying Adwords should not be a factor in the ranking of a site, IMHO.

  48. Having read all 6 feedback requests, and all suggestions made so far, I’m kinda exhausted 😉

    Since Matt said he wouldn’t respond to any of these suggestions, let me throw in a little help:

    Everett Sizemore => Regarding your AdSense issue, you CAN do what you are requesting by adding and managing CHANNELS. Look it up in AdSense help.

    Liane => You got it wrong. Google won’t penalize your site if it changes often. It’s completely the other way around. (take a look at Blogs. Some of them even change several times a day!). It will seem like you are being penalized if you suddenly erase or change a page NAME in order to make a new one. (say you renamed country.htm to country.php). If that’s the case, you need to do a 301 redirect from the old filename to the new one. Look it up on google, or in Matt’s Blog. He’s talked plenty of it.

    Aaron Pratt => Whatever change you make to your site, be very careful with your 301 redirects (See above). Otherwise, you’ll “lose it all” in google and appear to be starting from scratch.

    Ben => At a glance your site looks OK, however there are a couple of things against you. First, you are linking to a site or sites which are banned. You need to either remove them entirely, or add the rel=”nofollow” tag, otherwise, you’re getting banned as well (by association). It sucks, I know, but you have to check all your outgoing links (and on a big site, that can be very, VERY tedious)… Secondly, your site doesn’t validate at all (whoever programmed it doesn’t understand a thing about validation). Check this out:
    http://validator.w3.org/check?uri=http://www.freepress.net
    You’ll find a LOT of errors, and that’s just on page 1… you need to check and fix these issues on your whole site…
    Once you got both things fixed, you need to wait for the next time there’s a PR update, usually every 45 days. There are forums out there that can help you, or tell you when the next one is going on. If you don’t get some form of PR then, you should go to google, and file a reinclusion request. Instructions for how to file one properly can be found here, in Matt’s blog. State that you’ve completed all these steps, and yet can’t find what is going on. Ask for help from a google engineer, but be very specific and polite.

    Ryan => As I’ve said previously. Be very careful that you are returning 301 redirects. It would be wise if you verify this against a header check tool. I use this one:
    http://www.rexswain.com/httpview.html

    Hey Matt, how did I do? 😉

    Regards,

    Luis Alberto

  49. Matt,

    For the sake of accurate feedback it would really be good if at least one Big Daddy centre was live with the new index 100% of the time.

    At present 66.249.93.104 keeps switching back to default google and many people are mistaking default results for Big Daddy results and causing havoc on the forums!

    You cannot ask for feedback if there are no results to feedback…

  50. I don’t know whether this is the right place. I just try: I’d like to have two features at Blogger:
    1. I’d like to have a button as a reader that gets me to the previous entry. Whenever I discover an intersting blog and like to read it all it is annoying to search through the archieves.
    2. In order to reduce the ammont of clicking even more. I’d like to be able to display the entiers in chunks, i.e. ten or even more on one page.

    Of course as a Blogger I also like to be able to offer that to my readers.

    Best wishes

  51. Health sections have been overrun far to long by gigantic sites dealing with every conceivable subject and trolling only for the money keywords. Jagger made this catastrophic. What’s bad about these gigantic sites is 99.999% of them rarely have over a page devoted to the keyword or subject area. Their pages rank well based purely on site size. This has to end in the health sections for smaller sites to survive. What happened to theming in these areas? You can have a site with 300 pages devoted to the subject and rank 200 while the large site floats a half page fluff piece and ranks number one.

    You can’t offset what these gigantic sites are doing in Health sections by strictly using Adwords. Ironically these large sites pay virtually nothing for spamming thousands of health categories.

    Also quit penalizing sites for good content that sell something. Adwords is only an adjunct, it definitely won’t carry a site.

    Plus for God’s sake start killing off this “content is king” thing Google spawned. It has brought more garbage to the Internet than anything history we will ever witness. To many people, thanks to Google, are creating page after page of worthless and stolen content to earn livings. Forum posts like how many pages of content should I create daily to rank well in Google are just plain ridiculous.

  52. I personally think you guys are doing a pretty good job. Apart from that I have one question to better understand the Google policy.

    How come you guys don’t put a link to your privacy policy right on your local home pages, so that it can be easily accessable by anyone? I actually don’t really get why you still didn’t do that. Sounds a little weird to me.

    In my personal opinion, considering the fact that you’re the best search engine ever (no doubt about it) and also that you’re managing millions of queries every day, it would be something really appreciated by your users. Think about it!

    Also, any news about a possible semantic/ajax technology release?

    Hope to hear from you soon.

    Jacopo

  53. Apologies for sounding abstruse but I had to make these points:

    I think that Google does not grasp the increasingly level of *very legitimate* confusion, frustration, and even hostility within the experienced white hat web crowd about issues relating to penalties, severe downranking, filtering, point subtractions in ranking algo, etc. Why should you care? It’ll start affecting market share soon if not already – we were the guys that got people to use Google in the first place!

    The assumption at Google is that this confusion “keeps the spammers off balance” which is certainly true to some extent, but full time spammers are better at determining the loopholes than legitimate webmasters and actively exploiting them. Thus Google is increasingly facing the challenges from really good spammers PLUS a formerly sympathetic but increasingly frustrated white hat webmaster community.

    Compliance would *dramatically* increase if people had more specific and technical ideas about what compliance….entailed. Jeremy Z was one of the *world’s* most experienced internet professionals and a major search industry insider but even he didn’t know if Google would consider the sale of links at his site appropriate or not. Issues like this simply are not addressed specifically in the guidelines.

    I recommend you have some support people read the webmaster guidelines carefully and then answer your own questions about what is allowed and not – you’ll see how subjective they are and thus see the challenges we all face out here in web land..

  54. I like the Google blogs like this one. It a good form of communication.

  55. Google earn money with adwords – a lot of money- so I am a costumer and not an applicant!
    I would like to have support like a costumer.

    I am a white seo.
    take a look at my website, you can controll it.

    But the serps of google unteach me. the teach me it is better to create doorwaypages than good content.
    🙁

    @bad neighborship
    some seos believe that if a webmaster use “nofollow” he tells googlebot that this website isn’t ok.
    I don’t know if this is a long shot or not.
    But if :why did you use nofollow?

    Since years ago google was a good alternativ search enginee.

    I have promoted this search enginee.

    Now I have to realize that I have “cherish a viper in my bosom”.
    I remember “sorcerer’s apprentice” Johann Wolfang von Goethe.

    regards
    Monika

  56. >> [i]If I am looking for digg.com, Google needs to move that domain’s pages up in thier returns.[/i]

  57. >> If I am looking for digg.com, Google needs to move that domain’s pages up in their returns.

    @AMCer: It looks like you need to do a: site:digg.com search instead.

  58. Penalising new websites, usually under 8 months old, is like disliking someone because they have green eyes, just because you know someone with green eyes who has done something wrong.

    Yes, these sites index page should rank lower than established sites, but not completely out of the index.

  59. Hi Matt,

    Adding a comment without reading the comments as requested.

    1) It’s extremely hard to know why some websites don’t get indexed. I go over a site over and over and usually I end up doing 5,10,15 emails appealing (begging?) to have google add the site. They eventually do. This is for NEW sites, not reinclusion. I build tons of links, verify via webmaster guidelines all ok and still nothing. So this is always a frustrating point. I can have one site indexed within 24 hours and another 6+ months. both use the same (like http://www.recipetrove.com now you all wont index and I dont know why) technology (mambo) and some used the same template. Did 3 sites all with the mambo, same template just different areas of interest (general web directory, shopping site, travel site). 2 of the 3 were indexed in 1 week, other took 6 months and many many emails, most with no response from a human.

    2) I think a google run forum with a few dedicated google staff would go a long ways to helping just interact with webmasters. I think many webmasters have a feeling of being ignored, neglected, or just plain not even noticed (never seen by human eyes on emails sent to google?).

    I am sure I could think of more, but cut it at this. Public relations *on* the internet hot and heavy and let webmasters know you hear their voices (and bold 24pt fonts!) :-p

  60. From a user’s standpoint, I’d love it if you could prevent those “directory” sites from showing up at all.

    When I’m trying to find a product, it’s really annoying to have to check 10-15 sites that appeared to be useful only to find out they are just another link farm.

  61. I’d like to know what is Google is doing about Google Bowling?

    It appears to be really easy to sabotage a web site by putting links to a site from bad neighborhoods.

    Is there a way to combat this? Is Google doing anything about it?

  62. End Google’s participation in SEO conferences, which is like having chickens rent a booth and host a party at a foxes’ convention. 🙂

  63. I think one of the best points above and made several times: Tell us why a website is broken.

    “Give us hint or a little clue”. What would it hurt to point us in the right direction. Something like a google site check and have a pass or fail answer.

    At that point make us provide a drivers ID or social security number or first born to get a private second screening. At that point have the google site check give a answer of “spam, bad links, dupe content” and then have the link or tool evaperate so it can not be abused.

    We do not need hidden text notice as you have already provided it using the google cache text only view. Or you can highlight the page. But when we review and optimize and review and review a site to no avail. Bang my head on my desk and try more things. http header checks, robots exculsion, back links, validation, hidden text, on and on. This is in reference to a PR6 site that is in the index. Ranks in yahoo and msn but not a stitch in google.

    For us SEO’s it would be nice to know what we are up against when we take on a client. IT REALLY SUCKS TO TAKE ON A CLIENT FOR $X and SPEND 10X in time to loose my shirt and credibility. We (SEO’s) provide google with good content. Rich titles, descriptions and make sure our site meet your guidelines.

    Help us make a living!!

  64. I would like to see you list your 10 favorite suggestions in an upcoming blog entry….

  65. It is amazing that Google hasn’t updated its Guidelines in years. Does anybody at Google ever create content of any kind? What actually do Google employees do all day? I’ve updated guidelines, brochures, and instruction manuals dozens of times since 2000 making them available on and off the Internet. If you can’t provide service provide the information for people to help themselves. Instead customer service at Google takes the form of a blog. Webmasters on these forums are asking questions and they have to resort to one of the most notorious spamming vehicles out there which is a blog.

  66. Aaron Pratt,

    Alt should always be a accurate text description of the picture.

    Title should be a accurate description of the landing page the href links to.

    For all these and more, make your way to: http://www.w3.org/

    RE: “It is amazing that Google hasn’t updated its Guidelines in years”

    They were updated about 3-6 months ago.

  67. simple – why is Google returning the same site for nearly 80% of my searches – generic shopping and yet most of the time this site doesn’t actualy lead to the product – IT IS A JOKE

    search for Bratz Bedding
    the no 1 site – a comparison site – lists
    1) an ebay site – lists generic Bratz products – no bedding
    2)shopzilla – another comparison site
    3)uk.shopping.com – generic shopping site – dynamically produced
    4)pricerunner – comparison shopping site
    5)tdtextiles – sells bedding – but not bratz
    6)Harrods – again sells bedding but not bratz

    this site has millions of pages indexed in GG and comes in at no1 or thereabouts for probably millions of searches and yet it’s a hit or miss affair if it actually leads to the product – CRAZY

    this is just one example of this type of dynamically produced price comparison sites that congest the top of the searches – and now many, many more are being created as Networks make it so easy to create pages from merchants product feeds.

    maybe it’s time for ‘mom and pop’ webmasters to throw in the towel.

  68. Do not know if this will make a difference. I am looking into the pharmaceuticals sector and its a mess. For example a search for Cialis brings so much spam. I’ve did a spam report and dissatisfied report several times but nothing happened. Its the same problem for almost any pharmaceutical search term: Viagra, Levitra, Propecia…

  69. Matt
    I read your blog almost daily and it’s generally very helpful.

    One thing with this current post

    You are asking us for Feedback – What about giving us some feedback?

    1. Am i banned?
    2. Am i being penalised and why?

    I think that your average webmaster (not SEO’s – white or black hat) needs this advice. I am a ‘one man band’ with a handful of small business websites (mostly ranking well at the moment) that doesn’t have enough hours in the day to spend hours and hours searching SEO forums to find answers that may or may not be correct.

    Your webmaster guidlines provide the absolute basics, every website i have ever made has conformed yet it took me 6 months of repetitive emails and then resorting to blogging and leaving ‘ignored by Google’ posts in as many places as possible before i finally got an answer from a human (and i’m sure i was a lucky one), which by the way didn’t help at all with getting indexed.

    It’s understandable that you wont give away all your secrets – but some more detailed guidlines on what is and is not acceptable would help us all. It’s not going to help the spammers as they are looking for ways around it.

    You do help us with your blogging, for which i am very grateful – but a quick 10 point list on do this but don’t do that would help so many in my position.

  70. The one thing that comes up time and again is the issue of communication between Google and Webmasters. Googles main task is as a ‘provider of information’ in the form of its indices.

    Many people are unaware if their site is ‘penalised’ or ‘banned’. However, many more are unsure if their site is listed in the correct position. This issue is daunting, as Google is the best at providing quality results, but due to the nature of indexing, it is virtually impossible to get it right 100% of the time.

    However, for those sites who have concerns regarding their status, it would be a fine idea to have an area in Google Sitemaps, which generates a synopsis of the site status. The ‘stats/index stats’ gives links to relevant URL’s for certain information. But a more indepth analysis stating if the site is penalised or banned,and the reason why, would give webmasters something to work on, so as to get their site back on track.

    There are two basic avenues in building a website. The first is to learn everything needed to create a legitimate website, before building it. The second is to pay a professional to do it. This of course is in the perfect world, but as we live in the real world, there are anomalies outside of these, such as not having the time to learn everything,or not having the money to pay a professional. So, websites are built with different levels of expertise. [Some are also built with full knowledge of trying to spam or cheat, and they deservedly get banned.]

    But it’s the middle of the road webmaster, who has little knowledge or funds, who needs assistance and guidance, and Google should perhaps reference those, and help more in giving them a guiding hand.

  71. In the same way there is a Google Adsense account exec, there should be a Google Search AE. Or at least have the Adsense team know what the search team is doing- the outcome all falls in the Google $ pot in the end!

    Most of the Google search “problems” of honest site owners are innocent rather than black hat. Many SEO “experts” are the online equivalent of used car salesmen, or just as clueless as us when we read their WebMaster World posts.

    There needs to be a unified authoritative service representing Google’s preferences of the moment and working with site owners who are experts in their own content field, but not as savvy in the latest tweaks at Google or interested in spending half their life following the fluctuations at DC’s and guessing about it in forums. We have enough of a time reporting Google snags in our search categories.

    Having you select a query and aanalyze it here is informative, but not everyone wants to be subjected to a public comments flogging or be lucky enough to have you pick them (go, Gwen!).

    That is why, with all the Google additions, it would be nice if Google could provide a service to unify webmasters and itself, rather than have us trying to make sense of crumbs.

    If such a service does exist, please let me know either here or by e-mail. Thank you for personally doing what you do in this blog, but as a company, Google needs to do a whole lot more…

  72. Matt

    How about something on cross linking a collection of sites that you own and operate.

    I run about 6 financial websites in the UK and am always scared to cross link them although these links would help my surfers a lot.

    I realise that Google has an uphill battle with spammers and cross linking is one of the tools they use but how about some guidelines to people that want to cross link to help their surfers a la the Google rules?

    Thanks a lot

  73. Sometimes I search for subjects and most of the top items are old – like 5+ years, however I’m usually after more recent entries as a most subjects evolve over time. I’d like to be able to influence my results – e.g. – Increase the influence of date on the search. I know I can restrict results to the last 3 months, but even this can be too old.

  74. My only non-spammy-searchy request would be to continue to improve the popup blocker on the toolbar. Many of my clients don’t have XP and the Google Toolbar is the only safe toolbar with a remotely effective popup blocker. Still, about 1 in 4 ads tend to get through.

  75. Also, I’m not sure if this is the right post for this, but I can’t find a better one so here goes:

    http://www.exploringpsychics.com/astrology.html

    On my screen, I see Google ads for pre-approved home loans and Comcast Internet. I’m not really sure of the connection it has with astrology.

    I even found this site that talks about how Allah calculated the speed of light on a web page that discussed a CSS attribute once (I forget the page or the attribute, though.)

  76. I just have one simple little question. I´ve learned that spiders don´t read the titel tag if it´s not in the top of the head tag. Looking at your code at this blog shows that your title tag is second after the meta http-equiv. All the talk about the importance of having the title tag in top, is´t just talk or does it have any relevance?

    Peter the Swede

  77. what I don’t like is rumours that Google wants to go out of the search business and move towards providing real services.
    you say it is important to concentrate on one thing and do it well. just make your search results better.

  78. faster indexing :p

  79. I agree with comment 9083 google needs to stop putting so much weight on DMOZ. If DMOZ was run well and up to date I would agree with it’s importance, but that is simply not the case now a days.

    I am not sure why and I really do not care because I have never seen a hit from DMOZ in my log but my website has been removed from DMOZ and a quality top 10 listing in the SERP for Google is now lost. I have to assume the two are related since my site has been 6th-10th for my keyword for over 3 years in Google and now I can not find it.

  80. weary, I don’t get any of those results for Bratz Bedding …. the #3 result is target… which sells it.. takes me right to the page.

    Although #2 result is a pure spam page… and #1 is a yahoo page with links to stores that sell bratz bedding.

  81. I’ll keep mine short as you must be whipped by now reading all of these. But I guess you did bring it upon yourself.

    1. I think I’ve read about it somewhere as a possible solution down the line, but how about shooting an email the sites that you penalize explaining it happened and why? I know you check the whois, so you can use an email I would guess from there. That way people that are oblivious to some things they do wrong would know. Even if it is someone doing it knowingly, it will be a way for Google to say, “we caught you!”. Seems like it would save on a lot of support emails from webmasters.

    2. It seems like google spends more time trying to prevent sites from listing well than they do on their main focus….search. Maybe in 2006 and moving forward Google can incorporate a new motto. Something like…

    “Website owners are our friends, not our enemies.”

  82. Thanks Dave, I appreciate the advice, wow wish i had time to read all the posts in these great threads…

    I really does come down to setting the rules and updating the guidelines doesn’t it?

  83. Yep. SEO is not rocket science and HTML is probably one of the easiest languages. Although many unscrupulous SEO firms have a vested interest in spinning otherwise.

    Sorry, OT I know. I have a VERY simple recipe that has worked and seen all my pages rise over the years with no adverse effect from updates. In fact, I usually wait in anticipation for updates and love seeing the spammers sweat 🙂

    1) Add unique qaulity content daily.
    2) Link out whenever helpful for your users.
    3) Exchange links with relevant sites assumming they are not a ‘bad neighborhood’.
    4) Use the Title element to accurately and concisely describe what the page is about. Each page & Title should be unique. Remember, this IS what users see in the SERPs.
    5) Have a H1 heading that is close to the same wording as the Title.
    6) Use the Meta description. This should be an extended version of your page Title.
    7) Use the Meta keywords to include all keywords/phrases that ARE on the page itself.
    8) Write the page for humans NOT SEs and forget about KWD. ALL keywords in the Title, Meta description and Meta keywords tag should be visible text on the page. DO NOT use words in these that are NOT going to be on the page itself.
    9) Use good descriptive anchor text for links. Again, ensure the words in the anchor text are visible text on the page being linked to.
    10) Think end user not SE.

  84. >> how about shooting an email the sites that you penalize explaining it happened and why?

    Google has already been doing that for several months now. I think the fourth (?) batch of e-mails went out in recent days…

    There is even a blog post about it here (several months ago).

  85. There have been some very good, intelligent comments on here, and there have been some very bad comments from people who just don’t know what they’re doing and want to blame Google for their ignorance.

    I just wanted to tell you Matt, that you’re a better man than I for putting up with our crap. And I mean “our” because I’m sure I’ve asked some pretty silly questions in my time as well.

    For the guy who is all pissed off about other sites showing up when he searches for his domain, try this:

    Site:yourdomain.com
    (That also tells you how many of your pages are indexed on Google)

    If you want to search for a keyword and ONLY get results from your domain, try this:

    Keyword site:yourdomain.com

    If you want to EXCLUDE a domain from the search results, try this:

    Keyword –site:excludeddomain.com

    To the guy who says he gets penalized for frequent updates, I have done tests on this and found the exact opposite to be true. Most search engines, including Google, seem to reward frequently updated websites with more frequent and deeper crawls. You may feel “penalized” because one of those updates you did wasn’t so good and your pages fell in ranking. Because you don’t give it enough time to re-index your website before making more changes, you don’t know which change caused the problem. If you don’t know what caused the problem, you don’t know what to fix. If you don’t know what to fix, your site may continue to get poor rankings until you figure out what you did wrong. But I doubt very seriously that any search engine would penalize you for updating your website too frequently. Otherwise my blog wouldn’t be the best-performing page on my entire website.

    Thanks again Matt for putting up with all of our “The Sky is Falling!” cries.

  86. Durant,

    Thanks for the laugh.

    I think Google needs to attend the SEO/SEM gabfests.

    They need to get pointers from the evil black hats so they can fight the spam wars ;-).

    Surely you don’t think that the evil black hats can flip the GoogleForce to the dark side do you?

    Matt, see that some of your people continue to attend them.

  87. A Different Matt

    Going along with my feedback thoughts …

    I’ve been trying since Dec 21 to get a feed into base.google.com which always fails with ‘Internal Error’. What a useless message.

    The feed is mostly the same as the feed the we send to froogle so we don’t understand the failure. I’ve combed my end and don’t see a problem but if you’re a coder you know that means nothing.

    Anyway, I wait the 2 days that the site says, submit to ‘contact us’, and don’t get any response. I love it the support. I realize the product is beta, but most beta programs provide *MORE* support then you get with launched programs. I’m simply frustrated with no where to go for help.

    I say G is a big player now, and needs to figure out how to balance customer support with the G business plan. I think the days are gone where G can just sit back, and answer the emails of choice. It’s time for customer support. Whether that requires a contract with customers (indexed customers, adsense, adwords, etc) or whatever, simply waiting weeks and to hear nothing from support is ludicrous.

  88. I’d be interested to better understand the role that randomness plays in the search results. I think that would be worth a post:)

  89. dear matt,

    i would gladly pay google search engineers a pretty penny if they would tell me why some of my 100% white hat websites have dropped so far back in the index.

    TURN OUR HELP REQUESTS INTO A PROFIT CENTER

    my credit card is ready and waiting!

    thank you.

  90. The ability to add URLs to your Google account and then receive feedback from Google should they drop. PLEEEEEEEASE! 🙂

  91. any way to put your responses in a crazy color or some weird format so they stick out and are easy to read on entries with many comments like this one?

  92. I would like to see some more Error-Messages with Google sitemaps:

    I provide a sidemap with every html-file, wich helps Google to crawl only updated files and sites.

    In exchange, when there are errors or I have been punished for hidden text, hidden links, something ugly then google sitemaps should show the reason for nonindexing the file or getting a lower pagerank. Maybe there’s room for hint’s like: “unlinked page”, “use rel=”nofollow” on links” or ” xhtml structure is not valid” or ….

    Thanks.

  93. okay matt,
    know tell ’em to stop posting feedback. all this feeback blogging only serves one reason. no it’s not going to be forwarded to other googlers nor is matt going to read it.
    he just found a nice way to have loads of unique content added to his website in a few days.
    after he collected thousands of backlinks in hte blink of an eye he needed the site to get biger in the sense of content in order to rank on more keywords.
    he is actually working on a master site which has the utmost auhtority on the net.
    all bots will find this the no.1 site in the net so they spend all day indexing its pages. you end up not having a single bot visiting any seo sides anymore.

    thx to all seos for participating in this project.

    just finnished going through all the feedback and found nothing more to add so i ended up having to post something stupid 😉

  94. I think that what surprised me the most this year from Google was the number of times I got the “Ooops, an error occured” message (why Ooops anyway?)

    I know that the services that got me those error messages are still in the “Beta” mode, but really, I don’t think it’s a good enough excuse from a company like Google! The trophee goes to Google RSS online feed reader. I was curious about RSS feeds back in september/october and was looking for a way to read them. I noticed Google offered a service for that. So I tried it. Big mistake! Every second or third click, I kept getting the Ooops error messages. I tried it for 2 days, then lost patience and never tried it again (I suppose it’s probably better today but I had time to find a better way of reading RSS feeds).

    I also get that too frequently in GMail. This service has been online for quite a while, yet it’s still not completly functionnal to me. I must say that those error messages are a LOT less frequent on Gmail (in the last few months, I got one every 2 or 3 sessions and recently I didn’t get any), but still, for a service that’s been online for 2 years, it shoud work almost flawlessly! I also have a Yahoo mail account and I can’t remember the last time I got a crash in it.

    Maybe all this happens only because I’m using Firefox? Still that wouldn’t be a good excuse to me!

    We also had 2 interesting problems with Google Analytics after the switch from Urchin:
    – a case where all our PPC links in adwords got the Google tracking code added to them without notifying us, and that made our web server crashed for every PPC click during a full week before we noticed that (AdWords didn’t charge us for that week at least, but we’ll never know how many clients we lost because of that)
    – when we tried to deactivate this automatic tracking code feature, there was a nice checkbox in the options, but every time we unchecked it and saved, it kept comming back! We later discovered that the original owner of the account had to uncheck it for this to work (thanks to my imagination).

    Soooo, in conclusion, I invite Google to do more QA before releasing updates in their live environment in 2006!

    Etienne

  95. Matt:

    Since you asked, I would like to see Google enforce their published API Terms of Service/Conditions.

    There are dozens if not hundreds (I am maintaining a list for future reference) of COMMERCIAL websites using the Google API for profitable enterprise.

    Google’s API TOS http://www.google.com/apis/api_terms.html very explicitly states “The Google Web APIs service is made available to you for your personal, non-commercial use only..”. But there are dozens of sites that are selling Google PR related products for a fee making them commercial and profitable in strict violation of Google’s API TOS.

    We own a patent pending link management application service and our users ask for Google PR data on almost a daily basis. They want it that badly especially when they see our commercial competition offering PR data in violation of Google’s API TOS.

    We tell our users that the Google API is for non commercial sites and that we cannot in good conscience violate Google’s published TOS.

    I sent a note about this to the API team 14 weeks ago and never received a reply. Disappointing.

    My company will never do anything to violate Google’s TOS but it is very frustrating for us to watch our users leave us for the competition who is in direct violation of the API TOS.

    I would like to see Google either enforce their API TOS or change the API TOS so that ethical companies like mine can compete fairly in the marketplace.

    Thank you very much for asking what Google can do better. I appreciate the opportunity to share this concern with you.

  96. Having had two sites that were temporarily ‘banned’ from Google at one point, I think it would be very helpful where a webmaster could find out WHY it was banned and WHAT to do in order to get it back in the index. Otherwise it’s a bit like big brother…. One site in particular generates 90% of my client’s income. It was a top rated site for 5 years then disappeared. No reason given, there was no spam involved. This was a large site that has taken 5 years to create, and Google randomly removed it, effectively playing with someone’s livelihood. Mysteriously, it returned in 4 weeks, once again with no explanation as to why it was removed OR why it was reinstated.

    We want to do things correctly, and I believe it WAS done correctly, but you can’t become better if someone doesn’t tell you what your infraction actually was!

  97. When I search for something that will inevitably contain much spam in the results, I intuitively scan the AdWords on the top and right, and click them feeling safe I’m going to get to the legitimate players fast. The problem lately is the spammers have taken over in that area as well.

    I wish Google had draconian measures for AdWord participants. AdWords should be a safe haven for those of us with urgent needs for valuable info and suppliers, and not just another bunch of hit-or-miss links.

    Think of how much more valuable, trusted, and therefore trafficked the ads would be if they had a better reputation for quality control.

  98. a useful tool for webmasters would be the ability to check your rankings for certain search terms, as written about in this article…

    http://www.cwd.dk/article.asp?articleid=115

    ive been “banned” from google a number of times using software to try and determine rankings for my site – as the article above explains, it would be a lot easier for everyone (including google) if we could simply find out our rankings by a quick search, eg

    rank:www.site.com search query

  99. Please, please, PLEASE go back to something like the old Google Local interface. The new one sucks, sucks big, sucks awful, and is extremely unuser friendly.

    I have to right-click on the results links to open a new window that has the information I need to get to. And I don’t like the scrolling frames.

    The old way was clunky, the new way is CLUNKIER.

  100. I wish you could
    1) Lift the 200 URLs limit in AdSense site block list

    2) Remove 64 char. limit on entries in the AdSense block list (large and quality advertisers may have a poor adgroup that cannot be blocked because of its long URL)

    3) Improve training of AdSense and AdWords support personnel – they are mostly incompetent or pretend to be so. Learn from Microsoft how tech support should work!

    Thanks in advance

  101. Supplemental Challenged

    1) Eliminate reinclusion requests for spammers

    2) Establish a for pay diagnostic check with the ability to reinclude a site hijacked, mistakenly removed by the URL tool, or other such “innocent” situations. Allow spammer sites to rejoin the index six months after they file a request and the spam is seen to be removed.

    3) Put a stake through the heart of the supplemental database

  102. I’ve been having a problem with the Firefox Google Toolbar and Gmail. When there is a search term in the Google Tool Bar, and I log into Gmail, an automatic search is done within my Gmail account for the term which appears in the Google Toolbar search box. No big deal, but it costs me an extra click. lol

    Has anyone else experienced this?

  103. Matt

    I would love to see Google crack down on web site spammers. Over the last 2 years I have sent in several reports to Google on duplicated sites that have never been removed.

    It is getting to the point that 4 or 5 of the top 10 listings within the pet vertical are the same company using exactly the same database. They are not even being sneaky about it… all they do is change the URL.

    I am also finding that instead of removing sites that hide links within banners, these sites are now at the top of your directory. The pet vertical is a mess because nothing is being done about blatant spamming techniques.

    If you contact us I will be more than happy to give examples.

    David

  104. I am having a lot of trouble with the new Google Accounts. I’ve had business accounts totally switched to a new personal GMail account (even after specifying on signup that I did not want the new address to be part of my Google Account). It took me hours and hours of my time to be able to fix it so that I could log into my AdWords account with my business address and not my personal one – and Analytics is still messed up! I have no idea at all how to switch it back to the log in that it had originally, and which shouldn’t have been changed. Funny how everything can be switched at one blow when I don’t want it, but can’t be when I do!

    Furthermore, every time I log into a GMail address, it boots me out of AdWords, and vice versa – and the same with the Analytics account!

    There should really be a better way of keeping Accounts separate.

  105. When I try to use Gmail on Eudora to send mail, I constantly get an error message saying “Error Reading From Network Cause: Connection Closed By Foreign Host” I think this is because of a similar problem discussed at this website:
    http://eudora.com/techsupport/kb/2465hq.html
    It appears that Gmail’s SMTP servers might not be in compliance with the internet standards (specifically RFC 1939) for sending mail; they disconnect prematurely once they see the QUIT command, rather than sending a message back. This problem has been occurring since about April 2005, and I have emailed Gmail technical support about it and got a response back in October 2005 saying they were looking into it, but it still has not been fixed.
    I’m hoping you could ask the Gmail team to look into this. It doesn’t appear on other email clients (Thunderbird/Outlook), because I gather they are not as strict regarding complying with the requirements. If you email me, I’ll be happy to give you more information.

  106. Matt,
    why can Google not flag a site as been penalized by yourselves and show something to this effect on the Google toolbar, sounds crazy but….

    1. You could add a small addition to the toolbar saying that it was relevant to website owners only and only they would benefit from downloading it.

    2. Website owners would not waste so much of their time and possibly yours by trying to find out if they are penalized or not…. just think no more of those annoying emails to yourself and other engineers asking for you to “quickly check their site when you get a minute…” !!

    3. Website owners can immediatley see that they have a penalty assigned to their site and start work on correcting it and then doing a reinclusion request… the posotive thing about this is that Google’s search index gets that bit cleaner.

  107. Matt,

    I would like to have a definition as to what exactly is involved with a data refresh. There is something that I clearly do not understand about it and need to learn more. Since the data refresh on the 27th, there has been massive changes in the serps i watch and big daddy is not any better. So, with that said, I would really like to know more about what happened on the 27th if possible.

    Thanks,

    Mike

  108. I am not sure if anyone has mentioned this, but a specialized search for charitable organizations would be nice. For example if I wanted to donate to a charity which does work for the endangered mall-rat, I’d be able to do a quick search on Google Charity, and find a list of organizations who are involved in such work.

  109. While on the subject, can you add a feature in froogle where sites that have a promotion on the product I’m looking at are tagged or visible seperate.

    And sorry for leaving same comments on two threads! I didn’t realize that this is the thread where I had to leave feedback.

  110. I am not sure if this question has raised earlier or not. As new site usually not ranking high on google becoz of aging delay. Is google planning to communicate or provide some operator to find out weather particular site is still in goolge sandbox or not?

  111. I deleted one of the five sites I was tracking with Google Analytics because it was a new site and not getting much traffic. Since I could only track five sites, I wanted to use the resources to track something else.

    The problem: I can’t add another. Now I can only track four.

    Can you put in a good word for me? 🙂

  112. Matt,

    I have been reading you for a long time and I appreciate you being available.

    First, I am totally confused regarding the “sandbox”, and am concerned with the ramifications of being placed in the “sandbox”. Where can I go to read a bit more regarding this issue and how to deal with it effectively?

    Second, we have many people that are trying to gain search reputation in foreign countries, particularly in Romania. Being that Romania is so small, and the internet has not taken off as in other parts of the world, are there any special SEO strategies that you would guide us in?

    Regards,

    Mary Stevens

  113. Separate “real” search results from link directories/shopping sites.

    Put Amazon/PriceGrabber/eBay/Nextag/about.com/etc in a different section. They never offer anything unique, anyways. Lump in with this category any page that has >3 text ads on it.

  114. To Joel Lesser regarding the remark There are dozens if not hundreds (I am maintaining a list for future reference) of COMMERCIAL websites using the Google API for profitable enterprise. … Joel, while you sound a little bitter and broadcasting that linksmanager is going to tell on people might not be the best thing to do, you might want to know that a lot of places pull PR by ways other than the API. From what I know, API doesn’t actually pull the PR….maybe I’m wrong, but I thought it only pulled search results mainly.

    I guess my point is there is no reason to run around tattling on people because I’m sure google already has the same list you have.

    Scott said using a method like: rank:www.site.com search query …. what a great idea. It’s obvious there is an issue of software like webposition using google resources…maybe google could provide something along the lines of API where it could provide the common things these automated programs go after to ease up the queries on main servers.

    great feedback. Hope it’s put to use!

  115. Hi Matt,

    Just wanted to let you know that if 66.249.93.104 were to go live today you would find out what it feels like to be an average normal Webmaster, without access to the google admin panel. Of course, you would have to pretend that your main source of income depended on this site of yours before the real impact hit you…

    Your site mattcutts.com has disappeared, vanished, been vaporized from the 66.249.93.104 SERP!

    Yup, your site is nowhere to be found, dropped completly out of the index on 66.249.93.104.

    I’m sure you won’t let 66.249.93.104 go live without your site in the index, that would show the weaknesses in the software to way too many people/investors.

    But you have to admit, that in a dark and twisted sort of way this is very very funny…

  116. BTW – I know you were setting your site up for this on purpose by not protecting it with a 301 or 302, to try to prove that there as no are canonicalization issues with google.

    But it is still funny as hell… 😉

  117. Hi Matt

    I have a question, how can a web site with ZERO content, 100% affiliate banners and links and it was only registered in Aug 2006 (not even 6 months) gain a Google PR 7 on the toolbar? http://www.milliondollarhomepage.com/ I know it has been well documented and on several news stations but it breaks neary everyone one of Googles rules?

    Regards
    Richard

  118. Sorry Typo above, registered 2005

  119. Hey this may sound stupid or crazy but here goes.

    Okay google has programmers. . . how about Google builds its own browser, implemented with it’s own toolbar and access to google content at a click of a mouse.

    Crazy? Genius? Pay Me? 🙂

    Just a small idea.

  120. I imagine this has been covered somewhere before but I missed it. I have a URL question for you. Does Google rank a page higher if it is structured like this:

    http://example.mattcutts.com

    versus like this:

    http://www.mattcutts.com/example

    ?

  121. Can we have your opinion about Hedir, it is making into news and challenging DMOZ?

  122. Why are lots of Porn links re-directing to google SERP?

    Smell is coming….. something Fishy?

  123. I belong to a blog…writingup.com…In the past couple of weeks, about a dozen of us suddenly had our Google Adsense accounts suspended. There might be a bad egg or two in the bunch but not ALL of us are. I am NOT. We don’t know what’s up and though we have tried the appeal route, we have heard SQUAT from GOOGLE. Can you recommend an appropriate course of action? Any advice would be much appreciated.

    Thanks!

  124. I would like an automated way to check if a website is banned from Google.

    It appears that if you link to a website that is banned by Google, Google AdSense refuses to show ads on that page. And even after you remove the offending link, Google AdSense still doesn’t show any ads. This is a problem for websites that accept links in a directory or for websites with a forum or for websites who accept articles for reprint (which typically have a link back to the author’s home page).

    In my case, the offending website appeared legit, submitted an excellent article for reprint on one of my websites, and after checking because I was suspicious of no Google Ads showing up, noticed it did not appear in the Google index at all (which, I assume, means they were banned). All other pages on my thousand plus page website show ads right away, so its very odd that one page out of thousands is the ONLY one that doesn’t show any ads. The only difference on that one page was the link to that particular website and the content of that particular article (headers, footers, etc. are all constant for each section since they are include files).

    This also poses a risk for our website that we didn’t really think about. Links in our directory, links in our forum and links in user-submitted articles could potentially penalize our website in Google.

    As a webmaster who strives tp present good content and who avoids linking to spam, I would really like an automated way to check what websites are banned or have a low PageRank so we can ban them as well and avoid being penalized by Google for linking to them.

    Thank you.

  125. An edit button on your blog so I can fix that glaring typo in my comment above!

  126. I’ve heard tales about people getting banned from AdSense because a competitor or someone malicious kept clicking on their ads. As someone said earlier, it (supposedly) is way too easy to get someone kicked off AdSense by someone else. I’m not sure if this is true since it (thank God) has never happned to me (and I hope it never will). It may be just some spammers who are upset they got banned. But still, there are way too many stories out there about this.

    So I would suggest a way to find out the IP Address of the person who clicked on the ads that got the account banned. That way, even if Google refuses to reinstate the account, the webmaster could contact the person’s ISP, find out who was using the account, and get them kicked off their ISP. Which Google proof in hand, many ISPs will kick off someone like that for violating the TOS. Plus, if they are in a favorable jurisdiction, initiate a lawsuit against the person who caused Google to be ban the account. With the Patriot Act and other anti-terror laws, they could easily be fined or thrown in jail for hacking if you have a good lawyer.

    Okay, so I’m being a bit extreme, especially about the lawsuit part. But my point is, there has to be some way for webmasters to have recourse against malicious people.

  127. (a continuation of the last post)

    Another thing, if a webmaster knows who is causing the malicious clicks on the Google Ads, the webmaster can ban their IP address server side so they can’t even access their site anymore… or instead show the malicious person a version of the website without Google Ads (so he’s not tempted to try logging in from someone else’s computer to do it again).

    If you give the webmaster information on who is causing their AdSense account to be banned, the webmaster can do a lot of things to prevent them from doing it again.

  128. Or how about this? Why don’t YOU ban the IP address that is clicking too much. If you detect that a particular IP address is clicking too much, simply stop showing Google Ads to them. They won’t be able to click anymore (or if they do, it’ll just be a public service ad anyway). This would help protect webmasters from malicous people and also prevent people from clicking on ads too much.

  129. Here’s another idea. How about make it easy for AdSense users to pay writers?

    As a webmaster with many content driven websites, I am always looking for authors to contribute new unique valuable content. And on those content driven websites, we use AdSense as one method of monetizing the content. It would be nice if we could give authors a subaccount that only shows their pages, and as the webmaster, we assign what pages they get paid for. The webmaster also sets up the percentage the author gets. Authors who submit exclusive content could be given a higher percentage of the webmaster’s AdSense revenue whereas authors who submit non-exclusive content could be given a lower percentage, for example.

    It would also be better if Authors only had to have one account even if they had content on multiple websites by multiple webmasters.

    All the webmaster would have to do is select which pages the author gets a share of the revenue, the percentage of revenue they should get (on a page by page basis or site-wide), and what Google Account should be credited with the money.

    The author would be able to log into one account, see the stats for the pages he is credited for, and be able to setup his payment details.

    Google then uses the percentage setup by the webmaster and deducts the writers’ portion from the webmaster’s check/direct-deposit and credits the author with that portion.

    This would make it easy for authors to get paid for their content and would also give webmasters an easy way to entice authors to write content for their website.

css.php