Google++

If you’re not a Googler, please ignore this post.

Okay, it’s just us Googlers now, right? I’m sure you’ve seen Danny Sullivan’s post about 25 things he loves about Google and 25 things he hates about Google. If your service got a shout-out on the love list, congratulations. There’s a ton of stuff that Google is doing really well, and lots of groups listen to what our users want and work hard to make that happen.

But: the list that everyone should mull over is what Danny hates. You’ve been handed detailed bug reports (for free!) from one of the foremost experts in search. Bug reports? Yup, that’s how I’d treat them. Sure, I’d disagree with a few (#3, #4, #9, and #16 are the ones that I’d respectfully disagree with the most). But the criticisms that Danny gives should be addressed, even if the issue is mostly perception.

The blessing (even if sometimes it feels like it’s not) of working at Google is that everyone has an opinion on Google and what they want Google to be doing. Of course we’re working hard on core search quality. Of course we’re working on great products. But I implore you, gentle Googler, to listen to users, webmasters, advertisers, and publishers whenever you can. Find out what issues are hot with our user support team. Browse the feedback from the “Dissatisfied” link. Read the feedback we’ve gotten on how to improve our products, search quality, communications, webmaster-related ideas, webspam, and miscellaneous feedback. We’ve been collecting feedback for years, and if you talk to users on a regular basis, it keeps you grounded and working on the right things. We don’t always have the time or cycles to fix every issue that a user asks about. But we should always strive to. And when a someone outside Google tells us something that they wish were different, we should look for scalable, robust ways to tackle it.

95 Responses to Google++ (Leave a comment)

  1. Good morning Matt

    And here is my Google Loves/Hates list.

    – I love BigDaddy new infrastructure

    – I hate that you haven’t posted for sometime a BigDaddy weather report

    That was better than Danny’s , right 🙂

  2. Please please please please please… Please!

    Please fix #18 this is causing Gmail users all over the world to cry.

    Including me (sad face)

    Cidy.

  3. I wish Danny Sullivan would clean up his SECs and Forums of spammers and black hats before telling another to clean-up their back yard.

  4. Hi Matt,

    Here’s some constructive criticism on the GAP/ AQI Adwords program:

    Google launched its “Google Advertising Professionals’ program – run out of the US – over a year ago.

    By Feb 15 2005, (13 months ago) 2 of us from the same firm had become qualified as GAPs. Maybe Google US could look at improving communications with GAPs?

    For example – after we received our Goodie box for being in the first 100 – Fedexed to Australia from the US – we’ve heard very little. In the past year – we’ve only received one GAP Specific email.

    That email (Feb 1st 2006) was to advise that Google has ‘changed the name of the qualification’ from GAP (Google Advertising Professional) to an AQI (Adwords Qualified Individual) – and advising us to change our logo. And to advise that GAP status will instead be issued at the agency level – but that program isn’t yet offered in Australia.

    I’m not criticising anyone in Google Australia – they are great – I can pick up the phone and call anyone – we regularly communicate on a range of topics with a range of people – and we get great service and attention. But the GAP program – apparently run out of the US and UK – is lacking in communications – certainly with non US GAPs (err I mean AQI’s)..

    Maybe GAP communications should occur at the Country level?

    I also think the whole agency process needs a review. You know – how the client loses all their click through history when they move from an Agency account to an account in the client’s own name…..

  5. Despite submission and verification of sitemaps, crawling has been very slow lately across a range of sites.

    Thanks for listening.

  6. Hi Matt,

    Okay, here is some feedback for you. About a month ago I sent an email to Google to say that a site of mine (that is totally white-hat) had mysteriously disappeared from the SERPS under our main keyterms- which we have historically always ranked well for. Anyway, in this email I asked for confirmation as to whether or not we were being penalized in any way.

    I’m happy to say that today I finally received an email back. The email said:

    “Thank you for your reply. We apologize for our delayed response. Please be assured that your site is not currently banned or penalized by Google.”

    Now, first of all, let me say that it is a relief to be able to get that confirmation- even though I was already 99% sure that nothing was penalizable on this site.

    However, let me say that one is left in a little bit of a lurch when- while one knows that a penalty is not the issue, one still can’t determine what the issue is- or how long it will take to resolve itself.

    I have a hunch that the canonicalization issue you’ve made mention of might explain why the site in question is currently performing a “vanishing act”. I say this because our secondary pages are ranking under our keyterms but not our homepage. So where our homepage used to rank in the top 10 for a very competitive keyterm, a secondary page now appears on about the 5th page.

    The frustration comes from the fact that I have no way of knowing if canonicaliztion is in fact the issue.

    I’d appreciate it if you could comment on this kind of webmaster dilemma. Thanks Matt, I appreciate you’re desire to communicate with the webmastering community.

  7. What I love about Google?

    Google brought relevance to search. I will always love that.

    I have to agree with Danny about some things he hates (and a few things he didn’t mention) what I love lately is you and Vanessa and the whole sitemaps teams work so hard on your blogs and at SES NYC. I really appreciate that you spent the time to talk to all us “cuttletts” and listen to our feedback. It made my trip to NYC worth while.

    I can’t tell you how much I appreciate Google engineers helping me with robots.txt syntax to solve duplicate content issues. I am especially impressed that two different engineers responded to my email with two different solutions, both of which solved helped me both eliminate the content issues and get my company behind our sitemap effort.

    I have never gotten that much response from search engines. All I get from the competitors is autoresponders!

    Ironically, what I hate the most is the Adwords customer experience. I was one of the first adwords clients and I have had half a dozen reps in 4 1/2 years. Now I can’t even get a rep assigned to me. Hard to figure that the natural search team is more responsive than paid search when i spend 80-90k/month on adwords!

  8. Here is one which I have reported to google already (last year or even earlier):

    Try this search in google: “site:wikipedia.org inurl:wiki/Image: -inurl:svg”, which should show all Image pages on wikipedia except the SVG ones. It comes with about 700 resultes, which seems a bit low. And it is. Googlebox regards urls which end in .jpg or .gif as special and doesn’t index pages like this:

    http://en.wikipedia.org/wiki/Image:Rubber_dripping.jpg

    and also doesn’t hand the images in there to the imagebot. So searching for images or image information on wikipedia is pretty much impossible with google.

    Another Idea would be to publish the bug database for google. If I would know what the status of my bug report about this is, I might not bother to mention it again.

    cheers
    christof

    disclaimer: yes, I run a wiki with lots of images too.

  9. What I hate are the directory sites that keep popping up on top of my results. If I’m looking for a specific topic, I don’t want some heavily optimized paid directory to be in my results. I want content pages not listings I then have to wade through further. While directories can be a good way to find sites grouped into categories, I would think Google should be able to do a better job of serving these sites than the directories.

    I agree with you about #3, #4, #9, and #16.

  10. Excellent PR and I don’t mean PageRank 😉

  11. Ignoring your please ignore message i read on to your article and to Danny’s.

    As expressing my undying love and fascination for Google would only lead to embarrasement, i will only bring up the two things that annoy me the most.

    #1 Google Analytics and timezones … Reports are meaningfull if they are easy to ‘read’. Reading the graphs referencing the date and time when the time mentioned is Mountain View time while you live in Central Europe is not so user friendly. Just adding a time offset setting would make it so much easier. I just cant use Google Analytics yet to show reports to management because of this timezone issue.

    #2 Adsense not using GoogleAccount … I love the single-sign-on with my GoogleAccount across the google sites and applications. But why is Google Adsense using a different account ? with different password validations ?

    Am i the only one getting annoyed by these two things ?

  12. Very refreshing post Matt.
    Thank you.

    What you write is profound, beyond just google; people from virtually any company could consider your words in ways that relate to them, and the world will be further on it’s way to be a better place.

    Thanks again..

  13. Hi Matt,

    Mind elaborating on why you disagree with the points you disagree with?

  14. First thing struck me the most was Danny’s comment about yet another project from Google before finishing so many half-done projects. I see his point, but IMO Google has always been riding the “big wave”, so to me it does make sense for Google to go full speed ahead and fix things along the way. The only problem or issue that I do take very seriously is blogspot and blogger in general, the Trojan horse, the Adsense fraud hot spot. Looking at something like Blogwise’s spammiest submission hosts, I do wonder why guys like Eric play around on McHammer’s blog rather than trying harder to find a solution. Seriously: Blogger IMO goes against everything that Google stands for and became famous for. I personally hate criticism of Google and the first thing popping into my mind is “So, you think you could run Google better?”. But looking at blogger and comparing it to a few other blog software companies, I need to say that it’s really an embarrassment. And the google blog search engine should be fixed as well. 🙂

    Other than that, I can’t wait for the new online payment system and Google’s own version of eBay (I know, you prolly don’t want it to be called that, but for now that’s how we anticipate it).

    Mike

  15. That is an interesting list of “to-dos”, and it is certainly true that listening to complaints is vastly more important than listening to praise.

    I would strongly support the “paid for support” option – heck bung it on a premium rate phone line and outsource the service to India if necessary – but legit webmasters would kill to be able to have a live conversation with someone who isn’t going to take a week to send back a generic copy/paste reply that doesn’t explain why a site has been banned.

  16. Web standards should be more than a search result for Google. Really.

  17. Well, perhaps I should state first that I’m not a Google employee, but risked to read the article nonetheless 🙂

    It’s nice to hear how much Googlers care about feedback. But while I realize that changing a service that was just introduced not only takes time but also really good reasons, I can’t stop wondering why you released Google Groups in the way it is now. If you’re following reactions on usenet (especially in the not-so-chaotic hierarchies like de.*), you’ll notice that most are negative.

    I won’t repeat the criticism of the UI, the “intertwingling” of usenet and Google Groups, character encoding issues and the usenet posting interface here, but I wonder why there seem to be no attempts to improve this service. Sure, Google Groups is not just about usenet, but it’s the only large usenet archive left. Don’t you feel this gives Google a bit of responsibility for providing better usenet search?

  18. Matt,

    What I loved about Google was the fact that we have ranked very well for the last seven years in the very competitive UK financial market.

    However we were bombed by the “Supplemental Hell” problem and left with a homepage + supplementsls listing on Big Daddy.

    You have reported a fix but this has not resulted in our site index returning.

    To compound the issue the dafault index has incorrectly index a number of our key pages as https instead of http. The error occured due to relative linking on a site with SSL installed.

    The SSL has been removed to allow 404 errors on the incorrect pages but Google support will not help in removing the dud pages that are blocking the http index and main sub pages.

    We are a major brand in the UK financial market and consider ourselves to be white hat and would really appreciate some help as we have been double whammied on the Google index and are desperate for a fix!

    I know its not directly connected but we are also major advertiser with Adwords spending of over $150,000+ per month.

    Hope you can get someone to have a look for us.

  19. I thought Googlers have permission to post here!

  20. But the criticisms that Danny gives should be addressed, even if the issue is mostly perception.

    🙂

  21. Hate
    #26 Fix Supplemental Hell ASAP for honest sites – You are not taking it seriously. Just like it always happens; you will only take action when you start losing users to other search engines – will be too late then.

    #27 Read feedback and implement it.

  22. Here is an example of a general weakness in search engines. http://www.buyaswatch.co.uk is the largest supplier of Swatch watches in the UK and one of only two websites in the UK that Swatch officially endorse, the other being http://www.swatch-shop.co.uk. Buyaswatch spends a small fortune on the pay per click to be top of google but comes no-where in the natural listings at all. The other, smaller website comes second in the natural listings and spends nothing on pay per click.

    To add to the head scratching http://www.clickandbuild.com/cnb/shop/buyaswatch appears 7th in the google.co.uk list under the term Swatch. This is actually the old empty home of http://www.buyaswatch.co.uk from several years ago and is now just an empty WorldPay page with no content.

    I have tried to tell Google that they have a ‘dead’ link but they say that they cannot do anything about it. WorldPay are not interested and for some reason http://www.buyaswatch.co.uk remains out of the listings.

    All very strange. Would be really interested to learn more about why these things hapen.

    regards

    Chris

  23. Please add the ability to combine Google For Search (AdSense) with the Google API, or at least let some webmasters push search phrases into the search box. There’s a ton of Google Search mashups just waiting to happen to help users search Google better. Google For Search could provide the fuel to turn this into an avalanche of new Google Search portals. Right now it’s against the TOS to do these things. Last I checked, the vast majority of search users still don’t even know about the exclude “-” operator.

  24. Matt,

    Here’s one more to add the list from a site webmaster:

    Have googlebot return an HTTP_REFERER, or some other indication of where it came from when it browses pages on my site. I get 404 reports each day that show google browsing pages that don’t exist (and thus generate 404 errors). If these 404 errors are caused by bad links somewhere on my site I would LOVE to know where google found them so I can fix them and a HTTP_REFERER would help me out ALOT.

    Thanks!

  25. Matt,

    Your words seem like empty shells to me. What good is feedback if you a) get only canned responses or b) no responses at all or c) no action taken? As much as I like (or liked) Google, but it is becoming more and more frustrating to see which way you are heading.

    I know I am just a small fish in the pond (and my issues do not measure high up compared to other ones), but if companies start ignoring the smaller customers and only listen to the bigger ones or the ones crying foul with a bigger voice, they will lose the base that supported them in the first place and helped them to become what they are today.

    Once people start realizing that Google is not serving their best interests anymore and move on to other search engines (like it happened when everyone moved to Google), it’s difficult to reverse the trend. Marketing is a powerful tool, but sooner or later marketing phrases will not be enough to persue people that problems do not exist.

    Please put better tools in place to allow webmasters to communicate with Google and to work with Google. Where are those emails to webmasters informing them about problems with their sites? I have not seen one and I lost 90% (ninety!!!!) of my Google traffic on a busy (white hat promotoed) site starting about 2 weeks ago. Do you expect people like me just to sit it out and pray that Google will fix issues or do you rather see us move away from any dependency on Google and use and promote other search engines? The small customer from today could be a market leader tomorrow that uses your competition.

    Which way will Google go?

    Christoph

  26. What about #16 do you disagree with? Every once in a while you might get a quality blog from someone who – if it werent for the free service – wouldnt be blogging. But that really doesnt justify the detrimental effects of thousands upon thousands of splogs that tag along for the ride. Their spam probably contributes to dozens of other legitimate blogs being lost in the chaotic search results.

    Youve gotta have a good reason for keeping blogger free that im missing… since you said it was one of the points you disagree with the most. Could ya please expound on this?

  27. Danny makes some great points..

    “hate” is a very strong word IMO … My primary complaint I have with Google is that Google’s API group does not respond to email and they do not appear to be enforcing Google’s published API Terms and Conditions.

    From http://www.google.com/apis/api_terms.html:
    “… The Google Web APIs service is made available to you for your personal, non-commercial use only ….”

    Dozens (if not hundreds) of commercial sites are currently using the Google API for commercial for-profit businesses which appears to directly violate the published API Terms & Conditions. That makes for an unfair playing field for company’s like ours who is strictly playing by the rules.

    I also made the same suggestion over 2 years ago to you Matt in one of our first phone conversations regarding a paid support service for webmasters… Matt, I cannot imagine what your email load must be like from folks like us who run the web. Would be great if we could pay Google for a service that would help us get official answers to queries from your staff.

  28. My wish: make the AdWords numbers understandable and verifyable. Do it on a per-click-basis if need be. January shows 50% of all my clicks from IP addresses that clicked more than once within a short time frame — are they counted? Which “invalid clicks” are automatically discarded? Which “invalid clicks” do I have to report to Google?

    My other wish: Make Google accessable to the normal user. Your contact information (in general) is hidden in the deepest hole the web can provide. When people have problems they often just want to know if it’s on their side or someone elses problem – a short comment by “someone@google.com” could really help. Have someone post in the official Google Groups to your projects. Tell them “we hear you”. Have them start a blog (like yours here) and put some emotions, real people behind the projects. Let them have a bad day and write about it for all I care. Google seems like a fortress behind a giant firewall – “join Google and never talk to anyone outside again”. Your blog has given one side of Google a personal face – don’t forget about the rest of Google!

    Oh, and the Sitemaps project goes in a similar direction: offer an interface between webmasters and Google – great work, keep it up!

  29. My previous post was to express support for JohnMu’s first paragraph, but it was stripped due to my use of tags.

    At any rate, I really, really, agreee with the following:

    My wish: make the AdWords numbers understandable and verifyable. Do it on a per-click-basis if need be. January shows 50% of all my clicks from IP addresses that clicked more than once within a short time frame — are they counted? Which “invalid clicks” are automatically discarded? Which “invalid clicks” do I have to report to Google?

    Matt – you can delete my previous post if you like.

  30. What I hate about Google:

    When they (MC) asks for SPAM reports, and I submit one or two, with a high amount of detail including the URLS of the offending pages, and the SPAMMER site still remains in the index weeks after submission.

  31. RE: the ones you disagree with.

    #9: Why does Google cache anyway? I never really understood what good it does, other than being able to see a page if the real site is down. It causes some confusion, as many people believe that if all pages in their site aren’t cached, they must be banned.

    #16: You disagree with the proposed solution then? Not that there’s a problem? Because there sure are a lot of Splogs at blogger. I’m sure too you could come up with something better than fees, but what?

  32. “Thank you for your reply. We apologize for our delayed response. Please be assured that your site is not currently banned or penalized by Google.”

    The problem with this note is that it is Microsoftesque – you’ll get it even if you’ve been algorithmically penalized to the point of no more Google traffic. I think support only checks to see that you have not been removed from the index and then sends this out.

  33. Amen on #12, #13, and #15.

    Way, way, way too many web sites solely exist as a “directory” to serve up ad-sense ads. These sites are often used to commit “click-fraud”. We’ve all seen them, and I can’t imagine anyone likes them – except for the “scammer” that is making money serving them up.

    I don’t buy any Ad-Sense ads – and will never do so until I would be guaranteed that my ads were being placed on actual websites that were content related and not just some link farm.

    Further, allowing these types of sites to serve Ad-Sense ads just furthers the notion that Google doesn’t give a crap about solving the “click fraud” issue. Google seems to place the burden on the advertiser to PROVE that the clicks are fraudulent – or to even DISCOVER that they were. Plenty of Joe Webmasters out there but into the Google “do-no-evil” maxim that they nonchalontly go about buying ads – never knowing that they are buying into a system that is fundamentally flawed.

    I generally like Google as a whole – however, Google seriously tarnishes it’s reputation with the Ad-Sense program. Not addressing it fully and being sued (and losing) over it even makes it worse.

  34. One interesting thing Matt keeps repeating is that the “algorithm” will take care of most issues, even splogs on splogger?

  35. RE: the ones you disagree with

    #16 – I agree w/the post above, something has to be done. Maybe it’s not to charge for Blogger, but there has to be a solution. Blogger is sooo much easier to set up than other blogging software that it has become splog central. The simplist solution (having human approval for blogs) probably isn’t feasible, but how about some sort of automated system that weeds out spammers? With all of the resources you guys have, there’s got to be a way to do that.

  36. Despite utilization of verified site maps, crawling seems exceptionally slow.

    Thanks for listening.

  37. Adam, I think it would be a meaty post to talk about the items I agree with (and disagree with) Danny about. Or it might make for several meaty posts. But I wanted to get something out last night. I’d like to talk about exactly that in the future.

    Gary and Rahul #26: I’m looking into it. Lots of people have come back, and I’m asking someone to read the stillsupplemental emails from WMW.

  38. Peter, let me try to answer the #9 one..

    #9.. If i were designing a search engine, I’d prefer to have that data here on hand in my database, to manipulate, query against and search against at will rather than have to go out over the internet to do it…. The spider grabbing it, and then letting a local script play with all the data in the same database is greatly more efficient than going out over the internet each time to get the data. It also allows room for error…. if you need to re calculate something.

  39. Looks like Google just filed their 10-K. Mr WaveTheory posted his analysis of the trends cited in the 10k in this posting … Google filed their 10-K. Very insightful.

  40. What I hate

    I hate it when you ask for pages to be removed and six months later they reappear even though they return a 404 if you try accessing them. It’s frustrating because you can’t figure out if you guys are Google the search engine or Google the internet archive.

    I would think that removing 404 pages quickly would be able important so the quality doesn’t suffer. What good reason could there be for keeping a page that no longer exist and so returns a true 404 in the index ?

  41. Oh yeah, one more thing I hate about Google:
    I hate it when people put the blam on Google for taking their site out of the index when it’s their own friggen fault. I hate it when they accuse Google of censorship, of wanting to supress their *valuable* content (often enough = spam sites) because of a mistake they made (“Google should know that I meant to do something else”, “EVERYONE does that”)
    However, I don’t know how Google could remedy that, :D.

  42. Hello everyone,

    One interesting issue that danny has a problem with is the search return count. I get it why the count is higher when he added a search term. The new list is bigger because an added term means there’s more matching documents, and when you’re Google, you want to appear to have the biggest index. Hey, with an additional term, we (Goog) can add docs that wern’t in the original list that contain only the new term.

    But wait. Danny assumed adding a term should *refine* the query. The idea comes from search refinements by looking at related term matches search engines provided in the past. Yahoo! does related search matches. There is a little refinement going on in Goog when the new term is added; the algo will show a different oredered list of results at the top which match the whole new query with the new documents inserted.

    The important bit: The important bit is that the search return count is 180 degrees from the concept of refining by using related searches (that tend to add a term to your original query).

    It’s a matter of presenting the information in a consistent way. Without seeing a reduction in returns on a query that Danny assumes would refine the search, it can confuse the audience from their experience with other engines that do related (Yahoo!). You guys are all about user experience. Here is one area where you can make a small change that helps Danny, and those that learn from him and his search experiences. All you need to do is forget the “bigger is more impressive” idea behind the number of returns you report. And who better to influence by making that change? At least Danny would understand what change was made, and he would write about it so many thousands would learn the difference. And the cease-fire between all you guys with respect to showing impressive return numbers would be more valid as regards Goog.

    Yes? No?

    -detlev

  43. I love the simple front page and nice Firefox integration. Don’t lose that ever!

    I hate:

    – Third level domain spam
    – splogs that rank higher than real content
    – adwords sites strangely ranking higher than real content
    – continued reliance on poisoned sources as “good” (IE: DMOZ)
    – old, old, untouched pages ranking way higher than new up to date pages

    Most of all, I hate the principal of “We’ll fix it in the algo”. More good and decent people have had their sites removed from the SERPs as a result of algo changes that are too wide open. The new Google rule should be “see spam remove spam now” rather than waiting for a programmer to come up with a solution for 6 months from now.

    Oh yeah, other thing I love: Matt’s blog 🙂

  44. I love that you provide helpful information to us webmasters. Thanks Matt!

  45. Hi Matt
    Just in case you ever read this, Danie’s comment about the Dmoz description is my personal pet peeve, and here’s why:
    7 years ago we created a site and got listed in dmoz. The description reads “a few listings of red widgets”.
    Whilst that was true at the time, in 7 years the site has grown from 100 pages to 27,000 pages and today boasts thousands of listings of red, blue, green, pink etc widgets, located in many countries, in different shapes and sizes.

    Have you ever tried asking a dmoz editor to change your description? If you have you might sympathise and realise that the description today is totally misleading. The site is not a crappy site that deserves this. It is ranked #1 for its major searches in all the big search engines for the past 3 years – something that would not be possible if it were spam or junk, so I think my suggestion to relook at this policy of using dmoz descriptions might warrant some merit.

  46. In regards to the search count number, who on Earth cares if it’s accurate or not???? Do searchers really care, I HIGHLY doubt it.

    RE: “Search Engines Web has consistantly made many practical and extremely intelligent suggestions over the history of this blog – are the search quality people reading and following and implementing the suggestions?”

    I wouldn’t bother posting anymore If I were you SEW.

  47. Here is my list:

    What I hate:

    1. Censorship in China

    2. If you use interface language other than American English you can’t see any new staff. Gmail looks like years ago. Compare English and Polish versions of Google itself:

    Google Services:
    English: http://www.google.com/options/index.html (32 items)
    Polish: http://www.google.pl/intl/pl/options/index.html (7 items)

    About:
    English: http://www.google.com/about.html
    Polish: http://www.google.pl/intl/pl/about.html

    Funny, isn’t it? It’s really stupid that when you use even British English you can’t use new Gmail features (!).

    3. No Linux version of their applications. Why don’t they use GTK, QT, python, XUL or other portable interface language?

    4. Every other Jabber server saves messages for you when you are offline. But talk.google.com don’t. Argh!

    5. They don’t answer mails. I now that if I have problem with Gmail Google won’t help me. It looks like they don’t care about their customers. Even if you send them good idea and they use it they don’t answer your mail to say “thank you”. It’s pretty arrogant.

    6. They have too many secrets. I like companies that act openly. It’s one of the reasons I love Open Source so much – there are really great contact with developers (-> point 5) and nobody keeps you away from information.

    7. Bad HTML. Haven’t they heard about w3c validator?! They makes silly HTML mistakes.

    What I like about Google:

    1. Great Web search results

    2. Gmail. It really rocks. It’s new, better way of thinking about e-mail.

    3. They haven’t made new private IM protocol for Google Talk and Gmail, but instead have used internet standard for IM – Jabber. If others companies did that Internet would be a better place.

    4. Developers, developers, developers! 😉 In Google works really brilliant people that have great ideas (but not always…).

  48. “Aaron Pratt Said,

    March 16, 2006 @ 9:24 am

    One interesting thing Matt keeps repeating is that the “algorithm” will take care of most issues, even splogs on splogger? ”

    That would be great. I just don’t see it possible to find the pearls in the pile of garbage too easily. There are some great blogspot blogs out there, just not many percentage wise.

  49. Here’s an idea I have. Since you offer a reinclusion request from for spammy sites that decide to turn good, why not offer a form where webmasters can submit a site for approval and removal of the “aging filter.” Because there are a number of 100% clean sites out there that get stuck “filtered” for months just because they talk about popular topics.

  50. It would take a long time, but no longer than DMOZ. In fact, I don’t see why DMOZ sites should be subjected to any sort of aging filter at all.

  51. Hello Matt

    I have your site bookmarked and visit most days. This is the first time I’ve posted.

    I think Google is doing a really good job. I like the search part of your business, because I can find what I’m looking for without a great deal of effort. I also use Gmail and Blogger.com.

    The only concerns I have are about the future of search technology when the Internet is growing at such a fast pace. I’m beginning to see stress over at Blogger and I think the problem is most likely related to the fast growing pace of blogs on the Internet. Sometimes, I take a look at their “sites updated every 10 minutes page” and I wonder how any company will be able to keep up with this kind of growth in content? Sometimes there are over 6000 pages updated every 10 minutes just on Blogger.

    From what I understand, those 6000 sites have already been run through the “spam filter” and passed. There must be at least 3x that number of submission that did not get through the filter. How will Blogger or Google for that matter keep up with this kind of volume of new content? I’m no computer expert, but there has to be some point where the whole system will become overwhelmed and grind to a halt.

    While Blogger.com is a big part of blogging on the Internet, it’s by no means the majority of new content that is being added every single minute of the day. I’m sure you guys have some kind of plan for dealing with this tremendous growth and I hope you’ll think about addressing some of these issues and concerns in the future. Thanks!

    Mark

  52. Detlev said:

    > I get it why the count is higher when he added a search term. The new list is bigger because an added term means there’s more matching documents

    That would be true if Google defaulted to “or” queries.

    However, Google Help : Automatic “and” queries says:

    > By default, Google only returns pages that include all of your search terms. There is no need to include “and” between terms. Keep in mind that the order in which the terms are typed will affect the search results. To restrict a search further, just include more terms.

    So, larger result sets from more refined queries is counter intuitive.

  53. Hi Matt,

    your “blog” is my 1st resource about the future of GOOG and our information driven society. Beside all regular relationships we have here at CNET with GOOG it is great to get direct input and how your readers think about it (very often very similar to our content driven approach to search and SEO).

    One point: are you looking into the QUAERO project ? My feeling is that it is NOT against Google (it is one theme, to be against a US company, for French people to get started), but it will help GOOG and other services to become an even better service.

    regards
    Thomas from CNET.de / ZDNet.de

  54. >>>Gary and Rahul #26: I’m looking into it. Lots of people have come back, and I’m asking someone to read the stillsupplemental emails from WMW.

  55. The above should have said:

    >>>Gary and Rahul #26: I’m looking into it. Lots of people have come back, and I’m asking someone to read the stillsupplemental emails from WMW.

  56. Not sure whats happening but the above posts have been cutt of twice.

    I am just saying thanks for looking into our sites problems. I hope you can find a fix ASAP.

    Gary

  57. Hi Matt..

    It looks like Big Daddy has nearly rolled out ?

    But still no improvements (ranking specifically) for Canonical and Hijack sites – although if you look at live PR aka RK these sites look like they have values calculated again for them – just does not seem that these values have been applied to the serps 🙁

    Cheers

    Stephen

  58. Oops – meant to say:-

    Does that make sense ? to be expected ? at this stage.

  59. Matt,

    What about pages that used the URL remove tool over 6 months ago, googlebot is crawling those pages daily and yet they still aren’t back in the index? It’s over a month since the 6 month date passed and they just aren’t showing up (or moving to status:expired on the url remove tool) in the index.

    Any ideas?

    Cheers,
    Nicholas.

  60. I’m just going to ask one short question, and yes I’m fully aware of the irony of it, but I’m still not sure so I’m going to ask anyway:

    Is it just me, or was this post intended only for Google employees? Yes, Matt posted it publicly in his blog, but the tone of it would suggest that he was only looking for feedback/comments from other Google-types. Or did I miss something?

  61. The new Google rule should be “see spam remove spam now” rather than waiting for a programmer to come up with a solution for 6 months from now.

    How is that scalable and robust? Pretty soon you’d end up with an algorithm full of “if thens” and a huge database of sites NOT to list, and a lot of man hours wasted.

    It’s better for Google to do it in the algorithm. It may be better for us short term to do it manually.. but long term it’s better for them, while achieving the same results for us (just with a longer delay)

  62. Ryan, it isn’t that I don’t want them to adjust the algo, but I think they should take more agressive steps against spam while waiting for the algo to get “fixed” (some would suggest “broken”, but that is another issue).

    Putting spammy domains (especially those people actively using redirects and othey clearly less than ethical method to harvest traffic) into a temporary black hole of sorts would help to give the end user a better set of SERPs, which would reflect well on Google. It would also keep those domains from giving PR / importance / backlinks to the next set of spammy pages that are going to wipe us out next month.

    I am just looking for something that stops the reward cycle that these spammers profit from. The delay to remove spam from the SERPs is what makes spamming profitable. Shorten that down dramatically, and the profit motive goes away.

  63. I am in two minds about harsher spam filters. My own site is currently blocked (re-inclusion filed) and I really don’t think I did anything wrong, but I undid what I had done as it was presumably “naughty”.

    If little me can get blocked for such a minor infraction, I dread to think what would happen if the spam filters were increased !

    Yes – real spam is a horrid nuisance and affects my traffic as well -but do we risk “throwing the baby out with the bathwater” each time we cry for harsher spam filters ?

  64. Hey Matt,

    yes g* does a great job andf has many features for webmasters. One is the sidemaps feature but since 15.03.2006 it does not work anymore.
    U can do everything and get the message
    not supported file format

    Its discussed on google groups already. Maybe u could be so nice to pass this message to the sidemaps team?

    Thanx in advance.

  65. Matt, you have a great attitude to treat Danny’s rant as a bug report.

  66. Shimon, I honestly think that’s what it is. If someone as smart as Danny considers something an issue, then it’s an issue. The only question is whether the issue is mostly substance or more perception. I think the feedback about results counts is perfectly fair, for example.

    Martin, I’ll ask what’s up.

    Stephen, yup. We’re live with Bigdaddy at about 2/3rds of our data centers now.

    Adam, I think it’s too easy for Googlers to focus internally instead of externally, and if that happens it’s a mistake. Of course not everyone outside Google is going to stop reading the thread, but I’ve actually gotten a couple really nice reactions from inside Google.

    Gary, the > character causes things to get cut off, I think. I think it’s a WordPress thing.

  67. Hi Matt

    Would you be kind to tell us, if possible, when do you expect the nex Toolbar-PR update.

    Have a great weekend

  68. Thanx Matt,

    forgot to tell i am from germany. Maybe its al lokal problem!

  69. Adam, I think it’s too easy for Googlers to focus internally instead of externally, and if that happens it’s a mistake. Of course not everyone outside Google is going to stop reading the thread, but I’ve actually gotten a couple really nice reactions from inside Google.

    I am wise to your ways, Sensei of Mattitude. Thank you.

    (Let’s see who gets the relatively obscure reference here.)

  70. Hi Matt

    Looks more than 2/3 from here but then again I suppose there is flux within the DC on the roll out and 2/3 are completed Big Daddy perhaps.

    Intresting that the sf giants query is now showing (IMO correctly) sanfrancisco.giants.mlb.com/NASApp/mlb/index.jsp?c_id=sf on some DCs – I guess they have thrown another/different redirect which Google picked up quickly enough :).

    On Big Daddy are we still pretty much talking about infastructure changes – so we would be seeing Google pick up redirects like that etc – rather than expecting ranking changes based on Google now picking up redirects of this nature ?

    Have a good weekend.

    Cheers

    Stephen

  71. Employ more people in customer service / technical help. At the moment the responses take a long time and are often poorly related to the issue raised.

  72. >>>> Matt Cutts said, “Sure, I’d disagree with a few (#3, #4, #9, and #16 are the ones that I’d respectfully disagree with the most).”

    Hmmm the #4 post in that thread is from googleguy… So are we having a bit of a split personality thing going on Matt?

    That #5 post in that SEW thread, now I can agree with that….that double ‘t’ thing is spot on… 😉

  73. Err, I think Matt is refering to POINTS #3, #4, #9, and #16 on DS site, not in any post.

    Come on Matt, you should know ALL Google employees should think alike. Where do you get off having your own opinion 🙂

  74. Hey Matt,

    sorry to harry u again with the sidemaps error. Since that error my pages indexed went down from about 5000 to now 890 and its still reducing. The error is still there: it shows

    nicht unterstütztes dateiformat

    i have nothing changed to the sidemaps-file. Its UTF8 encoded and validated.

    Wish u a nice sunday.

  75. I love having good SERPs, and I hate it when my SERPs go away

    However, that doesn’t give me the right to whine and sue for not having good SERPs. I still can’t believe the news about KinderStart.com — suing Google for something they didn’t pay for. It made me so irritated that I had to write some commentary about it on my blog…

    http://www.familyresource.com/blog/2006/03/19/can-you-sue-for-something-you-didnt-pay-for-whiny-kinderstartcom-thinks-so/

  76. #23 – No one says ni. Am I dumb or is there an obvious answer
    to this ?

    I mean, you grab money from the advertising network – virtually
    nothing in the IT/geek/we-love-google history : the more
    important you are, the more important you will be.

    And with that money, you build (great) and *free* services on a
    different market – the IT one – that I have no chance to beat,
    since my best offer, with the greatest engineering designs I
    could come with, would actually *HAVE* a cost.

    Even if your products were awful hacks and slow shell CGI scripts
    all around, I couldn’t compete,

    Let us dream a strange bug. Let’s say you kill the IT market using
    your backup on the advertising one, then you end up selling ads
    to yourself and we have a starvation.

    Is there an available fix for this ?
    Guidelines, maybe ?

  77. Jon, I too was amazed at how this business would even dream they have a snow-balls chance in hell of winning.

    However, I’m sure Google is willing to give their money back 🙂

  78. What I don’t like is being left in silence over critical issues that effect honest webmasters.

    In Jul/Aug05 we were hacked and had robots.txt applied across our site. Days later we removed this, but went into the 180 day suspension period as outlined by G. During this period we could not get any assistance from G support except for standard uninformed replies.

    We came out of the suspension period around mid Feb and applied 301 redirects from our old to new pages. Then we got smashed with the “Supplementary Hell”.

    #26 & #27 are just a small representation of the many webmasters who are desperate to be communicated to on things that matter in their day to day ops

    What I’d like to see is a responsive support desk and someone updating us regularily on key forums and partner networks with what’s happening on critical fix issues such as “Supplemental Hell”

    Then i/we can get back to living.

  79. That’s a good idea, Dave.

    Let’s all chip in and Matt can give KinderStart.com a special refund.

    I’ve got some pocket lint and a handful of crumbs from the bottom of a Doritos bag.

    What else have we got?

  80. Jon and Dave,
    I agree that suing Google has a zero chance of success and a waste of time and energy – probably an act of frustration. But I have to sympathize with this company. If you’ve read any of the posts regarding the Google “Supplemental Hell” problem and looked at the symptoms, you would see that this company is the classic victim of this Google “issue”. They have serious canonical problems which are not being addressed with 301’s and Google Big Daddy is just hammering them for this, removing all but their HOME page which Google appears to maintain 3 copies of in its main index. One with www, one without, and an https version. All probably having to divide its limited Google PR is my guess.
    You can argue all day that this problem is the website owners and I would probably agree. But Yahoo! and MSN both seem to get it right (site fully indexed under www prefix) – thus the site keeps its ranking.
    Then you have to ask yourself, is the internet better off including this site in search results or not. They appear to have a decent website with useful content. I’ll probably get shot down by Matt whose magic utility found some hidden text :-). I’m sure with the lawsuit, he won’t be able to comment.
    These canonical problems have been hell for lots of site owners. Sites fall into 3 categories.
    1 – Those that have total protection from canonical issues (and not just www and non www) – probably only a small percent is my guess.
    2 – Those that are unprotected but fortunate enough not to have internal or external references that trigger these canonical problems.
    3 – Those left unprotected and unfortunate enough to have these references create duplicate index entries in Google and thus serious ranking problems.

    This is just my humble opinion and I’m certainly no expert, but the sooner Google and Big Daddy fixes these problems the happier a lot of site owners will be.

  81. Hi Matt ,

    Looking at a thread on WMW @ http://www.webmasterworld.com/forum30/33552.htm I can see a consensus amongst some webmasters who would be prepared to pay for access to a professional support service – albeit the majority understand that Google’s role isn’t that of an SEO for the general community.

    I understand Google began a test program last year of communicating with some webmasters and/or site owners. How did this progress?

    Is there the ability for this to operate in conjunction with Google Site Maps to improve overall quality control alignment with site owners?

    Al

  82. Hi Martin,

    I did a post today on the Sitemaps blog about why you might be getting an “unsupported file format” error.
    http://sitemaps.blogspot.com/2006/03/more-information-on-new-unsupported.html

    If you check your Sitemap and it seems to confirm to what’s listed in that post, I would suggest posting to the Sitemaps group with a link to your Sitemap:
    http://groups.google.com/group/google-sitemaps

    The blog post covers most of the issues we’ve seen, so hopefully it will help.

  83. This is something I feel strongly about, many of the problems created on that page of his are problems for webmaster and not the web users. I really do feel the ‘rule’ “do it for the users, not the web masters” comes in to play here. Rule #3 and #4 both seem to be a personal dig. I wouldn’t worry too much about his opinion.

    Keep up teh Good Work Matt.

  84. RE: “I can see a consensus amongst some webmasters who would be prepared to pay for access to a professional support service”

    What sort of support issues are you thinking? If it’s a “support service” for outed sites then I’m dead against it and highly doubt Google would oblige.

    AdSense, AdWords and even Sitemaps have a great support service in my experience.

  85. “What sort of support issues are you thinking?”

    Some of those raised in the webmaster community on the WMW link above, related to system issues.

    I can see your point if you relate it to BMW that were able to be reinstated [ it might seem unfair ] – but look – they simply made an error – they’re a highly reputable corporate – they’re not in the practice of Black Hat SEO .

    But, equally I’m sure there’s a way to help 000’s of Mr and Mrs Average website owner or webmaster with their sites if the service was funded and still lots of opportunity for Google to sell it’s add on services.

    I saw some great seeds of suggestions that might work for Google and the community.

  86. Hey Vanessa,

    thanx a lot. Seems to be working now.

    Greets Martin

  87. I know my opinion is hard, but its my personal poin of view:

    Google has converted the web in a link collection, where everybody is valued as “tell me how much links you get and i will told you how much you value”.

    Thats is, ni my point of view, a big distortion of things and, the horrible part is that its distorting the entirely was that people develope web sites.

    I know that links can be, in an IDEAL world, a good way to know the relevance of a site… BUT that is NOT anymore the true in this real world.

    Theory behind Google is good… just in theory. In real world is becoming just in a collection of tricks to get traffic. How much tricks did you know, how much traffic did you get.

    But nobody is still talking about giving the results based on its content.

    The same page, with no content changes, can be ranked VERY different in Google, depending on the links the owner can get AND -in some extreme cases- on his knowledge abot many Google’s preferences about how he uses ALT, titles, etc.

    Again, we are talking about anything but NO about content.

    I hope that some day Google go back on analize CONTENT to make ranks and forget all this ‘spam’ stuff of links, anchors, ALTs on images, etc.

    Finally, user is searching content, just content.. and nobody seams to remember that.

  88. Too much to list in the likes department.

    I just love Page Creator – what a treat that is so simple and attractive. I would pay for this if I could tweak the titles and add sitemaps.

    I would happily pay for GMail – Fantastic – I have ditched Y and MSN for emails now that I’ve gotten accustomed to Gmail

    I love my Google Mini – paid for happily

    Dislike = Search?? I do seo for a living and feel there is still too much garbage out there.

    Try searching for djs bay area or realtors charlotte OUCH!

    AOL results look much tidier…..

    I like the way G involves everyone (or seemingly so)

    I like that I’ve got a real chance of making a living with Google (sharing is good)

    I like the occassional webinar

    Em – that’s it for now

    David 28078@ gmail dot com

  89. hey matt,
    read your blog every day, gone through pretty much every single post…but WTF is big daddy? could you please explain what it is

    cheers !!

  90. It’s been a while since this was posted but I just found one more thing I don’t like about Google. Actually, I pretty much hate it.

    It’s the local results at the top of the page. I hate it because it rewards those that have money or have spent money. I do all my business online and work from home. I hate the fact that larger companies can just show up at the top of Google because they have a bricks and mortar operation. The web should be for the little guy too.

    Now I know I can submit a location and have my business listed there but then I’d have people showing up to my house to watch me work in my underwear. I’d rather the local search be a tab at the top like images and maps. That’s all. Thanks.

  91. The local results at the top of the page. I hate it because it rewards those that have money or have spent money. I do
    all my business online and work from home. I hate the fact that larger companies can just show up at the top of Google
    because they have a bricks and mortar operation. The web should be for the little guy too.

  92. Hi Matt,

    I know this is an old thread but there is one thing I really hate (ok.. dislike) or maybe just don’t understand about Google search. Why is it by just changing the interface language the results are also completely changed? Seems that any search results changes should be a function of the search language parameter and not the interface language. Also, how does Google decide which language is associated with a website? Is it a page by page analysis or a site-wide analysis?

    Any input would be greatly appreciated.

  93. Okay,

    So being very unbiased against Google I think all issues I can imagine are mostly mentioned in this thread, and in Danny’s post. I actually see an issue with lack of rebuttal, not even other webmasters have posted many reasonable explanations for some of the things brought up here. Where are the constructive WHY responses that you’d expect from any other company, or from followers of a company. Here’s one at least…

    In regards to #9, Caching pages. Just think long and hard about this one and I think any webmaster can come up with a reason or two why any search engine is better off caching a web page. First off, they’d waste, no, DESTROY your bandwidth if they had to spider your site every time someone wanted to search for something. I don’t know about you but I can’t afford to have Google look at the same pages, 1 million times a piece, per day. You’d be able to sue them for shutting down your site because it went beyond it’s bandwidth capacity. Secondly, it makes searches 100x faster. Since Google spiders sites on a reasonable basis it doesn’t flood my bandwidth, and allows you to type a search and find results instantly rather than over the coarse of what would take, at the least minutes.

    What would the alternative be to caching pages? Of coarse Google could just snip your text and images and automate the theft of all your content so that it can be searched without “cached” pages, but they’d still have all your data anyway, and it wouldn’t be direct. It would seem very sketchy to me.

    I personally enjoy Google’s honesty in this regard at least, if you can’t view cached pages from a search engine, then they have your data stashed away somewhere, and you don’t even have access to it.

    A cached page from Google to me represents a free form of internet security to a website I chose to make public. If someone steels my website content for themselves I not only have my own data for a court case, I have Google on my side to show that my site was in fact cached before their site was ever even on Google. Plus, say I lost every physical possession to my name…I might still get some of my content back.

    I’d cover more topics but I have a lot to say just on one, so I’ll leave it at that. Great thread guys, I’m glad there’s a few places around that actually might have the critical throughput that Google needs. It’s not the throughput I’m concerned about though, it’s the feedback. I think they do listen to us fairly well with regards to the concept there’s millions of us to 80,000 of them. They just don’t tell us enough when they actually are listening. I bet they could expand a lot upon their following, if they showed their dedication more often to user feedback.

  94. I admit, Google is the leader of search. And when you’re king, you make the rules. But it would be nice to have more control over what Google collects about our websites.

    Like linking content to your copyrights (via webmaster tools), then being able to manually spanning across the Google index to see if content is being taking from you and duplicated on other websites in the index.

    More control over expiration of cached pages, like when your website design completely changes or is intentionally taken offline. Faster than disallow in robots.txt.

    Control over what countries you do/don’t want your website showing via search results, inspite of google language tools.

    Allow comment blocks in HTML to include/exclude content you want to be indexed. The Enterprise search and Adsense does this. A start and finish comment block.

    Is there someplace in google where you can see if a website is blacklisted, an untrusted site or can see if websites you’re linked to are either untrusted or could bring down your page ranking? Thanks Matt.

  95. I hate something : that your blog doesn’t use excerpts, or at least some type or html sitemap, like wp-archives, so we can surf better here… can you Matt++ this? 🙂

css.php