Great colleagues

Speaking of the Search Engine Strategies conference, I was thinking of things that I meant to post about but haven’t yet, and I remembered reading this post about what a great job Charles Martin did at Search Engine Strategies in Chicago. Charles had never done any search engine conferences, so we sat and prepped for an hour or so about what SES is like and what questions he might get. And he knocked it out of the park: he spoke at two different sessions and even joked around with his co-panelists. Heck, he took better notes than I usually do at a conference, including writing down all the questions that people asked. Danny Sullivan recently sent word that Charles got high marks on audience feedback, too. Charles, thanks for representing Google at SES Chicago.

I often tell people that if I failed the “bus test” (that is, if I got hit by bus and didn’t survive), there would easily be 50 Googlers who could take my place and talk about webmaster issues, and I really think that’s true. Google does a pretty good job on webmaster communication, but we need to keep looking for ways to scale even more. It’s healthy to bring more technical Googlers into the rotation at conferences: we’re sending Brian to speak in Australia in March, and at Search Engine Strategies in New York I’ll be joined by software engineer Aaron (in addition to the loads of other Google speakers who will be there, and hopefully several other engineers; I know that some Sitemaps folks are coming too).

The last month or so especially, I’ve enjoyed talking to colleagues in the crawl and indexing teams, Sitemaps, and core quality about how Google could improve webmaster/engineering communications more. I’m excited about how far we’ve come, and we’ll keep looking for ways that we could scale up communication from the engineering side of Google.

38 Responses to Great colleagues (Leave a comment)

  1. TORONTO, Matt. When you coming to TORONTO?

    I’m too lazy and cheap to go to those other places. 😀

  2. Matt: Any plans to send the infamous Google Guy to a conference … or has he retired?!? 😉

  3. Hi Matt,

    You ask about how Google could improve webmaster communications? That is simple. Just answer emails instead of leaving poeple in the dark.

  4. Dear Matt and All,

    Need your help guys… [fake URLs used]

    In 2003 I bought two domains: “music-store.com” and “music.com”. Assume both domains have never been used (the real domains I bought have never been used before). I launched my brand new website with 100% original contents using “http://www.music-store.com” and kept “music.com” domain unused. In 2004, AOL spotted our work and we made an agreement with them. Our new main URL moved to “http://music.aol.com”. We placed a 302 redirect from “http://www.music-store.com” to “http://music.aol.com”. By the end of 2005, as our business has evolved and time change things, I’ve decided to move out of AOL and the reason which led me to use “-store.com” is no longer valid, so we moved to “http://www.music.com” which is cleaner and easier to catch for our readers.

    Firstly (and probably wrongly, I know), for an entire week, we’ve made our website available through any of the three domains while our partners (REAL BUSINESS partners, not linking buying stuff, directories and so) changed their links to our new URL “http://www.music.com”. In the end of that week our website was fully indexed by Google. So, we placed a 301 redirect either from “http://music.aol.com” and “http://www.music-store.com” to “http://www.music.com”, page by page (i.e. “http://music.aol.com/news.php?id=222” and “http://www.music-store.com/news.php?id=222” redirecting to “http://www.music.com/news.php?id=222”).

    One week later my nightmare began. In the first days of January, my website that has always been, NATURALLY, in the top SERPS of music related articles in my native language vanished from Google SERPS, this is you can’t find my website in the first 1000 results for any keywords you use. Also, if you query Google with [http://www.music.com] or [http://www.music-store.com] you get no result. If you query [http://music.aol.com] you will get as result “http://www.music.aol.com”, an AOL automated search page for unknown subdomains (“http://www.music.aol.com” never existed! Our domain was ONLY “http://music.aol.com”). If you query [site:http://www.music.com] you will find the whole website indexed and cached, while the old domains still show some few hundred pages each out of tens of thousands.

    I have already tried to get in touch with Matt Cutts and Google. However the first never answered my requests and Google took two weeks to send a human answer to my reply to their automated answers (I filled a reinclusion request). Google answer was completelly useless, telling it may be “normal fluctuations”. If IGN disappears from any SERPS may it be considered “normal fluctuations”? I don’t really think so. Well, my website is in my native language like a more specific IGN. If it vanish from Google SERPS, Google has almost nothing relevant to show up about music, in my native language.

    After all this, the REAL weird thing happened a week ago. We’ve made some changes in our website, however the webmaster forgot to re-enable our website’s function that verify URL to avoid canonical problems (that function is disabled for internal testing in our labs). That function is also responsable by the 301 redirects. Two days later, I noticed we started to show up partially as it used to BUT with the old URLs. We noticed that we forgot to re-enable that function and so it has been re-enabled. However, we were in Google SERPS again and we were not sure why… Let me explain…

    If you queried either [http://www.music.com] and [http://www.music-store.com] you would get “http://www.music-store.com” as result. If you queried [http://music.aol.com] the result would still be “http://www.music.aol.com”, which has never been our URL as you remember. The TOP #1 SERPS we started to get again (as it used to be) were inside “http://music.aol.com” and we showed up again in #3 (for the same keywords) but only for our main page at “http://www.music-store.com”. However still for the same keywords, “http://www.music.com” started to show up in #60 while it never showed up before AT ALL (in the first 1000 results).

    As you remember, I told you we re-enabled that disabled function two days after the error right? Well, two days after fixing that error [http://www.music-store.com] disappeared again, [http://music.aol.com] continues to wrongly return “http://www.music.aol.com” however we still show up in TOP SERPS, [http://www.music.com] started to show the link “music.com/” as result (we never used links without “www.”).

    For some reason, Google is absolutelly confused about this just like me. However I try to talk with him to solve the problem but he doesn’t, at least until now.

    What do you think about this real case study? Anybody can help or offer advice?

  5. I’m not sure I’ll make it to Toronto, Adam. That may be one of the chances to ramp up the speaking of some other engineers. 🙂 Sounds like Toronto is a great city though.

    Valentine, I’d love to get to a place where webmasters don’t have to send emails–the ability to solve most issues by using the webmaster console or Sitemaps would be great.

  6. Matt, who will you send to the european conferences??? So far anyway, simply great work.

  7. S.E.W., that’s because this is my own site, not a Google site. But there are Google engineers contributing to a ton of blogs (e.g. AdSense, AdWords, Sitemaps, Google Video, the Blogger Buzz blog, the Google Desktop Blog, the China blog that Googlers do), and also speaking at conferences too.

  8. >Few people REALLY understand the ALGO DNA – they understand the peripheral, but their insight only penetrates to a certain level. But collectively, crowd wisdom really adds pieces to the puzzle.

    That’s by design, I would assume. Couldnt have irresponsible people using Google DNA to clone a search engine algo now, could we. Imagine the horrors that would produce.

    Seriously, Im sure Google will go open source the day after MS does.

  9. Matt, didn’t you get the memo? SES has a whole ritual for Toronto. I wear a Matt Cutts mask and give away false “secrets” at the Google party. Can’t wait for this year’s!

    Actually all kidding aside – this would be hilarious fun but it would take me at least a year to grow facial hair, so a mask it would have to be.

  10. I am considering paying the money to come to the conference just so I can ask you when the next PR update will be. I know Big Daddy is slowing it down, but I would love to know a ballpark time frame so I don’t drive myself crazy checking every day. What should I do, Matt?

  11. Hi Matt,

    Anyone from Google going to SES China?

    Marc

  12. I wouldn’t be surprised, but I’m not sure. Strangely enough, my Mom will be in China around then though.

  13. How many engineers is Google planning on sending to SES NYC?

    I plan on attending the “Lunch with the Google Engineers”. Yet, it is only 45 minutes. That is not much time. An entire session block (or two) with Google engineers for Q&A would be great. People could submit their questions ahead of time that way time is not wasted on basic issues already covered.

    Any chance the lunch will be catered by a Google Chef? *smirk*

  14. Why are those other Engineers NOT contibuting to this Blog – why can’t other perspectives be revealed and discussed.

    For the sheer sake of starting one of those rumours that has no real substance but is fun to watch spread across the Net, it’s entirely possible that the other engineers do contribute, but under nicknames or the assumption of other identities.

    Why I could very well be a Google engineer myself, and you’d never know.

    Think about it. Scary, huh? 🙂

  15. Valentine, I’d love to get to a place where webmasters don’t have to send emails–the ability to solve most issues by using the webmaster console or Sitemaps would be great.

    That’d be great, Matt. But it will never happen, through no fault of your own.

    The problem is that too many things are subject to interpretation, and with different people interpreting things different ways, that leads to “Google said that I can’t put invisible text on my site” or “Doorway pages are okay, they won’t ever ban for that, Matt said so” or the infamous sandbox bitching and moaning.

    By the way, Toronto is fast deteriorating. So I can’t say I blame you for not going. It’s not what it was 10-15 years ago.

  16. Hi Matt

    I like this one most:

    When Matt answers questions, you notice how he always throws in terms like, “in my personal opinion”, “if I was to do it this way”, “if I was you”, etc. In addition, Matt has a tact of deflecting questions that he can not or should not answer.

    Adding to Matt’s selections of “deflecting terms”: I wouldn’t be surprised if 🙂

    And talking about non-deflecting terms; how is it going with our friend BigDaddy? any interem weather report?

    Have a nice weekend, Matt.

  17. > I’m excited about how far we’ve come, and we’ll keep looking for ways that we could scale up communication from the engineering side of Google.

    Would it be useful or even possible for Google to send a satisfaction/info/data gathering survey (to those who use Google Sitemaps for example)?

    I too am excited. Of course being in a very competitive industry sometimes I worry about too many sites getting the “right” info! In other words I think Google keeping some things secret is positive.

    I agree with others who have posted here that spam definately needs to be reduced. For example I’m a retailer and I see my competitors running 2 or 3 nearly identical (except for their alias business names and different urls) websites, offering mostly identical products on each site, dominating (hogging) the 1st SERPs for main and many important keywords. Of course this shoves down or drops the rank of other good competing sites. I don’t know if Google considers this practice to be spam although they should IMO. If not, what’s to stop me or someone else from doing the same thing, where eventually only sites owned by just 2 or 3 companies would show up on 1st and 2nd SERPs? I also see some sites that have 2 different urls from their site listed in a row, for example hogging positions 2 and 3. It’s good for business if you can get it and bad if you can’t.

  18. I understood that Google’s rationale over the past 6 months has been to de-rank sites where reciprocal link building was employed by their SEOs. Personally, I have stopped all practices relating to recipriocal linking and/or making low PR submissions.

    Can you tell me how I respond to one of my clients, who I have eloquently tutored on the changes that have been made to search engine algorithms of late.

    He runs a small travel agency in Vienna and his business is in selling tickets online – generally for the Spanish Riding School and the Vienna Boys Choir. Both are included in the title, metas and front page copy of the site, with links to their respective inside pages.

    However, when I look at Google in “Find web pages that link to http://www.viennaticket.com“, I see Entertainment Services Link Exchange – http://www.bigbook.pl/links.php?c=19&cn=Entertainment+Services. Then I look at “Find web pages that contain the term http://www.viennaticket.com” and on page 3 I find Spanish Riding School – http://www.srs.at/index.php?id=386.

    Can you explain why a link exchange page with no visible sign of the Vienna Ticket agency appearing on the page should be placed in the links section when the Spanish Riding School – which couldn’t be more ‘relevant’ to the site – does not.

  19. Roll on “webmaster console.”

    I’ve not a bad word to say about Google from a searcher’s perspective. However, it would be great to have a better answer than “I don’t know” when my board of directors ask me “why isn’t our new website listed in Google?”

  20. yep, it’s good to have more than just one guy behind the helm during a storm…smart. 🙂

    can someone help me with a google/webmaster question?

    question:

    i want to make my blog more professional and have been removing old posts.

    will my 404 page take care of this or should I use a 301 that directs back to home? from sitemaps the 404 seems to be all thats needed?

    thanks,

    aaron

  21. Matt said “Valentine, I’d love to get to a place where webmasters don’t have to send emails–the ability to solve most issues by using the webmaster console or Sitemaps would be great.”

    Matt,

    That would be fantastic and couldn’t come soon enough for someone like me that in six months of trying has only ever got the response, “We understand your concern and have passed your message on to our engineering team for further investigation”.

  22. Aaron, if the posts are truly removed, I would go with a 404. Points well taken, Mistah and Valentine.

  23. Well I sort of answered one of my own questions. I just recalled that Google DOES consider more than 1 related website operated by the same organization to be spam, because Google Adwords allows one company to list only one website per keyword. It’s hard to believe that Adwords policies are more strict (or at least more strictly enforced) than Google search. I would think the opposite would be true. Now if we can just get big (daddy) Google to act like little Google! 🙂

    “No Double-Serving
    Google maintains a high standard for our user experience and the quality of our advertising. To protect the value and diversity of ads running on Google, we do not generally permit advertisers to manage multiple accounts featuring the same business or keywords. To learn more, please review our Double-Serving Policy.”

    https://adwords.google.com/select/guidelines.html

    “Can I show more than one of my ads on a page?

    To provide the best possible experience for our users and advertisers, Google does not permit multiple ads from the same or affiliated company or person to appear on the same results page. We believe that pages with multiple ads from the same company provide less relevant results and a lower quality experience for our users.”

    https://adwords.google.com/support/bin/answer.py?answer=14179

  24. thanks matt

  25. Matt,

    Been reading your blog for a couple of months and I was wondering whether you ever come to the UK? If so, when will you next be appearing? Sorry if I have missed earlier entries where you have already mentioned this.

    Thanks
    Richard

  26. Matt, could you point me to some kind of french version of Matt Cutts ? 🙂 I e-mailed you and e-mailed Google but nover got any reply (well, no problem, I understand you’re a busy man), so maybe there is a specific french spam team or a person I could contact and get answers from…

  27. > I’d love to get to a place where webmasters don’t have to send
    > emails–the ability to solve most issues by using the webmaster
    > console or Sitemaps would be great.

    This is what seems to be a cultural problem inside Google. They aim for the perfect automated technical solution but until it is found, non-technical solutions are not explored enough.

    Nothing bad with the search for the holy grail, but in the meantime please think outside the “automated” mindset and throw more (educated) people at the problem to answer these emails while you try to do more automation.

    “With the console, everything will be fine some day” is not practical for the people who have problems NOW. And the completion of the wonder-console will inevitably take more time than planned. I don’t go with “it is not worth it for a few months”. It is worth it, and it will be more than a few planed months.

  28. Hey Matt – I’m looking forward to meeting Brian White next month in Sydney. But I really thought you’d have stepped up and taken that gig for yourself!!!

    🙂

  29. > we’ll keep looking for ways that we could scale up communication from the engineering side of Google.

    It would really be great if webmasters could have a hint as to why their sites have dropped out of the serps months ago, for no apparent reason. A lot better than tearing our hair out daily.

    How about letting us know what it is that Google doesn’t like about our sites, and what we can do to get back in the serps again.

  30. Thanks for the hot tip about Brian ‘speaking in Australia.” I didn’t even know a conference in Oz was on the cards. Needless to say I’m locked in and looking forward to it!

  31. First off Matt, please don’t take the bus test any time soon, okay?

    When it comes to communication, I think my personal biggest gripe is that the method for reporting blatant / obvious spam issues is the same as reporting any other “don’t like the results” issue, and often leads to a slow response or worse. If there was a more direct and effective way (potentially with some feedback) to handle the most obvious blatant spam (like third level .info domains packed full of unlilnked URLs and repeated words & massive cross links), I am sure that we all could help to keep the worst spammers from infesting Google.

    It is a common complaint I see in most webmaster forums I visit.

    On the plus side, blogs like this and other less than official communications are certainly helping to keep the webmaster community feeling a little more part of the game, and that too is a good thing.

    Thanks for having this blog and keeping us up to date, and let’s hope 2006 and beyond come good for all of us.

  32. Speaking of better “webmaster/engineering communications”…
    The most frustrating thing about communicating with Google is that after taking up the invitation to help clean up the index by reporting an obvious cheat website in detail via google.co.uk/contact/spamreport.html or google.com/contact/spamreport.html (where it says “We thoroughly investigate every report of deceptive practices and take appropriate action when we uncover genuine abuse”) there is no way of knowing what the reaction will be or if anyone is going to take any action at all. It’s even more frustrating when the reported website is still appearing in the top ten results when you repeat the same search weeks later. If a website makes changes in the meantime (removes the hidden text, or whatever it is that’s fooling the index) that would be understandable. And maybe sometimes Google disagrees with reports. But when it’s beyond any doubt that a site has been designed to fool the index (e.g. by adding a bunch of keywords in ‘invisible’ text) and it’s still in the same position weeks after being reported, it gives the impression that the report has been ignored. Are there simply too many reports to handle? Do most of them have to be passed over?

  33. Matt,
    It’s really great that in this forum, people can ask questions to you and get actual answers. It’s a really great lead and, as I’m new to SEO, a great learning experience.

    Keep up the good work.

    Will

  34. Matt,

    I agree with Alex that there is really no need to push the whole “bus test” much further. This has dangerous inklings of a subject/thread that gets pushed around for a long time…

    As for improving webmaster communication… I haven’t seen that Google is throwing a party for SES NY. Considering the quality of parties that you guys throw, when armed with a city like NYC, why would you back off? Any chance that you can rally those party-throwing-AdWords-folks to make something happen??

  35. Yes, more communication would be good.

    For example, right now, due to Google’s current implementation of Smart Pricing, people are starting to advocate removing Google ads from under-performing / non-targeted pages because they reduce the amount the webmaster gets per click. I can understand that you want to pay more for relevant targeted clicks and less on general interest clicks, but you need to be careful how you implement it. Right now it seems to be on a site-by-site or account-by-account basis instead of a page-by-page basis. That means a smart webmaster will remove Google Ads from non-targeted / under-performing pages and replace them with, well, some competitor of yours that doesn’t have Smart Pricing. If you made your Smart Pricing on a page-by-page basis similar to PageRank, this would not be an issue since targeted pages would be paid the higher targeted content amount and non-targeted pages would get paid the lower general click amount. Of course, even then, smart webmasters would be wise to replace Google Ads on under-performing pages with something without Smart Pricing that pays more, but at least, even if they don’t remove the under-performing ads, they won’t get penalized for a couple pages that bring down the Smart Pricing average for the whole website.

    After finding out this information, I am now going to have to go through some of my high traffic websites and see what ads I should remove. There are certain pages that, by their nature, don’t get many click thrus or are not targeted in nature, or for some reason are not performing well. Since all Google Ads are inserted into many of my websites via include files, I will now have to figure out a way to show Google ads on some pages but not others, which is a big pain. Plus then I also lose the benefit of Google Ads reporting to me the number of page views for the entire website and each section (by using the channels feature). That allows me to see what’s hot and what’s not, and develop content and articles and posts related to what people are actually interested in.

    Also, ironically, a website with good content and a lot of it will have a LOWER click-thru rate than a junk website where the only place to go is to click on the ads. If you are not careful, you are actually penalizing content sites where people stick around and read a lot of pages and rewarding websites that encourage people to go away by clicking on an ad.

    It’s things like these that Google needs to be careful of and needs to talk with webmasters about. Many of your policies often have unintended consequences on legit webmasters even though those policies are targeted at spammers. So more feedback and dialog between legit webmasters and the Google Engineering team would be great.

  36. Any plans to send the infamous Google Guy to a conference … or has he retired?!?

    I am considering paying the money to come to the conference just so I can ask you when the next PR update will be. I know Big Daddy is slowing it down, but I would love to know a ballpark time frame so I don’t drive myself crazy checking every day. What should I do, Matt?

  37. I look forward to attending the panels you’ll be on at SES in NY.
    One thing I’d love to see address in some session is how sites that have good placement in Google can safely go about making major changes in design and structure of their sites.

    Two scenarios come to mind. One is a situation like that described in an earlier post, where domains need to change because of changes in the business, or in how the business wants to feature products.

    The other scenario is just when a site grows and the original structure no longer makes sense either from a site management or a visitor perspective. The site may be too big for visitors to easily find what they want, and it therefore makes sense to spin off smaller, more targetted niche sites. (Say, you started with MyPets.com and after a few years, you had thousands of pages about cats, dogs, gerbils, etc. So you want to spin off a Mycats.com, MyPets.com, etc. How could you do that effectively?

  38. SER2006 was yesterday in Sydney. Brian did a great job, speaking at both the breakfast session, and again presenting at the main conference around midday. And he hung around at the conference until early evening, and answered a lot of questions and took a lot of feedback. I saw him making lots of notes in his little black book. 🙂

    Its a long way to come to Australia for a 3 day round trip – so kudos to Brian for putting in such a long day after a long flight.

    Tomorrow is Google Australia’s 3rd birthday – shame Brian couldn’t stay for the party!

    Thanks for sending Brian out to Australia Matt.

css.php