Video: Google Webmaster Tools

At this point, I think Google’s webmaster console offers enough useful features that every site owner should try it on a site. If you haven’t checked out Google’s webmaster tools recently, I’d highly recommend taking another look, especially for the brand new “pick www vs. non-www” feature.

Session 13: Google Webmaster Tools can answer general user questions. provides tools to
– tell Google that and are the same
– test out robots.txt files
– discover crawl errors that Googlebot saw
– see some spam penalties

All this is happening as Sitemaps is renaming itself and tackling a wider scope: all of webmasters. There’s even an official Google blog for webmasters now at

By the way, I wanted to send shouts out to folks that have been transcribing the videos:
Dave at ViperChill was one of the first to twig to transcribing as a linkbait idea, and has transcribed many (all?) the videos so far.
– Philipp Lensen has been among the most meticulous. Philipp also prodded me into thinking about what the copyright should be for these videos. Based on the talks we had, I’m placing the videos under a non-commercial Creative Commons license. So as long as you don’t charge for it, feel free to mash up the videos all you want. 🙂
– Poor Rebecca Kelley is stuck transcribing for SEOmoz and is starting to ask me not to do any more video answers. Sorry to have to disappoint you, Rebecca. 😉

111 Responses to Video: Google Webmaster Tools (Leave a comment)

  1. Harith

    Good morning Matt,

    Is it possible to have those great informative valuable videos all on one page somewhere within WMC?

    You have really done a great job, Matt.

    Thanks a bunch.

  2. Zac

    Thank you Matt for all the great videos and posts.

    I’ve translated your first 10 videos into Chinese at:

    In fact I’ve been translating your SEO tips and suggestions for a few months in my blog. You have lots of fans in China as well. 🙂

  3. Stephen

    Hi Matt

    You say in your video that your domain has been tested for this preferred domain feature.

    At the moment if you do a check there is pages indexed under followed by – and if you query “” or “” as a phrase then you get nothing returned for either.

    Hmmmmz – lets hope that is still just the fallout from the testing that has occured on your domain ?

  4. The Query Stats section of the Webmaster tools are much improved and is going in the right direction…


    There is one flaw. There are two terms:

    Top search queries

    Top search query clicks

    They should have realized that it would have been difficult for some to distinguish among the differences between them 😕

    And another suggestion would be for the Hyperlinked SEARCH TERMS to go to the actual Page that the website in question is on – not to the 1st page SERPs.

  5. This video is currently not available…please try again later…does this only happen for me ?

  6. The Query Stats section of the Webmaster tools are much improved and is going in the right direction…

    However,…There is one flaw.

    There are two terms:

    Top search queries

    Top search query clicks

    Google should have realized that it would have been difficult for some to distinguish among the differences between them 😕

    And another suggestion would be for the Hyperlinked SEARCH TERMS to go to the actual Page that the website in question is on – not to the 1st page SERPs.

  7. Speaking of suggestions, where is the best place to make a suggestion?

    For instance, ever since it’s been hot in the states lots of Humidity Ad’s have been showing up view AdSense on one of my sites. The reason for this is I have I small box that shows the current weather since it’s a local site.

    Someone told me that if I move this to the bottom of the page via CSS, Adsense wouldn’t put much weight to it and use the other text on the page to show ad’s. I tried this for awhile, but it didn’t work.

  8. Ian

    Matt, here’s a question for you (I think it got buried down below the 100 posts the last couple of times). Perhaps it would make a suitable video.

    I am wondering what happens if you have a site hosted in, say, the UK, whether it will do worse the general web search than a site hosted in the US? Or if we are hosting in, say Canada, will we be penalized for the searches. Are IPs used just for helping work out that a site is a candidate for a country-specific search, where there is nothing else (e.g. a TLD), or do they factor into the rankings at all?

    I know a lot of people and clients are concerned about these sorts of issues.


    ps: Rebecca has my sympathies, I’m finding it hard to keep up with you too 😀

  9. Matt, thanks for that great rundown on Webmaster tools, etc. VERY helpful.

    One thing: WHY (oh, why) do our AdSense login and Google_other logins HAVE to be different? I have an autofill for my ID and Password which is domain triggered, and because my Google_other login ID “cannot be” the same as my AdSense login, I constantly have to re-type my ID & Password EVERY time I switch from Webmaster Tools to AdSense stats.

    HELP !

  10. Jenny Rudge


    There have been suggestions ( that re=”nofollow” links might actually count against the site which is being linked to.

    The implications for reverse Google Bombing are obvious, so would you be able to confirm or deny this for us?



  11. Ian

    Now how did S.E.W. get that pretty box in his comment?

    Hmm lets have a go:

    Ahh well, at least his homepage is PR0 🙂

  12. Ian

    Great video Matt – I guess the new www/non-www feature is really for those who can’t do 301 redirects. Is a 301 from one to the other still the preferred way of dealing with this problem?

  13. Give that team a big slap on the back. Downloads are fine but how soon will they have a proper API?

  14. Hi Matt,

    I have one recommendation regarding the Sitemaps statistics; the Page Analysis feature could be improved by excluding from the Common Words list all prepositions and stopwords in general.. These are implicitly common, and cannot be used as keywords (or, at least, they shouldn’t), so they carry no relevant information whatsoever. Also, a customizable stopword list would be welcome.


  15. Links – without links.

    A question that occured to me today. As I run a news website, we get cited a lot – and quite often it is by our site brand name, not the full URL.

    Does google have a way of associating plain text, non-link references to a website as a “link credit” to the source, even though the brandname is not actually clickable.

    Obviously, I am biased in this – but it seems to me that mentioning the source/brandname is still the website “endorcing” the source – only the webmaster/writer didn’t think to be techie and turn the editorial into an actual hyperlink.

    Just wondering.



  16. All i heard was ‘dub dub dub’ and ‘no dub dub dub’

    hehe. just kidding, thanks again for another great video!

  17. Matt, I actually thought my old boss was the only person in the world who said “dub dub dub” … maybe it’s bigger than I thought.

  18. I used to use Wuh, Wuh, Wuh dot…

  19. Anyone else getting that the video isn’t available?

  20. Ben Pate

    I just looked at Rebecca’s post…

    I guess it worked cause Matt caught the bait 🙂

    Keep those videos coming Matt.

  21. Mike Franks

    Matt any feedback on local state and town keywords spams? ie. [town webdesign]

  22. Jeff


    In reference with the choice in site maps for the www vs non www, does that mean we no longer have to worry about 301’s?

  23. Mike Garde

    Hey Matt –

    In regards to Webmaster Central…are there plans to tell us what pages look like each other. I know I have 700+ unique pages on one site but I just ran sitemaps and it generated 5000+ from my log. I don’t want to compare URL’s all day, will google do it for me? (PS, web central tells me about 6 errors, not 4,300)

    Thanks, and thanks for this blog, it’s very useful

  24. Ray

    The Adam That Doesn’t Belong To Matt:
    Yes, but clicking on the movie link a 2nd time worked for me. YMMV

    Till Yahoo/MSN/OtherGuys all get a Sitemaps/WMC, the 301s probably still matter to them.

  25. Jeff

    On other item on the preference, if we choose one will google adjust the page rank and back links accordingly?

  26. Thanks Matt for another good video. I’m so happy to see a solution to the “dub dub dub” issue. Granted it might take time to totally sort itself out once we state our preferences, but at least there is light at the end of the tunnel.

    Make sure the folks working on developing the tool formally known as Sitemaps know that many of us really appreciate their efforts.

  27. pavlos

    It is really encouraging that you have finally done something to help webmasters severely affected by spammers. Well done! Positive things are good to be pointed out. I have been whining about too much being spent into adsense/adwords and too less on search quality, but I am fully supportive of such efforts.

    You are starting to live to the expectation of those who have used google shortly after it’s migration to the domain.

    The only thing is that IT IS NOT ENOUGH!!! WE WANT MORE!

    Google wants spammers out (even if they have adsense) and webmasters want them out, apart from not contributing anything other than viagra and other types of content they are not usefull/potentially dangerous for your searchers. So LET’s KICK THEM OUT !!

    Sitemaps are a very positive start. I am only hoping that the intention of Google is purely for improving their search quality…

    Is it true that all GS data are stored in ldap? Want to mention this in my project report if true…

    (Pitty I can’t go to SES, I am stuck with a very complex ldap project for my uni, hope you do a few long posts about it or maybe do a “breaking news” mattcast)

  28. they should change the reporting for the most clicked queries to show average clicks per day or total per month or somethingother than average position… as my top showing queries are usually the same as my most clicked (not many users drill down to page 3) and they already show average position.

    this would provide way more useful information (that Google’s already looking at in whatever code provides that report), and also it’ll prevent inconsistencies that make you look bad.. for example one report says i’m #2 average position for a query, then right next door it says i’m #3 for that query.

  29. It would be very helpful if sitemaps would provide the page linking to the page not found when listing pages Not Found. Very time consuming to search a site’s file system and db’s to find bad links.

  30. [quote]It would be very helpful if sitemaps would provide the page linking to the page not found when listing pages Not Found. Very time consuming to search a site’s file system and db’s to find bad links.[/quote]
    It does under “Diagnostics” ==> “Crawl Errors” ==> “Not Found (x)”. It is a very useful tool. Many times the 404s are caused by off site inbound links. Nothing to do then, but to put a 301 redirect in place.

  31. I saw the link bait aspect a few days ago and reported it too 🙂 I guess I was a little slower than the other guy 🙁 Otherwise I’d have a link.

    Matt: A few comments:

    I really wish you’d issue some posts along with the videos — not transcriptions, but technical details and concrete examples of what you’re saying. For example, when you spoke about mod_rewrite, you said something along the lines of “eliminate extra parameters,” or “use mod_rewrite.” That’s a great but very loaded statement, and Mod_rewrite is probably not terribly useful unless you also eliminate the underlying parameters to begin with. Mod_rewrite, AFAIK, only helps if you have lots of parameters _despite_ the fact that you have no duplicate content. A perfect example of a 3 parameter URL that would be OK for a rewrite is.

    ?site=1&category=2&product=3, where all the parameters actually materialize into major page content changes. I’d rewrite that URL.

    But many people use mod_rewrite as a magical fix and think they’re done. This is wrong. In reality, even if you rewrite URLs, if you’re just encoding the URLs with the same extra parameters — you’ll still have the duplicate content, and still have a spider trap in the worst case.

    This would be a perfect place to show some URLs and stuff that can be avoided — tracking parameters, session IDs etc — and maybe a mod_rewrite rule to encode it after we eliminate the said tracking parameters.

    I love your videos. But I have an irrational disdain for cats ever since my roommate had this cat that liked to wake me up to “play” at 4AM regularly 🙂 Next time, bring your dog!

    Despite the cat, I still love the videos 🙂

    Man this comment was long …

  32. Don’t mean to bug you again.

    I think you are doing a good job in such a delicate position. I apologize for being of the “speak your mind” type. I deal with the struggles in the SERPs for the smaller, normal web sites, the average Joe business looking for a few extra telephone calls. He’s got a pretty tough go of it out there in the organics if he’s platying by the rules. I am not refined enough to remotely resemble anything near being like a professional, so please pardon my directness, I am just looking for something to say to my customers that trust me to do what is in the best interests of their web site promotions.

    Here are my questions: In order to offer the average Joe hope in international competitive niche keyphrase markets, can I claim that we have as fair a chance at getting a decent ranking in the organics as any other web site if we continue to practice SEO as describe in Quality Guidelines – basic principles?

    And secondly,

    What about rewards for fair play? Is Google assigning “reward” for web sites that adhere explicitly to the Quality Guidelines and can these “rewards” be intensified for those that are detected to not artificially inflate the importance of the web page content?

  33. Matt,
    The Tools would be great if they correctly showed useful information, specifically, why a site, with decent pagerank is being penalized or filtered in the SERPS. If honest webmasters had this information I think you would be surprised how many of us would make a serious effort to clear up the issues that are causing these penalties and filters. This rampant speculation of why this and why that is happening and all the testing of “theories” is what gets us upset and in many cases creates many of the black hat tactics to get our pages reinstated. Put tools in place that provide us with information we Can Use to fix problems with our sites. Obviously Google knows very clearly why a penalty or filter is being generated, why can’t Google provide us with this information so we can fix the problem???

  34. Could one say that the only way to fix canonical confusion between “www” and the non “www” is via Google Webmaster Tools?

    I believe it is extremely hard to fix this and it is somehow related to the way the search engine functions, a kind of necessary failure in order to go further in beating webspam.

  35. Rob


    Question I would love to have answered either by video or just here on the comment board is this:

    Is it considered duplicate content or a punishable offense if a webmaster buys multiple domain names and has them all redirect to one main site?

    Example:, and all pointing at

  36. Harith

    Rob Said,

    “Is it considered duplicate content or a punishable offense if a webmaster buys multiple domain names and has them all redirect to one main site?”

    I would consider it as spaming the index. And deindex the whole bunch of those multiple domains, but not the one they redirect to unless all the domaines are owned by the same person/company.

  37. I signed up for the new webmaster counsel, and my reaction was – “Geez, is this really how Google views my site?”

    There was a line item for highest pagerank page, it lists the deadest page on my site for the last three months. It’s also one of my lower pagerank pages as reported by the toolbar, but I understand a lot of people think that tool is way out of sync with reality.

    The words in external links to your site is pretty neat, but it all makes me worry if I’m killing myself in the future by keeping the unrelated subjects I write about on a single site..

    One thing that really puzzles me is it reports 404 http errors, file not found, for eight navigation maps (NVM files) that I know are there because they work. Maybe that’s a problem with my server.

    I did the prefered domain thing, neat idea.

    I don’t really get the query stats, they don’t seem to have much in common with results I get from Google Analytics. The query stats seem to be more geared to some relative weighting scheme, like Google Trends. I guess that’s what meant by “that most often returned pages from your site.”

  38. Matt:

    Can you clarify the nofollow tag and how it impacts Page Reputation?

    Google tell us the honor the nofollow and that allowing a link in my blog will not “vote” page reputation or cause it to be spidered.

    Hence, when someone posts Trusted Partner, it shouldn’t affect Adobe one way or the other, even if the Anchor text said “Viagra – Debt Consolidation – Casino”.

    Looking at “Page analysis” in Sitemap, however, I can see if I post Widget Marketing that I will see both “Widget” and “Marketing” show up in what Google says is “In external links to your site”. I will even see “Matt” and “Cutts”.

    So, does the nofollow apply to reputation as well as page rank?

    Is sitemap giving us an unfiltered view of what external links say about our pages?

    Should I be posting this in the sitemaps area instead?


  39. Matt — here’s a video question for you:

    Does it matter if a domain is registered for 1 year, 5 years, or 10 years? Does GoogleBot or the algo look at the length of the registration period?


    A co-worker just brought this up to me and I figured I would ask you since you’re the perfect resource for the answer. I hope to see it in the next video. Thanks Matt!

    ~~ Jonathan

  40. Thanks for the mention on the post, although I dont like the idea of writing them as a link-bait, I simply wanted people to know there was a place to come to use the text to translate, or for people with hearing issues etc 😉

    Over the last few months I have viewed this blog more and more, I even went back and had a look what the first ever post was like, I expected more of an introduction

    Great work..long may it continue

    Great Job Mr Cutts

  41. Chris Bartow, have you tried this?

    Try it around your weather code.

    Matt, I noticed something with your video that you may wish to consider: the picture is pretty well irrelevant. No offense, dude, but your face and the languages of the world map loses some utility after the 13th video and I found that by opening up the Google video link in a new window and having the audio going while working on other stuff, I get just as much out of it.

    So…I have two suggestions:

    1) Include screenshots or some other aspect that requires video.

    2) Get a mic and just record the audio. This might work out better for the dialup-types (I’m not one, but I noticed a few people bringing it up), and everyone would likely get the same thing out of it.

  42. I am reading SEW and other sites tonight covering SES 2006 to see what I might have missed, so far it is much the same as we have in here and in our blogs, speculation. Boy has Google confused the hell out of all of us…hehe!

    Adam, that might be cool, maybe Matt could pull the audio off that video somehow?

    And transcribing is ok but is often used for spamming the story first.

  43. JLH

    Adam who Matt doesn’t have ownership of,

    My guess is since the title of the Blog is Matt Cutts: Gadgets, Google, and SEO. Gadgets gets a higher billing than Google or SEO (unless its just alphabetical which would blow my theory so I’ll ignore it) so the gadgetry (is that a word?) of the process is the reason. Matt gets to play with his camera, do some editing, post it on a google page, and talk about SEO and Google. What could be better or more on topic?

    P.s. I see Matt’s gone supplimental with the rest of the world, so since the blog has a high PR, and most posts are original, I’m not nearly as worried as before about what I am doing wrong.

  44. Me too, Matt, not happy with the videos explanations, bit of bugging to download the videos, was easier to go through the text.:(
    But couldn’t afford to miss such valuable info, so have to go through the pain. :).

  45. Hi Matt,

    it seems to me, that there is only one (kind of) t-shirt in your wardrobe. it would be nice to see you in some merchandising stuff 😉 …


  46. Jonathon, why wouldn’t you register it for 10 years? It’s only like $50 … and that’s not much of an investment if you’re serious about your site. It’s cheaper than re registering every year too; and a lot less annoying.

    Unless you’re running a site to test out the traffic, or spamming, or running a temporary site or what not… there’s no reason to register the domain for 1 year.

    I can also think of very few (albeit some) reasons why one of the site types I just listed above should show up in the first few Google results. I’d be suprised if they didn’t take this into account…

    But in all honesty.. if your site is any type of busines or non-spam site, it’s in your best interest to register for 5 years anyway. it’ll save you money.

  47. My guess is since the title of the Blog is Matt Cutts: Gadgets, Google, and SEO. Gadgets gets a higher billing than Google or SEO (unless its just alphabetical which would blow my theory so I’ll ignore it) so the gadgetry (is that a word?) of the process is the reason. Matt gets to play with his camera, do some editing, post it on a google page, and talk about SEO and Google. What could be better or more on topic?

    Another gadget testing just the audio to see how others might respond to it. 🙂

    Actually, it was partly stemmed from observation of what others had said who had dialup (specifically JohnMu). It might pose another alternative. Maybe both would work, I dunno. That’s why I threw it out there.

  48. Ryan: That makes sense, but what I’m wondering is if it has any effect on the SERPs if your domain is registered for a longer period or not. You state good reasons that you would register a domain for a while, but I’m really just interested in the effect of it.

  49. I guess what I was saying Jon, is that there’s no way it can hurt you, and if it does help you, (which it probably does), it’s so small that it’s not really worth spending the amount of time It took me to type this.

  50. Jonathon, why wouldn’t you register it for 10 years? It’s only like $50 … and that’s not much of an investment if you’re serious about your site. It’s cheaper than re registering every year too; and a lot less annoying.

    That also makes the presumption that the registrar will be around in 10 years, you won’t have issues with them, and that the prices won’t drop to the point that the $50 would be less expensive in the long run.

    At least, those are my reasons.

  51. Matt,

    I just Googled:

    matt cutts Speaking Unofficially

    because I was testing how Google returns posts vs monthly archives vs main pages for blogs. The funny thing is that Google returned:

    #1 – 41k – Cached – Similar pages


    #4 – 47k – Cached – Similar pages

    Did you say they are still experimenting with the function that tells Google that and are the same, or was your preference to keep them seperate?

  52. Jim


    Would you possibly comment on the state of PR in the Big Daddy infastructure?

    It seems clear to me, and many others based on forum comments, that the last 2 PR updates have been some sort of incomplete update which applied only pages added post BD. My site, in particular, has many PR 3 and 4 third level pages (added after BD), while the home page has been stuck at a 2 since Feb. In light of our legitimate link gathering efforts, this number is impossible for me to accept. A few months back you mentioned to me that my supplemental problem is due to low PR… the inability of Google to properly update my PR seems to have me in a vicious circle. What’s a guy to do?


  53. Jack from the Netherlands

    Hi Matt,

    Thanks for interacting this way with the community, much appreciated!

    As an extension to the Prefered Domain choice (with or without www) I’d like to see another important option.

    For all my websites I’ve registered several TLD extensions, just to protect the brandname. I’m pointing these domain names to the exact same website, and it’s not meant to get a higher PR or to spam the index. The problem however, is that visitors are linking not only to the domain I’d like them to link to, they’re linking to -say- 80% main domain, 10% other TLD, 10% other TLD.

    It would be great if there was an option to say, hey, this is my main domain name. Do count PR for all other TLD variations, but do not show them as a separate result in the Index.

    Again, it’s almost a must to buy all variations of a domain name nowadays, otherwise the domainers will register them and that’s a great nuisance.

    P.S.: I’ve chosen to not solve this with redirects in Apache. All TLD’s are serving the same website, but under different domains.


  54. Supplemental Challenged

    “- see some spam penalties”


  55. g1smd

    * * “Is it considered duplicate content or a punishable offense if a webmaster buys multiple domain names and has them all redirect to one main site?” * *

    If they all issue a site-wide 301 redirect for both non-www and www URLs on the extra domains, and they all point to exactly the same place on the target domain, then you can do that with as mainy domains as you like.

    Those that issue a redirect will never appear in the SERPs. There is no duplicate content to consider. I assume that you’ll mainly use this to sweep up the people who mistyped your domain name.

  56. Charles

    I have used the brand new “pick www vs. non-www” feature and I hope it makes a difference. How about a ‘/’ vs. /index.php feature? No where on my site do I link to index.php I only use “/” when referring to my home page, yet google still indexes both of them, and I am pretty sure penalizes me some how for having duplicate pages, even though it should be clear to google that index.php is the root of my website. This is my last thought on why google might have dropped me out of the serps over 13 months ago while allinanchor: shows my site exactly where it was the month before it was dropped.

  57. JLH

    Matt, I’ve been seeing duplicates listed under the site: command for pages based on capitilization, like Sitemap.asp and sitemap.asp, which on an IIS server is one and the same. Is that a new developement as I haven’t seen it before at all?

  58. Thanks Matt.
    this post made me go to webmaster tools (after droping sitemaps for some time).
    found a few errors i wasn’t aware of, and fixed them.

    any tips on bank account tools? 🙂

  59. Hi Matt,

    I think you videos are great, but I’ll get to the point (as no doubt you’ve already got a lot of comments to read).

    Do you know if Google’s Bot prefers ‘’ over ‘’.

    For example, if I had either: ‘’ and ‘’ which would be considered higher out of the two?

    I only ask as at the moment I’m having a small discusion on another web form about how to section of a web site via the domain name.

    Thanks Matt, great blog and great videos, they are every helpful.

    – Chris H

  60. Ive tried using the new tools

    One of my sites is a bit problematic – they had been blocking their homepage (in fact all php files – don’t laugh)

    ) I tried deleting the line’s in error but this didn’t seem to have any effect
    Eventually in a brute force approach I deleted all the robots.txt (they had 4)

    Now a few days later sitemaps reports no pages indexed but when I do a site command I see loads of pages

    Argh! – this is not doing my blood pressure any good

  61. Sorry for the off topic!
    Does changing your host have a negative effect on a sites rankings?
    I’m assuming it does as the Ip address will change, do you have any suggestions to minimize the effect or is it just “suck it and see”?

    Sorry again about the off topic, but I seem to have misplaced your mobiloe number 😉

  62. Sorry just found answer here Webmaster Help center.

    Hopefully it will stop others asking stupid questions …ahem!

  63. Argghh! Matt Cutts, you are a Googlicious thorn in my side…

  64. We’re eagerly waiting for you next presentation via video! 😀 Hopefully we’ll get some good answers out of it. Wait..hopefully? Heck, I know we will. You always have good stuff. Thanks Matt!

  65. Joe

    Matt I have just updated my sites to the www as recommended.

    Here lies the issue. When I return to the main sitemap page as I have several sites there is a non www site in on the page for each one updated to the www.

    These are self generated by this feature. OK so do I delete it or add the same site map to it as well.

    I don’t want duplicate content here so this would be a great thing to know.

    Being such a new feature I am not sure you can really give a safe answer either, but please consult with the team and shoot us an answer.


  66. Hi Matt,

    how are you?
    Nice videos, I love them!

    Anyway three questions:
    1. Is it bad for my ranking to have two url’s pointing to same site (and bl two both of them) like:
    – I hope not.
    2. Is it right, that content which is first on a site has a higher priority? (And because of that it’s better to have the content in front of the navigation)
    – I don’t think so but read it a couple of times.
    3. How high have the site visitors increased since starting your videos?
    – I think it’s gone nuts…

  67. I have to correct:
    and IdOfThePost

    (had the second one in ” < > “)

  68. Hi Matt –

    Nice to see you at the Google Party! Question about Webmaster Tools: Barry’s coverage of your duplicate content comments had you saying something about reporting DMOZ linkage corrections to sitemaps?

    I can’t find anything about that in my console though it would be helpful – DMOZ response is horrible. IMHO they worry more about screening out editors for capricious reasons than delivering a responsive editing process.

  69. We are trying to build a combination of a web popular science magazine and a reference library. The library function being similar to the wikipedia but of a much smaller area.

    Hence we publish quite a lot of science news and articles.

    A lot of times I feel that the tools and other stuff Google put out are made for quite static web-pages, or sites that grow quite slow with a generell static organication of data like blogs.

    I also feel that lot of information how to work against the Google search enginee are made for small static pages.

    Maybe Google could make some whitepapers describing how services like ours should use the Google tools, and made a data structuring that work with Google.

    I also looked at the site maps earlier this year, but the problem I noticed is that the pages are described in a long flat list. But maybe the protocol wasnt ready?

    For a web pages like ours it would be more usefull, at least for ourself, to be able to use the site map to acctually describe our web page. In that case it could be something that would be usefull for our self also. For example:

    1. Rooibos
    1.1. Rooibos_health_note
    1.2. Rooibos_business_news
    1.2.1 Rooibos_business_news_1
    1.3. Rooibos_articles
    1.3.1 Rooibos_and_training

    And so on.

  70. Matt Cutts,

    First off, thanks for the reply regarding (my day job). I had a response back from Google in wicked-fast time. I am sure that the prominance of the site helped to get a response but I was very impressed and appreciate the help.

    My personal site had over 97,000 pages indexed on Tuesday August 8th, 2006 and then unindexed again the next day. My site traffic increased about 10 fold when the new pages hit the index.

    A. Why was the site indexed with the 97,800 results?
    B. Why was the site unindexed within 24 hours?
    C. Is there any factors that could have caused this problem that are within my control (bandwidth, database structure, sitemaps, etc.)

    This is not the first time this has happened it also happened shortly before the Big Daddy update.

    Thanks for your time.

    Brent (David) Payne
    Lyric Vault

  71. Hi Matt,

    How accurate is de pagerank information in Sitemaps?
    Is it more accurate then the toolbar?


  72. Matt,

    How close is Google to being able to automatically transcribe the speech from videos into text, and then ranking based on the speech content of a video?

  73. Joe

    Brent (David) Payne
    Took a look see at your site several things come out to me

    1 No backlinks
    2 looks like you have 2 home pages indexed one is pretty messed up

    Notice: Undefined index: action in d:Inetpubwwwrootincludesconfigconfig.php on line 20

    Fatal error: Call to undefined function mysql_connect() in d:Inetpubwwwrootincludesdatabasedatabase.php on line 2

    This comes from

    Looked at the cached version of or your first link on this Bet this is your problem

  74. Kirby


    You should have video taped some of the “Meet the Engineers” at the Google Dance. They did a fantastic job of explaining their products. Amanda was a trooper going over Webmaster Central a gazillion times.

    What was really cool was to see the passion several had for their projects/products. I asked several what their favorite aspects were or what they felt was underused and a received a wealth of information. They were also very open to suggestions.

    I know I mentioned this to you at SES, but since they each told me they read your blog for feedback, I figured I would give them props here for the way they deftly (and very tactfully) avoided certain questions and topics. They clearly all passed the Mattspeak course that I’m sure by now is mandatory for all “Meet the Engineers” engineers.

    These folks obviously enjoy their jobs. A few have already followed up with me about a few questions.

    Thanks again.

  75. Rob

    Submitting a site map also greatly decreases the amount of time it will take for your site to be indexed. Another good read, thanks Matt.

  76. Hi, Matt !

    You have made a very good job with the last video-series. I have transpalte a
    part of these at

  77. Nice to see things evolving, but for me the google-system still has bugs:

    The diagnostic page says we got no pages indexed. And indeed, when searching nothing shows up. However, when searching for site:www.handvä (IDN-code replaced with the UTF-8 letter) then everything seems fine.

    Maybe this also has something to do with the fact that google on every occasion we update our main page keeps TWO versions of it in the index, as demonstrated at . This was indepently reported about 18 months ago on, but it didn’t seem to get any notice. A side effect of this seem to be that the ranking is split amongst the old page and the new one. Eventually the new one is the only one left and it’s got all the ranking, but as soon as the page is changed, the cycle starts over again, affecting search results.

  78. What is the meaning of the Top Search Queries Table? Last week’s update to the sitemaps (I mean webmaster center) left this mystery untouched. My 2 theories…

    1. These are the highest volumes searches that returned my site in the results. If this is true, what is the cut-off that google is using to say that my site was “returned”?

    All the searches on my table have my site ( in the top 10 so this could be the answer. Sometimes, I see some phrases as high as 11 or 12 but never higher…..And it lists me for searches like “homeschool curriculum” (where I’m 9th) but not “homeschool” (where I’m only 50th) even though it is much higher volume.

    2. These are the highest volume searches where some minimum number of people actually clicked on my link.

    This explanation makes some sense although its inconsistent with Google’s explanation. But, it makes sense in that its using similiar data to whats in the second table but in a different way.

    Conclusion – I return to the simplest explanation: The QUERY table shows the highest volume searches where

  79. The videos have been very informative Matt, many thanks for taking the time to put them together for us.
    I was interested in the videos that included information about Cloaking as its something that really annoys me in SERPS.

    I do wonder though, how the large webmaster related site webmasterworld can get away with cloaking a large percentage of its content is there a possible bug in the google software or is just something WMW are doing cleverly that is not being picked up, or is not classed as Cloaking? If its not classed as cloaking, why not?

    Keep up the good work

  80. I am creating an account right now, I started to after the sitemap feature came out, but I wasn’t having issues then, now I have been having issues, and the www vs not www issue is finally being addressed and admitted 😉 Good stuff.

  81. Hi Matt,

    Where would I be able to see those spam-penalties? Or do they only show up if you actually have them?


  82. Hi matt,

    Thanks for the mention in the recent video (i’m famous hehe!)

    It’s great to see the www vs non-www finally being sorted, because on a site I SEO (Lumos Lighting) it is not possible to redirect one of the subdomains because it messes with the barclaycard payment service we use (security & authentication, yada yada)

    This has probably been asked more times than you care to think about, but the site mentioned above is nowhere to be seen in the SERPS for keywords i’m targetting, whereas its placed first page on msn …. is there any indication I can look for (apart from regularly checking the SERPS) to see when my site will begin to be ranked for the terms?

    Have a great day



  83. How did we survive without videos before? Great stuff

  84. Sorry for double post.

    Does the non-www to www fix actually prevent the duplicate content problem (url cononicalization thingy?) or just make your site look better in the SERPS lol.

    Again, enjoy your day!


  85. Why are we getting this page indexed when searching for “sponsored links”

    Thanks Matt.

  86. pick one

    Hmm, the 4 links are algorithmic. Fine, top tier sites get the 4 links. But the 4 links that get chosen are sometimes absolutely useless.

    one of the 4 links is a redirect!

  87. Every page on my main site is in UTF-8, but the Google page analysis tool reckons 80% of them are in US-ASCII. I guess only 20% of my pages actually have any extended characters in them, so 80% of them would qualify as ASCII, but 100% of them have the META HTTP-EQUIV tag setting CONTENT=”text/html; charset=UTF-8″…

  88. Steven

    Why doesn’t Google just assume that and are the same and have a link that says “tell google that and are NOT the same?

  89. JLH


    I notice that your indexed pages are slowly but surely switching over to from even though all links in your site are the www version. Is this a test of the sitemaps preffered domain system?


  90. Hi Matt, burning up your bandwidth to catch up on your vids…great stuff thanks for carrying on with it…maybe you can release them on DVD?? Only joking…5min slot are great for jumping in and out of the vid list.

    Cheer for the update about webmaster tools…its been a while since I last logged in so will check them out.

  91. Top search queries are a little strange for me. I do not know how exactly Google rank sites because are few weird things there…

  92. JLH

    I don’t the comments are monitored once they are off the front page.

  93. All one could possibly ask for for Google webmasters with site issues
    is a simple explanation to that webmaster what is causing his/ her
    specific penalty/ filter being imposed. I am absolutely certain
    webmasters with these issues would immediately repair them. I just feel all this guessing with webmaster tools, is doing no one, not Google, not Webmasters, and certainly not the consumers using Google any good at all. Tell us where our specific problems exist and
    WE WILL FIX THEM. ENOUGH of the Speculation!!!

  94. Matt or Anyone,

    I have added robots txt file to not index a few directories on my site. It’s been out there for a onth but the pages still exist. Do you have any ideas why they are not getting removed by google?


  95. Dave, the removing of pages takes a while – give it a month or so and it’ll go.

    Why not do a php 301 redirect to the new page, and if you don’t have a new one – just redirect to your homepag. This way you won’t lose the value of the old page, it’ll be carried to the new one

  96. Christoph

    I have added the google12345.html-File to the root directory of my website. Although it is there now (200 OK), Webmaster-tools tell me that my 404-Page gives a 200 OK-Code as well. Well…it doesn’t. Thanx to redirects, I cannot add the meta-fallback on my index-page. Whom can I contact to make sure that I am the owner of that domain? It is hard to find a personal contact at the wemaster-tools help section. Thank you so much

  97. I have been using webmaster tools for a while now, in an attempt to increase my sites rankings.
    In the summary page of my site, under web crawl errors I have 10 Not found errors. When I click on the details all the pages listed are Shockwave flash buttons that have been deleted months ago.
    I updated my website ages ago removing these files, and yet the google webmaster tools is still looking for them.
    Why is that? I have checked my site and I dont have any pages using these files or linking to them. How long will these errors appear before being removed? Is it better to leave them deleted or should I reinstall them?, (although none of my pages will link to them at least I will not recieve this error any more)

    Thanks. Graeme

  98. Matt,

    Is there any rhyme or reason as to why a page would not get reindexed or refreshed. I have a page which I made tons of changes to and the last cache was Nov 22. This would make it over a month.


  99. yes webmaster gooogle are really nice. they give very detailed reports of your site. and also if you want to have more details then you can also use the google analytics. it is also like google webmaster’s tools but this gives you more details about the traffic on your website and other things.reallly google is the best.

  100. Mad

    Really a very nice tool, but the sitemap function is useless, I think. The sitemap files have no effect.

  101. I suggest that Google Webmaster Tools displays an update PR every day (a more exactlty page rank, with ranges)

  102. I’v been using the Google Webmaster tools for a while and was very happy until recently. I’m not sure why, but Query Stats, Links, & Page Analysis isn’t updating on my sites: & The golf site is showing it was last crawled on Apr. 13, 2007. Do I need to delete these site and re-submit, or is there another fix to the problem?

    Any help would be appreciated. I used to use these tools everyday.

  103. Hi Matt,

    Google Webmaster Tools seem to lack of freshness…
    The “links” part looks more than two months old for a lot of people, and the “GoogleBot crawl stats” graphs stop in may.

    Google has obviously been pushing webmasters towards these tools… but where is the interest if the data is obsolete?

    It might be good to learn what is going on 😉



  104. I don’t the comments are monitored once they are off the front page

  105. How did we survive without videos before? Great stuff

  106. I have found that my webmaster tool external link and indexed pages have stopped updating since 3 weeks. I also found many others are complaining google webmaster tools informaiton updating issue. How can I solve this problem please?

  107. How to see see spam penalties at the webmaster section?

  108. Does embedding video in a webpage increase it’s position in the Google SERP’s?

    I no longer trawl the web looking for the secrets to high SERPs – instead I follow the basic rules together with striving to become as much of a resource as possible for my visitors.

    I’ve recently committed to producing quality video for my websites but I sometimes wonder if all of the effort will increase the traffic Google send me.

  109. I have a question reqarding Google WMT: when I look to my keywords in GWMT, the top 10 are only prepositions and conjunctions like (at, for, which, to etc.). How can I get rid of those and let the nouns be on top?
    Thank you in advance.