Webmaster console features?

What do you want to see the Webmaster Central team do next? About 10 months ago, Google Blogoscoped asked what people wanted next. It’s time to ask that question again, because the team has made great progress. From the original thread:

– See the backlinks for my site: DONE. Site owners can now see their own backlinks.

– Verify an IP address is really Googlebot: DONE, not in the webmaster tool, but by using a reverse+forward DNS lookup.

– An option to easily remove URLs from the index: DONE. Google’s URL removal tool has been ported into the webmaster console, and it allows site owners to see and revoke their self-removals.

– Show how many people are subscribed to my website’s feeds in Google Reader: DONE, but not in the console. Google Reader now reports these numbers when fetching feeds. Feedburner will give you even more stats for free.

– Communicate with webmasters in an authenticated way: DONE. Just last week, Google added a webmaster message center to provide authenticated communication with site owners. The Webmaster Central team has done of ton of other stuff in the last few months as well.

So let’s ask the question again. I’m going to try doing a poll (it might not work with Google Reader, so you might have to visit my actual blog page). I sat down and thought of a few features that might be cool, then added some suggestions from the original Blogoscoped thread. Vote for your favorite feature below, or leave a comment if you have a different idea. Please note: I’m sure the Webmaster Central team will be interested to see what people like, but remember that these are just suggestions. Even if something is #1 in the poll, it’s still only a suggestion. The Webmaster Central team has to be free to pursue whatever they think is the most important.

(The votes from this poll are now gone, but I’m including the options from the original poll.)

What should the webmaster central team do next?

  • More information about penalties or other scoring issues
  • Tools for detecting or reporting duplicate content
  • Show PageRank numbers instead of none/low/medium/high
  • Show links on your site that are broken
  • Score the crawlability or accessibility of pages
  • A way to list supplemental result pages
  • Some type of rank checking
  • Tool to help move from one domain to a new domain
  • Tell Google the correct country or language for a site
  • Diagnostic wizard for common site problems
  • Option to “disavow” backlinks from or to a site
  • Fetch a page as Googlebot to verify correct behavior
  • Show causes of 404 errors
  • Tell Google a parameter doesn’t matter
  • Show pages that don’t validate
  • Integrate “Add URL” feature
  • Ability to show/download all pages from a site (e.g. if your server crashed)
  • More documentation and examples

263 Responses to Webmaster console features? (Leave a comment)

  1. Umm, why no option, “ALL OF THEM!!!!” :-()

    Although I voted for “Diagnostic wizard for common site problems”, I think “More information about penalties or other scoring issues” runs a very close second if for no other reason than to prove/disprove a penalty actually exists.

    And, were the “information about penalties” to include the actual reason for the specific penalty and specific page(s) it is being applied for, it would then seem to be a toss-up, for me, as to which were more useful.

    I base this not on what I myself would necessarily find all that useful but more so, what I see most questions come up for at the Google Webmaster Tools help forum.

    I still think there should have been an “ALL OF THEM!!!” option. 😉

    Craig

  2. Hey Matt,

    We don’t want Mrs. Cutts to wake up and smell the coffee, do we? 🙂

  3. I really miss one thing: When I go through the external backlinks in the Webmaster’s Tools, I miss an option, like a check box, to “ignore all pages from this site”. When you go through hundreds of links, it is pretty annoying to have to see links from your other sites.

  4. ♦ One feature that would be extremely helpful (but VERY resource intensive) would be to have an ARCHIVE of rankings for keywords suggested by the users …

    ♦ Also it would be helpful to filter and display the most common keywords and keyphrases in anchor backlinks of a site …

    ♦ Also, if a site is being scrutinized by the Google Webspam team – it is vital that web site owners get to address DIRECTLY – their concerns and confront the Google Webspam team about the standards that are being used …

    You are insular and do NOT have the benefit of the other side’s viewpionts and arguments, and there is a tendency to overgeneralize and simplify things into Good or Bad.

    Of course, one can post in forums and blogs – but no Website owner really knows what is being said or discussed behind one’s back – those who dare to MUST be able to confront..

    http://blogoscoped.com/forum/101921.html

  5. Going to bed now, Harith. 🙂 Thanks for the reminder.

  6. I would go for language and country specification. We have seen many French results and sites ending up in Swedish Google.se for specific terms. The same issue applies to TLD .co.uk and webmasters need french audience to read their sites because it has a French content.

    But I think this will remain a problem since it is hard to define the searchers needs when he use a generic keyword, would you Google answer him with English pages or local version of this generic term in his specific ip lookup serps?

  7. Good night and I do agree with these issues.
    Tell Google the correct country or language for a site

  8. Howzabout webmaster tools giving us a “Google Site Map Creator”?

    Some of us technofobes can’t even understand the instryuctions for DIY, let alone do it, and the list of ‘third party’ sitemap generators includes some that are often not available (or swamped), some that add questionnable extra stuff to your site, and some that produce wildly varying quality.

    I created one for a site and it listed fewer than a third of the pages. A second attempt with the same tool included all – but the code was broken. A third attempt got three quarters, so I put up with that at the time.

    It’s your sitemap protocol; surely a reliable in-house tool would be a logical and sensible move?

  9. Great post Matt – and these are some very cool tools to pull into webmaster central, if I could vote for all of them I would!

    The ones I’d most like to see however are:

    Tell Google the correct country or language for a site
    Tell Google a parameter doesn’t matter
    A way to list supplemental result pages

    Pretty please 🙂

  10. As we have anumber of peopel who host .coms (or other non geographic domains) in germany away of saying actulay this site is for the english market would be an absolute godsend.

  11. please add all of them 😉

  12. “Show causes of 404 errors” is about as close as I can get to what I really want, which is to have an option to have Googlebot send a referrer header. Or at least present a referrer in the data in webmaster tools when the status code is anything other than 200.

  13. I wish I could have voted for more than one 🙁

  14. Matt, could we get all these features under one roof?

    It may be easy for you to remember which pages/URLs all these different features live at in/out of Google, but it’s hard for me (and others?).

    Thanks

  15. I went for the ‘show my broken links’ option as I’ve lost count how many times the console has told me there’s one somewhere on the site, and I’ve then had to spend forever hunting around to find the blasted thing. Seems an easy fix to me – if the console can tell us where they’re pointing to, it can’t be too big a job to also tell us where they’re pointing *from*?

    Lots of other good stuff on the list, but one I DON’T want to see implemented is page rank numbers instead of low/medium/high – why feed the obsession? This is one case where a general indication is more use than a specific, IMHO.

  16. Thanks Matt for asking + putting it into a survey which is much easier to scan than a very long thread.

    One hint for future questionaires: it would be very nice (and easy to compute as well) to have 100 points that you can vote for more than 1 item. Now you have to give 100 % to one item, would be better to split this vote to several items. Maybe this makes sense.

  17. These are the more important/nice ones:

    # Tell Google the correct country or language for a site
    # Show links on your site that are broken
    # More information about penalties or other scoring issues
    # Show pages that don’t validate
    # Tell Google a parameter doesn’t matter
    # A way to list supplemental result pages

    Also, can I state, “Option to “disavow” backlinks from or to a site” – whaa? That sounds like you’re getting rather too close for comfort to Google telling people that sites linking to you can mess up your rankings 😐

  18. Robert Schroeder

    You simply MUST give more information about penalties in particular the 950 penalty. Why you continuously avoid the question is troubling to say the least.

  19. With my four websites that depend on Google traffic all being penalized and without google traffic, for three months with no one except Google knowing why, it is not hard to choose.

    # More information about penalties or other scoring issues

    It is extremely frustrating to have your websites gutted by Google without knowing why.

  20. How about the improvement of spam removal?

    I have reported a website that uses black hat techniques including the spamming of google local results many, many, many times. Yet the site is still being listed on Google and is still receiving visitors by deceptive means.

    This has lead me to believe that Google is just putting on a front, claiming to fight spam and occasionally taking high profile action (BMW) to keep up the facades of spam fighters. I would love to hear your response to this.

  21. I wasn’t sure whether it was really covered by the options:
    I’d like to know where my 404 errors are coming from – who’s generated them I mean. I’ve tried searching to track down why some pages are in Google’s index that don’t exist to decide if its worth redirecting them somewhere, but there isn’t an easy way to find that information. Then failing that – a way to say “ignore this link” – especially for links that I can’t redirect as they never existed.

  22. First time commenter so be easy on me!
    How about a reason why a certain page is in the supplementary index
    eg no links / lack of content/ spam

  23. Faster updates of the data would be great. It takes between 2 days and 2 weeks for some of the information to work its way through the tool – especially the crawl rate section.

  24. I like the console a lot, and all of your suggestions sound great, but here’s what I’d like the most: the ability to type in a search phrase dynamically and have the console tell me how high my site in general or a specific page ranks in the results for that phrase at that moment. You could cut it off at 100 or so, to keep from thrashing the database.

  25. More information about penalities for me would be the most welcome. The reason, most site owners try their best to create sites that are clean and within google guideline. The ones that do not know who they are, however, often times people see strange results without a good reason. If people knew more about why; this would not only help the site owner with good intentions, but also google.com becasue people could give more informed feedback about the search results.

  26. All of them of course 🙂

    I was torn between “Tell Google the correct country or language for a site” and “More information about penalties or other scoring issues”.

    Depending on the definition of site, the ability to tell Google the correct country or language could be a huge improvement, both for site owners and users running local searches. Ideally, this would be on a page level and would allow for multiple countries, for example a German shop might serve customers in Austria, Germany and Switzerland. Limiting local searches to the country domain excludes some valid results.

    More, or rather more precise, information about why Google doesn’t like a site would be helpful too (see my rant at http://www.mattcutts.com/blog/comments-on-our-webmaster-guidelines/#comment-107773 also).

  27. Can the displayed backlinks and pagerank be more recent?

  28. Hi Matt
    This is a wonderful post and I voted but I wish to vote more that I want

    Thanks
    Deb

  29. Hi Matt,

    As I could only vote for one, I have included my list below:

    I would love to see the following,

    * Tools for detecting or reporting duplicate content
    * Show links on your site that are broken
    * Option to “disavow” backlinks from or to a site

    Thanks

  30. (Wouldn’t it make sense to shuffle the poll items, so there won’t be a skew in some way connected to the fixed list order?)

    > Option to “disavow” backlinks from or to a site

    Interesting. So that’s in case someone tries googlebowling you? Well, I still hope for Google to do this defense automatically, I don’t want to spend too much time configuring all these things (especially because there’s several search engines out there, so it’s better to have only a single open configuration for all of them — e.g. robots.txt, Sitemaps).

    From the list, I really like “Show pages that don’t validate”, but as we can only select one, I’m going for “Ability to show/download all pages from a site”. This sounds really interesting.

  31. When viewing backlinks, it would be great to have them grouped by domain first and then be able to break this down to see specific pages that are linking.

    For example:
    http://www.domain1.com (10 links)
    http://www.domain2.com (7 links)

    Then clicking the “www.domain1.com” would give me:
    http://www.domain1.com/page1 (last accessed xxx) (page rank: x)
    http://www.domain1.com/page2 (last accessed xxx) (page rank: x)
    http://www.domain1.com/page3 (last accessed xxx) (page rank: x)
    http://www.domain1.com/page4 (last accessed xxx) (page rank: x)

    All the other suggestions you mention are great. Please implement them all!

    Harry

  32. One thing I would like to add, if any of the options mentioned are considered, it would be nice if those which aren’t available any other way would be higher priority.

    One’s I think are of as possibly being lesser priority and why are :
    “Show causes of 404 errors” and “Show links on your site that are broken” – the webmaster can just use a site link checker, e.g. Xenu, instead. If it is not a broken link from your site, who cares. If it is a backlink, checking one’s backlinks would tell one what one needed to know.

    “Tool to help move from one domain to a new domain” – Since server side actions are needed to do it right anyway, about the best Google could do would be to provide a tool to tell the bots to wait because a transition is taking place. Google can’t change htaccess files or IIS configurations or all the backlinks from other sites and can’t change people bookmarks so why provide a tool that would more than likely get used wrongly in the first place.

    “Ability to show/download all pages from a site (e.g. if your server crashed)” – Unless Google has ALL pages somewhere, even if not in the Main or Supplemental index, this is of dubious value anyway. Other than that, anyone who doesn’t have backups deserves what they get or don’t get as the case may be. 😉

    “Show pages that don’t validate” – D’Oh! Firefox plugin HTML Tidy or better yet, the W3C online validation. If Google ranked based on validation, MAYBE it would be useful but the day Google does that is the day the Internet goes dark. :-()

    “A way to list supplemental result pages” – That’s easy, all the pages that aren’t in the Main index are either Supplemental or no-mental. 😉

    “Integrate “Add URL” feature” – Or better yet, get rid of it altogether.

    “Fetch a page as Googlebot to verify correct behavior” – This is actually the opposite of others I have listed in this comment. This would be HUGELY useful in the Webmaster Tools help forum. After the number of arguments about a tag not being closed “here” properly for X/HTML or an improperly nested tag “there” making the page un-seeable to Google, and even considering a number of experiments proving the opposite, having a tool like this would save a LOT of time!

    Craig

  33. There are HTML tags that you can already use to state the language and country for a page. I would prefer to use those on a per-page basis knowing they will be read and actioned by Google, rather than having to log-in somewhere at multiple search engines and then re-state what I have already stated on the site itself.

    Something I would like to see, I discussed with both you and Adam separately at SES some time ago. For my validated email address in WMT, I would like to see a list of all of the sites that have included my email address as a mailto: link, so that I can contact all those sites and get them to protect the email address from being spidered by spammers. A normal Google search will already show me sites that have my email address as anchor text, but not those where it is only inside a mailto link.

  34. # Score the crawlability or accessibility of pages

    For a good number of webmasters, who wouldn’t be aware of the importance of a goog internal linking, this would be nice.

    # More information about penalties or other scoring issues

    Even if it opens the door to a bit of reverse engineering, which may affect quality of results with time, learning more about the penalties and filtering effects would be a goog thing in my opinion.

    But definitely, what would be very nice would be to have fresher data than the actual refresh rate grants us. Having links updated on a monthly basis is not quick enough… Let’s have it weekly, please 🙂

    Anyway, thanks for asking us!

    Denis

  35. Asking webmasters about new features is really a good thing but there are still several things that could and should be improved first:

    1. I miss a validator for Sitemaps; I have a sitemapindex file that always shows up with errors and I can’t figure out why.

    2. I miss a way to submit bugs; When I logout I’m always redirected to https://www.google.com:80/webmasters/tools – and that target does not exist.

  36. I voted for “Score the crawlability or accessibility of pages”, would would like to equally back the following, as I think they are all important:

    * Tool to help move from one domain to a new domain
    * Diagnostic wizard for common site problems
    * Show pages that don’t validate
    * Show pages that don’t validate

    I’d also like to see the new message center thing offered as a web feed, so that I don’t have to log-in to check messages, but rather see it pop up under my feed reader, which I am more than likely to see much sooner, even if it’s just a simple “Google have a message for you, log in to see it”. Something that doesn’t require authentication would be great.

    I’d also like to be able to tell Google about planned downtime, so that if on the slim chance Google comes a’crawling when I’m moving to a new host / server, etc, it’ll know to perhaps try in 2 days, rather than be met with a host of 404’s or a site that’s full of PHP errors.

    Oh, there’s an idea, if when crawling Google comes across one of those ugly PHP errors, it could be reported in the message center.

    On final one, which is perhaps a large task, but I remember a while ago I think PHPBB was targeted because it listed it’s version number at the bottom of each page, and had known file names, and bots could just scan sites for such things, and only attack the ones that listed an out-dated and known insecure version. Perhaps Google could add some sort of basic notification for popular scripts / services, such as this and alert the site owner in the message center? For example, if the above were to occur, and Google scanned the text “This site is powered by PHPBB 1.2.3.4”, and Google knew that this was a version that could be hacked because of a known vulnerability, a message could appear to alert the site owner. While this of course is impossible for all services available, perhaps the well known and widely used ones could be included?

  37. I’d like a command like the “site:” command that used to tell you how many pages of your site Google had indexed.

  38. Its funny to see how different sites will result in different requirements, I think the size of it and type of users and traffic really change things for you.
    Frankly, I don’t care about the more basic stuff, it would take hours or days to download all our pages (and days or weeks to regenerate the databases from them…) from a backup, and I’d be highly surprised if we got penalyzed by Google. So I don’t need those things, although I can understand why they’d be added.

    But in contrast to Craig:
    Since we have a big site with close to forty thousand articles written by paid authors and a forum with almost 20 million comments, I would really like to know which of our articles link to (now) non-existent content or (now) questionable sites and which user comments link to questionable content.
    For instance, many sites we linked to years ago have disappeared and their domains are taken by link farms or are just gone now.

    Using a url-checker isn’t really feasible with a site of our size. But google already crawls our and other sites… so knowing where 404’s are coming from, knowing which pages don’t validate and finding broken links on our site would be really useful for us.

  39. Where’s the “all of them” option?

    All of the options would be helpful on some level or another.

  40. I voted for “Tell Google the correct country or language for a site”. This would be especially useful for those of us working on English sites for use in different countries such as USA, UK, Canda, Australia, etc.

    I agree with g1smd that using existing HTML tags would probably be a better option.

  41. Would be nice to be able to send in spam reports and get some information on the status of report.

  42. These are my special little hobby horses:

    A tool for reporting our *own* paid links. And a way of notifying us if we are being penalized for using them. Lots to discuss on this, but my little dictionary site couldn’t stay online without revenue from Text Link Ads, which return about double what Google Ads do everything month. But if I found that I was definitely being punished for them, then I’d desperately look around for something else. The paid links are clearly marked as “sponsored” but I have no way of adding “no follow” to the Text Link Ad code.

    Under “tools for detecting or reporting duplicate content,” we need a way to grade the sits shown in the reports: permitted, not permitted, indifferent. Too many bad actors rip off RSS feeds on splogs.

    A tool for reporting errors in bibliographic information in Google Print/Books. As a researcher, this one is hugely important to me. This would be particularly sweet if Google Print would recognize my Google cookie and then enable a special flag or something I could click.

    A place to submit URLs of sites that we believe are content-wise like our own; that is, to associate ourselves consciously to our *preferred* cohort. This is a little diferent from “disavow backlinks,” because it’s a carrot action rather than a stick. Lots of trust issues come into play here.

    A way to grade backlinks. Disavowing is fine, but I’d like to rate *why* sites being disavowed: spam site, adult site, etc.

  43. In telling Google the correct country for a site, it would be good if a) you could pull this information from an AdWords campaign we are running under the same login and b) if we can choose more than one country, a whole continent, or even metropolitan areas.

  44. While I voted for, and have long been a proponent of, the idea of being able to pick a country of origin for a site, I’d like to add a small qualifier: only those sites with a non-country TLD should be able to do so, to avoid the possibility of Bob claiming his site is UK-based with a .ca TLD on a Norwegian server and gaining an unfair advantage, however slight (that’s a strictly hypothetical example, for those who may get upset about it.)

    Other than that, there’s nothing there on that list that I’d love to see, and a few things I personally wouldn’t want to see (PageRank checking, rank checking, server issues, etc.) Many of these things lead to confusion and overstated importance of issues.

  45. Matt, Google is doing good and I really liked your poll…

    Can’t you provide us an option so that we can vote for more than 1 option….

    As there were few on which we all are intrested….

    But any way we really want to see all of these on our Web.Central….

    Can’t we have some kind of trend report….

  46. Two issues not raised…

    Some way to find out if a site has been banned from either Google Search or Adsense before it is purchased. It’s become such a crap shoot to purchase a domain that may have been used before. Don’t see why domains that have expired and are banned couldn’t be listed somewhere.

    While I like the idea of offering me the corrected spelling of a search term i.e., Did you mean? I do not think that should ever be used in conjunction with domains. If I search for my one of my own domains I get a Did you mean for another site – the only things that are even close to matching are the first 3 letters (pet), the last (s) and that they are both 13-letter dot coms. Both sites are about completely different things and both are far from being “authority sites”.

    Download all the pages from a site? Wouldn’t that make it even easier for the scrapers?

  47. Hi Matt,

    I think the duplicate content one is the most vital. On large dynamically generated sites it’s almost impossible to try and track down which pages are duplicates and why. It would be great as well to have a tool to tell you how much of you page is useful to Google. i.e. how much of the page Google actually regards as usable content, excluding code, headers, footers, etc.

    Also, I think a supplemental section would be extremely useful. An area where you can see what pages are supplementalised and why, and which pages for a particular search are are more relevant and why they are not supplemental and mine are.

    Thanks

  48. I just thought of something that isn’t on this list: more tools for reporting spam manipulation attempts where a query isn’t generated. For example, I’ve got an email in my Hotmail account from a “major name” in Internet marketing promoting a new scam marketing opportunity specifically geared toward Google, but there’s no query for it…it would be a lot easier if there were a spot to just forward this stuff on, if at all possible.

  49. Matt, before I vote, could you please clarify:

    Option to “disavow” backlinks from or to a site

    Thanks! 🙂

    -Michael

  50. Oh, Matt, and another idea… you already show some sort of ranking, as long as you rank ok and someone has already searched on the phrase.

    What about extending that to show ranking history. I mean, does Google save that data? Isn’t that something that could be shown retroactively, with nice pretty graphs and numbers? 😀

    -Michael

  51. The things that have cropped up that are most important to me this year are:

    – The ability to tell Google which domain should be indexed when multiple domains point to the same site (right now I’m handling it with PHP and multiple robots.txt files)

    – SOME kind of facility to make it easy to move a site from one domain to another, or to combine domains (I’ve had three clients bought out by other companies this year alone)

    – The ability to disavow backlinks would be VERY good. I have a lot of crap pointing to my sites.

    – I would like to be able to list all my sites (currently at 67 but will be adding more) on the dashboard, instead of having to jump around from page to page, which takes a long time to load.

    – In line with the above, the javascript Sites button on the right does not work properly if you have any number of sites in your account – you can’t scroll, because the minute you take your mouse off it, it goes away.

    – I would like all the stuff that is supposed to work in the site: and link: commands to actually work properly in GWT.

    – I would love to be able to generate nice looking PDF reports for my clients out of the console, a la Analytics.

    – Keep everything updated on a reasonably consistent schedule, and if there ARE delays, send us a message. For a while there, I went for months without a backlink update, a crawl stats update, or a query stats update. In fact, it would be even cooler if the date of last update could be posted somewhere on the page.

    – If we’re stuck with High, Medium, etc for pagerank, at least provide a frame of reference – I have some PR4 urls that are considered Medium, and others that are considered Low, and other weirdnesses.

    – It’s nice to see when Google last crawled the home page, but usually I’d like to know when Google last looked at some OTHER page on the site.

    That’s for starters – I have more.

  52. The possibility to choose the canonical domain when a website is under two or more different domains (similar to the with-www/without-www setting).

  53. the three things I want Google to do aren’t really technology based.

    (1) sort out your process for resolving infringments of copyright or terms of use. I got a pleasant and helpful response from Technorati within a day. I’m still waiting for google to respond.

    (2) provide a contact service on your daughter sites that does not require me to (a) sign up or (b) search for 15 minutes to find it. Yes this means you will need some people sitting around in a service centre waiting to help customers. Deal with it.

    (3) kill the spammers. Don’t let people hijack your addresses to send out remoate spam. Spam is killing networks and so much of it is fraud that it is damaging the reputation of other companies. How long before one of those companies sues you as being the vehicle for spam? Fix it. Now.

    PS: I’d be fine with paying for blogger/gmail but frankly you’ve got the cash to resolve all of these.

  54. Maybe just the items that help a site actually “improve” the site as far as errors and the like, but why give spammers more tools that tells them exactly what is wrong? I’m with Adam on the idea of not putting stuff in there that is truly not important and could easily be done by looking at one’s own stats.

    PR, rank stuff, etc? I don’t think so.

    Michael; Don’t you already offer free rank check tools and many other kinds of such tools? I think many out there offer the same types of silly tools.

  55. You report 404 errors. I can easily find broken links on my own site, but I am left guessing if there’s a broken external link to my site from another site. That information isn’t secret, so why don’t you tell me the page that has the broken link so I can contact that webmaster and ask them to fix it?

  56. Why on Earth did you include “Show PageRank numbers”? That’s just a total waste of resources and the people who voted for it — while I am sure they are wonderful people — really need to learn to stop obsessing over TBPR.

  57. When will the posibility to vote end?

  58. Wow–almost 600 votes in about 8 hours. Thanks for the feedback, everybody. 🙂

    Michael VanDeMar, suppose you look at backlinks and see that some come from a spammy site or scraper. That option would let you say “Don’t count those links; I have nothing to do with that site.”

    Sourav Sharma, what sort of trend report are you looking for? Google Analytics is really strong at that..

    Deb, my WordPress plug-in only allows one vote per person, but in the future I’ll see if I can find something that would allow multiple votes per person.

  59. I’m surprised no one has mentioned this… But where is “Integrate with Google Account”?

    My google account already has my AdSense and AdWords, why wouldn’t there be a way to integrate the Webmaster console as well?

    Bonus: I won’t have to revalidate that the website in question is mine if my cookies go away… which means I can get rid of that file… which means I can get rid of the exclusion rule from robots.txt and from sitemap generator…

    Kevin
    http://technogeek.org/

  60. As many people said there should be multiple option to select. Another feature I like to see is, integrated view of webmaster central, google anlytics, adsense together.

  61. Hi Matt,

    I voted for the tool to tell Google the target country/language.

    3 reasons:

    1. This has to help Google, the User and the Webmaster
    2. Many still report problems or query issues with this stuff – .com hosted in UK aimed at UK but ranking better on Google.com – same issues not applying to .co.uk domains etc
    3. It is more about helping (ultimately) users rather than helping webmasters doing their job

    Sure – error reporting stuuf is good too – but this one stuck out as way the most important – by miles

  62. *** The ability to tell Google which domain should be indexed when multiple domains point to the same site (right now I’m handling it with PHP and multiple robots.txt files) ***

    You should be able to handle all of that with just a few lines of redirect code in your .htaccess file. You’ll need a 301 redirect. Google for “Mod_Rewrite” for more.

  63. Matt,

    I voted for the “some type of page ranking” option. I would like to see some type of future page rank from Google, this will help webmasters know if their marketing efforts are working and being recognized by Google. If I see Google has a positive future prediction then I know what I am doing is correct and will continue on the same path, if I see Google has a negative future prediction then I would assume what I am doing is not working or is wrong and can revamp my marketing efforts. Basically, it would be nice to know if I am going down the right path when marketing or if my marketing efforts are going unrecognized by Google, this will help webmasters to modify their marketing campaign to get the best possible results. I wouldn’t expect this to be something so solid that it helps everyone ranking, because I understand Google and search engines don’t like SEO, but it would be nice to know how well Google is receiving marketing efforts and if it is having a positive or negative effect. I would like to know how Google predicts my site and site ranking. thx

  64. *** Finding all broken links within a site ***

    Programs like Xenu Linksleuth can quickly analyse a whole site and produce a report of all broken links, time-out links, and all links that then pass through a redirect for both internal and outgoing external links. Those can then be fixed and the program run again to verify that the fix has worked.

  65. You should be able to handle all of that with just a few lines of redirect code in your .htaccess file. You’ll need a 301 redirect. Google for “Mod_Rewrite” for more.

    Thanks to you I know that already, and I can handle it in .htaccess – when I’m hosting the site(s) myself, or have enough access to be able to do that or advise the webmaster to do so. But sometimes, for reasons involving security, company politics, and just bureaucratic BS, I can’t use that method, or at least can’t get it done quickly enough. Company buyouts are a bitch.

  66. I’d like to add a way to see all outgoing links from our site and the timeline of when they were added.

    The reason being that I’d like to see of anyone hacks or spams the site and seeing outgoing links could be a good way to monitor this from time to time.

    How about also posting these changes in the message centre so when we log in to a site, we can see what’s new or what has changed.

    300 links added
    4 Page Not Found Errors
    1 page disallowed by Robots.txt

    Could be nice to have all that is different from the last login showing in one console.

    Other than that they’re doing great but I’d also like to add my vote to ignoring links from one site so we can filter when a site is giving us a ton of links without going from page to page constantly.

  67. suppose you look at backlinks and see that some come from a spammy site or scraper. That option would let you say “Don’t count those links; I have nothing to do with that site.”

    Ok, that’s a teensy bit scary… you’re saying we need that? It’s something that will actually make a difference?

    Something that we should not only definitely have access to, but something that we should actively be watching for? And a clear advantage/necessity to signing up for GWC, and edge over non-GWC signed up competitors who can get hurt without knowing it, and who would have no recourse even if they did…?

    -Michael

  68. Since this is important enough that they might be doing it next, why not just flag the ones we should be concerned about, and allow us to have a Google Alert style email if it happens? That way we wouldn’t have to scan through all those links every day, looking for warning signs.

    Or maybe just put them on top of the other results…?

    -Michael

  69. You can already find all your outgoing links, and verify the status of the URL that they link to, using Xenu Linksleuth or similar.

    You can see if it returns 200, 301, 302, 401, 402, 403, 404, 500, or simply times out. It takes minutes to make these checks.

  70. You can already find all your outgoing links, and verify the status of the URL that they link to, using Xenu Linksleuth or similar.

    g1smd, he’s not talking about who you link out to, he’s talking about links coming in.

  71. those all look great. but I like the information on the scoring the best. Also, the ability to find out when links are dead would be cool. We manage a lot of sites with thousands of pages and any help we get would be great.

  72. Edit – My bad, you were replying above me. 😀

    -Michael

  73. spiffy poll. i voted for the more info about duplicate content issues, but i just thought of this…

    when looking at the errors, like “Unreachable URLs”, i click the links to see if the page really errors. more times than not, the URLs in this list are not broken because i’ve already fixed the problem. if there was a “no longer a problem” link i could click that might help G schedule recrawls.

    you suggested “Fetch a page as Googlebot to verify correct behavior”. if you could put that on the error pages to verfiy that Googlebot won’t get an error, webmasters could quickly fix and dismiss site problems.

  74. Hi Matt,

    yes, a tool that gives me information WHY that one site of mine is not indexed deeper – espite this site having many really written pages, the main page ranked at 4 with a sitemap and everything I could imagine done right 😉 I am on my knees – please Google, tell me what – WHAT did I do to scorn you?

    That tool would be like my prayer being heard.

    Second in line would be to define the country and language of sites/pages because I have a dedicated server in Germany (best prices by far) and I don’t want this to be mistaken for a Germany only site.

    Merlin 😉

  75. Hi Matt,

    what is that plugin here in the comment submission asking me math questions? I want that, looking for something like that for a while.

    Cheers
    Merlin

  76. I have voted for “Ability to show/download all pages from a site (e.g. if your server crashed)” and I hope (when and/or if this becomes a feature) that it will also include the PHP/ASP/JS/CSS stuff in it, otherwish it will have no use, you could use the google cache for this. So, if it does not include ALL my scripts I would like to see a way to “spy” on the competition.. Just to have a little taste for what they have been up to and why Google seems to have a bias for them

  77. The authenticated Webmaster communication is awesome. I think penalty notifications are very important, and this is the best place to inform site owners of these penalities.

  78. Though I did vote, I think that before any further expansion, it would be great if the Webmaster Console proved to be a reliable and up-to-date source of the information it already “provides”.

    What I find there:

    – no messages even when you are evidently penalized (like, say, mattcutts.com/blog being dead last for the string “Matt Cutts”)

    – no updates on “Query stats” for 2-3 weeks even though it says “All data is averaged over the last 7 days”

    – an “External links” list so unstructured it borders on useless

    – contradicting information about “highest PageRank” (seeing a PR5 page of a site being shown as the highest, while having PR6 pages on the same site – one or another is obviously misleading)

    – Googlebot activity stats clearly out of sync with actual search database updates (the console says approx. 50% of the pages of a site are revisited every day on average; the SERP shows most of those – not stale, regularly changing – pages with a cache date of 2 weeks to 2 months). Or perhaps Googlebot does most of its visits solely for fun, not to actually update the cache?

    So for all the flood of love and congratulations on how great Google does in terms of webmaster communications, quite a couple of issues should be tackled, perhaps, for Webmaster Console to be of “Google quality”. Currently it certainly isn’t.

  79. “Option to “disavow” backlinks from or to a site” scares me a bit. Isn’t something like this going down the path that the old axiom that ‘external sites cannot hurt you’ no longer is valid? Do we really want to replace the paid links economy with the paid-competitor-destroyer economy?

    “Show pages that don’t validate” seems to add credence to validation is required to achieve rankings, which has been disavowed officially in the GWHG FAQ pages.
    http://groups.google.com/group/Google_Webmaster_Help/web/faqs-for-crawling-indexing-and-ranking-2

    As far as new features; before adding spinners and a new sound system to this car, I’d try to get the engine in it first. The statistics that are currently shown are often out of date. New sites get stuck in a seemingly eternal “no data is available at this time” place. It appears that the statistics are tied into popularity of the site, which is counter-intuitive to me. Does wiki log into their webmaster tools account every day to see how they are doing in the #1 spot for all their terms? I doubt it. It’s the site that’s struggling to find their place in this web world that needs the helpful information that the tools provide. I’ve been a proponent of sitemaps since inception but now I rarely even bother with it for new sites as I know it’s just not worth the effort to ftp a simple file to the root for statistics that just won’t be there.

    The word “Pending” has got to be the most frustrating thing to see as a feedback I’ve ever seen. It’s just so open ended. I wish that could be tightened up a bit with some sort of estimate (years, weeks, days, hours) anything. Even the cable company lets me know how long I’ll have to wait on hold to talk to a human, I’d like to see the same when I am sweating out a URL removal.

    I didn’t intend this to be a rip on webmaster tools, unfortunately it kind of sounds like that right now, I’m sorry. I love what you’ve all done so far, I just wish it would be improved a bit before expending energies on new projects.

  80. I’ve voted for more information on penalties or other scoring issues – if you remember back a bit, I used to complain here about the lack of information on scoring. I started my site in 2004, hoping by hard work and frequent updating to become an established commentator on a particular type of business strategy. On Yahoo, I’m number 34 for the subject matter, which is about what I’d expect. At this stage I can say that I am reasonably well known in my area of interest.

    But this has come despite, rather than because of Google websearch, where I’m anywhere from the 200’s down. I get ten times as many referrals from Google Image Search than from ordinary web search (I signed up for enhanced image search on the console). I do well on Google blog search. Also on Technorati and others.

    The most annoying thing is how I see big media sites break so many of Google’s guidelines (massive amounts of on-page advertising, duplicate content repeated word for word from Reuters or other primary source, non-relevant links, cross-linking etc.) and there’s no talk of them being penalised. It seems to be very much a case of one law for the big guy and another for the little guy.

    Do I want more information? Oh, yes. From day one I always seemed to be crawling up a steep slippery slope, from the trials of the sandbox to dropping hundreds of places for no obvious reason. After all this time, no one can tell me the site homepage isn’t filtered downwards for the main phrase.

    So anything that can be done to provide more information through the console is welcome. I’m convinced that sites and pages have a sort of credit history in Google’s data. I also believe that this history can be wrong. So a historical record of the status of a page would also be a very good idea, with an opportunity for webmasters to put right mistakes in some way.

    I’ll go away and leave you alone for a year now!

  81. An up or down arrow to show the last move of my Query stats. This would be a quick visual on how the site is doing (going generally up or down) and which queries are slipping and need some attention.

  82. Great post Matt. Google is impressive by continuing to improve upon the better mousetrap they have already built.

    Since I recently came out of a 60 day penalty I obviously voted on “More information about penalties or other scoring issues”. I would have corrected the problem immediately if I had received an email from Google. If possible, a clear explanation of how to correct would be nice. Although, that might be impossible with the massive number of sites out there.

  83. I definitely vote for all of the above.

  84. I’d love to see the idea I just came up with. It would be incredibly difficult to pull off, given the manipulability by blackhats, but if anyone can pull it off, Big G can.

    I’ve seen a large number of automated form spam requests made as of late, all of which contain both HTML and vB-encoded hyperlinks (with requisite SEO-based anchor text). What I’d like to be able to do is to send an XML file of those requests containing the spammy information over to you guys, which you’d then be able to use to determine the legit ones from the fakes and punish (or not punish) accordingly.

    As someone who gets at least 40-50 contact form spams a day, I’d love to be able to report it via an automated means.

  85. Voted for Dup Content, though the second vote would be for Penalties…

  86. Dave (original)

    Good stuff Matt, thanks, I voted for a broken link checker.

    However, I’m concerned that Google is feeding the link and PR frenzy that already exists. Most of the feature voting options had something to do with PR, links, bans and penalties.

    I do hope Google is not falling into the trap of giving the squeakiest wheels the most oil?

  87. I’ve been thinking it for a while Peden202 said it… I just checked M.I.T. is ranking for Buy Viagra search http://cef.mit.edu/other/gen2/?Buy-N-Viagra …C’mon already… Competitors spamming all over the place blatantly…trying to play fair… Adding more webmaster tools features versus handling the meat and potatoe issues of the volume of spam does not seem the right choice.

    Please act on spam reports. Please communicate with those filing spam reports. SERP quality is going downhill… My advice used to be if you want to learn about anything go to Google. Nowadays my advice is go to Wikipedia…then maybe you can go to a search engine…

  88. Dave (original)

    Michael VanDeMar, suppose you look at backlinks and see that some come from a spammy site or scraper. That option would let you say “Don’t count those links; I have nothing to do with that site.”

    I don’t get where you are coming from with that reason Matt. Surely Google would ONLY see a connection to the said site IF the site is linking back to the “spammy site or scraper”?

  89. I voted for “More information about penalties or other scoring issues” but the real issue is the lack of communication when Google decides something is wrong with a website. I lost 80% of my Google organic search traffic on July 12 after years of steadily increasing traffic from Google organic search and I don’t have a clue as to what has caused Google to suddenly downgrade the site. The Adsense staff, who does respond to e-mail, were sympathetic but they can’t help me and there is no one to contact at Google natural search. So here I sit in the dark, frustrated, wondering what I’m supposed to do next . . .

    I understand why Google may not wish to communicate with blatant abusers of its system, but what about sites that have been around as long as Google and have a clean history?

  90. Hey Matt,

    How are you thinking of these features…. really cool, I have voted my opinion

    Cheers,
    Harini

  91. Hi Matt,

    I’d love to see all of the above options but particulary the first “More information about penalties or other scoring issues”. This is always the source of confusion for webmaster when they think they are trying to achieve white hat SEO and sometimes there could be some penalty that is out of their(?) control.

    Another one i’d add to the list is a time value on the reports of crawling. When looking into network unreachable, gateway 504 issues in an organisation it would useful to have a timestamp with the date so we can try to pinpoint problems. With world timezones this is difficult but maybe an option to convert from say GMT (maybe we need Google time 😉 ) to the webmasters local time in their profile?

    Regards,
    Rich.

  92. Matt,

    I feel the below is the most important thing for a website to make it perfect, ofcourse i accept that the first option is also required.

    “Show pages that don’t validate”

    Can you please change the voting system, as one can select many option?

    Thanks, Pratheep

  93. I voted for “Some type of rank checking”, its more imp. to me to check my ranks on google, than others.

  94. # Tell Google the correct country or language for a site

    Languages – hell, no!! There are on-page ways to say which language it is. Google must not create a gray area for this.

    Countries – there should be on-page ways for this as well. Can you imagine users going to 5 search engines to say for which country it is meant instead of saying it on-page.

    You can give warnings for these being missing, though!

    # Show links on your site that are broken

    This is what I voted for. I’ve recently stumbled upon several maintained sites that have broken links on them (usually to third party sites).

    # Score the crawlability or accessibility of pages

    Thumbs up.

    Definitely display these as warnings:
    – All the weird things and bad practices your indexer has to compensate for should be included as warnings.
    – Dynamic, session ID URLs, infinite loops as warnings.

    # Tool to help move from one domain to a new domain

    This is what HTTP 301 was invented for. Or use on-page/on-site redirection methods to handle this for sites that cannot produce headers. robots.txt perhaps.

    # More information about penalties or other scoring issues

    This would be nice, although I’m sure it’s tricky for you to do so.

    # Tools for detecting or reporting duplicate content

    Giving hints to site owners about which pages are considered duplicate would be of use. It will make the benefit of creating original content more visible.

    Reporting pages to Google on which pages are duplicate (if that’s what you mean) is of little use. You cannot police the web on content stealing (though there has been several occasions I’ve wished you could) and duplicate content within a site should be strictly “noindex” by sites.

    # Tell Google a parameter doesn’t matter

    It *must* not be in Webmaster Central. If added, it *must* be in robots.txt.

    # Diagnostic wizard for common site problems

    Yes.

    # Some type of rank checking

    Supercool, but sounds like quite a big thing to get done.

    # Show causes of 404 errors

    Ok.

    # Option to “disavow” backlinks from or to a site

    I agree with Ian and Philipp above. What’s up?

    # Show pages that don’t validate

    Yes!

    # More documentation and examples

    Yes.

    Other issues:
    # Ability to show/download all pages from a site (e.g. if your server crashed)

    Backups are the responsibility of content owners.

    # Fetch a page as Googlebot to verify correct behavior

    The other items on the list should be able to take care of this.

    # A way to list supplemental result pages

    Don’t care for this.

    # Integrate “Add URL” feature

    Perhaps.

    # Show PageRank numbers instead of none/low/medium/high

    Hmmm. Maybe.

  95. Yes Matt, Our Analytics is having that….But can we have something like say – a graph to highlight the progress of our keywords just in percentage or may be with red or green bar – see the point is – for related stats we need to go to analytics account – so just for the sake of building a bridge between yout 2 services…

    Again, here you are showing the avg positions but in GA – we don’t have that feature,….so I think there is surely something we can do here so as to make things easier for us to manage….

    how about that….a common dashboard for both the application, where we can keep our favs report….and can also combine 2 reports to get some kind of graph…..

    These are my wild thoughts – may be you can get something from here to make things more savvy for all webmasters….

    What do you think…????

    Sourav

  96. So far 1018 votes but only 37 voted on:

    * A way to list supplemental result pages

    There must be something wrong with the poll software.

    C’mon. Give the above suggestion some love, please 🙂

  97. I think a good function to have is a form for webmasters to tell Googlebot not to crawl the site when it is scheduled to go down due to server maintenance or something else. This not only results better efficiency for crawling activities, but can help to relieve some unnecessary anxieties on the part of webmasters whenever they go through this process. (I think telling Googlebot not to come when the site is not ready is better than relying on the 503 status code to tell it to come back again later (when??).)

    I’m sure Googlbot will appreciate the reprieve of having to come knocking on a door again and again when it should know the owner is not home. 🙂

  98. Hey Good post,

    Can we not just do all of the above, it will keep the Team busy 🙂

    Shane

  99. Hmm… Way to know if your site is under penalty seems to be a clear winner – and I’m sure it’s something the webmaster tools folks would dearly love to implement – hell, I even voted for that one.

    What I’ve often wondered is HOW you would do that without actually giving spammers the heads up about exactly just how far they can go before getting penalised.

    Confirmation of a penalty WITH specific reasons for that penalty could be quite instructional if you have 5000 separate sites testing variations of the latest SE indexing exploit.

    Confirmation of a penalty WITHOUT specific reasons for that penalty equals frustration / distress for the good guys who can’t work out why they might have a penalty.

    I think everyone wants to avoid penalty fratricide where possible but I just can’t for the life of me think of a good, equitable solution that doesn’t potentially lead to a spammier index.

    doc

  100. Hello Mutt.
    I’m Lino Uruñuela from Spain.
    In Europe we have many difficulties at the moment of proving to be resulted in the local search.
    In order that a domain appears in Google Local’s version, we must fulfill one of the following requirements:
    -Ip local
    -domain local .es .de etc
    -Links of local domain

    This is a error. For a little time we have seen like to these factors they are very important and I think that this isn’t better way.
    The ip should not have any importance, since the companies of hosting here are very disappointing and we contract the hosting in USA.
    And this one must not be a determinant factor. The same thing with the domains .es or .de or .it

    There will be some solution to these problems?

    Because in this survey, the majority that answers is from USA and they do not have these problems.
    Thanks, Lino Uruñuela.

  101. Another thing, why do not they make the ping of the blogs slightly similar to know the one who wrote the content and this way to know before the doubt which is the duplicated content?

    It’s very easy 🙂

  102. Too bad you can’t cast 2 votes, for me scoring issues and content duplication are equally important, but at the moment i’ve voted for the first one 🙂

  103. Robert Schroeder

    Matt,

    As you can see “More information about penalties or other scoring issues” was a clear winner.

    The 950 penalty created a lot of collateral damage.
    Reconsideration requests go unanswered.
    No messages in the “Message Center”.

    Communicate, PLEASE!

  104. I don’t know if it has been said before in this post, too many to read – but I would like to see the proper brake down of google monthly results so we can see the most popular search terms.

    I don’t know if there is a tool that does it already but it would be extremely useful.

  105. My suggestion would be a better keyword suggestion tool. Overture kicks ass for this and for a very simple reason. They give a number as a measure of searches. Googles little graph things are too inaccurate when search volumes are low, and it doesn’t give you enough indication of which keywords you should be concentrating on.
    Personally I only use adwords, but I do my research in Overture, which I think is a bit sad.

  106. Dave (original)

    I say leave the spammers in the dark and tell them zip, then point them to the Guidelines.

    Come on Google (Matt), don’t cater to the spammers, cater to the majority who don’t spam.

  107. Hi Matt,

    I’m with Lino on the country-local issue. My vote is for “Tell Google the correct country or language for a site”.

    I found particularly difficult to obtain good rankings when selecting the filter “pages in {country}”.

    I would still love to have ALL features though. 😉

    Thanks

  108. Robert Schroeder

    Come-on! Dave (original),

    “I say leave the spammers in the dark and tell them zip, then point them to the Guidelines.”

    There are a lot of non-spammers with the 950 penalty – believe me!

  109. There is totally no way Google or ANY search engine should give a heads up to sites of penalities, etc. That is totally crazy stuff Matt. Sorry. If you all do that, your index will be totally overrun by those out there who would love to know exactly how far they can go before they trip the spam filter at any given time. It’s lunacy to think Google would do this.

    dockarl summed things up nicely, and I’ll repost it just so you all see it again, and then again:

    “What I’ve often wondered is HOW you would do that without actually giving spammers the heads up about exactly just how far they can go before getting penalised.

    Confirmation of a penalty WITH specific reasons for that penalty could be quite instructional if you have 5000 separate sites testing variations of the latest SE indexing exploit.

    Confirmation of a penalty WITHOUT specific reasons for that penalty equals frustration / distress for the good guys who can’t work out why they might have a penalty.

    I think everyone wants to avoid penalty fratricide where possible but I just can’t for the life of me think of a good, equitable solution that doesn’t potentially lead to a spammier index.”

  110. Matt,

    Just wanted to say that I really love your blog – you have provided incredibly useful information and insights and I thank you 🙂 I have been reading for a long time and just thought I would say hello.

    I would like some more information on link value and penalties – some kind of scoring breakdown – although I know that is a lot to ask! But those are the things that remain somewhat illusive – Links are interesting – for instance webmasters who are attaining top ten rankings by simply pumping out a lot of template sites and domains and linking them all to each other. Yes they are contextually relevant, but not valid, useful sites. I know I am only seeing a tiny piece of the linking enigma, but that is the one thing that I would really like to learn more about.

    Duplicate content, relevant content, site structure – all of that stuff has always made sense. Building with best practices always pays off in the long run. However, I would be interested to know more about penalties, ranking info, etc., especially as you continue to modify how you score/track links.

    Anyway, love Google analytics and love webmaster tools! Thanks!

  111. Robert Schroeder

    “There is totally no way Google or ANY search engine should give a heads up to sites of penalities, etc.”

    Fine, then don’t penalize sites for no reason.

    I repeat – There are a lot of non-spammers with the 950 penalty – believe me!

    Something has got to be done!

  112. Robert,

    dockarl hit it on the head and answered your comment before you made it. The problem with your statement is that, in at least 99.9% of cases, someone has done something silly to trigger what they believe to be a penalty (usually with some random number based on a small sampling of other sites that are affected by the mysterious non-existent penalty, such as -30 or -950.) Google can’t reveal every reason, nor should they…many of the reasons sites get penalized are the same reasons sites would suffer if there were no search engines involved (link exchanges, spammy promotional techniques, etc.) Chances are that if you’re complaining, you’re doing something wrong whether you realize it or not.

    You’re also forgetting that, if those penalties did exist, they’re not all bad. In order for a site to move down in the rankings, at least one has to move up to take its place. So at most, your penalty problem negatively affects 50% of the sites for any given search term, and the odds are strongly in favour that it’s less than 50%. Why would Google cater to the minority in this instance?

    Right now, all you’re doing is making a generic statement about a penalty that doesn’t exist with no evidence to back it up and a plea for action to solve a non-issue. Why would Google or anyone else do anything about that? Would you, if someone complained about your business?

    Seriously, this -X penalty crap has got to stop. It’s old, it never existed, and it’s just an excuse designed to placate SEO wannabes rather than force people to take a hard look at their promotional tactics and realize what they’re doing wrong.

    So yeah….ixnay onway ethay oringscay, easeplay.

  113. Ops, sorry, where I wrote “Show pages that don’t validate” twice, one should have read “More information about penalties or other scoring issues”.

  114. hmmm Robert; There are also many of us who have been developing sites with seo in mind for over ten years now with “no” penalites whatsoever. You say for “no reason”…. I’m thinking there is always a reason, even if it’s a “mistake” by Google or something. The point is that giving specifics to a site would allow any se spammer out there who researches to the ninth degree with mulitple “test” sites and domains, could easily do very well in Google by toeing that spam line/filter if they know “exactly” what caused the penalty.

    But anyway; a penalty that is a mistake by Google are very, very few these days. The chances of a site getting penalized unfairly is getting less and less as we go. Build a great site with great content, and stay away from “link schemes” designed to game Google, and the site will be fine. Stay away from certain forums and places who promote these link schemes and you will be fine as well. Build the site for humans.

  115. oh, and what Adam said as well. 🙂

  116. As we have more and more clients on the webmaster console, I’d like to make one simple suggestion. On AdWords My Client Center, there is a simple drop down box that allows you to quickly find the client you wish to go to… It’s called “Jump to client” and it’s found in the upper left area of the MCC.

    Plus I voted for more info on penalties. Anything and everything you can give us would be valuable… also information we can share with clients, in simple English that they can understand, would be very much appreciated.

  117. About the last 5-6 comments.

    The thing with penalties and detecting fraud in general is that it’s a numbers game. Good stuff does get penalized sometimes. The trick is to keep the false positives to a limit and learning to live with them (while trying to improve and get new information vectors whenever possible) and applying a manual process to handle complaints.

    If the spammers and black hats are any good, they do a good job on their content, so it could not be distinguished from the rest, but take it to the limit to get an unfair edge over others.

    If the “baddies” get clear information (or for some things even just hints) about the critical parts of the detection algo, they will screw up the listings and make it unusable for everybody else. If Google were to communicate non-critical parts of the algo, though, then that would be useless information, as it does not contribute to anything anyway.

    Also – top positions are not and can not be taken for granted. The ugly truth.

    If there is money in it, there’s money to be made in manipulating it.

  118. Now this had an impressive number of responses…. and some insights…. how about adding the thumbs up thing like seomoz has…. you could work out who is the leader of the cutletts.

    how about adding a box with your top 5 posters

  119. Yes, all or most of them would be useful. Many of us know external tools to accomplish almost all of this, but many webmasters are not aware of them or don’t consider their results important in any way as far as being indexed in Google goes.

    If those tools and their results as applied to each site were part of Webmaster Tools as more than just mere recommendations hidden in Help Pages (which so few read and even fewer relate to their own sites), you’d go a long way towards reducing the number of threads in the Group asking for the arrogant “what’s wrong with Google” or the more humble “what’s wrong with my site”.

    I thus voted for a diagnostic wizard for most problems, hoping that “most” truly covers the most common ones and also the sneaky, yet common ones.

  120. hey matt,

    All are cool tools, that every webmaster needs them. I request team to work and get every thing to be done asap. 🙂

  121. How about a Webmaster Central’s API?
    A commercial API to Google Search would be nice too.

  122. yeah, and you Aussie could work out who is the leader of your inner circle of buddies, and do so at seomoz as well. Heck; I could name all the members of the inner circle. 🙂

    And yes; the thread is a good one. Giving spammers a heads up on anything is not so good however, so we have to speak out about it. Sorry if that is not okay with you. You should write about it at the seomoz blog….. oh, and also write about it at sphinn and even SPHINN IT . 🙂 You all would get a kick out of it.

  123. Robert Schroeder

    Well, there are people here with their heads in the sand.

    There are sites getting penalized for nothing, while sites that Should Be don’t.
    It’s time for a level playing field.

    Matt,

    If you want examples I got them.

  124. I’d love to see an option of Google search to only show pages that validate.

    That would remove 99% of the clutter, spam, and potential fuzz attacks to which I am exposed every day.

    Sites that validate are sites that care (or care enough to use tools that actually work). It’d be one of the biggest steps forward Google could offer in improving the quality of the Web

  125. Yes Robert, I certainly agree with you on that one. 🙂 You should post the examples you have as that way not only Matt but others can help you figure it all out.

  126. Robert Schroeder

    I won’t post them on an open forum for obvious reasons.

    If you want to take a look – let me know how to forward the info.

  127. Matt’s not going to do that, Robert. He’s made that clear on here on more than one occasion. You can give up that line of thinking right now…it’ll save you some grief.

    There are no obvious reasons for not wanting to post a site that is allegedly penalized…and it usually means you’re trying to hide something.

    So show ’em.

  128. oh, and what Adam said as well.

    Damn right what I said, Doug. I’m da MAN. Don’t forget it, either. 😀

  129. Robert Schroeder

    “sites that Should Be don’t” Ever hear of the word competitors?

    I not trying to hide anything.

    It has been nothing but grief.

  130. I don’t know as I’d call it a penalty exactly, but I did experience the “950” phenomenon back at the end of May, with two of the most important pages on my main site. Put me in a panic, as my busiest season was just around the corner. Overnight, the primary search strings for those two pages, which had previously returned my site in the top five were relegated to the back of the bus (whichever the last page was at the time of the search. Reversing the search terms left me on top, and even adding 2007 helped – but most people didn’t bother doing that. The other 400+ pages came up just fine in the top spots for their search strings (each page is a city name with a list of events) I ended up removing a couple instances of the words from the title tags and pages themselves, and that tipped me back up again; took about ten days.

    Earlier in the year we experienced practically the same thing with a friend’s very small site devoted to a particular comic movie team. It disappeared for the main searches to the last available search page, while still coming up for some less popular searches. After a while, it popped back to #1 for everything, only this time we hadn’t done anything to it at all.

    Was that a penalty, or just an algorithm weirdness, or…? I have no idea. But the phenomenon does seem to exist, whatever you want to call it. There’s and endless thread on WMW about it.

  131. Robert Schroeder

    “I ended up removing a couple instances of the words from the title tags and pages themselves, and that tipped me back up again”

    Been there done that – no change!

  132. The duplicate content is a good thing.

    In my sector there are a lot web sites with the same content, (education sector in Spain), a company put the same content in several web sites (the information about a course is always the same), then there are a lot sites with the same content, and some of this sites create satellite sites to promote them.

    On the other hand for internal issues, we need to create several pages with the same content except the city. And we don´t know if we can have duplicate content problems.

  133. “sites that Should Be don’t” Ever hear of the word competitors?

    Yes, and I learned a long time ago that competition is only a threat when your company isn’t strong enough to begin with. Again, that goes back to having something to hide and/or an insecurity.

    A good company doesn’t give two tenths of a damn about hiding things that could easily be in the public domain from the competition because it has a solid customer base and market share. They just go ahead and do it.

    It doesn’t take much to figure out that you’ve got a deeper insecurity issue than anything Google is going to solve.

  134. – Ability to email Google with errors in the webmaster tools.

  135. Robert Schroeder

    What are you talking about? I think you’re confused.

    I’m not going to post my competitors URLs here. As I said some of THEM Should Be penalized, while mine should not.

    Do you want to help or not?

  136. Friends!

    I found a nice name for this one:

    * Tools for detecting or reporting duplicate content

    Spam-O-Matic 🙂

  137. I went ahead and voted. Seems like you have a lot on your plate if you are choosing form that many items. Good luck and thanks for allowing us an input.

  138. Robert – please forward the info to lauri@gishpuppy.com. I’d love to take a look.

  139. I’m not going to post my competitors URLs here. As I said some of THEM Should Be penalized, while mine should not.

    Do you want to help or not?

    That’s not what you said…you said that you were being penalized by the “-950 penalty”.

    I never said to post your competitor URLs either…I was referring to your own URL. Usually when people make the statements they make about competitors, it’s because they’re trying to hide from them.

    If you want help, post your URL. It’s not that complicated.

  140. Robert Schroeder

    Yes, I’m being penalized by the “-950 penalty.

    I said “There are sites getting penalized for nothing(mine), while sites that Should Be don’t(competitors).”

    I’d like somebody to compare the “sites” in question.

    I have a lot of information about the “sites” in question, so you would see what I mean.

    You need the whole picture.

    I have posted my URL in “Google Webmaster Help” and nobody found anything wrong!

  141. Since we can only choose one and no surprises that maximum votes are for “More information about penalties or other scoring issues” here are some more of the above:
    Show causes of 404 errors
    Tools for detecting or reporting duplicate content

  142. “Seriously, this -X penalty crap has got to stop. It’s old, it never existed, and it’s just an excuse designed to placate SEO wannabes”

    It’s very easy to dismiss other people’s problems in such a derogatory manner, especially if you haven’t been there with your sites.

    As I said above, you can be relegated to the back of the field even for your most obvious queries, like this blog might be for “Matt Cutts”. Don’t anybody tell me that Google is doing searchers a service by hiding your pages even from people who were out to find them!

    I mean, if StupidGardeningCompany Inc. was doing hidden text and doorway pages on its site, it is OK not to show it for, say, “gardening tips”. But as long as it stays in the index and not purged out altogether, it should definitely be the first result for “StupidGardeningCompany” regardless of spamming.

    Why? Because the guy who put this string in the search box was in all probability looking for this very company. Or doesn’t Google put the user first?

    This IS a penalty (call it minus this or minus that); it DOES happen to people who don’t do black hat; and it IS stupid. And the total lack of communication about it makes it worse.

  143. Hey Matt,

    can we get all features? 🙂

  144. All of them are a great help. I hope the options such as ‘disavow an inbound link’ etc give enough time for a webmaster to respond – particulary if they are away from the site/net for a couple of weeks holiday/hosipital or where e.g. the site has been transferred/sold to a new webmaster.

  145. A propos language and country: In Scandinavia (The Nordic countries – Sweden, Denmark, Norway, Iceland and Finland) most of us understand each others languages. (Well, between Swedish, Danish and Norwegian we do. In Iceland most people understand one of these languages aside of Icelandic and in Finland many understand Swedish). Our association “NordVisa” run a site on the subject about the common kind of folk songs that we call “visa”. This is shared between the countries.
    The site is written in Swedish, Norwegian and Danish, sometimes on the same page.
    Two important thins we do is to publish a yearly list of festivals and a list of all clubs and associations throughout all of our countries. It is read and understood by all.
    However we list very differently in the different countries, probably (partly) because of the assigned language for the page.
    It would be very nice to have a possibility to tell Google what the target countries are in an area with mutually understandable languages, since it is not possible in (x)html.
    Maybe Google could treat a site with different languages (marked up on in-site pages with mutual links) as potentially important for the different countries within areas in the world with known “common/close-related” languages.

    /chris

  146. Hi Matt! I would like to see the possibility of:

    1. More info on penalties
    2. Disavow in/out links

    Thanks

    Damien

  147. *** What I’d like to be able to do is to send an XML file of those requests containing the spammy information over to you guys ***

    I would hope that Google already had some honey-pot forums and blogs running, and were already collection that sort of data. 🙂

  148. Dave (original)

    There are a lot of non-spammers with the 950 penalty – believe me!

    Yeah right, just like prisons are full of innocent people.

    IF you have have been penalized or banned by Google you ARE a spammer.

  149. Dave (original)

    So these sites that say they have a penality for no reason (impossible BTW) want better SERP positions so they can be found, but are NOT willing to post their URL along with PROOF of a penality.

    That sais it all 🙂

  150. Dave (original)

    It’s very easy to dismiss other people’s problems in such a derogatory manner, especially if you haven’t been there with your sites.

    Yes it is, especially when those with a “no reason penalty” wont even offer a URL.

    It is also very easy dismiss those asking to *prove* a penalty for no reason and just keep on parroting the latest penalty buzz words while all-the-time pointing the finger of blame everywhere but the right place.

  151. Robert Schroeder

    “Yeah right, just like prisons are full of innocent people.”

    I’m sure there are innocent people in prison.

    Just as I’m sure there are non-spammers with the 950 penalty.

    Mistakes are made.

  152. Robert Schroeder
  153. Is that the site you are talking about Bob? Is that your post in there? If it is, it’s very clear to me why you are where you are right now. Some posts in that thread spelled it out for you as well. What’s hard to understand? You do “not” have any darn penalty whatsoever. Why do you think you do? Just because a site goes down in positions does not mean it’s a penalty of any kind.

    Your site looks very nice, and I’m sure you have good conversions, but what makes your site “better” or unique than all the other many sites who sell shoes? I don’t see anything, so why do you feel your site should rank better than the other thousands of other sites?

  154. Robert Schroeder

    Well, spell it out!

    I dropped over 700 positions.

    Did you compare the other sites on page 1 now?

    Take a good look.

  155. Sigh….

  156. I don’t have to compare “other” sites to know exactly the problems of the site Bob.

    #1 NO content to speak of whatsoever.

    #2 The site does not link out to “any” quality sites at all.

    Regarding the first one; why not have an articles section where the owner can talk about everything shoes?? You create a new page with each new article. The site is stagnant with nothing new for months and months I’m sure. More pages yields more content.

    Regarding the second one; Your site does not link to any other site. NOT good. Why not link out to pertinent sites on the pages? I’m not talking about a links page…. nothing like it. You can link out to one site on a few pages that is high in quality. Sites who don’t link out to other great sites will never get anywhere.

    There ya go…. take the advice or not. 🙂

  157. Robert Schroeder

    Please compare.

    You make good points, but the other e-commerce sites have the same issues and more.

    I’d like to point them out, but…

    You don’t explain why I dropped over 700 positions.

  158. Dave (original)

    Dropping x pages in Google for a said search term is NOT proof of a penalty. If you REALLY believe it is, then tell me how many SERP pages must a page fall so it becomes a “penalty”.

    What it is *evidence* of is;

    1) You *were* being credited for things Google was fooled into crediting.

    2) You are relying too few content pages for traffic (that’s a fact BTW).

    3) The competition heated up.

    4) Google rolled out an algo change.

    5) One or more of the above.

    Now, the site itself.

    1) Too little content pages.

    2) Too little quality inbound links.

    3) Reliance on too fewer pages for traffic.

    I would stop worrying about all the pages above yours and expend that energy on making your site the most important and authoritative in the market!

    Do that and you will get links that Google WILL see as true votes.

  159. sheesh. I’m done Bob. I “never” look at other sites as I don’t have to do that to know what a problem is. I just told you why the site dropped. Would you jump off the bridge because the other guy did as well? You are saying that just because some site out there is not doing something, that your site should not be doing it either? That’s crazy thinking Bob. NO TWO sites are alike at all. NO two sites should be doing what the other site is doing. Think unique. What is unique about your site? Nothing. Period.

    Do you understand how easy it is to throw up a shopping cart with a few pics and call it a website? Real easy Bob. Why the heck should your site be on the first page ahead of thousands of others?

    I’m trying to help you Bob, but you don’t seem to want to acknowledge your problems, but instead you want me to go on a fishing expedition to check out other sites that have nothing to do with your’s. No thanks. You have enough advice for free to move up if you take it. If not, fine by me.

    Ask booksandbaskets.com what happened to her positions when she actually started writing new pages on the fly and started linking out to quality sites…. like one link per page or so…. go ahead, ask her.

    But naw; ignore my advice Bob and just keep on insisting that we compare some other site to your’s….. see how far you get. 🙂

  160. Matt,

    There needs to be a way to remove urls from the index more than 1 at a time without submitting an entire sub directory. If you have a case where the bot gets into a section of the site that a webmaster forgot to make off limits, it can have terrible consequences and thus can take months for possibly 1000’s of urls to naturally fall out of the index. We had this happen last year.

  161. Doug’s dead right here, Robert…if no one found a problem with this site, they weren’t looking hard enough (read: looking at all).

    Your biggest problems actually have nothing to do with search engines at all, although fixing many of the problems with your usability will also solve at least some of your search engine issues.

    In most cases, it takes a user at least 3 clicks (not going by the site map, which 99.9% of users aren’t going to touch anyway) to find most of your products. How can they buy them, and refer others to them which includes the oh-so-precious organic links that most obsess about, if they can’t find them?

    You also are blocking a potentially huge source of traffic and sales in the form of image search:

    http://images.google.com/images?svnum=10&um=1&hl=en&rls=GGLF%2CGGLF%3A2006-19%2CGGLF%3Aen&q=site%3Amisshighheel.com

    (Mind you, that doesn’t appear to be your fault…looks like a misconfigured robots.txt file).

    Those are just a couple of issues. There are quite a few more than that, some of which people actually told you in that webmaster group thing you posted and that you chose to ignore (much as I suspect you’ll ignore this, but I’m hoping someone else somewhere gets this…if they do, it’s worth it.)

    Simply put, you made the mistake 99.9% of e-commerce site owners make: you took the “ez-build” approach, using a “shopping cart solution” that creates more problems than answers (I don’t totally blame you for this…the prevailing and incorrect attitude is that Yahoo! storefronts are great for the small business owner), and most importantly that doesn’t reflect your business…unless your business is cookie cutter. Like Doug said, what’s unique about what you’re doing?

    I’m not going to compare your site to any other site either…as I said before, what your competition is doing (or not doing) is meaningless. The reasons you aren’t finding success are the holes in your own site…period.

    Now, you can hear what you want, and believe what you want, and hope the magical Google fairy will come along and make everything right…or you can start taking a hard look at your site from the user point of view and recognizing that SEO isn’t standalone, and that there are user aspects to it. It’s up to you.

  162. Robert Schroeder

    “…using a “shopping cart solution” that creates more problems than answers…”

    What do you mean by this?

  163. Would be very interested to see what Matt voted/would vote for as a site owner?

  164. Hi Robert,

    What I mean is that shopping cart software often creates bloated and erroneous code, poor navigation and architecture, and a one-size-fits-all mentality that doesn’t fit most sites (including your own). The other issues that I mentioned above qualify as well.

    They don’t address issues such as unique products and content, how to write a proper privacy and/or refund policy,

    Like I said before, I blame you for knowing none of this…it comes with experience building e-commerce sites. You’re at the bottom of a fairly steep curve that we all were at the bottom of once (and most still are).

    The best thing you can do is to find a qualified e-commerce developer who can build you a custom cart. Of course, being the best thing you can do, it’s also by far the hardest…the good developers are usually too busy, and the available developers usually suck nard and wouldn’t know a good e-commerce solution if it bit them in the ass.

  165. I would like to have a way of marking up the content on my site(s) that I do NOT want to be duplicated (i.e. stolen) on other websites. Any such content found on other websites would be automatically removed from Google’s index.

    My website’s original content is very popular – so popular that dozens of webmasters are stealing it. Granted, they never ranked better than my website, but they’re still getting a lot of traffic that they don’t deserve, and making money with AdSense on those pages. Frustrating.

  166. Dave (original)

    I wouldn’t worry about changing shopping cart software until you have created at least 50 new content pages and you can stop relying on so few pages for traffic.

  167. Maybe Matt will change the poll to more correctly reflect the direction you all have taken this post in. Does -950 exist? Does it not? Does -30 exist…yawn…or better yet maybe the conversation should be more about the subject at hand.

    I’m still lobbying for stats to be updated more frequently , even for my peon sites that don’t have a PageRank of 9 and 30,000 visitors a second.

  168. Dave (original)

    Perhaps YOU should take your own advice THEN dish it out to others.

  169. Ahhh, why not have the vote as checkbox’s

    Ones i would find most useful are:

    Every One of them 😥

    What about all of the above – pretty much in that order as add url and additional documentation are not really needed – it is very well documented (as is everything google) and you shouldnt really have to add a url, the spiders find everything with links.

  170. Hello Matt,

    There is something wrong with the pool.
    Since people tend to choice first options, you should do the same pool later with the same list reverted to see if the results are consistent.
    Best regards.

  171. What google does with spanish cases like

    Search: http://www.google.es/search?q=dise%C3%B1o+web&sourceid=navclient-ff&ie=UTF-8&rlz=1B3GGGL_esES212ES212

    Page with Keyword Stuffing
    http://www.internacionalweb.com/
    or
    http://w2.edesign-comunicacion.com/index.html

    Diseño Web = web design …

    Personally I has reported more that 100 spanish spam cases and NONE of those has been fixed…

  172. Please, can we have a 1st, 2nd, 3rd vote next time!

  173. Hold on, Dave…JLH might have hit on something here.

    As many experienced e-commerce developers are acutely aware, most shopping cart software is inherently flawed and takes a great deal of work to fix (e.g. osCommerce, Mal’s E-Commerce, zenCart, cubeCart, Yahoo! stores, etc.) However, there are people like Robert Schroeder who, through no fault of their own, aren’t aware of these things…and since there aren’t a high percentage of e-commerce developers who talk about them openly, there’s really no way for them to be aware.

    There’s some potential benefit to big G picking out common software code and patterns (e.g. e-commerce software, vBulletin, Joomla, PHPNuke, etc.) and letting webmasters know that they need to fix the common errors within those pieces of software (session hashes embedded in URLs, excessive clicks to get to product pages, etc.)

    I don’t think that’s really a scoring issue but if there were a short phrase for it, I’d use “Common Website Software Issues”.

    Maybe this could be added to the poll, Matt?

  174. yeah sure, why not? Google could simply take over my firm’s entire business and reason for being by offering total website development, etc, and complete CMS systems and customized shopping carts. Heck; why does the internet need real developers when Google could it all by itself? 😀

  175. Robert Schroeder

    “…shopping cart software often creates bloated and erroneous code…”

    And, yes I’m aware of it, but some of these things can ONLY be fixed by Yahoo! – and other issues as you said “…takes a great deal of work to fix…”

    Custom carts are very very expensive.

  176. I voted for “More information about penalties or other scoring issues” which seems to be the most popular, but would like to contend that there are others here that would really act to inform this piece and would like to suggest that they be included with it.

    “Tools for detecting or reporting duplicate content” since dupe content can effect if your page is showing as supplemental vs main listing I think it goes along with scoring issues even if ‘technically’ it is not a penalty the supp index is one.. so this should go with it as well as “A way to list supplemental result pages”

    “Score the crawlability or accessibility of pages”, “Fetch a page as Googlebot to verify correct behavior” and “Diagnostic wizard for common site problems” Again if your page is not crawlable or if your site has issues it affects your ranking, so think this also goes with the first tone.

    So I think these combined would provide us with powerful methods for making our sites more relevant and worthy..

    Thanks Matt for posting this poll!

  177. Yes Bob; it can only be fixed by Yahoo and that is exactly your problem. It’s easy to throw up a site and use yahoo. That site doesn’t even use all the plugins that yahoo offers. You take your visitors “off” your site to make a purchase which is not good either. Yahoo is running the entire site for you… not good.

    It use to be one could simply make money on the internet by just “being”. It’s not that way, and hasn’t been that way for a few years now. There is way too much competition for the same eyeballs in each market. Shoes are very much a top item for any and all shoe affiliates. If you site is not unique, you can hang things up. Being unique can involve many different things including “you have to spend money to make money”. You state custom carts are too expensive? I beg to differ with you totally. You make any investment back in the end Bob. You paid “pennies” to get that yahoo site and cart. It’s not a unique proposition.

  178. Robert Schroeder

    “That site doesn’t even use all the plugins that yahoo offers.”

    What plugins?

    “You take your visitors “off” your site to make a purchase which is not good either.”

    I can’t do anything about that.

  179. Not sure what you mean by that. You are the only one who could do something about it. 🙂

  180. Robert Schroeder

    Well, Yahoo! could do something about that.

  181. No they can’t, and no they would not. You bought their services for your site as is. They didn’t force your hand to use them. You have plenty of options out there but you chose them. You are the only one who can fix your own site. No one else can do that for you….. unless of course you pay them.

  182. Robert Schroeder

    “No they can’t, and no they would not.”

    Why not?

  183. Most Important, is Think Universal! Not only my USCIIIIII CODE but its reasoning, the need for sharing natural logic with binary machines, a voiceprint based communications which is the future of human machine interface, is the natural standard you can not avoid. We are made in the image we must understand if we are to make a machine in our image. You can not escape the truth in my invention, like the phone, it does not have a problem with every language or music or verbal logic or truth you can trust, after you know my voice and person; so please be wise, and start by knowing the end solution that I only know as it is my invention and I do own it, and can show information about it while still having full ownership of how the future machines will work, and why it is important the you also work on The Universal and not the endlessly limited ways of ascii and English as well as many other languages not yet upgraded for use by machines.

  184. I want something else. an option that will allow to declare single sitemap as both mobile and web – I have same URLs for both mobile and web versions of my sites and hence have a single sitemap.

  185. I voted for “More information about penalties or other scoring issues” because Google does a terrible job explaining themselves. Google claims content is king, and that is how it should be, but the index doesn’t necessarily back that up.

    Search for —> free rv camping vidal junction ca (I’m sure someone has statistics about how often someone searches with and without quotes)- The first result today is a site that has absolutely nothing to do with the search free rv camping in vidal junction, california except for a Google Adwords link about RV camping and campgrounds, and then nothing free. As a matter of fact…of the results displayed on the first page , NOT ONE has ANY information about the actual aggregate search term. Now since Google has determined that I want information about Vidal Junction, I suppose the results are OK, EXCEPT they still didn’t give me anything about the first terms in the string! If I wanted information about vidal junction, I’d have searched for it, not a deeper specific search where I got a LOUSY result!

    Search for —>”free rv camping vidal junction ca” – The first result is a post of mine on the Google Webmasters Help Forum, again, no content about free camping, just the terms in a primary indexed web page. The second result is my Supplemental Result where there is TRUELY content about the search term.

    All you have to do is read the webmasters help forum… NOT! I’ve gotten help that is marginal at best, and inaccurate at worst. I’ve posted about this exact issue, though I now know that G is watching as the search results have changed since the last post I made. Now I’m worse off!

    Some of us really need help and guidance, and can’t afford to pay SEO wages. I’ve struggled for over 2 years to get my site noticed by G, and so far my reward is over 200 pages of supp, 18 pages in the primary index, and a whole boatload of frustration trying to figure out how a web page about Wyatt Earp Days in Tucson, AZ shows up ahead of a content page about the exact search term requested.

  186. I was going to vote – but when I read through the choices I saw that they would almost all be nice features – its too hard to choose only one.

  187. start penalising sites that are using black hat techniques such as forcing IPs and buying thousands of paid links across cooperative sites.

    have a look at every top 30 site in the finance / loans industry for starters. i used to believe it was all about content and ethical seo, but check the links of anyone in the top 10. properly check them, not just via the selfish google link: command which only gives you a taster.

  188. It would be nice to see history of Top search queries & Top search queries clicks

    Thanks

  189. “No they can’t, and no they would not.”

    Why not?

    Because, as the payment processor, Yahoo! gets a nice chunk from each transaction (as does any payment processor, so Yahoo! can’t be blamed for that in and of itself.)

    Robert: depending on your definition of “very very expensive”, custom shopping carts can be in terms of initial investment…but what you also need to factor into that equation is the loss of potential sales that you’re experiencing by using something limited and inherently flawed, such as a Yahoo! storefront. It’s like Doug said…the more you put into it, as long as you’re smart about it, the more you’ll get out of it.

    The best advice I can give you in that regard is to really shop around, do your homework, find designers with a clue and that are taking on business (a lot of us, such as myself, aren’t…so save some time in that regard).

    They’re out there…it just takes time and a lot of patience to find them.

  190. My biggest problem is when it loses the verification for a site. I have a good number of sites in there, and I often have to go back in there and re-validate some of them because at some point Google got a 404 when trying to access my validation file. Why not re-check it once or twice, then force me to re-validate?: Seems like that would be easy to implement.

  191. Dave (original)

    Robert, from what I see, your choice of payment processor is the least of your problems.

  192. Dave (original)

    Kenki, read my previous replies.

  193. So Matt asks “What do you want to see the Webmaster Central team do next?”

    Well it is more than likely he will not even bother to read this, but just for the hell of it, here goes……

    The thing that I would like the team at Webmaster Central do next …is to realise that the whole world does not live in the United States!

    So what do I mean? Well it’s like this…

    It borders now on virtual impossibility for an average small business or enterprise to put a website up and get any exposure or visibility whatsoever on Google unless they have a massive budget to spend on a small army of technical experts that can advise on correct web page construction, search engine optimisation, online marketing, PPC, location of server etc etc..

    When Google first came along many years ago, it claimed to be the best thing in the search world, ensuring a vastly improved level of search capability. However I would humbly suggest that Google has now become so carried away with itself, it’s war again search engine spam, linking and “supposed” quality of content, that it has totally forgotten the average Joe with the average website…and particularly if he does not live in the U.S.

    The fact that most of our countries were founded on the average Joe having the motivation to get up off his backside and start a business and provide employment, seems to have been completely forgotten by the esteemed Google team.

    How would Serge and the other multi-billionaires and multi-millionaires at Google react if I came and painted their house and then I could not guarantee the work I did for them. I would suggest they would be amongst the first to stand up and start screaming.

    I’ve personally been around in SEO and online marketing since the mid-90’s and it would be fair to say that it has never been so difficult (if not impossible) to get a new website ranked on Google than ever before. In the last year because of the way that Google has chosen to continually play around with it’s ranking requirements, I have made a decision to cease any and all operation in the SEO and online marketing field. And why? Because at the end of the day I can do the very best that I can for a client, but now I find that I cannot look him or her in the eye when I stick my hand out to get paid. Simply because I do not know if the work I have done for them will be successful.

    Trying to accommodate Google’s ever changing website requirements has now become little more than a joke. And sadly because of the constant meddling, so has the quality of Google’s search results.

    And the proof is in the pudding….earlier this year I was foolish enough to spend 3 months building an education website with a considerable amount of unique educational content. I really wanted to believe that the quality of content was of maximum importance to Google. So to test this out I ensured that there were few if any links to other sites. Six months later I see all that work was an absolute waste of time. All that valuable Maths, Biology, Art and Music content sits in the supplemental results. So I now sit here pondering how on earth Google can justify placing such importance on the number of links on a site, rather than providing 12 year old children with free educational material.

    So at the end of the day, here is my challenge to the big G Team. If you guys and gals really do care about the quality of Search Results and everyone having fair access to the Internet, why not try climbing down out of your ivory towers and make your way out of your technological jungle, just to take a quick look at the real world. And if you did that you might realise that you are surrounded by average Joe’s and Jill’s, with average small business who simply cannot afford to spend $20,000 a month on the latest Google changes to ensure their website gets any visibility. And when you’ve seen them you might also start to realise that we have now nearly done the full circle( because in all seriousness, Google has taken us from the use and validity of the Directory structure, right back to it.

    So Webmaster central team, if you really do care….that’s what we’d like to see you do next.

  194. Dave (original)

    LOL! There is no “objections” from me and my post listed possible reasons in a clear and concise list form.

    Here’s the spoon fed version: 🙂

    Not enough quality links from similar sites, not enough content pages, reliance on too few phrases and reliance on too few pages.

  195. I “voted” for the “way to list supplemental pages” because it is close to what I would REALLY like to see. WHY ARE SOME OF MY PRIMARY IMPORTANT PAGES BEING DUMPED INTO SUPPLEMENTAL? WHAT TO FIX? HOW TO GET OUT OF SUPPLEMENTAL?

    PULEEZE!

  196. Matt,
    I can’t believe you just used a font tag with the attribute of color=green in your post.

    Showing your age their fella 😉

  197. There’s one feature I and others would like to see – the ability within the webmasters url removal tool to remove all pages that include a phrase instead of removing them individually.
    For example, every page of my site that Google has listed that includes for example “product_info.php” or “cName=” within the url can be deleted in one go. This could be done by entering ” *product_info.php* ” or ” *cName=* ” (like you do in robots.txt) into the webpage removal tool and all pages that include the phrase will be deleted with one click.
    Stupidly I added an SEO url add-on which changed the look of my url’s then I changed to another one. I now have a massive duplicate content issue and need to delete over 3000 pages. The add-on described above will let me do this quickly and keep my site clean and true.

  198. Definitely these two as they are problems that only Google can solve and we already have tools that can do most of the other technical stuff, duplicate content detection etc:

    # Tool to help move from one domain to a new domain

    # Tell Google the correct country or language for a site
    —–> and have it so it can override your current filter settings no matter where the server is physically located.

    And also this just out of interest 🙂

    # More information about penalties or other scoring issues

  199. I agree Teddie, but what tools would you use for detecting duplicate content and how would you go about removing 3000+ pages from Google without searching and spending days to get it right? 🙂

  200. I wish if google provide us right informations to all publishers so everyone can start from start line and with hard work everyone could have chance for success but now for most of us google is still big enigma.

  201. like one of the first comment posters said – why not all of them (in an ordered fashion)? Or at least let us choose our top three, one was very hard to pick from. More information about penalties and scoring was certainly #1 on my list, and it looks like most others as well. I will be very interested in seeing in how this pans out (in the webmaster console) and also (if you make it available) how many more reinclusion requests you get as a result.

    I do applaud you for asking for the feedback, and actually creating new functionality based on what users really want. The google webmaster console is becoming light years ahead (in functionality) of Yahoo’s Site Explorer.

    thanks Matt (and Webmaster Console Team)…

  202. Excellent this pool but I would like to voted for all….

  203. Oh man, I hope what is currently the top requested feature gets implemented FAST. It’s just toooo frustrating to run a quality site and have no idea why you are obviously being penalized!!!

    Having said that, totally unrequested, I’m going to add my top five after the one I voted for seeing as it was still a tough choice. Some good ideas there!

    Diagnostic wizard for common site problems
    Tools for detecting or reporting duplicate content
    Show causes of 404 errors
    Score the crawlability or accessibility of pages
    Option to “disavow” backlinks from or to a site

    ps. cool, if you copy/paste the graph, it shows you the percentage behind the number of votes. Nice 🙂

  204. Quick I need votes for what country the site belongs in, I have been waiting for that for a while now. 😉

  205. Information as to the location of broken links.
    As Alex above said, I have spent many hours searching for a single broken link, reported by webmasters tools.
    Thanks Matt, I appreciate the great knowledge you share.
    Scott

  206. Dave (original)

    Why not all of them?

    Same ole question, which the answer to is based on common sense. If there was an “all of them” option most, if not all, would check that option. Then, nobody would know which is the most wanted.

  207. IBasket Bill – you shouldn’t be doing that via the url removal tool – you should be using htaccess (if it’s an apache server) and 301 redirects duplicate copies to the correct version – that would be very simple (maybe 3 lines) to accomplish and would avoid losing customers (and PR) when SE’s or customers follow an outdated link.

  208. UA&RU SEO says 😀 : “Thank you, Matt, during this interview”.
    …First option, certainly.

  209. i think Tool to help move from one domain to a new domain is very useful for the webmaster, it can save lots of time .

  210. @ Ibasket Bill – have our own crawler (may sourceforge it soon). Duplicate content within a domain is nearly always an issue with managing URLs and not content, we just need to find where the duplicates are coming from, which we can easily, then it’s just housekeeping. If someone else is ripping your Intellectual Property try Copyscape.

  211. I had a little tink about this way , but now: Nice step I say!

    Your team must had a genius as pathfinder (great music from Beggars Opera 3rd LP) I think.

    And now I plea for another step:

    Please integrate a Link-exchange at the safe webmaster console platform. There Google can propose the webmaster good and natural link-partners. And if the proposed partner denied, it should be added also to his PR. This way the system would become a little bit more rational and fair motivation for all.

    PS: ?some one noticed: the google-2007.gif goes TV
    in a new form, but it should also symbolize movement there.

    Karl

  212. Hey Matt,

    How are you thinking of these features…. really cool, I have voted my opinion

    Cheers,
    Harini

  213. Karl; I’m thinking that is a joke, right? Gawd; I hope so. 🙂

  214. Score the crawlability or accessibility of pages would I say! I have lot’s of visitors and low pagerank. No of visiros should have an impact to the pagerank too

  215. Hi Nejc; No it shouldn’t since pagerank has nothing to do with your visitors and numbers of visitors, nor should it.

  216. hi, id like to have the ability to ssh into someplace in google and type commands to view my google analytics. now that would be cool.

  217. now this is a nice service

  218. I’d like it to recognise that my site is encoded in UTF-8 (as per the meta tag and the server header) and not in US-ASCII as the page analysis insists, but I think that’s a Googlebot problem rather than a webmaster console problem. 🙂

  219. Matt McLaughlin

    I would really like to see accounts allowing multiple users without having to upload multiple verification files/metatags.

    Thanks!

  220. I voted the first option because I have recently a big problem and if WC had this options maybe it helped me a lot.

    Here is the strange case which I’m dealing.
    My blog about cars drop from 1000 long tail and short tail searches/day to about 15. I thought my site was banned but isn’t the case.

    It barely ranks for the titles in “”
    I’ve looked at the webmaster central and I got about 11900 not found URL’s that looks like this: carsandtuning.org/ar/etc

    A few month ago I had a language script but I removed it. I can’t understand how googlebot is crawling this pages (/ar/ and from where he finds it.And why it insists when this pages don’t exist.

    Also I write my own content and I’m not scrapping.

    Can you please take a look? All my work is ruined and I don’t understand how this happend just in one day.

    Thank you.

  221. Ibasket Bill, the meta “noindex” case is exactly for you. Or robots.txt (with a better organized site structure).

  222. Ahem, I’d also like “keep the existing tools up to date” to be on the list. The “Your page with the highest PageRank” tool is currently only displaying April, May and June for me, so it’s at least a month out of date. It used to update at least once a week. 🙁

  223. Just keep it constantly updated. It’s frustrating to see the same old thing day after day after day after live long day. Snore….zzzzz.

  224. How about a feature that shows which pages on your Website are considered a supplemental result?

  225. Hi Matt

    I totally agree with Harry. A more refined search with PR ranking as well as the number of backlinks would help in linking with other sites of the same quality etc.

  226. I think google should consider stopping toolbar PR. There are many benefits to this:

    1.) Buying and selling of links will reduce
    2.) Development Crapy MFA sites that are developed only to sell PR after gaining some initial PR through a couple of backlinks will stop

    Remember the good old days when there was no PR? Back the people used to link to each only based on merit.

    Stopping the toolbar PR can really help reduce spam if not completely stop it.

  227. Ha ha ha! That was funny!

    Can we stop laughing now?

    Ask what new tools people would like but then remove the only tool that currently was of much use and had any resemblance to being anywhere close to being up to date but, don’t remove the need for the tool in the first place.

    Did we just vote for a bunch of tools that if ever implemented, we’ll all have to wait for the other shoe to drop and find they’re removed?

  228. how can i see my website results from other countries?

  229. How about some kind of help for people who’s content is stolen daily and believe they are being penalized by Google for it.

    It really would be a shame for content creators to loose the value of their work from an automated process that is wrong some of the time.

    I think many webmasters are at a loss especially when their competitors syndicate their content.

  230. i had selected option one, More information about penalties or other scoring issues. but some other options are equally important to me, i think google should now give little respect to the buisnesses same as it gives respect to visitors searching data, there should be a proper guide and diagnosis method for online buisnesses so that they know what they are doing wrong, everyone knows that there are huge number of companies affiliate with online buying and selling, can’t there be anything by the google to let them know what wrong they are doing and what things may effect there online buisness.
    I just want is an easy way to understand what wrong someone is doing, intentionally or unintentionally

  231. I don’t know what kind of parameter is meant by “Tell Google a parameter doesn’t matter.”

    I think Google’s U.S. Government Search at http://www.google.com/ig/usgov needs to find a unique niche for itself rather than lure users who’d be better served by the official U.S. search engine at http://usasearch.gov/. I’m not sure what the difference is, but it seems the official one that’s run by the government would provide the best results for government searches.

  232. Hi Adam; you wrote:

    “More information about penalties or other scoring issues. but some other options are equally important to me, i think google should now give little respect to the buisnesses same as it gives respect to visitors searching data, there should be a proper guide and diagnosis method for online buisnesses so that they know what they are doing wrong, everyone knows that there are huge number of companies affiliate with online buying and selling, can’t there be anything by the google to let them know what wrong they are doing and what things may effect there online buisness.”

    So you think it’s Google’s responsibility what kind of success your business has on the internet? That’s an interesting take on things. Google offers “free” visibility and FREE visitors to “your” business; actually refers visitors to your business for free, but you don’t think Google does enough for businesses?

    Since Google’s search results are mainly objective and done via automated processes for the most part, if Google were to answer all the concerns of webmasters and businesses totally and told each and every one of you why you are not ranking; how the heck would that happen precisely? Which businesses would Google have to favor over other businesses and which 10 businesses in your market/industry would be on the first page of results if Google were to tell each and every one of you what is wrong with your business?

    I always thought it was the responsibility of the business owner to either figure out what was wrong with his own business, or hire a specialist to help the business figure out what was wrong with it? Maybe I’m too 1999, but that’s always been my take on things. I didn’t realize that in 2007 businesses want a third party to tell them what is wrong with their own businesses and even do so for free, eventhough that same party is already giving the same businesses free referrals anyway.

    Sounds kind of nutty, right? 🙂

  233. I want to echo and, er, “ditto” Alan Rabinowitz’s comment about somehow dealing with stolen content and any related penalties, etc….

    I had an experience a while back of finding a guy who copied several hundred pages from my CPA web site. In a way, it was actually a little funny. Sort of a backwards homage. But I wondered about the search engine penalty (if any).

    Tangential anecdote: I wrote a couple of the Dummies books, Quicken for Dummies and QuickBooks for Dummies. I offer, as a PR thing, to give these away to prospective clients. This guy who plagarized my site made the same offer… to give away free copies of my books. Geesh…

  234. Looks like everyone wants information on penalties… any chance we’ll ever get it 🙂

  235. I’d love to see the TEXT of the backlinks. Is that a possibility?

    Thanks!

  236. Umm, why no option, “ALL OF THEM!!!!” :-()

    Although I voted for “Diagnostic wizard for common site problems”, I think “More information about penalties or other scoring issues” runs a very close second if for no other reason than to prove/disprove a penalty actually exists.

    And, were the “information about penalties” to include the actual reason for the specific penalty and specific page(s) it is being applied for, it would then seem to be a toss-up, for me, as to which were more useful.

    I base this not on what I myself would necessarily find all that useful but more so, what I see most questions come up for at the Google Webmaster Tools help forum.

    I still think there should have been an “ALL OF THEM!!!” option.

    Craig

  237. I’d LOVE to see the Blog search as one of the default searchs on the Google homepage vs. having to pull it down as an option.

  238. A way to list supplemental result pages 🙂

  239. Several tools would help webmasters: more information on penalties, duplicate content, and supplemental results. Also, update the links more often.

  240. Do you still read comments after 200 😉 I hope so because this is a new request…

    The Google Alerts is so spammy. Week after week after week the same domains get through. Especially if you do an “as it happens” alert. And if you create a unique keyword on your site to catch scrapers, it is really annoying that it’s hard to report these domains. You require users to type in the search term and SERP page you found the site. You need an option to report a domain and just say it came from Google alerts.

  241. I’d really love to have a tool

    – for detecting or reporting duplicate content

    I’ve recently discovered a site that contais a lot of pages copied from my site… I found it casually on on google while searching for other things.

    I think this tool would help very much respecting copyrights.

  242. Hey Matt,

    I like the Google Backlink Alert idea. Why don’t you guys integrate the Google Alert System into the Webmaster Console, like a live feed of backlinks which you can sort by date, group etc. For each backlink you can do the following: disavow (including motivation?), find all links from this domain, report spam, report malware, contact webmaster (for the SEO fans 🙂 ).

  243. Some great ideas here.

    Me? I voted for being to help Google identify the geographic focus of my site. I have a Canadian hosted site, with a .com domain. However the site is UK focused. With hindsight I’d obviously have been better off with a .co.uk. domain. Yahoo indexes the site as a UK site…it would feel like all my Christmas’s had come at once there was a mechanism for asking Google to review such sites for geographical classification..

    Thanks

    Yvonne

  244. I voted for:
    More information about penalties or other scoring issues

    I think webmasters should know more about what they are doing wrong, and focus more on building content for users not for search engines.

  245. I’m missing another kind of tool.
    I got some costumers that want a remake of their page but they wont leave the basic frameset-design they are using atm.

    So the tool I would love is a tool to tell Google what is what and that the result from Google load up the frameset even though the hit is on the content pages?

    I used to use a framefix but since Google been working on how to deal with JavaScript (guessing all those JavaScript menus out there) every page but the frameset page have been removed from index. Guessing it’s due to the fact that you redirect or change the window location to the frameset if the content page viewed isn’t viewed in the frameset.

    So my vote would be for that, rather than any of those other tools…

  246. In the webmaster’s consol I would like to see the PageRank numbers and the posted time on when the next PageRank update. I have so many customers asking about their PageRanks. Google needs to give us more information.

  247. i voted for more info on whats wrong….

    i mean its kind of rediculous when they dont answer contact requests, they dont say why sites get gutted and only point to a vague webmaster guideline… how about some reason and details why our site was gutted? or god forbid a way to fix them without waiting from an eternity to an infinite eternity to get the gutting to go away

  248. 1. More information about penalties or other scoring issues (I dropped 30 spots overnight for no reason and I haven’t changed anything in months. This might help explain such anomalies)

    2. Option to “disavow” backlinks from or to a site (Without this isn’t there room for abuse from competitors?)

  249. Voted for more info about penalties but I would love to see all of them. You guys are miles ahead of Y & M with the Webmaster Console already, kepp charging ahead eh? 🙂

  250. Very good and interesting post, thank you Matt.

    The ones I’d most like to see however are:
    1. Tell Google a parameter doesn’t matter
    2. Tell Google the correct country or language for a site
    3. More information about penalties or other scoring issues
    4. Tool to help move from one domain to a new domain

  251. Wow huge vote for for More information about penalties or other scoring issues. I for one would love this.
    I am one web master who over night dropped into a -50 penalty 4mths ago. I want nothing more than to comply with Google’s guidelines. I have been over everything, changed some things and I am still none the wiser as to what I have done.
    So some extra information and help for people like me is greatly needed!

  252. “More information about penalties” is a great option, because it’s not fair to be penalized and not to know the reason. Also it’s a good idea that G says how much time is the penalty , this way you know if there are hopes or already you are dead. You know, when you don’t have traffic from Google you’re practically Dead.

    “Tell Google the correct country or language for a site” is another good point, in some countries web hosting is really expensive and people get web hosting abroad.

  253. Yay,

    I see that a geo target location tool has been added now – thanks to Matt and the Webmaster Tools team.

    http://googlewebmastercentral.blogspot.com/2007/10/better-geographic-choices-for.html

    Groovy 🙂

  254. Hi!

    Option to “disavow” backlinks from or to a site ,

    because I want let you know that this must be stopped at ounce:

    66.249.65.205 – – [31/Jan/2008:22:28:07 +0100] “GET /graph/Black-Smoker/?ref=SaglikAlani.Com HTTP/1.1” 200 – http://www.light2art.de “-” “Mozilla/5.0 (compatible; Googlebot/2.1; +http://www.google.com/bot.html)” “-”

    or how else can I stop this rumor?

    Greetings Karl

  255. Greetings Matt,

    I personally would like a few of those features:

    More information about penalties or other scoring issues: It would be great to know if something is wrong, especially for people who don’t know much about SEO and lack experience, or for more experienced people like me to know if new sites have penalties or they are just crappy and new. It gets a little scary when a site goes from 50 visitors a day to 1 or 2.

    Show links on your site that are broken: This is a big one! All there is out there (that I know of) is that Zenu software to check for broken links. EXTREMELY helpful for large sites so I don’t have to manually look at every single link. Problem is, Zenu always times out before it finishes a scan lol, it would be awesome to have this feature in webmaster tools to find what pages these bad links are. It could catch misspelled linking URLs as I make them.

    Show causes of 404 errors: This goes hand in hand with finding broken links, especially if I don’t know what pages the broken links are located and it’s a huge site.

    Option to “disavow” backlinks from or to a site: This would be an interesting tool because I would like to knock out some bad links that could be coming from hostile competitors or bad neighborhoods. At least I would have a little control over who links to me.

    Fetch a page as Googlebot to verify correct behavior: It would be great to see a page through the eyes of Googlebot. The reason is because I had a site a LONG time ago when I was an inexperienced webmaster. My background was that of a repeated background image ON TOP of a white background. I had white text over this image. The problem is Googlebot probably saw this as white text over white background and I got a major penalty that lasted all year and I had to close up that business. I didn’t notice it for a few months because it came up as white text on a dark picture (which looked sweet by the way). I thought this traffic was “normal” getting 1 or 2 visitors per day but I didn’t know about penalties that whole year and realized that competitors with crappy looking sites and less content than mine were getting 100s of visitors per day. Again, this goes hand in hand with the “More information about penalties or other scoring issues” feature that would help indicate problems!

    Those are the features that would help me out the most. Just thought I would throw it out there with my reasoning.

    Best

  256. Hello! I think more information about penalities would be the most good and reasonably.

  257. But of course I must say these are all great 🙂 My idea (I know, its a bit of a stretch but humor me) is to create GOOGLE (insert clever name here) custom page similar to igoogle. A place where all amazing google tools are integrated into one space and organized according to the users own taste.
    For example I want (if I am going, I might as well go big here) a fully customizable control panel where with a “click” I can add a custom analytic’s box on one side and and links list on the other. And so on… is it too much to ask? 😉

  258. Google Webmasters has been a huge blessing for many. The competition is fierce. Adwords, analytics, and all the tools are a big help to produce a successful site.

  259. First time commenter so be easy on me!
    How about a reason why a certain page is in the supplementary index
    eg no links / lack of content/ spam

  260. I sometimes write abstracts of articles I find in the web. It helps give a reference point for my input. Does using this content count as duplicate content even though I give credit and a link to the original? Anybody?

  261. An automated message or warning regarding a penalty you incurred so you can at the very least have the opportunity to fix the issue.

  262. HI Matt
    i would like to see “exloration DATE” on backlinks on webmaster tool

    Thanks
    Fares AJROUD

css.php