What would you like to see from Webmaster Tools in 2014?

A few years ago, I asked on my blog what people would like from Google’s free webmaster tools. It’s pretty cool to re-read that post now, because we’ve delivered on a lot of peoples’ requests.

At this point, our webmaster console will alert you to manual webspam actions that will directly affect your site. We’ve recently rolled out better visibility on website security issues, including radically improved resources for hacked site help. We’ve also improved the backlinks that we show to publishers and site owners. Along the way, we’ve also created a website that explains how search works, and Google has done dozens of “office hours” hangouts for websites. And we’re just about to hit 15 million views on ~500 different webmaster videos.

So here’s my question: what would you like to see from Webmaster Tools (or the larger team) in 2014? I’ll throw out a few ideas below, but please leave suggestions in the comments. Bear in mind that I’m not promising we’ll do any of these–this is just to get your mental juices going.

Some things that I could imagine people wanting:

  • Make it easier/faster to claim authorship or do authorship markup.
  • Improved reporting of spam, bugs, errors, or issues. Maybe people who do very good spam reports could be “deputized” so their future spam reports would be fast-tracked. Or perhaps a karma, cred, or peer-based system could bubble up the most important issues, bad search results, etc.
  • Option to download the web pages that Google has seen from your site, in case a catastrophe like a hard drive failure or a virus takes down your entire website.
  • Checklists or help for new businesses that are just starting out.
  • Periodic reports with advice on improving areas like mobile or page speed.
  • Send Google “fat pings” of content before publishing it on the web, to make it easier for Google to tell where content appeared first on the web.
  • Better tools for detecting or reporting duplicate content or scrapers.
  • Show pages that don’t validate.
  • Show the source pages that link to your 404 pages, so you can contact other sites and ask if they want to fix their broken links.
  • Or almost as nice: tell the pages on your website that lead to 404s or broken links, so that site owners can fix their own broken links.
  • Better or faster bulk url removal (maybe pages that match a specific phrase?).
  • Refreshing the existing data in Webmaster Tools faster or better.
  • Improve robots.txt checker to handle even longer files.
  • Ways for site owners to tell us more about their site: anything from country-level data to language to authorship to what content management system (CMS) you use on different parts of the site. That might help Google improve how it crawls different parts of a domain.

To be clear, this is just some personal brainstorming–I’m not saying that the Webmaster Tools team will work on any of these. What I’d really like to hear is what you would like to see in 2014, either in Webmaster Tools or from the larger team that works with webmasters and site owners.

546 Responses to What would you like to see from Webmaster Tools in 2014? (Leave a comment)

  1. The full list of links considered by Google as artificial

    • +1 – they know which links they deem to be bad so should share the info with the person they penalise to help resolve the problem, saves all the back and forth and the risk of removing links that are actually fine.

    • Totally agree with you, If you guys simple tell us what are “bad links” for our site than we will remove them asap and then there would be lesser spam for you to filter.

      • More in depth keyword data would be great, as well as a link diagnosis tool that lets us know if we accidentally added a good link to the disavow tool.

    • +1 I was thinking about this : evaluation of the link impact (good or bad or suspicious/need attention and a button next to links to directly disavow them

    • Also agree. There are so many links out there from which webmasters don’t know they exist in their backlinkportfolio as Google Webmastertools only shows limited amount of backlinks, same for other third-party-tools. It would be great to have a full list of links Google considers as spam or unnatural.

    • I 100% agree with this comment!

      As an SEO specialist who is managing clients accounts, if I suddenly see a drop in traffic, rankings or anything that would cause the client to question the work that we are doing then I need to know if it has anything to do with the links to their website.

      I am constantly trying to improve UX on clients websites and create truly wholesome experiences for customers. If I am not sure as to why they are being penalised then I want to know and rectify it immediately.

      I have never believed that “black hat” links are the way forward with regards to SEO, however they can happen from external sources and if I don’t know that this is happened how can I possibly help to improve my clients website by removing the bad link.

      What are peoples thoughts on this?

      • I think than the problem would be then that it would be too easy for Black Hatters to work out techniques that are not identified by Google.

        If they did provide what they thought was a bad link, it would have to be granular, either limit it to how many bad links you can see a month or something. Something to limit abuse / over use.

    • I can’t agree with this more…

    • Guys, the reason Google will not do this is because it shifts too much power back in the hands of blackhats. The feedback loop would be too short and would be very easy for blackhats to learn and scale. True, the hardcore guys could probably learn just as much using small scale tests and Turks, etc…, but no need to make repentance too easy.

      • Matt,
        I read your comment here and could not disagree more. The more mystery you have the more power the black hats have. The more straight forward and transparent Google is the better experience users will have.
        Also, in cases like mine where I am learning about my site as I go; I haven’t linked intentionally to anyone but the Better Business Bureau. Then I get some message from Google complaining about a link or links. I have no idea where to start. I watched your video went and looked at Webmaster Tools and it told me I had X number of links of which most had been on there a good while. So where do I start? It makes no sense to me. If there is a problem then tell me what it is. If Google really wants a level playing field on the web then surely it realizes that a small company doesn’t have the resources to play these kind of games like a billion dollar enterprise can. We don’t have rooms full of people figuring out or trying to figure out Goggles every move. Therefore, giving people accurate, timely information will ultimately make your job, my job and the users experience better. If a level playing field is what is truly desired. Thank you.

      • Well ,this is my problem.A competition site has started a spam link building against my site incontridonne24.it , after one month when they have reached first place for word incontri in google.it and surrely they reported my site as spam one.

        I have losed my positions for word incontri in google.it, I have made a disavow file and submited to webmaster tools(but it wasnt complete, because webmaster tools dont show all links,it shows about 1/4 of them, checked them with ahrefs.com), but every day new spam links appears.

        All I want is that disavow links support the option to disavow all except some links , dont need regular expressions(if you say it shifts too much power back in the hands of blackhats). Ahrefs gives very good details about backlinks, so the ideea that webmaster tools shouldnt give all the backlinks is somehow “cancelled”.

        The current disavow links tool is useless against automated blog comment spams, forum spams, etc…. Please give me a democratic solution to my issue and others, because my only way to get out of this is to apply the russian method on my competitions “fists in mouth “.

    • Exactly! I also want it. Google hides many things. e.g. in Adsense it just write 1 line. How the hell a webmaster can look up into sites of 1000s page to find out a small issue they found.

    • Totally agree! That will help us to much. As I know someone runs anti seo campagne against my site. Id like to know the bad links so I can disavow.

  2. This may be contrary to Google policies… but I want to talk to someone… a consultation… whatever the cost! lol. Seriously. I am sure I am not the only one… My site is definitely better – more authoratative than what I am ranking for. I know who the competition is, have looked at their sites, and I know who has more of the market share. So when the search results don’t reflect the truth… I would really like to be able to find out why – not to complain… but to make whatever is killing my site right. I am tired of paying for consults from “experts” only to find out that they don’t know any more than me and can’t figure it out either.

    • BTW: I’m certainly not talking about the blog that my name links too, lol. I don’t link to my main site for fear of penalties!

      • Hi Michael, you would only need to fear linking to your site in comments if (a) that’s your main way of getting links and (b) the links show anchor-text over-optimization. Drop me a line and I’ll have a quick look.
        P.S. I know what I’m talking about.

        • OK, Miguel. but over-optimization would not be my fear… but rather, under-optimization. That is, the anchor text is always me… straight up, since that is who I am. I believe doing things right. But I wouldn’t want to send signals to Google that my site is about me… cause it’s not. So if you could imagine hundreds of comments across the net on SEO blogs linking my name to a site about aloe. Yes they are no-follow… but G lists them in WMT. I’m even afraid to use my G+ profile in the website/URL box for fear that it can discredit my authorship – that G might see my name on too many SEO blogs and discredit what I contribute to the net about aloe. Unfortunately, since G is not transparent about these things, there is too much fear to even use the net the way it was intended. Now the pendulum has swung too far in the other direction. G should rather recognize “OK, SEO’s get it… black hat is dead because eventually G will penalize it… so now it’s time to be transparent… and if SEO’s take advantage of the knowledge and use it for black hat… G will figure that out and penalize them… and be transparent when we do it.

    • Michael,

      There is a lot of talk in the SEO community how when Google says “create more great content and build authority blah blah” it is a bit of a cop out. If something is wrong with your site in terms of possible Google penalties, their penalties are pretty ruthless and long-term. And they are very mum about it.

      Matt, I really wish you guys could be more open to requests like Micheal’s. There are tens of thousands of people just like him who need more transparency.

      • Well said.

      • Authority can be a quite controversial thing.

        I wish to repeat some of my views here:

        The way I look at it that Google is being very theoretical in their approach on which they base their algorithms. Everything has to start with theories but nothing can work if it keeps getting only theoretical.

        The basic things which are making the searches unnatural or not so effective may be as follows:

        1) Back links:
        2) Domain authority
        3) Google News

        How difficult it is for a big site with enough resources and connections to get mentioned on equally bigger sites? How difficult it would be to have a journalist write about you if you have the right connection? Actually not at all difficult. Then what about a site which may have equally good contents or even better but which does not have those connections or resources to get mentioned or get so called authority back links?

        A site may be bringing some updates about something everyday on the same page by updating that page. The very old things may not be relevant and hence why to have all those old contents and use unnecessary server spaces. Another site makes a different page everyday and gets into Google News. Just by the virtue of that it gets tons of back links or mentions from the news aggregating sites. Overnight it earns a different place in Google’s eyes. Mind you, I have clear examples of items which are only opinions, presented to look like news and Google failing in recognizing those.

        In IT world it is not only excellent coders which are needed to execute a project successfully. It takes coders and domain experts to do the job. A coder with exposure to the projects for automobile industry may not be good enough for to execute a policy administration system for an insurance company, even if the project involves the same technologies in which he is well-versed with. A person writing code/algorithm just on the basis of the technical knowledge and theoretical assumptions or principles like links, authority, whom to punish, who made a mistake and without any regard to the subject matter (at least to some extent) can never do the job.

        • Good points.

          I know that in the Facebook world you can pay a ‘Like farm’ in Bangladesh (it could be anywhere where labor is cheap) $20 per 1,000 ‘Likes’ (they employ staff to create and use fake Facebook accounts to ‘like’ your page). So, given that a domain costs less than $5 and hosting next to nothing, it would be similarly possible to create loads of unique domains backlinking to your site for a very small cost.

          In the old days, keyword stuffing was used to boost search engine results, until the search engines got wise to this (I’ve now removed the meta keywords tag entirely from my sites and whilst I can’t be sure, that seems to have improved our rankings). Perhaps search engines now need to get wise to all these other SEO tricks…? Because, quite frankly IMAO, there should not be a market for an ‘SEO consultant’!

      • Transparency would be ideal. As an expert in human behavior, the best way to get people to cooperate is not to incarcerate them indefinitely, but rather limit them access to resources AND then retrain. I think people are willing to be retrained when its the only perceived viable option.

    • Ignacio Colautti

      It happens to us also.
      It would be great knowing why a site ranks better than other.

      Pings to Google and advices would be nice too.

    • I’ve experienced many times these kind of X-files situations, for instance, a very competitive keyword, my client’s site on the first place /software checked/, branches in 27 countries, market leader in Europe and the others on that page are also big players, suddenly a new site appears on the second position and is still there for month now. We’ve checked everything and couldn’t find a reason for that. Just a handful of backlinks 80 percent from their sites in two different countries, no facebook profile or google+ page, fresh domain, etc. It’s very irritating..:)

    • You hit the nail…..Exactly the point, we are currently going thru this same pain…we had 134K visitors per month now it remained 30K per month…..Reason……we don’t know…when i posted in webmaster product forum…they come up with story of some minor html errors or google hates directory but these were prior also…now what has changed in 2 months.again no clue….then some body said bouncing rate is high we worked on that and brought down from 65% to 45%…but still result is same rather traffic has further gone down…where is transparency…what shld i do

      • Kushan, I am with you on this. I have a high quality site that has been nurtured through all the algorithm updates, and in fact got stronger with each one to the point that every article I created ended up on page 1 of Google SERP.

        Then suddenly in early November 2013 the site just disappeared from SERP, best I could do was page 18 for the landing page. Have checked everything, no manual penalty , no bad links, spam etc. The site an additional hit a few days ago and has slipped even further. What really irks me is that I cant find out why? Or where to go from here. The possibility of walking away from 2 years of hard work is now a reality. I have thought about starting a new site and moving all the articles across to it, but it would be soul destroying if the same thing happened again. And again not know why. It’s like being charged with crime, but not told which law we have broken. How can we prepare an adequate defense?

        Interesting to note that the sites that are now ranking where my site was are mostly skinny, dated, no visible SEO , No authorship.

        Unfortunately, Google, the self appointed Gods of the universe, have us over a barrel. We want to play by the rules but can’t because we don’t know what they are.

    • I agree number 1 would be some kind of live human support I would gladly pay a support fee if it meant that we could get a live Googler either by e-mail/chat/phone that can help me understand how we can improve our site.

  3. Fresher, deeper and more accurate keyword data!

    • +1

    • I love the broken link report idea. It would be wonderful to know when links on my site point to error pages. I’d also like the following reporting about possible problems with my site:

      – Notification when links on my site point to an external page and the content of that page changes significantly

      – A list of all the spelling errors on my site

      – A list of pages without titles or meta descriptions (I already like the duplicate reports of these).

    • I like make it robot.txt for site, but my web down indexed, make default it very good..

      • Yes, I think removing ‘meta keywords’ altogether has improved our results – if your content is good and what the user is looking for, the bot may do a better job than you at connecting them to it—after all, it’s got a database of zillions of search queries by actual users, which you don’t have.

        I’ve not so far found any need for robots.txt, except to prevent the 404 errors in the server log when we’re crawled.

    • +1 for more refined, more accurate, more granular keyword data – the stuff we used to get from referrers in Google Analytics!

    • +1 for this – the worst thing Google did to webmasters in 2013 was the HTTPS change

    • +1 would be so time efficient and stop all the guesstimates of SEO ‘experts’

    • I understand the reasons for removing keyword data were to try to make websites and their content become more natural rather than keyword focused. However I believe that the majority of SEOers do this anyway and that they now use Bing keyword data instead – surely taking away such useful information pushes users away to other platforms?.

  4. I’d like to see every page blocked by robots.txt , or at least an extract, with an API for downloading this list

    • Yes, there is a lot of ambiguity in GWT between pages that are meta noindex vs blocked by robots.txt, which makes diagnosis hard

  5. A more up to date links section – Why am I always looking at ‘links to our website’ that are over six months out of date. eg. Removed links and links from sites we no longer have links on are still shown in that section of WMT…..Hard to gauge how the link campaigns, link removal work and so on is going as the data seems to be so far out of date….

    Not sure how easy it is to alter, but it constantly irritates me!

    • Hey Steve,

      With the commonly out of date links in WMT, have you considered looking at tools like Majestic for your link data? At work here in Chicago, for our link removal efforts, using Majestic’s fresh link index for our domain coupled with their trust and citation flow metrics, has proven to be pretty helpful at identifying what we should leave up or try to take down. Just a thought, at least until WMT speeds their system up a bit in regards to links.


      • I should have mentioned this tool when I wrote my initial post, but it escaped my mind at the moment. Serpico (http://www.serpicoseo.com) has actually done a great job at fusing both WMT data and Majestic data into one for link reports. I’ve used it before for both link removals and finding great linking prospects. Hope this helps.

    • I want to second this and add to it – more information in general about the links that Google actually sees. Remove old obsolete links, show date you found the links (and unveil them as soon as you find them), and more transparency about links for manual penalties. We have a couple of clients dealing with manual penalties, and the current disavow – beg forgiveness – disavow – beg forgiveness takes a very long time. If they knew what the old SEO provider did wrong, they could fix it much faster and thoroughly, but they remain puzzled.

      Thanks for the opportunity to chime in Matt and Team.

    • Totally agree with you Steve. It’s great to have access to the backlink data but they are sooo slow to refresh. I guess this falls under Matt’s comment “Refreshing the existing data in Webmaster Tools faster or better.”

      Matt, rather than looking for answers in blog comments, how about a formal feature request where people can upvote the ones they want?

    • Yes, it is not easy…by design. True, link dates are always going to be affected by crawling limitations, however, would be nice to have a show/filter option instead of having to download them to see last crawl/index date

    • I keep adding 301 redirects for links that pop up in the crawl errors, though have also noted that the errors may magically disappear from one week to the next even if I don’t. The linking page invariably does not have the link, and in some cases I wonder if it ever did (particularly if the page has a PDF document). Perhaps there is a bug in GoogleBot here…?

      • Comment in one of our .htaccess files from a year ago (preceding a long list of 301 redirects):

        # Googlebot also sometimes manages to recall very old internal links years gone by, so help its hand.

        Should we really be wasting our time trying to help GoogleBot’s hand, or can it stand on its own two feet now?

  6. – exact impression and click data
    – “provided” keyword referral 😉

    • Accurate Keyword-level Data!

      But seriously… if you look at the data provided by Webmaster Tools, WT integrated into Analytics, and WT integrated into Adwords… you’ll have 3 drastically different numbers.

  7. I would really like you guys to be more sensitive to people who have their sites negatively (and often unexplainably) effected by your algorithm changes.

    You are sincerely having a real effect on tens of thousands of people’s livelihood. So more sensitivity, please.

    • Good sentiment Alex. The Penguin penalties seems disproportionate to the crime. My small business was approximately #2 in a 5 company market. In 2012 my SEO “expert” created 50 artificial links and my keyword density was high on some pages and other minor issues. The Penguin punishment Oct 4 was for my site to go from #3 to #30 or worse on SERP = death. My site had minor issues (IMO), now I am ranking behind dozens of non relevant sites. How does that help the person searching for my type of service? I fixed the issues (I think), now I wait and pray.

  8. Organic traffic figures that in some way correspond with reality?

    (Serious answer; a full, complete list of inbound linking pages to my domain)

  9. Organic keyword data?…..Worth a shot

  10. can you provide full functionnalities of the API webmaster tools please ? although documentation lists many type of data, only some of them can be used …

  11. “We do not detect any structured data on your site.”
    But there is structured data. I’d like better debug messages to see the errors, what matched and what didn’t.

  12. A tool a bit like the “Fetch as Google” tool, but for the Links to Your Site Section so you can tell Google when external links have been removed. Currently it can take months and sometimes years for external linking pages to be removed from the Links to Your Site Section after a link has been removed.

  13. What about an external redirect or canonical tool? Sometimes implementing a link rel=”canonical” element to page markup or setting up a 301 redirect on the server are not easy to accomplish tasks (especially enterprise businesses with large websites). An external tool would make it easy for Webmasters to alert Google to subtle changes in navigation without having to jump through all the hoops required for a website release.

    • I second this. Our CMS requires a new site name and marketers want the old and new site active for a few weeks. It would be great to simply indicate that site x/y/z is now a/b/c. We would still redirect pages with significant direct visits but it could accelerate the index’s learning curve and eliminate the spider banging into the thousands of now obsolete pages.

  14. I would like to see a list of penalties a site is penalized with. And I would like to see more “suggestions to improve” that are much better than they are currently.

  15. I would love to see more support for structured data, (the current tools are a nice start) and more tools to support showing me how my data was used.

  16. GWT notifies webmasters when they have a manual action, often the result of improper link building. There are no such tools for sites which are having algorithmic issues. There are generic guidelines for improving ones site, but nothing which points out potential issues. I think I’ve got a Panda problem, but I’m not sure. As a result, I’ve gutted major sections of my site, along with following the recommendations Google has outlined.

    HTML improvements in GWT are a good start, but what about a notification regarding potential ‘content issues’. If Google provided some sort of direction, it would help us make better sites. I don’t see how providing this information would in any way help those looking to game the system. In my case, I’ve got a UGC site, so maybe Google takes issue with non-relevant topics or indexation of members? It’s attached to a blog, so maybe that’s the area which needs improvement? Having some sort of feedback as to what might be impacting a site would go a long way. “We’ve determined [this URL] could be impacting the quality of your site.”

    As a site owner, that would allow us to quickly focus on problem areas and improve quality. Without these tools, changes made are often overkill and may be doing more harm than good.

  17. We would like to see WMT identify for us inbound links that it thinks are low quality, rather than have us sort through 50,000 links and wasting valuable company development time determining it on our own, and possibly disavowing a site that is not really a low quality site.

    Also, WMT should do a better job of updating inbound links on a regular bases. We often see many links which WMT said it found 3 years ago that are long since dead and 404.

    It would be a good feature if Webmaster Tools (WMT) could give a list of all URLs on our site that you have indexed (not just how many). This way when WMT shows thousands more URLs than we know to be on our site, it can tell us what these URLS are

  18. I think Google can learn from itself in this case. Take all of the identifiers (color coded buttons, examples, scores and breakdowns or even simple to ‘read at a glance’ charts) that you use in Google Site Speed Results, and apply that logic to Google Webmaster Tools.

    This will allow owners to see where they are getting a good grade from Google (nice and organic back links, mark-up, crawling availability, metadata, errors and other categories) and see what you need to improve first, or what will offer the most benefits. The grade doesn’t have mean (1st page – A) but just an overall (This is on track to succeed with Google – A) or (This area needs to be improved to make indexing and presenting on Google Better – C) and list some examples or bullet points.

    Some of this you do, but it’s the awesome presentation and guides that you use on Site Speed that could really benefit many aspects of GWT.

    • Grades are good. It would also slow down the SEO spam that is being launched against small business every day.

      • BTW,

        I recently came across a bad SEO company that is so bad, that they generate 10M – 20M a year using pure spam links, via direct partnerships with ALL the big yellow page companies. So they have 1,000’s of clueless customers. It’s an American company as well.

  19. Some separate reports about mobile for responsive or ‘server side detection’ sites.

  20. These would be great improvements. A better API would be also great to integrate WMT in third party tools to better serve agency clients.

  21. A way of distinguishing between followed and unfollowed links when downloading your list of latest links within ‘links to your site’.

    Also a FULL list of links that Google have identified as linking to your site (currently restricted when hitting a certain volume).

  22. I would like to have a ‘Request Malware Review’ that is available all the time as right now we are clean (according to safe browsing) but at the same time a concrete file when downloaded shows ‘appears malicious’ and because there are no Malware warnings at the Webmaster Tools – we are stuck, and cannot do anything.

  23. IF webmaster tool data refreshed faster then that will be a big news, rightnow google webmaster tool data is not updated even in a month, the Links to your website should be accurate and updated, That will help webmaster to identify which links are good or not.

    Thanks Matt,

  24. Faster resolution of malware issues! A hacked WordPress on a site is frequently fixed in minutes, but it’ll take hours for it to clear up after you mark it for a re-scan. I can certainly understand some technical hurdles with having that be instant, but taking down a business for half a day hurts quite a bit now that Chrome stops people in their tracks (even though it’s obviously not Google’s fault in the first place).

    • I can certainly understand some technical hurdles with having that be instant, but taking down a business for half a day hurts quite a bit now that Chrome stops people in their tracks (even though it’s obviously not Google’s fault in the first place).

      Well, don’t use W*rdPr*ss (the Internet’s biggest blight as it stands right now) and you won’t have this problem. I’m sorry, but that thing has caused way more problems than solutions and putting your business in the hands of something that’s free and open source means you put the business at risk. Yes, I know this is Matt’s blog and yes I know it’s W*rdPr*ss, but he has both work-related and personal reasons to use W*rdPr*ss (testing hacked message functionality, search-related testing, and things that make Matt a corner case).

      That’s not on Google at all. That’s on people not taking responsibility for the sites they build.

  25. All of these please!

    “Show the source pages that link to your 404 pages, so you can contact other sites and ask if they want to fix their broken links.
    Or almost as nice: tell the pages on your website that lead to 404s or broken links, so that site owners can fix their own broken links.
    Better or faster bulk url removal (maybe pages that match a specific phrase?).”

    Helping new businesses and site owners would also be a great step with some basic advice around content, relevancy as well as technical advice.

    I’d also link a report that would tell me when links from certain sites were first detected. This would help if some links need to be looked at. Also, there are occasions when Webmaster Tools does not report all the links coming into your site, so improvements to that side would be also welcome.

    • While this is valuable, I suspect the frequency of use of such a tool would be low. Definitely something a 3rd party link tracker should build at some point

      • 404 links can easily be handled with a 301 in .htaccess. Attempting to contact other site owners who have a broken link to your site I would have thought a waste of time compared with just simply editing .htaccess.

  26. Hi Matt,

    I wrote a blogpost (http://www.trustagents.de/blog/google-webmaster-tools-wunschliste) about features I miss. Heres my short list in English

    1. Downloading Indexationstatus

    It would be awesome if I could pull a list of URLs Google indexed

    2. Notifaction if webmaster authcode got removed

    A small website of a friend of mind got hacked and the hacker removed his auth file from the website – took him some time to find this out

    3. Improve usability

    It freaks me out to be unable to click on example URLs in “URL-Parameter” section. Need to copy the paths and add the hostname by myself. Quite time consuming

    4. Internal links accurancy

    The internal link data looks quite weird to me.

    5. Recent links: Anchortext + Link target information

    It’s nice to be able to pull recent links, but as only source and date is included in the download it’s quite hard to perform in-depth analysis without crawling the websites on your own

    6. Improved API

    Would love to get more data through API

    7. Increased dataranges

    The limit to 90 days is quite frustrating if you help a new client which didn’t took care about downloading the data in the past

    8. Searchquery download

    If you download searchquery data via backend, you only pull the queries without target URL.

    9. Crawling error download

    If you can’t access the API it’s pretty unhandy to work with crawling error report download due to size limitation (1.000 rows max)

  27. Hi Matt, from a small business webmaster that looks after his own website it would be great to be able to see if our domain has a algorithmic penalty (Penguin/Panda) . I think this would lead to better websites as we would know straight away if we have issues . As a webmaster for my business I can only guess if we have any algorithmic penalty’s, which then leaves us with a daunting task of trying to fix and we might not even have a problem. Also if we hired a S.E.O company and GWMT shows a problem we have a good reason to question the S.E.O and get them to fix there problem.

    GWMT is getting so much better for the small business owner to use and be able to fix problems please keep up the great work 🙂

    Kind Regards

    • Another good point. Protecting small business from the bad SEO companies could add a lot more accountability and help google track the bad seo companies.

  28. Just one thing… whether a domain is affected by any of those algorithmic updates…. with some recommendations for recovery or other additional information to recover (the way you show manual penalty actions)….

  29. Hi Matt, I think the most useful thing you guys could provide Webmasters at the moment is if they are affected by an algorithmic penalty, and if so which one. Thanks, Chris

    • I think we all kind of want the same thing… but the reality is the algorithms just weigh a little of everything. But it’s those things that we have no control over that stink – like having your keyword in your company name and URL which creates an over-optimized site… or a company name that has been around since the 50’s that just so happens to have a number in the name… these are some of the things that probably penalize, and when you own a company that has all of them, there needs to be some way to get an adjustment. It would be great if you could actually get Google to look at a site and recognize…. “Yes, site owner… you are right… the SERP’s don’t include you where they should and we need to fix it”….

  30. The feature to set URL canonicalization in more intuitive way than the current tool for Parameter Handling.

  31. Definitely the ability to download whatever of a site Google has in case of a hard drive failure or other. A new client of mine got into an argument with their previous web designer who turned their website off and refused to let them have the files so would’ve been very easy to just download a copy of the site to work at least as a starting point to building a new one, rather than having to look at cached copies and download page by page which took forever.

    • Wouldn’t that leave web designers open to being taken advantage of by unscrupulous customers? “You’re not going to pay me for 6 months worth of development? OK, I’ll just turn the site off.” “That’s OK. I’ve got a copy of your code. Thanks.”

      • Since most sites are dynamic nowadays, its hard to value this feature. Besides, what hosting companies don’t offer free backups nowadays?

  32. For me, it would be refreshing the crawl error data more quickly. I have errors in there now that I know for a fact were fixed globally 3 months ago. But Google seems to keep checking older versions of the pages that had the duff links, even after marking the 404s as fixed in GWT.

    Also it would be great if you could fix the many bugs with the GWT site itself. Don’t remember them all OTOH, but one example is ticking the ‘check all’ checkbox on the crawl errors page, then marking as fixed. The items are removed but that checkbox stays ticked; you have to click it twice to select all again.

  33. Confirmation of algorithm impacted websites plus tailored feedback on resolution areas.

  34. It would be nice to have a better faster way to report negative SEO. I don’t think it is right that innocent companies get slapped out of the SERPS for something beyond their control.

    • Hi Matt,

      Thanks for asking.

      The most important thing related to Google Webmaster Tools that comes to my mind is improving the way in which webmasters could report cases of blackhat SEO, paid links etc. As far as I can see, it hardly works right now while it’s the best way of detecting unfair ways of improving one’s position in Google, much more sensitive and precise than algorithms.

      I’m not sure if the problem is related to Google’s internal policies or anything else but it doesn’t work the way I would personally like.

  35. For me Other Resources inside Google place option is there. Google Place option inside can you please differentciat how many people came the mobile thorugh on my local business listing and how many people came to the web through on my local business listing.

  36. You know, Matt, it’s hard to believe you’re on vacation when you ask a question like this. 😉

    The one thing I’d really love to see myself is, for anyone who has used structured data and/or authorship/publisher markup, an indication of when it will appear in search results. I get that it won’t appear right away, but if you’re showing that something has been created and submitted successfully, an idea of when it will appear in a SERP would be nice. Even if you tell us it’ll take 6-9 months from the date you noticed it, at least there’s a range in place so we know what to expect and when.


    It is not very “transparent” that Google currently does not do this… ;p

  38. Hi Matt!
    Would be useful to get data for the index status feature such as the list of URL crawled by googlebot.

  39. Full download links that are linking to your site, not just few of them. Also, it would be nice a fresh index of those links.

  40. Give me an easy way to identify social accounts to my sites. I manage 3 company sites but have 1 Google Plus account (for content freshness). I also have 2 YouTube accounts due to the type of content we produce (quarterly reports on a specific geographical area).

    One more thing I want the webmaster tool to tell me who it thinks should be linking to me and/or who I should be linking to based on my content. The reason for this is sometimes the news media is the only relevant site with any authority in my industry (commercial real estate) and they (the business journals, local news media, etc) refuse to link off their site for any reason.

    • +1 on the social side… Need to allow all such associations to be possible, configurable and verifiable… Identifies ‘digital footprint’ and helps webmasters / Google understand additional authority signals

  41. Please

    can you give some information about which urls of each sitemap are indexed or not 😉

    pleeeeeeease 😉

  42. Hi Matt. Your suggestions are all great! As I do a lot of work to help sites remove unnatural links penalties, here is what I would love to see in WMT:

    -A complete list of links that Google is seeing pointing to our site. Quite often when we get example links from Google they are ones that are not in our WMT list of links and sometimes they are not on any of the paid backlink checkers either.

    -An indication of whether you are seeing any followed links from each url listed.

    -An indication of whether or not the link is in the site’s disavow file and whether it has been recrawled again so that it has been disavowed or is still waiting for that to happen.

    -The anchor text of the link. That could be a tough one though as some sites will link multiple times. But, if there is keyword anchor text, it would be awesome to see that in WMT.

    Thank you for asking for our opinion!

    • Marie said almost the same things I would have said.

      Also, more details in links to site: the URL that is linked to, for example. I am working on link removals for a penalized site that has a few old domains redirecting to it. The links don’t show in Webmaster Tools for the old domains, but show in the current live domain’s list. It would make my job much easier if I knew which sites where linking to which domain.
      And for that matter, a list of domains that redirect to the site would be nice too. I had a couple of incidents where someone 301’d a heavily spammed domain to mine and it took me way too long to discover it.

  43. I will echo many of the comments here.

    1. A complete list of all back links. We shouldn’t need to use 5 different websites to pull back link reports. None of which have all the same links.

    2. A notice if we have been hit by an algorithm penalty and what we can do to fix it.

    3. A disavow tool that when we submit it actually gives a time frame on when we can expect to see the file actually do something.

  44. A way to get in contact with google about your site. Now you are unable to contact anyone at google.. yeah only for adwords. So a contact option like a ticket for special questions or considerations.

  45. I would love if WMT had something to combine or authenticate ownership of a business to merge duplicate/oddball G+ Pages. Maybe have a page that says, “Here is the list of pages. Which should live or die?” Also, I have struggled with combining pages when our company has acquired another. If there was something even as simple as a code/tag we could put on the websites/G+ pages to claim, combine, or decommission the pages it would be enormously helpful.

  46. Notices of Panda algorithmic penalties, just like sites with unnatural links get notice messages in WMT.

    Also better WMT API to get information about the pages visited by the users upon searching Google. This is possible nowadays with the old Google login tokens, but not with OAuth.

  47. WMT Continental Geo targeting especially for European Union allow us to refine hreflang default down to continental (or political union) defaults. Also why not also refine down to state level geo targeting for large countries.

  48. 1. A complete list of all back links. We shouldn’t need to use 5 different websites to pull back link reports. None of which have all the same links.

    2. A notice if we have been hit by an algorithm penalty and what we can do to fix it.

    3. A disavow tool that when we submit it actually gives a time frame on when we can expect to see the file actually do something.

  49. 1. Log Analysis Report – what pages were crawled by Google, when they were crawled, average crawl activity per month for each page

    2. More Detailed Indexed Pages – tell us if a page is indexed or not, date that the page was first indexed

    3. Orphaned Pages Report – any pages that Google has not crawled on the site but was found via external link

    4. Duplicate Content Report – identify pages with identical content

    • +1 on #3 & 4 – though on #4, add ‘degree’ of duplication as it would affect page or site visibility (% of dupe content onsite) and identify dupes – ‘most authoriative’ – against offsite dupe content

  50. Notification if a site is below the quality threshold of an algorithmic assessment (Panda, or Penguin). Additionally, automated examples of either negative links identified (when affected by Penguin), or duplicate/low quality content (when affected by Panda).

    Status updates for reconsideration requests (pending, read by Google, responded, etc).

    Status updates on the effectiveness of a disavow file (have the negative domains/URLs submitted been crawled since submission of a disavow.txt)

    More accurate backlink data.


  51. Hi Matt,
    nice, thanks. Would be nice to have more infos about images.
    This may be a report about images without alt-text. Or if a site is penelized by safeSearch filter. Also interesting: the crawlrate of the Googlebot-images. Perhaps you could add an area about top image-pages or the main image-search keywords. And – this would be very great – a list of those images that could be found as copies in the web. So a webmaster could see if somebody else uses his picture without permission – or he could see if he uses (perhaps) a picture without permission (with a hint that he probably made a mistake).
    Best wishes

  52. Here’s my list:

    An overall website health indicator. (Perhaps from some type of aggregate of all other data.) I just think it would be useful to get an overall “grade”.

    Permanent sitelink demotions.

    Show content keyphrases and their significance.

  53. 1.) Accurate data. We’ve lost 90% of our keyword volume from Google switching to SSL, and it’s not properly being replaced in GWT. I’m showing 300 visits for a term per day in analytics (representing only 10% of actual entrances for that term) and how many does Google Webmaster Tools say I have for that keyword? 5.
    2.) Better link reports. We’re told that nofollowed links don’t matter and that we should focus on analyzing links in Google Webmaster tools….. which contains nofollowed links. Yeah, that makes sense.
    3.) Notification when Google thinks we have wrong / broken canonical tags. It’s great that Google ignores tags it thinks are bad, but it would be awesome for us to be notified so that we can fix them.
    4.) The number of indexed pages by directory. It’s nice that you tell me that there are a couple million more pages indexed than expected. Any chance you can help me track them down?
    5.) DNS based verification that spans subdomains + the ability to combine subdomains into a single report.
    6.) Sitelink Browser. The old tool worked perfectly and I understand that as you add more sitelinks to directories, it needed to change…. but I should be able to look and see if there is anything that I don’t want in there without doing a hundred searches.
    7.) How about you actually tell us what links are bad so that we can get back to creating awesome content and services instead of wasting time using third party services based on flawed criteria to remove possibly good yet possibly bad links from my site? Instead of adding value to the web, I’m shooting at my feet hoping I miss.

  54. How about a change history with dates of algo changes. I’m not asking to be specific ala giving away secrets. It would be good to have a history so we can compare with any changes we might see in traffic, rankings, etc.

  55. Since we just lost all our rich snippets (except authorship): Alert messages in case something like this happens – similar to the already existing issus alerts.

  56. I can only echo so many other pleas for greater transparency. I have never deliberately tried to falsify my site content for better ranking, it is just a good honest niche resource site, loaded to the gunnels with quality content, and only linked to by other sites who value my content, or niche-specific directories.

    Yet, like so many others, I have seen my organic traffic plummet since August 2013. I am lucky in so far that I can be certain that a penalty is in place, because an incidental full site page rename exercise resulted in the penalty being lifted until it was re-indexed and the (harsh – 75% reduction!) penalty re-applied.

    But even though I am certain a penalty has been applied, I can gain no clue as to why! My assumption is that the algorithm is applying to some sites in error, but if that really is the case, how can anyone get a message back to ‘Big Brother’ to tell them they have cocked up?

    You are brave enough to stick your head above the parapet, and sure enough you take plenty of flak, but any large organisation needs feedback from the bottom in order to function properly. I can easily understand that if Google gives too much away it opens the door for bad sites to manipulate their rankings, but surely there is a way to provide some limited guidance in WMT to help the genuine?

    And finally, how about a method to trigger a site ‘re-evaluation’ in order to get a penalty lifted, so we know if we have actually done something right? The message “fix your site then wait until the next update” is not very helpful.

  57. 1. Timespan extended to (at least) 12 months
    2. Better data accuracy (I’ve noticed a 15% of data loss compared to Google Analytics, which itself is losing a good 15%-25% of the real search traffic)
    3. Better Google Analytics integration (for example, th ability to aggregate data on a monthly/weekly basis and not only daily)
    4. Ability to drill down keywords impressions/clicks into website pages and vice-versa

    Enough is enough.


  58. Hi Matt,

    I would love to be able to know what people are clicking on – whether it’s the headline, image, video strip etc. from Google news. This would help us know what works in terms of photos.

    Also, if you could provide us with example sites that have a good job on certain types of code such as schema.org or accessibility or ARIA.

    Thanks so much,

  59. One thing and one thing only: The full merging of Webmaster Tools into Google/Universal Analytics.

    The reason is plain and simple: The amount of effort, work, and change that goes into making Google Analytics an amazing product makes it seem like GWT is being left to rot. I believe that the resources (and data servers) from the Analytics team would then mean far more positive changes going forward than from any other request.

    It would lead to a team that is heavily focused on data leading to the following changes:
    -Exact data numbers rather than ranges or limited sets of data.
    -Turning UA into form of a log server analytics system (basically having Google crawler data)
    -Better UX
    -Ability to match up multiple GWT subdomains into one profile of Analytics
    -Segmentation of GWT data
    -Crawl and Index data
    -and so much more!

    • Hi Micah. I don’t think that’s feasible considering that Analytics is an opt-in tool… and has to remain as such, otherwise webmasters and users would start complaining about stolen data and privacy.

  60. – more robust links to your site section (updated/fresh links data and a way to disavow links right from that section vs. the current disavow tool process that gives no information/update after submitted and isn’t as simple for SMBs to figure out)
    – human support and # we can call and talk to someone just like adwords support team which are great btw
    – better security issues section – either someone we can talk to when a site was hacked and Google hasn’t recognized yet or a way to submit to Google a malware/hack issue is resolved (not able to do this when Google doesn’t show it in WMT but is still showing the site with a “warning message” under the url in serps

  61. More feedback and transparency for Webmasters that really struggle to provide good experience and quality to their website visitors. For example: disavow tool. How it works? What’s the procedure? Any information on the outcome? None.

    (Personally I haven’t used the disavow tool and I hope I will not have to).

    • Kystian, Google have said that the Disavow tool effectively puts a nofollow on any links that are added to it. Although Google hasn’t published the proceedure and information on the outcome, a lot of people who have used it have detailed the process, so it is quite well understood now.

  62. Hi Matt, I’m coming at this from the perspective of an analyst who is relying more on WMT in light of keyword reporting changes. A couple ideas:

    1.) Flexible date range comparison feature similar to Google Analytics. Would save me the time of exporting to Excel and doing this manually with pivot tables. The “change” option in the report is helpful, but not super specific and requires constant monitoring so you don’t miss an update. Along with this, more historical data would be awesome.

    2.) Aggregated statistics concerning keyword/page position, supplementing top 10-20 results with deeper context into the long-tail. For example, a graph showing click contribution by keyword rank or percentile. Or average position of keywords containing “product XZY” over time.



  63. You and other Googlers have been encouraging site owners to add structured markup to our sites both through public pronouncements and by adding rich snippets. Adding the markup seems like a good idea, but when I talk to business managers about it the usual response is “what is the ROI?”

    There are lots of posts about rich snippet ROI, but they just show correlations and don’t give me the tools to go to the boss and say with confidence that adding structured markup is worth the time. GWT seems like the right place to show data on the effect of rich snippets on clicks.

    I’d like to see stats for the type of rich snippet, the number of clicks on the actual snippet, the number of clicks on results that have rich snippets and, ideally, the average click thru rate at that position for the query or for a bucket of similar queries.

  64. 1) Allow the selection of multiple countries for geographic targets instead of just one country. We are a global company, and will eventually set up URL’s for the more than 50 countries we serve. But it’ll take us a while to get there, and this change would allow us to target the appropriate countries associated with each of our 6 web regions.

    2) Tool to convert from http to https or vice versa. We changed from http to https 7 weeks ago, and made sure to use 301 redirects and canonical url’s. We created a new domain in WT with the https, but can’t use the Google Change of Address Tool to create a change of address for this new domain. And now the number of rows in Google’s index since the change has inexplicably been halved. We did not experience this problem with Bing; BWT has a tool for notification of domain name changes, and it seems to have worked.

    • We figured out yesterday why our rows dropped from Google’s index; had nothing to do with Google. It was an accidental nofollow.

  65. When we rebuild our website we always have a lot of 301 redirects taking people from old urls to new. What would be good to see if which 301 redirects are being still after a time so we can know when to remove them. Sometimes we have so many 301s that it makes the server slow.

  66. Social networks integration. Social network markup validation

  67. It would be very helpful to know what’s working on the site and what would Google like to see more of 🙂

  68. For the sitemap I would like to see the URL’s that are currently indexed, not a number. Just a number out of how many were submitted does not help me understand which ones Google is not seeing.


  69. More data in the links to your site report. I’m approaching this more from a manual penalty perspective. I’d love to see anchor text, alt text, and whether a link is follow or nofollow. Anchor text is an easy way to identify spammy links pointing to your site and can help webmasters clean up their link profiles much more easily. Also, knowing which links are nofollow could help identify other problems occurring for the site or cut down the amount of time spent removing links.

    When a reconsideration request is denied, I’d, like to have more data. While extremely grateful for the two to three links we receive now, you may not get the full picture if the links provided are not diverse enough. For example, if the links that triggered your penalty are article submissions and directories but the reviewer only provided article submission links as examples, you might not approach your link profile review in the correct way. Why not show all of the links the reviewer checked to judge whether your reconsideration is approved or denied? I understand that this may help people manipulate their penalties, but I truly believe this would do more good than harm.

    Algorithmic penalty flags in Webmaster Tools would be icing on the cake.

    Thanks for taking people’s feedback and attempting to improve Webmaster Tools.

  70. A list or sample list of what pages Google has indexed (more than just site: searching Google or looking in GA landing pages). It would be helpful for large eCommerce sites to diagnose indexation and crawling issues, especially for when a site has too many pages indexed.

  71. I would like to se a website quality score or grade with specific actions to take to improve your score for a quality website.

    Also, more importantly, automatic integration with Google Analytics. My small business clients often sign up for one or the other and don’t integrate the two tools for better data reporting. Both webmaster tools and Analytics are becoming essential to have active for every site.

  72. I know this is boring and prosaic, but we need Webmaster tools to provide some administrative capability. As a large global enterprise, we have hundreds of domains and subdomains and dozens of users.
    It would be useful to have a user admin. page where we could map users against sites and quickly add/delete a user in multiple sites. Some way to create hierarchy would also be helpful – say all German sites or all sites in vertical market y – (low priority compared to basic admin but the ability to roll-up some high level insights inside that structure – say total impressions and clicks – would be helpful).

    I would also strongly support your last bullet – better ways to identify country_language / CMS across the globe. With millions of pages and thousands of sites, hreflang is challenging – pages change all the time and do not always have a match in every locale so writing rules is not a trivial matter. Even when we have our hreflang tags correct and WMT set to a country, Google cross-country SERP results are not always good so additional facilities would be appreciated.

    • Even if you don’t have hundreds of domains this is challenging. It can be very frustrating when a ccTLD isn’t enough to get Google to rank the ccTLD as primary in that regional engine.

  73. API, API, API

    All the data that’s already available by clicking 3-4 times should be available as an API. Today, if you want to get a list of all the links to your site – with source and destination, you need to effectively click thousands of times, even for a small site.

    Once the API will allow you to pull the data that’s already available then the second step should be to increase the data / remove limits. As it stands some items are sampled (usually limited to 1,000 items per data set) or date limited (only going back 90 days). But first, we need a way to get the data out, then the more data the merrier.

  74. To see the answers of disavow and paid backlink reports. For disavow we do study our work and we disavow for the links that we cannot remove. It would be great if we can see at least “this afford is not enough” or “we are looking for” it notification.

    For paid backlinks report: we try to help google to impove its search algo. But we do not get any notification. At least you can say after the control “the report that you made helped us to see some different spam methods” or “the report that you made is not aginst our guidelines”

    WMT helps us to see more detailed data and lets us to take some actions but we cannot see any answer to our actions.

  75. Keyphrase position tracker. So we could submit say 20 phrases and see where we SERP at.

    • Most things already being mentioned, here’s what I’d vote on:

      Send Google “fat pings” of content before publishing it on the web, to make it easier for Google to tell where content appeared first on the web.

      Access to one year of data
      Full, trustworthy coverage in every data set, not just samples.
      Full list of links considered by Google as artificial
      List of indexed/blocked URLs
      Pages with identical content
      Algorithmic penalties, i.e. adjustments that aren’t natural
      Allow the selection of multiple countries for geographic targets instead of just one country.
      Bulk URL removal requests.

  76. Hi Matt –

    I dont know how crazy this idea seems, but we are always working hard to make our site more compelling, appealing, faster and smarter. We are proud of our work and then our traffic drops. Was it us or was it a Google algo change? Is there some way webmasters can see something – whether its an SEO score, architecture score, ranking score….just something that can give some guidance that it was our internal changes that caused a negative impact rather than an algo change that would have happened regardless of any recent changes we made to our site.

  77. As long as google wants us to “disavow” links we deem “bad”, why not make this process less painful. On the backlink reports, why not add a checkbox or button to easily disavow with one click. Also offer a page that displays all disavowed links with a “remove from list” button.

    Also, I am seeing in some cases 404 errors generated from SPAM links, specifically linking to NFL-Jersey pages that are non-existant (site has nothing to do with the NFL). They are all SPAM negative SEO attempts by competitors. Having a disavow button next to the link in the “linked from” tab would be hugely helpful for this and other situations.

  78. GWT should offer the ability to do bulk URL removal requests directly from the dashboard instead of one-by-one. (I regularly have to action URL removals for legal killed stories for a major news organisation and this would make me extremely happy)

  79. There is a badass guy who always copy what I add on my site. His page rank is higher than me? How funny is it?
    Something to monitor or stop this?

  80. Here’s one, although it isn’t necessarily a webmaster tools feature. I have a custom 404 on my site, and it does return the correct 404 code. It doesn’t fire when certain characters are part of the URL (e.g. URLs ending in ..). They usually come from some idiot bot or script that creates links that don’t exist (e.g. webstatsdomain.com) Somehow, Google is attempting to traverse these links in turn and hitting the default 404 page that is fired. In those circumstances, it would be nice to either flag it in some way such that we can say “this isn’t part of the site at all and never has been” or for Google to realize that the 404 is not the “normal” custom 404 from the site and go no further.

    This is a strange case, but I suspect I’m not the only one.

  81. Great post and thanks for your outreach. I love the idea of reporting tools that evaluate duplicate content that appears on-site as well as across the web. Improved mobile reporting would be awesome too.

    • I Second this. Think a “copyscape” type tool. When people copy your material, why not flag this somewhere in WMT. This would be incredibly helpful, and I imagine super easy for Google. You spider a site, you see it is a duplicate, and you send a message or trigger something in WMT to notify who you believe is the original owner.

      I suppose this could cause issues if you thought you knew the original owner but in fact it was not…

  82. Hi Matt,
    I’m quite happy with Google Webmaster tool. But when i using Fetch URL option, it’s not showing instant result in Google indexing. Can you explain it?

  83. Hi Matt
    Please integrate Places into GWT.
    Surely there is a more efficient system for webmasters to verify physical addresses that sending a postcard?
    I have a 20% success rate for my clients when it comes to the postcard actually being delivered, for the following reasons:
    – Postal rather than physical address. In NZ, all medium – large business mail uses PO Boxes rather than street addresses.
    – Complex locations. Many of my client’s locations are kiosks / stores within major airports, with the actual rental car depots several km away. I want to direct the customers to the official airport kiosk (pin placement), but I know that in this case there is a 99.99% chance that the postcard won’t get through.
    I was delighted to see a link to Google Places in “Other Resources”, but this is just a link to the same old frustration, not the new and effective GWT tool that I crave.

  84. In addition to what Michael Haley mentioned above…

    I would sure like to receive more information as to why our site was dropped 4 pages when my site puts our local competition (now ranking 1st page top 5) to shame. We’ve been at this a long time and before this hummingbird mess our web design companies site was #2 1st page results for local search terms related to the industry. We rank #1 with Yahoo/Bing. We provide great content and It took a lot of time and energy to rank well, then to see it all get washed away in favor of sites which are far worse and no content just doesn’t make any sense. We can’t stop other sites from linking to us if that’s an issue with hummingbird. WMT should offer facts/information on the reasons behind ranking drops.

    Thanks for the opportunity to respond.


  85. Hi Matt, I am ordering my preferences by order of importance from your list:

    Better tools for detecting or reporting duplicate content or scrapers.

    Indeed, since you mentioned that duplicate content is a factor. Report of duplicate content would be needed.

    Show the source pages that link to your 404 pages, so you can contact other sites and ask if they want to fix their broken links. Or almost as nice: tell the pages on your website that lead to 404s or broken links, so that site owners can fix their own broken links.

    Even better! An automatic way notifying other Webmaster Tools owners that their links pointing to ours are broken. Or a way to manually report those link to the appropriate Webmaster Tools account directly with the push of a button.
    That would avoid the whole “so you can contact other sites and ask if they want to fix their broken links”.
    Which is often an unrealistic nightmare to go through, and takes forever to accomplish…
    Given the existing Webmaster Tools infrastructure, Google is better off automating this process, and have the proper webmaster counterparts fix their broken links though Webmaster Tools in an anonymous fashion, to reduce 404s across the web.

    Better or faster bulk url removal (maybe pages that match a specific phrase?).

    Sure. Though I am thinking a checkbox next to each links for disavowing crappy irrelevant links from the “Links to Your Site” section would be a higher priority.

    – Refreshing the existing data in Webmaster Tools faster or better.
    Yes, particularly Internal Links and Content Keywords data, especially when a whole new structure is discovered. i.e. A site redesign or re-launch.

    – Show pages that don’t validate.
    A basic site validation would be nice, but don’t go as far a validating css, dynamic content or iframes though.
    They often don’t validate and non critical validation errors which can’t be fixed would be unnecessary notifications.

    An additional suggestion would be to inform sites who don’t gzip their pages, which would help speed up the rate at which people enable gzip on their site and/or servers. Many shared host or sites still don’t do that.

  86. Hi Matt, I am ordering my preferences by order of importance from your list:

    Better tools for detecting or reporting duplicate content or scrapers.

    Indeed, since you mentioned that duplicate content is a factor. Report of duplicate content would be needed.

    Show the source pages that link to your 404 pages, so you can contact other sites and ask if they want to fix their broken links. Or almost as nice: tell the pages on your website that lead to 404s or broken links, so that site owners can fix their own broken links.

    Even better! An automatic way notifying other Webmaster Tools owners that their links pointing to ours are broken. Or a way to manually report those link to the appropriate Webmaster Tools account directly with the push of a button.
    That would avoid the whole “so you can contact other sites and ask if they want to fix their broken links”. Which is often an unrealistic nightmare to go through, and takes forever to accomplish…

    Given the existing Webmaster Tools infrastructure, Google is better off automating this process, and have the proper webmaster counterparts fix their broken links though Webmaster Tools in an anonymous fashion, to reduce 404s across the web.

    Better or faster bulk url removal (maybe pages that match a specific phrase?).

    Sure. Though I am thinking a checkbox next to each links for disavowing crappy irrelevant links from the “Links to Your Site” section would be a higher priority.

    Refreshing the existing data in Webmaster Tools faster or better

    Yes, particularly Internal Links and Content Keywords data, especially when a whole new structure is discovered. i.e. A site redesign or re-launch.

    Show pages that don’t validate.

    A basic site validation would be nice, but don’t go as far a validating css, dynamic content or iframes though.
    They often don’t validate and non critical validation errors which can’t be fixed would be unnecessary notifications.

    An additional suggestion would be to inform sites who don’t gzip their pages, which would help speed up the rate at which people enable gzip on their site and/or servers. Many shared host or sites still don’t do that.

  87. We desperately need quantitative metrics or indicators to help us understand the quality of our content! I have beat Amit Singhal’s list to death looking for answers. All of his 23 points are helpful, but they are all so subjective.

    Business owners and webmasters need analytics to help us figure out what our content is lacking! It could be as simple as giving a pass/fail for the content biggies– authority, trust, originality. We don’t need the secret sauce recipe, just a hint at the ingredients.

  88. Hi Matt, why the link to the office hours (in your post) is all in Europe? Where are the US office hours for webmasters? https://sites.google.com/site/webmasterhelpforum/en/office-hours

  89. Hi,

    My points are below.

    It would be great if you allow webmasters to download complete list of indexed URLs.
    Fast link removal service
    Concrete reasons behind manual penalty (not about penguin because when penguin hit the reason is obvious)
    It would be great of you mention those pages having duplicate content or excessive use of links.


  90. I would like to see more “suggestions to improve” that are much better than they are currently.
    as per Google algorithm and panda.

  91. “The full list of links considered by Google as artificial” Agree big-time with Arnaud…

    Would like to see <10 go bye-bye… Example: average position 2.1 impressions (ZERO) <10 really! Pages of relative not being displayed only makes webmasters mad.
    Like this is where listings should be showing up and they are not. Addition to the tool should be why. Not only the best tasting tuna gets to be star kissed!

    Google has tried to make everything a mystery by making a mountain out of a mole hill.
    All very simple. Do exactly what I say, please should be more like follow these steps and your listing will display here. If deemed artificial fix by date or here is your spanking for x-amount of time until issue is fixed issue.

    May God bless the hard working honest seo's and small business owners who can't find their listings after following Google guidelines.

    No one expects favoritism. Competitive organic search results is all anyone can ask for their hard work. Along with tools that "work" and explain exactly what needs to be done to reach client goals other then Adwords!

    Bing and Yahoo don't play games!

  92. Hi Matts…
    Great great improvements by adding more n more options, organic keyword data and spam reports…

  93. Hi Matt,RE: Precise reason for Algo Penalty in WMT – Please 🙂

    I am a small business owner, and DO have a Panda algorithmic penalty right now. (July 11 was the hit) It has taken me MONTHS to figure out and understand what I have done wrong. I want a precise reason in WMT that I can fix quickly, direct from Google. I would have rather gotten a Manual Spam action. REALLY! I read that businesses get that fixed quickly, because they know exactly what and how to fix it and can do a reconsideration request.

    I have used the forums, and participated in the Live John Muller hang outs, and got some good feed back, but it required me to keep asking questions and circle around issues for months until I narrowed it down myself, which has taken me away from running my business.

    What Google forgets to realize is that business owners who also do their own content for their websites may make UNINTENTIONAL mistakes that would result in a Panda penalty. Google is punishing businesses that do not know what they have done wrong, and now must pay out huge bucks to some SEOs to figure it out. Just Saying.

    Thanks for allowing us to post our requests/opinions here.


  94. If this kinds of update would be done in Google Webmaster Tool so all website owner will feel secure and comfortable for their data. I will recommend to involve this features to improve the search quality and user experience.

  95. The most important things are EVERY backlinks and trafic keywords.

  96. First off, your list reads like a dream come true; it would be nice to have all this done, and I am sure you will, sooner or later. Sadly, my traffic dropped dramatically on September 23rd (site dropped to second page of SERP) and Google Webmasters showed on my account on that day and for one month afterwards one notification, nicely highlighted, reading something like this: Your traffic had a sharp increase on September 19th . It felt ironic and it hurt.
    Anything that would help smaller sites with big heart, would be great. Not directly related, but one idea is to create a mini-Google for small sites that don’t have a chance competing with the big dogs . From Madison , WI.

  97. overall backlink profile quality score, content submit option (text, pdf, docs file, etc) before making it live to the main site to claim the authorship of content, score against each and every link that help webmaster to identify good link and bad link, geo targeted keyword rank checker, and finally the not provided data in analytics.

  98. It would be very helpful to have authorship stats available for all of the authors on a web site. Right now Webmaster Tools only shows my personal authorship stats — I would like to see the stats for all of the authors writing for the site.

  99. Since google is more and more able to judge quality of pages, it would be good to get some type of alert that the website has fallen in quality or “low quality alert”. I mean we have tons of link information and we get notifications if the links are unnatural, why not the content?


  100. Hi Matt, indeed as some others users pointed out algo penalty or sandbox messages would be helpful. Even more helpful would be something like early warning that exists in adsense system-“you have 3 days to fix errors or else”.
    I believe that most of small biz owners do not have time or knowledge to figure out which one of many possible errors they made when traffic drop down in some cases upto 70% because they are focusing on users.
    It would be great if wmt would update little faster.

  101. That’s a great move from Google. Now we see more fresher and accurate data

  102. Absolutely the second you mention Matt.

    A clear reporting of the spam report submitted. It should be useful to know when my spam report will be evaluated and eventually which is the action taken or not taken and why.

    Another good idea could be to make the ranking report data available for more than few months or one year.

    Sure I’d also like to see better stats about my authorship and g+ profile.

  103. I like the suggestion from Syed – a backlink profile score and maybe on-site score, but that probably defeats the whole ideology of great content for visitors rather than SEO. Still – I quite like that idea and know many others would, too.

    Also, maybe more of our links so as we don’t need to rely on third-parties.

  104. 1. Complete list of backlinks that webmasters need to take help from other backlink finder tools for creating complete list.
    2. Complete data in Search queries and clicks, currently there is huge difference in Analytics and Webamster tools data
    3. Most important change in disavow tools. If we’ve disavowed certain amount of links before 4 months currently there is no update that what happened to those links. Proper indication needed that what is the outcome of the links which are in disavow file. If all the links have been disavowed then it should be removed from links to your site section of Google webmaster tools.
    4. If your site got hit by any algorithmic penalty, the site owner get notification for the same and what can be done to revoke the penalty. Currently this notification is there for only manual actions.
    5. More time period for search queries and clicks at least 6 months to 1 year which is currently only 3 months.

    Thanks a lot to Google team for this post so that we can share our views..:)

  105. I would like to see a full export of crawl errors (with filtering in what kind of errors) + search functionality in the dashboard after login to find the website you’re looking for. Inisghts in “toxic” links would also be much appreciated.

  106. Thanks for asking Matt –

    1. Would like to see algorithmic actions as well as manual actions.

    2. Full historical data (or at least a years worth) C’mon, you guys have the storage!

    3. I really liked Bing’s new “connected pages” feature they recently added. Only recently started playing with it, but it seems great.

    4. Site & page speed scoring. A WMT section that would display a list of pages that respond slowly or possibly have a history of responding slowly.

    5. More submissions on fetch as Google

    I think everything else is covered here. Really excited to see some new features!

  107. Good points from you Matt.

    1. More fresh and up-to-date data.
    2. Information about the website usability. Google speak a lot of creating value for the user and usability is one important thing. It would be interesting to get more information about how Google valuate this, and feedback.
    3. Improve the API, more information.
    4. Information about algorithmic updates (timeline + info).
    5. A tool to see how a search result looks like in other countries, from their data centre.

    Looking forward to 2014!

  108. Dear Matt,

    I’d like to add these features on 2014 Webmasters Hand.

    -Daily updates on Impressions,clicks,links and keywords position.
    -Differentiate Good links and Bad links based on Google guidelines b’caz of identifying negative links (if someone building negative links to my website) and to do disavow bad links.
    -If my website violates Google guidelines the Google webmaster should show detailed info for why this site has been punished by Google. To show the EXACT point of violation not a (Quality,Unnatural links etc.) like to show “This is the link and its violates Google guidelines” , “This is the keyword and its violates keyword stuffing on your website” etc….
    -Simple Live website traffic can be add not a advanced report.
    -Can be add Link lost.
    -If the links was disavowed the Google webmaster can show report its disavowed or not.

    And Much More.!

    Thanks 🙂

  109. Old cow hand on the rio grande

    More transparency. Your algo has matured enough now to deal w/ old school spam.

    If people are being filtered by an algo then tell them explicitly.

    Maybe show a “quality” gauge w/ danger zones – show their relative position on the gauge, did their domain quality perception score increase or improve on the previous month.

    You could insulate yourselves via ensuring that you only revealed such a gauge/meter to domains that met a set of internal metrics relative to location, theme, age, previous owners, penalties, has a twitter/fb/g+ page – clearly identifiable brands or individuals behind small businesses perhaps.

    You’d definitely help silence folks who believe that part of Googles agenda is to confuse and obfuscate building a perception that spending marketing budget on SEO is high risk.

  110. Steven Macdonald

    A Google webmaster certification program similar to how the AdWords team has one.

    We need to set a high standard, as right now there are too many amateur SEO’s hurting the industry and business owners livelihood. Moz might be working on something and there are other qualifications from Market Motive or HubSpot but ideally, it should come from Google.

    You have Analytics certification. You have AdWords certification. It only makes sense to launch a Webmaster certification (seeing as they all integrate anyhow).

  111. And one more thing Matt..

    When i was crawled my website using fetch as Google tool in WMT working fine and showed the catch date in that day of date but after some days the catch date reversed to back why.? is it a bug on Google crawler.?

    Thanks 🙂

  112. Keyword data…want more organic keyword data please. Similar to what was provided in Analytics.

  113. Hello Matt,

    The two most important changes I will like to see in the webmaster tool are:
    1. Although currently, webmaster do show keyword rankings, but I will like webmaster to allow the users to enter the keywords manually (the keywords client want their website to see ranked on Google). We have many paid tools such as Moz which provides us with information, but it will be great if we get this embedded in the webmaster tool.

    2. Google update its information/algorithm from time to time, if through webmaster can provide with the changes in detail which can help users to work effectively on their website.

    Thanks and Regards
    Neeraj Modani

  114. Dawid Dutkiewicz

    We could use API for keywords and backlinks reports 🙂 Thanks to that I could react faster for ranking drops and raises and possible spammy link-building conducted on my site.

    And it would be nice if you gave us some more information what internal pages are duplicated and Panda-prone. Siteliner.com has such feature and it helps to create better user-experience and help on-site optimisation 🙂

  115. It would be nice to have:

    1) Google Algorithm updates in the WMT with a description of what changed.
    2) Suggestions about your content when Google believes it is spam or low quality and why.
    3) More accurate keyword ranking data, at the moment is quite odd.
    4) Absolute and relative Content Keywords views. To understand also globally how relevant the site it.

    Thanks 🙂

  116. My #1 is “Improved reporting of spam”. To then be able to develop your own credibility would be an excellent extension. Most of us want the best results and so you could quickly build a set of commanders in the field to assist you with manual intervention.

  117. – Google keyword rankings
    – Review stars microdata

  118. 1. More backlink data
    2. Broken hyperlinks report
    3. Easy way to download sitemap URLs not indexed
    4. Easy way to download duplicate URLs submitted in sitemap
    5. Easy way to download details of images with missing Alt tags
    6. SEO recommendations by page – similar to Bing Webmaster tools

  119. Hi Matt,
    I hope your vacation is going well!
    About Google Webmaster Tools… your suggestions are all amazing but I would like to see deeper and more accurate keyword data and i also suggest you to fix Google Data Highlighter, there are too many bugs on it.

    • Anna, can you talk a bit more about what bugs you see in the Google Data Highlighter?

      • I only saw a bug, that the data Highlighter didn’t found any image
        Not on my site.
        Was a client site, one other site he saw nothing, not even text.
        I think on a well made side there is no problem – I saw no other bugs in data highlighter, not even one of these two bugs on any other site.
        Working for a webdesign agency mostly >>> many sites in WMT

  120. Jerome Verstrynge

    Hi Matt,

    Thanks for asking us feedback:
    – List images missing an alt attribute (in a SEO improvement opportunities section for example).
    – Make responsive design a ranking factor to serve better results for smartphones and tablets. Reward those making efforts to serve customers better.
    – List pages not implementing responsive design.
    – List submitted sitemap pages Google won’t index. If there is a specific reason, tell it too. By taking a look at them, webmasters may identify issues to solve.
    – If someone breaks a rule (accidentally or not) from the guidelines, and Google penalizes them for that, Google should inform such people in GWT, so they can take action. Provide a list of pages breaking rules, with the corresponding list of issues. Honest mistakes do happen, people are ready to learn.
    – Sometimes, pages are indexed, but never rank for a keyword. If Google thinks a page is too low in quality or value for ranking, tell it in GWT so people can work on improving them, merging them or marking them with noindex, follow.

    It would be great if the overall priority was shifting from fighting spammers to empowering honest webmasters.


  121. Hi Matt,
    Very nice of you to reach out like this – I hope you don’t get too swamped by people’s preferences… 😉

    I agree with the guy above about a Certification program for webmasters somehow, like what you have for Analytics and Adwords, and taking the opportunity to also broaden the platform for the webmaster tools.

    I also think that connecting your different google services for sites, like Google+ and Local Pages, as well as authorship markup needs to become significantly easier. I would do it for ourselves and our clients not because of any imagined SEO benefits, but because it’s just nice to do so…

  122. Hi Matt 🙂

    I like the above points you mention! These can really help us.
    Although WMT should show the BAD Urls only, which are taking back our website.

    And everyone knows, Online business are growing via google.
    So Google should inform them, where they’re wrong with the Guidelines so that they can correct them instead of getting penalty over time.

    As It is very difficult to come up again.


  123. Merging WT and Analytics to come to one complete dashboard for your site(s). It makes so much sense it’s a wonder it hasn’t happened yet.

  124. For me the most important changes would be in the amount of backlink data we receive, particularly when a manual penalty has been issued. The vast majority of penalties I see still do not give example links or any insight at all, with some example links that are given not appearing anywhere in the links to your site section. More transparency and support for sites prepared to spend time and effort cleaning up their act after years of being ill-advised about links.

  125. Above mentioned list, if Google really makes the mentioned updates in Webmaster Tools the business owners using the tool to enhance their online businesses will feel more secure and will be able to rely on Google Webmaster tools more than they do now, me being a marketing executive for an online business myself would request Google to take the list above into consideration and present a new updated webmaster tool to its customers in 2014.

  126. I really want a tab for automated penalty so that we comes to know from which Google update our site has been penalized.
    Also, GWT should provide all back-link data for a website then we will not need any paid Tools.

  127. Notification of Algorithmic penalties like Panda and Penguin.

  128. Could you add a couple of boobies to it?

  129. Matt,
    I’ve just been told that I’m a spammer by your Comment Moderation system.
    I assure you that I’m not. Initially I tried posting this under my normal business email address which I’ve never had problems with before when posting on your blog!

    Anyway back to my WMT request – Earliest Date Discovered for each inbound link.

    Kind Regards,


  130. Hi Matt,

    Thanks for posting this, its great to get the opportunity to contribute.

    Personally I think the following pieces of functionality would be extremely useful:
    – A list of URLs that have been detected as having duplicate content, currently I use the duplicate title tags and duplicate meta descriptions reports – but sometimes these don’t catch everything. Identifying duplicate body content would be really helpful.

    – A list of URLs that have been flagged as thin content (if there is such a thing)

    – A list of URLs affected by Panda and/or Penguin

    – Whether the site has been hit with a Panda and/or Penguin based penalty.

    I would love to see what you suggested regarding bulk URL removal. I have a client with 1.35 million URLs that need de-indexing, currently been waiting 7 weeks for Google to crawl and de-index. If I could bulk remove with wildcards that would be great (its not in the same sub folder, but a file with different query strings).



  131. Having the chance to really know what are the bad links

    Section dedicated to the duplicate pages

  132. Hi matt,

    Thanks for posting this,
    the most important change i will like to see in webmaster tool is exact knowledge about the backlinks.

    thanks and regards
    Nishant Singh

  133. I write original content that I post on 2 different blogs. I would like to be able to signal to GWT that those are my blogs and that I really am the author of this post without being penalised by duplicate content.
    Of course, we should have a limited number of times to do this.

  134. I will like bulk URL removal feature as well as full list of back links so that we do not have to use any other tools

  135. Detailed instructions of how to get to number 1 position. That would be *really* helpful! 🙂

  136. A whole new disavow process. That is, a chrome toolbar extension to use that, rather than “disavow” lets us rate a website. That is, we click a from inside webmaster tools to visit the page and then check the appropriate boxes:

    article site for intentionally created backlinks (includes spun articles)
    directory site that probably scraped our content
    directory site where we created a listing
    site that scraped our content
    we left a blog comment on this page
    someone else made reference to us in a blog comment
    social media site with an account we created
    site that links to our pdf files

    If we were able to rate sites in a fashion like this, we could better give you what you want. We are finding that most of the crummy links to our site just happened… we had nothing to do with them… which sort of means they happened naturally – and though they are crummy… at what point do we disavow a crummy link? How crummy should it be before we disavow… when it occurred naturally? going through hundreds of links not really knowing what you want is a pain in the butt. But an integrated options box would sure make it easier to return to you valuable information.

  137. A pain point for us is that our domain name is http://www.companywidgetwidget.com But our brand name is displayed Company Widget and Widget; but Google is not producing site links with SERPs when typing in our ‘company widget and widget’. It would be good if you could inform Google if your brand has aliases like acronyms and such. Especially for new brands that might not yet have a strong and varied link profile to highlight all these alias signals.

    Barry Schwartz points on this post would be a great development – http://www.seroundtable.com/google-algorithmic-penalty-17703.html

  138. Here is my request: You have removed the “Not Selected” data from GWT because most people were confused by its meaning. I ask you to profile all human beings on Earth by their IQ and then providing again that useful data in GWT only to those who have evolved from apes. Thanks!

  139. Would like to know whether the site is a victim of an algo change such as Panda or Penguin. Manual penalties are usually applied to webmasters who intentionally went to the limits of Google guidelines. It doesn’t seem fair that those who have been innocent victims of an algo change largely because former best practices have become worst practices have absolutely no idea what has gone wrong or how to fix the site. A simple “we’ve detected Panda problems” or “we’ve detected Penguin problems” at least points the webmaster in the right direction. There’s no reason why we have to wander in the wilderness watching our livelihood seep away when Google surely knows what the problem is. Allowing honest webmasters to improve their useful sites would make the web a better place.

  140. I would like to see two things be added to GWT functionality. First being the ability for an agency to easily manage multiple accounts such as how Google Analytics permission levels have improved in the past quarter. The second would be to have the sitemap submission be used as a location indicator, but GWT should not take a copy of the sitemap there. Instead it should reference that location every time the spiders are looking to check the sitemap. This would allow for webmasters to make changes on the website, but not have to worry about submitting a new sitemap every time to GWT.

  141. I would like to see more API data available: Search Queries and Backlinks data.

    This would help a lot.

  142. Webmasters need the ability to contact someone at Google after having a manual unnatural linking request removed. We received a ‘unnatural linking penalty’ on May 28th/2013 that caused our traffic to drop by 70%, the penalty was revoked on August 12th. Despite having 3,000 original articles, thousands of inbound links and plenty of new content our site has not seen any traffic return.

    Though Google has notified us the penalty was “revoked”, I believe the penalty was not removed and needs to be looked into. We don’t even come up #1 for our own name “Professor’s House”… a ‘one page’ Wikipedia bio about a book written in 1925 does! How does a one page site outrank another site with thousands of original articles and thousand of inbound links?

  143. These two are huge for me. The first slightly modified.

    In an export, show the source pages that link to your 404 pages, so you can contact other sites and ask if they want to fix their broken links.
    Refreshing the existing data in Webmaster Tools faster or better.

    These are some other requests that I’d love to see:

    * Increase the number of issues visible in WMT from 1,000 to {something significantly bigger than 1,000}. Sometimes I find one programmatic driver for an error, and would love to easily be able to wipe them all from the reporting when that driver is corrected.
    * Option to prioritize re-crawl of select issues. Similar to above, if we’ve fixed something that’s going to clear a large number of reported errors, some way to get them re-crawled and updated ahead of other site pages.
    * On “Blocked URLs”, include a downloadable report that shows those URLs.
    * On “URL Parameters”, ability to remove ANY parameter, not just the ones Google auto-identified.
    * Ability to show only Clicks (hide Impressions) on the “Search Queries” chart. On sites with low CTR, that would make it easier to see the day to day variances being reported.
    * “Google News” section, where links to information for webmasters from Google can be directly viewed within WMT. Basically an RSS feed that includes official Google blogs, this blog, etc.

    Thanks for this opportunity!

  144. I’d like to remove the keyword visibility data and instead show a topic relevance category for what Google thinks the website is about and the topics it could potentially tackle.

  145. Algorithmic filter/penalty notifications for things like Penguin.

  146. I’d like to see reports on algorithm penalties – I have seen some sites suffer from unknown penalties where there is absolutely nothing obvious happening, so some actual information about the nature of penalties would be helpful.

  147. +1 for showing people if they currently have algo penalties and not just manual actions against them

  148. More transparency not only for manual penalties, also for algorithmic penalties. Often it is difficult to find the problem for algorithmic penalties.

  149. Hi Matt,

    Thanks for inviting suggestions in true Google style.

    After the Google Panda & Penguin has been rolled out there has been a set of low quality websites which have started linking to good , authority websites practicing negative SEO and demanding money when we send a request for removal. It is like a sort of blackmailing where they are fleecing poor webmasters & SMEs with their hard earned money. Even after you make the payment, they dont remove your links and demand more money.

    These websites does not have any contact forms and has a hidden whois.net ID. Some of the sample websiites are bestwebyellowpages DOT com, bookmarkexplore DOT com etc. I am sure that when you get disavow request these websites will be there in multiple disavow requests. It would be great, if you can banish the links from this websites all together and instead of punishing SMEs who are linked in these websites , remove these links from Google Index this year.

    I have created a list of 1000+ websites which demand money for removing links and would like to share the same with you and if required can share the same with you

  150. 1) A publisher rating or “preferred by algorithm” indicator for Google News
    2) If a backlink is deemed inappropriate by the algorithm, indicate the risk factor to webmaster
    3) Option to disavow a link/domain/website category in “Links to your site”
    4) Revoke a disavow, if needed
    5) It would be a fantastic feature to know the response time of disavowing each link – an approximate turn around time to test the impact of disavow


  151. How about getting accurate backlink data? Why can’t you remove things like DMOZ scrapers, dup content, no followed , “sample ” of backlinks. Why not show just all what you’re “counting” , and none of all the ones you’re not counting.
    I’m sure google could show accurate backlinks to us , why don’t they?

  152. In Google webmaster tools need show the all broken links, bed links, content duplicate, how much site has been spam etc.

  153. A function to disavow links by checking a box next to the link in Webmaster Tools.

  154. Matt,

    (Not Provided) is bad for your users, bad for webmasters, etc. Okay, “privacy,” fine. Keep that. Nobody believes Google about that, but keep it anyway.

    Just do this, at least:

    Webmaster Tools has a “sort of” integration with Analytics. Make it better. Let the page-level reports in Analytics tell us what queries are leading people to that page.

    So we can make the page a better answer for the query.

    Benefits: you get to keep pretending that you’re protecting people’s privacy, and we get to make our websites better for users.

    That’d be awesome, thanks.

  155. Search Traffic > Search Queries:
    Clarity on what the Search Query numbers mean. For instance…

    I may have 10,000 Clicks, but underneath that number it reads “Displaying 5,000”. Which 5,000? Are they the most popular, or a random sample? Does the methodology change? Also, why only 5,000?

    For each term, Impressions and Clicks are shown. Does the Impression count include only Google-proper data, or does it include Google News, Google Images, Images in the Knowledge Graph, etc.

    A whitepaper/online FAQ detailing the methodology for each report would be ideal.

    Crawl > Sitemaps:
    Clarity on the reason behind swings in reported numbers. For example…

    A site has 1,000 videos submitted, and this number doesn’t change. One month it will show 450 videos indexed, the next month will show 800, the next will show 500. None of them expire and all of the URL’s are crawlable by search engines.

    The same goes for Search Traffic > Links to your site. One month I’ll see 1,000, then 1,200, then 200, then 800.

    Essentially, I’d like to see more consistency in the data. Currently, due to inconsistencies (as noted above), it’s difficult to trust the data (especially for decision-makers/executives), meaning there is very little appetite for addressing Server errors, site speed, etc.

    Thanks for asking!

  156. A facility to disavow spammy links created by a negative seo campaign by clicking on an option like “disavow this link”

  157. When I fetch as Google I can choose one URL or all linked URLs.

    I’d like to be able to see the internal linkage of pages as Google sees, to help identify silos or silo issues, where Google is associating page groups, and give me insights into potential crawlability, site structure via possible crawl patterns / understanding



  158. Hi Matt,

    Thanks for opening the door for suggestions. First, I think referring keyword data should be transparently and openly provided and not only available to large agencies with budgets to spend tens of thousands of dollars for it. Not going to beat a dead horse here, so I’ll leave it at that.

    Also, I’d like to see better integration with other tools, and the ability to customize alerts by specifying triggers. In addition I think it would be a HUGE improvement to improve reports and data exports. Right now they are very basic, and helpful but not very accurate or timely.

    One more thing; I’d like the ability to report spam from competitors (websites & company names) from the get go so they can be identified before their website is harmed and businesses suffer. It’s just not fair to small businesses when they have no way to defend themselves from shady fast talking SEOs and low brow competitors. The reconsideration “request” is a joke, but a good start. Thanks again 🙂

  159. Controlling Google’s crawl on large sites is difficult, having lists of suspected duplication or even low quality pages would help identify/restrict unwanted pages from using up crawl budget. Or even a outputted number that indicated an efficiency to crawl rate on a given site (how many duplicate pages get crawled versus unique/indexable).


    “WMT Continental Geo targeting especially for European Union allow us to refine hreflang default down to continental (or political union) defaults. Also why not also refine down to state level geo targeting for large countries.”

    Finally having more control in specifying mobile relationships in WMT would help – linking sites/seeing what Google sees as the mobile equivalent for a given page.

  160. I’d say Google’s No. 1 priority should be to fix existing reporting issues in WMT. For example, “Structured Data” figures under “Search Appearance” often go several weeks without being updated. “Latest” inbound Web links are often out of date, too.

    Beyond that, I like the idea of:

    – Trusted bug or spam reports. “Deputizing” submitters may be overkill, but maybe Google could give priority to reports from searchers who are willing to use their Google IDs.

    – Reports on algorithmic actions (e.g., Panda or Penguin), maybe with some kind of running score that tells site owners if they’re going in the right or wrong direction. I realize that Google can’t give away the recipe for its secret sauce, but having some kind of quality score might ultimately benefit site owners and Google alike.

  161. Would love to see all the 14 points you have mentioned plus more detailed and structured keyword data report. More detailed information about manual spam penalty and steps to recover from it. Also an notification for algorithmic penalty if any would be great to see in WMT. That way site owners can know what is wrong and how to fix it.

  162. Hello Matt,

    I think some of the pre-existing / core features of Google Webmaster Tools could be improved to be more useful.

    Search Traffic>Search Queries:Improve data accuracy for Impressions & Clicks. (e.g. Exactly 22,000 people don’t click one of my pages every single week.)

    Search Traffic>Top Pages: Improve data accuracy for Impressions & Clicks

    Search Traffic>Top Pages: (Expanding Page Details) When expanding a page’s data to see additional keyphrase data, the CTR, Avg Position & Change do not appear for additional keyphrases, it would help to see all the data relative to the page, even if the numbers are small.

    Search Traffic>Top Pages: Enable a download of a Top Pages report which includes each keyphrases it received traffic for with clicks, impressions and CTR data.


    Google Index>Content Keywords:
    Provide occurrence count for each keyphrase variant. The downloadable table would include this data as well. Also, breaking out the data could lead one to believe that “Variants encountered” within the reports and GUI would no longer be needed since the data for each word is available but it’s still useful for grouping words so I would keep it.

    Custom Reports
    The ability to mishmash data pre-download such as PAGE, KEYPHRASE, RANKING (for Page and Keyphrase combination) and any other existing data would be useful.

    Date Range of Archived Data
    It would be very helpful to have data from the entire life of a website instead of just a few months but if Google doesn’t keep an archive, that wouldn’t work so at the very least I would say 2 full years of data should be available. One could argue that WMT is more about ‘now’ and not ‘then’ but this suggestion is only an expansion of pre-existing options.

    WMT should have a tool which provides a rankings report for “Queries” (which it does) and for keyphrases which we are working on ranking for (which it doesn’t). I don’t always need data about 10,000 keyphrases, sometimes I just want to know about 20, 50 or 100, some of which I have no traffic for, yet.

    Rankings Part Two
    When people are logged in to their Google Account, on average, what are the rankings for my website?

    Rankings Part Three
    Filters to see what rankings are when users are logged in and meet the criteria defined in demographical filters such as age range, gender, location etc.

    A feature which allows a user to select multiple date ranges for comparison such as “April 1, 2013 to May 1, 2013” vs “Previous Period” / Another Range.

    Thanks for the opportunity to provide feedback and suggestions. If you would like more details on any of the above, feel free to reach out, I’d be more than happy to help.

  163. Pagination issues – would be nice to have something looking into it


  164. Suggestion for new indexed backlink whether it is useful or not in ‘Links to the site’ section.

  165. A health score for indexed pages, based on some aggregation of signals already reported in WMT.

    Some facility for the Webmaster to mark links as either good or bad, directly in the /webmasters/tools/external-links tool would be valuable.

  166. We need quick data result details regarding search data as well as link data

    Shall we expect a Real time data availability in 2014?

    Or Google web master tools can provide accurate keyword details or helpful choices like keyword suggestion or a graphical progress level in SERP etc

  167. I would like to see a basic indicator of algorithmic changes that have affected your site. Similar to what Barry Schwartz had mentioned in his post the other day.

  168. Notification of any algorithmic penalty on WMT would be great 🙂

  169. I’d like a simple way to tell Google (Webmaster Tools) that I do not want my site indexed. Because I often receice email alerts “warning ! Googlebot can’t index your site” but it is wrong since it’s what I want !

  170. I’d be interested in seeing a way to add and verify sub-domains within the main domain rather than adding them as a separate site.

    Also, some sort of tool for maintaining duplicate content across languages and regions– this would be helpful to both site owners, who could avoid being penalized for non-malicious duplicate content, and users, who would have a better chance of being served the correct content. Of course there are many ways to delineate this already, but a process that works directly with Webmaster tools might help minimize a lot of error.

  171. Hi Matt!
    I would like to embed a feature that might help to detect harmful links and alerts why your website got down. although we have Disavow link but it is really confusing and also it doesn’t tell which links are bad for our website!! so please make sure to get it live.

  172. I would like to offer answers in cases where a website is affected with a sudden drop in visits, in most of the occasions as webmaster investigates possible reasons “without finding any” no duplicate content, no unnatural links without penalty, finding no apparent reason.

    could be received in such cases some kind of suggestions or warning about that site visits fall dramatically.

    Well, anyway, a little more clarity on that ranking varies widely from one day to another.

    Greetings from Argentina

  173. Version the Webmaster Guidelines (and other content policies):
    1. Add a version/revision number to critical Google policy documents such as Webmaster Guidelines and Google Places Content Policy.
    2. Put text copies of those important policy documents into some system with version control and a diff tool that users can run from the web (source code control or a cms or whatever).
    3. Add a link from the existing html policy/help pages (at the bottom of the page or whatever) over to the versioned copies of docs.

    The only versioned/archived google policy docs I could find anywhere were the Privacy & Terms of Service docs at http://www.google.com/intl/en/policies/ but someone let me know if what I’m asking for already exists somewhere out there. Basically I want a way to tell WHEN there have been changes to Webmaster Guidelines and WHAT those changes were over time.

    If you do implement versioning the docs then you might consider adding a statement that the current version of a given policy applies to all content or links, regardless of whether that content/link was created before or after the policy change. For example, a new rule about keyword-optimized backlinks in articles on 3rd party sites applies to all existing backlinks, not just backlinks created after that policy change. That fact might not be clear to some people if the Guidelines were versioned, as i’m suggesting, and one could see that a rule didn’t exist a few weeks ago or whatever.

  174. How about a little service? How about reaching someone at Google except for Adwords (your money making machine)? Google is an absolute desaster nowadays. I just claimed the wrong name at Gmail and cannot get rid of it cause everything is accociated: Webmastertools, Analytics, etc.) and only a complete delete would help.

    Service? None. Not reachable. What kind of giant machinery, arrogant, without telephone or eMail Service. Great work Mr Cutts.

    And by the way: making billions without paying any taxes is just the same behaviour. Taking all the good things form all the countries but not willing to participate. Great!

  175. Definitely, let users with no manual penalties, knowing if their websites were hit by the Panda or Penguin algo. If a website lost its rankings because of a lot duplicated content for example, let us know which pages are causing the issue.

  176. Brendon Schenecker

    A list of urls that google has in its index!

  177. What about an “obsolete” content alert and some recommendations on pages that are identified as orphan?


  178. Instead of “disavow” which is a remedy for non-accountibility how about backlink vouching in GWT… this way when people largely agree that links from NYT are good and links from cheapviagrablog.ru are bad then google has crowdsourced data that is held accountable to GWT account reputations… I’d much rather have backlinks be untrusted and then have an option to vouch/disavow for ones Google thinks could help me or hurt me over time. Then you could have reputable backlink managment be a minor factor in the SERPs to promote its adoption.

  179. If we were given the option to suggest our preferred sitelinks – and Google could take that into consideration when selecting them, that would be very helpful.

    • That’s a really nice idea and one I’d use – sitelinks do seem a bit random! I also like the idea of pattern matching removals, I would use that

      WMT has improved since I used it a lot 3-4 yrs ago – looking through though, the “html improvements” part could easily be beefed up a bit. How about something like being able to set a category of site?

  180. Hi Matt,

    Two improvements that I feel would enhance the WMT user experience are as follows:

    1. A more up-to-date inbound link index
    2. Expand the date range for the top pages and search queries reports.

  181. Hi Matt,

    I think we should be able to see exactly what pages robots.txt file is blocking – not only the number of them.



  182. Detailed authorship data on a per-site basis, rather than by GWT user account, would be fantastic. Even a list of verified authors per site and basic data by individual author (search queries, traffic estimates, etc.) would be a valuable resource.

  183. hmmm where to start.
    1) How about adding the disavow tool in Google Webmaster Tools so it is easy to access which makes Google more user friendly.
    2) If someone posted a link of our website on a “bad” website that google sees as bad, Google should at least say “hey your website was posted on this site in which Google treats as spam”, if you didn’t do this clear here to add to disavow. This way people can’t sabotage backlinks.
    3) If you rebuild someones website, it would be nice to easily download all links Google index so that Webmaster’s can double check all 301’s are properly added.
    4) This goes with #3, after that it would be nice to tell Google, hey delete all these files from your index because they are old pages, right now you have to delete every page 1 by 1. Say the old site uses .php file extensions, now the new site uses .htm file extensions. If the site has 1,000 pages doing it 1 by 1 is a pain.
    5) Why do we have to create http://www.whatever.com and http://whatever.com with and without the www, not only is it confusing, it causes duplicate issues in programs such as Google Analytics… When you go to link the property, you should be able to link www and non www, but really Google needs to clean this up and merge the 2 together so things are organized correctly.
    6) I am constantly finding and fixing errors in the Crawl Errors page, and a lot of the links are junk sometimes. Such as if i have a page blah.com/whatever.htm sometimes you guys say there is a crawl error to blah.com/whatever.. or blah.com/whatever..htmASASDA or something at the end, then when i research this deeper on the page it came from, it is a display ad. Why are you guys indexing display ads? Then I have to waste time creating 301 redirects to fix your crawling issues.
    7) If you guys found a page that is a 404 error, the Linked from tab should never say “No data” if you crawl it, you have to say where it came from so that we can fix the source of the issue.
    8) The links to your site feature is nice.. but you can do better. We should easily be able to download all backlinks, what the link is of the backlink and what page it is linking to. If you want to include alt text and other things.. thats fine too but right now it is unorganized. I think you can do a better job. “Download All Backlinks” simple link download.
    9) As well as have a way to manually force Google to update there backlink table. Lets say i found a bad link, and i got the bad link removed, it would be nice to say “hey google, i took this link off this website, please update your end”
    10) It would be my Goal in life to get Matt to respond to my famous blog http://www.samuelpizzo.com/blog/google-destroying-small-business.htm If you want to provide the user with the best possible search result, why are you making so many “link farms” aka “big directory sites” on page 1. Google is the directory… why would i want to go to another directory to find what i am looking for? I have asked this question 1000 times and Google is flat out scared to respond to this because they know I am right and the reason they do that is so small business buys AdWords.

  184. I’d like to see which of my URLs are triggering rich snippets in SERPs (or what impressions were gained via rich snippets)

  185. In addition to the average SERP ranking for my site based on keywords I’d love it to show the URL that is ranking for each search query. Being able to filter the ranking reports to the URL ranking and other related features would be very useful.

  186. Would love to see the “send Google ‘fat pings’ of content before publishing it on the web, to make it easier for Google to tell where content appeared first on the web.”

    • I second this motion. Allowing a system to ping Google would solve most scraper outranking problems. Authorship and all your other methods seem to be easily duped.

  187. I work as an SEO on E-Commerce sites for my day job. There are a few things I would like to see to make it easier to diagnose the problems I see all the time.

    1. When we get 404’s because of internal links it would be nice to see a snippet of the code where the link was picked up. I want to see if the code is still on the site or if it was fixed already and GWT just didn’t get refreshed yet.

    2. I would like a way to know if a site has an algorithmic penalty. It would make it much easier to get changes approved that I know need to be done to improve the sites I work on. There are a lot of things that would be improved on our sites without my endless pushing if the other teams knew that we have an issue.

    3. I would like a way to know what pages are seen as duplicate so I can get them fixed. Once again this is hard to do without proof when the people you work with are not all SEO’s.

    4. I would like a better way to identify bad links or a bigger list of links that are considered bad. I work on multiple sites with over 1,000,000 links pointing to them. There is no easy way to go through them all as the only SEO.

    5. I would like to see a list of URL’s that are penalized if possible. This once again makes it easier to clean up any mess and get the changes done that should have been done. If we can’t get a full list maybe a sample of them? You could have a manual and algorithmic page that had an example of URL’s that are hurting the sites rankings. This page could show penalties like thin content, bad links, hidden content, etc.

  188. Dick Bradley said this particular one but I just saw it again today and it bears repeating…a list of the Sitelinks for a site when you review the site in GWT. I don’t think we’re overly concerned about the query generating the sitelinks (although that would be nice too)…I’d personally be fine with just knowing what they are so that we can demote them if not applicable or they’re old pages before users see and stumble upon them.

  189. Hi Matt,

    I would suggest two things for WMT in 2014,

    1. Update website backlinks data frequently (may be weekly).
    2. It would be good if there is any tool in WMT which can give freedom to webmaster to disallow unwanted backlinks of website. So that Google Bot does not count those links. This will reduce pain of webmaster to send link removal request emails to individual website owners.


  190. I think there must be a feature that should be added in the Remove URL tab of Google Webmaster to remove the URL’s just by marking the check-box button because it becomes very difficult for a person to remove 404 errors if they are in bulk. We need to copy the url one by one and then to paste in remove url option to de-index it. So this is my suggestion that there is a way which can help us out to remove the URL in a easy way.
    We can do like this :
    There should be a list of 404 pages in remove url option and we just need to select the pages and then need to press remove.

  191. I have found many bugs in WMT, Sorry but can we have a bug free Google Webmaster Tools in 2014?

    Thanks in advance.

  192. 1. Complete list of backlinks from Google Webmsater that webmasters need not to take help from other backlink finder tools for creating complete list.
    2. Complete data in Search queries and clicks, currently there is huge difference in Analytics and Webamster tools data
    3. Most important change in disavow tools. If we’ve disavowed certain amount of links before 4 months currently there is no update that what happened to those links. Proper indication needed that what is the outcome of the links which are in disavow file. If all the links have been disavowed then it should be removed from links to your site section of Google webmaster tools.
    4. If your site got hit by any algorithmic penalty, the site owner get notification for the same and what can be done to revoke the penalty.
    5. More time period for search queries and clicks at least 6 months to 1 year.


  193. if webmaster tool 2014 have more export full backlink is great, I very need that function. Currently, i only see main domain of link to my site.

  194. The capacity for WMT to distinguish between disavowed and natural backlinks, or “flagging” capacity for users on single/mass backlink(s). Thank you 🙂

  195. Hi,

    Some PogoSticking information in WMT would be very helpful. I believe most of web master are living in denial so getting some information about the bounce back in serps could be very helpful for both web masters and google.


  196. I have 2 main suggestions
    1) Sites that are often disavowed, or are reported as charging for link removal should be penalized.
    2) Do something about link rot. This actually touches on suggestion 1 as sites that suffer a lot of link rot are poorly maintained and the people running it are often hard if not impossible to contact.
    If a webmaster can find out if a site they are linked to suffers a lot of link rot, they can more easily decide if they wish to drop that site from their backlink profile.
    I help admin a comic directory, and I see all kinds of link rot. I once checked a link to what was meant to be a child friendly webcomic and arrived at a porn site. Not a good thing to have in a child safe area. Another comic directory had not been maintained for 8 years and yet still scored top ratings for “Online comics.” This was partly because their name was “Onlinecomics” with .net after it (I did not write down their name as I did it once already and the system thought I was spammer). It was also partly because they had been very good before the lack of maintenance and asked the people they had in the directories to include some code to notify them of updates, so they could tell their members who had faved the webcomic. That code contained a link to the site. They have closed down now. However, I am willing to bet any of you AU$20 that in five years time I will still be able to find the coding and links to that site.

  197. UX is important .. and WMT is way to technical for most. Bing WMT is nice from an UX point of view:).

    Google’s WMT could be way better from an UX point of view.

  198. corporate advertisers should be aware that their ads are like a swarm of nasty blow flies popping up without invitation, disrupting important research efforts, and when there is sound, they compete with each other via volume and amount of wordage. I can’t see how this is effective to the company image or to its products.

  199. It has been mentioned a few times already, but an overall site score would be cool…with a threshold level that triggers a notification when a score hits a low (or high).

    Many of my customers add their own content/pages via a CMS. Which leads to missing tags, broken links, etc. This often leads to a “drift” (from a well optimized site, to one that’s not).

    Not sure if it’s doable, but sure would be nice!

  200. Matt, You mentioned previously that it would be too time consuming to respond to every single request for reconsideration from web masters, and that was why Google only gave the vague responses in reconsideration request denials. I’m sure there are plenty of webmasters who would be willing to pay for a consultation with Google support to help expedite the reconsideration process and take all of the guesswork out of it.

  201. So many web masters has lost trust in Google, so many webmasters (small businesses) have been unfairly penalized by Google’s crusade to get rid of spam and produce the best possible results to its users. Looking at the results over the past 12 months they are an absolute mess, Spam, domain crowding, brands dominating the first page results, hacked sites, useless Youtube videos, the list goes on. All the great sites with good quality content have disappeared trying to find anything in Google is like trying to find a needle in a hey stack. One thing Google needs to do in 2014 is try and build up its trust with web masters & businesses again and like so many other people have noted already be more transparent. Google web master tools would be a great way of achieving this goal.

    1. If a web masters site has been penalized, provide the webmaster with real information which will allow them to resolve the issue, don’t keep them in the dark and play games with them. This will help improve the overall quality of the results which appear in Google if the web master can identify the problem and fix it.
    2. Give web masters a date when the manual or algo penalty was applied to the site and date when the manual or algo penalty will be removed if the web master takes all the actions required.
    3. Communicate with the web master. If the web master submits a request for a manual review Google could at least have the courtesy to provide constructive feedback or reason why the site is not ranking rather than automated, meaningless responses.

  202. Ability to specify undesirable or unrelated links from external websites in robots.txt (different from “disallow”) and get the bot to take this into consideration. Going to each undesired website and asking them to remove the link to our website – is anything but unrealistic. Disallow has unpleasant implications and we don’t use it out of fear of those implications. But if we can signal what links we consider unrelated and if the bot can take a note of it, compare it to its own data, and take the results of its analysis into consideration when recalculating positioning, this would be very useful.

    Also, ability to preview and compare several browsers’ displays at once in Google Analytics on targeted “goals” and “funnels” pages and especially in shopping carts. Our designers have to use a couple of tools to make sure the display is ok everywhere. It would be nice to have it all in one place.

    Also, a simple basic SEO health check package that can be run against individual pages and also against the whole website: current positioning by keywords (not page rank), specify three competitors and compare your positioning (or at least historical site’s data per keyword); keyword density by keyword, external links – all in one place. I realize this is bound to open a large can of worms, so I would like to clarify, I am not suggesting many radically new things, I am suggesting putting them into one place and presenting them in a more convenient and user friendly way (the old Google’s way).

    Also, if there is a penalty, it would be nice to see some of them expire within our life time. It seems that currently all penalties, once imposed, are forever. Our business has nearly died because of this (dropped from page 1 to page 10 suddenly), thank Goodness for other means of marketing. We have since tried everything to correct this, including adding a lot of unique and useful content, tried to remove back links that we don’t want, tweetering, facebooking, Google plusing, videos on youtube, updating CMS, cleaning up errors, improving speed, what have you – to no avail. We only got dropped two pages more (after a few weeks). And yet our website is of high quality, we get a lot of positive feedback from visitors and clients, and we definitely have more content and useful info for users than many of our competitors. So I agree with previous commentators who state that new ranking does not adequately reflect quality in terms of how this quality was officially defined by Google.

  203. I would love to see a GWT Video or a help page (or even a post on your blog, Matt) on how Google is NOT the Web and how the ultimate power is in the hands on webmasters, not of Google.

    This is so important because there’s a misconception running among the less experienced webmasters and web users that Google IS the Internet. No way.

    Hope Google can make this transparent, Matt.

    – Luana S.

  204. Hi Matt, I would appreciate if we could monitor queries within GWT without restriction (3 months of data) and also possibly to have filters same way we used to have within GA. that would be awesome. Please have a look to Bing Webmasters Tools they’re doing so well in this area plus the latest feature enabling us to monitor also connected media sites. Thanks.

  205. I would like to see a feature which should provide webmasters with more details about the content of a site. e.g

    1. The content that are marked as duplicate content (whether external duplicate or Internal)
    2. Low quality content
    3. Issues caused by the Internal Linking.
    4. The pages that have got penalized.

  206. I have reposted this from my post on webpronews.com

    I would say that the most helpful tool that google could include would be notifications of ALGORITHMIC penalties, along with an explanation why the algo penalized you and what steps you can tale to remedy the problem. Basically the same level of information that we would get should we “earn” a manual penalty”, there should be no distinction. I know for a fact we are under an algo penalty, possibly links from years ago, but nevertheless if we were TOLD exactly what’s wrong, then we can fix it, and hey-ho Google gets a better quality site on their search engine too. Simple.

    PS. I left out my URL when posting on purpose. Why? Because it seems google would view is an un natural link or comment spam would it not???!! Matt your work on quality is to be admired, but people are SCARED of linking now. You have also created a link detox industry AND a negative SEO industry.

    I thought you told people not to be evil…….?

  207. Matt , Most of us want to follow the guidlines and do well. Most white-hat website owners entrust in third party SEO companies to lookout for our best interest. But sometimes they can cross the line and get us in trouble …perhaps beyond our control that lead to algo-penalities to our site. We as website owners would love to be able to have a webmaster a tool that pin-points any algorithim penalties (Panda, Penguin, hummingbrid etc.) that are affecting our site so we can call them out or at least correct the issue ourselves immediately. It will also help us website owners QC and determine the value of our entrusted SEO companies and grade them on their work. Thanks.

  208. Hi Matt,

    It’s great and thanks for asking us what we want more in webmaster tools.

    I would like to see these changes in 2014:

    – Links data update in short time

    – Social networks integration like in Bing webmaster tools is providing “Social connected Pages”

    – List out the pages Google won’t index from submitted sitemap with specific reason

    – After every algorithmic update (Panda, Penguin and more in future), if the website has any affect then notify in message with effect in percentage

    – Specify some standards and show in message if website is below to that standards with important suggestions

    I like the recommendation given by “Scott Van Achte” about disavow links, please consider this.


    • “- List out the pages Google won’t index from submitted sitemap with specific reason”

      “- After every algorithmic update … if the website has any affect then notify in message with effect in percentage”

      I second these. On the algorithmic update, would it be so hard to send an email to the standard webmaster@domain?

  209. I would really like to see two things that could help me as a retailer / home SEO:

    – a complete list of the links Google has identified rather than just a sample
    – some indication of any algorithmic penalty (rather than manual action)

    it would be great to get some form of pointer as to where to focus remedial efforts as at the moment we disavow what looks spammy – but it seems to make things worse

  210. When you 404 a page, it appears in the Crawl Errors tab. That’s great, but once you have marked as fixed, do something so it doesn’t appear anymore.

    I have been “marking as fixed” the same URL’s for over 2 years or more and they are still appearing every two weeks, that’s very annoying.

    Since the operator site:domain.com only shows you 1000 pages indexed, a complete list of these URLs in the WMT would be great. For example, one of my site has 5000 unique pages, site:domain.com operator shows 10000 indexed, and the WMT Index Status tab shows 7,500 @-@? that’s crazy.

    • I think the question there is; What do they list as URL? 5000 unique pages does not equal 5000 URL total. It may include URLs for the pages, images, and scripts that are also included.

  211. Matt…

    How about a SPAM REPORT that works… So many times i’ve reported a site that is 5,6,7 and 8 times on the first page of Google with absolutely no content at all… Only iframe with AdSense.

    Could it be because it’s in Quebec ? This site in question is my competitor… I’m following Google guidelines but this site is always above…

    So a SPAM REPORT (For Quebec) would be nice. I’m not even sure if anyone is getting those reports.

    Thanks anyways…

  212. GWT reports 404 errors in one place and allows me to remove URL in another. I would like to be able to select URL from the 404 list and have them removed in one action. This would be particularity useful when a redesign changes URLs and many have no link juice worthy of a 301.

  213. I think it would be very helpful to be notified if your site got hit by panda!

  214. I would love to see some functionality where I can combine my subdomains with my main domain … no idea how to do this. I just separated some topics on my photography page by subdomain but it looks like google is treating them differently. It is one page and should be treated as one.

    Otherwise keep it simple … I would love to rank without any SEO effort … just create great content (text or pics!) for users.

    • I get more sitelinks when I put domain.com into search box compared with http://www.domain.com. I know that www is supposedly a subdomain, but the site is exactly the same for both. Maybe it’s because some backlinks use www and some don’t. Who knows? Maybe Webmaster Tools 2014 could help us find out, or maybe GoogleBot could simply be fixed so we don’t have to worry about such trivialities.

  215. Dear Matt,

    How about regional language drop down options? So anything in Webmaster Account need to be in local languages.

  216. Hi,
    I want to see backlinks with nofollow and dofollow as seperate in google webmaster tools.

  217. A better debugging tool for schema.org markup. I have personally faced the issue, wherein the markups in the HTML code does not reflect in the search results (waited for more than a month)
    I had to manually tag with the data highlighter, which did not give enough control over elements.

  218. Subject : Hold Period When Layout / Ad Placement Changes.

    Anytime when we make layout changes or changes in Adsense ad units placement for a test etc, We automatically fall into a ‘Hold Period’, as we see drop in Adsense CTR, CPC etc which gets recovered in next 2 to 3 weeks approximate. We often get confused knowing if the hold period is over or not, Or Ohh the new layout / ad placement not working … ?

    It will be very helpful if some signals or hold period is shown. How ..?
    Example : Displaying 15 days hold period which gradually shows lower numbers as days goes and a day comes when hold period goes.

    -Deven J

  219. An addition to the ‘Links to your site’ section – a graph showing the historical number of backlinks that Google has found for your site(s). And the ability to pull up a list going back to day one of each of your sites, or put in a specified range of dates.

  220. Provide information on why Sitelinks are not displayed for your site, particularly in cases where a site once had Sitelinks.

  221. It would be nice to getting better information on what makes up a page or site ranking based on specific keywords. “Create better content” is fine for bloggers, but does little to help a corporate website or a restaurant that isn’t focused on writing every day just to get higher in searches.

    Comparison data so that you can evaluate how you are doing vs competitors in a region or industry when it comes to various key elements that make up a sites ranking.

    Ability to embed real time reports into an external system. For example, if you work in a company that has several websites and divisions, you may want to integrate reports into a portal, allowing key decision makers to see how their areas of responsibility are doing when they log into the corporate portal rather than having to log into a separate system.

  222. Is there a way for ecommerce stores to have more guidelines on SEO? Currently a lot of us are reusing the same content as many others. And sometimes there are almost similar products. how do we write these? Will there be penalty?

  223. Fresher, deeper and more accurate keyword data!

  224. Hi Matt, I think the deputization concept works well. Maybe there could be a verification process for search companies when working with client accounts so that manual reviewers can see that an established search company with previous successful reinclusions and good work can be fast tracked.

    Also it would be helpful if there was a “message history” in GWT so that owners and search companiea coming onto accounts can see the history (without former companies deleting that information.

  225. It seems that Google is woefully inadequate. My father has had a part-time/full-time business for over 40years and yet he is on Google search page 4. Not that he cares as he, at 85, is excited just to have a web page. But I go deeper in the true relevance of Google search when several farms with much less experience (6 years) are on page one.

    Seems that Google has fallen into the MS cash cow trap and now lacks innovation.

    My dad, age 85, can still whip me at math, a computer science major. He worked on the Apollo project using a slide-rule, worked with the guy who invented the wing tips you see on most commercial airplanes and worked on the infrared telescope in Hawaii.

    But then Mr. Cutts only graduated from UNC, whereas my dad graduated from NC State.

  226. Hi Matt,

    it would be great to get some better tools of seeking out duplicate content and also some sort of traffic light system for inbound links. Red for what Google views as bad links, yellow as a warning and green for links which are beneficial.

    We are a news publisher (This is our homepage http://bit.ly/124onp ), have a team of dedicated reporters. We publish daily exclusive interviews and have great links with all major UK TV channels and their press teams, as well as a number of US networks.

    We were ranking very well for our keywords up until Sep/Oct of this year when all of a sudden something went wrong. We are being crawled by Google but if I search for the title of our articles our Facebook page is ranking before the main site.

    We have NEVER sought back links but having been informed on your brilliant forums that scrapers etc may be an issue I spent weeks emailing webmasters and got thousands of bad links (or what I assume to be bad links) wiped out.

    Still, I don’t want to clear all back links and ruin our rankings altogether. It’s so hit and miss and as an original content producer, with a team of journalists in our employ, I am disappointed that we’re running into issues that I would definitely deal with, if I only knew what they were.

    Also, we can’t submit for re-inclusion, as there are no manual actions against us. An option to re-submit a site would be a welcome new feature too.

    Thanks for taking the time to read this.

  227. I don’t think you should change anything in webmaster tools. Why waste peoples time playing with things designed not to work. Just send your chairman to more Bilderberg meetings and be honest with people. Google is now a system designed to cheat the masses and reward the elite.

    I give up!!!! Well done

  228. I’d definitely like to see:

    – Disavow functionality added to backlink reports (either URL or domain level), and Scott Van Achte’s idea of a check box is great.
    – Allow the search query report to combine search queries with landing pages, instead of only showing them separately.
    – Incorporate a title tag and meta description preview tool which shows exactly how many characters/pixels will appear on desktop and mobile search results.

  229. I would love to see the breakdown of impressions and search queries location wise, it is really hard to determine average position, and also the impressions generated should be separated in Web / News / Images / Blogs and so on …

  230. Goodnight, I think this time of information on cargo and how to improve our sites, is very useful in the face of mobile, because the trend is clear to them. For those who have multiple sites, and in the order brainstorming, maybe, in Web Master Tools, we could compare the behavior of one or the other, so you can replicate the good we’re doing one on one, and improve on the other, thank you greetings

  231. I wish to have the following options

    Filtering IP address so that we can get the actual impressions and clicks.
    A sophisticated keyword research tool as that of bing.

  232. Hello,

    I want to know 1 thing. Why google change url? cn any body tell me plz . i want to know this reason .

  233. Its better if you can provide the details about penalties the site has received.
    Like type of penalty – Manual or Algorithmic.
    Changes suggested to fix the penalty. And also, there is no option to contact Google support, if a user experiences any penalty or sort of problem, except reconsideration request there is nothing to contact Google. If we contact through Reconsideration request, we still receive “Some Automatic email” saying “No manual action required”
    I think you better give support for the webmasters.

  234. It is great to be here so can voice my frustration about the whole google pingume which has hurt a lot sites including mine. Google always has been about being white hat and advising webmaster not to spam etc . That’s been said it reminds me as a parent tell my kids not to do bad things but if my kids ask me what are considered bad things I would tell them ” sorry I can’t tell you go and figure it out yourself” . This is what I feel this days about SEO and what google really considers as i spam and hurting sites. Why not just be clear on that and let everyone know what exactly google is looking for other then saying “GOOD CONTENT” which is another broad word to request with our clear definition.

    That’s my wish list and I hope Matt will help all lost web masters int dark of SEO planet.

  235. When there is a penalization and what are the reasons for the penalty has a website.

  236. a fast and may be Flat Design would be cool. Plus – when you have 30 to 100 Domains in WMT I have first to choose one – than I can search for domains – so the search field only is visible when a Domain is chosen – that should be changed.
    Otherwise – Google WMT is cool – may be it dont needs a Design – we need it fast… 🙂

  237. Google Analytics has a good “Users” module. We can click on a user name and see which sites he/she has access to.

    GWT too should have a similar “Users” module feature

  238. I would like to have an easy way to export the Searches for specific keyword via Google’s API. Paying for this function for every keyword or result would be okay for me. The Possibility to export the results from the keyword planner would make my work much easier 🙂

  239. As Authorship Markup is about correlating the content with the author hence blog posts where the author has been specified clearly should get the benefit of the author pic. being displayed in SERPs but I have seen many websites (Simple brochure websites) adding the authorship markup and verifying the link on the Google+ profile getting displayed in SERPs with the website owner’s pic being displayed in SERPs . Hope Google can differentiate between static and simple websites and blogs in 2014 and give the benefit of the authorship markup only to blog posts following the authorship rules.

    Authorship Markup is about correlating content with the author and giving the due credit to the author to establish his/her authority and thought leadership and not about displaying pics. of business owners and people who want to just get their headshots displayed in SERPs.

    This kind of misuse totally beats the purpose of the Authorship markup and has a potential of again polluting the SERPs more than ever.

  240. Hi Matt,

    1. Personally I’d like better insight into a domain’s link profile and better integration of the disavow process. This is a big area from improvement, The introduction of alerts for manual webspam action and the introduction of the disavow process are all well and good but, as some people have already mentioned, we need a more intuitive system when it comes to the identification of malicious links.

    For massive organisations with bottomless pockets malicious link identification and link removal is not a problem, they’ll just throw some money at it and employee someone to manage the process. Job done. But for the smaller local businesses with limited budgets and little knowledge managing a link removal process can be impossible and has can have a seriously negative impact on their business.

    As webmasters we need a system which provides more insight into the potentially high risk areas of a link profile. Ideally Webmaster Tools should highlight which links are potentially malicious/low-quality advising which links potentially require action. This should somehow integrate with a clearer disavow process which indicates which links have already been disavowed by the webmaster and which disavowed link Google has taken action against.

    This type of insight would be more beneficial and ultimately more actionable for the everyday webmaster. Ideally I’d like this to be combined with a more open line of communication with Google representatives to help remove any manual webspam actions and avoid in future.

    2. Other than the links side of things, I’d also like to see an enhancement to the Crawl Errors so when you download error details you receive the ‘Linked from’ data allowing webmasters to efficiently solve the root cause of the issue. I’ve never understood why you can’t download this!

    3. I’d definitely like to see more information on content duplication, perhaps with some kind of classification system whereby Google rate the potential impact of content. I’ve seen a lot of instances of content scraping on a massive scale by seeming dodgy Chinese websites; but it’s always difficult to gauge how aware Google is of this and how bigger impact it has. A system to report and categorise the severity of content duplication would be fantastic.

    4. Finally I would also like to see better integration of data from external web properties such as Facebook and Twitter. Bing have recently released “Connected Pages” allowing webmasters to connect with data from social profiles etc. I think that’s potentially very insightful data from which webmasters could benefit.


  241. 1- There should be a feature where we can tell Google these are our OLD URLs please do not show 404 for the same.

    2- List of pages remaining for index. ( We can QC those pages and improve page quality).

    3- Mark the pages that are duplicate or No added value content.


    • “1- There should be a feature where we can tell Google these are our OLD URLs please do not show 404 for the same.”

      301 redirects will do this for you (301=’permanently moved’). Assuming Apache server, create or modify the .htaccess file in the root of your site, e.g.

      RewriteEngine on
      Redirect 301 /oldpages/page.html /newpages/page.html

  242. I would like to see disavow tool option inside the menu and just below the “Manual Actions”. Make it ease to user can remove bad links through webmasters tools as we use “Remove URLs” under “Google Index” option.

  243. As a webmaster I have bunch of clients sites. I would be happy to have the opportunty to join data of all/some sites on one graph. For example, all organic visitors for last month to watch a general trend. That would be easy implemented by adding “Add site” tab above current draw. Thanks.

  244. Within Google Webmaster Tools we should be given an option to pay for feedback on why a site was given a penalty, and how to fix it. When Google, Microsoft or Apple is dragged into court by various countries around the world for breaking their rules, they are fined and each company changes the way they do business in that country. They are not given a penalty where they loose 50% of their revenue with no understanding of what they did wrong or how to get the penalty removed.

    I understand getting a penalty for doing something wrong. If I speed I am given a ticket by the police for breaking the law; but i am given an opportunity to defend myself. If the judge does not believe me I am ordered to pay a fine… that won’t bankrupt me. The current system makes no sense as you can spend months and months trying to guess what one did wrong and have no clue how to fix the problem.

    Yes we were given a manual penalty… and we worked very hard to have revoked. Despite all the work, our traffic has not increased at all. Up till May of this year 70% of our traffic came from Google, now it comes from Yahoo.

    The saddest part is Google has made it IMPOSSIBLE to get help from webmaster tools… and when you do break a rule you are ‘guilty’ forever. This should NOT be the case.

    • I would love just to have opportunity to use Data within GWT without limit of time and able to filter, segment in same way we used to do within GA for keyword traffic (search queries)

  245. For authenticated brands or other owned entities I would like a way to manage knowledge graph content and disapprove incorrect data in a similar manner to the way sitelinks currently function within GWT.

  246. Hi Matt, I think it would be great to update the link removal tool to accommodate multiple removals. There is nothing worse than when a client creates a new site from ASP.NET to PHP, and all the broken URLs that come with it.

  247. I think that all of the suggestions you make yourself would be of a great help to webmasters all over. Especially more in depth analysis of errors or faults within a site. Thanks for asking!

  248. I would like to see some Page Speed metrics. Perhaps a line graph showing site speed over a given time – with a threshold showing pass/fail and recommendations to improve score?

    My biggest request for Google is for them to follow their own page speed rules. Their maps and translate tool do not follow their speed guidelines for compression, caching, and image optimization.

    • Page load speed tends to be determined by the number of objects/scripts from third party sites included in the page—the fewer the better. For example, including a Facebook like button, or for that matter Google Analytics, will slow down the page loading, as each of these requires a separate HTTP request, and these take a small but appreciable time regardless of connection speed.

      I’m not sure how GWT can help here…

  249. Better control of sitelinks!

    We can only demote sitelink URLs – why not do it the opposite and allow us to suggest which URLs we would like to show for a given search result?

    Seems to make more sense from the bandwidth perspective – if I have 3 sitelinks I would like to show, why should I need to demote dozens of links over and over until I get the 3 I want instead of just stating – “I would like these 3 sitelinks” ?

  250. Hi Matt,

    I actually pinged a couple of suggestions at the WMT on Twitter a couple of days ago. My biggest asks would be:

    – show us ALL the links to a site that Google knows about, not just the top X number. If we need to clean up links, show us all of the ones Google is tracking to our site, and maybe show which have been disavowed if applicable
    – give us the ability to see all the pages indexed by Google somewhere or at least far more than the top 550-775 that you can find with the site: query
    – I agree that a bulk upload of URLs to remove from the index would help. I’ve seen sites expose an entire staging or dev site and have it indexed. Getting rid of those one at a time is a huge time suck. Having said that I think an extra level of security on URL removals would be prudent. I see lots of sites that still have former agencies or employees with admin access to WMT.
    – how about some messaging that a disavow for a site hit by an algorithmic update rather than a manual action has at least been looked at so we don’t have to wonder was it looked at and didn’t meet the criteria, or has it not even been looked at yet.

  251. Improved Mobile reporting and AJAX Crawling Scheme reporting please

  252. Thanks for starting this conversation!

    The Change of Address feature is currently “Restricted to root level domains only.” Our previous government domain uses the standard format of http://www.ci.[city].[state].us. It would be great to use this feature to indicate our permanent move to the alternative format of http://www.[city][state].gov.

    I also see some of our site search results returned with breadcrumbs in the Google links that reflect our site hierarchy, while others are a meaningless URI with parameters. While our current CMS does not use a canonical page addressing scheme, it does so in the ever present breadcrumbs. It would be great to configure the webmaster tools to recognize a page element by id as a means to provide Google with authoritative position in content hierarchy, such as “this-is-the-breadcrumb-element”.

  253. Hi Matt,

    it would be great to get some better tools of seeking out duplicate content and also some sort of traffic light system for inbound links. Red for what Google views as bad links, yellow as a warning and green for links which are beneficial.

    We are a news publisher, have a team of dedicated reporters. We publish daily exclusive interviews and have great links with all major UK TV channels and their press teams, as well as a number of US networks.

    We were ranking very well for our keywords up until Sep/Oct of this year when all of a sudden something went wrong. We are being crawled by Google but if I search for the title of our articles our Facebook page is ranking before the main site.

    We have NEVER sought back links but having been informed on your forums that scrapers etc may be an issue I spent weeks emailing webmasters and got a large amount of bad links (or what I assume to be bad links) wiped out.

    Still, I don’t want to clear all back links and ruin our rankings altogether. It’s so hit and miss and as an original content producer, with a team of journalists in our employ, I am disappointed that we’re running into issues that I would definitely deal with, if I only knew what they were.

    Also, we can’t submit for re-inclusion, as there are no manual actions against us. An option to re-submit a site would be a welcome new feature too.

    Thanks for taking the time to read this.

  254. Matt,

    Since you opened the door, I have a laundry list of wishes.

    I would like to see:
    a. the removal of meta-data characteristics that Google no longer uses, i.e. meta-keywords
    b. the removal of Page Rank as a tool, completely, unless by some chance, you decide it is worth keeping (although I cannot fathom why)
    c. I would really like to see someplace on a given site, or in webmaster tools that can track and produce an accurate report for keyword density in on-page text content
    d. along a similar vein, I would really like to see an accurate keyword reporting feature that indicates, via an average search obviously, where utilized keywords rank in a search for individual sites, i.e. what Raventools used to do remarkably well
    e. finally, if I am having my wishes known, I would like to know the exact criteria for a positive link building strategy, rather than having an outline and a bunch of guesses

    Anyways, I look forward to hearing from you more.

    Sliante Matt

  255. It would be nice if I can see the afterlife of my spam report. I’ve done so many lately and don’t know anything about it. Maybe I just ‘spammed’ you because you don’t think those urls are spammy. Maybe you declined it, maybe you haven’t processed it… I/we don’t know anything.

    I think it would be a nice idea to some webmasters will be “deputized”, or you can build a distributed system and webmasters can ‘review’ others’ report and if it gets many votes your team check the site manually faster… eg. I report something and user X and Y get my report if both finds my report valid you guys check the report faster than I just said that specific site are spammy. Users/webmasters can volunteerly take part in this system. I definitely would do this 🙂

    Also faster processing of the datas will be nice. I’ve got a new site and I don’t even see an inbound link however the site is nearly a month old.

  256. I know Google’s not going to tell us exactly which links are spam or what pages are “low quality”, but it would be nice to see a simple 1-10 score that assesses the “Panda Risk” and “Penguin Risk” levels for each site. That would make my job easier because I would at least know where to start, and have some indication other than traffic that tells me when I’m making things better or worse.

  257. I would like:

    Geo-targetting to have a complete make-over. It is very shallow. I think Google Places can be wrapped up in this partially. Let a company set their service radius here for their whole or part of their site. Webmaster and Google Places are unable to handle websites that have part local services (such as youth camps) and part international products (such as DVDs). If a website has a well thought out URL structure that separates local and national, then they really could easily use such a tool that would help their website target the right people in the right places.

    Webmaster is not as helpful for local companies as international ones – as a whole.

    It would be incredible if ALL the Webmaster Data and tools were available in Google Analytics. Only putting a part of it there isn’t as helpful.

    Data needs to be available for a year, too. This 3 months only of data is killer.

    And while we are dreaming… I know you guys have all the data, so when a person signs up for Google Webmaster and Google Analytics I think it should give the historical data and not just from that day forward.

    OHHHH – and now that PPC is so important I’ll take a moment to say that to charge for clicks which don’t show up in Google Analytics (for whatever reasons) is wrong. A click is a click, a visit is a visit and they should ALL show up in Google Analytics. I realize this is about Google Websmaster! Forgive my going off topic and please consider my request! I just want 1 dashboard. 1, not 3.

  258. We need search queries option back. It is getting difficult to know what people are typing and reaching at our website. If you can have search queries for paid search why can’t you have it for organic search?

    • The referer URL is present in the HTML header, and in PHP can be accessed with $_SERVER[‘HTTP_REFERER’]. Look for the q parameter to find the user’s search query.

      I agree that if the user is using ‘secure search’ (https) this information is annoyingly absent. Nothing AFAICS GWT can do about this, as it’s part of the HTTP spec—unless Google decides to discontinue ‘secure search’…

      • Sorry, originally misunderstood your post. We do internal analysis and don’t rely on Google tools, but if the Google tools were flexible enough such that we wouldn’t have to, we’d use them more. It’s pretty hard to specify what we’d want in terms of flexibility as each is on a case by case basis, so we just log all (non-identifyable) visitor info ourselves, and mine it as and when we want to do some analysis. Actually I don’t think GWT could do this as it’s not privvy to the HTTP headers…

  259. I see most of the comments/requests are for site owner wishlist items.
    Here are some thoughts from a Google search user:
    A report bar on the results page for each result with the following:
    a) Date of last change.
    b) How many times the search word or phrase appears on the page.
    c) How many times the page visited in last month using same search word or phrase.
    d) Is the target page currently giving a 404 return (unfortunately MANY do)
    e) A “Useful” / “Not Useful” icon set (and maybe reason text) with results (user feedback related to keyword and page)
    f) An option to always open the target site in a new tab (will allow ‘e’ to work easily)

    With regard to “e” and “f”:
    I’m not saying that the ranking should be influenced by the user vote results.
    We all know how that will be abused.
    Its simply taking “+1” and expanding it to include the “-1” option and including a reason.

    And the best thing is that this sort of info will not only be useful to searchers BUT owners/developers will also be able to get very useful insight into how the public sees their site/page.

  260. Harsh Wardhan Singh

    I am not fond of Bing but I think Google can add a section of site SEO analysis like that of in Bing if possible. It would be helpful to just figure out the site’s present condition especially page by page and it would be easier for newbie webmaster to make improvements accordingly.

  261. I’m not sure if Webmaster is the place for it but I miss the old keyword tool. Also as a small business I wish Google would highlight any bad links because not knowing creates paranoia. I don’t always know who is linking to me. I would prefer if bad links just had no value instead of negative value. This would make obtaining bad links a waste of time but would also eliminate negative SEO.

  262. What is the new keyword tool?

  263. Matt, I’d like to be able to contact Google via Google Webmasters Console. I had a site that, due to several coding errors in the backend, eventually developed large amounts of duplicate content which appeared to be of low quality (and it was !). Back in January 2013 it completely disappeared from the SERPs overnight – probably as a result of a manual review.

    In the months since then I’ve removed all content and have rebuilt it from the ground up using a totally new backend and with 100% unique content. But it’s still not ranking, a site: query still shows old content from the site mixed with the new even though it’s all been 404’d and removed – several times !

    There are no manual actions shown in the Webmaster Console but there clearly must be something still hanging around because the rankings are dreadful. It used to be possible to fill a reinclusion request but now that Webmasters shows that there is no manual action applied, the link has gone.

    I have absolutely no way to contact Google to take a look …. which is far from ideal !

  264. When I submit my sitemap I am using it to tell you about my important urls and that I would like them indexed please. When I see say x thousand indexed and 50 not indexed I am left wondering why. Of course it could be products, available in say liquid and powder, and they have duplicate meta data because they are variations and we have made a mistake. Or it could be that they simply have not been indexed yet. But it would be nice to know which ones are not indexed and why so I can fix it.

  265. I am still working through the disavow process (sheesh!) I wish WMT was returning the links that are actually counted. When I came across this website in the list, the comment links are nofollow… but it is still in the list. I felt a little funny about adding domain:mattcutts.com today to my disavow list – but better to be safe than sorry, especially since my business is nothing about SEO.

    I’ll reiterate – a list of links with checkboxes where we could “tick” to describe the link makes more sense.

  266. Sending private mails to bloggers and webmaster when their site is having any issue is the best option. Thanks

  267. Google currently gives users the following message for Unnatural inbound links:

    Google has detected a pattern of artificial or unnatural links pointing to your site. Buying links or participating in link schemes in order to manipulate PageRank are violations of Google’s Webmaster Guidelines.

    It’d be great if Google could give webmasters and site owners the full list of links that they deem unnatural rather than just a few examples (especially for sites with 10’s of thousands of links). This would not only help webmasters acquire a better understanding of what Google sees as “unnatural”, but it would also help “clean up” the web faster by making sure PageRank is being passed accordingly.


  268. Error reporting, I really need the 404 and 410 errors separated and maybe the 410 errors being removed from the reports earlier.

  269. It would also be good to know when Google access your site to give more information, presently its struggling to read my multiple sites Robots.txt file, yet fetches them every time, so it would be good to know what the issue is as I’ve still not discovered why this is happening.

  270. Can’t understand why my comments are awaiting approval !

    • That’s because I pre-moderate comments, and I was on vacation for a few days.

      • Matt,
        You cannot go on vacation….. With all the comments here it is clear you need to do what Superman did one time and split your atoms and make another Matt Cutts to keep going.
        Joking of course, but still the amount of comments to this post is crazy. . . and indicative of the frustration out there when our sites drop ranking and we cannot figure out why.
        It is a bit like waking up in prison and no-one will tell you why

  271. Better or faster bulk url removal 🙂

  272. I would like a way if you get banned by Google for Adsense that you can be re-evaluated. I got banned the very first month when I was a newbie and would love to get a real reevaluation.

    It would also be nice to have a real person to contact. Many of the items you brainstormed are on my wishlist too!

  273. 1. Understand why Google rank 10% of our content while some competitors get 1000% of their content ranked ( duplicate website and more f.i a website with 60.000 ads has 700.000 pages on Google ! )

    2. Write when you read a spamreport because it looks there is a delay, in years between the time we post and the time you read them. A little bit like posting a website on DMOZ, u have the time to die :-/

    3. Be ble to contact your team more easily. When we are Adwords customers and spend thousands of euros on your site, the minimum is to get feedback and not an answer like “oh sorry we can only speak about Adwords, we don’t know nothing about natural results”
    because it looks like nobody cares about the “natural” results

  274. It’s a good idea to help Google understand who first published a content so copied content could be identified easier.

    It would also be nice if we had mayor control on our site links.

    Last, the anchor text information could be offered in a clearer system than just numbering them.

    Thanks a lot!

  275. Author stats in GWT only show author stats for pages for which I am the verified author. It would be great if it would show details of all authors that have been verified for the site that I own.

  276. An addition of social mentions & things like this to monitor our social side. With social being such a huge part of web traffic it seems there needs to be more information available to webmasters in this area.

    Also, as others have mentioned, a faster/more up to date ‘Pages that Link to you’ page. That would be really helpful.

    Showing the negative values of links would also be extremely useful: sometimes (quite often) it’s really not clear to us if something is negative or not – if a site is legitimately linking to ours, but the site itself is bad, we have to do a lot of work to find out how ‘legit’ this site is before deciding whether or not to get the link removed.

  277. Thank you matt.

    I believe all this features will improve a lot our work.

    Maybe also a list of full backlinks considered by Google as artificial or bad quality will be very useful.

    Often our competitors are spamming our website in low quality websites and it’s hard to find them.

    This tool will be a dream for me.

  278. In 2014 I would like #1 rank please Matt.

  279. ・・All Organic Search Queries・・

  280. I would like to see a more detailed backlink report, like something that can tell me exactly which backlink(s) are degrading my website. I was totally confused over the past a couple of weeks… because I tried to follow the rule of Penguin, so I checked my backlinks and contacted other websites to remove some suspected ones, however now my website is some 10 places lower than it was before the backlink removal… I am suspecting I may removed the fine links though they didn’t look that good on the surface???

  281. I’m more interested in this:

    Improved reporting of spam, bugs, errors, or issues. Maybe people who do very good spam reports could be “deputized” so their future spam reports would be fast-tracked. Or perhaps a karma, cred, or peer-based system could bubble up the most important issues, bad search results, etc.

    • The spam is becoming more complicated. I listed on this website before how search still happily indexes scrapper websites. When reported not much happens and 2-3 weeks later you see that the index has grown with scrappers.

      I think it is a good idea to improve the spam reports in this fashion.

  282. Hey Matt, I would like to see not only 3months Search data but keep full data available. It will be very helpful for analyzing coming year traffic through webmaster tools.


  283. How about disavow link / domain review tool that highlights any links which were inadvertantly uploaded (potentially good links) to be disavowed? With so many links to disavow it’s easy to make a mistake. And to take it another step…how about ranking and classifying the spammy links disavowed.

  284. I would like to see if Google can give some value to link, say 1-10. And we can download that full list.

  285. Hi Matt, I would look to see an upload option for bulk list of urls for “Remove URLs” request. I would like to see “Disavow Tools” option inside the menu of “Google Index”.

  286. Mu 5 cents:
    – more then 2000 search queries to cover large sites better
    – an API to get link and search query data

  287. Yes Matt, most of the people are considering this tool after the not provided keyword issue and you can get good analysis of your website through GWMT.

  288. I had chance to put this suggestion across to Syed Malik during the GdayX Mumbai that happened last month where he was a speaker.

    Right now when you tell Google which is your preferred site version www or non-www you need validate ownership of both versions first, in case thats not done, instead of requiring a user to go back and add a site, the UI could be simplified to include a validate ownership option from that very section itself.

  289. Hi Matt
    Webmasters would like some real support from Google; some real answers to genuine queries. The Google Product Forums are populated by SPAMMERS AND TROLLS; they have some sort of territorial mentality; the movement someone asks a query, they pounce on him with accusations ranging from plagiarism to spam to low content; all kinds of insults; so no one comes back again.
    As head of Spam, I would like you to look inwards at you Spam Forums and start thinking about giving some real support and not abuses.

  290. A complete list of pages indexed by Google.

    Currently the “index status” figure in Webmaster Tools often reports thousands more pages than a “site:” query – but there’s no way to find out what they are! A big problem if you want to NoIndex them eg for Panda recovery.

    +1 for Fat Pings idea.

  291. Hi Matt, it would be great if you could provide all those thing you mention on the list.

  292. When exporting a list of Not Found errors to excel, include the Referring URL data in the export!!! Super valuable in contacting for corrections, or removal/disavow if they are bad links.

  293. That’s really awesome to have betterment in GWT. Looking forward to use those lively 🙂

  294. Matt,

    I know what the smaller brands need. I can tell you that in the competitive market I work in for my clients, my work stands out amongst the best in seo (you would likely be impressed).

    1. Most importantly, a clear alert of algorithmic penalty. It makes little sense to eliminate some of the most useful sites online (e.g. Panda) and not let them owners about it, sometimes due to sheer accident (e.g. site has published numerous stubs through technical error, or brand links have mistakenly tripped a Penguin). The problem is, for panda esp., it only takes a small amount of bad content to bring down a large site. Obviously, you don’t need to be explicit on the general details of what type of issue (e.g. your biggest problem is too many empty stubs) but that would be even better.

    2. Optionally, simplify by creating disavow interface, so we can add checkboxes, grouping domain links, instead of worrying about formatting; and ideally with anchor text to links, nofollow, etc… Trust me, you have enough fear in SEO’s to prevent casual spamming, so adding these features wont help the hardcore spammers.

  295. I’d like to see a Sitelink analysis tool – to understand why some sitelinks appear and not others, or in some cases no sitelinks appear at all.

  296. After reading these comments, its clear the top requests are related to algo penalties, to better identify, and resolve. Will be interesting to see if Matt acknowledges this and whether these would be a serious consideration to add.

    Also, Matt, would it be possible to summarize the comments here by what you see as being the most important/requested items?

  297. More info on Algorithm updates; something to know when your site might have been hit and why .Bulk URL remover would be nice too.

  298. I would like to see improved speed and accuracy in google fetch feature . I have seen some major changes in last year , probably process of updating needs to be slowed down as too many updations will confuse the users .

  299. Why do site owners need to respond to ‘bad inbound links’? Surely disallowing a positive benefit for bad links is enough. Small site owners shouldn’t be penalised just because they can’t recognise a bad link and navigate their way around web master tools.
    Get a grip Matt – not all your customers are seo experts.

  300. a link diagnostics tool would be awesome! one that would show what links Google considers to be bad and good.

  301. What would be very helpful in Webmaster tools is to know where the impressions occurred. A simple map that tells me the locations of the visitors. Analytics has this feature and I would love to see it in webmaster tools, including CTR information per location.

    Analytics has this information but it doesn’t tell me anything about CTR’s and positions per region. All Analytics gives is the total number of visits. This information would be so helpful to identify variation between search phrases per region in relation to positions per region.

  302. Hi Matt,

    Thanks for opening the suggestions. My two cents :
    1) We lost the capabilty to know the keywords typed to reach our websites. This cause a very big issue. Why not a pretty solution instead of simply removing the old keyword tool ?

    2) Configuring the authorship+publisher is a mess for guest blogging. Why not clarify and simplify all of this with a all-in-one solution

  303. Matt,
    I have the following feature requests:
    – Labs: Author stats which shows s a combined view don’t add much value

    – Sitelinks: Can we improve it in a way that links that we demote are immediately removed from the index? (like URL Removal request turnaround time)

    – Search Queries: < 10 clicks don't convey anything. Since Google have that number, why not show absolute number there? Some of these keywords with even 5 hits may be important information for us 🙂

    – Email alerts: Monthly/Weekly summary by mail – Many webmasters don't open WM Tools on a regular basis but would definitely open it upon a mail alert on an issue

    Thank you,

  304. 1) SERP updates in real time !!!!! would be great
    2) graph with the percentage of anchor text
    3) rank of input links
    4) rank of website pages
    5) other 198 factors for go up in the SERP (beta version) 🙂

  305. Webmaster didn’t show me backlink from sub domain blogspot. Why? Is it bad?

  306. I don’t know if it is about webmaster tools, but I really would like to live in time, that the content will be the only king. When I am reading many sites about webcombination I see that it is still not as good, as it should be. I would like to see in webmaster tools more info about a site content analize and I would like it to be faster.m Sorry for language.

  307. Would absolutely love if there were some way we could be more clued in as to why a website gains/loses rank. We switched our site to using https only (with a 301) and went from page 1 of SERP’s to not even ranking, and it would have been great to know if this was why or if something else caused us to drop off so drastically. I understand you guys can’t give out your secrets, but at least give some hints to help lead us to the right answers or to effective solutions. This process has been so frustrating!

    • I agree… and if you have a website that is poisoned for whatever reason and you 301 redirect that to a new virgin website, the poison ( algorithem penalties ) will transfer over and sicken that website too.

    • So you do a drastic thing like going from http to https and use 301’s to redirect old to new and you STILL aren’t sure if that’s the reason you lost rankings?

      If something happens shortly after you did something to your site, and you´re sure you didn’t do anything else,… then you can be pretty sure that that’s what caused it. Yet, you are so proud of your own work and decisions that you still want to consider that it might have been something else.

      I get the emotional factor behind it, but really,. all you have to do is search for “SEO moving to https” and then there is no need for doubt anymore.

      Actually, you should have done that search BEFORE you decided to move to https so you would be better prepared.

  308. Yeah you mentioned the points that I would want to improve too, I agree 🙂

  309. I really like the current layout and content of webmaster tools, it’s a great resource. The only one consideration is that I would like control over which sitelinks take priority. It would give the user a better experience in being directed to key pages.

  310. 1. Show all crawled link tree, display and allow customization of its snippets in Google (Google Search, Google Now, Google+, Google Glass, etc.). Allow customization of which links/links tree/files to show and how to show – to which snippet type to associate.
    2. Make full marge of Google Analytics and Webmaster Tools in one product.

  311. Something about Knowledge Graph features? How about submitting contents to Google’s Knowledge graph database through a standard procedure?

  312. Allow webmasters who have received algorithmic penalties–especially those who have played by the rules but can no longer perform for brand terms–to understand the reasons why they have received the action. This would be helpful in cases where the Google Help Forum and Hangout members aren’t able to provide answers. It would be a win for both Google and the organic search community.

  313. Hi,

    1. (Not your department) The new look in Adsense is nice but it takes away the link to Webmaster Tools.
    2. Allow Tools to go further back – at least 12 months so you can see traffic treads.
    3. Allow us to update the snapshot images at least once a month
    4. You should be able to show us how our sites are pieced together in a graphic format instead of just (Internal Links) as you call it now. Right now it looks ‘unGoogle’
    5. Geographic target – We should be able to pick at least 3

    Yep thats it!

  314. I think webmaster tool must include the option to showing all the details about coming bad links which is harming the search engine rankings and performance of the website… so that website owner can remove the bad links asap.

  315. Richer data under ‘Search Queries’ would be great 🙂

  316. Matt,
    It would be great if you could add a function that allows us to see all the links that google has in its index for our own website. I use joomla and doing a upgrade to the latest version of joomla I had a major bug with one of the modules I was using (but couldn’t figure out which one) and google suddenly indexed and created thousands of pages (creating a huge network of duplicate content and crazy web address) when my website is less than a 100. I have no way of knowing the address that google created and indexed and removing those is very challenging (other than adding some commands in the htaccess and waiting months if not years for google to remove those from its index we can’t do anything). This is very frustrating and can kill a business ( I use to rank and disappeared from the rankings and I am confident this is the reason as nothing else changed ).

  317. Addition to the current API to extract keyword data.

  318. First I love Google Webmaster Tools, well done Vanessa Fox and the team that continues to build it. Here are a few items that would be cool additions.

    I would like to see a better sample and breakdown of branded vs non-brand keywords without having to export and filter.

    I would like to see a notification of pages that Google feel need enhancement of quality (i.e a list of pages in supplemental results) – like in the old days when we could see the Supplemental index”.

    Refreshing the existing data in Webmaster Tools faster.

    An easier (or maybe quicker) way to get GWT data for a specific section (or sub folder) of a website. For example let’s say that I wanted data for only my /products or my /blog section of the site.

    Additional CTR data (i.e. aggregate average of CTR% depending on position in a Google SERP).

    I know this is available via GA, but perhaps more insight into site engagement. Aggregate of types of content that users are engaging with (i.e blog post vs. traditional web page vs video etc)

    Thanks Matt!

  319. Hi Matt,

    I run a small business and manage my website internally. On my end it would be helpful to get advice from webmaster tools on how our current site’s content and page structure could be improved from an authorship standpoint. This way we can improve our content and overall website with the focus on providing our site visitors the best user experience possible.

  320. I have had a recent issue with the rankings in Google. I usually appreciate every little change that is introduced and make changes to our site accordingly. But, there are some things which are beyond my control that affects my SEO rankings.
    a) There are some anonymous users (such as a disgruntled ex-employee or a competitor) who can bring down my keyword rankings by creating blogspot site that talk ill about my website. And they do a damn good job too. Now, this requires me to increase my SEO team just to make sure I bring my pages upto speed. I feel this is an unnecessary expense for my startup, which could easily be eliminated by Google spam team.
    b) My website’s URLs are copy pasted on irrelevant sites by these same anonymous characters. This contributes to backlink spamming as well, I suppose. I cannot control this, again. This has made my SEO team to make sure they check backlinks every day/week to report bogus backlinks.

    I would like to know if your spam team can stop this from happening from 2014.

  321. Well if this going to get implemented this would be much appreciated thing in 2k14. There are lots of confused webmaster who are still wondering where and when did they go wrong. Trust me all those penalties and punishments still paining us. Out of your mentioned points I really felt these one would really do the trick for both the Search Engine and Webmasters. These are as follows

     Improved reporting of spam, bugs, errors, or issues. Maybe people who do very good spam reports could be “deputized” so their future spam reports would be fast-tracked. Or perhaps a karma, cred, or peer-based system could bubble up the most important issues, bad search results, etc.

     Option to download the web pages that Google has seen from your site, in case a catastrophe like a hard drive failure or a virus takes down your entire website. ( Less Priority – Though a good one)

     Periodic reports with advice on improving areas like mobile or page speed.

     Better or faster bulk url removal (maybe pages that match a specific phrase?).

     Improve robots.txt checker to handle even longer files.

    Thanks in Advance..

  322. I would like to see improved speed and accuracy in Google fetch feature . Lst year, I have seen some major changes in Google…

  323. Give us back the Keyword Tool that used to display the number of searches for any given search string. The new tool is not useful.

  324. Hi Matt,

    we would like to see the following points in Webmaster Tools 2014:

    – Show all (100%) crawled Link + date of the last crawling the domain
    – Give Tipps that the Links may be bad
    – Function to disavow the link per click
    – temporal link growth diagram (daily or weekly)
    – Link statistics regarding domain pop, ip, C-CLASS…
    – google visibility index similar to sistrix visibility index

    Thank you


  325. Maybe you can do this already, I can’t seem to find an answer. I would like to be able to use the disavow links tool preemptively to keep the scraper sites from putting up too many links to my well ranked pages. I recently had a case in which a site I was very careful to stay within Google’s guidelines while promoting and was in 1st place, was bounced back to page 2 by a scraper site putting up hundreds of links on irrelevant pages.

    While wishing, it would be nice if there were a way we could request Google to set a limit on the number of links from any site that would be factored into ranking, I think that would curb the amount of havoc the scraper sites could wreak. A link diagnostic tool would be magnificent! While I’m on a roll, you could have Jennifer Anniston pick me up for dinner at 7:30 sharp!!

  326. I haven´t read all the comments and I don´t know if it says, but it would be very interesting for counteract the “not provided” effect the chance to know the exact words of user input to arrive some page.
    I really wish and expect that is your present for us 🙂

  327. It is better to have crawl rate control in hours. So that webmasters can control crawl rate depending on traffic. Most of local websites have visitors in day time and less traffic during night. In that cases webmasters can set higher crawl rate during night hours.

    This is also helpful for websites which have daily maintenance/downtime in a particular time of the day.

  328. If we can know why the Website ranks below the other results, it would be really great. Either because of the low content quality or because of the low quality backlinks.

    This can be fantabulous for the Webmasters to know about their Website and then come up with some great source.

  329. Hi Matt,

    I’m very happy with latest webmaster tool interface and analytic interface.


  330. the artificial links so we can do something to perform our rank

  331. two things:

    – Notification when links on my site point to an external page and the content of that page changes significantly

    – A list of all the spelling errors on my site

  332. Hi Matt,

    Personally I think that the disavow tool should be looked at again, as the internet is becoming more and more competitive, people are finding ways to eliminate the competition and for me i think that the easiest way someone could do this is by pointing hundreds /thousands of spam links to another’s website. I think that an instant disavow tool would be great as there are sometimes links that are pointing to your site and no matter how hard you try you can’t get rid of them and then google take months to realise you have submitted links to be disavowed but by this time you have lost hundreds and thousands of viewers, and google has been punishing you the whole time.

    I also think that a checklist of basic SEO objectives that need to be met should be introduced, this would then eliminate hundreds if not thousands of spammy blogs that are just a pain, notifying Webmasters that they have too many links on a page or there is just not enough content for it to be constructive. This would allow webmasters to realise when they are lacking in the basics of SEO.

    Also i think that combining Google Analytics and Webmasters together would be great so that you can view all of your data in one place. I also think that some more information on the keywords that we are ranking for would be productive allowing webmasters to see what they are ranking for (more specifically) and allow them to target new ones more easily. Also incorporating your social networking presence would be great, allowing webmasters to track mentions/ likes and other social networking activities.

    Well there is just a few thoughts here and hopefully you guys may take them into consideration.


  333. I’m a Google fanboy to say the least, but one thing that Bing does better is WMT. I actually prefer Bing WMT over Google’s, except their is less data to look at since traffic volume from Bing is much lower.

    I’d love to see Google roll out some of the features in Bing WMT in 2014

  334. SERP position updates in real time.
    Now we see the previous month’s data

  335. I agree….. SERP position updates in real time.

  336. I’d like to see a Sitelink analysis tool – to understand why some sitelinks appear and not others, or in some cases no sitelinks appear at all.

  337. I would say i agree with those who have said a way to know the links that are bad ones. In addition, it would be a good idea to be able to send “Google “fat pings” of content before publishing it on the web, to make it easier for Google to tell where content appeared first on the web” like you rightly mentioned.

    The reason why i think the ping is important is the fact that, sometimes, smaller/not well known sites will publish an article. A bigger and more popular site will scrape it and end up being recognised as the first to publish it. And the ones who did all the work end up with no reward for their hard work because they are small.

  338. I would like to see more information about local search, keywords and demographics. I think improving local search and tools that will help local businesses thrive online are very needed. Link building is somewhat limited at times for local businesses as we have to try and find local places we can work to get links from because most people don’t get up and decide to link to a local business site that day. There needs to be more in place for local search and make the local experience better…Thanks

  339. Make it easier/faster to claim authorship or do authorship markup.

  340. I want to see hot girls in Google webmaster tools, Everyone loves hot girls. I love hot girls, If we just have a photo in there that would be pretty cool.

    Other than that SERP Position monitoring, In either real time or daily, I am sick of this “AVG” crap, We need to be able to monitor our keywords and what position we are based on a keyword. AVG stands for nothing some keywords I AVG 4 but doing the search I am page 30.

  341. Hi Matt,

    Really would like better communication from Google to Webmasters.
    Better messages if disavow has been effective, if disavow was missing bad links,
    better spam details etc… Just more information.

    Thanks, Joel

  342. Tell us if the links are bad or not.

  343. I use webmaster tools everyday and it is a great tool
    the only improvement that I can think of at the moment would be to allow website owners to select which sitelinks they would like to appear under there main URL in the serps. I also believe if used correctly this would help make the serp’s pages a better place by allowing site owners the chance to show users there other pages that might not show usually

  344. The crawl stats tab could be inproved by having a bit more explanation or a traffic light system to show that the stats for our website are ok or not doing too well.

  345. I would like to see on webmaster tools in 2014 an integration with social share. And i also would like to see links that you disavow (because people linked to you without your will and you cant remove) no longer appear in your linking profile as links to your site

  346. Matt, How does Google justify Zillows domination of all search results related to real estate? How is this not Overstock or J.C Penny all over again? How can a company in Seattle Washington have the number one spot for “Bangor Maine Real Estate”? Or “Boise Idaho Real Estate”? It is a travesty. Real estate buying is about as local an activity as you can get. you can’t ship it, or move it. And yet they are allowed to use Google results to squeeze local real estate agents.

  347. How about a tool that will let the business owner know if the site s/he is going to be posting a comment on has them as doFollow or noFollow? Possibly as an app for Chrome.

    • Why would you need a tool like that unless you are purposely posting just to try and get backlinks to your site?

      If you want to post a genuine comment on a site or blog you wouldn’t care whether it is a follow link or no follow link.

  348. Personally, I’d suggest taking a look at what Majestic SEO does when they connect they tool to your GWTools account.

    Second to that, I’d love to see actionable items, like we already get in PageSpeed. Example: this is going wrong, you should do this and that to mitigate. And then keep track of actions undertaken (changes log), as well as their impact on your website quality (# of errors, backlinks, CTR, etc.).

  349. I want to see this tools in webmastertools [Refreshing the existing data in Webmaster Tools faster or better].. because google update backlink just few times in a month- and backlink comes everyday for our website and it makes a huge impact for our website- so we use ahref.com tools because they provide 30 mints before updated link and it helps a lot.. if we get bad link– we can contact webmaster hurry and remove that backlinks– but this service are paid– we have to spend 80$ for a month of service– so i wish if google update backlinks within days it will be so nice–
    bye the wat mr- mattcutts thanks to guide us!!

  350. A report feature that shows a quality score with areas that need improvement on a particular website.

  351. Looked at WMT and found manual action message today. The only thing we did recently was send out a press release. Our URL was in the title, but surprised this sort of thing could trigger an issue.
    The bells and whistles aren’t as important as having things working properly.

  352. Since Google is in the unique position of having such a large majority of search data, history of trending topics, demographic information of users, etc, etc… i would like to see something along the lines of tailored additional content suggestions for the given website inside GWT. Google users want to find valuable and helpful information and Google knows what people are looking for as well as what websites are trusted and good references per various topics. Why not have some sort of feedback loop to the webmaster suggesting what Google thinks would make a nice content addition to the site? Ways to improve your content. I know this is wishful thinking but if you could suggest what “might” do well if that particular website published it based on what Google already knows of that site’s audience, trustworthiness and what people are looking for within that topic it would be very valuable IMHO.

  353. In WMT, backlinks are recorded but not the anchor text, is there a reason for this?

    A question for Matt: In relation to ranking, is it only the links which appear in WMT that matter, from my point of view it would make more sense to organise a system this way in regards to resource.

    I have a plethora of questions about WMT, one in particular is; Search Queries, why don’t they ever record true positions?

  354. These things may better for me….

    1. Send Google “fat pings” of content before publishing it on the web,
    2. Better tools for detecting or reporting duplicate content or scrapers
    3. Better or faster bulk url removal
    4. Refreshing the existing data in Webmaster Tools faster or better.

  355. I really want a tab for automated penalty so that we comes to know from which update our site has been penalized.

  356. The web master tool will help us to prevent our sites content and avoid spam and it will gives track back codes which are very much useful tracing our site backlink and where our site links are posted

  357. I know this is offtopic, but experts-exchange presents different info to search engines and to visitors. Their whole business is based around this (the answers, which help them rank higher, are shown to googlebot-ua but not to users). This is similar to what bmw.de was doing.

    Is Google being bought by Experts-exchange.com? If not, then this is a tremendous boon to anyone who wants an SEO shortcut (eg nytimes.com archives, anyone using paid content, etc…).

  358. Most useful thing would be more accurate search queries data, and available with a much longer backlog and with better API integration (e.g. specifying date ranges). Would really help with the “not provided” issue.

  359. I would like to have the “Better or faster bulk url removal (maybe pages that match a specific phrase?)” feature. It is extremely useful if we have to remove some pages that have the same phrase in their urls but a number at the end that differentiates between them. On my website there are urls like “http://www.carspecwall.com/bmw/6-series/gran-coupe/m6-gran-coupe/m6-gran-coupe-2013/bmw-m6-gran-coupe-2013-wallpaper-001/” and one is “http://www.carspecwall.com/bmw/6-series/gran-coupe/m6-gran-coupe/m6-gran-coupe-2013/bmw-m6-gran-coupe-2013-wallpaper-002/” now in these two pages the content is different and if suppose I change the parent url etc (I have 190 of these pages) all of different images then I would have to remove them from google results one by one, instead if this feature that you mentioned is given to us this whole hectic process would become easy.

  360. A full list of “bad links” we would never get. Google wants the black-hatters/spammers put effort into cleaning up the mess they made. (I admit I did some bad things too in the past). You need to evaluate your links using WMT, Majestic SEO etc… and contact webmasters to remove your links. This can take months and MAY be impossible in many cases, think “Chinese link farms” and the like. Matt of course KNOWS it’s not an easy task to clean up your mess….because that’s exactly how it is supposed to be 🙂

  361. Google webmaster tool should also show related anchor text sites i.e competitors website with same keywords and searches, so we don’t need to go somewhere else

  362. Hi Matt

    maybe off topic but…

    i noticed a line saying duplicate content

    how is that handled when you have a website in let’s say 6 different countries all with their own TLD (domain.nl , domain.be , domain.de etc.)

    some basic content is the same for every country , just translated… is that marked as duplicate content too??



  363. Matt, THANK YOU and yes to all of the above.

    In particular, adding these two:

    1.Option to download the web pages that Google has seen from your site, in case a catastrophe like a hard drive failure or a virus takes down your entire website.

    2. Checklists or help for new businesses that are just starting out.

    But YES (PLEASE) to all of them! I am drooling at the thought.

    Thank you in advance,

    Danny in Chicago

  364. Indranil Bhattacharjee

    Hi Matt,

    I have used the link disavow tool to get rid of bad links post manual request and have also filed reconsideration request. But can there be a feedback or update on the status of those request. I am not sure weather they are removed or not, then if they are not removed kindly let us know what should we do to get them unlinked. The site ranking is effected by bad links and i wanted to remove it as it is hitting my small-business growth. is there a finite timeline? i feel quite helpless with this regard as i have followed all the instruction provided by Google on bad link removal.

    Your Feedback will be appreciated

    Best regards

  365. Preventative SEO feedback.
    In many cases SEO is more about what not to do rather than what to do.
    Many web developers will come up with creative solutions to product problems that can be interpreted by Google as spam. It would be great if there was a way to kind of get an OK from Google as to whether a certain solution is viable rather than taking the risk of having your site penalized.
    In many cases, the fact that you are trying to err on the side of caution you actually limit the user experience of your site.
    For instance…
    One of my proigrammers came up with this as a solution to a product restriction that we have using iFrames.
    But since i can see this as being a method that could very easily be used (and misinterpreted) as a form of cloaking I basically rejected the solution in favor of something that was MUCH less practical and fare to my clients.
    Example Code


    sendMessage(‘remove(“hostContent”)’) /* remove content from host*/

  366. Hi Matt,

    I would look to see some features.. like
    1) Geo targeting keywords rankings
    2) quality score of backlinks
    3) options to fix spammy links quickly


  367. Given that google all but ignores meta keyword tags it would be nice if google would give you one keyword you could enter in webmaster tools that as a webmaster we believe we should rank for then you could do your Googlenss and see if we’re getting traction on that ranking — if it’s a good ranking for your site given Bounces, CTR on Impressions, Traffic, PR ETC — if Google verifies that your site is getting traction on that 1 keyword they let your site rank better for that short tail popular keyword. It would be a great way to balance out web masters that market for long tail keywords while maintaining competition for the short tail popular keywords.

  368. I would be happy if you would just moderate your google webmaster forum, because some of the people posting there are the moment are adding no value and just trying to bully people that have genuine issues with their websites.

    Google are the worlds’ biggest company, how does it look to people if you have “hotlines” and personal representatives & incentivised sales (free vouchers etc) & proactive follow up calls for your adwords service, & a pack of rude hyena’s taking the micky out of someone who needs help with their website and wants to know how to make their website more pleasing to google so that it can rank in the organic search results?

    How do you think that looks to the masses?

  369. Hi Matt,

    My traffic has been going down since May and has taken another nose dive this month, I am not into SEO just try to create a unique website ,

    I have had a webmaster tools account for a few years and the only thing I really ever used it for was change preferred domain a few years ago.

    So yesterday I checked my webmaster tools and find I have over 250,000 links I have never bothered with link building just do what I do and if someone links that is great, so was rather surprised to see that many ( is that all of them or just a portion ) if just a portion I can not imagine how many I have in full ,

    Some seem a little high like pinterest.com with 10,000 but I presume that is just people pinning some of my images, others seem to be big time scrapers who have have just scraped my content ( a few of them with 15,000 to 20,000 links )

    As an ordinary webmaster I would not have a clue how to remove all these type links or even how to stop them linking .

    The problem I feel with Webmaster Tools is it seems to be designed for SEO type webmasters not ordinary webmasters who have very little knowledge of SEO

    Because I am not that SEO savvy and have never done any link building I never thought to check to see sites that were linking to me

    Just looking at my report It would have helped if all these links were the problem if Google could have sent me a message saying Unnatural Links detected with a caveat that maybe I should try and do something about it

    I do have another website who uses some of my content under licence set up many years ago when they were still quite small but they have grown so the links have grown but the content they use is in a folder set up just for them with a setting in robots.txt

    User-agent: *
    Disallow: /foldertheyuse

    The pages are just information pages with no links on them to my site or even set up as webpages

    I have tried to do some reading re: disavowing links from other sites but where do you start , it would be nice if Google made it simple for non super techies to just tick a couple of boxes to say please do not count these links

    Sorry to rabbit on but you did ask what would nice to have in future Webmaster Tools



  370. Show suggested reasons why our sites drop way down in the rankings. We think we are doing everything right and by the book and them bam, other sites with no content, no links and a cheesy design jump to the top and next thing you are on page 2 with no idea why and nobody to ask how to improve. Google seems to have been punishing those who do it right and rewarding those who do nothing the past year.

  371. You have covered maximum points from wishlist. I guess I’d be more keen to see proper notification of algorithm changes and frequent/scheduled pr updates.

  372. Oh, Matt, it may not be a feature for Google Webmaster Tools but I am sick of seeing posts which I have shared when I’m logged into my Google account. If I’m logged in and search for something I’ve written about it will show me an existing update on Google plus which I know about because I wrote it.

  373. What I would really like to see is you leave Google and start your own search engine company that actually works like the old stable Google search engine before the insane updates and worthless disavow list.

    You could get the backing from a zillion investment bankers in the valley.

    And you would make a few hundred million people very happy if you started a new search engine with an ad system similar to adsense but better. As you know, if you started fresh, a lot of the current problems, work-a rounds, and solutions you are doing at Google Search could be worked into a new search engine from the very start without all the fallout.

    Google search is like Alta Vista and Netscape, it’s time has come and gone!

    It is time for a new stable and clean search engine.

  374. Hi, Matt;

    Inbound links to my website more effectively in google webmaster tools, I want to see. Unfortunately, inbound links are found on the latest webmaster tools. I’d like to see in Google webmaster tools in 2014, the manual spam actions applied and detailed reports on sites affected by reason of the sites that the new algorithms describing the reports. I also want to confirm inbound links to my site.


  375. I would like to have a tab “Compare to Google Properties”.
    Then I’d be able to see how my site compares to Google properties and the reasons why any given Google property has better SERPS than mine (algo, manual, greed, etc).
    My 2 cents.

  376. MATT knows better, I my view no, if you have 2 websites A (English) and B (French) with same content the google or any other search engine will not take it as duplicate content. As when translated the script and codes both will differ. If u understand binary, then you must be aware that binary for both language will turn out different for same piece of information. Content duplicacy is unethical, but presenting information in translated form is not, if you have author permission of A to translate in B.
    Hope this helps. Thanks.

  377. I was wondering if there is an advantage in terms of SEO and ranking in what language the site is coded . I have noticed that sites with ASP would rank and indexed better then PHP but just wanted to make sure and share with you and your audience.


  378. Finally, there shouldn’t be any scope for SEO tips and tricks. All I want to see is SEO rules with no loopholes.

  379. We want to have the causes of algorithmic penalties, of course…

  380. How about you fix the bugs in the existing webmasters tools?

    does indexed:0 mean its not being indexed and if not why not or is it broken.

    No data: Well that’s useful – I cant fix it of you don’t tell me whats broken.

    Spammy sites: I need a bigger box to enter the details in. For every site I find that has spammy links I find literally hundreds of incoming dodgy sites each of which has hundreds or thousands of outbounds. I need to be able to upload a spreadsheet – not enter them one at a time.

    Get what you have already working before adding more stuff because from where I am sitting Google itself is irreparably broken but we have no choice but to use it.

  381. link diagnostics tool would be great. We spend many ours with programs that may have bad information! plz help us

  382. we want to have a comprehensive list of links that Google considers to spam or unnatural.

  383. Hi matt,

    I would like to see more backlinks and 100% that it show report of your backlinks.
    Because Im struggling where to find my backlinks and tried many online backlink checker but none of them are 100%.

    nothing more.


  384. Matt what I would like to see on Webmaster Tools is a way to know why a site was ranked lower on search results pages. I see sites with no reported errors demoted and no apparent reason, none one can see with a naked eye! I realize this maybe hard to configure but its on my wish list nevertheless!

  385. Would love to see some form of “support ticket” system. The webmasters community is great for some generic question and answers, but not so great for individual cases, you often get the “are you stupid, and mock responses by some of the brown nosers on there. Support tickets would particularly be helpful for reconsideration requests, you could be waiting days, weeks, months… At least with some kind of support ticket system you feel more confident that someone is actually there, and actually updating you with progress.

  386. In addition to my “support ticket” suggestion. I am in agreement with many above, when it comes to the epidemic of “negative seo” on customers sites, I find it incredibly frustrating, that in clear cut cases where a competitor has just such tactics, in most cases a disavow of all the links, is NOT good enough. You expect the customer to document evidence that they contacted all the bad link webmasters etc etc.. this is a monumental task, let alone for the average guy whose livelihood is his/her business. They needs to be some clear stance by google in the favor of the company who has been a victim of negative seo. They shouldn’t have to suffer for weeks/months.

  387. I just wish webmaster tools would give my site a grade based on how good it is. No need to tell me what is wrong but how it ranks, so I could try to fix things myself and see if the grade gets better or worse. That way I would be able to find out what considered good tactics for Google rather than just guessing.

  388. Happy Holidays!

    Instead of the standard “unnatural link” warning or penalty, I would like to see something that resembles gmail spam controls. Domains being used for link schemes and spam could be ticked “Spam or Blacklist” in the dashboard and you would not receive the benefits or opposite for the links coming from that particular domain. It would be very easy and there could be a global tally of how many people have blacklisted a domain, like for email.

  389. Google is in the unique position of knowing what people are searching for within thousands of niches as well as evaluating the various resources (websites) providing information to those searching for information. I’d like to see some type of content suggestion feedback within Webmaster Tools to inform webmasters of what types of topics ‘might do well’ if they published it based on what Google knows about the demographics of the audience visiting the website, the search term(s) entered prior to finding the website, the engagement metrics on site, etc… I think Google could seriously make the internet a better place by guiding webmasters into the types of topics/content that would perform well on a given site.

  390. We Beg You We Need Authorship back again for all webmasters.

    I Really find it pointless how that affects SEO For Availing it for poor quality authors.

    Please I’d Beg My Property to return this back.
    Our Clients found 150% decline in CTR after authorship disappearance.

  391. Hi Matt
    I would like in GWT periodical PDF reports about the site to send via email like in Google Analytics 😉

    Best Regards from Spain

  392. I love the sound of being able to download a back up of your site. I have so much content on mine it would take me a ridiculous time to re-build it from scratch if I ever had to and I probably do not have an offline version of at least a couple hundred of the pages.

    Also an easier way to implement Google authorship would be brilliant. For those of us who have spent years building up good content we now have hundreds of pages of it. I now have to go back through all those pages to add myself as the author. Because it is my website I don’t add ‘written by Fiona’ to each article because visitors know it is written by me. I only add it to any guest content with the name of the guest writer. Part of my plan for January is to go through and add my name to the my most popular content, but it’s going to take quite a while.

    Perhaps if there is someway Google could take the site owner’s details from the About page, or the Who is and link that to their Google Plus account so then any content on the site which is not credited to anyone else would automatically be credited to the site owner that would be brilliant. Or if there could an ‘author’ tag that we could add to the html which would carry across the site? Although I guess that wouldn’t work if there is also guest content on the site.

  393. It will be great if Webmaster tool 2014 can has new section that tell us which are the bad / spam links to our site. This will be definitely helpful for us to make sure no dirty site is linked to us.

  394. It wil be great if Webmasters Tool 2014 can show us all backlinks value or highlight which are the bad links to our site, so we can have disavow them easily.

  395. A very simple request about a usability issue.
    Search Queries: Impressions and Clicks use the same vertical scale, but being normally Impressions much more, the Clicks line is flat at the bottom and you can’t really appreciate its changes. Please normalize the two.

  396. I’d like to see improvements to “Fetch as google”. Try to submit more than one page and subsequently submit it to index. You’ll get request failed for every second / third request.

    I understand that you likely don’t want to open up bulk submit to index, but current system is just painful to use.

  397. That would be good in webmaster tools not to show nofollow inbound backlinks in the list of “links to your website”.

  398. It would be nice if GWT could provide a list of links with anchor text. Currently, we can download a list of links only with discovery date.

  399. Hello Matt,

    Happy new year. Hope 2014 will be an healthy and successful year for you.

    For 2014, I’d like to see:

    -more accurate and up to date data for Search queries (average positions are oftenmisleading)
    better exports (TSv for Excel such as in Google Analytics are great)

    -better handling of international sites ( why can’t we specify that domain.fr is the same as domain.co.uk?). Specify subfolder or sub-domain for multilingual websites.
    Maybe an option to prevent domain.fr site to appear in google.com and be replaced by domain.com

    -More information on crawl stats (samples of urls being crawled in a major increase or decrease of crawl). Information and recommendation to decrease page speed.
    Show the Googlebot crawl schedule preferences (like in Bing Webmaster tools)

    And of course more informations if dodgy links, Duplicate Content or Penalty had been spotted.

  400. Webmaster tool having an option to show bad backlink and direct option to disallow the bad backlink from the site would be the great thing for us.

  401. It is really geat post of the site. This is all point of intresting of blog. Thare are wonderful you like webmaster tools in the site. Most useful thing would be more accurate search queries data.

  402. Thanks for the blog Matt, I think webmaster tools is awesome and if some of the suggestion in your list happen great. I’d like to see the disavow tool become an easy remove button on domains you don’t want linking to your website.

  403. Hi Matt

    Based in the UK I surf the internet a lot as part of my day job and analyse Google natural search listings on a daily basis. I see many flaws in the Google algorithm.

    Of particular current interest is that when searching for ‘Michael Schumacher’ (search conducted on 04.01.2014) we find the official page of the former F1 driver relegated to page 2 in the SERPS. The official page is the ONLY website that carries any ‘official’ message from the Schumacher family yet we see it usurped with multiple pages from news/information websites all pedalling the same 1 or 2 stories.

    This is a really poor user experience – I don’t need to be presented with 15 different presentations/websites communicating the ‘exact’ same Michael Schumacher latest update.

  404. If a business has shared address with other two and with a common phone number how can we verify the businesses for the Google local listing?

  405. In 2014 I want that we can change GEO TARGET of country specific domains like “.in” or “.aus” please it would be great if you can enable it!

  406. 1) I would like to have a tool which removes backlinks without asking from google to delete them
    2)more informations about duplicate content in my pages
    3)full control of robots.txt from webmaster
    4)more details about errors in the code
    5)seo tool for mobile devices

    (a catalogue with spammers and the reasons(penalties) would be a nice example for avoiding being a new spammer in 2014)

    thanks Google and webmaster help!

  407. 1. Google should provide a list of all pages that are indexed (least up to 25,000-50,000). Cleaning up duplicate content would be a lot easier if people could see where the problem is. It might also encourage people to stay proactive if they can catch google indexing unwanted content. I am shocked sometimes to find out what google indexed on my sites as I didn’t even link to some of it.

    2. Google should offer a bulk link removal tool. One link at time is crazy. I have a site with 9,000,000 pages of empty phone number template pages to useless numbers. A noindex tag seems to be the only way to fix this problem and it may take months.

    3. A index dumper. Sometimes starting over from scratch is the only way out or people didn’t put up their robots.txt file before development finished. I wish I could just say “Hey Google, erase my index completely.” All you can do now is get the results out of the search engine results while you wait to be recrawled, but not a real final deletion at once. I want the power to say “Delete it all now.”

    At least offer a duplicate content index dumper. Something that may provide a list of suspected duplicate content and let me delete links with specific link structures in bulk

    4. A link structure whitelist and blacklist so all duplicate content can be ignored by default.


    mysite.com/search.php?q=xxxx+xxxx (good link)
    mysite.com/search.php?p=xxxx%20xxxx (bad link)
    mysite.com/search.php?p=1&q=xxxx+xxxx (bad link)

    4. A way to stop indexing for specific parts of a page. I have a dictionary search engine and a lot of the terms will show up for different keywords entered. I wish I could have a googleoff/googleon switch so duplicate content wouldn’t be considered for indexing. You can still crawl my site and rank me (nothing to hide) , just let me decide if I want something ignored for indexing.

  408. I will happy if have a realtime SERP updates and backlink information too.

  409. Hi Matt

    1) We lost the capabilty to know the keywords typed to reach our websites. This cause a very big issue. Why not a pretty solution instead of simply removing the old keyword tool ?

    2) Configuring the authorship+publisher is a mess for guest blogging. Why not clarify and simplify all of this with a all-in-one solution

  410. A way to remove FAKE reviews about businesses. Verify identity and post real names of reviewers. If my reviews are public and my business name and info is public then the reviewers name should be public, as is their comments. Their is no way to have people own up to their comments. Google almost even fosters anonymous reviews and fake posts.

  411. Hai Matt,

    Here a question from The Netherlands.
    I’ve noticed that some of my pages rank on a multiple wrong keyword querys.
    So in Webmastertools (wmt) i used “search traffic” -> “searchqueries” -> “toppages” with the option “incl. changes” on (i thing its a rough translation but i mean the option next to “totals”)

    To see the growth and downfall of ctr and relevancy by impressions in percentage.
    But when i unfold the specifications of witch toppage, ranks on what query word.
    I have to view every query woord separately.

    I dont ask much… But in 2014 i’d like to have the growth and dropping data available in the downfold.
    So i can adjust, optimize and monitor my page as the ranking for irrelevant keyword queries decline.

    kind regards,
    Raffiek (pronounce as traffic without the “t” 😉

  412. Hi Matt,
    I work for a legitimate hip-hop website that has several bloggers on staff with original quality content. Nothing we post is against any sort of rules. We are also very strong on social media including G+. But as of November 26th, we’ve seen our articles moved to the 2nd page from the 1st in search results and we all truly believe this is a mistake. Unless there is something we’ve missed that is not part of any of the checklists we’ve found online.

  413. Webmaster Tools and Analytics is the central nervous system for any SEO practitioner, regardless of whether they are genuine or a black hat operator selling links, fake likes and twitter followers.

    One idea to improve the SEO practices is to introduce automatic penalties against confirmed Blackhat SEO operators who’s IP address is seen across multiple spammy websites and blogs which Google has access to. Furthermore, given Google is reliant on social signals, understanding fake facebook accounts and twitter accounts would be valuable (given there are many websites out there that can detect this as well) and be used towards identifying a black hat operator.

    If a Webmaster Tools user appears to be associated with any Blackhat SEO/social media practices then temporarily block them from gaining access to Webmaster Tools and Analytics. Tell them why they’re blocked. Give them a chance to undo all the bad they did, and then reinstate their account. Put their IP address(s) on some kind of activity watchlist and should any repeat offenses occur, the penalty gets harsher.

    Just a thought to improve quality through taking away the nerve centre of how these operators measure their fraudulent services.

  414. You mentioned recovering webpages if a website sustains a catastrophic failure. I think that is a neat idea, especially is it`s a small website with non server side files. Over the last 12 months I have probably dealt with half a dozen of support requests where a client has lost all website files due to no fault of their own, whether it be their web host has suddenly closed with no access to recover the files (Yes, this happens) or a rogue webmaster who has maliciously screwed with website content. There`s many reasons why file loss could happen; as you probably could guess yourselves.

    So your comment “Option to download the web pages that Google has seen from your site, in case a catastrophe like a hard drive failure or a virus takes down your entire website.” has my +1.

  415. Here are a few simple things I’d like to see updated in Webmaster:
    1) A clear way or process to disavow spam links created, and/or reinstate them afterward. The owner of the website should have some control as to how their website is linked and have the ability to “turn on/off” links.
    2) Clear interface and controls for Authorship.
    3) An easy way to determine if/why a page is being penalized.

  416. Another idea google should consider is what I would call a “Locked Index.” Meaning that nothing can be indexed that isn’t on a specific list or sitemap. No snooping around for Google. Obviously make it something a person has to choose and click a verification link for. This would help to solve most duplicate content issues and get people focusing on content and not fighting to keep perfect SEO for Google. Autogenerated sitemaps would make it easy to keep the list updated.

    I see Google indexing irrelevant search terms on my site clearly in an attempt to compare my site with a competitors (I don’t have as much content as them). I had to stop it with an autogenerated noindex tag. People shouldn’t have to be this careful. If I want my sitemap to be all you look at then let me control that.

  417. More ideas. I apologize, you got me on a roll here.

    You should provide percentage ratings for duplicate content and thin content compared to quality content.

    Next, what I would call a Google Speeding Ticket System. If someone get’s penalized then you can charge them $100 to tell them exactly what their problem is and pay a fine (another $200) to bypass penalties. Repeat offenses get higher pricetags and longer penalties. Make money off the spammers instead of this ridiculous silent treatment. With all the cash you can afford to have a “Google Police Station” that allows people to send you plain text complaints about whatever they think is dishonest on the internet.

    You should also consider premium webmaster tools and services. Charge money for 1 hour full site crawling, instant index dumping, duplicate content lists, full index lists, or personal SEO advice from someone at google who will tell you exactly why your site is having problem (obviously a high fee).

    Google’s biggest problem is you expect too much from people. Start offering services to make SEO easy.

  418. This is definitely a good start for 2014. As a webmaster improvement is fine with us as long as it will help us in our work and to better the result of the search engine. Thank you for this update and hopefully there will be more better tools like knowing how we could identify bad links that are already in a website.

  419. As far as I can tell identifying bad links is a major cause of my websites problems . A tool for identifying these links would be extremely helpful as would be an easier way to claim authorship for articles and blogs etc. . Reasons such as spammy links and such should be identified more clearly in WMT for better transparency and in order for webmasters to be able to fix or delete them.
    Add to this a better more complete listing of all the links pointing to a site in WMT and possibly a place next to each link to disavow and/or make the link no-follow. Doing the no-follow this way would save new webmasters the trouble of having to learn html and how to implement it. Experienced webmasters would also benefit by saving time as well.

  420. Santa Matt, The thing I want the most from Google has nothing to do with Webmaster tool. I want these operators back [ ] and ” “. I mean, I want them back like they where..”old school” where this [ ] meant ‘exact’ as in completely identical, spaces, words – freaking exact. Spam hunting was so easy with this operator.

  421. How about accuracy. I have a site that webmaster tools is saying ranks for certain keywords in certain countries and it absolutely doesn’t. Webmaster tools is saying it is ranked position 7 and it is not appearing on the first 6 pages.

    So yeah accuracy would be nice.

  422. Matt I have taken over a large site that had thousands of 404’s. Over time I have fixed the vast majority of them. I know there are more but the problem is I am not getting anymore errors in the crawls. It has been 4 months since an error and the download file hasn’t updated as well.

    I would love to have the option to delete or clear this file as an option in 2014 so when a re-crawl is done and new errors are finally posted I have a fresh file that I don’t have to sift through to find the new errors.

  423. Hello Matt,
    I’ll love to see also Social Metric on Google webmaster tool and statistical report of link by type. like :
    -Number of Comment, blog post, guest post, social share, social network. I mean a great overview easy to read for webmaster without paying “special seo tool”.

  424. Matt cutts, That will the happiest day of my life when there will search quality in social media metric and love to follow your rules after penalized of my sites. Good Experience was that, at that movement i learn many things. Thanks Matt 🙂

  425. Hey Matt,
    It would be awesome if there was information on whether your sites were penalised or not. Also if you could see all the links that Google sees of your site and a quicker option to get rid of spammy links.


  426. WMT Geo Targeting:
    The thing is, Europe/EU is a free trade area market. Therefore, if I am in Ireland my customer base is across all EU countries as Ireland is part of the EU and shares a currency. If I select Ireland in WMT, which is one of the smallest markets in the EU, I am at a disadvantage and I am penalised for it. It’s also an issue with regards Ireland and the UK. If I live in the republic of Ireland, I’ll be only targeting southern Ireland as the North is UK. Frankly, it’s bloody ridiculous and google need to address it. There should be an option for EU which includes UK and Ireland and the rest of the EU members and there should be an option for Ireland/UK.

  427. Matt, I have found that GWT can be off sometimes in keyword rankings. I’m not sure how it is determined, whether or not it takes in to account multiple clicks or just on time clicks.

    I’m wondering if you could explain a bit please.

  428. Hi Matt,
    Sounds like your doing it tough on Holidays. It would be great if the Disavow Links was easier and whether the dodgy looking links were actually bad for your site or not, as some of them are hard to tell.


  429. I’d like for google webmaster tools to have a better layout for us “newbies” doing our own seo. Sometimes when you just start this, analytics is super confusing too. Im finally learning about webmaster tools but it sure would be helpful to get a more user friendly layout if possible.