Feedback: Webmaster services in 2006?

How about webmasters? As a site owner or webmaster, what would make your life easier? Assume that I’ve already stressed to people the importance of nailing redirects and canonicalization. I’m talking less about stuff under the hood, and more about products or services like Analytics and Sitemaps. What would ease your stress, streamline your life, or otherwise help you as a site owner, maybe advertiser, or webmaster in 2006? If you could propose any project that would simplify the life of webmasters, what would it be?

135 Responses to Feedback: Webmaster services in 2006? (Leave a comment)

  1. How about a better understanding of penalties… perhaps a service that says what you have done wrong and suggests how to correct it?

  2. JCopia

    I’m not sure exactly what you are looking for here, but how about a penalty checker. Some kind of code like, penalty:mysite.com, to see if there are any penalties on my site, whether it be through algo’s or independent review penalties. It is hard to change an error on my site if I don’t know for sure that an error has been detected. It would take a lot of the guess work out of seo’ing.

  3. - Top 50 serps for a term available via RSS.
    - Better translation services so I can read and write to foreign articles. more emphasis on the read.
    - Kill the Toolbar Page Rank, it still causes misunderstanding amongst new webmasters and small business owners.
    - More Cow Bell

  4. I agree with what Adam Senour said in Feedback on Bigdaddy datacenter article about having a penalty:domain.com command…

  5. yes, how about alerting a site if they’ve been penalized, and why..

  6. Dan

    I like the penalty:domain idea too. Maybe it could be integrated with the sitemaps service so it wouldn’t be abused by competitors, etc.

  7. Getting rid of all the spam blogs

  8. RedSheriff

    Hi Matt,

    A penalty:domain.ext command or however an infrastructure, integrated also in the sitemap of google, in order to know how many and which penalties have been inflicted to a site can help to improve the quality of the web in a generalized manner.

    Sorry for my bed english
    :-)

  9. Well first being able to sign up for analytics would be nice. I like the idea of an error or penalty checker but I am not sure its feasible.

  10. anim8tr

    How about a complete conversion tracking tool that can track a keyword click all the thru to a sale on an external merchant or third party web site.

    For example, let’s say I’m an affiliate for Widget Works. I promote their products on my website and they in turn report sales to me either thru their own tool or another service like Commission Junction. The problem is that I have no way of knowing what keyword triggered a sale because they don’t keep track of my keywords and they won’t let me place my tracking code on their “Thank You” page.

    If we could do this thru Google Analytics it would be big winner…

  11. Updated help pages for all things webmaster related. I visit Google groups often and there are many questions not being answered. This gives Sebastian a job ;) but I would also like Google to get into the game. You nailing redirects and canonicalization was the best advice I have seen yet, why? Because I am trying to be my own webmaster and boy are there lots of questions bouncing around in my head.

    Like Sitemaps for example.

    What is the difference between a “regular HTTP error” and a “404 not found”? If I have a 404 in place for all pages (that no longer exist) why do I still see “regular HTTP errors” listed when I log-in and check sitemaps? I want that little error red bar to be clear in sitemaps.

    If you click on the page from my blog it also doesn’t return a 404 while all other missing pages do?

    “Sorry, no posts matched your criteria”.

    There is a help page when you click the error (in sitemaps) that links you to this:

    General HTTP error – Google encountered a general HTTP error when trying to access this page. Note potential reasons for this error could include any of the HTTP errors listed above.

    AND none of the other examples give a good enough explanation so I am still scratching my head. Could the pages not getting the 404 be ones that I dleleted using Google’s URL removal tool before Matt wrote up all the helpful posts this week? If so please include this information or expand upon reasons for or there will always be head scratchers yes?

    Blah Blah Blah

    Yes, I may be a bit anal but if you check Google Groups many others have had and WILL HAVE this exact same question.

    Another cool thing would be an area in the specific tool being used where you can list your head scratching event to better Google’s customer service. This way you guys work for us, because after all without us you do not exist. My wife is a Librarian and believe me, just that little email she got explaining the algorithm has won her over a bit more…you guys keep up the good work she will be telling all her students to just “Google it”.
    :)

  12. Mike

    Google Site Survey with some details.

    An automated site survey that spiders your site and sends back results via email letting you know if google bot detected some kind early indication that your site may be in some trouble in Google’s eyes and allow us a specific time frame to fix it. NOT like sitemaps in a sense, as I do not mean it like that. If it is not fixed in that time frame, then follow through with the penalty.

    I have not ever been banned, but I have read around, and if I understand right, it is the reverse order, meaning you get an email saying that its out for 30 days (or something like that) submit a reinclusion request and so on. I can understand why that is done, but why not allow webmasters the time and chance to comply before getting a penalty? I know some may say, that when you are indexed you have the chance to comply, but if you do not know if you are doing something wrong, some people might not see it that way. Having worked in a license compliance field before, we would warn offender, allow time to rectify, and believe it or not, most people were not aware they were making a mistake and were more than willing to comply within the allotted time frame .

  13. Greg Manter

    Google webhosting would be great. Finding a reliable webhosting service is difficult.

  14. Although I haven’t ever been hit by penalties, the penalties idea is awesome…I’ve WORRIED that I’ve been hit a number of times and had to do tons of research to be certain that my page tuning simply sucked :-)

    Here’s a very simple suggestion: make it easier to find the webmaster info! I’ve had a lot of trouble finding the sitemaps, analytics, sitemap groups, etc. from your home page. You know how I’ve solved this? I do a Google search for “google sitemaps”! Update your main “human” sitemap so we can find stuff in a couple of clicks!

    Michael.

  15. Apt

    Respond to each Spam/Abuse submission with a status.

    Like, “the site(s) you reported are being investigated” or “we investigated the sites you submitted but were unable to find a violation of our Webmaster Guidelines” or “we investigated the sites you submitted, and we will be taking appropriate action until they correct the problems with their site”. Anything is better than silence, but something specific would be great.

    Also, this is unrelated, but I would love it if some ex-googlers started up a consultancy to analyze and document blackhat methods used by my competitors. It’s so rampant in my industry, I’m willing to pay someone to track it and go after any of the cheaters.

    Google would make a huge mistake by only concentrating on the quality of the top SERPs and trying to govern the rest of the long tail strictly with algorithms.

  16. I love the idea of the top x SERPs for a query being fed with RSS. I’d love to have a section in the admin control panel for http://www.teckreviews.com (shameless plug) where I could track my ranking on google for certain keywords.

  17. Instead of several small services I want one big thing that tell me everything I’d like to know. Merging Analytics and Sitemaps would be a good start. We need better communication with Google (“your site is being penalized because of this”). Improve the official Google Blog to take up SEO issues too, this blog is good but should we need to talk unofficially with you guys? Your communication and feedback with webmasters can be vastly improved. Thanks for listening :)

  18. Jon

    1. Do web standards, accessibility and semantic structure make a difference with Google’s algo, and if so, how?

    2. Can we have example pages/code of what Google would consider the perfectly structure page?

    3. I would like a Google webmaster community. One that webmasters/professionals could pay a fee for, which would give us an API key to a dedicated Google server that would allow more than 1,000 (or unlimited) queries each day. Basically, creating a true Google Webmaster community that offers access to better information, tools, and Google reps.

  19. I’d love to see a Yahoo Site Explorer like service added to the webmaster tools in site maps with the export to cvs or xsl function.

    I realize this is available through the search api but it would really be a nice to have… since I check my link: and site: by hand all the time.

    Thanks for taking input Matt. And, Go Cats (UK ’96)

  20. Oh and a real backlinks report when queried via the google API.

    Google backlinks reports are such a joke I have no idea why your company even provides them

  21. Please integrate all the webmaster tools (Analytics, Site Maps, AdWords, AdSense) into one “Google Account”. All these tools are great, but I hate having to log in 4 times. It would be great if I could log in once and have those 4 things somewhere on the right sidebar.

  22. SpamHound

    Hurry up and get Google Urchin ready for new sign ups!

    I’m chomping at the bit ;)

  23. Penalty/ban check service (when and why) open to ALL is also #1 in my personal list. Btw, I said about that couple of days ago in one of the comments to your preious post ….

  24. I am just waiting for you guys to open up room for more people on analytics. I have been waiting for almost (perhaps more than) a month. Looks like a great tool, even after everyone else’s enthusiasm has died down a bit I am still waiting to try it out.

  25. EKB

    1. Some way to tell if a site has been banned or penalized, and some indication as to why. I see that others have addressed this, so I’ll just note that the current system seems quite scattered and archaic. Why are frantic webmasters jumping from blog to blog looking for Googleguy/Matt Cutts/anyone who seems to know something, doing exhaustive tests, and wearing themselves out cursing Google to the heavens when their sites take a dive? Isn’t Google basically a conduit between site owners and users, and as such, shouldn’t it work with both sides? Doesn’t it benefit everyone to open up (and centralize) the communication between Google and webmasters? Which brings me to…

    2. Is there a division at Google dedicated to “Webmaster Relations” or something similar? Such a division could set up “Google Information for Webmasters” (http://www.google.com/intl/en/webmasters/) as a centralized place to announce upcoming changes (these could be vague and generalized, to preserve the secrecy of the algorithm), allow webmasters to check their own site(s) for bans or penalties (after first verifying through sitemaps or some other method), exchange feedback, and generally keep things a little more sane and hunky-dory. Much could be done to repair the perception that Google pits itself against site owners/webmasters, who are, after all, half the equation on the web.

    Basically, I don’t think that maintaining a thick veil of secrecy around Google’s operations does much to prevent spammers, cheaters, and other bad guys, or to improve Google’s search results. Open things up a bit, and watch webmasters fall in line!

  26. Get Analytics to work in Safari. Make navigation in Analytics better (the back button doesn’t work, from my experience). Allow custom “dashboards” or whatever to be created (so I have quick links to what I want to see and don’t have to use all the hierarchal navigation. I can’t seem to figure out how to do this, so: allow me to see what pages on my site were viewed by who (aka, allow me to cross reference my content with other variables (so, for example, I can see what pages people on Google’s network were viewing)).

  27. Allow a way to disable Google ads on my computer so I don’t accidently click it and wind up getting kick out of the program and lose all the earning.

  28. Wayne

    QUOTE: I’m not sure exactly what you are looking for here, but how about a penalty checker. Some kind of code like, penalty:mysite.com,QUOTE

    I agree this is one tool that I believe would help all webmasters. I know that google says there isnt much others can do to hurt your site to get it penalized, but the fact still remians they can. There are those out there that want to do your sites harm. Some people like myself do not have the knowledge to prevent this or catch what others have done to our sites to get them penalized.

    This would lead me to suggest that Google create and put in place a set of search strings / tools to detect these types of malicious attacks.

    Another suggestion would be to add a penalty:mysite.com and allow webmasters and users to view what Google deems to have been a violation of its TOS. If it is a matter of privacy allow the owner of the site to register and acquire some like the API key and use some type of system to verify the site owner.

    To get rid of competition competitors use these tactics to harm our sites. I know that there are several good relevant sites out there that should be ranking above some of their competitors but yet they dont because of malicious acts by others.

    If Google truly wants to be a friend to the webmaster community and consumers, I believe this is a must function Google should create.

  29. joseph

    Google MIni but software based…

    cheers

  30. I guess what Google should not do is create its own world inside. It seems that the major three search engines Google, Yahoo and Msn are all moving away from eachother creating a chaos for us webmasters. Also, google should give reasons why some websites are ranked first and why others are ranked so low. I think this should be great. Other feature to lookup would be lookup:domain All info about the website, since when it was online, approximation of traffic (I guess mostly everyone has gtoolbar now :) do like Alexa rankings), list of keywords, and mostly the reason why it is ranked there, and why did Google choose to rank it there.

    That would be really nice. Com’n Matt, I know Google has resources to do this instantenously :D

  31. Clean up the webmaster guidelines so they are clear, remove inconsistencies and ambiguities like:

    No one can guarantee a #1 ranking on Google.
    Beware of SEOs that claim to guarantee rankings …

    later on

    For your own safety, you should insist on a full and unconditional money-back guarantee.

  32. Jan

    Hi Matt,
    I would love to see some way of providing a copyright on handwritten stuff. Maybe a way to submit an article and get it copyrighted by Google? A Google copyright service? That would be unbelievable cool!

    My biggest problem is writing articles about preconstructrion investing and stuff and then finding my article on 5 of my competitor’s websites.
    I’d love to have a way of entering my personal writings as I do them.

    Jan
    Myrtle Beach, SC

  33. How about a Google operated Whois?

    I would also very much appreciate a tool for checking the first time a domain was indexed by Google.

  34. I really wish the rules are open and visible so everybody can know why sites are ranking well or not. There should be way to check it out and I think it would increase the value of Google search when we all know what the rules are. Webmasters are great community and we can all help to make the better quality of searching.

  35. 1) Paid reviews for sites to determine basic problems.

    2) I can’t help but think the support team does not undertand enough about the ranking system to give informed replies. Your team should give them more basic info to help webmasters. For example many at WMW have noted that doomed sites (302 hijack, duplicate filter of death, etc) often get the infamous “you have no penalty and your site is easily found using site:yoursite.com” note implying you have nothing to worry about when your site has been killed.

    3) There’s got to be a way to avoid helping spammers but communicate more effectively about when some aspect of *linking* causes severe downranking. Rumors still about about all types of filters and point deductions and this wastes a lot of Google and webmaster time trying to second guess the system. The Jeremy link debate was just one example of how divergent opinion is about what is legitimate linking and what is not.

    4) Guidelines should give more examples of “great user friendly structure”.

  36. Tell people what’s up with their site.

    “Why isn’t my site being indexed, listed, ranking?” Check the statistics in the Google Groups, probably question Nr. 1. The answers range from “you must spend more on adwords” to “google is a commie bla bla bla conspiracy bla bla” to “spam some wikis”. Why can’t a webmaster be able to check what’s up with their domain? “You aren’t being indexed because you do not have sufficient inbound links.” -> one sentance from Google, people know what to do and people will work on it. The spammers know how it works anyway, why disallow things like this to everyone because of that?

  37. spike

    A penalty checker sounds like a great idea… in theory, …but the end result will be that most be people will only do the bare minimum to beat the penalty.

    Google will have to keep changing the “pass” level as more sites that should not be listed, get the “all clear” from the checker.

    Can’t ever see it working in practice. It’s one of the reasons that very few directories feed back their rejection reasons to applicants.

  38. When doing a search on “pages from country X”, Google returns results from sites with a domain from country X or hosted on an IP address located in country X. Google says that it may also use information from the whois record of the domain, but I’ve never seen that for cases where IP and domain are different from the whois country.

    It would be good if Google offered a way for webmasters to indicate the country from which they are. That would be especially interesting for non-US webmasters using US hosting services and non-country-specific domains. It would also be interesting for all non-US bloggers using the blogspot.com service.

    An easy way to do this would be from within the Google Sitemap account.

  39. Hi:

    You mentioned that you used Google tools to provide infromation to site owners/webmasters about what they were doing right and wrong at a conference. My immediate thought was that I would pay $1,000 to have that done for one of my sites.

    Cheers,
    Ted

  40. How about some understanding on Google’s part that a huge number of small businesses around the globe rely on Google for their income and most of that income is made in the lead up to Christmas.

    If Google understand’s that then just maybe Google will understand why it should never do a major update in October, November or December and take some real steps to ensure that it does look after the people that rely on it.

    When this question was raised in another place it was dismissed by Google Guy with “oh well updates have to be done sometime” – now that showed a complete lack of understanding so perhaps Google can act responsibly next time?

    Yeah I also believe in the tooth fairy and Santa Claus.

  41. An URL removal tool that’s capable of removing URLs (including supplementals) permanently from the index.

  42. Reddog58

    1st off Matt I would like to say that this is my very first post to your blog and “what a blog it is” thank you for taking the time to help us poor web folk out with your witty yet useful insight. Just a quick question from a reliative newbie to the webworld….. It seems that google hasn’t yet been able to decipher original content from duplicate content. I have a few sites that have been completely copied several times right down to format. When reporting these thieves via the google spam report it seems that not much gets accomplished other than the baby is thrown out with the entire bath!! I am in a highly competitive field where spam runs a-muck…for those of us who follow the guidelines set forth by google it becomes disheartening when this happens. Is google working on something that will be able to decipher original websites vs. copied spam in the near future? Again thanks for your time as I know it must be very busy at the googleplex and your insight helps us all strive to SEO perfection!

    Cheers,

    Reddog58

  43. VJ

    Chung, would you also like the Google algorithm? lol

    I think that would be great to have that information, but then there would be a lot of spam sites that would exploit that information.

  44. 1. Report pages that have been penalised as suggested originally by Adam Senour and, as per my original suggestion made earlier today, use the Google Sitemaps account to give that little privacy…

    2. Google Sitemaps to help determine which country a .com and a .net belongs to since most sites in the world are hosted outside their countries and national domain names are harder to get outside the US and UK.

    2a. Google Sitemaps to help determine which language you want or doesn’t want to be found at. I gave an example at an earlier post using the keyword ‘florida’ which is used both in English, Spanish, Portuguese and several other romance languages.

    3. Keywords exclusion via Google Sitemaps or HTML meta-tagging if that is W3C or Dublin Core compliant. What is that for? Have you ever checked your stats and discovered that an incredible number of poor souls ended up in your page by searching for keywords you never knew you were competing for? Worse, these keywords are just the something you never wanted to compete for.

    Keyword exclusion lists would hint the bots that just because you mentioned a scene in Scrubs in your web development article you don’t want to be found at a ‘scrubs series’ search for example.

    It has to be flexible enough to set a general keyword exclusion list for the whole site and another for individual pages but it shouldn’t override the obvious keywords that are in the content itself, meaning that if you are really making use of a keyword the exclusion list wouldn’t serve as a way to hide portions of your content.

    It could be the solution to the Google Bombing issue as well, if that is an issue, so if the whole world is linking to you as ‘wears pink panties’, you just exclude the keywords and voluntarily chooses to opt out from your high rankings in the street of shame.

    Benefits search relevance, user experience, usability and accessibility. Good for man and beast.

    4. Inbound links exclusion via nobody else than Google Sitemaps. Why that then, Luis? Oh, I am happy you asked. Sometimes don’t you just feel like all those inbound links in DMOZ clones are simply not relevant to your site? Then the Inbound links exclusion list is the solution for you that want to concentrate on the good links not on the ones that are making your link analysis list cross the world twice.

    Just like in real life, you don’t just accept a gift because someone gave it to you. (make that a trademark, please) :)

    5. Anything that is relevant in what I said about search in other languages than English at an earlier post.

    Well that is all for tonight. I hope to take the credit for them if they are ever used. Bye for now.

    Luis,

    Webalorixá

  45. Ron

    Using the Site Maps that are primarily submitted by legitimate webmasters, create some sort of automated feedback system beyond what is currently provided, i.e., if the site map spider detects something that goes against the grain of Google’s guidelines either send an email with those findings to the webmaster or post it to the log in area like the stats you currently have posted.

    I doubt that many cloaked sites or sites that are using factors that go against Google’s guidelines are submitting their site map to your system so why not provide these webmasters with advanced “trust” guidelines that they need to work on to help improve their sites and bring them into the highest level of compliance.

    This two way effort provides what Google is supposedly always looking for, i.e., the best SERPs possible for the user.

    Ron

  46. Scrap the crummy Supplemental Result system. Just move everything in to one database.

  47. I would say a bigger webmaster section with more info on what Google likes.

    Also, perhaps provide a feature where webmasters can pay to clear their site of any filters/sandbox/penalty they might have on them, and then, if Google finds any spam on their site in the future, they forfeit their money but otherwise, they get it back after a certain amount of time.

    Perhaps, along the same lines, a paying system to be reviewed/included in Google before other sites might be useful. Basically, just a quick check by a Google employee to ensure that the site is high quality and good enough to be included in their database.

    Just some thoughts,

    Nathan Malone
    Austin, Texas

  48. Just another addition, ok let’s skip the boring part (yeah, it is about Google Sitemaps)

    6. Give the site and pages the ability to position lower or step down their rank for the benefit of another site or set of pages.

    Explaining:

    I work with some massive e-commerce sites that belong to companies that have several flagships and brands. One company might have a principal brand website and have other smaller brands, another company might have several white labels (affiliates) that represent the same brand and another might just not want to the take the place above another for several reasons.

    So, waht do you do when you want to make sure that little brand of yours that competes with the major brands of your own company will not appear above the biggest selling page of another brand. Just tell the site to position itself one position below the main page of another brand in case they ever appear side by side.

    There might be a case in which you simply don’t want to position your site above a charity site (after all, don’t be evil), so you tell your site to position itself immediately below that charity’s site in teh SERPs. Since you can elect to demote your site ranking there is no problem there.

    Make it sitewise only.

    Cheers,

    Luis

  49. Matt, you could give us access to your Google login (you know, the one you used at Pubcon up on stage when the free wireless was down):)

    Seriously, letting website owners know if there site is penalized and why would be great. Or at least the ones on Google SiteMaps.

  50. What a Maroon

    I wanna test drive your “pimped out” FF browser.

    Havent had a site penalized, so not sure about the penalty checker. Seems like too much info for the algo crackers.

  51. RJO

    Sitemaps and Analytics are both good ideas, but neither is living up to its potential. I had Analytics on quite a few pages and have now pulled it out because it slowed page-loading to a crawl and I was almost certainly losing visitors. It takes forever for data to display on the Analytics page, and a lot of data isn’t available at all until a 2-week delay. I understand there’s supposed to be a map of your visitors showing on the main Analytics page – I’ve never seen it display (I use Mac Mozilla). I don’t even check the Analytics site anymore.

    Sitemaps got off to a very bumpy start also. I had two sites verified, then they suddenly become unverified with no warning; it took about a week to get them back in. Once a site is verified there should be no reason to have to keep the crufty verification file on the server. The Sitemaps pages give only very minimal explanations of the data being displayed. On the error listings, are the bad URLs referrals from someone’s erroneous link (which I might want to track down or do a rewrite for since it will happen again and again), or were they one-off type-ins by someone that will never appear again and so can be safely ignored? On the Sitemaps page showing query stats, can we get some indication of what we’re seeing? Data from the last day? Last week? Last month?

    I guess my general feeling is that these will eventually be good services, but it really seems like they were rushed out the door. At the moment I’d encourage G to concentrate on getting this stuff that is already available to work well, rather than to go off into new services before these are fixed.

  52. Marshall

    Yes, I agree with all of the people requesting some kind of penalty checking and notification. My site has slipped over 200 SERPs since Oct 2005. I don’t use invisible text or keyword stuffing or that kind of stupidity.

    And I certainly don’t have a problem with fixing what is wrong if I just knew what it was. I really can’t afford to buy every piece of SEO software to try and find out what is wrong.

    I am an affiliate site and so far don’t have any products of my own. I have read Google penalizes these kind of sites as “thin affiliates” sites. Yet manufacturers in my industry will not sell directly to you unless you have a brick and mortar store with regular business hours. This is just not feasible for me. So I am an “affiliate” website business in an industry that I love.

    I have content on my site related to my industry and based on my 36 years of experience. So I am not just a thin affiliate site. If this is a major part of the reason my site is being penalized, why can’t Google just say so.

    Again it comes back to communication feedback from Google to the webmasters. Most of us are willing to fix what is wrong if we knew what it was. As one of the Comments said, “Without us you don’t exist.” No sites = no index

  53. Bozman

    Yes, what everyone above me said. We’ve pulling hair out trying to understand if we’ve had specific pages banned.

    Why not take the mystery out of the banning process. Currently, trying to understand if (and more importantly WHY) a site has been banned is about as clear as prophesizing with a Bull’s entrails.

    If Google really “cares” about webmasters, why not be clear and specific? Why was site dropped? How can it be reinstated? Sometimes, it seems that only pages get dropped, but how can a webmaster know for sure? The real problem is deciding if a site/page is banned or suffering from some sort of technical problem.

    Why not offer a paid evaluation service?
    -Site ban – yes/no and if yes why.
    -Specific technical recommendations to increase crawlability.

    The current boilerplate email is not enough. I understand that more info to webmasters can make it easier to game the SERPS, but too little info raises the motivation to mass spam the SERPS. Would you rather deal with sniper or shotgun approaches?

  54. oojee

    please fix what is broken before you make new things.

    a starting point:

    http://www.google-adsense-sucks.com/

  55. Fair Penalty for SEO’s

    I am all for kicking off people that abuse to system but sometimes its hard to judge who is at fault.

    One of my clients abruptly dropped our services saying that we were too high priced and there were cheaper more effective options out there. This client then went with some blackhat SEO who tried to instal code that made his website look like it belonged to the Canadian Goverment Business Agency and a Major Canadian Newspaper. Yes he is in idiot…….Problem was that my link was still at the bottom of his page saying that I did the internet marketing. Google promptly kicked him off but my site could have been penalized without me even knowing. Luckily I had him remove the link with the first day and we were not penalized.

  56. mick

    Id like to see, even a one off, Q&A, feedback from you guys at the plex to the webmasters.
    1 question each – 1 answer. no follow ups, no discussion.
    We ask the questions and they get a human answer, not an automated bog standard reply thats goes out to 1000′s.

    Sometimes it can feel like a lot of questions asked go unanswered, if you cant comment this you could say, then at least it doesn’t feel we’re being ignored.

    I know you are helpful but a lot of times it feels like it’s in riddles and often vague.

    You guys often ask for feedback, like this, imagine no one responded?

    Matt I’m not having a pop at you here, you are helpful, I hope it hasnt come across as anything other than that.

  57. Ben

    I’d like to see the AdSense code removed from websites. You’ve got URL channels – detect the url and retrieve the code yourself.

    I’d also like to see a way to combine my AdSense and AdWords accounts, not just the usernames but the balance & stuff as well. There’s a redundant step in receiving $x in one hand and paying $y with the other.

    It’d be peachy to see G. move everything out of Beta this year instead of rolling out a dozen more “new beta services”.

    It’d be nice if the much-touted webmaster guidelines actually started to mean something.

  58. HI Matt,

    This is part of a post I made in the spam feedback string – sorry for my stupidity, I should’ve kept it for here:

    IF a site is being subjected to a cap/ban/filter/penalty for ANY reason, why not make that information freely available in the info:www.mysite.com results?

    That would stop the endless guessing games, conjecture, rumour spreading, wasted time, ignorance, frustration and “google bashing”, whilst simultaneously educating webmasters and would gradually “encourage” the entire SEO world to follow google quality guidelines… what could be better than that?

  59. Add an email parameter to the sitemap schema:

    E.g.:

    webmaster@domain
    no
    warning
    daily

    This would give Google an address to which warnings or notice of
    bannings could be sent. The parameters are only a first suggestion – this mechanism could be expanded to give Google a direct route to whomever controls the domain, and its use would absolve Google of certain compliance requirements.

    There are many cases where a webmaster is not actually included in the default email scope of the domain – I have one such that I can do
    nothing about.

    Implementing a status:domain as a general search would be a
    confidentiality exposure, IMO.

  60. Ben

    I would also like to see the AdSense referals stuff done PROPERLY. It’s the most half-arsed referal system I’ve ever seen.

    Things I dislike about it:
    It’s January now, so I’ve no way of accessing referal statistics for November. There’s no reports available for referals, you can only view it on the summary which spans the current and previous month.

    AdSense referals also need SOLID information.

    There’s no point saying X people have been referred to AdSense because the approval/rejection process can take weeks. A better system would give you the total you’ve referred, and the numbers for pending/accepted/rejected.

  61. I like sitemap, the information is useful, but how about a little more data?
    The two pages I use most are Top Searches and HTTP Errors

    Errors could be improved with more data – its great that you tell me when a link is missing from a sitemap entry (thank you! {slips off and fixes that one}) but it would be helpful to know where the other errors are coming from. When Joe Public surfs in from somewhere there is usually a referer in the logs and I can see what the typo is and allow for it in my htaccess. Bots dont (generally) provide referers, so until a real person follows that link I dont know where the break is. Bots normally surf sooner. Can you tell me where the dud link is coming from? Just knowing there is one is more of a headache than a help.

    Top Searches could be improved with more data. Put some numbers on it and we will love you forever. How many searches is that term returning? and how many do I get? (Although the latter is in my logs, and thus less important) If you really wanted to go over the top, you could list a ranking against the term. That would save your bandwidth, as I open the link.
    I also find that, while the terms are linked to google, quite often my link doesn’t appear on that page. Does the list only cover the first page, or the 1st hundred queries or am I ranking on a CC TLD? More information on what the link does would be good.

    I agree with posters above that adding a ‘your site is currently hitting filters due to X Y and Z’ information on the sitemap pages would go a long way to improving the quality of the web – how many of the posters on this blog are saying ‘I was doing X because I could see my competitors were doing well with it. Why aren’t I ranking anymore?’ Ignorance is rife. Your filters are applied automatically; the processing cost of reporting the filters in play shouldn’t be high, and should have a payoff.

    HIH! I love the work you guys are doing!

  62. and one more idea, now that the sitemaps tool allows verified communicatin with the owner of the site, why not allow me to select my home country (for the site, NOT for the account!)
    This could then allow you to override the CC TLD/ Ip range driven location finder.
    Yes, my .com site, hosted in Canada still has ‘the best widgets in Brisbane, Australia’ at the bottom of every page. I realise its too much to ask you to analyse that, but consider that now you can get verified data from me!
    Prefill the dropdown with where you currently have me located.
    You lovely US-based people probably don’t realise how frustratingly difficult it is to understand why site X doesn’t appear in the local Google CC TLD

  63. I’d like to see an official solution for adsense tracking in google analytics, better adsense and adwords integration, an analytics lite for faster page loading, and better error reporting in sitemaps. (“HTTP error” is not specific enough)

  64. James

    The computer pimps at Google should tell you upfront what is wrong with the site, so people can conform to your standards. By the way boys and girls dont talk to SEOs as they are evil people, and they beat up on old women and children, and they kick little puppies. They are like pirates trying to take over our ship’s gold. They are bad bad people. If you love America and apple pie, then you will hate SEOs. Just pay us at Google, where we use the all-American adwords that can solve all your problems. Just pay Google, cause Google is good. (subliminal: google is good, google is good, google is good) LOL. Don’t take it personally Matt, its just a joke. Go tarheels!

  65. I agree, penalty:domain.com command.

  66. Dave Anderson

    I would love to see google look at more ways of supporting public service content.

    Something along the lines of sourceforge, OSU open source or ibiblio.

    Dealing with for profit hosting services is the bane of community based organizations, whether real world or virtual. I would love to see google put a small fraction of their bandwith and processing power into helping those groups that are trying to provide the non-commercial content.

  67. Alex Henderson

    When sitemaps came into being, I used them for one domain I owned. I learned from this that a configuration error on a server was causing a sort of weird error, which apparently made this domain “poison” to googlebot. Sure enough, by changing that situation around to be completely 100% compliant with standards, I suddenly started to get stronger SERPs and more googlebot visits.

    It would be great to have a page like the submit URL page that would allow you to see potential reasons why googlebot isn’t visiting a domain or isn’t caching or listing a page or domain. I think it would help most honest webmasters to improve their sites and make them more compliant to what google wants to see.

    Alex

  68. Error checker / penalty notififier, definitely. – Bad neighborhood warning, definitley.

    This service should be accessable from a standard web browser without having to go through sitemaps or whatever. A simple form to enter a url and see what comes up negative would be nice. Suggestions from google about code, links, or whatever that should be looked at.

    So many webmasters do not know what they can do to be better, and I have yet to find any information about what a bad neighborhood really is. Sure there are things that elude to it, but I beleive that info is out of date with today’s blogosphere creating pages and blog rolls that make it very difficult to decide.

    google’s recent changes with penalizing a site for linking to a site that links to a lesbian site, and other quirks make it impossible to really know what to do with a site. Not linking to other sites in fear of the possibility of a penalty, really hurts the web for end users, as many webmasters will just stop linking. i think we would all like a google page so we can check our site, and someone else’s to see if it is in danger of being near or in a bad neighborhood – this could be an update on the google toolbar, but would probably be better as a page with definitions and suggestion much like the w3c validator tool, or search engine world keyword analyzer.

    Please do not suggest using nofollow to fix this, going back through 75 web sites, most with over 100 pages, and hand coding rel=”external no follow” on every link is not a realistic option…

  69. Makroid

    I would like to see a correct handling of ROBOTS.TXT directives.

    I have a lot of sites and Googlebot often downloads files (javascript and CSS, mainly) explicitly disallowed with the robots.txt

  70. The single most important feature I’d like is Analytics tracking my AdSense convertions.
    A single control panel for the 3 A’s -AdWords, Analytics, and AdSense- woudn’t be a bad idea neither.

  71. I agree with what Adam Senour said in Feedback on Bigdaddy datacenter article about having a penalty:domain.com command…

    And I agree with Brent Atkerson’s agreeing with my idea. :)

    I’m just going to reiterate it here so it doesn’t get lost:

    penalty:domain.com .

  72. Martin

    1) Life would be easier if Google would do faster reincludes after a dupe content penalty. Three months is quite a long time for having a bug in robots.txt. I experienced this twice last year. (I hope it will never happen again.)

    2) Life would be much easier if you know what you’ve done wrong. Why not showing a “Dupe content penalty because of urlexample1 and urlexample2″ or stuff like that. (I think that has been mentioned before).

    3) Slow down that Mozilla-Googlebot. It ran two of my sites in serious trouble a few times.

  73. Martin,

    Try this:

    User-agent: *
    Crawl-delay: 120

    This will set a delay between crawler accesses (in seconds). This means that it will be at least 120 seconds, or two minutes, between the time a crawler accesses successive pages.

    You can also configure this by crawler if you want.

  74. Aaron Pratt

    Clean up the webmaster guidelines so they are clear, remove inconsistencies and ambiguities like:

    No one can guarantee a #1 ranking on Google.
    Beware of SEOs that claim to guarantee rankings …

    later on

    For your own safety, you should insist on a full and unconditional money-back guarantee.

    They could shorten the guidelines by adding the following.

    “Only 1-2% of SEO Companies are any good so be careful out there”.

    Right Matt?
    ;)

  75. A.G.

    Yes, a Penalty Checker of some sort, to determine violations of a particular URL would be a great tool.

    Additionally, it would be great if Froogle would allow a URL (multiple?) for a particular store’s feed file. Thus, a webmaster would not need to re-upload the file each change/periodically. Instead, being able to programmatically create one, as necessary, and store it in the same, accessible by the configured URL, location each time.

    Thanks for asking, Matt.

  76. How about a more effecient/effective tool to check Google rankings?

    If…

    Ethical (and effective) search engine optmization means improving the quality of the website…

    Then…

    Having the ability to track rankings in Google would give webmasters feedback about how good of a job they are doing.

    Having better tools to monitor these rankings… and possibly predict future rankings, would be valuable.

  77. Pin

    Combining Google Sitemaps, Analytics, Adsense/Adwords, penalty checker/notification and Reinclusion Request form into a single control panel. Webmasters could just log into his/her Google account and can access all available info from that account.

  78. Ron

    Hi Matt,

    Did you guys get the reinclusion request I made. I don’t want to give the URL here, but is there anyway you can help me by checking up on this with my email address, which I inputted above.

    Thanks,

    Ron

  79. shri

    Matt — improve site specific search based on the sitemaps. Put the site specific data into a seperate container and give owners the ability to implement a google quality search. As a webmaster it is a pain in the rear to integrate blog, forum, cms and ecommerce searches into one box.. and an external entity like google can help a lot.

  80. How about Google certified webmasters?
    webmasters could have a Google seal of approval on their site linked to a page on Google to verify link, so potential clients know that the company follows white hat SEO techniques

  81. This is a strange comment but one I think is valid
    I would like the ability to not have my site show up under certain key phrases for example one site which a web master shows up under the search term “naked child models” you can understand why I do not want my site showing up under this term but apart from rebuilding the site and change the companies name there is nothing I can do. the site is called naked lens and there is links on the home page the male models, female models and child models so you can see why we come up under that search term

  82. Martin

    Adam: Thank you. I didn’t know that this works for Google too. In fact I have an Crawl-delay entry for Yah**. I’ll give it a try in my Googlebot-section.

  83. Alex

    Well, I’d like to be able to sign up for Analytics, and I’d like my Sitemaps page to go out of “pending submission”, which it’s been in for possibly a month now. I don’t know if that’s the average turnaround, but it would certainly make it usable for me.

  84. Sekula

    How about better image search? One that index EXIF/IPTC data from images, and one with the bot that discover and crawl images on regular basis.

  85. Jeff

    How about an way to tell Google that a page should be filtered out from the default SafeSearch setting? Honoring ICRA tags, the Ratings metatag, or any other way of explicitally limiting the page to unfiltered search results would be great!

    (I like the keyword blacklist idea too!)

  86. Justin

    Hey,
    first time poster. Although not an organic Google search suggestion, it is for Adwords:

    The ability to upload keyword campaigns that have a lot of keywords and different url’s via global upload, like froogle does. I have designed a campaign with over 4000 unique keyword phrases that would be directed to the same number of relevant pages. The reason I have not done it yet is because I would have to submit each one individually, probably three months worth of work!

  87. Adam: Thank you. I didn’t know that this works for Google too. In fact I have an Crawl-delay entry for Yah**. I’ll give it a try in my Googlebot-section.

    You’re very welcome!

    How about Google certified webmasters?
    webmasters could have a Google seal of approval on their site linked to a page on Google to verify link, so potential clients know that the company follows white hat SEO techniques

    I like this idea in principle, but it needs some details filled in first:

    1) How does Google stay on top of this sort of thing?
    2) How does Google even decide who’s clean and who’s dirty?
    3) If a webmaster does something outside of the Google guidelines, is that webmaster banned along with his clients?
    4) What happens when the guidelines change?

  88. Michael Weir

    Google CMS with built-in functionality that lets you compare your content with similar themed websites in the google index, check for reoccurences of your content on other domains, and the search popularity of words or phrases contained in your content, etc.

  89. sid

    hi matt,
    I think what would be nice would be a Site Index Reset – what I mean by this is a system where webmasters can instruct google to drop the existing site index and start over. the problem with the current system is that when I change the structure of my site (file name changes) google does not drop the old pages from the index – it takes eons. google has 1000s of my pages that no longer exist and return a 404, but yet they still show up under the site:domain.tld command – even after 18 months being 404s! this would not only help us webmasters, but also would help google in creating a better, fresher, more relevant index in general. the only current tool, the URL removal tool is quite useless, IMHO.

    thank you, sid

  90. sid

    failed to mention:

    I think the penalty:site.tld command would be nice as well. especially for those of us that have little time to decipher if a penalty is present and rather spend time on content creation. also, this would be important due to the fact there are external factors can hurt our sites – as per your reply on why I dont rank under my unique domain name. if external factors such as links (that can be manipulated by others) contributing to a penalty, it would be nice to know if this is the factor. we can spend all day looking for on-page factors looking for faults, but if they lie externally, we may never find them. an indicator of penalty would certainly build a better web.

    content theft: it seems to me that google can not tell who the original owner of content is. I see this as other sites that copy my content rank higher than my site (the original owner) on an unique query. I dont know how the system currently works, but not being able to tell who the owner is only encourages the thieves to steal. this is a rampant problem on the web and the current tools to fight this are not useful in fighting this problem. it makes people like me not wanting to publish in fear that the thieves get the reap the benefits just by stealing.

    thank you, sid

  91. One thing I would really like to see is the ability for AdWords advertisers to check web results for a different country than the one in which they reside. For example, if I live in the U.S. and run a campaign that is only visible to Canadian residents, it’s very difficult to tell how my ads are appearing, who my competitors are, etc. – because I can’t run a search that shows me as being located in Canada.

    Some kind of emulator that would run the search without counting toward the AdWords results would be really handy.

    Other than that, I fully agree with all of the people who suggested a penalty checker. What an awesome tool that would be for ensuring quality sites and enhancing user experience!

  92. Here’s an improvement that I’ve suggested before (GG thought it was a good idea at the time).

    Add a – (minus) feature to SiteMaps, so that any URL that is preceded by a minus sign is removed from the index – not flagged in any way – simply removed completely, including from the Supplemental index.

    I know that it’s not as simple as it sounds, because references (links) to it would also need to be removed, or it would still show up in the serps as a URL only listing. But getting old pages (URLs) that no longer exist removed from the index is a bit of a pain for people, and the minus sign method would be a perfect way of doing it for webmasters.

    If not that, then a better/simpler way of getting old pages removed permanently would be very good; e.g. drop the 6 months thing, and actually remove them, instead of just flagging them.

  93. Steve

    I must agree with sid, the supplemental result thing that google comes up with may be good for some web masters trying to hold on to an old link to thier site, but it is a real pain in the ass for me.. i have pages that still show up in google, that have been excluded by a robots.txt file for more than a year now.. this can be a real pain in the butt for those who know how to use google’s cache with supplemental results.
    i went theough the trouble of using the robots.txt file just for one single page, and yet google is still showing it.
    I think that google should have a seperate internet archive search for supplemental results.. and this databse should look for new robots.txt exclusions and remove if listed.
    Supplemental results may look good on paper if you are trying to compare penis size, i mean index size with the other SE players, but they actually hurt end user experiences when the content is out of date, or links from it don’t work, etc…
    2 more cents 4 ya..
    S

  94. David

    The web was built because of linking. My first personal site in 1994 with ugly blue swimming pool background and orange fonts had hundreds of categorized links for visitors to enjoy.
    Now I have personal sites that do this, but I enjoy surfing and I often find value in the places it takes me. Google has removed some of that joy, because the SERPS are good and to the point.

    It feels as though Google penalizes webmasters for linking to external sites, but maybe it is just SEO babble. It seems sacrilege that they would, since we wouldn’t have PageRank without it.

    It would be nice if they would promote linking and define to webmasters outbound linking and when is it too much (per the algo)?

    How many outbound links are too many and why?

  95. Sandra,

    Can you not run the same search at http://www.google.ca ? Just a thought.

  96. otto

    I would like to see the available tools regarding links become more transparent. The link: command obviously shows only a fraction of the links indexed. The best change would be for the link: command to show all the links to bring consistency between the tools available from Yahoo and MSN. If Google is unwilling to do this, then link: should at least tell us the correct number while still only showing a percentage of that number.

    I would also like to see the -site: command fixed. I seems like it broke a few months ago, or the functionality was changed, and it has not been fixed. link:http://www.testdomain.com -site:testdomain.com used to show links but eliminate any internal ones. Now it will show random results missing obvious links shown in the link:http://www.testdomain.com results and instead showing weird foreign site links.

  97. For webmasters in Czech and Slovak republic — turn the AdSense for content ON. That’s already translated into Czech and Slovak, targeting is GOOD, (running on tiscali.cz) but AdSense for content is turned OFF for common webmasters.

  98. In regards to Sitemaps, we get access to query stats but it would be effective to know how often a web site was shown for a search term as well as the CTR.

    For Analytics, I wish it recognized that I have 1 Google account so I can tie it in with AdWords. Having the tab available is such a tease.

    In Adwords, in the keyword tool, I’d like to be able to find the differences in impressions and CPC when 1 term is used as a broad, narrow, negative or phrase match. Then I’d like to be able to instantly add a keyword to my campaign from the research tools.

    I have others but I’m drawing a blank at this moment.

    Thanks, Matt.

  99. I think a penalty search would be great. And then a seperate contact to a team at google for problems with sites not related to spam.

    One of my sites has been around for every been very slow and careful and it is a site promoting a geo targeted business. And in the site:www.domain.com search it shows a current cache updated every other day or so. But on the cache:www.domain.com search it has reverted to a cache from feb 27 05 when the site was all flash. The day this happend over a month ago we lost our # 2 ranking that was the bread and butter. It still hasn’t been fixed where as the site:www.domain.com to cache results are constantly updating. I am pretty sure there has been no penalty, it would be nice to inform someone on the google side about a gliche.

    I could see this flooding the google staff so maybe a box we can submit to but with a reply we can count on.

    Thanks – Matt

  100. I think you should get rid of showing PR because too many people obsess over it. You can’t go on any webmaster related website without people griping about their PR, and then saying Google has some kind of prejudice against them. It gets on my nerves.

    I want more cool Google features that I can add to my site. For example, I want to be able to put in Google News feeds on my site whereby I’d only have to insert the code once, and then have some kind of interface page on Google where I could set the topics/queries and sources that I want to show on my site. It should show the headline, and then link to the article on the originating website.

    A similar thing like above would be “Google Events”. We could insert some code in our site and have feeds of events for our area shown. Also, there should be an events search operator, so I could go events:Oasis and see a list of the upcoming Oasis tour dates. Or go events:Cleveland art and see all art-related events. Site feeds would be the icing on the cake. And this Google Events thing could somehow be integrated with Google Local. Having Google Events search would be great. I love the movie search, and I’m sure I’d love an events search just as much.

    If I think of more, I’ll post those ideas.

  101. senior writer

    Sir,

    We would appreciate more help from google in protecting our hard work from content thieves who use ‘website copiers’ to illegally download our sites.

    We wonder if google could find a way to curb this menace, which appears to be growing.

    I am sure I share this concern with millions of honest webmasters.

    thanks Matt,

    Senior

  102. Webdoctor

    How about a screen that lets Google Analytics set a cookie that will permanently EXCLUDE that username+computer combination from the statistics?

    Plenty of us get dynamic ip addresses from our ISPs, and trying to exclude our own surfing from Analytics’ statistics looks like it’s impossible unless you have a static ip address.

    On the whole, though, Analytics rocks!

  103. For 2006 I’d like an Adwords bidding tool that lets me automatically bid a multiple of the conversion rate for all selected keywords. I waste hours each day doing this task.

    Why would I want to do that? Let’s say a client is happy with $20 cost per conversion. If a keyword phrase converts at 5%, my bid is $1. For 3% conversion rate, I can bid $0.60 per click.

  104. Nate

    I second many things that have already been said:
    -Eliminate the Supplemental Index
    -Provide a tool for easily removing pages from the index (requires a login, maybe part of sitemaps), because noindex,nofollow tags and the robots.txt blocks never work as well as I’d like them to…

    Provide an interface for performing a “status check” on specific URL’s:
    -”Yes, we see that 301 and will be dropping your old URL within X days”
    -”Sorry, this page is too much like this other page (___) and will therefore not be included in our main index.”
    -And my personal favorite: A link popularity analyzer that shows the real measure of PR and localrank a given page has, and how much it is passing to other pages.

  105. Erik

    No penalty checker, no need for giving more info to baddies,

    Let the good kids have it now.. I’d like to know how the google bot sees the pages, like the Lynx viewer. In the webmasters infopages it says that the Google bot is alike.. How come pages don’t get indexed, while Lynx sees it all correct and there’s enough backlinks?

    … White Hat talking here.. Lately i’ve been seeing so much Black Hat results, and finding some much blocks on my rightous path.. That I’m allmost starting to figure that i’d might as well just go underground an pick up some new Hat.. a grey one.. or heck.. maybe even a Black one..

    Cuz there’s really no competition or competing if no-one is explaning the rules.. It’s only cheating and guessing…

  106. I’d love to have a way to find people that have copied my content. It’s a pain to search for that manually.

  107. Thumbs up at the suggestion of a penalty checker!

    Google has so many services, this one might already exist, but a forum where webmasters could meet to discuss Google, would be great! It could be moderated by a Google employee (maybe Matt needs more work? :) ), and detailed information regarding Google updates/dances/newsworthy information could be provided.

  108. JohnC

    It’s rather clear the the SEO industry is not going to impement any type of standards. It would be nice to see Google (possibly in conjunction with those other guys) set down at least a minimum level of standards. This would go far to help others weed out the decent SEOs from the “jump on the bandwagon” types. I know, thats what the webmaster guidelines are in principle. I am talking about going one step further; specific Dos and Donts.

  109. One of the major improvements could possibly be in the integration of the current array of Google products. Having a central resource to access to all products (ie a portal – and I know a limited version of this currently exists namely Google Accounts), which would be the central point of access to all services currently offered , ie sitemaps, analytics etc etc.

  110. PaidAnalyzer: My new domain has had a 301 redirect from a PR7 for over two months. The original site was dropped within 1-2 days, but the new domain is still not in the index.

    Apparently it has hit the *sandbox* or whatever you call it. I’d be happy to pay google to analyze this and include it in the index.

  111. Jen

    Matt,
    I have a great idea, a few actually.
    1) Have a program that you can enter in your website url, and spider it, then spit the results on every page and give suggestions on what I can do to better my website and increase my rankings as well as display what is wrong with it. Maybe an explanation on why joe shmo, whose web site looks like my 4 year old made it, is better than my site which has about $50,000 to $100,000 a year pumped into it, and is worked on daily.
    2) Maybe have a way to where I can submit my site after I have updated it, so I don’t have to wait until a googlebot comes to spider my site.
    3) Try to mimick overture.com’s setup for the advertising part.
    How you guys have it set up is a little weak. I don’t use overture as much as I do for googles…so that would be nice!

    Other than those, thanks for the opportunity to be able to give my $.02
    and if you need a deal on wheels and tires hit me up!

  112. Kelly Jones

    Rankings! We’re not supposed to use tools to check our rankings but rankings are what we live and die by. If our rankings suck in Google, we love money — sometimes a LOT of money. So a nice way to help us webmasters is to provide the service for us — especially if it graphed our results over time.

    Here’s a twist though. I’m sure there are terms which we rank highly for which I’ve never thought of — possibly because people don’t click on that listing very often. So here’s a pipe-dream. What if Google were able to provide me with a report something like the following:

    Searches today where you ranked in the top 30:
    Term…..Ranking…..Clicks…..CTR

    Honestly, I’ll bet there would be some interesting surprises for us webmasters. You already do something near this in the site map feature where you mention some of these terms. A bit more detail would be awesome!

    KJ

  113. I would like to see a few things.

    1) I would like to see a portal pulling together all the Google things I and many other webmasters utilize. Maybe on the Google Personalized Homepage? Search, GMail, Sitemaps, Analytics, AdSense, AdSense report requests, As It Happens search alerts, AdWords, Blogger and so forth. It seems there are so many services branching off from Google and I much prefer the idea that if all these services really are valuable to us (and they are) that they all be accessible from one location. The speculation about Google Browser has always had me hoping that you guys would deliver a package [or maybe set of extensions for Firefox??] that could deliver your webmaster services neatly together.

    2) I LOVE the idea of some sort of Google check – perhaps a once a month spider that could deliver some analytics and reporting about where our sites may suffer penalty. I much prefer the idea of logging in to review your own account than to offer the tool as a search command [penalty:domain.tld]

    3) If sites are being sandboxed – even if it’s a great way to filter spam – why not offer another search option to the public to include sandboxed sites or maybe even better – to specifically search new sites!

    My two cents worth.

    Thank You.
    ER

  114. Definitely a probation period before invoking penalties, and a few hints on what penalties will get thrown to the bottom of the rankings. I agree with most of the other things that have been said here.

    Webmasters should be able to choose who to link to, and not be penalized for it by loosing site ranking.

  115. Mauricio Quiros

    1) Clear Webmaster Guidelines. It’s hard to understand your current guidelines at some points for new people. Is it an encrypted message on it ? Why not to tell webmasters DO THIS AND THIS AND THIS.

    2)To be able to add/delete/editing(Without losing the current organic position) my website indexed pages

    The simple the better.

  116. How about giving some praise for webmasters and site owners who are genuinely making an effort to create a rewarding experience, and who have followed the rules, rather than assume we are as guilty as the hacks who continue to pollute the Internet with crap?

    I, for one, have come to the opinion that it is better for me to make a site that 1.) I am proud of; and, 2.) Is interesting to visitors of intelligence and good taste rather chasing my tail trying to figure out Google’s next move, and then have to rebuild my site to suit the capriciousness of an algorythm. (I have read and understood the literature published by the founders for their thesis… and I went to Stanford too…across the quad, though.) What an incredible waste of my talent and resources to create a site that you folks deem “worthy” at the expense of giving my visitors what they want. Right on Garth.

    I’ve decided to subscribe to the “If you build it, they will come” school of thought, and to constantly improve the quality of my site based on the needs of my visitors, and sound marketing practices, not Google’s. I’m getting 95% of my visitors through word-of-mouth recommendations and 2% from Google…and we’re growing at 200% per month. On our latest rev and 20,000 page views per month ain’t bad for a site three months out of the box and first month after complete rebuild. My visitors can find me, even if Google can’t.

    As it is, http://www.villas.com has gone from 300 solidly indexed pages to 68 since our rebuild a month ago…10 of them real functioning links and the other 58 are worthless links that Google dredged up from my first version of the site which we retired over a year ago. So, thanks for publically posting obsolete and moribund links to pages from a site that no longer exists, and hasn’t since 2004!

    All the best,

    Steve at Villas

  117. A site checker. E-mails from Googlebot, pointing at broken links, unparseable pages and similar problems. Opt-in, of course.

  118. Brian M

    Hi Matt,

    A public “penalty:domain” command could be used by competitors to gauge the effectiveness of their attempts to push other web sites down in the results (yes, this can be done). And although a “penalty:domain” command is a desperately needed function for webmasters, it does not address the issue of “filters” that may seem like a penalty. So, here is my wish list for items that could be included in the existing Sitemaps functionality (that is only available to webmasters of a site):

    1. Private “status:domain” command that lists penalties or filters that have been triggered (i.e. duplicate content, etc.).
    2. Permanent URL page removal tool (old pages keep coming back).
    3. Permanent “Supplemental Result” removal tool (these pages hang around forever).
    4. Webmaster account to sign up for email notification of penalties and filters.

    Thanks for letting us give you our input.

    Brian M

  119. This blog is an excellent part of Google’s webmaster services. Nice job Matt ;-)

  120. Ian

    C’mon, free hosting with integrated AdSense, analytics, a blog option (like Y!), video hosting using Google video, etc etc… I really can’t fathom why this isn’t done yet!

  121. gomer

    What is needed in my opinion is communication and this does not require a new product, service or some fancy new technology.

    When a webmaster writes asking what happened to their site, have the courtesy to respond and respond with honesty. If they write asking if their site has been penalized, and if it has not been, tell them their site is not under a penalty. If their site has been penalized, say it has been. If canonical issues are the problem and you are working on a fix, say so.

    If someone writes asking why another site has not been penalized when they appear to be breaking Google rules and they have been repeatedly reported, have the courage to say that some spam is handled algorithmically rather than manually. This may help you receive more spam reports and prevent a few people from going crazy.

    What would ease your stress, streamline your life, or otherwise help you as a site owner, maybe advertiser, or webmaster in 2006?
    Honest communication.

  122. 1. A way for webmasters to avoid clicking on their own AdSense Ads on their website. I know I personally click on AdSense Ads less in general since I have conditioned myself not to click on them on my own website. Sometimes I forget where I am, and its okay to click on other people’s websites! :-)

    You could allow webmasters to set a cookie in their browser that identifies that they are the webmaster of example.com website and have the AdSense code on their website go into “Webmaster Preview Mode” or something like that. Make the ads a different color or do something so that it is obvious it is in preview mode. You would need to support webmasters who have multiple websites. Another option is a FireFox extension or as a feature of the Google ToolBar.

    I’d still make the ads appear and the links clickable, just don’t charge the advertiser… or at least make it so that you can still find out the destination URL of the ad so you can type it in the address box of the browser. I know many times I am able to look at the Google AdSense Ad and find out the URL so I can still visit the advertiser without clicking on the ad on my own website. Thanks for puting that feature in, by the way. It allows me to check out the advertisers on my website without violating the AdSense TOS.

    2. I’d also like to see some clarification and reassurance about whether or not other websites can get your website banned or damage your ranking by linking to you from banned websites. There seems to be a lot of information floating around that implies that other webmasters could attack your website and get you banned or reduce your pagerank to 0 by THEM violating Google’s rules. Note sure if this is true, but there seems to be a lot of talk about it.

    If so, I would like to see the webmaster who is the victim of the attack to have some kind of recourse against the attacker… or at the very least make it easy to resolve the issue and make it so that the webmaster is notified.

    Perhaps you can allow webmasters to sign up as the registered contact for a domain and get notices from your about suspicious behavior or if they have been banned and why.

    The biggest thing is you have to find a way to protect the innocent rule-following webmasters from the ones who abuse the system. And that may take some being creative.

    Thanks.

    Scott

  123. I run a solid, top ranking site in my category with great content and have never had to hire an SEO. However I’m getting clobbered in a number of terms by those who have. It seems there’s 2 types of sites – those who do things like hire SEOs who propogate hundreds of bs links and those who don’t.

    2 things would be great on that front:
    – an obvious way to notify Google of the SEO spam domians that come up when I search on competitor inbound links so that they are omitted from future results
    – positive weighting of the voting patterns of the social bookmarking sites

    I also have a problem with other sites plagarizing my content. It would be great to have a clear way to submit them for exclusion from Google.

    Finally, I have a problem with content moved out of a database and into HTML and losing the page rank that had built up for the URIs with specific arguments, even though I did 301′s for those argument based URIs to the new HTML pages.

    Thanks for asking what we want,
    Peter

  124. It would be nice if Advertisers could sign up to advertise on our website via Google AdWords… on our website. With some kinds of API and script, it would be nice if they never had to leave our website to advertise on our website.

  125. DougP

    I feel that google should start a team to handle reinclusion even for a fee. Since it is a computer that is deciding if a site gets banned in most cases, there are gonna be mistakes. So there maybe something not intentional that causes a site to get banned. If you have a site and you feel its a good resource, you should be able to find out why your site is banned. If you are willing to pay to find out why, then it must be worth something to you. If Google does not want to make money off it, donate the $ to charity.

  126. Hi Matt,
    In terms of suggestions my site got an enquiry asking for advice on how to report their SEO company to Google for placing Black Hat SEO on their site. They were sking if you had a special page where they could report it. I told them to go to the Google normal spam page but they were looking for a specific page. Dont know how you verify that the SEO company they name did the work for them could get a bit iffy and turn into he said she said but seems a valid query for users to make.

    L8r
    M

  127. Real-time “validation” of individual Web pages’ or even entire Web-sites’ indexing-quality (like HTML validators and robots.txt validation) by Google would be most useful, but not via email. Admittedly, any instant page-quality validation done without Google’s PageRank data and a page’s “popularity” factored in tells only part of the story, but it’s what most mainstream websters care about…. Have I overloaded the keywords, phrase repetition, or whatever within the page itself?

    Even better would be if the real-time validator spidered outward from the tested page about 2 or 3 links’ worth, but only within the same domain (.com, .net, etc); that way, the topmost pages in a small Web-site can all get a quick, clean bill-of-health.

    The Sitemaps features/console are very cool but too fussy and complex for basic use.

    [Off-De-Topic signoff.... Your recent post about the desirability of decorative laptop-labelling reminds me of a fellow ex-boatbuilder who enhanced his black ThinkPad with hot-rod-style red, yellow, and blue flames. Linear polyurethane, just like the ocean racers. Used airbrush, also professional masking-tape to protect the keyboard and screen. Even de-greased the case before painting. But he forgot while spraying that opening the laptop during "sleep" mode usually also activates the intake FAN. Glossy red motherboard....]

  128. I would love to see GOogle Project, a project management package. I would also love to see Google calendar, cause I hate outlook and want to be fully online with all email etc.. I exclusively use gmail for my email now, but its annoying that i have to use outlook for my appointments etc….

  129. DavidZ

    “I feel that google should start a team to handle reinclusion even for a fee”

    i agree with the above my site was banned from google, although we know the reasons for the ban were justified, we have taken the necessary steps to meet google’s guidelines, we dont intend to ever negate our obligations to fulfill this from this day forth..

    I have submitted the reinclusion forms, but not having anyone to speak to and the process itself, im told 3-4 months ?? as long as 6 months., Makes me very sad..

    People would pay Google, $50 a call to ensure a rep works with them to reinclude their site.. I just need someone to speak to and better communication..

  130. hay matt;

    As a beginner building my first site, I have soo much trouble figuring out what’s up at google.

    Greater transparency would make my life easier, whether penalty: or backlinks returns not truncated after only a few listed.

  131. Matt,

    Weird one here…and don’t think I haven’t tried to find the answer to this before turning to Guru’s like your good self.. ohh my ego is miles too big to immediately accept I don’t know ;)

    Anyway back to the issue. I am sure that somehow our site is being penalised. The main reason I am almost sure of this is that I have looked to see of ALL the indexable pages we have which are actually IN the indices of the 3 major engines… and which are NOT. At the moment I’m doing this daily.

    Every day we end up with about 800 pages removed from the indices and a fresh 800 or so which had previously been removed added. A day or two later the newly added pages are kicked out of the index and again the previously crawled and kicked pages are reincluded! Weird beyond belief and its been going on as long as I’ve worked here now!

    I prefer if you were going to comment you didn’t included the actual URL of our site. Strange thisng here is that each day to keep abreast of new prices and updates made to descriptions and inventory listings we release a whole fresh copy of the site which is literally ‘thousands’ of new html pages replacing the old ones. We basically regenerate the HTML’s.

    I have NO idea at this point what the engines are doing. But any tool they could give to simply say something like “Your pages ‘http://www.xyz.com/page/thispagehere.html had been removed – Reason : Copy duplication” or something like that. Now I understand that competitors might have a field day. But not if like Google Sitemaps you needed to verify your ‘attachement’ to the actual; site.

  132. 1) Allow us to use adsense funds to fund adwords.
    2) Make the guidelines clear and easy to understand. Looks like your guidelines are written by some business analyst writing in legaleese. You can even call it Google Guidelines for dummies.
    3) Penalize checker like everyone else suggested. If webmasters start to figure out the algorithm, new black hat techniques and the like; Google will have to counter and make search results smarter. Are you up for the challenge?
    4) Some stuff I dont have suggestions for… Can I just list problems?
    a) I cant make a site entirly out of flash. Although I can make some really appealing eye-candy websites, I hold back because the spiders cant follow my navigation or read my text. Search engines make things easy to find, but also make them boring. Maybe there is a work-around that I dont know about by using sitemaps or something. Sorry if this one seems off-topic, but it would make my life easier as a web designer.

    I have more suggestions, a little late here. Maybe another day.

  133. Joe

    I’m dying to get my hands on a tool that shows penalties to my websites. Just think about how many webmasters out there that really think their websites are following Google’s guidelines 100% (Like me) and yet my blogs keep getting penalized left and right. My last one which got penalized this past Sunday is http:/wherewearebound.typepad.com and this site has been around for years. I recently decided to blog about issues other than political stuff and before you know it the site is penalized. I have another blog that was penalized 3 times in a 3 week period for one day each time then it went back to normal until the 3rd time in which it never came back. A tool like this would be great.

    My own theory is that other webmasters get upset that our websites are out-ranking them so they file a “spam report” and submit our site in the report. The above mentioned site is not doing anything wrong. No duplicate content, no link buying or selling, quality content to the readers. I’d love to know why I keep getting hit by Google so a tool like this would be great especially since I depend on adsense income to offset some of my bills.

    Thanks Matt

  134. Hei there,

    The seo game has changed, not because of radically changed algorithm, but because of competition.Yes search engines constantly changed their algorith, but that is not significant.I would say the main parts of the algotithm is the same.

    But the main problrm is competition.Many people want to have high ranking, and so they try many tricks.

  135. @survey what about new websites launched every day?

Leave a Comment

Your email address will not be published. Required fields are marked *

*

If you have a question about your site specifically or a general question about search, your best bet is to post in our Webmaster Help Forum linked from http://google.com/webmasters

If you comment, please use your personal name, not your business name. Business names can sound salesy or spammy, and I would like to try people leaving their actual name instead.

You may use these HTML tags and attributes: <a href="" title=""> <abbr title=""> <acronym title=""> <b> <blockquote cite=""> <cite> <code> <del datetime=""> <em> <i> <q cite=""> <strike> <strong>

css.php