Feedback: Search quality in 2006?

Okay, this thread is for any search quality improvements you’d like to see that are *not* webspam-related. Are there types of queries that you wish we did better on or that have frustrated you? Are there operators that would make web searching easier for you? Do you wish we’d do something different with when or how we show snippets?

Again, take a moment to think about it without looking at the other comments. Then please add a comment. Again, I’m less looking for discussion in this thread and more looking for independent views of what Google ought to be doing to improve different types of web searches.

98 Responses to Feedback: Search quality in 2006? (Leave a comment)

  1. Google should become more time-centric. There are some topics that are timeless, but others that get stale after a couple of years. It is very frustrating to search for information and get stuff that is 7 years old.

    And yes, I could include a date in my query, but you’re asking for something to improve the user experience.

  2. Hi Matt,

    I just have a question, maybe it’s not realted to this thread.

    Google.com algo update is reflected months later on another web sites like Google.com.mx or Google.com.es ?? I have seen some changes on the Google.com.mx the last few days and I though that you’re implementing the Jagger update there.

    Also a command like anchordomain:http://www.domain.com will be a great idea !!!

  3. a recent operator.

    Try doing a search for “nfl draft predictions”.

    The top result is a list of predictions for last year’s NFL entry draft. Sure the next draft isn’t until the summer, but anybody searching for this would be more interested in the upcoming one than the previous one.

    It’d be nice to tell Google that I’m not concerned with things that happened last year; only recently or upcoming. That’s just one example, there are tons more for this category.

  4. I really like Microsoft’s slider adjustment’s in particularly the one for freshness. Seems to help me get newer items.. I wish google had a similiar method to allow me to search specifically for items that are newer.

    Next I would like more vertical indexes. I use the edu and gov ones all the time and they are very useful. If there was a way to set up custom verticals, in a manner similiar to Rollyo, that would be awesome.

  5. how about categorizing results based on whether or not they’re commercial or informational?

    Try searching for anything loan related.. You won’t find information, only companies trying to sell you loans.

    Search engine optimization is another keyword like that. You’ll find companies, not information.

    I’d like to be able to not see any companies offering services or products to me for some searches.

  6. I would seriously wish that Google would resist the urge to suppress “freedom of speech in pictures”. I have a website with photos of The Little Mermaid IRL, and some time ago it went from top ten to 300+ – with nothing changed on it, and the rest of the website doing fine. The only possible explanation I can think of is Google considering the The Little Mermaid statue too “sexy” or “nude” for its users. Seriously!

  7. Too much emphasis is placed on Amazon. Whatever causes them to rank should be toned down a little.

    Pay attention to timeliness. You occasionally see things like ebay listings in search results. They should never be there.

    Get rid of “Fill in the blank” ads from adwords. The ebay ad for any product makes me less likely to click on any of the ads.

  8. Matt,

    Thanks for opening up the floor to us.

    I preface all of this with if users learned how to search correctly most of this wouldn’t even apply. 🙂 But, we all know that is too much to ask …

    One of the comments above mine basically asks for Google to infer “intent” with limited keywords entered into the search. Couple this will Bill Gates’ comments at CES where he called search “bad” and it basically comes down to Search AI.

    I think the “real win” for search will come from taking Search History of a user and then applying a weighted system to their search trends to then re-organize their search results using a weighted rank based on “probable” intent. Gates talks about search not knowing if you are buying a product locally, doing research, shopping online, etc. (example he gave was a search on “pizza”). Somehow, with limited user input like a search on “pizza”, Google needs to determine user intent and display the relevant results based not only on keyword matching, page rank, relevancy, etc. but also on “what exactly does the user plan on doing with this information”.

    Obviously, this is a tall order – just throwing that idea out there.

  9. Matt

    Awesome question. I’ve been itching to pass on my complaint/idea. I search for different kings of things at google: Sometimes for products and sometimes for information.

    For instance, I’d like to buy a tractor. But first, I’d like to learn about tractors (I’d like to educate myself). For instance I like to learn: what to watch out for, how tractors work, how to mate attachments, how to do maintance, etc… However when I do a regular google search, no matter the relevant words used, the results are always dominated by product pages with an ‘add to cart’ button. (which makes since cause these pages are surely going to be SEO’d)

    If the search could sort the results into: information and product, Or just simply allow me to weed out the products from the general search… it would be more useful. (maybe by allowing me to not see results that connect to a shopping cart)

    I know I could try some of the other google searches for this (ie – blog search), but the results of the other google searches usually disappoint compared to the premier google search (plus I’m lazy).

  10. Well, I’m glad the Gimp no longer shows up in queries for FTP-servers. 🙂

    I’d like to have a wiki: operator for Wikipedia… Even though it’s perhaps too much to ask for since Wikipedia is not affiliated with Google.

    I’d like all the cached links to start working again – all of them used to work, right now less than half of them still do.

    I really like the snippets. In the past, EVERY single result had one – why not anymore? Especially for results toward the end of your search. It can be quite bothersome, because it often forces me to either click on it (and too often they’re large documents), or guess whether it’ll be relevant (I don’t know which option is more annoying).

    I like the “remove result” link I used to see. I haven’t seen it lately, but that may be due to me surfing in Konqueror instead of Internet Explorer. I hope to see it return when I’m back on IE. One thing though: I only used it for the function to remove entire web sites. I wish I could have it default to that option. I can’t see why you’d ever want to remove a single page? That doesn’t even seem to be worth the effort.

    I’m sorry for going a bit off-topic… Google’s feedback options have just gotten worse and worse. If you feel it’s too off-topic, please just delete this comment. After all, this is your own blog. I guess I’m too demanding anyway 😉

    Oh, I just thought of two more: why the annoying image size list box in Google’s image search? The only effect it seems have had is that I now need one MORE click to access the feature compared to the three links. And it’s too bad as well that I have to edit the URL in order to fine-tune the sizes (e. g. if I only want to see ‘xlarge’ images, and not the entire group of medium-sized images).

  11. Typo searches

    I recall seeing (on a few instances over the past few months) on some keywords that I would search for and see about half way down the page, links that are for a different term, but in one form or another, relevant.

    Why not have that for typo searches? This way, a searcher may have an increased chance of actually getting to a better destination. Kind of like a beefy did you mean? I am not going to be specific, but I have to actually make naturally ranked pages that might have a typo because on some keywords normal searchers really misspell some keywords for what I target and it can sometimes be a lot cheaper than using PPC. By having that option for typo searches, I think it would allow the searcher to have an easier time to locate what they are looking for and make google look a lot cleaner.

    I am sure I could come up with a bunch of other ideas, but I really think that this would be a good idea for this purpose.

    Mike

  12. A nice optional parameter in search results would be the type of content on the site: pdfs? flash? lots of pop-ups and ad networks? blog? Perhaps the number of pages on the site. I wouldn’t want to junk up the results so that’s why it would be optional. But depending on what you’re looking for, it could help determine at a glance if it is the right type of site.

  13. I search a lot for “blue widgets”. Sometimes I want to buy a blue widget. Sometimes I want to research blue widgets. Sometimes I want current news about blue widgets.

    Search engines try to guess what I’m looking for based on a noun. Why not place qualifier check boxes before the search term such as Buy, Research, Reviews, News, Directories, Blogs, etc.

    When I want to buy a blue widget, I’m not interested in blogs, news articles, or my search term buried within some unrelated topic – I’m looking in this case for ecommerce sites that sell blue widgets.

  14. Matt,

    Great idea creating these feedback topics! Hope you don’t get completed drowned in the avalanche of feedback!

    I strongly echo the previous requests for results based on timeliness. Much of the time when I’m searching for information, I’m looking for the latest info on something. Perhaps a weighting in the algorithm that benefits “fresher” information.

    Second: a solution to the problem of plural/singular and synonyms. Currently a search for “bridal registry” and “wedding registry” return very different results….and a search for “bridal registries” returns different results from “bridal registry”.

    This is a hard problem (especially the latter) to solve on your end, however…I recognize that. I’m inclined to want to use the keyword meta tag to communicate this to the search engines–the webmaster will know what different key phrases and words customers use to refer to their products (this also tackles the misspellings problem!).

    Certainly metatags have been wildly abused in the past by spammers–however, you may be able to deliver better results to the customer by allowing the webmaster to declare what a page is about than trying to analyze the page to see what it’s about.

    Applying your page-analysis algorithms to spotting misleading metatags seems like about the same technical problem as applying those algos to determine the page’s content….and if you can allow the webmaster to say the page is about “wedding registries, wedding registry, bridal registries, bridal registry, bridle registry” etc. might allow you to better categorize a page’s content…then use your ranking etc. algos to order the results and deliver them to the consumer.

  15. I would love to see Google expand its synonymy efforts. For example, it would be great if the search results for “hypertension” and “high blood pressure” or “cars” and “automobiles” weren’t completely different.

  16. I’d really like to see some stemming technology fall into place. But since that seems to be on of the last things Google has in mind for us, I’ll continue to search four different ways on one phrase.

    I’d also like to see Google try to educate users on advanced search suggestions. Maybe a hint at the top. For example, a search on “spears”, could yield the following:

    Results not what you want, try removing irrelevant words with the – tag. Using spears -britney would eliminate 15,350,000 listings.

    It’s almost like that old proverb… “Give a man a relevant search and he learns for a day. Teach him how to make all his searches relevant and you can sit back and play videogames instead of doing hard work.” Ok, maybe a little of both is better.

  17. The one thing that absolutely infuriates me when searching for information on music, tv or movies is all the commercial sites (mainly stores) that plague my rankings. I often find myself having to type in such queries as: [“the beatles” “abbey road” -store -usd -shop -shopping -basket]. It’d be nice to be able to do a query such as [info: the beatles abbey road] that would only return results that don’t have anything commercially-related on the page (e.g. links to shopping baskets, lists of prices etc. I don’t want to buy the album, I want to read about it :)!

  18. I think that some of the most frustrating things for me are when articles or pages on well trusted sites rank above smaller sites that are more relevant and important to the specific search.

    True, the articles may be very relevant to the search phrase. But, they should not be considered more important as a direct result of the websites that they exist on. Obviously, I’m referring to a sort of trust ranking type of glitch, but it seems to me that many SERPS have similar occurrences in them.

    Many are simply affiliate link pages or advertisements, on well trusted sites, that are ranking only because of the site. The pages themselves are often neither relevant nor interesting.

    Just for an example related to my industry:
    Google search: Merchant Account
    Results:
    15: bizjournals.com
    16: forbes.com
    19: hostingcatalog.com
    21: ezimerchant.com
    28: online-store.co.uk

  19. I’d like to see dated information. The last time a page was cached doesn’t help me as much as if say, you coupled that with the last time a page was updated that would be beneficial. Maybe allow an “archive” search that allows users to find information that was last updated in 2004 or something of that nature.

    Another interesting idea I think would be to allow user feedback. So if a website were to appear on a certain search phrase such as “SEM quiz”, and you look at the majority of the top 10 results, you can click a link on Google that says “This site is not relevant to my search”. Now if this website breaks a certain threshold based on user feedback for this search phrase, then a Google quality engineer can view that site personally and decide whether or not it should be included in future results for that search phrase. As we all know, just because a site ranks well doesn’t mean its useful for the search topic. Maybe this can be a feature of a Google account of some sort to prevent a ‘click fraud’ type of response from users. It might not even be possible to do this but its just an idea.

  20. two words: People Search
    yahoo’s is not very good at all. There is the need when you’re searching for ‘John Smith’ to be recognized as being together, so that you don’t get result pages back with ‘John Wayne, Frank Jones, Bob Smith’. That has the terms you were searching for, but not in the semantic sense you were looking for.

  21. I find myself using [define:] all the time and I wish there was (no laughing, please) a shorter command for it. Something like [::] maybe?

  22. I think that current support of rel=”nofollow” for backlinks is just a semi-solution for a much wider problem than just antispam fight. To make it a full one SEs should allow user to give a hint to SE to ignore parts of the content of user’s own pages. In short: I suggest something like <div id="myid" rel="not4se">...</div> hint support. I’ve just writen a big post in my own blog about this possible hint

  23. [define: ] should give spelling suggestions if you spell the word wrong.

  24. Give users the option to specify whether they are doing a “commercial” or “informational” search.

    Existing options (News, Local, Froogle, etc.) are nice, but as a user I’d love to just have those two choices right at the top of my search: “commercial” to let me find the product or service I’m looking for, and “informational” to find out about anything (product, service, or otherwise) from someone who’s not trying to sell it to me. Maybe this could be some kind of filter?

    One example that comes to mind is web hosting — I recently tried using Google to find a new host, but most of the “reviews” and “directories” I came across were the bucolic fabrications of web hosting companies themselves, most of whom had bad spelling and at least 10 domains with the word “host” in the title. I was left with only my considerable spelling talents to separate out who was biased and who wasn’t.

  25. I agree with EKB. Google should allow users to choose for what to search: information or commercial, even though we already have scholar.google.com etc… I think its not enough and not many people know about it anyway.

    These days I saw some very odd website hosted by free hosts that rank on first page for keywords that Overture says it has been searched 500k times. Now this makes me sad. A person that designs and does a clear WH SEO methods and tries to reach at least the first page but sees a website built in 3 seconds with some odd website builders that is insane. Google should look at the information that we provide and also how we provide it.

  26. Resulst for languages much different from english are to bad in my opinion. It is the only reasons why google is so unknown to an ordinary users in Russia for example…

  27. How about something more basic? Why can’t I search for an email address? Google ignores the @ symbol and many others.

  28. duplicate results – WHY.

    in first four pages of any search some websites are listed several times – I don’t understand why.

    why does google sometimes list 2 pages from the same site – it always does with mine – why?

    someone searches for Bratz Bedding – GG lists my Bratz Page – which has Bratz bedding on it – great – but also lists my Generic bedding page which only has a link to the Bratz page. I’de be quite happy for the first page only to be listed. I’m sure visitors don’t want to see the same site again and again.

    and

    if GG stopped doing this sites that are currently on the 3rd page might get up to the top of the second page – cool. Fair shares for everyone.

    Visitors are quite capable of ascertaining if a site has any relevant information on it – you’re just clogging up the Serps.

  29. I see shift towards large companies in SERP, but rarely find such results very useful in provided information. Large companies have millions of pages, huge resources, content and now started to learn how to take advantage of search engine optimization. But I don’t see very useful to have nationwide big companies in search results for local searches, where smaller, local website is providing better and more useful information. I am afraid that internet will become soon playground for huge corporations and become something like advertising on TV. Few will be able to afford it and the rest will be lost and never get the chance.

    The internet was always great because it gave chance to the smallest companies to market themselves in very inexpensive way.

  30. Try a search for “how to be a model” without quotes.
    There’s a ton of information about the word model. What the searcher really wants is information about how to be a model like Tyra Banks, Cindy Crawford, or, well, you could probably come up with a lot of names. 😉

    I’ve got news for you: the searchers are not looking for Model Aeronautics, Model Railroader magazine, or the W3C Document Object Model.

    A search for “how to be a model” with quotes brings up exactly what the searcher is looking for. By the way–all of the Adwords ads are completely on-topic for this phrase.

    In this case, there’s only one result that is anywhere near what the searcher is looking for. I wonder if there’s any way to determine that the searcher is looking for “how to be” something–and people cannot “be” model railroads or object models.

  31. @SlyOldDog: Put the email address in “quotes” and Google will find it fine…. as long as it is in the text on the page. If it is only in a “mailto:” link then I have no idea how to find it.

    I know, these threads are supposed to be about “search” but those informational queries mentioned above as being hard to fulfil, have all the answers just half a dozen clicks away through the heirarchy of http://directory.google.com/ 🙂

  32. I think forum topics from forums should come up higher in the results, there’s a wealth of information that’s not found in a lot of the search results pointing to forums/Bulletin boards.
    I other words I think forums and should get a little kick over other sites, most of the information I learn on the internet is through forums.
    Thanks Matt

  33. The one site flaw that I run into almost every day on Google is that clicking on “More »” for additional search options clears the my query from the search box. I imagine that a lot of Google searchers would like for there to be a way to easily carry their existing query over to other search channels like Blogs, Catalogs, Directory, and Scholar Searches.

  34. More semantic analyses of web page texts.

  35. i am a c#/.net developer and i have issues searching for “.net” (i know its a bad name.. thanks MS) but never the less its a very important thing to be able to search for.

    brad

  36. I wish G would not show two listings for the same web site on a search. Most users have the default setting of 10 results per page, yet often only 5 unique sites are shown because they are listed twice. In one case, the 1st listing was for the home page, the 2nd for their Contact Us page!

    If a site is well designed the user will find the info. G just needs to show the single, most relevant page.

  37. I would like to see Google tackle expired domains again.The registrars are one step a head by not allowing expiring domains to drop.They also keep the old whois info on them.
    I let a domain slip expire unwittingly and when I disovered it, I found that it still had my exact whois info except for the admin email and the creation date was not reset.It was being used to spam porn links.
    Matt, what is being dome about expired domains?
    I see that 80% of adult serps are all expired domains.

  38. Your indexes are split out for countries (i.e http://www.google.co.uk/), but many results from google.com show up with results that only apply to another country.
    On your google region sites you can select to search results from that country only, but you can’t do that in the US.

    Do a search for ‘car insurance’ on google.com and results 4, 5, 6, and 10 are UK sites that only service UK customers.
    Do the same search on http://www.google.co.uk/ and select ‘the web’ and you get only UK sites. Select ‘pages from the UK’ and you get the same results. Those top 10 results are spread throughout the top 25 on google.com, but they don’t help any visitor other than UK visitors.

    One option is to offer the option to search US only results, but Google is probably smart enough to figure out a way to clean up the SERPs without a user filter.

    In many industries a site and/or product is region specific. It does make it harder if a site tries to accomodate multiple regions.

  39. Hmm, a couple of ideas come to mind.

    The first is this: A register of ‘known webmasters’ whose identity can be confirmed and who pursue white hat only techniques. Website owners could apply for ‘known webmaster’ status through, for example, their Google account. Their site(s) would be given ‘trusted’ status which could be reflected in SERPs.

    Commerce sites can use third party services like eTrust, but there is no real alternative for content driven sites.

    Qualifying criteria could be as strict as you like, such as only persons with at least one academic journal article published, or persons listed as officers of public companies, or persons with at least one article published in an authority site. One other category which might be included (a bit more controversial here) is ‘people known personally to the search engine community’ i.e. people who are actively involved in understanding the inner workings of search engines (attend conferences etc.) but only use whitehat. If a person is known within a community and have status within that community, they are less likely to go outside the rules of the community because they risk loss of status if caught.

    Just for the record I’m not in the first three categories and don’t consider myself to be in the fourth category.

    Basically I’m suggesting that the ‘known webmasters’ who only use whitehat should be given some degree of preferential treatment. Elitist? Certainly. But it might give webmasters something different to aim for besides backlinks backlinks backlinks.

    Of course, risks and problems would be significant, not least the risk of ‘status theft’ due to someone attempting to pass themselves off as a known webmaster.

    But food for thought, yes?

    And I know that you said don’t look at other posts beforehand, but I couldn’t resist…

    Toby Adams above makes reference to sites with an ‘Add to Cart’ button coming up too high in results. I would suggest that Google, rather than change the main algo, would set up a ‘Info Only’ search, sort of a halfway house between regular results and Scholarly Papers. Info Only would look at content driven sites with no ecommerce element. Difficult to specify criteria for this, but worth thinking about.

    Bill Hartzer discusses various interpretations of the word ‘model’ – suggesting that search context needs to be improved. How about a ‘category’ button – a ‘pre-search’ screen that would give categories rather than sites. The user would select a category and results limited to that category could then be returned. A possible way of G to integrate tagging to their results?

  40. A follow on from my previous comment – I acknowledge that EKB put in the same idea about non-commercial information search before me.

  41. And Dean Clatworthy!

  42. Hi Matt,

    I am a bit confused, please help.

    After the jagger update i have stop doing blogging and forum submissions, but i still see that the sites doing this are getting better results, like major inc in there actual links, and site indexing and all.

    So does we can practise this now or not. I am working for shakespearefinance.co.uk, initially it used to have 0 backlinks but last month in december I got 10 links for it, but no further improvrmrnt in rankings and all.

    So I just want to know shall we should go with forums and blogs or not.

    Suggestion required, please.

  43. There are reserved characters in Google that can not be searched for.

    For example, the other day I was searching for drivers for my wireless network card. My device’s model number is DWL-650+. There is also a network card called the DWL-650 and I couldn’t be sure if the DWL-650 drivers were compatible with the DWL-650+, so I was only interested in drivers for the DWL-650+.

    However with + being a reserved character in Google, I found no way of searching for DWL-650 without DWL-650+

    I’ve come across this problem numerous times before but this is the most recent example. What would be really great is if there were an option to do a straight search where Google assumed all reserved characters, stop word, etc were vitally important to the query.

  44. >>@SlyOldDog: Put the email address in “quotes” and Google will find it fine…. as long as it is in the text on the page. If it is only in a “mailto:” link then I have no idea how to find it.

    I know, these threads are supposed to be about “search” but those informational queries mentioned above as being hard to fulfil, have all the answers just half a dozen clicks away through the heirarchy of http://directory.google.com/ 🙂

    Actually this is not true. Google ignores the @ symbol. It might find the email address because of the name but not because of the @ symbol. you’ll get the same results if you put a # there or any other similar symbol.

  45. What about a partnership between Google and old webmasters…. like a “Google trust partnership”.

    In which we agree to do 0% spam, pure content, etc…

  46. Search Quality…. Now that´s a subjective subject. 🙂

    I still wonder what you guys are doing in this serp: css hack. I understand the results, but not the presentation. right in the middle a couple of different results and the next page doesn’t start at result 11 but at result 8. ???

    You know what I would really love to see? rank positions in front of the results, so I don’t have to count anymore in which position a site is.

  47. i get hacked off doing celeb picture searches to find little to no content on A LOT of high ranking sites. They have 1 image and a lot of text using keywords but no content or only links to other sites. I f I do a search for “matt cutts pictures” thats what i want. i dont want, just look at what you get.

    I know you guys have an image search engine but that is way off the mark and not very good to be honest.

  48. These localised results are a pain in the ass. Unless you’re from the usa or canada you won’t get the best search results available as google feels the need to sneek in some bad results from geo-closer locations. Even from languages I don’t speak. If I want localised results I’ll say so. Mostly I just want the best results, not the closest to home!

  49. This was posted on the thread about webspam, but it’s too important to let go.

    I’m tired of seeing results from search engines, shopping comparison engines and eBay. I’ve seen searches where the first two pages were nothing but shopping comparison engines, eBay and Amazon. That’s totally horrible results and strictly points to the problem with PR being your achilles heel for providing quality SERPs.

    If people want to use those services then they can go to them. They don’t need your help finding those sites.

    Continuing on this path will result in the top 3 or so pages being filled with major corporations such as WalMart, Amazon and others with no room for the little guy. Those major corporations might have the product, but often they lack the quality customer service one might find in small business.

    Also, by not filling the top results with those type of sites, the smaller guys won’t be having to hire SEO companies whose activities are questionable.

    The results will naturally improve.

  50. Search Quality improvements:

    Imagine I searched for the terms: “dog medicine” for Chihuahuas

    Once I get my results I think I should be able to modify the search results from a custom right click menu. Furthermore the right click menu should change depending on what I mouse over. The intent is to give the user much more control over the search and access to the power of the database.

    If you had this you could get rid of the advanced search and preferences and build them right into a custom right click menu (or small menu that acts like it).

    The option for the right click menu are endless, but they need to be specific to what I am mousing over. Here are a few ideas.

    Right Click Menu Items Listed by What Your Mousing Over.

    a) While Mousing Over a Result:

    1. Suggest keywords (reload page and lists the users keywords with a bunch of relevant keyword links, for him to click on) this is a good option! Links should repeat where the spelling correction text appears.
    2. Pages like this results.
    3. Advance Search (and don’t lose the persons search results if you load that page)
    4. More results per page (add 20 sites to the page)
    5. Less results per page (cut results by 20 site)
    6. Go forward a page
    7. Go back a page
    8. Show results without a keyword (reload page and adds a separate box for user to add a keyword) Box should appear where the spelling correction text appears.
    9. Remove the site descriptions (reload page and add 30 or more sites, but with just the headers, making a much longer list)
    10. Add the site descriptions back (reload page and put it back to its normal length)
    11. And a zillion other stuff that the smart people at google could think of that helps people find things quicker
    12. Restore defaults. Just set everything back to normal. The most important one!

    b) While Mousing Over the box with the users keywords:

    1. Depending on the keyword (or phrase) you have the mouse over it offers to add quotations around it and the keyword in front or back. (reloads page and adds quotes around both selected keyword and keyword (or phrase) selected.)
    2. Depending on the keyword (or phrase) you have the mouse over it offers to remove quotations around it and the keyword in front or back. (reloads page and adds quotes around both selected keyword and keyword (or phrase) selected.)
    3. Offers to add the “site:”, “url:”, “linkto:” in front of the keyword your mouse is over.
    4. Offers to delete the keyword (reloads the page without the keyword)

    5. Offers long list of extra relevant keywords that make good sense. (clicking on one reloads the page and adds the keyword) (especially ones that google knows the user might want) Sometimes my brain isn’t working perfectly and I need help.

    c) While Mousing Over the google logo

    1. quick links to all of google other stuff ( the first one is “google search” that returns you to the google search page)

  51. I’ve talked about this on Webmaster World, but repetition never hurts (at least when practicing skating figures or piano scales):

    I’d like to see Google offer a simple way to skew search results toward “information” or “commercial.” For example, there might be radio buttons or separate windows for:

    “I’m searching for information on…”

    “I’m shopping for…”

    The resulting SERPs would be weighted slightly toward information or e-commerce/affiliate pages, based on secret Google criteria such as the absence or presence of a shopping cart or the page-layout characteristics. Note that I’m *not* suggesting separate indexes or a “Chinese wall” between information and commercial results; I’m merely suggesting a slightly different *weighting* of the results, based on the user’s expressed intent.

    Google could continue to have general search (i.e., what it has now) as the default, or it could have only the “information” and “commercial” options with a default to one or the other (probably information, which is the focus of Google’s corporate mission statement).

    Letting users emphasize “information” or “commercial” results would make Google’s SERPs far more useful than the unwieldy keyword-based raw data dump that users get now. (Web publishers and e-commerce businesses would benefit, too, because they’d have a better chance of reaching readers or prospects who are interested in what they have to offer.)

  52. I would really like some kind of “NEAR” search parameter — e.g., ‘”this phrase” NEAR “another phrase”. I know “nearness” is a tough quality to define in code, but hey, you’ve got one or two really smart people working there!

  53. I think there is a treasure trove of information out there that’s appears to be ignored by Google — it’s on public web-based discussion boards (phpBB, vbulletin, etc). I am often searching for something that I know someone somewhere has to have mentioned on one of these boards, and I just don’t seem to find them.

    Also I would love to be able to tell it what kind of search I am doing – if I want to buy something versus if I want to find out how to do something, etc. I do “how to” searches that come up dry all the time.

    I really don’t care for the adwords that seem to show up for every single search word and bring up basically generic affiliate links pages. I am not sure what this is called but I see it in practically every search I do. It makes me hesitant to trust any of the links I see over there on the right.

  54. Hi,

    You opened this thread after I had made my suggestions at http://www.mattcutts.com/blog/bigdaddy/#comment-8716.

    Not being sure if it would be ok to transplant the whole comment to here I am just leaving a reference.

    Just in case you don’t remember: I am discussing about search engines in Latin America, although the principle of my questions and suggestions actually apply to national versions of Google almost everywhere outside the English-speaking world.

    Thanks,

    Luis

  55. Don’t put Google AdSense *above* the results. I understand on the right, but over the top results?

    Most users (like my mom) aren’t very computer savvy. So yes, they click on your ads. Which IMO are NOT as relevant as the top spots (which is a thumbs up to your search results).

    Right now its HIGHEST BIDDER that is most relevant to many users, which makes me ill…

  56. 1) For the love of God, please get rid of DMOZ as a source of results. Especially now that MSN has decided to use DMOZ descriptions.

    2) If you’re going to tackle paid text links (which I am so down with), what about paid-inclusion directories as well? I’m sorry, but most of those “guaranteed submission or money refunded” clauses never apply. In about 99.9999% of cases, a site’s getting in there anyway.

    3) Consider more directory sources (particularly manual free submission directory sources) and give weight to those.

    4) By the same extension, consider rewarding those who contribute both to big G (in the form of this blog or otherwise) and to the Internet community as a whole (people who have attained high ranking/mod status in high-traffic forums, people who write articles that are distributed, etc.) I’m not just saying this because these are things I do, but because they’re things a lot of people are doing to contribute something positive and help others.

    I’m not really sure how you algorithmically code that, but then again, you’re the engineer so that’s your issue. 🙂

    You might even wanna go one step further and give some love to guys like Aaron Pratt and Dave (if he has a site) and the other regulars up in dis piece.

    5) Deeper regionalization (state/province or city, for example.)

    6) Judgement of the origin of a site not by hosting IP address or TLD, but by address found on a contact page.

    For example, many of my clients don’t show up in Google Canada searches because they have .COMs and are hosted on Sectorlink servers. They shouldn’t have to be excluded for choosing an American host because Canadian hosts are garbage by comparison (I’m sorry, fellow Canucks, but it’s true. Try Sectorlink for a month after being with Blue Genesis, Black Sun or that one out of Alberta I forget the name of and you’ll see what I mean.)

    Anyway, those are top of my head thoughts.

  57. Some canges I would like to see

    – being able to search for informational sites only
    – search for commercial sites should be geo-targeted , either by specific location / region or by number of miles / km around a given location

  58. Stupid thought, but maybe some proprietary meta tags for geotargeting and categorical indexing of websites?

    e.g. <meta tag=”location” content=”Toronto (City), Ontario (Province), Canada (Country)” />

  59. Please get rid of DMOZ. That is all I ask. DMOZ editors are lame and losers, they don’t do anything and let us wait 2 years saying “don’t resubmit it can only do bad, just wait.” There is no communication between Google and webmasters and also with DMOZ and webmasters. We need to increase this communication and Internet is for communicating no? So do us a favour. Take out DMOZ and increase communication between webmasters and Google.

  60. Upgrade the weight a link from IMDb carries in designating the ‘official website’ for actors.

    IMDb is obviously an authority in determining which is the official site and trusted commentator.

    Doing so will improve the quality of results for name searches for actors sites.

    Really, the official site should be the #1 result, at worst #2 result with very little more than a link from IMDb.

  61. Hi Matt,

    I think that Google should continue to show the description on the snippits when possible as the site owner knows how best to tell the consumer what the page is about.

    I used to be in the Yahoo directorty and then asked them to remove my site, not because of the money but because they would show their title and description in their results. Those are put up their employees that know nothing about my industry and what the page is really about.

    Big Daddy!

  62. ** NOT FOR PUBLICATION**

    Matt,
    (Nigel Johnstone/me@nigeljohnstone.com)

    I have a lot of suggestions for you, some you may already have, but bear with me. Congrats on the Video store, BTW. impressive.

    ———————————–
    Apart from full stemming you mentioned, how about also understanding dates. e.g. 20th Feb is the same as February 20 is the same as 20/2/????. You’d have to cope with uncertain dates too, 2/3/2005 = 3/2/2005 just because you can’t know what was intended.
    So I search for
    [holidays 20th January] and it treats all these as matches:
    [holidays 20 Jan]
    [holidays 20/1/2005]
    [holidays 20/1/2006]
    [holidays 20/1/…]
    [holidays 20/1/05]

    Basically it would understand dates and partial dates.

    ———————————–
    How about relative dates?:
    [yesterday’s foodball results]
    Converts to
    [((“yesterday’s”) or (Date:8/1/2006)) AND football AND results]

    Imagine it would pull up results that either mention the word “yesterday’s”, or have yesterday’s value (the *date* of yesterday).

    Comprehension of “tomorrow”, “last week” etc. too
    —————————————-

    Date calculation in the calculator.
    [january 20th 2006 + 1 month + 40 days] = ?

    —————————————-

    Still waiting for a previous wish item I’ve bunged into Google, an understanding of density of common food items & materials!
    [1 cup in litres] works fine, but [1 cup flour in grams] doesn’t, yet the density of flour is known. It would be nice for chefs!

    —————————————-
    Common english phrase conprehension, (perhaps you do this already?)

    [bear with me cars]
    “bear with me” is an English phrase , so it would be treated as
    [(bear NEAR with NEAR me) AND cars]
    i.e. it would give extra weight to those words being close together because “bear with me” is a common English phrase.

    For *very* close bound phrases:
    [knick knack chocolate] -> [“knick knack” chocolate] (Again I’m sure there are phrases that illustrate this much better, but knick knack is all I can think of right now. i.e. it would bind knick and knack very tightly since thats how the phrase exists in English.

    Likewise for verb infinitives [to be],[to run] etc.

    —————————————-

    Now for the most complicated bugger, page rank.

    I think that sites can make their own Page Rank, by creating pages internally. (Correct or has this changed?). I also suspect they leak PR by giving outbound links, a strong dissincentive not to give proper links. Both of which I’m not a fan of. I think the internal PR problem creates mega sites of fluff and the PR penalty causes people not to give links, making it more difficult to rank. I see you guys have been trying to reduce the fluff pages (possibly with the intent of tackling this),

    Have you considered doing this approach:
    http://www.nigeljohnstone.com/archives/2006/01/how_about_somet.html

    Simply weighting the total page rank of groups of sites, so that the total internal rank of that group is equal to the total incoming rank into that group. (Read that sentence carefully, it explains the algo used on that page I just linked to).

    It seems to fix both those problems nicely and is beautifully simple. It also nicely spreads rank more thinly over spammy pages pushing them down the ranks even further. Look at the filler pages X3, X4, X4, X6… etc.

    —————————————-

    Finally, have you ever considered convincing those higher up to drop the MSC PHD requirements for jobs? I’d love to work for you guys, but never get past the degree prefilter! How much do I have to pay you guys to let me play?! 🙂

  63. It would be very nice if on the Google result page there was a small icon, a link next to every result item: “open in new window”. (like yahoo). I think it’s very useful.

  64. Since you’re looking for a specific type of search, I’ve noticed that, like MSN, the home page tends to carry a lot more weight than subpages even when the latter are more relevant to the search.

    For example:

    http://66.249.93.104/search?q=silk+flower+wedding+packages&hl=en&lr=&start=10&sa=N

    The last listing is a client of mine that I’ve brought up here a few times before. Now, while it’s great that the site itself is in there, would it not stand to reason that the silk flower wedding packages page is more relevant simply because it’s where the packages are?

    Also, still issues with upper/lowercase.

  65. Dead Link Removal (Not Your Own Site)

    Just because a website has an active link somewhere out there is no reason to keep it in your index for several years. And it certainly shouldn’t rank high in the SERPs! An old fashion “Report Dead Link” tool would help clean up the index. If a site is offline for more than a few months, there’s no reason to keep it in your index, no matter how many links point to it.

  66. I think a useful feature on google would be the ability for searchers to put a phrase they have typed in in “exclamation marks” at a click of a special button next to the search box. It’s what everyone wants to do but most people don’t know about putting things “in quotes.” Most serious searchers use quotes so why can’t this option be available “at a click”

  67. 2 operators pop into my mind:
    – name:
    name:”Matt Cutts” should bring up results for “Matthew Cutts”, “Matt Cutts”, “Cutts Matt”, ….

    – allinsource:
    allinsource:”rel=nofollow” should bring up sites that use link-condoms

  68. One thing I would like to see go away in the search results is the website ripoffreport.com. It may have started off as a decent site 10 years ago but it is now just one giant flame board with no concern for the truth or accuracy of the posts. One disgruntled employee can ruin the online reputation of a small business in as short as a couple of weeks. Or if your ex boyfriend/girlfriend wants they can post that you are a drug addict and/or prostitute. Or any other possible senerio that you can think of.

  69. When searching for anything, I am relatively happy with the results because I have experience in searching. However, when searching locally (Local Search) the results are awful – most results are taken from local yellow pages and most of those results don’t even have websites.

    I really hope Google does not start imposing local results into the serps. When I am online, I want to find a website, not a telephone number and directions – if I wanted that I would go to a yellow pages site, or just the yellow pages.

    Not to mentiuon it would begin the degeneration of ‘mail order’ websites, and a whole other way of clogging up serps with spam (keyword stuffing pages with city and state names).

    Just a note about results I am not happy with, and frankly would prefer not to see.

    Thanks

  70. One other thing I’ve thought of, that falls under a similar vein as the purchasing text links that you hate so very very much (although personally, I consider this far more insidious):

    http://www.placesafar.com/

    The site itself isn’t what I have the issue with. I think it’s a great site to look at and has a lot of cool art.

    But it’s what’s at the bottom (“Website Design by Broadwave International”) that bothers me.

    Web design companies and SEO-types shouldn’t have to draw traffic from their own customers if they’re that good.

    I suggest elimination of any search engine benefits created by “Powered by Such and Such” or “Web Design by XXX Company” or “Search Engine Optimization Lower Slobovia” or whatever the out-of-place phrase is that’s being used.

    In the case of the example above, they don’t seem to use anything at all because it’s an image with no alt tag description…so, by the same token, image links containing no description would go as well.

    In other words, any SERP increase assigned by the following phrases (and any similar phrases I neglected to mention) would be negated:

    null phrase
    “Powered By”
    “Web Design By”
    “Search Engine Optimization”
    “Designed By”
    “Custom Site Design By”
    “Custom Website Design By”

    You get the idea.

  71. How about indexing more of domain? I try and give people search on forums at our web site, but only some people get any search results for their forum (a unique subdirectory of the site), so it comes out kinda useless.

  72. Thanks for asking….

    I’d like to see a ‘find:’ operator that would land me on the first slice of the result set with a result from . For example, if the url I include is result number 42 for my query, drop me onto page 5 of the result set. Assuming sometimes matches more than one result (maybe it’s a domain name, for example), the ability to get to “Next” and “Previous” matches would also be helpful.

  73. Matt, one issue I have noticed on a few occasions is that a page, or 2, from my site rank above the one that these 2 link to, it’s probably down to PR and link pop etc. I’m sure the same happen to millions of sites.

    THE PROBLEM

    For example, suppose I have a homepage and one other page that links to my “free blue widgets” page. Now, naturally searchers would prefer to find my “free blue widgets” page by clicking a link in the Google SERPs and not land on my homepage and hunt down the link.

    SOLUTION

    Searchers can use negative keywords to narrow down their search, so why not allow webmasters to add negative keywords. Then, as in the above example, I could add “-Free blue widgets” to the 2 pages currently ranking and that should allow for the real page to float closer to where the other 2 pages were.

    IMO it would be a win for searchers, site owners and go a long way to making Google even more relevant.

  74. It’s amazing to me that Google is still rewarding ‘mini-networks of sites’. There are companies out there that are still setting up sites, buying $300 directory listings for all their sites, and being rewarded for it.

    I would love the opportunity to show you a specific example, but since this is a public area I’m hesitant to ‘turn someone in’.

  75. Bill, IMO Google does not give PR, link pop etc for links from paid (or even free) directories. This is due to Google stating in writing one cannot buy PR and link pop.

    When one submits to directories, one should only pay for click traffic. In other words, there is rarely a need to submit to paid directories as most cannot deliver traffic. Also, IMO, free directories do not pass PR with DMOZ itself (not all its downstream users) possible one of the only ones that does.

    I know if I were Google, a site that resides in more paid (or free) directories would NOT become more relevant or authoritive. It makes no sense and would contradict Googles statements.

  76. I’d say G (and most of SEs) rankings are calculated based on an inconsistent assumption. Yes, I agree with the basic philosophy behind PageRank (which basically ranks page according to the “Wisdom of Crowds”). However, the “crowds” here are webmasters; while the search results are aimed at users. In short, we can see an inconsistency here as the ranking factors are determined by webmasters while the majority of users who search for results are non-webmasters.

    So, why don’t use the end users as the “crowds” to rank page instead of backlinks provided by webmasters? At past, gathering info from millions of users might infeasible; so to some extend, the backlinks have been used as the proxy. However, with the advancement of G’s data center and algo, I believe G now could gather large enough info from users (from Toolbar, Analytics, etc..); and use the info to determine rankings. Factors that can be used as rankings could include: bookmark ratio, avg page views, avg re-visits, time spent on sites, etc…

    I think this shift in paradigm could deter spamming as it’s much more difficult to “bribe” thousands of users to revisit/bookmark your sites. Even if spammers try to manipulate results by revisiting their own sites often and longer, the use of statistics analysis could easily detect such “outliers” and remove them from calculation.

  77. how about query like:

    “word1*1.0 word2*0.5 word3*0.2”

    so that we can explicitly give weight to each word in query ?

  78. It’s amazing to me that Google is still rewarding ‘mini-networks of sites’. There are companies out there that are still setting up sites, buying $300 directory listings for all their sites, and being rewarded for it.

    I would love the opportunity to show you a specific example, but since this is a public area I’m hesitant to ‘turn someone in’.

    I agree with this thought totally, especially when they set up their own “directory sites” with very little to no content other than their own network and promote the dirs across the board.

  79. I’d really like to see Google increase their daily API quota of queries from 1000 to at least 5000 (to match Yahoo). This would certainly help researchers. Also it would be nice if they would do like Yahoo and MSN and enforce the query quota per IP addresses. This would allow applications that could be distributed to users without requiring them to register for Google IDs in order to make the apps work.

  80. I like the idea of having a “page history” type of option. I know recently I have done searches and thought I found the perfect result. When I clicked it was a blog or news site an the topic I thought I had found was no longer on the page. Then I need to search all over again on the site I found, sometimes to no avail. I have started going to the cached versions in situations like this, but not everyone would know how to do this.

  81. So glad that you asked this one, I am turning more towards Yahoo and MSN of late.. Why because their searches have been more relevant, Google returns far to many directories in my opinion… If I wanted a directory of sites I would go to one, the problem with searching a string or search term is that google picks out a line or word from 3-4 different sites in the diectory… I want a site that contains each of the words, not a directory of sites…

    Richard

  82. Hey Matt,

    I just found something so obscure and strange other people have tried this before (including a client of mine).

    The search is in Spanish, so as a result it has to do with the ü character.

    Bilingue

    Bilingüe

    Although if you want to wait a while to fix this one, it’d be cool with me. I’m redesigning a site in Spanish with search, and it’d be cool to say my search can do something Google’s can’t.

    So yeah…take your time. No rush. Have fun in San Diego. Take an extra week’s vacation. 😉

  83. I’m not sure why my comments were deleted (is it b/c you aren’t approving new posters?). But I think there should be an events search operator. So when I’d go events:Oasis I’d see info about their tour dates, etc. Or when I go events:arts cleveland I’d see arts related events in Cleveland.

  84. Please make the ‘link:’ operator return all link. Not just a random subset.
    Beside that I’m happy with the way google works.

  85. 1. Get rid of the we-judge-you-all-guilty-until-proven-innocent “Sandbox Effect” or at least explain your criteria for holding back new sites. Besides stunting the natural growth of the web, you are often depriving searchers of what might be the most timely and relevant content available. Besides, it’s damn un-American.

    2. Keep kicking the stuffing out of scraper sites. They are a cancer.

    3. Do something about DMOZ or dump it. Unless it’s an engagement to Natalie Portman, twelve months or more is just too long to wait.

    4. As you continue to increase your reliance on off-page ranking factors, I find myself forced to increase my reliance on quotation marks–and I’m just too damn lazy to use the shift key.

    5. Develop and implement a formula to devalue sites in which advertising outweighs content. I am tired of visiting well-ranked pages capable of triggering epileptic fits.

    6. Continue to devalue link trading. I’m tired of getting e-mails from strangers in India who are impressed with my site.

    7. Stop making people scroll to see the number one search result. I remember the days when it was near the top of the page.

    8. eBay pages should only be returned in searches for “tax evasion.”

    9. Keep looking for ways to improve the results. It gives us an excuse to get out of bed in the morning.

    10. Results should not be influenced by good-looks, charm or personality. If I search for “Matt Cutts,” I want results for Matt Cutts, not Matt Damon.

    😉

    Thanks for listening.

    Robert

  86. 1. Get rid of the we-judge-you-all-guilty-until-proven-innocent “Sandbox Effect” or at least explain your criteria for holding back new sites. Besides stunting the natural growth of the web, you are often depriving searchers of what might be the most timely and relevant content available. Besides, it’s damn un-American.

    Sadly, this is very Canadian. Any forward thinkers with new ideas get shot down over 99% of the time.

  87. On the Google Toobar you voting buttons which I think was a great idea but has probably gone unused. If you want to improve the quality of Google, here’s an idea for you.

    Why not make an advanced version of the toolbar (or add an advanced featured to the current one) where by known (registered) webmasters provide feedback on the fly. Not a day (sometimes only a minute) goes by where I don’t do a search only to receive a few good listings with a bunch of crude mixed in. Deceptive redirects, attempts to download activex spyware, redirects when the domain has expired to sites which have only paid links for content, 404, etc. — you name it, the Google listings “can” be full of garbage.

    So right now as I see it you’re trying to catch this stuff during indexing and then sometime later play catch up with your actual listings. However, if people like myself were allowed to click a button on the toolbar which instantly notified you of the problem and the toolbar was able to provide all the details (search phrase, datacenter, offending url before redirect, etc.) then you could send out googlebot immediately and quickly update your listings. Well, I’m assuming you could update your listings that quickly. Because if you want to improve the results in Google it can’t be from an index which is two weeks old and that’s what I see frequently when I search Google.

    You can’t index every page every day but you can cut down on the garbage by allowing trusted individuals to report it back to you quickly. I’ve heard you’re doing something like this in a personalized search environment but I’m not sure if it’s the same.

    For what it’s worth.

    KJ

  88. Maybe it was the initial flawed concept that the older the site the better. I wish Google would keep the engine up to date. I have started using MSN as the listings are more “fresh”.

  89. Don’t think this is spam related but leave you to decide.

    I hate the way Google allows the same page to be listed more than once by recognising affiliate links to the same url as separate urls. I thought this was cured some time ago.

    This morning for example I was searching for info on the Rich Jerk e-book, duly typed in Rich Jerk and the first 2 listings were clickbank links to the same page, only the hop=XXX being different.

    Other than that Google is still way ahead of the other SEs

  90. The biggest problem when you’re doing research is that Google keeps returning the same Web sites, usually sites with the highest pagerank or sites that are highly optimized to rank for phrases involving that keyword phrases. These big web sites dominate the results.

    One big improvement would simply for an option to search for the exact string. Most people won’t use operators such as the double quotes which specifies the exact phrase. Many won’t use operators. So a search option that says “filter results with exact phrase” would help to return results that are more directly relevant. Sure there’s 2.7 million pages that have san jose apartments, but there’s 311,000 that have that exact phrase, so you filter out a lot of the big sites that contaminate the results because of the pagerank and the limitations of Google’s algo.

  91. An easier to use daterange operator would be quite cool

    You are seriously behind with regards to the operators you support. I go to Yahoo! and Msn solely to use the operators linkdomain: ip: not to mention linkextension: is really useful. There are also region: on Yahoo!/location: on Msn which can be useful for power searchers. And you still don’t show enough backlinks.

    Plus your blogsearch doesn’t seem to support any of the normal operators, it’s quite frustrating.

    Oh and lastly, your site:somewhere.com/directory doesn’t seem to work properly quite often.

  92. I host a popular NFP site and we used FDSE site search, until I discovered Google’s Pubilc Service Search (free without ads).

    As soon as I was authorised with the G code on our site I was alarmed to see it showed 12000 supplemental results in G (our site has never had more than 1000 pgs, and none show as supps previously.)

    Activating the PSS had caused G to ‘find’ and dump 11000 supplemental results into the main index.

    Worse still, the PSS returns the dead and non-existent supp results ABOVE the correct pages, rendering the PSS useless for our site.

    I had to remove it after three days, and revert to FDSE.

    Please get these dead Supps from G.

    Before I opened the PSS account site:ourdomain returned a true 600pg count, with no supps.

    After I opened the PSS, site:ourdomain returned 12000pg count, mostly dead and never existed Supps.

    We fixed cannonical problems months ago, and the cache on the Supps was over a year old.

    Please get these dead Supps from G. They are ruining PSS.

    I thought G’s PSS would be better than FDSE, but it is no where near as good.

  93. I´m writing from Argentina so I want to be sorry if a make some mistakes in this email. English is not my natural language.
    I remember that few months ago you posted in your blog the invitation to suggest ideas about searching.
    Last nigth i was trying to find some information about an especific law of my country.
    So I used something like “ley 136376/76” in spanish (that wasn’t the real number). I found a lot of pages, but the problem was how to know what page is about the “text” of the law from other that simply talk about the law, and also what about the last update? So, I think Google could work on this with certains commands like “law: nnnnnn #1976” where nnnn es the number of the law and after# is the year of publication. I know i´m talking about my country particulary way of naming the laws but i also know every country as a lot of information that citizens need to search and find with value added. Thanks Matts

  94. it would be useful to include a little search box to the right of the top result, if that webpage’s primary feature is search and if they want you to add it. for example, a search box next to yahoo finance, wikipedia, amazon, google blog search, google video.

    if you aren’t going to add all of these search categories with your primary ones, you could at least make them easier to access.

    also, i love the “i’m feeling lucky button,” but i usually search from a toolbar. it would be great if i could enter, say, the letters “ifl” before my actual search if i wanted to be taken directly to the first result.

  95. Only make web quality. that’s not the case now with google.com

  96. I am facing a lot of problems with the google sitemaps as many sites that I have submitted have droped in the ranking. Hope you can come out with a better version.

  97. To paraphrase the old radio shows: “long time reader, first time writer”

    My wish for ’07? Fix “Pages from the UK” (or wherever in the world). Please!

    I’ve seen quite a few sites suddenly expunged from these results for apparently no good reason. One week they’re sitting pretty for one of their key search terms, then the next…. they’re gone!

    It always seems to be for UK-hosted, registered and maintained sites that use a .com TLD. For marketing reasons, most UK companies like to use the .com if it’s available (easier to remember, seems more ‘authorative’ etc) and the SERPS show up plenty of .coms in the results, but some sites just suddenly vanish regardless of their quality and provenance. I mean literally from page 1 to nothing. The hosting is in the UK, on a UK IP address with UK-centric content for a largely UK market, with a presence established for a couple of years… no spam… webmaster tools reporting no penalties or problems…. just gone, without so much as a goodbye. Some of the pages carry on being indexed, but search for “site:www.domain.com” with “pages from the UK” and the homepage isn’t indexed. Very. Very. Odd.

    If the anecdotal evidence of a 45/55% split in use of the ‘pages from…’ feature is anything like accurate, this could be a *major* commercial concern for a lot of people. From looking at the webmaster forums, it seems people have been reporting this to Google from all over the world for a good 4-5 months, and it’s a pretty serious flaw IMO.

    Happy new year, by the way!

  98. this is related to previous comment – about regional search

    is it possible to tell google which country to index a page, for example, i have a .com hosted in Italy. at present all pages only get index in google.com, but the content on some pages are for UK only, so i want these pages indexed in google.co.uk only (not in google.com) and other pages are for spanish speakers in Spain, so i want these pages indexed in google.es (not google.com or google.co.uk) – how do i tell google where to list each page – i hope you dont get offended by a direct question ( ps i read your artical in dotnet mag – vey interesting ) (pss i thought i would add that to save you shouting at me)

    nicholas

css.php