Google 2000 vs. Google 2011

I sometimes hear people say “Remember when Google launched and the results were so good? Google didn’t have any spam back then. Man, I wish we could go back to those days.” I know where those people are coming from. I was in grad school in 1999, and I remember that Google’s quality blew me away after just a few searches.

But it’s a misconception that there was no spam on Google back then. Google in 2000 looked great in comparison with other engines at the time, but Google 2011 is much better than Google 2000. I know because back in October 2000 I sent 40,000+ queries to google.com and saved the results as a sort of search time capsule. Take a query like [buy domain name]. Google’s current search results aren’t perfect, but the page returns several good resources as well as some places to actually buy a domain name. Here’s what Google returned for that query in 2000:

URL_1:http://buy-domain-name.domain-searcher.com/domains/buy-domain-name.shtml
URL_2:http://buy-domain-name.domain-searcher.com/buy-domain-name.shtml
URL_3:http://buy-domain.domain-searcher.com/domains/buy-domain.shtml
URL_4:http://buy-domain.domain-searcher.com/Map3.shtml
URL_5:http://domain-name-broker.domain-searcher.com/domains/domain-name-broker.shtml
URL_6:http://users5.50megs.com/buydomain32/
URL_7:http://users4.50megs.com/buydomain02/
URL_8:http://domain-name-service.domain-searcher.com/domains/domain-name-service.shtml
URL_9:http://domain-name-service.domain-searcher.com/Map2.shtml
URL_10:http://dns-id.co.uk/

Seven of the top 10 results all came from one domain, and the urls look a little… well, let’s say fishy. In 1999 and early 2000, search engines would often return 50 results from the same domain in the search results. One nice change that Google introduced in February 2000 was “host crowding,” which only showed two results from each hostname (here’s what a hostname is). Suddenly, Google’s search results were much cleaner and more diverse! It was a really nice win–we even got email fan letters. Unfortunately, just a few months later people were creating multiple subdomains to get around host crowding, as the results above show. Google later added more robust code to prevent that sort of subdomain abuse and to ensure better diversity. That’s why it’s pretty much a wash now when deciding whether to use subdomains vs. subdirectories.

Improving search quality is a process that never ends. I hope in another 10 years we look back and say “Wow, most queries were only a few words back then. And we had to type queries. How primitive!” Mostly I wanted to make the point that Google looked much cleaner compared to other search engines in 2000, but spam was absolutely an issue even back then. If someone harkens back to the golden, halcyon days when Google had no spam–take those memories with a grain of salt. 🙂

165 Responses to Google 2000 vs. Google 2011 (Leave a comment)

  1. But Matt, last year you effectively dropped host crowding by putting up more results from a web site for some searches, rather than fewer. You know this is an issue that drives me insane. One result per site, subdomain, you name it — that’s all I need. Especially with the use of sitelinks, I simply don’t understand why Google can’t give me more variety in my results.

  2. Everything always says things were better ‘back in the day’. I wonder why. Must be a weird quirk in human thinking. I think I’ll Google it to find out. Hopefully I don’t get any spam;)

    Interesting wee article.

    Thanks.

  3. Google is getting worse. Choose the dates you want to compare it to but the quality is degrading. My ability to find relevant information went from about 90% of the time succeeding to it now being about 10% of the time.

    The noise surrounding this is intensifying because it is real. Why not spend your energy pointing out the problems with google now and how you are fixing them. Why not identify why people are complaining or is that just their nature?

  4. I think the big issue is nostalgia … people want to believe things were better back in the day. And as things get harder and more competitive people long for opportunity they may have passed on.

    Also so many folks are so much more aware of how search works today. The microscope is much larger & more people are looking into it. Companies like AOL leaking their business plans, Demand Media speaking theirs to the press, tons of SEO blogs offering tips, 10’s of thousands of people tracking almost any major change Google makes, etc. And, finally, there are a lot more folks working public relations angles in the press…sorta like how Jason smeared Squidoo until he got it whacked so that he could launch his content farm from.

  5. I don’t think there is anything wrong with Google’s search results. I always find what I’m looking for with Google.

  6. I just tried to send 40k queries to Google and now my IP is blocked. Thanks a lot!

  7. I’m with the other Sullivan on this. I did LOVE Google when it first came out (specifically when i got Danny’s email newsletter telling me about it) because the expectations were so low. Now that I’ve come to expect so much from Google (it is what I’ve used ever since) I am disappointed with some things, notably when too much real estate is devoted to one domain. Also, I know you must never stop innovating, but sometimes lately I just want to know what my SERP is going to look like from search to search (instant, previews, social etc. etc.). But overall, very happy to be a Google user.

  8. Matt,

    Thanks for the look back. Google has come a long way, no question. But, just as soon as you present one solution, the next batch of issues and “work-arounds” begin to pop up. I look forward to seeing how Google handles mobile and location in the coming year. Learn from the past and adapt to changing browsing behaviors! Thanks – Hunter (@EHunterYoung)

  9. Matt

    Talking about spam. I think Google WebSpam Team have been doing a great job in fighting spam on google SERPs. But I feel that the process of reporting spam still need some improvemnts. For example creating for Firefox and IE a spam reporting extensions similar to that of Chrome. Also possibility to report spam through social media as Twitter directly by DMs to Google WebSpam Team instead of we are reporting spam to you and some other members of your teams by “public” tweets.

  10. Danny Sullivan, we keep tuning things trying to find the right balance. For a navigational query like [ibm] or [hp] for example, it makes sense to surface more results from that site.

    Chris, I’m not saying that Google doesn’t have issues. Things like increasing index size or indexing new pages faster can impact spam, and we’re absolutely working on that. But expectations have also skyrocketed; I’ve seen people blame Google when the information just doesn’t appear to be on the web at all.

    Aaron, it’s definitely true that fewer companies/people were paying attention to search back in 2000. Now it seems like everyone is aware of it. The bright side in my mind is that when we looked at asking for user feedback in the past, people weren’t always savvy enough to spot spammy or low-quality content. As people become more aware of search quality, that potentially opens up new opportunities along the lines of Gmail’s spam filter where people tell us more about content they’d rather not see.

  11. I think in terms of a comparison with 2000, one alarming difference between results then and now are as Aaron said about the ‘noise’ that appears in results for informational searches.

    The issue gained more exposure from Tim O Reilly’s comments in a Washington Post piece a couple of weeks ago when he said that he would now rather turn to Twitter for an informational search than to Google. If people do lose confidence in Google for these searches it could be a big piece of search market to lose.

  12. I think in the past decade or so the amount of spammers (and spam) must have increased exponentially. With the advent of mass-SEO as well, the value of links has been realised, and this has lead to thousands of hours around the world devoted to sneaky ways of linkbuilding and SERP manipulation. Considering the competition, I think Google do pretty well…it’s still my search of choice

  13. Dear Matt,

    Google is amazing, no question. That said, inherent issues are growing as the paid programs grow. Search results are becoming increasingly muddy — especially due to the increase in directories present in the search results. I would love to see clean, clear results for simple searches and it feels like we are moving away from clear results.

  14. Host crowding was a great innovation but I know that many companies really want to leverage their own properties in brand-related searches and though Google’s solution isn’t perfect it’s better than trying a one-size-fits-all approach.

  15. I know, search engines could use many different metrics, for example: % of first result clicks, % of non-clicked SERP’s etc. Moreover, such metrics could demonstrate different results for each query type, query length, user’s region etc.

    So, Matt, how did you measure the quality of Google’s search?
    Does Google use pointwise or listwise metrics?

    Thanks a lot for reply,

    PS. As for me, I like Google’s search for informational queries, it’s better than Yandex 🙂 But for transactional queries Yandex much more better, mainly due to market.yandex.ru and regional features.

  16. I specifically remember having to scrutinize the URLs of the first few Google results and judge whether or not they looked spammy, all the time wondering if clicking on a link would disable my back button before barraging me with popups. Black hat techniques worked like a charm back around 2000 and you’d routinely get back 2-3 prescription drug sales pages for just about any query you could imagine. Nowadays, I really only get spam from redirects after clicking through photos in Google Images.

    The fact that people now complain about the way Google chooses to sort things instead of blatant hideous spam being ranked #2, 4, and 5 for common queries is a major improvement.

  17. Well, IBM and HP aren’t necessarily navigational. People might be doing research on them, so the change potentially allows them to more easily push out negative listings about them. Like for McDonalds, I’m pretty sure there used to be an anti-McDonald’s site that came up. I don’t see it when I look now. For Coke, KillerCoke is down at the bottom — and it’s the only negative one. I’m pretty sure it used to be higher. But from a brand owner perspective, that change is golden.

  18. Trust Matt Cutts to have a time capsule of Google queries from 11 years ago 🙂

  19. About sending “quieries” to Google, it would be great to include such a feature in WMT, don’t you think?

  20. Back in 2000, getting a few relevant results was impressive and people ignored false positives. Now people tend to notice poor results because most of the time the results are pretty good. You no longer have to “feel lucky” to get good results, so you start to expect that the search engine is able to answer any question. Users are more demanding because search engines have improved and they’re now strongly connected to the way people understand the Web.

  21. Could it be that people who are having a hard time finding things on Google now are searching the same way they did in 2000? Google might (and arguably should) be attempting to cater to the way people are searching now, not the way they searched 10 years ago. Don’t forget that people using a search engine need to evolve as well.

  22. Shhh Danny. Us big brand SEOs DO love that change! No need to change anything there Matt 🙂

  23. Hey Danny if you search for “anti-McDonald’s” you get lots of anti-McDonalds sites. I think that’s the point – you get what you search for?
    Matt is becoming very defensive about the bad press of late – there is no need, Google are doing a good job. No one in the UK has even heard of Bleeko.

  24. No question, spam is a dominant problem. But this is a problem not only for users, but for legit websites/businesses as well. I’m just sick and tired of all these fake “SEO experts” that offer black hat techniques to manipulate the serps. And if you’re a newbie, it’s easy to fall for this scam.

    Someone mentioned above “I think Google WebSpam Team have been doing a great job in fighting spam on google SERPs”. Yes they do, but it takes too long. There are many sites that I’ve reported myself(with proofs) and as of today no auction was taken.

    Matt, another suggestion: More tips and guidelines in webmaster centrals.

  25. I remember how much spam there was in Google 10 years ago…some of it was mine :.)

    Google ain’t perfect now but to say it’s worse than even a short time ago requires rose-colored glasses IMO.

    Whether the distribution of universal results is to everyone’s liking is another story entirely…

  26. I don’t see why Google is going on the defensive. Wouldn’t it be much better to simply admit that you guys didn’t have your eyes on the ball for a while because Google diversified so quickly?

    Google risks becoming jack of all trades master of none.

  27. Hey Matt & Danny Sullivan

    Talking about “navigational”. Pls. take a look at this search from today. The 3 results at top from the same site. Now take a look at top of pages of the three mentioned results. Do you see the keywors stuffing spam? 🙂

    Also take a look at this site on the same mentioned SERP. I call it URL spam 🙂

  28. Google wasn’t that good when it “launched”. I remember trying it when it started showing up in netscape.com and i didn’t like it at all, i just kept using infoseek. Of course, that would eventually change… But it is a gross misconception that Google just came and conquered because it was so much better. There is no way a search engine can be much better than the established ones without tons of data, and it took Google a while to gather enough.

  29. Oh and google 2k vs 2k10 is not fair. Try google 2005 vs 2010 or somethnig

  30. I hope there will be a time when Google search engine will be able to pick the best and most relevant articles for all queries. For now, in Romania at least, in the medical field, the most relevant and informative articles are on the second page. First page results usually are taken by general big sites that happen to mention that search term.

  31. Well Google’s February 2011 update cost me two top ten positions which were replaced by third party products being sold on Amazon. Domain authority I can understand, but when an algorithm can’t determine truly quality content from a product being sold it is disheartening – especially when so many listings from Amazon appear for those queries. 🙁

    I would agree that it can be difficult to use an algorithm to determine quality content. But removing some search results and replacing them with Amazon content is not progress in my view. It is just a lazy way of being safe instead of focusing on diversity that users want. This may be why Bing has seen such a large jump in monthly search activity.

  32. I definitely like the way that Google has evolved over the years and blocked out certain attempts at gaming their engine. Unfortunately, when one door closes, another one opens – that is clearly true in the SEO world where spammers are constantly doing anything & everything they can to manipulate the results.

    Just like in sports and other competitive environments, there will always be a winner and a loser. Only, in the SEO industry, it’s often a winner and a whiner 😛

  33. From a user end, I like Google. Its sort of become a home on the web (for most).
    If I need any sort of information, my reflex is to google it. Like most of us.
    The spammy content is bothersome but is it really G’s fault? IMHO its not. *points finger at spam creators and promoters* The golden years might have only been golden because we didn’t know any better. There is always room for improvement but G is a couple of football fields ahead of the competition, if not more. Keep up the good work!

    Its nice to look back and see how the Internet has evolved. Looking forward to another 10 years with Google! 🙂

    Cheers,
    S

  34. Results are better now than back then. Just look at those crazy domains in that query above. I would never go to a site with a URL like that today, and Google’s search algorithm is smart enough now to agree with me.

  35. I think my favorite part about Google getting better over the years is that most of the blackhats had to get real jobs. Of course, by the time that happened, they had already made some serious cheese. And now they all have VP-level jobs or better. Now that’s what I call ‘win-win’.

    BTW another great improvement over the past decade has been the addition of Wikipedia pages to nearly all search queries. Right? Haha. Just messing with ya! 😉

  36. That content farm crap gets top ranking is no surprise.

    What happened to the briefly active search ranking (sorry don’t recall what it was called). After searching one could eliminate a result… and for a shorter time there was even more customization. Did that just tee off advertisers?

    That my wordpress blog, with a brief 16 hour period where Askimet was inactive, recieved hundreds of spam comment posts from these domains (de-hyperlinked here to avoid any traffic)
    SeoMarketingServicesOnlineDOTcom
    hostinginfiniteDOTcom
    thenetworktrafficDOTcom
    ezinearticlesDOTcom

    Speaks volumes about some of the top hits I get on every topic under the sun. EZinearticles showed up on two very different searches for me this morning.

    The big trouble is obviously with trusted content. There are great tutorial web sites on say woodworking, and, there are content farm web sites on the exact topic (sometimes a direct but cheap rip-off of the text) which appear higher in search results than the real articles with real value.

  37. I sure wouldn’t want to go back to the search results of 10 years ago. I would like more consistency in the way that results are displayed now. Also, I think that personalised search has gone too far and it’s “boxing people in” and preventing them from finding something “outside the box” that could be more useful. Sometimes people don’t know what they need to find.

  38. Matt, how about publishing your 40,000 results from 2000 so the world can evaluate them independently and without cherry picking? It shouldn’t be hard to verify the claim on that basis and it would be great press for Google.

    Best,

    BW

  39. I am just a plain old user, not a search expert or SEO expert or whatever. I would say that from a plain old user’s perspective Google search results are a lot worse than just a few years ago. You can bring up all the examples you want of spam from 2000 or 2005 etc. but the results feel worse to me on almost every search.

    If I want to search for information about a product I don’t even Google it anymore: there is too much spam. For products, I search Amazon first, then Consumer Reports. I only use Google on products if I am desperate.

    Content farms (ehow.com!) are such garbage and they appear for almost everything you search on. It’s getting so bad I frequently just click on a Wikipedia topic in my results because it’s too much work to sift through the search results to find a GOOD article vs. one of the dozens of crap-quality content farm articles.

    In an arms race with spammers the spammers always win. With e-mail, your best defense is a white-list. Put everything from untrusted domains or a sender that’s not in your address book into the trash, otherwise your inbox will flood every day and you’ll give up on e-mail.

    For search, I would like to see a “block domains” box that would allow a user to block certain domains from searches and better yet a “block content farms” button that would nuke all of the domains on the “content farm list.” It would be nice to have a Google-maintained “content farm list” and a user-defined one so users could nuke content farms from searches even when those content farms are good Google advertising partners 😉

  40. I think you are arguing the wrong point, and one that always will lose.

    The issue isn’t whether Google is better or worse. People expect more from search than they used to. I think Google has even published data that shows the average length of queries is increasing – users are becoming more sophisticated and expect more.

    Maybe the expectations are unreasonable?

  41. Seeing as people are more likely to complain about something than commend it, I am here to say great job Google. The results are faster and better than ever. I find myself looking for some pretty complex query’s when faced with development issues and Google has been amazing. Thanks again and keep up the great work.

  42. I’m really happy what has been done with search results in past few weeks. Lots of “bad” sites has been removed. Surely there are now less results, but they are much more accurate. I can tell it by increased number of visitors my site has gotten. Last year it was buried under many spam/autogenerated sites.

    I’d be happy to send some fan letters thanks to this. 🙂

  43. My experience with Google has been mixed – at times producing very precise, relevant results and at other times frustratingly choked with low quality sites and/or off-topic pages. I spend lots of time in Google both as a programmer, photographer and part-time SEO consultant and I am particularly surprised to see how often link exchanges seem to play a role in boosting SERP.

    I’m not referring to directories: high quality or at least marginally topically specific directories have their valuable place on the internet. I’m talking about outright link trading schemes that make no attempt to unite their membership by anything other than a desire for increasing search returns. As a consultant I steer clients away from black and gray hat techniques but their prevalence and effectiveness continue to bug me.

    I certainly don’t envy your position, Matt 🙂 – I’m sure that tackling spam and improving relevancy on a scale such as Google works with is an ongoing, monstrously complex job. Keep fighting the good fight, brother.

  44. Google 2011 is > than Google 2000. Hands down 100%. and I am not just kissing your *ss, that is, I am not a Cuttlet. 🙂 Google 2000 seemed so much better because Yahoo sucked so bad. Plus the reason why everyone is on your case is because you are now the powerhouse. I really noticed this at SMX last year. The internet community is very demanding of Google. They feel entitled to pick out every single miniscule problem. I mean, do 99.9% of google’s users want eHow removed from their results? I highly doubt it because I don’t. They are like the McPaper(usa today) of the internet. Sure, they might not give you every detail and the inside scoop, but they can give you the basic facts. And sometimes that is what people want. There is a reason why USA Today has a large circulation.

    but never mind me, let the witch hunt continue! Down with McContent! I mean eHow.

  45. The web was a LOT smaller in 2000, the large growth is due in a large part to the ease of making money trough adsense.

    Instead of trying to clean up the search engine you should put bigger restriction on what types of sites can make money from adsense.

    If you looked at every adsense account and shut down the account if you determined that their sole purpose was to attract google juice, you’d get a lot farther. Of course, getting your cut of the google adsense profits first and then shutting them down once they have become too obvious is much more lucrative.

  46. Sure, Google is getting better. The problem is that companies gaming Google are making huge amounts of money, so they try a lot harder to do that. Look at this latest brouhaha over content engines like ehow, etc… They are adapting their content to be more Google friendly, which also has a side effect of making them more useful, but still they are annoying, and hide the small web.

  47. I use Google every day, and depending on what I’m searching for, the results are either great or awful. For example, when I search a single domain looking for some needle in a haystack fact that I know is there, Google can almost always find it for me, often with very little help from my “what was that fact I wanted?” query. Impressive.

    Where Google deserves criticism is in the garbage it returns when you’re looking for a commercial product, say shopping online. It’s not just a problem of skilled SEO spammers exploiting Google to promote their junk. It’s a matter of Google pointing its searchers at worthless pages purporting to be “portals,” when all the links they offer are linkspam leveraging Google’s own paid search business model.

    The problem isn’t that Google isn’t doing a good enough job to tease out useful results under an onslaught of SEO spammers. The problem is that Google is creating the problem of webspam by paying to support these fraudulent domains (from DNS misspellings to phony sites that have no purpose other than to provide Google-sponsored links), handing them a cut of its own proceeds. Google has a pretty serious conflict of interest here, because it’s making more money from webspam than the SEO spammers it supports.

    When called on the carpet about this issue, Google accused Bing of “stealing” its public data. Now you’re ready to talk about it, framed in a cute context of how far you’ve come in the last ten years. This is pretty insulting to the web. We don’t want nonsense and subterfuge, we want Google to stop destroying the web to make an easy profit on taxing every click, a business model that promotes needing to click thorough acres of garbage to find a nugget of useful data.

    Google, tear down your webspam business model and stop propping up SEO spammers.

  48. 10 years ago the proliferation of ‘Complaint Websites’ had not taken place. I predict the fraud and abuse will be rampant.

    It is worth mentioning that Yahoo and Bing do not rank these type of complaint websites as relevant in searches, not nearly as much as Google does. As a developer I believe this speaks to inherent flaws in Google assigning so much value to back links.

    While I have not had the pleasure of having my name or company vilified, many people, companies, businesses and organizations have found themselves or are the victims of undeserved negative postings on ‘Complaint Websites such as Rip-0F-F- Re-p0-rt,
    P1ss-0F-F- C0n$u-mers and others.

    (All purposely typed-out the way because I do not want to give these websites or it’s owners ANY web reference weight by correctly spelling the domain name)

    I have been following these websites for some time now and have learned two distinct things.

    1) You either love or hate them.
    2) These type of websites are turning the Internet into an information cesspool.

    Let’s talk about point number one. You either love or hate them

    You Love Them

    You might love these types of websites because they provide you an online sounding board to vent, rant, whine and complain about a negative experience you might have had with product, service, person or company.

    I can relate. Recently my wife had a bad experience with an employee of a local retailer; part of a national jewelry store chain.

    It would have been cathartic for me to post negative reviews, but I chose to take the high road and keep my complaints between me, the store and the corporate office. As a small business man myself, making public posts would not contribute to the common economic good as it does not reflect well for the good employees who work in this store, and only serving some personal desire to impugn or embarrass this company online.

    Having worked in corporate America and received a regular pay check for many years I probably who not have given it a second thought. I would have jumped right into the fray and posted a negative comment on a complaint website. However these past 10 years I own my company and this changes your perspective in a big way.

    You Hate Them

    You hate these types of websites because they are wide open to abuse. Examples of abuse include; Ex-employees posting false statements or Competitors posting false statements hoping to steal your sales or business.

    Most people assume that some legal action is the best way to get a fraudulent posting removed, but think again. If you have a rich uncle or your father is a partner in a huge law firm you might try suing the website having the negative post for fun. But this is extremely expensive to do, not to mention the fact that every case brought before a court is never going to be a clear cut victory in your favor.

    Also remember we live in America. Negative speech whether it be true or false is protected in most states. This is a tradition that was exercised in new papers and leaf lets during the political campaigns going all the way back to Benjamin Franklin, Abraham Lincoln and George Washington.

    However today the big difference is that the negative speech or in this case ‘online postings’ remain live and online indefinitely. And this has led to the proliferation of a new cottage industry called ‘Reputation Management’.

    In the case of a political rival having ran so much negative advertising of an opponent, the opponent would seek to run as many or more ads to counter-act his rival and so on and so on. But long after the campaign is over the dust settles down and all the campaign fodder becomes yesterday’s papers.

    Not so on the Web.

    Fighting the Negative Search Result

    On the World Wide Web, let’s say a complaint website containing a negative comment about your business appears on the 1st page of Google just below or above your regular business website result. Yikes! What do you do?

    You could hire an online reputation management company or learn to push back this negative review yourself.

    Below are the basic steps that any online reputation management company takes to push the resulting negative website back to page 2, 3, 4 or 5 on Google.

    First the company will review and modify pages in your website to be sure it is search engine friendly.

    Secondly they will create and/or post to Blogs, Directories, Websites, Forums to help facilitate the ‘pushing back’ of the negative website reference in specific search results.

    Helping Google to Fill The Cesspool

    Today we have all these reputation companies, web developers, web writers, and SEO teams creating USELESS content and USELESS links to fight back against all these complaint websites. Even the complaint websites are getting in on this action and offering removal fees at up to $2400. There are even negative posts on competing complaint websites saying negative things about the other. I have even found postings of people complaining that they paid to have the posting removed and once removed another appeared the next day. Sounds suspicious?

    My Suggestion

    Google employees and it’s engineers can come up with a better way in serving up ‘Complaint Websites’ in it’s search results as follows;

    a. Any complaint website that does not display the full name, email and phone of the person making the complaint should get pushed back in results. Or at least display an email that is tied to so real or high SERPs website, Blog or Forum where they are a member.

    a. Any companies having a +A rating with the BBB should algorithmically negate or receive a push back against any complaints website.

    I think this makes sense and sounds doable. I truly hope this posting reaches a Google decision maker who wants to see search queries cleaned up, more relevant and increasingly useful.

    Like mom used to say; ‘If you don’t have anything nice to say about somebody, then say nothing at all’. Thanks, Mom.

  49. I work and research in the internet marketing space every day. I shake my head every time I conduct a training class and try to explain the best white hat techniques and am countered by people telling me they are creating websites every day by the hundreds. They use scrapers and article spinning sites to generate rubbish content and load that up with AdSense. Their end game is to create links.

    This continuous growth of black hat SEOs makes me absolutely crazy. Should we blame Google just because another Black Hat figured out how to game the system? This is web spam, pure and simple and in my own sphere of influence I’ve been telling people for years to avoid it because eventually Google will build a smart enough machine to fight it. Creating a computer that cold sift through information on so many levels and in so many dimensions in less that 0.1 seconds is a challenge, and the only way to make a perfect search engine is to create a perfect A.I.

    I don’t think it’s fair to say that Google is growing too large and can’t handle it either. When was the last time Google wasn’t hiring?

    Of course I do get annoyed when my search query returns a bunch of spam websites. When that happens I go back and use a longer tail search query. I always make a note of the spam pages I do find, that way I can make sure none of the sites I work on will ever accidentally look like one. I’d hate for one of my legit pages to give the 1 second impression as spam just because they are frustrated by the search results.

    Here’s a question for everyone complaining… Do you use long tail search or just a few random words? Might you be surprised that Google’s search results are usually more on target with the long tail? Those long tail searches sure to help Google determine if you want navigational, transactional, brand or just informational searches.

    I enjoy the Universal Search result pages when they are triggered as I know I immediately have real options. Now that the SERP is influenced by recommendations from others I also expect less spam to appear.

    Every day I wonder if a new feature is going to be announced that will benefit me.

    Google and Matt Cutts — Keep up the good work. Oh, and do you have a code name for the eventual A.I. you are building? Elmer? Skynet?

  50. Google’s biggest draw card all along has been the clean white page for searching without all the clutter. Keep it simple and people will continue to use it. The bar has definitely been raised but that’s also a natural progression with endless time being committed to refining searches.
    I would like to see how Google could get users to rank websites and match that info against their own rankings.
    Google’s getting better but they have to constantly raise the standard so Bing can be jealous 😉

  51. Google search may be better in some absolute sense, but relatively compared to early days, the experience is much worse.

    for example, for programer geeks, in the past i can easily find writings by other programers about esoteric topics, may it be programing, beer, or other weird topics, underground culture subjects, english writing, even angry poetry. The content is usually incredibly intelligent and informative. For one example, you’ll occasionally find Richard Stallman’s home page showing up. (that’s just a celebrity example; but usually these are written by nameles geeks) These days, you can never find such. It’s all content farm and large commercial sites such as tech crunch or such. (recent example from blogspere, such as searching for product review on washing machines etc, is a good illustration.)

    also Google Groups’s quality has gone down the drain. In early 2000s when Google just aquired dejanews, i can type a phrase and Google Groups will find messages showing that phrase. I often use this to find messages i’ve written. About mid 2000s, that has become impossible. I CANNOT EVEN FIND MESSAGES I’VE POSTED, doesn’t matter how i search. (yes, i know well the advanced search. It’s simply broken. I can type the author email address, date period, exactly phrase, and it still won’t find the message.) For example, just have a look at comp.lang.lisp
    http://groups.google.com/group/comp.lang.lisp/topics 〕 It’s filled with spam. e.g. at this very moment the top 3 post is:

    SEXY HOT GIRLS 
    SIMPLE HACK TO GET $2000 FROM YOUR PAYPAL.
    Sexy Indian Babes

    it’s actually better than a year ago. A year ago, some 90% posts are like that.

    Same thing i know happens in other groups i frequent. e.g. comp.emacs. Basically, many of these groups are today DEAD now. Nobody post to them anymore.

    few months ago, i know Google did a major cleanup on newsgroup, but it’s still pretty bad.

    In mid 2000s, if you use newsgroup, there are a few good alternatives, such as mame/mane (sp?) which provides a clean interface. But today, when you Google search some phrase you remember in some newsgroup you read, it comes up with tens of tech newsgroup mirrors, all content farm, filled with ads and garbage (e.g. egghead, byte.com, lots others). (right now, i spend 2 min trying to find the correct name for MANE/MAME, just failed. I tried “newsgroup mame”, “newsgroup mirror”, “newsgroup archive”, “newsgroup xahlee”, …)

    here’s 2 links i wrote about the issue on newsgroups, hope it’s relevant.

    〈comp.lang.lisp is 95% Spam〉
    http://xahlee.org/UnixResource_dir/writ2/google_group_lisp_spam.html

    〈Death of Newsgroups〉
    http://xahlee.org/UnixResource_dir/writ2/death_of_newsgroups.html

    I have tried occasionally on bing in past year. Overall, i think Google is still in general superior. I believe Matt’s claim that Google Search is better than before, but it’s only in some absolute sense. I think the whole search experience has gone pretty bad. I pretty much agree with the numerous blogs in recent months as appeared in hackernews and other sites. I read (i think on Wikipedia) few years ago that 90% of blogger post is machine generated spam. I’ve also read the same for email traffic. Google had to battle this.

    don’t know what to say, but i hope Google can do more improvement. One thing i blame Google, perhaps wrongly, is that it seems to me Google single-handedly brewed the entire SEO movement. Google encourages this, by incessent talk about how Google search works in its several webmaster, Google analytic, etc blogs over the years. I never understood why Google does it. On one hand, it seems Google is being helpful to web masters, but on the other hand, it continuously talks about how Google search works for the sole purpose of SEO. 〈Why Does Google Give SEO Advice?〉 http://xahlee.org/js/why_google_helps_seo.html When you read Matt’s articles or video, of course he often emphasize the best way is to have good content, but however, there’s always info about SEO. In a cynical view, Google does that because it spread their search engine and ads in a indirect way. I started research into SEO this year, there’s tens of thousands companies and blogs about the topic, all rather shallow and questionable. e.g. 7 things you need to know, 5 tricks to increase traffic, 12 ways to tweet…

    sorry for the long post.

  52. Morris Rosenthal

    Matt,

    I just read the Washington Post article that Eamon mentioned. I mention it because it took me a couple different searches on Google to find it:

    http://www.washingtonpost.com/wp-dyn/content/article/2011/01/28/AR2011012803849.html

    I think Googles attempt to generate fresh results was the main issue. I kept getting returns from blogs and other nespapers quoting from the article, rather than the article itself.

    I commented about content farms on your last post:-)

    Morris

  53. Long tail search is the way to go on Google, whether you’re searching or buying adwords.
    But, Google’s instant results make it easier not to use the long tail and work against it.

    There are various ways to use Google to assist in avoiding the spam–it’s helpful to search blogs before going to the mainstream search. They’re SLIGHTLY less polluted with spam. It’s helpful to use the long tail. If your domain is constrained enough and you can identify the key 100 or so sites, it is helpful to use a Google Custom Search Engine.

    But, for mainstream search, I agree, hard to believe Google is better today than it was in 2000.

    Cheers,

    BW

  54. Mat, I love Google with all my heart. It is my favorite search engine, favorite cell phone software, favorite email etc etc. I feel that it is still a very honest company.

    But, it pains me to say this, the spammers are winning.

    My searches are drowning in “shopping engines” (read affiliates optimized to the hilt), useless ebay affiliate sites like sportslinkup.com (really huge ebay affiliate), etc.

    Try searching for [kurt “dbkit-30”]. (without square brackets). You will see a couple of dozen of garbage sites, all clearly written by one person, redirecting to one affiliate page.

    At the very very least, I want an option of blocking ALL affiliates and ALL “shopping engines”. Just an option.

    Something happened a year ago, when somehow a lot of garbage sites reappeared in the search results.

    i

  55. French Point of view: The evolution you mentionned was quite nice, and made Google cleaner than it used to be. Nevertheless, even if content is the King, we all know that ‘Href’ is better. Here is the question: by giving more power to the quality team, won’t the search engine be less fair than before? Because you guys will have the possibility to “either pick up or choose” the website that will be considered as the most relevant.

    Nice topic

  56. Wow! Many changes in Google! And it irritates me/us. 🙂

  57. you are absolutely right and I don’t think search results are bad today..they are awesome. I know many people – searching is their bread and butter and I too is using Google since 2002. no other search engine can stand in close comparison to Google to this day..

  58. Mikhail Slivinskiy, we’ve talked about how we evaluate search quality a bit at http://googleblog.blogspot.com/2008/09/search-evaluation-at-google.html . I’ll ask whether there’s more info that we could give.

    Personally, I’d love if we did a blog post along the lines of “we compared Google to how relevant we were three years ago” and dove into details on that.

    Ionut Alex Chitu, I think the way you put it in your comment is exactly right.

    Darren, I don’t mind if people have questions about Google’s search quality. For a while I felt like I’d blogged about everything I needed to say, but now I’ve got lots of topics on my to-do list for when I can find the time.

    Yousaf Sekander, I think the recent spate of articles has been a really good call to action for us to redouble our efforts.

    “Oh and google 2k vs 2k10 is not fair. Try google 2005 vs 2010 or something.” We don’t have versions of Google for every year, but we do have some Google results from three years ago or so. I’ll see whether we can compare those.

    Cory Howell, too true.

    Mike Card, thanks for the suggestion. Let me get back to you on that.

    Dave Naffziger, that would be a good subject for a different blog post.

    Jon Bishop and Oskari and Tom Blue, thanks. 🙂

    Matthew Perosi, we usually avoid naming projects scary things like “Skynet.” They tend to be more positive, like “Fluffy Bunnies” or “Phil.”

    Igor Chudov, thanks for the specific search; I’ll pass that example around in Google.

  59. Matt, As we see more search users & webmasters day by day, also more spammers day by day. On a wide look search evaluator’s thinks that spam results are getting sorted but as per users perspective spam results are getting more and more. It’s high time to follow some sort of strong metric that calculates user’s vs. search queries behavior to sort out spam & duplicate results.

  60. I’m an occasional Google search user and it is simply a fact that results degraded over the past 1-2 years. Many other people seem to agree, and are starting to talk about it. Lately when I try to search for things, I get a page full of spam and give up. This is particularly true when my search string has commercial value to it.

    Results are probably better than in 2000, as your post implies, but they are also a lot worse than 2 years ago. Google needs to integrate social dynamics and credibility into search result rankings.

  61. Since the last year, each time when I input a book title, tens of spam web pages will be returned, which pretend that a free download link can be found on the page. Have your team noticed this problem? This is really a headache for me, and I really hope it can be overcome early.

  62. Google and Matt Cutts — Keep up the good work. Oh, and do you have a code name for the eventual A.I. you are building? Elmer? Skynet?

    Dude, there’s only one possible name they could use…CompuGlobalMegaHyperNet. And if Bill Gates ever tries to “buy Google out”, I get first right of refusal on the right to produce and distribute the “buyout” on DVD and Blu-Ray.

    I just need to find a way to work Vinnie Jones into the mix. He’s always such a badass.

  63. Wasn’t the introduction of LSI supposed to be Googles intention of filtering out the spam from the top results?

  64. Hey Matt that’s great that you saved the ‘time capsule’ of Google search results from 2000, you say you were in grad school in ’99 so when you did the 40k+ queries was this before you started work at the big G? I believe in Google wholeheartedly and think that the SERP in all aspects have just got better and better with refinement and time.

  65. Having been on the internet since the mid 90’s I can tell you how bad search results were back then. The problem back then was there was no information. Think of how many sites there were back then compared to the amount now. So yeah I get what people are saying about there being more spam these days but I remember back in the day not being able to get any information because there were not enough websites. You take the good with the bad. Sure there’s more spam but there is tons more information out there. Keep up the great work Google.

  66. Have you thought of a search engine based on human feelings ? Or a social network ?

  67. Maybe today’s results are better than 10 years ago, but my personal observation is Google does not often return quality results. You buy a domain name for long-tale keyword, put some general phrases on the webpage and you are on your first SERP page, well ahead of real quality sites. I think Google went astray with 200+ signals.

  68. Nice article. You should write a longer text (or make a video) to explain chronologically all the tricks spammers have been using every time Google added some new functionality or used some spam-fighting technique and how Google reacted to them.

    It would be an interesting review of the art of computer antispamming and “bad” webmasters would learn a lot of things they don’t need to do now to fool Google. Also, a long-time perspective of the mistakes made in the past can help “good” webmasters and Google to intuit how this fight will be in the future and make the web a better place.

    There are lots of webmasters who could help finding solutions to current spam problems if they knew what those problems are. An upgradable, chronological list of threats and solutions, from 1999 to current ones, would make spam fighting more “open source”. With all the risks this implies, but probably the answers to those threats will arise before anyone can take advantage of them. Also, a big collective will be able to imagine new future attacks before Google even notice them.

  69. “Unfortunately, just a few months later people were creating multiple subdomains to get around host crowding, as the results above show. Google later added more robust code to prevent that sort of subdomain abuse and to ensure better diversity”

    So now black hatters have simply moved registering multiple domains for a few dollars each and hosting them on different IP addresses. It’s that easy. I see several of these domain farms dominating the SERPs. I report them spam and nothing ever happens, except that the white hat sites all go out of business.

  70. Hands up anybody who feels they could have done better – with reasoning.

  71. “Suddenly, Google’s search results were much cleaner and more diverse! It was a really nice win–we even got email fan letters”

    I like how you say “we” when you weren’t a part of the team back them. People do this sort of thing with everything. Take Super Mario as an example. The game wasn’t all that great but people think it is. For some reason people look at memories through a rosey lens.

  72. It’s really frustrating , Sometimes we are trying to find out some urgent things in Google but never find out out those original pages. I don’t understand why Google are giving priority those pages which are not directly related to products.

    Thanks
    Sudip

  73. The good old “Google is dead because of spammy results” news shows up at least once every year, pretty much since forever… hyping up and then kicking down is one of the most popular news themes, true for pop singers, actors, search engines, tv series etc. I still think Google’s results are good, albeit presented with more and more clutter, but it’s still easy to ignore it… nothing like the popups of the late 1990s, and no repeat background music yet either… though hey, that could be an interesting experiment… audio clues to emphasize search results!

  74. @Matt
    When I first started following this blog, it was because I was somewhat new to SEO (I had actually done a lot of work on it, but usually I did the work as advised by somebody more expert than I…then I started working on my own projects). At that point, the majority of your postings had to do with sage wisdom, logic of how Google’s rankings worked, advice on how to handle / manage rankings, and a ton of valuable info on Webmaster Tools, Analytics, and quite a few other great topics. I don’t mind the increasingly higher ratio of personal postings; I’m a big believer in changing the channel if you don’t like the show, so I still read most of the personal ones, but sometimes skip them if time is short or content is just too far from my interests. But lately, I am not seeing anything that is educational, but rather a bit more dramatic.

    Obviously the signal-to-noise ratio is still low, but this post and the one before it feel very much like they are driven by Google’s marketting hand. I’m not exactly jumping up and down to yell about how Google’s result quality has gone downhill, but it’s clear that a lot of other people are (especially those over at Lifehacker/Gawker, a slowly increasing portion of the tech community, and even Apple…but I hardly take Apple seriously). While Google certainly can and should promote their search (and I prefer it be done honestly, as opposed to the fiasco from the last week or so), I really miss seeing quality content coming through this channel.

    Please bring back the Matt Cutts from 3 years ago…just as many feel Google was better 3-5 years ago, that’s very much how I’ve come to feel about this blog.

    Awesome: Electronics reviews, hobby projects, SEO/WebMaster topics, Event Info and even vacation photos

    Weak: Political garbage, FUD, Propaganda

  75. Slight correction, when I mentioned “the article before this one”, I meant the Google v. Bing fiasco, not the tiny posting about stripping personal info from jpegs in Ubuntu 🙂

  76. I partially agree with Danny that BIG brand names should not be given the most expensive real-estate nowadays (SERPs). The more diverse the results, the better a search engine is. For example when I searched for HP dv6500, Top four results are from HP. The rest are not what I really wanted. My desired result was on position 1 page 2 from CNET. Had HP be given just one place, my desired result would have been on page one and that could have saved me an additional click.

  77. I remember coming across Google by chance in the late 90’s and being amazed at the quality of the search results. “Finally!” I thought, “just what I am looking for.” Since then I have been a huge fan. The many innovative and useful tools that have been developed since then has been quite amazing – not just improved search results. Thanks for the blog post.

  78. “One nice change that Google introduced in February 2000 was “host crowding,” which only showed two results from each hostname (here’s what a hostname is).”

    Really? Try going to Google Australia (google.com.au) and typing in ‘real estate’. The first four result are (and have been for an very long time), all from the same domain name.

    In what way does the user benefit from being presented with the same website for the first four results?

  79. My biggest problem is that when I do a search, rather than getting a relevant company or site, I get a dozen different article sites spammed with Adsense. Most of the article sites contain the same article over and over. It is all a bunch of gibberish and is completely irrelevant.

  80. Matt,
    Seems like you are ignoring a cause/effect cycle that Google’s dominance and lack of transparency has caused. Consider that by not setting clear guidelines you encourage experimentation(gaming)/spamming. Consider how useable and uncluttered the web would be if you did have strict rules about the number of links on a page? What about the idea that PR shouldn’t leak on no-followed internal links? UI has grown way too complicated largely because you won’t let me tell you which links should and shouldn’t pass PR.

    Heck you could single handedly put the domain parking spam pages out of existence over night. Wouldn’t it be nice to have all those domains back in the actual available pool for entrepreneurship.

    Finally, consider that the more you promote big brand answers (4 pack) the more you become the search engine of the establishment, and less the tool for the people who put you where you are.

  81. I agree with Matt’s observation that Google has improved its results over the last decade. The reason I believe this to be true is that one has to take into account the amount of data that existed 10 years back and now therefore there is lot of more data and most of it is noise therefore the algorithms have to sift through more data. Then the types of data that has to indexed or searched has increased. Kudos to Google and Matt Cutts for achieving what has been achieved and I’m sure no one including Danny Sullivan does not recognize the achievements the Google team has made in the past decade.

    Danny also makes a valid point that most of searchers would prefer one result from one domain in the search results which hopefully gives the user a wider range of results.

    Having said that there is an advantage of having host crowding as the researcher can know if a particular domain has authority for the term searched. Example if I was to search for “search engine news” it would definitely help if I can see how many results are there per domain as it will help me to make a better decision which search result to click on as domains having a bigger host crowding is likely to have more information than the other (I agree this is not true for every search you do but it is true for majority of them)

    This is a catch22 situation.

    The solution that came to my mind was to use Ajax on the search results page where for each domain there is a “+” sign next to it which expands and shows other results from the same host and a count of how many they are which can help the researcher make a more qualified decision about which result to try out first. I feel this is worth a shot by having such a option in Google Search Labs.

    What do you think?

  82. Hi Matt

    Is it cloaking if you present the content of a private sales company website to spiders but not to users? Since business proposition doesn’t allow to display the sales before user logs in, but is it ok to let spiders index the site in this case? Of course, the user knows it’s a private sales site even before they click on the result in serp.

    Thanks
    Arthur
    Arthur

  83. I am not a techie person, but I have depended on google for the last few years to do research. I have been asking the same sort of queries for a few years, for both websites and images. I now cannot depend on the results I get. The weight is now given to personal pages, flickr images, and misspelled variations. University sites, images from authoritative sources, personal websites with detailed information rarely show up.

    Instead of getting dependable and correct information, I am now deluged with some kind of wider answer, one that is so undiscerning and totally inclusive that the results are useless.

    Please fix it. I don’t know what you did in the last few weeks, but you broke it.

  84. Do results not depend on whats out there? Google returns the best possible results according to your query. Maybe we should question the content and content creators instead of Google.

    Sure, they maybe something excellent buried on the 15th page but users can bring it to the top with a link and thats how search engines like Google measure its quality.

    So overall, I think Google is doing a good job (for free! (organic))! With the billions of documents out there.

  85. Commercial vs informational searches. I’d agree that Google’s results have gotten infinitely better from a commercial standpoint. There are still some issues. Lack of variety is one with only a few stores really mattering in the results. And the other being the abundance of shopping engines that dominate the results. Sorry, but I’m using Google as my search engine, not to get results for results from another search engine.

    But the big difference is in informational. I’d say this has changed a lot over the last few years as large, VC funded content farms have found ways to make informational searching much more profitable. If I’m looking at ways to make cookies, I would get some real interesting hobby sites with people passionate about it. Now I’ll get eHow, etc with run-of-the-mill information.

    So I think when discussing results, you have to seperate commercial from informational. Commercial has gotten much better, informational has gotten worse. Google needs to find a way to get me to the site of a baker, chef, or cooking enthusiast when I want to make something. Not to the latest VC funded content farm that had some guy overseas write a generic article for 30 cents.

  86. Very interesting post. It always amazes me how basic technology was “back in the day” and how we just put up with it. I guess, what other choices did we have..

  87. In 2000 I loved to use Google simply because the search result page was clean just like the site Google itself – search, and nothing else, no crowding by other portal features.

    Then I had ADVERTS showing up on the top ( ? sponsored results or whatever) as well as in the side of the pages. There were also privacy issues ( see ixquick.com). So for me G in 2000 and G in 2011 is heaven and hell difference, and I use it no more.

  88. Google needs to take a page out of the Gulliani book of leadership re: “broken windows theory” of being responsive to street level “smaller problems” also keeps the bigger problems at bay.

    Google needs to have an active abuse investigation team that looks into, responds to, and reports on link abuse. I know of a number of small businesses sites that continually get outranked by their competitors because their competitors are using link farms. It would not take long for the word to get out that Google investigate black hat link building – and with sufficient penalty would result in enough of a stick to deter people.

    For many – the carrot of poor quality links to keep up with competitors is more compelling than the stick of an unlikey penalty from Google . . especially when competitors are using these tactics.

    I tell my clients that implementing white hat search strategies is the only way to protect their online brand with search engines . . but also let them know it is unlikely to get them any visibility because that is not the game their competitors are playing.

    Groupon claims to have “unlocked” the local commerce equation. The most significant difference in their model was a commitment to feet on the street and front line people. I don’t think everything can be solved by algorithms . . but I think feet on the street to deal with “broken windows” would be a much bigger deterrent across the industry than randomly singling out some companies like JC Penney. Good for you with JC Penney but this doesn’t help level the playing field for many businesses I know where I could provide you with sites and inbound links for competitors that could easily be investigated and dealt with

  89. While the amount of spam getting into the index since the launch of Google, I still find the search results to be the best there is out of all the major search engines I’ve compared to (Yahoo, Bing, etc etc).

    And while Google does a lot more services at it’s disposal that I don’t really use, I will still love Google for all it’s worth.

    Only imagine what Google will be like in another 10 years at the year 2021.

  90. Is it really better? I doubt it. Sure the expectations on search quality may be higher than before but don’t forget you’re now 11 more years ahead. Time enough and manpower enough to make your algorithm better than it is. It still begins and ends on link building. That’s the only what really counts. Which sites rank high? The sites with the most highest PR links, bought links or links gained by article spamming. I only say NYT/JC Penny, several weeks at #1 on countless search phrases and no one of the 24 thousands employees noticed it?
    Against the Google guidelines … if I would have done this you would kick me light years out of the Google universe. Let’s talk 2020 again.

  91. It reminds me when i was a boy and could buy a packet of crisps for 2pence. They were the good old days lol. Time is a great healer so to speak and i guess people will be saying the same about google in another 10 years. That’s life as we knew it or should i say how we want to remember it!!

  92. What we need is a new system of Meritocracy. When there were only a handful of decent pages that were linked to from other sources, it was easy to identify the level of credibility of one page over another. It was also easy to game, which is why there have been battles between spammers and Google for the last 10 years.

    I would propose that the simplest solution to righting the wrongs in the search engine mess we have today across all search engines is to make authorship relevant (in direct relation to their areas of expertise, their sphere of influence in social media, etc.). Authorship today is nearly impossible to identify outside of what domain the content resides on (and really, who only has content on one domain!), and existing markup and publishing mechanisms do not have the necessary components to verify ownership in a way that is tamper proof. But if you can solve this problem, if authorship can become relevant, then I think there’s hope for “algorithmic” search engines.

    Otherwise “the end is nigh” in regards to algorithmic engines. In that eventuality, curated content (and mechanisms for navigating that content) is the only viable solution moving forward.

    Personally, I think the reason I look back at 1999/2000 as good times was the lack of adwords, google local, etc. that made me feel like I was getting “clean” results. Now the top 3 results are all paid links, local searches are full of google local results, wikipedia is often in the first slot, etc. and you have to be trained to ignore these things if you really want to see the “true” organic results. I don’t begrudge google this fact, but it is a fact. The search results “feel” junkier because of the clutter that has been added in order to monetize and “improve the search results”.

  93. No question that for me, the search quality and results continue to improve. But given the ongoing arms race between search engines and people looking for an edge in search results, it has to continually improve.

    At a high level, I like idea that Mike Card has proposed. However, the challenge is in the details to make this a workable approach. I’ve seen something similar to this with poor to medium success for newspapers trying to manage the occasional inappropriate comments they receive on an news article.

  94. 10 year from now I hope Google code is open source 🙂

  95. This is what i call “evolution “. Google has done a good job as I always get what I want & now I have many more options in the left hand navigation.

    Cheers!
    Agam

  96. I must give some kudos to Google doing decent job fighting the spam search results if we compare how many internet users (and Google users) there are now compared to year 2000.

  97. Does this scenario remind anyone of the “Thumb” from Hitchhikers Guide to the Galaxy? Google updates its algo to address spam, the spammers adjust their tactics to get around it. Repeat.

    Last decade it was link farms.

    Now it is content farms.

    Each time Google responds and corrects the problem…and they do it well enough that I still prefer their SERPs over the competition.

  98. I think in the year 2000 search volumes were too little so less exposure of Google and hence less spam. At that time other websites were more popular. With the passage of time Google could have improved and i personally think Google 2011 is better then Google 2000 because more maturity in search engine could be observed

  99. Hi Matt,
    as long you guys continue to strive for being the best and most accurate search engine, giving the fairest results and “Dont be Evil” your services will always improve.
    thanks for being there.

  100. Alastair McKenzie

    My issue with Google in the ‘old days’ wasn’t spam, it was relevancy. Spam levels seemed pretty equal across search engines. I spent my time trying to persuade colleagues and clients not to use Google like sheep simply because everyone else did, but to think for themselves and try other search engines because very often they would get better (more relevant) results. It used to really irritate me that Google was so dominant – all hype, no trousers.

    I stopped around 2007 when Google began to stride away from the others and have been an enthusiastic Google evangelist ever since.

  101. I think that the largest problem with Google today is nefarious companies stuffing hidden links and spamming hidden keywords on each of their pages. This cuts out competition who isn’t willing to use black hat SEO.

    For a particularly egregious example, Google:
    “national graphic design, web design, logo, and artistic”

    Dozens of pages of the same hidden link stuffing by one web design firm – Pressley Design.
    I could give hundreds of other example search strings with a ton of web design firms doing the same thing.

    Is there no way to recognize these repeated strings and check to see if they are visible on the page?

  102. I see too many negative remarks on a positive article on change. People have additional expectations from Google, that is given, but how about appreciating rather than criticizing. I for one know that when I am appreciated I do better.

  103. One thing I would live to see is a shift to more dynamic results. Possibly even full customization by the user; in other words the ability to change the exact algorithm in use for a given search. If say for my particular search reputability is most important I would like to be able to tweak Google to return results accordingly. On the other hand if I care very little about reputability and am more concerned about relevancy I would love to be able to adjust that as well. You could potentially automate this using some sort of feedback mechanism whereby Search would automatically attempt to tailor itself to the common user without any need to manually tweak the algorithm (no technical knowledge necessary). Relevant to this entry is the fact that such a thing has great potential for fighting spam in a way that goes beyond mere static algorithm adjustments (which will always be deficient). I believe this sort of thing is likely the future of search and I would love to see Google implement this.

  104. I think Google has got loads better than it was ten years ago. The points that some people have made, that we’re getting smarter at spotting spam are dead on. Also, I can’t help thinking that as Google evolves, so do the spammers so the cycle will never end. I personally, have flirted with Bing recently – just to see if they’re better – they’re not – at least in my opinion… I like Google! It returns the best results for me.

  105. You are a liar Matt

  106. Oh yeah I remember when I was in school and I actually taught a short class on HOW to search using hotbot, lycos and alta vista. Finding what you wanted took a skill, not too hard to learn, but teaching people “Exact term match”, Boolean, AND, OR, NEAR expressions was tougher than I thought it would be. I was floored when I first started using Google! I had to ‘un-teach’ myself all that stuff! Dear Google, I adore you and all you do!

  107. I think Google is doing a great job. I think its us, as searchers, who need to become more sophistacted in our search terms.

    The expectations for Google are so high, it must be tough to live up to what everyone expect to be able to do – especailly when there are thousands of people out there trying to work around the system.

  108. Ahh yes the good old days when you could rule not just the top results but sometimes the Top 10. I had it happen to me once in 1998 (pre-Google). There was my pitiful little website in the entire top 10, and for what is today one of the toughest search terms to rank for, but back in the day there was hardly any competition. The search term? Search Engine Optimization

  109. There will always be hatter

  110. Hi Matt,
    What plans does Google have to deal with the mountain of dubious blogs that are set up daily and like link farms to boost companies to the top of the Google Index? We notice our main competitors are doing this all the time, and at the moment it seems as though Google is rewarding them. I read the article on JC Penney in the New York Times [http://www.nytimes.com/2011/02/13/business/13search.html] with interest and wonder when this will filter down to the smaller companies doing just the same. I came across a link this morning from a Funeral Home website that now serves as a link center on subjects as diverse as ‘Ladders’, ‘Infertility’, ‘Used Dental Equipment’ etc. Sites like these are filling the Google Index with junk making the already hard task of finding worthwhile content even harder.

  111. Matt,

    I had a question about forum profile spam. I’ve seen some cases of websites have completely dominated results for years with nothing but software generated forum profiles. Is that on Google’s radar anytime soon?

    Thank You

  112. There’s something in all of this that doesn’t make any sense to me. Have all the Google skeptics searched on Bing or any of the other search engines recently? It’s true that Google isn’t perfect, but isn’t it incredibly clear that its technology is vastly superior to all the other search engines at the moment? I just don’t see the value in constantly attacking Google search quality when it’s so clearly obvious that Google is light years ahead of everyone else in this regard.

  113. Wow, Google was totally not on my radar back then… and I was playing in the search space but focused more on Yahoo, Altavista, Excite and Infoseek…

    How times have changed and man, Google… “You’ve come a long way baby” seems to fit.

    I can’t even imagine what 2020 Google will look like.

  114. To be perfectly honest if you do a search for loanfinder in google.co.uk you will find the same company has monopolized page 1 of google by using multple domains, the end result being exactly the same as you describe in 2000. This is a big challenge for google to resolve but it needs to happen.

    Companies in the UK do this to push negative press off the front 1-3 serps and it’s so very easy to do. This needs resolving really, I just wish I had a solution.

  115. Hi Matt,

    I need to sound off here, this is getting to be a bit absurd:

    1. Google wants to reward unique content and downplay duplicate content, all well and good, yet you reward major comparison shopping sites like TheFind.com and Kaboodle.com who steal other people’s content without asking (ie the crawler FatBot is TheFind’s robot used to steal content) and place them at the top of rankings. Even if you create unique content – it goes to waste in many cases as Google rewards the thieves.

    2. Amazon has been dominating SERP’s even more recently. Is this thanks to their enormous copy writing staff dedicated to high quality content? No, they’re getting by on everyone else’s back – and doing so in the face of your content quality claims.

    4. Google’s stance on the Overstock style links is so retroactively reactionary, ambiguous and absurd that it would take hours to articulate my position. In lieu of that, take a moment yourself to consider why the stance is an absolute breakdown of logic. Go so far as to re-acquaint yourself with the original purpose behind the anchor text of a link. I’d love to hear the results of this exercise.

    5. 100% agree that multiple SERP listings for a single domain is 3 steps in the wrong direction. Every other change Google is making moves toward a more diverse experience. Baffling.

    Ok, that’s enough.

    Google needs to stop talking out of both sides of their mouth.

  116. I agree with Danny Sullivan, all I really need is one or two pages from each domain. Just give me the most relevant pages and I will be happy. It drives me insane to have to scroll past multiple results from several domains, past the place listings, shopping section …..

    Why can’t that all be available in the side bar, and leave the main real estate for one listing for each of the most relevant organic results? Or add a “see more from this domain” instead of automatically listing them all in the results.

  117. Host Crowdling helps in a way because it gives all others a chance to compete! 🙂
    That is what is called a drastic change!! 🙂

  118. Truth is that google it self is responsible for large sums of poor content created,and it’s becoming even worster.

    Google loves fresh unique content -Let’s spin the brains out of the article!For example: i have articles that are actually readable and spinned more than 2700% (has several thousands of words for – article of 600 words), in order to spread and rank well i needed to spread something unique but in same time i have poluted internet and become SEO spammer- because that same article has cycled over and over again in different variations, but without any further value for visitor.Sure thing i could just write another unique article -but i like to create my own future and create conditions to manipulate instead of being in Google’s mercy.

    My opinion is that quality content should be re-posted everywhere, to avoid brainless spam “googpolice” will inevitably, in a future, need to incorporate people in their precious algo code- user interaction -similar to abused FBlike, etc. (-not giving ideas, since seo spam works)

    Overall, google can’t “own” internet and not be responsible for spam, especially if majority of this so called “SPAM” (i would say “desperately trying to rank among pro-adsense spammers” like blogger.com, youtube….) is tightly related to Adsense.

    One thing more, since Facebook is prototype of future SE, i also hope that Facebook will put Google where it belongs – on any place but not first.People – not code, are true movers and shakers of internet, regardless off Google’s computing power.

    My two cents -if Google wants to provide better quality to the people – engage people opinions in the search, but….naaah, who gives a crap for what Google does -so far as for conversions and traffic i will always chose Bing, Yahoo and other smaller engines. For everyday browsing through crap – i will always choose Google…

  119. Google still seems to return the most relevant results for me, hence I use it all the time, I find it really annoying when a toolbar I have updates itself and I am left with a useless yahoo search box.

  120. It’s very important for Google to improve the search result and they are improving this. But it personally disappoints me when I see more result from Local Listing 6 or 7 out of 10 instead of more from the Organic True Ranking. Earlier the Google Result was good but nowadays the organic ranking result is getting floated with the past queries “key phrases” as searched by me; and Google algorithm is not taking care and mixing those past queried key phrases result with the current some other subject key phrases result!
    Example: http://www.google.com/#hl=en&sugexp=ldymls&xhr=t&q=web+design+mumbai&cp=17&pq=web%20hosting%20india&pf=p&sclient=psy&aq=0&aqi=&aql=&oq=web+design+mumbai&pbx=1&bav=on.1,or.&fp=42ea6e12edc6080

  121. Rajendra Singh

    Dear Friend,
    I am damn sure that the search results would have improved in USA but what about the search results specially career related reference sites in India, some of them have become real menace. They violate each and every rule of Google search. They write the url of the source websites in their content title just to confuse the Google search users. There are several webmasters who are having multiple sites with similar content.
    Why don’t you come up with policy for reference sites specially those which are related to career and Jobs.

  122. I still remember the first time someone said “google”; it was in 2001 and ever since the first time i tried i thought it was the best search engine around.

  123. Results are better now than back then. Just look at those crazy domains in that query above. I would never go to a site with a URL like that today, and Google’s search algorithm is smart enough now to agree with me.

  124. Im a big fan of Google. Its the only search engine I use. But being a full time SEO consultant its becoming more and more obvious that Google still has a long way to go. Its disheartening when you follow Google’s guidelines, spend time adding unique high quality content, and follow strict white hat SEO methods to rank for a certain keyword, only to see a useless spammers site with NO content is 2nd position first page. *sigh

  125. Google 2011 > then Google 2000. Here is a question for modern day Google however. Is there any chance in the near future Google would devalue incoming links?

  126. I’m happy with about 80% of the queries I perform and usually find what I am looking for very quickly. Some of the Google Places results seem to be less relevant than I would expect. I am still amazed though by the relevance of most results.

  127. Sorry to elaborate on the question above (and not use the business name as the user ID)…I am asking because of all the bad publicity with content farms, Google recently changed their algorithm. With the recent additional bad publicity of Overstock.com/JC Penny and their incoming links, has there any notion that the algorithm may devalue incoming links in the near future?

  128. Web Spam has *much improved* in our business category, commercial real estate. Just wanted to say thanks to the people in charge. I bet you don’t get that too often.

  129. Hi Matt,

    Any idea why Google is so dedicated to supporting copyright infringement while not ranking a website from where the original content was stolen in the search results at all? I’m trying to understand why BestGore.com which is the original reality news website with original, well researched content from which all similar website derive their content (90% steal it word for word), yet Google rigorously returns pages with content stolen from BestGore.com in top results while BestGore.com itself is nowhere to be found?

    BestGore.com is well established and has been around for years. It’s WP powered so each time a post is made, pings are sent and Google is also notified through sitemap submission. So Google knows new content was published almost instantly, yet for some reason it always refuses to include it in the results, and instead it waits until content stealing websites plagiarize the text and images from BestGore.com and when that happens, Google places those plagiarizing website in top result for relevant searches.

    Why such ignorance for the original source that takes weeks to research and put together and so much support for criminals who simply steal these weeks of research and material gathering? Why is Google such a rigorous supporter of copyright infringement and leaves the most relevant website in the niche out of its results entirely?

    To be honest, BestGore.com gets a lot of traffic because it is the original source so type in traffic reaches high numbers. This is not a plea for more traffic. BestGore.com will do just fine as it has from the beginning even without Google. But why such tireless support for copycats who are above all criminals by infringing on the original content owner’s copyright?

    The fact that Google refuses to rank high the original and the most relevant source (thus not offering the users the most relevant results) – that’s one thing, but that it supports and sends so much traffic to thieves is alarming. And Google knows where the original source came from because thieves must wait for BestGore.com to post the content first before they can come over, copy it word for words, save all images, blur out the watermark and post it on their sites.

    Do you have any answer to why Google supports copyright infringement and turns back on the original source from which the thieves steal content to get Google’s traffic?

    Mark

  130. When I search for something, it’s not uncommon for me to have to reconfigure the search parameters for time/date to ensure I receive searches that are more recent than years ago.
    Aside from this, spam sites are becoming more and more pestiferous as they figure out Google’s alleged “new classifier”.

  131. The one thing i fund frustrating recently in google.co.uk is the amount of low quality websites ranking above authoritive domains, has anyone else noticed this in their industry?

  132. I remember well switching to Google from Alta Vista. The reason was simple; the results returned on AV were not relevant, and many were misleading. It was not because Alta Vista, the then number one search engine, started including ads.

    Today I ran a search on ” ATKFastUserSwitching.exe”. I challenge you to do the same, and find any useful information. If you don’t want to use the above term, open Task Manager and pick any process, and try a search for relevant, useful information.

    Is it Google’s fault they are being gamed – of course not.
    Will that make any difference to people looking for an answer – no.

  133. Hello Matt,

    Enjoy reading your blog and GREATLY appreciate your work and diligence to clean up the web! I’ve really been waiting to hear your thoughts on the new Algorithm (Panda), guessing finding time to blog in your line of work is rare… I’m the IT manager for an ecommerce site that was hit very hard by the algorithm and it’s got me thinking, does Google classify sites based on type such as ecommerce, blog, forum, article, educational, etc…? Wondering how all ecommerce sites do not get classified as having “scraped” content as the description and features are all generally exact copy of the MFG.?

    Thanks for listening,
    Lucas

  134. Not only have the search results changed but also the searchers. Google was just coming out in 2000 as primarily a resource for educational facilities and the world was starting to learn the power of the web but it was nowhere near as mainstream as today. Everyone now uses Google and the way that we use it has also changed, it is far more commercialized in terms of its content and in the tool itself as a revenue generator for the company.

  135. I remember fully those days of search. Search Engines were annoying to use and you had to trawl through search results for several pages in order to locate a legitimate non-spam entry.

    You also tended to bookmark pages a lot more, and also manually entered urls of known web pages, because you couldn’t rely on SEs to help you locate good quality content.

  136. Good post Matt.

    I also remember the overwhelming spam back in the day. A few of my close friends at the time where very heavy into that sort of a thing. I do keep my eyes open for more local upgrades to Google in the near future.

    Thanks again for the post Matt. You always bring the value. Cheers!

  137. Yeah, I remember that time with Google search but times had changed and people are scrambling to get to the top of the search results for organic traffic. Lest they shy away from cheap advertising but they can still do with advancedwebads which is really cheap with their unlimited impressions and clickd for a flat monthly fee. They’ll surely get results.

  138. Back in the day when search engine options were severely limited, I kept documents full of links to key websites so I could easily return to those sites to find the information I needed.

    I no longer have to be an ad hoc librarian. For most things, a quick Google search will pull up the key sites I use often. Even better, if a new piece of relevant information has emerged, it often shows up in the query results, allowing me to stay far more up-to-date than with my old system of keeping static links in local documents.

    Google was a game changer. For me, it was a positive game changer, and I remain loyal to Google’s search offering because I recognize how much it did change how I obtained information on the web.

  139. I agree to the great comment of SEO Mentor, Google now incorporates all sorts of results on the first page making it look kind of messy or flimzy.
    As a workaround you can log-out from Google account so they won’t identify your locality and the results will be more relevant.

    I would suggest giving the user an option (similar to when you spell wrong):
    Your serch includes a locale, would you like to show: Results from that locale, results from the Internet, or both?

  140. I agree with Martha Jennings’ comment above – Google is exceptional compared to other search engines, and the majority of the time it returns the results I would like. However, with the paid services (adwords), sometimes the ad’s/results aren’t even in focus to the term/phrase you were searching for, which in turn, doesn’t return the desired results. Maybe this is something that could be worked on?

  141. Spam is still working. Now they use such stuff like wikis and drupal which also allow peaople to modify the content of a page. I don´t understand in special, why this can´t be identified. I know u say that you identify it – but if u have a look at the results of some searches, u will find pages which are listed very good with very difficult keys by using those spam methods

  142. Hi, and nice article. Google is every days greater than the previous one. And it’s obvious it will improve in the future.

    I had an idea to achieve more powerful searches, but I didn’t find any ear to hear it.
    So I just wrote it down for the person who would like it, take it and develop it. I hope it’ll be Google.
    Maybe it could be interesting for you to take a look :
    http://stabbquadd.over-blog.com/article-la-recherche-par-contexte-62271087.html

    It’s in french, I know, because I’m french, but there’s a translation further down in the page. It’s not perfect because I’m french and like many of my compatriots, I have no talent for languages. But I hope it’s understandable enough.

    I’ll probably be back on your blog someday, as some SEO related search query got me there, and that’s a kind of information I often look for.

  143. Patrick Oborn spam master of monumental proportions never seems to get punished for deploying oodles of fake content scripted and published by bots on a daily basis as just one facet of his affiliate spam network worth millions. Now I know who’s butt he’s been kicking for the past decade.

  144. I agree with Danny comment – One display per domain/subdomain. I look at one search query. They sponsored adsense, the domain ranked 1 and had 7 site links (taking up enough space for two results) then they had a see more results button. Subdomain1 ranked 2, subdomain2 ranked 3 and then another see more results for this site button. This is just crazy – This website is a one product type brand. The subdomains were blatant way for them to dominate the first page.

    Not sure about your examples of IBM and HP Matt. You need to give searchers a little more credit. The results tend to be repeats of what is already in the site links. If a searcher is looking for a “job at IBM”, they would type that. Same with “HP support” or what ever. Poor old Heinz – who make the lovely HP only come up number 9.

  145. Google Panda strategy for 2011 is too transparent:

    “Fill google “places” and SERP results with poor quality results so that more and more people start depending on adwords listing at the top and right for their business needs”

    Brilliant extinction level idea – Panda what an appropriate mascot for latest update.

    Blekko is already showing promising results, with few million more users Blekko will be able to weed out all junk MFA pages from SERP results in no time.

  146. I’m happy with about 80% of the queries I perform and usually find what I am looking for very quickly. Some of the Google Places results seem to be less relevant than I would expect. I am still amazed though by the relevance of most results.

  147. sites like ehow and about are killing your results. I actually didn’t mind it much when, back in the day, wikipedia was top of the list. I don’t understand why you still sell adwords to sites like these. They’re easy to spot. Also, I’ve read a bunch of times that content that is copied is penalized, but from the results I’ve seen recently, this isn’t the case.

  148. Unfortunately I got the impression the recent update made the results more spammy than before, at least in the electronics sector. Sites with spun content, scrapers with nice design, squidoo and hubpages spam, are all over the SERPs while smaller focussed sites with good and straight to the point information literally disappeared.

    It looks like subpages of huge “jack of all trades” portal sites got a major boost, although they often offer no or little specific or useful information, just some very general stuff without real relevancy. The more subpages you site has, the better, no matter of content quality?

    I thought less, but good content would be honored, but the opposite is happening right now. Kinda frustrating if you spend alot of time on quality content.

    As for the multiple SERPs: 1 or 2 is really enough, especially since multiple results often lead to the same or very similar content, so the extra results don’t give any extra value.

  149. Spammers are always one step ahead of Google, it seems Google finds it hard to catch up with spammers.

  150. I like it.
    You are all doing a fine job over over there.
    It dosent have to be perfect and 30% there is good enough 😉

  151. Matt,

    Can we get a definitive on the whole Subdomain/Subdirectory thing that is a little more recent. I know you said it is pretty much a wash. But it is the “pretty much” that will make my job a ton harder if we can’t do subdomains.

    JR

  152. I agree with the earlier comment. There should be a better/easier to use reporting system for spammy pages in the SERPs.

  153. It is funny how we all seem to only remember the good when we look back, and I guess it holds true for Google searches as well. Nothing was quite as good as we remember it as.

  154. I understand that is never ending story when it comes to improving results. However, I still see some sites that use very old things to boost seo resuts, such as hidden text, transparent images, etc.

    I know there is a form to report some particular site but how would one go about reporting network of sites which use funny things to boost their main site…can not fond from where to send a report.

  155. I have used in Google in business from its inception. It has changed. Like everyone, I noticed the spam and junk sites in results growing over the past couple of years and now they seem to have addressed that with some success. The down side? Reduction in results quantity and depth.

    Yahoo, AltaVista and many others all deemed themselves “search kings” in their day and what killed them were limited and skewed results. Microsoft always tilted results and it is only after heavily pushing Bing (and frequently “borrowing” from Google searches) that it has started to get some momentum.

    What’s lost is the ability to deeply datamine. This is what I have done for years–not for personal information but for technical, graphics and scientific information. Obscure computer software references, old equipment hardware specifications, work-arounds and issues for hardware and software in various configurations.

    In art, I would look for particular representations or styles for comparison and sometimes for archived photography of specific events.

    In the science and technical fields, my sought after data might be an obscure paper by a relatively unknown professor or a blog by a specialist in a niche field.

    The ability to do these sorts of searches reliably through Google now is all but destroyed. You get the limited results they provide and have no capability to use Google to look beyond them.

    In an effort to stop scrape sites (which I applaud) and in an effort to be more “user friendly” (which I understand), Google has eliminated all other options and dumbed down the very thing that allowed them to supplant the other search engines I mentioned before.

    I’ll close with a brief example using graphics searchs as an example. I routinely would look through hundreds and sometimes thousands of graphics reults to find comparisons I needed. In such searches now, I’m fortunate to get two pages of pictures to view.

    Google does not need to discommodate the casual user to repair this–they simply need to enable options for serious searches that due to the implementation of current search algorithms, aren’t possible now.

    This isn’t a case of opining for “the good old days” or expressing gripes–this is a observation of a serious decline in the ability to sift quality information in quantity. As it is now, you take the information Google decides “you need” and have no other options.

  156. Hi Matt
    I think in 90+ % of search issues, Google has improved over the years, until this multi domain issue Danny Sullivan complains of.
    I know you understand what Danny means in the first comment, but look at this screen cap of a search for Cottage signs.

    http://smg.photobucket.com/albums/v12/ibselin_too/web/?action=view&current=googleserpspam.gif

    I am starting to call this SERP Spam

    3 Adwords ads, then image results, and finally 11 pages from 2 domains.

    if people don’t like the results from the first couple of the most relevant pages from a domain, why display an additional 5 pages from that domain.

    Google is limiting available choices, you know people hate to scroll through several pages to find what they want.
    Its almost at the point where one or two sites occupy the whole first page. (Even though my clients often benefit from multiple listings, I still find this irritating for my own use.)

    Even though I know its not the case, I wouldn’t be surprised for rumors to fly about Google shaping traffic to preferred sites.
    I would love to hear your opinion on how you feel personally, when you have to go through several pages of SERPS because the first page is dedicated to a couple of sites?

    Thanks Matt, I really appreciate your great info and work.
    Scott

  157. I noted the bit where you said that Google only returns two search results per domain. Why is it therefore that post-panda, I often see Amazon results in #1, #2 and #3? Not a criticism as such, just an observation.

    In respect of the article itself, Google isn’t perfect but there is no better alternative. I tried my hardest to prove that Bing and Yahoo were better than Google, for a post on one of my blogs, and failed miserably. None of them are perfect. The biggest issue that I have is seeing pages full of results from 2009. I often have to add ‘2011’ to the end of my search terms in order to obtain up to date resources. Try searching for the term ‘Is Bing Better Than Google?’ on each of the three big engines, and you largely find tonnes of articles from 2009. Albeit, Google did at least provide one or two newer sources, Yahoo and Bing did not.

    Google search, the best of a bad bunch 😉

  158. Out of curiosity wanted to know if factor of ‘host crowding’ is ignored lately by Google search. Currently, I am working in book industry so ISBN search is something I do a lot, seeing 4-4 results from same domains really drives me crazy.

    Please do something. It would really help if we see one (1) domain at a time.

    If you would like to investigate try this ISBN 0061284424.

  159. I sort of think of Google as a library..Back in 2000 the librarian did not know much about search results, and so when someone came into the library and ask for a specific book, they would go off into the library and come back with all sorts of junk. Over time the librarian has got more experienced, and provides information which is much more relevant. Going forward I am sure search results will be even more refined. Its like anything in life…You start, you learn, you improve…

    Bit of a weird analogy, but I always explain to people Google is like a huge library, and the “search” button triggers the virtual librarian to go off and get what you want …!!!

  160. I still think that exact match domains have too much power. But other than that I think that google does a great job of giving you the most relevant search results. Keep up the good work!

  161. Hi Matt, With regards to the Panda update, would an entire site be affected if there is thin content only on the subdomain? Or would both the subdomain and the domain be “pandafied”?

  162. I have left spam/link farm and scraping reports on the appropriate report form to Google untill I’m blue in the face, over the last several months. One site illegally quotes a page of my site verbatim and nothing in months has been done. Another site rocketed up the rankings due to high ranking ” link farm” pages. Been reporting it and the link farms for months. Still nothing. Google states they are against such underhanded practices. But my experience shows they don’t care. Especially as the site blstered with link farms is earning Google money as it has “Ad Wirds” on the pages of its site. – David Honaker, Phoenix, AZ David.Honaker@gmail.com

  163. Ya in 2000 the search & searchers are different as we compare to 2011. The results what we are getting today are very different from 2000.

  164. Okay, I was going to have a damn good rant, but I think it’s better if I surmise this as follows:I started designing sites 11 years ago. My first decent site still holds a top 3 ranking, as do all the sites I developed up until 5 years ago. Marketing through these sites is still viable, although local search plays havoc with trade (we’re talking a real business here, not an “out to get traffic site”). However, given the recent updates to Google’s search engine Algorithm, I wouldn’t even try to start a new business with the view to getting any form of business as a result of search results. Whereas 5 years ago, I could have achieved a top 10 ranking for relevant keywords with 1000 backlinks, the same simply wouldn’t happen now. Getting decent, relevant backlinks was hard enough back then, but now, no-one even wants to consider it. Content may be king, but it certainly doesn’t drive relevant traffic, with the majority of keyword cominations relevant to any form of business, already having been sewn up. The thing is, why should I have to pay for advertising, which costs pretty much a wage each month, when the returns are meagre at best. It’s probably the reason why we pulled the plug on Adwords and plunged our marketing division into more traditional marketing techniques that are known to produce a 2-5% return. It’s not that Adwords isn’t quantifiable, completely the opposite, it is, and it simply doesn’t add up. I can flyer drop 100 businesses and get a return of 5-10% from them over the space of 6 months. However, with search, any business is now pitching against so many businesses, they’re crowded out. To sum up, The idea of having a website only works when someone lands on your site, which ain’t gonna happen without a decent brand that can be picked out from teh crowd and so, what’s the point in marketing online without a brand nowadays. Ergo all the websites in the UK turning to TV advertising to develop a brand, rather than Google

  165. hi Matt,
    advice for you: to http://to-avoid-this-subdomain-mess.mattcutts.com/ please change existing rewrite rules in your root .htaccess to:

    RewriteCond %{HTTP_HOST} ^mattcutts.com$ [NC]
    RewriteRule ^(.*)$ http://www.mattcutts.com/$1 [R=301,L]
    RewriteCond %{HTTP_HOST} !^mattcutts.com$ [NC]
    RewriteCond %{HTTP_HOST} !^www.mattcutts.com$ [NC]
    RewriteRule ^(.*)$ http://www.mattcutts.com/$1 [R=301,L]

    regards,
    Alex

css.php