Text link follow-up

At this point, it shouldn’t be a surprise what I have to say about any particular site (Hi Jeremy!) selling links. Danny gives a good recap here, and I’m happy that Danny can channel me and say what I would say at this point. Let’s see how succinctly I can say it. 🙂 Many people who work on ranking at search engines think that selling links can lower the quality of links on the web. If you want to buy or sell a link purely for visitors or traffic and not for search engines, a simple method exists to do so (the nofollow attribute). Google’s stance on selling links is pretty clear and we’re pretty accurate at spotting them, both algorithmically and manually. Sites that sell links can lose their trust in search engines.

Okay, everyone should expect me to say those things. Let’s lighten up this post a bit. Would anyone be surprised to find that some link buyers turn around and then sell links to other sites? And that those links may not be of the highest quality? Let’s take a concrete example. Jeremy vetted his sponsored links trying to remove anything reminscent of blog comment spam, but take one of Jeremy’s sponsors, www.thisisouryear.com. Can you get from that site to the “Lesbian Gay Sex Positions” site at www.gay-sex-positions.com in two mouse clicks? Looks like there may be some scraped content on that porn site.

Just to be clear: it’s Jeremy’s site. Of course he can try any experiment he wants (YPN, AdSense, BlogAds, AdBrite, Chitika, Amazon affiliate program, selling links with nofollow, selling links without nofollow, offering flying lessons to the 10,000th visitor, selling pixels, auctioning lemurs, etc.) to make money. Many such experiments cause no problems for search engines. But if a web site does use a technique that can potentially cause issues, it’s understandable that search engines will pursue algorithmic and manual approaches to keep our quality high.

I take it as progress that most people would expect what I was going to post. So, other than the two-clicks-to-scraped-lesbian-porn, how many people could have guessed everything I was going to say? 🙂

130 Responses to Text link follow-up (Leave a comment)

  1. I’d like to see this lemur auction you blog of.

  2. 2 whole clicks? Somebody needs to find a way to better streamline that..

  3. With your suggestion of using nofollow I was wondering if you could “succinctly” (hehe) answer whether or not there is a penalty for overusing nofollow due to page rank hoarding, etc.?

    Great interview at Performancing btw.

  4. As one of the advertisers… this whole thing getting picked up everywhere is really annoying. Oh, well let me clarify… it’s annoying because our completely redeveloped site is going up late today or tomorrow… so now all this industry attention is going to see the crap we’ve had up for a few years too long. Other than that this would have been great. We jumped at the chance to have a link on a site that attracted like-minded people and potentially some customers. We also didn’t use superduper optimized anchor… at all, because we do just want the traffic… although some PR couldn’t hurt. We also didn’t want ROS anchor text…

    Hopefully we won’t draw the wrath of M. Cutts. as we’ve been doing reasonably well for a few years before/without this link which has had no measurable SE impact as of yet.

  5. > Many people who work on ranking at search engines think that selling links can lower the quality of links on the web

    How does selling links harm the web user any more than selling graphical ads does?

    I think it would be more accurate to say:

    “Selling links lowers the quality of search algorithms employed by for-profit entities such as Google.”

  6. Hey Matt,

    Sorry for being a bit out of context here but I just visited your blog through a link found on some other site and was wondering as to why you preferred to use word press instead of blogger which is the product from the same company that you work for?
    Any insights into your thoughts?

    Regards,
    –An avid Internet browser

  7. “Sites that sell links can lose their trust in search engines.”

    Ah. The “Trust” word. Will Jeremy lose TrustRank because of this?

  8. Once again great comments Matt. I am the perfect example of why selling and buying links are bad. I am not a webmaster and heck until 2 years ago I didnt even know what PR was…LOL.

    Not knowing any better I listen to a webmaster I hired that suggested I purchase links to get my site ranked higher. Being in real estate and understanding that the position within the search engines was important to a major portion of our business, I follwed his advice and was paying at that time $2000 per month for these links.

    He was right about buying the links, within 7 or 8 weeks I was in the top 3 in Google and Yahoo for my top keywords, city plus real estate and city plus homes.

    After about 3 months, I fell as fast as I got to the top. By me taking the advice of my webmaster and not knowing any better this was one expensive lesson which cost me at that time over $17,000 with nothing to show for it. I am no longer in that city and that website isnt worth a dime to me. All the hand written content that I placed on the site is now worthless. The money I spent to have the site designed and coded is now worthless. So now IMHO Google cracking down on those that purchase links and sell links may prevent others who dont know any better from allowing these sharks to con them out of their money.

    If you are interested Matt in taking a look at this site I am referring to its http://www.atlanta-real-estate-homefinder.com. I am no longer in that market but you might be able to use this as a tool to make a point with.

    Also I ask this before and didnt get an answer but is Google doing anything about the real estate industry as it pertains to link farms, doorway pages and purchased links. This is a major major issue in Las Vegas. Yes that is my market but if most of us play by the rules shouldnt the rest of us be held to the same standards?

  9. > “How many people could have guessed everything I was going to say?”

    I heard from a colleague that Danny Sullivan did an excellent Matt Cutts impersonation during the evening session.

    Matt’s comments on link building reminds of the old Carnac routine from Johnny Carson. We all know what he’s going to say before he says it.

    If I’m reading the Google patent application from earlier this year, my understanding is that there’s an aging factor, much like fine wine, that is involved. That, in and of itself, should discourage someone wanting to pay $300 a month for a link from Jeremy’s Blog where the full benefit may not be felt for up to a year.

    All this link building discussion will be less important if latent semantic indexing becomes a major force in the the algos. But LSI will probably require compute power at least several orders of magnitude than is currently available. And even then, I doubt that LSI will truy produce better search results. For now, quantity of links, followed by quality of links are the two ways to force feed the SERPs.

  10. Because graphical ads are generally obvious to the naked eye. They’re usually self-promoting in their “text” or whatever they use to grab the user’s eye, they don’t often match the layout of the site and create a colour clash, and they’re usually bigger.

    A text link could be buried anywhere in a site’s code by a savvy web designer, and may appear in the middle of code that has been indexed by an engine, but at the top or bottom of the page within a browser. Plus, text links are quite often completely innocuous and there is no way that a user can conclusively determine whether or not a link is paid for.

    The user, therefore, has to go on the somewhat naive assumption that the link is there because it’s a link of some quality, not because the owners of the linked site forked over a bunch of cash for the link.

    Paid text links don’t lower the quality of the algorithm because the algorithm is the exact same regardless. It’s a formula, nothing more, nothing less.

    The problem with paid text links is that it creates a potential scenario where search results become indirectly sold to the highest bidder.

    Like Matt says, if it’s a potential for traffic that the user wants, then the linking site could include a “nofollow” tag, obfuscate the link with a Javascript, make it a graphic link and use no alt text to describe the graphic in question, embed it in a Flash, whatever. There are ways around this.

    Yes, they’re all an absolute pain in the ass, but they do work.

    The only issue that I have with the whole post is the idea that Jeremy has to remove sponsored links or alter the manner in which he presents them in order to please. In this case, Jeremy isn’t linking to a porn site or a scraper site directly. He’s three levels removed from it. How can he reasonably be expected to go through every advertiser’s site 3 levels deep to track if Bob the advertiser links to Ted who links to Sally’s scraper lesbian porn site?

    And does that mean Jeremy should be punished for someone doing something stupid 3 levels down the chain? I don’t think so. That advertiser could have been added before the scraper site made its way in. The advertiser has about a gazillion linkis on the linked page, and while the link in question is fairly obvious it’s not Jeremy’s site to control. So why should Jeremy be punished for this? I don’t get it.

  11. I never would have guessed you were going to say anything about auctioning lemurs 😉

  12. http://ezinearticles.com/?Organic-SEO-And-Link-Building&id=111464 This is an article I just submitted to ezine articles about link building. I hope it is useful and accurate.

  13. Personally I think there is way too much emphasis on optimizing your site to rank well in search engines. Search engines should be the ones worrying about rankings.

    Nevertheless, if you don’t spend your hard to come by time optimizing your site so search engines making billions of dollars can find you, you won’t be found by anyone.

  14. Matt: “Many people who work on ranking at search engines think that selling links can lower the quality of links on the web

    Then they are mistaken. It may lower the quality of the serps, but that’s not the web.

    Matt: If you want to buy or sell a link purely for visitors or traffic and not for search engines, a simple method exists to do so (the nofollow attribute).

    A simple method has always existed, Matt. It’s called placing a hyperlink in a web page, and it pre-dates search engines. rel=nofollow is only for the engines, and the engines are not web.

    Matt: Google’s stance on selling links is pretty clear and we’re pretty accurate at spotting them, both algorithmically and manually. Sites that sell links can lose their trust in search engines.

    No problem. Search engines are free to count and discount any links they want. It’s your engine and you can do what you want with it, but it’s rather arrogant to assume that bought links are bought for ranking purposes. Advertising was bought long before search engines were ever dreamed of. If search engines find it difficult to deal with it, it is a problem of their own making, and the solution must be of their own making.

    Yes, you would be expected to say those things, but you mean them, so they need responses. I’m sorry Matt, but the engines’ links problems are of their own making. The answer is is to stop putting so much emphasis on link text, and hurry up with a radically different way of ranking pages, but you know that don’t you 😉

  15. when you talk about “penalizing” jeremy for being 3 levels away from this other site, I really hope the people at google aren’t actually using penalties.

    Think about it. If being linked somewhere could hurt your site, what’s to stop all of your competitors to submitting your link to, or buying a link to your site on these places?

    Just pay $200 and link your competitors site and watch them fall out of google? That would be fun!

  16. Good evening Matt

    There is a moral issue involved here!

    ” Sites that sell links can lose their trust in search engines.” isn’t enough at all.

    When a site is not in accordance with Google’s webmaster guidelines, a penalty should be expected. No matter of the name of the site owner. Right?

    I believe that a site should lose its PR value (say for 3 or 6 months) If Google is spotting it, both algorithmically and manually, selling links without rel=nofollow.

    For example, Jeremy Zawodny’s blog has a PR8

    Why should such site continue having that PR8 if it is selling links without nofollow in daylight?

    Such site deserves not less than a white PR bar, IMO 🙂

  17. >> search engines will pursue algorithmic and manual approaches to keep our quality high…

  18. What we fail to realize is that search technology is constantly evolving and it takes time to get it 100% right. It’s not like an email anti-spam services or antivirus software where you get definition updates daily. I hope it will get to this and we can have more personalized relevant search.

  19. Matt – Are you going to start selling links on your blog?

  20. > rel=nofollow is only for the engines, and the engines are not web.

    THANK YOU!

    Is it just me, or does anyone else have a problem with the fact that we’re ‘supposed’ to mark up our code in such a way that HELPS GOOGLE MAKE MORE MONEY? (it helps Google’s SERPs stay relevant… and when the SERPs are relevant, there is more longterm revenue from AdWords)

    Do you not see the hubris in asserting that Google (or search engines) = the Web?

    So why should a webmaster have to help Google and Yahoo! and use a nofollow? Last time I checked, those two companies weren’t charities.

    [ / troll ]

  21. Matt – if you penalize Jeremy you take one of the best sites out of view of Google users. This could have the desired chilling effect on link selling, but it’s not consistent with providing users with the best content for the query.
    I don’t understand the significance of “two clicks away”. At Google.com after a number of common queries (e.g. “teens”) the user is only ONE click from highly objectionable content.

  22. I’m glad that Google is taking a stand against this. If a webmaster isn’t buying or selling links for PR then they should have no problems placing the rel=nofollow tag or they should be willing to accept the fact that the search engines won’t rank them as well.

  23. This smacks of whackness. Google is allowed to sell “Sponsored Links,” but anybody who tries to make money on their own website by offering a service like paid web site reviews (on related topics, mind you) gets penalized. Whack. And anti-competitive… I hope you all continue this and get sued. End up like INTL, MSFT and all the other anti-competitive entities out there…

  24. But wait. I thought the original PageRank algorithm and link quantity/quality matter very little in the current search engine algorithms. At least that’s what I’ve been told every time someone from the press suggested that Google’s engine was a one-trick pony.

    If the backward link count/analysis matters little, it shouldn’t screw up ranking algorithms by any noticeable amount.

  25. >>Just pay $200 and link your competitor’s site and watch them fall out of google? That would be fun!

    What stops you?

    There are legal questions arising from a method like that, defiantly civil and quite possible criminal aspects to it. People should seek council before they even think of trying that. You do not have the jurisdiction or authority to place ads on behalf of an entity without consent. That is like Pepsi bringing out bad Coca Cola ads without them knowing. LAWSUIT. Now that is not Fun!!!

  26. Great stuff! I love it when spammers are busted and all their pals jump to their defence 🙂

    To all saying “Why should we have to……” You don’t have to, just like Google don’t have to rank you. All very simple stuff IMO.

    The patently clear motto IMO is, if you buy links only pay for click traffic. If paying for click traffic, keep in mind that Google can deliver 1000 of times more targeted click traffic than most sites….and it’s FREE!

  27. We are only nano-moments away from a Google browser toolbar plugin that turns some percentage of Google surfers into a quality control task force. They will be paid a small amount, like the Amazon HITS system (mechanical turk), for completing a short course in grading web sites. Then as they surf, they will use a simple drop-down dialog box from the desktop plugin, with several Google specified checkboxes and 5 point scale items for web quality metrics, site tagging, and abuse/spam reports. Then those people will receive additional micropayments for every site they grade, with an AdSense-like smart pricing algorithm that raises or lowers their pay, based on a performance quotient. The performance quotient itself will be policed by peer members in the Google Quality Control corps, who will grade each other in an anonymous fashion in realtime.

    You heard it here first. 🙂

  28. RE: “Just pay $200 and link your competitor’s site and watch them fall out of google? That would be fun!”

    I doubt it, all you would do is pay for people to visit your competitors site! Not my idea of fun 🙂

  29. maybe we should all follow WMW and then there would be nothing to search for at the engines ,no breaking of any rules and the world would be an utopian paradise 😉

    cheers

  30. Practice what you preach, man!

    Do a google search for any of the keywords you mentioned in your posting and tell me that you have ZERO sites with questionable content! And don’t get me started about the displayed adwords links… That’s more people PAYING Google to list questionable content. I guess if you pay Google directly it is not a problem, right? Paid text-links are BAD because the ad-money could be going into adwords directly.

    Matt, while your blog is interesting to read, in the last couple of postings you’ve managed to spread an image of Google the Web-Enforcer, Google the money-greedy “we know everything about you guys” engine… It makes me sad.

    I’m sure you can track me and my sites down, ban my “spam sites” (that, btw, Google personally delivers the ads to, and which make lots of traffic for Googles advertisers and give Google half or more of the ad-income. If my spam-site ranks higher than your “natural” site, perhaps the issue is not the sapm-site but the algorithm itself? And if the visitors to my spam-site are directed to the things they really wanted, through adsense/adwords, is it not the RIGHT experience for THEM?).

    Next thing we’ll see is html-validation and spell-checking required for ranking. No, Google doesn’t validate, but they link to questionable sites as well, make money selling links to questionable sites and link without nofollow. I see their rules don’t apply to themselves. “Do no evil” and only do good to those who pay.

  31. Lightening things up a bit…

    > two-clicks-to-scraped-lesbian

    I think I’ve just found a great domain name for a site catering to active girl-lovers.

    Or, taken in part (“Scraped Lesbian”) a GREAT name for a hard rock girl band.

    Thanks, Matt!

  32. Supplemental Challenged

    The no follow thing doesn’t pass the “do things for users, not search engines” test. Some paid links reflect exactly what they should reflect: some huge/authority/important site advertsing on various appropriate web properties to gain more visitors. These links do reflect relevance, importance and quality. Other paid links are completely useless… trivial sites buying pagerank or link text volume from some wildly off-topic site.

    Classifying these two together is poor search engineering. Discerning the valuable from the non-valuable is hard, but that is a search engine’s job.

    Quality websites take responsibility for their decisions, be they advertising or not, while lower quality websites reveal themselves via their random paid linking. The engine’s inability to grasp niche relevance and authority is the problem here.

  33. I don’t think a punishable relationship should be fabricated between SiteA and SiteC just because SiteB links to them both.

    Google links to a number of porn sites. Should anyone linking to Google be penalised?

    Or should people manually check every site they link to every day to ensure that they still have the same content/whatever that they did when originally creating the link?

    Google needs to stop relying on links rather than hope to change the huge, unsorted, irrelevant, paid, unpaid, useful, useless, spam, non-spam mess of links between sites.

    Using your analogy, I could probably find porn within a site or two of Microsoft.com. To test that theory, I did:

    http://www.microsoft.com links to http://www.asp.net links to
    http://weblogs.asp.net/scottgu/archive/2003/09/23/28748.aspx links to [insert long list of blog comment spam]

  34. “www.microsoft.com links to http://www.asp.net links to
    http://weblogs.asp.net/scottgu/archive/2003/09/23/28748.aspx links to [insert long list of blog comment spam]”

    Funny, so does Matt Cutts’ blog (now!).

  35. With no need to read the comments above, let me just say that I agree 100%. It’s very hypocritical IMO for Jeremy to do this. He can sell as much advertising as he likes and use his Yahoo! given fame to do anything that he likes, but going against something that is being preached by search engines, that is extremely offensive to anyone who has been trying to do what is right!

  36. wow.

    I just can’t believe how much traffic this is generating on the web in general – both for and against. I mean, everyone in the industry is blogging about this.

    Don’t we have anything else to talk about? It’s not that I don’t have an opinion but god, isn’t there anything else out there worth talking about?

  37. >Is it just me, or does anyone else have a problem with the fact that we’re >’supposed’ to mark up our code in such a way that HELPS GOOGLE MAKE >MORE MONEY? (it helps Google’s SERPs stay relevant… and when the SERPs >are relevant, there is more longterm revenue from AdWords)

    You do not have to do anything. You only need to abide by Google’s conditions if you want to do well in the search engine. The choice has always been and always be up to the webmaster. Claiming they are dictating or forcing you to do something is silly. You are free to ignore Google all you want.

  38. >> Great stuff! I love it when spammers are busted and all their pals jump to their defence

    I’m with you on this, Dave. Watching them go is pretty funny.

    JasonK:

    >> Matt, while your blog is interesting to read, in the last couple of postings you’ve managed to spread an image of Google the Web-Enforcer, Google the money-greedy “we know everything about you guys” engine… It makes me sad.

    To who?

    While I don’t agree with everything Matt has said (and I think Matt’s reasonably intelligent enough to assume that not everyone would), how does him posting warnings that Google is aware of less-than-completely-ethical behaviour and allowing webmasters the opportunity to correct any errors in terms of their coding, link sales, advertising sales and/or SEO spread this image?

    In order for him and Google to spread that image, they’d have to go to each and every one of our sites and use us as examples.

    >> “And if the visitors to my spam-site are directed to the things they really wanted, through adsense/adwords, is it not the RIGHT experience for THEM?”

    That is just stretching. First of all, you admit to spamming the engine (and then have the balls to go off half-cocked.) Second, if a user detects a low-content spam page, they’re generally a lot less likely to click through to something else.

    >> Next thing we’ll see is html-validation and spell-checking required for ranking. No, Google doesn’t validate, but they link to questionable sites as well, make money selling links to questionable sites and link without nofollow.

    I’d love to see validation and spell checking be required, especially the latter.

    Hell, I’d like to see it taken one step further and see XHTML 1.0 Strict validation be required before ranking.

    Proper spelling makes documents a lot easier to read and serves the dual purpose of avoiding the trick a lot of webmasters STILL pull whereby words are intentionally misspelled to gain competitive ranking on them.

    Don’t believe me? Here, check this out:

    http://www.google.ca/search?sourceid=navclient&ie=UTF-8&rls=DVXA,DVXA:2005-04,DVXA:en&q=ringtoen

    Spell checking gets rid of that issue in a heartbeat. There is the possibility of collateral damage as the result of uncommon words (such as my last name), but overall it’s an idea that makes a buttload of sense.

    Validation (especially XHTML) goes a long way toward ensuring that people will be served pages that won’t crash, will load more or less the same in browsers, and will in general provide a positive user experience. The latter is something of a synonym for relevancy.

    Hey Matt, run with the spell checking and validation thing. JasonK just came up with a hell of a good idea.

    PhilC:

    >> Then they are mistaken. It may lower the quality of the serps, but that’s not the web.

    It does, however, degrade the quality of the user experience, and that’s bad enough.

    If the average web user is on Site A, and they click through to Site B, then they’re generally assuming that Site B is of some relevance and that it is something worth visiting.

    However, if Site B purchased that hyperlink from Site A, then the user’s own relative naiveté is used against him/her and their behaviour becomes influenced by a relatively undemocratic force.

    Boney:

    >>Is it just me, or does anyone else have a problem with the fact that we’re ’supposed’ to mark up our code in such a way that HELPS GOOGLE MAKE MORE MONEY? (it helps Google’s SERPs stay relevant… and when the SERPs are relevant, there is more longterm revenue from AdWords)

    And therein lies the precise reason that the code SHOULD be used on sold links….the exact opposite behaviour holds true. The results become less relevant when hyperlink selling for SEs occurs.

    While I agree that it’s not up to us as webmasters to ensure search engine relevancy (that’s their job), we owe them a certain ethical responsibility not to do things that would influence the engines in a less than organic fashion, since that negatively impacts the user experience.

    Search Engines Web:

    >> — In a PURE world, there would be NO Sponsor links AT THE TOP of the SERPs
    — In a PURE world, there would be NO Sponsor in the Yahoo Directory
    — In a PURE world, there would be NO Sponsor or Adwords in the SERPs – but one link where Surfers could click on the go to a seperate Sponsor Page!!!

    And in the real world, the search engines would go bankrupt in about six months. Besides, no one is forcing you to click the links. If you don’t like them being there, use one of the tier-2 engines that have next to no results and no sponsor links.

  39. Glad you liked that, graywolf. 🙂 I liked your “click here” experiments you tried, btw. More people need to do a few empirical experiments like that.

    Rob, I agree. Here Alexa opened up a way to rip over a large chunk of the web (not unlike the Google Programming contest where we provided a CD with example .edu data to operate on), and it barely made a ripple. Weird, esp. since our position on selling links has been out there before..

  40. Engadget and most of WeblogsInc are selling Text links all over their sites (engadget sells a text link for $2k per month)

    Being realistic people don’t really click them, they are being sold and bought purely for SEO / PR / link building.

    So….when you are going to penalise Engadget?

  41. Can you get from that site to the “Lesbian Gay Sex Positions” site at http://www.gay-sex-positions.com in two mouse clicks? Looks like there may be some scraped content on that porn site.

    Matt, if that’s the case, then Google should be penalized as well. The DMOZ clone located at http://www.google.com/dirhp is full of websites from which you can get to “Lesbian Gay Sex Positions” type sites in two mouseclicks or one mouseclick or that are just these.

    I don’t think that we should research every single site that wants to advertise with 3 levels deep. With the rest I second PhilC.

  42. Matt, are you going to take jeremy’s link out of your blog? We can find a porno site in three or four mouse clicks from this blog, then. 😉

    Great comment, PhilC.

  43. quoting Adam’s quote here

    >> Then they are mistaken. It may lower the quality of the serps, but that’s not the web.

    >It does, however, degrade the quality of the user experience, and that’s bad enough.

    So what are your thoughts on when Google allows KNOWN SPAM SITES to still display Google ads to fund the sites that pollute competing search results?

    To me that is the single biggest issue which really shows how self serving their linking tips and quality of web rhetoric are.

  44. Lesbian what? Damn, I got to go have a look at this stuff..er for purely educational purposes. 😉

    I am no expert but the way Matt has explained it is that paid links will not get the same bang for your buck anymore, they will not help but will not hurt at the same time? The seller of the links at the top of the tree will be devalued and lose PR when discovered so you will be paying for nothing. Correct?

    I can just see it now, an admin. is getting a little broke and tired of waiting, he packs up a suitcase and kisses his wife and child goodbye. He is going to a top secret linking trading meeting, a place where people trade in darkness.

    I am practicing meditation, looks like website building is a task for those who can delay gratification. Oh well…

  45. Apples to Oranges 🙂

    RE: So what are your thoughts on when Google allows KNOWN SPAM SITES to still display Google ads to fund the sites that pollute competing search results?

    Spam Sites can still dish up relevant ads for users, they just don’t rank for their spammy tactics.

  46. I have been trying to get a hold of someone at google for over 2 months now, with no luck at all. I am building a website http://www.copystopy.com which uses the google api to help the website find text on a webpage that matches other web pages throughout the web. Which in turn will help to fight plagiarism and copyright infringement. I need written consent from google to use the api for this business idea. I am hoping that you can put me in contact with someone that works for google to help me out.

  47. In another move designed to appear emo-ish, Google is penalising sites who use offline advertising as it lowers the quality of their annual report.

    Google claims to have already detected a number of businesses using offline advertising to make searching unnecessary and believes this is a practice which can harm the web.

    “People are free to use billboards, phone directories, signs and tv/radio commercials”, says Matt Cutts, “but to avoid penalty you must display a disclaimer saying the product is irrelevant. If you don’t you risk people thinking a non-Google advertisement is relevant, and that is impossible!”

  48. Matt glad you liked them, I agree I wish more people would test things like that.

  49. >> So what are your thoughts on when Google allows KNOWN SPAM SITES to still display Google ads to fund the sites that pollute competing search results?

    I have no idea what you’re trying to ask here, since there are two seemingly different thoughts.

    I don’t like the fact that Google displays ads on low-content sites that were created for spam purposes. However, I personally haven’t seen a “KNOWN SPAM SITE” (i.e. a banned site) that runs Google ads.

    Could you be so kind as to provide an example?

    And that doesn’t even take into account pages that Google has been able to block for obvious spamming.

    As Dave quite rightly pointed out (hey Dave, you’re like the only one making any damn sense in here right now), the low-content sites tend to rank at best in terms and phrases so obscure that the average bear wouldn’t go looking for them anyway.

    Do I think Google needs to take a more stringent approach to who and what shows their ads? Absolutely. And now that I’m aware of it (thanks to another post Matt made), I’m going to do my part to report the things that I see that are spammy.

    I also know that they do remove spammers once they catch them, because one particular scumbag that I know of lost her account once she was caught doing it (and has been bitcihng about the engine ever since.)

    And since I have no clue what you’re trying to say with the second part (or how it relates to the first part), I can’t answer that.

  50. I think Google is having a tough time waging this battle against paid links and this is something that worries them.

    Matt seems very eager as of late to let us all know that paid links are being detected and negated. When the sandbox first came out, we heard nothing about it from Google and still don’t hear much about it. Why? They are in full control and comfortable with how the sandbox is working and they don’t need to talk about it. With paid links, they could use the help and want to get the word out.

    Sure Google has found some paid links but they are missing many and they know it too. That is why this is also a PR (public relations) batttle with SEO’s. Don’t bother buying links, we’ll find them and they will be worthless. Putting out that statement and having people buy into it makes their battle easier.

    I think a lot of resources are being spent by Google in having the algorithm handle this in addition to the manual efforts required to find and negate paid link networks and Google would like to ‘educate’ webmasters so they don’t have to fight that battle.

    Something I find ironic, advertising on Google and having ads displayed in the ‘content’ network also counts as backlinks. I know because I have seen it happen to my sites (although I never advertised for that reason). I cannot remember if it was Google or another of the big three that picked them up but at least one search engine did for sure. We never hear Google referrring to advertising on their network as link buying but it is no different.

    I also don’t believe that link buyers appearing on these sites are not to worry about anything. At the very least, I suspect, link buyers appearing on ‘paid networks’ are probably brought under greater scrutiny and perhaps can spend longer in the sandbox. IF that is true, the sad part is that your competitors can in fact hurt you or at leaast make you stay in the sandbox longer. Why would Google go through all this effort to detect paid links, then find a site with 80% paid links coming in and then treat that site like all the rest.

    PhilC is bang on with his entire post, particularly:
    …the engines’ links problems are of their own making. The answer is is to stop putting so much emphasis on link text, and hurry up with a radically different way of ranking pages…

    I think a great search engine should be only an observer of the web and choose most relevant sites for a query. With Google’s PageRank, TrustRank and overall high emphasis on links, it has gone to shaping the web and pushing it towards paid links. Now they need your help in stopping this.

  51. “However, I personally haven’t seen a “KNOWN SPAM SITE” (i.e. a banned site) that runs Google ads”.

    Here you go Adam take your pick

    I am more concerned about the a-holes who scraped my articles from my water garden site via rss feeds when it was brand new and have run them through articlebot. A few of them have PR, will they be my future competition? Lame!

    In fairness to Matt he is not part of the Adsense team but this is an issue that needs to be addressed, I am sure we all agree yes?

    Good night

  52. I think the real words to live by are:

    Google owes you nothing
    and
    You don’t owe Google anything

    Some SEO guy posted that up and I think it sums it up very well.

  53. Oh, one more.

    I think you guys need to calm down about the paid link thing. Matt has said that they are just devalued so stop paying for them, you are getting ripped off! In fact most things that are a bit shady are not punished, they just are devalued, so you might drop off in the serps but you only have yourself to blame eh?

    and Adam, I know the sites I linked above are not banned in google maybe but they are absolute sh*t sites that decrease the chances of good sites succeeding. Think about it, once all this crap is gone there is so much opportunity for those who got something fresh to offer. True?

    Can Google detect and articlebot document? I know I can but can Google? I am very interested in learning about this…I am removing .rss from my new sites.

    My head hurts. 🙁

  54. How does selling links harm the web user any more than selling graphical ads does?

    Wanted to ask the exact same question. Link buying is no diffrent then buying advertisment. It is just a very normal market force.

  55. Re: paid links

    Macy’s buys paid adverts in The New York Times…does that make them a less trusted company?

    Your arguement here doesn’t hold water…you are pointing to a flaw in your system, not a reason why paid links are bad.

  56. >> We are only nano-moments away from a Google browser toolbar plugin that turns some percentage of Google surfers into a quality control task force.

    I’m fairly confident that this is happening and is a factor in the sandboxing. How many times have people like Matt said something to the effect like… “if you’ve bought a link for traffic, you’re ok”.

    So, link on site a points to site b… google knows that that link was followed, either through toolbar data, adsense data or analytics data. If that link gets followed enough number of times, it is passing traffic and … err .. ranking factors.

    Oh well, just letting a conspiracy out for the more knowledgable folks to debate.

  57. One more thing…I went looking for high PR sites lately…so there are not so many PR 10 sites….

    What I want to know is why blogger.com is equal to newyorktimes.com and apple.com

    Are you telling me the quality of info from blogger is the same as The New York Times??!

    And that the quality of info from Apple or Adobe equals The New York Times?

    That’s crazy stuff…

  58. “Sites that sell links can lose their trust in search engines”

    Matt, in Vegas I asked you about the paper that suggested that page rank be scored like LDL and HDL cholesterol; PR weighted as good or spam based on its source. Your answer was that “it was an interesting read”.

    Sounds a little like there may be a lot more to that paper’s hypothesis.

    If for argument’s sake we extrapolate that to paid links, so that any page rank gained from the link is scored as spam PR, that could negate or reduce the value of a paid link without necessarily hurting a site (say Jeremy’s blog) for simply generating revenue (a little sympathy here Matt, since his Y! stock isnt doing nearly as well as yours).

    Now if I’m totally off here and Jeremy is screwed not only for making a buck, but also for linking to a page that is linking to a page about scraped lesbians (what if they were just rug burns?), then do I need to do a forensic link analysis of all pages I have linked to? Perhaps this is where the data from Alexa will come in handy.

    Regardless, Google is going to have to hire a few thousand people just to handle the forthcoming flood of spam reports about inappropriate 2+ level deep linking.

    Should we reference “scraped lesbians” in the subject?

  59. PhilC >> Then they are mistaken. It may lower the quality of the serps, but that’s not the web.

    Adam Senour >> It does, however, degrade the quality of the user experience, and that’s bad enough.

    If the average web user is on Site A, and they click through to Site B, then they’re generally assuming that Site B is of some relevance and that it is something worth visiting.

    However, if Site B purchased that hyperlink from Site A, then the user’s own relative naiveté is used against him/her and their behaviour becomes influenced by a relatively undemocratic force.

    —————————————-

    I have to disagree with you there, Adam. When links are bought, what is at the other end of the link is made clear in the link itself, or in the introductory text to the link. Nobody’s experience is degraded. People don’t assume relevance when they click from SiteA to SiteB – not relevance between the sites/pages, anyway. They assume that the destination page is about what the link says it’s about, and nothing else.

  60. Matt,

    We all certainly understand Google’s problem with paid links as I think each of us has seen SERPs for specific verticals ruled by those with large text link purchasing budgets. At the same time I do feel that Google’s stance on this is a little heavy handed.

    This whole line of reasoning becomes an infinitely slippery slope. What about paid directories? Should they also be considered in this discussion? How about merchants with affiliate programs that use traditional HTML links for their affiliate links? There are too many potential cases to list them all, but ultimately where do you draw the line?

    Every Google user does really want more relevant results but the real question should be how much are we going to be required to contribute to achieve them?

    The “nofollow” attribute is very useful but the comments on this post make it pretty easy to imagine a future where everyone’s offsite links include them. We are all only a few clicks away from scraper porn and most likely have been even when the internet was still in DARPA’s hands. I’d be willing to bet that Kevin Bacon would get a kick out of this.

    Regardless of anyone’s opinions, I’m sure that this theme has had a nice impact on your pageviews this week. It’s too bad you don’t have AdSense loaded, if you did I could buy a link with no SEO value. 🙂

    Take Care.

  61. Matt

    I must say I’m a little disturbed about all this talk about penalizing sites that sell links.

    I buy a lot of advertising for my site on sites that are relevant and which in my opinion will increase my traffic and also help raise awareness of my site with their visitors. The advertsing can be either text or graphics, it really all depends on the owner of the website and what he is willing to allow. Personally I prefer the graphic route because I can do much more with it and have a higher probability of having a visitor click thru to my website. I rarely consider when buying advertsing on these sites what ranking benefit it will give me in Google. What I am concerned about is gaining exposure and traffic via their visitors and if the traffic is not there, I just don’t renew the advertising.

    But in all the advertising I do, hardly any of the sites I advertise with put even an average amount of time into SEO and if I requested that my link on there site contain a “rel=nofollow” they would look at me like I’m some space alien or give me one of those stupid looks that my dog does when I ask him what he wants for X-mas. My point is only people who are heavily into SEO know want the “rel=nofollow” link command is and thus would even consider using. There is a large percentage of websites under the control of people who frankly couldn’t give a crap about Google or it’s standards.

    It would be a shame that I have to make as a condition of my advertising requirements that the websites I advertise on know of, understand and utilize the Google “rel-nofollow” link standard just so that my site doesn’t get harmed when I trying to just gain traffic from a relevant source. I have to agree with some other Matt, that idea sounds awefully predetory and also like an attempt to restrain trade.

    If Google either algo or manually wants to devalue the PR or whatever value the links outbound from a site may provide in relation to Google’s serps , thats fine.

    But what I am hearing sounds like You/Google is saying that if a site sells advertsing and doesn’t utilize a Google created link standard, then we(Google) are going to “PENALIZE” that site by reducing its trust and rank in our serps. If so, I would imagine that you will soon be on the receiving end of numerous lawsuits, which when deemed in a court of law to have enough merit to proceed, will be escalated into a class action lawsuit. Google will then also probably become the focus anti-trust investigation by the FTC. Google can only hide behind the “our serps are ours and we can rank them anyway we like” argument for so long.

  62. In another move designed to appear emo-ish, Google is penalising sites who use offline advertising as it lowers the quality of their annual report.

    Google claims to have already detected a number of businesses using offline advertising to make searching unnecessary and believes this is a practice which can harm the web.

    “People are free to use billboards, phone directories, signs and tv/radio commercials”, says Matt Cutts, “but to avoid penalty you must display a disclaimer saying the product is irrelevant. If you don’t you risk people thinking a non-Google advertisement is relevant, and that’s just not fair on us, the Do No Evil guys.”

  63. Matt,

    The fact of the matter here, no matter which side of the SE you stand on, is that text links work!

    Google sells them, Yahoo sells them, MSN sells them, websites sell them, advertisers get business, PR or NO PR.
    – Agreed!

    Here’s another issue. –>Relevancy.
    But where is relevancy accurate when an advertiser, lets say an office supply company, advertises on a completely different industry site.

    You see it as a PR attempt, but from a marketers perspective, we see it as a targeted audience, a strategic traffic ad. The site visitors are all are heavy users of the office product, so for the advertiser it is 100% the right spot. Relevant to the SE it is not, but relevant to the audience it is (do I sound like Yoda?). The SE however does not think, it only analyzes, so it does not know that the advertiser is actually reaching out to its target audience for exposure.
    – Disagree (probably one of the few, but there’s more to relevancy than straight out categories your overlooking some good stuff)

    So I have to say that you and I see ads differently, from a SE perspective and not from the marketers point of view. Today’s “smart ads” offer targeted audiences, branding, and prospects, so we actually tell our clients to forget the relevancy in Google because this is your targeted audience (Like AOL at the superbowl –sports & technology?). That’s TRUE marketing!

    How about trying to add that one to your algorithm. I think that could be called “Real-Life” Relevancy! Get any ideas?
    🙂

    Advertising online is the way of the web. In G’s efforts to create an interlinked world, it has created the largest advertising community ever seen and brought the online market into a whole new area, and this is helping the world economy.

    Google has always stated, or at least it used to, that no business should be reliant on the SEs as their only source of income, and advertising can offer more traffic, branding and prospects, then SOME SEs.

    So while we understand G’s need to protect the algorithm, the question is whether or not G understands the needs of companies to have exposure in as many targeted places as possible without being penalized for it, or for embracing the new economy of online advertising.
    -Disagree

    Nofollow or not, an ad can be as powerful as a Search Engine! Our stats show as much traffic from ads as the SE’s, not bad!
    🙂

    What do you think? Don’t forget my relevancy idea, sound like a possibility?

    Best!
    -Alan

  64. I suffer from dyslexia and read things 3 or 4 times just to confirm I understand it.

    And from what I understand Matt has said, If you buy text links, thinking it will help you in the ranks and get you to the top or close, you are wrong. If that stops one person spending USD6000 to have links across the Jupiter network for a month or what ever it cost on indiainfo.com, then Matt is doing alot of people a favor by telling them DONT BUY, you wasting $$$, the links don’t count.

    Sites that are selling these paid links are doing it mostly on a promise that it will help you get better rankings or lead folks to believe it will. And they all do, if not expressed its implied. Yeh, they mention the traffic also, they trying to make a sale. Then those sites should be penalized unless they change their practice. They are selling folks a false promise. Their PR reduced because they are promoting the artificial inflation of PR which does not work and therefore not delivering the product they have sold. Fair

    Lawsuits are usually because one has suffered damages. The people suffering the damages are the folk getting ripped off buying these links under these false premises.

    Maybe because I am dyslexic I seem to pick up on SPIN quicker than most. U know see things backwards. Come on.

  65. Matt here’s an example of why I believe you are off base on this:

    1. I have a local service business with a web site.
    2. I advertise on local radio station, newspaper and content web sites that know nothing about “nofollow”.
    3. I buy these ads because of the click from local users and the exposure to my local community.
    4. Any SEO benefit of these ads is secondary.
    5. My business paying for these ads for months and months should be a great indicator to a search engine that I’m a great site for this local service.
    6. The sites I advertise on are never going to use “nofollow”. They don’t know or care enough to bother.

    Conclusion: Whatever search engine is able to best figure out which links are revelant and which are spam wins. Best of luck google, yahoo and msn.

    BTW, trying to influence the web with the “nofollow” tag seem very unlikely to succeed, at least beyond blog software. I sincerely hope that at least one of the search engines is not counting on this to happen. We all need somewhere to go for revelant search results.

  66. Adam Senour

    If you are going to be high and mighty with your attitude, please help learners like myself by practising what you preach. Otherwise I’m likely to think that you are not as white hat as you make out.

    >I’d love to see validation and spell checking be required, especially the latter.
    >Hell, I’d like to see it taken one step further and see XHTML 1.0 Strict validation be required before ranking.

    >Validation (especially XHTML) goes a long way toward ensuring that people will be served pages that won’t crash, will load more or less the same in browsers, and will in general provide a positive user experience. The latter is something of a synonym for relevancy.

    Hmmm. Try this page http://www.adamwebdesign.ca/

    Does not have the language declaration xml:lang=”en” in the DTD

    Fails validation on 3 points

    Error Line 120 column 27: duplicate specification of attribute “class”.
    Featured Client
    You have specified an attribute more than once. For instance, you have used the “height” attribute twice on the same “img” tag.

    Error Line 121 column 24: duplicate specification of attribute “class”.
    Featured Client

    Error Line 129 column 147: there is no attribute “target”.
    … background-color: #D6D6D6;” target=”_new”title=”Wedding Photography, Family
    You have used the attribute named above in your document, but the document type you are using does not support that attribute for this element.

    >Proper spelling makes documents a lot easier to read and serves the dual purpose of avoiding the trick a lot of webmasters STILL pull whereby words are intentionally misspelled to gain competitive ranking on them.
    > I’m going to do my part to report the things that I see that are spammy.

    Are you using your meta KEYWORDS

    and your use of [CDATA] to hide keywords as an example?

    –>

  67. OK, Matts blog obviously doesn’t nullify code tags

    Are you using your meta KEYWORDS

    meta name=”KEYWORDS” content=”Toronto web design, web design, web site design, website design, web development, web design company, web hosting, website submission, website marketing, e-commerce Toronto, ecommerce Toronto, web hosting Toronto, Toronto web designers, web services, Toronto website design, Toronto web site design, Toronto web development, ADAM Web Design”

    and your use of [CDATA] to hide keywords as an example?

    start comment code

    end comment code

  68. Matt, I’m just wondering – how many of the spam sites that you manually remove run Adsense? Could it be that with Adsense you (Google) is giving people the Number-One reason to even try to rank spam sites? Sure there are more and more possibilities, but Adsense is #1. Something to think about.

    Another thought. Why do you still have to manually remove sites? Are the algorithms so easy to trick? If Google has to specifically devalue paid links, does that mean the algorithm relies on links too strongly? If paid links are devalued, does that devalue Yahoo! ($300/entry) and DMOZ (some editors bribed)?

    Regarding the (x)html validation / spelling verification — sadly, this will probably be the way that Google will (need to) go. How else can you do a proper block-level-analysis and analize the content, if it is not available in a proper format and with proper spellings (word frequency analysis)? However, at the same time, this is going against the trend which is being set by devaluing paid links: it will benefit those who have the resources (and MONEY) to have someone do it for them. Joe-Webmaster with his unique content will be left behind. Bye bye content, hello commerce…

  69. RE: “I suffer from dyslexia and read things 3 or 4 times just to confirm I understand it.”

    Your post makes me believe you have fully understood it and applied logic and common sense. It’s a pity those WITHOUT dyslexia cannot grasp such a basic concept.

    Come on guys, this isn’t rocket science. If you buy links for SE purposes you are likely to be wasting your money. If you buy links for click traffic then you are fine.

  70. > Come on guys, this isn’t rocket science. If you buy links for SE purposes you are likely to be wasting your money. If you buy links for click traffic then you are fine.

    That’s only the surface Dave. The real issue is that the web does not belong to Google. Keeping Google relevant is their goal and their problem.

    There is no benefit “for the internet'” to embrace rel=”nofollow”. It’s not about making the internet a better place, it’s simply about making Google (the corporation) a better (more profitable) search engine.

    Would you allow Yellow Pages (or anyone uninvited) the liberty of dictating how you monetize your business Dave?

  71. RE: “The real issue is that the web does not belong to Google. Keeping Google relevant is their goal and their problem.”

    That would be correct and as such, nobody HAS to do anything Google requests. The choice is STILL in the hands of the site owner.

    Why is so hard to grasp????

    RE: “There is no benefit “for the internet’” to embrace rel=”nofollow”. It’s not about making the internet a better place, it’s simply about making Google (the corporation) a better (more profitable) search engine.”

    I don’t agree! By being open, transparent and using rel=”nofollow” the link mongers will be a LOT less likley to be fooling the unwary into thinking they can buy their way up the SERPs.

    Oh, also, as Google is now publicy held company, millions of moms & pops ALSO prosper when/if Google making more $$s.

    RE: “Would you allow Yellow Pages (or anyone uninvited) the liberty of dictating how you monetize your business Dave?”

    It depends, if they stated you must fit a profile to be listed in their Yellow pages I would weigh it up and decide based on the facts.

    Let’s cut the dramatization of this, nobody, anyhwere, at anytime is having Google dictate their business strategy. We ALL have a CHOICE, always have and always will.

    Imagine what the SE world would be like if Yahoo and/or MSN were top dog. I BET you would then see little but paid placement.

  72. Hi Ben

    “That’s only the surface Dave. The real issue is that the web does not belong to Google. Keeping Google relevant is their goal and their problem.”

    Please don’t try to make it look that simple. The facts are:

    I’m as webmaster/publisher is a free man and can do with my sites whatever I want. I can sell links, create doorway pages, steal contents of other fellow publishers and play with meta tags as much as I like. Google and the other SEs have no influence on that.

    On the other hand, Its Google and the other SEs rights to make webmaster guidelines which publishers MUST respect if they do wish to be included in the index of any of those SEs.

    As such we are talking about free choice on both parts of the equation.

  73. > fooling the unwary into thinking they can buy their way up the SERPs.

    Looks to me like they can already buy ABOVE, and most of the way DOWN the serps already.

    I have Google set up to show 100 results per page. I searched for accounting software, and counted 85 ads. There were 3 up the top, and 82 in the right column, literally going down nearly the entire length of the results.

    > the link mongers will
    find another way, as always, and keep on publishing AdSense while they do it.

    > having Google dictate their business strategy
    Google has said if you want to sell text links, do it their way at reduced value or they will reduce the value. What’s that word where you have to choose but you’re only given one choice?

    > Imagine what the SE world would be like if Yahoo and/or MSN were top dog. I BET you would then see little but paid placement.

    See above post about accounting software. Yahoo and MSN have some above/below/beside, but nowhere near 85 in total.

  74. RE: “Looks to me like they can already buy ABOVE, and most of the way DOWN the serps already.”

    Sorry, totally disagree. From what I see in Google the SERPs ‘might is not right’ and moms & pops are on a level playing field with the business giants. I know this to be true as I run a very small web business yet can outrank M$ on many of my chosen keyterms.

    RE: “Google has said if you want to sell text links, do it their way at reduced value or they will reduce the value. What’s that word where you have to choose but you’re only given one choice?”

    No, what they say is if you sell text links, charge only for click value. If you buy text links make sure you are not paying for SERP boosting as you are likely to become your own worse enemy. However, anyone can ignore this, take heed, or do it however they like.

    Google doesn’t have to list a site and any site can tell Google to go jump…However, only spammers and fools bite the FREE hand that feeds them.

  75. nuevojefe said:

    “Hopefully we won’t draw the wrath of M. Cutts. as we’ve been doing reasonably well for a few years before/without this link which has had no measurable SE impact as of yet.”

    I’m one of the advertisers on Jeremy’s blog also.

    I’m in the same boat as you… I’ve done and am doing very well organically and hate to think that I put everything I’ve worked for at risk just by purchasing a temporary link for temporary traffic. I had no idea it would have this much impact… I was thinking “High profile site, high profile traffic!”

    Matt is there any sign of hope that I won’t get banished organically for purchasing this lone link?

    P.S. I’ve requested to have my link removed from Jeremy’s pages as this is getting way too controversial for my taste.

  76. Chris_Y:

    I don’t know where you got that keyword list from. It’s not the one on the site now. And the version of code on my site passes through Bruce Clay’s keyword density analyzer with only one issue (one I cannot resolve, so I’m going to have to leave it).

    It may have come when very old versions of sites and experiments started getting cached on my server. I restarted it this morning. That could have been your issue.

    As far as the comment tags, where exactly is it stated that what I’m doing is wrong? It’s stated in various SEO-type sites that there should be something for comment tags on every page. So I fail to see what’s wrong with that.

    Not only that, what did this have to do with incorrect spelling?

    The target=”_new” attribute is used because, at the time I did that site (last year), it was my first-ever attempt with XHTML. I had never used it before and at the time, did not know how to have the user leave the site and still keep it validation-friendly. I’ve developed a Javascript since then in that regard, but I’ve never been a fan of using JS to accomplish this purpose. Have you got a better way? I’d love to see it.

    That’s the only validation issue that I saw.

    I sense anger in you, Chris_Y. A cup of herbal tea may be in order.

  77. Matt,

    I have a question that came to me last night on the rel=”nofollow” issue.

    Let’s say Page A has 20 links on it. Of those 20 links, 3 apply the rel=”nofollow” attribute.

    Based on what you just said, no PR value will be assigned to the 3 with said attribute, which makes sense.

    The question becomes, what happens to the other 17 links in terms of PR distribution? Does the full PR get distributed evenly among the other 17, or does it get distributed among the other 17 as if there were 20 links to distribute PR to?

    In other words, does each of the other 17 links get 1/17 of the PR distribution, or 1/20?

    If it’s a case of the former, that does lead to the possibility of a different variety of blog spam. Let’s say I decide to start a blog. And with every blog post I make, I have some sort of “signature” (a la forum posts), containing a link with anchor text that DOESN’T use the rel=”nofollow” attribute. Because the other blog post comments have no distributed PR, a larger percentage of the distribution goes to my links.

    I’m not doing this (since I don’t have a blog because no one ever listens to me), but I do know of someone that is (among about 20 other spammy things).

  78. Sorry Matt, but where’s the price label of the assumed bought links?

    Many catalogues are listing companies for money, is that forbidden now?? Most of payed web promotion and advertising are including LINKS…

    What’s about link exchange? Is it not on the same way abjectly like link-buying?

    Sorry, I don’t believe you, maybe it is possible to recognize a handful of link sellers on ebay, but that’s not the solution to avoid all bought links and to clean the web from spammers.

    Best regards from good old Germany….

  79. Adam Senour

    Hehe. Just got a fly up my ass about people extolling the virtues of XHTML1.1 when it will screw up most browsers when implemented properly and using the correct “application/xhtml+xml” parsing. Plus the xml prologue throws IE into quirks mode and screws up the css box model.

    I’ve still got your source code from this morning if you want me to email it to you, so you can see the keyword list.

    To open in a new window without using JS, you need to make a custom doctype. Accessify give a good example here http://accessify.com/features/tutorials/new-windows/

  80. Thanks to Google there will be less money spent on Internet to fight for PageRanks. This is bad for the webmasters and good for users.

    So, to make it clear. Since I am a webmaster I need to make bunch of websites and give direct links to the main big website so it will gain some weight in PR? Because I dont think anyone would want to give away their high PR privilage to other people for free, unless for something in return; money.

  81. Chung – you should just wait for people (not sure who exactly …) to link to you of course!

    Adam – the easiest way is to just do this:
    <a href=”the url” onclick=”window.open(this.href); return false;”>Your link is now valid and will work whether the user has JS or not</a>

  82. Ben: a variant of that is what I use on other sites. But as I said, I’m not a fan because it’s a JScript version.

    Chris_Y: I actually found that example once before, but never could understand or follow it.

    My interpretation is that, if I just copy and paste the code that buddy has put up and save it as something.dtd, that it would allow me to use both XHTML and the target module and still validate.

    The problem is that I don’t follow it enough to be confident enough to try it anywhere, nor have I ever seen an example of it being used other than the one they provide, which again, I don’t totally follow.

    I’d also like to understand exactly what’s going on with the code that I’m copying and pasting in case I should ever want to modify it down the road. With this example, I don’t.

    Is there any site that has dumbed down the explanation a shade?

  83. I wonder to what degree buying text links would hurt your listings. I was reading an article in the Salt Lake Tribune (http://www.sltrib.com/ci_3181115) and found that Yahoo Travel is buying run of site links on the bottom of every page.

    Interesting…

  84. Adam – is there any reason why you can’t juse use Transitional instead of Strict? They’re both valid/well formed XML documents. Transitional allows you to set the target though.

  85. RE: “I wonder to what degree buying text links would hurt your listings”

    More likley to hurt your wallet as apposed to any SERP position. That is unless you get into a PR buying frenzy like many SEO sites out there stir up. Then you could be banned altogether.

  86. AS with SEO – is their white hat link buying and black hat link buying?

    There is a clear difference between a camera company buying a link on a camera site than a gambling company buying a link on a university website. Right?

  87. Ben,

    A hunch that eventually the web evangelists of the world will somehow be heard by people other than their own kind and somehow influence SEs, the average Joe on the street, and most companies that to develop a truly effective website, XHTML Strict is the way to go because it is the highest web coding standard available.

    Wait…that ain’t happening any time soon. 🙂

    Seriously, part of the reason is because XHTML Strict is the highest standard available, and where possible I’d like to stick to that. Here in Toronto, Dr*amw*av*r and Flash are still considered “essential web development tools” and the idea of using XHTML is so foreign for the most part that it gives me a certain competitive advantage.

    Oddly enough, the few people in the GTA that parrot the traditional pro-web-standards party lines are, for the most part, Dr*amw*av*r users, which to me is kind of like trying to make the PGA tour by practicing on a minature golf course.

    I just found a site last week that’s still using HTML 2.0….and is regularly updated. HTML 2.0! No typo. Two point oh.

    So basically, it’s a self-perceived competitive advantage in the market I’m in. That, and I’ve more or less gotten used to it over the past 18 months and have done some stuff with it and Photoshop I wouldn’t have dreamed of 2-3 years ago.

  88. Adam

    Texas Star use the custom dtd on this example page http://www.texastar.com/tips/2004/target_blank.shtml however I haven’t found anything that explains the W3C instructions on building a custom dtd in words that I can understand.

  89. Matt,

    I’ve thought quite a bit about the no-follow and link+PR-selling issue, and though my reasons for being uncomfortable with Google’s stance on this have changed somewhat from my original no-follow commentary, I still think that Google’s heading down a slippery and unfortunate slope here.

    Specifically, Google is quite literally trying to determine *intent*… and until Sergey pushes through his brain chip implants idea, I see this as quite the losing battle for y’all.

    * * *

    Here are some other “bad intents” / “bad motive” examples:

    – Acme is one of my paid clients, and I gush rhapsodically and suck-up-ingly about Acme (with links, of course) several times in my blog over the course of a couple of months.

    – My acquaintance John, a blogger of considerable less talent and good looks than myself, is desperate for GoogleJuice, so he pays me one MILLION dollars to write up a 100% positive review of his blog (with, of course, a link to his site in several paragraphs).

    – I get really drunk one night at Molly’s (again) and lose a bet. Ahem, and the loser (me) has to write a limerick praising and linking to a cheesy porn site of the winner’s choosing. As a result, I’m stuck blogging about “ScrapedLesbians.com” and — let me tell you, Matt — that’s a humdinger to rhyme with stuff. TWICE!

    * * *

    In each of the cases above, I’m linking to a site that I don’t TRULY believe in or sincerely recommend. But it’s not on a nice separated or delineated list of “Sponsored Links” on the side. Nor do these links necessarily stand out amongst the quite-diverse set of links already in place on my multi-topic blog.

    In short, there’s no way in heck for Google to determine whether I’m selling Page Rank or simply abominably sincere in my bad taste (naturally).

    * * *

    Worse yet, I see Google’s publicized attempts to ferret out Webmasters’ intents as something quite likely to backfire. One of my friends — we’ll call him Jerk — has a very high PR site and he sells links for the express purpose of giving GoogleJuice to his ‘clients.’ Right now, he’s a clumsy oaf about it. But were he to stumble upon this discussion, he’d do what any savvy evil Jerk would do: Simply disguise sponsored links as editorial choices (e.g., within descriptive paragraphs).

    So, as a result of Google’s attempts to determine intent, Webmasters will simply blur that intent… blurring, specifically, the line between the editorials and the ads.

    * * *

    I have a rather off-the-wall but well-meaning counter-proposal. You guys practically own Firefox, right? 😉 Why not work with those fellas and the IE 7 peeps to make no-follow links look different than regular links (squiggly orange underlines by default, whatever). Then, at least over time, Webmasters using or not using no-follow could be judged not by Google, but rather by visitors’ ethic-o-meters.

    For instance, if I go to theoretically-trusted-news-site.com and see a bunch of links to questionable sites (whether editorial or advertisements) without link condoms, I’m probably going to think less of that site’s ethics and trustworthiness.

    In the long haul ideally we readers on the Web would think similarly of lists of sponsored links or advertorials sans link condoms on any popular sites (CNN, The Onion, etc.).

    * * *

    In other words, Matt, I’m proposing that a Web of Intent be judged and juried by informed readers, not by search engines. Give browser makers and readers the tools to see link condoms visibly, give them the information to understand what they MEAN, and I think that this will do more good long-term than a single search engine trying to determine (and potentially punish) intent without the benefits of mind-reading chips.

    Thanks for taking this into consideration 🙂

  90. Oops! Given the stellar and more frequent contributions of Adam S. in these threads, I probably should start identifying myself as something other than just “Adam.” So, for now at least, “Adam L” it is! 🙂

  91. Matt,

    Since it often comes up within the topic of paid links, and because the Matt v. Jeremy ‘debate’ is just plain stupid anyway, I think it’s time you said something about the idea of “Google bowling,” just to put it to bed… which is where I’m going next.

    Oh, yeah, almost forgot – case sensitive ‘security codes’ suck, don’t blame me I voted for the other guy, and thanks to whoever came up with nofollow, it’s a very nice solution for those who ‘just sell advertising.’

  92. Jeremy Zawodny is an idiot.

  93. Here’s the problem, Matt, with your position. As an avid user, if I click a link and don’t like the experience, I won’t go to that site again. Or I’ll abolish any desire to click links on “Jeremy’s” site if he keeps putting Viagra or Porn links up.

    Your stance is heavily colored by the company you work for, because user-experience is being obfuscated with what Google Thinks a User’s Experience is. (yahoo as well). User experience is survival of the fittest to a web publisher and should NOT be dictated by a Search Engine Dictatorship. If I get to your site, and I’m bored; I’ll go elsewhere. That only impacts the web site’s owners, not Google, not Yahoo, not anyone else.

    So why are you pretending to be on the “poor website owners'” side? PageRank doesn’t mean much to anyone but SEO/SEM people. Most user-experiencers (both web savvy and unsavvy) don’t give a toot about that little green bar at the top. It certainly doesn’t make me buy someone’s product. I care more about what other users have said about a product/site/etc. for my trust issues. So once again, I implore, why not let people buy links? So what? Or is it because that’s not money in Google’s Adsense/Adwords pocket — by small publishers capitalizing on their own ad traffic?

    just my 2 cents, maybe not worth much, but i’m curious

  94. Matt,

    I understand that Google is discouraging webmasters to build a network of sites. However, lots of those networked sites have existed long before the webmasters heard about Google and PageRank; and link exchange between those sites was considered natural. Now, as Google starts penalizing those sites for interlinking, for honest webmasters who are caught in this situation, would you care to give us some recommendation on how to deal with this situation? Should we combine our network of sites into one single domain and redirect URLs to a single URL? If so, what is the proper procedure without tripping Google’s penalty?

    Best,

    Pin.

  95. I think the whole rel=”nofollow” thing for sponsored etc links ought to be in the webmaster guidelines, as it’s very clear and simple, and people can be pointed specifically towards it.

    Am I correct in saying that this situation is the same as the O’Reilly one, i.e. that the site wouldn’t itself be penalized (as long as it’s not really nasty stuff it’s linking to, say), but all external links from the site would become untrusted?

    By the way, how does this work for domains/subdomains etc? I think a follow-up post clarifying all of these issues would be great 🙂

  96. From a Google point of view and this is only a GUESS: if the spider goes to webdirectory.com, a site based on the environment and then the Bot leaves to a bunch of unrelated sites: Car Insurance, Santa Barbara Hotels, Point-of-Sale Paper Rolls sites and this slows down the ability for G to deliver quality results faster, then they have even more reasons to penalize the site for selling the link. It is affecting their ability to deliver quality. And just like the sites selling the links are trying to make a buck, so it Google trying too by delivering better results.

    Exchanging links with irrelevant sites also creates the above problem I guess. I think if one reads the Google Guidelines carefully, it is obvious that Paid Text Links and Exchange Links do not meet the guidelines in almost every situation: It takes away from “Create a useful, information-rich site, and write pages that clearly and accurately describe your content” in the guidelines. The guidelines clearly state that don’t use “tricks” to increase PR. Paid links and exchange links are in the most part a trick to manipulate the engine. I know, I have been guilty with both.

    If Google feels like it is affecting the user experience, then it is their derogative to do what they want to help increase the quality of the user experience searching with Google.

    Perhaps Google should add this to the Guidelines: Only link to sites you believe will benefit the viewer of your site and the site you link too should have valuable content for your viewers. Or something to clearly define the type of links you should have and not.

    I just suffered a Removal of my site for “hidden text” recently. Believe me when u see this: Importance of page (0/10), you will add the nonfollow tags to the Paid links in a second just like I found the hidden text and got it removed in a flash. Atleast I got a warning letter, you might not get one.

  97. Matt, I just don’t believe that you can really detect purchased and sold links unless the webmaster is stupid enough to use a well-known brokerage network. Otherwise, you’re just speculating.

    I challenge you to detect links I purchase and sell with my PR7 site http://free-backup.info. You can’t!

    Sorry Matt, it was a nice FUD campaign while it lasted.

    – Chad

  98. This does not make any sense, text links have been around since before you could make a picture. Are you saying that even the search engines, Google , Msn etc… which are all full of billions and billions of text links use the nofollow tag, does ebay, slashdot, dmoz (linkdirectory) all use the nofollow tag, or is this rule just for us folks trying to feed our kids.

    You people (google) are nuts,

    Google will die just like the rest… of the past.. sooner or later, I guess I can start the process by ending my gmail account etc.

    BTW Google you are starting to sound a lot like old BILL did a few years ago when it was going to be his internet

  99. Hi Matt, I have a couple of websites and wondered what your thoughts were about interlinking between them? I currently do not interlink as I think this is wrong however I notice many sites do it and it gets them good rankings. I expect the way to do it is to have the sites on different servers and under ownership.

    An example of this type of linking that works is for the keyword Ringtones. The number #1 site has been for a long time http://www.ringtonesgalore.co.uk . It’s a nice enough site but has an interesting linking structure. Most links to the site are from http://www.pubsgalore.co.uk who in turn receives links from http://www.schoolsgalore.co.uk. Also involved in this chain are http://www.fansgalore.co.uk and check the design – http://www.davemcnally.com/lyrics/ . If these sites can get good ranking doing this sort of linking. Can I ? and is it best to move to different servers?

    Regards

  100. You know if someone wants to buy links let them. He works for yahoo, so what you say they have a search engine, How many of you actually use their search engine come on ppl It doesnt matter what he does, he still has more money than you and still works for yahoo making more money than you just drop the issue already.

    We all know there are those that view it as wrong, but it doesnt change the fact he is still going to do it. Boycotting yahoo might speak out somewhat but still not going to be able to generate enough response to get anywhere with it. eh. Ive wasted more time on this than i wanted to now by typing.

  101. > Oops! Given the stellar and more frequent contributions of Adam S. in these threads, I probably should start identifying myself as something other than just “Adam.” So, for now at least, “Adam L” it is!

    A belated (mostly because I never noticed) and a genuine thanks for the notice, kind sir. 🙂

    I figure if Matt’s going to take the time to give us enlightenment, the least I can do (and any of us for that matter) is to do the same. Even if we don’t all agree (and we never will), at least Matt gets the feedback he needs to make Google a better engine.

    By the way, I love your squiggly line idea for the “nofollow” links. I’d like to take your suggestion a little bit differently, and a bit further, and make the following suggestion (since it’s browser-independent).

    What if the Google Toolbar were used as a rough measure of text link popularity and the value of said links?

    To explain:

    Let’s say I’m on Matt’s site (wow, what a stretch, huh?) I make my post, Aaron Pratt makes his post, Adam 1A (the original) makes his post.

    We all know that Google, like any other engine, can extract hyperlinks from code, so the Google spiders will easily be able to determine where the links are in this code. So the three links mentioned above can easily be extracted for tracking-type purposes.

    Now…since the toolbar is browser-based, it can be reasonably assumed that the traffic data that Google sees from it is about 99.9999% real people traffic data (someone could try and tip the scales with it by automating something, but Google could quickly and easily pick that up, as they presumably do with AdSense already).

    Other users come along who run the Toolbar and read Matt’s blog posts, containing Aaron’s, Adam 1A’s and my replies. And these users for some reason love Aaron’s SEO Buzzbox, clicking on it 20 times. They like Adam 1A’s site, clicking on it 10 times. And they hate mine, not clicking on it at all.

    (Side note: Aaron, I like your site. I’m just using this as example.)

    What Google could presumably do is alter the appearance of those hyperlinks (like they do on certain form fields) to indicate which links are good links (possibly a green highlight?) based on traffic, and also which links lead to bad neighbourhoods (a nice red highlight or something).

    The traffic could also be used within the engine itself (if it isn’t already a factor) as a ranking mechanism.

    There are some downsides to the argument (webmasters inadvertently linking to spam areas and looking bad as a result, for example), but I can see more that’s in favour of this methodology than the nofollow methodology.

    Perhaps the biggest plus is that it would somewhat minimize the effect that buying a text link would have. If I buy a text link on Site B and no one clicks on it, then it will also have the dual effect of providing me a reduced (perhaps no) impact in terms of results.

    What say you, Mr. Cutts?

  102. And to keep this separate, what I originally came here for (before I got sidetracked):

    I was wondering what you thought of the concept of http://www.hedir.com , and the possibility that one day it could replace DMOZ as Google’s directory of choice.

  103. Looks like the links aren’t gonna see a month two according to TLA!

  104. As always a very interesting discussion.

    Here are some observations:

    http://www.gay-sex-positions.com/ aparently has a PR of 6!

    Penalising spelling would hit sites that just make mistakes, use local dialects or are sights about spelling/misspelling – not just sites using mispelt words to capture search results positioning. What’s wrong with that anyway, since the user will have entered the mispelt word anyway for the result to come up?

    It would be wonderful if all web sites could be somehow (magically?) evaluated for relevance/greatness/worth the instant they appeared on the internet … of course they can’t be. Therefore new sites with smaller budgets have a very poor chance of ever becoming known however good they are with any sort of advertising. Established sites have every chance of remaining the only ones at the top of search engines results because just by being there they will attract ‘organic’ links.

    The network of organic links is what Google originally emphasised as the best way of analysing the value of sites on the internet. Probably this was reasonably true at the start. But like experiments at the quantum level, Google own presence in the equation now has a massive effect on the www itself. Sites are not now found ‘organically’ nearly as much as through through their position in Google’s own search engine results …

    This suggests Google should reduce it’s dependence on linking and mainly rank results on content as mentioned above. A good site is a good site even if very few people have found it as yet.

  105. Please note typo in above –

    known however good they are with any sort of advertising

    should read

    known however good they are without any sort of advertising

    by which I really meant

    known however good they are without advertising

  106. Hi Matt,
    Got a quick question for you if you dont mind. We all know that the purchase of ROS falls well outside of the terms and conditions laid out by Google. What I am interested in is the best way for a client to drop the links. I am in discussions with a client regarding this issue and have taken the stance that the ROS should go, not an opinion that they were overly enamoured with but they are listening, however they asked what the impact upon their positioning would be by dropping them and if there was a safe solution. These links were put in place by a previous SEO company and they were aware of them going on, as they paid for them, but were unaware that they transgressed the T&C’s.

    Thanks in advance

    Mike

  107. You know that people will just keep on buying them though..

    I stopped when my first site got banned 🙂

  108. I personally think that google are of course allowed to do what they want, its their engine, computers, technology etc…. However its quite easy to see that Google are moving on towards becoming a big bad organisation, ala MS. I’m sure theres another search engine player just round the corner, Anyone else noticed that Google seems to be getting worse with the more money it makes?

  109. I think its in Googles interest to stop people spending money on text links so they can spend more money on adwords.

  110. The system that has been created by Google is wide open for abuse as it stands. Linking is just another flaw in the system which Google’s PR rush has encouraged. There are many complaints on the internet of competitors linking else failing abusing competitors adsense accounts just to get their competitors banned by Google. It makes it so easy to get rid of the competition nowadays thanks to Google!

    What are you guys doing about this Matt?

  111. The system that has been created by Google is wide open for abuse as it stands. Linking is just another flaw in the system which Google’s PR rush has encouraged. There are many complaints on the internet of competitors linking else failing abusing competitors adsense accounts just to get their competitors banned by Google. It makes it so easy to get rid of the competition nowadays thanks to Google!

    What are you guys doing about this Matt?
    http://www.sh3bwah.com/web/index.php

  112. The real questions are:
    1. Has Google made the Web a better place?
    2. Is Google evil?
    3. What would we do if we we’re Google?
    4. What are Matt’s real thoughts?

    Answers:
    1. Since going public they’ve pretty much crapped on the Web as a whole in the pursuit of pleasing shareholders.
    2. My Mama always said, “Evil is as evil does”.
    3. Become self-centered and greedy.
    4. How can I go on defending Satan? Hummmm… The money’s good and it’s only my soul….

    Also, how can Google have the balls to penalize others when they directly link to more porn than ANYONE? That’s right children of all ages…use image search without the “safe search” feature and you can see things that may disturb you for life…

    Doubt Matt’s posting this one…

  113. I think buying links NOW is the equivilent of changing META tag information 10 years ago. The search engines are getting smarter & know what sites are bought links and what is a natural link. I say get in while the gettin’s good – and in the mean time, build up your natural links for when the well dries up!!

  114. Due to the interlinking nature of the Internet, where sites link to one another to form a “web” of hyperlinked websites, industrious webmasters began exchanging hyperlinks (reciprocal link) to one another’s sites in the hope of improving their search engine position. But buying link may lose their trust in search engines
    .

  115. Let’s be fair – some top good IM and SEO forums sell links, and if Google will penalize them – we will lose great communities. I think the threats will not touch big fish. Maybe only some middle size marketers.

  116. The system that has been created by Google is wide open for abuse as it stands

  117. I hav a site and was recently approached by a very good site for selling ad space. I generally don’t sell link space, but after a closer review I found tat the site was very good and deserved a link. I wanted to link to tat site for free. So I linked to him and the site owner paid me nevertheless. So in tis case, I am linking to a site bcause it is good. Isn’t tat called a vote? Why sld I cover tat vote with a ‘nofollow’? This is the very basis how google started. Links are counted as votes or more specifically speaking, the search engines don’t hav brains to evaluate a site by looking at it’s quality and hence rely on links given by other sites as it thinks the the sites hav given a link after a thorough evaluation of the site they are linking to. Isn’t tat exactly wat I am doing here. I evaluate the site and giv the link under sponsored ads or any other caption for tat matter. The difference – siteowner decides to pay me for the favor. So where does the question of nofollow arise? And why sld I use a nofollow in my links anyway irrespective? Didn’t google mention in it’s webmaster guidelines to make site’s for users and not for search engines?? So why sld I bother abt a rule like tis?

    Overall, using nofollow in paid links is one thing which I think no one should follow. It doesn’t make sense unless you link back to junk/unrelated websites for money. In tat case, you are sending your traffic which has come to your site via search engines to bad sites which is bad for the visitors. So who suffers at the end? The visitors.

    I think google sld follow yahoo in following, no follow links in order to follow what the website is all abt. Decide wat the site is dealing with by checking all link backs. If the site links to porn or other illegal stuff, why hav it in an index (ranking higher in serps) and why point people to such a site? ?

    Hope everyone followed tat. Is there any one with no-follow? Ok get in touch with me at m_mukesh22 at the rate of the email program tat starts with y and don’t forget to put a dot commerce extension. chao

  118. The system that has been created by Google is wide open for abuse as it stands thanx

  119. I am an attorney. I see merit to the anti trust violation arguments. It appears that Google is trending to make it impossible for sites to get high placement or traffic unless web site owners purchase pay for click ads. But at the same time, it also appears that Yahoo! and MSN will take over again because the Google search result quality will go down, as Google will sell a pay for click to anyone, even a porn or “spam” site. (so much for a free exhange or ideas)

    I have pretty much given up at trying to rank high in Google, as I find pay per click repulsive and a legal website like mine is almost impossible to get ranking with high PR unless you pay for high PR links. But now I am told I will be penalized if I do that! (unless I pay Google of course)

    I figure the market will favor Yahoo! and MSN and I am just gonna have to be happy with my first page rakings there and give up on Google. For that same reason I am probably going to stop using Google altogether.

    For those with unique sites Google may be cool. But a personal injury attorney site like mine won’t do well unless you pay $5,000 or more per month to findlaw or Martindale Hubble for their links or are super old and trusted by Google or pay per click. Even though they are basically “selling” their links. It is all hippocrasy it appears.

    It’s spam unless you pay Google or a big company like Findlaw or Lexis. /s/ a very frustrated lawyer.

  120. I may be a little naive, indeed I do feel a bit like the little boy who thought the king was naked when everybody else was admiring his new clothes. If I ask a relevant site to link to mine and they insist on payment. I pay up. They link me. How does anybody know that I’ve paid? It’s just a link.

    If I pay lots of irrelevant sites for links then, yes, the search engines will pick them up. So surely the issue is not whether you pay for a link but whether the link is relevant, n’est pas?

  121. Ok Matt. I went on an article writing campaign and yes I moved up. But I am still bummed because Findlaw websites get way better rankings from the paid links they offer (they just call them “listings”) The thing is, the findlaw listing links have very little valuable content and no where near as good as my articles offer.

  122. I started using Text Link Ads yesterday. Today I read several scary articles about potentially being penalized by Google for having them. The Text Links came off the site.
    I would like to run them, as they certainly pay better than the alternatives for a site getting off the ground. Eventually my site will rely on private sponsors because of my niche, but in the meantime I am experimenting with Adsense and others as placeholders.
    I do not want to undermine the work I’ve done to get this site off the ground. I am not stupid, and I realize that my relationship with Google is far more important than a few dollars a month for some links. Can I publish a half dozen text link ads on my site without fear, or is it better to leave them off?

  123. Google has created a monopoly by banning paid links, it’s good for google but bad for everyone else.

  124. I’m reading your blog for some hour now, and I’d like to ask a question.

    we’ve developed a social network with subscribing users such as photographers designers and agencies. Subscribers additionally to some privileges could post a url to their website.

    At first we had this as follow links, however as the subscribers increased (and after reading the information about no follow) we decided to turn all links to no follow because of google. However in reality we believe that this is not fair to the subscribers as most of them have interesting sites, which we would “vote” for. However because of the subscription pricing we have removed follow links so that they’ll not be interpreted as link selling etc.

    Moreover the subscribers don’t really know the difference between follow and nofollow so they don’t seem to care or notice.

    My question is this: have we gone to an extreme having no follow to our subscribers?

    Thanks in Advance,
    Alexandros

  125. Its tricky to know whats best to do for a specific domain you just have to keep plugging away at it.

  126. I can see both sides to the coin. People want to be able to make money without a ton of sweat and labor so they decide, “hey I’m going to sell links on my site. I should be able to do that and not get penalized.” BUT I can also see how crappy spam sites could just throw money at a situation and get better rankings even though they are NOT providing any real value to the end user. Kind of an issue when you use links as a gauge to judge who’s the best resource for “said” information.

  127. Does Matt Cutts actually talk anywhere about selling links from one’s own website?
    I ask because there are organizations out there that I can sell a business listing to which would of course include a link to their website. THe end result would be the same especially if we have a directory set up for the particular demographic we specialize in to offer themed links such as on this page here:
    http://www.petap.org/distance-learning-online-schools-directory.cfm

  128. As a web developer myself i can tell you that buying links is effective but gets very expensive – I only recommend it to some start-ups to get their site initially onto the search engines and after that i advise they get a bit more legitimate and find better ways to promote your site.

    Never sign up to these 1,000 links schemes as they really are bad.

  129. Matt I would like a clear answer to this simple question: Can I sell links on my website without using nofollow tag or will I get some penalty? Is this bad?

    Thanks

  130. Buying links is a slippery slope. I like adwords better.

css.php