SEO Mistakes: sneaky JavaScript

In one of my earlier posts, I said

I’ll also mention some specific “high risk” techniques and give the reasons why I’d avoid them.

When everyone talks about red or blue widgets, it can be hard to get the point across clearly, so this time I’m going to give a concrete example. Today I’ll use For this example, you’ll want to make sure that JavaScript is off (e.g. using prefbar, as we talked about earlier). If you do a search like [ optimization] (or use words like sells, nj, or even danny sullivan), you’ll find urls such as If you check out that page, you’ll find text like

seop. I need webseek either baiduspider intelliseek is focused on scrubtheweb etc.
spidering is focused on alphasearch cannot be northernlight.
Buy greg notess and planetsearch by serchengine and find details of webtop is required by metasearcher.
This website has information on euroseek and mirago products. teome depends on ssp.
supersearch with advantage, what are people searching for and teona.
This website has information on excite’s and meta searches. searchday features. metaspy depends on 703.
infind needs metacrawler com resources. inktomisearch meta searches, cyber411 – gigablast, searchtheweb, argus clearinghouse, espotting of danny sullivan.

Is that something a normal person would write? No, it’s pretty much complete gibberish. Plus, I see multiple typos (teome and teona for Teoma). What’s an “espotting of Danny Sullivan”? Elsewhere on the site, it says that “Our website sells danny sullivan is infomak either quigo to smartsearch.” I happen to know that Danny Sullivan is not available for purchase–see how all this industry research pays off? 🙂 Of all the industries to scrape in, SEO is a poor choice: people in that industry are much more likely to notice someone using their content.

So this text appears to be autogenerated, and autogenerated from scraped pages (not just snippets or visible text of web pages–I found text that only appears in the comments of other HTML pages). Now why did I ask you to turn off your JavaScript? The answer is at the bottom of the page:
Hmm. JavaScript that just does a redirect to the root page. And it doesn’t just set the location, it does some obfuscation and uses eval to unescape a cryptic string. But you can see the result well enough by reading the string, or by turning JavaScript back on and reloading the page.

So let’s recap the high-risk techniques that I would recommend avoiding:

  • Don’t use programs that automatically generate doorway pages.
  • It especially looks bad if the doorway pages are gibberish.
  • It really especially looks bad if the content you use is scraped content.
  • If you’re considering scraping content, doing it in the SEO industries is one of the worst places to do it.
  • If you scrape SEO content and end up scraping a couple spam pages, you may get noticed even more because someone is investigating the other spam pages.

and then:

  • If you make lots of pages, don’t put JavaScript redirects on all of them.
  • If you’re doing JavaScript redirects, don’t obfuscate the code–it just makes people think that you’re doing things after lots of deliberate consideration.
  • If you do obfuscate code, ask yourself: can a regular person still look at this code and tell what it’s doing without even knowing JavaScript?

That’s what I can think of right now. For web design company or SEO to employ techniques like this is especially bad–SEOs should absolutely know better. Every SEO company should be well aware of our webmaster guidelines, especially guidelines such as “Don’t employ cloaking or sneaky redirects” and “Don’t load pages with irrelevant words.”

In the comments on an earlier post, someone asked “It’s fine to take out one instance of spam, but do you work on the more general case of this type of spam?” (I’m paraphrasing a bit). That’s an interesting point, and the fact is that we work hard at improving our search quality with better algorithms. From this example, you can certainly make a case for checking for sneaky JavaScript redirects, plus things like 100% frames that don’t help the user. So it’s good to take action on individual instances of spam when we find it, but of course we’re working on better algorithmic solutions as well. In fact, I’ll issue a small weather report: I would not recommend using sneaky JavaScript redirects. Your domains might get rained on in the near future.

76 Responses to SEO Mistakes: sneaky JavaScript (Leave a comment)

  1. Presumably this will extend to dodgy redirects from throwaway domains to the “real” domain and not just be limited to internal redirects as well?

  2. Matt,

    I have been having a problem lately with somebody scraping content from my blog and running and Adwords campaign on top of it. They don’t link back to my site, and on some searches for my content I don’t show up because of duplicate content.

    I have talked to google and they basically said there is nothing I can do – I have tried to talk to the website owner, but they are completely unreasonable.

    2 questions:

    1) Is there anything else I can do?
    2) How do you feel about people taking rss feeds and regurgitating them? Even if they are linking to me, I might get penalized for duplicate content, right?

    Thanks for all the great info.


  3. Its funny you mention Danny Sullivan (aka Google Guy in your post).

  4. Wow Danny Sullivan is googleguy?

  5. Hi Matt.

    Let me say that was a damn fine post and it goes to show the extra mile in showing what you can say as an individual on your personal blog compared to what can be said in an official sounding nom de plume. Congrats pal.

    Great information for everyone out there, none of it was “new to me” but I do want to thank you for putting it in plain English, especially for the mini weather report. I better go get to work on something that you guys at the Plex can’t detect so simply. I love SpiderMonkey 🙂

  6. I read that the other day on dp forums. I didnt think it was confirmed by google though

  7. Richard Rinehart

    Who is danny sullivan?

  8. Matt – perhaps I am confused as to what is considered a “sneaky redirect”, but would a javascript that loads a frameset be considered “sneaky”?

    I’ve been using this code on framed pages for years:

    onLoad=”if (parent.frames.length==0) top.location=’’;”

    Am I to understand that soon any page with that code would get penalized?

  9. I have to agree, that was an excellent post that was easy to understand. BTW prefbar is an excellent tool.

  10. I’d highly doubt Danny Sullivan is GoogleGuy, as GoogleGuy is an employee of Google, and Danny is not 😉

  11. Danny Sullivan – Runs the SES Shows, operates – basically built from the ground up most of the relationships between the webmaster/SEO community and the search engines.

    Gotta agree with Jason, here. This was a damn fine post and it’s this kind of example and exposition that can really help to convince those who think they might outsmart GG to think again.

  12. Matt, in general a really good post, as allways from you! 🙂 However, I have to tease you a little biut – I know you can take it hehe

    Matt, I know Danny is (most often) not for sale 🙂 … but do you remember some of the many really funny examples of a similar kind of autogenerated AdWords ads that apparently gets approved for larger advertisers such as eBay. As I recall it some of the searches triggered ads such as “Women For Sale” and “Babies For Sale”. I believe most people – including AdWords un-official forum rep agreed that many of the examples, however funny they where, was not appropriate and probably should not have been approved.

    So, why did they get approved? (I know thats not really your area …)
    And why is it bad to put my dear friend Danny on sale when paid ads to buy other similar “silly” items and people apparently gets approved? I am sure it can’t have anything to do with money … 😉

    Sorry, Matt, I just could not help myself …

  13. Hello Matt,

    Speaking of sneaky javascript redirects, is Google’s current use of them on the main SERPs still just the same old occasional, temporary sampling? Or is this a permanent feature now? It seems to me to be sticking around for longer than usual.

    (For those who might not know, if you search for, say [milly], then hover over “”, you’ll see the URL [] in the status bar. But, for JS enabled browsers, the actual URL is something like [], which then uses a 302 redirect to send you to the actual URL. Google use JS to hide the tracking (I suppose it’s more user-friendly too), and use the tracking URL to monitor/sample clickthrough data, as disclosed on their Privacy page:(“Google may present links in a format that enables us to understand whether they have been followed. We use this information to understand and improve the quality of Google’s search technology. For instance, this data helps us determine how often users are satisfied with the first result of a query and how often they proceed to later results.”)

  14. for the record, I know danny is not googleguy 🙂

  15. Hi Matt,
    I really liked that you were able to show an example of bad SEO, simply because its not the kind of thing that people would do in any of the popular SEO Forum sites. If anyone ever brings up an example of a spam site, they are usually get a mouthful from a few of the long time members.

    Great example.

  16. Matt,

    Interesting post. We recently fell victim to someone else’s sneaky javascript that confused googlebot.

    Is Google doing anything for malicious javascript from offending, “hijacking” sites? Here is what I think happened to us:

    1. A scrapersite wrote a spider that copied all our pages and all its content.
    2. On all the pages on their site they changed any link that was our site to scrapersite
    4. They posted all of these copied pages on their server in a sub directory. Everything under that directory had our exact directory structure.
    5. They added many carriage returns to the top of our pages, so nothing was identifiable during the redirect from the google cache pages or when SE visitor clicked the link from Google.
    6. They added javascript to the bottom of each page that allowed the page to load up and then redirect to scrapersite.

    The javascript redirect has the markings of a 302 hijack because googlebot attributed our site to the other, but it is the javascript redirect that did the damage. Google displays scrapersite as the provider of our site pages, like a 302 hijack problem. Our site was potentially penalized for duplicate content…

  17. hey Detlev ..

    for the record :

    for the record, I know googleguy is not danny


  18. Milly,

    How are links you describe sneaky? I think Matt is referring to cases where a spammer wants one thing indexed, Google ranks the page based the spammy text (above in the example), and the user gets something much different. Google is reserving the right to rank against what the user sees and determine relevancy for themselves.

    Apples and oranges.

  19. Actually wasn’t Danny along with his entire organization just sold recently? =) I think he was!

  20. I am sorry but what is 100% frames?

  21. >> I happen to know that Danny Sullivan is not available for purchase–see how all this industry research pays off?

  22. Who’s been espotting GG?

  23. softwareengineer99

    Thanks for the informative post.
    Do you please care to comment on why Google uses the same sneaky redirect on their pages in FF?

  24. Brian,
    There is a difference between letting people see they are being redirected and hiding the redirection all together. Google is hiding the redirection much similar to other sneaky redirects as a user thinks they will be exiting out to the domain name in the SERPs when actually Google will rewrite the URL using an onmousedown event.

    Obviously Google can show the user that they are being redirected and no one will mind. The fact that they are redirecting using an onmousedown event makes it sneaky. An ethical company like Google should practice what they preach. If Google feels the need to hide the redirection using an onmousedown event, rather than displaying it, why should other webmasters be penalized for this?

    Don’t take me wrong. I have nothing against Google redirecting visitors.
    However, it makes me sick thinking that they are not doing it openly, rather hiding it, making it a sneaky redirect.

    One more question. If another site was to detect User Agent and then based on the UA deliver onmousedown event (for users) and links (for search engines), wouldn’t this be considered cloaking? Google is rewriting the URLs only for FF users currently using the rwt function. Same searches performed using IE don’t seem to have the sneaky redirect using rwt function.

    So I can forsee a filter that will look for onmousedown and onclick events in the future on webpages and that will penalize those sites based on excessive usage of such events. My guess is this filter will be applied to all domains except

    Matt, keep up the good work. I am sure your blog will help ethical web professionals a lot in coming years.

  25. eWhisper aka Brad

    Just to reiterate DaveN (not that he needs it at all 😉 )
    Danny is not for sale.
    All the posts pointing towards this site that Danny is GoogleGuy are wrong.
    And forget all the rumors.

    Matt’s putting up some very quality info on this blog. Great job Matt, you’ve made my bloglines…err. Google personal home page.

  26. Excellent post once again. By the feedback, I can see that some of the readers are confused… especially on the “sneaky redirect” part.

    What the problem is here is that the code automatically executes. They’re not showing a link that someone has to click on, so the comparison to the Google redirects is way off base. The Google redirects are actually a middle-man. They use JavaScript to show the URL you will actually go to, but they track the click with a redirect. The example Matt gave is completely different in that it is passive – you go to the page, then you get immediately thrown somewhere else. That’s the sneaky part – the fact that you can’t normally go to the page you thought you’d see because the redirect takes you elsewhere.


  27. Brian,
    I wasn’t confused: I was shoehorning in a tangential question (which I hope Matt will find time to answer, btw. It was a question, not a jibe).

    But still, the topics aren’t as disparate as you appear to think. “Sneaky” implies only subterfuge, not necessarily malevolence. Both softwareengineer99/lid and I acknowledged the benign reason behind Google’s JS sleight-of-hand, but it’s still … sneaky.

    You might like to consider softwareengineer99’s comment again, on it’s own merits, rather than as if it were a confusion (which it surely wasn’t). Yes, it’s apples and oranges: both are fruit.

    The reason, I’d guess, that Google obfuscate the redirect is that 99% of users probably only care where the link goes, not how it gets there, and are thus arguably better served by not seeing the other stuff in the URL. Why then only for JS-enabled browsers? Why only for some UAs (Opera too, btw, not just FF), if we’ve got that right? I dunno.

    It used to be that Google just showed the tracking redirect URL, during it’s sampling, without any obfuscation (JS or otherwise). Indeed, Matt once implied criticism (I put it no stronger) of other search engines which – unlike Google then – used JS to hide the tracking. Times change?

  28. Hi Matt,

    Very nice post 😉
    I love checking search results with Javascript off to see what kind of stuff webmasters want to hide. 😉
    Well, I have to say that despite the Bourbon update that took down numerous sites with no cheating (just a bit overoptimised), it is amazing to see that search results are still full of crap hidden under a silly javascript.
    My theory is that old sites can do a lot more black hat SEO without getting caught than newer sites. Am I wrong ?


  29. Matt. The example you gave is fair enough. It’s pure spam and being penalised for it is part of the game. But another part of the game is that Google doesn’t mind taking out some innocent sites when attempting to weed out guilty sites – the well-being of individual websites is not Google’s responsibility. But I have very serious reservations about weeding out “sneaky javascript” redirects because framed sites *must* use auto-redirects. Otherwise people will land on pages (from the serps) that are useless to them when they are viewed outside of their framed environment.

    Before you go “raining” on any websites, can you assure us that very sound safeguards are incorporated into the algorithm that will avoid wrongly penalising those framed sites? It’s one thing to get rid of outright spam, but it would be quite wrong to dump innocent sites to do it.

  30. You’re all wrong about Danny NOT being for sale. And funny that you mentioned that Mikkel, because that’s exactly where he’s being sold.

    “Danny Sullivan
    Great deals on Danny Sullivan
    Shop on eBay and Save!


  31. Patrick Cornwell

    Thanks Matt for your mini weather report. I’m pleased you guys get into such detail when it comes to tackling spam-like tactics but I also share concern as to the ambiguity I am reading within this particular post. Are we talking about sneaky automatic redirects, or those which are triggered upon an action, or both?

    I second PhilC’s concern, but on the subject of clicked links: I have a couple of sites where link clicks are tracked ‘onClick’ but the eventual URL is exactly as displayed in the HREF. To me that would concur that nothing sneaky is going on, but again it makes me slightly nervous reading posts like this that I’m about to fall out of the index!

  32. Matt, I generally enjoy reading your posts but this one make me worry. I believe that only simplest of the script snippets can be programmatically evaluated and classified as sneaky or not. More complex scripts involving conditional redirects must be considered in the proper execution context which is not easy if not impossible to do programmatically. That said I can see more innocent sites becoming collateral damage of google reign. Just today google has indexed the content of my external JavaScript. Has the onslaught started already?

  33. I’m very interested in how they plan on doing this. You can make an algorithm just aware of the different ways of doing JS redirects as there are just so many of them. I think google are actually constructing (limited?) DOMs of the pages they fetch and running a JS interpretor on them, possibly using Mozilla XPCOM and SpiderMonkey. Or maybe not? Perhaps they have their own HTML cleanup -> DOM routines, which begs the question, will we see a browser based on them?

  34. He even comes with free shipping…

    Danny sullivan
    Find Anything on the Internet!
    Free Shipping on Danny sullivan,RNWA:2003-35,RNWA:en&q=danny+sullivan

  35. The problem with the term “sneaky redirects” is that the decision about whether they are sneaky or not is subjective. I wrote Google to ask if a technique I was using with my affiliate URLs was legal or not. They promptly hid most of my site, docked my rankings, DECIDED my tactics were sneaky and never wrote me back. My business is ruined. I’m having to look at other ways to make money which is taking me away from a site I’ve had for 9 years – long before Google arrived. I spent 2 years changing and changing the site to “conform” to Google’s constantly-changing whims – when I simply wanted to work on the site — and now this.

    That sort of subjectivity, in the final analysis, is simply unethical.

  36. One thing I worry about with search engines, however, is that from a human’s eyes, they really aren’t that linguistically sophisticated. An example for you – I had a VERY simply analogy on one of my pages, one that NO human over the age of 6 would ever be confused by. The Adsense bot (which I assume is as sophisticated as the search spider), however, was completely thrown off base. I can understand that, it’s just a dumb bot and has never even sat in so much as a 2nd grade Language Arts class. That said, it worries me that real life humans depend on these bots to determine what means what. In the example you give, you’re right. And it’s right to try to weed out the spam, but I fear G may be throwing out the baby with the bathwater in many cases (what would Adsense do with THAT sentence – even though it’s a well-known cliche at that!). I fear a dumbed-down web that capitulates to spiders and not the human intellect. … As I said, can’t blame the spiders.

  37. PhilC, we’re aware that sites have code to show frames in the correct way. I believe that such sites will not have any cause for concern–this is a common idiom.

  38. Matt,
    Since JavaScript redirect code is not being allowed is it a bad idea to use the Meta Refresh HTML tag on pages?

  39. Hi

    I did like your blog postings and comments but i have soem problem if some1 could help me out.

    Does anyone know how to pass html table data from parent to a child form html table ? Or may be you could let me know any URL related to this topic.

    Any kind of help is highly appericiated.


    Imran Hashmi

  40. var a=”window.”, a2=”location.”, a3=”replace”, a4=”(‘http://domainname.cmo’)”;
    var str=””+a+a2+a3+a4;

    how abt a code like this in .js file ? 🙂 maybe banned ?

  41. Srinath, that code doesn’t look good. 🙂 There’s very few reasons to write code in that way.

  42. I’m a very ethical SEO, myself and I get annoyed when I see spam in use. I have been talking to a company in an office just a few doors down who are using this very technique to make thousands off of some big brand names.

    When they told me they were using JavaScript redirects I thought, “Surley not, Google would obviously pick up on that and ignore the page?!”. Turns out, that’s not the case.

    Now even though it’s clear you guys at Google have known about this technique since at least August ’05 it’s still used extensively for countless sites (and that’s just the company I’ve been talking to). So how come the algo isn’t getting rid of this technique yet? Surely Google is clever enough to determine the text in your post is gibberish and so seeing a redirect in there as well can’t be that hard… I’m baffled people are still getting away with this??!


  43. I agree with the comment above about subjectivity – there’s too much opportunity for abuse in such a system. Google should really be more sensitive to and cooperative with small business owners. Not every small business owner is a “black hat” evil doer to be treated as a pest. Because Google has become the effective doorway to the Internet for so many people, they have unparalleled control over the fate of someone’s business. As such, Google needs to keep in mind their responsibility to the community as a whole; including small businesses that depend on fair search treatment for their business.

    If you ask me Google should be bending over backwards to help small businesses in situations like these. With the situation they’re in (controlling the Internet as AT&T controlled the phone), I’d anticipate government regulators to step in eventually to prevent them from having too much control over companies. The more small businesses that complain that Google put them out of business, the quicker the regulators will arrive at the Google-plex. Just my $.02 but I do hope Google warms up to the small business owner a bit more in 2006. :-

  44. As for the quoted text like «need webseek either baiduspider intelliseek» – it doesn’t look like spam/scraping, it’s probably the last revision of NewSpeek :p

  45. It would really help if Google spiders the web in such a way as to make the webserver think it is a user and not a bot and compare results.

    Thanks you Matt and thanks to Google for letting you write.

  46. I found some stuff the other day and was hoping it could be explained. If something I say doesnt make sense please excuse because I have been up late a few nights trying to figure it out.
    I know slickcar….com and commandoalarms….com and a few other commando sites are all owned by the same person and sell the same thing. But if you do a search for slickcar and paypal you get some really weird results. Long story how I stumbled on it but I thought it was well different. at first I thought it might be a sneaky redirect thing but I couldn’t find the code like matt had listed in one of his examples. Besides it isnt gibberish though the text changes everytime you reload the page. Its loaded with google ads 🙂 Ill give you gurus an example.….info/ and….info/
    I added the exttra peroids of course since not sure if there is a rule against urls here.. Im still learning this stuff.. what is this? good bad otherwise? clever? is it a google thing?

  47. Thanx Matt, nice way of clarifying things. But is SEO=Google? I mean, how many engines are actually looking that deeply? OK, Google is THE engine. But how many engines implement and filter such javascript codes? Sometimes, a coder can’t do w/o a redirect or has to obfuscate content, to protect it from content-crawling leechers… What’ya think?

  48. >It would really help if Google spiders the web in such a way as to make the webserver think it is a user and not a bot and compare results.

    They won’t be doing that because it would mean removing the user-agent and in doing that people will get angry because they wish to know when googlebot visits their page. Theirs no solution because you could easily write a php script to detect the useragent and to show some content depending on the useragent, and since Google has to identify its useragent – it would make no difference changing it from googlebot to say “MSIE – Googlebot” since people will just use that useragent to block or alter code.

  49. good resource of info.

  50. Thanks for above article Matt,
    I have understood not to use any sneaky JavaScript and
    not to load pages with irrelevant words.

  51. If I the redirection is set to be made by the server, from the main URL ( to any other, and not by javascript commands;
    how do the bots interpret this?

  52. S.M.Khurram Quaseem

    Who is danny sullivan?

    Danny Sullivan is the Managing Editor of the famous Search Engine Watch. A well known authority in search engines, he is widely known as the “Search Engine Guru”. His creativity and knowledge in writing articles is indescribable. Danny is also a well known speaker in conferences and is known for his eloquence.
    Danny worked in news papers like LA Times and Orange County Register and then turned his concentration over to web. Danny Sullivan started his career in the web world in 1995, managing a web marketing company which later became Search Engine Watch.
    Danny strongly believes search engine marketing has a future. In a conference he spoke, Dannie compared search engines to a ‘reverse broadcast network’.
    Danny successfully provide website consulting services, from site conception to internet publicity. He also authors the monthly search engine report newsletter. In the search engine strategies conferences focusing on search engine marketing issues, he writes expert session content.
    Danny Sullivan is climbing heights in the field of search engine marketing, alert of changes in trend and making newer innovations in the field.

  53. I don’t know what business school some of you people went to. Personally, if I put my own time and money into optimizing a site about blue widgets and it begins to receive traffic from (you know who) and I decide to redirect that traffic to another blue widget site that is willing to pay for it, well isn’t this still a capitalist country? And if that person decides not to pay for it anymore why should I not sell it to a competitor? This is Business 101
    Would you do the same with a popular phone number?

  54. I guess javascript must be used when you have to, but do not force to use them especially on generating doorway passages.

  55. Thanks you Matt and thanks to Google for letting you write.

  56. Hi all!

    I am in the begionning stages of developing a CMS solution in PHP/mySQL, and to allow for percentage settings I use javascript to get the clients broswer width before redirecting to the CMS starter page (index.php?CLientWidth= *THE JAVASCRIPT VALUE*).

    I havent obfuscated my code, and it only redirects to relative link within the same domain.

    Would i be penalized for this by the search engines? It kinda looks like a black hat redirect until you actually look at the code to find out what im doing!

    This is the code used in the index.htm file:


    if (window.innerWidth) scrW = window.innerWidth; //For Internet Explorer

    if (document.body.clientWidth) scrW = document.body.clientWidth; //For Firefox / Mozilla / Opera

    urlString = “index.php?ClientWidth=” + scrW;

    window.location = urlString;


    If this is deemd as black hatting, what would an alternative solution to the problem be? I really cant think of one, but i dont want the search engines to throw my nice shiny new CMS out before iv even finished building it!

    Any help would be MUCH appreciated!

    Ill even post nmy email address –

  57. What about sites using meta redirect written in javascript that aren’t intending to do spammy stuff like this. If I put a a piece of javascript on one page to prevent users from viewing that page directly (as opposed to via an include) will it be considered spam? If I don’t obfusicate it – then technically it should be appropriate and its not on every page – just one page that the user shouldn’t be on. I would use a php redirect but that would mess up the script from actually working itself. So I have to use the meta redirect..and advice on if this would get penalized?

  58. Well, i found an answer in the end!

    First, place this code after tag and before any other content (in php of course..)

    if (!isset($_SESSION[‘clientagentwidth’]))
    echo ”.”n”;
    echo ”.”n”;
    echo ”.”n”;
    //For Internet Explorer
    echo ‘ if (window.innerWidth) scrW = window.innerWidth; ‘.”n”;

    //For Firefox / Mozilla / Opera
    echo ‘ if (document.body.clientWidth) scrW = document.body.clientWidth;’.”n”;

    echo ‘ document.getwidth.ClientWidth.value = scrW;’.”n”;

    echo ‘ document.getwidth.submit();’.”n”;
    echo ”.”n”;
    echo ”.”n”;

    and then all you have to do pickup the value with $_POST, set it as a $_SESSION variable, and then your system will have the clients avaliable width, in pixels for the duration of the session, and it wont need the form submission again as that checks for the session variable width.

    I hope this has been of some help to you.

    To see it in action, this is my temporary dev site


  59. oops, the messege had some code removed!

    you also need to echo out a form wrapper and a javascript wrapper. I cant work out how to get the code to display on here so if you need it email me at and i’ll be happy to send you full examples of the script.


  60. good resource of info . thanks

  61. thanks for the info!

  62. Would not it be nice If all webmasters are honest and use clean SEO ?

  63. Thanks for the useful info.

  64. Hi Matt,

    I was wondering if you ever heard of ther site They offer a very unique and different way of a sites link page to display the image of the landing page of thr site you are linked to when your courser is hovered over the text link. They do use javascript and I was wondering if this could hurt or affect my rankings with the search engines? I have contacted them regarding my concern, but they really did not seem to think it would be a concern. Do you have any thought on this?

    Thank you in advance.

  65. Exacly what is that will it let me see sites that are blocked?

  66. Snap is a search engine isn’t it?

  67. So would a meta refresh redirect be considered a “sneaky redirect” in Google’s eyes? Or, a subdomain redirect where a subdomain is created and forwarded to an affiliate link for example?

    I know Google favors 301 redirects, but from an affiliate link standpoint where a webmaster wants uniform looking outgoing links, are these methods considered acceptable?

    Thx, JJ

  68. What is prefbar tool ?

  69. Scraper sites are just such a bad idea.

    The best way to get search engine rankings is good SEO, on page and off page.

    Non of this black hat nonsense

  70. Hi Matt, I posted a comment here that I told you not to approve which was about a site doing a cloaked user agent duplicate content attack on my site. I just noticed today, after I talked about the site as a blind item in a webmaster forum, they took it out already. So no harm done, everything is back to normal so far. Although I did submit a spam report, but it looks like they fixed everything already. Aside from that, if you believe this comment is irrelevant and does not need to be on this blog post, you can simply delete it too.

  71. Hi, just a question – I use a single page with a single inline script to check for javascript – not enabled view the current page, enabled you get to see the whole site (which uses javascript for some layout).

    Is this acceptable?

    It appears openly in the header and is a single line to redirect to index2.html, that is a local file in the same domain. All subsequent links to the home page (from any other page on the site) point to index2, so this redirect is only fired the first time you enter the site, and never again if you bookmark any of the other pages)



  72. nice article, you are the person I was searching for, look at it its my new webiste’s my portal page . which is ful of java scripts. since search engines can not read java scripts? thn what is the solution of it?

  73. Great article. I have to agree with everything. Thanks for the tools.

  74. let me know how’d i redirect someone to the place which they must see while having java script off .