In one of my earlier posts, I said
I’ll also mention some specific “high risk” techniques and give the reasons why I’d avoid them.
seop. I need webseek either baiduspider intelliseek is focused on scrubtheweb etc.
spidering is focused on alphasearch cannot be northernlight.
Buy greg notess and planetsearch by serchengine and find details of webtop is required by metasearcher.
This website has information on euroseek and mirago products. teome depends on ssp.
supersearch with advantage, what are people searching for and teona.
This website has information on excite’s and meta searches. searchday features. metaspy depends on 703.
infind needs metacrawler com resources. inktomisearch meta searches, cyber411 – gigablast, searchtheweb, argus clearinghouse, espotting of danny sullivan.
Is that something a normal person would write? No, it’s pretty much complete gibberish. Plus, I see multiple typos (teome and teona for Teoma). What’s an “espotting of Danny Sullivan”? Elsewhere on the site, it says that “Our website sells danny sullivan is infomak either quigo to smartsearch.” I happen to know that Danny Sullivan is not available for purchase–see how all this industry research pays off? Of all the industries to scrape in, SEO is a poor choice: people in that industry are much more likely to notice someone using their content.
So let’s recap the high-risk techniques that I would recommend avoiding:
- Don’t use programs that automatically generate doorway pages.
- It especially looks bad if the doorway pages are gibberish.
- It really especially looks bad if the content you use is scraped content.
- If you’re considering scraping content, doing it in the SEO industries is one of the worst places to do it.
- If you scrape SEO content and end up scraping a couple spam pages, you may get noticed even more because someone is investigating the other spam pages.
That’s what I can think of right now. For web design company or SEO to employ techniques like this is especially bad–SEOs should absolutely know better. Every SEO company should be well aware of our webmaster guidelines, especially guidelines such as “Don’t employ cloaking or sneaky redirects” and “Don’t load pages with irrelevant words.”