At SES NYC, I’ll be talking more about how we’ve been ramping up our program to email webmasters when we notice problems with sites. We want to let legitimate sites know when we spot issues like hidden text.
Based on that, I was interested to see an article about a site in the Philippines that is removing hidden text from its front page. Kudos to myayala.com for taking that action. A quick check still shows some hidden text though, e.g. http://sureseats.myayala.com/ has some white-on-white text that I highlighted:
If you’re doing a reinclusion request, my advice is to make sure that all of the hidden text is gone from the site before requesting reinclusion. It may just be an oversight, but doing a full clean-up on a site before doing a reinclusion request really improves the chances of everything going smoothly.
Mat, I’m sure this would have crossed your mind but I’ll go ahead anyway.
Why not have a second spider (or set of) that does a follow up check of X pages on a site, declaring itself as IE or something?
It would probably be a lot easier to have it automatically detect the page changed rather than have you casually browsing and selecting text 😉
It would not need to check the whole site, a random sampling of a few pages would be enough to fairly accurately tell if it’s spamming or not. The only problem would be if the site received a legitimate update between the real & fake spiders requesting it, but that could be factored in pretty easily.
Matt,
I really appreciate the informations you give webmasters to achieve best sites what SEO *and* Google Terms are concerned.
But in case of hidden text I have a question: What about text being hidden for accessibility reasons? I think of all those “skip links” which are not viewable for the common user with a common browser.
IMHO Google should make a difference between hidden text which only serves the purpose to rank better and hidden text in order to give users with screen readers a more richful experience.
What do you think? Hidden text is not evil per se. It depends on the purpose. But how should Google manage this?
Best,
Lars
Hi Matt
I like your pilot program of contacting webmasters. However, don’t see you mentioning much about WebSpam Team fighting back on keywords stuffing. Any news on that front?
Have a great day.
Ben, that’s a fine way to detect cloaking, but not to detect hidden text. We’ve been doing some neat stuff lately though.
Harith, we have been working on some new stuff. It’s a little early to say how that will pan out, though.
Good idea
Wish we here based nearer NY so i could come along .
How do I Make sure that for the sites I look after I am down as the webmaster.
May be expanding the sitemap to have info on the owner and a technical contact a bit like DNS could be one way to collect he information.
“It may just be an oversight, but doing a full clean-up on a site before doing a reinclusion request really improves the chances of everything going smoothly.”
Sure, but this means you know what has to be cleaned up. And this means that Google has replied to your e-mails 🙂 Which is not always the case
>Harith
Isn’t all this talk of fighting all a little adversarial..;-)
Sometimes – All this talk of penalties and elimination and removal and cleansing and evil spam is..well, you know..OTT!
It would be very if all this silly stuff just didn’t count/work etc.
This algo as a bit of a baby really… very young, prone to tantrums and poops on you when you least expect it! Maybe one day it’ll mature to a state when the perceived need for public admonishments is but a way off memory.
Funny how things improve, hell look at computing! Abacus to Pentium..
Is G still in the Abacus stage of development, or has it reached Babbage or Turin status even 🙂
Do you have a view on this Matt? If so where would you place the algo on a developmental curve – Kaizen kop outs disallowed 😉
Do you email the webmasters every time now that you detect they are using illegal method, or just for “good” sites? I guess its not an automated process but manual selection. And do you email foreign webmasters too (i.e. for german sites)?
I reported 4 sites with hidden text and repeated words, but nothing happened. They’re on top of Google result page.
Automatically detect the page is a better idea.
Emailing you to tell you of problems is an excellent idea. I have one specific page on a website that for some reason got penalized, and I can’t for the life of me see what caused it. An email letting me know that I needed to do something would have been a great help!
With regards to hidden text, would you class the css code:
display: none;
as hidden text. I use this on my projects as an alternative to alt text, so when someone chosse to see no style or uses a screen reader the text in the tag that has the display: none; is read to the user, instead of displaying an image.
This is something that the likes of csszengarden uses – where I picked up the technique.
Hi Matt,
Was just wondering if you were going to comment on the recent news of a UK SEO firm being allegedly banned from Google? I won’t say the name, but it has been featured heavily in the search news recently.
Thanks.
I think what the spiders should do, is check out for the bg color of divs, tables or body and see if it matches with the text. This way you put an alarm and a real human can check over the site.
In order to avoid spam, I don’t include any email addresses or mailto’s in my site content so the spam bots can’t target me – and still an address such as “contact@domain.com” or “webmaster@domain.com” will still get untold loads of spam. My whois address gets spam too (both email and mass mailing junk) and so I use private registration.
I like the idea of Google automatically alerting me to issues with my site but I fear my own spam avoidance practices will exclude me. Is there any alternative? Perhaps a secure webmaster registry on Google linking my contact addr to domains I manage?
LMAO, I could screw up a wet dream. Sorry all I was trying to get the source code to show up without the links to show the hidden link values and the urls they are pointing to with the anchor text. I am clueless
I know this is way off topic but the URL Removal Tool seems to broken. I would normally just use meta tags or robots.txt, but I took a sub domain completely offline over a month ago, and the web site is still listed.
http://services.google.com/urlconsole/controller
If you can pass the message on, or let me know an alternate method it would be appreciated.
I’ve received that mail (spanish):
Hola muy buenas, mi nombre es Víctor y trabajo para http://www.interdominios.com
Actualmente estamos trabajando en la promoción en buscadores de las webs de algunos de nuestros clientes y para ello necesitamos tener una red de webs bastante grande.
Queremos llegar a un acuerdo el cual consistiría en la colocación de 5 enlaces que generamos a trabes de un script en PHP o ASP, estos enlaces son de un sistema de intercambio, en tu página se generarían enlaces de otras webs y a cambio esas otras webs generarían nuestros enlaces, con lo cual conseguiríamos la promoción en buscadores deseada gracias a ti, por lo que te pagaríamos 50€ mensuales, ten en cuenta que tu trabajo tan solo consistiría en colocar los enlaces y olvidarte de todo y en el caso de que necesites ayuda, no tengo ningún problema en echarte una mano con la instalación.
Sobre los enlaces que se generarían en tu página, no tienes por que preocuparte sin son de casinos, juegos de azar, productos farmacéuticos milagrosos o viagra, ni de política y por supuesto, no hay enlaces de páginas de sexo ni nada que pueda estropear la imagen de tu web o el posicionamiento, siempre escogemos páginas de una buena calidad, como la tuya 😉
Ya me comentaras que te parece.
Para contactar conmigo lo puedes hacer por mail a: victor.andres@gmail.com o si prefieres el teléfono puedes llamarme al: 653 49 75 77.
Un saludo y que tengas un buen día.
—
************************************
Víctor Andrés Pérez
victor.andres@interdominios.com
************************************
GRUPO INTERDOMINIOS S.A.
http://www.interdominios.com
Tlf. 902 199 915
Fax 916 323 917
They offer me 50€/month to put 5 links
In my country, many spam in hidden text, doorway page and any blackhat seo.
I try to report it. But now have more increase blackhat SEO.
Some time i think that it’s normally seo in Thaiwebsite.
Rather than a system to email webmasters I’d prefer simply being able to enter a URL or domain into Google and getting any ban/penalty information along with the other information stored on it.
For instance the homepage of a new site of mine that is 100% whitehat and 100% unique content was still zeroed in the toolbar in the most recent toolbar update. Despite having some good incoming links and having a good amount of pages indexed. Google knows about the incoming links, Google has many pages from the site indexed, but actual search rankings aren’t good, and the homepage was still listed at 0 PR. Whats even weirder is that subpages had PR. I don’t know if it is a bug or a penalty (I can’t imagine it’d be a penalty, but who knows. I do know that the former owner of the IP used it for bittorret porn or something… found some really weird chinese referrals in my error logs, but I moved it from that IP last month.) In anycase I don’t want to waste some Google employees time by sending in a request if there isn’t a penalty, and its just some bug that’ll be worked out as Google updates. Even if I did send a reinclusion request I don’t know what I’d say since Matt recommends that we repent our sins and say it won’t happen again. With this site I have no idea what could have possibly triggered any penalty. No invisible text, no cloaking, the only bought incoming link is a Yahoo directory listing, and all the content was written specifically for the site by some well paid writers, if its not unique someone is using it without permission.
So, long comment short, if there were an automated way to check for a ban/penalty and get a decisive answer it would certainly make my job easier.
How about adding the warning functinality to the SiteMaps control panel? This would allow webmasters to ensure they are complying and avoid the “no email, no problem” assumption.
On a related note: How about a feed of the errors listed in the SiteMaps control panel?
Great! It’s about time. So often, pages and sites dissipear and we are left guessing why. Maybe now, at least we’ll know why.
that’s a great idea. I’d love to know what is going on with my sites. But why not just make it part of Google Sitemaps? We have a tab for robots.txt and errors. Why not just tack one on for penalties? That way only people owning the sites would have access to their information.
E-mail is great but I think you’re already sitting on a much better way of doing it.
Hell, put the re-inclusion request right in there too while you’re at it.
It seems since my posting of this spam, the site owner has removed the hidden links from their source code. Just incase the site owner comes on and says it was never there I have taken a screen shot of the source code before it was changed. I have posted this to my server just incase my integrity is quested again.
Thanks Wayne, finally you did me a favor. Just as an FYI, the original designer of this old frames site put navigation links at the bottom of every single page with links back to the home page as well as his own site. That way if the non-framed pages are ever found first (which happens regularly as they do come up in relevant searches) viewers are guided back to the properly framed version of the site with the left hand navigation links. All the other pages of the site still contain these bottom-of-the-page navigation links minus the link back to the designer’s site which were promptly removed back at the time when we were all told site wide external links were “bad.”
The page you pointed out is one of the two frames pages on the site. One frames page has the left hand navigation for the site and the other is the top frame which is merely the site’s logo to “brand” it and make all the pages look the same. As this page has never had any content, no one has ever had a reason to even look at it before, and every professional SEO that has looked at this site has missed this one, even when I had them remove the rest of the original SEO’s links back to his site during the site wide link clean up.
It is clear that these were navigation links and that they were not intended for spamming – it’s not like it was a long list with hundreds of keywords as you make it sound. And of course as soon as you posted about it they have been removed. As Fred said above – in order to fix something “you have to know what has to be cleaned up.” I have always fixed everything that has ever been pointed out to me right away and this was obviously not intentional.
As for the rest of your “complaint,” there should always be a non frames version of a framed site that contains the content of the framed page. That is what you are supposed to do when you have a framed site. This is not considered duplicate content and does not increase the page size of the site.
If you see anything else legitimate please feel free to point it out and it will be fixed immediately if it is wrong. I am happy to do it. And just FYI again, I do still have those old emails (as well as the page snap shots) between us where you copied my site word for word on about 18 pages and refused to take it down (until I was forced to report you for fear of getting slapped with a duplicate content penalty myself), even after I contacted you personally and privately and gave you OVER a 30 day “grace period” to take it down because you said you were moving, so let’s not talk about integrity. It would be nice if you would contact me directly for a change and extend me the same courtesy to correct any errors you might find as I did with you originally.
Hey Wayne,
You have hidden text at the bottom of your home page, but since it was hidden, you will probably say you never noticed and blame some SEO.
BTW, your integrity has never been “quested”.
Hey Kirby dont just talk lets see. I know you are one of Diann’s linking partners and I have a good idea who but, I have listen to you and a few others that come to defend her. Stop talking and show me the proof.
Open Letter to the friends at AdSense Team
I hope some of the folks at AdSense Team read this post, or Matt to ask them to read it.
Visiting a PR furure ranking tool, I saw an AdSense spot refering to the following:
———————————-
Boost your ranking
The one and only script that includes content generator plug-in
darksideoptimization.com
——————————
Clicked the AdSense link to arrive at a site offering a search engine cloaking script:
“About this script
The main purpose of this script is to boost your search engine traffic by delivering different content to real visitors and search engine bot. Many people believe that this cloaking method will not work in major search engines. But in reality, this method had bring in ten of thousands of dollars revenue for our networks.”
My question is; how come that AdSense allow such advertiser to promote a script to cheat the search engines including Google?
I.e while WebSpam Team and Google Search Quality Team are doing their best to fight back on spam and blackhat tactics, we are witnessing AdSense Team allowing the opposite!!!
Oh well….
Diann I guess you will have to wait for my response, I think my post trip a spam filter since I quoted a lot of what you said. Check back later for the response.
Wayne, highlight the white space at the bottom of las-vegas-homefinder.com and you will see it. Its probably an innocent mistake, as stuff happens, but it’s there nonetheless. FYI, its not a matter of defending anyone, just not blaming all success or failure on something as hard to define as spam. As this example points out, not everything is spam.
Perhaps Google was smart enough to figure out that the links on the other site were not blackhat any more than your hidden copyright notice, unless you think that this why you dont rank for “All Rights Reserved”.
Kirby, I wouldnt know about that site as it hasnt been touched in over a year. I dont update that site and I am sure you can understand that if someone was going to use hidden text to manipulate the search engines it would be for a primary keyword 🙂
Matt,
I see some posts here that refer to cloaking, but since I’m a newbie, perhaps you can shed a little light on the subject for me?
I understand the basic concept of cloaking (I think I do anyway) and see how it appears to expressely violate Google’s terms, but I also see a number of listings in the SERPs that appear to be cloaked pages (the result sends you to a landing page at a different URL that does not appear to have the same content that is in the result description).
How successful has Google been in finding these sites and removing them? I would assume that those listings are removed from the SERPs, but do they also remove the landing site (the assumption being that the owner of that site is the one who created the cloaked pages on a different URL)? Or does that site get away with it because they’re using a different URL?
It’s a little frustrating to spend all your time trying to do it the “right” way only to be passed up in the SERPs by people with less emphasis on white-hat SEO…
Any clarity you can provide would be greatly appreciated!
Hey Matt,
Got a question and it may just be a stupid one but hopefully you or someone else can answer it.
If I have a table with a different background colour to the background (say blue) then I put white text over that.
And presuming my actual main background is white.
Does the Google algorithm see it as fine, or as hiddent text? I presume it has the ability to work out that the table background offsets the text.
Thanks
An automated process that would us to check why/if a site got penalized would be awesome. One of my sites got dropped with the Jagger update. I verified code and anything I could think off. I put in a reinclusion request, but the results are very disappointing. I also used the “not satisfied with the results” option, but again – no luck. The site has unique content, was very well ranked and then it just dropped of the (Google) earth.
For the honest webmaster trying to make a living off the Internet these things can have a significant impact. I was hoping BD would bring the site back, but now I am at a loss. So, having a way to know what I eventually did wrong on the site would be very beneficial.
Christoph
Any word on how we can be assured to get an email? What method google uses to determine which email addy to send a note to?
Glad to hear the email “notification” program is on track, though I agree that Sitemaps is probably the best way to allow you to more scalably deal with notification. However I’m still thinking that many, many site problems still stem from algo issues that are not addressed in the guidelines. It would sure be nice to find a way to address that without compromising the secrets to the vault.
OK, I’ll stop after this, but prompted by the real estate one earlier, I did a search for real estate several times in a row:
http://www.google.co.uk/search?&q=%22real+estate+real+estate+real+estate%22
Are there any pages there not using hidden text / keyword stuffing?!? Including the National Trust:
Historic Real Estate Program — National Trust for Historic …
real estate real estate real estate real estate real estate real estate real
estate real estate real estate real estate real estate real estate real estate …
http://www.nationaltrust.org/historic_homeowner/buying_selling/era_real_estate.html
I am so glad you are contacting webmasters before banning them. That gives me a sigh of relief right there. I’ve worked hard on my various websites and would be emotionally discouraged if I were to be penalized in Google and not told why and not told how to fix it.
It may be normal that your home page has a low PR if Google considers it too broad or not specific enough. Remember, Google and most other search engines prefer to send users directly to the pages that have what the searcher is looking for. Looking at my stats on various websites, I have never seen Google send someone directly to my home page, usually they arrive at a page that is more appropriate to their search. Which is actually what I want. In fact sometimes I get frustrated because Google sends the person to a similar page but not the best page (for example, sending them to a page that links to an item instead of the page with the item they are looking for). Also, consider that PR is determined, in part, by people linking to you. Other people may not be linking to your home page, they may be linking to your inside pages as well. That would explain the low PR for the home page.
Also, switching topics, a way to preview your comments would be good. I’ve included a blockquote (which I assume is supported since other people have used them above), but am not sure it will work. So the ability to preview your comment before posting would be good.
I just realized that what I said a minute ago is not totally accurate. Google does direct people directly to the home page and people click on that link when people enter the URL of the website into Google instead of entering it in the browser’s address bar. If found out that some people do that on purpose to avoid the website showing up in the recent websites visited list in the address bar.
Is this a valid place to point out sites that are going blackhat? After all, I know of quite a few sites that rank quite highly for their keywords, yet have hidden/tiny/white links in their pages.
One in particular has 60 valid links on the homepage, followed by 80 links in a div called “hid”, wherein all of the links are white text on a white background, with a font size of 1 pixel.
I hesitate to mention it, unless this is considered a good place to do so.
Matt, I see in my logs that you and others at Google are busy looking what pages one of my websites has. If there is a problem why don’t you email me? My address is very visible on the pages. Let me know what to change. I’m standing by. Why waste everyone’s time: your time with several manual checks and my time with the re-inclusion process in case you decide to penalize me. ps. My sites are built in accordance to Google’s guidelines but Google has been very hard on white hat affiliate sites lately.
I find this whole idea somewhat ludicrous. If you quite simply build a site using relevant content, and take the high ground, your site will be free of spam…………..agreed?
If not, you are wasting others money who have to use Adwords to promote their businesses, and therefore should be given no sympathy from legitimate webmasters or Google.
Occasionally there are a few who become penalised for unwittingly adding spam to their sites, but on the whole, these people are looking to ‘jump the queue’ and gain an advantage over their competition.
I don’t see how any webmaster can use the arguement that ‘my seo told me to’ or ‘someone else did it’
We are talking about commerce here, which is something Google is acutely aware of and enjoys significant revenue from. To advise spammers of ‘the error of their ways’ is daft in my book. Hang ’em out to dry! You are in fact allowing these people a means to correct their ways with a little bit of seo help from Google. Isn’t that a reward for dodgy dealing?
The site above that you’ve illustrated is deplorable in my book and this site should not be given any correction as you would only offer this sort of content in the full knowledge that you are breaking Google’s quality guidelines. In other words, you are a spammer!
Personally, I think Google could better their time improving serps by allowing one page result from any domain (first page results for one of my keyphrases only shows 6 unique domains due to duplication. Come on! Spread the wealth guys!), dealing with spammers with multiple domains redirecting to one main site, redirects intended to retain a previous (obsolete) page’s rankings on Google and getting rid of those directories ( I thought that was what Froogle was for)
Please excuse the rant, but this is from someone who spends 30% of their profit on Adwords as they are on page 2 of the serps when they have competitors making £50k a month courtesy of spam.
Matt, if you can get hold of my email address via this post. Please contact me and I’ll give you chapter and verse.
Apart from that Matt, Google’s great. Long may it continue!
Hi,
If I sell rosey widgets, pink widgets, and blue widgets, and I write on my site:
rosey widgets, pink widgets, blue widgets without it being part of a sentence, will I be keyword stuffing? If yes, what if I write on my site:
I sell:
rosey widgets, pink widgets, blue widgets
Will I be keyword stuffing then?
so Matt, is your team hiring? If they ever are, will you let me know?
I notice most of the top search results for keywords like newborn photography all have hidden text by using font=”-1″. It’s been like that for over a year and half since I’ve been watching. Does google not look for that or just very slow at cleaning up the serps?
Funny…I read this, and I hear “Highway to Hell” in my head. Anyone else have that problem?
Actually, Harith raises an interesting issue with that. Some of the ads that are being contextually served may not apply to the overall theme of the site.
For example, I’ve noticed on a site I recently launched pertaining to layouts that the word “column” triggers AdSense results related to columns in the building sense e.g.
How would Harith and I go about reporting our respective issues, please and thank you?
Hello Matt,
can you please give us an update why there is a massive drop in pages around Feb 21st in various big daddy datacenters. It is disappointing because one of my excellent information site lost almost 50% of its pages. All pages were 100% unique. This happened around feb 21st, is there any particular reason for this.
Those pages are still pretty fine with non-bigdaddy DCs.
Problem discussed in various forums.
http://www.highrankings.com/forum/index.php?showtopic=20500&hl=
http://www.webmasterworld.com/forum30/33244.htm
Hi Matt,
is there still another trick apart from fully cleaning the site before requesting reinclusion? I already tried reinclusion on an important site (3 times within 8 months) but did not get any reply and of course also no reinclusion. Is there another special trick to get any feedback from your team?
My question is, when is Google going to do anything about all scraper sites and Sites (that I have reported) doesn’t have any content but Google Ads on their sites?
If I do a search for my domain name I see a lot of sites that have used my domain name just to get visitors. Tons of them doesn’t even show a link or anything on the page but if you view the source the domain name is there and keyword I use for my site.
I do NOT think that sites like that should even be in the index but I see no result of reporting a violation. Redirects and other sneaky tricks are bad but to me there are more important things like web site thieves being indexed in Google. If anyone at Google were REALLY interested in getting quality web sites indexed then all these sites should be gone but it looks like it just comes down to a “personal” thing between the web site owner and the thief but to me it is like identy theft!
Matt,
That site that was using hidden text that I asked you about…took it out… I feel great about that. I am glad that your efforts are making a difference. Thanks again.
Malcolm,
LOL. Great find.
It’s amazing the blatant keyword stuffing that goes on still AND how Google is indexing it.
I think a few emails need to be sent out.
What about people that buy up an existing site and fill it up with spam and use java redirects to some other affiliate site. I am seeing a lot of this right now. It seems that Google can not catch java redirects right now. I have seen a site that has been up for months doing this.
Hmmm… So, George… what do the following (at the bottom of your homepage) have to do with your hotel site:
@Adam Senour and Harith: You can click on “Ads by Goooooogle” associated with each adsense panel to provide feedback on the ads or the site displaying the ads. 🙂
I think it would be fair to email web masters when Google detects a problem with their site. I learned the hard way that I was breaking the rules. My site was banned for nearly six months. Many novice web masters are simply making honest mistakes and their being classified as black hats. Google site maps could evolve into a great tool for detecting site problems.
Hi from another Matt 🙂
We face the situation where we don’t know what to clean up.
Large numbes of pages lost rankings suddenly in September. Some older pages retained earlier rankings solidly. New pages stopped ranking.
The site in qeustion is a news website, and run by a pack of journalists. We definitely do not have keyword stuffing. We defnitely have some press releases. No cloaking either. Would it have been a duplicate content penalty due to the press releases? That my explain my some pages retain rankings, but what explains new nages not ranking?
We were well known after a couple of years, our brand was getting known and established as a site with a unique take on current events. Site not worth working on anymore, we need to focus on where we can make money from too. The brand name slowly disappearing from everyones memory after six months of obscurity in SERPs. And clueless about what to do.
Posting in WW turns up no way to investigate. I can surely remove every single page on the site which are even remotely related any press release. And what if nothing happens? And why do something so drastic unless you know its the problem? Ah, would it be those sites which copy my own content into thousands of sites and cloak and display it to Googlebot be the problem?
Would be great if you could help, Matt. I spent 2 years posting in WW about the advantage sof being ethical and never ever messing with Google, and it hits me of all people 🙂
Is there any way to check if a certain URL is delisted? I’m not talking about a full report of delisted sites but say a webmaster didn’t receive the warning e-mail for some reason (caught up in a mail spamkilling program for example), the site gets delisted and someone suggests checking if the site is delisted or not.
Would they have to check search results for something that they know should be listed (and if they get no results then they can assume they are delisted) or would they have to e-mail Google and ask?
Jimmy.
Wouldn’t that spammy text be red flagged by google regardless, hidden or not?
Nick Said,
February 27, 2006 @ 6:42 am
Wouldn’t that spammy text be red flagged by google regardless, hidden or not?
____________________________________________________________
I dont believe so or the site that I mentioned who has been # 1 in Google for 3 years for Las vegas Real Estate would have been removed.
Thu: does that apply if we own the site though? I figured there might be some sort of other way based on that alone.
I know that this doesn’t have anything to do with my original post, just like the comments, but Beyond5Stars is not only a hotel site as the name is for anything that would be beyond 5 stars, i.e jewery and all other “expensive” things you can come up with. We are the first to regret that we haven’t any jewellers add any listings but we wanted to attract them also. Guess it didn’t work so I have removed it!
I don’t know a lot of SEO stuff and might be doing things wrong 🙁 but I guess I might be doing something right when other people steal from me. If anyone have ideas how to be a better SEO for my site then I’d gladly accept it! I want to follow rules just like most of you guys!
Don’t know how the heck I missed this issue last Sept but sure am glad to see Google being pro-active on this. Should cut down the “Why does Google hate me?” questions I get on my blog and emails your support team must get. Matt can you tell us approximately how many emails you expect to be sending on a monthly basis? I read somewhere you’ve sent less than 100 so far. Cheers Kalena
I sure wish they would have emailed me 🙁
I have had a site for about a year and it has been indexed in Google with page Rank and all. A while ago it seemed to have been out of the sandbox since I started getting hits from Google but today you can’t even find it if you search for domain.com. I can see a lot of scraper sites using the doamin name and text from the site but not my own site, THIS SUXXXX BIG TIME!!! and I wish there was a way to really find out what’s going on here since I never got any email from Google……
Yes, nothing would have been better for me than a mail from Google saying we are removing you because of these reasons..
Hmm, but then I have not been removed – just this inexplicable drop in rankings for 70% of pages in September, and thats it!
Marc,
I have the same question that you do about the background color of tables & text. Just wondering if you ever got an answer to your question? Thanks.
George: I see your site listed when I searched the domain. Maybe there’s an update so there are differences in results based on the server we hit. I’ve noticed differences in results when I do a site search on my own site.
Also, it looks like the scraper sites are no longer listed when I search for beyond5stars.
I know it’s maddening to see scrapers steal your content and slap adsense on it. I don’t think they’re singling out sites, though. They look for sites that contain keywords they want to use.
Matt, how’s the feedback for the spam report tool working? I mean. How fast is google response on that subject? I’ve already reported many many times of a hidden text issue with no action so far.
The offending site is www[dot]letras[dot]terra[dot]com[dot]br. They have on every page with keyword stuffing.
I appreciate if you can pass that on.
I am not really sure that hidden text is the major problem with people getting banned. I know that it would be hard for google to say, hey your site is banned for “this” because then people would just keep making sites that push to that level until they dont get banned. There needs to be some way that people can pay for reinclusion or something. Site unintentionally trigger alerts and this results in them getting banned. Think about it.
Here is a software that can detect hidden text before Google does.
http://www.detect-hidden-text.com
Because it’s IE-based it is able to detect all kinds of hidden text regardless of tricks used.
I think Google should not consider those texts, but i don’t think that they should ban entire websites for this.
Hi Matt,
How relevant is the url in determining the search result ranking? If there are 10 sites which each have the same approximate amount of links, meta-tags, content, videos, blog entries, etc. and the the user searched for the string “red fast foreign cars” and nine of the sites had urls which contained one of the search terms and one url had all nine search terms- wouldn’t Google always put more weight on the site which contained exactly what the user actually searched for?
Hello Matt
Please tell me, Is the keyword density plays a very important role during SEO???
If yes, then we have to compromise with the quality content for attaining top rank on SERPs
Please give me appropriate suggestions regarding this.
Thanks a lot.