We’ve started a pilot program to alert sites that we consider to be outside our quality guidelines. Some of this was already discussed on Threadwatch, but let’s put the info in one place in convenient Q&A format:
Q: Are you writing to every site that receives a spam penalty?
A: No. Right now we’re running this as a test.
Q: What sort of sites are you writing to?
A: This is not targeted to sites like buy-my-cheap-viagra-here-while-consolidating-your-debt-and-buy-some-posters-about-online-casinos.com, but more for sites that have good content, but may not be as savvy about what their SEO was doing or what that “Make thousands of doorway pages for $39.95” software was doing.
Q: Are these sites penalized forever?
A: No, they can return to the index if they correct or remove the pages that were violating our guidelines. See my previous post about how to do a reinclusion request for advice on that.
Q: Are you emailing webhosts too? Where are you getting email addresses from?
A: We’re not trying to email webhosts, just the site owner or webmaster. Our primary way of finding who to contact is via email addresses from the web. If there really aren’t any, we use a few addresses like webmaster@domain.com and support@domain.com. We may try a contact address by doing a whois search as another backup, but we will avoid emailing the technical or admin contact from whois.
Q: Can you give me an idea of what an example email might look like?
A: Sure. Here’s an example one for hidden text.
Dear site owner or webmaster of http://www.chefrevival.com.au/,
While we were indexing your webpages, we detected that some of your
pages were using techniques that were outside our quality guidelines,
which can be found here: http://www.google.com/webmasters/guidelines.html
In order to preserve the quality of our search engine, we have
temporarily removed some webpages from our search results. Currently
pages from http://www.chefrevival.com.au/ are scheduled to be removed for at least 30 days.Specifically, we detected the following practices on your webpages:
On http://www.chefrevival.com.au/, we noticed the following hidden text: “Chef Revival Chef Uniforms – A range of stylish, comfortable and durable chef uniforms designed to withstand the pressures of today’s kitchens, Chef apron Chef Jackets Chef Pant Chef trouser Chef headwear Chef Apron Chef Shirt Chef Neckties, Chef aprons Chef Jackets Chef Pants Chef trousers Chef headwears Chef Aprons Chef Shirts Chef Neckties, traditional check chefwear clothes”We would prefer to have your pages in Google’s index. If you wish to be
reincluded, please correct or remove all pages that are outside our
quality guidelines. When you are ready, please submit a reinclusion
request at http://www.google.com/support/bin/request.pyYou can select “I’m a webmaster inquiring about my website” and
then “Why my site disappeared from the search results or dropped in
ranking,” click Continue, and then make sure to type “Reinclusion
Request” in the Subject: line of the resulting form.Sincerely,
Google Search Quality Team
Q: Matt, are you excited about this?
A: Heck yeah. I’m glad we’re trying to proactively contact webmasters and site owners when there’s an issue with their site in Google. I’m so excited that I split an infinitive in that sentence, didn’t I? Doh! 🙂
As I said over at TW, well done mate. Communication with webmasters is a great step forward. I’d like to see the emails not being sent to addresses other than on site contact addresses or admin whois information but other than that a huge thumbs up from me.
Hi Matt, may we have an idea of how many messages have been sent so far ? I am surprised that only one person (Hampstead ) AFAIK expressed having received this type of message.
Does that mean that most blacklisted website are really spammy ? Does that mean that Google is not so good at detecting JS redirects ? Or that simply is a very small scale test ?
If a split infinitive is good enough for Star Trek, it’s good enough for me 🙂
This first came up at SEF, where the guy who received the email said that it had also been copied to the host, and that’s a cause for concern. Judging by the email addresses used, it was one of the can’t-find-an-email-address-on-the-site types.
It’s quite possible that a host will not want a site that spams the engines, in case the whole server becomes a bad neighborhood and affects many other sites on the server. It’s possible that it could cause real problems for the site if the host is also emailed, as the guy at SEF said they were.
I think the idea is excellent, but I don’t like the idea of also contacting the host.
Correction to my previous post. The guy at SEF said:-
“Email was sent to the email addresses found on the website. Also to generic addresses like webmaster@ and the like. I was also surprised to find that the host were copied in too.”
The email address was found on the site, and still the host was also emailed.
Matt
Thats what I call “Google´s Fair Play”. This move shall promote friendship and partnership among Google and “whitehat” webmasters.
We’ve sent under one hundred so far, Sebastien. That was to check on the workflow and see how things went. One thing we found was that Hampstead got an email to the technical & admin contact email from a whois search. So we’re going to change it so that we try not to email those addresses. That’s why I mentioned we wouldn’t be doing in the Q&A. That’s the sort of thing we wanted to learn from doing a pilot program.
The other thing to bear in mind is that these sites are mostly unaware of the whole SEO thing. Turn off JS and do [site:sewshop.com] on Yahoo, then look at a few of the cached pages like http://www.sewshop.com/notions-sewing-supply.html
The site itself is for a real small business, but they clearly made a bad choice and bought some piece of software that makes gibberish pages that do a JavaScript redirect with code like
var1=15;
var2=var1;
if(var1==var2) document.1ocation=”http://www.sewshop.com/acart/index.htm”;
The owner of that business is probably not going to go to WMW, SEW, SEF, IHU, TW, or other SEO hangouts.
PhilC, regarding the comment you said to cancel, I’m guessing the text was the mostly likely reason. I’d be curious to know what the site was.
Hi Matt –
Great new program if you can expand it to reach most who are in the dark about ranking problems.
You’ve mentioned JS redirects a LOT as evil, yet most sites that show popunders use them without penalty, e.g. big news sites. JS redirection is acceptable in *some* circumstances, right?
It’s a nice idea.
I wonder (if the idea is fully rolled out) how long it will be before we find networks of sites set up solely to test which techniques are triggering the spam filters.
A spammer could even set up a mail filter to grab the complaint message, auto-fix the spam and submit the reinclusion request.
Hi Matt,
For clients that may have bad pages on their sites (Like old, ill-advised Traffic Equalizer pages), can I get away with 302s that redirect from the bad stuff to the home page, or do I need to 404 them?
That’s awesome Matt, best of luck with the new program. I know that if I was overlooking something on/in my site that would be hindering my PR or SEO that I would love an email from someone at Google explaining what’s wrong.
2 Kudos from me.
>> PhilC, regarding the comment you said to cancel, I’m guessing the text was the mostly likely reason. I’d be curious to know what the site was.
The site is http://www.holidays.org.uk
There was/is no text of any kind that I’m aware of that would cause a flag. I sent a Reinclusion Request, and I’d be happy to copy it to you if I knew where to send it to (my email is phil@ukzone com, or phil@webworkshop.net). The request includes all the details of what was and is in the site.
The site wasn’t 100% squeaky clean at the time it was penalised, although there was nothing massive in it, but it’s totally clean now. I’d be very interested to know caused the penalty – especially if it was the javascript redirects, which are essential for the functioning of the site.
Brian, I would definitely not just do a 302/301 from old doorway pages–those pages should be removed. Any doorway pages should be removed, and then return a 404 status code.
Matt,
I used one DMOZ scrapping script on one of my domain and the domain got banned. I deleted all the files from the server a month ago. The only mistake i done on the domain was using that script.
Can i apply for reinclusion. After reinclusion, will the domain still have some black marks or it will be whitelisted ?
Because i bought that domain for my dream project in 2003.
Hi Matt,
I like this idea very much.
Another topic I’d like to see you discuss is duplicate content penalties. I don’t know if the penalties are as severe as people say they are, but they shouldn’t be.
What I think Google should be aggressive about is penalizing scrapers and people who copy content from the copyright’s owner to another domain. But if you have duplicate content on your own site, one page for printing, one page with large print, one page for palm users, one page etc… why should that site be penalized? Why is the onus on the site owner to decide where to disallow your bot?
Instead of penalizing a site for internally duplicate content, why not choose the best page within the site?
I do understand that you don’t want a site to display and target 5 pages for the sake of search engines, but I think the current system is Google changing the fabric of the web in order to makes it’s search engine results better, rather than doing a better job in organizing the content that’s there the way it is.
I read a lot of comments on SEO forums and here like “Google says JS redirects are bad but there are many legit uses of JS redirects.”
I just want to point out that what triggers the anti-spam countermeasure IS NOT the JS redirect but THE PURPOSE for which the JS redirect has been added by the webmaster.
While Google currently is focusing on JS based methods, the final goal of any anti-spam algorithm is to detect spam REGARDLESS of the method used by the spammer.
So, don’t think in terms of the METHODS that Google wants to fight but in terms of the webmasters’s INTENT that Google could dislike.
Fantastic news to hear Matt. When a site returns to the index after all the spam has been removed, does Google still impose a penalty of some sort and how quickly can the site owner expect to see normal natural rankings obtained.
Many of my sites have been down in the rankings in Google for a long time now. A few of them were high in the rankings in Google over a year/year and a half ago. All my sites now have top rankings in yahoo and msn. The thing is, I don’t know if I’ve been hit by a penalty. I haven’t done anything different or shady since when they ranked high or were on the radar. I have seen Google index or split the two urls but I started using all absolute internal urls not too long ago and that seems to have helped Google get the right url, but the ranking is still poor. I’m nervous about tryiing a 301 because Yahoo and MSN seem to have no problems ranking the site and I don’t want to mess with that plus I have heard stories that people sites continued to have problems after 301’ing or had new problems. I’ve kind of given up even trying with Google because at least there is Y and M. Boy though I do miss finding my sites in G. It was good traffic.
Matt
Will the new scheme be restricted to sites that have been banned manually or are picked up by the automated spam triggers? The idea is great and will help many people who have stepped over the line somehow.
Regards
IanC
There’s nothing at all wrong with split infinitives! I hope Google isn’t going to start sending us email about grammatical mistakes *grin*.
(Though a voluntary service for pointing out likely mistakes on web pages would be a real boon, especially for non-native language users.)
On grammar checking, I reckon Google could do better than Microsoft Word – see http://chronicle.com/temp/reprint.php?id=y0przoap7fzws5o9ez7p7vv3h95hdg6a
And this could be part of a lightweight web-app word-processor… (My mother is using OpenOffice and that works fine, but it’s at least an order of magnitude more complex – counting controls and menu options – than she actually needs.)
Matt
I have a suggestion.
Why not create a “Alerting site owners to problems” opt-in mailing list, where webmasters can subscribe. Promote the list on Google´s pages of webmaster guidelines, among the webmaster community and SEO specialists (to advise their clients to subscribe). This way you shall avoid possible problems in some countries of the illegality of sending your “unsolicited” alert emails.
I personally wish that such list had existed for 6 months ago 😉
Matt, here’s a wacky question for you. It appears to me that in Google SERPs, the link displayed always has a space inserted somewhere in it (the text, not the href itself). Is this a move by Google to help stop spammers from skimming content from Google SERPs to generate their own spammy pages?
Michael
YES. Communication!
😀
Alan Perkins wrote: “I wonder (if the idea is fully rolled out) how long it will be before we find networks of sites set up solely to test which techniques are triggering the spam filters.”
This has been discussed in various forums, but I don’t think it could happen within a reasonable timeframe, and I don’t think that it would be worth doing.
I’m just guessing, but I think it’s likely that the type of no-no that the system will target is stuff that all SEOs are aware of, so there’d be no point in testing. I really don’t think that the email is going to say that you have one too many instances of a certain work in the Title, or anything so refined.
I’m still guessing when I suggest that it wouldn’t be a rolling and automated system, and that it’s more likely to be used when (a) someone at Google is hand-checking a site for spam, and (b) when Google runs a profile/filter over the index which flags certain sites for checking by hand.
If my guesses are correct, then test sites would often be missed, making them worthless, and the time it takes to spot a test site would often be far too long to bother with.
Why do I guess those things? Matt said that they have sent 100 emails out for the trial. If it were anywhere near a fully automated system, they would surely have tested with more.
Just my guesses, of course.
Matt,
Thank you for all your time and effort with this blog and all the helpful posts for those webmasters that are trying to do it “right”.
I am amazed at the interaction; it is another example of why Google is the best, will be the best and their simple mission is accomplished from the top down. As a user, I remember leaving altavista in the late 90’s because I could find a needle in a haystack with Google.
In addition, thank you to Google for correcting our inaccurate penalty, I look forward and request any feedback that will help our site be listed on Google AND help our site be the best resource with accurate and simple navigation to our services and subjects.
What would be the from address on these emails.
With the gazillions of sites using questionable tactics, how did you arrive at which hundred or so to call out in your pilot test?
Mr. Cutts,
The problem wasn’t with your split infinitive. It was the use of the word “proactively.” 🙂
Alan Perkins – you are nearing grandmaster level. I love it! Thinking ahead to the day when SEO’s test what degree of black hat technique triggers a Google contact letter! It would be funny if it weren’t so true.
So using split-infinitives will result in Google-banning?
I think that is a good thing you’ve started the pilot. There are a lot of people out there who are unaware their sites are penalized.
I am trying to stick to the guidelines, however it looks to me that Google is totally ignoring my site (it is only 4 months old) while it is ranked exceptionally well in MSN.
The hardest thing is to figure out if I am on the right track. At least their will be some warning in the future if I am not.
Matt,
We like the idea I have 2 points I would like to run past your eyes…
1, Would it not be better to give an advanced warning, then check again, then ban if nothing has been done. Some people may innocently lose business becuase of this.
2, It would be nice if we could have a TAG we can put in the pages with the email we would like used for communication. Or perhaps we can list domains we use in our google account and associate and email with each domain, if a site you are going to ban is listed in an account you would email the appropriate address? Can use the same verify system as the sitemaps and this would also mean spammers cant grab the emails but we will be able to set up dedicated addresses for collecting information about our sites from you?
Get in touch via searchgrub if you wish to discuss these ideas further, however just thought I should run them in front of you 🙂
Dave
Searchgrub.com
Matt — again, supplying google with a contact email for an individual site should be a part of the sitemap spec. You should ask webmasters.. what do you really want google to know about you and your site and build that into the protocol.
Great idea from Google, hope it will continue. I think it is a nice way to increase awarness towards search engine techniques.
Matt, if action was taken at once, will google still remove/penalize the penalized site for these 30 days or will the site get back again earlier?
Another thing please, if you can advice me about, does this have anything to do with the recent changes in G. latest results, think many people distinguished a change. What I mean, is that sometimes, we find on Google good sites with quality contents that then disappear or their rank is pushed back to further pages , while other spammy or with not that quality sites might appear.
thanks again for the great idea,
Thats a really good idea. Especially for people like us who had a site dropped from the index last year and still not back, but have absolutely no clue as to why. Its horrible to be in that situation, so would be fantastic to be given an idea as to why.
Great idea and I really hope it works.
I think this is a great idea. Thank you
Matt: Thank you for this update. This is great news since I have been penalized apparently because my site host redirected 301’s incorrectly. This was not intentional, however, my site has been penalized and I am praying everyday to be re-included. After being upset about not being contacted I quickly realized that with the number of sites it is unrealistic to expect Google to know the intentional from the non-intentional. Now, it looks like it is possible! Thank you!
John Sabia
Fort Lauderdale Real Estate
Appropriate to Alan Perkins comment above, it was my understanding that Google simply banned large blocks of sites wholesale without examining them by hand because to do so would allow spammers to reverse engineeer the spam filters…
This did give me visions of angry robots roaming the countyside zapping the peasants who “looked funny.”
This pilot prgram sounds like a step in the right direction for better communication between webmasters and Google. I think there are many, many, many cases in which webmasters would clean up their sites, if they only knew what they needed to do. Of course, there are also many cases of webmasters who should already know what should be changed.
Matt, would you like to be a guest on SEO Radio to talk about this and other SEO/webmaster topics? Please email us if you are interested, we would love to have you on the show.
Big thumbs up for this – I hope the test proves successful.
Anything that opens up a dialog between the search engines and web publishers has got to be applauded. Seems like a real win-win idea.
With Google’s market share – a webmaster losing their Google traffic might as well shut up shop – and the technically naive, or badly advised, who’ve strayed from the white hat path, might never know what hit them.
Nice one Google.
Matt,
What about if you have a site, know that something is wrong. With this is it possible to get an answer from Goog as to what can be done to correct a problem?
Greetings Matt,
I appreciate all the great insight you’re giving us on your blog posts. Thanks for this tip. I recently launched my site TECK Reviews this week after some experimental SEO, and I’m closely monitoring my results to ensure I haven’t done anything wrong 🙂
I know this is not the best place to ask, but I could find your email anywhere (probably because you’d get hundreds of emails a week heh). I was wondering do google employee interns at all? I’m based in the UK and about to start a degree and would love to work at Google in a sandwich year. (Yes I’m sure you hear that every day). Just wondering.
Best Regards,
– Dean
“[we email] support@domain.com”
Have you guys never heard of Spam traps?
Hi Matt
Off topic…sorry.
Would you be kind to conduct my “BIG THANK YOU for a well done job” to The Google Team/ Google engineers who have handled my reinclusion request.
And once again… thank you Matt for posting on your blog the guidelines of reinclusion request .
Well, I’ve a different position on Google and spam.
Google love spam sites. Google have a good opinion of sites with:
1. Keywords Density over 14%
2. Hidden Text
3. Cloacking
Without theese “quality” google cannot index well “quality” sites.
I’ve reported to G many sites that perfom trivial spam.
Before theese where at 8, 2 position, second page of Serp.
Now theese sites rank very very well. It’s strange … But this is ..
Look at this : http://www.google.com/search?hl=en&q=cerco+casa+roma&btnG=Google+Search
I’ve asked to G to list me all the results with real estate in Roma.
Your result maybe depend naturally of the datacenter but look at: http://www.babelecase.it/gat10.htm the first result in Italy.
Somebody on the seo world do real misinformation. It’s a strategy to cut many competitors in the market of search engine results
My english is too … bad. Sorry
Best regards
Hi Matt,
This blog is really helpful for webmasters like us who always try their best to do it right. I must say its a great initiative from Google to inform the webmasters about anything that they might have done beyond the scope of google’s guidelines.
I just have a few suggestions that you might consider:
a) Can Google have a request form or something similar through which the webmasters (who suspect a penalty or ban on their site) can also request Google to advice them on anything that they are doing wrong ?
I mean , that would open two-way communication, not just Google proactively informing the webmasters but also the webmasters can take the intiative to ask help from google.
b) Also, for sites that rank well in other major SEs like MSN and Yahoo but not doing good in Google ( my site is an example 🙁 )– which are not spammy and are actually trying to provide good content and service to its visitors — can Google take any initiative to help these websites do things in Google’s way so that they do good in Google SERP’s as well ?
I believe Google’s moto is to provide their searchers with useful results so if google helps these sites which are actually good but unable to make it good in Google SERPs — they would eventually come up on google searches and thereby google will actually be providing their visiotrs with more good results.
This is great news from google. Thanks for the update
Google english rules for english sites or same rules for all other languages ??
US is the most important market on net, and the orher markets ??
i put several reports for spam sites, hidden text, providers with redirect in domain clients, google forgot it ( i wrote in english, bad english but i think all can understand)
Spanish sites and portuguese are outside rules, if u want do spam, easy, host in a country with a diferent language google allocate site to IP server and help you in hidden text, redirect etc etc.
Hi Matt
Would it not be far easier to roll this out to the SEM SEO Firms who are practicing the white hat user friendly help google help users meet quality guidelines firms so that they can reach out to all their clients and clean things up even more??
I can’t wait to read the forums on this one.
Clint
Hi Matt
I would gladly pay Google on a “webmasters registration site” to inform me if one of my Websites was making any errors.
Tony
(The Man who built R2D2)
Matt:
As a professional marketing company we appreciate when search engines (and not JUST any search engine, but Google) work with us to get our clients the best results.
It is greatly appreciated.
Sharifah Hardie
Hello Matt,
May I just say I love the blog (I really do!)
My site was born at a similar time (July 17th?)
http://www.FitnessBegin.com
You have no toolbar PageRank and nor do I.
It will be interesting to see when PageRank becomes active for you and I, you being an employee.
Actually, from what you say, Google seems to have the end searcher in mind. (very credible). So i’m sure they’ll be impartial.
Still, it’s interesting as your backlinks are very high with avid readers like myself.
Anyway keep blogging and i’ll keep reading (or what ever they say active blog readers do).
You seem to have a talent there!
Best regards,
Simon Gould
http://www.FitnessBegin.com
Hi Matt
I just found out about this news from a newsletter. Apparently it leads me to your blog.
I hope I will read more insider stuff from this blog.
cheers
Hi Matt
You mentioned in a previous post “What’s an update?”:
“I’m happy to try to give weather reports when we do our update scoring/algo data though”.
Would you be kind to inform us whether there is an update (algo changes) going on right now.
Thanks!
In response to “Search Engines Web” – I disagree with you – Google has every right to penalise a website that goes blatantly against their guidelines and it is about time they began to clean up some of the results. It may be a good website with valuable information (which Google is saying it is) however there are many ways for them to improve their positioning by using correct page construction elements without losing anything at all aesthetically or from a marketing point of view. Easy short cuts for a business website are not a good long term solution especially if they go against known SE guidelines and industry best practice. Why sould a business expect free traffic from the SEs if they cannot respect them? It would be so easy to fix the situation – and to improve the quality and appeal for visitors at the same time.
In case anyone is interested, I’ve been having a little rant, putting in my 2cents worth over this inititive, at SEF in Hampstead’s thread. 😉
Whilst I commend Google for its efforts and initiatives I think if they are serious in their aims to provide better search results they need to be more proactive with honest webmasters who don’t go out to intentionally spam search engines but get caught up in imperfect filters. One minute you can be flavour of the month and the next you’re public enemy number 1. At least in the civilized world you be entitled to a trial. Filing re-inclusion requests as previously recommended is in my experience a waste of time. If you are lucky enough to even get a reply then it is mostly cut and paste replies which bear no relevance to the situation. Maybe this is sour grapes on my part but recently I had a site drop dramatically in the rankings. The very next day I get an invite from the Adsense team to help optimize ads on the site because the click rate was so low. Could it be that people don’t click on the ads because they are already happy with what they have found? Sadly for 30,000 Google users a day they are denied access to my site now.
As far as I can see Google still has problems with 301 redirects, doesn’t obey robots.txt has a bad habit of clinging on to long dead pages that no longer exist and if you use the url removal tool Google will bring back pages from the dead that haven’t existed for months once the request has expired. Perhaps you could expand on these points in future topics.
We are told to build websites for users not for search engines and I would wholeheartedly agree with that sentiment but it simply isn’t possible anymore. Because of the whims of search engines webmasters now have to pander to the search engines over the needs of the users in fear of tripping one of many half baked filters.
Keep up the good work though. I enjoy reading your posts.
Great move by doing this Matt! Maybe this will help remove this stigma floating arount that “Google is evil”.
chuckle – google ain’t evil (I believe they try) just don’t always succeed…if pathways to site owners is saught – I say Google should live up to their own guidelines and get rid of those that don’t live up to their requirements as stated. Why send e-mails – just do it…. naive site owners would wake up QUICKER…
I think this is a great idea, it’s a shame you don’t go one step further and tell us when an aggressive filter has been applied to our sites! I have one PR5 domain which is now all but useless, since you applied multiple levels of filtering on it starting in Feb 2005. If I knew why Google has developed a deep-rooted hatred for this site, I’d happily do something about it – but since no information was given, it’s hard to fully determine why filters were applied and what to do to remove them so my business can actually make some money.
It’s obvious to me that Google now has a long list of “don’t likes” and some of them are fairly unreasonable in nature. The latest ones appear to be:-
1. Sites with strong affiliate content.
2. Increasing filtering on new sites which adopt any kind of early link building practices, even with similar-themed high Google ranked sites.
Re: 2) above – I particularly hate the way Google treats new web sites which don’t acquire many quality links soon after launch. They nearly all languish in the merky depths of the search results for anything up to 15 months, unable to surface whilst Google tries to milk their owners for Adwords revenue. All other search engines treat new sites with a lot more dignity, especially well designed, non-spam sites. In the early days at least, Google doesn’t seem to care whether a site meets the Google webmaster guidelines or not – it still applies heavy filtering to make sure no money is made. I can only think that the link acquisition trend analysis part of the Google algorithm is being pretty unfair to these sites as just about any form of early stage SEO reduces a site’s already awful rankings down to abysmal levels.
One more thing, when is Google going to detect hidden text and ban sites for it? A recent search for “web design newbury” shows the top listed sites to have hidden text, invisible H1 and H2 headings, hidden spam content lists with { display: none } in the style sheet. Whilst my sites languish in the 100’s these sites are happily #1 ranked in Google as it seems unable to spot these nasty SEO practices. I even found one of these web design sites using the same nasty hidden spam text technique to stuff adverts for his services into his client’s web sites. They even rank in Google for “web design newbury” too even though they’re sites listing nannies! It makes me extremely angry that these type of sites do well whilst Google filters mine to oblivion!!
It’s high time that the sophisticated Google algo found this kind of stuff.
Give this some thought and check out that search term if you want a laugh. For me, I don’t know whether to laugh or cry, given your recent Algo changes….
What happens if there is a site that used Javascript redirect for legit purposes, such as Geo-redirection to direct users based on their IP to their appropriate country site… The site still needs to place some content on the initial page for search engines to index them well and rank them well…would this type of redirect be considered spam if there is conetnt on the initial page that the visitors don’t see?
Hey Matt,
Will those first 100 webmasters who get the email also get a black Google cap like the first 100 Adwords Professionals got?
🙂
This is in my opinion a huge step forward for Google. I have been aproached by several agitated webmasters after dealing with shady SEO companies, not having a clue why their site was removed fromt he index. I could not give them a valid answer either as I did not know. Now, we have reasons, we can fix and we can make the web a better place all round.
Google communication with the webmaster / SEO community have really gone leaps with all your new nice initiatives.
Matt, you are really making it easier to be a white hat webmaster!
Kim
Hello Matt
I wish google would just come out with a paid service that we could use to see if the site in question does pass muster in google’s eyes. I would certainly pay a reasonable fee to submit my site(s) and know for sure if what I am doing with my sites is offensive to google.
No one would have to use it if they did not want to and no guarantee would be needed, just an honest opinion from google about the site that was submitted so we could be sure our site was not in violation of any google policies for being listed.
It just really sucks not knowing for sure. I could hire some high buck SEO guy but why should I have to in order to just know if my site is OK? I would much rather simple pay some of those bucks to google and know for sure if my site has a clean bill of health or not.
I bet this would generate a pretty damn good income as I am sure I am not the only one who would end up using it.
###
Matt, would you be able to tell us the domain name that these emails will originate from?
If Google is going to email the contacts in the whois record then I know a lot of people use spamguards in whois recs. I operate the myprivacy.ca whois privacy service and would be willing to whitelist the Google MTA’s so these messages can get through.
Mr Cutts,
Google could show this information from the search box:
WHY: http://www.mydomain.com
Steve
I noticed like 80% of the real estate websites have seem to vanish from the serps.
They only show up as domain name.
What is googles plan with the real estate industry.
Is it only to have just a few websites listed on google forcing all the realtors to use adwords.
It seems like good news, but most spam sites I see have Google Adsense on them which is the main reason for producing spam content on these sites ?
Google is also getting alot of bad press lately due to their redirect hijacking problem.
Hi Matt,
How do you guys plan on handling things like this:
http://forums.seochat.com/t55946/s.html
Seems your spam systems need a lot of work still.
Matt,
I’ve only recently become aware of this blog, and appreciate the feedback from you and Google.
One of my clients was a recent recipient of the above mentioned pilot program email. He wasn’t even aware he was doing it, but corrected the problem and applied for reinclusion. How long can he expect to wait for a reply? What is typical?
Thanks in advance for your reply.
My site http://www.buy-and-sell-car-secrets.com/index.html was dropped from Google on 9-23-05. I had hired an seo guy to do some very basic seo work and found out immediately after my site was dropped that he had added some hidden text. I went through each and every page because he of course didn’t keep a record on which pages he had done this too (it was only a few) and removed the hidden text. I’ve done several reinclusion requests explaing what had happened but they only referred me to the webmaster guidelines the first time and now they are just ignoring me. Google is killing me. No one will ever touch my sites again but this isn’t fair. Its been about 6 weeks now. I see that Google keeps spidering my site and then a few days later they drop it again. Is there anything else I can do? I’m desperate.
Thank you.
Hi Matt
My site has been down for about 30days. I have send several reinclusion request. Can you tell me how long does for reinclusion?
Thankyou
My site (above) is back on Google! It got dropped during that update. I got an email from some nice person at Google last week assuring me that it had NOT been banned or penalized. Only a few pages have a PR so far. My home page was a PR5 before so it will be interesting to see how long it will take before it gets back its PR.
Hi there, I found your blog recently and have been trying to digest as much info as possible. Thank you by the way.
Has this program of alerting webmasters gone anywhere? I sure hope this rolls out all the way. See Ihave been playing with computers for years and years, but this search engine stuff really has me stumped, although if this webmaster notification tool works out, then it would be a huge relief.
The bad neighborhood thing in Google is so mysterious that there seems to be no real definition or even examples of what to look for. All I see is people saying adult, prescription, and link farms. Well it is sooo hard to tell what a link farm is today, I have seen blog rolls that make wonder… and certainly adult and prescription web sites are online, and people are looking for them, and so these can’t all be bad.
My problem is that I want to do everything right. But the issue of what is a bad neighborhood, and how to tell if a site I link to becomes a bad neighborhood, has me stumped. on google’s first ten search results today there is one that offers a free program in exchange for a link that will alert you if there are links on your site to a bad neighborhood. But now I think that may be a bad neighborhood. I don’t know! HELP!
Also, what is this crosslinking thing, or is it out of date news with all these blogs corsslinking… will technorati be a crosslinking link farm bad neighborhood next update?
I am so confused, but this little webmaster notification thing you posted about here may fix that for everybofy, and get eveyone to build better web sites… and that makes me happy!
Please.. any update would help me sleep at night!
Steve
I am adding the rss for comments on this page, although it appears perhaps that you do not have time, or do not comment on comments here maybe?
I hope more discussions ensue on thiese topics, as I have posted on the googlecommunity forums with no replies for basically the same question.. ugh!
I think it’s a great approach, and frankly I’m amazed that such an effort is even being made by google, considering the sheer number of sites that are delisted each day (any stats on that?).
My question is, if a domain is listed as “private” is the whois, will they still send an email to the proxy email address?
Hello Matt,
Thank you for developing this blog. It’s a great resource but I am still stumped on how to get back into Google’s good graces.
Our web site SummerJobs.com has been online since 1996. We attract tens of thousands of visitors to our site each month, list hundreds of seasonal job listings each summer, and are linked to from hundreds of other web sites.
However we have dropped in search results on Google from page 1 (for more than 8 years!) to page 15+ for the keywords “summer jobs”.
For the past year we have torn through our web code in hopes of finding answers to what is making Google drop us. We’ve studied their webmaster guidelines religiously. Every time we think we may have found the “thing” that is wrong — we submit a re-inclusion request and wait.
We have repeatedly asked Google to provide us with more information, but only receive the canned email responses. How can we get included in that pilot program to alert sites that are considered outside Google’s quality guidelines?
Your advice is much appreciated.
With Regards,
Michelle
Michelle all you have to do is have patience. Start using some nice white hat SEO and I guess time will tell.
I have a similar problem to many of you guys. I had a forum and information site that was fairly well listed at google, and now it is totally wiped out entirely. I’m not sure what happened!!
http://www.buggynews.com as search shows that no pages are included in the index. Both MSN and YAHOO still include me.
I’m at a loss as to what has happened. I’ve sent several re-inclusion requests, but so far have not heard back from anyhow, and no results.
Well… I finally got a reply from my reinclusion request of about 3 weeks ago. A representative took the time to let me know that I am not banned or penalized, but that I’m just not included at this time. Doesn’t make much sense to me, as I was well included previously.
It has, as so many of the comments above show, always been devastingly frustrating for honest webmasters to see their work–which may well be their major or sole income source–blasted away by utterly unknowable, impersonal algorithms. It is, I would say, *vital* that search engines provide some method of acknowledgement and feedback (other than canned form emails) for inquiring webmasters. This pilot seems a start, but it needs to go production as soon as possible.
Frankly, I am amazed that Congress has not yet taken up search engines. For one or a very small number of companies to have the power to essentially destroy innocent businesses, even (or perhaps especially) by simple misadvertance, is astonishing. The so-far successful “private enterprise” and “free speech” arguments will not hold under scrutiny–think of credit-reporting services, and the constraints on them. Google and the other SEs have at least as much make-or-break power as a credit-reporting service.
I am sure that Google, at least, wants to be both friendly to honest webmasters and clean ethically. It is thus, to repeat, essential that those whose very livelihoods are at risk have some means to know what is wrong, and what they need to do to fix it.
Please expedite this program!
Just read this article and can only imagine the massive volume of emails Google may be sending to bad webmasters! I’d like to see an opt-in notification service, perhaps tied in to the URL submit form and Sitemaps. Where do I sign up? 🙂
Just curious if “G” plans to do something like this in sitemaps ?
Matt,
This sounds like a great idea. I certainly encourage it.
A question I have is: will Google confirm for a webmaster when a site is not being penalized in any way?
The reason I ask is tht while the Big Daddy update has been rolling out one of our homepages has curiously disappeared from the results under our main search terms. Secondary pages do show up, but not the home page. This doesn’t make any sense because there is nothing we’re doing on the site that is outside of the Google webmaster guidelines.
For webmasters in our position, it’d be very helpful to be able to determine if this is a glitch that will resolve itself or if something else is at work.
My guess is that this issue might relate to the canonicalization resolution issue you mentioned is part of this update. But it would sure be helpful to be able to confirm this.
Also, this strange homepage disappearence was not showing up in the test BD DC’s that you indicated prior to the full update. So that’s even more perplexing.
So, all that is to say that it would be helpful to be able to determine what is or is not happening with a certain URL as pertains to the Google SERPS. Especially when a site has consistently ranked highly and then suddenly disappears from the SERPS for no apparent reason.
I’d really appreciate it if you’d offer some feedback to this question. Thanks in advance.
The problem is that right now it’s all a guessing a game in these situations. And of course this is a little stressful when a high ranking is important for site revenues.
Matt,
I think this a great idea. A few of my sites were penalized in Yahoo for about 8-9 months for what I think was cross linking and duplicate content. If they had a service like this I would have corrected the problems immediately. Instead, they just penalized for what I think was a fixed amount of time. I had corrected my sites shortly after the penalization and Yahoo kept me in time out for about 8-9 months.
I wish they would have emailed me when my site with a PR of 3 suddenly disappeared after a year in the index.
I used the reinclusion but all I got was a standard email telling me to seach the FAQ’s for answers which doesn’t help at all. You can read all you want but I need specifics.
Google could use the sitemaps pages to tell people if there is something wrong with the site(s), if they’ve found duplicate content or any other violations so people could fix it before the site(s) gets dropped from the index.
I don’t know why my site got dropped efter a year and with the new BigDaddy update I could see it climb in the SERP’s just to suddenly being dropped which suxx big time 🙁
Matt said:
The site itself is for a real small business, but they clearly made a bad choice and bought some piece of software that makes gibberish pages that do a JavaScript redirect with code like
var1=15;
var2=var1;
if(var1==var2) document.1ocation=”http://www.sewshop.com/acart/index.htm”;
Now, if this is bad then I have reported tons of sites that uses this type of code and the only thing that happened was that MY SITE DISAPPEARED but all the sites using the code got better ranking using my domain name and scraped content. What kind of a deal is that 🙁
This is the site http://12824DOT48g6kl.DOTinfo/ with this code:
georgia department of correction offender query
this domain has tons of “numbered” subdomains. Is this spamming or not??????
First off, thanks for your time at SES NYC, Matt. It was good to get your feedback on the issues that are really specific to our site, and also just to get the impression that you genuinely care about what those of us on this side of the line are going through.
I asked you about a site participating in a link farm that seemed to be doing well nonetheless. You mentioned the possibility that the site deserved a top ranking after the link farm was discounted. I just believe that I could have shot our own site in the foot–our site is nowhere to be found anymore! We were really climbing over the last 2-3 weeks, and now we’ve been pushed off the entire map. What could have happened?
Brian, I would definitely not just do a 302/301 from old doorway pages–those pages should be removed. Any doorway pages should be removed, and then return a 404 status code.
Matt, my 2nd issue was that I’ve been cleaning a site up since it was hit by Traffic Power over 2 years ago. Those TP redirect pages were eliminated immediately (2+ years ago), but they are now in the Supplementals. All of our legit content pages are in the “Omitted” results & I don’t know how to get the index to read our site properly. Instead of the 404 status code, I told my webmaster to set up a “Custom Error Page” that redirects visitors to the homepage. This is now in the place of any of those old TP pages and also any other possible mistaken URLs entered by visitors. Is this acceptable?
I really really really hope that asking for advice & help didn’t lead to crippling our rankings in the SERPs. Any advice for what I may have done wrong? If not, why the immediate & precipitous drop when nothing else changed?
I JUST realized that the entire site went supplemental?!? Matt, please check out the thread over at http://www.webmasterworld.com/forum30/33351.htm
…
How can I request that one of these notification emails be sent? Thanks.
-Matt
Im not sure you are serious about this issue… is it serious or just a joke? Because if it would be a serious idea to inform webmaster of their lack in following google TOS, why google is never answering such request? Instead of starting pilot programs like that, i would prefer to got an answer to a serious and friedly question, if i can’t understand my error. But never got one….
Matt,
I think that this is a great idea. Why not tell people, “Hey we noticed this __________ and if you want to maintain your placement then change it.”
There are a lot of people out there with decent sites but not much knowledge about what are the rules.
Hi Matt:
You mention that doorway pages are affected by this. I have read various articles on doorway pages, is it true that if page(s) is linked within the site (ex. on the “map” page) and is easily accessible by anyone visiting the site it is not considered to be a doorway page?
I think this is a great approach to take…many companies get scammed or misrepresented, many SEOs sail a little close to the wind, hell, SEO itself necessarily pushes the envelope…giving people a chance to make good is a very grown-up attitude.
And, great blog!
Hi Matt: greeting from Germany .. I read your Blog now for 5 or 6 Hours to find any hint or small tip what Google do on the 12th April with our Site .Google has kicked the comlete website and I.am now serching 22 Days for mistakes on our Website . I read information for webmasters and webmaster guidelines but for my thing ( iam not a Programmer) i found nothing that make trouble with the webmaster guidelines.
I wish Google send me an email like this in the first posting so i know what is wrong with our Page. Ithinking about our Servercrash on the 12th April , it was down between 8:00 to 12:00 in the morning … Is this the Point no I don´t think so …. Please Excuse me for my English …Its a long time ago
Hi Matt,
Does Google offers Newsletters?
Hi Matt,
Nice blog! First time I see this, very nice enjoy reading it all.
I have a question that I think relates to many webmasters. When using affiliates program you actually duplicating your site content. How will Google know not to penalize those sites?
We gave a right to our previous webmaster to create affiliate websites that are duplicates of our site in some ways. We own the domains, but he makes the commission on every sale he generates from the sites he wrote. Will that be considered as spamming? Can we be penalized because of his actions? As far as I know he didn’t put any code to trick Google. But he created static pages for all products he promotes (that is all ours).
P.S. Google sending emails to webmasters is a good idea. But most of the common email addresses are not valid because every email spammer knows this too. So you have to create an email that is hard to detect and does not show up on your domain registration … With all this in mind your “penalized’ email will not reach there target destination.
I have an idea- Let the webmasters initiate request by visiting a special page created by Google (Webmasters will be able to enter their domain) that would check the website if it was penalized and then reply to the webmaster’s request.
LOVE TO HERE BACK FROM YOU!
This is a great idea, I could also see this used in a very deceitful way when it comes to data miners trying to get security information from a company. Using dummy pages to have people log into their gmail account, etc.
Great to see Google being proactive on this issue. I am sure many innocent companies have lost thousands of dollars from changes being made that penalized them in the future.
This sounds like a great idea, but I am not sure about how practical it is. Even if google try its best, they may not ven reach 1% of the webmasters who are penalised. Google customer support is already overloaded and slow. I dont think all the replies from webmasters is going to help it. No wonder even almost an year the blog, there is not much difference in ground reality.
Augustine
http://www.hostingadvices.com
Someone sent me this thread from the Dreamweaver user group. I’m the web designer but I think our web host did something this weekend to deliberately remove Margaret’s Cleaners from the google search engines (because he was upset for no good reason). I filed a Reinclusion Request already. I’m using my private email because we think our “host” is also reading our email. Is there any way to find out whether a removal request occured?
Your Mr. Nice Guy approach is great in terms of getting good PR (public relations) points but instead of focusing your attention to unknowing site owners who have been victimized by spammy seo companies, why don’t you go after the seo companies who have optimized these sites using unacceptable seo tactics?
You’re not dealing with the root cause of the problem by sending out these nice emails to these victimized sites. Why don’t you campaign for rooting out spammy seo companies instead and really go after them aggressively? If you can take down 1 spammy seo company, you’d be taking down dozens if not hundreds of spammy sites. These spammy seo companies would then think twice about using seo spam to optimize future clients if they know that Google would penalize them for it quickly.
I redesigned a website last april (simply-cedar.com) which was a year old at the time and it appears to be penalized because the main keywords don’t rank at all. If the other designer did something that caused a penalty then we have no knowledge of it. we sent in a reinclusion request a couple months ago but not knowing the reason for the penalty the reinclusion request wasn’t responded to and the site sits in limbo for what we know not. I’ve had several SEO people look at this site and they can’t find anything wrong with it.
It looks like this project never went anywhere.
Hi Matt,
We feel our competitors are spamming our website. Spamming not in an orthodox sense but by placing our links on bad/porn sites & on a ring of completely unrelated websites. I have yet to check whether those websites are in penalty or in the spam database but in case they are, what are the options open to us because we do not want our listings(rankings) and PR to be adveresely affected by these “bad” links (which may already have happened becuase we know Google just had a PR update and we fell though we cant say till the update is complete but the trend right now is that we have fallen). We have asked the Webmaster to remove the links but in case they were done intentionally then the links may not come off so easily. It would be great to have your advice and guidance in this matter.
Looking forward to hearing from you soon,
Best Regards,
Kieron
I wonder if this has started again because recently I found that one of our site was dropped from SERP’s for every popular keyword which was ranked for almost 3 years..
We never recieved any intimation nor seems to violate any rules but I am kind of surprised as to what has happened..Any updates on these guys?
It’s unfortunate this doesn’t seem to have been pursued any further. It’s a great idea and shows google’s forward thinking. How come it got stuck at the deadlock?
How about Google’s algorithm for duplicate contents? Supposed that there are 10 sites having same contents, (just like the ready-made adsense websites) would Google gives consideration to the first site that was first indexed instead of penalizing every site and have them all delisted?
Today I received one of these e-mails as well. While I generally appreciate Google’s approach, it seems to me Google is relying on its algorithms a bit too much. From the e-mail:
“[…] Wir haben auf Ihren Seiten insbesondere die Verwendung folgender Techniken festgestellt:
*Seiten wie z. B. xyz.de, die zu Seiten wie z. B.
http://www.xyz.de/index.htm mit Hilfe eines Redirects
weiterleiten, der nicht mit unseren Richtlinien konform ist.”
Which (roughly) translates to:
“We found you are using the following techniques:
*Pages like xyz.de which redirect to pages like http://www.xyz.de/index.htm with a form of redirect which is not conform to our guidelines.”
Funny thing is, I am using standard Apache settings, meaning xyz.de/ and xyz.de/index.htm deliver identical contents and the redirect, if provoced, is a 301 permanent.
I can absolutely not see any violation of Google’s guidelines here.
Best,
Stefan
Hello,
I received one of those fake-emails about banning a page of our (which does not exist), nevertheless I was shocked first, because I thought, maybe they just named it wrong in the mail.
To suggest an idea about that for serious warnings
* why not check if that page or emailaddy ever was used by a registered user and then give him such an essential message in his postbox there, only announcing the message in an email: “Dear mr. X, please login to your GOOGLE account and check your postbox”
In such a case, there is no doubt about seriosity of the message
* anyway there should be first a “warning” and a time to repair the wrong scripts or whatever, related to the excessivity of the unwanted technique (very bad e.g. 5 bdays, bad 7 days, slight ones 2 weeks…)
(we don´t use something like that by intention, however, as written in a post above, it could be a non-intentional use of a script, which is banned or usues not accepted techniques (what one from the outside not always might be able to see)
* when I received this fake-email, I wanted to contact someone from G., but I only found several emailforms, that gave automatic responses, finally, what a surprise, I found a form and received a reaction from an employee of G… there should be ONE certain communication possibility to ask, if essentially necessary – and blocking a webpage might surely be a VERY essential reaction (maybe access to personal contact by using a certain code, written in the email or so)
I know, it´s very difficult to solve, but I think, only the big players are able find solutions against such frauds and hoaxes and fakes and whatever… there are many, many serious webworkers out here (like us), that of course struggle for success and which are very much suffering from unwanted methods of too many people worldwide…
Regards from Germany
Hans
Matt wrote about this nearly 2 years ago. Did this idea die off?
I could have used an email to tell me that the reciprocal links I did over a year ago was going to get me penalized.
Too bad this never went anywhere.
I wonder if they are hiring at KMart.
This webmaster notification email should be sent before the removal is in progress! Today if you get this email, you have no chance to avert the removal. No matter if you resolve the issue instantly and send instantly a reconsideration request, you get removed anyway.
In addition you don’t have any possibility to contact someone to discuss the issue and reasonably resolve it.
You just get removed. You know, some people rely on Google, some people need it for their business and you just may be removed because of some stupid issue.
Yes, its done to save Googles’ quality. But I ask you, what is more damaging for Google’s Quality? 1 single word rated as “hidden text” or the removal of thousands of pages with quality content?
Dear Sir, I like Google But you guys in thinking of every thing there is to think of I have no place in all of this material to contact Google to say Happybirthday. Yahoo does.
What ever happened with this initiative? Has anyone actually received an email (or other notification) regarding a site about to be banned? Just curious as I had an issue with Yahoo and wish that something like this actually existed before I was yanked. It was due to an SEO I paid and his practices were not what Yahoo considered to be legit. Notification would be awesome.
I got a letter from Google
“Currently pages from [url] are scheduled to be removed for at least 30 days.”
I had 1 hidden sentense on my index page just for the search engine to index it, because my site is flash, thats it. When i got this message I removed the sentense within 2 minutes and sent reinclusion request.
Will i still be penalized for 30 days ? Seems a little bit harsh to me.
My website has reappeared in Google after 4 days of absence.
I guess reinclusion requests does actually work.:)
I think sometimes google is not able to find pages on a persons site so it might not show a page it had previously indexed. The best way that i have found to go round this problem is to have a site map that you regularly update.
I think Google’s alerts are very good idea. I hope other
search engines will starting doing this as well. Very helpfull.
One of my sites at http://www.maidright.info has not so hidden text that seemed to help with ranking for a while but it drops out of the listings for weeks on end all the time. It is of poor design, built only with fireworks, but it is a site that our customers like. That is really what is most important, or was, but I hate loosing the volume of results from longtails. If I pull the added text will it reindex and stop dropping? If i use alt tags with info on each pic, would that help? Questions questions…
Dear Mr. Cutts,
If I would have had support@domain.com would I get the alert? Because I was not aware and now it’s too late. Or is it still beta?
Yours truly,
Dave
This was a good idea, I hope that they idea still alive.
This is a good initiative from Google. Everybody is assured that we are given a fair play in the rankings of Google SERPs. It will reduce the number of people using or applying black hat strategies in search engine optimization.
I would like very much to know when there’s an issue with my site in Google, Any suggestions would always be greatly appreciated!
Try always to avoid hidden sentense on your index page
Why does google waste my time by directing me to pay-to-view (paywalled) web sites? This is a waste of bandwidth. It includes a lot of stuff at IEEE and Newscientist etc.
That’s a great idea, but where do you get the time for that? Having a goal to weed out bad SEO practices is great, but wow, very time consuming. Now if there was any way you could identify the SEO company that caused the infraction and then find all the websites they are running SEO packages for that would definately help cut down the time.
Definitely alert site owners. Most owners have their sites drop and it kills them and they don’t know why. The more communication for well intentioned non-Viagra-pushing websites out there just trying to do good business – the better.
Great service! I received a note from the “Google Search Quality Team” this afternoon and I initially thought that the email itself was spam. However, after researching the facts, I did indeed find that an unknown entity had set up shop on my domain and spoof Western Union, and had even captured details for one unsuspecting user already.
Kudos to you and your team! My only suggestion is that you come up with a forensics tool that backtracks the initial incident and zings the perpetrator. But perhaps that is a pipe-dream.
Thanks again,
So Sherif Cutts .. thanks for deleting my post here yesterday. A case of you don’t like it when a site owner tells you a truth. Your “posse” falsely blacklisted our site and then when we protested, they reversed that but only after you had caused us commercial damage.
So again my question “who appointed you ?” … and what happened to “when I make a mistake, I apologise ?”
Dear Matt,
Why Google doesn’t bother to alert Adsense publishers when their ads are clickbombed? Instead they simply disable the accounts of innocent publishers. Is there anyway that you could alert webmasters as this happens.
With hope that you’d answer my query,
Sreejesh,