Google recently beefed up our webmaster quality guidelines with more info, examples, etc. Normally I’d start the conversation and highlight important points. Let’s turn it around this time. Check out the additional info in the webmaster guidelines — what do you see that is unclear? Are there places where you think the wording is poor or confusing?
I’m so glad you asked Matt 🙂
Hidden text and links;
Assume some software is used to create site pages. Part of the page creation includes a hidden link (via css) that only shows if you print the page, or view it in print preview. Would this be considered outside the Google guidelines?
Please note, I’m not asking if this will incur a penalty, ban or even if it will benefit the pages in Google.
Oops, forgot to say that, IMO, the updated guidelines are worded less ambiguously than the old. However, nothing is perfect so please keep updating them as need be.
Matt
I wish to see more info and tips within the webmaster quality guidelines about “Submit a reconsideration request “. Something in the direction of this previous postFiling a reinclusion request
Matt, care to comment on why John Chow doesn’t rank for his own name anymore?
http://www.google.com/search?hl=en&q=john+chow
His blog is read by a lot of people who try to copy him. Explaining to everyone where he went wrong would be a great help to a lot of bloggers.
If you don’t explain where he went wrong people will just keep doing it (whatever “it” is).
I still have questions about “Duplicate content”, and especially about blog archives.
Lets take your article ‘The role of humans in Google search’. I can find the full article on multiple pages:
– Homepage (http://www.mattcutts.com/blog/)
– Article Permalink (http://www.mattcutts.com/blog/the-role-of-humans-in-google-search/)
– June 2007 archive (http://www.mattcutts.com/blog/2007/06/)
– Google/SEO category archive (http://www.mattcutts.com/blog/type/googleseo/)
i looked at your robots.txt file and i would have thought you would have blocked out the archive pages for indexing, as archives typically just contain ‘duplicate content’.
When i google for ‘The role of humans in Google search’, only the article page shows up (and the comments feed page). Did google choose that page because thats the page lots of other people link to? What would be best practise on handling duplicate content created by blog archive pages?
(there also seems to be a blogspot fan of you and adsense, showing up in those results)
Dear Matt,
excuse my bad english..
“Make pages for users, not for search engines” Really? A little example: Years ago I created a second Domain, which pointed directly to the subfolder of another Domain. It was the desire of many users of a Community. Google removed the maindomain from the SERPs and I learned from it: Make pages for users AND for search engines. The Google guidelines are more important than the desires of the users.
I always get confused with the names. What is “Webmaster Central” and what is “Webmaster tools”? It’s hard to send people to their “accounts” (for the user, website, verified site?) there without confusing them :-).
The robots.txt entry needs information about the limits of the processed robots.txt file. There’s nothing worse than having the processor stop at
Disallow: /
when the line would be
Disallow: /someextremelylongpath
I realize the initial review at Yahoo costs $299 and it is not a paid listing. However subsequent years also cost that much — without a review. Is the first year ok and the next year bad (as a paid listing)?
What is http://www.google.com/addurl/?continue=/addurl for?
Why do you continue to explicitly list “Web Position Gold” and none of the others?
And one snarky one which should be a top priority – “If you’d like to discuss this with Google, or have ideas for how we can better communicate with you about it, please post in our webmaster discussion forum.”
This needs to be changed to “please post on one of the many third party blogs and forums” or “on one of our employee’s private blogs” (aka here). Why is this discussion not in the official Google group?
i have one question:
why is ebay treated different than any other webmaster in terms of the quality guideline?
i see lots of indexed internal search, keywordspam, subdomain spam…
is ebay getting away with it because of their hugh advertising budget on adwords?
How can Google expect the webmaster to follow your quality guideline if there are always plenty of ebay websites and subdomains in front of the ranking with the stuff that you want to get rid of in your guidelines?
like:
“Avoid “doorway” pages created just for search engines”
that’s what ebay does all the time.
that’s what almost every webmaster out there thinks right now about your quality of the index.
Hidden links (again): Assume you’ve got inaccessible navigation (classic case = HTML element), and are providing an alternative form of navigation for search engines in the form of links that are not visible to users but replicate exactly the contents of the element. Complying or going against guidelines?
YES! The guidelines, even after all this time, have nothing about how to deal with language-specific content, especially international variants of a particular language. For example, how to specify the contents of a page, in a method that is understood by Google, as either US-English or UK-English.
The lack of internationalisation information there does make it seem like Google is “just another American company” that treats the rest of the world as an afterthought.
A second issue – on the page http://www.google.com/support/webmasters/bin/answer.py?answer=66359 it would be helpful to give examples of non-page-content information (meta and title) that can cause duplicate content, as a reference to point people to.
Wow, JohnMu, good point. This discussion is occurring here on the Google webmaster help group after being mentioned on the webmaster central blog,
I think it is great that Matt is getting the word out and requesting more input (as he often does here). By the way, hello Matt, sorry for “third personing” you.
JohnMus note that this isn’t being discussed in the right place has alot of merit though Matt, many of the most actve and loyal Google webmaster help group members semi-recently spoke up about how much it urks them to have Google “stars” posting things like this, or make semi-official statements on third party blogs instead of the community they are working to build.
So from one point of view maybe you could have gone to the Google group thread and made an informative, interesting post and then got the extra desired input in that way, right where it has already begun.
From another point of view, you will likely get more input here than the thread on the group will get, so I see your reasoning to put it up here, but it does seem to fragment the exsisting dialogue and could have been an opportunity to “pimp the group” a bit.
Don’t know, I am tired. I have comments on the webmaster guidelines naturally, and I will post them on the Google webmaster help thread.
It doesn’t seem to provide guidelines for proper use of rel=”nofollow”.
Where it says “Make sure your web server supports the If-Modified-Since HTTP header” you could save 99% of the world’s population the effort of checking, by confirming that modern versions of Apache and IIS support it.
Hi Matt,
I’ve a longish comment on the changes here:
http://sebastianx.blogspot.com/2007/06/google-enhances-quality-guidelines.html
In short:
– Hidden text/links
– Cloaking
– Sneaky redirects
– Paid links
– Hallway/doorway pages, thin SERPs
need clarifications here and there.
Thanks
Sebastian
I know its minor, but how about a link to the lynx browser where you mention it?
“Make sure that your TITLE and ALT tags are descriptive and accurate.”
There are still no “alt tags” in the W3C specification, only “alt attributes.” On that note, it would be nice if Google would include at least one link to w3.org in their “technical guidelines” section, e.g. to the HTML specification or the W3C’s accessibility guidelines.
“If your company buys a content management system, make sure that the system can export your content so that search engine spiders can crawl your site.”
Perhaps “export/ display” would be more correct.
“When your site is read: Submit it to Google at http://www.google.com/addurl.html.”
Does this really deserve to be on the top page as top tip? I never submit any of my sites these days, I think the tip is confusing as it takes away emphasis from other points.
“Submit a Sitemap as part of our Google webmaster tools”
This tip should emphasize that it’s *optional*. Again, this can be confusing to webmasters new to the topic, who now feel like they *must* offer a Sitemap file, which they simly don’t need to.
“The Google crawler doesn’t recognize text contained in images.”
Unless you use alt texts, I guess, which might be worth clarifying here (I can already see management say that “images are bad for rankings”).
“If you decide to use dynamic pages (i.e., the URL contains a “?” character), be aware that not every search engine spider crawls dynamic pages as well as static pages.”
Might be worth to link to something explaining htaccess files, and how one can rewrite URLs.
“Keep the links on a given page to a reasonable number (fewer than 100).”
I’m curious why exactly 100 links is a “reasonable number.” If you decide to open up the frontpage with a tag cloud, for instance, more than 100 links may be reasonable. I’m not sure a precide number should be included at all, or does Google simply stop parsing link #101?
“Does this help my users?”
Might rephrase “users” to “visitors.”
Please correct the spelling/word formation.
http://www.vinodlive.com/2007/06/29/gloogle-sepell-cehck-psls/
Hey Matt:
Looks good. Don’t do deceptive things and stay out of bad neighborhoods like your Mom told you when you were five. There is a saying that you lean all the important stuff by the time you are six.
Should add “If you are worried that you might be doing something wrong you probably are.”
Cheers,
Ted
“If you’d like to discuss this with Google, or have ideas for how we can better communicate with you about it, please post in our webmaster discussion forum.”
I post the same question 3 times in the “webmaster discussion forum” and never got a helpful answer.
So, here it is again:
How do I get rid of the -950 penalty?
Regarding “Use robots.txt to prevent crawling of search results pages or other auto-generated pages that don’t add much value for users coming from search engines.” – would that cover the DMOZ clones? What about the Google directory and the various versions across those domains/subdomains?
Regarding automated queries: how much is “automated”? Is a browser plugin for PR an automated query? is a tool on a server that queries Google (be it ranking, # pages indexed, etc) which the user manually uses automated? Or are only tools which run on a scheduler or which query URLs from a database automated?
This is poorly written; it does not take into account that you are NOT telling people how to get good rankings without employing some of the HARMLESS techniques.
It is fine to say ‘dont do this and dont do that’ but what alternatives are there for those small Websites that can not compete successfully with the well-funded websites.
The BASIC guidelines are really poorly written and show no empathy for the desperations of some Webmasters who may have started off pure and good, but finally got so disappointed they began deploying these tactics.
In other words, there is not compassion or empathy in the guidelines.
WHAT RIGHT does anyone have to tell someone to make their pages for Humans and not for Search Engines. That is none of your business!!
That is a business decision that every site owner should make for herself or himself. In many cases you MUST create some pages for Search Engines – unless you can afford tens of thousands of dollars for PPCs or Public Relations. It is one thing to tell a large firm to engage in these guidelines, but since the playing field is NOT equal, you can not hold the small Webmaster to the same standards – they MUST have some ‘real word’ privileges.
Links schemes are an important asset for many sites. It is like a support systems among small sites to help each other. The bottom line is that people who use these schemes many not have any other way of promoting because of extremely low budgets. IT IS A SUPPORT GROUP – be more tolerant. Some link schemes have standards before allowing sites to participate, they should be judged differently than a completely malicious link scheme. Go the extra distance and judge each link scheme on its own merits. People’s livelihoods are at stake.
Doorway pages should be tolerated if there is no other cost effective option to get traffic for misspellings or synonyms. We are not living in an age of CONCEPT SEARCHING, so tactics must be employed that deal with the limitations of search technology as it exists in 2007.
It is not YOUR problem is small sites get NO traffic and large sites dominate the SERPs. It is not YOUR worry that Webmasters are frustrated. You get paid regardless of what concerns they face. YOUR job is not affected because those who have played by the rules get no income from their small sites. And those who get angry and start to react in a manner you disapprove of – GET heartless BANNED (that will show them)
In fact if they give up all hope for buy ADWORDS, you gain. 😕
Hidden Text:
There are some quality sites out there using CSS to offset text so that its not visible in a browser. See http://www.copyblogger.com for an example of a site that does this. The Copyblogger logo at the top of the site is an image that has been absolutley placed using CSS and is directly above a Div that contains a hyperlink to the copyblogger home page. So nice image that is aesthetically pleasing that when hovered over changes the cursor to indicate a hyperlink.
The link text “Copyblogger” has been offset using CSS (text-indent:-9999px;) so that its not visible on the page. Now strictly according to the guidlines this is bad because you can not see this text. The reality is the text is exactly what you see in the browser. So is this a breach of the guidlines or not?
There are other examples such as the RSS icon with hyperlink text that is descriptive “Copyblogger RSS Feed” but again not visible.
Copyblogger
Philipp Lenssen,
Following W3C standards has never seemed to be a requirement, not that I disagree but posting a link to w3c could imply there is one. I have seen over time numerous pages in Google’s index which are missing body tags, will have code outside the html tags and so on. I guess it is more about the message and not the delivery (following the guidelines of course).
http://www.google.com/support/webmasters/bin/answer.py?answer=35291
>No one can guarantee a #1 ranking on Google.
Beware of SEOs that claim to guarantee rankings, allege a “special relationship” with Google, or advertise a “priority submit” to Google. There is no priority submit for Google. In fact, the only way to submit a site to Google directly is through our Add URL page or through the Google Sitemaps program, and you can do this yourself at no cost whatsoever.
followed by
>Make sure you’re protected legally.
For your own safety, you should insist on a full and unconditional money-back guarantee. Don’t be afraid to request a refund if you’re unsatisfied for any reason, or if your SEO’s actions cause your domain to be removed from a search engine’s index. Make sure you have a contract in writing that includes pricing. The contract should also require the SEO to stay within the guidelines recommended by each search engine for site inclusion.
so none can guarantee anything, but yet people are supposed to get one, so what exactly are they getting a guarantee for?
Matt
I like these informations under the heading “Quality guidelines – specific guidelines” and for more information we can clink on these but it would be better to write in brief within these points, so we can read there without click I think.
Thanks
Deb
It would be really good if you can cover the use of rel=”nofollow”.
I like the updates. I am glad that they are there and more clear. I don’t think that it is necessary for Google to outline any good techniques. Only what will not be tolerated.
I would have benefited from these guidelines myself and would have been able to avoid getting in trouble in the first place.
Thank you for the update to these guidelines. Hopefully they will help keep more people working on best practices instead of gaming the system.
And hopefully my website will be back soon!!!
Here is my main complaint. Nothing is timely or working. The Web Developer tools seem to be dated (as in not showing day to day current information like with google analytics). I know GA is embedded in the page, but couldn’t there be an option to join to two to get better snapshots?
Same is true for working with a Spam report. I have submitted a site in the past and no action was taken (while the site is clearly spam and breaks every quality guideline rule you have). You had said in previous posts that spam submissions from webmaster tools takes precedence over anonymous claims. There is no feedback to this process – so I just tend to lose faith in using the tools.
So these guidelines give me the same feel…somewhat dated. Like the ‘alt tag’ example above. I think that they need revisions that answer the questions above – the questions related to image replacement and ajax techniques that use display: none or negative indenting.
I have a problem with the guidelines as a whole. They’re a nice concept but seeing as Google rarely seems to take notice of them itself then they’re not that useful.
Just take a look at perhaps the most competitive area in search: finance. One can frequently see breaches of the guidelines going completely unpunished. For example: http://www.google.co.uk/search?source=ig&hl=en&q=loans&btnG=Google+Search&meta=
The top result here is a member of the Digital Point Ad Network, an automated link generation scheme. They’re in top position for a competitive term. Yahoo says they have over 6 million inbound links to the homepage alone. Look at the source of one of these and it’s clearly an automated linking scheme, which is also against the Google guidelines/rules. But this site is not penalised for this behaviour, and in fact gets rewarded by being ranked in top position.
So, my point is why should webmasters pay any attention to Google guidelines when blatant breaches of those guidelines result in no punishment and even in favourable postioning? Why even try to follow Google’s rules and create a better product for Google? And therefore why is Google spending time updating guidelines they don’t even enforce? What’s the point? We might as well all sign up to DP ads and enjoy the benefits because Google won’t do anything to you.
Seriously, why even ask? You know your guidlines are cryptic and that is how you want it to remain. Besides even if we tell you something you guys do not listen. Google actually encourage dirty SEO, especially if you use Adwords. I tell your folks about sites that employ Black SEO techniques and what do you do? Put them on the first page for competitive keywords…Speaking of adwords BAN all MFA sites already! I know your desperate for revenue to maintain your lofty market cap, but please it does not enhance the internet users experience and is high priced spam.
I’m wondering if there is a time limit between reinclusion requests. Obviously not something you want to do bi-monthly, but I can surmise webmasters getting ancy with no response or change after 3 months.
SearcHEngineSWeB, with all due respect to you, I disagree that any business, regardless of size or budget, should be given any special graces with doing blackhat SEO. Where do you draw the line? And if a company that is marginally bigger than yours does better than you, do you lower the bar again? The wrong answer is to allow blackhat SEO if your company is below a certain size. The right answer is for no-one to engage in blackhat SEO and for Google to actually penalise anyone that does fairly across the board, thereby levelling the playing field a bit more. This is not happening currently.
Doorway pages – again, completely unnecessary with a good site architecture and site strategy plan. You can use your existing site to cover misspellings if you plan accordingly.
It’s not Google’s responsibility to look out for how your business does online, that’s yours. It is Google’s responsibility, however, to ensure that rules that are created are enforced equally. Again, this isn’t happening. Whilst it’s not happening, it makes a mockery of these guidelines and the useful intentions behind them.
“A good rule of thumb is whether you’d feel comfortable explaining what you’ve done to a website that competes with you.”
This is funny. I’m sure Google and Yahoo sit down for lunch every week and discuss what you guys do to gain more market share against one another. Do you think I’m going to my competition to tell them how I’m going to beat them whether its advertising online or on the radio?
Hey Matt. I know I wasn’t an SEO before, but I just took a new SEO job..
anyway, I’m curious about the “don’t use automated programs like web position gold” part. Out of respect for your TOS I don’t use it. The problem is, we have many clients who like it and want us to do so.
Aside from lots of manual labor involved in gathering the data on my own, is there a Google recommended way of providing the same features without abusing Google resources?
Is there a penalty for using such programs? I know it’d be pretty easy for Google to see what domains they keep querying.
The sitemaps console is a great resource, but I think it could be a little better as far as mimicking the deliverables of the automated programs.
Perhaps Google could expand on this feature and offer alternatives to the masses who are so dependent upon programs like web position gold. I’d like to see that part drilled into in depth a little more.
Graywolf, I interpreted that to get a guarantee that they won’t do anything blackhat or to harm your domain. On a re-read, perhaps it does need a little clarification.
I would have to say I found this a little confusing:
“Use text instead of images as spiders to not understand the text in images”
followed by:
“Make sure that your TITLE and ALT tags are descriptive and accurate.”
ALT tags are for images (as we know) but sound’s a bit contradictive. Spiders DON’T understand the text in images?? Does that mean Spiders don’t read ALT tags?
Sorry – that should have read:
“Try to use text instead of images to display important names, content, or links. The Google crawler doesn’t recognize text contained in images.”
Thanks for asking for feedback. Regarding Paid Links, originally I got the impression that the penalties for paid links would be on SELLER of the link, as referenced here by Adam Lasnik:
http://www.marketingpilgrim.com/2007/03/googles-lasnik-wishes-nofollow-didnt-exist.html#comment-22729
But then the new guidelines came out with a section on “Why should I report paid links?” :
http://www.google.com/support/webmasters/bin/answer.py?answer=66736&topic=8524
In that piece it appears that the burden has been switched from the SELLER to the BUYER, based on statements like this, ” Buying links in order to improve a site’s ranking is in violation of Google’s webmaster guidelines and can negatively impact a site’s ranking in search results.”
I realize Google’s stance on buying and selling links, that’s not my confusion, but what I don’t understand is how Adam says that the seller will be penalized and even gives a reason why, so other sites cannot hurt your ranking, “We’re aware of and very strongly tuned against facilitating the “Googlebowling” you’ve expressed concerns about… and, in fact, I’ve never seen an example of that in the wild.” But yet when the official documents come out, they say the exact opposite.
Thanks for your time, and I’ll hang up and listen.
Hey Matt,
I actually very much liked the document, Google policies affect so many businesses around the world it’s nice to see clear guidelines from the source.
Anyways, what I liked best is the relevant links here and there, could you guys append (it’s not in quality section, but in technical, but since you asked…) this “Make sure your web server supports the If-Modified-Since HTTP header. ” with a link to an internal or external site how to check for this? Or at least add something like “Apache from version x and IIS from version Y support this feature by default”? Would be cool if I could check it myself without involving our technical guy.
Regarding the content duplication — we run different third-level domains for our different applications and they all have some generic pages (like privacy, terms of use, about the company) available “locally” because we don’t want users to leave the current web-site and our download page is pretty much the same except different order of products on different domains.
Anyways, if you could maybe make a tool available telling us exactly how similar Google finds two particular pages and what the penalty for having them is, that would be great.
Keep up the good job, it’s nice to see even tho you guys are #1, you still care.
Spiders don’t understand a pictorial representation of text, but can read alt tags associated with images since this live as text in the source code of the page.
Sneaky Javascript redirects –
This section is vague at best. I have a good long term grasp of the issues involved, but I still had to read this section several times before I thought I understood what google was trying to say. I think what your trying to say is that using javascript to manipulate search rankings is bad…
Doorway pages – ???
IMO, This entire section needs to be rewritten.
Are “Doorway Pages” considered good or bad?
Is a “Door Way Page the same as a “Landing Page”?
Is a site Map going to make all my “Doorway Pages” cool with the Goo?
If I put a Navigation Bar on my “Door Way Pages” for users to navigate the site are those pages still considered “Door Way Pages” by the Goo?
HI Matt.
I have the Same question as Ryan Has.We dont want to use Programs like Web Position Gold, But if we want to Check rankings for a lot of Keywords for a larger site, What would be a better altenative? Personaly I dont use those Softwares to Submit sites or anything. Is it violation even to use those softwares to Check the Position? The Guidelines says dont use the softwares, It would be great if there is a little bit more explanation like why you shouldnt use those softwares and what will happen if someone uses those softwares.
Thank you for asking the question. We are so glad that people like you are trying to solve our poblems.
Suresh
“Submit your site to relevant directories such as the Open Directory Project and Yahoo!, as well as to other industry-specific expert sites.”
Who is judge which is relevant directory and which not? Google? Google says: make sure other sites link to you. How? Only reasonable choice is buing links. As long as our competitors do that we do not have another possibilidad. At the moment we are living in world where we cant make a succes without breaking Google guidelines.
3 things;
1 – a working example of how to correctly include the url to your XML sitemap in your robots.txt file
2 – suggesting how to test the website for WAI compliance and why this is a good practice to get into. perhaps a link to cynthiasays.com for their online validation tool?
3 – All of the above suggestions that involve W3C Standards being pushed more onto new webmasters.
Change nothing. You’ll only regret letting the SEO community dictate to you how you should word your guidelines.
Just as the SEO community has come to regret allowing Google to dictate what they can and cannot do on their own pages.
In other words, you and the Naboo form a symbiotic circle. Surely you realize that whatever happens to them will also affect you.
Thank you for asking Matt.
It would be helpful to webmasters if Google would define “link schemes” and explain the subtle difference between full duplex high volume link programs and editor based low volume link exchange for the end user.
These days, many webmasters and SEOs think any type of link exchange is bad due to rampant mis-interpretation of comments from engineers at major search engines. Google has directly contributed to this issue by providing Page Rank which then gives overzealous webmasters the drive to manipulate search returns by participating in full duplex link exchange programs and services. But what about the good guys who are making link exchange decisions for the end user?
http://www.google.com/support/webmasters/bin/answer.py?answer=66736&topic=8524 states:
“Google works hard to ensure that it fully discounts links intended to manipulate search engine results, such link exchanges and purchased links.”
This guideline appears to be targeted towards explaining paid links – so why the subtle reference to link exchange at the very bottom of this guideline? That is sloppy in my opinion and should be further explained or omitted from this guideline.
http://www.google.com/support/webmasters/bin/answer.py?answer=66736&topic=8524 is the only webmaster guideline that specifically mentions “link exchange” and doing so out of context of the original guideline title/question is misinformative and confusing.
Google also mentioned the term “link scheme” in numerous guidelines without defining the term anywhere. How about a glossary of webmaster terms? I know other webmaster forums publish a glossary but that is useless IMO .. Google should publish a webmaster glossary and link to it from vague terms such as “link schemes” and “bad neighborhoods”.
I will carefully presume that “link scheme” means full duplex fully automated linking whereas a webmaster pays his 50 bucks and then gets auto linked to hundreds of sites overnight. But what about editor based link exchange for the end user conducted in natural/slow volume??
If my site sells motorcycle parts and a site that sells cycle graphics wants to exchange links with me, why can’t I link exchange with this site from my links pages without fear of being penalized in Google??
When Google implies that I may be penalized for link exchange when it benefits my end user, Google is negatively affecting my ability to do business with like minded sites.
Link exchange between relevant like minded sites has existed since the WWW was invented. Webmasters will hopefully continue to link their sites with quality relevant sites when it benefits the end user, in slow natural volume – while avoiding services and software that make linking guarantees overnight.
When Google makes subtle comments regarding “link exchange” thrown into a guideline regarding paid links, it confuses webmasters and adds to the paranoia related to relevant link exchange for the end user.
Once and for all, Google should simply define the difference between full duplex link exchange where links are not obtainined using editorial discretion – and relevant editor based link exchange for the end user.
A Google webmaster glossary would be a great improvement to the webmaster guidelines.
Regarding WPG. It is objectionable because it exploits search engines resources and does not adhere to Robots.txt. So, WPG is the SEM equivalent of an email harvester. Think about it, WPG makes millions of $’s and doesn’t spend a cent in support or bandwidth. The cost to SE’s to serve and maintain support for WPG is is likely in the millions.
The Search engines don’t want you to do this but until recently they did little to provide a substitute. Now I believe you can get that info from one of the Google webmaster or anylitics tools. Digital Point also has a Rank checker product but it requires a Soap API key which Google no longer supports or isn’t currently issuing new ones.
Hey all, I just wanted to let you know that the person on the webmaster central team who pushed the updated guidelines has already dropped me an email to say that they’re reading this thread. I’ll probably get together with them next week to see if there’s ways we can take some of the suggestions and look for rough edges on the additional information and polish them more.
I totally agree with Ryan on the “Don’t send automated queries to Google.” How else is anyone supposed to track and monitor their search rankings? We can all manually type in each query and then browse the results to see our rankings, but thats no different in terms of server processing than automated queries (i.e. Web Position). If Google doesn’t want these types of programs used then there needs to be an alternative solution.
On the topic of hidden text..what about text behind Flash? For accessibility, if a person doesn’t have Flash installed we want them to still see the text contained in the Flash so the text is placed behind the Flash. Would this considered deceptive by Google? If so, what is the suggested best practice?
BTW, why don’t you ask for suggestions before the change is made next time?
Hey Matt
There still seems to be a lot of confusion in SEO circles regarding whether incoming links can negatively affect SERP rankings, whether it is the webmaster’s fault or not.
I have had a web page drop in ranks and the only thing I see different is that a link spammer has linked to it from an obviously deceptive ( hidden and covered up text ) webpage.
I never ask for links, ever. I go with the precept that quality content will merit links naturally ( as you sorta’ said in an earlier article ) but I can’t control who decides to link to what. I have asked the webmaster to remove his link, to no avail.
Here’s the question : Can this webpage of mine be penalized due to someone else’s SEO strategy of offering an outbound link to my webpage from his spam webpage.
The example page and the one debate about this was held in a forum here : http://www.daniweb.com/forums/thread78073.html
Thanks
I’d like to see this sentence explained.
“In particular, avoid links to web spammers or “bad neighborhoods””
What’s a bad neighborhood?
Who are web spammers?
Can you tell to what sites you are allowed to link?
Can you cover ‘buying links’?
Hey Matt,
It seems pretty clear to me but, some folks only want to hear what they want to hear!
Could you clarify how Google feels about DIV tags hidden via external CSS and used to provide “alternative” content for Flash.
There is a debate over at SEW:
http://forums.searchenginewatch.com/showpost.php?p=110920&postcount=18
Thanks for your time!
LOL Matt; Too funny.
I’m with Michael Martinez on this issue. You are well aware of spammers and those who don’t follow your guidelines, or those who “interpret” your guidelines a certain way that happen to fit into their biz model, would be truly happy campers if Google did more and more “clarifying” of the guidelines. Some of us actually understand them totally and simply use common sense. “Many” others would love much more detail so they how close to come to that line before being spam. I mean look at the posts in here already from people wanted MORE detail? LOL One even claims “what’s a small site owner to do” if they cannot spam…. or something like that. Sheesh…. my answer? Get a REAL SEO firm to help you.
It’s really not in Google’s best interest to give “more detail” at all. Most whitehats understand the guidelines perfectly.
In a nutshell; if you are doing something for “both” real visitors to your site and a spider, it’s just not spam. That should take care of the “hidden links” issue as well. If you define what a hidden link is and when it’s spam, you would also have to define “everything” that is not seen by a browser window. That would be an impossible task.
Just an IMO post, but an experienced and educated one. 🙂
Lotso wrote:
“Doorway pages – ???
IMO, This entire section needs to be rewritten.
Are “Doorway Pages” considered good or bad?
Is a “Door Way Page the same as a “Landing Page”?
Is a site Map going to make all my “Doorway Pages” cool with the Goo?
If I put a Navigation Bar on my “Door Way Pages” for users to navigate the site are those pages still considered “Door Way Pages” by the Goo?”
This is what I’m talking about. My, oh my. Gee, let’s see; Doorway pages are good things if they are reached by real visitors and are part of the actual site. Doorway pages are bad things if they are created strictly for spiders. doorway, landing, entry, subway, or maylay pages,;;.. what the heck makes the difference what you call them? If you create them for spiders only, it’s darn spam. Easy. I thought you knew all of this already? The rest of the post is more of the same silly stuff. Use your common sense.
S.E.W. … your post is just a “rolling on the floor” with laughter type of thing. How about all your concerns with “small businesses” being given to “other” firms who know what they are doing so that small biz can do better in Google?
Matt expected the types of posts in this thread though. Google expected them as well.
Looks good. I’d like to see even more information on paid links, in a post a while back you said that reviewed paid links in directories are okay, and I’d just like to see more on that as well as more safe ways to buy and sell advertising. Thanks!
Geee Dougie when did you go to work for Google?
Doug as Google’s new spokes person I am glad you have such a grasp of what google is trying to say to webmasters and can explain it so well to all the uneducated masses that frequent Matt’s Blog.
However Doug, I was trying to point out those passages, as written, could be considered rather vague to people that don’t have your genius and over whelming understanding of what Google wants or does not want.
You will have to remake properly the center of assistance Administrateur Web “Webmasters Guidelines” © Google France. It is necessary to say to them!
Wherever you mention “alt tags” can you change that to read “alt attributes” instead please.
Your site statistics. If you’re not seeing referral traffic for the keywords and phrases that you think you should be seeing, you haven’t done enough for those keywords and phrases to be a playa.
Why would a search engine company introduce empathy into what is an objective process, thereby making it subjective? This is just silly.
A guarantee that a site won’t get banned would be a good start. A guarantee that any SEO tactics would not inhibit the user experience in any way whatsoever would be another one, albeit a much trickier one.
There are ways.
Personally, my only issue with the guidelines is clarification. I think it would be worth clarifying specific issues with examples further would be a benefit, even with the possibility of manipulation as the result of transparency as Michael Martinez pointed out. There are too many people out there twisting the guideline wording around to suit whatever purpose they want, and too many unknowing webmasters following the spin doctors, to worry about if a relative few people will find the loopholes.
That, and I wouldn’t mind if they were a lot tougher. Of course, I wouldn’t mind if you guys went all the way to the levels you do in the Accessibility search. Strip it down to the 10% of sites that are well-coded and built for users, and you’d actually be helping webmasters in the long run by forcing them to code to higher standards if they want to get in. Seriously, that thing rocks. You gotta start plugging it more.
Oh…and if you could answer Dave’s question, that’d be great. He won’t stop talking about it. 😉
If you need an example, there’s one at http://www.wikipedia.com/Main_Page (the “printfooter” div/code).
I’m tending to agree with those commenting on the Doorway pages section.
It seems that in Google’s attempt to make the internet a better place, they’ve dictated what is and isn’t acceptable in terms of website design (like plain html pages versus pages using flash and javascript).
As of now, the doorway page section says to me that “Yeah, you can have a really cool looking/working site that benefits the internet community but no one will be able to find you because we won’t allow you to create content for search engines to access.” I know several awesome looking and very useful sites that are that way because of the flash and javascript used. Sadly these pages have an extremely hard time ranking because of how Google indexes pages. The only chance they have is to create a page that simply gives Google an idea of what the site’s about, which may or may not be seen as “illegal” in Google’s eyes.
I do strongly disagree with doorway pages used to spam the search engines so perhaps the section could clarify how far is too far and whether or not websites can use anything to inform Google of what the site’s about without spamming.
Doorway pages
Forget doorway pages, how about doorway domains? Domains owned by one person with similar content used to flood top postions in search for “keywords” to sell a product. Example: Say I sell slot machines and have 10 sites ranking in the top 10 positions in organic search? (don’t laugh, someone just sent me an email complaining about this).
Well, how about it? Care to do something about it then update your guidelines to cover greed? If not please tell me because in the area where I sell my product I will be forced to do the same to compete.
Pagerank has many flaws.
Thanks,
Aaron
LOL – Let me write them for you as I am apparently pretty good at writing stuff without actually saying anything!
The bottom line is that the more you add to the guidelines, the more complicated it will be to interpret. Simplicity should be the focus. To that end, a list with the practices that can get a website penalized or banned should be all that is included. Otherwise you might as well higher a team of lawyers run circles with semantics and make it a 20 page document instead of guidelines.
A good method might be to simply state the ultimate goal of what Google is trying to accomplish (provide consumers the most relevant results for their search query) followed by a list of unacceptable practices that inhibit or mislead that process. A seperate section could include tips on how to allow a website to be spidered more easily. Nothing more should need to be explained beyond that.
Nor do I think it should be necessary to explain how websites should be created to benefit consumers other than to say that websites should put the consumer before the spider.
I think this is a case of “keep it simple sT%$^&@” and take the questions and opinion out of it. Here are some examples:
Don’t buy links to increase youe SERPS
Don’t trade links to increase your SERPS
Following any of the above practices could subject your website to a penalty or a ban from the index.
Of course there would be more than 2, but that’s the idea.
The tips section could explain that the spider can’t read images and use alt descriptions to help guide someone just learning or working on their own, but should not be considered part of the “guidelines”. Instead, it would be a FAQ or something.
Matt
Regarding:
“Don’t send automated queries to Google.”
Do you agree that these automated queries retrieve valuable information from a seo perspective? How do webmasters access that information, without these automated queries? Manual queries aren’t so productive?! Look at google bot. It’s not a manual process. It also consumes computing resources.
“Another useful test is to ask: Does this help my users? Would I do this if search engines didn’t exist?”
But the fact is: search engine do exist. And it’s just normal that webmaster try their best to optimize their website for google. They just love google! And they want more google!
Look at the title of this page: Matt Cutts: Gadgets, Google, and SEO » Comments ou our webmaster guidelines?
Would you write this title if search engines didn’t exist? I don’t think you would. (How many visitors read the title?!…) But this title is not written only for google. It’s perfect for visitors. It summarizes perfectly the content of the page. It’s not stuffed with keywords. I think seo is also good copywriting. And that’s a win / win situation. You give google a better idea about the content of your website and you write better content, that will increase your visitor’s experience. How many webmasters are writing better titles because of google? Even better content?!…
And look at the url? You have all these keywords: comments-on-our-webmaster-guidelines. Is that just for google? But isn’t that positive? Aren’t you helping google to better index your website? Or is it a trick?
Don’t get me wrong. I love google bot. 😀 I owe a lot to google and adwords.
Why can you give us these guidelines and then for some reason the corporate geniuses above me, refuse to change anything that is specifically stated!! I really don’t understand it sometimes, but that happens so much everywhere it just blows my mind… It’s not that difficult to rank especially with Google being nice enough to give Quality Guidelines. I mean you guys go as far as to break it down nice and simple. If you are doing things that are stated in the Specific, and you are wondering why you aren’t ranking for terms that you probably should…you might be a redneck. I’m sorry Matt for venting here, but I just can’t yell at the higher ups enough and seriously for the past month I have been highlighting a few sections in the specific section that we are doing and passing it around… Nothing has changed!! I can’t understand how thiscompany can own half of America offline, and not translate that into online domination. Thank you for letting me release some anger…and thank you for making those Quality Guidelines, not just because without them people honestly would think it was some crazy Google logic why people get ranked, but because it lets the average guy compete with the soulless corporations. Google is far from soulless because I know when you guys do run the world, and we are all plugged into the GoogleBrain, it will be nothing like the Matrix. Much less humans harvested as energy sources, more neural-interactive massages…
TomFrog wrote:
“Do you agree that these automated queries retrieve valuable information from a seo perspective? How do webmasters access that information, without these automated queries? Manual queries aren’t so productive?! Look at google bot. It’s not a manual process. It also consumes computing resources.”
Tom; please read multi-worded Adam’s post again, then again and again. Many of us do not use or need something to “auto” ANYthing at all. Why? If your client insists on some kind of “rank” report, your client can go to google.com and do a search on his phrases, right? Why would you have to do that for him/her anyway? It’s easy to do manually for the client. I guess if you are actually charging this client to do this kind of report, then you and your client have much bigger issues. While your client or you are doing searches, if you are not in the first 3 pages on a term, you know you have to work harder on that term. No need for some automated report, right?
I guess I’m stumped about these auto check reports of ANY kind at all. They are very useless and does not “help” your client in any way. You as the “SEO” don’t need them either. Read your log files and stats. They tell you “much” more than a silly rank on a term. They really do.
Tom wrote this:
“Look at the title of this page: Matt Cutts: Gadgets, Google, and SEO » Comments ou our webmaster guidelines?
Would you write this title if search engines didn’t exist? I don’t think you would.”
Why not? It sure does help me when skimming titles in here. Title tags are done for BOTH spiders and real users. Not spam. Meta descriptions are done for BOTH spiders and real users. Not spam.
If you do something strictly for the spiders…. spam. This isn’t hard.
Matt, not trying to be picky but here is something from your Design and Content guidelines:
“Make sure that your TITLE and ALT tags are descriptive and accurate.”
Since there is no such thing as an ALT tag, it would probably make more sense to rephrase it to:
“Make sure that your TITLE tags and ALT attributes are descriptive and accurate.”
Doug
We don’t live in a perfect world. And webmasters are writing better titles because of google. Webmasters are using h1 because of google. That doesn’t mean that they shouldn’t look at webdesign standards, acessibility guidelines and quality copywriting. But google and better search results are great incentives.
spam = only for the spiders ? I don’t agree. What about the url? It that for the user? No way! Is it spam? No. Matt’s using a search friendly url. It’s not spam.
Stats? You don’t read raw stats. I look at raw stats everyday on the command line. You read a stats report, that’s subject every day to an automated process, that bites away all that raw information and servers you with yummy and easy to read stats.
I’m not referring to those automated SEO magic scripts. I’m talking about organization and productivity and information gathering for analysis. I referred automated queries. I didn’t mention Web Position Gold.
The same applies to submission programs. It’s not about the automation. I don’t like that. It’s about organization. Have you ever used a CRM software? It’s not going to sell your services and products. And you decide if you’re going to spam everyone with the same sales letter. Or if you’re going to write a personalized email for every sales lead. It’s just an instrument. Some are good software. Others aren’t.
Technically, Matt didn’t write the title here, Tom. The WordPress software combined his blog name with the title of the post (which, as you stated, is perfect for visitors). So he probably would have written the same thing if search engines didn’t exist, simply because the software is what really does the work for him.
Not as such. It’s a WordPress setting that autogenerates the permalink based on the title. Whatever Matt puts into his title will be the default permalink. This can be overridden if Matt wants, but he can’t be bothered for the most part, and there’s really no good reason for him to since the titles he creates are pretty good from a permalink standpoint as well.
In other words, most of what Matt appears to be doing for SEO is autogenerated by WordPress, and Matt’s just going with whatever the defaults are. (Sorry Matt…I told them all your dirty little secret. 😉 )
Hi Matt
I’m of the same opinion as others that the section on software tools such as Web Position Gold is a little too vague, it’s not even a section but a short statement with no explanation.
Whilst the use of software such as this and the many others available may go against your recommendation’s, would one actually be penalised and why?
Virtually anyone with a large amount of keywords and rankings to check use some kind of software tool or self coded script. Small webmasters alsofind software such as this extremely useful when first starting their online businesses. SEO is a complex industry now and much and such why we first invented computers (and search engines!!!) to ease up on manual workloads we should also be expected to use tools to lessen the burden of checking and submitting information manually.
Maybe I would not have even had to write the above paragraph if it was explained WHY we should not use these programs, remembering that it’s not just the big boys who read Google’s guidelines, in fact its the millions of small webmasters which will be looking over them and Google will have put “the fear” into them by making such short unexplained statements.
I wonder why wordpress does that? 😉
Matthew wrote:
“Virtually anyone with a large amount of keywords and rankings to check use some kind of software tool or self coded script”
I’m begging someone to please explain to me WHY you feel and have the need to check ranks using automated software?? NO ONE has told me the exact reasons you are selling your clients this fake stuff and selling your clients something they can do/check themselves? So what if you have many, many terms you are targeting? What makes the diff? Can’t you check your stats or logs to see what terms you are getting referrals from ? If you are not getting referrals on a particular term, that means one of two things.
Either the term is not being searched on so it’s useless to target it. OR:
You are not on the first three pages of search results for the term, as you can clearly see by your stats if you are not getting any referrals from it.
I just don’t get it folks. Are you actually charging clients for this type of “rank check” service? Wouldn’t your time be better spent doing something else? For another thing; don’t you find that if “you” do a search from your puter on a term, and your client does a search from their puter on the same term, that you get different results on that term anyhoo?
Please tell me what the exact reasons are you need to check silly ranks? I stopped doing these type reports that mean “zero” to and for clients about 4 years ago. ALL other types of software that scours search engines like “link monger” type software such as back links, etc, is all totally worthless and not a benefit to your client whatsoever. Do stuff that actually helps a client’s website instead. IE: helps them sell more product or whatever it is the goal is of the particular site. Knowing a “rank” does not do that at all.
Ya got something there Igor. Auto software crap is BIG time business in this industry. It’s not in their best interests for the majority of site owners to actually get knowledgeable at all. It appears it’s not in the best interests of certain types of SEO’s for owners to get knowledgeable either. All I can figure is that some out there actually “charge” for the privilege of a rank report. 🙂
Doug Heil Said,
June 29, 2007 @ 3:16 pm
“If you do something strictly for the spiders…. spam. This isn’t hard.”
So Site Maps are now SPAM?
Doug Heil Said,
June 29, 2007 @ 3:16 pm
“If you do something strictly for the spiders…. spam. This isn’t hard.”
So having a robots.txt is now SPAM?
I agreed with your entire post Igor until the very last sentence. I feel there “are” good SEO’s only. Everyone else is not a SEO at all, but something else entirely. They are the one’s who hijack the industry. Good SEO’s actually care about their client’s sales and goals. It’s the “bad’ people who call themselves SEO’s who do not teach clients that a rank is just a rank and nothing more. A rank does not get you sales, nor does a rank help your website convert visitors at doing whatever it is the owner wants them to do. A rank is just a rank and nothing more. It’s toooo bad these so-called “other” types are charging client’s for this instead of charging client’s for things that actually matter. I think it’s a “see what I did for you” thang for this type SEO. It’s this type SEO not teaching clients what’s important. It’s this type SEO who takes the monies even if the client’s website is butt-ugly and could not sell if it ranked number one on the most searched on term. These types of “so-called” seo’s hurt the industry in a big way. Good SEO’s/developers would “never” take money from a bad site unless they actually redesigned/restructured, or in other words, … started from scratch.
So does this mean all the link directories that are using many of these tactics will be punished ?
Dude, I’m right here. Lasnik should now be referred to as “Secondary Adam”, or “Adam #2”. I got here first. 😀
Actually I have two questions:
First – I have seen a program that is an RSS Aggregator which generates a page with a news headline with one or two lines from the news article and is automatically added to the site and links to the original news article. It is used soley to generate pages for the site and to provide a link back to the index page. One site that uses this is ranked in the top ten on google for the search “Website Design” and has well over 6,000 pages indexed with the majority being “news” pages generated by this aggregator. Is this type of program ok to use?
Second – If one has done something to cause a problem with being indexed on google and the problem is corrected, is there a way to get the google bot to come back earlier than normal to recrawl the page? Sometimes it takes 2 or 3 weeks to see a return by google to crawl a page. Can we “invite” google back earlier? The site hasn’t been banned nor is it in supplemental….. but it has dropped significantly due to a bad judgement call in a change on the index page.
Thanks for the info and continued communications……… Eric in KY
Eric (the 2nd part of your Q – its a bit off topic) –
I’ve heard it suggested that submitting a sitemap may ‘jolt’ the bot into an earlier crawl – meh – not too sure about that – but something that DOES seem to work is getting some links (pref to a deep page) on the front page of a well crawled site – gbot will find your site from there – so if you’ve got any friends with decent PR websites or blogs, ask them for some help.
doc
New guideline:
Do not trade/buy links or advertise with anyone except Google or we will ban and penalize your site.
What is so wrong with buying links or selling links on your site? Why is it that Google can make their money from selling advertising, but they will penalize websites that sell ads or buys ads? Yes, a text link is an ad. Can you say hypocrisy?
MWA
“Dude, I’m right here. Lasnik should now be referred to as “Secondary Adam”, or “Adam #2″. I got here first.”
Or more correctly:
Adam Lasnik = Adam Jr.
Multi-Worded Adam = Adam Senior 🙂
That actually works, Harith, since Senior is how my last name is pronounced (although it’s not spelled the same.)
Adam Senior 🙂
You may wish to add to your robots.txt the following:
User-agent: *
Disallow: /*/feed/$
Disallow: /*/feed/rss/$
Disallow: /*/trackback/$
Disallow: /comments/feed/
Disallow: /*/2007/$
Disallow: /feed/
Disallow: /index.php
And between and
Just in case 🙂
And between (head) and (/head)
(META NAME=”GOOGLEBOT” CONTENT=”NOODP”)
(META NAME=”ROBOTS” CONTENT=”INDEX, FOLLOW”)
change “(” and “)” to “”
Regarding:
“Well frog! This might be something like Matt refers to as a gray area! In actuality Google states in GQG that a Website should not sell links because it tries to manipulate Google search results!”
It’s TomFrog.
Matt did differentiate between directories and link farms. True directories are great tools for google and will help improve search results, part of the human touch Matt was referring to on a previous post. Directories do not sell links.
Selling links is a very complicated issue. Sometimes you visit a blog and you know that it just exists for review me type of posts. And a few SEO sell links and they just have a few blogs and post the same content in each blog and it’s for example 10 links per blog x 10 blogs and that’s 100 hundred backlinks. But you also find quality blogs that get an extra revenue from selling links and that don’t accept low quality websites. Google created an asset: backlinks. And there’s a market for that asset.
Google needs good content. If there’s a decrease of quality content, google revenues will suffer. And some of those quality blogs and other quality websites gain some extra income from selling links. It’s not advertising. It’s link love. And that extra income allows the owner to spend more time researching and writing.
I don’t think google should treat equally those blogs that just do review me posts, with several twin blogs, and those quality blogs, from hard working webmasters, that offer google something it needs: content.
I think marketing and webdesign are more important than just SEO. A website that complies to webdesign and accessibility standards and that has a great marketing campaign can do good in the search results. Having someone that knows SEO will help. But just tweaking. It«s not about SEO. It’s all marketing.
Phew! Matt, you really opened up the hornet’s nest on this one. I’m exhausted just reading through the comments and from thoughts that are going through my mind with all of this.
If we follow the guidelines and then end up getting removed from the index anyway, submit a reinclusion request to which there is no response, how the f#$%! are we supposed to know where we went wrong?
“No one can guarantee anything, but make sure you get a guarantee?” Brilliant! So the client should get a full refund for being penalized when I have no idea what caused the penalty in the first place. Some of my clients like to take an active role in promoting their site – how do I know it’s not something the client did and didn’t ask me about first that got him penalized? How do I know that all this extra effort it’s taking to try to get him re-indexed that I’m not charging him for was my fault in the first place?
To those who say rankings are meaningless, how about the small guy SEO who is trying to land new clients who want to know what rankings the SEO has achieved for other sites and for which terms? Then, when you explain “rankings don’t mean squat”, the prospect thinks you’re a scam artist who doesn’t know how to get decent rankings. These clients want to know their rankings and most of them don’t want to sit there doing manual queries and then trying to find their site in the listings, nor do they want to pay me to do that for them. Using an automated rankings checker allows me to provide the information they want without charge. I agree that charging for something that runs in the background and requires virtually no input and no labor is not honest, what about those of us who aren’t charging for these reports and are just trying to spend time on more productive pursuits? I agree that 90% of the automated software that’s out there is pure garbaged intended to spam the SEs, but there are some tools that actually do help improve our productivity that are not used for spamming purposes, and in many ways help keep our costs down. What about those software vendors, aren’t they allowed to sell their products – even those that are useful tools and not spambots?
And there are both TITLE tags (in the HEAD) AND title attributes on links in the BODY of a page. This shouldn’t be too hard to clarify, just avoiding interchanging “tags” with “attributes” since they aren’t the same. And of course, spiders can read alt attributes for an image but they can’t read text that has been rendered (burned) into an image. I can’t imaging that would be too hard to figure out, yet many “webmasters” and web designers out there still insist on using image-based and Flash-based navigation elements.
“There are no good SEOs, only good webmasters”. Some of us act as both SEO and webmaster for our clients. I guess we should feel conflicted by that.
Penalized for link exchange and paid links? Give me a break. Sure there are abuses and they shouldn’t be too tough to spot for links between sites that have no correlation and for massive automated link generators, but what about trying to build traffic from relevant sites, what about online advertising to drive more traffic, what about hosting paid links in order to monetize a site and earn a living, and what about AdWords – aren’t they paid links? Let’s say I forget about rankings and do all these things to get traffic. Will Google differentiate? How?
Maybe get more specific on what Google won’t tolerate, eliminate the vagueries and confusion. Then, if none of those things occurred and our site still drops out of the index altogether, how about letting us know why? I understand why an algorithm needs to be confidential, but if things were more specific on the penalty end, fewer people would do them, fewer prospects would hire clients who engage in those techniques, and more prospects who want to help with their site will know clearly what they can’t do.
How is anyone supposed to make a living in this environment?
Thanks, Matt. It took some guts to throw this out there, but at least you did (he said as he left the room and started banging his head against the wall).
I’m glad I don’t drink beer and I don’t have any footer links. I don’t care if link sales is a white, black or grey area. I’m not a SEO.
But, I’m not a YES man.
Each time you search for a keyword and find a website that has adsense all over the place and you have to scroll down to find some useless content, then you have to ask google for accountability. It also affects the search results! And google has an important role to play in this area. What drives those websites is adsense.
Google wants Microsoft products that include search features to allow the user to choose from different search providers, including google. So I also feel that I have the right to ask google to allow small website owners to choose from different revenue sources, including link selling. If google recognizes that a certain website has a certain trust, reflected in it’s pagerank, on what grounds can they allege that a website can’t accept payment to refer a website (meaning link love) to google. It was google that empowered that website with it’s pagerank. Is it fair to say that these websites are causing problems to search results? Well, eliminate spam sites from the search results. Then let’s see. I might change my mind then.
I think Google is worried about linkbuing because it is 97% cheaper to get visitors than adwords and clickfraud is not a problem.
Harith, did that, it did nothing, this works better.
Igor, it’s a WordPress blog like Matt’s, only I hacked it a bit more than he did (so that’s why I have a bit better understanding of the inner workings, although I still don’t know anywhere near as much about it as I would like to). I don’t know where you get 100k from, but I do plan on tweaking the layout somewhat next weekend. And it was done in Photoshop, and that’s as low as it gets.
Then educate your prospect, rather than leave the prospect misinformed and chasing rainbows. You know. They don’t. Tell them. That’s no excuse, that’s just laziness.
If you’re good enough, you’ll know. Just promoting a site by itself shouldn’t get it banned. It’s usually a stupid pet trick that causes that.
If you’re this upset and angry over something, dude, you’re in the wrong line of work.
Igor, there’s one, not two, and I did the Save for Web thing (as I did with all the images on my site). That’s the best result.
Randy Wrote:
“To those who say rankings are meaningless, how about the small guy SEO who is trying to land new clients who want to know what rankings the SEO has achieved for other sites and for which terms? Then, when you explain “rankings don’t mean squat”, the prospect thinks you’re a scam artist who doesn’t know how to get decent rankings. These clients want to know their rankings and most of them don’t want to sit there doing manual queries and then trying to find their site in the listings, nor do they want to pay me to do that for them. Using an automated rankings checker allows me to provide the information they want without charge.”
It’s called “teaching” and “education. MOST all clients or potential clients “think” they are hiring you to “get them ranks”. Think about that for a moment. Do you really believe a client hired you to “get ranks”? Really?
Personally; I actually believe site owner or whoever hires a SEO because they need “help” with their website. That’s a broad and general statement, but it’s a fact. That means that they need help with their website. oh; I already said that….. they really do though. Needing help is “not” some sort of rank at all. That client of your’s need to “make more sales”. Period. He just thinks he needs a rank, but what he fails to understand, and so do MOST SEO’s is that clients really want MORE SALES. Getting that is much, much more than a simple rank in google. A rank gets you “visitors”. That is it. What’s next?
Educate your client about what is important. I see truly real bad sites in the serps consistently. Do these bad sites actually make a sale? Almost none. But I do know the so-called SEO who helped that bad site with getting a “rank” sure did make a sale, right? The poor site has good ranks, but cannot make a sale. What’s wrong with that picture? Clients are “not” the experts in this industry. The SEO who is taking that clients money IS the expert. The client does not know that what he really wants is help with his website, and not just a simple rank.
Read that a couple of times please.
Igor, thanks for the help, but no thanks. I’m not afraid to ask for help on something when I want it, but this isn’t one of those times (in fact, most times I don’t).
Like I said, I’m changing the layout slightly, and as a result certain elements (including that pic) will be changed as well. So don’t waste any more of your time on my account.
“Cloaking refers to the practice of presenting different content or URLs to users and search engines. Serving up different results based on user agent may cause your site to be perceived as deceptive and removed from the Google index.”
I think it’s time to go beyond “may cause” and define when Google is OK with cloaking. I searched this website and you have not explained this on more than a case by case basis focusing on reg-req’d pages.
I’d like to echo what was said 3 years ago on SERoundtable..
“…we need the search engines to come up with a clear acceptable cloaking policy.”
…and this year on SearchEngineLand…
“I urged Google to alter its cloaking policy to something stressing that cloaking was bad only if not approved” … “Did what you do help or harm the search results, in Google’s opinion? If you were harming search results, Google’s always reserved the right to boot you out. And if you were technically violating guidelines but not in a harmful way, Google’s always reserved the right to turn a blind eye. Or rather, an approving eye, an eye knowing that it’s intent that matters, not some technicality.”
Are UK Web sites (www.somenameorother.co.uk) that show up less in Google search results at Google.co.uk when using the “pages from the UK” option than they do when using “the web” option being penalized for something they’ve done, or is it a fault in Google’s systems?
Google should scan all the web sites with AdSense on them. I bet you will find out that 90% of these sites are nothing but search engine spam(mers) using every balck hat SEO technique under the sun.
Hi Matt, I am glad to have found your blog.
I wish to ask if a webmaster is doing a marketing campaign for asking his readers to link him in return of a prize or money, how does it have to do with abusing Google quality guidance? Because in the Internet world, all a webmaster can ask for is only links to his website in his marketing campaign. And such marketing campaign shouldn’t bounded only by advertising links where the webmaster pays for. Webmaster might actually pay his readers for linking him too.
Can you let me know more about this? Thank you.. 🙂
Matt
When does GOOG intend to update the webmaster guidelinesin Danish ?
Hi Matt:
Much more clarification is needed in these guidelines. Since it’s such an important subject – I’m sort of shocked that it’s so short. Reading through the posts – it seems the guidelines create more questions than clear “guidelines”. Don’t you agree that webmaster “guidelines” should clearly “guide” webmasters? The guys creating them should ask themselves the question “Do these guidelines ‘guide’ or create more questions and confusion?”
Thanks for the opportunity to give input.
I’m sorry but this is not a forum for discussions, it’s a blog asking for user comments and not more “unofficial” answers or advice!
If your name is not Matt Cutts and you don’t work for Google, please refrain from attempting to answer questions or giving advice on topics that you are not qualified to address.
Reading your unofficial rhetoric, in response to the legitimate comments posted herein is a waist of my time and I’m sure confusing to others.
Thanks,
Brian
Not IMO. IF you buy a link, pay ONLY for traffic. If you pay for PR or any sort of SERP boost, you are wasting money and giving your money to link mongers.
Brian wrote:
“Reading your unofficial rhetoric, in response to the legitimate comments posted herein is a waist of my time and I’m sure confusing to others.”
Well Brian, then you don’t have to read anything you don’t want to read. If things are “confusing” to you, maybe you need to learn more about things? Our opinions in here are just that…. our opinions. You just stated your opinion, right? So what’s the prob? Matt Cutts states his “opinion” in here as well, and is “not” an official of Google while doing so. I’m sorry if you did not know that fact, so now you know.
What do you think would happen if only questions were asked in here without any answers? Matt’s blog would be pretty darn dead, right?
When members ask questions some of us can answer, why the heck not answer them? What you are suppose to do is figure out for yourself who to believe and who not to believe. If all you wish to do is wait on Matt to spoon feed you, then you have a long time to wait.
If you sell links base the price on the amount of relevant traffic you can provide and use nofollow. Using nofollow is in YOUR best interest so you don’t end up being associated with a ‘bad neighborhood’.
Igor
Does WGHF stands for:
Winter Garden Heritage Foundation ?
igor
Maybe you, keniki and SearcHEngineSWeB join forces and start your own forum/blogs/Insane Asylum/webmaster Psychiatric hospital etc..
Sorry couldn’t resist 🙂
Dear Matt Cutts,
From what I’m reading here on duplicate content, I wonder if a practice I am doing (and many of my peers) is counterproductive for us. Here’s my examples.
Since last summer, Ebay support has been actively suggesting to us store sellers to create guides, wiki and blog entries on their site for search exposure. They also encourage us to duplicate that content on our store pages (on the ebay site) and also go to blogger and copy/paste it there to increase even more traffic. They also now have these ebay to go widgets and ughh affiliate site packages we can implement to help us with our sales.
Can you clarify the impact of this duplicate content in regards to my little personal site vs the content I place on the monster ebay site? Have I somehow put a noose over my sites neck Matt?
More and more of us are growing up and moving from ebay into the real internet. If we don’t have a basic understanding of how to keep our “neighborhood” on the internet clean, we may be inadvertently contributing to a junky experience because of our current understanding of how to drive traffic. What do you think?
I can’t expect Google to adjust their guidelines to speak to just my situation, but maybe if you can reply that would be helpful to understand. I would appreciate it.
Fruity
Matt, how about some “do no evil” comments on this ?
http://google-health-ads.blogspot.com/2007/06/does-negative-press-make-you-sicko.html
Hi!
From whom should, a hypothetical inventor of a wheel get
(in a also hypothetical world, that uses the internet but for transport only sledges) natural links?
Natural I think is only
one more word with delicate content
Self needing borders
Frank D. Said
The Google guidelines are more important than the desires of the users.
Because of no one sees the site
if google do not like the owners *
On my Site are hypothetics of mine,
which may be today abnormal
but in 10 years they could be standards
(if some one could discuss them with me)
but if Google only like American style, easy amusement content,
it could be very complicated to search consequentials.
And on the people which laughing here:
Buy some old educational books
and say me after: Today’s knowledge is 4 ever!
Ian M Said,
For example, how to specify the contents of a page, in a method that is understood by Google, as either US-English or UK-English.
JohnMu Said,
I realize the initial review at Yahoo costs $299 and it is not a paid listing. However subsequent years also cost that much — without a review. Is the first year ok and the next year bad (as a paid listing)?
and where is the border between paid and allowance?
Greetings Karl
Brian Ussery Said,
June 30, 2007 @ 6:28 pm
“I’m sorry but this is not a forum for discussions, it’s a blog asking for user comments and not more “unofficial” answers or advice!
If your name is not Matt Cutts and you don’t work for Google, please refrain from attempting to answer questions or giving advice on topics that you are not qualified to address.
Reading your unofficial rhetoric, in response to the legitimate comments posted herein is a waist of my time and I’m sure confusing to others.”
*************
I agree with Brian.
This is getting out of hand…
You can’t hold a conversation here, you can’t can’t answer a post here without one of these trolls attacking you and calling you a spammer or telling you a bunch of bull that is their incorrect opinion on how google works or what google wants.
I come here to discuss google issues, read what a google engineer has written or in the case of this thread, answer a request from the owner of the blog.
I did not come to this blog to read some trolls opinion about what he thinks google wants or to be attacked by one of these trolls.
I guess if the trolls here were correct once in a while it would not be quite so bad, but the trolls here are consistently wrong and consistently giving bad advice to webmasters that are seeking reliable knowledge about google.
Matt PLEASE control (or do away with) the trolls here and raise the level of discussion on this blog, instead of allowing the discussion to lower its self to the inevitable… “Your a spammer, no your a spammer” that that these trolls seem to thrive on…
Matt,
I wish Google would issue ‘safe harbor’ guidelines — and a clear, direct way for webmasters to submit challenges when their compliant site suddenly vanishes from the rankings anyhow.
Running my site has always been a fascinating (and scary) look at how Google operates. It’s a small, content-rich site that easily sits atop its tiny, tiny niche. Judged purely on content and user-experience, my site has no trouble earning top search rankings. It has no need of black or even gray-hat tactics.
My site, and others like it, are exactly the sort of sites search engines ought to favor. And yet, overnight, it can go ‘poof’ and vanish as if it never existed.
I can speculate as to why this happened (did Google actually care that I replaced Javascript affiliate ads with HTML last month – one per page in the product review section?) but there is no way for me to know for certain.
Google blasts the entire site, not just an offending page or section.
The new webmaster guidelines look a lot like the old guidelines. They are designed not so much to help designers create compliant sites, but rather not to give too much info away to black hatters looking to game the system.
Give designers unambiguous instructions on how to avoid suddenly being vaporized. It that really so hard to do?
“supplementary which is approximately Spam….”
igor you’re wrong time and time again. please stop commenting.
“What do you expect? I want Matt to answer every one of you complaints,”
no, but we sure don’t want trolls like you to respond to us.
Matt, I would like some clarification regarding “mirrors” that don’t seem to be convered in the guidelines.
I have setup 2 sites, one “.com” and another “.com.au” where the “.com.au” site is an almost exact copy (mirror). The main server “.com” is physically located in the UK and the mirror is in Australia. Using the IP2Location database, I have modified our CMS to automatically compose page URLs sending the client to either .com or .com.au based on what is closer and effectively faster to download.
My desired clarifications are:
(a) Does this pose a threat to our web-rankings? By doing this, we seem to be in opposition to the “Duplicate Content” guideline.
(b) Is there some kind of etiquette i can follow regarding mirrors with respect to Google? – From a humans point of view it is a major speed improvement.
(c) Does the GoogleBot that spiders Australian sites live in Australia i.e. have an Australian IP address? (I could check my logs for this – but clarification of the possibilites would be ideal)
(d) Should I encourage GoogleBot to index the “.com.au” even though it is duplicate content, for the reason that it is preferrable for Australian visitors? – To do this I could simply NOT modifiy the URLs where the user-agent is ‘GoogleBot’.
The problem with any guidelines put forth so far is that they aren’t hard and fast rules. Some of the guidelines should be read as “our algo can’t get this right a majority of the time, so please help us out and don’t do this”.
That puts mom & pop at a distinct disadvantage because they do not know the true risk to reward. They aren’t button pushers, don’t know how to buy links that don’t scream “PAID LINK”, and don’t have the time or tech to spam blogs.
Case in point is the recent reciprocal link issue. Most of those hit spent hours emailing colleagues asking for links. Not only were links traded, but relationships were formed. However, since SEs don’t understand how some business is still based on people networking with each other, these people were hit.
The guidelines are basically useless and ignored by any SEO worth their salt. Merely rewriting FUD won’t cut it. What will make a difference is when penalties are applied, adapt a policy of applying the same penalties equally across all sites that are violating that guideline within the same set of serps. That will send a message that will resonate.
Anything less is just Google’s failed attempt at gun control where only the bad guys have the guns.
Well I guess you covered the most of them there Matt and perhaps you may want to add to your list the misspelling in the web pages that are responsible for higher rankings on the misspelled terms of the searched terms.
Like “home business” misspelled “home busines” etc…
I am extremely dissatified with your results for “st george real estate” its funny because all of google biggest problems come to light on this page.
1. http://www.realtor.com/stgeorge/ Authority domain ranking and giving almost no information about St. George other then a search of homes. Surely there are real estate sites in St. George that give a full picture of st. George Real Estate.
2.www.4davebarton.com Wow let me count the ways he is cheating the system. Let me see his a member of a real estate webring, link partners.com and he has about 50 directories spamming his main page. Just to name a few.
3.utahrealestate.utah.com/stgeorgerealestate/ Basically these are blank templete pages of a super authority site utah.com once again they used the authority of there domain to rank for something they have no content for.
4.www.thesmithteam.org this site is down. They have a filler of adsense on the page and that is all. Surely there are better websites in St. George then this.
5.www.davidellisrealtor.com On the top this site look pretty damn good. Then when you dig a little deaper you see that he is doing link exchanges with Real Estate Webmasters, part of the keller williams link scheme, and he has rehashed other peoples contnet on his pages.
6.www.stgeorgechamper.com/housing This is an ok page for #6 even though it doesnt have crap for real estate information about st george. Again Authority site ranks for something it has little authority on.
7.realestate.yahoo.com/utah/saint_George Authority site with no information about st george other then yahoos search of listings you have to pay Yahoo to be included in. I am looking for St George Real Estate information and a search this only provides 1/2 of what I am looking for and its far from complete because it isnt tied in with the Washington County MLS.
8. http://www.newhomes.com/utah/utah_st_george_new_Homes.html Templeted webpage with no real content. Authority Site wins again. When will google figure this out?
9.www.homegain.com/local_real_estate/UT/saint_george These people sell leads to agents. They have no good information about St. George adn they dont provide a search of results. Basically they sucker people in to fill out the form then sell that lead to a poor realtor who thinks he is getting a quality lead when in fact he is getting garbage.
10.www.stgeorgerealestateinfo.com Old site that spamms directories for rankings. He is also a part of a real estate link exchange group as well as a real estate web ring. This site and david ellis are the only 2 sites you could even justify having in the top 10 for St George based on content or search. Yet it first pops up at #10. Way to go google your results suck.
First of all, I really appreciate being able to read an interview in a magazine from an engineer at one of the biggest internet companies in the world, and then later get to comment directly on the blog. Kudos.
I’d like to see much more thorough guidelines. There are too many gray areas, and if a company like BMW or Ricoh can get delisted for blackhat endeavors, I think it’s an area that needs more detail. I realize the more thorough you make your guidelines you run the risk of exposing more of the algorithms behind pagerank, but there has to be more than this.
I’ve spent the last two years tweaking my websites to be better for my users and SE friendly without using any questionable methods, but I’ve noticed as my rank raises, so do spam sites trying to use my name to build ranks for their clients.
I don’t want to get penalized for these, and it would be nice if I had a tool in the webmaster central to flag these sites as something I am not attached to in any way. I’ve tried the old ‘report spam sites’ route, but it doesn’t seem to go anywhere. And forget asking them to take it down. That’s like trying to convince igor this is someone else’s blog… 😉
– Mark
This is why My very first post in this thread was asking Matt why he started this type of thread in the first place.
No one called you a name Lotso….. at least not in “here”. I think people know you internet-wide anyway, so nothing needs to be said. 🙂
And yes; only spammers who think they are SEO’s have real problems with the current guidelines. I am excluding all real webmasters and owners with that statement as you all are legitimately trying to learn. All other real SEO’s have never had a problem knowing what the guidelines mean and don’t mean. Aside from maybe using a few different words here and there, the guidelines are fine as they are.
I have to agree whilst the guidlines are sufficiently vague to ensure that google keeps it’s algos secret you need to provide feedback to website owners.
I just had an old domain, put a new site on it (after leaving nothing on the site for three months to clear the indexes), it started getting good results and now the site doesn’t even rank no. 1 for it’s own domain name. But hang on, I followed the guidelines, I put loads of original content on the site, made sure the site was designed with end users in mind. Oh and I even got relevant links from other trusted sites.
So what do I do instead, contact google, yeah right, I would be dead from old age before I got an answer, no I now have to use adwords much more agressively than I did before.
But hang on adwords are pretty much the the same as paid links, as I use them to increase my ranking in the search engines …..
Small gripe, printing the Webmaster Guidelines is terrible, the main content only showing in a thin, center column… Google could use a good dose of a decent print.css stylesheet!
Can you clarify how the ‘no duplicate content’ mantra applies to sites using technologies like Endeca/Mercado? These search engines strive to give the user guided navigation without dead ends. It’s totally normal for people using this type of software to have two urls pointing to the same content/results because the user can take different paths to the same results. IT’s pretty much ‘tas’ but a different name.
For example:
/search/Color:Red/Finish:Painted/
/search/Finish:Painted/Color:Red/
The user can choose either path, but the search results presented are exactly the same. Is this duplicate content? Yes. Am I trying to abuse the system? No. Should my page rank suffer? No.
It seems we spend a lot of time worrying about such things for no apparent reason, or with a clear indication of what ‘no duplicate content’ really means.
Thoughts?
Matt, personally I don’t think Google will ever come close to pleasing most of the Webmasters most of the time. Perhaps a simple intro to the guidelines on page 1 would be the most productive? Something like;
Google is not an evil Giant waiting to pounce on any site that goes outside our guidelines. We truly want as many pages in our database as possible. We fully understand that many pages have elements which could be considered outside our guidelines. Think of each of these ‘elements’ as drops of water. While one to a few probably wont make much difference, many drops can soon become a bucket full! As such, our guidelines should be read with the best interest of Google searchers in the forefront of your mind. Also, common sense should always be applied.
“enough” for who?
Hey,
i have a question on the “hidden text” thing.
We use the css style “display:none” or “visibility:hidden” to hide some text and display it with javascript, when the user do an action.
is this bad for us?
When it is bad, why?
All new stuff like AJAX using this. Will google destroy this new technologie?
>Brian Ussery Said,
>June 30, 2007 @ 6:28 pm
>lots0 Said,
>July 1, 2007 @ 6:38 am
>corey Said,
>July 1, 2007 @ 9:02 am
>Harith Said,
>July 1, 2007 @ 1:57 am
>igor
>Maybe you, keniki and SearcHEngineSWeB join forces and start your own
>forum/blogs/Insane Asylum/webmaster Psychiatric hospital etc..
I agree with Brian, Lots0, Corey, Harith … trolling sucks. I’d add a few troll-handles to Harith’s list though. Sigh.
And the purpose of your post is?
* Make pages for users, not for search engines. Don’t deceive your users or present different content to search engines than you display to users, which is commonly referred to as “cloaking.”
what not even a little bit 🙂
* Avoid tricks intended to improve search engine rankings. A good rule of thumb is whether you’d feel comfortable explaining what you’ve done to a website that competes with you. Another useful test is to ask, “Does this help my users? Would I do this if search engines didn’t exist?”
This is insane. We’re not on a level playing field if we stop SEOing our sites? instead the spammers who ignore your guidelines will be sitting pretty in the 1-100 spots in the serps. The problem is google traffic is massively important. How about splitting the results further into “im looking for a business or service” and “im researching information” that way you can identify commercial and non-commercial results.
* Don’t participate in link schemes designed to increase your site’s ranking or PageRank. In particular, avoid links to web spammers or “bad neighborhoods” on the web, as your own ranking may be affected adversely by those links.
But links into a site are the ONLY way to influence google???
* Don’t use unauthorized computer programs to submit pages, check rankings, etc. Such programs consume computing resources and violate our Terms of Service. Google does not recommend the use of products such as WebPosition Gold™ that send automatic or programmatic queries to Google.
Does google have a list of AUTHORISED programs? no… i thought not!
Quality guidelines – specific guidelines
* Avoid hidden text or hidden links.
This should state unless they play an important role in the non-browser based viewing of a page.
* Don’t use cloaking or sneaky redirects.
* Don’t send automated queries to Google.
* Don’t load pages with irrelevant keywords.
* Don’t create multiple pages, subdomains, or domains with substantially duplicate content.
So you’ll be deleting ebay from your listings then? no… i thought not.
* Don’t create pages that install viruses, trojans, or other badware.
* Avoid “doorway” pages created just for search engines, or other “cookie cutter” approaches such as affiliate programs with little or no original content.
Again Dell, Amazon etc all do this!
* If your site participates in an affiliate program, make sure that your site adds value. Provide unique and relevant content that gives users a reason to visit your site first.
Hi Mark, I just viewed your blog. I see you have been doing design/marketing, etc for 7 years now. I am quite surprised by the post you just made. I really, really am. This stuff is not hard. You must be learning at some strange places. For some reason you cannot understand the guidelines and what they mean. Is this really true or is your post just being sarcastic?
“* Avoid tricks intended to improve search engine rankings. A good rule of thumb is whether you’d feel comfortable explaining what you’ve done to a website that competes with you. Another useful test is to ask, “Does this help my users? Would I do this if search engines didn’t exist?””
Why do you say that is “insane”? What about that paragraph don’t you understand?
Matt.
Why are Goos guidlines so cryptic?
Why is there never a concrete example of your guidlines in action? You could purposefully setup some example sites so we can the guidlines in practice.
I note your affiliate link guidelines. What makes an affiliate site valuable?
Take a look at Moneysupermarket.com. is this about to be penalised?
Try not to give me a politicians answer please, it helps nobody.
Doug: Its insane because they cant 100% automatically kick spammy sites out… there are always people trying to trick their way to the top and while i dont condone black hat techniques, there needs to be some way of evening the playing field.
Its like the police asking the law abiding to ‘be nice’ whilst there are murderers at the door.
Keniki hush those gums and go build some more “accessibility” link farms 🙂
Well sure Mark, but it’s that way no matter what we may be discussing. There are always bad people in every industry and walk of life. Your prior post makes it seem like you didn’t understand the guidelines at all. Spammy results are in every search engine. That’s nothing new. Any site practicing se spam will “eventually” get caught at it. The guidelines are truly what they are….. guidelines. They are meant for owners and webmasters as a “guide” to what is a good site for the Google users of their search engine. If people choose not to follow them, then they choose to be people who love to gamble with their sites/businesses/domains. Personally; I’d rather sleep well at night knowing that when I get up, no client or site of mine will be banned or penalized.
Doug, I agree. Mark these are guidelines. As Matt said at a conference … and Matt, feel free to edit if I’m wrong …
“You are free to do what you want as a webmaster, I am fully supportive of that idea. These are just guidelines. But by the same token, I am (Google is) also free to remove you from the results if you do something that violates our quality guidelines.”
igor, don’t use matt’s comments to discuss other topics. that’s called hijacking. if you want to talk about me please contact me. what you said about supplemental is wrong, the end.
sebastian, thanks for echoing my point and the points of others.
Doug Heil Said,
July 2, 2007 @ 5:18 am
“Your prior post makes it seem like you didn’t understand the guidelines at all.”
Doug not only do you not understand the guidelines, you seem to have a problem with reading comprehension as well.
Doug, did you understand what Matt was asking for when he started this thread? From your aggressive and inane comments to posters, it looks like the subject of this thread went way over your head.
Doug, I am still waiting for you to clarify your comment about how ANYTHING done just for googlebot is SPAM…
Things like the robots.txt file or site maps being SPAM is really something only a person with little or no real world experience would say.
igor berger Said,
July 1, 2007 @ 3:03 am
This topic is getting old!
Matt, needs to do a new one.
Maybe about his cats, or the Amazons!
Hey Igor… Get a life.
Igor, the guidelines are not crystal as they don’t take into account the wide and varied use of a website – printing pages for example may require hidden text only viewable when printing – is this against the guidelines?
All in all, i think the guidelines are pure smoke and mirrors whilst Google serves its own self interest which to be fair, doesn’t include non-paying customers.
mark
Lotso; I assume your questions take a little common sense to answer. If not; I’ll answer for you.
A robots.txt file and a “Google” site maps are options “you” can use or not that are also options that the se’s actually suggest you use themselves. ALL major engines state to use them so the spiders know which pages you want indexed. Google suggests you use the site maps if your site cannot be crawled real well.
So tell me Lotso; what about that is spam to you? Is it being deceptive to Google even though they suggest them for your site? Nope. Not. How can it be spam if it’s not being deceptive to Google?
Goodness; I feel like I’m answering questions from brand new site owners who actually do not know things.
Lotso Wrote:
“Doug, did you understand what Matt was asking for when he started this thread? From your aggressive and inane comments to posters, it looks like the subject of this thread went way over your head.”
Yep. You didn’t however. He asked for “comments”. He didn’t ask “just” for negative comments, but just comments. That means we can already agree with the guidelines as they are OR not. I choose to agree with them, and also question others about why they disagree with them. What’s wrong with that? Nothing. Doesn’t it spur “learning”? It sure does. It’s called “teaching” 101. If Matt doesn’t want people to learn in here, he will say so. For now; I always lean towards places that want others to learn. You just don’t want to hear about it. That’s your right, so just don’t read my posts.
Your questions are questions that new people might ask. Forcing me to answer them is plain silly don’t ya think? But maybe I should think of it as you “trying” to learn as well? If that is the case, then good for you. 🙂
Mark wrote:
“the guidelines are not crystal as they don’t take into account the wide and varied use of a website – printing pages for example may require hidden text only viewable when printing – is this against the guidelines?”
There is no way nobody could possibly state ALL the different and many ways a website can be built. Guidelines are guidelines. Your question about “print” is impossible to answer as each circumstance could be very different than the next. You if are doing something for your users, it’s not spam. It’s really simple if you use a common sense approach. If you are “concerned” about what you are doing/implementing, maybe you shouldn’t be doing it that way. Do it a different way that helps your users.
Hmm, new guidelines (and better – amazing what you can do with hypertext), same old arguments!
Those that want to get it, will get it – apart from paid links of course, where the guidelines that do exist are still vague.
That’s if you can find any reference to paid links in the guidelines, starting out at the link Matt gave:
http://www.google.com/support/webmasters/bin/answer.py?answer=35769
Please explain why Google continues to crawl via proxies and index the hijacked pages, which distresses many a webmaster, when the very process violates a bunch of webmaster guidelines such as:
# Don’t use cloaking or sneaky redirects.
That’s how the pages get crawled via the proxy server, so why is it permitted to continue?
# Don’t create multiple pages, subdomains, or domains with substantially duplicate content.
Google actually creates the multiple pages via the proxy server, et tu brute?
# Avoid tricks intended to improve search engine rankings.
What other purpose would the proxy server have of allowing Google to crawl through them in the first place?
I could go on but I think you get the point…
igor berger Said,
July 2, 2007 @ 7:40 am
“Google Guidelines are crystal and they have been crystal clear as far as I remember but you and others like you want to make something else out of them to benefit your dirty style of Webmastering…”
Igor, you and Doug are both geniuses way above and way beyond human comprehension.
Matt was asking and I quote Matt Cutts… “what do you see that is unclear? Are there places where you think the wording is poor or confusing?”
I pointed out two parts that I thought were written badly…. I was never asking for any changes to the google guidelines.
Unlike you I was doing what Matt asked… I pointed out some things that were written poorly and did not get googles point across well…
How you can twist that into making me a “dirty webmaster” is way beyond me.
Doug…. Your the one saying that robots.txt and site maps were SPAM… You said it…. oh so wise google one, why can’t you back it up?
Doug Heil Said,
June 29, 2007 @ 3:16 pm
“If you do something strictly for the spiders…. spam. This isn’t hard.”
Doug can you point me to anywhere it says that in google’s guidelines???
You can’t because google did not say it…
Doug you make stuff and then say it’s “google’s guidelines”… now why would you do that?
Doug, because it was pointed out to me that a big time genius like yourself often does not understand sarcasm very well…
I thought I would point out that no where in this thread did Matt Cutts ever ask for “Comments”.
Doug Heil Said,
July 2, 2007 @ 9:41 am
“…He asked for “comments. “
Thanks for calling me a big time genius. I’m not, but thanks for noticing. 🙂
He didn’t mention “comments”?
I guess you don’t count the most important part of the thread then. You know, the part that is called a “title” of the thread where Matt has this question:
Comments on our Webmaster Guidelines?
just like that.
I know you love it when forums out there and other places out there are constantly criticizing, etc, but I’m very sure every website out there would love to have a few people actually state they agree with things as they are. Some of us see zero need to change “yet again” the Google guidelines. I know some of you will never get it at all, no matter how things are worded or changed. Those are just the facts.
Today i have read all the new guidelines, they are much better than the old ones, describing everything very good. I will also do some changes on our homepage to make sure everything meets the google guidelines 100% in future.
Right now i’ve found a complete URL List of the spammer who copied everything from us and is spamming the google index now. i hope this info helps. http://www.olympostravel.com/koylar1.html
best regards
Heiko
You can deface webpages with web page defacement 2.0:
http://drawhere.com/site/http://mattcutts.com/blog/
to canadafred:
it is possible, but it be very tricky and could hurt your rankings as well; if you’re a spammer, and you’re using the same techniques to promote your site.
I would suggest you get into search engine reputation management, pushing the bad reviews down,
http://www.joostdevalk.nl/search-engine-reputation-management-make-it-look-natural/
If Matt didn’t want comments, he can turn the option for comments off for an individual post (and has in the past).
Not only that, he’s asking for comments:
Matt is looking for feedback on behalf of big G. Doug gave his feedback (that he felt that nothing was wrong). I don’t agree with Doug on this issue personally, but he’s not doing anything wrong, unless saying something that the rest of the SEO community would rather not hear is wrong.
Why are you picking a fight anyway, lots0?
lots0 is bare faced liar with major insecurity issues. I would guess he was the School bully at one time (or still is) and now he is below even that.
He is best ignored, after doing so about 100 times he finally takes the hint.
Is it possible to encrypt javascript unbreakable, perhaps with some server challenge/response mechanism? I’ve heard rumors from the underground of a special encrypting javascript. It sounds like an opium pipe dream, though.
Hi Matt,
I think the guidelines are ggreat – to the point and tell you exactly what “not to do”.
[quote]Google works hard to ensure that it fully discounts links intended to manipulate search engine results, such link exchanges and purchased links. If you see a site that is buying or selling links, If you see a site that is buying or selling links, let us know.We’ll use your information to improve our algorithmic detection of paid links.[/quote]
Just a grammatical point touched on earlier:
“such AS link exchanges” ?
and “if you see a site that is buying or selling links, if you see a site that is buying or selling links, please let us know” (repeated).
[quote]Submit your site to relevant directories such as the Open Directory Project and Yahoo!, as well as to other industry-specific expert sites.[/quote]
This is the only thing I find confusing from the Guidelines – we all understand “dont buy links”, but most “relevant directories” charge a review fee yet most do not use rel=”nofollow” – yet the Guidelines say DO submit, if relevant?
I know this has been asked tons of times – sorry!
Ray, Google isn’t against buying links for traffic and will only discount (not penalize) links that are paid in the main. So, if you pay for a link, or review, ensure the traffic is relevant and high enough to justify the cost.
Don’t EVER pay based on the PR of the page. All you are doing in that way is lining the pocket of some low life link monger.
Hi Dave (original),
I totally agree with your interpretation, it is just that this issue is still a tad confusing on the Guidelines – which Matt asked for feedback on.
What i would love to know is this:
Do web positioning software titles leave some kind of digital footprint?
In the interests of the webmaster world as a whole, clarification is needed.
These software titles are big business.
They are selling goods that improve positioning when in fact the exact opposite could well be true.
My thought is that Google CAN trace this activity and DO penalise sites who use this software.
And before anyone jumps on their soapbox and has a personal dig at me, i think this is a very pertinant question, don’t you?
SEOcock, of course those tools leave footprints when they access the Google servers :). However, in general it’s impossible to tell whether or not a site owner is the person using the tool — you might be checking up on your competition and it would be unfair to penalize them because you used tools like that. But things are not always “in general” – imagine the IP that uses those tools matches the IP that logs into the Google account for that site, or matches the IP that accesses the Adsense, Adwords or Analytics data for that site. That would be a fairly strong signal that you are using those tools for your own sites. Does Google do that? Who knows. But they certainly could, and it would be their right if they wanted to do so.
Even if they don’t penalize the site, they have the right (even without the guidelines) to block your IP because of a hint that you might be using tools like that, for your site or for someone elses. If you’re using those tools to create reports for clients (imo – see other postings above – useless), it would be pretty bad to have to explain the incomplete statistics where Google decided to block your IP for an hour or a day or more… Don’t get started on that and you won’t have problems with it.
I don’t know what you mean by “these software improve positioning”. No they don’t. They don’t help the webmaster or client or owner in any way. Is it a pertinent question? I guess it is if you have to ask about it. 🙂 I would think they sure do leave footprints. I know many of them “claim” they don’t leave any, but just the fact they claim that means they “know” that Google does not like them. I don’t get why people think they have to use them. This is an area where our industry really falters in a big way by not educating what actually helps and does not.
Doug – they sure do make it easier to track your progress and monitor your website across platforms tho!
What “progress” do you need to track or “monitor” that is not in your stats program already? I think this issue has been asked and answered a few times already. If you don’t get my answer to this the first or second time, you just won’t get it now either.
I think that what a few people who are so vocal in this thread are forgetting in this thread is that Matt asked for comments on the Webmaster Guidelines. That definitely shouldn’t include direct insults from those who think they know it all against people who have chosen to express comments and concerns here. What could have been a very healthy and useful collection of comments has instead turned into a “you don’t know your asp from your elbow” flaming session.
People with varying levels of skills and backgrounds use Google’s Webmaster tools and the guidelines need to be as universally understandable as they can be in order to be truly “user-friendly”. Comments here are for the purpose of generating feedback from Google’s user community – very valuable – not as an opportunity to take pot shots at those who post – worthless and counter productive.
Okay Randy. So it’s okay for a member in here to ask about a ‘product’ like WPG, etc, etc, etc, but it’s not okay for other members to answer them about that same product? I gotcha. Last I knew, Google states quite clearly in the guidelines to NOT use WPG and others similar. But yet, you don’t want us to help you or anyone else about that same product. Gotcha.
Doug. Here’s the deal (and apologies to Randy here).
You my friend, think that by posting all these types of comments on Matt’s blog, Google will somehow think you’re an angel and give you the highest paid job at the company.
Playing teachers pet all the time, knocking serious questions and concerns that people have. It’s boring mate, you are boring the board.
Yours is not the best way, yours is not the only, it’s just a way.
I for one am bored of you thinking you know best and thinking you own this thread.
Either you reign back that ridiculous know it all attitude or i’ll make you famous. Your choice.
I think you should offer an apology to those who you have been rude to. Be grown up and accept a grown up debate or simply be quiet.
Your choice.
(sorry to all that had to endure this post)
Matt, it’d be good if something could be added to the guidelines on what to do if your site complies 100% with google’s guidelines but you still find it penalized. It seems to me that the last thing google wants is for their results to not be as relevant as they could be and it’s thus in both the interest of google and the webmaster to have some way to resolve these cases. Options where sites (clearly) break the guidelines are totally different of course! While I’m on that, it would be pleasant for these same webmasters not to have to admit to wrongdoing when filing a reinclusion request when they haven’t actually done anything wrong according to those guidelines. It’s quite frustrating to have to spend your hours trying to fix something that isn’t broken and also a waste of good time that could be put to use in creating even more content rather than reading endless posts in forums/blogs that all don’t have the answer 🙁 Even better would of course be if penalties showed up in the webmaster tools area in these cases or there would be some kind of way of contacting google about these cases, but I figure that might be tricky to do automatically.
LOL SEOCOCK. Too funny chap. I just “read” your blog. It appears you have many issues with “many” people including Matt Cutts. What’s up with that? It looks like you really shouldn’t be posting in here at all. You are very welcome to your opinion, but you are the one who posed the question about WPG and wanted to know if Google could find “footprints” with it. I answered your question. If you don’t want to read comments, I suggest you don’t read them or simply tend to “your” own blog.
WPG is wrote about in the Google guidelines along with software like it. You ask the question. You get an answer. Answers are our opinions. Matt Cutts can give his opinion as well. You can state your opinion. We all can give our opinions.
And no; I’m too stupid to work for Google. They already know that is a fact. Heck; I have NO college degree for one thing. But I can tell you this; reading you and your blog, it may help you to read and learn at many places as you seem to have lots to learn. You state I need to grow up? Again; people can read your blog to find out who needs to grow up. 🙂
I write the Help content for Webmaster Tools – and wow, that’s a lot of feedback: thank you.
Some of the changes – for example, fixes to the embarrassing ” tag” error – are no-brainers, and we’ve already made the changes. Other comments and suggestions require some more thought, and we’ll be reviewing these over the next few weeks.
Again, thank you for the feedback and suggestions – we’re all reading this thread very closely.
I have a question on automated queries.
Alot of websites send automated queries to either look up search rankings or pagerank.
I myself was about to launch a Page Rank checker type of site but thought that it violates Google’s TOS and dropped the idea.
But on the other hand I see so many of these so called sites, which I’m pretty sure don’t have any type of automated query agreement with Google and are still doing it.
Can u please clarify if this is allowed or not.
thx
I fully agree with Doug. SEOCOCK, your post is nothing more than a childish flame born from frustration.
The vast majority of “SEO tools” are marketed to the uninformed or misinformed. Or, put bluntly, they prey on the ignorant.
By using “SEO tools” you penalize yourself, both in the SERPs and hip pocket.
One of my sites apparently was removed from the Google index. I don’t care much about this particular site since it is dated and whether or not it gets indexed doesn’t matter. The notification however pointed to the webmaster guidelines, but I couldn’t find anything related to the specific issue highlighted.
In addition the issue highlighted doesn’t exist (or I don’t understand what it is trying to say, maybe something got lost in the translation since the mail was in German):
Wir haben auf Ihren Seiten insbesondere die Verwendung folgender Techniken festgestellt:
*Seiten wie z. B. example.com, die zu Seiten wie z. B. http://www.example.com/index.htm mit Hilfe eines Redirects weiterleiten, der nicht mit unseren Richtlinien konform ist.
Translation: In particular we have noticed the following techniques on your pages: * Pages such as example.com, which redirect to pages such as http://www.example.com/index.htm using redirects, which is not compliant with our guidelines.
Now since when does Google consider redirects within a site evil? Plus, the referenced domain example.com does not even exist, nor does the homepage redirect either.
What bothers me is two things: One, the alleged violation of content quality guidelines doesn’t exist and there is no guideline against using redirects either. Two, there is no way to give feedback about this (unless Matt happens to ask for feedback about the guidelines, thanks Matt!) and the only option is a re-inclusion request where I have to confirm that I have understood and fixed the problem … can’t do unless I understand what the problem is.
Matt, It would be nice if some of the questions addressed were answered.
I myself have been in the SEM industry for the past 3 years and as I am sure everyone knows Google has been the major target of all SEM companies for their clients.
I long ago took the Google way and stopped looking at websites from the search engine robots point of view but rather at the average joe browsing the website.
As a SEM our job is to help the marketing side of the website, I can promise you the major problems our clients have are overpriced programmers who do a bad job code wise and do not use industry standards. But also bad understanding of what the Internet is and how it differs from the classic marketing lots of people are used too and what is still taught in Universities world wide.
I myself don’t see the problem using automated tools to check stats, and also at the same time also check my analytics stats of a web site and do a comparison between the two. Someone mentioned that if you don’t get traffic that means you aren’t ranked or that you are ranked and no one searches. So how would I know without checking what I rank for keywords ?
Some people have huge websites with many sections and target hundreds of keywords – Its not a must to check positions as it is most content happy websites receive there traffic from long tail searches, but thats what clients want, its a way of traffic SEO progress.
I don’t Google will punish the website/keyword thats being searched for but rather target the IP address making automated requests – I have seen this happen and IP’s get magically banned (rightly so).
Just my two cents …
Thanks for reading 🙂
P.S Google Webmasters Tools is a great way to discover what the real relationship your website has with GoogleBot.
The Google guidelines url seems to need posting again:
http://www.google.com/support/webmasters/bin/answer.py?answer=35769
You all should read them. The part about automated software does not need any “clarification” that I can see. Here it is:
“Don’t use unauthorized computer programs to submit pages, check rankings, etc. Such programs consume computing resources and violate our Terms of Service. Google does not recommend the use of products such as WebPosition Gold™ that send automatic or programmatic queries to Google.”
The first words should be all you need…… “Don’t use..”
The guidelines are pretty clear as they are if you would simply read them. yeah Dave; all that coming from an anonymous person who even hides his “info” in the whois and has NO info about who he is on his blog either.
Doug,
I don’t believe that
“Don’t use unauthorized computer programs to submit pages, check rankings, etc. Such programs consume computing resources and violate our Terms of Service. Google does not recommend the use of products such as WebPosition Gold™ that send automatic or programmatic queries to Google.”
Will ban your web site, I think they will just ban the person running the automated software.
If that was the case people would be trying to knock each other out of Google using this method.
Simon
Doug Heil Said,
July 3, 2007 @ 8:25 am
Okay Randy. So it’s okay for a member in here to ask about a ‘product’ like WPG, etc, etc, etc, but it’s not okay for other members to answer them about that same product?
That’s fine, Doug. But why the attitude? You just proved my point (in fact, I knew you couldn’t leave it alone).
And thanks, Riona, it’s good to know that you’re reading the thread.
Seriously, what is an empty threat like this really supposed to accomplish? I’ve got two words for you, SEOCock: “Anger Management”. You can’t back this up, so why even bother?
Randy: Doug’s not really giving attitude. He’s pointing out the obvious (that software like WPG leaves footprints and goes against Google guidelines). The problem is that most people don’t want to hear it, much like a lot of things that are wrong with the SEO industry. It’s ostrich logic at its finest…duck your head in the sand and see no evil, hear no evil, speak no evil. And it’s really hard not to get a little edgy once in a while when the same questions are repeated over and over again by the people who should really know better.
Riona, I will agree with Randy that it’s good to see you’re reading this thread. I’m just not sure if you’re going to get much out of it.
Thanks for asking, Matt,
What I’m wondering about is the Sitemap.
As stated on the Guidelines page, I understand a page should have less than 100 links so I divided the sitemap on one of my sites to 3 so each has less than 100 links.
Now, what if my site’s content reaches thousands of pages? Let’s say it reaches 10,000 pages…should I then create 100+ sitemaps?
That’s a question flying in my mind since quite a while…
Thanks in advance! 🙂
Sincerely,
Marcus
One thing that is sorely needed in the Guidelines is instructions on geo-location. At the moment, this information is buried in the Q&A system, but most webmasters don’t even know they should be asking the question in the first place, so its not very helpful there. Perhaps in the Technical guidelines.
As the owner of fitflex.com I find it extremely frustrating since I have been penalized for close to 15 months, and in those months feel I abide by every guideline. All experts from the major boards, including the Google boards have no idea why I continue to be penalized. Even in terms of the affiliate sales aspect of my site, which my site does offer, the guidelines suggest reviews, rating and comparisons to offer more value – I do all of those in the 1000’s and have been for the entire length of the penalty in addition to 100’s of pages of unique and compelling content updated weekly.. I used to be a thin-affiliate at the time of the penalty but at this time fitflex.com is no-where near it.
How is one supposed to get out of a penalty if they are meeting all the written guidelines? How is one supposed to crack the reason for a penalty if there is no feedback in webmaster tools following genuine re-consideration requests based on months of research and good faith? How is one supposed to guess at the cause of a penalty that is not written in the guidelines?
Hi Nate; If you are “very” clean now; then “eventually” your penalty will be lifted. This isn’t a one size fits all type thing. You were penalized and are paying the price, just like all sites who go against the guidelines as they “will” eventually get caught.
If you really don’t know how to go about figuring out if you really cleaned things up, then you have to find someone who can help you with it. Google is “not” going to help you. If they told every site owner what the exact problem was/is, then all spammers would be able to go right up to that line, or go over, and then Google would tell them exactly what happened. It’s just not feasible and downright impossible for Google to consider.
Your site isn’t in your profile in here, so it’s also impossible for anyone to give you any advice.
As far as prior posts in this thread; I never stated that people who use WPG would be banned. Not even close. What I did say was that the guidelines are “very” clear as to running automated software that pounds the se’s servers needlessly. The guidelines state this:
“Don’t use…”
That cannot be more clearer in my mind. Do people really have a problem with those two words? I’m constantly amazed in this industry. 🙂 It really does not matter what happens “if” you use them, does it? Isn’t it enough that the guidelines state “don’t use”? Who cares what happens if you are caught? Don’t use the auto stuff. You will have no problems then, so wouldn’t have to wonder “what happens”.
Instead of making the auto software thing “more clear” as Google has stated they want from this thread, maybe they should make those two words “not as clear” as they are now.
As far as “client’s wanting” you to do rank reports for them? You are wrong, sorry. Clients just “think” they want that. They really, really do not want a silly report such as ranks. Not at all. What they do want is more sales and more advice about how to get sales. That’s what they want. If SEO’s in this industry cannot educate their own clients, .. why are they even in this industry? It’s no wonder why the industry has the bad rap it has. Many of us shake our heads in amazement as “others” don’t put their collective feet down and put a stop to the non-sense.
I believe the almighty dollar is a big reason for the state the industry is in. Look at all the software tools out there who are allowed to “show”/tout their wares “everywhere”? At conferences, in forums, in Yahoo ads. MSN ads. Google ads. Everywhere. In most forums, nothing is ever said about the uselessness of all of this software. Instead; we all see people touting this tool and that tool and to whose benefit? That’s right; the owner of the software’s benefit. If you think that rank check tool is helping your client or their website in any way, you just truly don’t have a clue about what a “true” SEO/designer really is. Those are just the facts.
And before you bash me; this post is MY opinion and I’m sticking to it. “Many” others agree with me but are not as outspoken about things as I am. Someone needs to do it/say it…….
Hi Doug, my url is posted in the thread 🙂
And yes I agree that Google can’t tell how the entire process works, then real spammers will take advantage of it, that’s common sense. Just posting on what “I” would like to see done for cases that relate to people like me that truly do put effort into following the guidelines, about 15 months worth since the penalty, and would like some more feedback.
Thanks for your post!
Nate, what makes you think you are being penalized?
IF you are truly clean (as per the guidelines) you likely have no penality and are simply not ranking as well as you want.
Nate, think about that from Google’s side. In their eyes you are/were one of these “real spammers” as is anybody that is spamming their SE.
How many decent links do you have pointing to your site?
The site is actually http://www.fitflex.com, for those who didn’t see it in Nate’s post (which at least covers Doug).
Nate, you have a number of issues (not the least of which ist that so-called experts are giving you a line of BS). You already acknowledged a large issue (i.e. that you’re “affiliate marketing” as opposed to actually selling something.) Dave touched upon another major issue on the surface; that you don’t have any of the right kind of links (i.e. links that will send you worthwhile traffic.)
You’re too busy worrying about how to please Google and not worrying enough about how to please users.
I have a question on following webmaster guideline points:
“Keep the links on a given page to a reasonable number (fewer than 100)”
“Offer a site map to your users with links that point to the important parts of your site. If the site map is larger than 100 or so links, you may want to break the site map into separate pages.”
What’s Google’s behaviour with nofolow links??
eg: In my html sitemap there are 110 links including navigation bar links and 20 of them are marked with nofolow. So does Google counts nofolow links? What shall Google consider 110 links or 90 links? kindly resolve this confusion?
Hi Nate; I just viewed your site. You are in one the toughest markets/industries out there. As Adam said, it wouldn’t surprise me if you were not penalized at all.
My firm would never help you with “just seo” without a total redesign from scratch. Black backgrounds with white text is “VERY” hard on the eyes. Many of your visitors just leave when they find it. I could not read your text without squinting.
What makes your site unique compared to the other thousands of sites out there selling pills and the like? If you are going to pick something to sell as an affiliate, why the heck did you pick that one anyway? lol That would be my last choice. Actually; I wouldn’t even redesign the site as I don’t do pills and things similar anyway. I’m simply telling you that unless your site is one of the best out there, you stand almost no chance of doing anything much in the se’s. I know that is not what you want to hear, but I can only write my thoughts as I see them.
Dave:
I agree, I am working increasing backlinks, thanks!
Multi Worked Adam:
“You’re too busy worrying about how to please Google and not worrying enough about how to please users.”
Couldn’t be farther from the truth, it’s the other way around. Past 15 months I have done nothing but improve the site for my users (reviews, ratings, articles, advice, downloads, free email advice, etc).
Doug:
How is my site unique? For one I am not a pill store, used to be a thin-affiliate but no longer. I am a rich-content driven health and fitness resource to help people share opions on goals, get educated etc AND if they choose to guide them to the supplements to help them reach those goals. I provide value added aspect through reviews, ratings and comparisons as Google itself recommends. 99% of the affiliate sites have template driven sites with duplicate content descriptions etc. Not fitflex. Along with product reviews, ratings and comparisons I offer audio/video downloads, 100’s of articles, email and online advice and such.
As for being penalized, yes I am. There have been dozens of proven -30 penalty case scenerios out there and I am one of the first. Sure most of you guys won’t believe it – I am not asking anyone to, and say I just don’t rank well.. that’s fine.. to each his own.
Once again, thanks very much for the input!
PS – I am not interested in SEO, just keeping my limited visitors happy as my bookmark rate has grown from 15% to well over 37%.
To get back to topic on hand by Matt:
My recommendation to the guidelines is to make them clearer in regards to legitamate (non thin-affiliate) online resources that contain affiliate links. The guidelines do suggest ratings, reviews and comparisons for a value-added visitor experience. As well as general unique, relevant and updated content for visitors in general, a site that puts visitors above click-thrus, a site that would remain highly useful without affiliate links.
I understand many affiliate sites are thin and spamming out there, and this is a tough line to draw.
What if a site meets all these critria yet continues to sustain a penalty?
Thanks to those Google staff that take my post into consideration.
Best regards,
Nate
Nate, explain how you *know* you are being penalized? You can’t keep stating that without proving it! With the lack of decent links you have I would say you wont get past page 100 of the SERPs for any decent phrases.
Why not simply err on the side of caution and do as they request?
The penalty does exist for me, for the past 15 months, and has been commented on by Google staff a few times indirectly. I know because others have been in the exact penalty with the exact same symptoms and have had the penalty removed with a re-consideration request a few days later. Happens the same time, each and every time.
I am not here to defend my penalty, I’ve had enough evidence shown to me first hand that it is very real, and many sites have been affected by it, yet as I mentioned, the penalty was removed within a few days once the penalizing aspect(s) was removed, mentioned in a reconsideration request.
This discussion of this infamous penalty has been done so many times, and unfortunately it’s very difficult to comprehend if one is not inflicted by it. Many try to disprove it, yet cannot prove that it does *not* exist.
Dave, if you are interested, here is a prime example of a site that was affected with this exact penalty, find the post by Mark. Someone that I chatted back and forth with trying to help him find the cause as we shared the same symptoms of the -30 penalty, and after a quick reconsideration request and closer look Mark returned to the rankings in a flash as if nothing ever happened. The penalty was removed. The penalty did exist and continues to for many, many sites and is not just an example of poor rankings, etc.
http://www.seroundtable.com/archives/013423.html
That is what brings me back once again to the topic at hand of this thread before it get way out of topic; new suggesions on the guidelines which I made clear for those in my shoes.
Nate, you just answered your own question without realizing it:
If you didn’t care, you wouldn’t be pissed off over what you perceive to be a penalty. You also wouldn’t be posting on SEO boards claiming the same thing.
You may be doing that, but it’s not immediately obvious if you are. A quick glance through the Wayback Machine indicates that your site doesn’t appear all that different than it did 15 months ago.
Not only that, good sites have a way of spreading through various traffic sources (e.g. email, social media sites that aren’t manipulated by the traffic-generating crowd, organic links from other sites) in such a manner that you won’t even need to worry about what traffic you’re getting from one source. Where are your links from other bodybuilding sites? Where are other bodybuilding blogs (if any exist) with posts that talk about your site? Where are your organic links from other sites that say you have unique and interesting content, instead of you having to tell everyone?
In other words, Google isn’t the problem here…the problem is that your own target market is telling you something that you can’t or don’t want to hear right now. You solve those issues, you’ll likely solve at least some of your SEO woes as well.
So you are now saying you have no penalty????
When/if a site is being penalized and the site removes the spammy elements, don’t assume your position is being held open for you, cause it’s not. It could easly be a case of Google WERE being fooled by the spam and giving credit where none was due. Now Google are positioning your site without being gamed.
As I said, with your lack of true votes you don’t stand a snow-balls chance in Hell of ranking in your market. IMO the only one penalizing your site is yourself. I can see credible and factual evidence of your site simply not making the cut in your market, but no evidence (credible or otherwise) that Google are penalizing you.
I know that is not you want to hear, but I’m not one to sugar coat things and say what one wants to hear.
Thank you for sharing your personal, theoretical opinions. I will take a closer look at your suggestions to end this 15 month penalty.
Nate
Nate, your snarky remark speaks more about you than anyone.
Your pereception of being penalized for 15 months is based entirely on personal theories. You have not offered or shown one iota of proof. On the other hand, our believe you are likely not penalized in based entirely on fact, the fact you don’t have any decent links.
As I said, the only penalty is likely being self-inflicted as you close your ears to facts, but open them to any sentence containing the word “penalty”. Some people are their own worse enemies and cannot be helped.
Oh, forgot one one question. How has your sustained belief you are being penalized for the past 15 months working for ya? Ponder on that.
“Duplicate Content” has always been a tough one for me. Most websites have duplicate content in some way or another. Whether they be linking to additional information, or restating their mission statement. This could be a little bit more clear.
I’ve seen some webmaster tools that analyze your page and shoot back a percent on how alike your pages are.
I’m guessing some duplicate content is a necessary part of websites. What’s your thoughts on this?
– Check for broken links and correct HTML.
Matt,
does this actually play a keyrole in the rankings. I see many sites that don’t have correct html and they do quite well.
Even the google.com home page is not W3C compliant with quite a few warnings.
Maybe its time for Google to clean up its own site to set an example and give more weighting on W3C compliant sites.
Tung, that is actually an age old debate on most SEO forums.
My view;
Being W3C compliant doesn’t equate to good and/or relevant content.
Google ranks pages on their relevance & importance to a search term.
Unless the code is that bad a Browser cannot display it properly, users don’t care if a site is W3C compliant.
W3C compliant code, more often than not, doesn’t enhance the users experience while on the page.
Google have: http://labs.google.com/accessible/ which likely factors in good coding practice. But as you will see, it’s catering the visually impared users.
IMO Google (or Yahoo, MSN etc) will never factor in W3C compliant code in their *main SE* as it will only dilute down relevacy and would be catering to a minority at the expense of the majority.
.
Dave.. lol.. relax. The only theories going around are your higher than high theories.. imagine for one second that you are wrong. Ponder that.
“Nate, your snarky remark speaks more about you than anyone.”
Take your own advice.
Once again, thanks for your theories, they do not apply to my penalty.
Out, Nate.
Before I make my final exit, thank you to Google for listening to my guidelines suggestions, much appreciated on my behalf and those many webmasters in the same shoes! I understand that it’s a very fine line between spammers and genuine webmasters that are making an effort to meet the guidelines.
David, it’s been fun but there is no longer any reason to attempt to have a mature discussion with you since you are incapable to accepting any other theories or facts besides your own. You said it clearly with, ‘some people are their own worse enemies and cannot be helped.’
You mention ‘links’ in every post, please take a second to read my first reply to you once again, here it is: ‘I agree, I am working increasing backlinks, thanks!’
As the most recent outcome of real penalization/filter, my pr fell to 0 from 5, and backlinks and cached date are returning no info – a crystal clear sign that a penalty/filter has been triggered ontop of or in replacement of the original minus thirty penalty. So backlinks in Googles eyes are impossible to view at this point as they are being blocked from the google bar.
I work on increasing value-added content daily, and that is how I will earn backlinks, so Dave it takes time to gain natural links and those are the links that count. I avoid link schemes at all cost and will earn proper links naturally over time. However, to rank well again, I also need to find the cause(s) that has kept me penalized, since the minus thirty penalty moves and keeps all results no better than position #31 including your brand/domain name search. But like most minus thirty penalties that I have seen (dozens) they have all been released after a successful re-consideration request. All positions return to pre-penalty levels immediately within a few days, even to site that had no change in new content or links, etc.
That is what brought me to this topic started by Matt, to send a suggestion on how a webmaster is to find a cause of a penalty when all written rules are being followed.
I realize that you have made up your mind about the minus thirty penalty a long time ago, but do a quick search for the penalty on Google, can’t hurt to hear other theories from your own right? Besides, it has never been proven that it doesn’t exist and all the facts and evidence clearly suggest it does exist. It is in my eyes a penalty
Either way, there is nothing else that can be said on this item – we each have our opinions and that’s what life is about, I continue in good faith to build my site for my visitors, and will return once my site has gained trust once again with Google.
Dave, if you would like to continue this discussion as I am making my final post here, visit us at seroundtable, WMW or Google Group where the minus thirty penalty is discussed in a mature mature matter for both believers and those penalized – some great ideas come out and progress is being made. I’d seriously love your input there amongst the others.
Signing out,
Nate
Dave: do a series of searches on the Accessibility search, especially in areas that are traditionally spammy. One of the joys of the Accessibility search is that about 99.9% of crap gets removed, just because spammers are inherently lazy creatures who couldn’t give 2/10ths of a damn about accessibility issues.
I’m a believer that at some point, accessibility will be a factor for this reason and for reasons of political/social pressure from countries such as England where at least some level of accessibility in a site is a legal requirement now.
It’s not clear with doubled content. How does Google recognize original content? May be by index date, maybe by a number of incoming links, some people say about PR… It’ll be a useful information for webmasters.
Accessibility and W3C compliance are surely different things? It is a good point about Government legislation – accessibility is for sure a legal requirement in the UK as Multi-Worded Adam states.
Also, it is now a legal requirement in the UK for a company to display certain information on its website including:
Registered Office Address
Company Registration #
Tax (VAT) #
Tel: #
and so on – but many many sites in the UK do not comply!
Whilst it is not entirely logical that such non-compliance should affect a SERP I can see a point in the future whereby commercial companies are “vetted” for a variety of issues and given a gold star rating or whatever – A Yahoo Directory+, or Trading Standards 2.0 if you please.
Another problem in the UK is comapnies who go bust – then set up a “Phoenix” the next day with the same Website and SERP’s – “olden is golden” is not much use here!
So, you heard it here first – How about a paid for “Google Commercial Vendor Quality Directory” with a paid annual submission and vetting?
Sweet…I learned not one, but two things today. That’s good stuff, Ray Burn. I never knew that about British companies being legally required to put that stuff on their websites.
How would that apply to foreign companies doing business in England?
And on a note unrelated to anything, II like your site…and more importantly, the concept it represents. Big up for taking some strain off of Mother Earth!
Hi Multi Worded Adam,
Here is a link on UK legal requirements (not sure if I can help with your question on imports into UK)
http://www.out-law.com/page-7594
and thanks for the kind words about our site….cheers!
Adam, think about that. The reason for less spam is simply due to less users of that SE. Just as virus writers and hackers target popular software, spammers target popular SEs. IF Google used accessibility and/or W3C compliance in their main SE they would without doubt adapt and still spam. accessibility and/or W3C compliance is not a viable solution to stopping spam and at best a band-aid solution.
Without doubt, yes. I have mentioned this to Adam before in one of our debates.
I cannot see Google ever fighting spam at the expense of relevancy. For me at least, it just makes no sense.
Yes, I too can see a day where Countries (and/or States within Countries) take a tougher stance on accessibility issues. This may even extend to pressure on SEs to factor in accessibility, but not in their main search. Not more than they have now at least.
Google have already taken the intiative by creating a seperate SE for accessibility issues. That alone speaks volumes IMO, they do not want a minority dictating to a majority.
Dave: you’re also assuming that they would use accessibility and/or W3C compliance and tell the public that it’s a factor. For all you and I know, it could be a factor right now; I doubt very strongly that it is, and it’s certainly not a big one. But the technology is there in at the very least a crude form.
Not only that, you’re also assuming that, if Google came out tomorrow and said that quality code was one of the ranking factors, most people would listen and understand.
Google has mentioned that they target link exchanges, and most SEO-types still link exchange.
Google has mentioned hidden text and links for SEO purposes, and people still hide text and links.
Google has mentioned redirects, and people still redirect.
Google has mentioned auto-generated content, and people still auto-generate content en masse.
Google has mentioned blog posting “contextual ad” networks, and people still participate in those, too.
Google could hire a series of skywriters, put up billboards everywhere, post onto every webmaster forum, blog, and portal there is, and most people still do the exact opposite of what the announcement implies that we’re supposed to do.
This still doesn’t even take into account the single best reason for including code in an algorithm: it forces people who want to rank to learn how to code properly and build websites that are better for users.
This isn’t minority dictating to majority, either; it’s end user getting a better product..
By the way, Ray Burn, thanks for that out-law.com link. Interesting reading material (and I forgot about out-law.com).
Adam, you missed this from me in reference to accessibility and/or W3C compliance : “Not more than they have now at least.”
Google cannot afford to be the slightest bit transparent in their algos so don’t expect any announcements any time soon 🙂
Also, I’m not so much assuming spammers would adapt, I’d bet money on it. History is the best predictor of future behavior.
Trust me Adam, Google will use a separate SE for accessibility and/or W3C compliance IF it becomes a meaningful factor and/or they are forced and/or their is money in it. They have a separate SE for accessibility already, so that’s a clue right there 😉
Just as Browsers cannot afford to be too strict on coding, neither can SEs.
Sorry Adam, that is not true at all. The BEST website is the one that gives the user what they want. Users could not careless if the site they are browsing has code errors, they only care that it displays in their browser. Which 99% of sites with code errors do.
Why do you think the 2 most popular browsers are displaying ‘bad code’ just fine in most cases? Let me tell you, it’s because they, like SEs, cannot afford to be fussy about strict coding.
Code errors and proper coding are two different animals, Dave. Just because code “validates” doesn’t mean that it’s good code…it just means that the code validates.
And your percentage guess of 99% is way off. There are a large number of browser issues with both bad and good code…webmaster boards are littered with them. And look at all the sites that are still “optimized for Internet Explorer at 1024 x 768 resolution” out there…it’s a problem.
Better coding = a better user experience and is an important part of giving users what they want, browsers displaying crappy coding or otherwise. There’s nothing in the world that can be said to indicate anything else.
Adam, the topic at hand IS W3C validate code being given weight in the Google SERPs. So now you believe Google should be subjective in ranking based on “good code”, whatever that is. Who decides “good code”? Why should a page that is potentially less relevant & important to a search term be given a extra boost in the SERPs just because it has “good code” when both display just fine for the vast majority of users?
Code “optimized for Internet Explorer at 1024 x 768 resolution” is smart coding IMO, as that is what the majority of users browse with. Don’t confuse “optimized for” as not displaying in other Browsers or screen resolutions.
I browse the Web all day, most days, I would guess you do to. Very rarely do I encounter a site that gives me problems due to coding, in fact, it’s extremely rare. I, like most users, couldn’t give a hoot about the underlying code, it’s the content that I and most users want.
Trust me, when the day comes, it wont be in their main SE.
Trust me again when I say, if giving *extra* points to Accessibility coding, W3C compliant code or “good code” were advantageous to Google users of their main SE, they would already be doing so.
Explain to us when an update is happening. I write for customers and that usually means we rank well for relevant terms. But when you are on the first page for various terms and one day they drop to the 6th or 7th page you begin to wonder; am I violating a google policy, will this correct itself in a week or so, etc.
I understand that if we are writing for clients we shouldn’t care, but if that was 100% the case no one would even read the webmaster guidelines. Is there any way to truly know that updates are taking place or understand why after years on the first page and despite being “white hat” you might drop?
Generally much clearer – unfortunately in attempting to make things clearer, sometimes new ambiguity is introduced.
For example on http://www.google.com/support/webmasters/bin/answer.py?answer=66353 it says “using white text on white background”.
I feel that this should actually read “using text of the same colour to or very similar to the colour of the background”.
To people who understand the concepts of hidden text it is obvious that the colour is irrelevant, however, to the average person, this is not so obvious.
Just had a read through the guide lines and they seem clear to me.
Whilst reading through the guidelines, I came across a guideline which states; “make pages for users and not for search engines”
Do you think webmasters are still concentrating more on building their sites so they are search engine friendly rather then user friendly?
Very clear guidelines. But I was hoping I could get more detailed information about the ranking algorithm so that I can do a better job in SEO.
Hi Matt,
Have you got any thoughts on the removal of the following guidlines.
* Have other relevant sites link to yours.
* Submit your site to relevant directories such as the Open Directory Project and Yahoo!, as well as to other industry-specific expert sites.
Lots of people talking about it on the forums.
Bryn
Matt,
Why Google is making URLs to case sensitive? If you do this whole many links in out website can show as duplicate pages. We were working on the best way to avoid duplicated web page and contents with title and Meta keywords. Can’t your algorithm to consider the URLs in the lowers case while comparing the pages?
I understand that if we are writing for clients we shouldn’t care, but if that was 100% the case no one would even read the webmaster guidelines