A thread on WMW started Dec. 20th asking whether there was an update, so I’m taking a break from wrapping presents for an ultra-quick answer: no, there wasn’t.
To answer in more detail, let’s review the definitions. You may want to review this post or re-watch this video (session #8 from my videos). I’ll try to summarize the gist in very few words though:
Algorithm update: Typically yields changes in the search results on the larger end of the spectrum. Algorithms can change at any time, but noticeable changes tend to be less frequent.
Data refresh: When data is refreshed within an existing algorithm. Changes are typically toward the less-impactful end of the spectrum, and are often so small that people don’t even notice. One of the smallest types of data refreshes is an:
Index update: When new indexing data is pushed out to data centers. From the summer of 2000 to the summer of 2003, index updates tended to happen about once a month. The resulting changes were called the Google Dance. The Google Dance occurred over the course of 6-8 days because each data center in turn had to be taken out of rotation and loaded with an entirely new web index, and that took time. In the summer of 2003 (the Google Dance called “Update Fritz”), Google switched to an index that was incrementally updated every day (or faster). Instead of a monolithic monthly event, the Google would refresh some of its index pretty much every day, which generated much smaller day-to-day changes that some people called everflux.
Over the years, Google’s indexing has been streamlined, to the point where most regular people don’t even notice the index updating. As a result, the terms “everflux,” “Google Dance,” and “index update” are hardly ever used anymore (or they’re used incorrectly 🙂 ). Instead, most SEOs talk about algorithm updates or data updates/refreshes. Most data refreshes are index updates, although occasionally a data refresh will happen outside of the day-to-day index updates. For example, updated backlinks and PageRanks are made visible every 3-4 months.
Okay, here’s a pop quiz to see if you’ve been paying attention:
Q: True or false: an index update is a type of data refresh.
A: Of course an index update is a type of data refresh! Pay attention, I just said that 2-3 paragraphs ago. 🙂 Don’t get hung up on “update” vs. “refresh” since they’re basically the same thing. There’s algorithms, and the data that the algorithms work on. A large part of changing data is our index being updated.
I know for a fact that there haven’t been any major algorithm updates to our scoring in the last few days, and I believe the only data refreshes have been normal (index updates). So what are the people on WMW talking about? Here’s my best MEGO guess. Go re-watch this video. Listen to the part about “data refreshes on June 27th, July 27th, and August 17th 2006.” Somewhere on the web (can’t remember where, and it’s Christmas weekend and after midnight, so I’m not super-motivated to hunt down where I said it) in the last few months, I said to expect those (roughly monthly) updates to become more of a daily thing. That data refresh became more frequent (roughly daily instead of every 3-4 weeks or so) well over a month ago. My best guess is that any changes people are seeing are because that particular data is being refreshed more frequently.
Matt
Thanks for such detailed informative post which explains several “Data Push & DataRefresh” . Do you really need to “Refresh” our sites rankings so often 🙂
The query operator changes recently are more interesting. Can you clarify what has been done with inanchor, allinanchor, and site in the past few months? You did mention earlier this year that site especially was getting some attention.
Hi Matt, thanks for the info. This particular data refresh seems to have really resulted in a lot of personal content (i.e. photo’s, free templates and software) sites getting the heave ho for the top ten spots. They seem to have been replaced with a load of if you want ‘free templates’ follow this link kind of pages. I know when these kind of refreshes happen we’re told to wait and see, but I spent months last waiting and seeing and nothing ever changed until the next monthly refresh. Do we write off another month of Google traffic and start again in February ?
Cheers & Merry Christmas to all
Colin
Michael, inanchor: and allinanchor: is used by practically nobody except for SEOs. And I’m including Googlers in the don’t-use-much column.
Thanks for the explanations. Its good, that google includes new results of its indexing faster. I am just wonderng, how it can happen, that also very old results are appearing?!
Issues are especially with “supplemental results” – pages with cache data from january or february 2006 appeared last month new. I’ve had it especially at one Domain, which I completely redesigned.
Those pages have been away from Google-Index, Google-Cache since May, as they do not excist since may, have been deleted with URL console in may, sending 301 – 404s if you’d call them from google serps (other search engines do not have them any more). The reappeared to all ge-Datacenters end of last month beginning of December. They are ranking in site: request in Top of all excisting pages.
I do not know, if it is allowed to post the link here, but I am talking about the domain I placed in “URL” of this form.
So Question is: Why does google digg those old URLs back to index and cache and do I have to worry about them? Are they a sign, that my Domain is not healthy?
Sorry about my bad english – and merry Christmas (or “season greetings”) to all readers from good old Germany.
Since I’m the top nominee for 2006 SEO PIA, I’ll point out that you really didn’t answer Michael Martinez question there you just side stepped it. The “site:” command is especially screwy pushing supplemental results much higher. In many cases it also seems to be scrambling the SERP much more. Previously there was some rough hierarchy to the way the results were ordered, now it’s wonkadoodle for example:
http://www.google.com/search?q=site%3Awolf-howl.com
clearly not the most important or linked to pages are first. If you’re just going to “randomize” the results since you think SEO’s are using the data, say so it’s just easier for everyone involved. However I would suggest giving a more hierarchical option available in the webmaster central console. Sure SEO’s will use it but it will only be on their own sites, plus it gives regular site owners some diagnostic tools about their site. Much the same way people want real back link information for their sites.
So now get back to wrapping and have a merry!
Can you explain what causes cache’s to “rollback” to an earlier version? One day I see a brand new cache… next day one from a week ago.
One thing a few people are noticing with the site: command, is that it suddenly started indexing ws_ftp.log files, which resulted in a lot of cr@p being indexed and thrown into supplemental – and all those pages are now appearing first in a site: list. I see it with one of my sites, and it is also a site that was affected by one of the latest index updates. Now, those log files have been there forever, so it’s interesting that they are suddenly being indexed and/or showing at the top of a site: list. Perhaps they’ve always been indexed and are only now showing up in a site: list…hard to tell as they might have just been there all along but out of sight. Still…it may mean “something”.
And GrayWolf, I’ve been a royal PIA lately too. 🙂
Graywolf, I was trying to say gently that hardly anyone at Google pays any attention to inanchor: or allinanchor: other than to ask if the results are technically correct. Discussing the results for searches with that operator are like saying “I rank really well for the are-these-words-in-my-title operator” and all the while ignoring the 99+ other factors that make up ranking. Rankings for inanchor:/allinanchor: simply can’t be generalized to search results, and people shouldn’t expect them to.
DazzlinDonna/Graywolf, site: used to show purely random pages a year or so ago. Now site: tends to show shorter urls higher instead of a random order. I’m happy to see an example if anyone wants to post an example where a site: search looks strange, but I’ll pre-ask people to step into Googlers’ shoes and realize that supplemental results by themselves don’t indicate badness/penalties/problems.
>supplemental results by themselves don’t indicate badness/penalties/problems
Unless of course the pages that are supplemental are the pages you want to rank 😉
>show shorter urls higher instead of a random order
Thanks for the explanation, but from a site owners perspective the most useful metric would probably be “importantness” within site. Length of URL’s ??? well that’s just wonky.
Matt, hope you have a great Christmas! I have been meaning to ask you about supplemental pages but had not seen any recent threads that were somewhat appropriate. This one looks good. We are using Co-op CSE on our other site, which is a film reference site that does not get updated a lot. I believe just 4 of maybe 100 pages are in the regular index and my question is are there any plans or things I can do to get Co-op CSE to call up supplemental pages?
They are great resources for people in the film industry but IMO the CSE is pretty useless if it only displays 3 or 4 pages in the results. Any words of wisdom?
Hi Matt:
Merry Christmas.
I think my biggest concern is that I’m noting some web sites getting booted into oblivion with simple ‘data refreshes’. Typically, after ‘data refreshes’ I’m seeing authority sites reigning supreme, however, also noting questionable ones getting favorable placements (and questionable I mean high PR (5+) but irrelevant content.)
Are substantial (say 200-300 placements) drops routine in data refreshes?
This is troublesome because you think to yourself, “self, follow the rules,” but when you see significant SERP changes as a results of ‘data refreshes’ you wonder, “ok, back to the drawing board, what did I do wrong?” Then the next question is, “do I follow suit to what Mr. Irrelevant content did to get ranked?”
Of course the answer is no, but you can imagine the Google angst involved….. 😛
I have noticed that my backlinks keep on fluctuating on Google.
(By using link:www.yourdomain.com)
Some time it shows 50 backlinks and sometimes 20 backlinks.
How can the backlinks decrease without actually been removed from the linking site.
Also why does Google shows the least backlinks as compared to Yahoo and MSN?
Thanks.
I just wanted to say that I’m loving the attitude. About time you started to throw down a little, Matt. Attaboy!
Really great for you to take out some time for us, thank you Matt.
Thanks for the update on this Matt!
I hope everyone has a great Christmas and Holiday Season!
Charles, I’ll ask about using CSE with supplemental results. My guess is that that wouldn’t happen in the short-term, but I’ll try to find out more.
I see where you’re coming from, Barry. My guess is that the variance of these particular data refreshes would go down over time, much in the same way that everflux eventually faded into the background to where people hardly notice it these days.
So if the DC updates faster than once a month then why do I see such a huge discrepancy for the SERP on one of my sites between the different DC’s?
theworldsgreatestguy dot com ranks #1 in SERP for (“the worlds greatest guy”) on EVERY Google data center that I could locate but TWO.
http://64.233.167.147/
http://64.233.167.104/
Both of these have listed me as #30+ for the past 4-6 months. Why the huge difference and why won’t the DCs finally agree on a similar number if they refresh so often?
I am not seeking a fix to my specific site; I just want to know WHY it doesn’t work like I would imagine. It has me frustrated and I couldn’t Google an answer.
Thanks so much, Isaac
Matt
Does a data refresh affect the number of backlinks displayed by link: operator on different DCs?
Any plans to update the Google Directory? 🙂
Hi,
1) Matt, I realise that this is a mug’s game, but someone who I trust is blowing a gasket because his 10y+ old authority sites (and his competitors’ sites) seem to have been replaced in the G SERPs by a bunch of worthless non-authority stuff or even his internal directory pages! So, there *may* be something beyond all the usual GOOGLE==EVIL cussing going on.
2) Interesting problem that I’ve already taken up with AdWords support, and isn’t losing me any sleep since it’s not a for-profit site, but I created a new .mobi (ie handheld-friendly) front-end to an existing site that you treat well (and treat the mirrors of well), but the G SERPS don’t like the .mobi and AW won’t give it impressions even though the CTR is actually the best I have anywhere and the AW support people agree that I seem to be doing everything right with a bona fide site. So the generic question is, are your normal dupe-content filters confused by re-presented material for other media (in this case, vastly stripped down), or do they understand the necessary difference between guideline-conforming .mobi sites and the ugly-but-works HTML presentations of the same material?
Rgds
Damon
>Now site: tends to show shorter urls higher …
… now I know why you crawl from short to long 🙂
http://www.cre8asiteforums.com/forums/index.php?showtopic=38021
as far as I understand there is something important going on, many people are complaining even if their positions on queries did not drop, click trough is significantly lower. Explanation for this is “Google is pushing top if the listing inserts” like movies, products, news, ads, notices like for blogger, Google base (base is very important), Google local (this is real pain), music, images stock quotes etc.
To be honest especially Google base with very low quality if results full of SPAM, images, news, stock quotes are annoying and harm user experience. When Google is offering own service on the top before best world competitors services what’s next? Blackmailing has started. Google made major change and hijacking all important queries with own services that are far worse then competitors. Google crawler, index and ranking and finaly web service is purely $$$ driven.
I have just checked over 30 queries that brought me most of traffic in last months and now my site is still listed but main reason why click trough dropped by 4 times (four times! or from 100 to 25 for those who like percents) is Google local.
Adding Google local on the top of results is real reason why traffic dropped. I am sure that “vacation rentals” is not the only vertical that is doomed by Google local, Google base or some other service. I also hope that users will become “top link blind” once they realize that services that Google force are not so good like some other specialized sites. Your business your rules… but in my humble opinion you are pulling strings to hard.
Hi Matt
I wonder if this is a good time to ask a question about backlinks?
Our site has a fairly large numbe rof backlinks in total, but for the last 3 or 4 months the actual figure shown for link:www…. appears to fluctuate daily – but it always hows 1 of 3 figures. Either, 127, 1000, or 1250. Never any or figure, … but it changes from day to day, possibly even several times a day (although I could be imagining that to be honest).
Are you able to explain how or why that might happen to us? As I say, it started to do this only about 3 – 4 months ago at most.
Many thanks for any insights.
– Mark McNeece.
RE: “So what are the people on WMW talking about? ”
========================================
Been seeking an answer to that since its inception. Most there are so obsessed with Google they totally ignore their own user needs.
Hi Matt,
Happy Holidays!
Is it possible to gain any perspective on a Filter and how that fits into the system (ie: is it a data refresh)? Taking it further how about the acceptable percentage that a filter may use for accuracy.
By that I mean, if you add a filter and it removes millions of spam/duplicate sites, what is acceptable percentage of sites that are not spam yet may still be filtered? Is this even possible to calculate? (Perhaps a future blog post)
Matt,
I have noticed in the last little while a dramatic drop in the number of indexed pages on a number of sites (my own and a number not my own) when using the “site:” operator. How does this relate to the what you are talking about here with the refreshes and could this be why people think there was a major change?
Hi Matt,
“I’m happy to see an example if anyone wants to post an example where a site: search looks strange,…”
http://www.google.de/search?hl=de&q=site%3Awww.poezdka.de&meta=
supplemental results (not excisting pages) are in front of the results. Nothing random or so: First old supplemental results, later the usualö pages like they have been shown allways before.
So why its strange? Those pages didn’t exist in any request at google for months. They appeared one month ago.
Again merry christmas to all 🙂
Hello, Matt!
For that night 5 my sites (was dropped from index). Google Sitemaps show ‘no pages included’ when yesterday all was ok!
I only see in referers logs for now – what if user search on google by my domain name – it get link to site on page N3, all previous is some doorways crap.
Also at other site (which less affected), results looks like not stop dancing. 5 mins in results, after not.
Looks like something really happen today, may be ‘update’ error or something like it.
by the way – I speak about old popular sites (more than 4-6 years), exclusive content, good positions at google before, yahoo/dmoz listings, etc.
Yes, there is in fact a massive algo update going on right now:
😮 . 😮
Do a search for SEARCH ENGINE OPTIMIZATION
on most datacenters, there is a new champ after 2 years.
In fact the SERPs are completely different for many, many competetive terms
http://www.mcdar.net/q-check/datatool.asp?yoursite=en.wikipedia.org/wiki/Search_engine_optimization&keyword=Search%20Engine%20Optimization&bank=.400&Pages=
Hi Matt,
First Merry Christmas!!
“In fact the SERPs are completely different for many, many competitive terms”
So my site has dropped from 3 to 15 on one keyword search and is totally absent on others.
Is there anyway for me to determine why it dropped so much and what to do to correct it given the changes that Google has implemented.
Keep up the grate work and thanks,
Bill
After thinking about it some more Matt, I’m wondering if you can shed some insight re: ‘consistent data refreshes’. Is there any form of ‘old data’ being recycled in ‘refreshed results’? (Data maybe 1 year old?)
Again, Merry Christmas :).
After last Google “update” pretty much amount of mature software developer sites were wiped from SERP. Positions by traffic generating keywords lowered by 200-700 pos.
Let me tell the truth: today G can’t list straight sites offering real stuff. Instead affiliates / resellers listed – sites with completely stolen content and fabricated “articles” just made for AdSense. Just look at subdomian spammer qarchive.org.
Also can anyone tell me why SERP on different datacenters so different now?. On google.com the position #1 but on google.de – #200?
I’ve seen many weird results lately, and I’ve seen weird stuff with the supplemental index. If it’s not a bad index, and doesn’t give any problems, then what is it?? 🙂
Hope you all had a nice christmas evening 😉
Hi Matt,
First off, thanks for the very useful blog.
How comes if there isnt a periodic update, more of small continual updates, we often see many webmasters complaining at similar times.
I have 3 MySpace Resource sites, all completely different, on different servers, different hosts etc yet all 3 sites had massive drops in traffic from google SERPS all around the same time (2 weeks ago).
As you can see in the website supplied, a whiter than white, clean, W3C compliant website but yet we still see drops.
Many others seem to have seen negative results as you can see at http://forums.digitalpoint.com/showthread.php?t=203431
So what are us webmasters doing wrong?
Don’t worry Matt. We still use the term Everflux correctly all the time at PlumberSurplus.com. Thanks for the post.
Merry Christmas!
Tim
Thanks Matt for update and detail information of data refreshes, it clear some doubt. I hope you are relaxing, have great holiday!
I´m liking this continuous updating of everything, (or what ever you want to call it). The days of waiting for the next update are over and I am happy for it. Now we just have to wait for a page to be indexed and the results pretty much always come in the weekend.
So I guess it’s not an everflux, unless your concient mind is only active during the weekend of course,. 🙂
But seriously. All noticable changes in the SERP’s, always happen during the weekend. Is that the experience of others as well?
As for me positions are still very low and quite vary on different DCs with my keywords set. Only ONE relevant site is always on top in my niche. Others are spamy “directories” and affiliaates.
Hey Matt,
I wrote about the supplemental index several months ago. I still don’t get why the supplemental tag is applied to some types of pages and not others. Thank you for pointing out a bit more on algo updates. Could you talk a bit about the supplementals and how that works?
You end up with searches like this:
http://www.google.com/search?q=A+provision+in+a+mortgage+or+deed+of+trust+that+allows+the+lender+to+demand+immediate+payment+of+the+balance+of+the+mortgage+if+the+mortgage+holder+sells+the+home&num=100&hl=en&lr=&as_qdr=all&filter=0
Where obviously everyone is copying everyone else and yet… where are the supplementals? They can’t all be the original source of info.
Yet, other searches have intentional duplicated content and supplementals tags in the search results: http://www.google.com/search?hl=en&lr=&q=%22The+Day+the+Earth+Stood+Still+is+one+of+the+many+classic+movies+mentioned%22&btnG=Search
Thanks! Keep up the good work.
Scott
About a month ago I started use Google Webmaster tools. The /Query stats/ tab shows irrelevant data for my site. It displays search query at average position #4 but if I check SERP I can found my site near position 700 with this query.
Matt, thanks again for clarifying the correct terminology there. I have to say though, like Graywolf we have seen some bizarre stuff happening, I commented in SE Roundtable to Barry’s first article about this, you can have the screenshots if you want, pure junk coming back on the site: operator for effected domains, but it seems to be stabilising since the 19th. Looked more like some kind of accidental data corruption than an intentional update.
Belated Merry Xmas to you and the family.
I’m recovering from surgery here – Merry Christmas to all as well. Matt, we are SEOs at Bizresearch – I’ve been doing this for nearly ten years – and can’t believe how much we’re all writing on Christmas weekend/week about supplemental results but I too, have questions, about supp results. Our client, Pier 1, has the FAST site search engine in use, which generates two different types of URLs. It has URLs which you and Vanessa have noted as not search friendly. We’re working with Pier 1 and Fast to create a solution. In the meantime, we noted that Google picked up the shorter versions of the URLs, which are the breadcrumb versions. We then began to link to these shorter version of URLs. However, one problem – they’re all going into supplemental. We’re hoping to get the IT team to update the robots.txt files so we can possibly exclude the non-friendly versions. Do you have any other recommendations specifically for Pier1.com? Thanks Matt.
Todays results provides more spam than I ever saw. And especialy one group of people wins. they use googlegroups to get rankings.
Its very disappointing to see that blackhatters wins more and more often. :/
Sometimes I wonder that I should go with blackhatters.
#### ….”It has URLs which you and Vanessa have noted as not search friendly. We’re working with Pier 1 and Fast to create a solution”….
I’d think that solution would be easy, right? You start creating search engine friendly url’s and do 301 redirects from the old to the new. If the site is on an apache server things should be fairly straight-forward for a mod_rewrite.
#####….”In the meantime, we noted that Google picked up the shorter versions of the URLs, which are the breadcrumb versions. We then began to link to these shorter version of URLs. However, one problem – they’re all going into supplemental”….
If you don’t have 301’s in place, Google won’t know which url you want to use and will pick one for you. Normally it won’t be the right one. You can’t have both url’s in the index that lead to the same page. Using the 301 tells the bot to dump the old, pick up the new.
This all sounds like most any database driven website with bad url’s. It should be easy stuff to fix.
Oh my.
http://www.pier1.com/catalog/gateway.aspx?fh_location=%2f%2fpier1direct%2fen_US%2fcategories%3c%7b110296%7d&fh_refpath=facet_59232842
My firm would rebuild the backend instead of screwing around with url’s. The entire backend is a huge mess. It needs a new CMS system, etc, and needs to be completely redone. Without researching further; I doubt you could do a 301 and magically get things to work. It’s a mess.
Surely Pier 1 knows they have programmers without a clue about search engines, right? I suggest you do a total backend revamping. I’d say that firm is not lacking for the monies so they may as well build it correctly.
sheesh. I didn’t mean to link to them. Can someone please edit it to not be a link?
Matt,
“I’m happy to see an example if anyone wants to post an example where a site: search looks strange,…”
Many sites in Germany showing strange supplemental results and bad ranking since beginning of December. I think there was a bad data push at 7. December. 😉
Example 1
http://www.google.de/search?hl=de&q=site%3Awww.firmen-banner.de&btnG=Google-Suche&meta=
Example 2
http://www.google.de/search?hl=de&q=site%3Awww.heureka-shop.com&btnG=Suche&meta=
Example 3
http://www.google.de/search?hl=de&q=site%3Awww.mode-und-geschenkideen.de&btnG=Suche&meta=
Chalsie
Hello,
Do you have any comment/explanation on the recent ranking issues that have been experienced by many adult & sexuality sites/bloggers?
See:
http://www.tinynibbles.com/blogarchives/2006/12/google_is_broken_1.html
http://blog.babeland.com/2006/11/28/google-delivers-lump-of-coal-to-babeland/#comments
Im looking at rankings on different keywords and still find that the result in top for the term “sökmotoroptimering” in Sweden are companies that violates Googles guidelines. They put links back to their own site on customers sitemaps and on other places on their cusotmers sites. And this is done without telling the customers. When can we aspect an update on this mather in Sweden so this kind of linking not will inluense the results? When can we aspect an update on several algoritsm in Sweden so it will be less spam in the results?
For the nth time, it seems that Matt is side-stepping the issue. He devotes 8 paragraphs to defining “Data refresh” vs “index update” — which is just semantics. The real questions being asked on Webmasterworld is WHY there are such drastic changes during the last month. Unfortunately, Matt ends his blog with a weak guess about refresh frequency. I think it is clear and obvious that there is no logical reason why an increase in refresh frequency has anything to do with this (ie., authority sites dropping 100+ ranks one day and then recovering days later for no obvious reason). Now if Matt had said, for example, that Google didn’t have enough server resources to handle the increase in refreshing billions of web pages — now THERE is an answer the webmaster community would be find useful (although in that case, I would ask why is Google increasing the refresh frequency if it can’t handle the increase in server load?). Unfortunately, instead of a useful response, all we get is either silence or a weak guess.
So the Index Update is what people use to call Google Dance, when it was on monthly bases. Now its on daily bases and they call it Everflux.
Backlinks update and Pagerank update are also types of data refresh.
So, the big scary updates like Florida and Sandbox are Algorithm updates !?
Thanks for taking the time to explain this.
Art, I’m not trying to side-step the issue. I believe that a data refresh which used to be every 3-4 weeks is now happening more like every day. So the changes in ranking that some people were seeing on the 17th or 27th during the summer months can now happen every day.
Senaia, that’s not a bad summary. Florida and Jagger were changes in our algorithms to score documents, for example.
Matt, I have just posted a message on a WPW forum regarding the recent issues and your replies here, which is basically my opinion addressed to Art.
I would appreciate if you could comment this possibility (regarding unstable rankings):
“Well I DO think that everyday’s refresh frequency could cause such ‘drastic’ fluctuations in rankings.
The rankings are now result of constantly changing data set whose factors, such as new/old global pages, filters, frequency of crawling, etc., are not ‘stable’ anymore as before, when the factors were related to the snapshot from the data set valid for a few weeks.”
OK, so it’s somewhat like tossing a mixed salad? Before, Google would update/refresh the “salad” every month or so, and affected vegies would talk about it as it happened. Now the salad is being tossed daily, and now there’s hundreds of complaints vegetating online. OK, so that explains part of what’s going on. But why so much frenetic tossing before the holidays? Why were the choiciest croutons, which used to be at the top of the salad, suddenly marinading in sauce at the bottom of the bowl for days before resurfacing? Why isn’t the daily tossing a little more gentle? (As indicated on the WmW threads, the tossing is/was affecting large portions of salad, not just rotten vegies). This is the unanswered question being asked by the webmaster community, particularily those who lost holiday sales during all the hard tossing, and this is what I had referred to as “side-stepping”.
So, it looks like much of the site: “supplementals coming up first” problems are resolving now. Am guessing either a bug was fixed, or a new clean data refresh has taken place. Good to see.
Doug et al.,
Large corporate sites typically have extremely talented IT teams behind them. It’s actually much more complex than merely re-writing or redirecting URLs. There are numerous factors in the URLs that can not be easily modified by anyone, regardless of their amazing talent. What I’d like to do, is ask Matt and Google to consider the FAST site search URL structure as one that not all retailers can control and to consider it acceptable. It’s not an easy fix. If it was, they’d have likely done it a long time ago. This is a challenge that we need others involved with in order to come up with a solution.
Thanks.
HI Matt,
How long does it take to update the indexes? I am in the uk and it seems that all of a sudden my main page has been dropped tp page 14 and beyond. Is this me or is this the update?. I could really do with some advice.
James
Matt, that was wonderful explanation of how things specially updates work at Google. But still some time times we see sudden upheavels in the ranking for a key word for a site, why does that happen.
R.Anand
http://www.AroundDelhi.com
Is this same for other regional Google’s like google.ca , google.co.uk ?
So does a PageRank update fall under a category of its own? From my experiences they tend to last for a week or longer (until they finish).
IS the page rank shown on Google toolabr current ? If not, how old is it ?
Google Toolbar is showing actual website PR.The page rank is changeind apter periods described here : http://www.seocompany.ca/pagerank/page-rank-update-list.html .
Matt,
Let’s assume that we have done our best to implement the inanchor/allinanchor factors in our new site during implementation and assume that the SERP results are at the bottom of page 1 or page 2 after getting indexed. Does the SERP change if leaving the SEO alone over a period of time? Can the SERP change it’s position on its own? My reason for asking is, that I would like to know statistically what’s a nice period of time to track SERPs without making any changes to understand the true effectiveness of the SEO and useability implementation or SEO changes.
I still have a question about indexing. What are the minimum site revisit days that are possible in google. I now its different with all search engines.
Thanks Matt.
As being a newbie I was biting my nails to get an answer as to when will be the updates and how it happens and you helped me out with the information’s on your website. 🙂
I am so much convince with the kind of information you provide on your website that I actually pasted your link on my website blog (http://internetmarketing.find2k.com/) so that people can know more about Internet marketing.
Good to see some insight into how Google works. There is a lot of misinformation out there.
Hi! Watching the changing PR. My impression is that the actual change is different to previous updates? Is there anybody out there to confirm.
http://www.rosehr.info Is now PR4 most Pages have become a pagerank. Most are now PR 3. But I noticed some Pages from high ranked Domains have gone down. H´Greetings
Hi can any one give me brief explainnation about Florida, Jagger and big deddy updates means what happend in these updates, I was much unaware about these all updates, and at this time i have very less time to know all about this.
Hi Matt.
Interesting information. I can not imagine all the new information that is indexed every day, and much less how search engines handle all that stuff, but that’s good for everyone, fresh and recent content is one of the favorite reasons to surf the web.
Regards.
Hi Matt
Interesting information. Is it fully updated?
Regards
Is this the twilight zone or what?
How come it seems that I feel the onslaught of effects of from changes you made years ago today? It’s totally wierd, but everytime I experience an issue I find that the issue was caused by changes in algos years ago.
I would just say that the Page Rank algorithm is a just one of the factors that are important in ranking web sites but as we already know that many web pages rank much better with the lower PR. So, think about it what way you will take and where you will put more effort.
Yesterday i noticed my sites index page has dropped from a mediocre 3 to a dismal 2 and subsequently my site has dropped out of the rankings. Yet there has not been a toolbar update for over 180 days.
Is this a glitch in the google algorithm or has my site been degraded now until the next update ?
Nice info, though: How does a PR change… what are the factors for higher or lower PR.
I noticed some new pages with lower visit have a higher PR than some older sites with more visitors. So age and visitors aren’t the case?
Thanx, S
Hi, Matts, what i see is different from what you said. could you explain for me or for us? every day index update exist, but i have another scenario: sometime, my site got boost in serp for all of keywords. it seem it is not index update or a algorithm update(it happen to no other sites), and it happen to many sites in different time. webmasters, you guys , do you notice this too?
Hi Matt, thanks for the info. I have seen a number of large movements back and forward through the rankings of late on a number of different sites i operate. Is this google testing algorithims or just a usual google dance as you call it?
thanks again for clarifying the correct terminology there.
Thats a great question. I have a Canadian company with a .ca domain, and I am only targeting Canadian searches. Most of my backlinks are .com .net, but would like to know if I need to invest more time in .ca backlinks.
I can’t believe I never stumbled on to this post before it explains almost everything thing I ever wanted to ask about Google dances, algorithm updates and data refreshes.
Thanks for a great post.
I might have to read this one once more 🙂
Have a nice day
thnxx
hi,
These days i am not getting top ranking in google my simple keywords (like carpet cleaning sydney) . what happen to the google now a days . please help to bring my keywords top in google . what can i do for optimization . please give me a solution
Hi Matt,
When these changes happen, could you expect your position in Google to lower drastically or disappear temporarily from certain searches and not others?
Looks like the index refreshes based on the category of a search. I could only image the category of News and related search terms would update more frequently then the results for a “local plumber”. Also, the new impact of social feeds in Google search I am sure will have a significant impact on data refreshes. When I was at a Dolphin’s (NFL) game this year I was able to find out about a player injury quicker on a Google Tweet feed in results then by asking people around me in the stadium. How Google knew to switch the search result for the Quarterback to the Tweet feed? My guess is the algorithm is sophisticated enough to study social patterns. You all studying seismic patterns at Google? Sure seems you can detect social earthquakes and respond with a search shuffle. Cool stuff!
I can’t believe I never stumbled on to this post before it explains almost everything thing I ever wanted to ask about Google dances, algorithm updates and data refreshes.
Awesome post and a great read. Its a pity i did not stumble upon this some months ago – would have saved me a few strands of hair. Interested in a comment by Zack “I could only image the category of News and related search terms would update more frequently then the results for a “local plumber”. Being in HVAC, I am off-course interested in how Google might learn towards local business searches as diligently as they do to News type searches.
Thanks for the tips Matt!
Hello Matt,
This is Tarun Singh from Delhi INDIA. I am working as SEO in one of event management company. The reason to come on your blog is I have more then 25 websites and all based on wordpress. Tell me how can handle all on same time is only on-page optimization if ok or i need to do off-page optimization?
A little late in reading this, but as always very inspired and informative post. keep the good work.