Pubcon notes, part 1

I’m here at WebmasterWorld Pubcon. Jumbled bunch of thoughts so far:

  • The timing worked out to announce a new webmaster console on Google Sitemaps. I signed up today and it’s pretty sweet. For example, you can now see crawl errors, timeouts on pages, robots.txt errors, unreachable urls, etc. Just really useful hard data that tells you if you have crawl problems and what they are. And you do not need a sitemap to use this functionality. You just create an empty file to verify that you own the domain. Check it out.
  • Yahoo! knows how to throw a party. They threw a fun shindig at Pure last night with good food and free drinks. I stuck to Sprite, but several webmasters were drinking with both hands.
  • Lots of people liked the price of Google Analytics (free). More to post on Analytics later.
  • I finally got to hear Jeremy Zawodny speak (we always seem to be on different panels at the same time). He’s making hundreds of dollars a month from AdSense, and he hasn’t gotten around to using YPN yet. He’s said he’s gotten a “talking to” three times over the last three years. I haven’t gotten a “talking to” from my company yet, and would prefer not to.
  • The conference went pretty smoothly, or at least the talks I was on. The coffee talk Q&A was full of juicy questions that I’ll try to recap later. Lots of people have said hello today, which is nice. I’ll be at pubcon tomorrow, so please come up and say hi if you see me.

I’m heading downstairs to the hotel lobby to hang with webmasters for a few hours; catcha later..

93 Responses to Pubcon notes, part 1 (Leave a comment)

  1. How about your business cards? Did you opt for the spider monkey or did he get drunk @ yahoo?

  2. Looks pretty sweet… Hopefully once there is some data in it, Google will finally tell me why it broke up with me. πŸ™‚

    http://www.digitalpoint.com/~shawn/2005/11/google-broke-up-with-me.html

  3. Hi Matt

    Looking forward to read your comments about those coffee talk Q&A juicy questions πŸ™‚

    Have a great day whereever you are.

  4. Hey Matt –

    Excellent job in your talks and the site reviews today, and it’s always nice to get a chance to chat with you.

  5. Hi Matt

    Would be pretty cool if the sitemaps showed what url Google was treating as the root – eg if it picked domain.com over http://www.domain.com or http://www.domain.com/en/index.html etc.

    And then obviously if it was wrong the ability to change the root would be extra cool πŸ™‚

    Just a thought.

    Enjoy the rest of the conference.

    Cheers

    Stephen

  6. Whats this Matt πŸ™‚ Google Reader πŸ™‚ http://www.google.com/reader/things/intro

  7. Hi Matt,
    on the new sitemap features, we can read :
    “Top search query clicks are the top queries to Google that directed traffic to your site (based on the number of clicks to your pages in our search results).”
    Does this confirm that Google tracks all clics from the search results ?
    thanks

  8. Hi Matt,

    That’s so good that you are keeping us posted on everything about the PubCon.

    BTW what’s YPN?

    Thanxs,

    White

  9. Well I guess that’s Yahoo Publisher’s Advertising Network.

    Please correct me if i m wrong.

    Thanxs,

    White

  10. With recent lauch of all these new tools and services, Google is clearly no longer just a search engine company. Google is now more like a β€œWeb Intelligence Agency”, or shall we say “Webnopoly”.

    Their products – AdWords, AdSense, Reader, Sitemaps, Analytics, Toolbar, etc. – collect and send a lot of information to Google. If I think of the Internet as it’s own country, Google is a very scary entity.

  11. I like the query stats but…

    would it not possible to make it a bit longer?
    There are 5 entries and 4 right and 3 left are not interesting me.
    It is nice but these queries are the ones I am incidentally ranking for, and not the ones “I would like” to rank for (you will also see one query right and one left is listed twice).
    Would it not be possible to see stats like how many clics a day/week and if it is increasing or not?
    Maybe it should be possible to remove the queries that are not interesting just to get stats of the interesting ones.

    Just another question: I am ranking for many queries that I do not try to rank for. Are these clics helping the ranking of the domain as a whole or are just being worthless since I am not trying to rank for them but for what I do?

  12. german, you can’t get visits per day and such stuff from Google’s click tracking on the SERPs, because they don’t track SERP clicks permanently, try Google analytics instead.

    I like the new site stats, especially because they are a result of Google listening to Webmasters (in the sitemap related Google groups).

  13. G Sitemaps is a good tool because admins. can now check if there is error in their code before bitchin’ at Google. Just another example of how these tools help G and us at the same time and also a reason why the should be free. We help you grow, just don’t forget this Google! πŸ˜€

    I am watchin’ Pubcon via good ol’ G Alerts, haven’t plinked around in G Feeds yet, wrote up a blog about “alerts” and seo today that nobody will likely see but scrapers, until my new site passes the year mark, oh well.

    Thanks for letting us know Matt, I notice Aaron Wall was sending off data on you to his blog from there, heh heh

    -Aaron

  14. Google has been tracking which ads you click on on the SER pages for a long time now.

    In fact, one of their recent patents mentions a method of ranking websites based on the amount of clicks, and analyzing the amount of time spent on page to determine relevancy.

    That is to say, it’s also possible Google is tracking whether or not you visit the page, then click the back button and come right back. Obviously, if you click a link, and 2 seconds later are back at the search results, then that page probably wasn’t what you were looking for.

    With search sitemaps, analytics etc… it’s only a matter of time before (like somebody else said) that every click somehow touches a google server.

    Even more scary, is the fact that the patriot act gives the govt permission to tap into this database for any reason, without telling us.

    I trust Google, but I’m not sure I trust my govt with all that information.

  15. While it’s not exactly the type of data that I made reference to in my comment on the ‘More Info on Updates’ post back in October (http://www.mattcutts.com/blog/more-info-on-updates/#comments), I’m pleased to see that G is doing something to provide webmasters with feedback on ‘How Google Sees You’.

    Now if it could just have an indicator saying ‘Sandbox Level’……;)

  16. Hi Matt,

    I love the concept behind the new Sitemaps functions and was thrilled to see them. However, I was a little let down to see just a “Top 5” search queries – to reiterate what german said above.

    It’d be great to know which terms a site ranks for overall, as I already know the top 5.

    Beta Phase3 ?

  17. Hehe David…

    Matt,
    I’m sure Yahoo did all of that to help convince you to move over… that email was really for you. They were planning to get you drunk and throw you in a van and take you to the Y! HQ and pick your brain while teasing you with dirty lingerie and tube socks…

    Anyways, back to the post. I really LOVE the new update to Sitemaps. Gives a much better view on what our sites are doing and how googlebot is reacting to it.

    Do you have anymore weather reports on J3? My site took a huge hit (actually, went down in some keywords and up in others… just the ones I went down in were searched for more often, I suspect anyways since traffic got shot a bundle) and wondering if the results will stay this way or perhaps there are more tweeks on the way.

  18. Sebastian,

    My site is not a new site anymore, it will be having it first birthday in the next couple of days (got partially out of the sandox in june and through this update for the most important terms). My problem with this is that I am ranking over the board (yesterday I had been ranking for “seek technical job in Russia” which actually don’t have anything to do with my business. There are so many keywords I am ranking for, some of them atually comptetitive too but google is showing me out of the 5 keywords, twice my name, one my surname, one light related term and one important. I don’t care about people typing my name into google, I want to know about the other terms but people are actually clicking at my name first in google.

    Actually I know what the people are typing, it’s in my logs. the “start=”part of the URL is also showing the page I have been found. For other webmasters it could be interesting though.

    I know they must be tracking serps. My serps are higher on days where I am outing myself and having much traffic and it is getting down on week-ends where there is less traffic.

  19. german, unlike others, Google tracks clicks on SERPs only sporadically. Mouseover the links on the SERPs, sometimes you see a tracking URI which redirects, sometimes the real URI. The top 5 clicked search terms on your stats page are made up from a tiny fraction of all user clicks –only on SERPs–, so they are useless for low traffic sites. Whether or not and if so then how current rankings are influenced by UA data is still secret sauce. Those who know won’t tell. Tracking clicks on the SERPs is only a chapter of the whole story.

  20. Hey Matt,

    I know it’s probably just me being anal, but I wanted to point out a grammar error on your page. I know I’d want someone to point one out to me if they found one:

    “The biggest change for new users: you can now add a site to your account even before create a Sitemap for it.”

    “even before creating a sitemap” or “even before you create a sitemap”

    Anyways, just wanted to let you know, say “Hey” (we’ve briefly met a couple of times at conferences), and say thanks for all of the updates and contact you have with the community. πŸ™‚

    Ben Wills

  21. Just to say great presentation yesterday Matt!

    Remember, what happens in Vegas stays in Vegas *Cough* MSN Search comparision *Cough*

  22. Matt,

    Jason Lee Miller at Webpronews has published a transcript of your Q&A session at PubCon. I’m interested in the following quote (which I’m assuming is accurately quoted):

    “On original content and credit:

    “We’ve got some projects underway to help determine β€˜who wrote this first.'”

    I am glad to hear this, because I’ve seen first hand evidence of the algo getting things somewhat wrong in this area. Up until now, I’ve not made reference to specific pages in comments here, but in this case I think it is necessary to illustrate the point:

    On November 12 I published an article with the heading “Bill Gates Points Towards ‘Anything On-Demand’ Future”. A fairly long and specific phrase. Much of the title is repeated in the text. The location is http://www.madeforone.com/news/20051112-Bill-Gates-Media.html .

    When I do a search with the exact title of the article, the original article appears anywhere from 7 to 30 in SERPS. Many of the results above it are pages that legitimately linked to the article, or quoted part of it with a link to the original, so I wouldn’t consider them to be spam. However, they still outrank the original page.

    Other results that appear in the results above the original article are unrelated pages.

    It does seem that there is a difficulty at present for the algo to determine the originator of a particular piece of content.

    On the other hand, perhaps, the reason for the poor showing of this particular page for its own title is related to the overall difficulties for the site in question A.J. (After Jagger). I’ve yet to discover the reason for these difficulties.

    I filed a ‘dissatisifed with results’ feedback report on this, and can only wait and see what happens. Hopefully any development of G to analyse for originality of page content will come soon.

  23. Wow. I love the new Google stats. Finally some feedback straight from the source. This rocks! What I’m interested in is the Query Stats.

    5 and 5 are just not enough for me. If we could see maybe 100 on each side, I’d be a happy man. I assume the data is already there so why not give us access to it? Come on. Please?

    πŸ˜‰

  24. Matt,
    What about sites that can’t upload a sitemap file or the “empty file” that the new service suggests? Specifically, how about blogs using Blogger.com (from Google).

    There are many blogs that are better than most domain-based websites. Shouldn’t they also have the opportunity to utilize the sitemap services?

    Any suggestions for getting included or could you possibly throw some weight around?

    Thanks.

  25. [You can post as much or as little of this as you chose]

    I just wanted to let you know that the amount of time you gave and the patience you showed were commendable. I can’t say I’ve ever seen a mob of web geeks act like that.

    If ever there was a personification of a #1 organic ranking, the social dynamic on Wednesday was it. Your PR (personal ranking) was at least 9-10 and you had a ton (hundreds) of valuable “human” backlinks hanging on your every word. It was a little cereal and you are a great sport to engage and answer all those questions (or queries) the way you did.

  26. PubCon rocks!

    Looking forward to PubCon 2006.

    :insert spam link: :inert more spam links:

    – DSLxp

  27. Great gadget, the site map thing, I like it! It gives some neat data.

  28. Tearinghairout,

    I am no expert but say a big site like SEW takes a snippet out of your article and posts it with a link to the originator (you) at the bottom. Their two sentence blog scrape will be #1 in the search in under a day, why? It’s the authority equation and the only way you can beat it is to become an authority yourself by maintaining your focus and not doing blogs on “fun with whip cream” if you are all about “microsoft”. Why do you think these big sites articles get shorter and shorter and lack any real substance? (I am not speaking of SEW, I like to read their morning scrapes with coffee) πŸ˜‰

    Just keep doing what you do, you will find a slot if your junk fits into the “collective” matrix.

  29. whoa…! convention parties…sounds cool πŸ˜‰

    PS I love the spell check on the google toolbar

  30. Matt,

    Good to see you at PUBCON. Thanks for all of your hard work communicating with the SEO community! Much appreciated!

    Am excited about Googlelytics and G!Sitemaps.

    Just FYI: 35,000 Yahoo! Stores can’t verify their URLS on Google Sitemaps because the verify code has both upper and lowercase letters and all of our stores can only create pages with lowercase URLs. I posted same on the Google Groups.

    BTW When are you guys launching the (free?) Google Stores?

    Rob

  31. Matt,

    Loved your blog..I got so hooked i read all the posts from beginning to end in 4 hours….Any new stuff from Google really excites me and amount of buzz it generates from stock market to technical community,legal communites is just awesome.

    By the way for Google Base,it’d me more cool if the search was restricted just to Google Base.For e.g if i search for “wanted jobs”..it will also pull those links which has been index by google for “wanted” but doesnt have anything to do with actually a job posting..That kinda sucks aint it!!!

    Peace

  32. Matt,

    Why don’t Google combine Sitemap with Analytics in one screen; and in addition to that, perhaps in the future, Google Sitemap could inform webmasters why their sites are penalized/banned — all on the same screen?

    Pin.

  33. It’s been mentioned that Matt has not recommended buying text links. I have to disagree.

    Lets say I find a blog about a product in my industry and want to buy a text link because it will generate traffic to the site. I would say it’s a wonderful idea.

    Maybe the transcript I read was a cut down version of what was said.

    Or maybe Matt could have said for purposes of increasing PR and SERPs that text link buying might not be fruitful.

  34. Hi Johny Favourite

    According to the recap which I read, Matt is only in doubt whether the purchased textlinks might work in connection to ranking on the serps (not whether they generate direct traffic or not from for example a blog which they are on, I guess:

    ————————————————————-
    Q: Let’s go back to text links.
    A: Best links are earned, not sold or traded. You may not get what you pay for. He said, if someone is selling text links, they should give you a free test trial to make sure it works. They have both manual and algorithmic approaches to detect paid links. He said Google.com gets emails asking to trade links. The guy who came up with the pixel homepage thing, that was creative.
    ————————————————————–

  35. Google Analytics (free) launched by Google is excellent thing. I am going to implement Google Analytics in my website and will post the feedback after testing.

    It’s it’ll works fine then we don’t need to depend to use URCHIN πŸ™‚

    Thanks alot for lauching Google Analytics (free).

  36. Thats ok then!

    I just read he would not recommend buying text links at all!

    I just had the image of no one ever buying any links on my http://www.blog-about-everythiing-but-mostly-mortgages-electricals-seo-sexual-enhancement-tablets-get-rich-quick-schemes-and-all-the-free-goods-you-could-ever-imagine-and-all-you-have-to-do-is-send-me-cash-in-a-brown-paper-bag.com

    And obviously I don’t want my main source of income to suffer!

  37. Interesting stuff! I’ve got to try the new Sitemaps when I have time.

  38. yes, great analytics. anyone who uses a real analytics tool will notice one thing. the results are way off! the slowness, ugh.. the lack of accuracy, ugh. the easyness in which it is fooled, ugh.

    there is a reason google analytics is free. it is worthless. if i paid for this, i would feel cheated. there are several other free log file apps that are more accurate, they just dont get hyped as much – and perhaps are not as “user friendly”

    good job on the PR front though, gotta give Google credit where credit is due.

  39. Hi Matt,

    Is it possible for a page to be penalized for duplicate content if there are identical pages listed in the supplemental index, with slightly different names… i.e. /widgets.html, /Widgets.html, & /WIDGETS.html?

    I have put 301 redirects in place to correct the problem, but I think the good page is penalized/filtered.

    If this is the case, what can a webmaster do to fix it?

    Thanks.

  40. Wow, Google just keeps coming up with new stuff. I can’t even keep track anymore. I just found the Google reader and Google video for the first time. Anyone know any others?

  41. Howdy!

    I especially like how the Sitemaps is a feedback mechanism, showing you how googlebot views your site. That is a treasure trove of information, right there.

    It was funny hearing that some mass-emails have been sent to google.com, requesting link exchanges. That’s just hilarious … talk about shooting yourself in the foot.

    Now … the pixel homepage … what exactly was that?

  42. million dollar homepage dot com (I won’t gratify it with a link)

    You’re very lucky you don’t know what it was. It’s stupid.

    It was hype for a week… it actually made him money. Then millions of people copied it. In fact, you can buy the code to make one for $50 if you search hard enough (please don’t)

    But what would actually cause a person to go there? Sure it was a unique Idea, but it has no market. why would I want to look at a webpage that’s a jumble of links?

    I”m suprised he gets traffic at all.

  43. The additions to the sitemaps area are excellent. But there’s one addition that I’d really like to see in the sitemaps file itself – the minus sign. A simple minus sign in front of a URL could be used to tell Google to remove the file from the index. It would be a whole lot easier than the current method.

    I suggested it when sitemaps first came out, and GG liked the idea, but he isn’t on the sitemaps team, so he said he’d pass it along. I guess they didn’t like it, but it would be very useful for webmasters in many situations.

  44. Brandon, the pixel homepage was on a site called http://www.themilliondollarhomepage.com quite amusing if you ask me.

    Anyhow, I am happy with the sitemaps thing but really not too sure about Google Analytics. I’ve tried it out but am unsure about it.

    Also, I saw Google Base was up the other day. I have submitted an article there, though I am waiting to hear Google hype us up about it.

  45. >if someone is selling text links, they should give you a free test trial to make sure it works.

    Well there are time delays associated with feedback loops, and in some markets conversions are delayed a bit, in others conversion rates may be fairly low with a high value per customer.

    what percent of quality publishers will give you a free one or two month trial to ensure their ads work with your site? Where is an example of anything like that offline?

    You have to enter your credit card details into Google before you get AdWords up and running. It is not the job of the publisher selling ad space to ensure the ad buyer is not messing up on some other fronts (be it SEO or conversion or whatever).

  46. Google is fast approaching King Kong status and seems to be willing to stand on anything to get what they want. Who controls their social concience?

  47. Matt,

    Any comments on the sitemaps 404 verification error? Are the accounts that have sites previously verified going to be re-verified? Obviously there’s a strong chance that a fair number of people are collecting stats on sites they should not have access to?

    Thanks,

  48. I got data from Sitemaps last night and it’s much more comprehensive than I expected.

    The errors list is especially useful. Indirectly, it led me to find a scraped copy of one of my pages, which I’ve spam reported.

    So I take my (white!) hat off to G for doing this.

  49. “Yahoo! knows how to throw a party. They threw a fun shindig at Pure last night with good food and free drinks. I stuck to Sprite, but several webmasters were drinking with both hands.”

    Yes I did πŸ™‚ 11 Jack and Cokes and a couple of Double Scotch Rocks and I almost had to crawl back to my hotel. Matt thanks for the information you gave us all, it helped me immensely when you answered my question at the Pub the other night, your a gentleman and a scholar.

  50. Yeah, you keep drinking while Matt probes you with his cybernetic eye, “Hmm, I see much spam in this mans head, will have to write down his url and exclude him” lol!

    See for yourself: http://www.webguerrilla.com/photos/album/1394062/page/1/photo/64560901

    HA!!! I love you guys, black hats, white hats, google hats, it’s all good fun!!!

    πŸ˜€

    Peace and Love,

    -AP

  51. Hi,

    Great addition, but I have a question about the PAGERANK thing in sitemaps console. My site is VERY new yet i have 100% distribution of pages shown as MEDIUM PR already ?

    Google shows me no real inbound links and the toolbar says I have PR0 πŸ˜€ so whos right the toolbar or my console ?

    Sounds like a console bug to me….

  52. Hi Matt…

    Hope you had a great time in Vegas and won some blackjack hands (or poker hands??)

    Sad but true, but back to work…

    My question it’s not about Pubcom, I tried to submit a spamreport

    http://www.google.com/contact/spamreport.html

    But it’s not working.I recieve http://services.google.com/cgi-bin/feedback/feedback.py
    and display an error…

    Regards

  53. Hi Matt –

    I hope you got out of LV safe, sound and solvent. You clearly win the champion belt for search rock star. I hope you get to share some of your general observations about PubCon. I’ve been to two of them and was impressed by the quality of the sessions and the general audience.

    One presenter said “WMW used to be the spammers conference” and I’m wondering how you things are changing in that respect. You and Brett should convince Larry and Sergey to attend one of these – they’d enjoy the “internet energy” level which has got to be as high as anywhere but Google or Yahoo’s HQ.

  54. Question for Mr. Cutts that is totally not related but I will give this a go. What do I got to lose? I either go to a forum and get last years advice or I throw one down and hope it is picked by Matts cybernetic eye. πŸ˜€

    Here it is:

    I have a site that is not getting traffic from google (i did a dumb thing, i deleted the entire site and started over) and I am a little tired of it so I am considering paying for YES paying for my link to be on a legit travel site that gets insane amounts of traffic. It is not really related to what I do but I do not care, did i say I need traffic? BUT here is the thing, it is showing a pr8 and my site is only a pr5, will this boost my pagerank to a dangerous level and set off a flag requiring a manual check? The problem is I need traffic NOT pr but I see how this could be seen as spamming. Did I say I need traffic not pr??? Grrr…

    (Any of you minions know?)

    Thanks,

    -Aaron Pratt

  55. Canonical issues still exist at Jagger3.

    Referral sites would be a great addition to the Sitemap product. If a problem exists with an external link, it would be nice to see what website is generating the problem link.

    Just did a check this AM on J3 test site and still see the same problem appear. One of my top PR5 pages that pre J3 garnered one of top five search positions for a relevant keyword now appears down in 60 – 70 position. Searching a snippet from this page returns 2 identical pages from my site, both listed AFTER a spamming, low PR Adsense page that scraped our site. URLs identical except one has a double slash. I’m assuming there’s a “duplicate content” penalty being applied. I checked our site and there is no reference internally with a double slash so I assume its from an external website.

    Google Sitemap recently displayed HTTP errors where its spider found some site referencing one of our pages with 2,3,4, and 5 slashes in the URL. It would be nice to know who. Our site logs show no such entries.
    Is it possibly Google?

  56. There one part of the new sitemaps stuff that doesn’t work 100% – verifying the ownership of a site.

    The verification includes doing a “404 probe” by requesting a page that the site cannot contain. But not all sites return 404s when a page doesn’t exist. Some sites return a useful page instead – a 200. I have such a site and I can’t verifiy it because the sitemaps system refuses to verify it when a 404 isn’t returned.

    What’s the point in that?

  57. Cancel my question. I’ve realised the need for the 404 probe. It’s *because* sites like mine exist.

    The 404 probe isn’t a solution – again, because sites like mine exist. I suggest that Google adopts a much better verification method of the type that’s being talked about in the forums – a robots.txt entry, or something on the homepage, for instance.

  58. Matt, I want to test this text link buying thing out for myself. How much for a link?

  59. Well, the Urchin team failed to mention this at PubCon during all the hooplah. Frustrating since I got all excited about it and now can’t use it.

    “Google Analytics has experienced extremely strong demand, and as a result, we have temporarily limited the number of new signups as we increase capacity”.

  60. hey matt, i spoke with you a bit at the pubcon. We talked about bike trails, hospitals, and i think i even told you about the forest fires. I snagged you on the way out of Canada and pinned you at the bathroom for a bit. By the way, i did a little search on “Ireland bathroom”, those are some intersting results πŸ˜‰
    Thanks to Matt and Google, it was fun for sure.
    -justin

  61. I guess I was off topic on my last couple posts, is this why they were deleted? There are just too many questions, oh well, I will figure this all out quietly on my own.

    Anyhow, the thing I like about Analytics is that it shows regions, I have a website on rainwater harvesting, Analytics confirms that people in India are in great need of information so I will provide it. In this way analytics is VERY useful indeed.

  62. Hi Matt

    I guess by now you have reached Kentucky for Thanksgiving.

    Just to say that we are still awaiting that recap of PubCon coffee talk Q&A πŸ™‚

    Have a great Sunday.

  63. PhilC – I had the same problem… (I had to rename my .htaccess file until after the verfification process was completed)!

    Thanks for posting – I thought I was the only one – lol

  64. Hi Matt

    I hope you read the post on the older blog entries – eg – if someone added a comment on the Bacon Polenta – would you know and read it – just wondering ?

    Anyway, to move slightly away from my Canonical URL postings – it seems that Google has a problem determining the root page of my site – eg. The J3 DCs seem ordered for some sites – but my homepage does not come top πŸ™ – If I do a “Company Name Ltd” search – my root page is not the first returned from my site – If I do a site:www.domain.com http://www.domain.com – my root page is not top etc.

    So – I wonder if Google is having a problem determining my root page. I have done a 301 from non-www to www and all my internal links point to the www page.

    Is there anything else that makes it easier for Googlebot to determine the correct root page ?

    All the best

    Stephen

  65. Analytics ain’t cutting it – this seems like an Alpha release, rather than a beta release open to the general public. Google should’ve spent some more time with this, instead of encouraging others to leave their paid analytics software for this tracking that is hardly functioning.

    I’m a bit disappointed.

  66. ClickyB. I wonder if the verification file needs to stay in place in case Google does repeat verifications, or if everything can be removed once the verification is done. I added a line in my .htaccess file to get the site verified, and I’m leaving it, and the verification file itself, in place in case Google does impromptu repeat verifications.

  67. Matt do you plan on going to the Boston show in the spring??

  68. Hey Matt,

    Thanks for all the great updates in the fast month along with the pubcon. Anyhow, I know this may be a little off topic but I mean take a look at this search query and look how many blog spams there are. Is this how the jagger update is going to turnout? I know this may be a bit selfish to hope that my site would rank higher but if you take a look you’ll understand. 3-4 blogs covering up the top 30 search queries is not what google should be about.

    http://www.google.com/search?as_q=wow+gold&num=30&hl=en&btnG=Google+Search&as_epq=&as_oq=&as_eq=&lr=&as_ft=i&as_filetype=&as_qdr=all&as_occt=any&as_dt=i&as_sitesearch=&as_rights=&safe=images

  69. I seem to have the same problem as shawn and it almost seems like a canonocial issue showing http://mps-golf.com instead of http://www.mps-golf.com. Almost ALL of my pages are no longer indexed or cached by google and I will probably be completely gone by tomorrow.

    I don’t know whats going on, but its not good. πŸ™

  70. Sitemap stats are great on the whole, but its frustrating to see that Googlebot’s reporting an error 404 on a page that never existed. That is someone’s tried to link to you, but mistyped the url. The stats tell you Googlebot couldn’t find the page they’re after, but they don’t tell you where Googlebot found the broken link. So you can’t tell the other webmaster in the the hope they’ll fix it.

  71. I asked this question of Google Adwords and was told to write support at Google because they didn’t have an answer. No response yet. I’ll try it here also.

    We placed tracking code in all of our Adwords ads similar to the following:

    http://www.mysite.com/?ref=Google

    Pre Jagger, I noticed that by clicking on the ads within Adwords, our page came up with a different PR than the non tracking code flavor (usually 1 or 2 less). With the canonical issues raised with Jagger, I asked Adwords if there was any potential problem with this and was it possibly hurting our search engine rankings because of the apparent distinction that Google was making between pages with and without tracking code.

    I know this is beating a dead canonical horse, but at least this is one I can control. I have several hundred ads under Adwords that I will need to change and also scrap our tracking program that looks for these codes.

    Should I remove all these tracking codes?

    Any definitive answer would be great! πŸ™‚

  72. you could hear a pin drop in here eh? hmm interesting. πŸ˜‰

  73. @Stephen didn’t it show up on a link: query?

  74. Matt,

    The stats site would be really nice if it worked with Yahoo! stores. Who’s bright idea to use mixed case filename? Yahoo! stores doesn’t allow uppercase filenames so therefore I can’t verify the domain. Or maybe this was intentional…

    Besides, it was my understanding that common practice was to use lowercase filenames for web sites. Is Google trying to change that?

  75. Stephen newton. Unless I’ve misuderstood you, you are mistaken. There was a security issue with the new sitemaps data, and Google fixed it by doing a 404 probe.They request a non-existant page to make sure that a 404 is returned. They need a 404 to be returned for the non-existant page, so they can be sure that when they receive a 200 for the real verification page, it really is for that page. So if the verification says that there’s a 404 problem, it means that they can’t be satisfied that you put the verification page on the site.

  76. PhilC and Stephen,
    I think Stephen was referring to 404’s in general because Google does report a number of these on our site as well, but doesn’t give the referral site that maintains these links. The Google report contains additional 404’s than what we’ve seen in our site log stats and it would be nice if Google would tell us where they came from. I mentioned an example of this earlier in this post where some site was referring to a non-existant page on our site with 2, 3, 4, and 5 slashes in the URL name. Almost looked like a fishing expedition.
    This info was in the Google reports that predates the security issue fix that Google recently implemented.

  77. I was rushing to get out when I posted the last message, and I wasn’t very clear about it, so I’ll explain it in a bit more detail.

    First I’ll say that Google’s “404” message does not mean that there is a faulty link somewhere. It ONLY means that the site doesn’t return a 404 when a non-existant page is requested.

    The security problem that the new sitemaps data had was that some sites don’t return a 404 when a non-existant file is requested. Instead, they usually return a useful page with the 200 code. Initially, Google requested only the verification file, and it assumed that the file existed if a 200 was returned, so the site was verified. It meant that anyone could add any site that doesn’t return a 404 for non-existant pages to their list, and get it verified. The site would return a non-404 code whether or not the verification file existed, and Google would assume that the file did exist. So anyone could see the data for any of those sites, including sites like aol.com.

    Google needs a 404 when it requests the verification file, to know that the file wasn’t placed on the site. So their quick fix is to request a file that it knows doesn’t exist on the site, and if doesn’t get a 404 back, then they can’t verify that the verification file exists, because all file requests return something other than a 404, whether they exist or not.

    That’s the reason for the 404 message that you got, Stephen. It doesn’t mean that anything is broken or faulty. It only means that your site never returns a 404, and Google can’t verify the verification file. That’s all.

  78. Matt.

    I’m sure that Google will come up with a better solution than the 404 probe, but in the meantime, you could ask them to make the “404 error message” understandable. It really doesn’t tell us anything at the moment. I had to go reading in forums to find out what the problem was, because it looked like a faulty Google system to me.

  79. Matt, in a few dicussions i mentioned the viewstate in Dot Net. I think i told you it fell under the Meta. I was wrong. It is a hidden input in the main form in a dot net app.
    -justin

  80. Matt –

    Help, I’m still lost in the pub at the New York New York Casino…
    ummm …. where did everybody go?

  81. Why do all of my comments keep getting deleted? Is there an blog policy that I’m not aware of?

  82. Alexa create a small txt file you need to upload to your site, and then they retrieve it and verify the contents. That’s a much smarter way.

  83. The GoogleBase bulk upload is a real pain, and I’m not the only one having problems. I’ve tried half a dozen experiments to get the items published, but all the system ever does is come back with “invalid URL” messages every time. There’s nothing wrong with the URLs – no funny characters, no name/value pairs, no anything except straight forward full URLs to html files.

    Most of my experiments just had one item in the file, and the items get published fine when I submit them as individual items, but GoogleBase doesn’t want to know the same URLs when they are in a bulk upload file.

    It sounds as if the file is formatted wrongly (it’s a .txt file), but I’ve even gone through it in a hex viewer to make sure that the tabs are where they are supposed to be (that’s the reason for only putting one item in – it’s easier to check), and that no unwanted characters have found their way into the file anywhere. Everything is as it’s supposed to be, but it won’t go through.

  84. Hey Matt, just wanted to say you handled yourself well under the pressure from Brett during Coffee Talk and during the SEM Smack Down Forum at PubCon. I didn’t get a chance to run into you during the show but my brother said he bumped into you and said what’s up. I got a lot of good info and that nonsense MSN vs. Google API was ridiculous. I was lmao!! Catch you later man.

  85. Hmm anybody notice Matt hasn’t been posting for awhile? Anyone know why?

  86. Daniel, I think he’s on a well deserved Vacation in Kentucky.
    Boonesboro Matt?

  87. Daniel Said,

    “Hmm anybody notice Matt hasn’t been posting for awhile? Anyone know why?”

    Inigo is in Kentucky for Thanksgiving πŸ™‚

  88. Gord Hotchkiss wrote in MediaPost’s Search Insider on Matt.
    Great piece, everyone should read. Very true.
    http://publications.mediapost.com/index.cfm?fuseaction=Articles.showArticle&art_aid=36721&art_search=

  89. How/why does Google differentiate between ‘buying a link’ and ‘buying an AdWords ad’ … fundamentally they are the same thing, it’s ironic that you’re penalised if you buy a link through someone directly and not Google.

  90. Google’s changes have been good to our site, so we are happy. However, we have noticed that as Google improves the ranking of our site Yahoo dramatically lowers it our ranking at the same time. (From 12 to 47)

    Is this a coincidence? Or is Yahoo that much different than Google?

  91. Now how long till next update?

  92. β€œYahoo! knows how to throw a party. They threw a fun shindig at Pure last night with good food and free drinks. I stuck to Sprite, but several webmasters were drinking with both hands.”

    Too right! The barmaid was excellent too – after my 3rd JD and Coke she would just pour them as I was walking towards the bar! What a star πŸ˜€

  93. Noone have a clue why the PR shown in sitemaps console and the PR in the toolbar is completely off ??

    I have a feeling the toolbar is right and Google has more problems. Lots of problems seem to be appearing, MSN search looks pretty nice sometimes and I find myself having to use it to find something relevant πŸ™

    I agree with link buying, we should be allowed to buy links from whoever we want. Google should not be trying to control things like that, they should be spending their time fixing the problems in the search engine !

    Why not build a fast reporting system and have it manned. Users can report spammy links even faster than now, and the sites can be vanished off Google quickly. If they have contact info the author of the site can complain. Make users register to be able to report sites, and give them adwords or even direct revenue BONUSES for helping Google.

    I will put in a little every day killing bad sites πŸ˜€

css.php