Boston Pubcon 2006, Day 1

Stream of unconsciousness:

I’m on a bus (technically, the “Silver Line”) heading into Boston. Now is when that EVDO 3G wireless would be handy, but the hotel evidently has Wi-Fi. I’m happy to say that the United flight was smooth, and I got a few hours of sleep. You always remember when your entire day is ruined because a flight is delayed or a plane is broken, but you never remember when the flight attendants are nice and the plane leaves on time. Kinda like forgetting to be thankful when you’re not sick, I guess.

Later: I’m really optimistic. Good omens so far:
– The change machine was broken, so I didn’t have my $1.25 exact change. The bus driver took a dollar and waved me on back.
– Daffodils are in bloom here. I love daffodils, but they’re already gone in California and Kentucky. I guess spring comes later in Boston.
– An airport employee was happily munching on tater tots. Someone called to her and by way of greeting, the woman said “You see me eating tater tots?” with a smile in her voice.
– Another person, a “T” employee kept me on track by pointing me to the E Green Line.
– Once I got into the hotel, four people said hello before I reached the check-in desk, and I only recognized two of them. So people are stopping me to say hi, which is nice.

And the best omen: wifi at the conference is free and everywhere! O frabjous day! I can watch sessions without being out of touch, and I can keep up on email/Bloglines.

I got checked in early and caught the tail end of Malcolm Gladwell (Blink? The Tipping Point? You know this man, yes?). He seemed to regret that search engines take a just-the-facts approach; this is a man who likes serendipity and seeing novel things. Given the choice of a search engine or the blogosphere, he said he’d take the blogosphere. 🙂

In the affiliate and mumble microsite mumble panel, ChrisR noted how appealing to emotions can increase sales. And he made the fine point that specific stuff often converts better than general stuff.

In the local panel, Baked Jake mentioned that DMV is the #8 most common local search on TrueLocal, and urged people not to forget government-related searches. Jake also mentioned that golf course searches have a great return-on-investment. Thai Tran from Google was on the panel, and mentioned how content can make it into Local Search on Google:
– yellow pages-type data
– data that site owners submit directly to Google
– crawling the web

Then we had the Google-sponsored luncheon. It was pretty cozy (maybe a couple hundred people or so?). I embarrassed Martinibuster into going out into the hallway to let people know about the luncheon. Some highlights:
– I briefly talked about things that had launched since the last WMW, such as Google Calendar, Google Finance, and Bigdaddy.
– I mentioned where to find/talk to Googlers at the conference. Google is speaking on 7-8 panels, so Googlers aren’t hard to find. 🙂
– Amanda and Vanessa gave a live demo of Sitemaps, including the recent robots.txt checker.
– My favorite part: Amanda deliberately typed “Disalow” in the robots.txt tool to show that 1) Google will still treat that correctly and 2) the checker tool will warn you about the typo. Several people in the audience thought it was an accidental typo until they realized Amanda did it on purpose. It’s interesting how uncomfortable people get when they believe a speaker is mistyping something. 🙂

Questions from the audience were all good, from “Can you break these stats down more for different countries/languages?” (Answer: we’ll pass that feedback on) to “Can I access sitemaps for my clients?” (Answer: yes. If you control the domain, you can verify a sitemap for both and Then you could both access stats on that domain.) Aha: WebProNews covered the luncheon.

What else? The blogger/podcast/RSS session was really fun. Daron Babin, Amanda Watlington, and Greg Hartnett spoke, with Anne Kennedy moderating. Both Jeremy Zawodny and I were in the audience, so they pulled us up for the Q&A part of it. Some highlights:
– A nice lady mentioned that sometimes Google could return raw RSS or XML links in our search, and that looked ugly and didn’t help users much.
– I know Danny likes Feedburner, but I don’t use it because I feel a little weird about giving my RSS feeds over to someone else. Turns out that several people on the panel felt the same way. One solution that people mentioned was to keep the feeds as an internal link (e.g., and then do a 302 over to Feedburner. That makes it easy to undo later if you want to take your feeds back in-house.
– At some point during the conversation above, I felt pretty dumb. I blog, but don’t really feel like a native blogger. When I started blogging, I did about a week of research, then I just picked WordPress, tweaked it a bit, and haven’t done much otherwise. Even my blog template is pretty vanilla, which is fine because I assume you’re reading me via a feed reader. That’s why I include the full text and pictures of my posts in my feeds. I don’t care very much about RSS/Atom/whatever, and I feel vaguely guilty about that. I sometimes feel that if I were a real blogger, I would dig through XML for fun.
– Daron mentioned the danger of chiclet inflation (“Add to MyYahoo”, “Add to Google Reader”, “Subscribe in Bloglines”, and so on) and that he uses a dropbox on I think he also said that he switched from Shoutcast to Icecast. Either that, or the other way around.
– I thought again that it might be fun to do a podcast, and again wished that there was a rock-solid/cheap/reliable/scalable place to dump 30-50 MB mp3s. Any recommendations?

Okay, moving on. I’d been talking to Mike Grehan about doing an interview for, oh, two years or so, and we finally got around to doing that. We ended up talking for 40-50 minutes and discussing lots of meaty topics such as the differences between IP delivery and cloaking. I hope he’ll put that audio up some time, and I also hope to see technical posts by Mike Grehan, Amanda Watlington, and Daron Babin about the type of audio set-up that they each have.

Then it was well after 4 p.m., and I realized that I was going to miss Gordon Hotchkiss’ thin-slicing search session with Ron Belanger. Hopefully someone else caught it and can fill me in. So instead, I hunkered in the hotel room and wrote this up.

Now comes the dilemma: it’s about 6pm. Should I go to the YPN party? They never sent me a YPN invite, but they invited me to their party. Or should I hang around in the hotel pub and chat SEO? Or should I try to sleep a little bit to prepare for that 9am Blogger panel? I’m mightly low on sleep after that red-eye flight.

60 Responses to Boston Pubcon 2006, Day 1 (Leave a comment)

  1. Hey Matt, that twas brillig 🙂

    I’d love to hear your thoughts here on IP delivery vs. cloaking. I know there are a ton of things we do on our site to customize what we present to the user…not just based on IP, but based on browser type and version, screen resolution (when we can detect it).

    I understand the official Google stand on this, but I’d be very interested to hear as much as you can tell us on what’s considered safe and what’s going to get us in trouble. While we aren’t trying to be deceptive, I am always concerned that things we do for the user can get us in trouble with your spam-spotting code.

  2. Here’s the short answer from Google’s perspective:

    IP delivery: delivering results to users based on IP address.
    Cloaking: showing different pages to users than to search engines.

    IP delivery includes things like “users from Britain get sent to the, users from France get sent to the .fr”. This is fine–even Google does this.

    It’s when you do something *special* or out-of-the-ordinary for Googlebot that you start to get in trouble, because that’s cloaking. In the example above, cloaking would be “if a user is from Googlelandia, they get sent to our Google-only optimized text pages.”

    So IP delivery is fine, but don’t do anything special for Googlebot. Just treat it like a typical user visiting the site.

  3. Just read’s report about the Sitemap demo regarding webmasters looking for better communication w/ Google. Something sort of just pop’d up. Instead of communication via email, which are less and less reliable these days, to the webmasters, how about using the Sitemap account as the base and establish a sort of ticket system (or even if just a one-way/reply only messaging system). This would be much more reliable and, for webmasters managing multiple websites, much more convenient.

  4. Hello Matt,

    Our site went from 33000 listings on Google 2.5 weeks ago to 950 today. It is affecting our traffic. We can’t find anyone to speak to or to email.

    I read your guidelines on comments and I know that you no longer handle site-specific situations but then a public access to someone who would at Google would help. Large companies seem to have that access.

    We contacted a serious seo expert favorably quoted by you on your blog and he could not see the reason why this would happen. How then, can a small business figure it out?



  5. > again wished that there was a rock-solid/cheap/reliable/scalable place
    > to dump 30-50 MB mp3s. Any recommendations?

    Google Pages? Or don’t you agree on the rock-solid/cheap/reliable/scalable part? 🙂

  6. Ok, I don’t really expect an answer to this question, but since I seem to have the luck of being only the 2nd poster on this thread and since you’ve got that blogger panel, I’ll try my luck. Here’s what I posted before (sorry, it’s not that I’m so lazy I can’t write the post over…or maybe I am that lazy)


    This is not related to traffic power, but I had a question about having a regular website and also having a blog. Let’s say the blog was created for the sole purpose of being an adjunct to your site. That is, while blogging about your content area of interest you continually reference things on your main site and even use the blog to promote new and older content that exists on your main site. As a function of this, of course, you are posting a lot of links to your own site. Is this, or could this, be considered spamming? Sounds ridiculous to me, but these days I am forced to wonder how google might view it. Should I use the “nofollow” attribute when I post links from my blog to my regular website?

    BTW, I have to say, you are a relative dynamo Matt.

  7. Matt, the spammers love to confuse the masses by blurring the lines between content delivery and cloaking. An official definition (like above) in the Google guidelines would great in shutting up the black hats!

  8. Matt, is it just me, or is the security code thingy for posting getting fuzzier and fuzzier? Maybe it’s those two sake martinis I just had…

    Seriously, allow me to bare my nuts here for a second:

    Let’s say…hypothetically speaking of course…that I alter the anchor text for links for googlebot vs. a human. I do this because for a human on a given page, the context implies additional info on a link that googlebot LIKELY doesn’t get.


    On a page that’s all about honeymoon specials, I might have links for: Hawaii, Tahiti, Fiji, etc. To a human, that makes sense because of the page context. But I want googlebot to know that, and return those pages from my site instead of my pages about Tahiti currency, or Tahiti weather, etc. when Google picks a page from my site to return for a search for Tahiti honeymoon specials.

    Now, I’ve contrived this example a bit, but my point is: what I’m meaning to do is not deceptive, either to Google or the user. Yet, I can certainly see how an algorithm meant to catch deceptive diffs between pages returned for googlebots vs. users my fry my butt on this.

    Of course, I’m assuming Google considers anchor text when ranking pages for a term 🙂 The PR3 and PR4 pages that rank above mine for certain phrases make me pretty certain I’m not a total crack-smoker here!

    Am I making sense?


  9. Great summary of the first day Matt 🙂 Thanks for keeping us up to date. As for a solution to hosting a podcast, I’d reccomend somthing like or which are both free MP3 hosting sites which offer lo-fi/hi-fi streaming as well as direct MP3 download.

    – Dean

  10. You should sleep after the YPN party. 🙂

    (Of course, I just got back from said party.)

    Hopefully we’re all semi-coherent in the morning for the Q&A.

  11. Jean-François Lavigne

    I don’t know what are the exact requirements for hosting podcasts but rock-solid/cheap/reliable/scalable sure sounds like the new Amazon S3 service. You could upload your MP3s and make them publicly available there.


  12. Yah, the YPN party was nice. Yahoo! is really good at parties. And now I have three “I ♥” T-shirts that I’m not exactly sure what I should do with. 🙂

    Michael, I appreciate your example, but if you’re deliberately showing different links to Googlebot than to users, that’s cloaking and very high-risk behavior, because we want to score the same page that the user would see. My best advice is to figure out links that are useful both to users and for crawling purposes, and show that set of links. And remember that web pages can be created at will, so there’s always options such as a site map on your domain that is mostly useful to crawlers (but not ugly to humans), that can augment whatever linkage you want to show to regular users.

  13. Nice summary Matt – You should get no sleep more often.

  14. Dude, why didn’t I know about the YPN party? I spoke to the YPN guy and the other Y girls at length, too… 🙁

    Matt, I enjoyed the luncheon. That was the first time I’ve seen you in person and heard you speak. I did take-away a few things, but mostly enjoyed your candor.

  15. I’m glad you came to the YPN Party! Philipp and i are going to have to visit the GooglePlex soon! We’ll have to plan it over Google Calendar 😉

  16. Hi Matt,

    I wish I could have made it to PubCon, I hope you’re having fun. I would think you should get the sleep. I had the opportunity to meet you at SES San Jose last August and I have a quick question; I am starting a new site and would like to know if there are any pros and cons to the following.
    If my site is should I have it redirect to or can I keep it as is. I guess my question is, does a redirect to the www have any advantages?
    Sorry to be off topic. Have fun in Boston


  17. Yeah. I feel bad for not going to check the hotel pub, but a friend said that earlier tonight there were only 4-5 people there. And it’s 12:30 and I’m on a panel at 9am. So I think I’ll crash now..

  18. Heheh, I decided to go to sleep so I could keep up with you and Jeremy in the panel tomorrow. Looking forward to seeing you again.

  19. Sounds like a good day to me, Matt. Too bad i hadn’t got the chance to come to the US to visit NY or Boston. Butt knowing myself, i will one day 🙂

    I’m wondering how the interview with Mike went 😛

  20. Matt, thanks for the wrap up. Sounds like a pretty good conference so far.

    Also, you may want to check out as a cheap(er) way of podcasting. I haven’t used it personally (yet), but I know a lot of people who have and they love it. They seem to have reasonable rates for the hosting and streaming they do. Just FYI.

  21. “- I thought again that it might be fun to do a podcast, and again wished that there was a rock-solid/cheap/reliable/scalable place to dump 30-50 MB mp3s. Any recommendations?”

    How about a 20% project? 🙂

  22. Sounds like you guys are having a ball over there. Matt – sorry to continue the discussion re-IP delivery/cloaking but how are you meant to manage smoothly channelling users from different countries to specific content hosted under one domain without it appearing like cloaking? The moment you start redirecting US IP’s to US content it would seem like there is a high probability that:

    1) You’ll start indexing from the wrong part of the website IE. a country specific sub/dir most likely the US one.
    2) If you review from a different country to the one you spidered it from will appear like cloaking.

    This a big problem for international businesses that correctly use the .com extension, whereby they have general international content as the global default from the homepage and then country specific content, UK, US, French etc… in sub/dirs and want to give their users the best content experience by channelling (redirecting) them to it, but without upsetting the proper indexation topology of the website from the default homepage.

    I’ll tap Mike for an advance on the interview but any suggestions are extremely welcome.

  23. Great post, Matt! On Feedburner, no way! No way did I hand over my feeds to them. I didn’t go with Feedburner until they came up with a way for the feeds to use my own address (look close — see how they are This means Feedburner does the tracking, but the feed addresses themselves have stayed effectively in house. I could leave Feedburner and not lose anyone on those feeds. They formalized the service later after I started so anyone can get it. It’s called MyBrand (and for $3 per month, cheap insurance anyone should take out). Always keep your own domain!

  24. Hey Matt, any further comments you might shed on the mediapartners bot debacle going on right now would be much appreciated. There’s a lot of concern right now that it means that pages with AdSense are going to be indexed more frequently than non-AdSense pages. Now that you’ve confirmed that BigDaddy looks at the mediabot data, we need more information on what exactly is happening….

  25. I thought again that it might be fun to do a podcast, and again wished that there was a rock-solid/cheap/reliable/scalable place to dump 30-50 MB mp3s. Any recommendations?

    My (future) recommendation:
    combined with Operalike voice search, the visual disabled community would like that too.

  26. Hi Matt

    This one would be very interesting for sure:

    – 3:30 p.m. – 5 p.m. on Wednesday, April 19th, 2006: It’s the Search Engine Super Session! I’m hoping for plenty of Q&A time.

    If you get some time, would you be kind to post few of the Q&A that you have been involved in.

    Thanks a bunch.

  27. “never remember when the flight attendants are nice?” Have you ever tried flying one of the Asian carriers? Believe me, there have been instances were service was so impeccable, I didn’t want to reach the destination. I had a transpacific flight yesterday on United and it was the best service I’ve ever received on an American carrier. It still didn’t compare to average service I’ve received on Asian carriers, though.

  28. Matt

    Completely un-related query here, but is there any way to report spelling mistakes to the Google Earth crew? The island of St Agnes in the Isles of Scilly (post code TR21 in the UK should find it) is spelt wrong, which annoys me 🙂


    P.S. Click my name for some holiday snaps in said Islands, including several nice ones on St Agnes!

  29. Heya Matt,

    Can someone at G respond to all the WMW hype right now with the de-indexing of sites (lots of them) at a maddening pace?

  30. What about split testing? Is that considered cloaking? I really want to test, but am afraid I will be penalized for it. Thanks.

  31. [quote]a rock-solid/cheap/reliable/scalable place to dump 30-50 MB mp3s. Any recommendations?
    [/quote] — My experience? Rock solid, cheap, great reputation, and they do a bunch of stuff with NPR podcasts (Last I heard?). Must be doing something right.

    Works for me… 😉

  32. Hi Matt:

    First of all I wanted to thank you for supporting such an interesting blog for the webmasters as this one. On the other hand to ask for pardon you for speaking about a topic that surely does not have anything that to see with here spoken but I you need to ask urgently:

    One year ago and a half I am employed at a web Go months and months trying to improve her but I go already many months that google looks like it(he,she) has me completely penalized by something and scarcely he(she) me sends visits. Might you indicate me if I am doing something badly or can improve the web in something in order that google me envie visit again?

    Thank you for every Matt.

    [Sorry by my English]

  33. Javier,

    The links at the bottom can’t possibly be good.

  34. Mmmm… tater tots! 🙂

  35. Javier Sanchez

    PR0 on the non-www, PR6 on the www. Looks like the Canonical problem (although you still look like you can get crawled and can appear in the top 20 for your unqiue name etc – so not that badly hit really)

    A fix supposed to be forthcoming for these type of errors doest it Matt 😉 ?

  36. The best way i’ve found to sleep on a plane is to just not sleep the night before, then chug a beer or two at the airport. Was particularly usefull on flights to Korea and back in the military. advise may be a bit late, but there’s always the ride home.


  37. Javier Sanchez:

    What’s with the random links at the bottom of the page?

  38. As much as I hate to jump on a bandwagon, like Henry Elliss I’ve noticed some anomalies with Google Earth (cool tool as it is), specifically as it pertains to the mapping of Canadian postal codes. I just never bothered to ask because I use the Google Earth tool for horsing around with, nothing too serious.

    For example, L4A 7X4 (my old postal code from about 7 years ago, but still valid) generates a map about 10 km north of where it actually is (if you scroll down on the map to where Lazy Lake/Island Lake are, that’s the correct area.)

    It would be cool to report this stuff, because I’ve been looking for a replacement for Mapquest/Mapblast for generating customer maps, and I usually do it by postal code because putting in a street address can put me as much as 20 minutes off where I’m supposed to be going.

  39. Nice talking with you yesterday Matt and thanks for answering that question about client Sitemap accounts.

  40. Adam Senour:

    What’s with the random links at the side of the page?

  41. Matt, thx for the suggestion of adding the sitemap–that’s a good solution that’s nearly as not space-constrained like a typical sidebar menu might be (in terms of length of anchor text).

    We’ll remove the cloaking on the anchor text on our links.

    Next question: does having related words NEAR the anchor text accomplish the same thing as having the words IN the anchor text as far as determining the content of the linked-TO page?

    Examples: contextual nav menus like the following:

    Hawaii > Maui > Activities > Snorkeling


    – snorkeling
    – scuba
    – surfing

    Matt, I think we all realize that there’s a delicate balance between letting evil spam monsters find out things that let them “work the system” and sharing info that helps the only semi-evil among us to build website links that work for both humans and bots…anything you can share or advise is always appreciated. – MC

  42. I wonder if Matt know about Yahoo wanting his Moms url to her blog. Read about it here

  43. One other thing, if the counter isn’t invisible, is this kind of promotion legal? Can I make a counter that people use and change the anchor text to target words unrelated to the counter and the theme of the webpage that the counter is placed on? Cause if so I will have a free counter available by tomorrow 🙂 Your comments would be much appreciated. Thanks man.

  44. Good to meet you today Matt. Thanks for making yourself so accessible at these conferences. I know it’s appreciated by everyone.

  45. Sorry I’ll disagree here. I’ll serve the bot’s what they need and people what they need. For example if I block the images directory with my robots file there’s no need to serve them up to any of the bots. Why is it important for the search indexing bots to know about adsense/chitika/YPN, they don’t benefit from them in any way. If you’ve got comment links and trackback links with nofollow tags why bother to show them to the engines, they’re not following them anyway right?

  46. Graywolf.. while you’re free to do so, I wouldn’t advise disagreeing with the google webmaster guidelines.

  47. Graywolf,

    The people politely ask that you feed the bots the same thing as you feed the people, because they trust the search engines to provide the whole relevancy thing more than they trust you to do it on everyone’s behalf. Count me as one of those people.

  48. Adam Senour:

    What’s with the random links at the side of the page?

    rob: there’s a difference between contextually served Google ads and 5-6 which are generated at random with no explanation and bearing no actual relevance to the topic of the site.

    I trust you’ll see the folly of your statement and stop now.

  49. Matt,

    If you really want to host a few audio files, let me know. We’re putting some ‘spensive equipment up to make for high availability and would be happy to host some files for you.

    Cheers, and I hope to see you in Omaha come November.

  50. No offense meant to Google or the other search engines, but as the creator of that content I think I’m the best qualified person to decide what is the most relevant part. By removing the “less relevant parts” I am actually helping Google understand what the page is about.

  51. >I wouldnt advise disagreeing…

    No sense arguing with the cuttlets, Garywolf. They are the new self appointed arbitrators of spam as it relates to Google guidelines. Their motto: “Learn it, love it, live it.”

  52. Good morning Matt

    Would you be kind to elaborate more on what tedster said on WMW that:
    “Matt Cutts mentioned that some new things were going to being worked with at — and also at”


  53. Danny, thanks for the info–good to know. Sorry that I got that wrong.

    Eric, I talked about mediabot more today and even made a couple PowerPoint slides. I may post about this more when I get back from WMW, but: pages with AdSense will not be indexed more frequently. It’s literally just a crawl cache, so if e.g. our news crawl fetched a page and then Googlebot wanted the same page, we’d retrieve the page from the crawl cache. But there’s no boost at all in rankings if you’re in AdSense or Google News. You don’t get any more pages crawled either.

  54. Thanks, Matt, that’s what I was assuming it was — “AdSense pull” and not “AdSense push” (see my blog for the diagrams showing what I mean). Hopefully this will cause the fuss to die down now….

  55. Thanks Matt for the nice coverage. Perhaps you would like to help out SER with the next SES coverage? :p

    Just to get things nice and clear, are you saying that the Googlebot can tell if unique content is being delivered to any of its IP’s during a crawl? I understand if this would be considered too much to elaborate on. I also hope that you weren’t trying to elicit additional information from Mike through the use of any red liquids…

    Any chance you could be a little more specific in terms of Daron’s “chicklet inflation” warning? Does this mean that automated activities (bots?) may click on those features and if so what is the problem with that? This is what it seems like to me due to Daron’s briefly described solution.

  56. Matt

    How about sending Googlebot to a different server to enable it to crawl smoothly and avoid putting pressure to the servers that users see?

    I am talking about serving exactly the same content here. This is cloacking, is it bad though?

    Many thanks

  57. Great Post Matt, WIFI is just great , am loving it.

  58. boston has nice flowers. your comment box functionality is great. nice site design too.

  59. Hi matt,

    I have one question on media bot . I have not allowed any of the crawler in my robots.txt file so is media bot ll crawl my pages?

    I am going to update my site so i dnt want allow crawlwer’s to crawl .PHP pages so if I do so

    User-agent: Googlebot
    Disallow: *.php$

    Then mediabot ll crawl tht pages or not coz both the bots are diffrent.


  60. hi,i just wondered if adsense can improve our site’s poor indexing and found matt’s this answer “But there’s no boost at all in rankings if you’re in AdSense or Google News. You don’t get any more pages crawled either.” then i quit the idea to get adsense in our site, thanks.