SES NYC 2006, Day 4

It was snowing today in NYC. Pretty cool.

I stayed up late getting PowerPoint ready for Meet the Crawlers. That session is usually introductory, so I wanted to mix things up by showing some of the examples of Analytics and Sitemaps. Of course, if I had *read* the speaker prep, I would have seen that Danny was thinking the same thing, and therefore wanted to change the session to be all Q&A. Maybe I’ll post the Powerpoint anyway at some point, just in case people are interested. I’ll ask PR about that.

Doing it as Q&A was great, because people got to ask exactly what they wanted. Lee Odden covered the session here. I’m the one in the bright orange T-shirt. I packed nice outfits (slacks and button-down shirts), but after a couple days I remembered that no one cares a whit about what the search engine reps look like; they only care about what they have to say.

I just got off of my six-hour plane ride and I’m tired, so I’ll leave out my meta-commentary on the write-up.

The most interesting thing I saw today was on the plane ride back. I had a flight with Song, which has DishTV/movies/music/games for each individual seat. The fellow next to me somehow crashed his display, so I got to see it reboot. It was running Linux! I could tell from the cute little Tux logo + the distinctive log messages scrolling by. I’m sure that’s common knowledge, but it was fun for me to notice. Also, at one of the flight attendant stations there was a port labeled “Gigabit Ethernet.” Neat.

Scoble snuck in a Channel9 guy and got someone to snap a photo, and I saw the picture for the first time today. What the picture doesn’t show is that Scoble let me keep the Channel9 guy. I feel some mischief may be coming over me. πŸ™‚

Let’s see, what else. Someone came up and had gotten one of our email alerts about issues with their site. They told me what they’d changed, and just wanted to see if the changes they had made were sufficient (they were). It was nice to see that program having a positive impact.

And now, je suis profoundly tired, and I think I have a dentist appointment at 9 a.m. tomorrow. I’ll probably rest up over the weekend and then come in on Monday and work with someone on going over all the feedback we got. Thanks again to everyone who came up and said hello, asked a question, or gave suggestions. It was really nice talking to everyone.

27 Responses to SES NYC 2006, Day 4 (Leave a comment)

  1. Good morning Matt

    How old is that scruffy little goatee of yours? Looks like you have copied mine which was “created” 1990

    Are we talking here about copyright infringement πŸ™‚

    Sleep well

  2. Hi Matts!

    Many pages lost in the last days on BigDaddy every page except the main – and many old Supplementals seem.

    Please look an this thread:

    http://www.webmasterworld.com/forum30/33351.htm

    It is a penalty for these pages or a problem?

    Thank you for your feedback!

  3. Yes Matt.

    Please – feedback is required on that thread.

    Cheers

    Stephen

  4. Matt, if you come to the SES in Germany, I hope you leave your “nice outfits (slacks and button-down shirts)” at home. It may help loosen this county’s stuffy dress code!

  5. Matt

    The above two posters are not kidding, There appears to be a major widespread problem and it appears the problem is creating alot of disgruntled webmasters. People are reporting downward traffic trends because of the prpblem, which translates into lost sales, sleepless nights, stressed out caffeine imbalmed bodies followed by angry and conspriacy riden webmasters.

  6. Hey Matt, It was nice to see your departure from the blue oxford and chinos uniform!

    Meet the crawlers is one of my favoite sessions and I think it’s impressive how you maintain your composure after getting the same questions over and over at every conference.

  7. Google: Matt shows his Google Sitemap data using then new version of sitemaps. For some reason Matt’s blog is #2 for a phrase like free porn on Google local. Shows a variety of information on his blog.

    It must be tough having a blog with high PR and trust when you start showing up for the phrase “free porn”.

    Matt is currently #9 for “SEO” — be scared be VERY scared, muhaha! πŸ˜‰

  8. I concur with Armi, Stephen and Ledfish. ANY new solid information on BigDaddy, indexing problems, supplemental results would be a doing a great service to the hundreds (thousands?) of webmasters affected by this.

  9. Good to know I am not the only one with a BigDaddy issue. I heard a lot of talk about this very issue in at SES. I had a chance to speak with Matt about this a bit and he said he is going to look into it.

    I have to give Danny props for coining the funniest term of the conference. “Matt Cutts and the Cuttlets” – when referring to the army of people who follow him around wherever he goes at the evening forum.

    A good time was had by all in New York but I am very happy to be back in the 80 degree weather in Phoenix! I gotta bounce :). Going to catch a spring training game!!

  10. Matt,

    When you get a chance look into this puppy:

    http://www.katedrala.cz

    Sure looks like you have indexed an open proxy.:

    http://www.google.com/search?num=100&hl=en&lr=&newwindow=1&c2coff=1&q=site%3Akatedrala.cz+inurl%3Aanonym&btnG=Search

    I cna’t look into the cache or click on the links and get anyresponse. But you should have the html that goes with those 9,370 pages (I expect maybe the home pages and maybe other ranking pages of several thousand websites).

  11. Whoa give the guy a break. This thread isn’t about ‘hey Mat check out this query I found’. I don’e see this free pron query either for that matter – unless there’s a backlink spam exercise about. Also I’m sure Matt needs some sleep before *shiver* a dental visit. Poor guy. But yeah:

    Matt: powerpoint.zip! Supplemental results!

    Feed us dammit! πŸ˜‰

  12. Dude if you were up until 3am and then in sessions at 8 you probably added another meaning to “meet the crawlers”.

  13. RE: Armi,Stephen , Ledfish above. No lie. Some big supplemental prob. mattcutts.com is the only communication to the inside of the borg. Sort of like a intergalactic homing beam to the clingons to try to reason with them.
    If I had a teleport machine I would beam myself into the serps engine room and say “computer, please fix the supplementals”. Then go up to the bridge and tell mr zulu. Warp speed 10.Finally Spok ( matt cutts) would look at the algorythm and tell the captain, ” Sir, this will never happen gain. There must be peace and harmony in our galaxy.

  14. I had a supplemental problem in the first place–but traffic was increasing as were the rankings. Now the site seemed to fall completely off of the SERPS…

    I’m with you guys, Armi, Stephen, Ledfish & MaxD–I just checked the results at 64.233.161.105, and the entire site (except the homepage) seems to have gone supplemental! This is the thread, Matt, and without some sort of answer we’ll be in some serious trouble. Please offer some guidance here!

    I just made a post on another one of your threads about how great it was to see that you would be able to help me fix the index after 100 old Traffic Power pages were all in the supplementals, relegating our legit pages of quality & unique content to the “omitted results”. Instead, now the entire site went into the supplementals?!? What on Earth could we have done wrong?

    Thanks in advance–

  15. Ok, I’ve done the WMW threadwatching. There is a great quote from a Senior Member–ANY feedback woudl be seriously huge…

    The key thing to remember is that BigDaddy is being deployed to resolve canonical problems.

    I suspect sites are going supplemental down to the home page to identify the canonical domain. Then the site is respidered and rebuilt in Google’s index using “new infrastructure,” revised code that better handles canonical issues.

    Could it be that for some reason only my site is getting hit in my industry space?? Meanwhile the Scientologists with their sites are blog-spamming to get themselves on top–which happens to be working.

    Sheesh what’s a guy supposed to do?.

  16. I found this comment very interesting:

    “Google: Every search engine crawls in a different way. Mentions vs indexed. There are instances where we know about the url, but we did not crawl it. Your site may not have enough PageRank for us to do a deep crawl.”

    “Your sitemay not have enough PageRank for us to do a deep crawl”

    ThatΒ΄s for all you pagerank hunters out there.

    πŸ˜‰

  17. Glad you enjoyed SES, and genuinely sorry I wasn’t there – would be interesting to finally meet the Princess Bride Guy one day. πŸ™‚

    Anyway, what’s really interesting about the supp. index issue is how Google is claiming indexing of completely non-existant URLs.

    Something seems to be stripping then reinventing URLs for pages that have never existed. Reported on Threadwatch, hope it helps.

    In the meantime, sounds like you have some junior engineers who are making your job more difficult. More sleepless nights to come?

  18. Matt,

    please give us some feedback on if there is a problem with all the sites that went completely supplemental except their homepage.

    Love to know if this is a positive thing or if we all need to go back to the drawing board.

  19. Matt, Many Matt fans in China keep asking about the possibility of your presentation at search engine strategies conference at Nanjing, China. That will be held in 17th-18th, March. Although it is such a rush after new york conference. But we are all looking forward to have you here with us.
    I would like to assist you with more information about Nanjing event and chinese issue.

  20. From what i see on wmw. G know about it now. So Robin Kay and anyone else I would refer you to this thread if you are dealing with supplemental issues.

    http://www.webmasterworld.com/forum30/33351-16-10.htm

    Appreciate the communication from Google.

  21. Hi Matt,

    I have done a major ‘overhaul’ on my site with a brand new design. I have a couple of questions about this:

    1. I’ve submitted a new sitemap to google which it seems to have indexed, but not updating on all dc’s, just wondering how long it actually takes for all dc’s to update and for the sitemap to stop visiting the previous links in the sitemap?

    2. I was running a script for a marketplace I have, which was no doubt crap, so I have created my own html pages for it, just wondering how to stop googlebot from visiting the old links via the cgi script? Do I have to just add it to the robots.txt file?

    Cheers,

    Luke

  22. George Chaney

    Enjoyed reading about the SES – Get some rest as you’re most likely gonna have a lonnnnnnng next week.

    As far as this stuff: http://www.webmasterworld.com/forum30/33351-16-10.htm

    The old saying “Don’t Panic” comes to mind for those doing nothing wrong. These things work themselves out, typically sooner than later. If G’s aware, as many have made them, it’s gonna get taken care of itself pretty soon I’m sure.

    Have a great rest of the weekend (holiday Monday for us down in my neck of the woods!

    Cheerios

  23. The ppt would be cool to look over. Lai-tro!!

  24. Google is more then scaring me. My business start almost exactly a year ago(2/05). Sales quickly sky rocketed and I had to hire three people. Most of my business was coming from Google which I was very thankful for. Since The October update I have had to let go all three employees as visitors/customers has dropped off the map.

    Our site is a nice site that does not try to use any spamming techniques. It’s just a true blue site. I’m very confused as to how it could have been getting so many hits then all the sudden just drop off the map. It’s greatly impacted my income and other lives/jobs.

    I’m crossing my fingers and saying my prayers that things turn around and good honest businesses/sites can rise to the top again.

    Take care all,
    Scott

  25. Hi Matt,

    What is Google’s opinion of using the Google API for automated search engine ranking tracking?

    There is a very nice tool, but I worry that it conflicts with the Google TOS:

    http://forums.digitalpoint.com/showthread.php?p=681136&posted=1

    If you can help, that’d be great.

    Thanks,

    Ben

  26. Hi Matt,

    In his piece on Duplicate Content from SES NY, David Utter stated to use the robots.txt to stop spiders from visiting specific landing pages.

    Given that http://www.oursite.com/?referrer=google (which is an exact duplicate of http://www.oursite.com and therefore http://www.outsite.com/?referrer=google has long since been 301’d to http://www.oursite.com) is still stuck in the supplementals with a cache of January 2005 – could we just add /?referrer=google to our robots.txt to resolve the issue?

    Or:
    1) Given the 301 that’s in place (even without the 301, the /? in the string automatically defaults http://www.oursite.com/?referrer=google to http://www.oursite.com) would this block you from http://www.oursite.com as well?
    2) And once the robots.txt has this added, do we still need to wait for the “supplemental bot” to pick it up?

    NOTE: Our full url is in my email address.

    Thanks as always!

  27. Hey Matt,

    I missed Thursday’s session due to weather/car issues. This is what I get for living near a conference and having to go home every night.
    Any way I could see the PowerPoint (even though I missed the session)?

css.php