I almost made it. I got through one day of a three day conference, and I was still blogging, caught up on email, and I’d checked my RSS feeds. Then on the second day, I spoke on three panels, stayed up talking SEO until 3:30, and it all crashed down. Now 80-90 emails sit unread in my inbox, and I’m behind on everything else too.
During the conference, I was talking to Vanessa Fox from the Sitemaps team. You probably know her from the Sitemaps blog, and she was also at WMW Boston. It turns out that she took lots of notes at the organic site review panel.
“Would you like to do a guest post on my blog?” I asked. “Sure, why not?” Vanessa replied. That’s cool, because my summary of that panel would have been something like:
The last time I did this panel, SEOs realized how much things like paid links could stick out like a sore thumb. In a different panel at WMW Boston, Rae Hoffman illustrated that other SEOs could easily see paid links using open tools like Yahoo’s Site Explorer, so it doesn’t even require the special tools that a search engine has. On the bright side, every site in the organic site review panel looked white-hat and had serious questions; paid links didn’t come up for discussion once during the panel.
without going into the detail of all we talked about. So I’m glad Vanessa is willing to cover it in more detail. Without further ado, here are Vanessa’s notes on that session:
I’ve been having a great time here at Pubcon Boston, talking to webmasters, getting feedback, and learning about what they’d most like from Google. I sat in on the Organic Site Reviews session, both because Google Sitemaps is a site review tool from a different perspective and because I wanted to make funny faces at Matt while he was talking.
I normally blog for Google Sitemaps, but Matt asked me if I wanted to do a guest post here (probably to keep me too busy paying attention to make funny faces at him).
The strongest point I got from the session (and I knew it already, but it became so apparent) is that you don’t need any special tools or secret knowledge to evaluate your site. Search engines want to return relevant and useful results for searchers. All you really need to do is look at your site through the eyes of your audience. What do they see when they get to your site? Can they easily find what they’re looking for? Webmasters are looking for some other secret key, but really, that’s all there is to it.
The panelists (Matt, Tim Mayer from Yahoo, Thomas Bindl from ThomasBindl.com and Bruce Clay from Bruce Clay, Inc.) looked over the sites that the audience asked about and offered up advice.
Googlebot and other search engine bots can only crawl the free portions that non-subscribed users can access. So, make sure that the free section includes meaty content that offers value. If the article is about African elephants and only one paragraph is available in the free section, make sure that paragraph is about African elephants, and not an introductory section that talks about sweeping plains and brilliant sunsets. If it’s the latter, the article is likely to be returned in results for, oh say, [sweeping plains] and [brilliant sunsets] rather than [African elephants].
And compare your free section to the information offered by other sites that are ranking more highly for the keywords you care about. If your one free paragraph doesn’t compare to the content they provide, it only makes sense that those sites could be seen as more useful and relevant.
You could make the entire article available for free to users who access it from external links and then require login for any additional articles. That would enable search engine crawlers to crawl the entire article, which would help users find it more easily. And visitors to your site could read the entire article, see first-hand how useful your articles are, which would make a subscription to your site more compelling.
You could also structure your site in such a way that the value available to subscribers was easily apparent. The free content could provide several useful paragraphs on African elephants. Then, rather than one link that says something like “subscribe to read more”, you could list several links to the specific subcategories availabe in the article, as well as links to all related articles. You could provide some free and some behind a subscription (and make the distinction between the two obvious).
African elephants — topics available once you register:
Social patterns ($)
Wildlife refuges ($)
History of elephants ($)
Sure, having all those keywords on the page might help the page return in results for those keywords, but it’s not just about search engines. Visitors to your site have a much better idea about what’s available after registration with a linking structure such as that than with “subscribe to read more”. Visitors want to know what “more” means. Ultimately, you care about users, not search engines. You just want the search engines to let the users know about your site. You want to make uses happy once they do know about it.
It’s not only Googlebot who doesn’t watch a 20 second video load before the home page comes into view. A lot of users don’t either. Some users don’t want to wait that long; other users don’t have Flash installed. If all of your content and menus are in Flash, search engines may have a harder time following the links. If you feel strongly about using Flash, just make an HTML version of the page available as well. The search engine bots will thank you. Your users will thank you. Feel free to block the Flash version from the crawlers with a robots.txt file, since you don’t need your pages indexed twice. If your home page is Flash, put the navigation outside of the Flash content. You could offer a choice on the home page so users can choose either the HTML version or the Flash version of the site. (You might be surprised at what users choose.)
Images as text and navigation
What is true for Flash is also true of images. Many users have images turned off. Try viewing your site with images turned off in your browser. Can you still see all the content and links?
Sites in general
You know what your site’s about, so it may seem completely obvious to you when you look at your home page. But ask someone else to take a look and don’t tell them anything about the site. What do they think your site is about?
Consider this text:
“We have hundreds of workshops and classes available. You can choose the workshop that is right for you. Spend an hour or a week in our relaxing facility.”
Will this site show up for searches for [cooking classes] or [wine tasting workshops] or even [classes in Seattle]?
It may not be as obvious to visitors (and search engine bots) what your page is about as you think.
Along those same lines, does your content use words that people are searching for? Does your site text say “check out our homes for sale” when people are searching for [real estate in Boston]?
Next, consider this page name:
It doesn’t take a special tool to know that the URL isn’t user-friendly. Compare it to:
But you can have too much of a good thing. It also doesn’t take a special tool to know that this page name isn’t user-friendly:
And speaking of putting a dash in URLs, hyphens are often better than underscores [Ed. Note: bolded by Matt 🙂 ]. african-elephants.html is seen as two words: “African” and “elephants”. african_elephants is seen as one word: african_elephant. It’s doubtful many people will be searching for that.
* Don’t break your content up too much. Users don’t like to continuously click to get the content they’re looking for. Search engines can better know what your site is about when a lot of key content is on one page. When the content is broken up too much, it’s not as easily searchable. FAQ pages don’t have to be 15 different tiny pages; often one page with 15 questions on it is better for users and search engines.
* Make sure your content is unique. Give a searcher a reason to want to click your site in the results.
* Make sure each page has a descriptive <title> tag and headings. The title of a page isn’t all that useful if every page has the same one.
* Minimize the number of redirects and URL parameters [Ed. Note: I’d keep it to 1-2 parameters if possible]. And don’t use “&id=” in the URL for anything other than a session ID. Since it generally is a session ID, we treat it as such and usually don’t include those URLs in the index.
Google isn’t secretive about these tips (http://www.google.com/webmasters/guidelines.html). And the panelists in the session reviewed the sites using tools readily available. They looked at the sites, read through them, and clicked around. No magic needed.