2006 Pubcon in Vegas: Thursday site reviews

I woke up early on Thursday and was at the convention center by 8am to check on my backpack and laptop. It was still tucked away under a table, untouched. Whew! I hunkered down in the speaker room and started working on my slides. (I hate seeing the same presentations over and over again at conferences, so I always try to make a new presentation for each show.)

By 9am, I was still really behind, so I decided to skip Danny’s keynote and kept chugging. I missed a few other sessions on Thursday, but I figured it was worth it to be prepared.

Around 1:30, Brett had signed me up for a site review panel with Greg Boser, Tim Mayer, Danny Sullivan, and Todd Friesen, with Jake Baillie moderating. The idea behind a site review panel is that people volunteer their sites and get advice on how to do things better. We discussed a promotional gifts company, the Dollar Stretcher site, a real estate company in Las Vegas, a chiropractic doctor, a real estate licensing company, a computer peripheral site, a hifi sounds store, and a day spa in Arizona. In his PubCon recap, Barry said I ripped on sites. For most of the sites I tried to give positive advice, but I wasn’t afraid to tell people of potential problems.

Once again, I sat on the end and had my wireless and VPN working so that I could use all of my Google tools. The promotional gifts company had a couple issues. For one thing, I was immediately able to find 20+ other sites also belonged to the promotional gifts person. The other sites offered overlapping content and overlapping pages on different urls. The larger issue was searching for a few words from a description quickly found dozens of other sites with the exact same descriptions. We discussed the difficulty of adding value to feeds when you’re running lots of sites. One thing to do is to find ways to incorporate user feedback (forums, reviews, etc.). The wrong thing to do is to try to add a few extra sentences or to scramble a few words or bullet points trying to avoid duplicate content detection. If I can spot duplicate content in a minute with a search, Google has time to do more in-depth duplicate detection in its index.

Next up was the Dollar Stretcher. I’d actually heard of this site before (stretcher.com), and everything I checked off-page and on-page looked fine. The site owner said that Google was doing fine on this site, but Yahoo! didn’t seem to like it. Greg Boser managed to find a sitemap-type page that listed hundreds of articles, but in my opinion it only looked bad because the site has been live since 1996 and they had tons of original articles that they had written over the years. So how should a webmaster make a sitemap on their site when they have hundreds of articles? My advice would be to break the sitemap up on your pages. Instead of hundreds of links all on one page, you could organize your articles chronologically (each year could be a page), alphabetically, or by topic. Danny and Todd noticed a mismatch between uppercase url titles on the live pages and lowercase url titles according Yahoo!’s Site Explorer, and a few folks started to wonder if cloaking was going on. The site was definitely hitting a “this is a legit site” chord for me, and I didn’t think they were cloaking, so I checked with a quick wget and also told Googlebot to fetch the page. It all checked out–no cloaking going on to Google. I gently tried to suggest that it might be a Site Explorer issue, which a few people took as a diss. No diss was intended to Site Explorer; I think it’s a fine way to explore urls; I just don’t think stretcher.com was trying to pull any tricks with lowercasing their titles to search engines (not that cloaking to lowercase a title is something that it would help a site anyway).

Holy crawp this is taking forever to write. Let me kick it into high gear. The real estate site was functional (~100-ish pages about different projects + 10 about us/contact sort of pages) about Las Vegas real estate, but it was also pretty close to brochureware. There was nothing compelling or exciting about the site. I recommended looking for good ways to attract links: surveys, articles about the crazy construction levels in Vegas, contests–basically just looking at ways to create a little buzz, as opposed to standard corporate brochureware sites. Linkbait doesn’t have to be sneaky or cunning; great content can be linkbait as well, if you let people know about it.

The chiropractor site looked good. Danny Sullivan made some good points that they wanted to show up for a keyword phrase, but that phrase didn’t occur on the site’s home page. Think about what users will type (and what you want to rank for), and make sure to use those words on your page. The site owner was also using Comic Sans, which is a font that a few people hate. I recommended something more conservative for a medical story. Greg Boser mentioned to be aware of local medical associations and similar community organizations. I recommended local newspapers, and gave the example that when my Mom shows up with a prepared article for her small hometown newspaper about a charity event, they’re usually happy to run it or something close to it. Don’t neglect local resources when you’re trying to promote your site.

My favorite for the real estate licensing site is that in less that a minute, I was able to find 50+ other domains that this person had–everything from learning Spanish to military training. So I got to say “Let us be frank, you and I: how many sites do you have?” He paused for a while, then said “a handful.” After I ran through several of his sites, he agree that he had quite a few. My quick take is that if you’re running 50 or a 100 domains yourself, you’re fundamentally different than the chiropractor with his one site: with that many domains, each domain doesn’t always get as much loving attention, and that can really show. Ask yourself how many domains you have, and if it’s so many that lots of domains end up a bit cookie-cutter-like.

Several times during the session, it was readily apparent that someone had tried to do reciprocal links as a “quick hit” to increase their link popularity. When I saw that in the backlinks, I tried to communicate that 1) it was immediately obvious to me, and therefore our algorithms can do a pretty good job of spotting excessive reciprocal links, and 2) in the instances that I looked at, the reciprocal links weren’t doing any good. I urged folks to spend more time looking for ways to make a compelling site that attract viral buzz or word of mouth. Compelling sites that are well-marketed attract editorially chosen links, which tend to help a site more.

The computer peripheral site had a few issues, but it was a solid site. They had genuine links from e.g. Lexar listing them as a place to buy their memory cards. When you’re a well-known site like that, it’s worth trying to find even more manufacturers whose products you sell. Links from computer part makers would be pretty good links, for example. The peripheral site had urls that were like /i-Linksys-WRT54G-Wireless-G-54Mbps-Broadband-Router-4-Port-10100Mbps-Switch-54Mbps-80211G-Access-Point-519
, which looks kinda cruddy. Instead of using the first 14-15 words of the description, the panel recommended truncating the keywords in the url to ~4-5 words. The site also had session ID stuff like “sid=te8is439m75w6mp” that I recommended to drop if they could. The site also had product categories, but the urls were like “/s-subcat-NETWORK~.html”. Personally, I think having “/network/” and then having the networking products in that subdirectory is a little cleaner.

The HiFi store was fine, but this was another example where someone had 40+ other sites. Having lots of sites isn’t bad, but I’ve mentioned the risk that not all the sites get as much attention as they should. In this case, 1-2 of the sites were stuff like cheap-cheap-(something-related-to-telephone-calling).com. Rather than any real content, most of the pages were pay-per-click (PPC) parked pages, and when I checked the whois on them, they all had “whois privacy protection service” on them. That’s relatively unusual. Having lots of sites isn’t automatically bad, and having PPC sites isn’t automatically bad, and having whois privacy turned on isn’t automatically bad, but once you get several of these factors all together, you’re often talking about a very different type of webmaster than the fellow who just has a single site or so.

Closing out on a fun note, the day spa was done by someone who was definitely a novice. The site owner seemed to have trouble accessing all the pages on the site, so she had loaded a ton of content onto the main page. But it was a real site, and it still ranked well at Yahoo! and Google. The site owner said that she’d been trying to find a good SEO for ages, so Todd guilted the audience and said “Will someone volunteer to help out this nice person for free?” A whole mess of people raised their hand–good evidence that SEOs have big hearts and that this site owner picked the right conference to ask for help. 🙂

Okay, that’s a rundown of SEO feedback on some real-world sites, and this post is like a gajillion words long, so I’ll stop now and save more write-ups for later.

97 Responses to 2006 Pubcon in Vegas: Thursday site reviews (Leave a comment)

  1. Thanks for the straight-dope approach on this one, some great information here. I’ve already quoted you twice on other forums. (with link credit of course)

  2. HI Matt
    Really enjoyed the Pubcon and your comments and presentation.I asked you a question about URL listing in Google Local Results.Just reminding you about it and i will be happy to give the information again.
    Thank you very much
    Keep up the Good work
    Suresh

  3. Hey Matt,

    Thanks for the post. For me, this explains a lot of the confusion that has come about re the multiple domain issue. A couple key things I note in this post are that having multiple domains in and of itself is not a bad thing and there will not be a “Google penalty” applied. However, you realistically can’t expect to give the necessary attention to all the sites to make them rank well. Unless you forgo sleep or have a team of content writers and SEOs at your disposal, your non-primary sites will suffer due to lack of proper development.

    Secondly, in a later paragraph you comment on the fact that when a lot of the secondary sites are privacty protected, MFA and PPC then that does raise a red flag that maybe the webmaster is doing more than just trying to build quality websites for public use. Thus, maybe Google does become suspicious and possibly ding the sites across the board for that webmaster.

    Third, you mentioned that you knew that this person owned 40+ sites, and afterwards when you clicked on the WhoIs info that they were all privacy protected. I think you just let it slip that Google has access to all the WhoIs info regardless of what is protected from public domain. That’s a pretty big statement to make and will probably make a lot of people nervous.

    Thanks again for your comments!

    Hey – did it ever occur to you that you were like Alan Greenspan were even your slightest comments will be analzyed and re-interpreted to death so that webmasters can gain even the slighest insight into the Google Dance? Funny (and kind of scary). 🙂

  4. A good read thanks Matt. More posts like this would go a long way in helping people understand the things they should or should not be doing.

  5. One of the best articles you wrote. If you could do this more often it would help greatly cause it gives us some real examples of what we can do better or shouldn’t be doing.

  6. Thanks for the neat summary Matt, it sounds like we missed something 🙂
    Can you tell us something more about the handling of non-case-sensitive URLs? How does Google notice? What happens?

    >Google has access to all the WhoIs info regardless of what is protected
    >from public domain
    How that? A proxy is a proxy 🙂 — I can proxy domains for you. But of course, if you run the same Adsense-ID on your domains, it’ll be easy to spot (with the right tools).

  7. Brian B,

    So true. Summarizing other peoples words will never die as long as privacy laws remain…..LOL 🙂

  8. I was howling at the real estate licensing review. I’m sure the poor site owner must have heard me since I was only sat 2 rows behind him.

    I didn’t take your comment about site explorer as a diss at first, but it was funny to see the audience’s reaction.

    I also enjoyed the comparison of Google local to Yahoo local results on the chiropractor’s site, shame msndude wasn’t also on the panel, we could have had some real interesting SERP comparisons 🙂

    Just out of interest, why did you poll the audience on +-30 and red/blue?

  9. You’re correct up to the whois speculation, Brian B. All I did was take one of the domains and run “whois domain.com” from a command-line and noticed that whois data privacy protection was on for that domain. Then I did the same with 1-2 more domains to verify it. So I wasn’t using any special Google data or tool for noting the whois info was private. Sorry if I gave that impression.

    Back to multiple sites: multiple sites by themselves aren’t bad. Heck, it’s easy to get several domains if you register typos of your main domain and 301/permanent redirect them to your main domain (Google often gets domains like gogle.com because someone will register them in bad-faith and then we end up with them). But if you’ve got that many sites and they’re all separate, it’s hard to find the time to develop all those standalone sites well unless, as you mention, you never sleep. 🙂

  10. > I think you just let it slip that Google has access to all the WhoIs info regardless of what is protected from public domain.

    There are plenty of ways to tell that one person is running multiple sites. Whois is only one piece of the puzzle. In addition to using adsense ids and analytic ids as JohnMu points out, many webmasters will run dozens of sites from the same IP address… If all those sites are all “privacy protected” from the same registrar with the same nameservers, guess what? Probably all owned by the same person.

  11. Great post Matt … so it seems that having a multiple domain is not such a bad idea as long as its not cookie cutter site and has unique content …

  12. Pittbug, it is interesting to see which way the audience skews. There were a lot more older-than-30 SEOs than under 30, for example, which was interesting. If you’re not willing to shake things up a little bit from time to time, you get peanut butter memos written about you. Plus no one had to raise their hand either way on the red state/blue state poll if they felt uncomfortable. 🙂

    JohnMu, Google doesn’t really care much (at all?) whether your titles are uppercase or lowercase. Personally, I’d still use the “correct” case, e.g. capitalize place names, but that’s just to make the site better for users. That’s why I was trying to say that it didn’t make sense to cloak if you didn’t even change the words but instead changed uppercase to lowercase. I don’t think stretcher.com was cloaking.

    Glad you liked this, Chris. I’d be happy to do more posts like this when I can scrape out the time (which isn’t as often as I’d like).

  13. Thanks for the detail Matt. This is the one thing I was truely sorry I missed this time around.

    Having lots of sites isn’t automatically bad, and having PPC sites isn’t automatically bad, and having whois privacy turned on isn’t automatically bad, but once you get several of these factors all together, you’re often talking about a very different type of webmaster than the fellow who just has a single site or so.”

    That says it all. It’s all a matter of fitting the profile.

    I do have one concern. Let’s hope that such profiling is used to trigger manual reviews and not automatically punishing websites that on an individual basis are doing nothing wrong. You cannot tell from whois data whether the registrant is a 10 person company running 50 sites or if it’s one person running 50 sites.

    @johnMuSomeone on Net Income said they used to work for a registrar (which Google is) and that registrars have the real info of private registered domains. Don’t know if that’s true. And yes they can use Adsense, Adwords, and your host to figure it out also.

  14. Matt,
    It was very nice to meet you at PubCon, I actually met you on the stairs of the Irish Pub Friday. Thanks for all your efforts. I have a quick question regarding this statement made on one of the site reviews:

    “The site also had session ID stuff like “sid=te8is439m75w6mp” that I recommended to drop if they could.”

    When you recommended dropping session id’s, do you mean altogether or just disallowing them from being indexed? If we have to totally eliminate session id’s, how do we keep track of unique visitors shopping? We have issues with this very thing.

    Thanks in advance, and I took notes on all your suggestions, they were right on!

    Ty
    All-Spec.com

  15. Hey Matt, thanks for letting me hit you with some of that DrinkBait on Thursday. I had a blast and your session was awesome!

  16. Sorry, I didn’t mean if upper or lower case is better — I was thinking more in terms of those non-standard-conform Windows servers that are not case-sensitive, ie you might have a link to /page.asp and one to /PAGE.asp (and Page.asp, etc) – leading a standards conform bot to index various URLs for the same page. Does Google recognize them as being the same page or does it just end up in the duplicate content filter (ie dropping the lower-value versions instead of adding the combined value up)?

  17. I liked this part:

    “great content can be linkbait as well, if you let people know about it.”

    The “if you let people know about it” part is great,…. but equally difficult. Easy to say for people that are already known, but nobody links to you unless they like you. And the issue is: People don’t like you unless they think they know you or unless they think you´re known by many other people.

    Content alone doesn’t get you links. You need to be known to get links. Only if it (at least) looks like you´re known, your content will be valued as much.

    Ask Danny Sulivan to write a good article about something, but place it on a beginner SEO site and say it was written by the unknown beginner. Little chance that that article will get the links it would get in Sulivan’s blog.

    Just content is a waste of time without the required marketing efforts.

  18. Hey, you’re exactly right Peter. It’s the whole state of blogging now.

    No offense meant, but if Matt didn’t work at Google, nobody would read his blog.

    The same is true for Scoble. If he were to create a blog without his name or microsoft’s on it, nobody would read.

    I think it’d be a fun experiment for somebody like Scoble or Calacanis, or somebody to do. Create a blogger blog without using your name, your company, or any of your contacts to promote it. I’m pretty confident it wouldn’t be successful.

    It’s not like it was when i started writing (blogging wasn’t a word yet). back when domains cost $35, and you had to have your own hosting account, and know enough perl to write a CMS (php wasn’t big yet, neither was mysql..ahh text files..)

    it was pain in the ass, which guaranteed that only people who were good writers, or had something to say had blogs…. and getting links was easy then. I miss 1998….

  19. Matt … out of all the sessions, this one was one of the best in terms of knowledge as well as entertainment. Having industry giants give their critique on a certain URL is excellent. A super-session like this has to be all the time in PubCon and SES conferences.

    Thank you.

    I loved it when you got that guy with his 50+ sites. Funny moment.

  20. yeah i forgot to mention..
    this article is good. You should do more site reviews on here. I like them, and they’re great for sending people to when you need to explain something.

  21. Rally good post Matt. I agree with the comment above that it was one of the best. It was packed full of useful information.

  22. Great post Matt! It’s almost like being there.

    Wow! – I had just taken my tinfoil hat off, but now I think I need to put it back on. 😉

    Note to self: First, create several new identities, email accounts, P.O. Boxes and LLC’s… and then register the domain names for my MFA empire.

  23. Wow, this is even better than being there Matt thanks.

  24. Ah, gotcha JohnMu. I think we do pretty well on that, but someone at Google might be looking at e.g. improving case-sensitive canonicalization for IIS servers. See, I’d made it for months without using the word canonicalization and now ya made me say it. 😉

    Jeremy, that’s exactly right. The only reason that I was looking for more sites belonging to a person was because we were in a site review and I wanted to have a holistic view of what kind of webmaster I was dealing with.

    Hawaii SEO, before people start going all tinfoil hat, I just want to reiterate that I only knew they had whois protection because I took one of their domains and checked it with a whois from a command-line. That’s something anyone can do.

    Peter (IMC), I agree that the marketing aspect of it can be challenging, but there are ways to break into the consciousness. For example, from this conference, one fellow took his drinkbait.com idea and pushed it to the max. Another example is seoloser.com, which is getting some attention because the guy is teasing the SEO elite for not being very social. Often finding the right angle is hard, but once you’ve found it, a new approach can really pay off.

  25. Matt

    God bless. I can’t imagine you writing such rich post without several cups of Colombian coffee. Are you sure you are still off caffeine 🙂

  26. Matt, talking of duplicate content I was browsing around as you do… 🙂 and noticed Googles subdomains and robots.txt are all screwy. The Google directory and anything intentionally set up to be indexed from one subdomain seems to be available at other combinations of Google subdomains, and Yahoo! and yourselves are indexing some of them.

    http://www.search-engine-war.co.uk/2006/11/google_i_ndexes.html

  27. I hid my whois information because I did not want the world having my home address until I got the “suite #” recently.

    I also use the same wordpress template on 5 of my sites because I designed them to convert and pay some bills while I build them.

    I use wordpress because it is a database driven CMS with RSS and other functions.

    At some point I will rent my own redundant server and have my sites hosted on the same IP.

    The above does not make me a spammer correct?

  28. I am still off the caffeine, Harith. I think my body has rebooted now, so I’m just looking for the right time to get back on and get that first-week-on-diet-red-bull boost. 🙂

    Teddie, I know that some of that is scheduled to be improved soon. I agree it’s suboptimal though.

  29. No problem. Dissapointed I couldn’t make PubCon it sounds like a blast, perhaps you can try and make it to Europe this year? I have something like 500,000 airmiles to use up so if Google can’t afford it I can always get you a ticket 🙂

  30. so Matt how are you making the determination that sites are in fact owned by the same person? For example Alexa thinks Kim Krause is associated some undesirable characters

    http://www.alexa.com/data/details/main?q=www.cre8pc.com&url=cre8pc.com

    Now I know this false since I was also associated with those sites but made a nuisance of myself until they fixed it (no comments)

  31. Long post – short conclusion:

    O(telling-people-what-to-do-and-making-sure-they-obey)
    is smaller than
    O(finding-out-what-they-did-and-ranking-it-without-interference)

    Do your math.

  32. Hi Matt,

    Thanks a lot for your availability at the PubCon to discuss everyone’s issues.

    As you might remember, one issue that I raised in our talks in Las Vegas was geotargeting.

    In fact, since the beginning of the year, we have been severely hit by the new geotargeting algorithms, as many other sites on the net.

    As you know we are based in Europe, selling only online, and we do it via “.com” domains with a web server based in USA. Not a very uncommon condition, as many businesses has chosen the .com as a sign of internationalization of the service they provide, and decided to host in the US simply because there are the best datacenters in the world for dedicated hosting.

    Google still beautifully rewards such sites’ popularity on the international (US) results, but relegates them in 20-or-30-something positions (therefore 3rd/4th page) on European local editions of Google, even if the choice made by the European user was “results in English”.

    You can then check the incredible difference with the .co.uk for example: same English speaking market, closer to any European place of business.

    This paradoxical condition led these sites in being “more invisible” to an audience (the UK one) that they could serve well, if not better than the one where they rank well.

    Here comes my point.

    I fully agree that, if the user searches for a certain product of service without specifying the location, he could mean and imply “close to my place, please”.

    Therefore, searches like “pizza delivery” or “taxi” must be geotargeted as much as possible, until the maximum extent.

    But, if you search 1) in a non-local language (example, you seat in the Netherlands and you search for “Widgets in London”, which is an English phrase although searched on the Dutch version of Google.com) and 2) something related to a place specified in the keywords (“London”) I’m not too sure in saying that the Dutch user meant “providers of widgets in London which reside close to my place, please” as in this case he specified that:

    – He’s able and willing to accept results in English so he/she wants to broaden his/her horizons outside his/her home Country;

    – He’s interested in Widgets in London, which could be reserved on any website on a worldwide level, not necessarily on a .nl website.

    In fact: let’s exaggerate the example and create a paradox.

    If we seat in, say, Pakistan and we search “Widgets in London” on the Pakistanis local version of Google, the geotargeting approach will instruct to show .pk domains/sites dealing with Widgets in London first, then will show, with a bit of “penalty”, the biggest worldwide providers of Widgets based in London, which are probably the best players in the world if you need to buy Widgets in London.

    So, those who have chosen an international TLD (.com) and have based their servers in a reliable datacenter in California, will not find “fair” to be excluded (3rd page of results: excluded) from the European SERPS as “they are not local to it”.

    Actually those businesses are “very local” as they are a 100% European business, and they serve 100% well Dutch customers, as well as American ones, of course.

    Leaving aside the multilanguage side of the business (I can guess a workaround for the Dutch version of Google: register domain.nl, host it in the Netherlands and write the content in Dutch: although this exposes the site to one of my objections below under “workarounds” section), as you can imagine an English speaking site cannot multiply the English version for the Irish, British, Australian, New Zealander, South African etc. markets as it will make duplicate contents everywhere….

    In fact, even if you registered domain.ie, domain.co.uk, domain.com.au, domain.co.za, domain.co.nz etc. you’d end up in delivering the very same English content as the language variations are too tiny to be done on 5 different English speaking markets.

    —-

    I also would like to point out some aspects that in some cases, if well analyzed, will lead to evaluate the current principles used by many engines for geotargeting, as pure myths:

    —–
    “Place of business = Place of location of the server” is an outdated concept that should not be followed.
    —–

    The OECD (Organisation for Economic Co-operation and Development http://www.oecd.org) at the early stages of the e-commerce boom, tried to outline the guidelines to determine the residency of an online business. The 1st guess was to establish the correlation between physical location of the server and actual place of the e-business.

    Because of the various hosting plans scattered around the world, which lead to big distortions in the application of the trade laws, the OECD soon changed his approach stated above.

    In fact, imagine a company hosting its servers in Anguilla and for this reason only, not paying taxes on its profits as that was a zero-tax jurisdiction, or hosting in the US and having to pay all US taxes and not the European ones (if the company was all European), even if the company managers and employees had never touched the US soil in their entire life… bit distortive, isn’t it?

    In year 2002 they in fact change their approach, to signify that the place of business (where a business is located) is not to be investigated only and merely through the too easy and inaccurate approach “where the server is located”, but they diverted the attention on more substantial factors like where decisions are taken and where business is operated, conducted and founded.

    http://www.oecd.org/dataoecd/46/32/1923380.pdf

    “a server cannot, by itself, constitute a permanent establishment” says this paper.

    Translated in geotargeting terms, we could state that “a server location cannot, by itself, constitute a proof of residency or relationship of that business with that certain Country”.

    I wanted to make this “tax” example as this is a very prestigious Organization that was voluntarely “patching”, 5 years ago, the same geotargeting principle currently adopted by the main search engines (you rank better in the Country where the server is located) as they realized that the physical location of the server is not enough to prove that the business has links with that Country.

    ——————————————————————————–
    “dot com” + American IP is not as relevant as “dot nl” and Dutch IP

    Because of the world-wide usage, the .com domains cannot be regarded as “hints of American roots”, as every one in the world use them: “.com = international” rather than “.com = American”.

    Also, the super-advanced US hosting industry attracted and is attracting too many international sites to be considered a sign of localization.

    Therefore, while a .nl TLD hosted in Netherlands can be a good proof of localization, a .com hosted in US has a much lower chance to be guessed right according to this rule.

    ——————————————————————————–
    Country of the Domain Record could be a good hint

    By “Whois” the domains, we’ll see that many .coms domains have a non-US Country as Country record in all their domains. This could be a hint (not necessarily valid for all cases, ok, but still “a hint” of the localization of a business: may be an additional score in favour of our “Europeanity” if we are Europeans, for example…).

    ——————————————————————————–
    Geo-Link-Popularity could be another good hint

    Assigning a higher value to the incoming links originating from IPs based in the same Country of the searcher could be another interesting aspect.

    Example: if I search “Widgets in London” and I seat in Pakistan, I may be want to give more relevance (geo-page rank) to what the Pakistani websites are linking for that keyword so that the link popularity will be “biased” by the votes of my co-citizens which will count more.

    In this way, the criteria stated above about TLD inspection and IP localization inspections are going to be defeated or ignored as the “geo link pop” is going to determine who is the relevant resources for my citizenships.

    From a different approach, any given url could be investigated for the geo-popularity point of view, and accounted for the various countries of residence of its incoming links.

    In the example of this post (international service provision) you’ll end up in seeing that this site is linked from all the Countries of the World and flag this site as “international” i.e. “good for all English speaking versions” of the Index.

    In the case of a home delivery service based in NYC, you’ll see that this site will be mainly linked from Newyorker websites, therefore you’ll be able to define it “local and related to NYC”. And so on.

    ——————————————————–
    WORKAROUNDS in case all the above will not be adopted?
    ——————————————————–

    Well, we could comment that all I wrote above is not possible to be implemented in real life. Ok. Let’s assume none of my remarks are reasonable.

    So, what should do a company with a .com hosted in the US and selling in English language to the entire world, if they want to keep their Australian, Irish, New Zealander, South African, British customers via organic search results?

    Various solutions:

    **** Duplicate its website in various “copies” distributed across localized servers in the countries I mentioned above?

    business.co.uk
    business.ie
    business.com.au
    business.com.nz
    business.com.za
    etc.?

    Not a good move. This will be dup content or, if used the above domains with a 301, they will eventually be ignored in favour of the only one which has no 301 on it.

    **** Split its website content in small portions, hosting on those countries a portion of the website with the following scheme?

    business.co.uk
    business.ie
    business.com.au
    business.com.nz
    business.com.za

    Not a good move. “Size matters” for Google, MSN etc…. therefore cutting a big site (which is somewhat already ranked on SEes) in slices and create a series of small (therefore less liked) invisible sites will not help ranking well in any of those markets. We’ll end up in not selling anything to anyone…

    Also: what am I going to sell to Irish travellers? “Small Blue Widgets” only? If they wanted “Big Red Widgets” which I can provide, am I going to push them on the US site?

    So, this will increase the possibility of being penalized for cross linking between the small sites as I must tell the Irish visitors that I sell “Small Blue Widgets” but I also have “Big Red Widgets” offered by us on the US site “come, click here and check them out”, otherwise I’ll make a commercial damage to our site.

    **** Duplicate its website in various “copies” distributed across localized servers in the countries I mentioned above, customizing a bit the language to adapt to the American English, British English Australian English etc.?

    Not a good move: if you sell the same product, you will surely end up in making dup content anyway: I’ll still sell “Widgets” at the end of the day, and the only thing you can describe those concepts is by using the same terms… Or should I speak like Shakespeare fro the British and describe a bed like a “magnificent treasure and masterpiece of slices of wood positioned in horizontal fashion, which are the base of a multitude of fluffy fabric layers, all of this called “bed”? 🙂

    **** Move the server on the market they prefer the most?

    Not a good move: as you sell in English and your service/product could be bought worldwide, you will probably serve 50% of US and 50% of UK customers (now sadly UK ones are disappeared because of geotargeting in our case but I can swear that was the ideal partition until 2005 :-): wherever we are going to host the server, this is going to be a diminished clientele in the Country where we will not host.

    ++++
    In short: those who have an internationally-provideable service, described in English language are penalized by geotargeting.

    I’d really be interested in knowing what are your ideas on this.

    I guess this is a common problem, common to many sites.

    ——————-

    Editorial note: My apologies in advance if I haven’t explained well some concepts, if I have written spelling mistakes or if my tone could have sounded not right or offensive: all I wrote has to be intended as a plain explanation of my concerns with no accusatory mood or tone and with a very “scientific, collaborative” approach. If it sounded like I used bad tones, I apologize in advance.

    Also, if you find it too long for the blog, you can edit it as you like.

    My intent is only to point out a possible problem, and I have tried to anonymize and keep generic and “useful for everyone” a case that is of course, related to our business.

    Thanks a lot,

    20kMilesGuy

  33. First registation and now “crawp”? I’m thinking Matt’s just trying to come up with a word for Webster’s now that Will Smith got “jiggy” into the dictionary. 😉

    Funny thing about that Comic Sans font. I once built a commercial site (back in the dark days of 1999 when no one had taste) in cyan Comic Sans on a dark blue and magenta background, with lots of animated GIFs. This site broke EVERY conventional law of web design imaginable at the time, it was an eyesore, it was awful…and it worked extremely well. This thing made the client almost $20,000 one week. And I was able to keep it that way until early 2002 before I had to change it.

    It’s really strange what people will flock to. Anyone else remember My Trailer Park?

  34. Matt,

    Thanks for attending PUBCON and for sharing such valuable information with everyone. Not only were you willing to talk even to the littlest person, but from what I saw, you did it with dignity and grace. I am a novice SEO and your demeanor is what made me willing to reach out for help.

    I am confused by your blog. I own an internet jewelry site and started buying jewelry related domains many years ago and probably own something north of 125?. All parked with GoDaddy except the site I am using. I heard you question almost everyone about their domain collection and knowing how many I owned, became nervous.

    Is it not likely that most 30+ something website owner, SEO person would own many domains? Heck, many years ago it was like gambling hoping to get a good one and sell it.

    Should I get rid of them?
    Should they not be parked?
    Should I not have the WhoIS privatized?

    or should I have them all 301 redirected to my ecommerce jewelry site?

    Sorry, I read your blog and I just have not come away with a clear understanding of what to do or not to do. It is clear that if you cannot provide relative unique content for each one, that you should not have them live, but other than that, I was left wondering…….

    I appreciate any response you can offer !

    THANKS!

    Michael

  35. Good info, thanks for the recap. Anyone close to that real estate licensor guy get a good look at just HOW BIG his hands were? I assume they must have been huge. 😉

    Sort of OT, what’s a good way to go about requesting a manual review of a site which has had all its images filtered as being adult (they’re not) in G image search?

  36. This is a great post Matt, more of the same please! On another note, I’m just wondering why you’ve gone off the caffeine — you’re not pregnant are you?

  37. I would have loved to see a follow-up question to the “how many people use Internet Explorer?” — “How many people use IE only for webmaster duties to make sure their sites render in IE?”

  38. graywolf, happily I have more confidence in my tools than I do in Alexa’s data. 🙂 If you’ll notice, several times I asked the site owner “Is X one of your sites?” and each time it was. 🙂

    Keri, maybe I’ll ask that next time. 🙂 I liked your post at
    http://www.morgretdesigns.com/index.php/2006/11/22/things-to-fix-before-letting-a-search-engineer-review-your-website/
    Reading the write-up on Rae’s session, there were definitely “high-order bits” to pay attention to first.

    Matt_Not_Cutts, it’s pretty simple. One day I found myself having two energy drinks to get a chock-full day of stuff done. So I figured I’d skip caffeine for a few weeks and let my body reboot and recalibrate. That way, I can jump back *on* the caffeine wagon again soon, and get more done with a smaller dosage. 😉

    (Updated this comment to point to Keri’s new website location.)

  39. Did you just “diss” site explorer and alexa data?

  40. Matt –

    On the topic of reciprocal linking. At one time I did engage in link exchanges (maybe 20 or so) but I haven’t exchanged links in nearly 2 years. Over time some of our partners have removed our links so I’ve removed theirs too. All of these exchanges made good sense (our topics were directly related).

    I’m still trying to figure out why Google doesn’t like us. I recently checked one topic and we ranked around 150. (I’d post it here, but you’d probably drop me down to 151…) I talk about the history of the term, how to find companies like this and even give examples – a very thorough explanation. Exactly what the reader would be looking for. Still, we sit at 150 with a lot of lower quality information ahead of us.

    I’ve spent the last five month re-writing nearly all of our content to make it more useful to visitors (three more categories to go).

    Could the problem be these reciprocal links? That seems kind of harsh.

  41. Matt, It was a pleasure to meet you again and thank you for thaat mega burger.

    It’s scary that you may be my evil twin, what with the same age, similar size, same choice of food and even removal of the huge onion ring from the food.

    Carry on the good work mate and I’ll do my best to carry on trying to beat you 😀

  42. Great post Matt,

    Let us know more about this learning Spanish to military training pages. ¿Were they quality pages? ¿Can you provide us some urls for checking?

    Thanks

  43. Jason Duke, it gives pause for thought, doesn’t it? It’s like the cartoon where the wolf and the dog get up in the morning, get breakfast, and then spend all day trying to destroy each other. 🙂

    BillyS, you might consider doing a reinclusion request via the webmaster console (look for the Tools box in the upper right-hand corner when you first sign in).

  44. I usually click into your blogs and hope for something new, and this has probably hit my favorite post by far! Even your responses to others are amazing. (of course you already won fanship? from Pubcon) I read up on SeoLoser view on PubCon, and I laughed the entire time, thinking about how each day I went home to my boyfriend and told him how unfriendly SEO people were, like they were hiding something.

    Danny Sullivan was a great meet, one of my clients was an original client of his and we spoke about that more than anything. I wasn’t interested in his (how to’s) since I got it first hand from day 1, but more about him. It was neat. I did finally get up to meet you on the final day, I’m sure you probably threw away my business card as well. But the impression you left on that 3 minute conversation about duplicate content left me confident.

    Seconds after speaking with you, I spoke to Tim Meyer about the same issue and to my astonishment, he did nothing. (facitious…) I did get his business card and hopefully he will take some time to help me. It felt like pulling teeth to get him to look into something – that really means whether I return back to 50k ads on yahoo or keep it padding for more on google.

    In any case, you did an excellent job at PubCon, I’ve been in SEO since 9/11/2001 after being let go from a Travel Company. I’m still a bit perplexed about the google adwords and their contribution to scraper sites, it’s been my biggest peeve with Google, but after PubCon and meeting you, I’m confident that the issue will be resolved some day and in many ways, even if a solution isn’t found I still feel like I was heard and that the situation mattered to you. (Google has earned my ad dollars in more ways than one. In fact you are reason my clients, who came with me to pubcon, have agreed to increase budgets in 2007)

    On the site reviews, I have to say.. Even with my years in SEO, I learned so much that day. By the way… I can’t for the life of me find this “Tim” guy who used to run the west region for Google about 3 years ago – Got a Google Lamp, Shirt and Hat.. a google party at the Paris Hotel and then he disappeared. Help me!! I liked working with him. 🙁

  45. Well… a new event for me to make sure that I turn up at next year!… especially as it seems an excellent place to soundboard concerns or the latest matters of interest. – For instance……
    …. it is about time that Google addressed the issue of linking domains to the location address of their hosting partner in the local SERPS !!!! instead of allowing a tag or the webmaster console to direct the focus region for SERP’ for a website! this inaccuracy impacts hundreds of thousands of people around the world and will only get worse in the Global market place as more people choose web hosting suppliers abroard. Anyhow,
    Clearly some excellent involvement and time spent by everybody, praise goes out to all who organise and take part in such a great event. God Bless you all.

  46. Matt Said:
    “Peter (IMC), I agree that the marketing aspect of it can be challenging, but there are ways to break into the consciousness. For example, from this conference, one fellow took his drinkbait.com idea and pushed it to the max. Another example is seoloser.com, which is getting some attention because the guy is teasing the SEO elite for not being very social. Often finding the right angle is hard, but once you’ve found it, a new approach can really pay off. ”

    Oh yes, the marketing aspect is not so difficult when you´re writing for the thing you love. In the english language the SEO community is well established and actually exists.. but in most other countries,.. well,.. there´s no such thing as an SEO community… 🙂 But marketing for our own site(s) isn’t the difficult part…. It gets difficult when you´re writing for a client sites. For example, imagine writing for a site that sells the base material for fences. 🙂 Luckily nobody else is into that stuff and you don’t need to have many links to get high rankings. Just optimizing the site it self generally is enough for markets like that. (I bet that feels like the old days for some SEO’s 🙂 )

    In any way, I wrote that post just to get some focus on the popularity side of “great content” because it seems often forgotten while one can’t go without the other. (also shows why a PR algorithm has such great value)

  47. Hey Matt! Wow, I’m shocked (and thrilled) you found your way to my blog! :)… had a great time at the show and loved your site reviews.

  48. By the way Matt (sorry for the double post, just remembered this), during your panel on Thursday when the room erupted as people realized one of the questionable sites actually appeared to be doing quite well in Google, I was just dying waiting for your to say “Okay. Refresh” into your microphone… I guess you’re too nice to ban on the spot 😉

  49. Matt,

    I don’t understand why you present reciprocal linking as bad. Reciprocal on-topic links have been here long time before google, and it is the most logical way to start promoting a new site. Because google penalizes sites with reciprocal links (even those which stopped it years ago), blog spam rised to the top of google results.

    I continue building up reciprocal links with good sites even though my site is in supplemental index because of reciprocal links (either by a google penalty or by a strange algorithm) and I don’t have google traffic for over 18 months. Reciprocal links bring at least several qualified visitors.

  50. This is very good post for people like us interested in getting our site reviewed by experts but practically not possible because we live in outside of USA. I wish you start a section on this blog and review sites periodically (probably you can ask people like us to submit site for review). I am having a content rich site , but not seems to be doing good in search engine ranking, also don’t have luxury of getting anybody volunteering to do SEO !!!!.

  51. I’m just starting to get into SEO. Having been offered the chance to go to Vegas for a conference, I jumped and I have to say, I found it all pretty interesting. I even decided to start a blog when I got back (although I haven’t done too much with it yet). I’d just like to say thank you for participating in the conference because I really leaned a lot from listening to the site reviews.

    P.S.: Oh, and it shocked even me that the Spa woman ranked as well as she did.

  52. So you have to be a women with a badly optimized site to get free SEO help? 🙂

    In SEO 1+1 never seems to be 2, more 1 or even -1, why? You set up a site with excellent content, a first class usability, piccobello SE optimized en then the only way to get decent traffic to your website is to buy it. I don’t find it particularly correct that a SE stays that vague about how and when they rank your website.
    I don’t have a big name or the means to come to PubCon, organize something like DrinkBait or do a PPC campaign.
    I suppose I have to stay a big nobody in Belgium.
    Oh, what a cry-baby I am.

  53. Woooo Baby!!
    That made for a good read!
    First off, that was the first time I have seen the My Trailer Park site, and the SEO Loser site had me in stitches!

    PubCon rocked! Matt, thank you publicly for reviewing my site. I implemented almost everything suggested by you and the google engineers and my site went from 640 in Google for my major term “My Town Name” “My profession” to number 10 over the past week, all white hat baby!

    I have not removed the Comic Sans yet. The site has always converted visitors to customers (patients) well. I just couldn’t get it to rank well in Google until now. I promise I will test it without comic sans, and see the results. My wife is saying “If Matt Cutts told you to get rid of the Comic Sans, then get rid of the Comic Sans!” So good job to all of you and especially the PR division for organizing it on every level so that Google came across so well!

    I spend a lot of time on WebMasterWorld, and I think Google did a great job on PR at this conference. The tone of posts, (and this is a generalization) has gone from a slightly suspicious stance on Google, to a hey they are great guys, are helpful, are not trying to hide anything, and throw great parties.

    The Google Get Together was off the hook. I did feel a little sorry for some of the engineers who got spotted in the crowd. The SEO’s were hungry for data, and the poor engineers did their best to handle the barrage of questions and opinions launched by the mildly inebriated SEO’s in the crowd. I learned tons that hour.

    Finally I was amazed watching you, (Matt) after the Site Review. There were literally a hundred people, pressing in on all sides, (even from the back) all with their one make or break question to ask. You really held your own in that environment, very relaxed and composed. I know from experience that is not an easy place to be, especially when a lot of these guys have their very livelihood based on the answers you give them.

    All and all, PubCon was a wonderful welcoming experience into the world of SEO for me, and I can not thank you and all of Google enough for the help you have given me, and the way you have changed the world.

    Much Love,

    dk

  54. I am getting a strange feeling that guglee is encougaring a single site lame web masters and punishing professionals that make money off the web having 100s of sites. Especially strange that guglee itself is making money off the web and buying sites and companies. It’s like “I can but you don’t” attitude that sucks. That doesn’t encourage me to be in a whitehat category, at all. I wanna be all white and fluffy but guglee is FORCING me to limit my earning potential by these strange preferences. And it will not work that way, because I won’t let it.

  55. matt said:

    >>>graywolf, happily I have more confidence in my tools than I do in Alexa’s data.

    Although you said, you did not use tools for your whois-queries, of course. 😉 😉 😉

  56. Well it looks like it has all been said… but kudos on a great post and all of your personal responses.

    One thing I am still unclear on… our very small company must own 200 domains but we only have 4 or 5 websites. Most of our effort goes into 3 of them. How is this viewed by Google?

    Are you going to be making a stop at SES in Chicago?

  57. Great post and really good reading for the Novice SEO and Webmaster wanting to do more. Would you suggest registering Domains to a Company to avoid any perceived problems with one person being the main contact yet lots of people are working on the Domains.

    Ton of ideas to try and as usual we need good, original content ;-)), but have to start somewhere. In my day job I am a vet and know the hassle when i do evening talks, and everyone hanging on my every words!! And the crush at the end, the price of fame, keep up the good work.

    Thomas

  58. Matt –

    Thanks for the post and insight at PubCon.

    We have been wallowing with our Google results forever now (more than a year). We have changed our structure to ‘fix’ duplicate content issues along w/ other SEO fixes like robots.txt, clean url structure and 301’s on all of our parked domains.

    You mentioned backlinks and a reinclusion request when I handed you my scribbled on card. My questions are about our 300+ registered domains (are they still possibly hurting us), and about our 98k pages that are in googles crawl, but have pagerank of 0(90%) or low (10%). We have thousands of pages of good unique content and hundreds of affiliates (pointed at a blocked by robots.txt subdomain). Could our backlinks hurt us when they are not under our control?

    With such a large site it is possible that we have something else broken… thus the scratching our head bit.

    Thanks,

    T

  59. Care to elabourate on “if you let people know about it.” is there a recommended approach or just stuff it in front of their faces on the site?

    Mark

  60. Nice post, Matt! Very helpful. In fact the entire concept of a site review – especially at the hands of such knowledge people as yourself, Sullivan, et all – must have been wonderful to watch. Sorry I misseed it – maybe next time.

    Can you recommend firms (reputable firms that is) that offer ths type of site review service from their top end SEO talent?

    Thanks.

  61. For those of us who may have bought up typos of our domains and have other generic domains 301’ed to our actual site, would you recommend using an IP-funnel? http://www.bruceclay.com/seo-tech-tips/techtips.htm

  62. Matt,
    I was at Pubcon and was shocked at how little you said about that Arizona Day Spa site. Come on man, that site was junk all the way. The only thing it had going for it was age.

    This is why people are always trying to buy aged domains, etc. Because people are trying to be listed on Google.

    You mean to truly tell me that out of the search ‘Arizona Day Spas’ (1.3 million returned sites on that search), she deserves that current position of #2? That site has still not changed. What a bunch of junk spam on the bottom. You yourself laughed when people were joking about that.

    Google is going to have to start adding newer sites with less backlinks or the results are always going to be junk.

    Mike

  63. Thanks for your input on Pubcon. You did rip on sites but many of those sites needed it quite a bit. I do like your points on spreading your interest a little too thin if you have too many sites. Hopefully I’ll be there at the next Pubcon to see you there.

  64. Great article Matt. I’m just trying to get into SEO myself, if you can call nearly a year “just trying to get into” 🙂 and love all the great tips you impart on us.

    Cheers!

  65. Coming in late here Matt, but thanks for the great work. I can’t get over the reaction of some folks to honest discussion … re the reference to ‘ripping’ sites. I don’t see anything being ‘ripped’, recon that those who think so wouldn’t know honest criticism and good advice from a thousand dollar bill in the cutter.

    Keep on with work like this, as there _are_ many folks out there who can separate the wheat from the chaff.

  66. I have a question regarding search results.

    I have two computers. One is a mac running Safari and the other is Windows running Firefox. If I put in the same key term in each machine and run them at roughly the same time, I get two different results.

    Does Google server results differently to each browser is this something else? I always get different results with Safari than even my clients running Firefox in a different part of the city.

  67. Nick: I seem to have this problem when working with other people in different areas as well. Usually I find that we are connecting to different data centers with different sets of data. Now if I can just figure out which one has the newer data 🙂 Usually, not always but usually, giving them the IP of a specific data center ensures that we can look at the same results.

    Matt: Thanks for the post! I have been working to convince some people that reciprocal linking isn’t useful for established sites (especially when the linking goes to sites that are completely unrelated) so you’ve given me a little more ammunition!

  68. I have a question: I own 4 sites, and I turned on the Privacy Whois when I registered. I didn’t do that with any bad black hat intention.

    I want to know if it can penalize my sites in the “eyes” of the search engines?

    Thanks.

  69. Great explanation matt, but I’m trying to understand what Google is doing when they find duplicate content. will the site(s) be banned from the results?

    As you say: “If I can spot duplicate content in a minute with a search, Google has time to do more in-depth duplicate detection in its index.” But you don’t really tell us what the penalty is or even if there is an actual penalty for sites that are doing this.

    If I make a search for example the term:hotels in barcelona
    I sometimes see 2 or 3 sites that appear to be a 100% copy of a site with just a different url. Ofcource I understand that in some terms a lot of content is comming from a 3rd party sites and therefore almost all sites about that term are using the same content and can’t be banned.
    But if a site is a 100% the same all over the site then there is no other explanation but they are spamming.
    If the Google bot can’t find duplicate content and Google is penalizing sites for doing that then why are those sites stil there?

  70. I have a client in the cosmetic surgery field and the site that outranks him for a key word uses 100% copied content including illustrations from the American Society of Plastic Surgeons website. I’m not sure Google is doing such a good job of banning duplicate content sites.

  71. “An inference in search of legitimacy” is the thought that comes to mind when I read the (semi-reliable) quotes about your comments concerning individuals associated with multiple websites and/or domain names. Signs of common ownership or control of multiple domains/websites makes a sensible trigger – for specific processing for either the ‘absence of signals of quality’ OR the presence of signals of . . errr . . junk . . but multiple domain/website associations does not make for a valid ranking or penalizing criteria. Sure, I mod a domain forum. That’s a good signal for the fact that I likelyI control scads of domains, a fact I’ve never hid by a single proxy or false field. But, guess what? I’m well into middle-age, I have multiple interests, and have a depth of knowledge on multiple issues that is likely greater than many who might just be coming to discover an interest in any subject I might choose to write about – relating to those many domains I’m associated with. I may not be THE authority but I am certain able to speak WITH authority on a multitude of issues or subjects or activities, based on real life hands-on experience, book and classroom training, litigation experience (where I get to learn all sorts fo fun things), etc. – on multiple topics. When any SE’s algo can recognize evidence of real life subject matter experience as a signal – the sharing of insight and/or experience, a/k/a knowledge – without the algo having to wait for link love to approve or confirm the value – well, that will be the day, won’t it? Too bad if the algo thinks that the polysci PhD haning out at the university is actually a greater authority on politics than the little grunt domainer who also did a few stints as a town mayor. Too bad if all those domains means I lack signals of expertise on the subject of shopping for a used car, even though I’ve owned dozens and help many more people choose a car. Too bad that motorhome domain I have will have to languish because of those import-export domains I have. I thought I might have some interesting tales to tell of my family’s adventures, including the one about my son and I gutting and rebuilding (and still rebuilding) the motorhome. And on and on it goes. If what you suggest holds water – the multiple domain ownership is a signal to dowgrade – then for certain you have signaled to me that the algo is officially hamstrung by it’s humanlike intelligence, faulty thing that may be at times. Fortunately humans can learn. One can only hope the same is true for the latest “belief” that the algo may be laboring under. ;-P

  72. Hey Matt. Thanks for the article.

    I have many health-related websites, each for a corresponding medical journal or newspaper. Sometimes, we publish articles that may be relevant to more than one journal, and may exist in two physical journals, as well as two internet sites.

    Is this okay to do?

    Thanks alot. Take care!

  73. Hey Matt

    Wow after all this time the reciprocal links can cause a site harm is finally divulged.

    It is a start.

    As for the domain issue and seeing behind private date…. Google is also a domain registar )I guess a lot of people forgot that) so it can see who owns any site….no smoke and mirrors here boys & girls…..

  74. Hi Matt,

    from Spa Lady in Arizona. I send you an e-mail a little while back, I am not sure if you received it. I know you probably get 1000’s of e-mails so I will not hold it against you :):) I wanted to say thank you sooo much for reviewing my site at Pub Con. I learned a lot, but I still have lots and lots to learn when it comes to SES and SEO. I plan on attending more conventions that will teach me how to SES and SEO correctly.
    If you have any other advice for me please let me know. You can send me the advice to the e-mail I provided:):) I would really appreciate it… In the mean time I am still looking to find a programmer that knows asp. My site is programmed in asp and my shopping cart needs to be fixed desperately 🙁 it’s been broken for 2 years. I have paid several programmers lots of money to fix the problem but no one has been able to fix the problem yet, or guide me to the right solution. If you know of anyone please refer them my way:):)
    Thank you again for your help. It was a pleasure to meet you and hope to see you again at the next Pub Con.

    Wishing you and your family a very Happy Holiday Season!

    Thank you,
    Jana:)

  75. Thanks for the article Matt,

    Seemingly there are a lot of people that are using the “If i throw enough darts, one of them is bound to hit the target” philosophy. Although I do manage a number of sites, 12 or so atm, all of them get a similar amount of attention with me managing them and a team of writers who work on content with me.

    I was wondering what you would call an optimum number of sites to be running for one person. eg. should there be one full time member of staff per site, or is having one writer, researching and writing good quality information for me to publish on 3 sites enough.

    Your thoughts would be appreciated.

  76. Matt these are the articles that eveybody wants to see! Great answers about burning questions in the SEO industry.

    The two current burning questions I seem to see everybody asking:

    1. What is going on with the -30 position penalty? Where a site mysteriously one day loses 30 positions. Duplicate content, link text too common, ?

    2. Since were on the topic of webmasters having mutiple sites, what is the policy on linking them to each other? What if theyre all in the same general field? It cant be a crime to link to ones own websites, can it?

    Thanks Matt and Merry Christmas. 🙂

  77. hey matt,

    just been curious about this thing.see if i have 50 domains and i use the same template and interlink those websites like more links but those sites have completely different content.so would u think that i am spamming.

    Regards,
    Marcus

  78. A great article Matt! With super reads like this Im getting the know exactly I have to do to optimize my websites.
    What I noticed about some new sites was that they targeted big “1 word” keywords in there site and when the can’t get any traffic they give up. My advice is to target less traffic words and get some traffic to start with. Then later re-optimize the keywords to slightly higher ranked words.

  79. A good read thanks Matt. More posts like this would go a long way in helping people understand the things they should or should not be doing.

  80. Although I agree in principy with what you’re saying, you’ve glossed over the importance of a large link network, which we all knew before there were any worthwhile search engines, which is TRAFFIC without the search engine itself.

  81. Matt,

    I am confused on what approach to take with international site structures. Are you best served with keeping translations all on one domain:

    http://www.mydomain.com
    http://www.mydomain.com/fr
    etc.

    Or breaking it out amongst country specific domains and linking together?

    http://www.mydomain.com
    http://www.mydomain.fr
    etc.

    I have researched your blog and google webmaster forums and have been unable to see a definitive answer?

    I appreciate the time you have put into this blog… it’s very informative.

    Thanks

  82. Hi Matt,

    Thanks for your wise and useful suggestions:

    1) If have hundred of articles and linking those articles from one page i.e sitemap would not be a great idea to do. It is always advisable that the sitemap page would be to break into more sitemap pages.
    2) For a new website it is always good to instead of developing plenty of reciprocal links go for viral marketing instead.
    3) Dynamic url conversion to static ones.

  83. Hi Matt,

    I’m not all caught up in the SEO game (yet) so I have an outsiders perspective on the issues discussed here and I have to say that I find this stuff to be very, very bizarre.

    What’s wrong with reciprocal linking? It’s healthy for related web sites to link to each other. Why must this linking be one way? If I were researching content on a web site, I would never know about and benefit from a related web site because of this bizarre rule that discourages two way linking. Imagine imposing this requirement on all of the roads that connect our towns and cities. Because town A has a one way road to town B, I can’t get to A from B because of this strange and stupid rule!

    This is an artifact of the ongoing dynamic between search engines and web site owners that has led to an unnatural distortion of the internets linking structure.

    If one person owns 200 web sites it doesn’t necessarily mean that it’s a one man show. Ever heard of outsourcing? There is nothing wrong with being entrepreneurial. If I hired 200 different niche experts to manage 200 separate niche web sites, I shouldn’t be penalized for that. That scenario is not farfetched, it’s a description of a company. If Google chooses to discourage this, then the ecology of the Internet will get further twisted and distorted.

    How far will the duplicate content issue get pushed? Some well known Internet marketing ‘gurus’ are dispensing some really ludicrous stuff about this. One in particular is saying that I must avoid duplicate content at the sentence level. I mean come’ on, am I to avoid using common expressions of speech? Are we to change the way that human beings naturally communicate? I really should be confronting the ‘guru’ about this particular absurdity but his blog is closed to comments so I thought I’d toss it in here.

    Well, diatribe is complete.

  84. Hello Matt,
    Our site is the HiFi site mentioned above in your blog was one of the ones reviewed at pubcon in vegas. The URL is…

    http://www.hifisoundconnection.com

    Once we returned home from pubcon I was very puzzled by the comment that we owned many different telecom sites which we in fact did not. Upon checking with our 3rd party providor Infopia and the domain host they used Oracle we found that our site had been grouped with about 50 others as you mentioned and this must have been what you were seeing. Oracle upon notification of this (I pointed them to your blog as proof) claimed to have fixed this problem. I am just wondering how wide spread of a problem this is for site owners like myself that trust 3rd parties but may have damage like this that we do not even know about that may be affecting our rankings. Is there anyway for the average site owner to check for problems such as this?
    Thank you again for reviewing our site and I am really greatful that you pointed this out to us or we would have never known.

    Chris

  85. Matt,
    Also what kind of penalty could have this inflicted on our site and what is the best way to tell if it has been removed. We did notice a increase in our google rankings once Oracle removed us from that URL grouping however we were making all kinds of changes at that time.
    Thanks again,
    Chris

  86. Good read Matt. I also agree that content only is diffcult specially if you do not market it. I am really waiting for the next update to see if google will change the way SEO is done.

  87. DOM elements. Was there a reason specifically or some obvious constraint I’m missing? Also, any idea on what document.matchAll() will return in the next generation of browsers? Thanks for the library, we need more devs with your approach.

  88. DOM elements. Was there a reason specifically or some obvious constraint I’m missing? AThank lso, any idea on what document.matchAll() will return in the next generation of browsers? Thanks for the library, we need more devs with your approach…

  89. Thanks for the post! I have been working to convince some people that reciprocal linking isn’t useful for established sites (especially when the linking goes to sites that are completely unrelated) so you’ve given me a little more ammunition!
    http://www.audiophile.pl

  90. Was there a reason specifically or some obvious constraint I’m missing? AThank lso, any idea on what document.matchAll() will return in the next generation of browsers? Thanks for the library, we need more devs with your approach…

  91. Hi
    I have read you comments Matt, but there is nothing definitive to answer my question which I am sure many others also wonder about.

    I have a .com web site that is hosted in the US for cost reasons.

    I also have registered .co.uk and .ie domains for the same domain.

    The .ie is hosted in Ireland and the .co.uk is hosted in US I think.

    All domains 301 redirect to the .com address.

    Obviously hosting a duplicate site in Ireland and the UK will lead to problems and not only that its almost impossible to manage three separate websites.

    What is the best solution to this to get good rankings for .com, .co.uk and .ie results in google?

  92. Seemingly there are a lot of people that are using the “If i throw enough darts, one of them is bound to hit the target” philosophy. Although I do manage a number of sites, 12 or so atm, all of them get a similar amount of attention with me managing them and a team of writers who work on content with me.

    I was wondering what you would call an optimum number of sites to be running for one person. eg. should there be one full time member of staff per site, or is having one writer, researching and writing good quality information for me to publish on 3 sites enough.

  93. Although I agree in principy with what you’re saying, you’ve glossed over the importance of a large link network, which we all knew before there were any worthwhile search engines, which is TRAFFIC without the search engine itself.

    Nice post, Matt! Very helpful. In fact the entire concept of a site review – especially at the hands of such knowledge people as yourself, Sullivan, et all – must have been wonderful to watch. Sorry I misseed it – maybe next time.

  94. Hey Matt, thou shalt not speaketh about the DrinkBait without a Linketh to the Drinkbait! (especially when you even put the .com after it!). Hook me up man!

  95. Hi Matt,

    On the issue of domain privacy. I run an genuine niche adult dating site and use domain privacy to hide my home address. A) will this affect our SE rankings and if I removed it would SE’s then see the domain as being registered by someone new and sandbox it again?

    On a similar note: If a site does change ownership does it get Sandboxed again?

    The reason I ask is that we have set up a new company and wonder if we should change the registry to our new company name or leave it in my own name.

    Many thanks,

    Paul

  96. PS, Do they ever review adult sites at pubcan or other conventions if not then who would you reccomend to perform a site review as we do not want to go down the route of paying for an SEO review only for them to use blk hat methods that risk us receiving a penalty.

    Cheers,

    Paul

  97. What is the best way to SEO for real estate sites in Turkish?

css.php