ASP.NET 2 + url rewriting considered harmful in some cases

Sometimes people ask me “Does Google make any distinction in scoring between Apache, IIS, or other web servers?” And I’m happy to say “Nope. You can use either web server and Google will rank the pages independently of the web server platform.”

But someone in AdSense mentioned an interesting case that they’d heard of. Apparently, doing url rewrites in ASP .NET 2 can sometimes generate an HTTP status code of 302 instead of a 200. This issue isn’t specific to Googlebot (it would impact any search engine bot). The best write-up I’ve seen is at http://communityserver.org/forums/536640/ShowThread.aspx. Looks like one of the first places noticing this was here (note: that post is French; an English translation is here).

It sounds as though if this issue (ASP.NET 2 + url rewriting generates a 302 instead of a 200) affects you, your site may drop out of most search engines. So how would you debug this? Fiddler is one handy tool for Windows. For Firefox, you might use the Live HTTP Headers extension to see the actual request your browser sent, and the raw reply from a web server.

I would also recommend Google’s Sitemaps tool as well. That team recently upgraded Sitemaps to show more details on errors that Googlebot saw when we tried to fetch pages from a site. The upgraded Sitemaps console also lets you download errors as a CSV file for debugging. I found out that I had a few urls with errors:

Sitemaps errors

Clicking on the red oval above lets me download a file listing the problems that Googlebot had crawling my site, for example.

(Thanks for mentioning this, Antoine!)

66 Responses to ASP.NET 2 + url rewriting considered harmful in some cases (Leave a comment)

  1. By far the best feature of sitemaps. It’s worth it just for that. Now you know what page gbot had a problem with. It’s helped me out on more than one occaison when I’ve made a monster mistake in an includes file and things seem fine on my end, but then I see that gbot tripped over it 3 days later. Without that knowledge I may have found out a month later after much damage had been done to my listings.

    They also expanded the anchor text list on other sites. That helps in knowing what other people may have done without your knowledge. I think the sitemaps program is going to evolve into great things. And I can’t say enough about the help that Venessa (or whoever Google Employee is that day) has given in the support group. The topics often stray, but some pretty sound information is exchanged between google and webmasters there.

    John

  2. Even after reading the two posts, I couldn’t quite figure out what the final result was – didn’t the 302 toss an alternate URL that then failed with a return code of 500 – Server Failure (?)

    And purely as an academic exercise, perhaps there a bright side here that at least ASP.NET was tossing a 302 (temporary) rather than a 301 (permanent) which would probably be a stronger signal (although not quite as much as a 404 from the original URL) to the search engine that the page no longer exists?!? ๐Ÿ˜‰

    I.e. the related question is if a web server returns a 404 by mistake (or something is screwed up as in this example), does GoogleBot swing by again to give it a few more shots before it decides the page is really, really gone? Or assuming inbound links, does it still swing by to check it out periodically regardless of previous response codes?

    alek

    P.S. I’ve used Live HTTP Headers extension for a while – rocks.

  3. Perhaps it also might be good to talk about 302’s in general and how they can damage your website externally, unless all the strange things we see, like sites loosing their rankings is just foul play from spammers with the dup filter.

    Thanks Matt

  4. KAPA, I haven’t noticed any “problems” with 302s in general as long as they resolve to a 200 page, and you don’t have redirect after redirect after redirect. After these past few Google Updates, I’ve noticed great improvement on how Google crawls my 302 redirected “search engine friendly” urls. We have gone to great lengths to make sure that all links on the site use the “friendly” url and that the “ugly” url isn’t a link anywhere to keep from Googlebot from thinking we have duplicate content. Our 302s are working as well for MSN and Yahoo.

    So what does Matt think about 302s….??? That’s the big question.

  5. While you say that your spider has no preference with either, can I note that it is far easier to set up 301 redirects from non-www to www on Apache than it is on IIS. With IIS you usually have to pay for extra modules (ISAPI_Rewite,etc).

    More importantly, Apache URL requests are case sensitive, meaning that it is far harder to get accidental dupliate content. A request for page.html serves the page of content that you expected, and a request for Page.html serves a 404 error on Apache, while on IIS, both URLs would serve the same content and lead to “duplicate content”. .

  6. were there any comments from Rob Haward ?? the telligent sever guy ??

    I find it strange that I did not pick up this chatter from the th ASP folks.

    I was under the impression that this was a closed issue am i wrong ?/

  7. I’m sure Google doesn’t discriminate between web servers as such – but as an IIS user – here’s something that I think may cause problems: Windows case insensitivity. For example:

    On Apache, if you create a page called “mypage.html” – Google will index and rank it. If anyone creates a link to that page using different case, the link will be broken (and probably noticed and fixed).

    But on IIS, you can create “mypage.htm” – but it’s really easy to end up getting links to “MyPage.htm”, “MYPAGE.htm”, “myPage.htm” etc etc. Of course these pages are all identical – but Google seems to index them all, presumably splitting page rank between them.

    I’m starting to implement a re-direct to make sure all pages go to their correct “case” – but I bet most windows webmasters don’t.

    Could Google implement something so that pages where the page names are identical but in a different case are amalgamated if they’re being served from a Windows server?

    Might be too big an overhead I guess – but I bet it would save some space in the index, and should give slightly better search results, as it would accurately represent the fact that on a Windows box, they’re all the same page.

  8. It is the problem with Community Server application only, it has nothing to do with ASP.NET 2.0.

    Your topic suggests it is ASP.NET 2.0 problem.

    I have ASP.NET 2.0 + url rewriting (http://blogs.x2line.com) and I get 200 in the above cases.

  9. Most web frameworks use 302 redirects to… well… redirect. I’ve never noticed a problem before – what exactly is the issue?

    Also, the sitemaps screenshot shows 4xx errors – are these the pages that the 302 resolves to?

  10. /pd and alek, this is not a conventional 302 issue. I believe it returns a 302 without giving any destination url.

  11. Hey Matt,

    Very interesting post. It’s great having you back.

  12. Matt,

    I went back and re-read things again and I think I have it correct. From this writeup – http://todotnet.com/archive/0001/01/01/7472.aspx – it appears the server bombs when you are redirecting but ONLY for certain user agents – and suggests the final result is an error code 500 which is an Internal Server Error per http://www.w3.org/Protocols/rfc2616/rfc2616-sec10.html

    If you/Adsense guys have an ASP.net page that demonstrates this behavior, it would be easy to run Live HTTP Headers on it and see what the final response is. A 302 with target URL would just try to refresh the original page (right?) – so heck, that may be why you end up with a 500 since eventually the servers say that’s enough. I betcha a beer (or how ’bout (more) flowers for your wife versus my wife) that’s the final response! ๐Ÿ˜‰

    And yea, from the writeup, this is a bizzarro bug and doesnt’ appear targetted at Google in any way … despite what the tin-foil hat crowd might think.

    alek

  13. Ooops – add these two words to the post above:
    “A 302 with *NO DESTINATION* target URL would just try to refresh the original page (right?)”

  14. While you say that your spider has no preference with either, can I note that it is far easier to set up 301 redirects from non-www to www on Apache than it is on IIS. With IIS you usually have to pay for extra modules (ISAPI_Rewite,etc).

    Actually, there are two options in this regard (5.0 and beyond) that are free and don’t require much extra effort.

    The “Set Up Two Sites” In IIS Way

    Basically, it’s what it says it is. You set up two different sites, one for the non-www and one for the www version. Then, for the home directory of the non-www version, set up a redirect to the www version and ensure that it’s a permanent redirect for this resource (301).

    The other way is through ASP.

    That’s the oversimplified version, but it can be made to redirect to individual pages and stuff. And it does work.

    So…it can be done in IIS, and for free. Just in case anyone was confused by this.

    As far as the .NET thing is concerned, that doesn’t surprise me. I’ve dabbled in it and it’s a bit of a mess as far as some things go, so I stuck with classic ASP. But that’s just me.

  15. Dammit…my code didn’t show up.

    Matt, make it show up. I’m too lazy to retype it. ๐Ÿ™‚

  16. The question is actually what kind of URL rewriting CommunityServer is using. We do URL Rewriting in ASP.NET v2 by hand in the global.asax and do not experience any of these problems. Everything is well `200’ed` ๐Ÿ˜‰

  17. Sorry for doing a second post but I can’t confirm by now that this is an ASP.NET v2 issue. See my lastest blog post for details.

  18. Hi Matt,

    All these redirect stuff caused many problems to many sites. When I started using 301 from non www to www version many “bad things” happened to my site in Google. It came to a point now that I am considering to remove the 301 and to see what happens…

    Is it safe to believe that not having a 301 from non www to www version will somehow fix my problems as I see that your own blog does not use this?

  19. Isn’t this a general: If your server is broken, the search engines will ignore it? Yes, .net (1-2) has issues, but if you know how, they’re easy to get around and make a technically sound site with.

    Matt: How does Google handle the case-issue? Does it recognize IIS servers and ignore URL case or does it handle it with the “usual” duplicate content filters?

  20. Personally in on IIS I prefer doing it with an ISAPI filter it anyway becuase the REGEX is a bit more standard.

    Nick, rewrites != to redirects
    http://msdn.microsoft.com/library/default.asp?url=/library/en-us/dnaspp/html/urlrewriting.asp

  21. Matt, make it show up

    Use

  22. Yup .. I’m way smarter than you are ๐Ÿ˜€

    use

  23. I’ve developed some major ASP.NET / open source consumer web sites- one with .NET 1.0 and a second as a member of the .NET 2.0 Atlas beta. When building the site with .NET 1.0, I used an ISAPI filter. Unless you need to do some very sophisticated processing (beyond what’s easily done with RegEx), it’s really better to stay on the ISAPI filter route for performance and maintainability reasons. Take a look at isapirewrite.com

    By the way, this sounds like a problem with Community Server (www.communityserver.org) as an earlier post noted, not with ASP.NET 2.0

  24. Hi Matt,

    Just a small “detail” the bug Trown a 302 error for CommunityServer Engine (An ASP .NET 2 blogging engine), but for a “normal” site if you don’t redirect (302) errors, GoogleBot will get a 500 error

  25. On IIS, I had to set up 404 errors for any case-issue other than lower case. Then I submitted a Google XML sitemap that was only lower case. Finally, I used the URL removal tool to manually remove some of the UpPeR case URLs that had been indexed and were just duplicate pages of lower case URLs.

    It has taken 6 months, but the duplicate content issues are fading away and are almost gone…

    Great to have you back Matt!!!

  26. Thanks Matt! Great to have you back. Just your being away for a month I missed your posts. This one on Url Rewriting is very helpful for a new client I have so I’m glad you posted about it.

  27. But the sitemap crawl errors STILL don’t tell us WHERE Google was when it found the error.

    It gives me a url that it couldn’t find. Usually it’s obvious that this was an href error somewhere: probably I left out “http://” when I shouldn’t have, so of course that’s seen as local but it isn’t.

    But WHERE? Where as Google when it found that? Knowing that would save me tremendous effort in tracking down the problem and fixing it.

  28. Matt,

    I’ve been doing ASP rewrites in Classic ASP for years. I’ve always done this with the Server.Transfer method. Sends out the 200 status code instead of a 302. I also use a similar method for pages which have moved and use the 302 status code. I use the 302 permanent redirect with pages that have moved such as http://www.truckinfo.net/truckingschools.asp. It’s all dyanmic but appears as static pages.

  29. Oops, I meant a 301 redirect… not 302.

  30. Good to see you back Matt…

    Your already helping me out.

    This ASP rewrite thing was giving me a brain tumor.

    Google needs to give you a big raise.

  31. Hi Matt,

    I understand that Google has a policy of not discussing/announcing algorithm changes, for fear that any such information would be exploited by spammers.

    All well and good, but here’s what I don’t get: What would be the harm in at least acknowledging when a major change has occurred? Those of us who have been badly affected are fairly sure that Google introduced a major change – probably a new (desperately misguided) spam filter -on the 27th of June (–use_experimental_spamscore ??). So far, you have chosen to ignore all requests for any information regarding this change.

    What would be the harm in acknowledging the change (without disclosing any details), and recommending a course of action for those of us who may have been caught in the crossfire? A special email address to contact, for example, would be a big help. Surely the large number of complaints from Webmasters (all over all of the forums) could be taken as a sign of serious unintended consequence of the recent change/s that, at least, merit a degree of urgent manual review?

  32. If you have 2 services or products that are not related. Should you have one website or two? Example: Lets say I design websites and I also sell point of sale software. Should these 2 distinct services be on the same site? or will it end up hurting seo results?

  33. It’s definitely not an ASP.NET problem .. I have been using url rewriting for a long time with .NET and not had any issues getting sites spidered.

    The only issue I have experienced is the form is by default rendered to submit to the real filename but relative .. eg: /fake/page.aspx wants to submit to “fake.aspx” in the /fake/ directory.

  34. First, it is an ASP.NET issue. And the reason for the 302 results is that CommunityServer will redirect to an error page (and logging the exception at the same time).

  35. Ben, I can tell you this is definitely a .NET 2 bug (or IIS6 bug, or twice), please read the posts that explain the problem before tell it is not.
    The problem does not occur in all rewriting cases, so please take time for reading the posts… If you can’t reproduce, you are maybe a lucky guy with a non public .NET 2 patched version…

  36. I use URL Rewriting extensively on my Golf Course Map & Directory Site, FindMyGolfCourse.com, and I just checked it with Fiddler (got the latest version right before that. I put in the Yahoo Bot agent string. Got a 200 Status code, which is what I would expect to see. So I wonder if it is something to do with Community Server or maybe the built in rewriting in ASP.NET 2.0. I wrote a Blog about URL Rewriting early this year and this is the code I use for my rewriting engine so maybe there is a huge differnece, I don’t know. My Blog is Community Server, but the older version, 1.1 and I do not seem to have problems getting indexed with it.

  37. ** How does Google handle the case-issue? Does it recognize IIS servers and ignore URL case or does it handle it with the โ€œusualโ€ duplicate content filters? **

    It indexes the multiple URLs, and makes some of them Supplemental:
    http://www.google.com/search?q=cosnam+po6&filter=0

  38. Hey Matt, question for you.
    Somebody is currently DDOSing one (or several, I can’t tell) of my websites.

    We’ve unpluged the server from the ethernet jack hoping to stop it (10 hours later, it didn’t work)

    anyway.. as googlebot visits me daily, what effect will this have on my rankings.

    I’m 90% sure that the culprit is a competing site of mine, as right before it started this “hacker group” launched a competing site,to mine and started smearing mine on forums.

    anyway, is it possible they’re doing this to knock me off page 1 of the search engines? Will that work? I don’t think I saw that SEO technique mentioned in any books.

    God I wish they’d stop.

  39. Nix: Please post a reproducable example, I’m not able to reproduce this bug in any way so for me it’s still no ASP.NET v2 issue.

  40. Great error report Matt. WIth the large uptake of .NET v2 for its friendly URL option, this is a timely warning.

  41. Hi Matt,

    Interesting about site maps, but I am confused. I have taken off a lot of the pages from my site in the last month and many pages have been removed a long LONG time ago from our server but still since then despite these pages 404ing (I have checked) they are still in site:www.firefly-it.com

    Should these not drop off at some point!!? Some of the pages have been removed for nearly 1 year now. This just seems like a waste of server space for you guys to keep cached copies of pages that have not been there for such a long time.

    We are really trying to work on the quality of our site and writting good content. Fair play a little while back we had a directory of web design plus city names which you obviosuly thought were doorway pages but at least the quality was there i.e. no crappy adds and the content was good. It was just we could be assed to write about web design in every town in the UK. Anyway, you removed those pages from ya index and we subsequently removed them from our site to match you quality guidling but looking again now I notice that some of them are back in your cache. I am just not sure what it is that you are up to. It seems that you indexing system / rules for caching change day in day out. And I mean why have they come back into cache if we have taken them away. We have no crappy outward links, everything on the site is cool so far as I can see. It just appears to me (black box view) That there is something up with your indexing / caching system. 1 year to drop pages that are not there is too long here are some examples which are particularly bothering me.

    A CRM Demo we did LONG time ago. STILL THERE?
    Contact List Contacts List Groups E-Mail Campaign Action History …
    Firefly IT — CRM System. … Contact List. Create New Record. Previous Page – 1 – Next Page Viewing records 1 to 1 of 1 (Page 1 of 1) …
    http://www.firefly-it.com/firefly-crm-demonstration/?PHPSESSID=89fabdd636786260eba40bbccff618ef – 5k – Supplemental Result –

    These are the pages you thought were doorways — questionable I’d say but we removed a long time ago yet still in your cache. WHY?
    Website Design Aberfoyle, Aberfoyle Website Design website design …
    Website design Aberfoyle. … website design Aberfoyle Website Design Aberfoyle. Welcome to Firefly IT where website design is our speciality ! …
    http://www.firefly-it.com/website-design-a/website-design-aberfoyle.html – 14k – Supplemental Result

    A CMS DEmo we did yonks ago too -Still there ๐Ÿ™‚
    phpCommander Create a file Create a directory Refresh Upload file …
    Actions . . img, 4096, 2004 – 11 – 26, drwxrwsr – x. langues, 4096, 2004 – 10 – 21, drwxrwsr – x. htmlarea, 4096, 2004 – 10 – 21, drwxrwsr – x …
    http://www.firefly-it.com/phpcommander/cms/index.php?Directory=.%2Fcms&sort=sortDateDESC – 65k – Supplemental Result – Cached – Similar pages

    Cheers Matt and hope you had a good holiday ๐Ÿ˜‰

  42. Andreas… there is a link to a zip in the 2 posts (in the french and english version) Guys, did you read the posts ?

  43. Hi Matt,

    Had a relaxed vacation? Good post about distiction between webservers.

    However Google treats some pages strange:
    I redirect all my sites, old files neatly with 301 in Apache, put a good sitemap on the site and submit an xml file to Google Sitemaps. With all this effort Google seems to have problems indexing the domain. Because Google indexes the http://www.domain.com thoroughly. But Google also indexes domain.com grrr

    I try to prevent this (duplicate content) with:

    RewriteCond %{HTTP_HOST} !^www. [NC]
    RewriteRule .* http://www.%{HTTP_HOST}%{REQUEST_URI} [QSA,R=301,L]

    But the Google spider, won’t listen. It’s still lists the domain.com pages nothing seems to work ;-(

    About the distiction between webservers and redirection. It’s far easier for Apache users to make 301 redirect with a .htaccess file. Than for IIS users who have to do much more effort: contact their administrator to install ISAPI or use ISS manager to redirect all their files.

    Great tool that Google Sitemaps

    Greetz from the Netherlands

  44. @ Nix – I’ll take your word for it – and yes, I’m lucky. I only get to play with fully up-to-date dedicateds heh.

  45. Thanks for the info on web servers and new updated sitemap.

    301 redirect, it takes some time for google to pick up new pages which are directed to.

  46. Dave (Original)

    Paul, this works for me;

    Options +FollowSymLinks
    RewriteEngine on
    RewriteCond %{HTTP_HOST} !^www.domain.com [NC]
    RewriteRule ^(.*)$ http://www.domain.com/$1 [R=301,L]

  47. To resume the problem:
    1-Use RewritePath(url,false); /*false IS important to reproduce the problem */
    2-Do a request on a sub directory such http://domain.com/toto/coucou with the users agent indicated on the blog
    3-There’s an error !

    If you use true on the REwritePath method, all is ok :o)

  48. Matt, I have for a year or two suspected that a disturbingly high percentage of so called ‘google dropped my site’ or ‘google sucks because it isn’t ranking my site high’ is due to problems with rewriting of urls, and failure to rewrite them.

    This concrete example confirms my suspicion. For some reason, the vast majority of webmasters absolutely will not consult with experts when doing this rewrites, even though it’s the single place where an error can create near disastrous results, or totally disastrous. Since I know of few things harder than doing a quality url rewrite, it’s really not surprising that so many people get messed up like that. I’ve also suspected that of these poorly implemented or absent rewrites, a large percentage are based on IIS servers.

    Unfortunately, few webmasters ever bother to even mention the server their site’s are runnning on, as if that doesn’t matter.

  49. Unfortunately, few webmasters ever bother to even mention the server their siteโ€™s are runnning on, as if that doesnโ€™t matter.

    The problem with that statement is the part that’s missing: a lot of webmasters, surprisingly and disturbingly, don’t even know what server and platform their sites are running on.

  50. h2, it’s true that pretending to be a search engine with telnet/Lynx or a user-agent switcher can often help debug stuff.

    Paul, I know that’s feedback that I’ve heard before and passed on. In an ideal world, no one should need to worry about www or non-www issues.

    Allan Stewart, in general that’s something where the next time the Supplemental Googlebot visits, it will see the pages are gone and drop them. As long as you’re returning a 404 or a 301 to a sensible page, I wouldn’t worry about them very much. It doesn’t harm you to have those extra pages showing.

    Ryan, I mentioned this recently in a post, but as soon as Gbot can crawl you again, you’ll usually start showing up again in a few days.

    lots0, glad I could help.

  51. The new enhancements to SiteMaps are awesome. Kudos to Google for making this search index information public and easy to get at.

    As far the ASP issues – man, why do people put up with so much pain hosting on windows boxes? Just use apache for God’s sake.

  52. As far the ASP issues – man, why do people put up with so much pain hosting on windows boxes? Just use apache for Godโ€™s sake.

    Show me where the pain is in a box that hasn’t gone down in 2.5 years, has had to be restarted exactly twice, and takes 2 minutes to set up a new site on it.

    Seriously, this is one of the biggest problems with ASP in general: people don’t know how to code it properly. And to be fair, part of that came from Microsoft’s default coding examples, which were meant to illustrate the easiest way to do something, not the best. Nevertheless, it’s an opinion based on ignorance.

  53. Note: the two restarts were due to server component installs, not crashing.

  54. Dave (Original)

    BUT, were the component installs to stop the server crashing? ๐Ÿ™‚

    Aren’t Windows servers targeted by hackers more than Linux?

  55. I don’t like IIS because of all the duplicate content issues that it exposes, not least the Variable Case URL Problem.

  56. “upgraded Sitemaps to show more details on errors that Googlebot saw when we tried to fetch pages from a site.”

    You know Matt, that 302 thing is still such a problem…I market a site who was being 302 hijacked for a long time by a directory. Googlebot never got past the homepage of the site… and I have had Google Sitemaps with the site for a long time as well, however, there is never ever Crawl Errors found. But I have a 1000 urls in the sitemap, and only one page (homepage) was ever indexed by googlebot.

  57. Hi Matt,

    The information posted here about the URL rewriting in ASP.NET 2 is really great. Earlier we have developed few URL Rewriting project successfuly in ASP.NET and google supports very well to our websites. But after reading this useful information I will take care while coding in ASP.NET 2.

  58. Matt,
    My competitor is doing some tricky things with his website to boost his rank which I thought Google was supposed to catch. The term Sacramento Acura is very important to both of us. His website is http://www.elkgroveacura.com. He also owns an EXACT duplicate called http://www.sacramentoacura.com which directs all of its links to http://www.elkgroveacura.com. This has made it impossible to take the #1 spot for the search term. I thought Google was penalizing webmasters for this kind of activity. I have reported him several times via site maps to no avail. Is anyone looking at this kind of behavior?

  59. Well, now I know where to buy my next Acura.

    ๐Ÿ™‚

  60. Hi,

    I’m hoping someone could help me with a dilemma I’m facing right now. After reading articles about dynamic URLs and how they are not SEO friendly, I am convinced that help improve our website SERP position we should consider using static URLs. But the problem is convincing our programmers to do a rewrite for it will take them months to do the job. So here are a couple of questions for you guys…

    Our site uses the following URL format for almost all of the articles we publish:
    http://www.xyz.com/xyz/headlines/article_display.jsp?xyz_content_id=1002985563

    1. Will search engines have a problem crawling or indexing such pages?
    2. Is it possible to get high rankings in SERPs with that URL?
    3. Do you think rewriting URLs for dynamic pages such as the example above worthwhile?
    2. Based on experience roughly how long does it take to do the rewrite?

    Hope to hear from you! Thanks!

  61. Hey,
    I just wrote an article about Url Rewriting in asp.net 2.0 for virtual directories

  62. Thanks for info.

    I’m developing an ASP.NET 2.0 web controls to display Google AdSense units e.g. AdUnit/LinkUnit and wanted and used to use url rewriting for the library’s website http://AdSenseASP.NET/2.0-WebControls/ but will check is it and issue or not.

  63. Hi

    I am using ASP.NET 2.0, Isapi_Rewrite (Helicon).

    I have a set of pages that when requested under the Isapi Filter they render then after 1 minute or so they display “Internet Explorer cannot display the webpage”.

    The page is been rendered in what would be a .NET 1.1 Virtual Directory, but is Running under .NET 2.0 (But i couldnt see this as an issue).

    Has anyone else had this? Is it an issue with .NET 2.0 or the ISAPI extension, i have raised this with Helicon, but no response as yet.

    The links are as follows for your info:

    Under ISAPI – http://www.packyourbags.com/Travel-Guide/Default.aspx
    Not under ISAPI – http://www.packyourbags.com/pys-netv2/travel-guide/Default.aspx

    If anyone has come across the same thing i would be interested to know.

    Regards
    Anthony Hook

  64. As usual, great article Matt!

    You might like Force Apache to output any HTTP Status Code with ErrorDocument

    I setup an automated system to view all 57 Apache Response codes and ErrorDocuments, saving the headers and returned content for future reference. Use this page as a reference when designing scripts that use headers. Ex: 404 Not Found, 200 OK, 304 Not-Modified, 506 Service Temporarily Unavailable, etc.

    When a Status code is encountered, Apache serves the header and the ErrorDocument for the error code. So if you can see any Header and ErrorDocument by causing that error on Apache.

    For instance, if you request a file that doesnโ€™t exist, a 404 Not Found is issued and the corresponding ErrorDocument is served with the 404 Not Found Header. So we can see what Apache 404 Errors and Response Codes look like, but how do we cause errors for the 56 other Apache Response Codes?

  65. I use IIS Rewrite for IIS. I am able to write user friendly URLs like:

    http://jobs.verkoops.com/items/job_industry_equal_technology_and_keyword_equal_sql_and_salary_over_75000

    In a regular ASP.Net page this would have a bunch on URL parameters, but not with IIS Rewrite for IIS.

  66. Has anyone come up with a rewrite rule to be able to rewrite an arbitrary number of querystring elements?

    Ex:
    /sb-kw-KEYWORD-page-3-anything-anythingvalue-anything2-an ything2value-.aspx

    nextsearch.aspx?keyword=KEYWORD&page=3&anything=anythingvalue&anything2=anythingvalue2

    etc

    Thoughts?

css.php