ASPnix adds ISAPI Rewrite - Finally!

ASPnix Web Hosting Logo Back in July of 2006 someone asked on the forum for ASPnix, the web host that specializes in CommunityServer, to add ISAPI Rewrite to their servers so that customers can clean up their URLs. Seven people including myself chimed in asked for it. Over the past eight months, little was said by ASPnix except by a former staffer who implied it was harm the stablity of their servers and who really gave no indication that any real consideration was being made to offer a solution for URL Rewriting.

Well finally, on Feb 22nd, Roma confirmed that ASPnix has will finally be offering ISAPI Rewrite on ASPnix’s web servers. That’s yet another IIS-centric web host who has finally freed its customers from the shackles of poorly designed URL Hell! Hooray!

Now let’s just hope that Scott Watermasysk can be convinced to add URL Rewriting support in CommunityServer using ISAPI Rewrite to eliminate .ASPX extensions and more on CommunityServer, sooner than later.

Buzzwords

They say people can’t understand an abstract concept unless they have language to describe it. For example, because Tahitians don’t have a word for sadness they think of sadness as they would a physical illness.

As we are immersed in a world of rapid change we need many new words to describe previously unidentified concepts. And when one of those new concepts inspires the masses, the media latches hold and a buzzword is born. And though everyone scoffs at them, we simply couldn’t discuss so as new concepts without using buzzwords. Like it or not, buzzwords are here to stay as the pace of change accelerates.

Recent examples of Internet buzzwords are ‘AJAX‘ and ‘Web 2.0‘ with the latter often being derided as meaningless and just hype. But ‘Web 2.0‘ is, by definition, not meaningless! Ney, the term ‘Web 2.0‘ identifies the nature and level of activity on the web not seen since the dotcom crash. So if ‘Web 2.0‘ were truly meaningless, there wouldn’t be a buzzword for it! Of course whether or not ‘Web 2.0‘ actually describes anything of tangible value distinct from prior periods is a matter of significant debate. :)

The reason buzzwords are so beneficial and will continue to be used is they give people a shared context in which to efficiently communicate, and that has an incredible value. Of course most buzzwords are merely shorthand for “the next big thing” but that’s just the nature of the hyped-up world we live in.

As an aside, the reason the term ‘Web 2.0‘ has attracted so much derision is it grouped hard-to-pin-down concepts having more in common with the current era than anything else. The shared context for ‘Web 2.0‘ is ‘the period starting around 2003‘ and since there is little value in discussing ‘the benefits of the period starting around 2003‘ the value of the shared context is diminished and dissonance results. It would have been much better had the purveyors of Web 2.0 done more to segment and focus attention on the individual concepts instead of defining the umbrella that covered them. Ah, but easier said than done.

On the other hand when the buzzword defines a concise and well understood concept the shared context can create many orders of magnitude more value than the concept on its own, as has been the case with the term ‘AJAX.’ Of course the downside to buzzwords is that wherever they go hype will follow, and that you just can’t avoid!

Can Microsoft’s Developer Division Compete Moving Forward?

I’ve been planning to blog about this for some time but just haven’t gotten to it. Well here goes…


Contents

Note: The day after I posted this I decided to add headings to make the argument easier to follow.


Is Microsoft’s Approach Failing?

I believe Microsoft legacy processes simply cannot react fast enough to the innovation happening in the open source arena on the language and web framework front. Microsoft’s developer division typically offers three-year version cycles where they first architect Visual Studio and related technologies in a vacuum. In recent years they’ve even thrown out alphas and betas to the Microsoft faithful to get feedback which, and thankfully they’ve used a lot of that feedback. But that approach just isn’t working in the market anymore. When the release cycles of
scripting languages frameworks like Ruby On Rails and Django and CMS platforms such as Drupal are sometimes as little as a few months, it’s really hard to wait around for the next version of Visual Studio.

After Ten Years; Too Little, Too Late?

It would be different if Microsoft’s developer technologies provided at least 95 percentile of what’s needed by work-a-day developers on a daily basis, but they don’t. Case in point is we still don’t have the ability to do complete URL Rewriting for ASP.NET on IIS even though Apache has had mod_rewrite for years. Looking back, how many years of massively duplicated developer effort in the field did it take befor Microsoft finally provided a login module and a method of managing site-wide templates?!? (i.e. “MasterPages”) Oh, about a decade from when they first released Active Server Pages.

Providing Solutions Frequently Just Not a Priority

It’s not just that Microsoft’s developer division takes too long to offer new solutions to recurring needs; it is that they place such low priority on providing those solutions. Three year development cycles testify to that fact, especially when you consider it takes Microsoft many releases to address fundamental needs. The guys on the product management teams at Microsoft are really smart people, but they often can’t see how much trouble they cause people in the field by their decisions. They see the world of creating Visual Studio, but they don’t see the world of using Visual Studio to develop software.

Core “Real World” Problems Not Addressed

What’s more, Microsoft architects its developer products in a vacuum; they don’t use them to solve “real world” problems. Sure, they may use them internally for developing productsbut when does the average developer’s project look like product development at Microsoft? They often create excellent software but software that either doesn’t solve real world problems or does so in a totally over-engineered manner. While running Xtras I watched many a developer launch a 3rd party component business because they had identified a need while working on a real world project. However, once they saw small success as a vendor they started developing, designing, and even envisioning new products in a vacuum. And often those products either didn’t address real world needs or did so in a really unnatural manner.

Microsoft is a much worse example of this. Their saving grace thus far has been market share and financial resources to brute force their products into the market, and many of the faithful won’t even look at other s offerings to understand why some of Microsoft’s offerings so miss the mark. I know, until recently I was one of them.

Values “Sugar”-Free Over Productivity

And Microsoft’s product managers often dismiss feature requests that would make development a LOT easier as simply being “syntactic sugar. For example, one such dismissed feature request I made years ago was for simplified property references in VB.NET. I wanted a syntax that would allow a developer to implement a single-line syntax for specifying properties you didn’t need anything special, something like:

1. Property Foo Into _Foo

Instead of nine lines of:

1. Private _Foo
2. Property Foo
3.    Get
4.       Return _Foo
5.    End Get
6.    Set(ByVal value)
7.       _Foo= value
8.    End Set
9. End Property

That would have reduced the number of lines of VB.NET code by probably half an order of magnitude. But they just weren’t interested in it because it “bloated the language and otherwise had no value” (I am paraphrasing from memory.)

Focuses on Details, NOT the Big Picture

Even more, I advocated an advanced scripting language that would be a lot like today’s “in-vogue” scripting languages. I called my proposal VBScript.NET. But then my suggestions were dismissed for esoteric reasons and I was told that Top Minds Are Working On It! (Well, evidently not, or so many developers wouldn’t be moving to PHP, Ruby, and Python.) Microsoft’s culture is to argue semantics when reality doesn’t match their world view, and they are blissfully willing to ignore the pain that continues to exist.

Revolutionary Paths Are Often Dead-Ends

What’s more, probably because of its financial resources and a hubris that comes from being the industry leader, Microsoft has a bad habit of creating huge revolutionary jumps instead of small evolutionary steps. Rather than always creating lots of little independent layers of loosely coupled components, each with it’s own independent functionality, documentation, and rationale for existence, Microsoft often builds monolithically complex solutions where the individual components are highly coupled, not documented, hidden beneath the covers, and frankly with functionality that has not been fleshed out well had it had to be developed to stand on its own. This creates bloated and fragile systems that are often extremely hard to debug and for which there is no passionate community of supporters surrounding it.

ASP.NET: Wrong Medium, Wrong Model

ASP.NET is a perfect example of many of these problems. Rather than study the web and realize it was a violently different platform than desktop apps, Microsoft chose to shoehorn an event model onto the web and use a page-oriented implementation. Not only did they get the medium wrong, they also got the model wrong. And this decision resulted in an outrageously complex pipeline processing model with tons of code that is hard to debug or even understand, and that requires lots of high end developers to figure it out and repeatedly explain to newbies what they need to do just be able to do some of the simplest things, things that are brain-dead easy in PHP for example.
But hundreds of thousands of Microsoft-centric developers just trudged along and accepted it as the next best thing because Microsoft said so. And for a short time, I was one of those true believers.

ASP.NET: Exceptional Engineering, Answers Wrong Questions

Now, however, even many Microsoft developers are starting to see ASP.NET for what it really is: An exceptionally engineering product that answers the Wrong Questions. Former ASP.NET developers are moving to the platforms I mentioned earlier (Ruby on Rails, Django, and Drupal) simply because those platforms offered developers the syntactic sugar they crave, and because the developers of those platforms focused on solving pain because the pain they were solving was their own.

Open-Source: Answering the Right Questions, Rapidly

Open-Source development by nature results in lots of little independent layers, and there are communities that sprouted or are sprouting to support each of those independent layers. Each of those layers has had an opportunity to be fleshed out, and by comparison it shows. How can something like Open-Source PHP on Apache take on mighty Microsoft’s ASP.NET and IIS, and win? Because they answer the right questions, and they did so in far less than a decade.

Is there any hope for Microsoft’s Developer Division?

Which brings me back to the original question:


Can Microsoft’s Developer Division Compete Moving Forward?

Frankly, though I really like the .NET Framework and hope I’m wrong, I’m completely skeptical.

 

Announcing WellDesignedUrls.org

Those of you who read my blog know that I strongly believe in the importance of URL design. For years it bothered me that we’ve see so many URLs on the web that look like the following example of poor URL design from Jeffrey Veen’s 2001 book The Art & Science of Web Design:

http://www.site.com/computers.dll?1345,1,,22,567,009a.html

Back in Aug of 2005 I finally got my thoughts together and wrote the post Well Designed Urls are Beautiful. Well, from anecdotal evidence (I don’t track stats on my blog stats very closely) it appears that post has become my blogs my popular post!The popularity of that post combine with the several others facts inspired me to go ahead and launch a website with the following mission:

"Providing best practices for URL design, and to raise awareness of the importance of URL design especially among providers of server software and web application development tools."

The "facts" I referenced above are:

  • I continue to feel strongly about URL design yet many are still oblivious to the benefits,
  • I still have a lot more to say on the topic, and
  • It appears that good URL design is one of the many tenants of Web 2.0 partly because of AJAX, Mashups, and REST-based APIs meaning that it won’t be such an uphill battle!  

The name of the website/wiki is WellDesignedUrls.org and for it I have the following goals:

  • To create a list of "Principles" as best practices for good URL design,
  • To cultivate how-to articles about implementing good URL designs on the various platforms like ASP.NET, LAMP and Ruby on Rails, servers like IIS and Apache, and web development tools like Visual Web Developer and Dreamweaver,
  • To cultivate general how-to articles and resources for tools such as mod_rewrite and ISAPI Rewrite and others,
  • To cultivate "solutions sets" for mod_rewrite and ISAPI Rewrite and others that can clean up the URLs on well known open-source and commericial web applications,  
  • To grade web applications, websites, and web development tools by giving them a "report card" on how well or how poorly they follow best URL design practices,  
  • To document URL structure of major web applications and major websites,
  • To recognize people who are "Champions for the URL Design cause" (those who’ve written articles and essays promoting good URL design), and  
  • To providing resources for further reading about good URL design.  

The wiki is clearly new and thus a work in progress, so it will probably be a while before it realizes all these things I mention. However, as I have time and am able to recruite others to help, I think it will become an important advocate for good url design and a great central resource for best practices.  And if you’ve read this far, I’m hoping that you’ll consider either contributing when you feel you have something relevent, or at least use start considering the value of URL design in your own web application development and also point people in the wiki’s direction when applicable.Thanks in advance for the help!P.S. I also plan to launch a WellDesignedUrl blog in the near future.

Subscribe to my RSS feed it you want to be notified of when the blog goes live.

Anti-phishing tactic helps the “Well Designed Url” cause

Today Joris Evers on CNET posted an article about the security developers for the four main web browsers discussing how to make surfing the Web safer. One of the tactics mentioned was Microsoft plans for IIS7 to show the URL in the address bar on all Internet windows to help users identify fraudulent sites. Whereas the trend has somewhat been for many websites to eliminate the address bar on their seconday windows to make their websites look slicker — see what happens when the bad marketing wonks get involved, and when techies become over-enamored by techniques like AJAX — this move will shine the light more brightly on the lowly URL.

In the past have blogged about Good URL design for websites and the related topics of wanting Mod_rewrite functionality for IIS and the tool ISAPI Rewrite that gives mod_rewrite functionality to IIS so it is clear I’m passionate about virtue of incorporating URL design into the overall design of a website. More specifically, my personal opinion is that URL design is one of the more important aspects of web design. This even though one person in this world disagrees with me, but Mark Kamoski is wrong. :)

What’s cool about IIS7 requiring the URL to be seen at all times besides the obvious anti-phishing benefits is it will hopefully cause more website stakeholders (marketers, developers, etc.) to think more about the design of their website’s URLs.

And that would be a good thing.

P.S. Actually, I’d love to see all Windows applications do what Windows Explorer does and support a URL of sorts (maybe call it an "LRL" as in Local Resource Locator?) Wouldn’t it be great to see apps like Word, Excel, QuickBooks, and even Visual Studio be written as a series of state changes where the URL/LRL could represent in a user readable format each uniquely-representable state (with some obvious caveats)? Just imagine how that would empower the creation of solutions by composing applications… but I digress as that is the topic for a future day’s blog post.

P.P.S. I almost don’t want to say this next thing as it could obviate the need for exposing URLs to guard against phishing, but I’m too intellectually honest not to. I see a huge market opportunity for Verisign, with the support of browser and server vendors, to enhance their SSL certificates to include a "Phishing-Safe" seal of approval. Today website owners only need pay for a certificate if they are collecting sensitive information, but in the future I could see it becoming a defacto requirement for any website with a login to need a "phishing-safe" certificate, raising the bar on lots of hobby forums sites, etc. But I once again digress… Oops, I should have read the whole article before pontificating here; looks like they are discussing just such a concept.

Links and Discussion related (indirectly) to ISAPI Rewrite

I just found a blog post by Shirley E. Kaiser at her blog entitled Brainstorms & Raves containing an awesome collection of links and related discussion about Apache’s .htaccess.

While admittedly I write mostly for an audience of developers that use Microsoft-technologies, many of the items discussed apply to Microsoft’s IIS if you use a 3rd party tool named ISAPI Rewrite. This tools provides many of the same features of Apache’s .htaccess on IIS via the httpd.ini config file and implements most of its functionality in a manner identical to mod_rewrite on Apache.  I absolutely love this tool and I’ve previously blogged about ISAPI Rewrite as well as The Importance of Well-Designed URLs, the latter of which is IMO the most important reason you absolutely need ISAPI Rewrite if you are hosting a website on IIS.

Anyway, the blog post covers topics and links to articles that:

  • Explain how and why to rewrite and/or redirect URLs,
  • Discuss techniques for reducing hotlinking and bandwidth theft,
  • Talk about blocking bad bots and comment spammers,
  • Covers regular expressions needed to match the URLs to rewrite or redirect,
  • Explains Robots.txt files,
  • Lists which bots are good and which are bad, and
  • Covers HTTP error code.

A great resource!

All I want for IIS7 is my mod_rewrite!

I’ve recently been spending a lot of time pondering and pontificating on web architecture, and it occurs to me that Microsoft’s Internet Information Server (IIS), now in it’s sixth version, is still pathetically lacking in one key feature that I think it critical for properly architecting websites. And this key feature has been part of/available for Apache for a long time in the form of mod_rewrite.

What is this key feature? The ability to create dynamically defined virtual URLs that contain only directories - i.e. URLs that don’t require a file name with a specific extension.  Sure, you can easily support dynamically defined virtual URLs using a custom HTTP Handler with IIS6 and ASP.NET; they will look something like this:

http://www.trucks.com/toyota/4runner.aspx
http://www.trucks.com/nissan/pathfinder.aspx
http://www.trucks.com/honda/ridgeline.aspx

But it won’t support dynamically defined virtual URLs that look like this without some C++ derived ISAPI magic:

http://www.trucks.com/toyota/4runner/
http://www.trucks.com/nissan/pathfinder/
http://www.trucks.com/honda/ridgeline/

However, there is a reasonably inexpensive third party ISAPI filter known as ISAPI_Rewrite that provides this capability for IIS.

In addition, there’s another feature that is needed, and that’s the ability to cleanse HTML output before it’s sent to the browser.  Why would you need that?  For example, if you are using a content management system like DotNetNuke that has a mind of its own with respect to the URL formats it generates and you want use clean and concise URLs you need to be able to filter and transform the HTML after your CMS generates it but before IIS sends it on to the web surfer’s brower.  There is evidently an inexpensive product named Speerio SkinWidgets DNN with a "widget" called PageSwiffer with which you can supposedly filter and transform outbound HTML, though I’ve not tested it.

I‘m sure both of these products are great, but dang it all Microsoft, this functionality really should have been baked into IIS a long time ago. How it is something that should have been this easy to add has been for so long overlooked?

Anywho, here’s hoping these features show up in IIS7.

Well Designed URLs are Beautiful!

With all the talk of AJAX these days and with my concerns about poorly implemented AJAX-based sites and what they may mean for the web, I’m once again reminded of an opinion I’ve had for a long time: Well designed URL is one of the most valuable aspects of the web. Put more succinctly:

Well Designed URLs are Beautiful!

The following are my (current) set of rules for how to ensure beautiful URLs:

Well Designed URLs Point to Content that Does Not Change

Theoretically, each URL points to a unique view of specific content, or a specific "state" if you will. And I content that should include URLs that point to dynamically generated web pages.

Of course many URLs point to content that changes that with each view (such as advertisements) and/or that is modified based on the current login state of the person viewing the content. Both of these cases corrupt the purity of the ideal stateless URL, but in my pragmatic opinion they are okay as long as the core content for a given URL is static.

URLs that point to home pages and section home pages often change their content, but to me that is okay too. Web users generally don’t expect portal pages to have static content, but all other URL should point to content that doesn’t change.

Well Designed URLs Don’t Change

This should go without saying, but then again how often have you found exactly what you were wanting on a search engine or in a website’s "links" section only to get a 404 after you click the link? Sometimes this happens because the website owner went out of business, but usually its because of a careless website owner/developer who simply reorganized the website without considering all the dead links they are creating, and all the opportunities for traffic they lost for themselves.

Of course most application servers and most content management systems make it almost impossible to "do this right." For example, what’s with Microsoft’s IIS, now in version 6.0, that you can’t serve virtual URLs unless they have an extension (most commonly .aspx?!?) Sheesh!

Well Designed URLs Can Be Stored

When you see a web page you like, you should be able to bookmark its URL and return to view the same thing later. When you see a web page you want a friend to see, you should be able to cut & paste into an email where they can see in their browser exactly what you saw (this is especially helpful if what the see is a bug in your server-side scripting!) And when someone who is blogging or building out a website finds your related web page, they should be able to include your URL as a link in their web page, and it should work as a link for anyone that views their site. Plus, if a URL can be stored, it can be indexed by a Search Engine.

Well Designed URLs Only Use Parameters with Forms-Driven Queries

Many websites use parameters to data drive their website. In most cases, those URLs are just ugly. If I’m looking for Sweaters at Sears, I should click a link that points to www.sears.com/sweaters/, not www.sears.com/products?type=23.

Instead, URL parameters should only best used on pages that allow users to submit a query based entered into form fields. All other URLs should be composed and readable.

Well Designed URLs are Readable and Hierarchical

URLs can and should be part of a website’s user interface. Well designed URLs that are readable and Hierarchical provide wonderfully rich information to a website user about the structure and content of a website. Websites with non-readable and non-Hierarchical URLs don’t.

Well Designed URLs Mean Something

Besides being readable, a web page’s URL should mean something to the website viewer. Having "/honda/" in a URL helps the website user understand the site; having "/tabid23/" in a URL does not.

Of course who violates this rule the worst? Content management systems. And it seems the most more expensive the CMS, the worse it violates this rule (can you say "Vignette?")

Well Designed URLs are Readable in Print

When you see a web page you’d like to reference in print, you’d want it to be readable, not a collection of what appears to be random letters and numbers (i.e. not like a "GUID.") Imagine a reader trying to type in 38 apparently random letter and numbers; that’s simply a painful thought.

Well Designed Websites Have Atomic Leaf Nodes

How many sites have a URL to display a collection of items but no unique URLs for each specific item? Just as an atom is indivisible, so should be leaf-node web pages, each with its own specific and understandable URL.

Well Designed URLs Are Hackable

A website that has a web page for the relative URL "/cars/toyota/4runner" should also have a web page for "/cars/toyota/" and for "/cars/."

Well Designed URLs Can Be Guessed

Let’s say that a website user is on a really slow link. If you URLs are well designed, chances are they can guess at the URL for the page they want. Of if you, godforbid, have a broken link, maybe they can correct it.

Well Designed URLs Are Only As Long and As Short As Necessary

URLs should be short. Short URLs can more easily be posted into emails and not wrap, and short emails can be printed in advertisements, for example. However, URLs should be long enough to make them readable, not obscure their meaning, and retain the website’s heirarchy.

If you can’t make URLs short enough to be readable, retain meaning, and retain heirarchy, create alternate URLs for print, advertisement, etc.

Well Designed Links Do Not Trigger JavaScript

How often do I find web pages with links that use JavaScript? Grrrrr!!!! (Can you say "__doPostBack()" Yes, I feared you could.) What’s wrong with JavaScript? Users often use browser’s status bar to view the URL give them a clue to where the link will take them. Not with JavaScript. Plus, many users hold down the shift key to launch a new window. NOT WITH JAVASCRIPT!!! (Can you feel my anger? Well Designed Links do NOT trigger JavaScript.)

If this is so bad, why is this done? For the application server developer’s convenience, and not for the user’s convenience; that’s for sure.

Well Designed Search Forms Always Use GET

Search engines result pages provide content too, and their URLs need to point to content that doesn’t change. Well of course they do change over time as they display what is current, but that’s appropriate for a search engine result page. If a search form using POST, its search engine result page URL is only useful when an action of a search form, and worthless in every other case.

For a perfect example of a search page that violates this rule, check out CycleTrader’s Search Page.

Well Designed URLs Have More Friends than Just Me

Of course I’m not the first to make the case for URLs, and I probably won’t be the last. Here are some of the better essays about quality URLs:

There are also a few tools that can help:

Well Designed URLs are an Asset

Some people rail against the URL and say it is an overly technical anachronism to which non-technical people should not be exposed. I completely disagree.

Just like so many other technical things that have become such a part of common culture over the years to have become all but invisible, such as radio station’s frequencies, checking account numbers, ATM passcodes, and highway speed limits, so too will URLs become so straightforward that they’ll soon not even be recognized as technical. Instead, they’ll be viewed as obvious and valuable because they fundamentally are.

So, in a nutshell:

Well Designed URLs are Beautiful!