August 29, 2004

Open Standarization vs. Market-Driven Innovation

A comment sent to my last blog posting has started me thinking about one of the more troublesome questions about the current state of the programming world. At what point does standardization (especially non-proprietary standardization) become more (or less) important than market-driven innovation.

To be on the safe side, let me define my terms. Market-driven innovation is essentially pushing to "completion" an api or architecture for the purpose of upgrading or innovating some kind of application, be it a paint program, a game, or an operating system. It is market-driven in that the API must be published at the same time as the application itself for users and third-party developers to make use of this API. For example, WinFX, the API for Longhorn, is very much tied into that architecture. In as much as the product (Longhorn) offers new functionality via an expanded (and perhaps improved) set of interfaces, these APIs essentially act as de facto standards.

Open standards, on the other hand, are generated (more or less) transparently, are not specifically tied into a product release, and (somewhat secondarily to this discussion) do not have license fees associated with them. Instead of describing what is (as a market-oriented API would do) such standards all too typically set benchmarks that must be met in order for a given implementation to be considered compliant to the standard. Thus, these standards serve as a goal, rather than a fiat description.

I'm aiming this particular discussion at the W3C, though other standards bodies are implied in the same arguments. One of the most frequently heard comments I hear about the W3C Recommendation process is that it is too slow. Certainly by the standards of commercial software vendors (such as Microsoft) it is. Take the SVG standard as an example. In early 1999, there was a fair amount of discussion about forming a W3C committee for the express purpose of building a language for graphical description of vector content, leading to the formation of the SVG group after two proposals (PGML and VML) had been offered up by Adobe and Microsoft/Macromedia respectively. The language was formalized in 2000, along with the release of one reference implementation - Adobe's SVG Viewer. Four years later, there are still critical things that are going into the standard (text flow, multimedia support and XML file import/export capabilities, among other things), and the notion of interactive SVG has effectively been offsourced into the XBL initiative.

Four years ago, .NET was still an interesting marketing idea, we were all still trying to get the taste of Windows ME out of our mouths, and Linux was still a geek platform, though there were signs of respectibility. On Internet time, four years is practically a whole generation. So why have these guys been moving at a snail's pace?

Or have they? The Law of Three seems to be an extraordinarily common pattern within program, and it can be roughly summarized as:

A software application can only be considered finished when it has gone through it's third release.

There are actually any number of perfectly good reasons for this. A few:

  1. The first version of a product is its conceptualization, the second is its working prototype, the third is its final proof.
  2. The first version tells customers what's possible, the second tells the developers what the customers want, and the third is where the two meet in compromise.
  3. No standard exists by itself, it takes a while (and a couple different prototypes) before all of the potential interactions come to light.
I can give more, but I like the symmetry of three here.

An open standard is not a product. Rather, it is a meta-product ... it is a description of the realistic expected behavior of an application that provides the greatest degree of compatibility with other applications while simultaneously providing the greatest degree of performance for any given implementation. It is a politicial document as much as a technological one, because the all of the primary players in the field must agree upon that point where compatibility vs. performance must lay. Politics moves slowly, especially in the business world, because of the realization on the part of all of the participants that they are ceding competitive advantages in the technology by participating in the standard process in the first place. Thus sometimes the most trivial points become major roadblocks, and deep level architecture can prove excruciatingly painful.

Yet, if you look around today, you may notice that SVG has quietly crept into just about every major graphics application on the planet, from Macromedia Flash and Adobe Acrobat to Microsoft's Visio. It's even beginning to show up in marketing literature and bullet points. Yet this is not because Adobe or Macromedia or Microsoft doesn't believe wholeheartedly that their home-grown solutions are far and away superior to SVG. Instead, its the fear that all of a sudden, their competitors have started to incorporate SVG, the market is beginning to clamour for it, and if they don't have that particular bullet point, they'll lose market share to the ones that day.

In that respect, Open Standard solutions are "Dark Horse" candidates, nowhere near as flashy as what each company produces in-house yet the best compromise for interchange and keeping the competitor from seizing that market advantage. This means that the "biggies" in any given field will be at best lukewarm to such standards, but they are also very much aware that there is a whole crop of new kids on the block that may very well hit the turning point and eclipse them all. Those new kids, on the other hand, not having any advantage in building yet another in-house standard, are perfectly happy to take a pre-existing standard and build the best shell for it, because much of that initial expensive design process is effectively done for free for them.

I think this is one of the reasons I am so ambivalent about companies that are deciding to jump to XAML and build XAML processors while Microsoft is still trying to get their act together with Longhorn. XAML is cool -- I've worked with a couple of different iterations of it, and while I have some quibbles about the somewhat monolithic nature of the schema I have to admit that there are some very smart minds there playing with the lower level architectures. It was a good design choice (especially if they get away from the requirement of needing to pre-compile it, security be-damned) , and it may very well prove the salvation of Microsoft as their product market otherwise decays.

However, all of those companies that are jumping onto the XAML bandwagon are placing themselves at the mercy of Microsoft, and once Longhorn is released, "XAML compatible" is going to be very much suspect. Such companies may be able to play the role of Borland and maintain such alternative interpreters (as Borland did with their OWL libraries back in the eraly 1990s) but most customers who want XAML viewer/interpreters are going to want Microsoft's version, if only because all it will take is one republishing statement for Microsoft to effectively render all other versions of XAML obsolete and unrunnable. Don't think they'd do it? They'd be foolish not too. Five letters ... DRDOS.

SVG and other XML standards (such as XForms) are going to have setbacks, are going to undergo multiple revisions in the face of user pressure, and are going to emerge stronger for it. Once SVG 1.2 is published, I suspect that it will be the last word on SVG proper, though binding architectures through XBL will take a couple of more years to happen. The next version of SVG (1.3 or 2.0) will be the logical jump from the 2D space to the 3D one, and it will likely involve a collision between two open standards groups that will push the completion of that process well into the 2006 time frame (just in time for Longhorn). More on that struggle lately.

However, the point I'm trying to make here is that I can reliably count on being able to use SVG 1.2 ten years from now with little or no modification, providing a great deal of stabilization that's necessary to move to the next level of abstraction on the web. Is anyone willing to bet me that XAML files generated today can still run 25 years from now the same way? That's what taking your time with the political processes involved in Open Standardization buys you, stability upon which to build the next layer of abstraction.

I welcome other people's comments on this thread here at http://metaphoricalweb.blogspot.com. I've deliberately stacked the deck here in favor or the Open Standards side, and if you feel strongly about the disadvantages of the Open Standards process compared to Close Standards (or just feel I'm full of it) feel free to let me know.


88 comments: