December 4, 2004

Internet Time and the Long Now

I wish to thank Edd Dumbill for syndicating the Metaphorical Web on his superb blog aggregator site http://planet.xmlhack.com. It's an honor and a privilege, and I hope to be able to produce commentary in keeping with the luminaries on this board.

I've been pondering for a while the state of the XML world. XML has now been in existence as a standard for nearly seven years. XSLT and XPath for six, SVG for four and change. By the scale of Internet time (how retrograde that expression seems now) these technologies are no longer bright and shiny new, but are actually approaching middle-aged. Heck, Microsoft would have gone through two world changing marketing campaigns by now, and would be gearing up on the third. So why does it seem that XML is still just getting started?

Oh, you could single out all the usual suspects, of course. The W3C moving at a snail's pace, vendor lock-in keeping these open standards from reaching their true pace, the lack of marketing to push standards into the forefront of computing and so forth (all of which are true to a certain extent). However, over the last year or so another idea has occurred to me, one that is perhaps heretical in the world of nanosecond processors, yet something that has nagged and tugged at my consciousness for a while.

Daniel Hillis, the founder of Thinking Machines and an Imagineer for Disney for a number of years, has of late spent a great deal of time concentrating on another project of his, The Clock of the Long Now. This particular clock is unusual in that it is intended to exist for 10,000 years. To put that into perspective, 10,000 years ago Europe was beginning to repopulate after the last of the continent covering glaciers had retreated, and the dominant language was an early Mesopotamian ur-language which would in time branch into the Indo-European languages.

Thus, to build such a clock, you have to design it to be simple, to be self-maintaining (who knows how many centuries may pass between people even knowing of it) and to be intuitively obvious regardless of what culture finds it. In many ways, the Clock is the ultimate exercise in user interface design. It is also, in its own way, a remarkable statement about the nature of time itself and man's interaction with it.

So what does such a clock have to do with XML? In any period of intense innovation, a pattern tends to emerge, regardless of the technology in question. Most of this should be familiar as a variant of the adoption curve:

  1. An innovation is created, usually a variant of some existing technology but with a twist that moves the technology in a completely unexpected direction.
  2. "Hobbyists" emerge, the earliest adopters who work with out while the technology is still in rough stages. They often end up becoming the evangelists for that technology later on.
  3. The technology attracts the attention of early investors, who see this as a way to start companies that exploit the technology. Such investors seldom come from the hobbyists, but they are keyed in enough to what where the hobbyists are most excited.
  4. The technology goes into its hype phase, where marketers begin promoting it for everything from improving efficiency to curing the common cold. This phase usually involves the appearance of semi-nude women seductively wrapped around the piece of hardware, working at a computer with the software, or otherwise doing improper things with a technological implementation.
  5. The backlash occurs as people realize that it does not in fact cure the common cold, may actual decrease efficiency in certain ways, and tends to be a big turnoff to attractive young women at bars (who are usually after the early investors, not the schleppy users).
  6. Meanwhile, those people who utilize the technology for a living figure out whether the technology actually does meet a specific need of theirs, and in the process will usually provide feedback to strengthen the technology in question.
  7. If the technology manages to get past this stage into adoption, its development shifts from paradigm shifting innovation to more stable improvements, especially if the same technology inspires other implementations/variations. This competitive phase usually lasts until one particular implementation manages to gain a 90/10 mindshare. This phase also usually sees the emergence of some body of standards that define interconnectivity of that particular technology (for instance, standard gauge sizes in railroads, standard socket types, HTML). These standards usually reflect the implementation of the dominant player.
  8. The technology then is able to sit with comparatively minor changes for a significant period of time - years or even decades, but because they are based upon these standards, the dominant player also has the greatest to lose and least to gain in changing those standards to reflects shifts in new technologies.Standards that were based upon a given implementation consequently freeze that implementation, until eventually, the standard is out of sync with additional innovations.
  9. The dominant standard-bearer can suddenly find its position upended very quickly -- within a couple of years -- and find that it is now in a position with an aging infrastructure and a massive installed base. It either abandons that base and moves on, putting itself into a much more vulnerable position, or it sticks to that base even as the base itself moves on to new technology. In either case, the technology may remain for a while longer in a kind of white dwarf state, slowly cooling to a brown cinder. Periodically, the technology may be resurrected for a revival (think of all of the coin-op video games which now exist as software emulations) but such technology holds entertainment value only (restoring an 1890s railroad train engine, not because it is even remotely competitive, but because it has sentimental value).

Given this birth to death cycle for most technologies, why does XML feel so different? My suspicion is that XML is different -- it is a technology for the Long Now. XML of course has its roots in SGML, a technology that was relatively obscure for most people outside of corporate librarians and academics but that can in turn be traced back to the work of Charles Goldfarb in the 1960s. This means that it in fact predates another "long-now" technology: the Unix/C duality. SGML was not intended as a means to gain temporary market share; indeed, the high cost of creating SGML implementations meant that it was really commercially viable only for the largest of organizations.

SGML is not an implementation-driven technology -- it was from the first a vehicle that required consensus, because it's focus was at the heart of document communication, which means that from the start it was intended to model the way that people think, not what is necessarily the best way for machines to store artificially rigid class constructions. Because it was a meta-language, SGML by its very nature was intended to be implementation independent, long before Sun's "Write Once Run Anywhere" slogan came into play.

In addition to this, SGML is declarative. It's operant verb is BE, not DO. By doing this, it was able to provide a level of abstraction that bypassed the 1000 class monstrosities that emerged over the course of the next three decades. That abstraction lives on with XML, to the extent that it is dramatically affecting both the theory and practice of coding.

XML, in turn, has been a refinement of this notion of working with abstractions through an abstract interface, with the underlying assumption that so long as the expected behavior that's agreed upon is met, that specific implementation is irrelevant. One effect of this has been the increasing dominance of XML abstraction layers that in turn push the imperative code -- the C++ and Java and Perl of the world -- down the processing stack, away from the business logic and closer to providing a common low-level substrate. XUL, SVG, XAML, XFaces, XForms, etc. all provide manifestations of this principle to some degree. Create a binding layer such as XBL, and you can hide the API even more, with the consequence that you can increasingly reproduce sophisticated sets of object code with XML representations. The imperative code never goes away completely (nor should it) but it becomes much less significant in the scheme of things.

As a consequence of this, while certainly there have been companies that have ridden some aspect of XML through the technology business cycle described above, I think that XML as a technology is acting very much like the rise of mitochondrial RNA arising during the Cambrian era - providing a mechanism that serves as a substrate for a whole new kind of life (or in this case for a whole new kind of programming paradigm). The problem from the standpoint of those in what had been the dominant paradigm - the framework based OOP system - is that such XML provides few of the features that traditionally make money for vendors -- lock-in of file formats, opacity of data structures, variations in access formats that provide advantages to one or another vendor, arcane API implementations, reliance upon a specific language, and so forth.

Web services make it possible to do things that would have been impossible otherwise (the dominant of which seem to be less providing unique data feeds and more performing transformations between varying schemas). XML based GUIs (and even full application sets) are now becoming the standard way of building human faces to applications, cutting down on the reliance of specific language toolkits. XML is even being used for discovery of non-XML APIs, something which usually indicates a transient phase (if you can discover non-XML APIs, you are invoking an abstraction layer within the interface, which in turn makes it easier to decouple the implementations from an imperative basis.

Microsoft has gone through several core changes of technology (and more marketing "initiatives") in the last 30 years, yet curiously few of them have a staying power beyond about five years. This is not a disparagement of their products (which for all of my grumpiness about Microsoft even I will concede they do produce very usable and professional products), but I think is more indicative of the commercial software vendor market (I could replace Microsoft with Sun or Hewlett Packard in the above statement and be equally correct.) Their definition of Internet Time gets conflated with their definition of Business Time, where next quarter is more important than next year, and five years out is an eternity.

Ultimately, though, Internet Time to me is something that isn't measured in nanoseconds and months between deployment cycles. If, as I do, you believe that XML is becoming the neural circuitry of the Internet, then I have to wonder if perhaps the real Internet Time is measured in decades, and maybe even centuries.

Indeed, when Danny Hillis finally gets around to inscribing the operating instructions of the Long Now Clock on its stainless steel base, perhaps, just perhaps, he should write it ... in XML.

121 comments: