Successful technology standards are those created by the market, rather than those imposed on the market. Because markets are fluid and difficult to predict, often an entity will mandate a technology standard in an attempt to isolate itself from the detrimental effects of technology’s dynamic nature. An unfortunate side-effect is that the entity is deprived of precisely what we seek from technology: its dynamic ability to find new ways to solve problems. And when governments are urged to adopt technology standards, it is often an attempt to use the purchasing power and influence of a large government entity to create change in technology markets, rather than merely a misguided effort to ensure “best practices.” Either motivation is doomed to failure, especially in the area of software.
There are two big reasons it is important for governments not to choose sides in software standards and licensing wars. First, it is simply too easy to bet on the wrong horse. A prominent developer, John Sowa, summarized this idea as the “law of standards”: "Whenever a major organization develops a new system as an official standard for X, the primary result is the widespread adoption of some simpler system as a de facto standard for X."
Second, and a point that flows from the first, even when you choose the best option available, that option will not be the best for long, nor will it be the best solution for all problems. Indeed, some have argued that “worse is better” in that simpler, less elegant technologies that get to market more quickly also improve more quickly. For this reason, Linus Torvalds, the creator and chief developer of linux, a shining example of worse-is-better, has refused to commit to any particular standard for linux applications to interface with the operating system because to do so would be to commit to the exact design linux had when the interface was created. (From a recent podcast interview for the Linux Foundation).
There have been periodic initiatives to try to convince governments to adopt particular “open standards”, such as the idea of the “single universal file format” promoted by the Government Open Source Conference (GOSCON). Top-down technology directives are almost always a bad idea, but there are special problems that arise when governments adopt them. Governments are not as nimble as individuals or private organizations at change, nor would we want them to be. We want our governments to have fair and stable procedures for making decisions. Indeed, when governments do not follow established procedures, we consider their decisions to be arbitrary and capricious. But mandating particular technology standards goes too far in the opposite direction by casting a fluid innovative area in cement.
We should try to learn from past mistakes in this area. The US Department of Defense (DoD) hired the best and brightest to create a genuinely superior programming language in the 1980s called Ada. DoD wanted all software systems to be built in Ada to reduce the cost of development and maintenance, and established Ada as the standard language throughout the four military services. Congress even passed legislation to mandate its use; any exceptions required a “special exemption.”
The only problem is that another programming language called “C” was taking off like wildfire, in spite of (or perhaps because of) the fact that it was not nearly as “good” a language. In contrast to Ada, the apparent simplicity of C made it very easy for less-skilled programmers to create systems that “crashed” and were hard to maintain. C became the de facto standard for system programming, and many defense program directors sought waivers to procure systems built in C and other languages rather than Ada. Furthermore, there was very little commercial adoption of Ada, despite the expert consensus of Ada’s elegance and superiority as a programming language and a huge built-in market in the DoD. When the DoD finally dropped the Ada mandate in 1997, it finally saw the cost savings and reliability it hoped for in creating Ada, “not only because
In hindsight, the best thing the government did in its efforts to adopt a standard systems language was to provide a mechanism for waivers and exemptions from the requirement. If the government had not provided for these exceptions, project managers that had different needs would have risked litigation and censure for considering better solutions. The lesson from the Ada experience is that governments should commit to learning and decision-making processes, not technology.
This comports with the recommendations of a recent Berkman Center report on interoperability and innovation. In this report, researchers at Harvard’s Berkman Center for Internet & Society talked with hundreds of experts and focused on three case studies to explore the relationship between interoperability and innovation. Their most pertinent recommendation is that “[t]he role of the state is to promote private sector solutions ex ante and to intervene only in appropriate cases, when markets fail in certain ways.” They also conclude that “interoperability is not an unqualified good and should not be seen as an end in itself.”
In an attempt to “level the playing field” and promote the ideals of “openness,” promoters of government standards threaten to separate users from the market leading solutions that they have the greatest demand for, supposedly for the greater good. In so doing, they forget that government users, like the rest of us, have real-world problems that need to be solved. If we allow some people to tie government’s hands by mandating particular standards, we hinder government’s ability to provide us services. And that is what we have governments for, not for side-stepping the market by attempting to choose technology winners for “the greater good.”