CHAPTER 4 Standards and standardization processes

Introduction

The world is full of standards. Standards regulate, simplify and make possible an extensive division of labour which should be recognized as a necessary basis for far-reaching modernization processes (REFs noe STS).

Also in the world of computers there is a rich variety of standards. The conventional wisdom, however, is that standards are either simple and straightforward to define or purely technical (REFs). One, if not the, key theme running right through this book is that the development of an information infrastructure, necessarily including the standards, should instead be recognized as a highly complex socio-technical negotiation process. It is a pressing need to develop our understanding of how "social, economic, political and technical institutions (...) interact in the overall design of electronic communication systems" (Hawkins 1996, p. 158). There is accordingly a need to classify and conceptualize to grasp the role of standards in the development of information infrastructures.

This chapter first outlines the basic argument for standards for communication technology. Strictly speaking, "Communication systems cannot function without standards" (Hawkins 1996, p. 157). We then provide a taxonomy of different types of standards. Standards for information infrastructures are worked out within quite distinct institutional frameworks. We describe the most influential, international institutions aiming at open information infrastructures. Lastly, we briefly review the literature on standardization.

The economy of scale argument

Standardized technology abound and make perfectly good sense. It simplifies otherwise complicated choices, enables large scale integration, and it is the basis for a division of labour. The standardization of the design of cars created such a division of labour between car manufacturers and suppliers of standardized parts ranging from roller bearings and lamps to complete motors (når? ref). For communication technology there is in addition a very influential, brute force argument. It is simple and is typically illustrated by the following figure.

The number of different links as a function of the number of nodes.

The figure shows how the number of communication connections, or protocols, (the edges) rapidly increases as the number of communicating partners (the nodes) is rising. In the left hand case, every pair of communicating partners need a protocol to communicate. With 4 partners the required number of protocols is 6. The number of protocols increase rapidly. With 5 partners, the number of required protocols is 10, with 6 partners 15, etc. The number of protocols is given by the formula n(n-1)/2, where n is the number of nodes. In the right hand situation, all communication is based on one single shared, standardized protocol. What is needed is establishing a link between each partner (node) and the standard. As a consequence, the number of required links to be established and maintained increases linearly with the number of partners -- given the existence of a shared standard. With 4 partners, 4 links are required and with 5 partners, 5 links are required, etc. Already with a relative small community of communicating partners, the only feasible strategy is to use a shared, standardized protocol, an Esperanto language which everyone reads and writes. A solution with pairwise protocols among the communicating partners simply does not scale, it works in principle but not in practise. Larger communication networks is simply impossible to build and manage if not based on standards.

As mentioned in the previous chapter, we consider standards not only important from a practical and economic perspective, but also an essential and necessary constituting element. If an "infrastructure" is built only on the bases of bilateral arrangements, it is no real infrastructure. It is then just a collection of separate independent connections.

The brute force argument sketched above makes a lot of sense. The ideal of establishing a shared standard is easy to understand and support. The problem, however, is how to decide which areas should be covered by one standard, how different standards should relate to each other and how to change them as their environment changes, i.e how to pragmatically balance the idealized picture of everybody sharing the same standard against the messy, heterogeneous and irreversible character of an information infrastructure. This act of balancing -- as opposed to dogmatically insisting on establishing one, shared standard -- is very close to the heart of the design of an information infrastructure. It is an issue we explore in greater in the subsequent chapters.

Types of standards

Standards abound. David and Greenstein (1990, p. 4) distinguish among three kinds of standards: reference, minimum quality and compatibility standards. II standards belong to the last category, that is, standards which ensure that one component may successfully be incorporated into a larger system given an adherence to the interface specification of the standard (ibid., p. 4). One may also classify standards according to the processes whereby they emerge. A distinction is often made between formal, de facto and de jure standards. Formal standards are worked out by standardisation bodies. Both OSI and Internet are formal according to such a classification. 1 De facto standards are technologies standardised through market mechanisms, and de jure standards are imposed by law.

De facto standards are often developed by industrial consortia or vendors. Examples of such standards are the W3 consortium currently developing a new version of the HTML format for WorldWideWeb, IBM´s SNA protocol, CORBA developing a common object oriented repository for distributed computing, X/Open developing a new version of Unix and the Health Level 7 2 standard for health care communication. Some of these consortia operate independently of the international standardisation bodies, others align their activities more closely. For instance, the W3 consortium is independent of, but closely linked to, the standardisation process of the IETF (see further below).

Internet standards

The Internet is built of components implementing standardized communication protocols. Among these are the well known ones such as TCP, IP, SMTP (email), HTTP (World Wide Web), FTP (file transfer) and TELNET (remote login). But Internet includes many more standards - in June 1997 there were in fact 569 officially registered Internet standards (RFC 2200). These standards are split into different categories.

Maturity levels and status

There are two independent categorization of protocols. The first is the "maturity level" which in the Internet terminology is called the state of standardization. The state of a protocol is either "standard", "draft standard", "proposed standard", "experimental", "informational" or "historic". The second categorization is the "requirement level" or status of a protocol. The state is either "required", "recommended", "elective", "limited use", or "not recommended".

When a protocol is advanced to proposed standard or draft standard, it is labelled with a current status.

In the Internet terminology computers attached to or otherwise a part of the network is called a "system." There are two kinds of systems - hosts and gateways. Some protocols are particular to hosts and some to gateways; a few protocols are used in both.. It should be clear from the context of the particular protocol which types of systems are intended.

Protocol states are defined as follows:

Typically, experimental protocols are those that are developed as part of an ongoing research project not related to an operational service offering. While they may be proposed as a service protocol at a later stage, and thus become proposed standard, draft standard, and then standard protocols, the designation of a protocol as experimental may sometimes be meant to suggest that the protocol, although perhaps mature, is not intended for operational use.

Protocol status is defined as follows:

 

Internet standards - numbers and growth

 

94/7 3

Added 4

97/6 5

New 6

Removed 7

Standard

58

2

60

7

5

Draft

21

28

49

45

17

Proposed

161

99

260

150

51

Experimental

53

45

98

55

10

Informational

23

34

57

54

20

Historic

35

10

45

10

0

Totally

351

218

569

321

103

Among the 569 registered Internet standards there are 60 standard protocols, 8 49 draft standards, 260 proposed, 98 experimental, 57 informational and 45 historic. The growth in standards from July 1994 is, as illustrated in Table 1, quite significant - about 62%. However, the growth has been rather uneven among the various standards categories - 3.5% growth in full standards, 133% in draft and 60% in proposed standards. Looking at the numbers in Table 1, we see that the number of proposed and experimental standards is growing rapidly. Lots of new standards are popping up, while close to none is advancing to the top level. An important explanation for this is the increasing growth of the Internet, both in numbers of computers (and users) connected and in the complexity of the technology. Within the environment constituted by this complex unstable environment, developing stable and accepted standards gets increasingly more difficult (Alvestrand 1996). This points to the more general question of whether the organisation of Internet standardisation which so far has proven so successful, has reached its limit, i.e. that there is a need for a more formal, more ISO like organisation (see also earlier remarks in chapter 2). Alternatively, the size the Internet is approaching represents a limit for big far that kind of networks can grow.

Types of Internet standards

The Internet standards specifies lots of different kinds of communication (sub-) technologies. We will here mention some of them just to give a flavour of what all these protocols are all about. Interested readers should consult the truly vast electronic archive the Internet community keeps of its reports and discussion (see the list cited in the section on methodological issues in chapter 2). 9

First of all there are protocol specifications like TCP, IP, SMTP, HTTP, RTP (Transport protocol for Real Time Applications), FTP, etc. These are the standards most often associated with the Internet.

In addition, there are lots of auxiliary protocols , protocols offering services to the others. These include PPP (Point-to-Point Protocol), SNMP (for network management), Echo protocol, DNS (Domain Name System, a distribute data base system mapping a symbolic Internet address (like diagnostix.ifi.uio.no) to a numerical one (like 129.240.68.33)), tools for DNS debugging, Time Server Protocol, etc. There are further lots of standards defining security systems (Kerberos authentication service, encryption control protocol, MIME object security services, Signed and encrytped MIME, etc.).

The Internet runs across many different physical networks, accordingly there are standards defining how to implement one protocol on top of these networks , typically IP on ARPANET, Wideband Network, Ethernet Networks, IEEE 802, transmission of IP traffic over serial lines, NETBIOS and FDDI. There is also a significant number of standards defining gateways between protocols, like FTP - FTAM Gateway Specifications and X.400 - MIME Body Equivalencies. One group of standards defines how to construct various identifiers (including addresses) like IP addresses and URL specifications.

SNMP is the Internet network management protocol, defining the general rules for management of any part of the Internet. However, to set up a management system, additional information about the specific networks and protocols is required. Accordingly, there is one standard, SMI, defining how this information should be specified as well as standards (MIBs, Management Information Bases) defining how to manage specific parts of the Internet using for instance ATM, Ethernet, IP, TCP, UDP, DECNET and X.500.

Lastly, there are standardized data formats like MAIL (Format of Electronic Mail Messages), MIME (Multipurpose Internet Mail Extensions) and MIME extensions (MIME Media Types, MIME message header extensions for non-ASCII), HTML, SGML Media Types, Using Unicode with MIME, NETFAX (File format for exchange of images), MIME encapsulation of Macintosh files and Serial number arithmetic.

Standards for health care information infrastructures

We give a brief outline of the different II standards related to the health care sector. A comprehensive overview of various international standardization efforts can be found in (De Moor, McDonald and van Goor 1993).

CEN TC/251

In chapter 2 we presented CEN TC/251 as the standardisation body most widely accepted as the authoritative one. We will in this section give a brief overview of what kinds of standards this body is defining. CEN/TC 251 has split its work into seven different subfields which are called: 10

  1. Healthcare Terminology, Semantics and Knowledge Bases
  2. Healthcare Information Modelling and Medical Records
  3. Healthcare Communications and Messages
  4. Medical Imaging and Multimedia
  5. Medical Device Communication in Integrated Healthcare
  6. Healthcare Security and Privacy, Quality and Safety
  7. Intermittently Connected Devices

Within these areas standards of very different types are defined. We will here illustrate this with the standards defined by xx 1996 within four areas. Within "Healthcare Terminology, Semantics and Knowledge Bases," standards with the following titles were defined:

Within the second area, "Healthcare Information Modelling and Medical Records," standards were defined with the following titles:

Within "Healthcare Communications and Messages" the standards defined were

Within "Medical Imaging and Multimedia" the following standards were defined:

Each of these standards are also given a status similar to those used in the Internet community reflecting its maturity level (pre-standard, standard).

The standards cover a wide range of different health care related phenomena. They are also related to very different aspects of information infrastructure development and use. Several specify "messages," i.e. structure of information to be exchanged. These standards specify also the institutions (or partners) the information should be exchanged between and when (for instance that a lab report should be sent from the lab to the ordering unit when the ordered analysis are finished). For these standards it is also defined by a separate group of standards the formats and protocols to be used for exchanging them. Another group of standards defines the semantics of important data fields in terms of nomenclatures and classification systems to be used. Further, there are standards defining the overall architectures of health care information systems and infrastructures, and even methodologies for developing various types of standards.

While CEN standards are those having been considered the most important in Europe, there are lots of others. Some of these are overlapping, and in some cases one standard defined by one body is simply adopted by another. For instance, the DICOM standard for medical images developed by ACR/NEMA has a broad acceptance in this area and is accordingly more or less adopted by CEN as it is. Similarly, the EDIFACT messages defined by CEN is also given official status within the EDIFACT bodies as well.

EDI

EDI denotes electronic exchange of form-like information between different organization. It is often used to transfer electronic equivalents of already standardized paper forms. Examples are orders, invoices, customs declaration documents, bank (payment) transactions, etc. Within health care there is a vast range of such forms, including lab orders and reports, admission and discharge letters and drug prescriptions.

Within EDI, the EDIFACT format is the most widely accepted standard (among standardization bodies, may be not among users). Accordingly, EDIFACT has been chosen as a basis for exchange of form-like information in health care as well.

In the EDIFACT community, there are three forms of standards: messages, segments and data elements. The electronic equivalent of a paper form is a message. A message is, in EDIFACT terms, composed of segments and segment groups, where a segment group is (recursively) defined by segment groups/and or segments. A segment is composed of data elements - single or composed. The latter being composed by a number of single ones.

EDIFACT messages are defined internationally. Such international messages are often accompanied by specification of national or sectorwise subsets as well as "exchange agreements" between pairs of communicating partners. Such specifications define in more detail how a general message should be used within a region or sector or between specific partners.

Standardisation processes and strategies

Information infrastructures, like many other kinds of large technical systems (Hughes 1987), are standardised by formal, quasi-democratic, international standardisation bodies (Lehr 1992). These standardisation bodies have to follow predefined procedures and rules regulating the status, organisation and process of developing standards. In recognition of the limits of both market forces and hierarchical control, formal standardisation is a key strategy for developing an information infrastructure (OECD 1991).

Different standardization institutions organise the process of standardisation quite differently along several important dimensions, including the way participation in the process is regulated, how voting procedures are organised, the requirements proposed standards have to meet at different stages in the process, the manner information about ongoing standardisation is made public, and the bureaucratic arrangements of how work on one, specific standard is aligned with other efforts, etc.

Standardization processes has only recently become a topic for research and debate. Branscomb and Kahin (1995) discuss three possible models for NII standards development, quite close to David's three categories mentioned above, which they have given the following names:

  1. The Applications Model: Intense Competition and Ad Hoc Consortia.
  2. The Internet Model: A Cooperative Platform for Fostering Competition.
  3. The Telecommunications Model: From National-Level Management to Open Competition

We will now present OSI and EDIFACT as an example of the telecommunications model and the Internet model. We will also present standardization processes and strategies adopted in health care, which in fact combines all models to some extent.

OSI

OSI is short for the ISO Open Systems Interconnection (OSI) reference model. OSI was worked out by the International Standardization Organization (ISO) in the early 80s. ISO is a voluntary, non-treaty organization that was founded just after World War II and produce international standards of a wide range of types. The members of ISO are national standardization bureaus rather than individuals or organizations. Members include national standardization organisations like ANSI (from the US), BSI (from Great Britain), AFNOR (from France) and DIN (from Germany) (Tanenbaum 1989).

ISO is divided into a number of technical committees according to whether the focus is on specification of nuts and bolts for construction work, paint quality or computer software. There are several hundreds technical committees within ISO. The "real" work is done by the several thousand non-paid volunteers in the working groups. A number of working groups belong to a subcommittees which, again, belong to a technical committee. To refer to the members of the working groups as non-paid volunteers merely implies that they are not paid or employed by ISO as such. They are typically employed by large vendors, consumer organizations or governmental institutions. Historically, vendors have dominated standards committees (Jakobs 1998; Lehr 1992).

The development of OSI protocols follow (in formal terms) democratic procedures with representative participation under the supervision of the ISO (Lehr 1992). The standardization process aims at achieving as broad consensus as possible. Voting is based on representative voting, that is, that each member (representatives of national bureaus of standardization) is given a given weight.

An ISO standard passes through certain stages. It starts as a draft proposal and is worked out by a working group in response to one representative's suggestion. The draft proposal is circulated around for six months and may be criticized. Given that a substantial majority is favourable, criticism is incorporated to produce a revised document called a draft international standard. This is circulated for both comments and voting. This then is fed into the final document, an international standard, which get approved and published. When faced with controversial issues, the process may back-track in order to work out compromises that mobilize sufficient support in the voting. In this way, as was the case with OSI, the standardization process may stretch over several years.

OSI protocols are developed by first reaching a consensus about a specification of the protocol. The protocol specifications are assumed to be implemented as software products by vendors. The implementation is independent of the standardisation process. Because of the formal and political status of OSI protocols, most Western governments have decided that II in the public sector should be based on OSI protocols.

The implementation and diffusion of OSI protocols have not proceeded as anticipated by those involved in the standardisation processes. One of the main reasons is that they have been developed by large groups of people who have been specifying the protocols without any required implementation and without considering compatibility with non-OSI protocols (Rose 1992). This results in very complex protocols and serious unforeseen problems. The protocols cannot run alongside other networks, only within closed OSI environments. The protocols are big, complex and ambiguous, making them very difficult to implement in compatible ways by different vendors. The definition of profiles mentioned earlier is an attempt to deal with this problem.

EDIFACT

EDIFACT, short for Electronic Data Interchange for Administration, Commerce and Transport, is a standardization body within the United Nation. This has not always been the case. EDIFACT has during the last couple of decades transformed dramatically both in content and institutionally. It started in the early days, as an informal body of about 30 people world-wide (Graham et al. 1996, p. 9). Since then it has grown to a huge, global bureaucracy involving several hundred participants. The small group of people who established EDIFACT chose the United Nation as their overarching organization because they expected the perceived neutrality of the United Nation to contribute favourably to the diffusion of EDIFACT messages (ibid., p. 9). The representation in EDIFACT is, as is usual within the United Nations, through national governments rather than the national standards bodies as would have been the case if EDIFACT had chosen to align with ISO instead.

During 1987, EDIFACT reorganized into three geographically defined units, North America, Western Europe and East Europe. This has later been extended to cover the areas Australia/ New Zealand and Japan/ Singapore. These areas are subsequently required to coordinate their activity. Although these areas in conjunction cover a significant part of the world, the vast majority of the work has taken part within the Western European EDIFACT board (ibid., p. 9). This is due to the close alignment with the TEDIS programme of the Commission of the European Community. In 1988, the EDIFACT syntax rules were recognized by the ISO (LIASON???XXXX).

EDIFACT messages pass through different phases before reaching the status as a proper standard. A draft is first circulated to the secretariat and all other regional secretariat to be assessed purely technically before being classified as status level 0. Moving up to status 1 requires a consensus from all the geographical areas. After status 1 is achieved, the proposal has to be delayed for at least a year to allow implementations and assessments of use. If and when it reaches status 2, it has become an United Nation standard and is published.

The EDIFACT organization also specifies rules for design of messages. For instance, if an existing message can be used, that one should be adopted rather than designing a new one. In the same way the rules specify that the existing segments and data elements should be reused as far as possible.

Internet

As illustrated in chapter 2, the organization of the development of the Internet has changed several times throughout its history. The organizational structure has changed from initially that of a research project into one being very close to the standard standardization body. Along the change in organizational structure have the rules also changed.

Internet is open to participation for anyone interested but without ensuring representative participation. 11 The development process of Internet protocols follows a pattern different from that of OSI (RFC 1994; Rose 1992). A protocol will normally be expected to remain in a temporary state for several months (minimum six months for proposed standard, minimum four months for draft standard). A protocol may be in a long term state for many years.

A protocol may enter the standards track only on the recommendation of the IESG; and may move from one state to another along the track only on the recommendation of the IESG. That is, it takes action by the IESG to either start a protocol on the track or to move it along.

Generally, as the protocol enters the standards track a decision is made as to the eventual STATUS, requirement level or applicability (elective, recommended, or required) the protocol will have, although a somewhat less stringent current status may be assigned, and it then is placed in the proposed standard STATE with that status. So the initial placement of a protocol is into the state of proposed protocol. At any time the STATUS decision may be revisited.

The transition from proposed standard to draft standard can only be by action of the IESG and only after the protocol has been proposed standard for at least six months.

The transition from draft standard to standard can only be by action of the IESG and only after the protocol has been draft standard for at least four months.

Occasionally, the decision may be that the protocol is not ready for standardization and will be assigned to the experimental state. This is off the standards track, and the protocol may be resubmitted to enter the standards track after further work. There are other paths into the experimental and historic states that do not involve IESG action.

Sometimes one protocol is replaced by another and thus becomes historic, or it may happen that a protocol on the standards track is in a sense overtaken by another protocol (or other events) and becomes historic.

Standards develop through phases which explicitly aim at interleaving the development of the standard with practical use and evaluation (RFC 1994, 5). During the first phase (a Proposed Standard), known design problems should be resolved but no practical use is required. In the second phase (a Draft Standard), at least two independent implementations need to be developed and evaluated before it may pass on to the final phase, that is, to be certified as a full Internet Standard. This process is intended to ensure that several features are improved, the protocols are lean and simple, and they are compatible with the already installed base of networks. Internet standards are to function in a multi-vendor environment, that is, achieve "interoperability by multiple independent parties" (ibid., p. 5).

A key source for identifying design principles shared by the vast majority of the Internet community, is the ones embedded in the procedural arrangements for developing Internet standards. The standards pass through three phases which explicitly aim at interleaving the development of the standard with practical use and evaluation:

These procedures are explicitly aimed at recognizing and adopting generally-accepted practices. Thus, a candidate specification is implemented and tested for correct operation and interoperability by multiple independent parties and utilized in increasingly demanding environments, before it can be adopted as an Internet Standard.

(RFC 1994, p. 5)

The Internet community consists, in principle, of everybody with access to Internet (in the sense of (ii) above) (RFC 1994). Participation in the e-mail discussions, either general ones or those devoted to specific topics, is open to anyone who submits an e-mail request in the way specified (see http://www.ietf.org). The Internet community may participate in the three yearly meetings of the Internet Engineering Task Force (IETF). The IETF dynamically decides to establish and dismantle working groups devoted to specific topics. These working groups do much of the actual work of developing proposals. At the IETF meetings design issues are debated. It is furthermore possible to organise informal forums called BOFs ("birds of feather") at these meetings.

IETF nominates candidates to both the 13 members of the Internet Advisory Board (IAB) and the 10 members of the Internet Engineering Steering Group (IESG). The IETF, the IESG and the IAB constitute the core institutions for the design of Internet. Their members are part-time volunteers. In principle, they have distinct roles: the IETF is responsible for actually working out the proposals, the IESG for managing the standardisation process and the IAB for the overall architecture of the Internet together with the editorial management for the report series within Internet called Requests For Comments (RFC). In practise, however, the "boundaries of the proper role for the IETF, the IESG and the IAB are somewhat fuzzy" as the current chair of the IAB admits (Carpenter 1996). It has proven particularly difficult, as vividly illustrated in the case further below, to negotiate how the IAB should exercise its role and extend advice to the IESG and the IETF about the overall architecture of the Internet protocols.

Standardization of health care IIs

The organization of CEN standardization work

CEN is the European branch of ISO follows the same rules and organizational principles as ISO in the definition of OSI protocols. Standards and work programmes are approved in meetings where each European country has a fixed number of votes. As presented above, the work is organized in eight so-called work groups, from WG1 up to WG8. Each group is responsible for one area. The tasks demanding "real work", for instance the development of a proposal for a standardized message, are carried out in project teams. More than 1500 experts have been involved (De Moor 1993).

In areas where EDIFACT is used, the definition of the EDIFACT message is delegated to the EDIFACT standardization body, WEEB (Western European EDIFACT board), which has established a so-called "message design group," MD9, for the health sector. They have a liaison agreement regulating their cooperation. This alignment is furthermore strengthened by the fact that a number of the members of WG3 within CEN TC 251 are also members of WEEB MD9. As of July 1995, the secretary of WE/EB was moved to CEN.

On the national level, standardization work is organized to mirror that on the European. Typical work tasks include specifying national requirements to a European standard, validation of proposed European standard messages and appropriating European standardized messages according to national needs.

CEN/TC 251 is following the ISO/OSI model.

Industrial consortia

To the best of our knowledge, there is no standardization efforts related to health care that can be said to follow the Internet model. Industrial consortia, on the other hand, play a significant role. The earlier mentioned HL-7 standardization is organized by one such. This organization has, or at least had in its earlier days, some rules (may be informal agreement) saying that the larger companies were not allowed to participate.

Within the field of medical imaging, the standardization work organized by ACR/NEMA (American College of Radiology/National Electronic Manufacturers' Association), developing the DICOM standard, seems by far to be the most influential. ACR/NEMA is an organization that could be seen as a rather traditional standardization body. However, it is in this field operating as an industrial consortium and the work is dominated by large companies like General Electric, Siemens and Phillips.

Standards engineering

Expositions like the one in this chapter of the contents and operations of the various standardisation bodies involved in information infrastructure development might easily become overwhelming: the number and complexity of abbreviations, institutions and arrangements is impressive (or alarming). The aim of this book deals with the design -- broadly conceived -- of information infrastructure and hence needs to come to grips with the practical and institutional setting of standardisation. Still, it is fruitful to underscore a couple of themes of particular relevance to us that might otherwise drown in the many details surrounding standardisation.

Specification or prototyping

The three models presented by Branscomb and Kahin (REF) categorize standardization processes according to primarily according to organizational and governmental principles. We will here draw attention to important differences between the OSI and Internet processes seen as technological design strategies. Even if never made explicit, we believe that the two approaches followed by, on the one hand, OSI (and EDIFACT and CEN) and, on the other hand, Internet could be presented as two archetypical approaches to the development of II. To explain this, we attempt to make explicit some of the underlying assumptions and beliefs.

The principal, underlying assumption of OSI's approach is that standards should be developed in much the same way as traditional software engineering, namely by first specifying the systems design, then implementing it as software products and finally put it into use (Pressman 1992). Technical considerations dominate. As for traditional software engineering (Pressman 1992, 771), OSI relies on a simplistic, linear model of technological diffusion, and in this case, for the adoption of formal standards. The standardisation of Internet protocols are based on different assumptions. The process is close to an approach to software development much less widely applied than the traditional software engineering approach explained above, namely one stressing prototyping, evolutionary development, learning and user involvement (Schuler and Namioka 1993).

In the Internet approach the standardisation process unifies the development of formal standards and their establishment as de facto ones. There is currently an interesting and relevant discussion going on about whether Internet's approach has reached its limits (see Eidnes 1994, p. 52; Steinberg 1995, p. 144). This is due to the fact that not only the technology changes. As the number of users grow, the organisation of the standardisation work also changes (Kahn 1994).

Contingency approaches

Due to the vast range of different "things" being standardized, it is unlikely that there is one best approach for all cases. Rather one needs a contingency approach in the sense that one needs to identify different approaches and criteria for choosing among them (David 1985?). Such criteria are the general maturity of the technology to be standardized, its range of diffusion, the number of actors involved, etc. This also implies that the approach used may change dynamically over time as the conditions for it changes. The evolution of the standardization approaches followed as the Internet has evolved illustrates such a contingency approach.

Branscomb and Kahin (op. cit.) hold that Internet demonstrates the remarkable potential (although perhaps the outer limits) for standards development and implementation in concert with rapid technological change. However, their view is that interoperability becomes harder to achieve as the functionality of the NII expands and draws in more and more vendors. They also say that it remains uncertain how well the Internet-style standards processes scale beyond its current reach. The difficulties in defining and implementing a new version of IP supports this view. This issue will be analysed extensively throughout this book.

Branscomb and Kahin expect that the U.S. Congress (and governments in other countries) will proceed with progressive deregulation of the telecommunication industry. This means that the importance of standardization will grow, while the government's authority to dictate standards will weaken. Yet these standards must not only accommodate the market failures of a deregulated industry; they must support the much more complex networks of the NII.

According to Branscomb and Kahin, all face the paradox that standards are critical to market development but, once accepted by the market, standards may threaten innovation, inhibit change, and retard the development of future markets. They conclude that these risks require standards processes to be future oriented. They consider Internet practices breaking down the dichotomy between anticipatory versus reactive standards by promoting an iterative standards development with concurrent implementation.

State of the art in research into standardisation

In activities aiming at implementing the NII and Bangemann reports, standards are identified as the key elements (ref.). However, it is becoming increasingly more accepted that current standardization approaches will not deliver (ref.). "Current approaches" means here the telecommunication model (including ISO's). This approach is experienced to be too all too slow and inflexible. Some believe the democratic decision process is the problem, and that it should be replaced by more managerial government models (ref.).

Although the success of the Internet approach implies that it should be more widely adopted, as pointed out by Branscomb and Kahin, this approach has its limitations as well. This rather miserable state of affairs is motivating a growing research activity into standardization and standardization approach. We will here briefly point to some of the major resent and ongoing activities - activities this boot also is intended to contribute to.

ta med sts-aktige greier

digital libraries

Harvard project

hawkins & mansell - policical institutional economy

diffusjons teoretikere

 

 

1. This is the source of some controversy. Some prefer to only regard OSI as "formal" due to properties of the standardisation process described later. This disagreement is peripheral to our endeavour and is not be pursued in this book.

2. Health Level 7 is a standard worked out by an ad-hoc formation of a group of smaller vendors in the United States, later on being affiliated to American National Standards Institute, ANSI (see url http://www.mcis.duke.edu/standards/HL7/hl7.htm).

3. Number of Internet standards in July 1994 (RFC 161o).

4. Increase in the number of Internet standards from July 1994 to June 1997 (RFCs 1720, 1780, 1800, 1880, 1920, 2000, 2200).

5. Number of Internet standards in June 1997 (RFC 2200).

6. Total number of new Internet standards from July 1994 to June 1997 (RFCs 1720, 1780, 1800, 1880, 1920, 2000, 2200).

7. The number of Internet standards deleted from the official list from July 1994 to June 1997 (ibid.).

8. There are 53 Internet standards assigned a number. We have here counted the number of RFCs included in the official list of standards.

9. A good place to start looking for RFCs and information about Internet standards is http://ds.internic.net.

10. Information about the work of CEN TC/251is found at http://

11. The term "Internet" may denote either (i) the set of standards which facilitate the technology, (ii) the social and bureaucratic procedures which govern the process of developing the standards or (iii) the physical network itself (Krol 1992; RFC 1994). This might create some confusion because a version of Internet in the sense of (i) or (iii) has existed for many years whereas (ii) is still at work. We employ the term in the sense of (ii) in this context. To spell out the formal organisation of Internet in slightly more detail (RFC 1994), anyone with access to Internet (that is, in the sense of (iii)!) may participate in any of the task forces (called IETF) which are dynamically established and dismantled to address technical issues. IETF nominates candidates to both Internet Advisory Board (IAB, responsible for the overall architecture) and Internet Engineering Steering Group (IESG, responsible for the management and approval of the standards). The IAB and IESG issue all the official reports which bear the name "Requests For Comments" (RFC). This archive was established along with the conception of the Internet some 25 years ago. It contains close to 2000 documents including: all the formal, proposed, draft and experimental standards together with a description of their intended use. The RFCs also record a substantial part of the technical controversies as played out within working groups established by the IETF or independent comments. Minutes from working group meetings are sometimes published as RFCs. In short, the RFCs constitute a rich archive which shed light on the historic and present controversies surrounding Internet. It seems to be a rather neglected source for information and accordingly an ideal subject matter for an informed STS project providing us with the social construction of Internet. It is an electronic achieve which may be reached by WorldWideWeb using http://ds.internic.net.

 

Go to Main Go to Previous Go to Next