CHAPTER 2 Cases: The Internet and Norwegian Health Care

Throughout this book, a number of examples will be used to discuss and illustrate various issues. These examples will primary be selected from two cases - the building of two different information infrastructures: the Internet and an infrastructure for exchange of form like information in the Norwegian health care sector. The building of these infrastructures will be presented in this chapter. We also discuss methodological issues regarding how reasonable it is to draw general conclusions about information infrastructures from these cases. Our approach is pragmatic. We present an emerging picture of information infrastructure standardisation and development based on the empirical material at hand. This picture will be adjusted as more experience with information infrastructures is gained. The two cases exhibit, we believe, a number of salient features of information infrastructure building.

In order to make our use of the two cases as clear as possible, we identify both the important lessons to be learned as well as pointing out the more accidental, less reproducible aspects.

Internet

"The Internet has revolutionized the computer and communications world like nothing before" (Leiner et al., 1997, p. 102).

"The Internet today is a widespread information infrastructure, the initial prototype of what is often called the National /or Global or Galactic) Information Infrastructure" (ibid., p. 102).

As indicated by these quotes, the Internet is widely held to be the primary successful example to learn from when trying to realized the envisioned information infrastructures (Kahin and Branscomb 1995, Digital lib. 1995). We share this view, and will accordingly present the development of the Internet from its very beginning up to today. We will in this section give a brief overview of this development, pointing to what we believe to be the important steps and events that indicate what could be included in strategies for building future information infrastructures. This presentation draws heavily on (Leiner et al., 1997).

We also include a few cautionary remarks about the danger of idolizing Internet. It is a lot easier to acknowledge its historical success than to feel confident about its future.

The beginning

The first notes related to the work leading to Internet is a few papers on packet switching as a basis for computer networking written in 1962. The first long distance connections between computers based on this principles were set up in 1965. In 1969 the first nodes of ARPANET were linked together, and in 1971-72 the NCP 1 protocol was implemented on this network, finally offering network users the possibilities of developing network applications. In 1972 e-mail was introduced, motivated by the ARPANET's developers' need for easy coordination. From there, e-mail took off as the most popular network application.

The TCP/IP core

The original ARPANET grew into the Internet. Internet was based on the idea that there would be multiple independent networks of rather arbitrary design, beginning with the ARPANET as the pioneering packet switching network, but soon to include packet satellite networks, ground-based packet radio networks and other networks. The Internet as we now know it embodies a key underlying technical idea, namely that of open architecture networking. In this approach, the choice of any individual network technology was not dictated by a particular network architecture but rather could be selected freely by a provider and made to interwork with the other networks through a meta-level "Internetworking Architecture".

The idea of open-architecture networking was guided by four critical ground rules:

The original Cerf and Kahn (1974) paper on the Internet described one protocol, called TCP, which provided all the transport and forwarding services in the Internet. Kahn had intended that the TCP protocol support a range of different transport services, from the totally reliable sequenced delivery of data (virtual circuit model) to a datagram service in which the application made direct use of the underlying network service, which might imply occasional lost, corrupted or reordered packets.

Although Ethernet was under development at Xerox PARC at that time, the proliferation of LANs were not envisioned at the time, much less PCs and workstations. The original model was national level networks like ARPANET of which only a relatively small number were expected to exist. Thus a 32 bit IP address was used of which the first 8 bits signified the network and the remaining 24 bits designated the host on that network. This assumption, that 256 networks would be sufficient for the foreseeable future, was clearly in need of reconsideration when LANs began to appear in the late 1970s.

However, the initial effort to implement TCP resulted in a version that only allowed for virtual circuits. This model worked fine for file transfer and remote login applications, but some of the early work on advanced network applications, in particular packet voice in the 1970s, made clear that in some cases packet losses should not be corrected by TCP, but should be left to the application to deal with. This led to a reorganization of the original TCP into two protocols, the simple IP which provided only for addressing and forwarding of individual packets, and the separate TCP, which was concerned with service features such as flow control and recovery from lost packets. For those applications that did not want the services of TCP, an alternative called the User Datagram Protocol (UDP) was added in order to provide direct access to the basic service of IP.

ARPANET replaced NCP with TCP/IP in 1983. This version of TCP/IP is officially given version number four.

New applications - new protocols

A major initial motivation for both the ARPANET and the Internet was resource sharing, like for instance allowing users on the packet radio networks to access the time sharing systems attached to the ARPANET. Connecting the two networks was far more economical that duplicating these very expensive computers. However, while file transfer (the ftp protocol) and remote login (Telnet) were very important applications, electronic mail has probably had the most significant impact of the innovations from that era. E-mail provided a new model of how people could communicate with each other, and changed the nature of collaboration, first in the building of the Internet itself (as is discussed below) and later for much of society.

In addition to e-mail, file transfer, and remote login, other applications were proposed in the early days of the Internet, including packet-based voice communication (the precursor of Internet telephony), various models of file and disk sharing, and early "worm" programs illustrating the concept of agents (and viruses). The Internet was not designed for just one application but as a general infrastructure on which new applications could be conceived, exemplified later by the emergence of the Web. The general-purpose nature of the service provided by TCP and IP makes this possible.

As TCP moved to new platforms, new challenges were met. For instance, the early implementations were done for large time-sharing systems. When desktop computers first appeared, it was thought by some that TCP was too big and complex to run on a personal computer. However, well working implementations were developed, showing that such small computers could be connected to Internet as well.

As the number of computers connected increased, new addressing challenges appeared. For example, the Domain Name System was developed to provide a scalable mechanism for resolving hierarchical host names (like ifi.uio.no) into Internet addresses. The requirements for scalable routing approaches led to a hierarchical model of routing, with an Interior Gateway Protocol (IGP) used inside each region of the Internet and an Exterior Gateway Protocol (EGP) used to tie the regions together.

Diffusion

For quite some time, the Internet (or at that time ARPANET) was primarily used by its developers and within the computer networking research community. As the next major step, the technology was adopted by groups of computer science researchers. An important use area was the development of basic support services for distributed computing in environments based on work stations connected to LANs like distributed (or networking) file systems. A crucial event in this respect was the incorporation of TCP/IP into the Unix operating system. An additional important element was the fact that the code was freely distributed. And lastly, the on-line availability of the protocols' documentation.

In 1985 the Internet was established as a technology supporting a broad community of researchers and developers.

The evolution of the organization of the Internet

"The Internet is as much a collection of communities as it is a collection of technologies" (ibid. p. 106).

It started with the ARPANET researchers working as a tight-knit community, the ARPANET Working Group (Kahn 1994). In the late 1970s, the growth of the Internet was accompanied by the growth of the interested research community and accordingly also an increased need for more powerful coordination mechanisms. The role of the government was initially to finance the project, to pay for the leased lines, gateways and development contracts. In 1979, one opened up for participation from a wider segment of the research community by setting up Internet Configuration Control Board (ICCB) to overlook the evolution of Internet. ICCB was chaired by ARPA (Kahn 1994, p. 16). In 1980 the US Department of Defence adopted TCP/IP as one of several standards. In 1983 ICCB was substituted by the IAB, Internet Activities Board. The IAB delegated problems to task forces. There were initially 10 such task forces. The chair of the IAB was selected from the research community supported by ARPA. In 1983 TCP/IP was chosen as the standard for Internet, and ARPA delegated the responsibility for certain aspects of the standardisation process to the IAB.

During the period 1983 -- 1989 there is a steady growth in the number of task forces under the IAB. This gives rise to a reorganisation in 1989 where the IAB is split into two parts: (i) the IETF (Internet Engineering Task Force) which is to consider "near-future" problems and (ii) IRTF (Internet Engineering Research Task Force) for more long term issues.

The Internet is significantly stimulated by the High Performance Computing initiate during the mid-80s where a number of supercomputing centres were connected by high-speed links. In the early 90s the IAB is forced to charge nominal fees from its members to pay for the growing administration of the standardisation process.

In 1992 a new reorganisation took place. Internet Society was established as a professional society. IAB got constituted as part of the Internet Society. While keeping the abbreviation, the name was change from Internet Activities Board to Internet Architecture Board. IAB delegated the responsibility for the Internet Standards to the top level organization within the IETF which is called IESG (Internet Engineering Steering Group). The IETF as such remained outside of the Internet Society to function as a "mixing bowl'" for experimenting with new standards.

The Web's recent development and widespread development brings in a new community. Therefore, in 1995, a new coordination organization was formed - the World Wide Web Consortium (W3C). Today, the W3C is responsible for the development (evolution) of the various protocols and standards associated with the Web. W3C is formally outside the IETF but is closely related. This is partly due to formal arrangements but more important is the similarity between the standardisation processes of the two (see chapter 11), and the fact that many of those active in W3C have been involved in other Internet activities for a long time, being members of the larger Internet community and sharing the Internet culture (Hannemyr 1998, ++).

Future

Internet has constantly been changing, seemingly at am increasing speed. The outphasing of IP version 4, adopted by the whole ARPANET in 1983, is now at its beginning. The development of the new one, IP version 6, started in 1991 and was made a draft standard in 1996 (see further explanation in chapter 4). The transition of the Internet to the new one is at its very beginning. The details of this evolution is an important and instructive case of changing a large infrastructure and will be described in detail later in chapter 10.

The development of this new version turned out to be far more complicated than anticipated - both technologically and organizationally. Technologically due to the complexity of the Internet, organizationally due to the number of users and user groups involved. The Internet is supposed to be the underlying basis of a wide range of new services, from new interactive media to electronic commerce. To play this future role the Internet has to change significantly. New services such as real-time transport, supporting, for instance, audio and video streams have to be provided: It also has to be properly adapted to new lower level services such as broadband networks like ATM and Frame Relay, portable computers (lap tops, PDAs, cellular phones) and wireless networks enabling a new paradigm of nomadic computing, etc. The required changes confront us with challenging technological as well as organizational issues.

"The most pressing question for the future of the Internet is not how the technology will change, but how the process of change and evolution itself will be managed" (Leiner at al., p. 108).

Highlights

We will here point to what we consider the key lessons to be learned from the Internet experience (so far) and which will be focused throughout this book.

Fist of all, the Internet's has constantly changed. It has changed in many ways, including:

  1. The Internet's character has changed.
  2. It started as a research project, developing new basic network technologies. As it developed, it also became a network providing services for its developers, then a network providing services to researcher communities, and lastly a network supporting all of us. As the network changed, its organization had to adapt - and it has done so.
  3. It has constantly been growing.
  4. It has grown in terms of number of nodes (hosts and routers) connected, in number of users and use areas, and in terms of protocols and applications/services.
  5. The Internet has had to change.
  6. It has had to change to accommodate to its own growth as well as its changing environment. Among the first changes are the introduction of the Domain Name System and the change from initial 32 bit address to the one used in IPv4, and now the definition of IPv6 to allow continued growth. Among the latter are changes necessary when PCs and LANs were developed and diffused and the above mentioned ongoing changes to adapt to broadband and wireless networks etc.

Basic principles for the development has been:

  1. Establishing a general basis for experimental development of applications and services. This general basis was a packet switched computer communications network, itself being subject to experimentation.
  2. Applications and services have been implemented to support specific local needs. Widely used services are based on applications that turned out to be generally useful. E-mail and WorldWideWeb are examples of applications originally developed for such specific local needs.
  3. When an application has proved to be of general interest, its specification is approved as standard.
  4. The Internet was always been an heterogeneous network.
  5. It has been heterogeneous as it from its very beginning was designed to integrate, run across, various basic networks like telephone, radio, satellite, etc. It has also been heterogeneous by accepting two alternative protocols, TCP and UDP, on the same level. Today is the Internet also heterogeneous as it has integrate various different network on higher levels like America On-Line, prodigy, etc. with its own protocols and services an e-mail networks based on other protocols like X.400, cc:mail, etc. 2

Historical coincidences

Given the present, overwhelming success of Internet, there is a pronounced danger that this might tend towards idealizing the Internet experience. It is important to develop a sense of how far the Internet experience is relevant as a basis for generalized lessons, and what should be regarded as more or less historically contingent, irreproducible decisions. Internet has, of course, a number of historically contingent features which distinguish it and make it difficult, if not impossible, to intentionally reproduce. Among these are:

In addition, as in all cases, there has been a number of coincidences where independent events have happened at the same time, opening up possibilities and opportunities creating a success story.

International Standardization Organization (ISO)

ISO has a long history of developing standards in all kinds of areas, ranging from the length of a meter to nuts and bolts (REF). Due to the way it is intended to mimic quasi-democratic decision processes with representative voting (see chapter 4 for further details), ISO has an unique position within national technology policy. Member nations agree to make ISO standards official, national standards. This implies that ISO standards are automatically promoted by key, public actors in the areas they cover. And in some countries (like Norway), ISO standards are granted the status of laws.

When the need for open (i.e. non-proprietary) computer communication standards was gaining wide acceptance, ISO was a quite obvious choice of body being responsible for the standardization work. The development of the OSI model and its protocol suite, covering everything from coding of physical signals to applications like e-mail and secure transactions, started in XXXX. The standards are specified in terms of a so-called reference model, the Open Systems Interconnection (OSI) model. It specifies seven layers which in sum make up what was supposed to become the basis of what we call information infrastructures.

For several years the was a "religious war" between the Internet and OSI supporters (ref.). Beyond the religious aspects, the Internet and OSI work reflected different visions about what our future world of computer communications should look like, giving different actors different roles. Accordingly, the battles over technological design alternatives, were also battles over positions in future markets (Abbate 1995). In addition, the war had a general political content as the Internet technology was developed by Americans, and accordingly giving them an competitive advantage. For the Europeans, then, it was important to make OSI different from the Internet technology, putting them in an equal position.

The OSI approach was indeed different. It negates virtually every element in the list of "highlights" above. The protocols have been tried developed by everybody coming together, agreeing on the protocols specification. No experimentation, no implementation before standardization, no evolution, no change, no heterogeneity.

Now the war is over, although there might still be some lonely soldiers left in the jungle to whom this message still has not reached. 3

EDI and EDIFACT

One important form of computer communication is EDI)Electronic Data Interchange). This form of communication covers the exchange of information between different organizations, typically information having been exchanged as paper forms, often even formally standardized forms. Such forms include orders, invoices, consignment notes and freight bills, customs declaration documents, etc. In the business world, EDI infrastructures have been built for quite some time, and bodies taking care of the standardization have been set up. The seemingly largest and important activity is related to EDIFACT. 4 EDIFACT is a multifaceted creature. It is a format, or language, for defining data structures combined with rules for how such structures should be encoded into character streams to be exchanges. Further, it includes a set of standardized data structures and a large bureaucratic organization controlling the EDIFCAT standards and standardization processes.

The primary items to be standardized in the EDIFACT world are "messages." An EDIFACT message is typically an electronic equivalent of a paper form.

The EDIFACT standardization organization is a part of the United Nations system. This was a deliberate choice by those starting the EDIFACT standardization work, believing that United Nations would give the activities the best possible legitimation.

The EDIFACT format was defined in the early seventies, while the formal standardization activities started in XX.

Health care information infrastructures

Health care information infrastructures have been developed and used over a period of more than ten years and have taken different shapes over time. Two main forms of use are transmission of form-like information and (possibly real-time) multi-media information. Typical examples of the former include: lab orders and reports exchanged between general practitioners, hospitals and labs; admission and discharge letters between general practitioners, specialists, and hospitals; prescriptions from general practitioners to pharmacies; exchange of non-medical information like ordering of equipment and food and invoices from hospitals and general practitioners to health insurance offices for reimbursement.

Typical examples of the latter type include: telemedicine services, that is, computer based services which usually include real time multi-media conferencing systems supporting a physician requesting advise from another physician at another institution; access to data bases and Web servers containing medical information; and PACS (Picture Archive Systems for X-rays) systems. In this book, we focus on the former type, i.e. transmission of form-like information.

The various forms of information exchange are overlapping and interconnected. The same piece of information may be exchanged as part of different transactions, for instance, by transmission of a digital X-ray image either using a multi-media conference system or attached in an e-mail. Furthermore, any organisational unit may engage in transactions with several other units. A lab, for instance, may communicate with a number of general practitioners, other labs, and other hospital wards. This is what distinguish such systems from stand-alone applications and turn them into infrastructure.

The development of health information infrastructure standards -- not to mention their implementation in products and adoption by user organisations -- has been slow. Based on personal experience, it seems that the more formal the standardisation process is, the slower the adoption becomes. Industrial consortia seem so far to be most successful. As, to the best of our knowledge, there does not exist any systematic evaluation, this is difficult to "prove." But studies in other sectors than health care exist. The evaluation of the European Union's program for diffusion of EDI in trade, the TEDIS programme, lend support to the view that formal standardisation is incredible slow - design as well as diffusion (Graham et al. 1996). An evaluation of EDIFACT on behalf of the United Nations concludes similarly (UN 1996).

EDI Infrastructure in the Norwegian health care sector

Fürst

The development of electronic information exchange between health care institutions in Norway started when a private lab, Dr. Fürst's Medisinske Laboratorium in Oslo, developed a system for lab report transmission to general practitioners in 1987. The system was very simple -- the development time was only 3 weeks for one person (Fiskerud 1996). The interest of Dr. Fürst's laboratory was simply to make profit by attracting new customers. It was based on the assumption that the system would help general practitioners save much time otherwise spent on manual entry of lab report data, and that the general practitioners would find the possibility of saving this time attractive. Each general practitioner receives on average approximately 20 reports a day, which take quite some time to register manually in their medical record systems.

The system proved to be a commercial success and brought them lots of general practitioners as new customers. This implied less profit for the other labs. Within a couple of years, several non-private labs (in hospitals) developed or bought systems with similar functionality in order to be competitive. Alongside the growing number of labs adopting systems for exchange of reports, an increasing number of actors saw a wider range of applications of similar technology in other areas. These actors were represented within the health sector as well as among possible vendors of such technology. For all of them it was perceived as important that the technologies should be shared among as many groups as possible in order to reduce costs and enable interconnection of a wide range of institutions.

The network Fürst established is still in use. Currently Fürst delivers the reports to about 2000 regular customers through the network.

Telenor - standardization

After an initial experiment, Telenor (the former Norwegian Telecom) launched the project "Telemedicine in Northern Norway" in 1987 which was running until 1993. Standardisation has always been considered important within the telecommunication sector. This attitude combined with Telenor's interest in largest possible markets, made them take it for granted that the new health information infrastructure standards should be like any other telecommunication standard: "open" and developed according to the procedures of formal standardisation bodies.

As there was a general consensus about the need for standards, the fight about what these standards should look like and how they should be developed started. Based on their interests in general solutions and rooted in the telecommunication tradition of international standardisation, Telenor searched for international activities aiming at developing "open" standards. The IEEE (Institute of Electrical and Electronics Engineers) P1157 committee, usually called Medix, did exactly that. This work was the result of an initiative to develop open, international standards taken at the MEDINFO conference in 1986. Medix, which was dominated by IT professionals working in large companies like Hewlett Packard and Telenor and some standardisation specialists working for health care authorities, adopted the dominating approach at that time, namely that standards should be as open, general and universal as possible.

The idea of open, universal standards underlying the Medix effort implied using existing OSI (Open Systems Interconnection) protocols defined by the ISO (International Standardisation Organisation) as underlying basis. The Medix effort adopted a standardisation approach -- perfectly in line with texts books in information systems development -- that the development should be based on an information model being a "true" description of the relevant part of reality, that is, the health care sector, independent of existing as well as future technology. (More on this in later chapters, particularly 8 and 12.) Individual messages would be derived from the model more or less automatically.

While the focus was directed towards a comprehensive information model, lab reports were still the single most important area. However, for those involved in Medix the task of developing a Norwegian standardised lab report message had around 1990 been translated into the development of a proper object-oriented data model of the world-wide health care sector.

In addition to the information model, protocols and formats to be used had to be specified. In line with the general strategy, as few and general protocols and formats as possible should be included. Medix first focused on the ISO standard for exchange of multi media documents, ODA (Open Document Architecture) believing it covered all needs for document like information. However, around 1990 most agreed that EDIFACT should be included as well. The Europeans who strongest advocated EDIFACT had already established a new body, EMEDI (European Medical EDI), to promote EDIFACT in the health sector. In Norway, a driving force behind the EDIFACT movement was the "Infrastructure programme" run by a governmental agency (Statskonsult) during 1989 - 92. Promoting Open Systems Interconnection standards and EDIFACT systems based on Open Systems Interconnection were key goals for the whole public sector (Statskonsult 1992).

Several other standardization activities were considered and promoted/advocated by various actors (vendors). Andersen Consulting, for instance, promoted the HL-7 5 standard. The Ministry of Health hired a consultant making their own proposal - which immediately were killed by other actors on this arena. The dispute settled in 1990 when the Commission of the European Community delegated to CEN (Comite Europeen de Normalisation, the European branch of ISO) to take responsibility for working out European standards within the health care domain in order to facilitate the economical benefits of an European inner market. CEN established a so-called technical committee (TC 251) on the 23. of March 1990 dedicated to the development of standards within health care informatics. From this time Medix disappeared from the European scene. However, the people involved moved to CEN and CEN's work to a large extent continued along the lines of Medix.

Both CEN and Medix were closely linked to the OSI and ISO ways of thinking concerning standards and standardization: Accordingly, the work has been based on the same specification driven approach - and with the seemingly same lack of results.

Isolated networks for lab report exchange

As mentioned above, a large number of labs bought systems similar to Fürst's. Some of these were based on early drafts of standardized EDIFACT messages, later update to the standardized versions. These networks were, however, not connected to each other. They remained isolated islands - each GP connected could only receive report from one lab.

Each of the networks got a significant number of GPs connected within a short period after the network technology was put in place. From then on, the growth has been very, very slow.

Pilots

In other areas, a number of pilot solutions have been implemented and tested in parallel with the standardization activities. This has been the case for prescriptions, lab orders, physicians' invoices, reports from X-ray clinics and other labs like pathology and micro-biology, etc. However, non of these systems has been adopted in regular use.

Cost containment in the public sector

All activities mentioned above has been driven from an interest in the improvement of medical services. Some overlapping activities were started under Statskonsult´s "Infrastructure programme." The overall objectives of this programme was to improve productivity, service quality, and cost containment in the public sector.

It is widely accepted that physicians get too much money for their work from the social insurance offices. Through improved control the government could save maybe more than hundred billion Norwegian kroner a year. 6 Spendings on pharmaceuticals are high, and is accordingly another important area for cost containment. In addition, the health care authorities wanted enhanced control concerning the use of drugs by patients as well as prescription practices of physicians concerning habit-forming drugs.

As part of the Infrastructure programme KITH (Kompetansesenteret for IT i helsesektoren) was hired by Statskonsult to work out a preliminary specification of an EDIFACT message for prescriptions (KITH 1992). A project organization was established, also involving representatives for the pharmacies and the GPs.

The interests of the pharmacies were primarily improved logistics and eliminating unnecessary retyping of information (Statskonsult 1992). By integrating the system receiving prescriptions with the existing system for electronic ordering of drugs, the pharmacies would essentially have a just-in-time production scheme established. In addition, the pharmacies viewed it as an opportunity for improving the quality of service to their customers. A survey had documented that as much as 80% of their customers were favourable to reducing waiting time at the pharmacies as a result of electronic transmission of prescriptions (cited in Pedersen 1996).

As the standardization activities proceeded, the project drifted (Ciborra 1996, Berg 1997 a, b) away from the focus on cost containment. The improved logistics of pharmacies became the more focused benefit. The technological aspects of the standardization work contributed to this, as such an objective appeared to be more easy to obtain through standard application of EDI technology.

This drifting might have been an unhappy event, as the potential reduction in public spending could significantly help raising the required funding for developing a successful system.

Highlights

We will here, as we did for the Internet, point to some lessons to be learnt from this experience:

  1. Simple solutions being designed and deployed under a strong focus an the specific benefits to be obtained have been very successful.
  2. Health care is a sector with a wide range of different overlapping forms of communication, and communication between many different overlapping areas.
  3. Focusing on general, universal standards makes things very complex, organizationally as well as technologically. The specification of the data format used by Fürst takes one A4 page. The specification of the CEN standardized EDIFACT message for lab reports takes 500 (!) pages (ref. CEN). Where the CEN message is used, the date being exchanges are exactly the same as in the Fürst solution. The focus on general solutions also makes the benefits rather abstract, and the solutions are hard to sell to those who has the money. Furthermore, there is a long way to go from such general standards to useful, profitable solutions.
  4. A strong focus on standards makes success unlikely.

 

Methodological issues

As pointed above and elaborated further in chapter 3, there is a wide variety of information infrastructure standards produced within bodies ISO/CEN, EDIFACT, and Internet Society. These standards are on different levels and deals with issues like message definitions, syntax specification, protocols, file type formats, etc. Some standards are general purpose, others are sector specific ones (for instance, health care), and some are global while others are regional. Most of them are currently in-the-making. Our study does not provide evidence for drawing far-reaching conclusions regarding all types of information infrastructure standards. We believe, however, that the health care standards we have studied are representative for crucial parts of the standards of the information infrastructures envisioned in for instance the Bangemann report (1994), and that the picture of standardisation emerging from our analysis contains important features.

Studying the development of information infrastructures is not straightforward. There are at least two reasons for this which have immediate methodological implications worth reflecting upon.

First, the size of an information infrastructure makes detailed studies of all elements practically prohibitive. Internet, for instance, consists of an estimated 10 million nodes with an unknown number of users, more than 200 standards which have, and still are, extended and modified over a period of 25 years within a large, geographically dispersed organisation where also a number of vendors (Sun, IBM, Microsoft), commercial interests (MasterCard, Visa) and consortia (W3) attempt to exercise influence. This implies that the notion of an actor in connection with information infrastructure standardisation is a fairly general one in the sense that it is sometimes an individual, a group, an organisation or a governmental institution. An actor may also be a technological artifact -- small and simple or a large system or network like Internet or EDIFACT.

Actor network theory has a scalable notion of an actor as Callon and Latour (1981, p. 286) explain: "[M]acro-actors are micro-actors sitting on top of many (leaky) black boxes". In other words, actor network theory does not distinguish between a macro- and micro-actor because opening one (macro) black-box, there is always a new actor-network. It is not a question of principle but of convenience, then, which black-boxes are opened and which are not. To account for information infrastructure standardisation within reasonable space limits, it is necessary to rely on such a scalable notion of an actor. A systematic, comprehensive empirical study is prohibitive. In our study, we have opened some, but far from every, black-box. Several black-boxes have been left unopened for different reasons: some due to constraints on writing space, some due to lack of data access and some due to constraints on research resources. It might be desirable to have opened more black-boxes than we have done. We believe, however, we have opened enough to be able to present a reasonable picture of standardisation.

Our empirical evidence is partly drawn from standardisation of EDI messages within the health care sector in Norway. A method of historical reconstruction from reports, minutes and standards documents together with semi- and unstructured interview has been employed, partly based on (Pedersen 1996). One of the authors was for a period of three years engaged in the development of the standards by two of the companies involved (Telenor and Fearnley Data). Our account of the case has been presented, discussed and validated with one of the key actors (KITH, Norwegian: Kompetansesenteret for IT i Helsesektoren A/S).

Second, the fact that information infrastructures are currently being developed and established implies that there is only limited material on about the practical experience with which solutions "survive" and which "die", i.e. which inscriptions are actually strong enough to enforce the desired pattern of use. Hence, we are caught in a dilemma. On the one hand, the pressure for grounding an empirical study suggests that we need to await the situation, let the dust settle before inquiring closer. On the other hand, we are strongly motivated by a desire to engage in the ongoing process of developing information infrastructures in order to influence them (Hanseth, Hatling and Monteiro 1996).

Methodologically, the use of Internet as a case, in particular the IPng case described in chapter 10, is a historical reconstruction based on several sources. Internet keeps a truly extensive written record of most of its activities, an ideal source for empirical studies related to the design of Internet. We have used the archives for (see chapter 4 for an explanation of the abbreviations) IETF meetings including BOFs, working group presentations at IETF meetings (ftp://ds.internic.net/ietf/ and http://www.ieft.org), RFCs (ftp://ds.internic.net/rfc/), minutes from IPng directorate meetings (ftp://Hsdndev.harvard.edu/pub/ipng/directorate.minutes/), e-mail list for big-internet (ftp://munnari.oz.au/big-internet/list-archive/) and several working groups (ftp://Hsdndev.harvard.edu/pub/ipng/archive/), internet drafts (ftp://ds.internic.net/internet-drafts/), IESG membership (http://ietf.org/iesg.html#members), IAB minutes (http://info.internet.isi.edu:80/IAB), IAB membership (http://www.iab.org/iab/members.html) and information about IAB activities (http://www.iab.org/iab/connexions.html). The archives are vast, many thousand pages of documentation in total. The big-internet e-mail list, for instance, receives on the average about 200 e-mails every month. As a supplement, We have conducted in-depth semi-structured interviewing lasting about two hours with two persons involved in the design of Internet (Alvestrand 1996; Eidnes 1996). One of them is area director within IETF and hence a member of the IESG.

 


1. NCP, Network Control Protocol

2. This is a somewhat controversial standpoint. The Internet community has always stressed that "reality" out there is heterogeneous, and accordingly a useful running network has to run across different basic network technologies. This has been a crucial argument in the "religious war" against OSI. However, the Internet community strongly believe in "perfect" technical solutions (Hannemyr 1997), and accordingly refuse to accept gateways between not perfectly compatible solutions (Stefferud 199x). We return to this when discussing gateways in chapter 11.

3. OSI has been strongly supported by public agencies, and most governments have specified their GOSIPs (Government OSI Profiles), which have been mandatory in government sectors. These GOSIPs are still mandatory, and Internet protocols are only accepted to a limited extent.

4. EDIFACT is an abbreviation for Electronic Data Interchange For Administration, Commerce and Transport.

5. HL-7 is an abbreviation of Health Level 7, the number "7" referring to level 7 in the OSI model.

6. These numbers are of course controversial and debatable.

Go to Main Go to Previous Go to Next