CHAPTER 5 Openness and flexibility

Introduction

We will in this chapter look more closely at the open character of information infrastructures underscored in chapter 3. We will discuss and illustrate how this character is materialized and spell out (some of) its implications. We will in particular look at how openness generates requirements for flexible infrastructures.

At the core of this lies a dilemma. On the one hand, the expanding II supporting a growing population of users and new services accumulate pressure to make changes, but, on the other hand, this has to balanced against the conservative influence of the huge, already installed base of elements of the II. There is simply no way to accomplish abrupt changes to the whole II requiring any kind of overall coordination (for instance, so-called flag-days) because it is "too large for any kind of controlled rollout to be successful" (Hinden 1996, p. 63).

Standardization and flexibility are opposites. However, both are required. This fact makes the tension, interdependencies and interaction between these two issues crucial.

Openness

Defining openness

Infrastructures, as explained in chapter 3, are open in the sense that there is no limits for number of user, stakeholders, vendors involved, nodes in the network and other technological components, application areas, network operators, etc. This defining characteristic does not necessarily imply the extreme position that absolutely everything is included in every II. However, it does imply that one cannot draw a strict border saying that there is one infrastructure for what is on one side of the border and others for the other side and that these infrastructures have no connections.

The enabling character of infrastructures means that their use ares should not be predetermined or specified ahead of design and implementation. Enabling implies that infrastructures should be used where and as needed, as needs changes and use opportunities are discovered. This aspect of infrastructures is well illustrated by the growth in numbers of fields where the Internet is used, from the initial small group of researches to virtually everybody.

Each use area has its categories and groups of users. If numbers of use areas is unlimited, so is the number of users, which further makes it impossible to define strict limits for the number of components included in the infrastructure and the number of vendors and other stakeholders involved in its design and operation.

In the discussion about open systems (XXX Carl Hewitt), the term open usually means that a system is embedded in an environment and that the system cannot be properly understood without being considered as a part of its environment. Our use of the term open also includes this one.

Infrastructures are embedded in environments, upon which they are dependent. Change in the environment might require change in the infrastructure. Infrastructures are parts of their environments in a way making it difficult to define a strict border between infrastructure and environment. The infrastructure is a part of its environment just as the environment is a part of the infrastructure.

The Internet, for instance, is running over various basic telecommunication infrastructures like ATM and telephone networks. Are these networks a part of the Internet or not? In the same way are health care infrastructures (for instance patient record and telemedicine infrastructures) integrated with medical instruments (MR, CT, X-ray, ultrasound, ECG, EEG or endoscope). Are they a part of the infrastructure?

When using a specific information infrastructure, say an EDI network for transmission of lab reports or prescriptions, it will in fact be used in combination with other infrastructures like the ordinary telephone. In case of lab reports the telephone is used in emergency cases. In case of prescriptions will pharmacies call the general practitioner when there is something that requires clarification. The EDI network is not working unless it is combined with others in this way. The overall lab report and prescription transmission infrastructure also includes the telephone infrastructure.

Unlimited use

As mention above, the Internet is a good example of the impossibility of defining the areas and ways of using an infrastructure. Email was initially designed to support the coordination and management of the first version of the Internet. Today it is used in virtually all kinds of activities. The enabling character of infrastructures makes defining its use areas impossible. This implies further that defining user requirements is impossible.

Unlimited number of stakeholders

Unlimited use necessarily implies unlimited users as well as other stakeholders.The stakeholders around the design and use of an information infrastructure include at least:

The present dog-fight over Internet provides ample evidence for the crossing interests of the many stakeholders (Monteiro 1998). The last years have witnessed a transformation of Internet from an esoteric toy nurtured by a small, US based boy's club to an infrastructure supporting a rich spectrum of users and interests. The traditional Internet community was small and cosy, consisting of academics and a collection of designers from large software vendors in the US. Only in 1992 did this traditional community become a minority user group in Internet. Today, there are a number of new stakeholders on the scene. The picture is highly dynamic and changes every day. With the widespread use of the WorldWideWeb, Internet opened up to a new, large and heterogeneous user community of young and old, experienced and non-experienced and technical and non-technical. Important elements of the design moved outside the control of the IETF. Especially the establishment of the W3C consortium and efforts to promote electronic transactions are influential. Key actors here are large software vendors (including IBM, Sun, Microsoft, Apple and HP) and financial institutions (including Visa and MasterCard). It is interesting, but regrettably beyond the present scope, to study the strategies of these vendors towards open standards. Microsoft, for instance, has traditionally been extremely "closed" in the sense that the different Microsoft products were hardwired into each other thus keeping competitors out (as did IBM during the golden age of main frames). Within Internet technology in general and Web in particular, areas where Microsoft has been lagging considerable behind, they are perfectly happy to be daring and explorative (XXXKILDE Haakon Lie 1996). They promote continued experimentation rather than early adoption of an existing solution, a solution developed by a competitor.

Another slow starting, but presumably influential, group of stakeholders are the telecom operators. Coming from a different paradigm altogether (Abbate 1994), telecom operators are increasingly trying to align with Internet rather than work against it. In Europe, for instance, telecom operators are key actors in the area of Internet access providers (REF Knut HS, Jarle B)). The media, especially TV and newspapers, struggle to come to grips with their role and use of Internet. Until now, however, they have played a marginal role.

A more technologically biased, but historically important, sense that information infrastructures involve numerous stakeholders concerns interoperability and portability. An information infrastructure is to avoid a too tight connection with a specific vendor. It should be multi-platform, ideally allowing a rich selection of operating systems, hardware architectures, applications and programming languages. XXFINN PASSE SITAT Needless to say, this is more easily stated that practised as illustrated in chapter 10.

Unlimited size of an information infrastructure

It is not given from the outset how big, that is, how widespread, an information infrastructure will be. It is an open, empirical question. Note that despite the apparent similarities, the situation is distinct from that in the product world. It is certainly true that nobody can tell for sure exactly many copies of MS Word will be sold, that the "size" of MS Word is unknown beforehand. But this has little or no impact on the design (beyond, of course, the increased pressure in terms of appealing functionality and appearance). There is no intrinsic connection between different stand-alone copies of MS Word. This is fundamentally different for an information infrastructure. As all nodes need to function together -- regardless of their number -- the design need to cater for this. It must be designed to be open to allow for an indefinite number of nodes to hook up.

An information infrastructure has to be designed to support a large, but unknown, population of users. So if hardwired constraints to further expansion surfaces, changes have to be made. These changes also take place during rapid diffusion. The very diffusion is in itself an important reason for the need for change. The number of hosts connected to Internet grew from about 1000 to over 300.000 during 1985-1991 (Smarr and Catlett 1992). The Matrix Information and Directory Services estimated the number to about 10 million in July 1995 (McKinney 1995).

A network with just a few hosts has to be designed differently from one with millions. When the number of hosts is open, the design has to be so as well.

The wider environment - a more rapidly changing world

Dominating accounts of today's business and organizational reality is full of concepts like "globalization," "competitiveness," "flexibility," and change. The "reality" depicted by these concepts are not isolated to the business world, it is rather creeping into virtually all our social life. Such accounts are found in popular press as well as scientific literature.

We are witnessing a rapidly increasing number of theoretical, speculative or empirical accounts dealing with the background, contents and implications for a restructuring of private and public organisations. The sources of these accounts mirror the complex and many-faceted issues raised of economical (OECD 1992), social (Clegg 1990), political (Mulgan 1991) and technological (Malone and Rockart 1993; Malone, Yates and Benjamin 1991) nature. A comprehensive account of this is practically prohibitive; the only feasible strategy is to focus attention on a restricted set of issues.

New organisational forms are assumed important in order to achieve enhanced productivity, competitiveness, flexibility, etc. New organisational forms are usually of a network type positioned between markets and hierarchies. The discussions about new organisational forms borrow from a number of sources.

From economics, basically relying on transaction-cost considerations, there is a growing pressure to accommodate to the "information economy" (Ciborra 1992; OECD 1992). Transaction-cost considerations fail to do justice to the dynamically changing division of labour and functions which are two important aspects of new organisational forms (Ciborra 1992). Within business policy literature the arguments focus on issues of innovation processes as facilitated through strategic alliances and globalization which emerge pragmatically from concerns about maintaining competitiveness in a turbulent environment (Porter 1990; von Hippel 1988). In organisational theory, one emphasises the weaknesses of centralised, bureaucratic control in terms of responsiveness to new situations (Clegg 1990). Ciborra (1992) sees new organisational forms as rational, institutional arrangements to meet the increased need for organisational learning. Technological development within information and communication technology are identified by some scholars as the driving force for the restructuring of organisations (Malone, Yates and Benjamin 1991; Malone and Rockart 1993).

Even such a brief exposition of theoretical considerations should make it evident that the issue of new organisational forms is vast. When we turn to what exists of empirical evidence, the picture gets even more complicated. This is because the empirical material document a far less clear-cut picture as it contains numerous contradicting trends (Applegate 1994; Capello and Williams 1992; Orlikowski 1991; Whitaker 1992).

A basic underlying theme in all these accounts is the view that our world is becoming increasingly more open as we are becoming more integrated. The integration is primarily due to improved transport and communication technology. As we are more tightly interwoven in a larger environment, our life is more open to influence by others - which again implies increased instability and unpredictability. The organizational response to this is increased flexibility and communication to more effectively adapt to and interoperate with the environment. In this sense, information and communication technology is both a cause and an effect of this trend, generating and continuously faster spinning spiral.

Implications: the need for flexible infrastructures

Systems and infrastructures

Information infrastructures are, of course, in a sense information systems. As such they share lots of properties. There is a general and fairly well-known argument that the use (at least the requirements) of an information system evolves over time. Boehm (198xx), for instance, includes this in his spiral model for systems development when explaining that systems development is really like aiming at a moving target. Requirements are neither complete nor stable. They are only gradually uncovered and they are dynamic. For larger systems it is also the case that it is impossible to foresee all relevant issues and problems, they are discovered as we go along, and the technology must be changed accordingly (Parnas and Clement 1986).

Hence, it is hardly news that requirements about use evolve. Still, mainstream systems development is biased towards early and fairly stable specifications (Pressman 1992). There is an alternative, much less widely applied, approach stressing prototyping, evolutionary development, learning and user involvement (Schuler and Namioka 1993). Systems development is viewed, in principle, as a mutual learning process where designers learn about the context of use and end-users about the technical possibilities (REF noe blautfisk greier). The first versions of a system are poor in quality compared to later ones. They are improved as users get experience in using them and discover what is needed as well as how the technology may be adapted to improved ways of working. For users it is impossible to tell in advance what kind of technology that will suit their needs best. User influence is an illusion unless it is based on experience of use.

The rationale for the importance of enabling a mutual learning process associated with systems development is, at least in parts, an ideologically biased one (REF empiriske studier). A less dogmatic approach, and one more immediately relevant to the development of information infrastructures, would be to inquire empirically into actual changes that have been implemented in information infrastructures in response to evolving patterns of use. This may be illustrated by a few of the changes of some OSI and Internet standards during their lifetime up till now.

OSI

OSI protocols have in fact been quite stable after their formal approval. This stability may to a large extent be explained by the fact that most OSI protocols did not diffuse. The OSI standard for e-mail, however, was approved in 1984. Four years later a new version came. It differed so much that a number of its features were incompatible with the earlier version (Rose 1992).

Internet

Internet has so far proved remarkably flexible, adaptable and extendable. It has undergone a substantial transformation -- constantly changing, elaborating or rejecting its constituting standards -- during its history. To keep track of all the changes, approximately quarterly a special report is issued which gives all the latest updates (RFC 1995). These changes also take place during rapid diffusion. The very diffusion is an important reason for the need for change. The number of hosts connected to Internet grew from about 1000 to over 300.000 during 1985-1991 (Smarr and Catlett 1992). The Matrix Information and Directory Services estimated the number to about 10 million in July 1995 (McKinney 1995).

The need for an II to continue to change alongside its diffusion is recognised by the designers themselves as expressed in an internal document describing the organisation of the Internet standardisation process: "From its conception, the Internet has been, and is expected to remain, an evolving system whose participants regularly factor new requirements and technology into its design and implementation" (RFC 1994, p. 6).

The IETF has launched a series of working groups which, after 4-5 years, are still struggling with different aspects of these problems. Some are due to new requirements stemming from new services or applications. Examples are asynchronous transmission mode, video and audio transmission, mobile computers, high speed networks (ATM) and financial transactions (safe credit card purchases). Other problems, for instance, routing, addressing and net topology, are intrinsically linked to and fuelled by the diffusion itself of Internet (RFC 1995). As commercial actors have been involved, the "triple A-s" - authentication, authorization, and accounting - have appeared as important issues. Until recently, when the Internet has been a non-commercial network, these issues have been non/exiting. For commercial service providers and network operators, they are absolutely necessary. This commercial turn requires that new features are added to a wide range of existing protocols.

The above illustrations clearly indicate that there is nothing which suggests that the pace or need for flexibility to change Internet will cease, quite the contrary (Smarr and Catlett 1992; RFC 1994, 1995).

During the period between 1974 and 1978 four versions of the bottom-most layer of the Internet, that is, the IP protocol were developed and tested out (Kahn 1994). For almost 15 years it has been practically stable. It forms in many respects the core of the Internet by providing the basic services which all others build upon (cf. our earlier description). An anticipated revision of IP is today the subject of "spirited discussions" (RFC 1995, p. 5). The discussions are heated because the stakes are high. The problems with the present version of IP are acknowledged to be so grave that Internet, in its present form, cannot evolve for more than an estimated 10 years without ceasing to be a globally, inter-connected network (ibid., pp. 6-7; Eidnes 1994, p. 46). This situation is quite distinct from the more popular conception of an inevitable continued development of Internet. There are a whole set of serious and still unresolved problems. Among the more pressing ones, there is the problem that the "address space" will run out in few years. The Internet is based on the fact that all nodes (computers, terminals and printers) are uniquely identified by an address. This size of this space is finite and determined by how one represents and assigns addresses. The problem with exhausting the current address space is serious as it will block any further diffusion of Internet for the simple reason that there will not be any free addresses to assign to new nodes wishing to hook up. The difficulty is that if one switches to a completely different way of addressing, one cannot communicate with the "old" Internet. One is accordingly forced to find solutions which allow both the "old" (that is, the present) version of IP to function alongside the new and non-existing IP.

This case is elaborated at greater length in chapter 10 (with a full account in (Monteiro 1998)). A core dilemma is how to balance the urge for making changes against the conservative influence of the installed base (see chapter 9). As a standard is implemented and put into widespread use, the effort of changing it increases accordingly simply because any changes need to be propagated to a growing population of geographically and organisationally dispersed users as captured by the notion of "network externalities" (Antonelli 1993, Callon 1994, p. 408) or the creation of lock-ins and self-reinforcing effects (Cowan 1992, pp. 282-283).

As the components of IIs are inter-connected, standardisation sometimes requires flexibility in the sense that to keep one component standardised and stable others must change. Enabling mobile computers network connections, for instance, requires new features to be added to IIs (Teraoka et al. 1994). These may be implemented either as extensions to the protocols at the network, transport or application level of the OSI model. If one wants to keep one layer stable others must change.

Enabling flexibility to change

The primary principle for enabling flexibility is modularization, i.e. "black-boxing." Modularization as a strategy for coping with design is employed by most engineers, not only those involved with II (Hård 1994). It could, however, be maintained that in the case of computer science (including the development of II) modularization is systematically supported through a large and expanding body of tools, computer language constructs and design methodologies. Elaborating this would carry us well beyond the scope of this paper, but it is indeed possible to present the historical development of a core element of computer science, namely the evolution of programming language, as very much influenced with exactly how to find constructs which support flexibility to change in the long run by pragmatically deciding how to restrict or discipline local flexibility. The interested reader might want to recast, say, the controversy over structured programming along these lines, that is, recognising the call for structured constructs as a means for enabling flexibility in the long run by sacrificing local flexibility of the kind the GOTO statement offers. (The GOTO statement offers great flexibility in how to link micro level modules together at the cost of diminishing the flexibility to change the modules later on.)

Decomposition and modularization are at the same time a basis for flexibility in II: flexibility presupposes modularization. The reason for this, at least on a conceptual level, is quite simple. The effect of black-boxing is that only the interface (the outside) of the box matters. The inside does not matter and may accordingly be changed without disturbing the full system provided the interface looks the same. As long as a box is black, it is stable and hence standardised. In this sense standardisation is a precondition for flexibility .

Two forms of this modularization need to be distinguished. Firstly, it may give rise to a layered or hierarchical system. OSI's seven layered communication model provides a splendid example of this. Each layer is uniquely determined through its three interfaces: the services it offers to the layer immediately above, the services it uses in the layer immediately below and the services a pair of sender and receiver on the same level make use of.

Secondly, modularization may avoid coupling or overlap between modules by keeping them "lean". One way this modularization principle is applied is by defining mechanisms for adding new features without changing the existing ones. In the new version of IP, for instance, a new mechanism is introduced to make it easier to define new options (RFC 1995). Another example is the WorldWideWeb which is currently both diffusing and changing very fast. This is possible, among other reasons, because it is based on a format defined such that one implementation simply may skip or read as plain text elements it does not understand. In this way, new features can be added so old and new implementations can work together.

Hampering flexibility to change

There are three principal ways the flexibility to change an II is hampered. Breaking either of the two forms of modularization enabling flexibility described above accounts for two of the three ways flexibility is hampered. To illustrate how lack of hierarchical modularization may hamper flexibility, consider the following instance of a violation found in OSI. In the application level standard for e-mail, the task of uniquely identifying a person is not kept apart from the conceptually different task of implementing the way a person is located. This hampers the flexibility because if an organisation changes the way its e-mail system locates a person (for instance, by changing its network provider), all the unique identifications of the persons belonging to the organisation have to be changed as well. Most OSI protocols are good illustrations of violations of the "lean-ness" principle. Although the OSI model is an excellent example of hierarchical modularization, each OSI protocol is so packed with features that they are hardly possible to implement and even harder to change (Rose 1992). The reason is simply that it is easier to change a small and simple component than a large and complex one. Internet protocols are much simpler, that is, leaner, that OSI ones, and accordingly easier to change.

The third source of hampered flexibility is the diffusion of the II. As a standard is implemented and put into widespread use, the effort of changing it increases accordingly simply because any changes need to be propagated to a growing population of geographically and organisationally dispersed users as captured by the notion of "network externalities" (Antonelli 1993, Callon 1994, p. 408) or the creation of lock-ins and self-reinforcing effects (Cowan 1992, pp. 282-283).

At the moment, Internet appears to be approaching a state of irreversibility. Consider the development of a new version of IP described earlier. One reason for the difficulties in developing a new version of IP is the size of the installed base of IP protocols which must be replaced while the network is running (cf. rate of diffusion cited earlier). Another major difficulty stems from the inter-connectivity of standards: a large number of other technical components depend on IP. An internal report assesses the situation more precisely as: "Many current IETF standards are affected by [the next version of] IP. At least 27 of the 51 full Internet Standards must be revised (...) along with at least 6 of the 20 Draft Standards and at least 25 of the 130 Proposed Standards." (RFC 1995, p. 38).

The irreversibility of II has not only a technical basis. An II turn irreversible as it grows due to numbers of and relations between the actors, organisations and institutions involved. In the case of Internet, this is perhaps most evident in relation to new, commercial services promoted by organisations with different interests and background. The transition to the new version of IP will require coordinated actions from all of these parties. It is a risk that "everybody" will await "the others" making it hard to be an early adopter. As the number of users as well as the types of users grow, reaching agreement on changes becomes more difficult (Steinberg 1995).

The interdependencies between standardization and flexibility

We have sketched, drawing on both conceptual arguments and empirical illustrations, in what sense an information infrastructure needs to be open-ended and flexible. However, the major difficulty may be to replace one working version with another working one as change will introduce some kind of incompatibility which may cause a lock-in situation. As the components of information infrastructures are inter-connected, standardisation requires flexibility in the sense that to keep one component standardised and stable others must change. Enabling mobile computers network connections, for instance, requires new features to be added to IIs (Teraoka et al. 1994). These may be implemented either as extensions to the protocols at the network, transport or application level of the OSI model. If one wants to keep one layer stable others must change. This aspect of information infrastructure we might call anticipated and alternating flexibility.

There are, as we have seen in earlier chapters, lots of Internet standards. These standards do not all fit into a tidy, monolithic form. Their inter-relationships are highly complex. Some are organised in a hierarchical fashion as illustrated by the bulk of standards from OSI and Internet as outlined below. Others are partly overlapping as, for instance, in the case where application specific or regional standards share some but not all features. Yet others are replaced, wholly or only in part, by newer standards creating a genealogy of standards . This implies that the inter-dependencies of the totality of standards related to II form a complex network. The heterogeneity of II standards, the fact that one standard includes, encompasses or is intertwined with a number of others, is an important aspect of II. It has, we argue, serious implications for how the tension between standardisation and flexibility unfolds in II.

We have in this chapter focused on just one type of flexibility -- flexibility to change. Another type of flexibility is flexibility in use. This means that the information infrastructure may be used in many different ways, serving different purposes as it is -- without being changed. This is exactly what gives infrastructures their enabling function. Stressing this function makes use flexibility crucial. Flexibility of use and change are linked in the sense that increased flexibility of use decreases the need for flexibility for change and vice versa. An aspect of both flexibility of use and change is to provide possibilities for adaptation to different local needs and practices, avoiding unacceptable constraints being imposed by some centralized authority. Flexibility of use is the topic of chapter 7 when we develop the notion of an "inscription" to talk about the way material (or non-material) artefacts (attempt to) shape behaviour. To do so, however, it is necessary to frame this within a broader understanding of the interplay between artefacts and human behaviour. To this end, we in the following chapter 6 develop a theoretical framework called actor-network theory (ANT).

Go to Main Go to Previous Go to Next