In a six-page special, The THES looks at the impact of new technology on our society and the degree to which we control it or it has come to control us.
There is growing political and commercial pressure for engineering changes to the core of the net. Paul David considers the pros and cons.
Everyday life in the world's economically advanced regions has been touched and, in some parts, transformed by the advent of the internet. The expansion of scale that this system has achieved in a decade is breathtaking. The internet can be regarded as the largest artefact in the known universe: there are more than 100 million network hosts, some 200 million PCs connected online, and almost 30 million websites on the worldwide web. The pace of growth in global connectivity and internet usage and the phenomenal proliferation of innovations in applications software elicit almost universal applause. These are marvels that distinguish the internet's performance as a communications medium from its predecessors, such as the telegraph and telephone networks. More than increasing the world's communications capabilities, it has had huge social and economic impacts, not least through its effect in generating tools to exploit the explosively expanding information resources to which effortless global access has been made possible.
The user perceives the internet as though it were one single homogeneous network, but in actuality it is a softly integrated heterogeneous network of networks. The openness and transparency of this "connection-less" communications system are properties derived from the distinctive "end-to-end" design of its architecture and transmission control mechanisms. These features enable the internet to tolerate extreme diversity and heterogeneity in the technical specifications of its constituent networks and platforms. That has made joining the system highly attractive to new network operators, internet service providers (ISPs) and users. The fixed costs of joining the network remain minimal, as there is no need for extensive reconfiguration of previously customised local-area or wide-area networks to conform to new standards, nor to replace pre-existing custom-designed programs with internet software. The transparency of the internet's architecture, moreover, affords a particularly accommodating platform for developers of applications innovations. Software can be designed to run on the computers situated at the network's "edges" - taking data inputs and generating data outputs that traverse the intervening channels - without having to pay attention to the specifics of the computer hardware and software that perform the routing functions of the communication system. Although the internet's end-to-end architecture has made this communications system readily "extensible" and highly encouraging to innovation both in hardware and software applications, the technology of this infrastructure is not static.
Today, strong pressures are mounting for engineering changes to the core of the network. The variety of "adaptive network modifications" that are under consideration should not, however, be construed as obvious steps in some automatic process of technological optimisation that will deliver an enhanced version of the internet we know and love. Unless they are made subject to independent expert assessments carried out within explicit public policy guidelines, some of the proposed engineering modifications would alter the performance of this communications system in important respects. Whatever gains in social and economic welfare are to be expected from these "improvements", the full extent of their implications should be understood, so that direct performance gains may be properly weighed against the possible losses to stakeholders of other social and economic benefits - particularly those deriving from the transparency of the internet's inherited end-to-end architecture. At present, however, there is nothing in the political economy of the internet to ensure assessments of this kind will be carried out, or that they will influence the engineering steps that are taken in reaction to perceived drawbacks of the architecture.
Many of these deficiencies are not new. They appeared quickly after the network of networks was thrown open to general public and commercial traffic in the mid-1990s. The most salient among them are the difficulties of: blocking delivery of nuisance messages, offensive content, and politically disturbing material; suppressing malicious actions (for example, the release of destructive software "viruses"); and pricing the usage of bandwidth to reduce delays in transmission arising from congestion.
Technical remedies for some of these are already being brought about by the introduction of so-called filters at the edges of the network. Indeed, the filters that are being installed by end-use organisations and ISPs (because they are useful for "traffic analysis" and better capacity planning) are also available for deployment by third parties that do not need or ask for users' consent.
According to a recent report, the government of China has been able to effectively "firewall" the entire country, thereby controlling connections with the rest of the internet in addition to monitoring the content of internally generated traffic. What makes this feasible for an authoritarian government and a business corporation alike is that there is a relatively small number of paths connecting its domain to the rest of the network; the same would be true for a large ISP. Inserting firewalls and filters at those few passage points is an effective and comparatively low-cost means of imposing selective controls on the messages that residents of the domain are able to exchange with the rest of the world. Equally, it allows the insertion of clandestine traffic analysis and content monitoring by outside parties. This possibility gives rise to understandable concerns that - especially in the climate of post-September 11 - the large ISPs may find it difficult to resist requests from government agencies to permit this to be done in the interests of security.
The insertion of technical devices to enable governments to exercise control for political purposes, or to protect the integrity of the communications system, is only one way in which the original architectural features of the internet may be compromised. Powerful business interests are also at work, particularly those of the major ISPs that have emerged during the past several years. From this quarter comes inducements for engineering innovations that would support certain high-value data transport services - services for which the precursor networks forming the internet were not designed. The transmission control and internet protocols - which help to reassemble the data packets in order, re-transmit lost packets and confirm complete delivery - offer a "best effort" quality of service.
While these have been successful in supporting a wide range of applications, they do not establish a dedicated connection between sender and receiver; and so it cannot make any guarantees for users as to when, or even whether, a message will be delivered. Network services such as email and web browsing easily tolerate the existing transmission delays and delay variations that are characteristic of the TCP mechanism, but they fatally degrade voice telephony and video services over the existing internet.
Consequently, would-be vendors of voice telephony and real-time video on the internet, and of other complementary services, have a keen interest in proposals to modify the layer of technology that controls and manages flows of data-packets, in order to achieve a "quality of service" approximating that of the public switched telephone network. But the technologies that provide this would require modifying the internet's routers in ways that terminal hardware and software would need to recognise and take into account. This is perhaps the most likely of the plausible evolutionary paths along which the termination of end-to-end architecture would be driven by private business initiatives.
It is in connection with these commercial pressures that another of the numerous consequences of the internet's privatisation has acquired unexpected significance. The enlarged scope for business strategies based on exploiting opportunities for "regulatory bypass" has posed new challenges to the ability of public authorities to regulate telecommunications industries in the interests of consumers. This has happened because the terms on which the internet was transferred into private operation and ownership opened interconnections to other public telecommunications domains, such as those based on cable or satellite, allowing migration into these fields by the providers of all sorts of commercial communications services via the internet. This has enabled some to escape legal and administrative restraints imposed on businesses using other communication modes.
In the United States, network operators in the long-regulated telephone business that offer broadband access to the internet have been required, largely for reasons of competition policy goals, to provide their customers with open and non-discriminatory access to other broadband ISPs. Cable companies, on the other hand, although performing the same functions, find themselves under no corresponding regulatory constraints.
This leaves the way open for some (weakly regulated) ISPs to pursue a strategy of creating what might be described as "restricted access shopping precincts in cyberspace": islands and archipelagos on the internet where subscribing customers will be offered pre-selected bundles of communications services, information applications, auctions and other e-shopping opportunities, as well as games and databases. Such a strategy would obviously create sources of indirect profit for the "cyber-mall landlord"; the firm in question would exercise considerable market power vis-a-vis the originators of the variety of goods and services being offered there.
Even though restricted cyber-malls of this kind might prove attractive to many customers seeking convenient and cheap access to standardised packages of regularly upgraded information goods and services, the adverse long-term effects on competition would curtail the economic benefits derived by consumers. Significantly, the effectiveness of the internet as a platform for innovation would be compromised, restricting net-wide innovations. It is not surprising that informed observers have expressed dismay that the existing regime of regulation (and non-regulation) in the US may permit cable companies to bundle broadband access with selected application services offerings. Indeed, the effect would be to unleash particularly powerful private economic incentives for what could be termed the "business balkanising" of the internet.
Rational discussion of the trade-off between the different policy options - the need for widespread innovation on the one hand and protection from such problems as terrorist actions coordinated through the internet and "spam" on the other - requires an elevated measure of public awareness of the implications of the solutions being proposed.
Lacking such understanding, one is at a particularly marked disadvantage in trying to protect the special performance features of the internet as a public domain for "information discovery", for the creation of new modes of discourse and democratic participation, and for the facilitation and coordination of learning, research and innovative activities that respond to the needs and desires of diverse communities throughout the world.
Paul A. David is senior research fellow at All Souls College, Oxford, and professor of economics at Stanford University.