10,99 €
The Internet has united the world as never before. But is it in danger of breaking apart? Cybersecurity, geopolitical tensions, and calls for data sovereignty have made many believe that the Internet is fragmenting.
In this incisive new book, Milton Mueller argues that the “fragmentation” diagnosis misses the mark. The rhetoric of “fragmentation” camouflages the real issue: the attempt by governments to align information flows with their jurisdictional boundaries. The fragmentation debate is really a power struggle over the future of national sovereignty. It pits global governance and open access against the traditional territorial institutions of government. This conflict, the book argues, can only be resolved through radical institutional innovations.
Will the Internet Fragment? is essential reading for students and scholars of media and communications, international relations, political science and STS, as well as anyone concerned about the quality of Internet governance.
Sie lesen das E-Book in den Legimi-Apps auf:
Seitenzahl: 166
Veröffentlichungsjahr: 2017
Cover
Title Page
Copyright
1 Coming Undone?
The “unified and unfragmented space”
The mismatch
“Fragmentation of the Internet” as a move in policy discourse
Interrogating fragmentation
Overview of key theses
2 A Taxonomy of “Fragmentation”
The unifragged Internet
A taxonomy of “fragmentation’
Alignment vs. globalization
Exemplar of alignment: the Chinese DNS proposal
Notes
3 The Illusion of Technical Fragmentation
Defining technical fragmentation
Network benefits
National Internets
Kill switches
A split DNS root
Incompatible protocols
Application layer incompatibilities
Why connectivity wins
Notes
4 Alignment: Cyberspace Meets Sovereignty
Methods of alignment
The contradictions of alignment
Alignment is an illusion
Notes
5 Confronting Alignment
Enhanced international legal cooperation
Giving up: Embracing national interest over global Internet
The “multistakeholder model”
Notes
6 Popular Sovereignty in Cyberspace
Sovereignty in history
Sovereignty and territoriality
Changing units of governance
Net nationalism
Identity
Displacement and articulation
Notes
References
Index
End User License Agreement
Cover
Table of Contents
Begin Reading
ii
iii
iv
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
99
100
101
102
103
104
105
106
107
108
109
110
111
112
113
114
115
116
117
118
119
120
121
122
123
124
125
126
127
128
129
130
131
132
133
134
135
136
137
138
139
140
141
142
143
144
145
146
147
148
149
150
151
152
153
154
155
156
157
158
159
160
161
162
163
164
165
166
167
168
169
170
171
172
173
174
175
176
177
Digital Futures Series
Milton Mueller, Will the Internet Fragment?
Neil Selwyn, Is Technology Good for Education?
MILTON MUELLER
polity
Copyright © Milton Mueller 2017
The right of Milton Mueller to be identified as Author of this Work has been asserted in accordance with the UK Copyright, Designs and Patents Act 1988.
First published in 2017 by Polity Press
Polity Press65 Bridge StreetCambridge CB2 1UR, UK
Polity Press350 Main StreetMalden, MA 02148, USA
All rights reserved. Except for the quotation of short passages for the purpose of criticism and review, no part of this publication may be reproduced, stored in a retrieval system or transmitted, in any form or by any means, electronic, mechanical, photocopying, recording or otherwise, without the prior permission of the publisher.
ISBN-13: 978-1-5095-0125-0
A catalogue record for this book is available from the British Library.
The publisher has used its best endeavours to ensure that the URLs for external websites referred to in this book are correct and active at the time of going to press. However, the publisher has no responsibility for the websites and can make no guarantee that a site will remain live or that the content is or will remain appropriate.
Every effort has been made to trace all copyright holders, but if any have been inadvertently overlooked the publisher will be pleased to include any necessary credits in any subsequent reprint or edition.
For further information on Polity, visit our website: politybooks.com
An alarming message about the Internet is being voiced around the world. The Internet is in danger of splitting up, fragmenting. “After 20 years of connecting the world ever more tightly,” the Financial Times wrote in late 2014, “the Internet is about to become Balkanised” (FT Reporters, 2014). A year later, two respected legal scholars lamented that “The era of a global Internet may be passing. Governments across the world are putting up barriers to the free flow of information across borders . . . breaking apart the World Wide Web” (Chander and Le, 2015). The most apocalyptic vision came from cybersecurity expert Eugene Kaspersky, who claimed that “Internet fragmentation will bring about a paradoxical de-globalization of the world, as communications within national borders among governmental bodies and large national companies become increasingly localized” (Kaspersky, 2013, December 17).
How serious are these claims? There is certainly a political backlash against trade, immigration, and the European Union that seems to match these concerns about the Internet. How much of a threat, if any, do current trends in information and Internet policy pose to globalization? The more the idea of fragmentation or Balkanization plays a role in global Internet governance debates, the more important it becomes to interrogate those concepts. What do we really mean by Internet fragmentation? What are its manifestations? Is there really a risk that the global Internet could be divided up into distinct fiefdoms?
Engaging with these questions opens up a rich set of issues in communications policy and global governance. What seemed at first to be a simple dichotomy – globalized vs. territorialized information, an “open” vs. a “closed” Internet – turned out to be much more complex. Many of the things being called fragmentation are indeed destructive attempts by political authorities or private monopolies to limit and control the potential of information technology. But at the same time, many of the methods and techniques used for these bad purposes are also used by legitimate local actors to enhance or protect their own networks and their users’ security and freedom of action. The basic, yet hard-to-grasp fact is that digital technology is so flexible and powerful that it enables both types of limitations in a variety of contexts.
“Fragmentation” is really the wrong word with which to approach this problem. In this book I will argue that the network effects and economic benefits of global compatibility are so powerful that they have consistently defeated, and will continue to defeat, any systemic deterioration of the global technical compatibility that the public Internet created. The rhetoric of “fragmentation” is in some ways a product of confusion, and in other ways an attempt to camouflage another, more inflammatory issue: the attempt by governments to align the Internet with their jurisdictional boundaries. The fragmentation debate is really a power struggle over the future of national sovereignty in the digital world. It’s not just about the Internet. It’s about geopolitics, national power, and the future of global governance.
Turn the clock back to Sao Paulo, Brazil, in April 2014. Nearly 2,000 people from business, government, civil society, and the technical community converged on this city to discuss Internet governance. The meeting, dubbed the NETMundial by its proud Brazilian hosts, was a bold attempt to bring together a community both shocked and mobilized by Edward Snowden’s revelations of NSA spying. One of the instigators of the meeting, Fadi Chehadé, at that time the President and CEO of ICANN, announced, “If we cannot find a way to govern the Internet on an equal footing, in an open transparent way this year, we might descend into a fragmented version of the Internet” (Chehadé, 2014).
Reacting to the news that the Internet had become a tool of globalized, mass surveillance, the NETMundial congregated to forge agreement on some basic principles for global Internet governance. Though many of the principles debated at the meeting proved contentious, it had no trouble coming to consensus on this one:
UNIFIED AND UNFRAGMENTED SPACEThe Internet should continue to be a globally coherent, interconnected, stable, unfragmented, scalable and accessible network-of-networks, based on a common set of unique identifiers . . . that allows data packets/information to flow freely end-to-end . . . (NETMundial 2014)
However awkwardly phrased, the principle that the Internet should be unified and unfragmented was considered fundamental; it sat in the event’s outcome document alongside principles such as HUMAN RIGHTS and SECURITY, STABILITY AND RESILIENCE.
NETMundial was only one of the many manifestations of a world-embracing universalism or globalizing tendency that has always been present in the technical vision of the Internet. One of the inventors of the Internet protocols, Vinton G. Cerf, wrote:
From a technical standpoint, the original shared vision guiding the Internet’s development was that every device on the Internet should be able to exchange data packets with any other device that was willing to receive them. Universal connectivity among the willing was the default assumption . . . (Drake, Cerf, and Kleinwachter, 2015)
One of the most strident advocates of this vision is the designer of the World Wide Web protocol, Tim Berners-Lee. Berners-Lee believes that the Web is a universe that is (or should be – for the line between the normative and the positive is always fuzzy when computer scientists talk about policy) subject to uniform laws, which he compares matter-of-factly to the laws of physics:
Ants, Neurons, objects, particles, people. In each case, the whole operates only because the parts interoperate. The behaviour of the whole is in some way dictated by the rules of behaviour of the parts. This may be a view influenced too much by physics, but I find it useful. It makes you think about how you predict the rules of the whole from the rules of the parts, and then as a global engineer (constitution writer, etc) how you can phrase the local laws to engender the global behaviour he desires. For people, we call these rules variously the constitution, laws, or codes of ethics, for example. These rules are things which are accepted across the board. For particles, we call them the laws of physics. For web objects they are the protocol standards. (Berners-Lee, 1995)
Berners-Lee, and many others like him, hold up as a guiding norm the idea that “an application should function at one point in the network as it does at any other; a website should look the same to a person in China as it does to a person in Chile. In other words, the experience of every Internet user should be the same regardless of geographic location, computer type, or any other distinguishing characteristic of the user” (Hill, 2012).
This “laws of physics” approach is echoed by the Internet Society’s description of the essential features of the Internet – what they call, again echoing the language of mathematical axioms or natural laws, one of the “Internet invariants.”
Global reach, integrity: Any endpoint of the Internet can address any other endpoint, and the information received at one endpoint is as intended by the sender, wherever the receiver connects to the Internet. Implicit in this is the requirement of global, managed addressing and naming services. (Daigle, 2015)
This commitment to perfect interoperability, to a seamlessly interconnected, borderless, and transparent cyberspace, is not a recent turn. It is an almost religious principle of the Internet technical community, built into its DNA. The US Department of Defense (DoD) funded the development of the Internet protocols not, as is commonly assumed, so that the network would survive a nuclear war, but because it wanted its field personnel to be able to communicate seamlessly regardless of what system or physical medium they were using. DoD wanted a single protocol to unify – to internetwork – any and all of their data communications.
They got one, and more. The community of computer scientists and network engineers they funded bought into the principle of interoperability with a fervor that exceeded the military’s original intent. And their commitment to that principle proved to be right. The economic and social benefits of interoperability among civilians and businesses in a digitizing world vastly exceeded its minor contribution to US military communications. In the 1980s, when personal computers and other digital devices began to proliferate, the open, nonproprietary Internet protocols met a powerful need. It took only about a dozen years for the Internet protocols, officially standardized late in 1981 and implemented from 1982 to 1984, to take over the world of digital communications completely. From about 1993 on, adoption of the Internet protocols reached the critical mass needed to create the bandwagon effect of self-sustaining growth. The Internet succeeded precisely because it overcame the compatibility barriers – the technical fragmentation – of the world of national telephone monopolies and multiple proprietary data networking protocols into which it was born.
Though rooted in the ideals of the technical community, the “unfragmented space” was a vision with profound political and economic implications. It is a vision that militates against jurisdictional boundaries on the flow of information. It is a vision that, if carried out consistently, drastically diminishes the power of local politicians and governments to shape and control information. With respect to the information economy, it is globalization on steroids. A system that is engineered to make communications and information accessible and interoperable across the board enables commercial exchanges of digital goods and information services among any two connected parties. In other words, it implies pure free trade in information services, a globalized market unprotected by customs checkpoints or tariffs. In a unified and unfragmented space, any entrepreneur with a new idea can make the world their marketplace. Turning the tables on the state, it moves from a regime requiring prior permission from national regulators to a regime of permissionless innovation (Thierer, 2014). As one Internet technologist put it, “The Internet was not designed to recognize national boundaries. It’s not being rude – they just weren’t relevant” (Daigle, 2013).
Of course, the technology also enables its users to opt out of any particular exchange of information. It provides all kinds of means by which those who are not willing to accept packets from others can block them. And that is where the quasi-religious fervor for global compatibility meets its moderating principle.
It has become a cliché to note that the “unified and unfragmented space” created by the victory of the Internet protocols was filled not only with innovative economic and social activity, but also with the crimes and conflicts that accompany human interactions in every other space. Along with the innovations, efficiencies, and creative new forms of entertainment and interaction came thieves, bullies, fraudsters, child abusers, spies, vandals. Most of the time, but not always, our services and devices can be configured to restrict these kinds of abuses, but usually only after the fact. But this litany of Internet-related problems rarely pauses to ask why these problems are so unique and disruptive.
Internet governance is hard not simply because networked digital devices have created all kinds of new problems, but also because of the mismatch between its global scope and the political and legal institutions for responding to societal problems. The state, law, policies, regulations, and courts are human society’s primary mechanism for handling crime and conflict. But the world of states is not unified and unfragmented. It is territorial and sovereign. There is a fundamental misalignment between a unified cyberspace and the far more fragmented legal and institutional mechanisms humans have devised to govern themselves. The engineers dreaming of global compatibility succeeded in doing their part of the job. They left it to the rest of us to figure out how to devise an institutional response.
Nowhere is the mismatch between global cyberspace and the territorial state more evident than in the domain of cybersecurity. As digital technology penetrates more and more of society, cybersecurity becomes relevant to national security, with all that implies for military power and international relations. For many state actors (and many critical non-state actors), the Snowden revelations merely confirmed what they had suspected all along: the United States has hegemony over the Internet, and all the talk about a globalized free flow of information is nothing but ideological cover for its hegemony. When the US International Strategy for Cyberspace says “The United States supports an Internet with end-to-end interoperability, which allows people worldwide to connect to knowledge, ideas, and one another through technology that meets their needs” they see not an ideal, but a self-serving rationalization for a US information empire. Whether those accusations are true or not, it is undeniable that cyberspace is increasingly seen as a place where nation-states compete for power (Segal, 2016).
Thus while the Snowden revelations produced the NETMundial’s reassertions of the universality principle, they also produced strong forces pushing in the opposite direction. The solution to NSA spying, some asserted, was to restrict the global flow of information. Require companies to store their users’ data in local jurisdictions; require Internet routing to stay within the borders of the country; require governments or users to rely on local companies rather than foreign companies for services like email and cloud computing. The call for protection was often couched in the language of sovereignty. There are now explicit demands for data sovereignty, technological sovereignty, or various other labels for some kind of governmental or jurisdictional overlay on information networks. They come from Brazil, Germany, the UK, and the European Commission as well as from China, Russia, Saudi Arabia, and Iran. The message is the same: realign states and cyberspace.
It was in this context that “fragmentation” and “Balkanization” became one of the prevailing themes in Internet policy. The New America Foundation’s Sasha Meinrath, a frequent recipient of US State Department funding, complained that “the motivations of those nations questioning America’s de facto control over the global Internet may vary, but their responses are all pointing in the same troubling direction: toward a Balkanized Internet.” A paper funded and published by the World Economic Forum announced, “A growing number of thought leaders in government, the private sector, the Internet technical community, civil society and academia have expressed concerns over the past two years that the Internet is in some danger of splintering or breaking up into loosely coupled islands of connectivity” (Drake, Cerf, and Kleinwachter, 2016). But this problem goes much further back than June 2013. The Snowden revelations merely amplified and exacerbated longstanding tensions between Internet communications and national sovereignty, tensions that had been growing for the preceding 20 years (Mueller, 2010).
Once we understand the critical role the idea of fragmentation or Balkanization is playing in global Internet governance debates, the more important it becomes to interrogate the concept. What do we really mean by Internet fragmentation, anyway?
Pursuing that question brings us face to face with a central paradox of the Internet. The Internet is a network of networks; a vast collection of independently managed but interoperable information systems. As such, the Internet protocols (coupled with increasingly powerful information processing capabilities) foster both universal interoperability and
