You are on page 1of 78

INTERNET REGULATION CHOICE OR NEED

Author:
Int. B.Tech, LL.B (Hons.) with Specialisation in Cyber Laws
Roll: R120211016
2011-2017
DISSERTATION
Submitted under the guidance of: Ms. Akanksha Singh
Assistant Professor
This dissertation is submitted in partial fulfillment of the
degree of B.Tech., LL.B. (Hons.)

College of Legal Studies


University of Petroleum and Energy Studies
Dehradun
2015-1
ABSTRACT

This paper presents an overview of the literature on Internet content regulation in


general and Internet blocking in particular as part of the research project on Internet
blocking in liberal democracies of the Digital Humanities Research Collaboration at
the Gttingen Centre for Digital Humanities. It starts by presenting the main debates
about Internet regulation and governance of the last twenty years. Scholars of Internet
governance remain divided about the role played by the nation-state in the digital
realm as well as the disruptive potential of the Internet for society and politics in
general .There is however broad consensus that new forms of regulation (e.g. code
as law) have emerged and that private actors play an important part in Internet
regulation. The report then assesses the challenges and opportunities presented by
digital content for policy-makers before reviewing various points and techniques of
control that have been implemented to deal with problematic content. This includes in
particular technical blocking, removal of search results and takedown procedures.
The final section of the review then assesses recent legal and empirical
scholarship pertaining to Internet blocking before discussing future research steps.
Introduction

Since the introduction of the World Wide Web and browsers in the early 1990s,
there has been an explosion of content available across state boundaries, in
easily duplicable format through the Internet. This development has first been
interpreted as a formidable chance for democracy and civil liberties. The
Internet has and continues to be perceived as the infra- structure and tool of
global free speech (Mueller, 2010). Many optimists hoped that, free from state
intervention or mainstream media intermediaries, citizens would be better
informed about politics, at lower costs and more efficiently. The need for content
control was however discussed as soon as the Internet became accessible to the
greater public. Similarly to the emergence of previous communication and media
technologies, pressure rapidly built up to demand more control of what type of
content is accessible to whom (Murray, 2007). The regulation of content is
linked to a broader discussion about the regularity of the Internet that is the
focus of section 1 before turning to content regulation per se in section 2.

1 . Internet Regulation
One of the characteristics of the literature on Internet regulation in general and
content regulation in particular is the use of often vague terminologies and
concepts that are not clearly distinguishable and lack direct connections to
empirical foundations (Hofmann, 2012). Authors writing on Internet regulation
do so from a given perspective. In particular, they diverge on two central
aspects: the role of the nation-state and the disruptive potential of the Internet.

The role of the nation-state in regulating the digital realm in comparison with other
actors such as corporations or civil society remains disputed. For some scholars, the
nation-state is the main actor capable of directly or indirectly regulating social
behaviour. For others, the state is one among a variety of competing actors and has
lost its dominance.
Their perspective is generally reflected in the terminology used, focusing on
governance instead of regulation when taking into account a broader set of
actors and processes than interactions centered around the state.

The transformative or disruptive impact digital technologies may have on politics and
society in general divides scholars. Early Internet policy debates have fuelled utopian
and dystopian scenarios. The Internet has been perceived as the instrument of
global free speech on the one side and as a tool leading to a new type of
sophisticated surveillance state on the other side. If nuances run through both the
optimistic and the pessimistic strands of the literature, a recurrent criticism is that
they are based on either social or technological determinism. They are emblematic of
the emergence of any new technology and not particular to the case of the Internet.
Similar narratives surrounded the emergence of previous technologies such as the
telegraph, the radio or television (Flichy, 2001; Vanobberghen, 2007). Technologies
are socially constructed. They do nonetheless generate social affordances, a term
largely used in human-computer interaction studies, and defined as the

possibilities that technological changes afford for social 7relations and social
structure (Wellman, 2001, 228). They hold certain potentialities that can be positive
new forms of sociability, rapid information transmission, spaces for open
collaboration as well as negative lack of control or oversight, reduced privacy,
increased surveillance and cyberattacks. However, in terms of regulation, the
literature remains divided between authors who consider that there is (or should be)
something distinctly different about Internet politics, compared to other policy
fields, and those who consider that Internet regulation is maturing and resembling
more traditional policy fields, similar to the emergence of environment issues as a
new policy field at the end of the twentieth century.

The discussion about the regulation of the Internet has shifted from whether the
Internet can be regulated at all to how it is regulated and by whom. The question
opposed the so-called cyber-libertarians who contested any exterior assertion of
power, be it by states or other actors (section 1.1), to legal and political scholars
arguing that the Internet was in fact regulated, although through different
regulatory modalities (section 1.2).
For Some authors, states continue to play a significant role in these regulatory
arrangements while there is widespread agreement that the Internet has become an
object of political struggle for states and various other actors alike (section 1.4).
However, the narrowly defined state-centric perspective on Internet regulation has
more recently been criticized as cyber-conservatism (Mueller, 2010; DeNardis,
2010) by a third set of scholars interested in the institutional innovations and
broader power dynamics at play in Internet governance (section1.5).

Cyberlibertarians
Because of the Internets decentralised and global architecture, early Internet
enthusiasts and cyber-libertarian scholars perceived cyberspace as a social
space beyond territorially-based regulation that should remain free from
governmental or corporate intervention (see for instance Johnson and Post,
1996 and Barlows famous Declaration of Independence of Cyberspace).
Internet freedom was thought to be hardwired into its technological
infrastructure as exemplified by the often-quoted phrase the Net interprets

censorship as damage and routes around it.2 Nation-states in particular were


perceived as illegitimate and powerless actors with no means to enforce state
sovereignty in cyberspace. The only legitimate form of decision-making for
cyberspace would have to be developed organically with the consent of the
majority of the citizens of cyberspace (Murray, 2007, 7).

Much has been written about the Internets open, minimalist and
decentralised architecture that allowed for its rapid success, integration with any
other computer network and the rapid development of new applications such as
the World Wide Web or email programs. The Internet has been built upon the
end-to-end principle, which stipulates that application-specific functions are
hosted at the endpoints of the network (e.g. servers or personal

1 Barlow, J.-P. (8 February 1996). A Declaration of the Independence of Cyberspace. Available at:
https://projects.eff.org/~barlow/Declaration-Final.html. See also Cyberspace and the American
Dream: A Magna Carta for the Knowledge Age (Dyson et al., 1996) and Birth of a Digital Nation
(Katz, 1997).

2 Quote attributed to the civil liberties advocate and co-founder, together with J.P. Barlow, of the
digital rights platform Electronic Frontier Foundation (EFF), John Gilmore.
computers) instead of intermediary nodes (e.g. routers). Similarly to postal mail
delivery agents, intermediaries route data packages from one endpoint to
another endpoint, without needing to know what the datagram will be used for or
contains in terms of content. The end-to-end principle is central to current net
neutrality debates (see below) that focus on new techno- logical possibilities for
intermediaries to perform certain application-specific functions (e.g.
distinguishing between peer-to-peer file-sharing and video streaming).

The protocols and standards developed during the 1960-70s by the so- called
Internet founders, academics and government engineers, funded by the U.S.
Department of Defense, are still the basis of todays Internet. To protect their
achievements, the founders established a series of institutions, in particular the
Internet Engineering Task Force (IETF) in 1986 to regulate and develop new
standards. These institutions were perceived by cyber- libertarians and many of
the founders as new and better forms of governance under the leitmotiv of
rough consensus and running code (Dave Clark, Internet founder, quoted
in Goldsmith and Wu, 2006, 24). Although the cyber-libertarian perspective
was rapidly criticised as technologically deterministic and contrary to empirical
evidence of increased state and corporate control of the Internets infrastructure and
content, its main tenets and values continue to inform current policy discussions
and self-regulatory practices that proliferate online.

Code as law and other cyberspace regulations


The cyber-libertarian perspective was rapidly challenged, notably by Joel Reidenberg
(1998) who argued that despite the challenge to territorial borders posed by global
networking, new models and rules would emerge in which the state continues to be
involved. If governments would not necessarily be able to directly regulate
cyberspace, they would at least be able to influence two distinct regulatory borders:
contractual agreements between Internet Service Providers (ISPs) and the network
architecture, in particular technical standards. He referred to Lex Informatica
as the possibility
regulate user behaviour through influencing the Internets underlying
technological infrastructure and system design choices. The concept would
influence the cyber-paternalist position and the future debate about Internet
regulation (Klang, 2005).

The argument that the Internet is not unregulable but in fact regulated by its
architecture was expanded upon in the 1999 publication of U.S. law professor
Lawrence Lessigs seminal book Code and other laws of Cyber- space. Lessig
argued that the Internet is not a space free from state intervention and that
computer code, the Internets underlying infra- structure, provides a powerful
tool for regulating behaviour online (Lessig, 1999, 2006). For Lessig, code is one
of the four modalities of regulation next to law, social norms and markets. The
latter are institutional constraints, which do not allow for immediate control over
human behaviour and are considered by a large majority of observers as
insufficient to effectively regulate global Internet traffic. Lawsuits are time and
cost intensive, often ineffective when dealing with the scale change brought
through the Internet).
Social norms are easily violated, and can be manipulated. Market constraints can be
circumvented in many ways, and commercial entities are dependent on effective
protection by social norms and the legal system for enforcement (Boas, 2006).
Whether Internet design choices are a regulatory modality in itself or not remains
debated (McIntyre and Scott, 2008). Murray and Scott (2001) have for instance
criticised Lessigs modalities as over- or under- inclusive. They propose to speak
instead of hierarchical (law), community- based (norms), competition -based
(market) or design-based controls.

Nonetheless, it is widely acknowledged that code plays a central role in controlling


user behaviour, often in support of legal arrangements. This focus on code as
law or the control of user behaviour through technical design features has been
widely taken up by scholars who point to the inherent political dimension of the
Internets infrastructure, made of hardware, cables, standards and protocols (see
for instance DeNardis, 2009, 2012). As we will see, this aspect is also highly
relevant for content control. Contrary to the cyber-libertarian position, which
postulates that freedom
was inscribed in the architecture of the Internet and thus beyond state control,
Lessigs position asserted that who controlled the code controlled user
behaviour. In practice, this means that private actors, the owners of the Internets
infrastructure made of hardware and software, play an increasingly important
role in regulating the digital realm while the state in which they operate can
indirectly control the infrastructure by regulating the intermediaries.

With the turn of the millennium, the discussion has thus clearly shifted from
whether the Internet can be regulated at all to how and by whom it is and whether
there is anything explicitly new about the phenomenon. Here, scholars remain
divided by those insisting on the dominant role of nation- states in Internet
regulation, pointing to the increasing number of state legislation directed towards
the digital realm (e.g. Goldsmith and Wu, 2006), and those who argue that more
attention should be paid to new processes and institutions that are emerging at the
international level and the key role played by private actors in Internet politics
(e.g. DeNardis, 2009; Mueller, 2010).

Cyberpaternalism or the return of the nation-state

In their 2006 book Who controls the Internet? Illusions of a borderless world,
Goldsmith and Wu (2006) recognise that the Internet challenges state power but
argue that since the 1990s, governments across the world have increasingly
asserted their power to make the global Internet more bordered and subject to
national legislations. They provide numerous examples of interactions where
the nation-state resorted as the dominant actor, starting with the LICRA v.
Yahoo! Inc. (2000) case in France that led Yahoo to remove Nazi memorabilia
on its auction website worldwide to comply with French law, 3 the long
interactions between the Internets

3 In the 2000 ruling LICRA v. Yahoo! Inc., the Tribunal de Grande Instance of Paris exercised
territorial jurisdiction on the grounds that the prejudice of the content hosted abroad took place on
French territory. The Court required Yahoo! to prevent French users from accessing Nazi
memorabilia on its auction website. The company complied to the judgement even though a U.S.
District Court judge considered in 2001 that Yahoo! could not be forced to comply to the French
laws, which were contrary to the First Amendment. The ruling was reversed in 2006 by a U.S. Court
of Appeals. For Goldsmith and Wu (2006), the French state could exert pressure upon Yahoo!
because the company held several assets for its operation in France that the French state could have
acted upon should Yahoo! have refused to comply to the French court order.
founding fathers and the U.S. government over the Internets domain name
system (referred to as the root, see Mueller, 2002) that eventually led to the
establishment of ICANN and, of course, the establishment of the Chinese great
firewall as an illustration of what a government that really wants to control
Internet communications can accomplish (Goldsmith and Wu, 2006, 89). In
sum, [a] governments failure to crack down on certain types of Internet
communication ultimately reflects a failure of interest or will, not a failure of
power (ibid.). Similarly, for Deibert and Crete-Nishihata (2012), it was a
conscious decision by the US and other Western states to not directly regulate
the Internet in the early 1990s, leaving operating decisions to the Internets
engineering community that functioned on a basis of consensus building and
requests for comments. This was to foster innovation and economic growth at a
time where one could only speculate as to how precisely the Internet would
develop over time.

In fact, the first motivation for Internet regulation was to situate Internet exchanges
into existing legal categories (e.g. is the Internet similar to the telephone or
broadcasting media?) or to create new categories, and in rare cases, new institutions.
However, in the early to mid-1990s, states largely maintained the status quo ante, either to
protect emerging business models or established governmental practices (Froomkin,
2011, 5).

The Internet As An Object Of Political Struggle


Although the state-centric and cyberlibertarian perspective are still present in
Internet governance discussions, the dominant perspective nowadays is that the
state is one, albeit important, actor among a variety of stakeholders that are
interested in shaping the Internets future. Those stakeholders hold different interests,
norms and values as to how the Internet should develop in the future. The
technical community, that was instrumental in developing the Internet in the first
place, aims to protect the open and decentralised architecture of the Internet from
governmental or corporate encroachments.4

4 Internet engineers remain the principal decision-makers over the Internets critical resources, most
notably the domain name system through ICANN but also in the domain of standards setting (e.g. the
IETF). Those resources are heavily contested but, notably due to the protection of the U.S.
Private actors are increasingly important in shaping the Internets develop- ment.
Some corporations have based their business model on the relatively unregulated
environment of the 1990s and early 2000s, with limited intermediary liability (e.g. for
Internet service providers (ISPs) or online con- tent providers (OCPs)), and
succeeded to monetize their Internet activities principally through paid, and
increasingly targeted, advertisement (e.g. Google or Facebook). Other actors, for
instance the entertainment industry, have been severely challenged by new practices
developing online such as widespread sharing of copyrighted material. Attempts to
roll back piracy have generally led to further technological developments such
as peer-to- peer technologies (Musiani, 2011). Other actors increasingly converge
their activities to the Internet bringing with them different norms and interests than
those of the early Internet communities (Deibert and Crete-Nishihata, 2012;
Rasmussen, 2007; Castells, 2001). States, which are driven by security and
public order concerns, are by no means in agreement over who should control the
Internet albeit all recognise the fundamental importance
of the network as a global communication infrastructure and business
opportunity. Finally, a broad transnational movement of NGOs, associations,
information groups and individuals has emerged over the years in response largely to
regulatory attempts to introduce more control of the network of networks but also, at
times, to demand governments to intervene in business practices that are considered
as harming the end-to-end principle at the basis of the Internet (Mueller et al.,
2004; Breindl and Briatte, 2013; Haunss, 2011; Lblich and Wendelin, 2012).

Through digital convergence, most information and communication


activities have shifted to the Internet. The separate legal and regulatory
instruments that governed entertainment consumption, print and broadcasted
media, libraries, public information delivery etc. are now converging on the
Internet, encompassing the entirety of communication and information policy
(Mueller, 2010, 10). States alone are not able to

government, a far-reaching reform of the U.S.-centred domain name system has until now been avoided,
although concessions have been made to other governments and private actors (Mueller, 2002).
regulate the Internet without at least relying on private actors to assist them. Many
Internet issues extent beyond national borders, making coordinated action
necessary.

WHY WE SHOULDN'T REGULATE THE INTERNET

Many people argue that it would be wrong to attempt to regulate the Internet and advance arguments such as
the following:

The Internet was created as a totally different kind of network and should be a free space. This
argument essentially refers back to the origins of the Net, when it was first used by the military as an
open network designed to ensure that the communication always got through, and then by academics
who largely knew and trusted each other and put a high value on freedom of expression.

The Internet is a pull not a push communications network. This argument implicitly accepts that it
is acceptable, even necessary, to regulate content which is simply 'pushed' at the consumer, such as
conventional radio and television broadcasting, but suggests that it is is unnecessary or inappropriate
to regulate content which the consumer 'pulls' to him or her such as by surfing or searching on the
Net.

The Internet is a global network that simply cannot be regulated. Almost all content regulation is
based on national laws and conventions and of course the Net is a worldwide phenomenon, so it is
argued that, even if one wanted to do so, any regulation of Internet content could not be effective.

The Internet is a technically complex and evolving network that can never be regulated. Effectively
the Web only became a mass media in the mid 1990s and, since then, developments - like Google and
blogging - have been so rapid that, it is argued, any attempt to regulate the medium is doomed.

Any form of regulation is flawed and imperfect. This argument rests on the experience that
techniques such as blocking of content by filters have often been less than perfect - for instance,
sometimes offensive material still gets through and other times educational material is blocked.

WHY WE SHOULD REGULATE THE INTERNET

However, there are strong arguments in favour of some form of regulation of the Internet, including the
following:

The Internet is fundamentally just another communications network. The argument runs: if we
regulate radio, television, and telecommunications networks, why don't we regulate the Net? This
argument suggests that, not only is the Internet in a sense, just another network, as a result of
convergence it is essentially becoming the network. so that, if we do not regulate the Net at all,
effectively over time we are going to abandon the notion of content regulation.

There is a range of problematic content on the Internet. There is illegal content such as child abuse
images; there is harmful content such as advice on how to commit suicide; and there is offensive
content such as pornography. The argument goes that we cannot regulate these different forms of
problematic content in the same way, but equally we cannot simply ignore it.

There is criminal activity on the Internet. Spam, scams, viruses, hacking, phishing, money
laundering, identification theft, grooming of children .. almost all criminal activity in the physical
world has its online analogue and again, the argument goes, we cannot simply ignore this.

The Internet now has users in every country totalling around several billion. This argument implicitly
accepts that the origins of the Internet involved a philosophy of free expression but insists that the
user base and the range of activities of the Net are now so fundamentally different that it is a mass
media and needs regulation like other media.

Most users want some form of regulation or control. The Oxford Internet Survey (OxIS) published
in May 2005 had some typical indications of this. When asked if governments should regulate the
Internet, 29% said that they should. When asked who should be responsible for restricting children's
content, 95% said parents, 75% said ISPs and 46% said government.

REGULATING ILLEGAL CONTENT

It is a major proposition of this presentation that any sensible discussion of regulation of the Internet needs
to distinguish between illegal content, harmful content, and offensive content. I now deal with these in turn.

In the UK, effectively illegal content is regulated by the Internet Watch Foundation through a self-regulatory
approach.

What is the nature of the IWF?

It was founded by the industry in late 1996 when two trade bodies - the Internet Service Providers'
Association (ISPA) and the London Internet Exchange (LINX) - together with some large players
like BT and AOL came together to create the body.

It has an independent Chair selected through open advertisement and appointed by the Board.

The Board consists of six non-industry members selected through open advertisement and three
industry members chosen by the Funding Council.
There is Funding Council which has on it representatives of every subscribing member.

The IWF has no statutory powers. Although in effect it is giving force to certain aspects of the
criminal law, all its notices and advice are technically advisory.

The IWF has no Government funding, although it does receive European Union funding under the
Commission's Safer Internet plus Action Plan

Although not a statutory body and receiving no state funding, the IWF has strong Government
support as expressed in Ministerial statements and access to Ministers and officials.

The IWF has a very specific remit focused on illegal content, more specifically:

images of child abuse anywhere in the world

adult material that potentially breaches the Obscene Publications Act in the UK

criminally racist material in the UK.

The IWF has been very successful in fulfilling that remit:

The number of reports handled has increased from 1,291 in 1997 to 23,658 in 2005.

The proportion of illegal content found to be hosted in UK has fallen from 18% in 1997 to 0.3% in
2005.

The number of funders has increased from 9 in 1997 to 60 in 2005.

No Internet Service Provider has ever been prosecuted and the reputation of the ISP community has
been greatly enhanced.

Then Prime Minister Tony Blair described the IWF as perhaps the worlds best regime for tackling
child pornography.

How is illegal material removed or blocked under the IWF regime?

There is a 'notice and take down' procedure for individual images which are both illegal and hosted in
UK.

The IWF compiles a list of newsgroups judged to be advertising illegal material and recommends to
members that these newsgroups not be carried. About 250 newsgroups are 'caught' by this policy.
The IWF compiles a list of newsgroups known regularly to contain illegal material and again
recommends to members that these newsgroups not be carried. A small number of additional
newsgroups are 'caught' by this policy.

Most recently and most significantly, ISPs are blocking illegal URLs using the IWF's child abuse
image content (CAIC) database and using technologies like BTs Cleanfeed. The number of URLs on
this list - which is up-dated twice a day - is between 800-1200.

The problem now for the IWF - and indeed for the other such hotlines around the world - is abroad, more
specifically:

United States - the source of 40% of illegal reports in 2005

Russia - the source of 28% of illegal reports in 2005

Thailand, China, Japan & South Korea - the source of 17% of illegal reports in 2005

In early 2005, a study by the International Centre for Missing and Exploited Children (ICMEC) in the
United States found that possession of child abuse material is not a crime in 138 countries and, in 122
countries, there is no law dealing with the use of computers and the Internet as a means of distribution of
child abuse images [for more information on this report ]. So the UK needs the cooperation of other
governments, law enforcement agencies and major industry players if we are to combat and reduce the
availability of child abuse images in this country and around the world.

Since the IWF's remit is illegal material, there are some possible areas of the law which might be amended in
terms which would suggest a minor extension to the IWF's existing remit, specifically:

The proposed new law on possession of extreme adult pornographic material

The proposed new law on incitement to religious hatred

A possible review of the law on incitement to racial hatred

A possible review of the law on protection of minors in relation to adult pornographic material

A possible review of the law on the test of obscenity in relation to adult pornographic material

However, the IWF has absolutely no intention or wish to engage in harmful or offensive content, so the
proposals that now follow are my personal suggestions for discussion and debate.

REGULATING HARMFUL CONTENT


It is my view that currently there is Internet content that is not illegal in UK law but would be regarded as
harmful by most people. It is my contention that the industry needs to tackle such harmful content if it is to
be credible in then insisting that users effectively have to protect themselves from content which, however
offensive, is not illegal or harmful. Clearly it is for Government and Parliament to define illegal content. But
how one would define harmful content?

I offer the following definition for discussion and debate: Content the creation of which or the viewing of
which involves or is likely to cause actual physical or possible psychological harm. Examples of material
likely to be caught by such a definition would be incitement to racial hatred or acts of violence and
promotion of anorexia, bulimia or suicide.

Often when I introduce such a notion into the debate on Internet regulation, I am challenged by the question:
How can you draw the line? My immediate response is that, in this country (as in most others), people are
drawing the line every day in relation to whether and, if so how and when, one can hear, see, or read various
forms of content, whether it be radio, television, films, videos & DVDs, newspapers & magazines.
Sometimes the same material is subject to different rules - for instance, something unacceptable for
broadcast at 8 pm might well be permissable at 10 pm or a film which is unacceptable for an '18' certificate
in the cinema might receive a 'R18' classification in a video shop.

Therefore I propose in relation to Internet content that we consult bodies which already make judgements on
content about creation of an appropriate panel. Such bodies would include the Ofcom Content Board the
BBC the Association for Television On Demand (ATVOD), the British Board for Film Classification
(BBFC)], and the Independent Mobile Classification Body (ICMB) I would suggest that we then create an
independent panel of individuals with expertise in physical and psychological health who would draw up an
agreed definition of harmful content and be available to judge whether material referred to them did or did
not fall within this definition.

What would one do about such harmful content?

There should be no requirement on ISPs to monitor proactively content to which they are providing
access to determine whether it is harmful.

Reports of suspected material from the public should be submitted to a defined body.

This body should immediately refer this material to the independent panel which would make a
judgement and a recommendation as to whether it was in fact harmful.

A database of sites judged to be harmful should be maintained by the defined body.

ISPs should be invited to block access to such sites on a voluntary basis.


Each ISP should be transparent about its policy in relation to blocking or otherwise of such content
and set out its policy in a link from the homepage of its web site, as many sites do now in relation to
privacy policy.

REGULATING OFFENSIVE CONTENT

Once we have effective regimes for illegal and harmful content respectively, one has to consider that
material which is offensive - sometimes grossly offensive - to certain users of the Internet. This is content
which some users would rather not access or would rather that their children not access.

Now identification of content as offensive is subjective and reflects the values of the user who must
therefore exercise some responsibility for controlling access. The judgement of a religious household would
probably be different from that of a secular household. The judgement of a household with children would
probably be different from that of one with no chidren. The judgement of what a 12 year old could access
might well be different from what it would be appropriate for an 8 year old to view. Tolerance of sexual
images might be different to those of violent images.

It is my view that, once we have proper arrangements for handling illegal and harmful content, it is
reasonable and right for government and industry to argue that end users themselves have to exercise control
in relation to material that they find offensive BUT we should inform users of the techniques and the tools
that they can use to exercise such control. What are such techniques and tools? They include:

Labelling of material through systems such as that of the Internet Content Rating Association (ICRA)
- The ICRA descriptors were determined through a process of international consultation to establish a
content labelling system that gives consistency across different cultures and languages.

Rating systems drawn up by third parties such as parents' or childrens' organisations - Whereas
labelling should as far as possible be value-free, rating sytems act on those labels to express a value
judgement that should be explicit, so that users of the system know what kinds of material are likely
to be blocked.

Filtering software of which there are many different options on the market - The European
Commission's Safer Internet Programme has initiated a study aiming at an independent assessment of
the filtering software and services. Started in November 2005, the study will be carried out through
an annual benchmarking exercise of 30 parental control and spam filtering products or services,
which will be repeated over three years.

Search engine 'preferences' which are unknown to most parents - Google, the most used browser has
the word 'preferences' in tiny text to the right of the search box and clicking on this reveals the option
of three settings for what is called 'SafeSearch Filtering', yet this facility is vitually a secret to most
parents.

Use of the 'history' tab on the browser which again is unknown to many parents - This is a means for
parents to keep a check on where their young children are going in cyberspace, although there has to
be some respect for the privacy of children.

Promotion of education, awareness and media literacy programmes - Section 11 of the


Communications Act 2003 provides that Ofcom has a duty to promote media literacy and the
Department of Culture, Media & Sports (DCMS) has granted 500,000 a year for this purpose, but a
very wide range of organisations have a role to play in the promotion of such programmes.

Of course, it would help parents and others with responsibility for children if they could buy a PC with
filtering software pre-installed and set at the maximum level of safety and if the default setting for all web
browsers was a child-safe mode. Then adult users of such hardware and software would have the
opportunity, when they wished, to to switch to a less filtered or completely open mode.

WHAT ELSE WOULD HELP?

Looking at Internet content generally, what else would help? Let me make a few final suggestions:

There should more proactive media comment and debate by spokespersons who can and will speak
for the industry as a whole rather than simply for particular companies or products. Too often debates
in the media are ill-informed or unbalanced because the industry does not engage as effectively as it
could.

There should be a user-friendly, well-publicised web site offering advice on the full range of Internet
content issues. The 'Get Safe Online' initiative is a useful start here.

There should be a UK body like the Internet Rights Forum in France with government, industry and
civil society representation. This could discuss problems of Internet content, stimulate wider debate,
and make recommendations.

The Internet Governance Forum (IGF) set up by the United Nations, following the World Summit on
the Information Society in Tunis, should become a genuinely useful focus for global discussion of
issues of Internet content.

In the medium term, we are going to see:


a blurring of linear ('push') and non-linear ('pull') material through developments like video on
demand (VoD), near VoD, the personal video recorder (PVR), BBC's iPlayer and Internet Protocol
Television (IPTV).

a convergence of broadcasting and the Internet both technologically and commercially.

more Net users using higher speeds and spending more time and doing more things online.

THE IMPLICATIONS FOR BROADCASTING

If we can determine an acceptable regime for regulating Internet content, this will beg the question of why
we regulate broadcasting content so differently. Indeed why do we regulate broadcasting at all?

Historically there have been three reasons:

1. Broadcasting has used scarce spectrum and, in return for free or 'cheap' spectrum, broadcasters have
been expected to meet certain public service broadcasting obligations and this requirement has
provided leverage to regulators to exercise quite strong controls on content. BUT: increasingly
broadcasting is not done using scarce spectrum; it uses satellite, cable or simply the telephone lines
with technologies like ADSL.

2. Broadcasting has been seen as having a special social impact because so many people watched the
same programme at the same time. BUT: increasingly with multi-channel television and time-shifting
devices like the VCR and the PVR, any given broadcast is probably seen by a relatively small
proportion of the population.

3. Broadcasting has been seen as a 'push' technology over which viewers had little control once they
switched on the television set. BUT: increasingly viewers are 'pulling' material through the use of
VCRS, PVRs, video on demand, near video on demand, podcasting, and so on.

Therefore it is possible to argue that the historic reasons for regulating broadcasting in the traditional ways
are fast disappearing. In these circumstances, one could well argue that broadcasting should not be regulated
much differently from how we regulate the Internet or at least how we might regulate the Internet in the
manner proposed in this article.

Therefore regulation of broadcasting would focus on illegal and harmful content, leaving offensive content
as a matter essentially for viewers to block if they thought that appropriate for their family. This would
suggest a convergence of the regulation of broadcasting and the Internet to a model which, compared to the
present situation, would involve a lot less regulation for broadcasting and a bit more regulation for the
Internet.
Two issues are crucial here:

This could not be done overnight. Consumers have strong expectations regarding broadcasting
regulation and these expectations would have to be managed through some kind of transitional
process.

We could not simply abandon most broadcasting regulation without empowering viewers to make
informed choices by provision of proper tools and better media literacy

Institutional change and multi-stakeholderism


Authors such as Milton Mueller (2010) consider that the only solution to current Internet
issues such as intellectual property rights, cybersecurity, content regulation and critical
Internet resources, that would preserve the open and disruptive character of the Internet, is
institutional change in the way communication and information technology has been
regulated so far. He therefore speaks about Internet governance at the international level to
highlight the coordination and regulation of interdependent actors in the absence of an
overarching political authority (Mueller, 2010, 8, emphasis in original). For Mueller
(2010, 4) the Internet challenges the nation-state because communication takes place at a
global scope, at unprecedented scale while control is distributed, meaning that decision-
making units over network operations are no longer closely aligned with political
units; new institutions have emerged to manage the Internets critical resources (e.g.
domain names, standards and protocols) beyond the established nation-state system, while
dramatically lowering the costs for collective action thus allow- ing new transnational
networks of actors and forms of activism to emerge. Similarly, for Braman (2009) the
increasing importance of information policy manifest and trigger profound changes in the
nature of how the state and governance functions. Old categories need therefore to be
reassessed in light of Internet governance and more fluid forms of decision-making.

To deal with Internet issues at an international level, the United Nations launched the
World Summit on the Information Society in 2002, which opposed proponents of a
state-centric regulatory regime to supporters of a more open, pluralistic, and
transnational policy-making framework (Mueller, 2010, 10). Especially civil society
and private businesses demanded to be integrated into the discussion asking for more
multi-stakeholder participation (Mueller, 2010). The Summit held in Geneva in
2002 and
Tunis in 2005 resulted in a series of declarations, action plans and agendas.5
The WSIS opposed the United States defending their unilateral control of
ICANN, one of the few globally centralised points of control on the Internet
(Mueller, 2010, 61) to Europe, on the one side, and emergent countries, on the
other side, both demanding more influence over the domain name system and
Internet governance in general. The Tunis Agenda explicitly praised the role of
the private sector in the Internets daily operating decisions, but also paved the
way for a long-term reform of ICANN (for more information see Mueller, 2010)
and mandated the creation of a non-binding, multi-stakeholder forum to discuss
Internet governance issues on an annual basis. Since then seven Internet
Governance Forums (IGFs) have taken place in various locations, offering a
unique, yet not binding, platform for discussion and dialogue about Internet
related issues by a broad range of stakeholders. However, IGFs, in which any
actor can participate, are repeatedly criticised for not resulting in concrete
outcomes with several critics turning to other forums, reserved to member states
suchas the International Telecommunications Union (ITU), a UN agency,
to defend their interests in global Internet politics, notably in the domain
of Internet content regulation.

Governments attempted at several occasions to use the established inter-


governmental process of the ITU to gain more influence over the U.S.
dominated critical Internet resources, in particular the domain name system,
and more recently by attempting to extend the ITUs competencies to Internet
policy at the World Conference on International Telecommuni- cations (WCIT)
in December 2012. The 1988 ITUs telecommunication regu- lations (ITRs)
were debated by all member states. The negotiations were heavily criticised for
being closed to non-governmental stakeholders, non- transparent and could lead
to more restrictive Internet policies, especially by authoritarian regimes. The
most contentious amendments were vaguely formulated, leaving space for
various interpretations in national law that

5 The Geneva meeting adopted the Declaration of Principles: Building the Information Society: A
Global Challenge in the New Millennium (2003); The Tunis meeting led to the Tunis Agenda for the
Information Society (2005).
could be used as a legitimation by authoritarian states to filter political content
and crack down on opponents. Several Western and African states refused to
sign the final declaration, the U.S. being the most vocal defender of Internet
freedom stating that:

The Internet has given the world unimaginable economic and social benefit during
these past 24 years. All without UN regulation. We candidly cannot support an ITU
Treaty that is inconsistent with the multi-stakeholder model of Internet governance.6

The controversial negotiations resulted in a split between Western states who


refused to sign the treaty and emerging economies, in particular Russia and China
who demanded more state control over Internet regulation and signed the final
treaty. If the present system is far from ideal, it is perceived nonetheless by most
Western states as the best possible solution to protect their interests (and the
interests of their IT industries) and prevent authoritarian states from gaining direct
interference in Internet regulation. Nonetheless, various commentators have
rejected the apparent opposition between the freedom-protecting West compared
to the authoritarian and
repressive East by pointing to the fact that maintaining the status quo
perpetuates the dominance of the U.S. and U.S. business interests (e.g. Google
accompanied the WCIT by a particularly vocal campaign to defend Internet
freedom) in Internet governance. Poorer countries can only voice their positions
at intergovernmental conventions such as the ITU. As it is, they are in effect
excluded from gaining any weight in Internet regulation, with some countries
even arguing that the U.S. uses denial of Internet services, among other forms of
sanctions, for policy leverage. Also, the U.S. defence of Internet freedoms at
the international level stands in sharp contrast to a series of domestic policies
adopted in the name of security that increases control over networks by possibly
reducing citizens freedoms.7 Not surprisingly, many of these policies deal with
Internet content regulation.

6 Terry Kramer, head of the US delegation to WCIT, quoted in Arthur, Charles (14 December 2012).
Internet remains unregulated after UN treaty blocked, The Guardian, available at:
http://www.guardian.co.uk/technology/2012/dec/14/telecoms-treaty-Internet- unregulated?
INTCMP=SRCH

7 See for instance: Powell, Alison (20 December 2012). A sticky WCIT and the battle for control of the
Internet, Free Speech Debate, available at: http://freespeechdebate.com/en/discuss/a-sticky- wcit-and-the-
battle-for-control-of-the-Internet/; Mueller, Milton (13 December 2012). What really
2 Digital content as a new policy problem

The early literature on Internet content regulation has primarily focused on the
tension between online content regulation and human rights, in particular
freedom of expression and privacy, and constitutional principles such as the rule
of law. Especially state-led initiatives were interpreted as censorship and
critically analysed by freedom of expression advocates, computer scientists and
legal scholars. A second wave of literature focused essentially on authoritarian
states to document how countries such as China or Iran started to build
national firewalls and sophisticated filtering systems (Deibert et al., 2008;
Clayton et al., 2006; Wright et al., 2011). The spread of information control
systems throughout the world, including in Western democracies Deibert and
Crete-Nishihata (2012, 341) speak about a norm regression to designate the
transition from the belief in the Internet as a freedom technology to the
increasing demands for more information controls has more recently led to
the emergence of a more empirical, Sometimes apolitical, literature that
views Internet blocking not as the exception but rather as a global
norm in emergence (Deibert et al., 2010, 2011a; McIntyre, 2012).

Internet content regulation is one of the drivers of Internet governance


(Mueller, 2010). As various authors have noted, making available content such
as pornography or copyrighted material has in fact significantly contributed
to driving Internet growth and, in response, led to increasing efforts to control
and regulate the Internet (Johnson, 1997; Zittrain, 2003; Murray, 2007). It has
indeed been this type of content that was the object of early concern, notably
because of the characteristics of digital content and its intrinsic relationship to
freedom of expression. We will therefore first examine the particularities of
digital content (section 2.1) before presenting the evolution of content controls
through different points and techniques of control (section 2.2) to finally assess
recent empirical research (section 2.3).

happened in Dubai?, Internet Governance Project, available at:


http://www.Internetgovernance.org/2012/12/13/what-really-happened-in-dubai/.
Characteristics Of Digital Content

The Internets role in providing access to information and facilitating global free
expression has been repeatedly underlined by commentators, politicians (e.g.
Clintons Freedom to connect) and institutional reports (e.g. Dutton et al.,
2011; La Rue, 2011; OECD, 2011). However, the borderless nature of
information exchanges conflicts with the body of pre-existing regimes on
information and content regulation that have been established at the national
level. Attempts to harmonise these regulatory bodies lead often to conflicts,
especially since information, and the control thereof, gains in strategic and
economic importance (Busch, 2012).

Although all democratic countries protect freedom of expression through a


series of national and international legal instruments, each country holds a
margin of appreciation to introduce speech-based restrictions to its laws
(Akdeniz, 2010). Countries have differing human rights approaches (Brown
and Marsden, 2013, 204). Especially in Europe, freedom of expression has
never been an inalienable right but is balanced against other rights, such as

the respect of privacy or public order (Akdeniz, 2010; Zeno-Zencovich, 2009). If


freedom of expression was longtime associated with freedom of the press and
the mass media in general, resulting in a series of regulations of these sectors,
the convergence of broadcasting and telecommunication and the emergence of
the Internet have fundamentally altered the situation. The Internet has in effect
resurrected the notion of freedom of expression as an individual liberty
(Zeno-Zencovich, 2009, 100), meaning that any actor can express himself freely
on the Internet and potentially reach a broad audience. The question remains
thus whether and to what extent any regulation might be desirable or
necessary (Zeno-Zencovich, 2009, 103). Furthermore, scholars wonder whether
technology-based forms of content control are proportionate with and respectful
of human rights protection, in particular freedom of expression and privacy.

Online content differs from previous types of content in its digital nature. danah
boyd (2008, 2009) distinguishes five by default properties of digi- tised
content: digital content is persistent, replicable, scalable, searchable
and (de)locatable. Online messages are automatically recorded and archived.
Once content is placed online, it is not easy to remove. Digital content can be
easily duplicated, e.g. over thousand mirror sites emerged within a week after
WikiLeaks web hosting contract was terminated by Amazon Cloud Services
after the publication of U.S. diplomatic cables in 2010 (Brown and Marsden,
2013; Benkler, 2011, see below). Digital copies are perfect copies of the
original, contrary to the products of previous recording technologies such as the
tape recorder. The potential visibility of digital content is high. Digital content
can be easily transferred to almost anywhere on the globe in a matter of
seconds. The emergence of search engines allows users to access content but
also provides a new opportunity to filter what type of content depending on the
algorithm used. Finally, mobile technologies dislocate us from physical
boundaries while at the same time locating us in geographical spaces. Content is
accessible in ever more places yet technologies are increasingly constructed to
locate us in space and propose location-based content. These properties are not
only socially significant, as shown in boyds
research, but also politically in that they introduce new social practices and
policy responses that may or may not challenge existing economic, legal and political
arrangements.

Previous technologies would transmit content to more readily confined


geographical areas (Akdeniz, 2010) often through more centralised institutions
that could more easily be controlled by governments. On the contrary, the
Internets architecture is highly decentralised, in the hands of private actors,
notably due to the liberalisation of the telecommunication industry in the 1980-
90s in many liberal democracies (Mueller, 2010), and interconnected at the
global level. For Brown and Marsden (2013, 176) data packets can take a wide
range of routes from sender to recipient, reducing the ability of intermediate
points to block communications. In sum, the Internets architecture empowers
the periphery over the centre of the network (Froomkin, 2011). Nation states
and content producers experience a loss of direct control over information flows
(DeNardis, 2012).

If the Internet has challenged state sovereignty and oversight over content
control, Internet technologies also offer new possibilities of control by
automatising the monitoring, tracking, processing and filtering of large amounts
of digital data. If the Internet has often been praised for its decentralised nature,
removing the gatekeeping function of intermediaries, certain nodes (e.g. routers
or servers) are increasingly taught to distinguish particular types of content be
this for reasons of managing network congestion, dealing with security threats,
developing for-profit services or restricting access to certain kinds of content
(Bendrath and Mueller, 2011; Fuchs, 2012). In fact, much of the technologies
used to block Internet con- tent can be used for both valid and undemocratic
purposes. They constitute so-called dual use technologies often produced by
the cybersecurity industry of Western state. These have been rapidly adopted by
authoritarian regimes (e.g. China built the great firewall using the U.S. company
Ciscos technologies, see Goldsmith and Wu, 2006). Especially surveillance
tech- nologies used for law enforcement or traffic management purposes in
Western democracies are invariably exported to authoritarian regimes where
they are employed against activists and citizens in violation of human rights
protections (Brown and Korff, 2012).

The Open Net Initiative (ONI), a consortium of universities and private


institutions emerged in 2002 to map content restrictions across the world. Since
2006, they mapped content-access control on the Internet in 70 states,
probed 289 ISPs within those states, and tested Web access .ONI identifies
four phases of Internet access and content regulation, three of which are the titles
of ONIs Access books (Deibert et al., 2008, 2010, 2011a):

THE OPEN COMMONS lasted from 1960s to 2000. Cyberspace was perceived as
distinct from offline activities and either ignored or only slightly regulated.
The period was marked by the belief in an open Internet that enabled global
free speech.

THE ACCESS DENIED phase, from 2000 to 2005, was marked by states
increasingly erecting filters to prevent their citizens from accessing certain
types of content. China in particular emerged as the poster-child of content
restrictions by building a highly sophisticated filtering regime
that covers a wide range of contents. These controls are either based on
existing regulations or new legal measures.

THE ACCESS CONTROLLED phase, from 2005 to 2010, saw states develop more
sophisticated forms of filtering, designed to be more flexible and offensive
(e.g. network attacks) to regulate user behaviour, including through
registration, licensing and identity regulations to facilitate online
monitoring and promote self-censorship. The salient feature of this phase is
the notion that there is a large series of mechanisms (including those that are
non-technological) at various points of control that can be used to limit and
shape access to knowledge and infor- mation (Deibert et al., 2011b, 10).
Filtering techniques are situated at numerous points of the network, controls
evolve over time and can be limited to particular periods such as elections or
political turmoil. Egypts complete disconnection from the Internet in January
2011 represents the most extreme form and has triggered wide debates about
state-controlled kill switches (see for instance Wu, 2010). To achieve
more fine-grained controls, states need to rely on private actors through
informal requests or stricter regulation.

ACCESS CONTESTED is the term used for the fourth phase from 2010 onwards,
during which the Internet has emerged as a battlefield of competing
interests for states, private companies, citizens and other groups.
Democratic states are increasingly vocal in wanting to regulate the
Internet. The contest over access has burst into the open, both among
advocates for an open Internet and those, mostly governments but also
corporations, who feel it is now legitimate for them to exercise power
openly in this domain, write Deibert et al. (2011b, 14). A wide variety of
groups recognise the growing ubiquity of the Internet in everyday life and
the possible effects of access controls with some openly questioning the
open standards and protocols that were thought to be achieved for good in
the 1960-70s. The foundational principles of an open and decentralised
Internet are now open for debate and the subject of competing interests and
values at all stages of decision-making both within states and in the
international realm.
Conflicts about online information and content are highly diverse, ranging from
privacy to copyright to freedom of expression to security issues. The fight against
child pornography or child abuse images has been one of the main reasons for
introducing stricter content controls and block lists, especially in liberal
democracies.8 Tools that are often associated with the diffusion of illegal content,
such as peer-to-peer file-sharing or circumvention tools have equally become the
target of censors (Deibert et al., 2008). States are by far not the only actors
showing an interest in controlling Internet content, especially since content owners
increasingly invest in network operating facilities. The emerging conflicts generally
oppose consumers and producers of information who hold diverging interests in
controlling or restricting the propagation of information (Busch, 2012). The
following section offers a short overview of the evolution of content regulation.

From Endpoint To Bottleneck Regulation


The early 1990s ONIs open commons phase was essentially a period
with no or very limited state interventions, where governments, especially in

liberal democracies, adopted a laissez-faire approach towards the nascent network to


leave space for innovation and new developments (regulation was thus mainly user or
market driven). In the domain of spam control, this type of self-regulatory system
proved to be very effective as spam lists were edited collaboratively online and used
by email systems to detect unwanted messages (Mueller, 2010). However, since
the early 2000s, states have increasingly asserted their control online (Deibert and
Crete-Nishihata, 2012; Deibert et al., 2010).

As already stated previously, much attention has been paid to the filtering of
Internet access by authoritarian regimes such as China or Iran (Deibert et al.,
2008, 2010; Boas, 2006). However, liberal democracies that base their
legitimacy on the protection of civil liberties such as freedom of expression and
the rule of law are also increasingly considering technological solutions

8 Sexual representations of children are frequently referred to as child pornography, a term rejected by
child protection associations and police forces as hiding the child abuse behind those images.
to content regulation. Automatic information controls have emerged as a new
policy tool in liberal democracies and a global norm for reasserting state
sovereignty in the online realm (McIntyre, 2012; Deibert et al., 2010; Zittrain
and Palfrey, 2008). In fact, Edwards (2009, 39) argues that technological
manipulations of Internet access to content have been widely disparaged in the
West. The Internet Watch Foundations (IWF) role in blocking content in the
UK for instance was unknown to a majority of users until the Wikipedia
blocking of 2008.9

In authoritarian regimes, the government is generally directly involved in


controlling Internet traffic (see for instance the Tunisian example in Wagner,
2012). In liberal democracies, direct government regulation has been hard, if not
impossible, to implement (see below). Section 1 has shown that Internet
regulation is driven by various forms of private and public orderings and
different combinations of regulatory modalities, notably the use of design
features by private actors. It is thus not surprising that we see these mechanisms
also at work in Internet content control. However, states
23
attempts at directly regulating speech online against the backdrop of existing

media and information and communication policies have been met with
widespread criticism for being ineffective or irrespective of existing human
rights protections (Brown, 2008).

Although the Internets role is to root data packets over the networks without
regards for what content they carry,10 infrastructure is increasingly co-opted to
perform content controls. This can be the case by denying certain actors access
to hosting platforms or financial intermediaries as in the case of WikiLeaks (see
below). Infrastructure is also increasingly used to enforce intellectual property
rights, through Internet access termination laws for repeated copyright
infringement (so-called three-strikes or graduated

9 In 2008, access to the editing function of Wikipedia was blocked for all UK users after an image of a
naked girl, the cover image of a famous rock bands album that could be officially purchased in UK
stores, was flagged as child pornography and added to the IWFs blocklist. Because ISPs used a
particular type of proxy blocking, this resulted in blocking access to Wikipedias editing function for all
UK users.

10 This principle has also gained widespread political salience through the net neutrality movement
particularly in the U.S. and since 2009 increasingly in Europe. Net neutrality advocates defend the
Internets end-to-end principle with no traffic discriminations that might prioritise certain types of
traffic over others. For more information, see Marsden (2010).
response mechanisms as implemented for instance in France, Ireland or the UK,
see Yu, 2010) or through domain name seizures (DeNardis, 2012).

In the West, the rapid development of the mobile Internet has contributed to
increased control and filtering mechanisms being built into mobile access
(Zittrain, 2008). As a result of increased government intervention and cor-
porate demands, the Internet is increasingly bordered (Goldsmith and Wu, 2006).
For some private actors, the development of technological means of control was
a condition for providing access to content they owned in the first place. This is
for instance the case of the entertainment industry who use Geo-ID technologies
to control where users can have access to their content in the world.
Technological innovation online is largely driven by advertising revenues.
Large Internet corporations such as Google or Facebook are heavily reliant
on advertising as their main source of revenue. Over the last three years, about
96% of Googles total revenue was generated by advertising. 11 To increase the
effectiveness of advertisement, contextual or targetted advertisement has
emerged, proposing targeted ads based on
the assumption that a user interested in one article will be interested in
similar products. Following this logic, the next step is to track user behaviour
across websites, collecting data to establish user profiles and propose products that
fit closest to his or her preferences (Kuehn, 2012). To do this, new technologies
and tools had to be developed to track, collect and process personal data. The
development of Geo-ID technologies for instance allows for geographically locating
Internet users with great accuracy, therefore making Internet advertising easier.

In parallel, a surveillance and security industry has emerged whose interest


lies in collecting, processing and selling data as well as expanding the
technological possibilities for tracking user behaviour and filter out unwanted
content. It is estimated that the cybersecurity market ranges in the order of
hundred billions of U.S. dollars annually. Commercial providers of networking
technology have a stake in the securitization of cyberspace and can inflate
threats to serve their more parochial market interests argue

11 Google Investor Relations, 2012 Financial Tables, Last consulted on 17 Januarw 2013, available at:
http://investor.google.com/financial/tables.html
Deibert and Crete-Nishihata (2012, 340). Most of the surveillance and filter- ing
systems used in authoritarian regimes are provided by North American and
European businesses that developed these for companies, governments and
individual users (Hintz, 2012; Fuchs, 2012). Often, these technologies are
customised to the particular demands of authoritarian regimes (Deibert and Crete-
Nishihata, 2012), but this is not always the case as for the Tunisian pre-
revolutionary filtering regime that relied on Western blocklists to control what
type of content its citizens could access (Wagner, 2012).

Although the Internet is a decentralised network, there exists a series of points of


control (Zittrain, 2003) that have been rapidly used by govern- ments and
corporations to insert control into the Internets infrastructure. The four main
techniques of content control online are technical blocking (i.e. through
blocklists that can either exclude blacklist or include whitelist
particular types of content), removal of search results, takedown requests and
self-censorship (Deibert et al., 2008).

Point of Technique Actors Challenges


control
Source Law enforcement Courts Costly, time
intensive,
identification,
national
Notice-and- Any actor Chilling effects
Intermediary takedown
Server takedown, State, company Risk of
domain overblocking
deregistration,
rating systems
Technical State, company Accountability,
blocking overblocking,
mission creep,
transparency
Destination (Parental) control State, company, opt-in/opt-out
filter, surveillance, user
social techniques

Figure 1: Internet content

Figure 1 provides a summary of the main forms of content control on the Internet,
which will be discussed in more detail below. Since the early 1990s, there has been
an evolution in the way content has been regulated ranging from enforcement at
the source (section 2.2.1) to enforcement at the desti-
nation (section 2.2.2) to enforcement through intermediaries (section 2.2.3),
including through automatic filtering.

Enforcement At The Source

Early attempts to deal with problematic content targeted the endpoints of the
network, i.e. the producers and consumers of problematic content (Zittrain,
2003). States could effectively intervene when the content was produced in the
country, by arresting and prosecuting individuals, or when the company hosting
the content held assets in the said country. Because the U.S. company
CompuServe had office spaces and hardware in Munich, Germany, Bavarian
prosecutors succeeded in pressuring the group to block access to 200 Usenet
messaging boards containing hard pornography and child abuse images illegal
under German law in 1996. Similarly, a Parisian court convicted Yahoo in 2000
to remove nazi memorabilia items from its online auction site (see section 1.3).
The company complied, although reluctantly, because it held several assets in
France that could be seized by the courts. A similar case is Dow Jones v.
Gutnick (2002), in which the Australian High court decided that the U.S.
company Dow Jones was liable under Australian defamation laws for an
unfounded mention of Joseph Gutnick in one of its articles that was also
published online. All three examples demonstrate the states power to
effectively regulate what type of content can be accessed on its national
territory. Because the technology of the late 1990s and early 2000s did not allow
for effectively limiting content controls to particular geographical areas, the result of
all three state interventions was that the content illegal in one state was effectively
removed from the global Internet, i.e. also in states where the content was legal. The
extraterritorial effects of the judgements resulted in much controversy especially in
the U.S. where commentators condemned the censoring of the U.S. Internet
through foreign speech restrictions (Goldsmith and Wu, 2006; Deibert et al.,
2010).

However, the U.S. were among the first to introduce legislation criminalising the
initiation of a transmission of indecent material to minors. The
Communications Decency Act of 1995 (CDA) aimed at introducing content
restrictions for underaged minors but was eventually struck down as
unconstitutional with regard to the first amendment protection by the courts.
Online, it is not necessarily possible to distinguish between minors and adults,
the restrictions would thus have effectively applied to all Internet users, which was
considered excessively chilling of free speech. All liberal democracies dispose of a
more or less broad set of laws that can be used against the source of digital content,
e.g. in cases of defamation, trade secret misappropriation or particular types of
speech (Zittrain, 2003).

Nonetheless, enforcement remains problematic when the endpoint is not situated


on the states territory and/or traditional legal procedures cannot cope with the
exponential amount of content at stake (Mueller, 2010) The latter is particularly
the case for pornographic or copyrighted material that is exchanged massively
between individuals. Lawsuits are costly and time consuming and do not seem to
effect the behaviour of many others engaging in similar behaviours. Furthermore,
identification remains a challenge although ISPs increasingly cooperate with
law enforcers in identifying the
sources of problematic material. Enforcement is impossible when the source
of the content is situated beyond the countrys jurisdiction and there exists no
cooperation mechanisms or both states disagree on whether the content is
illegal or not (e.g. the case of Yahoo v. LICRA, 2000). Finally, some actors may
also wish to control content in lire subtle ways in an attempt to avoid costly and
time-consuming lawsuits against their own customers (Zittrain, 2003). Confronted
with massive online copyright infringement, the entertainment industry for instance
has first engaged in mass litigation or threats thereof against individual file-sharers
and the companies operating online file-sharing platforms, often driving these out of
business in a variety of countries (Yu, 2004). However, for one platform shut
down, new ones would emerge avoiding the pitfalls that had brought their
predecessors to fall. As part of these copyright wars that are fought in all
developed democracies, industry players has come up with ever more aggressive
tactics to defend its business model, including lobbying for stricter copyright
enforcement legislation, education campaigns, copy-protection technologies
(e.g. DRMs) and, more timidly, licensing to online retailers such as iTunes or,
more recently, Spotify (Yu, 2004).

Efforts to harmonise content regulation at the international level have been


made, notably at the EU level with regards to sexually explicit images of children,
racism, terrorism or cybercriminality (Akdeniz, 2010). However, despite the
international consensus to tackle issues such as child abuse and the trafficking of
images of those abuses, international cooperation to remove such content and
prosecute the offenders remains the exception. In fact, the introduction of Internet
filtering tools to prevent access to such content correlates with a decrease in
cooperative efforts to remove the content at the source (Villeneuve, 2010).

Enforcement At The Destination

Control at the destination includes personal computer filtering software that


allows to monitor and control what type of content is accessed with the
destinations personal computer or network. These filters can be built directly into
computers. Sometimes, they are also integrated into Web browsers.These
type of filters are used in the corporate environment to prevent employers
from accessing leisure or illegal content from within the com- panys
network. Parents are also customers of so-called parental control filters to
monitor and control what type of content their children access.

Additionally, governments in liberal democracies have reflected about so- called


opt-out filters, which would be installed by default on particular Internet
connections. In the U.S. funding for schools and libraries is for instance linked
to the installation of blocking software that filters out child pornographic
material (Zittrain, 2003; Brown, 2008). In other countries (e.g. France), ISPs are
bound by law to provide their customers with so-called opt-in filters, which
are activated upon the customers request.

To avoid state interventions in the 1990s, the World Wide Web Con- sortium
(W3C) initiated a Platform for Internet Content Selection (PICS) to develop a global
rating system that would enable users to determine their own access to Internet
content (Resnick and Miller, 1996). The idea was
notably supported by important publishing and media conglomerates and
institutionalised through the Internet Content Rating Association (ICRA) in
1999, whose members were early Internet industry actors and supported by the
European Safer Internet Action Plan from 1999 to 2002. However, the attempt
to introduce similar ratings than for motion pictures and television content failed
due to the lack of user adoption and the difficulty to rate highly dynamic and
ever-increasing amounts of Internet content (Mueller, 2010; Brown and
Marsden, 2013; Oswell, 1999).

Enforcement Through Intermediaries

Enforcement through intermediaries has become increasingly popular to avoid the


pitfalls of other techniques mentioned previously. Following Zittrain (2003, 11),
this method promises easier enforcement but less legal certainly. Intermediaries can
be situated at the source or destination of content. Destination ISPs present the
particular attraction for law officials to be situated within a states jurisdiction
and are thus more readily subject to regulation. Intermediary-based
regulations, bottleneck or chokepoint re11gulation (Froomkin, 2011) allow
governments to transfer the technical implementation of their content legislation to
those providing access to the Internet. Boyle (1997, 202) noted already in the
1990s that the turn to privatised and technologically-based enforcement to avoid
practical and constitutional obstacles seems to be the rule rather than the
exception.
Since, the reliance on private actors has steadily increased. For Marsden (2011,
12) governments have outsourced constitutionally fundamental regulation to
private agents, with little or no regard for the legitimacy claims. In practice,
ISPs thus adopt self-regulatory or co-regulatory prac- tices in association with
governmental or independent institutions (Marsden, 2011; McIntyre, 2012). This
way of action tends to become standard in governmental regulation but raises
questions of effectiveness, transparency and accountability.

ISPs are not the only intermediaries in a position to enforce content regulations.
Information providers (e.g. search engines), financial intermedia- ries, DNS
registrars and hardware and software producers are also key
actors. Zittrain and Edelman (2002) noted already in 2002 that Google filtered
its search results in accordance with local laws, e.g. removing nazi propaganda
and right extremism in Germany. Goldsmith and Wu (2006) point to the
regulation of financial intermediaries in the U.S. to fight offshore gambling
websites. By forbidding all major credit card institutions to transfer money to
offshore gambling accounts, the U.S. government has effectively impacted user
behaviour. It is still possible for U.S. citizens to engage in offshore gambling but
the transaction procedure is significantly higher than previously. Also,
commercial owners of the Internets infrastructure (finan- cial intermediaries,
website hosts, etc.) play an essential role in that they can deny service to
controversial speakers thus depriving these of being heard. After whistleblower
WikiLeaks released thousands of U.S. diplomatic cables in 2010, its domain
name was rapidly made unavailable, its data refused hosting by Amazons cloud
computing platform and the most popular forms of payment services to
WikiLeaks were interrupted. The organisation, the website and the person of
Julian Assange rapidly came under attack by both
private and public actors (Benkler, 2011). WikiLeaks is an extreme case that still
triggers wide debate. It illustrates nonetheless that the U.S. government could
not directly prevent the Website from publishing the controversial cables.
The termination of its hosting platform can also be considered a minor
inconvenience, given that various other actors across the globe offered rapid
hosting and mirrored the cables on hundreds of websites. Removing content
from the Internet once it generates broad public interest is thus
near-to-impossible.12 The interruption of services by all major global finan- cial
intermediaries is however more problematic. It resulted in the loss of 95% of
WikiLeaks revenue and lead WikiLeaks to publicly announce the suspension of
further publications.13 If the group continued to publish, the

12 This phenomenon is also referred to as the Streisand effect following a case in which the U.S. singer
and actress Barbra Streisand used legal means to remove a picture of a her villa online, unwillingly
generating so much publicity that the picture was replicated to such an extent that the legal action had
to be abandoned.

13 Addley, Esther and Deans, Jason (24 October 2011). WikiLeaks suspends publishing to fight
financial blockage, The Guardian, available at:
http://www.guardian.co.uk/media/2011/oct/24/wikileaks-suspends-publishing.
activity is considerably reduced and WikiLeaks continues to face financial
difficulties.14

If it is true that other intermediaries should not be overlooked, ISPs and online
content providers (OCPs) merit particular attention. As gatekeepers of the
Internet, ISPs have the technical capability to monitor their users activities and are
able to block access to particular types of content through ever-more sophisticated
blocking techniques (for an overview see Murdoch and Anderson, 2008). OCPs
such as Facebook or Google attract millions of users on what has been called
quasi-public spheres, spaces that function as shopping malls or town-squares
in the digital realm. However, their content policies are largely defined by their
terms of use and contract law that does not benefit from the same constitutional
free speech protections than governmental regulations (York, 2010; MacKinnon,
2012). Nonetheless, their content policy decisions impact millions of users across
the world. For MacKinnon (2012) these giant Internet companies represent in fact
new corporate sovereigns that make crucial decisions about the type of
content
one can access or not. In her 2012 book Consent of the networked she
demands increased transparency and accountability from corporate and
governmental sovereigns, rejecting however a state-led initiative or stricter
legislation. A further self-regulatory measure that has attracted attention is the
Global Network Initiative, a process set up by Yahoo, Google, Microsoft, human
rights groups and academics in 2006 to reflect about how companies can uphold
human rights in the digital realm particularly when operating in authoritarian
regimes.15 A number of reports and human rights commitments have resulted
from the initiative, which failed however in attracting further corporations to join
the effort.

14 Greenberg, Andy (18 July 2012). WikiLeaks Reopens Channel for credit card donations,d ares Visa
and MasterCard to block them again, Forbes, available at:
http://www.forbes.com/sites/andygreenberg/2012/07/18/wikileaks-reopens-channel-for-credit- card-
donations-dares-visa-and-mastercard-to-block-it-again/.

15 The European Parliament has for instance demanded sharper export controls of dual-use technologies.
See: European Parliament (27 September 2011). Controlling dual-use exports. Available at:
http://www.europarl.europa.eu/news/en/pressroom/content/20110927IPR27586/html/Controll ing-dual-use-
exports.
In the mid-1990s, many Internet industry actors in liberal democracies
established private organisations, often supported by public funds, speci- fically
to deal with sexual images of children. These private bodies set up hotlines to
allow Internet users to flag problematic content and facilitate takedown and
prosecution by the police. One of the more successful hotlines is run by the
Internet Watch Foundation (IWF), set up in 1996 by the British Internet industry
as part of the broader Inhope network, the International Association of Internet
Hotlines. In the U.S. the National Center for Missing and Exploited Children
(NCMEC) pre-existed the Internet but increasingly focuses on online child
abuse. Hotlines were a response to the fact that the police was not able to
effectively deal with illegal content online (Mueller, 2010). A second reaction
were rating systems that equally developed in the 1990s but failed as indicated
previously. The organisations behind the hotlines, such as the IWF, then
converted to supporting the current notice- and-takedown system.

Internet service providers (ISPs) are generally exempt from liability for the
content carried or hosted on their servers as long as they are unaware of its
illegal nature and remove the content swiftly upon notification. This principle
has notably been enshrined in the U.S. 16 and the European mere conduit (e-
commerce directive, 2000) provisions. The importance of this principle has been
repeatedly underlined by advocacy groups and international organisations (La
Rue, 2011; OECD, 2011). However, the current notice-and- take down regime
encourages ISPs to swiftly remove content as soon as they are notified of its
potentially illegal or harmful nature to avoid liability issues. This results in so-
called chilling effects on free speech as content is taken down upon
notification with no or limited assessment on whether it is actually illegal. A
growing number of reports suggest that perfectly legal content is being removed
under notice-and-takedown procedures.17 When

16 Section 230 of the Communications Decency Act (CDA) states that "No provider or user of an
interactive computer service shall be treated as the publisher or speaker of any information provided
by another information content provider". Sections of the Digital Millennium Copyright Act
(DMCA, 1998) also provide safe harbor provisions for copyrighted material.

17 The website http://www.chillingeffects.org/, a project of the Electronic Frontier Foundation (EFF) and
various U.S. universities, aims to inform Internet users about their rights in dealing with
not complying with take-down-requests, ISPs or OCPs risk to be held liable, as
has recently been the case with Google and Yahoo in two defamation cases in
Australia.18

Furthermore, research by Moore and Clayton (2009) indicates that there are
strong variations in removal times after a request depending on the type of
content being taken done. Despite lacking an overarching legal framework,
phishing websites19 are removed very rapidly while child abuse images, which
are illegal across the globe, suffer long removal times. The authors argue that
this has to do with the incentive of the involved actors, banks acting very
promptly while child abuse images are dealt with by the police and encounter
many jurisdictional issues when not being situated within the polices country.

To overcome critiques about notice-and-take down excesses, Google


publishes since 2009 all state-initiated removal requests as part of its Google
Transparency Report. This provides a useful source of information of what type
of content is requested to be removed by which state actors and for which
reasons. Among the countries that request most content to be removed
are the U.S. and Germany, although no reliable information is available for
China for instance. Google also lists the requests from copyright owners. Figure
2 presents a screenshot from the rapid increase in copyright removal requests
addressed to Google search per week. 20 Since July 2012, the requests have
increased exponentially.

18 Holland, Adam (28 November 2012). Google Found Liable in Australian Court for Initial Refusal to
Remove Links, in: Chilling Effects, accessed on 18 December 2012
at:http://www.chillingeffects.org/weather.cgi?WeatherID=684

19 Phishing websites are sites that appear genuine (typically banking sites) to dupe Internet users to enter
their passwords and login credential to be used for fraud.

20 Google Transparency Report, Copyright removal Requests, retrieved on January 18, 2013 from:
https://www.google.com/transparencyreport/removals/copyright/.
Figure 2: Copyright removal requests to Google search per week

As a result of the difficulty to control or restrict digital content, Internet


blocklists have become increasingly popular to deal with problematic
content. In the UK for instance, the IWF started to use a blocklist from 2004
onwards that near to all ISPs use voluntarily or to prevent governmental
legislation. Blocklists establish a system of upstream filtering, without
consulting the users whose access is affected (Edwards, 2009; McIntyre,
2012). Internet filtering or blocking technologies provide an automatic
means of preventing access to or restricting distribution of particular infor-
mation (McIntyre and Scott, 2008, 109). They resemble traditional forms of

censorship (e.g. the Index of the Catholic Church) but many authors argue that
because of their automatic and self-enforcing nature, they are quali- tatively
different from prior forms of content control and pose new problems in particular
in terms of accountability and legitimacy (Brown, 2008; McIntyre and Scott,
2008; Deibert et al., 2008; McIntyre, 2012). Studying. Internet content restrictions
remains however challenging notably due to technological and methodological
issues.

Assessing Internet Filtering And Directions For Future Research


The assessment of automatic or technology-based content restrictions has only
just began, predominantly by legal scholars reflecting on the legitimacy and
accountability of this type of mechanisms (section 2.3.1). The technical nature of
filtering systems has also been investigated with research only just emerging on
the political and economic drivers of filtering systems (section 2.3.2).
Legal And Democratic Questions

Internet blocking techniques have led to several occasions of over-blocking,


where legitimate content was equally blocked, 21 and are often criticised for being
ineffective as Internet users can choose from a wide-range of tools to circumvent
blocking. As Brown and Marsden (2013) state the super-user is in effect able
to bypass information controls. However, this is not the case of the broad
majority of users. Also, the absence of information of what type of content is
blocked and by whom lacks any clear accountability mechanism. Automatic
blocking poses a series of democratic questions in relation to the proportionality
of Internet blocking, fundamental rights of freedom of expression, privacy and
questions of due process and the rule of law in what regards their
implementations.

Authors such as Bambauer (2009) have thus called for process-tracing


frameworks to assess Internet filtering regimes in light of a series of demo-
cratic principles, in particular in terms of openness, transparency, narrow-ness
and accountability. Initiatives such as the British IWF would fail on
most of these criteria despite being successful in removing child porn from
UK servers argues Edwards (2009). The problem is indeed that private orga-
nisations such as the IWF perform a quasi-judicial function and that content is
removed without any intervention of a judge and is not held accountable for its
actions. In the case of the Wikipedia over-blocking mentioned previously, the
discussion centred around the lack of legal com- petence of the IWF for deciding
whether the image was illegal or not, the fact that no notification about the
blocking had been given and that no possibility for appeal existed at the time
(Edwards, 2009). Blocking systems remain thus particularly vulnerable to
questions of effectiveness and the respect of democratic principles and human
rights. However, many authors do not outrightly reject blocking techniques
anymore but argue for increased

21 Blocklists have sometimes been leaked on the Internet. The Australians Communications and Media
Authority (ACMA) blocklist was leaked by WikiLeaks in 2009 and several sites were detected as non-
conform to ACMAs content rating system, for instance the Website of a dentist in Queensland. More
recently, in March 2012, more than 8000 Websites, including Google and Facebook, were blocked by
the Danish child pornography list. See EDRi (14 March 2012). Google and Facebook Blocked by the
Danish Child Pornography Filter, available at: http://www.edri.org/edrigram/number10.5/danish-filter-
blocks-google-facebook.
transparency and accountability to be introduced to the existing systems (see
for instance Bambauer, 2009; Edwards, 2009 for a critical stance on this
development see Mueller, 2010)

Much of the literature on Internet blocking in particular adopts a legal


perspective by focusing on particular types of content blocking (e.g. child
pornography for McIntyre, 2012) or particular countries (for Germany alone,
two dissertations have been published analysing the implications of Internet
blocking in light of the particular legal system. See Greve, 2011; Heliosch,
2012). Country comparisons remain the exception.

Brown and Marsden (2013) use a transdisciplinary framework to assess hard


cases of Internet governance, including online censorship principally in the
anglo-saxon world and the usual suspects of Internet blocking China and Iran.
They conclude to the near-to absence of appeal and due process principles and
overall reduced democratic scrutiny. They call for the development of
international standards and the adoption of best practices (e.g. the IWF before it
engaged into automatic blocking, banks responses to phishing and a
combination of spam filtering and takedown procedures). The answer should be
to go to the source: arresting producers not blocking viewing (2013, 309).

The political controversies and discourses surrounding Internet blocking have


until recently been the object of little research. McIntyre and Scott (2008)
explored the rhetoric underlying blocking proposals, with McIntyre (2012)
distinguishing between two main arguments advanced against Inter- net
blocking: practical and principled ones. Practical arguments refer to the
functional aspects of Internet blocking such as the blocking techniques used and
whether Internet blocking is an effective form of stopping the diffusion of
problematic content and the societal issues that drive the production and
diffusion of such content. Principled arguments directly appeal to democratic
and human rights principles, mainly freedom of expression and consti- tutional
safeguards such as due process, public oversight or transparency and
accountability mechanisms. Breindl (2012) and Breindl and Wright (2012)
proposed an analysis of the networks of actors and discourses
surrounding two government proposals to introduce Internet blocking of child
pornography in France and Germany. Breindl (2012) concludes that the
characteristics of the network of actors of the opponents to Internet blocking was
structurally different in both countries. The German oppo- nents network was
particularly large and cohesive. Furthermore, they domi- nated all core frames of
the debate, succeeding in providing a coherent alternative in removing not
blocking the content. In France, however, the debate was largely dominated by
other issues, leaving few discursive oppor- tunities to challengers of Internet
blocking, which was eventually adopted in France and revoked in Germany.

Measuring Technological Blocking

Internet filtering and blocking is based on a diversity of technologies, including


DNS tampering, IP header filtering, deep packet inspection or end- user filtering
softwares (Murdoch and Anderson, 2008). These techniques are often combined
into hybrid filters, with states starting with IP address or DNS filtering to
move onto more sophisticated methods to increase effect- tiveness
(Deibert et al., 2008, 2010). The EUs CIRCAMP blocklist, used by various
national hotlines, relies on DNS filtering, even though it can be easily
circumvented. British Telecoms Cleanfeed system employs a hybrid filter.
In the U.S., the National Centre for Missing and Exploited Children
(NCMEC) provides voluntary blocklists to ISPs since 2007, many of whom
use them to filter their networks. Some companies, such as AT&T, also use the
so-called hash-values provided by NCMEC to monitor and filter private
communications for known images based on their hash value, thus
including non-Web content (McIntyre, 2012). Computer science literature
focuses on the detailed implementations of this type of blocking techniques
principally in authoritarian regimes (for China, see for instance Wright et al.,
2011, but see Clayton (2005) for an analysis of the UK Cleanfeed system; see
also our technical report about Internet blocking).

Policy-analyses of the development of filtering regimes, in particular in


liberal democracies, are lacking. The Open Network Initiative (ONI) has been
the frontrunner with the publication of the Access books (Deibert et al.,
2008, 2010, 2011a), including detailed country profiles. Their research represents
the first systematic attempt to document information restrictions around the globe.
Their research methods combine on-the-ground fieldwork and collaboration with
local experts to the measurement of content restrictions using specialised software
(Faris and Villeneuve, 2008). Their testing focuses on user reports of blocked
content notably through the crowd-sourced website Herdict, set up by the Berkman
Center for Internet & Society at Harvard University. Their measurement results in
scores ranging from pervasive, substantial, selective, suspected to no evidence of
filtering for four broad categories of political, social, conflict and security and
Internet tools types of content. They then correlate the filtering scores to World
bank indicators of the rule of law and voice and accountability, concluding to no
straightforward relationship between the rule and law and Internet filtering while
countries who hold low voice and accountability scores also hold strong
filtering scores (Faris and Villeneuve, 2008). For liberal democracies, Deibert
et al. (2010) find no evidence of filtering also because, for legal and ethical
reasons, ONI abstains from measuring child pornographic content (Faris and
Villeneuve, 2008). In comparison to authoritarian regimes, liberal democracies
therefore show no evidence of filtering while many anecdotal evidence suggests
growing filtering in these countries too.

Non-academic reports provided by NGOs and freedom of expression advo-


cates (see for instance the annual reports by Freedom House, 2012; Repor- ters
without borders, 2012) provide an important source of information about what
type of content is blocked in the countries included in the reports. Publications
by advocacy groups such as European Digital Rights (McNamee, 2011a, b) or
AK Vorrat provide often up-to-date information on latest developments in
Internet blocking. Groups such as the Tor network, which developed the widely
used circumvention tool, are also interested in gathering data about online
blocking and developed the Open Observatory of Network Interference (OONI),
which provides only data about blocking incidents in two countries for the
moment but might be expanded in the future.

The lack of reliable data measuring Internet blocking has already been
invoked in 2003 by Zittrain, who subsequently participated in building up
ONI and Herdict. More recently, the New America Foundations Open Tech-
nology Institute, The PlanetLab Consortium, Google Inc. and individual
researchers have initiated the Measurement Lab, a Web platform that can host a
variety of network measurement tools for broadband and mobile connections.
While some of the available tests are more specifically targeted at measuring the
quality of broadband connections, the use of deep-packet inspection (DPI), a
technology that allows to open up data packets and examine their content has
come to the centre of attention more recently. DPI is used for a variety of
reasons including bandwidth management, network security or lawful
interception but can also be used to regulate content, prioritise certain products
over competing services, target advertising or enforce copyright (Bendrath and
Mueller, 2011). As a result, several teams of researchers have developed new
tools to measure and assess DPI use by Internet service providers, which is
unregulated in most countries (see for instance Dischinger et al., 2010).
First academic assessments have emerged: Dischinger et al. (2008) for
instance assessed Bit Torrent blocking, presenting particularly high values for
U.S. ISPs such as Comcast. More recent research by Mueller and Asghari (2012)
and Asghari et al. (2012b), using the Glasnost test available on M-Lab,
investigate the particular use of DPI technology for throttling or blocking peer-
to-peer (P2P) applications over three years. They use bivariate analysis to test
possible correlations for economic and political drivers of DPI technology and its
implementation by 288 ISPs in 75 countries.22

They find that DPI use is surprisingly widespread worldwide, including in


liberal democracies in particular in Canada and the UK. They find that market
factors such as bandwidth scarcity, costs of bandwidth and lower levels of
competition correlates with higher DPI use. Furthermore, political factors such
as governmental censorship and weak privacy protections correlate with higher
DPI use. Contrary to their initial expectations, they find that the strength of the
copyright industry does not correlate with DPI use by ISPs. Similarly, network
security correlates negatively with the ISPs DPI use.

22 For a critical assessment of methodological issues regarding Internet throttling measurements see Asghari
et al. (2012a).
Interestingly, Mueller and Asghari (2012) find that governmental regu- lation
in the U.S. and Canada did not impact DPI use. In both countries, DPI use
resulted in public protests, litigation and the development of new regulation
based on net neutrality principles. The public confrontation clearly impacted DPI
use in the U.S. where ISPs considerably decreased their use of the technology,
even when the FCC ruling was challenged. In Canada, however, the new,
uncontested, regulation did not reduce DPI use, which actually increased after
the regulation was passed. Legislation alone is therefore not able to explain this
apparent paradox.

Future research
The literature review presented the main research questions and findings on
Internet content regulation as they have evolved since the introduction of the
Internet in the early 1990s. Of particular interest is the nature of new
regulatory arrangements that range from self- to co- to state regulatory
interventions (see also Marsden, 2011) set in place to respond to growing
concerns about a wide range of illegal or harmful content such as copyright
infringing material or content deemed harmful to minors or threatening
public order.

The various techniques and points of control have been discussed to high- light
where states and private actors could intervene to control digital inform- ation
flows. Particular attention has been paid to blocking techniques and the legal and
democratic implications of these. Finally, we have discussed recent research
providing empirical evidence of the amount of blocking carried out in liberal
democracies, identifying several shortcomings.

First, there remains a lack of reliable and longitudinal data about what type of
content is blocked or removed by which type of actor, where and through which
process. Recent initiatives such as the M-Lab provide first opportunities to
gather and analyse large amounts of data but present none- theless several
methodological challenges (see for instance Asghari et al., 2012a). Regulatory
authorities such as the U.S. Federal Communications
Commission (FCC) or the Body of European Regulators for Electronic
Communications (BEREC) are in the process of carrying out large broadband
connection tests that might result in relevant data for this research project in the
coming years.

Second, there is a clear opportunity to carry out a comparative public policy


research project specifically on liberal democracies. As much of the literature
has until now focused on authoritarian regimes, liberal demo- cracies merit to be
examined in their own rights. They present both possi- bilities and challenges for
research. On the one hand, there exist more reliable data and indicators about
the political and institutional systems in liberal democracies. On the other hand,
Internet blocking initiatives are often carried out by private actors and lack
democratic scrutiny and public over- sight. Access to reliable data remains,
again, a challenge.

Finally, there has been limited attention for the political drivers and factors
surrounding the adoption and implementation of blocking techniques.
Much of what we know about Internet blocking in liberal democracies is
the result of media reports, freedom of expression advocates with little
systematic analysis. Future research will benefit from a comparative and
systematic perspective on Internet blocking in liberal democracies in particular.
REVIEW OF LITERATURE

Nowadays, Internet-based learning or online education is finding increased use wildly, and may
prove effective in facilitating advanced study coursework for both urban learners and rural area
students. For instance, online learning can provide effective strategies for offering courses and
field experiences in special education teacher preparation programs (Collins, Schuster, Ludlow, &
Duff, 2002). Organizations apply online learning to employ a Web-based training in their training
employees program (Gravill & Compeau, 2008). Majority of universities in the US are adding
asynchronous Web-based instruction to their undergraduate degree programs (Bell & Akroyd,
2006; Chen, 2002; Lynch & Dembo, 2004; Miller & Lu, 2003).
The use of Internet-based technology is embedded in most learning activities today. According to
the US Department of Education as of the 2006/2007 academic year, 97% of all public two year
degree granting institutions and 88% of public four year degree granting institutions offered
college level distance classes (National Center for Educational Statistics, U.S. Department of
Education, 2002). In a related finding Picciano and Seaman (2007) estimated that in the next few
academic years, approximately one million K-12 students took an online class (Picciano &
Seaman, 2007). Thus, it is important for students to learn new skills as technology changes or is
introduced in their learning environments (Perry, Phillips, & Hutchinson, 2006). Learners are
increasingly expected to assess and manage their own learning needs. Wageman (2001) stated that
self-management is a disciplinary skill that offers benefits, and learners have to learn this
particular skill. Also, Vonderwell and Savery (2004) suggested that students, especially in online
learning environments, need to be prepared for changing demands related to online situations with
respect to technology, learning management, pedagogical practice and social roles.

Online learning environments are platforms where educational courses are delivered through the
Internet, or using Web-based instructional systems either in real-time (synchronously) or
asynchronously. Reid (2005) stated that a web-based instructional system or online learning is
easy and inexpensive compared to traditional learning methods. Conformed to Moore and
Kearsley (2005), web-based instructional can make extensive use of network technologies to
incorporate a variety of organizational, administrative, instructional, and technological
components, in offering flexibility concerning the new methodology of learning. Online learning
is self-managed when an instructor provides the software programs and resources to transfer new
skills while the learner controls the process to achieve their own objective to acquire those new
skills (Gravill & Copeau, 2008). Therefore, the process of online learning is going to be
implemented by the learner, and the learner will become an active controller instead of being a
passive learner, which has been the norm in the past. Online learners need to understand the
dynamics in an online setting (Voderwell & Savery, 2004). Learners need to know how online
learning works, interactions, relations, perceptions, and role of learners. But are they ready for
online learning environments.
Learner readiness influences most institutions in terms of their curricular development and
pedagogies to entire academic divisions dedicated to Web-specific delivery (Blankenship &
Atkinson, 2010).
According to Hung and others (2010), online learner readiness involved in five dimensions; self-
directed, motivation, self-efficacy in computer/Internet, self-control, and online communication
self-efficacy (Hung, Chou, Chen, and Own, 2010). In particular, learners have realize their
responsibility for guiding and directing their own learning for time management, for keeping up
with the class, for completing the work on time, and for being active contributors to instruction.
Most of those activities are the important part of self-regulated learning, which become the answer
for preparing novice online students to be successful learners in online learning environments.
Self-regulated learning becomes a central topic in facilitating learning in online learning
environments. Self-regulated learning strategies have been identified (Boekaerts & Corno, 2005;
Dweck, 2002; Perry et al., 2006). Self-regulated learning is a learning behavior that is guided by
1) metacognition or thinking about one's thinking include planning, monitoring, and regulating
activities; 2) strategic action such as organizing, time management, and evaluating personal
progress against a standard; and 3) motivation to learn such as self-believe, goal setting, and task
value (Boekaerts & Corno, 2005).
Learners choose the best approach to learn the material and gain the skills they need, and then
become their habits. These processes are called self-regulated learning strategies (Boekaerts &
Corno, 2005; Dweck, 2002; Perry et al., 2006). To manage these self-regulated learning
experiences effectively, individuals have to make self directed choices of the actions they will
undertake or the strategies they will invoke to meet their goals. Then, its will become learning
habits when learners often use them as their learning strategies. Self-regulated learning strategies
have the potential of becoming study skills and habits through repetitive use and behaviors.
Individuals who are self-regulated learners believe that opportunities to take on challenging tasks,
practice their learning, develop a deep understanding of subject matter, and exert effort will give
rise to academic success (Perry et al., 2006). Given the background information toward self-
regulated learning strategies in

online learning environments, teachers could adapt self-regulation strategies to match their
teaching styles; instructors might apply these strategies to develop their course activities
effectively; researchers will use this review to be part of their secondary data and indicate them to
find the issues that need to be researched; and finally learners could apply the findings from
research into their own learning strategies in order to improve their learning skills and the
effectiveness of their online learning environments. The purpose of this review is to provide
educational researchers, instructors, practitioners, and online learners with an understanding of
extant research and theories on academic self-regulated learning and its influence on learner
success in online education. This review will address the applicability of employing a theoretical
framework of self-regulation for understanding learner success in an online learning environment,
including gaps in the literature and suggestions for future inquiry.

According to Barry Zimmerman (2002), self-regulated learning involves the regulation of three
general aspects of academic learning. First, self-regulation of cognition involves the control of
various cognitive strategies for learning, such as the use of deep processing strategies; planning,
monitoring, and regulating, which refer to awareness, knowledge, and control of cognition that
result in better learning and performance. Second, self-regulation of behavior involves the active
control of the various resources students have available to them, such as their time, their study
environment, the place in which they study, and their use of others help seeking such as peers and
teachers to help them learn effectively.

Awareness of thinking: Part of becoming self-regulated involves awareness of effective thinking


and analyses of ones own thinking habits. This is known as metacognition, or thinking about
thinking, that Brown & Campione (1990) and Flavell (1978) first described. They showed that
children from 5-16 years of age become increasingly aware of their own personal knowledge
states, the characteristics of tasks that influence learning, and their own strategies for monitoring
learning. Paris and Winograd (1990) summarized these aspects of metacognition as childrens
developing competencies for self-appraisal and self-management and discussed how these aspects
of knowledge can help direct students efforts as they learn.

Feedom of speech and expression.


4. Freedom of expression
This chapter will examine the right to freedom of expression provided for by article 10
of the European Convention for the Protection of Human Rights and Fundamental
Freedoms, and outline some of the political statements made in relation to Internet and
freedom of expression.
4.1. Legal point of departure
Freedom of expression is a fundamental human right1, which draws on values of
personal autonomy and democracy. Freedom of expression is closely connected to

1In its very first session in 1946, the UN General Assembly stated, Freedom of information is afundamental human right and is
the touchstone of all the freedoms to which the United Nations is
consecrated (A/RES/59(1): Para.1).
freedom of thought and is a precondition for individuals self-expression and selffulfilment.
The right to express oneself enables an open debate about political, social
and moral values, and encourages artistic and scholarly endeavour free of inhibitions
(Jacobs and White 1996:223). Freedom of expression is not absolute, since open debate
and personal autonomy can cause conflict between the values and rights respected by the
system. Therefore, rights of expression can be limited by the system.
The right to freedom of expression is provided for in the Universal Declaration on
Human Rights Article 19, the International Covenant on Civil and Political Rights
Article 19, the American Convention on Human Rights Article 13, The African Charter
on Human and Peoples Rights Article 9, and the European Convention for the
Protection of Human Rights and Fundamental Freedoms Article 10. The point of
departure of this chapter will be Article 10 of the European Convention (ECHR) and the
related case law.

1. Everyone has the right to freedom of expression. This right shall include freedom to
hold opinions and to receive and impart information and ideas without interference by public authority and
regardless of frontiers. This Article shall not prevent States from
requiring the licensing of broadcasting, television or cinema enterprises.
2. The exercise of these freedoms, since it carries with it duties and responsibilities, may
be subject to such formalities, conditions, restrictions or penalties as are prescribed by
law and are necessary in a democratic society, in the interests of national security,
territorial integrity or public safety, for the prevention of disorder or crime, for the
protection of health or morals, for the protection of the reputation or rights of others, for
preventing the disclosure of information received in confidence, or for maintaining the
authority and impartiality of the judiciary (Article 10, ECHR).
The European Court (the Court) has described freedom of expression as one of the
essential foundations of a democratic society, one of the basic conditions for its progress
and for the development of every man (Handyside 1976:23). Paragraph one of Article
10 lays down the freedoms protected, whereas paragraph two sets conditions for
legitimate restrictions on these freedoms. If the conditions laid down in the second
paragraph are not fulfilled, a limitation on freedom of expression will amount to a
violation of the European Convention.
Using Habermas concepts of system and lifeworld, we would say that Article 10,
paragraph one provides protection for the individuals and press right to exercise
communicative actions in the public sphere, whereas paragraph two provides for
legitimate system restrictions on this freedom2. The legitimate system restrictions are in
2 A set of basic rights concerned the sphere of the public engaged in rational-critical debate (freedom ofopinion and speech,
freedom of press, freedom of assembly and association, etc.) (Habermas 1989:83).
principle only connected to the state, and not private parties, since the Convention
regulates a relationship between the individual and the state. However, as we shall see
later on, the question of positive state obligations might extend to protecting individuals from restrictions by
third parties.
4.2. Freedoms protected
The freedoms protected in article 10, paragraph one are:
Freedom to hold opinions. The freedom implies that the state must not try to
indoctrinate its citizens nor make distinctions between those holding specific
opinions and others. The freedom gives citizens the right to criticise the
government and form opposition3.
Freedom to impart information and ideas, which give citizens the right to
distribute information and ideas through all possible lawful sources.
Freedom to receive information, which includes the right to gather information
and to try to get information through all possible lawful sources4
Freedom of the press. The freedom is not explicitly mentioned in paragraph one,
but has been underlined by the Court in several cases, where the Court has put
strong emphasis on the publics right to know5.
Freedom of radio and TV broadcasting. The freedom is applicable also for radio
and television broadcasting, since the specific possibility to introduce a licensing
procedure implies that the freedom as such must be applicable6
Since strong content diversity is one of the main features of Internet communication, it
is interesting to see to which degree a potential diversity of expressions is protected
under Article 10. In an important judgment from 1976 the Court stressed the pluralism
of expressions protected under Article 10. Subject to paragraph 2 of Article 10, it is
applicable not only to information or ideas that are favourably received or regarded as
inoffensive or as a matter of indifference, but also to those that offend, shock or disturb the state or any
sector of the population. Such are the demands of that pluralism, tolerance and broadmindedness without
which there is no democratic society
(Handyside 1976:23)7. As illustrated by the judgment, the contents of the expressions
3 Certain positions have inherent limitations to the right to express opinions, for example civil servantsand prisoners.

4 UDHR Article 19 and ICCPR Article 19 refer also to the right to seek information.
5 The Court has often stressed the public interest or public debate factor, for instance in the SundayTimes case 1979, the Lingens
case 1986 and the Jersild case 1994.

6 For a long time the Commission saw no incompatibility between state monopolies of radio and TV andthe Convention.
However, in 1993 the court gave judgment in the Austrian radio monopoly case
(Informationsverein Lentia and others 1993), where the Commission had come to the conclusion that a
violation of Article 10 existed. The issue is also mentioned in CCPR General Comment 10: "Effective

7 37 In assessing the right to freedom of expression provided for in the International Covenant on Civil andPolitical Rights Article
19, also the Human Rights Committee has stressed that all forms of expressions
are entitled to the same degree of protection. Article 19, paragraph 2, must be interpreted as
encompassing every form of subjective ideas and opinions capable of transmission to others, which are
seem to be irrelevant to the applicability of Article 10. The fact that the information
concerned is of a commercial nature or that the freedom of expression is not exercised
in a discussion of matters of public interest is also indifferent (Van Hoof 1998:559).
Up to this point no cases concerning Internet content regulation and Article 10 have
come before the European Court. However, cases have been raised before courts in the
US, which I will examine in the following chapter. First, here is an outline of some of
the key concepts concerning Article 10.
The fact that Article 10 protects the free expression of opinions implies that a rather
strong emphasis is laid on the protection of the specific means by which the opinion is
expressed38. Any restriction of the means will imply a restriction of the freedom to
receive and impart information and ideas. However, the means by which a particular
opinion is expressed are protected only insofar as they are means, which have an
independent significance (exclusive means factor) for the expression of the opinion
(Van Hoof 1998:559)39. Since there exists no comparable alternative for individuals to
communicate in cyberspace, one could argue that the independent significance of
Internet as a specific means for expressing opinions and receiving information is rather
strong.
Another important concept in Article 10, not least in the light of Internets borderless
nature, is the term regardless of frontiers. The term indicates that the state must admit tolerance and
broadmindedness without which there is no democratic society
(Handyside 1976:23. As illustrated by the judgment, the contents of the expressions
seem to be irrelevant to the applicability of Article 10. The fact that the information
concerned is of a commercial nature or that the freedom of expression is not exercised
in a discussion of matters of public interest is also indifferent (Van Hoof 1998:559).
Up to this point no cases concerning Internet content regulation and Article 10 have
come before the European Court. However, cases have been raised before courts in the
US, which I will examine in the following chapter. First, here is an outline of some of
the key concepts concerning Article 10.
The fact that Article 10 protects the free expression of opinions implies that a rather
strong emphasis is laid on the protection of the specific means by which the opinion is
expressed38. Any restriction of the means will imply a restriction of the freedom to
receive and impart information and ideas. However, the means by which a particular
opinion is expressed are protected only insofar as they are means, which have an
independent significance (exclusive means factor) for the expression of the opinion
(Van Hoof 1998:559). Since there exists no comparable alternative for individuals to
compatible with article 20 of the Covenant, of news and information, of commercial expression and
advertising, of works, of art, etc; it should not be confined to means of political, cultural or artistic
expression (CCPR/C/47/D/359/1989).
communicate in cyberspace, one could argue that the independent significance of
Internet as a specific means for expressing opinions and receiving information is rather
strong.
Another important concept in Article 10, not least in the light of Internets borderless
nature, is the term regardless of frontiers. The term indicates that the state must admit information from
beyond the frontiers of the country, both to be imparted and received,
subject to the possible restrictions laid down in the second paragraph8. The term is not
applied very much in case law, but is interesting in the light of upcoming cases related
to Internet.
The Court has applied a strict supervision on preventive restraints on expressions
reflected in those cases where the Court has held that the intended purpose of the ban on
publication, the prevention of the disclosure of information, could no longer justify the
prohibition because the information had already become public from another source.
This was the case in The Observer and Guardian case from 1991, where the Court held
that since the information (Spycatcher) was now in the public domain, and therefore no
longer confidential, no further damage could be done to the public interest that had not
already been done (The Observer and Guardian 1991: Para.42). When discussing
content regulation on Internet it is important to bear in mind, that regulatory means such
as governments bans on certain information, as carried out in Singapore or China, or
mandatory filters on public computers, as carried out in Denmark or the US, are
restricting access to information, which is already in the public World Wide Web
domain. Thus it is not a question of preventing the disclosure of the information, but
rather a question of restricting individuals access to information, which is already made
public.
Regarding individuals right to receive information, the term indicates that the
collection of information from any source should in principle be free, unless legitimate
restrictions under paragraph two can be raised (Van Hoof:562). In line with this
interpretation the Court has ruled that the right to receive information basically
prohibits a government from restricting a person from receiving information that others
wish or may be willing to impart to him(Leander 1987:29).
It is still not clear to what extent the freedom to receive information entails an obligation
on the part of the state to impart information (Van Hoof:565). According to Council of Europes 1982
Declaration on the Freedom of Expression and Information the member
states shall pursue an open information policy in the public sector, including access to
information, in order to enhance the individuals understanding of, and his ability to
disseminate freely political, social, economic and cultural matters (CoE 1982:1-2).
8 For judgments concerning the imparting and receiving of information from abroad see Groppera RadioAG and others 1990 or
Autronic AG 1990.
Although the declaration is not a legal document is does give some direction as to the
legal/political trend within the member states.
Regarding the level of protection provided for by Article 10, paragraph one is primarily
a negative obligation on the state not to interfere in individuals right to express
opinions. It might however also entail positive obligations on the state to protect
individuals from third party interference, thus raising the question of Drittwirkung9.
The Court has held that, although the essential object of many provisions of the
Convention is to protect the individual against arbitrary interference by public
authorities, there may in addition be positive obligations inherent in an effective respect
of the rights concerned (Ozgur Gundem 2000: Para.42). Genuine, effective exercise of
this freedom does not depend merely on the States duty not to interfere, but may require
positive measures of protection, even in the sphere of relations between individuals
(Ibid: Para.43). Thus the state obligation not to interfere might extend to a positive
obligation to interfere in order to protect the individual from third party restrictions. The
concept of positive state obligations is still developing, and there is no clear legal
interpretation on the issue so far.
Summing up on Article 10 paragraph one, it includes a rather broad guarantee of
individuals freedom of expression. Using Habermas terminology we would say, that
individuals freedom to form and express opinions in the public sphere - whatever the
subject be - is legally provided for in Article 10, and effectively implemented by the
Court over the past years. Also, at first sight, the restrictions in paragraph two are
formulated broadly. However, the Court has taken the position that the exceptions to

4.3. Admissible restrictions


The restrictions, which are admissible according to paragraph two, fall into three
categories (CoE 1999:20):
Protection of the public interest (national security, territorial integrity, public
safety, prevention of disorder or crime, protection of health or morals).
Protection of other individual rights (protection of the reputation or rights of
others10, prevention of the disclosure of information received in confidence).
Necessity of maintaining the authority and impartiality of the judiciary.

9 Van Hoof speak of a kind of indirect Drittwirkung in cases where provisions of the Convention -notably Articles 3, 10 and 11
- imply state measures in order to make their exercise possible, i.e. the rights
inferred for individuals imply a positive obligation on the part of the Contracting States to take measures
vis--vis third private parties (Van Hoof 1998:23).

10 Politicians can use libel-statutes to curtail criticism in an effective manner. It is therefore important thatpoliticians right to be
protected against libel does not amount to outlawing criticism. In the Lingens case,
the Court held that the penalty imposed on Mr Lingens (a journalist convicted for public defamation)
amounted to a kind of censure, which could discourage him from making criticism of that kind in the
future (Lingens 1986).
Whereas the first and third point concern restrictions referring to the interest of the
system, the second point concerns restrictions referring to lifeworld, that is the freedoms
of other individuals.
A key concept entailed in paragraph two is that of duties and responsibilities. Duties
and responsibilities play a particularly important part in three circumstances. Firstly, in
cases regarding freedom of the press, secondly in cases where the person possesses a
special status, and thirdly in cases where the protection of morals is involved (Van Hoof
1998:576).
The Court has often stressed that it is incumbent on the press to impart information and
ideas on political issues as well as other areas of public interest. Not only does the
press have the task of imparting such information and ideas: the public also has a right
to receive them (Lingens 1986:26). Accordingly, restrictions on freedom of expression
where the press is involved are interpreted narrowly by the Court (Ibid:571). Also,
several cases have been raised concerning the issue of restrictions on persons with
special status, such as soldiers, teachers, civil servants, and servicemen. Case law shows
that the Court does not easily accept restrictions on freedom of expression due to special
duties and responsibilities on these groups (Van Hoff 1998:578). We could argue that even though these
persons act in a capacity as system representatives (for instance the
school system), their right to freedom of expression (that is lifeworld communication)
seems to prevail over their system relation when the Court balances the interest of
system respectively lifeworld. Finally, duties and responsibilities play an important part
in cases where the protection of morals is invoked to justify a restriction on freedom
of expression11. Case law seems to indicate that if the concept of morals is involved on
good grounds then this leads to a broad margin of appreciation, since state authorities
are found to be in a better position to judge national morals. Thus the system of state
authorities can - to a relatively strong degree - invoke the protection of morals as a
legitimate ground for restricting individuals freedom of expression.
When assessing a restriction on freedom of expression, the Court applies a three-part
test (1) The restriction must be prescribed by law and meet the corresponding criteria
of precision and accessibility, (2) it must have a legitimate aim as provided in Article 10
paragraph two, and (3) it must be necessary in a democratic society. The term
necessary in a democratic society implies that there must be a pressing social need for
the limitation and that it must be proportionate to the legitimate aim pursued (Guardian
and Observer 1991: Para.40). The necessity test is the ultimate and decisive criterion for
the Court. When assessing the proportionality of the restriction in question, the Court
examines whether the formalities, conditions, restrictions or penalties imposed on the
exercise of freedom of expression are proportionate to the legitimate aim pursued, that

11 See Handyside 1976, Mller and others 1988, Otto-Preminger-Institut 1994.


is the restriction should not be overbroad nor be permitted if a less restrictive alternative
would serve the same goal.

4.4. Political statements


As outlined in chapters two and three, Internet bears potential for an empowered
lifeworld by providing individuals with essentially new means of communicative
actions. Let me illustrate how the political system has addressed this potential.
In the Council of Europes 1982 Declaration on the Freedom of Expression and Information, the member
states explicitly addressed the issue of information and communication technology stating that the
continued development of information and
communication technology should serve to further that right, regardless of frontiers, to
express, to seek, to receive and to impart information and ideas, whatever their source
(CoE 1982:1)12. To support this aim, the member states has agreed to the following
objectives (excerpts, my emphasis) (Ibid:1-2).
Absence of censorship or any arbitrary controls or constraints on participants in
the information process, on media content or on the transmission and
dissemination of information.
The pursuit of an open information policy in the public sector, including access
to information, in order to enhance the individuals understanding of, and his
ability to disseminate freely political, social, economic and cultural matters.
And to intensify their co-operation in order:
To promote the free flow of information, thus contributing to international
understanding, a better knowledge of convictions and traditions, respect for the
diversity of opinions and the mutual enrichment of cultures.
To ensure that new information and communication techniques and services,
where available, are effectively used to broaden the scope of freedom of
expression and information.
Also UNESCO has addressed the issue of Internet and freedom of expression in a 1999
draft on Cyberspace Law, which is a set of principles to be promoted by UNESCO
(excerpts, my emphasis) (UNESCO 1999:4-5).
Communication Principle: The right of communication is a fundamental human
right.
Universal Service Principle: States should promote universal services where, to
the extent possible given the different national and regional circumstances and
resources, the new media shall be accessible at community level by all

12 CoE (the MM-S-OD specialist group) is currently (June 2001) preparing a draft recommendation onself-regulation on
Internet, but the draft is not yet public. For further information see
http://www.humanrights.coe.int/media/events/2001/FORUM-INFO(EN).doc
individuals, on a non-discriminatory basis regardless of geographic location.
Ethics Principle: States and users should promote efforts, at the local and
international levels, to develop ethical guidelines for participation in the new
cyberspace environment.
Free expression Principle: States should promote the right to free expression and
the right to receive information regardless of frontiers.
Access to Information Principle: Public bodies should have an affirmative
responsibility to make public information widely available on the Internet and to
ensure the accuracy and timeliness of the information.
States should preserve and expand the public domain in cyberspace.
The United Nations Special Rapporteur on the promotion and protection of the right to
freedom of opinion and expression, Mr Abid Hussain, has also stressed Internets effect
on freedom of expression. In his annual reports of 1998, 1999 and 2000 to the
Commission on Human Rights, the Special Rapporteur underlines the importance of
Internet in the free flow of information, ideas and opinions. For instance, Internets
potential for bringing out dissenting voices and shaping the political and cultural debate
(E/CN.4/2000/63). According to the Special Rapporteur, Internet is inherently
democratic, and online expressions should be guided by international standards and be
guaranteed the same protection as is given to other forms of expression (Ibid).
As illustrated above, international organisations have made rather strong statements
related to Internet and freedom of expression. CoE stress the absence of any arbitrary
controls or constrains on participants in the information process and argues for a free
flow of information, thus broadening the scope of the freedom of expression. The
Special Rapporteur wishes to give online expressions the same protection as is given to
other forms of expression, and UNESCO underlines the issue of general access to
Internet at community level. UNESCO also stress that states should preserve and
expand the public domain in cyberspace, which, according to the framework of
Habermas, can be seen as an encouragement to protect the lifeworld-oriented
communicative sphere in cyberspace. UNESCO further proposes the development of
ethical guidelines for participation in cyberspace, which could imply common codes of
conduct implemented by private parties, as I shall return to in chapter six. Even though
neither the CoE Declaration, the UNESCO draft on Cyberspace Law or the annual reports by the Special
Rapporteur have legal standing, they all point to the international
political focus and awareness on cyberspace, not least in the field of freedom of
expression.

5. Cases
I will now examine some recent cases, which deal with the dilemma of online content
regulation. Since there have not been any cases concerning Internet and freedom of
expression before the European Court, I will employ some of the most well-known US
judgments in the field of Internet communication. Chapter five will explore the main
arguments of the cases, whereas chapter six will use the cases to discuss the legal and
political space so far defined for regulating online expressions drawing on the
framework of system and lifeworld. The cases concern the two main types of content
regulation. The first type is state regulation on individuals right to express opinions and
receive information. The second type is private parties self-regulation. Here follows a
brief introduction to the cases concerning state regulation.
5.1. State regulatory cases
Restrictions on the right to express opinions
The most well known legislation on online content regulation is the US Communication
Decency Act (CDA), which passed as part of the Telecommunications Act in 1996. The
CDA sought to impose criminal penalties on anyone who used Internet to communicate
material that, under contemporary community standards, would be deemed patently
offensive to minors under 18 of age. The CDA provided two affirmative defences to
prosecution: 1) use of credit card or other age verification system and 2) any good faith
effort to restrict access by minors. The law was passed by congress and signed by the
president in January 1996, but was ruled unconstitutional first by the District Court for
the Eastern District of Pennsylvania in June 199654 then by the Supreme Court in June 1997 13 Following
CDA, the Child Online Protection Act (COPA) was enacted in Congress in
October 1998, as an attempt to cure the constitutional defects of CDA. COPA sought to
impose criminal penalties against any commercial website that made material that is
deemed "harmful to minors" available on the World Wide Web to anyone under 17
years of age. Thus COPA was narrower in scope aiming only at commercial
communications published on WWW. Federal judges struck down COPA in 1998, 1999
and 200056 and the Supreme Court has now (May 2001) decided to hear the arguments
on COPA.
Restrictions on the right to receive information
The latest initiative from the American Congress aiming at protecting children on
Internet is the Childrens Internet Protection Act (CHIPA) targeted at all schools and
public libraries that accept federal money. The law mandates that Internet-connected
computers be equipped with software that block or filter out material deemed obscene

13 "As a matter of constitutional tradition, in the absence of evidence to the contrary, we presume thatgovernmental regulation of
the content of speech is more likely to interfere with the free exchange of ideas
than to encourage it. The interest in encouraging freedom of expression in a democratic society outweighs any theoretical but
unproven benefit of censorship" (concluding paragraph). US Supreme Court: Reno,
Attorney General of the United States; et al. v. American Civil Liberties Union et al. Case no. 96-511,
decided 26.6.1997. Referred to as (Supreme Court on CDA).
or harmful to minors. Adults must also use filtered terminals, but they have the option
of asking library supervisors to override the filter for "bona fide research or other lawful
purpose." CHIPA was attached to the federal budget bill and passed in Congress
December 2000. In March 2001, the American Civil Liberties Union57 and the
American Library Association, along with several individual users, libraries and public
agencies, filed lawsuits in federal court calling the law unconstitutional58.
Another important case concerning public libraries and Internet is the Loudoun Co.
Library Case; a US civil action concerning a public library policy, which prohibited
library patrons access to certain content-based categories of Internet publications. Ten individual plaintiffs
claimed that the librarys Internet policy infringed on their right to
free speech under the First Amendment. Defendant, the Board of Trustees of the
Loudoun County Library, argued that a public library has an absolute right to limit what
it provides to the public and that any restrictions on Internet access do not implicate the
First Amendment. In its judgment, the District Court for the Eastern district of Virginia
stressed that although defendant is under no obligation to provide Internet access to its
patrons, it has chosen to do so and is therefore restricted by the First Amendment in the
limitations it is allowed to place on patron access.
In December 2000 the Danish Parliament considered a proposal, B46, to mandate the
use of filtering technology on all public computes in order to protect children. Following
the first hearing in Parliament the proposed Act was replaced by a parliamentary
recommendation, which proposes that libraries, schools and so forth should establish
local Net-etiquettes14. Subsequently, Birkerd Public Library announced that they had
installed filter software on all their public computers in order to prevent library patrons
and minors from accessing websites containing pornographic information. According to
Birkerd library, pornographic material is not information within the librarys definition
of information, and therefore has no protection as such.15
In the following I will explore the cases using the structure of: Public space and
cyberspace, Internet as media, right to impart information, margin of appreciation,
necessity test, and the right to receive information.
The proposed legislation in CDA, COPA, and CHIPA can all be categorised as state
attempts to regulate the communicative sphere of Internet. Whereas CDA is directed at all communications
taking place on Internet, thus encompassing both system and lifeworld, COPA is restricted to
communications in the commercial sphere of Internet.
If we look at the kind of communication the Acts are aiming at, CDA and COPA both
seek to restrict individuals rights to express opinions, whereas CHIPA and B46 aim at

14 For speeches of the Parliament Hearing on B46 seehttp://www.ft.dk/Samling/20001/MENU/00550064.htm (in Danish)

15 For the Librarys arguments on the filter debate see http://www.birkerod.bibnet.dk/filterdebat.htm (inDanish)
restricting individuals right to receive information. This is also the question at issue in
the two Library cases, Loudoun and Birkerd, which both concern library policies that
restrict individuals right to receive information by the use of mandatory filters. Common
to the cases on self-regulation, which I shall return to later, is the fact that they are
content restrictions enforced by private parties. The various means of self-regulation
might be subject to government support, as exemplified by the EU supporting the
development of codes of conduct, but their enforcement is neither subject to democratic
control nor judicial review.
Right to express opinions
Individuals right to express opinions and impart information is addressed most
thoroughly in CDA and COPA. However, whereas COPA is directed at commercial
parties (system sphere), CDA imposes restrictions on all Internet communications,
whether these originate from individuals, non-profit organisations or commercial parties
(both lifeworld and system sphere). In both cases, the Court stated that the law in
question is a content-based restriction on speech and should be subject to strict scrutiny.
In assessing the scope of the CDA, the Courts agreed that it violates the First
Amendment due to its vagueness, over broadness and many ambiguities concerning the
scope. The communication at issue, whether denominated indecent or patently
offensive is entitled to constitutional protection and the CDA as a government-imposed
content-based restriction on speech must only be upheld it is justified by a compelling
government interest and if it is narrowly tailored to effectuate that interest (District
Court on CDA:23). The Supreme Court stated that CDAs use of undefined terms such
as indecent and patently offensive was likely to provoke uncertainty among
speakers as to content and relation between the two terms (Supreme Court on CDA:
Syllabus: D). For instance, each of the two parts of the CDA uses a different linguistic
form. The first uses the word "indecent", while the second speaks of material that "in
context, depicts or describes, in terms patently offensive as measured by contemporary
community standards, sexual or excretory activities or organs". The linguistic vagueness
combined with the severity of CDAs criminal penalties may well cause speakers toremain silent rather than
communicate even arguably unlawful words, ideas and images (Supreme Court on CDA:8). The Court
found that the uncertainty of the terms used
undermines the likelihood that the CDA has been narrowly tailored to the goal of
protecting minors from potentially harmful material. Thus the scope of the Act being
overly broad was a main defect found by the Court. Applying the terminology of the
European Court, the Act was not proportionate to the legitimate aim pursued, nor was
the law precise enough.
The Court confirmed that the government has a compelling interest in protecting
children, but argued that although the protection of children is an important goal, it
should not interfere with the legitimate rights of adults to speak or listen to matters not
fit for children. As an example, the Court mentions chat rooms. If, for instance, an adult
knows that one or more members of a 100 person chat group is a minor, and it therefore
would be a crime to send the group an indecent message, this would surely burden
communication among adults (Ibid:15).
Regarding the issue of potentially harmful material, the Court stressed that potentially
harmful material is not by nature different than other material, since sexually explicit
material is an integral part of the different kinds of Internet communications and a
search engine might retrieve material of a sexual nature through an imprecise search,
just as it might retrieve other irrelevant material. The accidental retrieval of sexually
explicit material is one manifestation of the larger phenomenon of irrelevant search
results (District Court on CDA:16). When evaluating adults freedom of expression,
the Court made it clear that the First Amendment protects sexual expressions, which are
indecent but not obscene, and that the fact that society may find speech offensive is not
a sufficient reason for suppressing it (Supreme Court on CDA:15). When assessing CDAs likely effect on
the free availability of online material, the Courts found that in many cases it would be either technologically
impossible or economically prohibitive to comply with the CDA without impeding the posting of online
material, which adults have a constitutional right to access (District Court on
CDA:25). Online speakers talking through chat rooms or newsgroups have no
practical means of controlling who receives the information, neither can content
providers determine the identity and age of every user accessing their material.
Implementation of age verification systems would place an inappropriate burden not
least on individuals and non-profit organisations. Concerning the affirmative defences in
CDA, the Court concluded that such defences would not be feasible for most non commercial
web publishers, and that even with respect to commercial publishers, the
technology had yet to be proven effective in shielding minors from harmful material.

In COPA the scope of communications was narrowed to web communications made


for commercial purposes (Court of Appeals on COPA:6). Commercial purposes were
defined as those individuals or entities that are engaged in the business of making such
communications16. In narrowing the scope of communications, COPA sought to
encompass the Supreme Courts criticism of the CDAs wide scope. However, when
assessing COPA, the Court again stressed that web publishers cannot prevent users from
certain geographical districts.
The District Court on COPA also discussed the costs and burdens COPA imposes on
16 COPA defines a person engaged in the business as one who makes a communication, or offers to makea communication, by
means of the world wide web, that includes any material that is harmful to minors,
devotes time, attention, or labor to such activities, as a regular course of such persons trade or business,
with the objective of earning a profit as a result of such activities (Court of Appeals on COPA:8).
web publishers and on the adults who seek access to sites covered by COPA. The Court
found that the only affirmative defences available were the implementation of credit
card or age verification systems, and that either system would impose significant
residual or indirect burdens upon web publishers (Ibid:14). Both systems would require
an individual seeking to access material, otherwise permissible to adults, to reveal
personal information. Because of the likelihood that many adults would choose not to
reveal this personal information, websites complying with COPA might experience a
loss of traffic, thus be discriminated compared to other commercial sites simply because
they contained material, which might be harmful to minors. Pursuant to the strict
scrutiny analysis of COPA the Court held that COPA placed too large a burden on
protected expressions. In particular, the high economic cost related to implementing an
age verification system could cause web publishers to cease to publish the material.
Furthermore, the difficulty in shielding minors from potentially harmful content might
lead web publishers to censor more information than necessary. Therefore, both CDA
and COPA would impose a disproportionate burden on web publishers, and might have
a negative effect on the free availability of constitutionally protected material. The Court
concluded that the government lacks an interest in enforcing an unconstitutional law,
and that losing First Amendment freedoms, even if only for a moment, constitutes
irreparable harm(Ibid:16).
In assessing less restrictive means, the Court examined filter solutions, which I shall
return to below. The conclusion was that the public interest factor weighs in favour of
having access to a free flow of constitutionally protected speech, and that the
government had not proved that less restrictive means would not be at least as effective
in achieving the legitimate purposes of CDA and COPA.
Summing up on the Courts assessment of individuals right to impart information on
Internet, a few points should be emphasised:
Online communication, whether denominated indecent or patently offensive
is entitled to constitutional protection, hence the Courts stress the diversity of
online expressions protected.
A content-based restriction on online speech must only be upheld it is justified
by a compelling government interest and if it is narrowly tailored to effectuate
that interest.
The protection of children is an important goal, but it should not interfere with
the legitimate rights of adults to speak or listen to matters not fit for children.
Existing technology does not permit material to be restricted to particular states
or jurisdiction, thus web publishers cannot prevent users from certain geographical districts.

Regulation of access
As illustrated by the Swedish example, the expressions of Flashback were denied access
to cyberspace from every Swedish Internet Service Provider. Thus the only means of
appearance for the discussions on Flashback was through a foreign Internet Service
Provider. This is not necessarily a problem, as long as the individual can choose another
service provider. However, what if service providers on a global level decided that they
did not want to host websites with certain types of content? If they all agreed on codes
of conduct restricting morally offensive expressions. Then the minority protection
entailed in the right to freedom of expression would not prevail on Internet. Then certain
legal expressions would no longer have means of appearance on Internet.
The customer contract of Cybercity is another example of private parties setting
restrictions on expressions. Referring to the discussion above, it is legitimate from a
system perspective - but problematic when perceived from a lifeworld perspective.
Again we can argue, that if this type of customer contract becomes the norm, one might
fear that individuals or organisations with marginalized views - whether of a political,
sexual, or religious character - will be denied access to Internet. Currently, individuals
have the option to change service provider if they are being restricted. But the tendency
outlined in the previous chapter is hard to ignore. Private parties such as the
collaboration of GBDe increasingly engage themselves in self-regulation in order to
meet the demand of their customers for a safer online environment. And governments,
as exemplified by the EU, increasingly encourage this tendency of self-regulation.
GBDe has addressed content as a specific area and has designated Walt Disney as the
responsible private party. Will this eventually imply that the diversity of Internet, of
expressions as they exist in society, will be replaced by a Disney version of reality? Will
the lifeworld elements currently present cease to exist because the reality of the new
public sphere is too offensive or provoking? We need to recall that there is nothing in
cyberspace that does not exist in the physical world. The variety of expressions on
Internet derives from human beings. The only difference is that whereas freedom of
expression in the physical public sphere has limited reach and means of appearance, the
public cyber sphere is far-reaching and with stronger means of appearance, thus making
the diversity of opinions more visible and accessible to everyone. In this sense we could
argue that Internet gives freedom of expression practical reality. In the public sphere of
Internet, freedom of expression is not merely a principle but effectively a new way for
individuals to voice their opinion and seek information. This is what gives the
communicative sphere of Internet its potential. But it is also the feature behind the call
for content regulation, not least by private parties concerned with consumer demand.

6.7. Positive state obligation


If we acknowledge that part of cyberspace can be perceived as public sphere, the
freedom to express opinions in this sphere should be entitled to protection by the state.
The protection is not only a matter of non-interference by the state, but might also entail
positive state obligations to protect individuals against interference from third parties as
discussed in chapter four.
Even though commercial parties in principle have a right to limit, which expressions are
allowed on their servers, the current development towards Internet Service Providers
setting standards for the legal content allowed to be communicated by their customers is
a de facto restriction on individuals right to express opinions in the public sphere of
cyberspace and could lead to a situation where certain discussions have no means of
appearance. To draw a parallel to the physical public space would be a situation where
public space was outsourced to private parties who restricted, which expressions were
allowed when individuals met on a street corner, or when they spoke in a public park.
79
The question is whether such a situation would not require positive measures of state
protection in order to provide an effective respect of individuals freedom of expression.
If private parties managed physical public space, we could surely demand that it was
supervised by the state in order to secure compliance with fundamental rights of the
individual.
A positive state obligation in relation to Internet Service Providers as public sphere
managers, could be to secure that rights of expressions prevail in the public cyber
sphere, whether in cyber assemblies or on websites. Hence restrictions from private
parties should be subject to scrutiny from the state in order to secure, that the principles
of freedom of expression is protected. Another aspect of the positive state obligation
could be the need to secure that every citizen has means of access to Internet as a new
public sphere. Recalling Habermas description of the public sphere, we could argue
that accessibility is a precondition for the public sphere. The public sphere of civil
society stood or fell with the principle of universal access. A public sphere from which
specific groups would be eo ipso excluded was less than merely incomplete; it was no
public sphere at all (Habermas 1989:85). By acknowledging Internet as partly public
sphere and further acknowledging that access to be in this public sphere - to express
oneself and to receive information as provided for in the right to freedom of expression - is vital for
democratic participation and development, we can argue that Internet access
is crucial for citizens ability to take part in society. This could call for a positive state
obligation not only to protect online expressions from third party restrictions, but also to
secure individuals access to the public sphere of Internet. The positive state obligation
could be to provide citizens with Internet access, for instance in public libraries. As
illustrated in the previous chapter, Denmark has chosen to provide this cyber-access for
all by securing Danish citizens Internet access in public libraries, while at the same time allowing a public
library to restrict citizens right to receive and seek information.

CONCLUSION

Let me attempt to summarize this presentation and my recommendations:

The Internet cannot and should not be regulated like old media.

However, more can and should be done, especially in relation to harmful content.

New initiatives should be low-cost, practical and promoted on a voluntary basis.

Most problematic Internet content is not illegal or harmful and users must take appropriate
responsibility while being advised on tools and techniques.

Over time, we should regulate broadcasting and the Internet in a less differentiated manner.

If we do not have a rational debate on the regulation of the Internet and come up with practical and
effective proposals, then many of the one-third of UK homes that are still not on the Internet will
be deterred from doing so, many of those who are on the Net will be reluctant to use it as
extensively and confidently as they should, and we run the risk that scare campaigns will be
whipped up around particularly harmful or offensive content, tempting politicians and regulators
to intervene in ways that the industry would probably find unhelpful.
Norms codified in codes of conduct, customer contracts and/or access criteria, chat
policies, or filtering systems can be effective regulators. However, with the right to
freedom of expression, which is by its very nature a protection of minorities or
80
dissenters to voice their opinion, privately defined set of norms to regulate online
communication is a problematic path. Since freedom of expression is meant to protect
especially those communications that shock, offend or disturb; thus the legitimate right
of lifeworld to oppose system, one should be very cautious towards a development
where private parties define, which conversations shall be allowed, and which shall not.
Self-regulation is a dangerous path when applied to public sphere communication, since
it commercialises something that is not commerceable.
As stated above, there is nothing in the architecture of rating-schemes such as PICS,
which prevents rating and filtering systems from expanding vertically. Especially in
order to enforce common codes of conduct, Internet Service Providers might decide on
implementing filtering systems at access level, thereby meeting the demand for a safer
Internet for their customers. So far the Internet industry has not agreed on common
means for content regulation or on common codes of conduct, but with the tendencies
outlined above the call for states to take a stand on self-regulatory schemes becomes still
more important. Acknowledging Internet as partly public sphere, it should deserve the
same level of protection, which is provided for physical public sphere communication.
The alternative is a privatised wild west, where individuals expressions and
information retrieval is subject to arbitrary restrictions with no judicial review or
democratic legitimacy. In relation to positive state obligations, it is time for states not only to concern
themselves with the protection of minors but also more generally concern themselves
with the positive protection of freedom of expression on Internet. Individuals right to
freely impart and receive information within a legal frame must be protected from
private parties restrictions, as outlined in the above examples. The concern for minors
should be balanced with a concern for the principles inherent in freedom of expression,
thus the protection of open debate, diversity and minority expressions. Furthermore,
individuals means to participate in the public cyber sphere should be positively secured,
for instance by providing Internet access at public libraries.
The time when Internet was merely an alternative communicative channel has passed.
Cyberspace today is an important part of living as a private and public individual in the
modern world. It is a way of speaking and listening; an essential part of being human.
Speaking the language of the European Court, it has strong independent significance.
Accordingly, access to communicate in cyberspace should be positively provided for by
states, as a natural part of democratic development and compliance with human rights.
In order to protect the communicative sphere on Internet we need to reinforce the state
as protector. For the last years states have turned to self-regulation as the preferred path
when dealing with potentially harmful content on Internet. However, self-regulation
potentially endangers individuals freedom of expression because it regulates lifeworld
communication according to commercial system codes. Neither the protection of
freedom of expression nor human dignity can be left to private parties to regulate. The
current tendency with service providers self-regulation and commercial interests setting the scene are
endangering citizens fundamental rights. Internet is both a commercial
sphere (system) and a public communicative sphere (lifeworld) and law, not arbitrary
action by private partners, must protect the latter. This is the only way to ensure
transparency, accountability and democratic legitimacy.
The good news is that we do not need a lot of new regulation. We have existing
international standards in our human rights treaties - human rights, which are
characterised by being global in scope, precisely as is Internet. The current challenge -
and state obligation - is to ensure that these standards are protected on Internet, by
applying the standards in the current Internet policy/regulation discussion, whether it be
on freedom of expression, privacy or freedom of assembly. This is not to say that the
task is easy, but the foundation is laid and the alternative unacceptable. The alternative
implies in the case of freedom of expression private parties restrictions on individuals
right to express opinions and receive information in the public cyber sphere.
With Internet we have gained new means for humans to express themselves. It is time
for states to grant these expressions the same protection, which we apply to expressions in the physical
world.

References

1. Akdeniz, Y. (2010). To block or not to block: European approaches to content


regulation, and implications for freedom of expression. Computer Law &
Security Review, 26:260272.

2. Asghari, H., Mueller, M. L., van Eeten, M. J. G., and Wang, X.


(2012a). Making Internet measurements accessible for multi-disciplinary
research. an in-depth look at using M-Labs Glasnost data for policy
research. Submitted to IMC12.

3. Asghari, H., van Eeten, M. J. G., and Mueller, M. L. (2012b).


Unravelling the economic and political drivers of deep packet inspection.
An empirical study of DPI use by broadband operators in 75 countries.
Paper presented at the GigaNet 7th Annual Symposium, November 5,
Baku, Azerbaijan.

4. Bambauer, D. E. (2009). Guiding the censors scissors: A framework to


assess Internet filtering. Brooklyn Law School. Legal Studies paper,
(149):172.

5. Bendrath, R. and Mueller, M. (2011). The end of the Net as we know it? deep
packet inspection and Internet governance. New Media & Society,
13(7):11421160.
6. Benkler, Y. (2011). A free irresponsible press: Wikileaks and the battle over the
soul of the networked fourth estate. Working draft. Forthcoming in Harvard
Civil Rights-Civil Liberties Law Review.

7. Boas, T. (2006). Weaving the authoritarian web: The control of Internet


use in non- democratic regimes. In: Zysman, J. and Newman, A.,
editors, How revolutionary was the digital revolution? National
responses, market transitions, and global technology, pp. 361378.
Stanford University Press, Palo Alto, CA.

8. Boyd, Danah (2008). Taken Out of Context. American Teen Sociality in


Networked Publics.
9. PhD thesis, University of California, Berkeley.

10. Boyd, Danah (2009). Social media is here to stay... now what?
http://www.danah.org/papers/talks/MSRTechFest2009.html. Paper
presented at the Microsoft Research Tech Fest, Redmond.

11. Boyle, J. (1997). Foucault in cyberspace, surveillance, souvereignty, and


hard-censors.

12. University of Cincinnati Law Review, 66:177205.


13. Braman, S. (2009). Change of State: Information, Policy, and Power. MIT
Press, Cambridge, MA. Breindl, Y. (2012). Discourse networks on state-
mandated access blocking in France and
14. Germany. Paper presented at the 7th GigaNet Symposium, November 5,
Baku, Azerbaijan.

15. Breindl, Y. and Briatte, F. (2013). Digital network repertoires and the
contentious politics of digital copyright in France and the European Union.
Policy & Internet, forthcoming.

16. Breindl, Y. and Wright, J. (2012). Internet filtering trends in liberal


democracies: French and German regulatory debates. Paper presented at
the FOCI12 workshop, 2nd USENIX workshop on Free and Open
Communications on the Internet, August 6, 2012, Bellevue, WA.
17. Brown, I. (2008). Internet filtering be careful what you ask for. In:
S.Kirca and L.Hanson, editors, Freedom and Prejudice: Approaches to
Media and Culture, pp. 7491.
18. Bahcesehir University Press, Istanbul.

19. Brown, I. (2010). Internet self-regulation and fundamental rights. Index on


Censorship, 1:98106.
20. Brown, I. and Korff, D. (2012). Digital freedoms in international law.
Practical steps to protect human rights online. Technical report, Global
Network Initiative.

21. Brown, I. and Marsden, C. (2013). Regulating Code: Good Governance


and Better Regulation in the Information Age. MIT Press, Cambridge, MA.

22. Busch, A. (2012). Politische Regulierung von Information eine


Einfhrung. In: Busch and Hofmann (2012), pp. 2447.

23. Busch, A. and Hofmann, J. (2012). In: Busch, A. and Hofmann, J.,
editors, Politik und die Regulierung von Information, volume 46 of
Politische Vierteljahresschrift. Nomos, Baden-Baden, Germany.

24. Castells, M. (2001). The Internet Galaxy: Reflections on the Internet,


Business, and Society.
25. Oxford University Press.

26. Clayton, R. (2005). Failures in a hybrid content blocking system.


Presented at the Workshop on Privacy Enhancing Technologies,
Dubrovnik, Croatia.

27. Clayton, R., Murdoch, S. J., and Watson, R. N. M. (2006). Ignoring the
great firewall of China. In: Danezis, G. and Golle, P., editors, Privacy
Enhancing Technologies workshop (PET 2006), LNCS. Springer.

28. Deibert, R. J. and Crete-Nishihata, M. (2012). Global Governance and


the Spread of Cyberspace Controls. Global Governance, 18(3):339261.

29. Deibert, R. J., Palfrey, J. G., Rohozinski, R., and Zittrain, J. (2008).
Access Denied:

30. The Practice and Policy of Global Internet Filtering. Information


Revolution and Global Politics. MIT Press, Cambridge, MA.

31. Deibert, R. J., Palfrey, J. G., Rohozinski, R., and Zittrain, J. (2010).
Access controlled:
32. The shaping of power, rights, and rule in cyberspace. Information
Revolution and

33. Global Politics. MIT Press, Cambridge, MA.

34. Deibert, R. J., Palfrey, J. G., Rohozinski, R., and Zittrain, J. (2011a).
Access Contested: Security, Identity, and Resistance in Asian
Cyberspace. Information Revolution and Global Politics. MIT Press,
Cambridge, MA.

35. Deibert, R. J., Palfrey, J. G., Rohozinski, R., and Zittrain, J. (2011b).
Access contested: Toward the fourth phase of cyberspace controls. In:
Deibert, R. J., Palfrey, J. G., Rohozinski, R., and Zittrain, J., editors,
Access Contested: Security, Identity, and Resistance in Asian
Cyberspace, Information Revolution and Global Politics, pp. 320. MIT
Press, Cambridge, MA.

36. DeNardis, L. (2009). Protocol Politics: The Globalization of Internet


Governance. MIT Press, Cambridge, MA.

37. DeNardis, L. (2010). The privatization of Internet governance. Paper


presented at the GigaNet 5th Symposium, Vilnius, Lithuania.

38. DeNardis, L. (2012). Hidden levers of Internet control. Information,


Communication & Society, 15(5):720738.

39. Dischinger, M., Gummadi, K. P., Marcon, M., Mahajan, R., Guha, S., and
Saroiu, S. (2010). Glasnost: Enabling end user to detect traffic
differentiation. Paper presented at the USENIX Symposium on
Networked Systems Design and Implementation (NSDI).

40. Dischinger, M., Mislove, A., Haeberlen, A., and Gummadi, K. P.


(2008). Detecting bittorrent blocking. Paper presented at IMC.
41. Dutton, W. H., Dopatka, A., Hills, M., Law, G., and Nash, V. (2011).
Freedom of connection, freedom of expression: the changing legal and
regulatory ecology shaping the Internet. Technical report, [UNESCO].

42. Dyson, E., Gilder, G., Keyworth, G., and Toffler, A. (1996). Cyberspace
and the American dream: A magna carta for the knowledge age.
Information Society, 12(3):295308.

43. Edwards, L. (2009). Pornography, Censorship and the Internet. In:


Edwards, L. and Waelde, C., editors, Law and the Internet. Hart
Publishing, Oxford, UK.

44. Faris, R. and Villeneuve, N. (2008). Measuring global Internet filtering.


In: Deibert, R. J., Palfrey, J. G., Rohozinski, R., and Zittrain, J.,
editors, Access Denied: The Practice and Policy of Global Internet
Filtering, Information Revolution and Global Politics, pp. 528. MIT
Press, Cambridge, MA.

45. Flichy, P. (2001). Limaginaire dInternet. La Dcouverte, Paris.

46. Freedom House (2012). Freedom on the Net 2012. A Global Assessment
of Internet and Digital Media. Technical report, Freedom House.

47. Froomkin, M. (2011). Lessons learned too well: The evolution of Internet
regulation.
48. Technical report, CDT Fellows Focus series.

49. Fuchs, C. (2012). Implications of deep packet inspection (dpi) Internet


surveillance for society. The Privacy & Security Research Paper Series 1,
Uppsala University.

50. Goldsmith, J. and Wu, T. (2006). Who Controls the Internet? Illusions
of a Borderless World.
51. Oxford University Press, Oxford/New York.
52. Greve, H. (2011). Access-Blocking Grenzen staatlicher Gefahrenabwehr
im Internet.
53. PhD thesis, Humboldt Universitt zu Berlin.

54. Haunss, S. (2011). The Politicization of Intellectual Property: IP Conflicts


and Social Change.

55. WIPO Journal, 3(1):129138.

56. Heliosch, A. (2012). Verfassungsrechtliche Anforderungen an


Sperrmanahmen von kinder- pornographischen Inhalten im Internet,
volume 10 of Gttinger Schriften zur Internet- forschung. Gttinger
Universittsverlag.

57. Hintz, A. (2012). Challenging the digital gatekeepers: International policy


initiatives for free expression. Journal of Information Policy, 2:128150.

58. Hofmann, J. (2012). Information und Wissen als Gegenstand oder


Ressource von Regulierung. In: Busch and Hofmann (2012), pp. 523.

59. Johnson, D. R. and Post, D. G. (1996). Law and borders the rise of
law in cyberspace.

60. Stanford Law Review, 48(5):13671402.

61. Johnson, P. (1997). Pornography drives technology: Why not to censor the
Internet.

62. Federal Communications Law Journal, 49(1):217226.

63. Katz, J. (1997). Birth of a digital nation. Wired, 5(04).

64. Klang, M. (2005). Controlling online information. Paper presented at


WSIS, Internet Governance and Human Rights, Uppsala, Sweden.
65. La Rue, F. (2011). Report of the special rapporteur on the promotion and
protection of the right to freedom of opinion and expression. Technical
Report A/HRC/17/27, Human Rights Council Seventeenth session Agenda
item 3.

66. Lessig, L. (1999). Code: And Other Laws of Cyberspace. Basic Books,
Cambridge, MA. Lessig, L. (2006). Code: And Other Laws of
Cyberspace, Version 2.0. Basic Books, New York.
67. Lblich, M. and Wendelin, M. (2012). ICT policy activism on a national
level: Ideas, resources and strategies of German civil society in governance
processes. New Media & Society, 14(6):899915.

68. MacKinnon, R. (2012). Consent of the Networked: The Worldwide Struggle


For Internet Freedom. Basic Books, New York.

69. Marsden, C. (2010). Net Neutrality: Towards a Co-Regulatory Solution.


70. Bloomsbury Publishing, London.

71. Marsden, C. T. (2011). Internet co-regulation. European Law, Regulatory


Governance and Legitimacy in Cyberspace. Cambridge University Press,
UK.

72. McIntyre, T. (2012). Child abuse images and cleanfeeds: Assessing Internet
blocking systems. In: Brown, I., editor, Research Handbook on Governance
of the Internet. Edward Elgar, Cheltenham.

73. McIntyre, T. J. and Scott, C. (2008). Internet Filtering: Rhetoric,


Legitimacy, Accountability and Responsibility. In: Brownsword, R. and
Yeung, K., editors, Regulating Technologies: Legal Futures, Regulatory
Frames and Technological Fixes, pp. 109124. Hart Publishing, Oxford,
UK.

74. McNamee, J. (2011a). Internet blocking. crimes should be punished


and not hidden.

75. Technical report, EDRi.


76.
77. McNamee, J. (2011b). The slide from self-regulation to corporate
censorship.

78. Discussion paper, European Digital Rights.

79. Moore, T. and Clayton, R. (2009). The Impact of Incentives on Notice and
Take-down. In:

80. Managing Information Risk and the Economics of Security, pp. 199223.
Springer US.

81. Mueller, M., Pag, C., and Kuerbis, B. (2004). Civil society and the
shaping of communication information policy: Four decades of advocacy.
The Information Society: An International Journal, 20(3):169.

82. Mueller, M. L. (2002). Ruling the root: Internet governance and the taming
of cyberspace.
83. Information Revolution and Global Politics Series. MIT Press, Cambridge,
MA.

84. Mueller, M. L. (2010). Networks and States: the global politics of Internet
governance. Information Revolution and Global Politics Series. MIT Press,
Cambridge, MA.

85. Mueller, M. L. and Asghari, H. (2012). Deep packet inspection and


bandwidth management: Battles over BitTorrent in Canada and the United
States. Telecommunications Policy, 36(6):462475.

86. Murdoch, S. J. and Anderson, R. (2008). Tools and technology of


Internet filtering. In: Deibert, R. J., Palfrey, J. G., Rohozinski, R.,
and Zittrain, J., editors, Access Denied: The Practice and Policy of
Global Internet Filtering, Information Revolution and Global Politics, pp.
2956. MIT Press, Cambridge, MA.
87. Murray, A. D. and Scott, C. (2001). The partial role of competition in
controlling the new media. Presented at the Competition Law and the New
Economy University of Leicester.

88. Musiani, F. (2011). Privacy as Invisibility: Pervasive Surveillance and the


Privatization of Peer-to-Peer Systems. tripleC Cognition, Communication,
Co-operation, 9(2):126140.

89. OECD (2011). Joint declaration on freedom of expression and the


Internet. Technical report, OECD.

90. Oswell, D. (1999). The dark side of cyberspace Internet content regulation
and child protection. Convergence: The International Journal of Research
into New Media Technologies, 5(4):4262.

91. Rasmussen, T. (2007). Techno-politics, Internet governance and some


challenges facing the Internet. Research Report 15, Oxford Internet
Institute.

92. Reidenberg, J. (1998). Lex informatica: The formulation of information


policy rules through technology. Texas Law Review, 76(3):553584.

93. Reporters without borders (2012). Internet enemies 2012. Report, Reporters
without borders for Press Freedom.

94. Resnick, P. and Miller, J. (1996). PICS: Internet access controls without
censorship.

95. Communications of the ACM, 39(10):8793.

96. Vanobberghen, W. (2007). The Marvel of our Time: Visions about radio
broadcasting in the Flemish Catholic Press, 19231936. Paper presented
at the 25th IAMCR conference, Paris, France.

97. Villeneuve, B. (2010). Barriers to Cooperation. An Analysis of the Origins


of International
98. Efforts to Protect Children Online. In: Deibert et al. (2010), pp. 5570.

99. Wagner, B. (2012). Push-button-autocracy in Tunisia: Analysing the role of


Internet infrastructure, institutions and international markets in creating a
Tunisian censorship regime. Telecommunications Policy, 36(6):484492.

100. Wellman, B. (2001). Physical place and cyberplace: The rise of


personalized networking.

101. International Journal of Urban and Regional Research, 25(2):227


252.

102. Wright, J., de Souza, T., and Brown, I. (2011). Fine-grained


censorship mapping information sources, legality and ethics. In:
Proceedings of Freedom of Communications on the Internet Workshop.

103. Wu, T. (2010). The Master Switch: The Rise and Fall of Information
Empires. Atlantic Books, London.

104. York, J. C. (2010). Policing content in the quasi-public sphere.


Technical report, Open Net Initiative Bulletin.

105. Yu, P. (2010). The Graduated Response. Florida Law Review,


62:13731430.
106. Yu, P. K. (2004). The Escalating Copyright Wars. Hofstra Law
Review, 32:907951. Zeno-Zencovich, V. (2009). Freedom of Expression:
A Critical and Comparative Analysis.
107. T & F Books UK.
108. Zittrain, J. (2003). Internet Points of Control. Boston College Law
Review, 43(1). Zittrain, J. (2008). The Future of the Internet And How
to Stop It. Yale University Press,

109. New Haven, CT.


110. Zittrain, J. and Edelman, B. (2002). Localized Google search
result exclusions: Statement of issues and call for data. Available at
http://cyber.law.harvard.edu/filtering/google/.

111. Zittrain, J. and Palfrey, J. G. (2008). Internet filtering: The


politics and mechanisms of control. In: Deibert, R. J., Palfrey, J. G.,
Rohozinski, R., and Zittrain, J., editors, Access Denied: The Practice
and Policy of Global Internet Filtering, Information Revolution and
Global Politics, pp. 2956. MIT Press, Cambridge, MA.

You might also like